Kolanaki ,
@Kolanaki@yiffit.net avatar

If they could do that, they would probably see how God damn miserable most people are. If they used that to change and make them not miserable, I don’t see it being dangerous. But more than likely it will be more “your sadness doesn’t vibe with us. You’re fired.”

Onii-Chan ,
@Onii-Chan@kbin.social avatar

[Thread, post or comment was deleted by the author]

  • Loading...
  • Lyrl ,

    Sounds like you are fighting on behalf of the whole world. I hope you get some times with yourself or a smaller circle that are positive and a break from the dumpster fires of modern civilization.

    HubertManne ,
    @HubertManne@kbin.social avatar

    I hope they keep a lot of drive space for the depression folder.

    Rozauhtuno ,
    @Rozauhtuno@lemmy.blahaj.zone avatar

    Everyday we get closer to a cyberpunk dystopia.

    ThatWeirdGuy1001 ,
    @ThatWeirdGuy1001@sh.itjust.works avatar

    I DON’T NEED TO BE/ACT HAPPY TO GET MY JOB DONE

    fckreddit ,

    This needs to be shutdown. It’s the most dystopian thing I have ever read.

    Anticorp ,

    What’s crazy is that this was already fully functional and in-use at least 8 years ago. Idk how this has stayed out of the headlines until now. Microsoft had a working demo of this in their visitor center in 2015 and was already using it in multiple places.

    AllonzeeLV , (edited )

    Too bad it gets the emotion and not the context.

    I’d love to be fired because “I hate making money for these greedy ass capitalist douchebags” pops up on a screen whenever I come in.

    The idea that employers should even be allowed to know what their employees are feeling is a new low for our modern Orwellian dystopia.

    thanevim ,

    The thing is though, I don't see how someone like this could even work out.

    Like, you hire employee 1, they get frustrated at something overnight. You fire them for being upset. Now you have to fill the seat. Employee 2 is brought on. They get told what happened to the person they replaced. They leave or are fired for having emotion and being human. This repeats ad nauseum.

    AllonzeeLV , (edited )

    I’m guessing it’s going to be implemented as identifying “persistent negative attitudes” and as validation to fire anyone in non-fire-at-will locales.

    It could also be used as bullshit to deny raises and promotions if your grateful or motivated indexes weren’t high enough.

    FringeTheory999 ,

    so, basically a tool to suss out which employees have undisclosed mental health issues that the employer can’t legally ask about. cool. cool.

    radioactiveradio ,

    Let’s be real, most of us would get weeded out at the interview when they start spilling all the “we’re like a family” bullshit.

    randon31415 ,

    What type of family? Found family? The kind of family that requires restraining orders for abuse? The kind that only sees each other on Chirstmas?

    Fapper_McFapper ,

    Great, now I have to walk around with permanent poker face.

    kool_newt ,

    It’s about time we start holding the engineers building these technologies responsible directly.

    I’m not talking about scientists expanding knowledge, I’m talking specifically about the engineers building these technologies.

    Is mood recognition a tool useful for anything other than maintaining power over others (actually curious)?

    const_void ,

    Seriously. Why are people choosing to work for these companies? There are other ways to make a buck. Have some fucking morals.

    Olgratin_Magmatoe ,

    The threat of homelessness and starvation is quite coercive. That’s why people still work at these kinds of jobs.

    PeterPoopshit , (edited )

    At a certain point, not just the companies doing this are to blame but the people working for them as well. Who tf can support this kind of thing? People need to have some self fucking respect.

    For example we could probably have the cure for cancer right by now if they spent half the effort on it as they did making unbeatable thc drug tests for example. It’s clear where society’s priorities are. Improving lives does not generate profit.

    Neato ,
    @Neato@kbin.social avatar

    You can to an extent, but that's a losing venture. If pubic opinion goes against this tech hard enough, it'll keep some people from working in those industries. BUT if those products are profitable enough, they will simply pay more and that'll be moot.

    Attacking the people who are earning a living isn't the answer. Most people take the job with the best combo of pay and work/life balance they can find in their area, or if they can afford to move. Not that many have the luxury to pick and choose based on their morality. And if compensation is high enough, it's a lot less likely.

    It's far easier to try to prevent this tech from being used at all. I know political action is hard as hell but it's a lot easier than trying to ostracize an entire industry's worth of workers. It may feel easier to denigrate faceless individuals but that won't accomplish anything. Plenty of people work for weapons manufacturers and such.

    kool_newt , (edited )

    Plenty of people work for weapons manufacturers and such.

    And those are bad people. If you work to build technology used to maintain power when you have an option not to, what else can that be called? These people are not desperate for a job.

    I’m an engineer, I quit \ (after the startup I worked for was acquired) because Intel powers much of the MI complex. I quit \ when it became clear I was directly assisting with state level genetic experiments. As an engineer I could easily get a job elsewhere where I was not directly contributing to the downfall of my fellow humans.

    Take McDonald’s for example. There’s a difference between someone who needs a job working in a restaurant and an Engineer working for McDonald’s figuring out how to more efficiently slaughter animals paid only to be concerned about their employer’s profit – that engineer could go work to more efficiently bake cookies.

    Neato ,
    @Neato@kbin.social avatar

    These people are not desperate for a job.

    You're painting with a firehose. Some people are.

    I’m an engineer, I quit Intel (after the startup I worked for was acquired) because Intel powers much of the MI complex. I quit Illumina when it became clear I was directly assisting with state level genetic experiments. As an engineer I could easily get a job elsewhere where I was not directly contributing to the downfall of my fellow humans.

    You are what we call, privileged. Maybe you should...check it?

    PiecePractical ,

    Yeah, I was a field service tech at a machine tool distributor for 15 years. One day about 7 years ago I realized that more of our customers than not were involved in some kind of arms manufacturing. Everything from components to military armaments to places making parts for AR-15s. Didn’t start that way but the business drifted into that market over time.

    I decided to move on and it took me all of 5 years to find a position that; a) I was qualified for, b) paid enough that I wouldn’t lose my house and, c) was relatively safe from drifting into the customer base as the last company.

    I don’t even have kids and this whole process was absolutely terrifying. I can easily see how someone with a family to support or less stability in their life wouldn’t feel like leaving was a possibility.

    Buelldozer ,
    @Buelldozer@lemmy.today avatar

    Is mood recognition a tool useful for anything other than maintaining power over others (actually curious)?

    If you ever want a real General AI then it will need the ability to recognize the mood of the person it’s interacting with. ESPECIALLY if you want to use it for things like Mental Health Counseling.

    PopOfAfrica ,

    Good thing. I DONT want a general AI

    kool_newt ,

    Mental Health Counseling.

    Thanks, that’s a valid answer like I was looking for. Though we don’t have actual AI and probably won’t have actual AGI for at least a good decade (we currently have machine learning and complex decision trees which appear kinda intelligent to us in 2023).

    Anticorp ,

    There are uses for it. They can track the average mood of an entire room over a period of time. If you use that somewhere like a restaurant, or a banquet venue, then that information can be useful for tweaking the policies, environment, prices, etc. Of course an actual human could do this too, just by being there. I think it’ll get the most use at places like casinos where they’re always using psychological tricks to make people want to gamble. Ironically I don’t think that “happy” is the mood they’ll be aiming for.

    kool_newt ,

    Ya, I guess I can see some uses for it, but nothing that makes the risks of it’s existence worth it.

    It seems like every tool/tech will be used by good people to do good things and bad people to do bad things. Some things like a spoon are handy for getting good things done but not very useful to bad people to do bad things with. Other tools like mood recognition might be quite handy for bad people looking to control others, but only moderately useful to good people.

    Tools in that second group I think we should be wary of letting them exist. Just because something can be done doesn’t mean it should be done or that it can be called “progress”.

    Anticorp ,

    It has already existed for a decade or so. I’m surprised it hasn’t made headlines before. I saw a working demo of it at the Microsoft Visitor Center about 8 years ago. In addition to estimating your mood, it also assigns you a persistent ID, estimates your height, weight, eye color, hair color, ethnicity, and age. It is scarily accurate at all of those things. That ID can be shared across all linked systems at any number of locations. I completely agree with you that there are a lot of concerning, if not downright terrifying implications of this system. It’s a privacy nightmare.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • All magazines