• ZILtoid1991@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 minutes ago

    That image reminds me a meme from “Scientific diagrams that look like shitposts”. It was titled something like “Mask of Damascus(?)/Triagones(?) - Acquire it (from a prisoner(?)) with a scimitar!”

  • skisnow@lemmy.ca
    link
    fedilink
    English
    arrow-up
    15
    ·
    2 hours ago

    But what if bias was not the reason? What if your face gave genuinely useful clues about your probable performance?

    I hate this so much, because spouting statistics is the number one go-to of idiot racists and other bigots trying to justify their prejudices. The whole fucking point is that judging someone’s value someone based on physical attributes outside their control, is fucking evil, and increasing the accuracy of your algorithm only makes it all the more insidious.

    The Economist has never been shy to post some questionable kneejerk shit in the past, but this is approaching a low even for them. Not only do they give the concept credibility, but they’re even going out of their way to dishonestly paint it as some sort of progressive boon for the poor.

  • pyre@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 hour ago

    this should be grounds for a prison sentence. open support for Nazism shouldn’t be covered by free speech laws.

  • buttnugget@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    4 hours ago

    Actually, what if slavery wasn’t such a bad idea after all? Lmao they never stop trying to resurrect class warfare and gatekeeping.

  • wilfim@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    8 hours ago

    This is so absurd it almost feels like it isn’t real. But indeed, the article appears when I look it up

      • 6nk06@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        I always pity the Germans who don’t deserve this but keep this shame since the war, and it’s worse since nazis became an international club.

  • Bob Smith@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    177
    ·
    edit-2
    2 days ago

    Wow. If a black box analysis of arbitrary facial characteristics is more meritocratic than the status quo, that speaks volumes about the nightmare hellscape shitshow of policy, procedure and discretion that resides behind the current set of ‘metrics’ being used.

      • scratchee@feddit.uk
        link
        fedilink
        English
        arrow-up
        10
        ·
        4 hours ago

        I really must commend you for overcoming your natural murderous inclinations and managing to become a useful member of society despite the depression in your front lobe. Keep resisting those dark temptations!

        • 6nk06@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          Do we lump all the teenagers with acne in the incel category, and put them in prison? I’m just asking questions.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      40
      ·
      2 days ago

      The gamification of hiring is largely a result of businesses de-institutionalizing Human Resources. If you were hired on at a company like Exxon or IBM in the 1980s, there was an enormous professionalized team dedicated to sourcing prospective hires, vetting them, and negotiating their employment.

      Now, we’ve automated so much of the process and gutted so much of the actual professionalized vetting and onboarding that its a total crap shoot as to whom you’re getting. Applicants aren’t trying to impress a recruiter, they’re just aiming to win the keyword search lottery. Businesses aren’t looking to cultivate talent long term, just fill contract positions at below-contractor rates.

      So we get an influx of pseudo-science to substitute for what had been a real sociological science of hiring. People promising quick and easy answers to complex and difficult questions, on the premise that they can accelerate the churn of staff without driving up cost of doing business.

      • Bob Smith@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        19
        ·
        2 days ago

        Gotcha. This is replacing one nonsense black box with a different one, then. That makes a depressing kind of sense. No evidence needed, either!

    • Bob Smith@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      31
      ·
      2 days ago

      All of that being typed, I’m aware that the ‘If’ in my initial response is doing the same amount of heavy lifting as the ‘Some might argue’ in the article. Barring the revelation of some truly extraordinary evidence, I don’t accept the premise.

    • technocrit@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 days ago

      A primary application of “AI” is providing blackboxes that enable the extremely privileged to wield arbitrary control with impunity.

  • psycotica0@lemmy.ca
    link
    fedilink
    English
    arrow-up
    101
    ·
    edit-2
    2 days ago

    "Imagine appearing for a job interview and, without saying a single word, being told that you are not getting the role because your face didn’t fit. You would assume discrimination, and might even contemplate litigation. But what if bias was not the reason?

    Uh… guys…

    Discrimination: the act, practice, or an instance of unfairly treating a person or group differently from other people or groups on a class or categorical basis

    Prejudice: an adverse opinion or leaning formed without just grounds or before sufficient knowledge

    Bias: to give a settled and often prejudiced outlook to

    Judging someone’s ability without knowing them, based solely on their appearance, is, like, kinda the definition of bias, discrimination, and prejudice. I think their stupid angle is “it’s not unfair because what if this time it really worked though!” 😅

    I know this is the point, but there’s no way this could possibly end up with anything other than a lazily written, comically clichéd, Sci Fi future where there’s an underclass of like “class gammas” who have gamma face, and then the betas that blah blah. Whereas the alphas are the most perfect ughhhhh. It’s not even a huge leap; it’s fucking inevitable. That’s the outcome of this.

    I should watch Gattaca again…

    • Tattorack@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      2 days ago

      Like every corporate entity, they’re trying to redefine what those words mean. See, it’s not “insufficient knowledge” if they’re using an AI powered facial recognition program to get an objective prediction, right? Right?

      • JackbyDev@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        6 hours ago

        The most generous thing I can think is that facial structure is not a protected class in the US so they’re saying it’s technically okay to descriminate against.

    • morriscox@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 days ago

      People see me in cargo pants, polo shirt, a smartphone in my shirt pocket, and sometimes tech stuff in my (cargo) pants pockets and they assume that I am good at computers. I have an IT background and have been on the Internet since March of 1993 so they are correct. I call it the tech support uniform. However, people could dress similarly to try to fool people.

      People will find ways, maybe makeup and prosthetics or AI modifications, to try to fool this system. Maybe they will learn to fake emotions. This system is a tool, not a solution.

    • WhyJiffie@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      I think their stupid angle is “it’s not unfair because what if this time it really worked though!”

      I think their angle is “its not unfair because the computer says it!”. automated bias. offloading liability to an AI.

  • panda_abyss@lemmy.ca
    link
    fedilink
    English
    arrow-up
    67
    ·
    edit-2
    2 days ago

    Racial profiling keeps getting reinvented.

    Fuck that.

    They then used data on these individuals’ labour-market outcomes to see whether the Photo Big Five had any predictive power. The answer, they conclude, is yes: facial analysis has useful things to say about a person’s post-mba earnings and propensity to move jobs, among other things.

    Correlation vs causation. More attractive people will be defaulted to better negotiating positions. People with richer backgrounds will probably look healthier. People from high stress environments will show signs of stress through skin wrinkles and resting muscles.

    This is going to do nothing but enforce systemic biases, but in a kafkaesque Gattica way.

    And then of course you have the garden of forking paths.

    These models have zero restraint on their features, so we have an extremely large feature space, and we train the model to pick features predictive of the outcome. Even the process of training, evaluating, then selecting the best model at this scale ends up being essentially P hacking.

    • Jason2357@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      28 minutes ago

      I cant imagine a model being trained like this /not/ end up encoding a bunch of features that correlate with race. It will find the white people, then reward its self as the group does statistically better.

    • ssladam@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 hours ago

      Exactly. It’s like saying that since every president has been over 6’ tall we should only allow tall people to run for president.

    • GenderNeutralBro@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      14 hours ago

      The problem here is education.

      And I’m not just talking about “average joes” who don’t know the first thing about statistics. It is mind-boggling how many people with advanced degrees do not understand the difference between correlation and causation, and will argue until they’re blue in the face that it doesn’t affect results.

      AI is not helping. Modern machine learning is basically a correlation engine with no concept of causation. The idea of using it to predict the future is dead on arrival. The idea of using it in any prescriptive role in social sciences is grotesque; it will never be more than a violation of human dignity.

      Billions upon billions of dollars are being invested in putting lipstick on that pig. At this point it is more lipstick than pig.

  • AmbitiousProcess (they/them)@piefed.social
    link
    fedilink
    English
    arrow-up
    49
    ·
    2 days ago

    The study claims that they analyzed participants’ labor market outcomes, that being earnings and propensity to move jobs, “among other things.”

    Fun fact, did you know white men tend to get paid more than black men for the same job, with the same experience and education?

    Following that logic, if we took a dataset of both black and white men, then used their labor market outcomes to judge which one would be a good fit over another, white men would have higher earnings and be recommended for a job more than black people.

    Black workers are also more likely to switch jobs, one of the reasons likely being because you tend to experience higher salary growth when moving jobs every 2-3 years than when you stay with a given company, which is necessary if you’re already being paid lower wages than your white counterparts.

    By this study’s methodology, that person could be deemed “unreliable” because they often switch jobs, and would then not be considered.

    Essentially, this is a black box that gets to excuse management saying “fuck all black people, we only want to hire whites” while sounding all smart and fancy.

    • shawn1122@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 days ago

      The goal here is to go back to a world where such racial hieraechies are accepted but without human accountability. This way you are subjugated arbitrarily but hey the computer said so, so what can we do about it?

    • Basic Glitch@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      4 hours ago

      Haven’t you heard? Palantir CEO Says a Surveillance State Is Preferable to China Winning the AI Race.

      Trump’s current Science Advisor (who was selected by Peter Thiel) gave an interview back in ~2019 where he kept insisting the U.S. was at a disadvantage to China in the AI race bc we didn’t have access to the level of surveillance data China had (which it turns out we fucking created and sold to them). He also used this point to argue against any regulations for facial recognition tech because again, it would put us at a disadvantage.

      But don’t worry, because the goal is to have an authoritarian surveillance state with “baked in American values,” so we won’t have to worry about ending up like China did with the surveillance tools we fucking sold them.

      I’m not sure what values he’s claiming will be somehow baked into it (because again, we created it and sold it to China). My mind conjures up scenario of automatic weapons and a speaker playing a screeching bald eagle, but maybe we’ll get some star spangled banner thrown in there too.

  • entwine@programming.dev
    link
    fedilink
    English
    arrow-up
    26
    ·
    2 days ago

    This fascist wave is really bringing out all the cockroaches in our society. It’s a good thing you can’t erase anything on the internet, as this type of evidence will probably be useful in the future.

    You’d better get in on a crypto grift, Kelly Shue of the Yale School of Management. I suspect you’ll have a hard time finding work within the next 1-3 years.

    • 3abas@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      2 days ago

      They absolutely can erase things on the internet, are you archiving this for when the other archives die? Are you gonna be able to share it when the time comes? And will anyone care?

      • valek879@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        I have some spare storage! What I want to do with it is either archive very important information, documents and/or scientific papers. I don’t mind of it’s the same shit others have, just want to be part of retaining information. I’m trans and last time fascists were in power we lost 100 years of progress towards being able to exist openly so I’m pretty eager to archive information.

        Either this or it’d be cool to be part of a decentralized database that is searchable and and readable.

        I could probably find somewhere between 1-10 TB to donate to the cause in perpetuity. But I don’t know how to do this myself, what to save, or if there are groups already doing this type of thing.