this post was submitted on 04 Feb 2024
85 points (97.8% liked)

Technology

59441 readers
3911 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

I lost my job after AI recruitment tool assessed my body language, says make-up artist::A make-up artist says she lost her job at a leading brand after an AI recruitment tool that used facial recognition technology marked her down for her body language.

top 32 comments
sorted by: hot top controversial new old
[–] FlyingSquid@lemmy.world 72 points 9 months ago (4 children)

Absolute bigotry against neurodivergent people. Normalizing body language is exactly the sort of prejudice neurodivergent people have to put up with all the time.

[–] celerate@lemmy.world 18 points 9 months ago (1 children)

Yeah, I am very likely autistic. I wonder when the lawsuits will start.

[–] FlyingSquid@lemmy.world 30 points 9 months ago (1 children)

Thinking more on this, you don't need to even have autism or ADHD or any other form of diagnoseable neurodivergence. You could just be an introverted person who doesn't do well in such situations.

And then there's the nationality issue. Different nationalities and cultures have different body languages. It is disrespectful to make eye contact in Japan and expected in the U.S. So what does this AI do with a Japanese hiree?

[–] originalucifer@moist.catsweat.com 11 points 9 months ago (1 children)

resting bitch face is a thing. i am constantly having to tell people i am not angry, thats just what i look like.

like jim bruer and his 'resting stoned face'

[–] FlyingSquid@lemmy.world 7 points 9 months ago

It sounds like a few people stand to get rich from suing whatever company made this crap into oblivion.

[–] psud@lemmy.world 17 points 9 months ago

And people from a different culture than the people who trained the AI

[–] Brainsploosh@lemmy.world 6 points 9 months ago

Or brought up by a neurodivergent parent, or sibling, or have an ND partner

[–] agitatedpotato@lemmy.world 4 points 9 months ago

That was my first thought too, anyone with a condition with a sensory component is inherently discriminated against.

[–] Meron35@lemmy.world 28 points 9 months ago (2 children)

This isn't new. Recruiting firms such as HireVue have been pushing out "AI" interviewing platforms which automatically judge your body language, fashion choices, tone of language etc since at least 2018.

Companies are using AI to stop bias in hiring. They could also make discrimination worse. - https://www.vice.com/en/article/qvq9ep/companies-are-using-ai-to-stop-bias-in-hiring-they-could-also-make-discrimination-worse

[–] Shirasho@lemmings.world 23 points 9 months ago (2 children)

For a brief moment I worked in that industry as a programmer. The whole point is not to find the most qualified candidate but to find the one that fits into the company culture the most in order to reduce turnover. These algorithms will throw away applications from people of color because they have "behaviors not in line with the company culture" or applications from disabled people because they would "not react properly to certain situations".

Of course they aren't explicitly rejecting these people, but the questions and answers on the tests for applications are specifically and painstakingly crafted to filter out these people without making it clear what type of person the question is trying to filter out.

This doesn't necessarily have to do with the AI in question, but my point is that the entire hiring/firing process is totally fucked, and companies are constantly looking for ways to get around discrimination laws.

[–] linearchaos@lemmy.world 11 points 9 months ago

Using an AI to grade someone's body language seems like a horrible thing.

Although I will say there is some validity to being careful about who you hire company culture wise and I'm not talking about race gender or disability.

We've turned down the 'best programmer' numerous times, some people that really had some solid skills, because they came in aggressive and brash.

The one guy got his "sorry but no thanks", said look at my resume I'm an absolute master at everything you do, and he wasn't kidding he was very good. We told him we recognize his skills but said that socially he was difficult and abrasive just in interviews and that there's no way that they could subject him to the rest of the company. He unleashed a string of profanities and said couldn't we just have him work somewhere else on his own separate projects. No, that wasn't going to be an option.

Nobody wants to hire somebody that's going to make a workplace toxic. That means that sometimes you turn down some of the better skilled opportunities, but you can always find somebody nicer and train/educate them.

As far as race, gender, quirks, we have you meet with everybody, a group takes you out to lunch. You can be shy, flighty, uncomfortable, awkward, the basic test is, can you mostly do the job and would other people want to work with you. And if the people come back with the answer of no, we don't bring you on. We've done that since the very beginning, so everybody there is already pretty much a tolerant nice person.

I had this one guy interview for my department, He made it through the morning interviews no problems. Gold star. The lunch crew took him out to lunch. He turned it into a people watching affair and started making horrible comments about all the people coming in the door. One of the strongest personalities I know was the lunch came back to me and said he makes me very uncomfortable. I sent him packing.

I hope we don't get to the point where all jobs are using AI to weed people out without humans checking behind it.

[–] Nawor3565@lemmy.blahaj.zone 6 points 9 months ago

Funny. That sounds exactly like how they tried to use "intelligence tests" to prevent Black people from voting. The questions didn't explicitly exclude Black people, but we're written in a vague and subjective way so that the test-giver could claim that any answer was right/wrong and thereby exclude anyone they wanted.

[–] TheObviousSolution@lemm.ee 12 points 9 months ago

They are not using it to stop bias. If history has proven anything, it's that AI is biased as shit. They are using AI to excuse bias, because "computers ergo cold hard logic" while ignoring that they aren't training in ethical and moral considerations.

[–] RotaryKeyboard@lemmy.sdf.org 23 points 9 months ago (1 children)

If a company requires you to re-apply for the job you already have, you lost your job long before you ever recorded yourself with HireVue.

[–] treadful@lemmy.zip 15 points 9 months ago (3 children)

These kinds of tools can easily just be the fall guy. An excuse they use to get rid of you. Then if you complain, they can just be like "the AI did it!"

[–] Taleya@aussie.zone 16 points 9 months ago* (last edited 9 months ago)

Suddenly, viciously reminded of that quote: "If a machine cannot be held accountable, it cannot be allowed to make a management decision" (paraphrased)

...should prooooobably start legislating that shit

[–] wizardbeard@lemmy.dbzer0.com 6 points 9 months ago

Just like return to office, relocations, and the overwhelming majority of performance metrics.

Cheap excuses for shit managers that can't or won't handle their employees (and firing of them) properly.

[–] BearOfaTime@lemm.ee 0 points 9 months ago

BTDT, and seen it a number of times with other tools.

[–] HeavyDogFeet@lemmy.world 23 points 9 months ago (1 children)

This shit seems like the perfect basis for a discrimination lawsuit.

[–] Khanzarate@lemmy.world 10 points 9 months ago

Only if the AI used discriminatory criteria from a protected class.

They CAN fire you for feeling you're likely to sue. They can't retaliate against a lawsuit, but there isn't one yet. At-will employment sucks, and the thing that protects against this is a union, not discrimination laws.

[–] TheObviousSolution@lemm.ee 20 points 9 months ago (2 children)

It classified her as "most likely to be critical and sue company for ethical violations".

But seriously, I don't know what it is with the AI craze. Today, HAL 9000 seems like a documentary because it's like most of these AI are behaving - highly reliable until they go off the completely deep end and suddenly aren't. They are at their worst when deployed in highly subjective and dynamic situations, like the one mentioned in this article.

[–] Theoriginalthon@lemmy.world 5 points 9 months ago

All I'm reading from this is that the company has "ethical violation" problems and should probably be investigated

[–] TORFdot0@lemmy.world 1 points 9 months ago (1 children)

What the hell is this some sort of yearbook superlative? Absolutely ridiculous

[–] TheObviousSolution@lemm.ee 1 points 9 months ago

This is lemmy's, sir.

[–] _number8_@lemmy.world 14 points 9 months ago (3 children)
[–] GraniteM@lemmy.world 1 points 9 months ago

What's twisted is that, if you go way out at the edge of the curve, you'll find people who either actually do enjoy any horrible job you can think of, or else are willing to do an earnest imitation of enjoying it. Large enough employers can search though a large enough pool, using sophisticated enough tools, to find these weirdos and try to employ only the kinds of lunatics who would say "I love this job, and I think a pizza party is an acceptable alternative to a raise!"

[–] HerbalGamer@sh.itjust.works 1 points 9 months ago

Woo David Mitchell! <3

[–] randon31415@lemmy.world 12 points 9 months ago
[–] bionicjoey@lemmy.ca 7 points 9 months ago (1 children)

To paraphrase Groucho Marx, I don't want to work anywhere that would use an AI body language analyzer in their recruitment pipeline.

No way a place that does that has a good culture.

[–] BearOfaTime@lemm.ee 3 points 9 months ago

Well, at least not very soon, for sure.

May have had a good culture, then the C-class jackasses decided to destroy it. And that's being optimistic (my default is cynicism).

[–] LodeMike@lemmy.today 1 points 9 months ago