this post was submitted on 07 Aug 2023
312 points (97.9% liked)

Technology

60112 readers
3472 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

In every reported case where police mistakenly arrested someone using facial recognition, that person has been Black::Facial recognition software has always had trouble telling Black people apart, yet police departments are still using it.

all 10 comments
sorted by: hot top controversial new old
[–] dbilitated@aussie.zone 38 points 1 year ago

to be fair that happened a lot before AI existed

[–] Mic_Check_One_Two@reddthat.com 26 points 1 year ago (2 children)

This isn’t new. It’s been a known problem for a long time, because facial recognition software is trained using white people. So it gets really really good at differentiating between white people. But with black people as a tiny fraction of the sample data, it basically just learns to differentiate them with broad strokes. It’s good at telling them apart from white people, but not much else.

[–] phx@lemmy.ca 29 points 1 year ago* (last edited 1 year ago) (1 children)

It's not just a training issue. Lighter (color) tones reflect. Dark tones absorb. There have been lots of issues with cameras or even just sensors having issues with people having dark skin tones because the lower reflectivity/contrast of dark tones. 3D scanners - even current models - have similar issues with objects having black parts for similar reasons. Training on more models can help, but there's still an overall technical difficulty to overcome as well (which is also a good reason that using facial recognition in this manner is just bullshit, period).

[–] Shihali@sh.itjust.works 3 points 1 year ago

As a technological problem it could have a technological partial solution: the darker the skin, the higher the threshold to declare a match. This would also mean more false negatives (real matches not caught by the software) but not much to do about that.

[–] Chunk@lemmy.world 6 points 1 year ago

I'm interested what dataset they're using because simply adding more black people to the training set seems like a pretty straightforward fix.

[–] Obsession@sh.itjust.works 16 points 1 year ago

Isn't face recognition just going to be inherently rly less reliable on darker skinned people? Their features would certainly have less contrast on darker skin, no?

[–] Diplomjodler@feddit.de 15 points 1 year ago

Who could possibly ever have foreseen that.

[–] Nacktmull@lemm.ee 6 points 1 year ago

Fuck the police!

[–] Blackdoom@lemmy.world 6 points 1 year ago

If Black Then arrest