Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Racist Facial Recognition Software

samzenpus posted more than 4 years ago | from the ebony-and-irony dept.

Programming 49

An anonymous reader writes "A black man found that his HP facial-tracking recognition software wouldn't work. Then he discovered it worked fine for a white co-worker. From the article: 'HP's Tony Welch thanked Desi and Wanda, the video's creators, and promised that he and the team at HP were looking into why the camera was behaving the way it was. "The technology we use is built on standard algorithms that measure the difference in intensity of contrast between the eyes and the upper cheek and nose," he said. "We believe that the camera might have difficulty 'seeing' contrast in conditions where there is insufficient foreground lighting."'"

Sorry! There are no comments related to the filter you selected.

Black / white or male / female? (0)

Anonymous Coward | more than 4 years ago | (#30542598)

With this extensive testing performed on the camera (only 2 subjects), i wouldn't dare to say for sure that it is because he is black. They two are different on more things than skin color (i.e., sex).

Anyway if I had to bet I would also bet on "blackness", but I'm just pointing some flaws in their straightforward conclusions.

Contrast to speculation (1, Interesting)

Anonymous Coward | more than 4 years ago | (#30542822)

I do not think HP is out right racist. But his skin color does have an effect. His eyes, cheek, and nose are of similar color.
Therefore the recognition software will interpret no difference in variables, because it can not see any contrast. This makes triangulation impossible. Simply a bug in software design.

A possible solution would be an adjuster to the contrast ratio for the recognition points. Allowing the user to adjust it. Though hard to teach to people who prefer plug-n-play style of settings.

Re:Contrast to speculation (0, Offtopic)

thisnamestoolong (1584383) | more than 4 years ago | (#30543416)

This isn't even a bug in software design -- it's simple physics. Darker skin = less contrast = software has a harder time seeing him.

Re:Contrast to speculation (0)

Anonymous Coward | more than 4 years ago | (#30545088)

In related news, it also discriminates against people with camo facepaint.

Re:Contrast to speculation (0)

Anonymous Coward | more than 4 years ago | (#30547750)

This isn't even a bug in software design -- it's simple physics. Darker skin = less contrast = software has a harder time seeing him.

Uh. It's facial recognition software. If it can't recognize a face, that's clearly a bug. The fault is with (horribly) inadequate testing, not physics.

Re:Contrast to speculation (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#30550408)

You are an idiot. Get off the internet and make it a better place for the rest of us.

Re:Contrast to speculation (0)

Anonymous Coward | more than 3 years ago | (#30556504)

Uh, unless programming can magically improve the quality of the hardware and lighting conditions, uh, it's not. There is no control in this test. Perhaps if the subjects were front lit instead of rear lit, the camera would not distinguish her features as they would be "whited-out". Would that, then, be racist? The camera and software are not racist. Stop anthropomorphizing objects.

Re:Contrast to speculation (0)

Anonymous Coward | more than 4 years ago | (#30616294)

its not a bug, its a feature

Re:Contrast to speculation (0)

Anonymous Coward | more than 4 years ago | (#30632546)

Well obviously physics is racist.

Re:Contrast to speculation (1)

lena_10326 (1100441) | more than 4 years ago | (#30632918)

This isn't even a bug in software design -- it's simple physics. Darker skin = less contrast = software has a harder time seeing him.

Watch the video again. Clearly there is MORE contrast between the background and the black guy than with the white woman because the background in that video is the light colored ceiling.

Re:Contrast to speculation (1)

dmizer (1081799) | more than 4 years ago | (#30654624)

Watch the video again. Clearly there is MORE contrast between the background and the black guy than with the white woman because the background in that video is the light colored ceiling

It's not about contrast between the face and the background. Its about contrast within the face itself. The facial recognition feature is programmed to triangulate differences in contrast on the face itself. This way, the software can tell the difference between a face and something else that just happens to be around or near the camera.

You can't create more contrast on a dark face. So unfortunately, this means that the software cannot recognize a dark skinned face from something that just happens to be around or near the camera.

Clearly (1)

geekoid (135745) | more than 4 years ago | (#30545390)

it just doesn't trust white people and need to keep an eye on them~

It's also pretty clear the the creators of the software where white.

Don't be quick to complain (5, Funny)

mechsoph (716782) | more than 4 years ago | (#30546980)

When the robot wars come, then we'll see who's laughing...

Re:Don't be quick to complain (1)

GNUALMAFUERTE (697061) | more than 4 years ago | (#30548928)

Best. Comment. Ever.

Re:Don't be quick to complain (0)

Anonymous Coward | more than 4 years ago | (#30572472)

No. It's. Not.

Re:Don't be quick to complain (0)

Anonymous Coward | more than 4 years ago | (#30599934)

was pretty good though

Re:Don't be quick to complain (0)

Anonymous Coward | more than 4 years ago | (#30631634)

ahahahah. like, " get embarased " . best comment ever.

Quick to judge (1)

diamondsandrain (1628327) | more than 4 years ago | (#30547738)

Notice how slashdot jumps on the item as inflammatory of a way as possible.... calling it racist. Way to be fair slashdot.

Re:Quick to judge (0)

Anonymous Coward | more than 4 years ago | (#30548458)

I think people are just having fun with it. No one seriously believes HP or the computer is raciest simply because it has a problem detecting a dark skinned face.

There is very little contrast in that situation and it probably can't see the facial features that make tells it that a face is present. They might be able to fix it by artificially cranking the contrast way up in software.

Sorry, just the nature of the problem. Dark contrasts are always going to be hard to make out. It's hard for humans to do it too.

Re:Quick to judge (1)

GameboyRMH (1153867) | more than 4 years ago | (#30550460)

MSNBC article calling software "racist" in 3,2,1...

Re:Quick to judge (0)

Anonymous Coward | more than 4 years ago | (#30662094)

So what you're saying is "They all look alike anyway"?

Re:Quick to judge (0)

Anonymous Coward | more than 4 years ago | (#30549310)

Actually, that was the article. And even the original article put it in quotes as they were mocking the idea. Take a joke kid.

Missed the Irony (0)

Anonymous Coward | more than 4 years ago | (#30568654)

No, this is funny and ironic. It isn't inflammatory. Most racism singles out "people of color", and this facial recognizer matches only white faces.

Re:Missed the Irony (0)

Anonymous Coward | more than 4 years ago | (#30624984)

This system is a stroke of genius: use it at a security checkpoint for your favorite gated community. Set it so that all detected faces will open the gate for one single person. If anyone complains about racism, blame insufficient technology and a constrained budget. Enjoy a peaceful neighborhood.

Re:Quick to judge (1)

blhack (921171) | more than 4 years ago | (#30601146)

Or it could be that they're quoting what the guy...you know...said in the video.

"I'm going on record...Hewlett Packard computers are racist, there I said it!"

Re:Quick to judge (1)

diamondsandrain (1628327) | more than 4 years ago | (#30601480)

I had heard about the item before slashdot so I was aware of the situation. Although the way it is phrased here on Slashdot is a shock headline to get you to read it. Seems to be a standard slimy practice. Not strange for Slashdot though.

TV to the rescue (2, Informative)

Krishnoid (984597) | more than 4 years ago | (#30549290)

Luckily, a similar situation was addressed and resolved in a Better off Ted [imdb.com] episode.

Viola-Jones (0)

Anonymous Coward | more than 4 years ago | (#30552600)

They're describing the Viola-Jones algorithm. Because it doesn't use colour (frames are converted to monochrome and normalised to account for different lighting conditions) it usually works well regardless of skin colour.

Re:Viola-Jones (2, Informative)

gravisan (1179923) | more than 4 years ago | (#30566620)

I have worked with Viola-Jones before - it is extremely robust to lighting conditions. However what it is not robust to is angular changes (twisting of face sideways). It is possible that HP are using some kind of naiive algorithm to achieve face tracking, an easy one for e.g. is simple edge analysis for eye recognition (easier if you have a infrared emitter - to exploit the red eye effect) and then using this extrapolate facial dimensions then to facilitate tracking. It is possible they use Viola-Jones for the initial stage to locate the face region and then begin tracking, so even though the VJ tracker is very good, the processing further in the chain isn't so robust. Part of the reason for doing this might be that the VJ tracker is expensive in terms of compute cycles.

Re:Viola-Jones (1)

Nova77 (613150) | more than 4 years ago | (#30671442)

VJ is expensive?? Is one of the cheapest type of face recognition algorithms out there!
I implemented my own for my ML class in 2004 and it was running on 60fps on a supercheap laptop.

blame testing (0)

Anonymous Coward | more than 3 years ago | (#30554928)

software developed and tested on the faces of pasty white software engineers.

Not racist at all - here's why (2, Interesting)

MindPrison (864299) | more than 3 years ago | (#30554990)

The recognition software is looking for contrast spots, such as shadow from the nose, eye-sockets, and ultimately the head shape, if these criteria isn't met - then it has a problem.

It's the same with eg. Sony PS3's Playstation EYE - and eg. the new game EYEPET, you try that on at home against a dark floor, dark carpet, no matter how much light you put on, the pet won't react to you.

Same with me - I'm a white guy, and unfortunately my floor is sorta white colored (wood, flesh like...if you're slightly yellow in the skin, don't worry - bad indoor light WILL look pale yellowish) ;)
Even with 200 watts of lighting - I had problems with my EYEPET and PS camera, it's virtually useless and incredibly annoying.

Trust me buddy, there's no difference with HP's camera software, the same stuff. Wait until MicroSofts Project NATAL comes out, then you're going to be pleased, it will reckognize ALL colors! Because it has an extra infrared camera!

You're kind of funny though ;) You have a talent for talking on tv, not everybody have that gift, you do!

Also not racist for another reason (0)

Anonymous Coward | more than 4 years ago | (#30688300)

racism

1 : a belief that race is the primary determinant of human traits and capacities and that racial differences produce an inherent superiority of a particular race

I haven't asked the camera but I don't believe it has such a belief.

hm... (0)

Anonymous Coward | more than 3 years ago | (#30558786)

As I was watching this, I got "Find an African Date" AD from Youtube... Something strangeis going on here.

Not Racist, But... (1)

Trip6 (1184883) | more than 4 years ago | (#30562752)

...this was an "edge case" they obviously did not test for!

The only racism (1)

WindBourne (631190) | more than 4 years ago | (#30570748)

is the person that submitted this. It is DIFFICULT to get this kind of thing done correctly. Heck, America has spent BILLIONS since 9/11 trying to handle facial recognition. And we are still nowhere near close. To the original submitter, please keep your racism to yourself.

Oh, please (2, Insightful)

new death barbie (240326) | more than 4 years ago | (#30571862)

SOMEBODY developed this facial tracking software, and HP vetted it for installation on thousands or millions of computers.

Either this problem will come as a complete surprise to them, or they knew about it and released it anyway. Both alternatives are pretty upsetting.

Because either there were no test cases involving black people -- for an algorithm that depends on skin contrast, you'd think this would be a no-brainer -- or they knew there was a problem, but never expected black people to buy it.

Re:Oh, please (1)

tuxedobob (582913) | more than 4 years ago | (#30582228)

Or they had enough lighting on their black people and it wasn't a problem.

Re:Oh, please (0)

Anonymous Coward | more than 4 years ago | (#30666308)

If you didn't notice, they were only using fluorescent lighting mounted what looks like 60 feet above them. Put a light bulb in from of the guy and it should work. You are just jumping to conlusions, and pointing out to everyone you have racist tendencies. Besides, the guy in the video is obviously just having fun with it, and thinks it is funny. He was sarcastic, if you didn't notice.

Seriously? (1)

msgyrd (891916) | more than 4 years ago | (#30581958)

A project that doesn't get outsourced to India and look what happens.

What, no GNAA trolls? (0)

Anonymous Coward | more than 4 years ago | (#30607676)

You'd think this story would bring them out of the woodwork...

This is a good thing (1)

vvaduva (859950) | more than 4 years ago | (#30612332)

When in fact Skynet becomes self aware, this will be a good thing for the dude :)

PROFIT!!! (1)

pyromega (1582935) | more than 4 years ago | (#30620354)

1. get a HP laptop
2. tell ppl the laptop track ppl's soul
3. claim u can get their soul back for them for a fee
5. ???
6. PROFIT!!!

What? it doesn't even have a (0)

Anonymous Coward | more than 4 years ago | (#30627538)

Beer tap?

What kind of entertaining robot doesn't serve beer?
I mean come on this is 2010.

Re:What? it doesn't even have a (1)

scott666 (1008567) | more than 4 years ago | (#30688534)

Since when does all-in-one PC = robot?

Glare? (1)

ins0m (584887) | more than 4 years ago | (#30640554)

Not to be offensive, but how greasy is this guy's face? You'd think there would be *more* contrast between his skin and his corneas and teeth, which would make a general "face" easier to pick out (even if it were a Cheshire-cat caricature, it's a baseline). However, there's a noticable glare off his forehead that I'd imagine would skew the results (as it'd significantly take a chunk out of his otherwise-round head to the point it wouldn't even hit on a fuzzy match).

My suggestion is to get this guy some Noxzema and try again.

Re:Glare? (0)

Anonymous Coward | more than 4 years ago | (#30677032)

Wow, that's not offensive, that's old-fashioned rude.

lmao (0)

Anonymous Coward | more than 4 years ago | (#30653344)

lmao

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?