Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Face-Scanning Loses by a Nose in Palm Beach

jamie posted more than 12 years ago | from the prima-facie dept.

Technology 232

Rio writes: "A story from myCFnow.com reports that Palm Beach International Airport officials said face-scanning technology will not become part of their airport's security system." Looks like the ACLU was right. Checking a database of 15 employees, the technology gave false-negatives -- failed to recognize the test subjects -- over 50% of the time. A spokesperson said, "There's room for improvement." The Pentagon said the same thing in February. The false-positive rate is more important -- it isn't mentioned, but even if it were just 0.1%, Bruce Schneier argues, it'd be useless.

Sorry! There are no comments related to the filter you selected.

first sunday night post (-1)

Anonymous Cowrad (571322) | more than 12 years ago | (#3589462)

hey gang!

love you much

Re:first sunday night post (-1, Troll)

Anonymous Coward | more than 12 years ago | (#3589473)

i think you mean monday morning. asshole.

Re:first sunday night post (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3589477)

that's the fastest troll mod i've ever seen! GOOD JOB!

Re:first sunday night post (-1)

Anonymous Cowrad (571322) | more than 12 years ago | (#3589519)

PST

Re:first sunday night post (0)

Anonymous Coward | more than 12 years ago | (#3589524)

no such thing.

Re:first sunday night post (0)

Anonymous Coward | more than 12 years ago | (#3589539)

Ha! You'll be telling me next that tomorrow (!Monday) is a national holiday and I don't need to go into work.

Re:first sunday night post (0)

Anonymous Coward | more than 12 years ago | (#3589551)

are you serious? it's tomorrow/today (monday)? i've honestly been thinking it was today the whole time.

shit (0)

Anonymous Coward | more than 12 years ago | (#3589552)

shit it's a holiday. and i didn't get piss drunk tonight even though i could have. aww shit. but remember **tips glass to slashdot trolls*** its never too late

Memorial Day? (1)

tdelaney (458893) | more than 12 years ago | (#3589637)

Isn't Memorial Day supposed to be your day of rememberance for the war dead, etc (like Anzac Day in Australia)?

So what the hell makes it a big party/movie weekend? If that's all it is to most people, I suggest you scrap it. If all you see it as is a holiday then it shouldn't be.

US citizens are always going on about how they "fought for their freedom" etc but it sure doesn't look like you respect those who actually *did* the fighting.

To put this on-topic ... both false positives and false negatives are bad. In both cases, the operators will nearly always assume that "the machine is right". Training can eliminate some of the problems with false positives (you must always do a human recognition check and the computer is merely narrowing down the possibilities) but false negatives are worse than useless as operators won't even think to check anyone who doesn't set off alarms in the computer.

i like marshmallows (-1)

Anonymous Cowrad (571322) | more than 12 years ago | (#3589558)


Re:i like marshmallows (-1)

Mao Zedong (467890) | more than 12 years ago | (#3589578)

Alton Brown is my father, and I have nothing but the foremost laudation for him, and the rampaging he gave my virgin sphincter twice a day during my 13th birthday.

correction for da niggas (0)

Anonymous Coward | more than 12 years ago | (#3589560)

r ya sure u dont mean "no such thang"?

watched battlefield earth (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3589573)

and it was a great film to me. i dont understand why so many hated it. can someone, perferably many people explain it to me.

slashdotters dont need to worry (-1, Troll)

Anonymous Coward | more than 12 years ago | (#3589465)

unless they are criminals and have something to hide.

mjl

Re:slashdotters dont need to worry (1)

MisterBlister (539957) | more than 12 years ago | (#3589557)

unless they are criminals and have something to hide.

This means you Randall "merlyn" Schwartz.

Re:slashdotters dont need to worry (2, Informative)

God Takeru (409424) | more than 12 years ago | (#3589577)

I don't know if you're being sarcastic or not, but if you're serious, you're wrong.

Right now, in your own eyes, you are not a criminal. But what keeps you that way? What if the government decides something you do is a criminal offense? Perhaps they'll decide that Slashdot, as a part of Hax0r culture (I wouldn't call it that, but the people in power in this country are stupid enough to do so), must be outlawed, and its users are all 'terrorists.' Of course, fifty years ago we'd all be 'communists,' but times change and the way you make the idea of a subversive sound like the enemy change.

You see, anything is potentially a crime. Leaving my house, attending class, writing papers, playing water polo, jacking off ten hours a day- these are things that take up most of my time. The fact is, no one is to say that these are not crimes. If using drugs is a crime, if someone who feeds a non-violent subversive activist is a 'terrorist' now, any of these activities could become criminal.

In the majority of the United States, it is still legal to fire someone for quite simply being gay. There is no amendment to protect from this, there is no federal law. And it will be this way for a long time, most likely. In fact, some of the anti-discrimination laws that keep this from being true everywhere are being repealed. What's to say that you aren't a criminal in such an unjust nation?

We are not the land of the free, don't buy that. You aren't safe. Unless you work for the government in a high ranking office (as in you were either elected or appointed), or have a LOT of money, you can be screwed at any time.

Slashdotters need to worry. Fight surveillance! Fight for your freedom, no matter the cost.

Re:slashdotters dont need to worry (1)

Troller Durden (253387) | more than 12 years ago | (#3589650)

It's interesting how you handwave a desire that there be laws to prevent behavior you dislike (firing someone for being gay) into a rant about how the government is bad for making many things illegal.

Having the government force people to conform to your (and my) opinion that gay people are not bad is just another example of the denial of person freedom. If I start a business with my own money, then why should the government make special cases about who I can and cannot fire at will?

Re:slashdotters dont need to worry (1)

ObviousGuy (578567) | more than 12 years ago | (#3589661)

Here's what Jill Nelson [msnbc.com] thinks.

Re:slashdotters dont need to worry (1)

God Takeru (409424) | more than 12 years ago | (#3589691)

I used the example because it is something many slashdotters would be agree with me on (as Slashdotters, in my experience, tend to be relatively liberal). First rule of the argument: know your audience. Still, it is true that I should have chosen something enforced by law and not something NOT enforced by law. Sodomy laws would be a far better example, and looking back I wish I had used that, instead.

As for the issues of reasons for firing people, this is rooted not in gay rights essentially but rather in the rights of the employee. Because the employer is given to a position of power, controls must be placed to prevent his abuse of the workforce. Now, in true capitalist society, that isn't theoretically the case...but this is getting so off topic, it doesn't really matter. As for my point of debate, you're right.

Holy shit you jack off 10 hours a day? (-1, Troll)

Anonymous Coward | more than 12 years ago | (#3589660)

You must have a huge arm? How do you compensate to make sure you don't look deformed? Just wondering because I jerk off at least 1 to 2 hours a day and my right are is getting huge.

For chrissakes... (-1, Offtopic)

ObviousGuy (578567) | more than 12 years ago | (#3589668)

Alternate arms. Use lube. Loosen your grip.

Better yet, get a girl to do the hand work for you.

Re:Holy shit you jack off 10 hours a day? (-1, Offtopic)

God Takeru (409424) | more than 12 years ago | (#3589698)

No, actually, my left arm isn't any bigger. Yeah, it's a bit of an exaggeration, but I do masturbate more than the common man (4 hours is a common day). And don't you think it would be awful if the government outlawed this traditional practice of my indigineous peoples (non-dating geeks, that is)?

That explains alot... (4, Funny)

chriso11 (254041) | more than 12 years ago | (#3589478)

Perhaps this is why I can't remember anyone's name - half the people look the same

only 15 employees? (2, Insightful)

Atrax (249401) | more than 12 years ago | (#3589479)

a bit of a small sample, don't you agree? and how was it composed...?

Re:only 15 employees? (4, Informative)

Triskaidekaphobia (580254) | more than 12 years ago | (#3589487)

If it is a small sample then a high false negative rate is even worse.

If it can't identify 1 of 15, then what chance has it got of finding 1 person out of millions?

Re:only 15 employees? (1)

Atrax (249401) | more than 12 years ago | (#3589503)

that's exactly what I mean. pretty porr showing, all in all. A sample of 15 doesn't cut it at all.

The Future of Open Source and Free Software (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3589481)

What is the future of Open Source and Free Software? I asked myself that very question because Open Source and Free Software is the most revolutionary movement today. It is changing everything.

To answer this question I went to slashdot.org, this website, since slashdot is known across the internet when it comes to Open Source and Free Software. I read the stories and the comments. I read the comments at the -1 threshold because I wanted to read real people's opinions and knowledge on Open Source and Free Software, not the sanitized version that gets modded up. What I found was very interesting.

So, what is the future of Open Source and Free Software? Wideness. That's right. Wideness. This concept of wideness is so powerful that it is invading domains beyond computer software. Take HDTV for instance. It is widescreen compared to normal tvs. TV is becoming wider.

The best examples of wideness are from slashdot of course. First, page widening posts. Slashdot pages weren't wide enough so they have to be widened manually. In the future slashdot pages will be wider.

I also found links to the goatse.cx website [goatse.cx] . Again, another example of wideness, namely a wide open anus. People who use Open Source and Free Software aren't boring heterosexuals. They are homosexuals, bisexuals, etc. As the goatse.cx website shows wideness is being added to the sexuality of Open Source and Free Software users.

Like me, you are probably excited about this wide future. The following email shows the future is closer than you think.

From: "Larry Augustin"
To: "Company - all"
Subject: Acquisition of latest OSDN holding

As you may be aware, our stock certificates are now unfit to even wipe our own asses in the restrooms. However, soon this will all change with our latest opensource acquisition. This is such a revolutionary paradigm shift that we have decided to coin a new term for it: "WideOpensource". The following letter was recently sent to the management of our prospect:

From: "Larry Augustin"
To: contrib@goatse.cx
Subject: Open source business opportunity

Dear Sir,

We at OSDN are continually looking to expand our growing network of opensource-related web real estate. Through intense analysis of comment traffic on our premier site, SlashDot.org, we have determined that your site holds considerable value to the community at large. As recent IDC surveys have shown, your site is one of the 10 most popular on the Internet. That, combined with its decidedly opensource bent, makes it a prime target for OSDN banner ads, our flagship product. We would like to acquire your site and employ you as a member of our OSDN team. Please consider this carefully, you aren't likely to see an opportunity like this every day!

Love,
Larry

Try telling the Aussies that. (4, Informative)

serps (517783) | more than 12 years ago | (#3589482)

Airport face identification isn't practical? Try telling the Australian Government that. [news.com.au] They are trialling a hybrid face-recognition/biometric passport system that sends shivers up my spine.

what does this do (4, Interesting)

vectus (193351) | more than 12 years ago | (#3589485)

but delay its deployment for a couple years? this isn't really a victory at all.. I mean, I bet this will only delay the technology two years.. maybe less.

If anything, it should be a call for all Americans to protest this kind of thing (should you disagree with it).

oh, but i want funding! (1)

australopithecus (215774) | more than 12 years ago | (#3589488)

i think that we should make all pregnancy tests have a 50% false positive rate also. that would be almost as fun as getting thrown into a cell and being denied legal counsel for a few days.

Re:oh, but i want funding! (0)

Anonymous Coward | more than 12 years ago | (#3589540)

  1. The only people being thrown into jail and "denied legal counsel" are camel jockeys illegal immigrants, many of whom harbor a severe hatred towards the US and would fly an airplace into the White House in the belief they will go to heaven and recieve 72 virgins.
  2. Isn't it funny how Tom Dasshole, etc, are complaining that George Bush didn't prevent the Sept. 11 attack form happening. Never mind that Bill Clinton had recieved the same warnings that Osoma Bin Laden had a hard on for the US. Never mind that George Bush had papers on his desk, ready to sign, to try and whack him. Never mind that the FBI refused to investigate Arabs training at US flight schools because they feared the liberals would rake them over coals for "ethnic profiling".

Now shut the fuck up.

Re:oh, but i want funding! (1)

australopithecus (215774) | more than 12 years ago | (#3589591)

wow...i was just trying to make a joke about preganacy tests. but i mean, hell, if you want to make a schizz about it, by all means, pick on me.

Re:oh, but i want funding! (1)

Triskaidekaphobia (580254) | more than 12 years ago | (#3589619)

You won't be joking when it turns pink [panix.com]

Human oversight (2, Informative)

enjo13 (444114) | more than 12 years ago | (#3589491)

I think a 0.01% false positive rate would be perfectly acceptable. I have not seen one proposal for a face scanning system that has not also included human oversight.

Its exactly the systems Casino's have sucessfully deployed to keep known "cheaters" out of their casino's. The face scanning technology merely provides POSSIBLE matches, the actual decision on further investigation rests with a human operator...

This seems perfectly reasonable to me from a technology standpoint, I'll argue the ethics of this technology some other time:)

Re:Human oversight (0)

Anonymous Coward | more than 12 years ago | (#3589545)

I basically agree with you. A 0.01% false-positive is fairly decent for a "closed" environment such as a mall, bus depot, airport, amusment part, etc... That would be the highest false-positive rate I'd accept though. Even 0.01% in an airport would probably bring up between 0 and 10 false-positives a day -- depending on airport size (up to 100,000 people per day for 10). On public streets 0.01% would be kinda high in places like NY and LA. If they deployed a full camera system like in Britain, then those cities would probably have up to 75-100 false-positives per day. That is actually a lot of false-positives for the police to check. Over the cource of a year, that means that 27,000 to 36,000 false positives would occur. One thing I wonder, is what the false-negative ratio is? Perhaps I've got Slashdot blindness, but I couldn't find it in the articles linked to from the main story.

Re:Human oversight (0)

Anonymous Coward | more than 12 years ago | (#3589556)

I couldn't find it

It's even in the summary.

From the article

The airport tested the system to see if it could detect faces in a database of 15 employees. The system looks for 80 facial features and a match occurs when 14 features are the same.


Officials said the test results were mixed. Less than half of test subjects were detected when they should have been.

Rise in number of cavity searches in Holland, MI (-1, Troll)

Anonymous Coward | more than 12 years ago | (#3589492)

Wednesday 22 May, 2002

HOLLAND, MI - Authorities are investigating the ever increasing incidents of cavity searches at Township Airport, which is situated 4 miles northwest of the town of Holland. Official records show that there has been over a 500% increase in the number of full cavity searches conducted by its officials between 1994 and 2002. The number of searches being conducted has steadily risen each year, and now authorities are becoming concerned that there is a problem.

A security officer within the airport revealed that the majority of these searches are being conducted on a very small group of regular travellers. The issue was first raised by an administrative worker, who noticed the spending on lubricant and latex gloves and raised the alarm with senior management.

One source within the airport told us that the cavity searches were even being conducted at the request of the passengers themselves. On one occasion, a passenger even removed his trousers prior to leaving the aircraft in anticipation of the check.

A female passenger lodged a complaint with airport staff after finding a man inserting aniseed sweets up his rectum outside a phone kiosk. The man, known only as Katz, has been spotted on several occasions, walking back and forth past the guard dogs and generally looking suspicious. Experts examining the case think that this behaviour is consitent with attempting to induce the need for a rectal examination.

In March, Robert Malda was arrested by Michigan State Police for sexually harassing security staff at the airport. Stephen Young, a guard at the airport, filed a complaint after Malda cornered him by an X-Ray machine. Malda was dragged away from the airport screaming "Stevie's pinkie is bigger than CowboyNeal's fist!"

Airport officials say the investigation is still underway, but have hinted that the majority of these problems seem to be caused by men carrying laptop computers and wearing t-shirts emblazoned with a penguin.

This AC unwittingly presents a good point (1)

littlerubberfeet (453565) | more than 12 years ago | (#3589514)

Although the above post is a troll, it exemplifies the problem with false positives and negatives. If someone starts reading, they might belive out of hand that it is real, until the last 2 paragraphs
The recognition required to notice that this article is false involved a mere 5 % (approx) of the article. That is the same issue being faced by the developers of the technology humans have similar faces. How does one draw on the 5% that is different?

useless (-1, Troll)

Anonymous Coward | more than 12 years ago | (#3589494)

it'd be useless.

Much like Jaime's penis.

Re:useless (1)

littlerubberfeet (453565) | more than 12 years ago | (#3589523)

yes, the technology is useless, and can be made so. A skilled makeup artist can definitly fool the computers.

Jaime's Penis? jaime Gumb? as in Silence of the Lambs?

Re:useless (-1, Troll)

Metrollica (552191) | more than 12 years ago | (#3589605)



Jaime's penis does serve one purpose. It's the sole giver of all his exercise. That is if you don't count clicking a mouse and typing on a keyboard... or taking it in the ass from Jon Katz.

good idea...now extend this (3, Insightful)

I Want GNU! (556631) | more than 12 years ago | (#3589500)

Not using faulty technology is a great idea! Now all they need to do is repeal the law taking away school or library aid if they don't use filter technology, since the filters don't have open lists of sites and often block sites they shouldn't and don't block sites they should!

Re:good idea...now extend this (1)

Phroggy (441) | more than 12 years ago | (#3589669)

I'm quite pleased that the Multnomah County library system has been fighting this - they offer filtering software, but it's optional, because they realize that while it blocks some legitimate material and fails to block some porn, it also does block a lot of stuff that people don't intend to be looking at.

(I don't live in Multnomah County, but do pass through it every day on my way from home in Clackamas County to work in Washington County. Yes, my commute sucks.)

More info here. [epic.org]

False positive rate (3, Interesting)

Triskaidekaphobia (580254) | more than 12 years ago | (#3589502)

A similar system in Florida [nando.net] (not an airport, but probably a vaguely-similar number of people) had 14 false positives in the first 4 days of operation.
(Two of the false positives even got the sex of the suspect wrong)

Since they state that it was the first days, perhaps it just needed tuning?

It's 0.1%, not 0.01% (2, Insightful)

Papineau (527159) | more than 12 years ago | (#3589504)

Bruce talks about 99.9%, so there's 0.1% left, not 0.01% as the story says right now.

If a person is mistaken once for a terrorist (or a "normal" criminal), don't you think other recognition points will do the same mistake? What do you do then? Plan a few extra hours each time you take the plane? Get an official letter stating "I'm not a terrorist"? If a simple letter can get you through, terrorists will get some.

Re:It's 0.1%, not 0.01% (2)

khym (117618) | more than 12 years ago | (#3589601)

If a person is mistaken once for a terrorist (or a "normal" criminal), don't you think other recognition points will do the same mistake? What do you do then? Plan a few extra hours each time you take the plane?
I presume that the system would also have a photo of the suspect, so that a human could compare the photo to the person the system flagged as a possible terrorist. If you happen to look almost exactly like a person in the database, so that even humans mistake you for him/her, then it'll like if you looked almost exactly like someone on the FBI's "most wanted list"; it sucks, but, well, what can you do?

Alan Cox's wife (-1, Troll)

Anonymous Coward | more than 12 years ago | (#3589509)

Did he meet her on the short bus or what? How could anybody fuck such a hideous creature.

Re:Alan Cox's wife (0)

Anonymous Coward | more than 12 years ago | (#3589528)

Alan is hardly a Casanova himself, is he now!

Re:Alan Cox's wife (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3589581)

that was his point. How could anyone fuck Alan cox?

Re:Alan Cox's wife (-1, Offtopic)

Metrollica (552191) | more than 12 years ago | (#3589656)

Dog muzzle and paper bag over the head?

Broken promise ring (2)

Graymalkin (13732) | more than 12 years ago | (#3589512)

What is so retarded about these supposedly security engendering technologies is they can only catch someone (if they work at all) if they are in the database. This stops absolutely zero sleepers from commiting some act of terrorism which is exactly what the terrorists in September were. The only way they would have possibly been prevented from boarding those planes was if there was some ultralarge database that collected all the information from all possible channels and picked them out of the crowd for having expired student visas. Even then it isn't terribly likely they would have been prevented from boarding the planes, they're paying customers who will get something in the mail from the INS warning them their visas are expired.

Re:Broken promise ring (0)

Anonymous Coward | more than 12 years ago | (#3589518)

How do propose to prevent a sleeper commiting an act of terrorism?

Re:Broken promise ring (3, Insightful)

larry bagina (561269) | more than 12 years ago | (#3589607)

The september 11 terrorists weren't "sleepers"

Atta (the scary looking ringleader) had previously been arrested in Israel for being a terrorist. He was relesed as part of Bill Clinton's mideast "peace" initiative, but was still on various US gov't list of terrorists.

If the INS wasn't totally useless, if the FBI, FTC etc. shared information, they would have been deported when they were caught being here illegally, driving with an expired licesne, failing to show up for court, or buying airline tickets.

Tom Daschle and the democrats want to blame George Bush because the FBI and CIA, in hinsight, had the information to see this coming.

The real tragedy is that they, and thousands of others, were here illegally, and we did nothing.

It does ONE thing right (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#3589515)

It wouldn't have any trouble recognizing Jon Katz's ugly face or his bad book/movie reviews. Any decent /. reader can't identify those in a second!

Unpopular View (5, Insightful)

deebaine (218719) | more than 12 years ago | (#3589517)

I don't necessarily understand the objections to face scanning technology. To be sure, I don't want computers flagging people to be arrested. But computers sift through enormous amounts of information, making them ideal for a first pass. If they are used to flag people to be scrutinized by humans, I don't have any objections. In fact, if a computer can flag 20 of the hundreds of thousands of faces so that human experts can give a closer look, so much the better.

Incidentally, by this reasoning, it is in fact the false negatives that are more important. False positives can presumably be discarded by humans providing closer scrutiny. False negatives in this scenario, however, present a major difficulty.

Face scanning technology isn't innately evil. Like everything else, if we use it wisely, it can help. If we use it irresponsibly, it can hurt. No surprises there.

-db

Re:Unpopular View (3, Insightful)

h0rus (451357) | more than 12 years ago | (#3589549)

Yes, and the reason for this tension is we expect these methods to be misused.

Re:Unpopular View (5, Insightful)

achurch (201270) | more than 12 years ago | (#3589618)

I don't necessarily understand the objections to face scanning technology. [...] Like everything else, if we use it wisely, it can help. If we use it irresponsibly, it can hurt.

You just hit the nail on the head there; most people who don't like this technology don't like it because (they believe) it will be used irresponsibly, eventually if not immediately. Power corrupts, as the old saying goes, and people are unfortunately easily corruptible. Ordinarily I wouldn't be quite so pessimistic, but given all the hoopla over the "War on Terrorism", I'm inclined to side with the Slashdot popular view.

(Note to moderators: Yes, I do realize that there are many points of view represented on Slashdot, thankyouverymuch.)

Re:Unpopular View (1)

joshki (152061) | more than 12 years ago | (#3589653)

You're absolutely correct when you say that the technology isn't innately evil. Technology is never evil -- it's neutral. It's how we use it that determines it's value.
The main reason I don't like facial scanning is quite simple. I view it as a slippery slope -- we start scanning for a few "bad guys" now, and what happens a few years down the road when it becomes feasible to scan everyone to make sure they're not doing something "wrong"? If we give our government the power to watch us all the time, we've given up the ability that was guaranteed to us in the Constitution to think, and speak freely. If you've never read 1984 you really need to. The descriptions of the lengths that the man in the book went to avoid being observed will drive you nuts -- and make you really think about where this is going. Orwell was off by a few years -- but it wouldn't surprise me if it turns out he was only wrong by about 20-25 years.

Re:Unpopular View (4, Insightful)

TheCage (309525) | more than 12 years ago | (#3589662)

But what about the fact that it doesn't solve the problem? This is presumably to stop terrorism, yet a significant number of terrorists are not part of any database, which is required for something like this to work. Seems just like a waste of money to me.

Re:Unpopular View (0)

Anonymous Coward | more than 12 years ago | (#3589715)

False positives can presumably be discarded by humans providing closer scrutiny.

It'll just be another reason for the cops to stop and search the ones they want to stop and search.

False positives okay (2, Insightful)

ObviousGuy (578567) | more than 12 years ago | (#3589525)

False positives are fine, though a failure rate of 50% is clearly way too high. False positives mean that in those cases a suspect was actually identified correctly.

It is the false negatives that are truly scary. If a known terrorist sympathizer can board a plane without setting off any signals then it is clearly a useless product.

Luckily, humans have the ability to fuzzily predict terrorist-like behavior (now that everyone's on high alert, that is).

Re:False positives okay (3, Insightful)

ivan256 (17499) | more than 12 years ago | (#3589543)

Let's see if you think a false positive is ok when the guy with the rubber gloves is up your ass to the elbow looking for explosives.

False positives are as bad if not worse then false negatives.

Re:False positives okay (1)

ObviousGuy (578567) | more than 12 years ago | (#3589567)

Would you feel better with the explosives container found lodged in your ass at the crash site?

Re:False positives okay (1)

ivan256 (17499) | more than 12 years ago | (#3589667)

Considering the low chance of hijacking, I'll pass on the 1 in 1000 chance of anal intrusion, thank you.

Maybe someone should tell these guys... (0)

Anonymous Coward | more than 12 years ago | (#3589526)

Maybe someone should tell these guys [washingtonpost.com] it doesn't work, as they seem to have so much hope in such a useless thing. Two interesting quotes:

I think it's great. It's a good safety precaution that is definitely necessary

and the better of the two:

I've got nothing to hide, and neither should anyone else

Either the reporters looked real hard to find these comments or (fill in the blank with anything)

I demand a recount! (0)

Anonymous Coward | more than 12 years ago | (#3589544)

Face recognition won the popular vote. Let's recound only a few ballots from face recognition-friendly districts, change the counting standards mid-stream (multiple times), and whine and cry until we get our way. Remember, the Florida Supreme Court ruled that confusion between may and shall == cannot.

statue of liberty (0)

Anonymous Coward | more than 12 years ago | (#3589550)

It's funny, after reading this article, that face scanning was just added to the statue of liberty!

Cameras to Seek Faces of Terror in Visitors to the Statue of Liberty [majcher.com]

(NY Times article)

I'm crying (-1)

Mao Zedong (467890) | more than 12 years ago | (#3589563)

These [mycomputer.com] precocious shvoogies suscitate tears in my eyes.

False Positives are OK (1, Interesting)

MikeD83 (529104) | more than 12 years ago | (#3589564)

How many times have you gone to a store and bought an item with an electronic anti-theft tag and not had it removed properly only to be stopped once you begin to exit. A loud alarm goes off, and everyone in the front of the store and looks at you wondering... is that a thief? Extremely embarassing. False Positives happen all the time. As long as they are dealt with in a timely manner it is still OK; and already deemed acceptable by MOST of society.

Re:False Positives are OK (2, Insightful)

Triskaidekaphobia (580254) | more than 12 years ago | (#3589572)

Clerks in bookshops don't have machine guns and don't have the authority to arrest and strip search you.

Given the choice of a false positive in a bookshop and one at the airport I know which I would want to avoid.

Re:False Positives are OK (4, Interesting)

Phroggy (441) | more than 12 years ago | (#3589718)

Actually, true story: I was at Fred Meyer's a few weeks ago (for those not fortunate enough to live in the Northwest, they sell pretty much everything, at decent quality and decent prices). In addition to my groceries, I'd picked up a pair of khaki pants. They've now got those self-checkout scanner things, in addition to the regular checkout lines, so I decided I'd try it. I didn't do so well. Anyway, in particular, I hadn't noticed that the pants had a security tag on them, and I neglected to remove it. I'm not sure how I would have removed it anyway, but the really large man keeping an eye on the self-checkout lines would surely have taken care of it.

So I cram the pants and half my groceries into my backpack, the other half in plastic bags. I leave. The alarm goes off. It occurs to me that the pants must have a security tag that I didn't remove. I glance around, and nobody even looks my direction. I proceed to leave the building.

Then I remember that I've forgotten to buy a bus pass. I go back in. The alarm goes off. I head over to the customer service counter, and shell out $56 for a little card that will enable me to get to/from work for the next month. I leave again, and the alarm goes off. I wait a few minutes for the bus, and go home.

I completely forget about the security tag until I'm wearing the pants and am on my way to catch the bus to work. I've gotten about a block when I hear a noise as I'm walking. Sure enough, there it is. I run home, try unsuccessfully to get it off, give up, change pants, and run to catch the bus. I arrive at work 15 minutes late. When I get home I finish mutilating the tag. Tough little buggers.

So anyway, the moral of the story is that those little tags are absolutely worthless if store security is asleep at the wheel.

Re:False Positives are OK (1)

Triskaidekaphobia (580254) | more than 12 years ago | (#3589768)

You were lucky; some of those tags can blow your hands off (or something [sensormatic.com] ) if you try to remove them yourself.

Re:False Positives are OK (0)

Anonymous Coward | more than 12 years ago | (#3589753)

Everyone looks at you? Not where I'm from. Around here, the false positives produced by those systems happen so often that they get about as much reaction as a car alarm. Nobody even looks up.

I was in a Best Buy store during a very busy period recently and there were alarms going off everywhere. The small product rack alarms were going off every 5 minutes or so. Nobody ran over to see what was happening. The front door alarm was going off even more often. Heck, it went off when I went through. Even though I'd blown off their [illegal] exit check and just walked past security, nobody even looked at me when the alarm went off.

THAT is the problem with false positives. Eventually, nobody will pay attention to the alarms and a true positive will be ignored.

If we want to make this technology work... (2, Insightful)

Indras (515472) | more than 12 years ago | (#3589565)

we need to take a minute to figure out why it doesn't work. Or maybe, instead of that, look at recognition that does work.

I, for one, am pretty much 99.99% correct when it comes to making positive recognition of those people around me that I see often. People I haven't seen in a few years, I have more trouble identifying. Why? Because people's faces change. Facial hair, glasses (or removal of them), makeup, etc. can throw a lot of people off. Can this technology compensate for that?

I personally think that these cameras need to look at people the way we do, with two eyes. What do we get when we look at the world with two eyes? Depth perception. We can see objects in three dimensions, because we see it from two angles at once. If facial recognition computers were able to take in two separate data streams, like two cameras a foot apart, it would be possible to create at three-dimensional image of that person's face. And though it would require more computing power, it is much easier to make a positive match using three-dimensional data as opposed to two. Ever seen a perfect frontal view photograph of a person's face? Can you tell how long their nose is when you're looking at it? Isn't the length of a person's nose a significant facial feature? (Oh, and I know, if you see a person from the side, you see that, but these cameras are always only getting one angle, so they're always throwing out a lot of data. If you see a person's face from the side, you are not seeing how wide their face is, and so on.)

Re:If we want to make this technology work... (0)

Metrollica (552191) | more than 12 years ago | (#3589672)



If you're thinking if going that far just make mandatory DNA testing at the gates.

99.99% accurate?? (3, Insightful)

FaithAndReason (112179) | more than 12 years ago | (#3589758)

I suspect that your own accuracy rate is not nearly as high as you believe it is.

First, as you state, that 99.99% accuracy rate only applies to a group of people you meet regularly; this probably includes perhaps a few hundred people, and a significant part of your total memory and processing capability is devoted to recognizing and updating your memory of those faces (check out a brian map for how much of our cortex is dedicated to face recognition.) Even duplicating that feat (i.e. identifying a small group of faces) would be a major undertaking for a computer system.

Second, that 99.99% isn't nearly as impressive as it sounds, because it represents the positive rate, i.e. the chance that you will correctly identify an individual in the target population. That corresponds to a false negative rate of 0.01% -- you're saying that once in ten thousand times, you'll actually fail to recognize somebody you see on a regular basis. Not too encouraging, that.

Third, that figure says absolutely nothing about the false positive rate, which I suspect is much higher. In other words, how often do you see somebody that you think you recognize, but can't quite remember exactly? From my own experience, I would say that number is as high as one in a hundred. Our own built-in face recognition system is simply designed that way -- to generate a large number of "near misses".

So, the bottom line is: even the supposedly high accuracy of human facial recognition isn't accurate enough, and undoubtedly doesn't scale very well.

Re:99.99% accurate?? (2, Funny)

Anonymous Coward | more than 12 years ago | (#3589775)

check out a brian map for how much of our cortex is dedicated to face recognition

How much is used for transposing of letters? :)

Re: brain = abstraction (2)

fferreres (525414) | more than 12 years ago | (#3589762)

Yes, stereo imaging and depth are needed. But when you look at a person the brain stores a "pattern" of how to recognize this guy again. It discards a shit load of uneeded information.

If you don't believe me, try to draw a portrait of a close friend with pencil and paper. You'll find out you can't or that it doesn't correspond to the real look. It's NOT that you can't draw (You can perfectly copy it if you have a B&W photograph). The thing is that you really abstract the look and only store tiny bits of angles, distances, colors, patterns, movements and facial expresions.

You don't even know WHAT you are storing in the first place. Perception and pattern-matching are a very complex thing, and a thing far different than what one might guess.

Re:If we want to make this technology work... (2)

Jerf (17166) | more than 12 years ago | (#3589764)

And though it would require more computing power, it is much easier to make a positive match using three-dimensional data as opposed to two.

Unfortunately, this apparently simple statement is not as true as it would seem to you, a human being equipped with staggeringly immense computational power and a brain specially equipped for this very task.

In vision, there are two problems (at least). One is the usual problem of creating algorithms that can recognize things. The other is the staggering amount of data these algorithms must cope with.

Many common vision applications (by which I mean not necessarily face recognition) involve taking the picture, which may start life at any resolution you please, sampling it down to 32x32 or 64x64 (if you're willing to stretch), dropping down to 4 or 6 bits color, and proceeding to do the analysis on this exponentially smaller sample size.

Facial recognition algorithms do not always (often?) do this, but the problem of dealing with immense amounts of data do not go away. They simply exist in different ways. You're still trying to get invarient data (recognizing "bob #2423" no matter what bob is doing to fool the camera) out of a domain that has 2^(100*100*24) possible images (for a 100x100 full color RGB image; keep going up if you want something larger then 100x100, which is barely ID-photo sized.)

Throwing more data at the problem does not necessarily get you ahead. You must always throw out the vast majority of it anyhow to get any real work done.

(Also, you may be surprised; depth perception in humans is an interesting field of study. Less of it comes from your eyes then you may think; most of it comes from image processing. Your binocular vision has effectively no discrimination past six feet or something like that; I'd have to look the exact number up but it's shorter then most people would think.)

Re: If we want to make this technology work... (2)

Black Parrot (19622) | more than 12 years ago | (#3589766)


> I, for one, am pretty much 99.99% correct when it comes to making positive recognition of those people around me that I see often. ... I personally think that these cameras need to look at people the way we do, with two eyes. ... Ever seen a perfect frontal view photograph of a person's face? Can you tell how long their nose is when you're looking at it?

I certainly am not an expert in these matters, but based on half a lifetime's self-observation, I'm pretty sure that your recognition of your fellow humans is based on subtleties of appearance and mannerisms rather than on some hyper-analytical form-matching mechanism.

I know that on several occasions I have been in a grocery store or somewhere and caught a former schoolmate out of the corner of my eye, recognizing him or her instantly. But as I approach to say 'hi' I get a better look and suddenly think that I have mis-recognized a stranger instead of correctly recognizing a former associate. It's only on the third or fourth look that I decide for sure that I should go ahead and say 'hi'.

Also notice the frequent situation where half your friends think Little Joey looks like Mom and the other half think he looks like Dad. I hypothesize that that's because some are looking at (say) the shape of his nose and others are looking at (say) the shape of his eyes. I.e., humans apparently recognize people on a fairly arbitrary subset of subtle cues rather than matching a remembered 3-D 'mask' to their faces.

As in so many other fields of AI, the technology that's on the market today falls far, far short of the basic abilities that humans -- and animals -- take so much for granted.

I wonder what the best today's technolgy could actually deliver is. If you set a threshold of (say) a maximum of 0.1% false positives, what are the chances of actually recognizing someone in your criminal/terrorist database if they are actively trying not to be recognized? I suspect the performance is going to be pretty dismal.

Face-scanning? (-1, Troll)

Metrollica (552191) | more than 12 years ago | (#3589571)



I won't be convinced till it can recognize my ass.

Troubling (1)

NickRob (575331) | more than 12 years ago | (#3589604)

Thank God that the scanners are out... even if it's not quite by the nose it said it was.

Besides the infrignement on civil liberties, what was troubling to me about the scanners is the reduction to a mathematical sequence... meaning quite literally, that we're just another number. How depressing.

I got only one thing to say .... (1)

tcc (140386) | more than 12 years ago | (#3589610)

I've got only one thing to say to the creators of this big brother device:

IN YOUR FACE!

heh.

Grousing... (2, Offtopic)

Mulletproof (513805) | more than 12 years ago | (#3589616)

2002-05-19 16:06:51 Florida Face Recognition Fails (articles,privacy) -Rejected

Gee, only beat this submission by about a month.

Re:Grousing... (0)

Anonymous Coward | more than 12 years ago | (#3589658)

You've had to a month to think about this issue and all you can manage to contribute to the discussion is a petty whine about how CmdrTaco doesn't love you any more?

Re:Grousing... (0)

Anonymous Coward | more than 12 years ago | (#3589678)

Look who is talking, Mr. Coward.

Sense the irony now??? hahahha

Yep. (0)

Mulletproof (513805) | more than 12 years ago | (#3589701)

And what did you have in mind? Should I comment about how the system isn't all bad? Or that it had an 80% success rate in other areas? Or that it could be used for racial profiling? Or that it is too unreliable to be taken seriously now? Or that... Oh, wait... People have already posted all that shit...

Yep. I had a month to think about it, and now I'm wading in the warm jello bath of irony ^__^

False positives, fales negatives, and wasting time (5, Insightful)

markwelch (553433) | more than 12 years ago | (#3589622)

The notion that someone will repeatedly be "identified" as matching a particular face, is a very real concern for travelers. Already, we find that Americans with brown skin and beards, and especially persons who "look" Muslim, are hassled every time they enter an airport and often miss their flights. Non-citizens who "appear Muslim" should probably just give up on any idea of flying in the next few years.

As noted, there can be no "get past ID check free" letter or ID card, since those would immediately be forged. And with a 50% false negative rate (missing a suspect 50% of the time), the system seems hardly worth using.

I have not traveled by air since returning from Europe on September 19 (delayed from Sept. 12).

In the past, I would have flown between the San Francisco Bay Area and the Los Angeles area (a 1-hour flight, using any of the airports on either end), but now it's actually likely to be faster to drive (around six hours each way), after including all the "waiting in line time," the increased flight delays, and of course the time to get into and out of the airports (park here, rent a car there).

To be fair, of course, a system with a 50% false negative rate is presumably able to detect "known suspects" 50% of the time, which is almost certainly much better than human beings will ever do. Of course, the tests are probably being conducted under very favorable conditions, with an extremely small sample of "suspects." And of course, if the false-positives were equally distributed, we'd all be willing to suffer a one-in-a-thousand delay, if it actually had any meaningful benefit. (But we know that the false-positives won't be equally distributed, they will mostly affect persons in certain ethnic groups or with beards, etc., and while that means I'm less likely to be inconvenienced, I can't tolerate a system that punishes people for their skin color or ethnic background.)

What's scary, to me, is that we are giving up so much (in many little bits and pieces) for so little benefit. On Saturday, I discovered that I couldn't use the restrooms in the BART (train) stations again, because they were closed to prevent someone from planting a bomb in them. Okay, so I had to hold it for an hour until I got home, big deal. And armed troops in the airports, and on bridges, okay, I can live with that one thing. And I can't drop off my express mail without handing it to a postal clerk now.

But ding, ding, ding, we add up all the little "show-off" gimmicks and what we face is a huge impact that provides pretty much zero actual benefit. All the gimmicks combined might provide about 1% or 10% improved safety, at a much greater cost.

While I was stuck in London during the week after September 11, I worried that things would change for the worse, not because of things that terrorists did, but because of the things we would do out of panic and fear and prejudice and idiocy. Things are nowhere near my worst fears, but I think things are very bad, and ultimately I believe that the terrorists have already "won" by causing most Americans to change multiple aspects of our "way of life."

Re:False positives, fales negatives, and wasting t (0)

Anonymous Coward | more than 12 years ago | (#3589643)

Bah, dont throw the "look muslim" bull out there. I'm 30, white, male, very short hair (shaved), and on EVERY flight since Sep 11 I've been pulled aside and my luggage checked. I sure as h*ll don't meet the "profile" of a Sep 11 terrorist.

So dont make it sound like ONLY those of "arab" decent are being targetted.

Who watches...? (1)

Mulletproof (513805) | more than 12 years ago | (#3589685)

What's scary, to me, is that we are giving up so much...

I really don't see the problem with having this system installed within the nations mass transit systems (air, rail, maybe bus). Eliminate a potential hijacking not just in the US, but in Europe and everywhere else? I wouldn't complain. I agree that the current system needs a lot of work, but it's also fair to mention that it scored upwards around 80% in some of the other airports in which it was tested. It's also true that it has the potential to become scary (like anything in government) if the proper counterveiling balances aren't installed to watch the watchers.

The Ultimate System (2, Insightful)

MikeD83 (529104) | more than 12 years ago | (#3589641)

At the metal detector a passenger's picture is taken. It is then compared to the database of known criminals.
A security guard is sitting in front of a computer next to the x-ray machine ready for a positive match.
If you look nothing like the person. (different race or something like that) You would be let through to the gate and not even know you were positively identified.
If it may be a good match- you get stopped. The operator already has some information about the criminal in front of him. The operator will do an on the spot quick check. One thing that crimanals are notorious for is tattoos. If the passenger doesn't have them (or signs of removal surgery) let them go. If the passenger is a very close match do a more thorough examination.
Every night there can be an audit of the matches to make sure the security personel are doing their job. The system seems very effective to me.

The system by Visionics looks at 80 different facial characteristics. The configuration used by the airport only needed 14 matches to produce a positive. It seems this is a setting in software and could probably be lowered to produce more positives. Even if they are false positives the sytem I menetioned above would do the job.

Face scanning at the Statue of Liberty ferry (1)

telematx (17553) | more than 12 years ago | (#3589679)

A spokesperson said, "There's room for improvement."
Someone should tell that to NY officials. According the the AP [salon.com] , a face-scanning system has been installed at the the Statue of Liberty & Ellis Island ferry.

Why on earth (2)

CaptainSuperBoy (17170) | more than 12 years ago | (#3589710)

Why on earth would you post a story about some Florida airport giving up on face recognition, when we just heard that New York City is already using this technology? [yahoo.com] It is already being used at the Statue of Liberty and Ellis Island to scan faces as people board the ferry. Way to go Jamie - raising that journalistic bar of integrity and thoroughness for everyone!

What's the hardware...? (1)

Mulletproof (513805) | more than 12 years ago | (#3589725)

Besides my grousing earlier in the section, does anybody know what hardware is running this face recognition system? How beefy does a computer need to be to sort through 20-100 people at a time and match their face with potentially thousands of profiles given factors such as the targets speed and aspect to the camera. I'm figuring if they want this thing any more accurate, their going to have to map more points on the face and are going to need the hardware to back it up.

Look-alikes? (5, Insightful)

TheSHAD0W (258774) | more than 12 years ago | (#3589738)

Here's one for you: What would you do if you looked like a terrorist?

Let's say, some time in the future, they get the face-scanning technology to work right. 0.000001% false-positive rate. And it's implemented all over the US.

Let's also say that, among the 250 million people in the United States, one or more people had facial structures similar enough to terrorists' that they would trigger those scanners. In fact, they'd trigger every scanner that person was surveiled by. And let's say that person were you.

What would you do?

You couldn't go to an airport. You couldn't go to a major public attraction. You probably couldn't go to a public place without fear of some alarm going off and people waving automatic weapons in your face. Would you cower at home? Would you wear a bag over your head? Would you sue the US government? How would you cope?

Re:Look-alikes? (0)

Anonymous Coward | more than 12 years ago | (#3589761)

You'd probably want to shave that ZZ Top beard off, for one.

If you don't want to be mistaken for a terrorist, don't go out of your way to look like one.

Everybody runs. (2, Funny)

Mulletproof (513805) | more than 12 years ago | (#3589767)

It's the future. The Police stop crime before it happens. They are never wrong [apple.com] .

No such thing as a cure-all (2, Insightful)

jonman_d (465049) | more than 12 years ago | (#3589742)

I think the problem with security these days is that many people are looking for a one-solution-fixes-all type of thing. People need to realize that (and geeks know this, we do it on our computers ;) there are, and should be, multiple layers of security.

Employing facial recognition is just one thing we can do - granted, we need to get the technology to work better, but we need to realize that it's multiple systems working together that is going to stop terrorists, not one or two "miracle systems."

False positives (5, Insightful)

Restil (31903) | more than 12 years ago | (#3589759)

We're not talking about using this technology to make courtroom identifications. We're using it to notify security that you MIGHT have someone in front of you that is of less than reputable character. This doesn't mean you immediately cuff him and throw him in jail, but if he tries to walk through a screener checkpoint it MIGHT be a good idea to do a little better check than a simple wand wave. In the meantime, someone can be checking the pictures to see if that person's face actually matches the match the computer made. With a .1% false positive rate, you could have a couple paid employees just looking at matching pictures to see if there's really cause for concern or not. At the rate people go through screening checkpoints now, they'll get a "match" about once every 10 minutes or so, your mileage may vary with larger airports, its all a matter of scale.

As for false negatives, even 50% is better than nothing as long as the false positive is much MUCH lower. Imagine catching 50% of the hijackers on September 11 before they boarded the planes. A lot of red flags could have gone up, and flights could have been delayed, the rest of the passengers could be more carefully scrutinized. No, this is not the solution to any problem. And no, it should not be used legally any more than a lie detector can be. Its a guide. It tells us where we might need to concentrate more of our efforts on.

As far as threats to privacy go, this makes sense in an airport, but it does not make sense out on the street. People go into an airport expecting to be searched, questioned, carded, etc. They do not have the same expectation while walking down the street. So unless the cops are currently chasing someone, they lose him, and you have a striking resemblance, they shouldn't bother you at all.

-Restil
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?