Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Homeland Security Department Testing "Pre-Crime" Detector

timothy posted more than 5 years ago | from the when-dowsing-meets-voight-kampff dept.

Privacy 580

holy_calamity writes "New Scientist reports that the Department of Homeland Security recently tested something called Future Attribute Screening Technologies (FAST) — a battery of sensors that determine whether someone is a security threat from a distance. Sensors look at facial expressions, body heat and can measure pulse and breathing rate from a distance. In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,' says a DHS spokesman."

cancel ×

580 comments

Sorry! There are no comments related to the filter you selected.

sensors... (5, Insightful)

adpsimpson (956630) | more than 5 years ago | (#25121825)

Sensors look at facial expressions, body heat and can measure pulse and breathing rate from a distance

...And most importantly, skin colour?

Seriously, is there anything a device like this can do that's either more useful or less invasive than a human watching people walking past and profiling/screening them on what they can see?

Re:sensors... (5, Insightful)

$RANDOMLUSER (804576) | more than 5 years ago | (#25122109)

Why yes, yes there is. It can randomly spurt out false positives, subjecting people to random stops and questioning. It can still miss the real terrorists who are doing their damnedest to look normal and unthreatening. It can further the "show us your papers" society we've been building and seem so enamored of. It can supply the mindless thugs at security checkpoints an ironclad "the machine says so" excuse to hassle harried, irritated travelers. It can further the "security theatre" in all aspects of everyday life. In short, it can do nothing positive.

Re:sensors... (5, Insightful)

electrictroy (912290) | more than 5 years ago | (#25122241)

Good point. A real terrorist doesn't show signs of distress, because he doesn't consider his actions immoral. He thinks killing IS the moral thing to do.

Re:sensors... (4, Insightful)

moderatorrater (1095745) | more than 5 years ago | (#25122343)

He'll still show signs of stress, though. Just because you think it's right to get into a fight doesn't mean that the adrenaline doesn't start pumping.

The real problem with this is that the number of wrongdoers is small while the pool for false positives is high. If 5% of people have some intent that should be picked up by this, then 4% of all people with ill intent will be picked up. At the rate, then they'd have to have less than a 5% rate of false positives just to reach the point where half the people it says have ill intent actually do. What are the chances that it's going to have a false positive rate less than 5%?

And that's assuming that 1/20 people have some intent that would need to be picked up by this, while the actual rate is almost certainly smaller. Millions of people fly on airplanes every year, yet every year only a handful try something stupid. This is security theater at its finest.

Re:sensors... (1)

IgnoramusMaximus (692000) | more than 5 years ago | (#25122349)

Brilliant point. Particularly, a religious fanatic will be in a state of peace and righteousness-filled euphoria because he is finally "fulfilling his destiny" in life and just hours away from being rewarded by his God for being a faithful "Holy Warrior".

Re:sensors... (2, Interesting)

gnick (1211984) | more than 5 years ago | (#25122509)

Particularly, a religious fanatic will be in a state of peace and righteousness-filled euphoria because he is finally "fulfilling his destiny" in life and just hours away from being rewarded by his God for being a faithful "Holy Warrior".

I've got to disagree there. I don't want to praise the machine - This thing is nuts. And I agree that, just before detonation, a fanatic may experience a sense of euphoric peace. But, when going through security, it's a toss up between beautiful martyrdom and failure resulting in a good long stretch in Guantanamo Bay being questioned unmercifully by the infidels. A good lot of training may help them deal with that stress. And their faith may provide them with confidence that their gods wouldn't allow them to fail. But until you actually get through security, there's got to be a lot of stress to deal with - Probably even more than when they actually push the button / flip the switch / light their shoe laces.

Re:sensors... (5, Insightful)

Otter (3800) | more than 5 years ago | (#25122419)

Absolutely untrue. Suicide bombers fail as often as they do (in Israel, Iraq, Sri Lanka,...) because they're usually bug-eyed, sweating, twitching, and frequently high. Highly trained operatives might be reliably calm, but the run-of-the-mill terrorist is usually pretty obvious, although they can still often kill people before someone can stop them.

Re:sensors... (2, Funny)

TheVelvetFlamebait (986083) | more than 5 years ago | (#25122311)

It can randomly spurt out false positives, subjecting people to random stops and questioning. It can still miss the real terrorists who are doing their damnedest to look normal and unthreatening.

Sheesh! I've never seen a bunch of geeks so opposed to developing an immature technology before! Perhaps a toning down of the pessimism would be in order, and perhaps we may see some improvements in our understanding of human behaviour, and the programs built to understand it.

Re:sensors... (3, Insightful)

Aphoxema (1088507) | more than 5 years ago | (#25122423)

It's okay since only a few people will get hurt in the process.

Re:sensors... (1)

mi (197448) | more than 5 years ago | (#25122493)

It can randomly spurt out false positives, subjecting people to random stops and questioning. It can still miss the real terrorists who are doing their damnedest to look normal and unthreatening.

Humans can — and already do — do all that too. The question is, will a device ever be better at it. Will it have — not zero — fewer false-positives? Will it catch — not all — more terrorists?

In fact, it does not even have to be better. If it is the same or even slightly worse, than a human guard, it may still be worse deploying for its price (human labor is way too expensive) and objectivity, as it will not be blinded by things like

  • familiarity (a terrorist may try to befriend a guard in advance)
  • physical attractiveness of the (would-be) suspect
  • guard's mood-swing — or a hang-over.

Re:sensors... (1)

geoffspear (692508) | more than 5 years ago | (#25122535)

Why yes, yes there is. It can randomly spurt out false positives, subjecting people to random stops and questioning.

I know I'm glad a human watching people go past has a 0% false positive rate.

Re:sensors... (3, Funny)

nedlohs (1335013) | more than 5 years ago | (#25122155)

It can't be sued for being racist...

Re:sensors... (4, Insightful)

Otter (3800) | more than 5 years ago | (#25122259)

...And most importantly, skin colour?

That's precisely the point of using an automated system instead of humans, to avoid accusations of racial or ethnic profiling.

Re:sensors... (3, Insightful)

arth1 (260657) | more than 5 years ago | (#25122399)

That's precisely the point of using an automated system instead of humans, to avoid accusations of racial or ethnic profiling.

So, who are the non-humans that calibrate the systems?

Re:sensors... (1)

Hairy Heron (1296923) | more than 5 years ago | (#25122575)

So this system just built itself without any outside human interaction?

Re:sensors... (2)

arth1 (260657) | more than 5 years ago | (#25122267)

Quite frankly, I don't think that DHS are for less invasive procedures.

Anyhow, if this thing can be refined so it accurately detects people intent on deception, it will mean that few politicians or lawyers ever will be able to fly. It'll get nixed, no worries.

Re:sensors... (1, Funny)

Anonymous Coward | more than 5 years ago | (#25122407)

When shown a picture of Dick Cheney, the detector started spinning in circles, waving its cables haplessly while emitting blasts of "WARNING! WARNING! DANGER WILL ROBINSON! DANGER!!!" and reduced itself to a molten clump of plastic and fused metal.

Re:sensors... (1)

Cartack (628620) | more than 5 years ago | (#25122527)

what about people with anxiety disorders, and other psychological problems that affect how nervous they appear. all of you geeks with GAD (generalized anxiety disorder), (SAD) Social anxiety DIsorder, Aspergers etc.. beware at security check points.

Err (5, Insightful)

InvisblePinkUnicorn (1126837) | more than 5 years ago | (#25121831)

Does this sound idiotic to anyone else? Of course it's going to work for people who are told how to act in order to get the device to flag them.

My first thought, too... (5, Insightful)

Joce640k (829181) | more than 5 years ago | (#25122065)

All we've got is a device which can spot normal people trying to be visibly "suspicious".

Re:My first thought, too... (3, Funny)

HTH NE1 (675604) | more than 5 years ago | (#25122569)

All we've got is a device which can spot normal people trying to be visibly "suspicious".

Doc Brown: Get yourself some fifties clothes.
Marty McFly: Check, Doc.
Doc Brown: Something inconspicuous!

Re:Err (5, Insightful)

Yvanhoe (564877) | more than 5 years ago | (#25122179)

If I recall correctly, the last time I traveled to USA, I had to fill a form stating that the intent of my travel was not to kill the US president. People who create such forms would probably fund a research on a "suspicious person detector"

Re:Err (2, Interesting)

Shadow Wrought (586631) | more than 5 years ago | (#25122217)

Does this sound idiotic to anyone else?

Yep. But this is slashdot. Tot he powers that be it probably shows "great promise" and, since it is a machine, would be "unbiased."

All the things it is tagging as "suspicious" could also be explained by a bad phone call just before you come in range. Maybe your wife just caleld to say she's leaving you for your sister. Again.

Re:Err (1)

electrictroy (912290) | more than 5 years ago | (#25122297)

>>>To the powers that be it probably shows "great promise" and, since it is a machine, would be "unbiased."

That's how we got those junk Diebold voting machines; they were supposedly better than a simple handcount of paper ballots. Not.

Re:Err (1)

penguinbrat (711309) | more than 5 years ago | (#25122257)

Idiotic or not (point of view thing I suppose) I personally view it as more scary than anything - how you could possibly raise a kid to be prepared for and deal with crap like this. The Minority report and BigBrother scenarios are only going to be the tip of iceberg in 30+ years...

Re:Err (1)

InvisblePinkUnicorn (1126837) | more than 5 years ago | (#25122405)

The Minority report and BigBrother scenarios are only going to be the tip of iceberg in 30+ years...

Careful, with comments like that, you're liable to wake up on an iceberg, with a GPS tracking device implanted in your skull.

Re:Err (1)

etully (158824) | more than 5 years ago | (#25122355)

And what do those stats 78% and 80% mean?

If you have 100 people approach and ALL 100 of them have mal-intent... and then system only alerts you to 78 or 80 of them... then sure, you've got a 78-80 percent success record I guess.

It'd be much nicer if 100 people approached and only ONE of them had mal-intent and it was able to spot that person 78-80% of the time... AS LONG AS it is 100% perfect at correctly identifying the other 99 innocent people as innocent people.

78-80% success rate doesn't sound so good if it means that I have a 20-22% chance of getting a full body cavity search every time I get within 500 feet of the police.

Re:Err (1)

mcgrew (92797) | more than 5 years ago | (#25122441)

"Welcome to Security Masterpiece Theatre! And here's your host, Mister Alistair Crowley!"

Re:Err (2, Funny)

gsslay (807818) | more than 5 years ago | (#25122449)

Does this sound idiotic to anyone else?

Yes indeed it does.

Testing on my new device starts tomorrow. It has a remarkable 98% accuracy in identifying people told to dress completely in purple and sing "I Love You, You Love Me". Even at a distance. As long as the terrorists play along (and who wouldn't?) we'll win this war on terror any time soon. And even if they don't, think of all the Barney impersonators we'll get off the streets. It's an everybody-wins scenario.

Pre-Crime? (1)

AZScotsman (962881) | more than 5 years ago | (#25121835)

Insert obvious Tom Cruise - Minority Report references here.

Philip K Dick was prescient (1)

JeffSchwab (1159723) | more than 5 years ago | (#25121841)

Minority Report, anyone?

Designing the ad (1, Funny)

Anonymous Coward | more than 5 years ago | (#25121847)

Hi, I'm a terrorist, and I've been made into a stereotype.

Re:Designing the ad (0)

Anonymous Coward | more than 5 years ago | (#25121989)

That's right... for some reason, people wearing turbans set of the sensor 100% of the time... :-)

Re:Designing the ad (2, Funny)

Ethanol-fueled (1125189) | more than 5 years ago | (#25122173)

...and people who look like this [photobucket.com] , too.

"Told to act suspicious"? (5, Insightful)

fprintf (82740) | more than 5 years ago | (#25121855)

The summary talks about the sujects being told to act suspicious. So, if you are told to be suspicious does this make any difference from someone who is actually planning something nasty? I suppose it is difficult to find subjects who are unaware they are being observed, and yet also intent on doing something bad. Nevertheless, I'd hypothesize there might be significant, observable differences between the two groups.

Re:"Told to act suspicious"? (3, Insightful)

Anonymous Coward | more than 5 years ago | (#25121991)

You will always get these sorts of results with forced actions. If I made a happiness detector (via facial expressions), and told half of the group to smile, and the other half not to, I bet it would pick that up. Now, what if half the group were given personal responsibility toy, and the other half were given a cuddly teddy bear? I bet it wouldn't be accurate anymore...

A better test would be to give the group water bottles. Most of the group are given real water in bottles. A few of the group are given water bottles filled with vodka. All subjects know what they are carrying. The goal is to finish an AA meeting, drinking your drink. If you get through the meeting, you are given a reward (say $20). If you don't, you owe $20.

What's the bets that would be much harder to figure out?

Re:"Told to act suspicious"? (0)

Anonymous Coward | more than 5 years ago | (#25122463)

Now, what if half the group were given personal responsibility toy,

Is that what we're calling vibrators these days?

Not even close (5, Interesting)

ShawnCplus (1083617) | more than 5 years ago | (#25121865)

Sorry, but 78% is not even REMOTELY accurate to consider someone dangerous. There is already a high enough false accusation rate.

Re:Not even close (3, Funny)

pizzach (1011925) | more than 5 years ago | (#25122209)

In other words, 22% of the time it is wrong. Saying it's right 78% of the time is pure and simple market speak.

The interesting thing about this is if people started to intrinsically act suspicious, the numbers become fudged and mostly meaningless. One way this could be accomplished is by standing around handing out complimentary eye patches, telling people it is act like a pirate day.

Re:Not even close (1)

SimonGhent (57578) | more than 5 years ago | (#25122433)

Sorry, but 78% is not even REMOTELY accurate to consider someone dangerous

Especially as they were

told to act suspicious

This really is an utter crock.

78% isn't the number you care about (3, Interesting)

patio11 (857072) | more than 5 years ago | (#25122583)

Most AIDS tests are 99%+ accurate at telling you that a person with HIV actually has HIV. They're also 99% accurate at saying a person who doesn't have HIV, doesn't have HIV. Its the combination of those two facts plus "Very few people in the general population have HIV" which makes mass one-time AIDS screenings a bad idea -- you successfully pull the guy out of 100 who had HIV, then you throw in one negative bystander, and you end up adding 99% accurate + 99% accurate to get 50% accurate.

There are a heck of a lot less terrorists than 1% of the flying public.

There is a countermeasure, of course -- you use the magic machine not as a definitive test but as a screening mechanism. Know why we aggressively screen high risk groups for AIDS? Because they're high risk -- if 1 out of every 4 screenies is known to be positive (not hard to reach with some populations) then the 99%/99% math adds up to better than 95%. Better news. (You then independently run a second test before you tell anyone they're positive. Just like you wouldn't immediately shoot anybody the machine said is a terrorist -- you'd just escalate the search, like subjecting them to a patdown or asking for permission to search their bags or what have you.)

So you could use the magic machine to, say, eliminate 75, 90, 99%, whatever of the search space before you go onto whatever your next level of screening is -- the whole flying rigamarole, for example. Concentrate the same amount of resources on searching 20 people a plane instead of 400. Less hassle for the vast majority of passengers, less cursoryness to all of the examinations.

The quick here will notice that this is exactly the mechanism racial profiling works by -- we know a priori that the 3 year old black kid and the 68 year old white grandmother is not holding a bomb, ergo we move onto the 20 year old Saudi who it is merely extraordinarily improbable to be holding a bomb. That would also let you lop off a huge section of the search space off the top.

The difference between the magic machine and racial profiling is that racial profiling is politically radioactive, but the magic machine might be perceived as neutral. Whether you consider that a good or a bad thing is up to you. Hypothetically assuming that the machine achieves, oh, 80% negative readings for true negatives, many people might consider it an awfully nice thing to have 80% of the plane not have to take off their shoes or get pat down -- they could possibly get screened as non-invasively as having to answer two of those silly, routine questions.

(Of course, regardless of what we do, people will claim we're racially profiling. But that is a different issue.)

Facial experessions? (3, Funny)

AioKits (1235070) | more than 5 years ago | (#25121867)

In other news today, Homeland Security has detained the entire Chili Cook-off Carnival event after their new FAST software registered positive hits on EVERYTHING there, including some domesticated animals and a squirrel with three legs.

Doesn't matter (5, Insightful)

MadMidnightBomber (894759) | more than 5 years ago | (#25121873)

"In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,' says a DHS spokesman."

None of that matters - what's important is the false positive rate, ie. the proportion of people with no malicious intent who get flagged up. If it's as high as 1% the system will be pretty much unworkable.

Re:Doesn't matter (1)

networkconsultant (1224452) | more than 5 years ago | (#25122187)

Well if I was planning something nasty I would act as if i was not planning anything at all, most suicide bombers are taught to infiltrate then gain confidence then blow themselves up, and since they are committing sueicide you might find that they will not appear any different than their targets. And what about people with dis-orders? like say someone prone to agoraphobia? or panic attacks? (that's a full 1/3 of the population by the way).

Re:Doesn't matter (1)

Thelasko (1196535) | more than 5 years ago | (#25122321)

"In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,' says a DHS spokesman."

None of that matters - what's important is the false positive rate, ie. the proportion of people with no malicious intent who get flagged up. If it's as high as 1% the system will be pretty much unworkable.

Exactly, the NewScientist article fails to mention false positives. However, the attached PDF [dhs.gov] goes into it in great detail. I don't have time to read 274 pages though.

Even if this new technique is only intended to help law enforcement determine which individuals to pay extra close attention too, it will inevitably be abused much like Tasers are today. What makes this "tool" completely useless is the fact that it can be tricked by individuals acting suspicious but not actually committing a crime. This tells me that this device is weak to attacks that involve misdirection. [wikipedia.org]

Re:Doesn't matter (1)

digitalderbs (718388) | more than 5 years ago | (#25122533)

Agreed. I crossed the boarder yesterday, and I had to renew my NAFTA visa. Completely at the discretion of the US boarder officer, he can decide whether to turn me back -- if I don't "look" right -- or if he doesn't like my paperwork. Searches are another concern. I had to be at work today, and getting on the US flight yesterday was important. The process can take one to two hours too.

As I was standing in line, I noticed that I may have exhibited some of these characteristics inadvertently : increased heart rate, looking around frequently, jitterish. The point is that it's a stressful situation. I'm not convinced that someone that is nervous is necessarily malicious -- a very poor correlation, I'd imagine. A detector like this would further exacerbate the situation.

Really? (3, Insightful)

gehrehmee (16338) | more than 5 years ago | (#25121875)

In trials using 140 volunteers those told to act suspicious were detected with 'about 78% accuracy on mal-intent detection, and 80% on deception,

Isn't this a little off-base? People who are really about to commit a crime, as a rule, will be explicitly trying not to look suspicious.

Re:Really? (1)

Manfre (631065) | more than 5 years ago | (#25122403)

There are also those who commit crimes that don't think they are doing anything wrong. False positives will be a serious issue. Awkward guy on a first date or around a girl he likes or a person who had a bad day at work will probably have higher pulse, body temp and be acting suspicious or annoyed.

Additional Locations (3, Interesting)

UncleWilly (1128141) | more than 5 years ago | (#25121885)

I propose the House, Senate and White House also.

Re:Additional Locations (5, Funny)

antifoidulus (807088) | more than 5 years ago | (#25121939)

Can the sensors even handle that much mal-intent and deception?

Minority Report (2, Funny)

Swampcritter (1165207) | more than 5 years ago | (#25121893)

I think someone has been watching the Minority Report just a bit too closely. I can just see it now... the 'Pre-Crime' Division of the DHS.

Only as good as its success rate (1)

TheVelvetFlamebait (986083) | more than 5 years ago | (#25121895)

Things like these are only as good as their success rate. If they get a whole lot of false positives, then they're going to be worth squat when it actually comes down to hard evidence.

Then again, perhaps they might be useful as a general indicator of "mal-intent". Not as a method of proof, but just a way of optimising the job of certain DHS officials.

So a jogger who's lying to his trainer... (1)

mr_mischief (456295) | more than 5 years ago | (#25121897)

So if I'm running and about to lie to my trainer or doctor about how far I ran today, my pulse rate, breathing rate, and body temperature are up. I'm thinking about deceiving someone. So I guess that means it's now a crime to lie to your trainer according to the DHS?

Re:So a jogger who's lying to his trainer... (1)

ivandavidoff (969036) | more than 5 years ago | (#25122163)

This one time in the late '70s I was stopped and frisked by a couple of officers in broad daylight in a quiet little town in Southern California. When I came up clean, they were happy to tell me they stopped me bacause I had long hair and was running. I missed my bus and was late to class that morning.

Re:So a jogger who's lying to his trainer... (1)

penguinbrat (711309) | more than 5 years ago | (#25122353)

Take it one step farther - you just finished running, your heart rate is up, body temp is up and sweating - your about to commit a crime according to the DHS...

Was this like the Missile Defense Shield tests? (1)

Chris Burke (6130) | more than 5 years ago | (#25121901)

Were the 'positive' participants in the test told to "act suspicious" by carrying a radio transponder on their person?

Re:Was this like the Missile Defense Shield tests? (1)

Actually, I do RTFA (1058596) | more than 5 years ago | (#25121985)

Were the 'positive' participants in the test told to "act suspicious" by carrying a radio transponder on their person?

Nope, only 78% of them were told to carry a radio transponder. Didn't you RTFS.

Government screws private sector again. (5, Funny)

bigtallmofo (695287) | more than 5 years ago | (#25121911)

I was just about to finish up my patent application for a device that could accurately detect a human pretending to be a monkey 80% of the time when a human test subject is asked in advance to pretend to be a monkey.

Why do I even bother?

Minority Report (1)

ireallylovelinux (589360) | more than 5 years ago | (#25121931)

But will this new detector include Tom Cruze's crimes in it's detector? Will he have to get new eyes?

What a bunch of BS (1)

mbone (558574) | more than 5 years ago | (#25121951)

Those told to act suspicious ? WTF, did they give them Groucho Marx subglasses ? .And a 20% false negative rate on that.

IMHO, every person involved with this project should be summarily fired, up to and including the Department Head.

Re:What a bunch of BS (3, Informative)

Chris Burke (6130) | more than 5 years ago | (#25122475)

Just an fyi, the accuracy number doesn't directly tell you the ratio of false negatives. It's a measure not just of how many true positives it gets (that's the sensitivity), but also of true negatives(that's the specificity), in that it should both identify the "suspicious" correctly and correctly identify the non-"suspicious".

You can't go from the accuracy directly to the specificity and sensitivity, since it's a combination of several measurements. The result, though, will be highly dependent on the prevalence of "suspicious" people in their test, which is the ratio of how often what you're trying to detect actually occurs.

I'm willing to bet that the prevalence they used in their testing is way, way higher than it would be in real life (like 1/4 to 1/2 of the test subjects were "suspicious", while in real life the odds of a random person in an airport being a terrorist is more like 1/1e6 on a bad day). So this would skew the accuracy measurement towards detecting the suspicious and understate the importance of figuring out correctly that someone is not suspicious. The problem is that when you're dealing with something very rare, even if your specificity is very high, the odds that someone you pull out of line because the machine flagged them is in fact innocent is extremely high (it's going to be over 99% chance unless this machine is -very- specific), and if your test methodology doesn't worry as much about specificity, then it's going to be even worse.

That's brilliant! (4, Funny)

Minwee (522556) | more than 5 years ago | (#25121971)

All you need to do now is post signs reminding any potential evil-doers to "act suspicious" and the system will work perfectly.

Re:That's brilliant! (1)

Anonymous Coward | more than 5 years ago | (#25122479)

All the system has to do is flag 100% of people as suspicious, and they will have a 100% hit rate on catching criminals. Everyone else is just a casualty of the system.

Testing procedure (1)

Johnny Mnemonic (176043) | more than 5 years ago | (#25121993)


those told to act suspicious

I am much more interested in how this unit would perform against people with evil intent and were trying to hide it, than against people without evil intent that were trying to display one.

At this point, it would be better suited to helping critique a theater performance than actually improving security. It detects and evaluates actors, not real world situations.

Let alone the thought-crime implications. I'll be more worried when it's actually demonstrated to, you know, work as advertised.

Interesting Research (1)

Craqshot (1131645) | more than 5 years ago | (#25122009)

I was visiting the University of Arizona earlier this year, and they have some similar research going on. They were using lasers to measure pulse, breathing, and body temperature. The whole project was involved in deception detection and they had a lot of funding from DHS and other government sources. This article might even be referring to some of their technology.

How about this? (0)

Anonymous Coward | more than 5 years ago | (#25122025)

I am OK as long as the following is true: if the system flags an innocent person the authorities should/must jail or punish one of their own the equivalent of at least half the (now innocent) person would have been punished for.

Fancy that, Burka's protect civil rights. (4, Interesting)

tjstork (137384) | more than 5 years ago | (#25122027)

If everyone was wearing a burka, then, there's no way that this system actually works. It may seem strange, but, what right does the public have to know my face?

yeah, that'll work.... (1)

dfm3 (830843) | more than 5 years ago | (#25122029)

Brilliant idea... install them at airport security checkpoints and border crossings. So, let's say I'm stressed and aggravated because I'm late for my flight, or tired of standing in line being shoved around and treated like a criminal (or an animal). Can you say "false positive"?

Deception? Mal intent? (1)

nerdacus (1161321) | more than 5 years ago | (#25122041)

Like if I intend to sneak up on my wife and give her a scare for fun? Or if I know I have a 3.5 ounce toothpaste tube in my bag, 0.5 ounces past the restriction? I wonder how many other forms of innocent "deception" will automatically call in the jackboots?

So they're called Sensors now? (1)

meist3r (1061628) | more than 5 years ago | (#25122045)

I thought they were called "Soldiers".

Good thing about sensors though: They don't aks no quest'ns to 'dem dangerous peepol!

What if a sensor decided you were bad and hit the automatic firing system? Or does a light go on and you're quickly and politely escorted out ... out of the range of human rights?

Maybe you would be just pissed at someone but it would still take 12 hours of flight to beat their ass. Can't one even have a violent pen pal anymore? Congrats, you prove to us every day that the Terrorists have won.

The new polygraph? Maybe not... (2, Interesting)

mveloso (325617) | more than 5 years ago | (#25122075)

The device relies on the assumption that the physiology of people up to no good may be different than normal people.

And that may be true.

However, this'll be much more useful somewhere like an embassy or checkpoint than in an airport. In a sea of potentially hostile people, it's harder to pick out the ones who may actually do something. In a sea of basically docile people, it should be relatively simple to visually pick the nervous ones.

I'm all for it! (2, Funny)

JamesP (688957) | more than 5 years ago | (#25122091)

If it helps nailing Tom Cruise

Re:I'm all for it! (0)

Anonymous Coward | more than 5 years ago | (#25122443)

The Katie Holmes bot that DHS built took care of that already... O, wait...

When they showed it to Dick Cheney... (1)

Alien Being (18488) | more than 5 years ago | (#25122095)

the machine overloaded and took out the power grid.

Will be fun at the airport (4, Insightful)

dbyte (1207524) | more than 5 years ago | (#25122101)

God help the nervous flier :)

Re:Will be fun at the airport (1)

Stypen (720346) | more than 5 years ago | (#25122559)

God help the nervous flier :)

God help the person who just lost his luggage..

everyone's on edge going through airport security (1)

petes_PoV (912422) | more than 5 years ago | (#25122113)

Not just people set on doing something bad (which may "merely" involve stealing your laptop). Everyone worries whether they'll be detained - more so if these completely inadequate machines get rolled out - whether they'll miss their flight, or even if they'll get lost.

If these machines pick up on stress then they'll get near to a 100% hit rate for travellers. Possibly the most serene people in an airport lounge are those who've already accepted their fate and are willingly going to meet their makers shortly after the plane takes off.

Re:everyone's on edge going through airport securi (1)

TheVelvetFlamebait (986083) | more than 5 years ago | (#25122207)

I'm sure that 78% success rate figure was calculated with all that in mind. In fact, with a bit of extra work, perhaps we can lower the chance of you being selected wrongly for a strip search to a mere 1 in 5!

Argh! (1)

Cillian (1003268) | more than 5 years ago | (#25122131)

For the love of god, I have read too many replies which suggest people think this is actually minority report style pre-crime. I would presume they are going to use this device to decide who to investigate further, i.e. at airports, rather than inferior methods of picking people (I.e. race). If they then find evidence against you, hard luck.

Very Reliable! (1)

Javarufus (733962) | more than 5 years ago | (#25122133)

Hey, if we can detect with about 2% assurance that Iraq has WMD's from space and start a war that costs trillions and kills many thousands, what excuse would the government need to explain why they killed a stadium of fans whom they detected, with their new whiz-bang device, that about 65% of them lied about something at work last week and about the number of beers they told their buddy that they've drank since the start of the 2nd quarter?

Holy run-on sentence Batman!

Hey, if you want to detect malfeasance of any kind with 100% accuracy, you need my wife. I can't get away with anything!

Suspcious People? (1)

shellster_dude (1261444) | more than 5 years ago | (#25122135)

So, this doesn't work for Sociopaths, or people who firmly believe in their religious reasons for doing something and thus have no fear? What about people wearing Baclava or some other sort of head covering? What about the business man that is in a hurry, and has to make a big sale because his job depends on it? This machine is a joke. It will never be a good as a human profiler because it can't infer context. Context is where this machine will really trip up, and yet it is a crucial part of analysis. Another item that the tests overlook, is that an actor playing a "suspicious" person are going to exaggerate the suspicious behavior. Therefore, these tests are of little value in determining the effectiveness of the machine in a real scenario.

Oh, yeah, just great. (1)

oahazmatt (868057) | more than 5 years ago | (#25122141)

Sensors look at facial expressions, body heat and can measure pulse and breathing rate from a distance.

I have severe allergies which affect my breathing and a faster-than-usual heartbeat due to another medical condition. I also have hyperhydrosis, so I sweat constantly, which would make me look more suspicious in an interogation room. Oh, and when I start getting antsy, it's not because I'm nervous, it's because of my hypoglycemia and I need to eat.

But at least this new technology will keep me off the streets.

hmm (2, Insightful)

lucky130 (267588) | more than 5 years ago | (#25122171)

I notice both of those success rates are less than 100%. Personally, I don't want to be one of those innocent 20+% that gets harassed.

Bunch of false positives (1)

MouseR (3264) | more than 5 years ago | (#25122223)

Stupid things like this make my blood boil. Watch me not cross the border anymore. Be it by choice or because some dumb-ass program decided I was too pissed (or needing to take a piss) for me to be a Good Citizen.

oh well (1)

BigBadBus (653823) | more than 5 years ago | (#25122235)

....this means no more hot curries for me then :(

I have 100% detection rate (1)

akgooseman (632715) | more than 5 years ago | (#25122245)

in my patent-pending pre-crime detector. My machine flags everyone as a possible criminal. It's win-win for everyone. We're all safer and the DHS gets to interrogate everyone.

I'm also developing a new detector (1)

Xelios (822510) | more than 5 years ago | (#25122261)

It's called the Pants Urination Screening SYstem (PUSSY). Although I haven't finished the test on volunteers yet, I'm confident that I can acheive at least a 90% pants urination detection rate in those told to piss their pants. This, combined with a study I've done which found terrorists may piss their pants before an attack, would be an effective tool in combatting terrorism. One interesting result of my study so far; a surprisingly large number of people who get really, really drunk seem to be terrorists. This can be seen as evidence that the device does in fact work.

Funding please...

Isn't this our gov't? (0)

Anonymous Coward | more than 5 years ago | (#25122327)

Aren't these people supposed to be doing what WE want them to do, implementing processes and laws that WE WANT? Why...WHY are we ALLOWING THIS? PLEASE WAKE UP!!!!!!!!!!!!!!!!!

I guess they are trying but come on... (1)

yoshi_mon (172895) | more than 5 years ago | (#25122347)

I'm not totally opposed to them trying new stuff out but...

- Looking suspicious because your cheating on someone.
- Looking suspicious because your trying to blame someone else for a ripping fart.
- Thinking up ways to lie about how you totally read the book for your book of the month club when you really only watched the movie.

I mean need I go on? We humans, well us normal humans anyway not the robots they want us to be, are normally always up to something. Unless thou some machine can tell that we are as well as what exactly we are planning it's going to be useless.

Where is our freedom? (1)

ronz0o (889697) | more than 5 years ago | (#25122377)

I am from the future. Welcome to America, the land and home of free speech. But, if we think we suspect you of doing ANYTHING, you are going to jail to prevent any crime. Thats how we, BigBrothers, believe it should be! We get off to security, and making sure that YOU can't live your lives. We want to make sure everyone is legal, and not intending to harm the US. Illegal aliens? Not a problem any more. With space lasers, we zap anyone who is crossing the border. Don't think of going to Canada or Mexico illegally, either. We will zap those trying to commit treason by leaving. We make it so you don't have to worry about security. We take the strain off of you. We think for you. We judge for you. Welcome to America. Please put your face in the scanner.

SARS (0)

Anonymous Coward | more than 5 years ago | (#25122409)

I remember about four-five years ago they had these all over the world, looking for people who could have SARS.

false positives vs false negatives (1, Interesting)

Anonymous Coward | more than 5 years ago | (#25122429)

I feel like I have to bring this example up about once a month what with security news and all, but:

Suppose there's a rare illness that occurs in 1 out of a million people. The test for that illness is 99.9% accurate, meaning that one in a thousand well people will be falsely diagnosed as sick and one in a thousand sick people will be falsely diagnosed as well.

A million people come in for the test. On average one has the illness. But a full *thousand* are going to be tested positive. So if this 99.9% accurate test says you're sick, you in fact have a 99.9% chance of being well.

This is a real and well studied problem in medicine (it has a name, but I forget it and if someone knows, please post). I've yet to see any evidence that the problem has been addressed or even acknowledged in the case of mass security screening.

So, even if this crime test is accurate 8 or 9 times out of 10, because most people (of any race or religion) are not criminals or terrorists, the positives are going to be meaningless without further screening. Since "further screening" will almost certainly represent a gross violation of an innocent person's rights, this should be aberrant to anyone who values a free society.

This of course raises the question: if it's so useless, why bother? I can think of two reasons, one cynical and one very cynical.

The first: a well connected contractor is making the device. As our government gets more and more privatized, this kind of thing is running rampant.

The second: an authoritarian police state loves a pretense to hold anyone they choose under suspicion.

The (even more cynical) reality is probably a combination of the two.

Cory Doctorow said it best (1)

Guysmiley777 (880063) | more than 5 years ago | (#25122453)

Very apt excerpt from Little Brother [craphound.com] :

If you ever decide to do something as stupid as build an automatic terrorism detector, here's a math lesson you need to learn first. It's called "the paradox of the false positive," and it's a doozy.

Say you have a new disease, called Super-AIDS. Only one in a million people gets Super-AIDS. You develop a test for Super-AIDS that's 99 percent accurate. I mean, 99 percent of the time, it gives the correct result -- true if the subject is infected, and false if the subject is healthy. You give the test to a million people.

One in a million people have Super-AIDS. One in a hundred people that you test will generate a "false positive" -- the test will say he has Super-AIDS even though he doesn't. That's what "99 percent accurate" means: one percent wrong.

What's one percent of one million?

1,000,000/100 = 10,000

One in a million people has Super-AIDS. If you test a million random people, you'll probably only find one case of real Super-AIDS. But your test won't identify *one* person as having Super-AIDS. It will identify *10,000* people as having it.

Your 99 percent accurate test will perform with 99.99 percent *inaccuracy*.

That's the paradox of the false positive. When you try to find something really rare, your test's accuracy has to match the rarity of the thing you're looking for. If you're trying to point at a single pixel on your screen, a sharp pencil is a good pointer: the pencil-tip is a lot smaller (more accurate) than the pixels. But a pencil-tip is no good at pointing at a single *atom* in your screen. For that, you need a pointer -- a test -- that's one atom wide or less at the tip.

This is the paradox of the false positive, and here's how it applies to terrorism:

Terrorists are really rare. In a city of twenty million like New York, there might be one or two terrorists. Maybe ten of them at the outside. 10/20,000,000 = 0.00005 percent. One twenty-thousandth of a percent.

That's pretty rare all right. Now, say you've got some software that can sift through all the bank-records, or toll-pass records, or public transit records, or phone-call records in the city and catch terrorists 99 percent of the time.

In a pool of twenty million people, a 99 percent accurate test will identify two hundred thousand people as being terrorists. But only ten of them are terrorists. To catch ten bad guys, you have to haul in and investigate two hundred thousand innocent people.

Guess what? Terrorism tests aren't anywhere *close* to 99 percent accurate. More like 60 percent accurate. Even 40 percent accurate, sometimes.

What this all meant was that the Department of Homeland Security had set itself up to fail badly. They were trying to spot incredibly rare events -- a person is a terrorist -- with inaccurate systems.

Is it any wonder we were able to make such a mess?

Guilty until innocent (3, Insightful)

theverylastperson (1208224) | more than 5 years ago | (#25122469)

Awesome, now we have a great tool to accuse people with. How can anything with an accuracy of 78% be worth using? On a grading scale it's a C+. How many innocent people (22%) will be caught up in this mess? If the government is trying to create a rebellion by the people, then this is a perfect method.

How about hiring intelligent guards? Or people with common sense?

If we spent 10% of what we spend on this kind of crap on actually solving the real problems we face, then we might actually get somewhere. But as long as we live in this ultra-paranoid world filled full of invisable terrorists then we'll never get the chance to overcome the real problems. What a shame and what a waste.

Acting fools the sensors? (1)

zotz (3951) | more than 5 years ago | (#25122519)

In trials using 140 volunteers those told to ACT!!! suspicious...

So wait, people who were only acting could fool the sensors into thinking they had bad intent when in fact, they did not have bad intent... And... We are to believe that acting like you have good intent when you don't can't possibly work?

all the best,

drew

Racial Profiling - Lawsuits (1)

McFly69 (603543) | more than 5 years ago | (#25122521)

Racial profiling comes to my mind immediately with some lawsuits. Obviously it is a system that will have some sort of AI that is based on certain parameters. As a result, this can create many false positives. For example, a male caucasian is walking awkwardly (has a bad leg) with sunglass on can trigger the system. Or perhaps an afro-american male, with a black hoodie (because he is cold), in a middle/upper class neighborhood (talking a walk around his home) can trigger the system. Or perhaps an older lady (a granny), with a funny grin on her face (bad eye-sight), is struggling the open the car door with the wrong keys can be seen as she is breaking-in because of her high body temperature (frustration) and multiple keys can indicate they are tools.


Just a bad idea....Poor granny might get locked up and I like hoodies!

It's not just Big Brother watching you... (1)

No Grand Plan (975972) | more than 5 years ago | (#25122525)

George Orwell is likely spinning in his grave.

This just in! (1)

Phleg (523632) | more than 5 years ago | (#25122541)

A new scientific breakthrough allows law enforcement officials to automatically detect people trying to look suspicious! News at 11...

Next up: (0)

Anonymous Coward | more than 5 years ago | (#25122555)

FAST, also known as voter registration list.
Are your papers in order?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>