Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Video Surveillance System That Reasons Like a Human

ScuttleMonkey posted more than 4 years ago | from the robotic-overlords dept.

143

An anonymous reader writes "BRS Labs has created a technology it calls Behavioral Analytics which uses cognitive reasoning, much like the human brain, to process visual data and to identify criminal and terroristic activities. Built on a framework of cognitive learning engines and computer vision, AISight, provides an automated and scalable surveillance solution that analyzes behavioral patterns, activities and scene content without the need for human training, setup, or programming."

cancel ×

143 comments

Of course (4, Insightful)

sopssa (1498795) | more than 4 years ago | (#29497305)

Nothing can go wrong!

Re:Of course (0)

Anonymous Coward | more than 4 years ago | (#29497573)

With ever better and smarter AI, cameras and ever improving computer science in general, so much for the views of people who still don't believe Big Brother is possible.

Re:Of course (2, Funny)

Anonymous Coward | more than 4 years ago | (#29497659)

Nothing can go wrong!

Monday September 21, 6:08 PM > System Pawn, ID:1498795, "sopssa" making sarcastic joke regarding system. Execute Order 66. Will be a huge success.

Re:Of course (1)

GrumblyStuff (870046) | more than 4 years ago | (#29497885)

Wow! It sounds almost too good to be true!

Wait, what's that you say?

Re:Of course (5, Insightful)

bugi (8479) | more than 4 years ago | (#29498411)

The best of both worlds! Human stupidity plus the compassion of a machine.

Re:Of course (1)

Nein Volts (1635979) | more than 4 years ago | (#29498665)

I can see it now! It hits the covers of all the magazines! 'Robot with synthetic brain gets angry over mistake and kills 50!'

Re:Of course.. (1)

Bob_Who (926234) | more than 4 years ago | (#29498849)

Fire all the cops and judges, convert them all to prison guards, and we'll make the city a jail.

Re:Of course (1)

sevenfootchicken (1268690) | more than 4 years ago | (#29499469)

Nothing can go wrong!

Isn't that what they said about Skynet?

Proof? (3, Interesting)

FlyingBishop (1293238) | more than 4 years ago | (#29497313)

Source or it doesn't work.

Re:Proof? (1, Funny)

Anonymous Coward | more than 4 years ago | (#29497719)

Source? Come on man, CAMERAS! Tits or doesn't work.

Re:Proof? (4, Insightful)

Jurily (900488) | more than 4 years ago | (#29498925)

Mod parent up. Said AI first needs to distinguish between "activity" and "the wind blew a leaf across the screen". Then you need to distinguish between "lights a cigarette" and "lights the fuse on dynamite".

So, if it already does all that, just one more question: how do you define "criminal and terrorist activities" programmatically when not even the law is clear? Even shooting people can be a non-criminal act.

Re:Proof? (3, Insightful)

TheWingThing (686802) | more than 4 years ago | (#29498985)

It must first differentiate between "time flies like an arrow" and "fruit flies like a banana". Then, and only then, can be the system be trusted.

Re:Proof? (3, Funny)

beav007 (746004) | more than 4 years ago | (#29499145)

What I want to know is: whose cognitive reasoning is it based on, exactly?

Male?

Ooh, low cut top! Zoom zoom zoom!
Wait, the wind is picking up! Initiate scan for pleated skirts!

Or female?

Ooh, there's a sale over there! *zoom* Do they have my colour?
Wait, that handbag's a knockoff! *Dials DHS*

Bit more info - can it be as good as humans? (4, Interesting)

xmas2003 (739875) | more than 4 years ago | (#29497317)

A little more info from the BRS Labs website: [brslabs.com]
"The system takes the input from existing video security cameras (no need to change equipment); recognizes and identifies the objects in each frame and passes that data to its Machine Learning Engine. There, the system 'learns' what activity is normal for each unique area viewed by each camera. It then stores these LEARNED memories, much the same way the human brain does, and refers back to them with any and all future activities observed by the camera. If any behavior falls outside of the norm, alerts are generated."

Sounds impressive, but will the algorithms be sophisticated enough to watch grass grow [watching-grass-grow.com] and realize that it's normal behavior for the garbage truck to come by weekly [watching-grass-grow.com] ... but still send an alarm when a burgler steals your stuff! [grisby.org]

Re:Bit more info - can it be as good as humans? (3, Insightful)

RightSaidFred99 (874576) | more than 4 years ago | (#29497375)

My guess is it applies a few simple heuristics to analyze the behavior and the real trick is identifying the behavior.

Example: In an alley behind a hotel people frequently walk out a door, put something in a container, and walk back in. This becomes "normal". Then someone goes out back and starts smoking. Whoops, wtf is this! Alert, alert. OK, so this gets flagged as OK a few times. The system decides it's OK. However, when two people hold a third at gunpoint and linger in an area of the alley not usually used for smoking, this would now trigger as abnormal.

Another thing it might notice is the same person coming back to the front of a convenience store, waiting a minute, then leaving, then coming back again. Most people only walk in, walk out - this is abnormal.

So it won't tell you someone is burglarizing you, but it might focus your attention on a camera where something could be happening. I'd assume it would get better over time as things were flagged "ok" or "not ok", but at best it would provide some simple pre-filtering to focus human attention on scenes that are slightly more likely to be "interesting".

Re:Bit more info - can it be as good as humans? (2, Insightful)

mhajicek (1582795) | more than 4 years ago | (#29497503)

So it's a video Zone Alarm. I imagine the first while of operation would be rather labor intensive.

Re:Bit more info - can it be as good as humans? (1)

xmas2003 (739875) | more than 4 years ago | (#29497789)

Yep - also wonder if the underlying code shares some of the Motion Source [lavrsen.dk] which is what Duncan used to catch the perp.

Re:Bit more info - can it be as good as humans? (2, Interesting)

Brian Gordon (987471) | more than 4 years ago | (#29498639)

No way that it's as complex as that. My guess is that it gets used to linear motion like cars driving by and develops a tolerance for humans walking by on the way to work, but when there's lots of irregular motion in different directions (ie not just from one side of the frame to the other) there's a good chance something unusual is happening.

Your system lacks the element of "no human training" mentioned in the summary

Re:Bit more info - can it be as good as humans? (1)

RightSaidFred99 (874576) | more than 4 years ago | (#29499405)

Not needing human training to function, and functioning much better with human training are two separate things. Just like speech recognition. It will work without training, but there are still cases where it needs training.

I didn't think what I described was that crazily complex. If the camera is stationary and you line everything up on a grid line and do edge detection to find outlines of people you can probably implement something like this. I'm just pulling stuff out of my ass, though, it's certainly not my field.

Re:Bit more info - can it be as good as humans? (1)

Jurily (900488) | more than 4 years ago | (#29499003)

Another thing it might notice is the same person coming back to the front of a convenience store, waiting a minute, then leaving, then coming back again. Most people only walk in, walk out - this is abnormal.

So now I'm verboten to look at the lottery numbers on the door each Sunday morning? Either you flag every alarm as OK or people will get pissed off that you question them about perfectly legal activities.

This might just be the thing needed to finally get the cameras off the streets.

Re:Bit more info - can it be as good as humans? (1)

Memroid (898199) | more than 4 years ago | (#29499135)

Alert, alert. OK, so this gets flagged as OK a few times. The system decides it's OK.

Doesn't this contradict what the summary says? "without the need for human training"

what activity is normal for each unique area (1)

nurb432 (527695) | more than 4 years ago | (#29497383)

So if it watches Gary Indiana, murder and mayhem will be programmed in as normal?

Re:Bit more info - can it be as good as humans? (1)

Jane Q. Public (1010737) | more than 4 years ago | (#29497581)

Does that which we call pattern recognition, by any other name, stink as badly?

Re:Bit more info - can it be as good as humans? (1)

Jane Q. Public (1010737) | more than 4 years ago | (#29497701)

That is to say: I'm not smelling any roses here.

Re:Bit more info - can it be as good as humans? (5, Funny)

droopycom (470921) | more than 4 years ago | (#29497845)

If it really think like a human, the main feature will be to automatically upload videos of people having sex in elevators on the web.

that was just a press release (1, Informative)

Anonymous Coward | more than 4 years ago | (#29497319)

that was a press release for the company's product. It has no reliable or interesting information whatsoever.

Photos (1)

Dyinobal (1427207) | more than 4 years ago | (#29497325)

So I guess this means that the camera is going to Harass people taking Photos now?

Re:Photos (2, Insightful)

The Archon V2.0 (782634) | more than 4 years ago | (#29497405)

So I guess this means that the camera is going to Harass people taking Photos now?

Even better. It will call some rentacops and tell them that there's "suspected terroristic activity" taking place, and suddenly a tourist will get a taser up some orifice because "the computer" already labeled him a terrorist and therefore Osama's second in command.

I'll know it when I see it. (4, Insightful)

Jason Pollock (45537) | more than 4 years ago | (#29497337)

It's a press release pretending to be journalism.

If it doesn't need training, how does it define "terroristic activity"? Is it the "I'll know it when I see it" definition?

The article seems to indicate it works like a Bayesian filter on the video - pointing out things that aren't typical for the camera.

Much like any automated system that is supposed to filter out false positives, it is probably pretty easy to train either the operators or the system itself to throttle back the sensitivity to a point where it ignores everything.

Re:I'll know it when I see it. (0)

Anonymous Coward | more than 4 years ago | (#29497655)

inputted
getted
terroristic

Which of these tragically stupid sounding words is not an actual word?

Numenta? (1)

Louis Savain (65843) | more than 4 years ago | (#29497703)

I wonder if it's based on Numenta [numenta.com] 's Bayesian HTM (hierarchical temporal memory). My understanding of neuro-like learning system is that, unless its knowledge base is organized hierachically like a tree, it could not possibly do the things its promoters are claiming for it.

Re:Numenta? (0)

Anonymous Coward | more than 4 years ago | (#29497795)

I doubt it. Numenta is based on patented technology and they claim that theirs is patented as well. Also, Numenta vision demos are not very impressive.

Re:I'll know it when I see it. (0)

Anonymous Coward | more than 4 years ago | (#29498047)

If it doesn't need training, how does it define "terroristic activity"? Is it the "I'll know it when I see it" definition?

According to the article it does autonomous training. It probably don't define "terroristic activity" in any way but instead classifies normal and abnormal situations and behaviours. From the descriptions given in the article and in the BSR Labs website (see the previous post #29497317) it probably is a set of separate neural nets such that for each camera, there is a separately evaluated and trained net. There are probably also some higher level nets working to form the "hypocepts" as they call them.

Re:I'll know it when I see it. (1)

randy of the redwood (1565519) | more than 4 years ago | (#29498075)

Can we please see some examples of pornography detected by this system?

That seems to be something identifiable by people as a "I'll know it when I see it". At least Supreme court justices...

Re:I'll know it when I see it. (1, Informative)

Anonymous Coward | more than 4 years ago | (#29498137)

With good heuristics, some bayesian analysis (is there an object or not?) or neural nets...I can actually imagine a lot of possibilities here. Suppose I've got a gate at an airport--traffic should all be going in one direction. Anything going the other way--anomaly. I could imagine ML systems picking that up fairly easily.

Similarly if I've got a physically secured compound with double fences (the first fence is for screening--the second is your secure perimeter), foot traffic in the secure area should be predictable, on a schedule, and follow certain rates and patterns. Anything outside the pattern would be worth throwing a red box around and drawing human attention to. Even if it didn't get a "time" input, the presence of motion in unusual planes, or of non-human shapes might be a viable alert.

Detecting terrorism...okay...that seems funky. But I can definitely imagine that with a little bit of context you could very easily rig a good camera system up to alert on unusual events if the light levels were done correctly.

Re:I'll know it when I see it. (1)

Jason Pollock (45537) | more than 4 years ago | (#29498211)

The trick is to train it to ignore you.

So, if you want to enter an area with a backpack, you start walking in with a hump, and make the hump bigger with every entry. Better yet, give out a bunch of free backpacks (in increasing numbers) over a period.

A human operator would go "WTF", a machine would simply recognise it as normal and increase the threshold.

You want to walk back through a door? Start by looking over your shoulder as you walk through it normally.

That fence? Add an automated fence wiggler. If the fence goes off every night, they'll turn the sensitivity down in short order.

It all depends on how patient the attacker is.

Re:I'll know it when I see it. (0)

Anonymous Coward | more than 4 years ago | (#29498519)

It's a press release pretending to be journalism.

In this day and age? When Presidents can get away with "Won the war" speeches given on command centers and battleships decked out by Hollywood set designers? Surely not.

It Comforts Me To Know.... (4, Funny)

BJ_Covert_Action (1499847) | more than 4 years ago | (#29497341)

...that somewhere else in the world, there is a young, badass mother fighting off robots from the future that were designed to look like my Governor in a heroic attempt to destroy this new technology along with her scrappy, but as-of-yet slightly immature son....

At least, I think that's where we are in the time-line right?

It's a lie (3, Insightful)

blhack (921171) | more than 4 years ago | (#29497349)

The "machine learning engine" is a "datacenter" (warehouse) full of cheap African laborers who are all watching the cameras.

(this is a joke, it just isn't funny, and it is meant to illustrate a point. See the next line):
God/nature/FSM/evolution/al gore/$deity has done a pretty damn good job at building our brains, why are we trying to reinvent that wheel in a computer?

 

Re:It's a lie (1, Informative)

Anonymous Coward | more than 4 years ago | (#29497485)

So it's a subsidiary of Spinvox [bbc.co.uk] then?

Re:It's a lie (1)

Ponga (934481) | more than 4 years ago | (#29497769)

God/nature/FSM/evolution/al gore/$deity has done a pretty damn good job at building our brains, why are we trying to reinvent that wheel in a computer?

We're lazy.

Next!

Re:It's a lie (3, Funny)

evanbd (210358) | more than 4 years ago | (#29498007)

The "machine learning engine" is a "datacenter" (warehouse) full of cheap African laborers who are all watching the cameras.

(this is a joke, it just isn't funny, and it is meant to illustrate a point. See the next line): God/nature/FSM/evolution/al gore/$deity has done a pretty damn good job at building our brains, why are we trying to reinvent that wheel in a computer?

Because the owners of those brains get all whiny when you try to stick them in jars and make them solve the problems you want to solve, rather than sitting around watching porn? Really, sticking a bunch of brains in a 19" rack is harder than you'd think.

Re:It's a lie (1)

FlyingBishop (1293238) | more than 4 years ago | (#29498599)

Actually, it's a lot easier than you think, it's just hard to get them to do anything.

Re:It's a lie (1)

dissy (172727) | more than 4 years ago | (#29499303)

God/nature/FSM/evolution/al gore/$deity has done a pretty damn good job at building our brains, why are we trying to reinvent that wheel in a computer?

Because one could imagine that if we actually do have the ability to reinvent a mind, that we might also be able to improve upon it. And we will not know if we can or can not create a mind until we try to do so.

If that is possible, then that better mind could arguably invent a mind better than itself, that much more better than ours.
Human kind would no longer be the bottle neck of technological achievement.

Now if we could also then just manage to not be stupid like usual and piss those minds off, they might even play nice and share their awesome toys with us :P

Let me explain why... (0)

Anonymous Coward | more than 4 years ago | (#29499399)

"God/nature/FSM/evolution/al gore/$deity has done a pretty damn good job at building our brains, why are we trying to reinvent that wheel in a computer?"

Because nobody like shitty, repetitive, unfulfilled, degrading work. Now I recommend you look up definition of "Weak AI". You may be able to find it in encyclopedia in a library around you or may be in your house if for some reason you disapprove of not evil corporation that shall remain nameless forcing hundreds of teraflops worth of enslaved magical pixies (that is a lot of pixies, oh my) locked in "servers" (steel boxes) to do the thinking for you.

Panopticon, your doing it wrong. (1)

w0mprat (1317953) | more than 4 years ago | (#29497365)

Eagle Eye is not a blueprint for your surviellence computers. Thanks.

Re:Panopticon, your doing it wrong. (1)

HungryHobo (1314109) | more than 4 years ago | (#29498281)

I thought it was a pretty awesome AI, we just need to make sure that when we build it we don't give it an ancient document written by revolutionaries and rather have programmers write it some "no murdering your admins" rules

yes, but... (3, Funny)

gandhi_2 (1108023) | more than 4 years ago | (#29497379)

...does it run racial profiling?

Yes, absolutely (0)

Anonymous Coward | more than 4 years ago | (#29497623)

From the description and providing it is smart enough to identify the race of an individual, it will sound an alert if a white person enters a monitored area normally only inhabited by non-whites and vice-versa. Gee now we're advanced enough to build our racism into our computers. Wonderful

Re:yes, but... (1)

Jane Q. Public (1010737) | more than 4 years ago | (#29497685)

Haha I wonder of a young "person of color", acting like a typical inner-city person of color, would get flagged on cameras in a white neighborhood because he "wasn't acting normally"?

I bet that in fact he would.

Re:yes, but... (1)

kumanopuusan (698669) | more than 4 years ago | (#29499005)

As long as detection is based on behavior and not skin color, there's nothing wrong with it.

If "acting like a young person of color" involves trespassing or loitering the system should flag it just as readily as anything else. Assuming that you're talking about legal behaviors, again, I don't see a problem. This system doesn't know anything about race, and is perfectly ignorant of it. This is an example of a "color-blind" system. There are people who claim to want a "color-blind" society, yet they always seem to want exceptions.

Ocean's Thirteen that a system like that. (0)

Joe The Dragon (967727) | more than 4 years ago | (#29497399)

Ocean's Thirteen that a system like that.

Re:Ocean's Thirteen that a system like that. (0)

BJ_Covert_Action (1499847) | more than 4 years ago | (#29497437)

Ocean's Thirteen that a system like that.

Uuuuum, yes?

Was that a question or a statement?

Re:Ocean's Thirteen that a system like that. (2, Funny)

swanzilla (1458281) | more than 4 years ago | (#29497495)

Ocean's Thirteen that a system like that.

Brad Pitt that an actor in that.

Re:Ocean's Thirteen that a system like that. (0)

Anonymous Coward | more than 4 years ago | (#29497611)

*unusual grammar anomaly detected*

Recommend immediate detention of "Joe The Dragon" for torture. Questioning is not recommended at this time.

Re:Ocean's Thirteen that a system like that. (1)

Joe The Dragon (967727) | more than 4 years ago | (#29498563)

so you can a black level player?

Human Intelligence (4, Insightful)

Reason58 (775044) | more than 4 years ago | (#29497421)

What a great way to absolve any personal responsibility. Detained wrongfully? Not our fault, the machine said you were moving like a terrorist.

Re:Human Intelligence (1)

Reason58 (775044) | more than 4 years ago | (#29497433)

Also, I wonder how well these systems will handle contextual clues that people pick up on automatically? Is that person moving in a suspicious manner because they are a terrorist, or because they are just carrying some heavy bags? Are they going to blow the place up, or are they just rushing to catch up with someone?

Re:Human Intelligence (3, Insightful)

radtea (464814) | more than 4 years ago | (#29498189)

Also, I wonder how well these systems will handle contextual clues that people pick up on automatically?

"Contextual clues" like a dark-skinned guy in London rushing to catch the Tube wearing a ski jacket on a warmish day?

Those are the kind of "contextual clues" that people use all the time to make lethal misjudgements, and in the case at hand resulted in a completely innocent Brazilian who was legally in Britain going legally about his legal business being murdered by police.

Given how badly humans are known empirically to suck at making these kinds of judgments only an arrogant idiot would think of programming a machine to emulate us. But of course, arrogant idiots are incapable of adjusting their beliefs in response to empirical data, so they probably aren't even aware of how badly they suck at this.

Re:Human Intelligence (0)

Anonymous Coward | more than 4 years ago | (#29497551)

What a great way to absolve any personal responsibility. Detained wrongfully? Not our fault, the machine said you were moving like a terrorist.

They do claim it is as smart as a human. Now think about how smart your average security guard is.

"You have an insulin needle, therefore you look like a terrorist."
"Your skin is a little dark, therefore you look like a terrorist."
"You won't show me your tits, therefore you look like a terrorist."

Re:Human Intelligence (0)

Anonymous Coward | more than 4 years ago | (#29497729)

What a great way to absolve any personal responsibility. Detained wrongfully? Not our fault, the machine said you were moving like a terrorist.

That sounds like something a terrorist would probably say.

Re:Human Intelligence (1, Insightful)

Anonymous Coward | more than 4 years ago | (#29497837)

Well, how's "our software" different from "our training", "our briefing", etc? I mean police are supposed to follow detailed regulations, not act as judges. It's only an officer's 'fault' right now if they don't follow regs.

Re:Human Intelligence (2, Funny)

LifesABeach (234436) | more than 4 years ago | (#29497871)

So while facing away from the camera, I see what looks like a quarter, and while bending over, I have this irresistible urge to scratch myself where my back pocket is. At least that's what the Camera will show, on CNN news, and if I have my way, YouTube also. It's convincing my wife, that's all it was; is going to be rough.

Fairness? (0)

Anonymous Coward | more than 4 years ago | (#29497431)

How is this fair to those who have a complex to appear to act criminally, though they are trying their hardest to act normal and not commit crime?

Scary (2, Insightful)

celibate for life (1639541) | more than 4 years ago | (#29497435)

Human judgment isn't accurate enough to distinguish between an actual terrorist and someone who may look like one. Why is there anyone expecting good results from a machine emulating this judgment that isn't reliable in the first place?

False positve and False negative readings (4, Insightful)

mjensen (118105) | more than 4 years ago | (#29497473)

Much like detecting terrorists by facial recognition, this is vaporware until they publish some numbers.

I once had someone misplace a sales call to me, being proud his facial recognition system was 70% accurate. He had no idea how much his system is a pain in the ass when its wrong, and for the airport security business he was trying to get, 90% accuracy is considered terrible.

Wow, just like a human! (0)

Anonymous Coward | more than 4 years ago | (#29497487)

Thank God it reasons like a human does. As we know, humans have a 0% false positive rate at identifying potential terrorists. Now, not only can we identify old ladies in wheelchairs with oxygen tanks as the terrorists they are in a completely automated fashion, but we can do it at speeds previously only dreamt of!

Security Cameras that shake you down for donuts (1)

leftie (667677) | more than 4 years ago | (#29497511)

Maybe they shouldn't have used human cops as their behavior model after all.

Reasons like a human (0)

Anonymous Coward | more than 4 years ago | (#29497593)

Surveillance System becomes self-aware, becomes as "efficient" as any other TSA at spotting terrorists, realizes the futility of security theater, quits in disgust, writes a best-selling autobiography.

Sick and tired (2, Insightful)

WillRobinson (159226) | more than 4 years ago | (#29497613)

Really I am sick and tired of the surveillance realm. If anybody really wants to do something nefarious they will make sure the cameras don't work. Simply pull them down, spray them with paint or whatever. The authorities will not come running. After the fact usage is good, but really it doesn't stop any crime, even the random ones.We are the ones funding this and do not even have a say in it.

So, (0, Troll)

supersloshy (1273442) | more than 4 years ago | (#29497637)

The whole point of this is:

1. Sell to customers who blindly trust in it.
2. Fail to detect anything on many an occasion because it most likely isn't perfect.
3. ???
4. PROFIT!!!

Re:So, (1)

emjay88 (1178161) | more than 4 years ago | (#29498487)

I think the order would be more like:

1. Sell to customers who blindly trust in it.
4. PROFIT!!!
3. ???
2. Fail to detect anything on many an occasion because it most likely isn't perfect.

The secret ingredient (1)

russotto (537200) | more than 4 years ago | (#29497645)

The central processor in this thing isn't a computer at all... it's a dead salmon!
 

Didn't happen (1)

dandart (1274360) | more than 4 years ago | (#29497715)

xkcd, source, video, pictures, audio, podcast or it didn't happen.

Boobies! (2, Funny)

pavon (30274) | more than 4 years ago | (#29497725)

So, it instinctively directs the cameras towards the hot women all the time, distracted from important things it should look at?

Hopefully not like humans (3, Interesting)

D4C5CE (578304) | more than 4 years ago | (#29497741)

Who, under video surveillance, tend to act rather irresponsibly:
  • Feeling safe(r) when and where they are not, because of the false promise of BB to be watching (over) them.
  • Mostly turning a blind eye on crime (and its victims), as the all-seeing eye of BB and/or "someone (else)" will surely take care of it.
  • Having learned from an early age to show only herd mentality out of preference falsification in their desperate attempts to try and please the watchmen and be seen to obey "like every other good citizen".
  • In the rare instances of courage, not fleeing insurmountable dangers out of the feeling that someone has got to be watching and will send backup any moment now.

Interestingly in Europe after a series of dreadful incidents on live video, this is finally being debated on the eve of general elections: http://www.piratenpartei.de/node/920/29268#comment-29268 [piratenpartei.de] - as at the other end of the line, in a situation room (that may be on the next floor or station, and yet too) far away, officers will have to watch events unfold and wish in vain to finally be out there with a gun again (or have sufficient forces to dispatch), e.g. to stop that attacker they can only videotape and helplessly watch wreak havoc on screen.

Fun in the UK (1)

RayMarron (657336) | more than 4 years ago | (#29497779)

I can just see the kids in the UK figuring out what kind of innocent activity triggers police reactions. When the flood of false-positives starts, the cameras will be back to being as use[ful|less] as they are today.

porno, Porno, PORNO! (1)

SoVeryTired (967875) | more than 4 years ago | (#29497785)

What are the odds these cameras won't be able to distinguish between people fighting and people shagging?

Re:porno, Porno, PORNO! (0)

Anonymous Coward | more than 4 years ago | (#29498155)

Two dogs FIGHTING? He'd have given his right arm to be called Two Dogs Fighting!

Re:porno, Porno, PORNO! (0)

Anonymous Coward | more than 4 years ago | (#29498165)

you know, you're right. Sad, isn't it : security is bound to be wrongly dispatched because normal decent people beating each other to pulp with crowbars in a socially completely acceptable way generated a false positive about a omg-pr0n-think-of-the-children-sexual act taking place.

Like a human? (1)

RNLockwood (224353) | more than 4 years ago | (#29497811)

"That Reasons Like a Human"

Spends most of it's time ogling - wait there's a honky in this black neighborhood, must be up to no good.

In other words (1)

countertrolling (1585477) | more than 4 years ago | (#29497851)

It will lie.

Please put down your weapon. You have 20 seconds.. (2, Funny)

netsharc (195805) | more than 4 years ago | (#29497867)

to comply!

No one talkd about ED-209 [3dblasphemy.com] yet?

Can we install this in congress? (1)

AnonymousX (1632759) | more than 4 years ago | (#29497941)

Seriously, I would love to see some of this surveillance tech turned back on the government. Install this thing in congress and train it to watch for corruption. It would probably fill up a massive disk array in a couple hours with positive hits.

And it will work as soon as . . . (2, Insightful)

taustin (171655) | more than 4 years ago | (#29498019)

you give us another billion dollars to finish it.

Yeah, right.

A 1% error rate will produce a hundred times as many false positives - all innocent people accused of a crime - as real positives. And a 20% error rate is far, far more likely.

Scams like this are the reason why you have to show up at airports three hours early now.

Is it smart enough to knwo that "terroristic" isn't a real word, at least?

Re:And it will work as soon as . . . (1)

icebraining (1313345) | more than 4 years ago | (#29498461)

"all innocent people accused of a crime"

Why? Even if the system flags the people as criminals, the operators will still be able to see the recordings, and then decide if it was a crime or not, no?

no need for programming? (0)

Anonymous Coward | more than 4 years ago | (#29498197)

hmm.. actually, this idea is great! except for one bit of a problem.. this will have a profound effect in future human behavior. since criminal acts/behavior would have to change incredibly fast to keep beating this 'sauron'-bot, every 'illegal behavior' might become extinct.. is that really a good thing given the laws we have today?

Not Like a Human (1)

florescent_beige (608235) | more than 4 years ago | (#29498229)

This system does not operate like a human at all. A human operator does not look for signs of terrorist activities. A human operator looks at boobs.

Reasons like a human... (1)

Nekomusume (956306) | more than 4 years ago | (#29498285)

So, it makes wild guesses, allows others to tell it how it should be thinking and/or bases vital decisions on obviously false beliefs?

Sounds a lot like.... (1)

EMG at MU (1194965) | more than 4 years ago | (#29498357)

I didn't RTFA or This one [slashdot.org] , but it looks similar.

And so when they're filming Terror 2010 (1)

Ralph Spoilsport (673134) | more than 4 years ago | (#29498385)

The entire film crew, actors, and craft people are friend into plasma because they were Acting Like Terrorists.

It's just another bloated pentagon pork project of no real value or merit. We see this all the time. It reminds me of avant garde art, only 10,000x more expensive and twice as pointless. But only half as ugly.

RS

Two things (1, Insightful)

Anonymous Coward | more than 4 years ago | (#29498695)

a) Terroristic? Not just terrorist activities?
b) Terrorism /is/ usually a crime, nothing special.

Will it play chess? (0)

Anonymous Coward | more than 4 years ago | (#29498723)

Colossus: The Forbin Project

I can't wait to see this in action (1)

hyades1 (1149581) | more than 4 years ago | (#29498745)

...uses cognitive reasoning, much like the human brain...

So how long before this thing figures out how to pork a co-worker on lunch break, record the act on one of the cameras it's supposed to be monitoring, and piss in the boss's coffee?

I'm betting about three weeks.

the most important question (0)

Anonymous Coward | more than 4 years ago | (#29498789)

it's great that it will alert you to possible terrorist activity, but will it also alert you to female nudity?

So basically a "red light camera" for people. (1)

tlambert (566799) | more than 4 years ago | (#29499001)

So basically a "red light camera" for people.

And like the red light cameras, there's no way to appeal to human judgement if the camera says you're guilty, you must be guilty unless you can prove you are innocent (for red light cameras, at least in California, that means proving the amber light lasted less than 4.8 seconds).

I love the presumption of guilt they're slowly building into the system in the name of revenue generation. "The war on terror" has been going on for 8 years now, and they finally arrested _a_ suspect with no concrete plans of timing, location, or a target. I will bet that this system will ultimately be used to automatically issue jay-walking tickets.

-- Terry

Just imagine... (0)

Anonymous Coward | more than 4 years ago | (#29499291)

...what sort of fun we could have if this system was used as the eyes and brains of an armed security robot?

Yes, yes... just like a human... (0)

Anonymous Coward | more than 4 years ago | (#29499397)

It will get bored and fall asleep. I can't wait to see the R&D explain this one: "Well, at 9h35mn PM, the image became out of focus and aimed at the floor under the camera..."
It will look around for interesting "sights". "Yes sir, the system was fixing on this lady, we deduced that this new camera was bogus and probably a terrorist weapon of some sort..."

There is one rule to remember here: Humans are best at behaving like humans.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...