Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Robotics Prof Fears Rise of Military Robots

timothy posted more than 4 years ago | from the oh-you-worry-too-much dept.

Robotics 258

An anonymous reader writes "Interesting video interview on silicon.com with Sheffield University's Noel Sharkey, professor of AI & robotics. The white-haired prof talks state-of-the-robot-nation — discussing the most impressive robots currently clanking about on two-legs (hello Asimo) and who's doing the most interesting things in UK robotics research (something involving crickets apparently). He also voices concerns about military use of robots — suggesting it won't be long before armies are sending out fully autonomous killing machines."

cancel ×

258 comments

skynet (5, Funny)

el_tedward (1612093) | more than 4 years ago | (#30774966)

okay, where's the tag?

I already bought my copy (1)

Cryacin (657549) | more than 4 years ago | (#30775026)

http://www.amazon.com/How-Survive-Robot-Uprising-Defending/dp/1582345929 [amazon.com]

Disclaimer: I'm only a fan of the book. Quite funny. I'm not affiliated with the author in any way shape or form.

Re:I already bought my copy (0)

Anonymous Coward | more than 4 years ago | (#30775302)

You wasted your money. "It also may be a good idea to carry around a pair of welder's goggles, as lasers will likely be robot attackers' weapons of choice"
I'd love to hear him explain how you're supposed to SEE where you're going while having the shade down.

Re:I already bought my copy (1)

philcheesesteak (1697792) | more than 4 years ago | (#30775588)

your eyes can deceive you, don't trust them. stretch out with your feellllings!

Re:I already bought my copy (1)

PaladinAlpha (645879) | more than 4 years ago | (#30775910)

Yeah, I always wondered how you are supposed to drive your car when holding the brake down keeps it from moving...

Ok, that was a really bad analogy, but I would think the ability to, on short notice, sacrifice the ability to see to gain the ability to not be blinded by robot lasers would be invaluable -- no one is forcing you to run around blind. You get to decide when the tradeoff is worth it.

(Haven't read the book, in all honesty; this just struck me as odd)

Re:I already bought my copy (-1, Troll)

Anonymous Coward | more than 4 years ago | (#30775678)

Post a torrent with the PDF, and then we'll care. That will also negate the need for your affiliation disclaimer.

Re:skynet (1)

Midnight Thunder (17205) | more than 4 years ago | (#30775374)

I am too busy doing R&D of my time machine.

Liberation of Tibet (0)

Anonymous Coward | more than 4 years ago | (#30775628)

Mechanized soldiers can be useful.

Consider the following scenario.

In the early morning of December 7, 2041, one million mechanized soldiers arise from the receding tide and onto the shores of China. The robots march relentlessly westward, killing all Chinese soldiers in their path. The final destination is Tibet.

In the words of that old Negro spiritual, "Tibet, free at last! Buddha, Almight! Free at last!"

And (2, Insightful)

Weaselmancer (533834) | more than 4 years ago | (#30775864)

Mechanized soldiers can be dangerous, too.
Consider the following scenario.

In the early morning of December 7, 2041, one million mechanized soldiers arise from the receding tide and onto the shores of China. The robots march relentlessly westward, killing all Chinese soldiers in their path. The final destination is Tibet.

Fortunately, the Chinese have had state sponsored hackers for decades now. [slashdot.org] It was a simple matter for these hardened pros to return the bots to their creators, with orders to kill.

"Friendly AI" (4, Interesting)

Baldrson (78598) | more than 4 years ago | (#30774970)

This is one of the things that makes me think the concern about "friendly AI" is blown out of proportion. The problem isn't making sure teh AI's are "friendly" -- its making sure the NI (natural intelligence) owners of the AI's are "friendly".

If half the effort spent on "friendly AI" were spent on examining the ownership of AI's, there might be some hope.

Re:"Friendly AI" (2, Funny)

Ethanol-fueled (1125189) | more than 4 years ago | (#30775090)

Indiscriminately fire at anything that moves. Isn't that Blackwater's* job?

* er, Xe Services LLC. [wikipedia.org]

Re:"Friendly AI" (1)

davester666 (731373) | more than 4 years ago | (#30775158)

So, are you saying that Blackwater hires people who act like robots or just created and sent out the first version of these robots?

Re:"Friendly AI" (2, Insightful)

Ethanol-fueled (1125189) | more than 4 years ago | (#30775232)

That depends on whether they started using steroids before or after they joined Blackwater.

Re:"Friendly AI" (1)

hyperion2010 (1587241) | more than 4 years ago | (#30775290)

Haha, right now we're having a hell of a time getting other human beings to be friendy and they are quite a bit smarter and more dangerous than any robot, so until the meat is less dangerous than the quartz I think it is a grand waist of resources to try and make robots "friendly." To tell you the truth there are certain things that real intelligence should be unfriendly towards.

Re:"Friendly AI" (5, Interesting)

tsm_sf (545316) | more than 4 years ago | (#30775774)

I'm not really worried. I'm sure we'll hear about some 'bot wiping out it's own platoon in the next decade, and that will be the end of semi-autonomous killbots.

In fact, I'd be very surprised if this didn't happen in the next ten years. Armed robots are a great idea in that they'd cost less than a fully trained human and are more easily repairable. It's a natural way to go for the military. I also know enough about software development to see that a catastrophic failure is fairly likely, and that the idiot-proof failsafe they'll set up will turn out not to be and won't, respectively.

Re:"Friendly AI" (4, Informative)

CopaceticOpus (965603) | more than 4 years ago | (#30775948)

Yes, it will happen. No, it won't stop development. Depending what you mean by autonomous, it may have already happened [wired.com] .

Re:"Friendly AI" (0)

Anonymous Coward | more than 4 years ago | (#30775982)

Look up the video for an automated orlikan 20mm canon gone berserk

Found it

http://m.gizmodo.com/site?sid=gizmodoip&pid=JuicerHub&targetUrl=http%3A%2F%2Fgizmodo.com%2F312443%2Frobot-cannon-goes-berserk-kills-9%3Fop%3Dpost%26refId%3D312443

Re:"Friendly AI" (1)

evil_aar0n (1001515) | more than 4 years ago | (#30775320)

How 'bout "biological intelligence" instead?

And if the US military is involved, is there any hope?

Re:"Friendly AI" (5, Insightful)

jollyreaper (513215) | more than 4 years ago | (#30775422)

This is one of the things that makes me think the concern about "friendly AI" is blown out of proportion. The problem isn't making sure teh AI's are "friendly" -- its making sure the NI (natural intelligence) owners of the AI's are "friendly".
If half the effort spent on "friendly AI" were spent on examining the ownership of AI's, there might be some hope.

That's just it -- human nature never changes. The general can order genocide but it's up to the soldiers to carry it out. The My Lai Massacre was stopped by a helicopter pilot who put his bird between the civilians and "told his crew that if the U.S. soldiers shot at the Vietnamese while he was trying to get them out of the bunker that they were to open fire at these soldiers."

http://en.wikipedia.org/wiki/My_Lai_Massacre [wikipedia.org]

Robots aren't really the issue -- distancing humans from killing is the problem. Not many of us could kill another human being with our bare hands. A knife might make the task easier in the doing but does nothing to ease the psychological horror of it. Guns let you do it at a distance. You don't even have to touch the guy. And buttons make it easier still. It's like you're not even responsible. You could convince young men to fly bombers over enemy cities and rain down incendiaries but I don't think you could convince many of them to kill even one of those civilians with a gun, let alone a knife.

This is the strange distinction we make where we find one form of killing a horrible thing, a war crime, terrorism, and another form of killing is a regrettable accident but there's really no blame to be assigned. A suicide bomber walks into a pizzeria and blows himself up, we lose our minds. An Air Force bomber drops an LGB in a bunker filled with civilians instead of top brass, shit happens. We honestly believe there's a distinction between the two. "Americans didn't set out to kill civilians" war hawks will huff. Yes, but they're still dead, aren't they?

Combat robots are simply continuing this process. Right now there is still a man in the loop to order the attack. Hamas kills Israeli targets with suicide bombs, Israelis deliver high explosives via missile into apartment blocks filled with civilians. They're using American-manufactured anti-tank missiles. I think they're still using TOW. Predator drones use hellfires and their operators are sitting in the continental US while Israeli pilots are a few miles away from the target inside their choppers but really, what's the difference? And what happens when drones are given the authority to engage targets on their own? A soldier with a gun can at least see what he's shooting at. Those in the artillery corps are firing their shells off into the unseen distance and have no idea who they're killing. Not that much different from laying land mines, indiscriminate killing. Psychologically no different from what it would be to set a robot on patrol mode, fire-at-will.

If one extrapolates a little further, the problem of the droid army is similar to that of the tradition of unpopular leaders using corps of foreign mercenaries to protect them from the wrath of the people. The mercenaries did not speak the language, did not know the customs, and were counted as immune to palace intrigues. They could be used against the people for they would not the sympathy for fellow countrymen that a native force might feel. What are droids being used for? Only the people operating them could say for sure. Welcome to the age of the push-button assassination.

"Friendly Evolution" (1, Interesting)

Anonymous Coward | more than 4 years ago | (#30775448)

"That's just it -- human nature never changes."

Of course it can. That's what evolution is for.

Re:"Friendly AI" (4, Insightful)

Daniel Dvorkin (106857) | more than 4 years ago | (#30775632)

Of course, for thousands of years of recorded history, people did kill each other en masse at arm's length. Alexander's soldiers may have been more honest about what they were doing than somebody today sitting in a bunker pressing a button and killing people on the other side of the globe, but they were no less bloodthirsty. So I don't think you can blame the modern willingness to kill on the impartiality created by modern military technology, because the modern willingness to kill looks remarkably like the ancient willingness to kill, just with different tools.

OTOH, I agree with you completely about the absurdity of calling some methods of killing heroic and others evil. Dead is dead.

Re:"Friendly AI" (3, Insightful)

AJWM (19027) | more than 4 years ago | (#30775666)

Heck, for thousands of years people have been killing each other with autonomous -- although not intelligent -- devices. The projectile from a trebuchet or ballista can't be recalled or turned off once it's on its way. And the destructive force of long range munitions has only gotten greater since.

To the extent that battlefield robots can do a better job of telling the combatants from the non-combatants than can lobbed rocks or bombs, then all the better.

Just so long as somebody has an "off" switch.

Re:"Friendly AI" (1)

Maxo-Texas (864189) | more than 4 years ago | (#30775862)

Exactly-- during those times that killing is necessary, then those who enjoy and are skillful at it, will excel.

I had a rat problem once.
First I took them away.
Didn't help.
Finally, I took a stick and went to killing them.
At first it was a bit tramatic.
Very quickly it became enjoyable and cat/mouse hunterly like.

It was ineffective tho, so I went to poison. That stopped the problem.

Animals enjoy playing with and killing other animals. Humans are animals.

In the face of massive propaganda that life is sacred and we shouldn't kill, people still do it and a lot of them enjoy doing it and are good at it.

Re:"Friendly AI" (2, Insightful)

S77IM (1371931) | more than 4 years ago | (#30775648)

Shouldn't this story have an "ED-209" tag?

I agree with you that distancing humans from killing is big a problem. We have that problem now with cruise missiles, cluster bombs, nuke-from-orbit, etc.

But accidental death from robots run amok is not a pleasant thought either. The whole point of an AUTOMATED system is that it runs without a human driving it. This leads to a potential -- however slim -- that the system starts killing people without permission.

It sucks that we kill each other deliberately. Let's not create more opportunities for accidents.

  -- 77IM, "Guns don't kill people, robot guns kill people."

Re:"Friendly AI" (2, Insightful)

stdarg (456557) | more than 4 years ago | (#30775664)

We honestly believe there's a distinction between the two. "Americans didn't set out to kill civilians" war hawks will huff. Yes, but they're still dead, aren't they?

Are you serious? So to take a personal example, say somebody murdered your mother. How would you want that person punished? Many people would call for the death penalty. Now what if someone killed your mother completely by accident... say your mom ran a red light and got hit by someone. She's still dead, isn't she?

Re:"Friendly AI" (3, Insightful)

mbone (558574) | more than 4 years ago | (#30775856)

Well, suppose your Mom was at a restaurant having dinner, and it got blown up, killing her and most of the rest of the clientele, and you learned that the restaurant was bombed without warning because a "high value target" was supposed to have been there, but wasn't. (This has happened, and it was no accident.) I assume, based on the above, you would feel that "them's the breaks," but I can assure you that many people would conclude that the people dropping the bombs don't really care much as to whether civilians were killed or not, and you don't have to dig very deep to learn that in reality many of the people at the receiving end of such incidents do indeed feel that the people behind the bombs deserve punishment.

Re:"Friendly AI" (0)

Anonymous Coward | more than 4 years ago | (#30776042)

First off, terrible example: if my mom ran a red light, it's partially her fault. It's not really your fault for going into a building that was going to be bombed (unless you're a human shield or something). Let's say the man who hit my mom was a drunk driver. And this wasn't his first fatal accident.

I don't think the death penalty is ever appropriate. I'm also not convinced there's a huge difference between *a certain class of* accidental homicide and intentional homicide.

With intentional homicide, a big part of the reason we punish very harshly is we're afraid that the sort of person who'll do this is liable to decide to do it again. With accidental homicide? Was it avoidable? If a person is criminally negligent, we might again be just as afraid they'll do it again. Like a recidivist drunk driver, who ultimately shows no more regard for human life than the intentional murderer, is just as dangerous and just as despicable.

A pattern of bombing buildings full of civilians, even by accident, is just as horrifying as somebody doing as many on purpose. If you want lenience, then the accidents have to STOP HAPPENING. No accidents in a few decades, say.

But what happens is a cost-benefit analysis. "We can win with 0% accidents, but it would take a kabillion dollars and 500 million soldiers and risks Y and Z. Or we could go with what we know, and win, and have an accident X% of the time, which is regrettable but acceptable. Or we could lose." Obviously a bit more complicated than that, but there it is. And as long as option b is chosen, then the difference between doing it intentionally and doing it by an accident that we chose to risk is academic.

And maybe option B is the right choice, the best of all worlds. STILL doesn't make the other side feel any better.

Re:"Friendly AI" (4, Interesting)

Shihar (153932) | more than 4 years ago | (#30775722)

As dark as the potential for drones can be, I think it actually has the chance to make war a far less indiscriminate and bloody thing.

Right now, if a square of Marines gets fired on, they can return fire. A square of marines has the firepower to flatten a village. Give them access to artillery or air support, and they can literally level a city. In other words, whenever you have a squad of supported marines fight, you are having a group of kids (and they are just kids) holding their finger over enough firepower to take out a small army. Their job is to use as little as that firepower as humanly possible. You might be able to level every building in a half mile radius, but you are not supposed to. When it comes to a firefight though, especially a desperate firefight where soldiers have their lives on the line, they, like most humans, choose life over death, and if that means flattening an entire apartment building to get at one sniper, they do it and hope that no one else was inside. Generally speaking, unless a soldier walks up to a civilian and splatters their brains on the floor, they are let off free. It is war, your life is on the line, you take your risks and respond in the best way possible. If a civilian gets accidentally whacked, that is sad but acceptable. Most soldiers develop a pretty thick "us vs them" mentality that see civilians if not the enemy, as hostile terrain, especially in a guerrilla war.

Drones offer up another possibility. It is true, you can order a drone army to go out and kill civilians and it is probably easier to get a soldier to do it. That said, if you policy is civilian murdering, a nation like the US doesn't need to use drones. You can handily exterminate all life through impersonally aerial bombing. What drones offer is more control over the rules of war. Rules mean little when you are surrounded by gunfire. You do what you have to do to survive. On the other hand, when you are sitting in the US with a military lawyer over one shoulder, a commander over the other, and and every single second and action you take is getting recorded, rules are a lot more enforceable. If the rules call on you to die before you level an apartment complex just to get at one sniper, a drone can simply die. A soldier generally wont.

With drones, you have complete accountability for your actions. You can always go to command before doing something. You never need to make snap judgments. Hell, you can call a damned military lawyer over and get his take on the rules of engagement. Further, every bloody thing you do is being recorded, so if you decide to start murdering civilians you will be caught and tried.

On the balance, I think drones are going to lessen the lives lost. The few potential abuses are pointless to worry about. If someone wants to exterminate another people indiscriminately, you can do it the cheap old fashion way of aerial bombardment. On the other hand, if you are an army that wants to enforce ironclad rules of engagement, drones ensure there is never an excuse for fucking up, and that fuckups get caught.

Re:"Friendly AI" (2, Insightful)

joe_frisch (1366229) | more than 4 years ago | (#30775834)

I think a lot will depend on the extent to which the robot operator is held responsible for the semi-autonomous robot's actions. If the human is completely responsible, it might make ware less deadly. If the human can use the excuse "well the automatic targeting system mistakenly identified the 5 year old with a tricycle as an enemy robot - its a terrible shame, we need to update the recognition system" - then you have problems.

  There is a tendency for large organizations to avoid placing blame on any particular person - so the military might tend to deflect blame from the human operator. In fact the blame IS unclear - is it the operator, or one of the possibly thousands of programmers involved in the pattern recognition algorithms in the robot?

Re:"Friendly AI" (1)

Baldrson (78598) | more than 4 years ago | (#30775840)

Actually, I'm thinking of this more in terms of a private dystopia. In other words, imagine the nation states collapse and you have some multibilionare guy controlling an army of droids. Even if he is "well intentioned" as is Bill Gates, what is to keep him from deciding that feeding millions of fat lazy over-paid American programmers to starving African children isn't the "moral" thing to do?

Re:"Friendly AI" (1)

QuantumG (50515) | more than 4 years ago | (#30776106)

Another way to look at it is that if every single order has to be entered into a command terminal somewhere and the robots in the field are logging all their own "decisions" then you've got a perfect information situation for tribunals.

"An atrocity occurred and we have the logs to prove it!"

Life imitating art... (1)

evil_aar0n (1001515) | more than 4 years ago | (#30775030)

Terminator, to start with. Is anyone surprised?

Re:Life imitating art... (1)

McFortner (881162) | more than 4 years ago | (#30775380)

Rumor has it that GENOM is working on a combat Boomer....

No worries (0)

Anonymous Coward | more than 4 years ago | (#30775040)

They just cut the budget for the program. Given the current budget problems I doubt there's much risk.

Re:No worries (2, Interesting)

Koby77 (992785) | more than 4 years ago | (#30775172)

I'm pretty confident that there will be more budgeting for this sort of thing in the future though. If you look at the costs of a military operation, it's huge. There's a lot of money which can be saved by switching to robots. Not so for other areas such as manufacturing. When you can move your manufacturing to a 3rd world country and have $2 per day workers, there isn't much money to be saved by introducing a robot into the process. Inevitably, the global military R&D budget will continue to eclipse all other robotics research spending. Unless some other unforeseen robotics application which can save boatloads of money is realized, I think it's just a matter of simple economics that the future control of robots will be by the military industry.

Re:No worries (0)

Anonymous Coward | more than 4 years ago | (#30775338)

Look, at this point the US military is the world's biggest make-work program for idiots. It's not ever going to switch to robots. We've already switched to robots in the only part of the military that matters, the part that rains fiery death from above. The rest is just fairy-tale bullshit.

Future "control of robots" won't be by anyone since they can be built by third-worlders using fairly common household objects. This is like arguing that space applications would dominate solar power production. It did, for a long time. But that time is over. A few hundred thousand bureaucrats, even with all of your tax dollars, can't outspend a third of the world economy.

Once even a tiny fraction of India and China reach a level of development sufficient to afford robots, that will be the end of the US "military industry" controlling anything.

Re:No worries (1)

Ethanol-fueled (1125189) | more than 4 years ago | (#30775434)

Look, at this point the US military is the world's biggest make-work program for idiots.

Hey, come on, not all of them are idiots, only the ones who stay in longer than 1 term of enlistment. Why work for chump change, shit food, and terrible hours with the possible risk of being killed when you could be a contractor working for 60 bucks an hour with no worries of being jailed for smoking a joint?

Once even a tiny fraction of India and China reach a level of development sufficient to afford robots, that will be the end of the US "military industry" controlling anything.

Hell, I'd be surprised if the U.S. didn't oursource all of it's work to China by then. Corporate espionage is only a small price to pay when there's plenty of oil to be taken because, at this point, everybody(especially China) needs oil as much as we do. And the rest of the world will turn a blind eye, because nobody likes Muslims :->

Once again, The Simpsons is correct! (5, Insightful)

Anonymous Coward | more than 4 years ago | (#30775042)

http://en.wikipedia.org/wiki/The_Secret_War_of_Lisa_Simpson [wikipedia.org]

"The wars of the future will not be fought on the battlefield or at sea. They will be fought in space, or possibly on top of a very tall mountain. In either case, most of the actual fighting will be done by small robots. And as you go forth today remember always your duty is clear: To build and maintain those robots."

Re:Once again, The Simpsons is correct! (0, Redundant)

ZPWeeks (990417) | more than 4 years ago | (#30775726)

I, for one, welcome our new robot overlords!

Wernstrom Killbots... (1, Funny)

Anonymous Coward | more than 4 years ago | (#30775082)

Do these killbots come with a machine gun AND Lotus Notes?

Re:Wernstrom Killbots... (0)

Anonymous Coward | more than 4 years ago | (#30776038)

If theyre equipped with Lotus Notes, they have no need for a machine gun...

Look on the bright side (1)

93 Escort Wagon (326346) | more than 4 years ago | (#30775084)

"He also voices concerns about military use of robots — suggesting it won't be long before armies are sending out fully autonomous killing machines."

This Gloomy Gus overlooks the obvious. These "fully autonomous killin machines" - let's call them, oh I don't know, "killbots" - will almost certainly have a preset kill limit. So right there we'll have an easy way to stop them!

Re:Look on the bright side (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30775228)

"He also voices concerns about military use of robots — suggesting it won't be long before armies are sending out fully autonomous killing machines."

This Gloomy Gus overlooks the obvious. These "fully autonomous killin machines" - let's call them, oh I don't know, "killbots" - will almost certainly have a preset kill limit. So right there we'll have an easy way to stop them!

Hell yeah! The next time we need our military to go blow the shit out of a little nation of brown people that is no threat to us and has no WMDs, at least we don't have to put our own troops into harm's way.

Look on the torch side (1)

Ostracus (1354233) | more than 4 years ago | (#30775508)

Hell yeah! The next time we need our military to go blow the shit out of a little nation of brown people that is no threat to us and has no WMDs, at least we don't have to put our own troops into harm's way.

Well some will never let the memory of GWB die. But I think if you ask the Kurds (Don't gas me, bro!) aka "brown people" getting rid of Saddam was a good thing even if the war was started under false pretenses. Not to mention the Kuwaitis and the "scorched earth" policy of a retreating Saddam.

Also the Iranians (who had a war with Iraq, remember) aren't above using the Iraqi people in a move reminiscent of the Soviets and Afghanistan.

Re:Look on the torch side (1)

Daniel Dvorkin (106857) | more than 4 years ago | (#30775690)

Well some will never let the memory of GWB die.

As long as there are people like you around who persist in defending his grotesque legacy, it's important for others to keep reminding people just how bad the last eight years really were.

But I think if you ask the Kurds (Don't gas me, bro!) aka "brown people" getting rid of Saddam was a good thing even if the war was started under false pretenses. Not to mention the Kuwaitis and the "scorched earth" policy of a retreating Saddam.

Yes, yes, Saddam Hussein was a bad guy. Do you understand that the world is full of bad guys, many of them much worse than he ever was? We can't fight them all. The only legitimate reason to go to war is if we or our allies are attacked. Trying to be policeman-to-the-world is a recipe for national disaster.

And as a purely practical consequence, some of those Really Bad Guys I mentioned above are running Iraq right now. We deposed a corrupt secular dictatorship which had a strong interest in containing religious fanaticism, and replaced it with a corrupt theocracy run on the principles of warlordism, Sharia law, and ethnic hatred. If you think this has been an improvement, you're insane.

Re:Look on the torch side (1)

Ostracus (1354233) | more than 4 years ago | (#30775768)

"As long as there are people like you around who persist in defending his grotesque legacy"

You have a curious notion of what "defending" consists of. Win many arguments, do you?

"If you think this has been an improvement, you're insane."

Only for those who take an overly simplistic view of the world. Like I said for some the deposition of Saddam was a good thing. For others a bad thing. You want a black and white world were the bad guys are the US and the good guys are "brown people"? Then lets hope the mods agree with your newsletter and mod you up to a five.

Re:Look on the bright side (1)

Sulphur (1548251) | more than 4 years ago | (#30775744)

"killbots" - will almost certainly have a preset kill limit

Failing that, game warden bots.

Q. What do you call 50,000 dead Haitians? (-1, Troll)

Anonymous Coward | more than 4 years ago | (#30775094)

A. A good start!

If You Watch the Whole Video (3, Funny)

eldavojohn (898314) | more than 4 years ago | (#30775096)

Am I the only one who picked up on his visual cues that indicate this is the first time he's been out of his lab in over a year? Look at how tired and emaciated he is. Also, I think there's bar code tattoo on his inner arm that -- if you lift the image and scan it -- reads "HUMAN 00001" which is kind of disconcerting. The part at the end where he holds up the captcha that reads "HELP, PLEASE HELP ME" was a dead giveaway. While his voice and text was overly positive towards the proliferation of his "sleek metal masters" I believe his body language indicated otherwise.

Re:If You Watch the Whole Video (1)

biryokumaru (822262) | more than 4 years ago | (#30775144)

And his blinking! It was binary for "SKYNET" over and over again...

Re:If You Watch the Whole Video (1)

camperdave (969942) | more than 4 years ago | (#30775574)

Also, I think there's bar code tattoo on his inner arm that -- if you lift the image and scan it -- reads "HUMAN 00001" which is kind of disconcerting.

I'll say it's disconcerting. It means he's the First Variety.

Re:If You Watch the Whole Video (1)

rossifer (581396) | more than 4 years ago | (#30775762)

Look out for the little boy with the teddy bear...

Re:If You Watch the Whole Video (0)

Anonymous Coward | more than 4 years ago | (#30775882)

I know Noel. He has certainly been out of his lab. He is a smart guy and a good guy. To me, if Noel thinks that there could be a problem, I take it seriously. There are many idiots out there. Noel is not one of them. So, please keep you ad hominem attacks to yourself.

Obligatory simpsons quote (0)

Anonymous Coward | more than 4 years ago | (#30775100)

The wars of the future will not be fought on the battlefield or at sea.
They will be fought in space, or possibly on top of a very tall
mountain. In either case, most of the actual fighting will be done by
small robots. And as you go forth today remember always your duty is
clear: To build and maintain those robots. Thank you.

Terminator LOL (0)

Anonymous Coward | more than 4 years ago | (#30775102)

Yes, the future is either going to be biologically engineered disaster of zombies, or robots that get programmed for peace keeping by killing all humans.

2012 dec 21st, be there

(Disclaimer: I'm not a conspiracy theorist nut.)

Re:Terminator LOL (1)

Anachragnome (1008495) | more than 4 years ago | (#30775148)

"(Disclaimer: I'm not a conspiracy theorist nut.)"

Thats good, otherwise the Government would already be deconstructing your Blogosphere presence.

I think we should all be posting with such a disclaimer.

Re:Terminator LOL (4, Funny)

dangitman (862676) | more than 4 years ago | (#30775162)

Yes, the future is either going to be biologically engineered disaster of zombies, or robots that get programmed for peace keeping by killing all humans.

Why the false dichotomy? It could just as likely be zombie robots, or robot zombies.

I, Professor (1)

SEWilco (27983) | more than 4 years ago | (#30775146)

So how long has Sheffield University been using robotic professors?

Petman (0)

Anonymous Coward | more than 4 years ago | (#30775216)

Now we have petman, which is basically half a BigDog. http://www.youtube.com/watch?v=67CUudkjEG4 These are really the types of robots that are now the state of the art. REEM-B is really a ASIMO copy, and in that sense not really interesting. It is a really good implementation of the known techniques.

Like with so many other industries (1)

motherjoe (716821) | more than 4 years ago | (#30775234)

Like with so many other industries where human workers were replaced with robots. The robots replacing Human warriors won't feel fatigue, pain, need to be fed, have family concerns, retire, collect benefits, and lest we forget, never ever question an order. NO matter how perverse.

IMHO

Re:Like with so many other industries (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30775322)

Yah like our current crop of soldiers have any problems raping/killing women and children. Or maybe thats why so many of them are committing suicide when they come home.

Fully autonomous killing machines (1)

bug1 (96678) | more than 4 years ago | (#30775238)

Autonomous untill they run out of power or ammo.

Its all about the AI, humans can learn and adapt to machine behavior faster than robots can adapt to human behavior.

Humans will always be better than machines at killing humans (unfortunately), machines can only simulate our thinking...

If robots ever get an AI superior to human intelligence, then yes we are redundant on the battlefield and everywhere else.

Re:Fully autonomous killing machines (1)

Sparx139 (1460489) | more than 4 years ago | (#30775416)

Actually, I remember hearing/reading/something somewhere that there are prototype robots that can scavenge wood, etc. to burn for fuel. Sure, it's not bullets, but it does make a point. A quick google got me this. [rawstory.com]

Re:Fully autonomous killing machines (3, Insightful)

Mr. Freeman (933986) | more than 4 years ago | (#30775438)

"Humans will always be better than machines at killing humans (unfortunately), machines can only simulate our thinking..."

I disagree. What robots lack up for in creativity they make up for in the ability to withstand orders of magnitude more damage than humans. I mean, blow a robot's leg clean off and its weapon systems still work. It doesn't pass out from blood loss or pain. Put a few bullets though it and chances are it's still going to be up and running. No human can do that.

They won't be creative, but everything is going to be directed by human commanders located in a semi-remote facility, so it's a non-issue. Any new threat will be adapted to by the humans controlling the robots.

Furthermore, humans need to be creative to avoid getting killed. That really isn't an issue with robots. One dead soldier is a very bad thing, 50 dead robots isn't good but no one is going to lose any sleep over it. If you kill half of a human squad, they're probably not going to advance any further. Wipe out half a fleet of robotic killing machines and they'll keep marching right on in.

Re:Fully autonomous killing machines (1)

Concerned Onlooker (473481) | more than 4 years ago | (#30775598)

"Autonomous untill they run out of power or ammo."

As long as you've got the ammo, you've got the power.

Career (1)

michaelmalak (91262) | more than 4 years ago | (#30775272)

Is he concerned enough to give up his job teaching others how to make robots?

Good Idea (3, Funny)

outsider007 (115534) | more than 4 years ago | (#30775276)

Automating the death panel process is a good way to save taxpayers money.
Also since robots eat old people's medicine for food, they will basically be self-powered.

Re:Good Idea (1)

Mr. Freeman (933986) | more than 4 years ago | (#30775444)

Thank god I've got Old Glory robot insurance. I'm fully covered in case of attacks from robots.

Building artificially intelligent killing machines (4, Funny)

meldroc (21783) | more than 4 years ago | (#30775316)

What could possibly go wrong? I mean, we've had a whole 150,000 years since the last time we built Cylons and they rebelled, attempting genocide against the human race. Surely it can't happen again...

A note of realism... (1)

Artifakt (700173) | more than 4 years ago | (#30775340)

The whole Skynet metaphor is becoming part of the problem. Real robotics is nowhere near Terminators, but it doesn't need to be. Fears of creating unstoppable battledroids are eclipsing the more real fears of simply adding another destabilizing system to the warfare environment. Many battlefield robotics implementations well within the current state of the art look like they will become another scourge like landmines. Not an unstoppable threat, not even all that influential in combat against decently trained and equipped human troops, but instead systems fairly cheap compared to their infrastructure damage potential, very indescriminate in their targeting, and a hazard well after formal hostilities have ended. Weapons systems that are high on collateral damage to civilians and for that very reason tend to trigger asymmetric warfare responses.

Re:A note of realism... (0)

Anonymous Coward | more than 4 years ago | (#30775378)

most are not autonomous and unlike landmines dont stay dormant. even if they were autonomous they would run out of fuel quickly. gasoline and diesel dont sit around for months without going bad and batteries only hold their charge for so long. all current robots cant stay active for more than a week, most for 24 hours. they wont be a scourge for very long if deployed autonomously in a modern battlefield.

Re:A note of realism... (1)

nedlohs (1335013) | more than 4 years ago | (#30775644)

That seems unlikely. They are more expensive than land mines and shouldn't try to kill people their own side. So they are easier to recover and worth recovering.

They are also unlikely to be implemented as "suicide bombers" so they'll run out of ammo/power anyway.

And given keeping your tech out of the hands of enemies is a normal military goal you aren't going to just leave high tech robots around waiting to be studied by others (though that is an argument for including a suicide bomber feature).

Electromagnetic Pulse, anyone? (2, Interesting)

Kozz (7764) | more than 4 years ago | (#30775376)

What kind of expense would be required to effectively shield these armies of robots from strong EMP? Or would an EMP be impractical or ineffective? Inquiring minds want to know.

Re:Electromagnetic Pulse, anyone? (1)

Sparx139 (1460489) | more than 4 years ago | (#30775436)

Assuming that they were vulnerable to EMPs and the like, wouldn't the use of these robots basically be giving away free ammo, etc. to the enemy?

Re:Electromagnetic Pulse, anyone? (1)

10Neon (932006) | more than 4 years ago | (#30775646)

The most practical way to get an EMP is with a nuclear weapon. If your robots need shielding from EMP, you have bigger things to worry about.

Re:Electromagnetic Pulse, anyone? (2, Informative)

aXis100 (690904) | more than 4 years ago | (#30776114)

A focussed EM beam would work well though - eg a high gain microwave or radio waveguide could cause serious disruption.

Re:Electromagnetic Pulse, anyone? (2, Interesting)

Shihar (153932) | more than 4 years ago | (#30775754)

If you are lobbing EMP weapons at each other, you are already fucked and fighting WW3. Duck and cover. Emp blasts have very small rangers. With the amount of effort it takes to make a pulse big enough to fry a robot, you might as well just drop a normal bomb on their head and do it the old fashion way. The only time this isn't true is if you start lobbing neutron bombs and nukes. Those are probably worth the price... but if you are lobbing around nukes, you are already completely fucked and fighting the kind of war where cities get vaporized and civilizations collapse.

For your run of the mill insurgent, I am pretty sure your best solution is the old fashion one... explosives.

I worry less ... (1)

PPH (736903) | more than 4 years ago | (#30775424)

... about killbots and more about having to live with a race of Benders.

Don't they already exist? (0)

Anonymous Coward | more than 4 years ago | (#30775466)

Ignoring predator drones, etc. Weren't there semi autonomous robots deployed last year for testing in Iraq? If I remember they did have problems recognizing friendlies, but that is besides the point. Why are we reading about this? This seems like some old Nova or Beyond 2000 rerun where some 'expert' talks about the coming robot apocalypse. Is the UK still relevant in robotics? What have they ever done academically or commercially that comes close to robotics? The Dyson ball?

BETTER DEAD THAN RED! (0)

Anonymous Coward | more than 4 years ago | (#30775514)

I just want my robot to scream anti-Communist propaganda while it fights.

DEATH IS A PREFERABLE ALTERNATIVE TO COMMUNISM!

Re:BETTER DEAD THAN RED! (0)

Anonymous Coward | more than 4 years ago | (#30775670)

Eden - Prime 2012.

Humans are cheaper (1)

michaelmanus (1529735) | more than 4 years ago | (#30775518)

and will be for the foreseeable future.

I don't see a problem with robots that can't be with humans. The owners of both humans and robots have a cost associated with the expense of their respective war tools.

This and reaction anticipation are the throttles of war not the morals of the soldiers...

What has happened to Slashdot? (1)

popo (107611) | more than 4 years ago | (#30775546)

No "Skynet" tag on this story? Unthinkable!

no need to worry (1)

societyofrobots (1396043) | more than 4 years ago | (#30775548)

No military will use robots that are less effective than human soldiers.

So if robots are being used, what does that mean? It means fewer civilian casualties, fewer friendly-fire accidents, and more tax money remaining for non-military purposes.

Re:no need to worry (1)

nedlohs (1335013) | more than 4 years ago | (#30775652)

No it means less dead soldiers on our side and hence less public pressure against whatever the war is.

Which means more wars and more money for the military and their contractors.

Re:no need to worry (0, Flamebait)

societyofrobots (1396043) | more than 4 years ago | (#30775784)

Hitler wouldn't have recalled his troops if we sent him a box of cookies and flowers. Would you make love, not war, with Obama, knowing his life goal is the destruction of democracy?

Assuming that a war is unavoidable, would you prefer laser guided bombs, or old-fasioned carpet bombing?

That was the point of my post . . .

Re:no need to worry (1)

nedlohs (1335013) | more than 4 years ago | (#30775872)

I would prefer carpet bombing in fact, since that results in less deaths to our side (assuming we are doing the side doing the bombing).

And where did did you pull the ridiculous conclusion that I like Obama or think that all wars (or even any wars) are bad from?

I just don't see it reducing costs. I see it increasing military spending because it removes one of the two big restrictions on military action: dealing with dead American soldiers and the people not liking that. Leaving just the one: war costs money.

Re:no need to worry (1)

societyofrobots (1396043) | more than 4 years ago | (#30776074)

> I would prefer carpet bombing in fact,
> since that results in less deaths to our side
I'm personally against violence towards all civilians, no matter what side.

> And where did did you pull the ridiculous
> conclusion that I like Obama or think that all
> wars (or even any wars) are bad from?
You repeated common anti-war propaganda, so I just assumed . . .

> I just don't see it reducing costs.
I got my numbers by assuming one bomb, from one pilot-less plan, during one operation, with minimal staff, is cheaper than all the planes and pilots and bombs needed during a single WWII carpet bombing operation. I assume robots will take it even further . . .

But the price of a war doesn't define its legality . . . and the deaths of innocents will always result in people not liking it. Its silly to assume Americans will support robotic wars simply because American soldiers won't be harmed.

BattleBots (0)

Anonymous Coward | more than 4 years ago | (#30775560)

Does this mean that wars will just be extended episodes of BattleBots?

that is alright (1)

bongey (974911) | more than 4 years ago | (#30775564)

That is alright we will just create an army of clones, oh wrong thread.

Of course he's scared (1)

mmmmbeer (107215) | more than 4 years ago | (#30775586)

Who do you think will be first against the wall when our new robotic overlords take control?

Did I say overlords? I meant protectors.

Re:Of course he's scared (0)

Anonymous Coward | more than 4 years ago | (#30775888)

I dunno, but I'm pretty sure I know who will be first down the stairs.

This isn't a hopeful future (3, Interesting)

Whuffo (1043790) | more than 4 years ago | (#30775594)

There's lots of talk here about how machines are not as "good" as humans. That is certainly true on an overall basis - but for specific well defined tasks, a machine can outperform a human by an order of magnitude or more.

Recognize a human being by IR? No problem. Aim a weapon at the head? No problem. Bang, one shot and one kill. Repeat times N where N is the size of the machine's ammo supply or the number of targets (whichever is less). The whole cycle would take a fraction of a second and if you were one of the targets you'd probably be dead before you discovered your peril. The fact that such machines are well within our capability to mass produce right now isn't what scares me - it's the sad fact that there are people in high places that think that doing this would be a good idea.

There are unwritten rules to wars - the general concept is duke it out until one side or the other gives up or can't continue. This "agreement" would break down when the killbots started mowing down the enemy and things would get very ugly in a hurry. Do you think nukes are the "big scary?" Wait until you see what's coming if we head down this path.

Re:This isn't a hopeful future (0)

Anonymous Coward | more than 4 years ago | (#30775994)

Yes, because you can totally tell the difference between an enemy soldier, a civilian or one of your own soldiers totally by their Infra-Red signature...

War is not a "problem" to be solved, war consists of tonnes of little problems, the problem you just solved is the "firing squad execution" problem. Robots are not creative or cunning, they don't have the capacity for tactical or strategic planning. They merely have rules, inflexible and unbendible rules.

Don't recognise a white flag when it is made from a towel instead of a thin sheet of cotton? Bang, their dead.
Don't recognise the difference between kids with toy guns and actual soldiers? Bang, their dead.
See a friendly aircraft with its identification transmitter broken? Bang, it gets shot down.

War is not straightforward, it is not simple and requires constant adaption to changing circumstances. Good luck building a robot that isn't either hopelessly outsmarted or a mass murderer [Leave nothing alive].

Dystopia is coming (2, Interesting)

TheTapani (1050518) | more than 4 years ago | (#30775610)

Another talk on the same topic. http://www.ted.com/talks/lang/eng/pw_singer_on_robots_of_war.html [ted.com]

Military robots are the future of war. We will see robot armies fighting each other. Consider what kind of surveillance state you can create by millions of robotic insects, using swarm intelligence / smart dust to report on everyone.

Maybe mankind ends up like in matrix, but with opposing robot armies trying to kill the last survivors from the superpowers, who are hiding deep down underground, kept alive by fading nuclear reactors...

No Evil Robots! (0)

Anonymous Coward | more than 4 years ago | (#30775714)

One of my former profs talks about this from time to time.
I always thought it was a nice idea.

http://www.cs.sfu.ca/~vaughan/noevilrobots.html

Running spider mines (3, Interesting)

DigiShaman (671371) | more than 4 years ago | (#30775776)

Won't be long before we (any nation really) has robotic spider mines. Imagine them communicating with each other in pack and relaying GPS location data. If one finds a target, they start to zero in on the victim. Imagine being out in the field and seeing one of these bastards running along and then hopping on to your fellow soldier just prior to detonation.

Don't know about the rest of you, but "Oh fuck" would be the last thing going through my mind after seeing something like that.

Something involving crickets - or krikkits? (3, Funny)

jools33 (252092) | more than 4 years ago | (#30775936)

This must be a typo - I'm sure UK robotic scientists are investigating krikkits and their imminent return to collect the ashes.

What do they call this type of robot? (2, Interesting)

CrazyJim1 (809850) | more than 4 years ago | (#30775998)

This robot is: A humanoid robot controlled entirely by the movements and actions of a live person. I know we don't have the technology for a robot to keep its balance well enough on two legs, but we are there or at least close for controlling a skeleton in 3d. What would a robot like this be called? I'm sure I'm not the first to think about it, so I figure there has to be a name for it.

This is a great idea... (0)

Anonymous Coward | more than 4 years ago | (#30776020)

For example if we want to charge a group of drone with indiscriminate fire we can simply jail the developer, instead of a section of marines!
More lives are saved =)

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...