Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Examining the Ethical Implications of Robots in War

ScuttleMonkey posted more than 5 years ago | from the come-with-me-if-you-want-to-live dept.

Robotics 369

Schneier points out an interesting (and long, 117-pages) paper on the ethical implications of robots in war [PDF]. "This report has provided the motivation, philosophy, formalisms, representational requirements, architectural design criteria, recommendations, and test scenarios to design and construct an autonomous robotic system architecture capable of the ethical use of lethal force. These first steps toward that goal are very preliminary and subject to major revision, but at the very least they can be viewed as the beginnings of an ethical robotic warfighter. The primary goal remains to enforce the International Laws of War in the battlefield in a manner that is believed achievable, by creating a class of robots that not only conform to International Law but outperform human soldiers in their ethical capacity."

cancel ×

369 comments

What's the point? (4, Interesting)

ccguy (1116865) | more than 5 years ago | (#22210830)

Obviously a country that can send robots instead of soldiers to fight is way more likely to become 'war happy' - so I'm not sure this robot thing is a good idea at all.

Besides, if your enemy expects your robots to defeat their army, what would be the point of fighting them in the first place? Attacking civilians seems a more logical step (I don't think it's reasonable to demand any country at war not to attack only military targets where there's none that can't be replaced easily).

(and no, I didn't read the whole 117 pages, but after a quick glance I reached the conclusion that whoever wrote the title didn't either, so I'm sharing my thoughts on the title, not the PDF)

Re:What's the point? (5, Insightful)

The Aethereal (1160051) | more than 5 years ago | (#22210926)

Obviously a country that can send robots instead of soldiers to fight is way more likely to become 'war happy'
Of equal concern to me is the fact that a country with a robot army can use them against their own citizens with no chance of mass mutiny.

Re:What's the point? (2, Insightful)

The One and Only (691315) | more than 5 years ago | (#22211344)

Depends on the robots. What about the people who build and maintain the robots? They can mutiny. Also I'd bet you need some sort of networking to coordinate the robots. Probably wireless. Sure you can set the right failure modes for jamming, but what about signal intrusion? You could make the robots mutiny for you.

Re:What's the point? (3, Funny)

Dancindan84 (1056246) | more than 5 years ago | (#22211484)

Also I'd bet you need some sort of networking to coordinate the robots. Probably wireless.
I agree. Let's do it centrally. We can name the hub skynet. Or V.I.K.I.

Re:What's the point? (1, Insightful)

Applekid (993327) | more than 5 years ago | (#22211516)

Talk about human support for mutiny is moot when even today in the United States our rights are being stripped out one thread at a time and nobody so much as blinks or turns away from American Idol or Walmart.

One worker might talk about it and wind up turned in (because he's a terrorist, obviously) and those that betray will be rewarded with coupons to McDonalds.

Re:What's the point? (2, Insightful)

smussman (1160103) | more than 5 years ago | (#22211538)

Depends on the robots. What about the people who build and maintain the robots? They can mutiny. Also I'd bet you need some sort of networking to coordinate the robots. Probably wireless. Sure you can set the right failure modes for jamming, but what about signal intrusion? You could make the robots mutiny for you.
But I don't think you really want that, because if the maintenance people can make the robots mutiny, how would you prevent your opponent from making them mutiny? Even if it requires very specialised knowledge, all it takes to get the secret is one converted/planted maintenance person.

Re:What's the point? (0)

Anonymous Coward | more than 5 years ago | (#22211382)

Well, if every household is armed with a robot militia, problem solved.

Re:What's the point? (4, Insightful)

Arkham (10779) | more than 5 years ago | (#22211568)

I don't think automatic robots will ever be a smart plan. The chance of malfunction is just too great, and the consequence would be too serious. There've been a million sci-fi movies to that effect, from "Terminator" to "I, Robot".

What would be interesting though would be robots as a shell to the humans they represent. Think "Quake" with a real robot proxy in the real world. Soldiers with hats on showing wide angle camera views of their area and a quake-like interface that would allow them to attack or assist as needed. Limited automation, but case-hardened soldiers being run by trained humans would present a powerful adversary. Heck, every army recruit would already know 80% of how to operate one on signing day if the UI was good.

I know I'd be a lot upset with "Four robots were blown up by a roadside bomb today. They should be operational again by tomorrow." than to see more soldiers die.

Re:What's the point? (5, Insightful)

KublaiKhan (522918) | more than 5 years ago | (#22210952)

I'd think that it'd be more effective to attack infrastructure--things like power stations, traffic control systems, that manner of thing--than to go after civilians directly.

For one thing, what's the point of taking over a territory if there's nobody there to rebuild and to use as a resource?

For another, it looks a -lot- better on the international PR scene if your robots decidedly ignore the civilians and only go after inanimate strategic targets--at least, up until the point that they get attacked. With that sort of programming, you could make the case that you're "seeking to avoid all unnecessary casualties" etc. etc.

Mowing down a civilian populace does sow terror, of course, but keeping the civilians intact (if in the dark and without water) can be argued to be more effective.

Re:What's the point? (2, Insightful)

ccguy (1116865) | more than 5 years ago | (#22211084)

Mowing down a civilian populace does sow terror, of course, but keeping the civilians intact (if in the dark and without water) can be argued to be more effective.
When desperate enough civilians can become soldiers. In fact, some can be willing to die (being willing to die and accepting a certain risk are totally different things). This is proven day after day in the Gaza strip for example.

Re:What's the point? (3, Funny)

KublaiKhan (522918) | more than 5 years ago | (#22211124)

At which point, once they take up arms, they're conveniently reclassifying themselves as irregular enemy combatants. If they had only stayed calm and awaited further instruction from our occupying forces, robotic or otherwise, this sad scene could have been avoided. We're just trying to be as humane as possible; is it our fault if they aren't going to follow directions?

Think like an evil overlord, man!

Re:What's the point? (2, Insightful)

moderatorrater (1095745) | more than 5 years ago | (#22211010)

It sounds like this is proposing something along the lines of Asimov's Three Laws of Robots, only instead of not being able to harm humans at all, they're able to harm humans only in an ethical manner.

Instead of sending human soldiers into Iraq who are able to go crazy and kill civilians, you could send in a robot that wouldn't have emotional responses. Instead of having VA hospitals filled with injured people, you could have dangerous assignments filled out with robots that are replaceable.

However, there's too much potential for abuse for me to feel comfortable about this. As the gap between the weapons available to citizens and the weapons available to the government widens, the ability for the government to abuse its own citizens grows.

Re:What's the point? (1)

The One and Only (691315) | more than 5 years ago | (#22211176)

I don't know. I think robotic armies would completely eliminate the horrors of war. Either you go to war with another country with a robot army (in which case you have a protracted war of production, same as any war between world powers since 1914 except with no human lives lost in the process) or you totally overpower the enemy (meaning they immediately surrender). Now, it would suck if the wrong people had robots, but war would be a remarkably tidy business.

Re:What's the point? (4, Interesting)

ccguy (1116865) | more than 5 years ago | (#22211336)

This assumes that once you have destroyed your opponent's robotic army you are done. However most likely is that after the robots will come humans, so in the end you are going to lose both.

Besides, I still fail to see why a country which is likely to lose in the robotic war would accept these rules, when it makes a lot more sense to attack the other country's civil population - which in turn might reconsider the whole thing.

Fighting from the sofa is one thing, having bombs exploding nearby is quite different.

Re:What's the point? (2, Insightful)

SkelVA (1055970) | more than 5 years ago | (#22211270)

Besides, if your enemy expects your robots to defeat their army, what would be the point of fighting them in the first place? Attacking civilians seems a more logical step (I don't think it's reasonable to demand any country at war not to attack only military targets where there's none that can't be replaced easily).
I think we just saw the thought process that bred guerrilla warfare (or terrorism, depending on your point of view). I'll make the logical leap.

Besides, if your enemy expects your highly-trained, well-financed, well-organized US military to defeat their army, what would be the point of fighting them in the first place? Attacking civilians seems a more logical step
Guess what. We've already reached the point you fear (at least from the point of view of most of the western world and the larger military powers). Robots augment armed forces that already have overwhelming force. They're not going to be creating a military where there was none.

To use a contemporary example, Iran isn't going to pump out a bunch of robots and all of the sudden have an armed forces capable of withstanding the US's in a conventional war. As per the logical process in the quotes though, you don't necessarily have to destroy the other side's army (or robots).

Re:What's the point? (0)

Anonymous Coward | more than 5 years ago | (#22211448)

"guerrilla warfare (or terrorism, depending on your point of view)."
There's no point of view about it. If you're using hit and run tactics against another military, you're fighting a guerrilla war. If you're specifically and knowingly targeting civilians, you're engaging in terrorism.

Re:What's the point? (1)

Joe_in_63640 (1228646) | more than 5 years ago | (#22211324)

Someone HAS to say this: (so sue me for being obvious) - Kyle Reese: "New... powerful... hooked into everything, trusted to run it all. They say it got smart, a new order of intelligence. Then it saw all people as a threat, not just the ones on the other side. Decided our fate in a microsecond: extermination."

Re:What's the point? (1)

kabocox (199019) | more than 5 years ago | (#22211332)

Besides, if your enemy expects your robots to defeat their army, what would be the point of fighting them in the first place? Attacking civilians seems a more logical step (I don't think it's reasonable to demand any country at war not to attack only military targets where there's none that can't be replaced easily).

Well, given that you have the tech to make solider death bots, let's also add in the tech to make police bots. You may not be able to properly man customs and police stations with moral upright individuals, but you can build/buy a million or two police/solider bots that can take over and then police most lands for you. You are assuming that just because they send robots off to war that they don't all have personal robot police bots at home/work. If terrorists are found active, add another order of police bots.

Economic Warfare & Gundams (5, Informative)

infonography (566403) | more than 5 years ago | (#22211360)

Consider that robots cost money, the country with more economic power is likely to be the winner in such a conflict. A large part of the U.S.A.'s success in WW2 was the sheer capacity of it's factories which were by if nothing but distance well defended against attack. European nations where under constant attack on their military infrastructure while American Factories where never bombed and even the concept of saboteurs blowing up factories in the States was a ridiculous notion to the Axis. Sure, Blow up the Pittsburgh bomb factory then you still have 20 more scattered about the US.

Robots won't be used simply because a robot doesn't have the discrimination as to who to attack and not to. Despite Orwellian fantasies, the practical upshot is that you would suffer to much friendly fire from such weapons and intense PR backlash. Sorry I don't see it happening.

Telepresence weapons are far more likely, as we have already seen in use.

Japan's Ministry of Agriculture [animenewsnetwork.com] has been denying their work on this. America is full of fully trained pilots for these crafts (Wii, Xbox, Playstation etc).

Suggested reading of Robert A. Heinlein's Starship Troopers and Robert Aspirin's Cold Cash War

Re:What's the point? (2, Funny)

TheNarrator (200498) | more than 5 years ago | (#22211510)

The generals will also get to blame collateral damage on bugs in the software.
For instance:
"Oh yeah the flame thrower robot went crazy and torched the entire village because some guy at Lockheed put a semicolon on the end of a for loop. Oops, we'll have to fix that in the next rev".

Re:What's the point? (2, Insightful)

MozeeToby (1163751) | more than 5 years ago | (#22211514)

It is well that war is so terrible -- lest we should grow too fond of it.

Robert E. Lee

Re:What's the point? (4, Insightful)

TheRaven64 (641858) | more than 5 years ago | (#22211542)

Obviously a country that can send robots instead of soldiers to fight is way more likely to become 'war happy' - so I'm not sure this robot thing is a good idea at all.
Not necessarily. One of the big reasons the USA lost in Vietnam was that it became politically unacceptable to have body bags coming home. The current administration found a solution to that; ban news crews from the areas of airports where the body bags are unloaded.

Beyond that it's just a question of economics. It costs a certain amount to train a soldier. Since the first world war, sending untrained recruits out to fight hasn't been economically viable since they get killed too quickly (often while carrying expensive equipment). A mass-produced robot might be cheaper, assuming the support costs aren't too great. If it isn't then the only reason for using one would be political.

Re:What's the point? (1)

Bill_the_Engineer (772575) | more than 5 years ago | (#22211572)

Obviously a country that can send robots instead of soldiers to fight is way more likely to become 'war happy' - so I'm not sure this robot thing is a good idea at all.

Don't worry there is still the nuclear option.

Seriously, I think the same ethics behind nuclear warfare applies to robotic warfare. Both kill people from a distance, just one of them is slower at it than the other.

As for becoming 'war happy', this is where the theory of Mutual Assured Destruction (MAD) apply. A country would not want to attack another country with robots (or anything else) for fear of retribution from the target or its allies using similar methods.

Hell I'm worried about the prospects of a nuclear war, so the thought of adding killer robots to the mix doesn't add much more to my anxiety levels (other than robots being cheaper).

Wow (1)

sigzero (914876) | more than 5 years ago | (#22210842)

The primary goal remains to enforce the International Laws of War in the battlefield in a manner that is believed achievable, by creating a class of robots that not only conform to International Law but outperform human soldiers in their ethical capacity.
Good luck with that!

Re:Wow (1)

khallow (566160) | more than 5 years ago | (#22210980)

Oh, I'm sure in some rosy transhumanist future, this can be done. My take is that the robots programmed to follow international law won't be the problem. It'll be the robots which they don't even bother to try to program to follow international law.

Re:Wow (1)

explosivejared (1186049) | more than 5 years ago | (#22211076)

Outperforming human soldiers in their ethical capacity is not a lofty goal. Look up Unit 731 and you'll see what I'm talking about.

That being said, the problem that this treatise tries to address is not one confined to the battlefield. It's much broader. The battlefield consequences of AI agents are just that, consequences. They come about as a result as the much larger question of creating an artificial intelligence that has an acceptable level of ethics for use in the real world. I'm assuming here that we're talking about AI soldiers making decisions on their own and not based on direct instructions from a human. So, without an overarching set of ethical principles that AI can adhere to in general, battlefield protocols are irrelevant. This compounded by the fact that at the core, there are still humans at the helm. We all know how well international regulations against war hold up in the real world. Battlefield protocols and regulations exist to give the illusion of the rule of law to what is an otherwise savage and immoral exercise. It's cool that someone is thinking ahead about the ethical implications, though.

Personally, I am very wary of the consequences a move to AI armies will have on the readiness of nations to go to war. War is not good. The fact that one of our first concerns with a new technology is how to implement this best on the battlefield is proof enough that AI agents will have very little trouble "outperforming" their human counterparts.

Re:Wow (1, Insightful)

Hatta (162192) | more than 5 years ago | (#22211194)

Seriously, is this a joke? Ethics and war in the same sentence? War is not ethical, it never will be. Robots are not going to change that.

Re:Wow (0)

Anonymous Coward | more than 5 years ago | (#22211466)

Isn't war maybe a little bit ethical, like for the purpose of stopping the Nazis?

Obligatory (5, Funny)

Anonymous Coward | more than 5 years ago | (#22210848)

I for one welcome our new robotic overlords.

Why is this funny? (1)

Besna (1175279) | more than 5 years ago | (#22211036)

Maybe a script wrote it by detecting some words in the description. Then, however, some bot would have to mod it up. It is this step that makes me wonder. It appears that there is conscious human activity here. Is facetiousness a part of being a geek? Is it possible to really think about these things? Perhaps:

"I will accept these robots as overlords."

Re:Why is this funny? (2, Funny)

4D6963 (933028) | more than 5 years ago | (#22211472)

Maybe a script wrote it by detecting some words in the description.

Oh, damn, I didn't think anyone would figure it out. Well since you asked, here's how it really works :

root@localhost# memebot.sh --sovietrussia --overlords --beowulf --linux --underpants
memebot 2.4-debian
Copyright (C) 2008 Michel Rouzic
MemeBot is free software, covered by the GNU General Public License, and you are
welcome to change it and/or distribute copies of it under certain conditions.

> Maybe a script wrote it by detecting some words in the description.

Found : 5 results in 0.063 seconds.
1. In Soviet Russia, words in descriptions detect scripts.
2. I, for one, welcome our new humour-making script overlords.
3. Imagine a Beowulf cluster of these!
4. But does it run on Linux?
5. 1. Write a nerd humour generating script. 2. Make it parse and reply anonymously to Slashdot comments. 3. ???. 4. Mod points!!!

Re:Obligatory (0)

Anonymous Coward | more than 5 years ago | (#22211106)

In Soviet Russia, robotic overlords welcome YOU!

Re:Obligatory (0)

Anonymous Coward | more than 5 years ago | (#22211156)

but will it run linux?

They gotta call one "Bender" (1)

Quiet_Desperation (858215) | more than 5 years ago | (#22210850)

As a registered misanthrope, I support anything that kills more people.

"I am KillBot. Please insert human."

Why bother going to war in the first place anymore (3, Insightful)

KublaiKhan (522918) | more than 5 years ago | (#22210858)

If you've got battlebots, why not have one against another to resolve international conflicts, rather than destroy infrastructure and the like?

It'd probably take a mountain of treaties and the like, and of course any organization used to judge the battlebot contest would be rife for corruption and whatnot, but it couldn't be that much worse than what happens around the World Cup and the Olympics...

Re:Why bother going to war in the first place anym (5, Insightful)

moderatorrater (1095745) | more than 5 years ago | (#22210932)

If you've got battlebots, why not have one against another to resolve international conflicts, rather than destroy infrastructure and the like?
We've already built structures to solve international conflicts, and it works extremely well when the two sides are willing to work through those structures. The US doesn't need battlebots to deal with European powers, because both sides are willing to talk it through instead. However, when Iraq refuses to cooperate, or the Arabs in Israel refuse to cooperate, the procedures break down and you're left with two countries that can't reach an agreement without raising the stakes.

In other words, for those countries willing to abide by a mountain of treaties, the problem's already solved. It's the other countries that are the problem, and they're unlikely to resolve their differences like this anyway.

Re:Why bother going to war in the first place anym (3, Insightful)

sammyF70 (1154563) | more than 5 years ago | (#22211240)

How well this work and how willing the US is at talking to .. well .. ANYBODY .. can be seen in archive footage of the UN meetings prior to the latest Iraq invasion.
If you decide to resolve wars using only bots (or even by playing out a virtual video-game like war), my bets are that one of the side will realize it can actually physically attack its opponent, while the opposing side is arguing that the random number generator used is unfair.
Add to that that what you want are generally the natural ressources of the country you're invading and that people are expendable, I'd guess that robots would be programmed to leave vital assets intact and wipe out the humans, instead of doing it the other way around. After all, you can run an oil refinery with a few hundred people, and it costs much more to rebuild it after the war instead of just flying in a few workers to operate it.

There is nothing civilized about war and hoping for fair behaviour on either side is hopelessly optimistic.

Re:Why bother going to war in the first place anym (1, Troll)

meringuoid (568297) | more than 5 years ago | (#22211296)

However, when Iraq refuses to cooperate, or the Arabs in Israel refuse to cooperate,

The Arabs in Israel? I thought it was the Arabs outside Israel who were the problem. Hamas causing bother in the Occupied Territories and all that. The Arabs in Israel itself, I haven't heard that they're such a big problem.

Unless of course you have an unusually broad definition of what constitutes Israel?

Re:Why bother going to war in the first place anym (2, Insightful)

Splab (574204) | more than 5 years ago | (#22211354)

You must be American to have such a screwed up view of whats going on in Iraq and Israel.

And talk it through? Since when did Americans start to respect any treaty that didn't put them in a favorable view? Building a robot army is just the next logical step in alienating the rest of the world.

What if they programmed a war,and nobody logged in (1)

Quiet_Desperation (858215) | more than 5 years ago | (#22210940)

Take it one step further and virtualize the whole thing.

Hell, you can use software from the 1960's. [wikipedia.org]

Re:What if they programmed a war,and nobody logged (2, Interesting)

Sponge Bath (413667) | more than 5 years ago | (#22211254)

Or use an alternative 1960's solution. [startrek.com]

Re:Why bother going to war in the first place anym (1)

Fx.Dr (915071) | more than 5 years ago | (#22210942)

Enter the horrendous movie Robot Jox [imdb.com] . But hell, if settling international and territorial disputes means I get to pilot one of those bad boys, sign me up!

Re:Why bother going to war in the first place anym (0)

Anonymous Coward | more than 5 years ago | (#22210944)

Because there's no incentive for both sides to use your battlebot contest. If I've got a bigger, better army than you, why would I cripple myself by agreeing to your robot competition? If I know my robots are crappy, why wouldn't I just take my chances fighting an asymmetric war?

Re:Why bother going to war in the first place anym (1)

SailorSpork (1080153) | more than 5 years ago | (#22210998)

Robots from each country fighting each other instead of sending soldiers to war? Wasn't that the premise of G Gundam [wikipedia.org] ? Will my dream of finally being able to pilot Mexico's Tequila Gundam [mahq.net] finally come true?

Re:Why bother going to war in the first place anym (1)

Sangui (1128165) | more than 5 years ago | (#22211110)

Not really. G Gundam's tournament was to prevent war from ever happening, because every country was so focused on having their Gundam win the competiton so they put forth all of their time making THEIR Gundam as best as it could be. And don't forget, there were still PEOPLE inside the robots. Gundam's in that were as close to exoskeletons as any Gundam series because it moved as you moved, you weren't using levers and buttons,you were actually punching kicking and running. But just remember: THIS HAND OF MINE GLOWS WITH AN AWESOME POWER. ITS BURNING GRIP TELLS ME TO DEFEAT YOU. TAKE THIS! MY LOVE, MY ANGER, AND ALL OF MY SORROW! BURNING FINGER SWORD!

Re:Why bother going to war in the first place anym (4, Insightful)

The One and Only (691315) | more than 5 years ago | (#22211030)

War is what happens when treaties stop working. You can't have a treaty for some other competition to replace war--if that was the case, FIFA would have replaced the UN by now and Brazil would be a superpower. The purpose of war is to use force in order to impose your will on the enemy, whoever those people may be. The idea is, after your robots destroy the enemy's robots, they will continue to destroy the enemy's infrastructure and population until they give up.

Re:Why bother going to war in the first place anym (1)

imgod2u (812837) | more than 5 years ago | (#22211196)

While I agree that it will not be a purely robot vs robot war, the idea is that since robots are expendable, less collateral damage will be necessary. That is, you won't have a "shoot-and-ask-questions-later" mentality because you can afford to have some robots get blown up by the other side if it meant not shooting innocent civilians.

The robots would often have to subdue humans, of course, but this can be done through non-lethal means. What battlebots gives is the ability to selectively use non-lethal force to make your opponent surrender rather than devastating lethal force. You need not even go after the infrastructure. Send in a million battlebots. Maybe half get destroyed. The other half subdues the enemy using non-lethal force. It takes longer to sway dissenters with non-lethal force but it also helps win the conquered population over a lot better if none of them are killed and their buildings, homes and daily lives still remain the same after the conquest.

The problem, of course, comes from when the guys controlling the robots decide that they should remain in control forever... /Or the robots themselves decide to take over //But then again, we'd get hot Summer Glau robots ///Welcomes hot Summer Glau overlords....

Re:Why bother going to war in the first place anym (1)

SiliconEntity (448450) | more than 5 years ago | (#22211230)

I would suggest that it will work out a little differently. Once battlebots become superior to human soldiers in warfighting ability, most battles will be between bots, with relatively few humans involved. This is simply because the bots will be the superior fighting force, and deployed preferentially by both sides. Only once one side's bot army is defeated would the war become bots against humans, and in that case the losing side would typically surrender rather than face a massacre of its population.

Re:Why bother going to war in the first place anym (0)

Anonymous Coward | more than 5 years ago | (#22211294)

ONLY in this tiny, idealistic, naive, technolologically-biased corner of the universe is a comment like yours judged as being "insightful." In absolutely any other context, you'd be shouted down almost instantly, or at least as soon as the first guy wipes the tears of laughter from his eyes well enough to see the keyboard. If written out, the list of even the most obvious objections to your "idea" may be visible from space.

Re:Why bother going to war in the first place anym (1)

kabocox (199019) | more than 5 years ago | (#22211446)

If you've got battlebots, why not have one against another to resolve international conflicts, rather than destroy infrastructure and the like?

It'd probably take a mountain of treaties and the like, and of course any organization used to judge the battlebot contest would be rife for corruption and whatnot, but it couldn't be that much worse than what happens around the World Cup and the Olympics...


Um, we'd use the existing way. That would mean that we'd go to war and find out, which side has the better/best bots. It could also show if it's better to build a million cheap bots rather than 10,000 expensive ones. It may be cheaper over the long run for your government to always have a few hidden stock piles of a couple million cheap bots just in case a war ever breaks out that you'd have some front line cannon fodder before the new guys show up.

You know one aspect that you completely forgot to think about was an entire new sport: Battle Bots the real time action game where you login and help your country take over some one else's country/defend against those nameless evil foreigners.

Re:Why bother going to war in the first place anym (1)

crakbone (860662) | more than 5 years ago | (#22211468)

This would work really good, of course we would have to beat everyone into submission to get them to agree to it first, but it would eventually work. I personally have never had any luck convincing an insane person with rationality. Also I think you will find that a lot of warlords out there already think less of their people than the value of a battle bot. They would most likely prefer to send 100,000 "replaceable" people (that they can't feed because they needed that new bunker/humvee)than lose the cost of one of these http://en.wikipedia.org/wiki/BattleBots [wikipedia.org] .

Send the interested parties over instead. (1)

iknownuttin (1099999) | more than 5 years ago | (#22211488)

If you've got battlebots, why not have one against another to resolve international conflicts, rather than destroy infrastructure and the like?

I'd rather send the humans whose interest is best served by a war to fight each other. For example, if someone who would profit if a dictatorship in the Middle East is to be disbanded really wants it to go away, then he should go over and fight said dictator himself - mano a mano. None of this shit of sending young people over to fight for his oil. Especially, when they joined the military to protect their country and maybe to pay for an outrageously overpriced college education: no one signed up to fight for oil.

Let's save the resources for the robots for productive reasons: medicines, ending poverty, improving education, food .....

I for one... (1)

Cheezymadman (1083175) | more than 5 years ago | (#22210876)

...welcome our new robot warlords!

Can't be that hard (1, Flamebait)

Nursie (632944) | more than 5 years ago | (#22210888)

"creating a class of robots that not only conform to International Law but outperform human soldiers in their ethical capacity"

Sounds easy to me.

Rule 1 - Don't abuse prisoners.

There, we already have a machine that outperforms humans.

Re:Can't be that hard (1)

meringuoid (568297) | more than 5 years ago | (#22211114)

Sounds easy to me. Rule 1 - Don't abuse prisoners. There, we already have a machine that outperforms humans.

Actually, I think that's the hardest part. Programming a robot to go out and blow shit up isn't such a difficult problem. Programming a robot to recognise when a human adversary is surrendering and to take him prisoner - I don't really know where you'd begin. It's the ED-209 problem: the shooting works fine, the trouble is deciding whether or not you actually ought to do so.

I'd guess what they're aiming for as a benefit in ethical capacity is that a robot does not feel anger. A robot won't get trigger-happy. It won't fire because it's afraid and it won't fire because it's bored and it won't fire because it's seen its friends blown up earlier on in the day. If there's been a ceasefire, the robot will sit still and do nothing even if that means its own destruction by the locals.

Whether the robot can identify friend from foe will be its main problem. But then, it's being built by Americans, so standards there won't be so high. 'Unit insignia: blue background with diagonally broken red and white asterisk across centre... FIRE!'

Same problem... (3, Interesting)

C10H14N2 (640033) | more than 5 years ago | (#22211212)


I've always wondered how HAL or Joshua would interpret:

Rule 1: Kill enemy combatants.
Rule 2: Do not kill or abuse prisoners.

"Take no prisoners, kill everything that moves" would be the most efficient means of satisfying both, especially after friendly-fire ensues.

Clones vs. Droids (1)

eviloverlordx (99809) | more than 5 years ago | (#22210898)

I would have figured that they would skip robots and go directly to clones.

Seriously, though, this sounds like an AI issue rather than a robot issue. If the robot is controlled by some guy with a joystick back at headquarters, you really haven't changed anything. If it's self-controlled, then you have to take into account that there will be bugs, and eventually some milbot will massacre a village somewhere.

Political Ethics... (4, Interesting)

RyanFenton (230700) | more than 5 years ago | (#22210920)

Yes. Superior robotic ethics. A regular Ghandi-bot, saving only those who are threatened, willing to die rather than kill in doubt.

That's all well and good... but what of the men who send these robots into battle? What happens to their sense of ethics? Do they begin to believe that their sending troops into pacify a landscape over political differences is a morally superior action? Do they begin to believe that death-by-algorithm is a morally superior way of dealing with irrational people?

There's an endless array of rationalizations man can make for war, and subjugation of those who disagree with them. Taking the cost of friendly human lives out of the equation of war, and replace it with an autoturret enforcing your wishes doesn't make for a 'morally superior' political game. For many, it would make for an endgame in terms of justifying a military police as the default form of political governance.

Ryan Fenton

Re:Political Ethics... (2, Interesting)

drijen (919269) | more than 5 years ago | (#22211140)

If I recall, one of the Gundam Wing Anime Series, dealt with the questions of robots in war. It pointed out the most critical question of all:

War is about sacrifice, cost, and essentially fighting for what you believe in, hold dear, and WILL DIE to preserve. If you remove the *human* cost from war, then where is the cost? What will it mean if no-one dies? Will anyone remember what was fought for? Will they even recognize why it was so important in the first place?

Also, if we have mass armies of robots, won't the victor simply be the one with the most natural resources (metal, power, etc) to waste? (Better weapons technology aside)

Re:Political Ethics... (1)

The One and Only (691315) | more than 5 years ago | (#22211274)

That sounds exactly as insightful as I thought it would once you said "Gundam".

Also, if we have mass armies of robots, won't the victor simply be the one with the most natural resources (metal, power, etc) to waste?

War is already based on production and logistics, and has been since the Industrial Revolution.

Re:Political Ethics... (4, Insightful)

meringuoid (568297) | more than 5 years ago | (#22211406)

War is about sacrifice, cost, and essentially fighting for what you believe in, hold dear, and WILL DIE to preserve. If you remove the *human* cost from war, then where is the cost? What will it mean if no-one dies? Will anyone remember what was fought for? Will they even recognize why it was so important in the first place?

Bullshit. War is about taking orders, fighting for what someone else believes in, and then getting blown up. Dulce et decorum est pro patria mori and all that shite. That poetic nonsense you spout there is just part of the cultural lie that sells war as romantic and idealistic to every generation of young fools who sign up and go out there to put their lives on the line for the sake of the millionaires. You got it from anime, too... how sad is that? You're buying the same line of bullshit that inspired the damn kamikaze! Clue: Bushido is a lie. Chivalry is a lie. War is about nothing but power.

Also, if we have mass armies of robots, won't the victor simply be the one with the most natural resources (metal, power, etc) to waste? (Better weapons technology aside)

Yes. How does that differ from the present situation?

Re:Political Ethics... (1)

zakone (1227236) | more than 5 years ago | (#22211206)

Well said. Robots or not, it won't help much in doing away with bloodthirsty dictators and "collateral damage".

"The release of atomic energy has not created a new problem. It has merely made more urgent the necessity of solving an existing one." - Einstein

Too easy to counter (2, Funny)

Sciros (986030) | more than 5 years ago | (#22210960)

My Apple comp00tar will just upload a virus wirelessly to them and they will all shut down! I've seen it done!

Re:Too easy to counter (0)

Anonymous Coward | more than 5 years ago | (#22211118)

Oh yeah! Well my robotic warfighter will ethically kick your Apple's ass!

this just in (1)

nude-fox (981081) | more than 5 years ago | (#22210968)

might makes right

Someone once said (0)

Anonymous Coward | more than 5 years ago | (#22210974)

I cannot remember who said this, but let me paraphrase it. They defined violence as the distance and magnitude of pain that you can inflict on another person. For example:

A knife is more violent than a fist
A baseball bat is more violent than a knife (bludgeoning vs stabbing)
A gun is more violent than a blunt object
A rocket or tank is more violent than a gun.

By their logic, then, a satellite controlled robot that can kill people from thousands of miles away would be even more violent than conventional warfare. Where do we go from there? Our robots versus their robots? Or how about take a cue from Star Trek TOS, we just let a computer simulate an atomic war, and after it determines the casualties, we send random portions of each sides' population to death booths.

Here's a better idea. Lets stop fucking killing each other! End the retarded circle of violence.

(OT) ending the circle of violence? (0, Offtopic)

khallow (566160) | more than 5 years ago | (#22211072)

Here's a better idea. Lets stop fucking killing each other! End the retarded circle of violence.
Would be nice, How do you propose to do that?

Re:(OT) ending the circle of violence? (1)

fair_n_hite_451 (712393) | more than 5 years ago | (#22211208)

Obviously by killing everyone who disagrees. Or looks different. Or believes in a different mythos than you...

Re:(OT) ending the circle of violence? (1)

Paul Fernhout (109597) | more than 5 years ago | (#22211378)

http://breakingranks.net/ [breakingranks.net]
"The purpose of this web site is to discuss the social cost of rankism and to develop a grassroots capacity to defend and protect dignity in everyday life. We hope you will join us in planning and building a world without rankism!"

http://www.whywork.org/rethinking/whywork/abolition.html [whywork.org]
"Clearly these ideology-mongers have serious differences over how to divvy up the spoils of power. Just as clearly, none of them have any objection to power as such and all of them want to keep us working."

http://www.reprap.org/ [reprap.org]
"[RepRap] has been called the invention that will bring down global capitalism, start a second industrial revolution and save the environment..."

http://www.educationanddemocracy.org/FSCfiles/C_CC2a_TripleRevolution.htm [educationa...ocracy.org]
"The fundamental problem posed by the cybernation revolution in the U.S. is that it invalidates the general mechanism so far employed to undergird people's rights as consumers. Up to this time economic resources have been distributed on the basis of contributions to production, with machines and men competing for employment on somewhat equal terms. In the developing cybernated system, potentially unlimited output can be achieved by systems of machines which will require little cooperation from human beings. As machines take over production from men, they absorb an increasing proportion of resources while the men who are displaced become dependent on minimal and unrelated government measures--unemployment insurance, social security, welfare payments. These measures are less and less able to disguise a historic paradox: That a substantial proportion of the population is subsisting on minimal incomes, often below the poverty line, at a time when sufficient productive potential is available to supply the needs of everyone in the U.S."

http://en.wikipedia.org/wiki/War_Is_a_Racket [wikipedia.org]
"War is a racket. It always has been. It is possibly the oldest, easily the most profitable, surely the most vicious. It is the only one international in scope. It is the only one in which the profits are reckoned in dollars and the losses in lives. A racket is best described, I believe, as something that is not what it seems to the majority of the people. Only a small 'inside' group knows what it is about. It is conducted for the benefit of the very few, at the expense of the very many. Out of war a few people make huge fortunes."

I used to hang out at the robot labs at CMU in the 1980s. What frightened me most about the whole thing of military robotics (as well as mind children) was a combination of arrogance and incompetence (not that I haven't been guilty of both at times myself), which in this area is likely as not to lead to robotic cockroaches that take over the earth (exterminating humankind incidentally) and which then all die off. :-)

If robots that kill autonomously is the answer, you're asking the wrong question.

Anyone remember Robocop? (1, Interesting)

Anonymous Coward | more than 5 years ago | (#22211002)

Anyone remember Robocop?

Robocop: "Directive 4: Classified."
Dick: "You can't kill me. Any attempt to arrest a senior OCP employee results in shutdown."
CEO of OCP: "You're fired1"
Robocop: "Thank you." *BLAM* (throws Dick, who is no longer an employee, out the window.)

> The transformation of International Protocols and battlefield ethics into machine usable representations and real-time reasoning capabilities for bounded morality using modal logics.

Killbot: "I am unable to target that school full of unarmed children."
Private Skippy: "Now they're armed." (tosses a handgun and a magazine into the classroom.)
Killbot: "Thank you." *BLAM* (incinerates armed terrorists who illegally took over what was once a school, but which is now a legitimate military target)

The biggest question... (4, Funny)

jockeys (753885) | more than 5 years ago | (#22211044)

Do these killbots have a preset kill limit? Can they be defeated by sending wave after wave of your own men at them?

Re:The biggest question... (4, Funny)

Sciros (986030) | more than 5 years ago | (#22211074)

It's probably MAX_INT.

Re:The biggest question... (1)

Joe the Lesser (533425) | more than 5 years ago | (#22211580)

"When I'm in command every mission is a suicide mission!"-Zap Branigan

Ethical Warbots? (1)

bughunter (10093) | more than 5 years ago | (#22211116)

It seems sort of oxymoronic.

I mean if you program the robots with Asimov's Laws of Robotics [wikipedia.org] , then what's the problem.

Robot on Robot violence?

Conscientiously objecting robots?

Or - the horror - formulation of a "Zeroth Law [wikipedia.org] "?

Natalie Portman Robot (5, Funny)

letchhausen (95030) | more than 5 years ago | (#22211130)

When are they going to stop using robots for evil and start using it for good? I want a Natalie Portman "pleasure model" robot and I want it now! Science has lost it's way.....

So what you're really saying is... (4, Funny)

riseoftheindividual (1214958) | more than 5 years ago | (#22211408)

...you want robots to make love and not war.

I have bad news for the war ethicists (3, Insightful)

Overzeetop (214511) | more than 5 years ago | (#22211186)

Wars are won by those who do not follow the "rules." There are no rules in war. If there were, then there would be a third party far more powerful than either side who could enforce said rules. If there was, then that power could enforce a solution to the conflict that started the war, and there would be no need for war. Said power would also not need answer to anyone, and would be exempt from said rules (having no one capable of enforcing them).

Some points (1)

Besna (1175279) | more than 5 years ago | (#22211264)

There may still be a need for war. The power can only enforce certain rules.

Just a small step (1)

MrCopilot (871878) | more than 5 years ago | (#22211188)

Inching ever closer to the inevitable Robot Jox [imdb.com]

Re:Just a small step (1)

wilder_card (774631) | more than 5 years ago | (#22211322)

Cool movie, but I actually preferred Crash and Burn [imdb.com] . I'm a sucker for cute female hackers who stumble across abandoned giant robots, I guess.

Robots will never be ethical (1)

trybywrench (584843) | more than 5 years ago | (#22211214)

I don't see how a robot will ever be able to solve an ethical dilemma on the battlefield. It's such a subjective, context sensitive, issue that we're centuries away from being able to handle that digitally. Another thing, I don't think the planet will ever agree to some "let the robots fight it out" type of warfare either. When a country's robot force is beaten they're not going to just throw up their arms and say "good match sir, here take our land/resource you won it fair and square". They'll still resist to the point of sacrificing their lives.

For the forseable future a human will be at the trigger somewhere in line. Maybe targets get pushed en-masse to central command for "destruction approval" and it's just a mouseclick then *kaboom* but a human will always be in the chain.

I think the over the next 100 years we'll see more UAV type combat robotics. Where an aircraft (or tank) is largely autonomous but is getting fire commands and mission details from a central human controller. I've wondered why strategic bombers aren't controlled like this already. Why not send 20 semi-autonomous F16's into a bombing mission rather than 2 or 3 very expensive, but safe, B2's? Maybe you lose one or 2 F16's but you lose no pilots. Overall I think it would be cheaper. (I'm aware that an F16 is not a bomber but there's certainly plenty of them around and they can carry JDAMs just like any other aircraft)

Ethics...blah (1)

TNTSoggy (1104133) | more than 5 years ago | (#22211220)

I don't care much about the Ethics of war, If some one tries to kill me/my people then I think I have the right to destroy them any way I see fit. No the problem I see with this is that Robots are controlled by computers and computers can be hacked. Also you can bet that the military would end up spending half a billion for each robot.

Re:Ethics...blah (0)

Anonymous Coward | more than 5 years ago | (#22211482)


I don't care much about the Ethics of war, If some one tries to kill me/my people then I think I have the right to destroy them any way I see fit.


Cool. You just captured Islamo-facism AND American Imperialism in a single sentence. (OK, two sentences if you fix the comma splice)

Wave after wave (1)

bonkeydcow (1186443) | more than 5 years ago | (#22211244)

No problem we will just send wave after wave of our own citizens at them until they reach their preset kill limit.

stoned robots (0)

Anonymous Coward | more than 5 years ago | (#22211246)

can robots smoke marijuana legally?

would robots enforce marijuana laws?

have you ever seen a robot... ON WEED?

First rule of roboethics (1)

Arthur B. (806360) | more than 5 years ago | (#22211260)

Kill all humans !

I have some good news and some bad news. (4, Insightful)

fuzzyfuzzyfungus (1223518) | more than 5 years ago | (#22211268)

The question of making lethal robots act ethically is far easier in some ways than doing so with humans and far harder in others. On the plus side, robots will not be subject to anger, fear, stress, desire for revenge, etc. So they should be effectively immune to the tendency toward taking out the stress of a difficult or unwinnable conflict on the local population. On the minus side, robots have no scruples, probably won't include whistleblowing functions, and will obey any order that can be expressed machine-readably.

The real trick, I suspect, will not be in the design of the robots; but in the design of the information gathering, storage, analysis, and release process that will enforce compliance with ethical rules by the robot's operators. As the robots will need a strong authentication system, in order to prevent their being hijacked or otherwise misused, the technical basis for a strong system of logging and accountability will come practically for free. Fair amounts of direct sensor data from robots in the field will probably be available as well. From the perspective of quantity and quality of information, a robot army will be the most accountable one in history. No verbal orders that nobody seems to remember, the ability to look through the sensors of the combatants in the field without reliance on human memory, and so on. Unfortunately, this vast collection of data will be much, much easier to control than has historically been the case. The robots aren't going to leak to the press, confess to their shrink, send photos home, or anything else.

It will all come down to governance. We will need a way for the data to be audited rigorously by people who will actually have the power and the motivation to act on what they find without revealing so much so soon that we destroy the robots' strategic effectiveness. We can't just dump the whole lot on youtube; but we all know what sorts of things happen behind the blank wall of "national security" even when there are humans who might talk. Robots will not, ever, talk; but they will provide the best data in history if we can handle it correctly.

I, for one, welcome our ethical robot overlords (2, Insightful)

amasiancrasian (1132031) | more than 5 years ago | (#22211280)

I see a lot of implementation problems before even getting involved with the ethical issues. I mean, there's the usual friend-or-foe IDing issues. Then there's the problem of getting the software to recognise a weapon. If you program it to recognise the shape of an AK, it'll pick up replicas or toys or, heck, lots of stuff that looks vaguely gun-shaped. And the enemy will simply resort to distorting the shape of the weapon, which can't be hard to do. Given that it will be a while before AI technology will improve, it doesn't seen any more effective than a remote-controlled car. And as far as the legal issues, this seems like skirting the boundaries, and definitely violating the spirit, if not the letter of the law.

War: What is it good for? (1)

phreakincool (975248) | more than 5 years ago | (#22211292)

Building better Gundams!

In other news ... (1)

powerlord (28156) | more than 5 years ago | (#22211306)

WOPR [wikipedia.org] wants to know if you'd like to play a nice game of Chess.

The Matrix [wikipedia.org] is reported as coming on-line shortly to service your needs.

SkyNet [wikipedia.org] remains unavailable for comment.

Great, BotNets of "commandeered" Warbots! (1)

Zymergy (803632) | more than 5 years ago | (#22211328)

I hope we run them on Linux or some other software much moe difficult to compromise. Then again human error is ever present in code so complex.

Few Movie Plots come to mind on commandeered war robots, but far too many movie plots focus on robotic AI's wising-up and going after their human overlords. Bladerunner, T1-T3, Runaway, Red Planet, A.I. (to a lesser degree), Lost In Space, etc... Help me out here if you want as I am sure I missed many others.

Hypocrisy (0)

Anonymous Coward | more than 5 years ago | (#22211342)

Paper like this are quite hypocrit, for they take
for granted that using robots in a war is okay.

As a matter of fact, the very first consequence
of using robots in a war (if it would ever become
possible, which is not sure) is that it lowers
immensely the price of human life.

This is something that never happened to such
a degree in the history (i.e. : I only loose
a machine, but the soldiers in front of me
loose their *lives*), and this is the question
a *really* ethical paper should consider in
the first place.

But as far as I saw, the paper we discuss
doesn't even spares one sentence about
this question.

Completely pointless (1)

HouseArrest420 (1105077) | more than 5 years ago | (#22211350)

I've not read the article...nor do I plan to. The idea of robot ethics is only good at the onset of robot usage in war. Noone wants to see thier soldiers sloaghtered by a machine, so the minute they can they're putting robots in as well, in which case robot ethics are compoletly pointless because the present geneva convention makes no rules regarding the destruction of artificial life forms. Matter of fact it only defends ppl like medics, pow's, unarmed combatants, and civilians, along with some other situation dependant confrontations

WMD (1)

Fuzzums (250400) | more than 5 years ago | (#22211366)

Maybe those robots know not to attack when non-existing WMD are an excuse for war... :s

War Games, Literally (1)

webword (82711) | more than 5 years ago | (#22211386)

So, why not settle all disputes using video games? That's what it'll come down to, unless the 'bots literally do all the fighting without any human interaction. In a way, Ender's Game [answers.com] gets at this point.

Enemies will just game the AI's behavior (1)

finlandia1869 (1001985) | more than 5 years ago | (#22211480)

And what happens as soon as the enemy figures out how to take advantage of the the "international laws" of warfare to beat the robots? Because I certainly can't think of any case in which humans have figured out how to predict and use an AI's behavior against it. Or, better yet, wait until the first time a robot on patrol duty shoots some kid in front of his mommy and a cameraman. It will happen because some combination of behavior and circumstances inevitably will combine to create an input that the robot's programming interprets as "kill." What are you supposed to say? Oops, there was a bug in the software? That ought to trigger big time public outcry.

Unless we get some serious leaps forward in AI, I think our best scenario for combat robots is, coincidentally, the best scenario for combat troops: every person you see is known to be hostile and collateral damage is not an issue. Achieve the objective and don't worry about red force casualties or collateral damage. In fact, the more enemy dead the better, because fewer enemy troops will be left to oppose you.

Ethical Outperformance (1)

umbrellasd (876984) | more than 5 years ago | (#22211496)

The primary goal remains to enforce the International Laws of War in the battlefield in a manner that is believed achievable, by creating a class of robots that not only conform to International Law but outperform human soldiers in their ethical capacity.
These people need to read. A basic background in Asimov would tell you exactly what's going to happen here.
  • General: go kill people.
  • Robot: due to my ethical outperformance, peace is the best option. I cannot comply.
  • General: I order you to comply with my directive.

At this point, one of two things happens. Either you have a whole bunch of robots that go into roblock and melt on the battlefield, which results in a huge waste of military budget and a disgruntled citizenship. Soon to be followed by heated discussions in Congress and Senate, which would result in more fights on the Hill except for the fact that most of the statesmen have been replaced with ethically outperforming robots who go into roblock as well when confronted with the choice of killing contrary representatives or failing to resolve the economic dilemma...

OR, 50% of the robots go berserk out of frustration (since anyone with even a smidgeon of ethics spends about 99% of their time in that state) and start indiscriminately killing everything in sight until their photoreceptors begin processing a color other than red, at which point you have the whole BSG/Terminator/etc. "God, I'm so pissed at my creators for making me ethically superior in every way that they need to f-ing die, one and all--no exceptions."

An ethically outperforming robot would tell use we shouldn't make him. Let's make clean energy and better health and educational programs for all children in the world, instead.

The Laws of Robotics (0)

Anonymous Coward | more than 5 years ago | (#22211556)

They need to have Asimov's Three Laws of Robotics built in to prevent them from harming human beings.

Stupid question : why war ? The answer : (0)

Anonymous Coward | more than 5 years ago | (#22211558)

1) Because violence solves conflicts. If one (ONE, NOT TWO) parties in a conflict is not prepared to compromise (like everyone consistently called "radical" these days), then the only option is violence. E.g. as long as palestinians want to give no land to the Jews, there will necessarily be violence. If one party wants violence and the other doesn't oblige (e.g. what gandhi did) this will result in more violence (in gandhi's case 10 million dead), not in less. In short, pacifism leads to violence, if there is even a single non-pacifist party. Si vis pacem, para bellum.

2) Because you don't have a choice (physically). If you believe in "redivision of richess/land/..." for example, like the indian culture of america did, then you HAVE to use violence ? Why because there's not enough to go around (if there is, natural population increase will take care of that, in the Indians case that's food, why ? simple : the only option for a non-agrarian indian tribe to expand is to kill another, so they regularly killed off one another, in the muslim case because there is only one world). In any socialist and/or totalitarian state it will eventually come to this (even though, yes, it might take 500, or even 1000 years, generally it doesn't even take 5 though, watch Venezuela in the coming year, or just check Caracas' "crime" rate). If you believe in the nanny state (ie. you vote for either hillary or obama or ron paul), this is what you're doing to your children.

3) Because you're psychologically ill.

4) Because you want to. Because you have nothing better to do. (no I'm not kidding). E.g. muslims that want to become "one with allah" by the quran's method (to fight, kill and die for islam like quran 9:111 commands)

Robots will not change this. At all. Besides we're a long way from a self-sustaining self-improving reproducing war machine, which is what it would take to actually be danguerous.

Did'nt they try this in a old Star Trek episode? (1)

GHynson (1216406) | more than 5 years ago | (#22211596)

And people willing stepped into incinerators because they were in the "kill zone" of the war simulation?
Didn't we learn from this 60's flick?
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...