Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Ethical Killing Machines

kdawson posted more than 5 years ago | from the i-for-one-welcome dept.

Robotics 785

ubermiester writes "The New York Times reports on research to develop autonomous battlefield robots that would 'behave more ethically in the battlefield than humans.' The researchers claim that these real-life terminators 'can be designed without an instinct for self-preservation and, as a result, no tendency to lash out in fear. They can be built without anger or recklessness ... and they can be made invulnerable to ... "scenario fulfillment," which causes people to absorb new information more easily if it agrees with their pre-existing ideas.' Based on a recent report stating that 'fewer than half of soldiers and marines serving in Iraq said that noncombatants should be treated with dignity and respect, and 17 percent said all civilians should be treated as insurgents,' this might not be all that dumb an idea."

cancel ×

785 comments

I for one welcome... (5, Funny)

pwnies (1034518) | more than 5 years ago | (#25890291)

...need I say more?

Do they run vista? (5, Insightful)

raymansean (1115689) | more than 5 years ago | (#25890551)

It takes a special set of skills to corrupt a single human being, it takes another set of skills, not that special, to corrupt an entire battalion of robots, that are all identical. Did I mention sharks with lasers?

Re:Do they run vista? (1)

bgspence (155914) | more than 5 years ago | (#25890803)

Never, never, never.

MWindows Mobile.

Re:Do they run vista? (4, Insightful)

blhack (921171) | more than 5 years ago | (#25890821)

It takes a special set of skills to corrupt a single human being, it takes another set of skills, not that special, to corrupt an entire battalion of robots

Do you live in a society without Money?
Or women?
Or sports cars?
Or Fancy houses?
Or Gold?
Or "Change" posters?

As far as I know, my computers have never accepted a bribe, or made a power-grab.

Re:Do they run vista? (5, Insightful)

EricWright (16803) | more than 5 years ago | (#25890837)

Ummm... it's not the computers you bribe, it's their programmers.

Re:Do they run vista? (4, Insightful)

blhack (921171) | more than 5 years ago | (#25890901)

Ummm... it's not the computers you bribe, it's their programmers.

AHA! So! How is this any different than humans?

Bribe a human to kill a person (or have their army kill a shitload of people).
Bribe a human to have their robot kill a person (or have their army of robots kill a shitload of people).

I think that the problem is people having misconceptions about robots. They're not sentient. They don't think. They only do what we tell them to. Sure there are horror stories about robots coming to life, but there are also horror stories about dead people coming to life, or cars coming to life.

We need to drop the term "robot".

Re:Do they run vista? (1)

PotatoFarmer (1250696) | more than 5 years ago | (#25890899)

Odds are pretty good your computers have never made a conscious decision either, and simply do as their programming dictates. It's a whole lot easier to corrupt a single piece of software than it is to a) figure out what motivates a group of disparate individuals, and b) exploit those motivations.

Re:I for one welcome... (5, Funny)

philspear (1142299) | more than 5 years ago | (#25890731)

...need I say more?

Yes! It's ambiguous as is. Which were you going to go with?

1. Our ethical killer-robot overlords
2. Our more-benevolent-than-a-human killing machinev overlords
3. The impending terminator/matrix/MD geist/1000 other sci-fi themed apocalypse
4. Users who are new to /. who aren't Simpsons fans and don't get this joke
5. Our new ant overlords, since there is no stopping them even with our new murder-bots

Re:I for one welcome... (1)

obyom (999186) | more than 5 years ago | (#25890925)

Why not just flip a coin? Does a field full of opposing robots make any more sense than a field full of opposing soldiers? War no more!

Ethical vs Moral (3, Insightful)

mcgrew (92797) | more than 5 years ago | (#25890293)

"The New York Times reports on research to develop autonomous battlefield robots that would 'behave more ethically in the battlefield than humans.'

Maybe I'm being a bit pedantic here, but "ethics" is a professional code - for instance, it is completely ethical by military codes of ethics to kill an armed combatant, but not to kill a civilian. It is unethical (and illegal) for a medical doctor to salk about your illness, but it's not unethical for me to.

The waterboarding and other torture at Gitmo was immoral; shamefully immoral, but was ethical.

The advantage to a killing robot is that it has no emotions. The disadvantage to a killing robot is ironically that it has no emotions.

It can't feel compassion after it's blown its enemiy's arm off. But it can't feel vengeance, either. It's a machine, just like any other weapon.

And like an M-16, its use can either be ethical or unethical, moral or immoral, moral yet unethical or immoral yet ethical.

Re:Ethical vs Moral (3, Informative)

neuromanc3r (1119631) | more than 5 years ago | (#25890433)

"The New York Times reports on research to develop autonomous battlefield robots that would 'behave more ethically in the battlefield than humans.'

Maybe I'm being a bit pedantic here, but "ethics" is a professional code

I'm sorry, but that is simply not true. Look up "ethics" on wikipedia [wikipedia.org] or, if you prefer in a dictionary.

Re:Ethical vs Moral (3, Informative)

mcgrew (92797) | more than 5 years ago | (#25890845)

It appears that language has evolved (or rather devolved) once again. I looked it up last year and "ethics" and "morals" were two separate things; ethics was a code of conduct ("It is unethical for a govenmnet employee to accept a gift of over $n, it is unethical for a medical doctor to discuss a patient's health with anyone unauthorized).

The new Miriam Webster seems to make no distinction.

As to Wikipedia, it is not an acceptable resource for defining the meanings of words. Looking to wikipedia when a dictionary [merriam-webster.com] is better suited is a waste of energy.

Wikipedia's entry on cataract surgery still has no mention of acommodating lenses. Any time someone adds the CrystaLens to wikipedia, somebody edits it out. Too newfangled for wikipedia I guess, they only just came out five years ago (there's one in my eye right now).

Morality may have gone out of style, but as it's needed they apparently brought it back under a more secular name. So now that "ethical" now means what "moral" used to mean, what word can we use for what used to be "ethics", such as the aformentioned doctor breaking HIPPA rules (ethics) which would not be immoral or unethical for me to do?

Of course uncyclopedia has no ethics, but it does have morals [wikia.com] , virtue [wikia.com] , and medical malpractice [wikia.com] .

Re:Ethical vs Moral (1)

brkello (642429) | more than 5 years ago | (#25890451)

Something that can't be unethical or ethical is probably going to be more ethical than something that is unethical. In other words, if robots are neutral and humans are either evil or good, neutral is more good than evil.

Re:Ethical vs Moral (1, Insightful)

poetmatt (793785) | more than 5 years ago | (#25890651)

That sounds pretty contradictory.

Re:Ethical vs Moral (1)

Ibiwan (763664) | more than 5 years ago | (#25890699)

No, it doesn't!

Re:Ethical vs Moral (1, Offtopic)

lilomar (1072448) | more than 5 years ago | (#25890801)

Look, if I'm going to have an argument with you, then I'm going to have to take up a contradictory position.

Re:Ethical vs Moral (0)

Anonymous Coward | more than 5 years ago | (#25890477)

The waterboarding and other torture at Gitmo was immoral; shamefully immoral, but was ethical.
i think the rest of your post is spot on, but i have to disagree on this bit. i believe the torture was unethical. one might argue that if, by torture, one was able to derive information that would save many other lives, then it could be said to be ethical on those grounds. however it i my understanding that it's been shown time and again that the intelligence gathered by torture is unreliable and false. thus, torture serves no ethical purpose.

Re:Ethical vs Moral (2, Insightful)

Marxist Hacker 42 (638312) | more than 5 years ago | (#25890481)

Here's where I see the value of a correctly ethical killing machine:

Enforcing a border between two groups of humans that would otherwise be killing each other, and making that border 100% impenetrable.

To do this, you need more than just a simple robot that has a GPS unit and shoots everything that moves within a predetermined rectangle. You need a CHEAP simple robot that has a GPS unit and shoots everything that moves in a predetermined rectangle; cheap enough so that you can deploy them thickly enough that their weapons overlap to two robots over.

But it will never be moral.

And a toddler wanders into your field of fire. (2, Insightful)

khasim (1285) | more than 5 years ago | (#25890693)

So, a family is picnicking on a hill overlooking your kill zone.

The toddler gets away and falls down the hill and then wanders into your kill zone.

Is it ethical to kill the toddler?

Machines cannot be ethical because they cannot make decisions based upon less / more ethical choices.

Re:And a toddler wanders into your field of fire. (1, Insightful)

Anonymous Coward | more than 5 years ago | (#25890787)

Was it ethical for a family to be picnicking near a kill zone?

Re:And a toddler wanders into your field of fire. (5, Funny)

tripdizzle (1386273) | more than 5 years ago | (#25890929)

That is less a question of ethics and more one of stupidity.

Re:Ethical vs Moral (2, Insightful)

b4upoo (166390) | more than 5 years ago | (#25890513)

From a soldier's point of view it is rather easy to understand why all of the population might appear to be an enemy. Often that is an outright fact. Even if the locals happen to like Americans those locals still must make a living and also appease the fanatical elements in their neighborhood. So the same people that smile and feed you dinner might also buy their groceries smuggling munitions.
          This may turn really ugly in the moonscape like land that borders Pakistan. There is no easy way to dislodge tribal people from that terrain. It could very well be that the only real path to victory is exterminating the entire population. And what else does a soldier seek other than victory?

Re:Ethical vs Moral (1)

sbeckstead (555647) | more than 5 years ago | (#25890895)

A soldier does not seek victory, we seek to complete our mission. If you describe completing your mission as victory that is your problem.

Three Laws of Robotics (1)

Jabbrwokk (1015725) | more than 5 years ago | (#25890525)

And I guess we can give up on Asimov's Three Laws of Robotics [wikipedia.org] ever being anything but science fiction.

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

As you've aptly pointed out, it is very possible to be ethical but immoral at the same time. Asimov's laws would prevent a robot from engaging in immoral activity (the word "injure" has a broad meaning) but would also prevent robots from being used as killing machines. So our choices are either "Bicentennial Man" or battlefield terminators. And I guess the government wants battlefield terminators.

Re:Three Laws of Robotics (1)

LandDolphin (1202876) | more than 5 years ago | (#25890855)

Haven't there been several movies/books that show how the first rule could be warped to allow robots to kill people.

Safe forever from the rock (3, Interesting)

slashnot007 (576103) | more than 5 years ago | (#25890559)

An old cartoon had a series of panels. the first panel had a cave man picking up a rock saying "saf forever from the fist". Next panel is a man inventing a spear, saying "safe forever from the rock". And so on, swords, bow and arrows, cata pults, guns, bombs.... well you get the idea.

On the otherhand, the evolution of those items coincided with the evolution of society. For example, You had to have an organized civil society to gather the resource to make a machine gun. (who mines the ore for the metal. Who feeds the miners? who loans the money for the mine?...)

It's a bit of a chicken and egg about which drives which these days, but certainly early on, mutual defense did promote societal organization.

So "safe forever from the angry soldier" is the next step. It's already happened in some ways with the drone so it's not as big an ethical step to the foor soldier, and given the delberateness with which drones are used compared to the dump and run of WWII bombing one can credibly argue they can be used ethically.

On the other hand war has changed a bit. The US no longer try to "seize lands" mititarily to expand nations (economically instead). (russia and china are perhaps the exceptions). These days it's more a job of fucking up nations we think are screwing with us. E.g. Afganistan.

Now imagine the next war where a bunch of these things get dropped into an assymetrical situation. Maybe even a hostage situation on an oil tanker in somalia.

It's really going to change the dynamic I think, when the "enemy" can't even threaten you. Sure it could be expensive but it totally deprives the enemy of the incentive of revenge for perceived injustice.

On the other hand it might make the decision to attack easier.

Re:Ethical vs Moral (0)

Anonymous Coward | more than 5 years ago | (#25890575)

But it can't feel vengeance

Or pity, or remorse, or fear. And it absolutely will not stop, EVER, until Iraq is won.

Re:Ethical vs Moral (4, Interesting)

vishbar (862440) | more than 5 years ago | (#25890581)

Ethics" is such a poorly defined term...hell, different cultures have different definitions of the term. In feudal Japan, it was ethical to give your opponent the chance for suicide...today, many Westerners would in fact argue the opposite: the ethical thing to do is prevent a human from committing suicide as that's seen as a symptom of mental illness.

I've always defined "morality" as the way one treats oneself and "ethics" as the way one treats others. It's possible to be ethical without being moral--for example, I'd consider a person who spends thousands of dollars on charity just to get laid to be acting ethically but immorally. By that definition, the hullabaloo at Guantanamo would certainly be both immoral and unethical--not only were they treated inhumanely, but it was done against international law and against the so-called "rules of war".

These robots would have to be programmed with certain specific directives: for example, "Don't take any actions which may harm civilians", "take actions against captured enemy soldiers which would cause the least amount of forseeable pain", etc. Is this good? Could be...soldiers tend to have things like rage, fear, and paranoia. But it could lead to glitches too....I wouldn't want to be on the battlefield with the 1.0 version. Something like Asimov's 3 Laws would have to be constructed, some guiding principle...the difficulty will be ironing out all the loopholes.

Re:Ethical vs Moral (4, Insightful)

vertinox (846076) | more than 5 years ago | (#25890601)

The advantage to a killing robot is that it has no emotions. The disadvantage to a killing robot is ironically that it has no emotions.

More than not, most face to face civilian casualties on the battlefield happen due to fatigue, emotional related issues (my buddy just died!), or miscommunication.

Not because the soldiers had lack of emotion or humanity.

The other kind in which a bomb, mortar, or arty shell lands on a house full of civilians because someone typed in the wrong address in GPS are so separated from the battlefield anyway, it won't really make a difference if the guy pushing the button is man or machine.

Re:Ethical vs Moral (4, Insightful)

ThosLives (686517) | more than 5 years ago | (#25890627)

The bigger issue isn't so much the tools and weapons, but the whole "modern" concept of war. You cannot accept the concept of war without the concept of causing destruction, even destruction of humans. To send people into a warzone and tell them not to cause destruction is actually more immoral and unethical, in my mind, than sending them in and allowing them to cause destruction.

Re:Ethical vs Moral (5, Insightful)

Abreu (173023) | more than 5 years ago | (#25890671)

Sorry McGrew, but waterboarding and torture is both unethical and immoral. As far as I know (being an ignorant foreigner), the US Army does not include any torture instructions in its manuals.

Now, you could make a case that Gitmo's existence might be ethical but immoral, considering that it is technically not a US territory, but legally* under US jurisdiction.

*The legality of this is disputed by Cuba, of course...

Strange moderation (1)

Rumagent (86695) | more than 5 years ago | (#25890785)

Why is this modded insightful? He clearly does not know what ethical means.

This is said without malice, there are good points in the post, but not about ethics.

Re:Ethical vs Moral (2, Informative)

langelgjm (860756) | more than 5 years ago | (#25890815)

Maybe I'm being a bit pedantic here, but "ethics" is a professional code - for instance, it is completely ethical by military codes of ethics to kill an armed combatant, but not to kill a civilian.

You're not being pedantic, you're being imprecise. Codes of ethics are one thing, but "ethics" is most certainly not limited to a professional code. Look up the word in a dictionary. I also don't know why you got modded to +5 insightful.

From the OED: ethics: "The science of morals; the department of study concerned with the principles of human duty." That's the primary definition that's listed.

Frist Post? (2, Funny)

Beyond_GoodandEvil (769135) | more than 5 years ago | (#25890303)

Ethical Killing Machine? Like military intelligence?

Re:Frist Post? (2)

Smidge207 (1278042) | more than 5 years ago | (#25890409)

No, more like Windows Genuine Advantage.

(I kid, I kid; I've enjoy my crash-free XP Pro box for the last 5 years. Srsly. O'reily.)

=Smidge=

Re:Frist Post? (1)

Midnight Thunder (17205) | more than 5 years ago | (#25890691)

Ethical Killing Machine? Like military intelligence?

As long as it is family friendly then I don't mind the confusion - uh, on second thoughts ...

Re:Frist Post? (1)

iron-kurton (891451) | more than 5 years ago | (#25890739)

...two words combined that can't make sense

Re:Frist Post? (1)

jonaskoelker (922170) | more than 5 years ago | (#25890923)

Heh ;)

On the other hand, there's the Schiavo case; we can have a long debate about this without coming to any conclusion, but some people believe that it's ethical to kill someone in some cases.

(I'll abstain from stating my view on the matter)

Clearly it's far from the application in question here, but it's not completely oxymoronic.

(You can resume laughing now, parent's joke is still funny.)

WWJCD? (0)

Anonymous Coward | more than 5 years ago | (#25890315)

What Would John Connor Do?

Re:WWJCD? (0)

Anonymous Coward | more than 5 years ago | (#25890511)

Sleep with her?

I mean it!

Ah, fuck it....

Not, don't fuck it!

...

Too late, the bastard's dead.

This Report Brought to you by.... (1)

Buzz_Litebeer (539463) | more than 5 years ago | (#25890345)

John Henry and the Sarah Connor Chronicles on Fox.

Skynet, not just the science fiction future anymore.

Re:This Report Brought to you by.... (1)

Icegryphon (715550) | more than 5 years ago | (#25890721)

MOD THIS UP, saw this episode, Series still has me hooked!

John Doe (0)

Anonymous Coward | more than 5 years ago | (#25890347)

Skynet much?

One shield of children please! (1)

assemblerex (1275164) | more than 5 years ago | (#25890349)

So I guess all our enemies will start dressing like priests, nuns, and red cross workers. Well done!

Re:One shield of children please! (1)

Yvan256 (722131) | more than 5 years ago | (#25890553)

[Cutaway to a group of soldiers in Vietnam, and Peter, dressed as a clown, follows them.]
Peter: You're all stupid. See, they're gonna be looking for army guys.

Interesting... (4, Insightful)

iamwhoiamtoday (1177507) | more than 5 years ago | (#25890373)

I was just watching the into to the first "Tomb Raider" movie, where Lara destroys "Simon" (the killer robot that she uses for morning warmup) Robots... I must say, I don't like the idea behind robots fighting our wars, because that means that "acceptable risks" become a thing of the part, and we are Far more likely to "militarily intervene". Aka: "Less risk to our troops" can translate into "we go into more wars" which is something I don't support... wars benefit companies, and lead to the death of thousands. If the lives lost aren't American Lives, does it still matter? in my opinion, YES.

Re:Interesting... (2, Interesting)

qoncept (599709) | more than 5 years ago | (#25890701)

That's some pretty flawed logic. Should doctors working to cure lung cancer stop, because a cure to lung cancer would make it safer to smoke?

"Less risk to our troops" can translate into "we go into more wars" which is something I don't support... wars benefit companies, and lead to the death

Read that again. You don't like wars because people are killed. You're talking about potentially eliminating human casualties in any war. That means the only remaining "problem" (in your scenario) is that they benefit companies.

Re:Interesting... (1)

moderatorrater (1095745) | more than 5 years ago | (#25890875)

If the lives lost aren't American Lives, does it still matter? in my opinion, YES.

You care about people in other countries? What a unique individual! Everyone give this man a hand, he cares about human life!

Everyone who's not a sociopath feels that human life is sacred and should be preserved, they just express it differently. The war in Iraq was about preventing more bad shit from coming out of that country and most people at the time felt that the risk was worth taking on the basis that the immediate loss of life would be offset by the long term decrease in lives lost coupled with the increased quality of life. Your self-righteous implications about American's not caring about the lives of people in other countries is ridiculous, especially when you look at how much foreign aid is given by the general population.

Re:Interesting... (1)

ruin20 (1242396) | more than 5 years ago | (#25890903)

only if the robots are cheep and capable of being produced at a massive scale, which I doubt would be the case with such sophisticated machines. Otherwise there's still a cost penalty and people will be even more upset when our robots screw up and kill civilians, because they wouldn't have to worry about being "against the troops" given "against the robots" doesn't exactly carry the same negative connotation.

Contradiction in terms (0, Redundant)

willrj.marshall (1084747) | more than 5 years ago | (#25890377)

I think we have a contradiction in terms, here.

Re:Contradiction in terms (1)

mcgrew (92797) | more than 5 years ago | (#25890507)

"Ethical" and "moral" are two different things.

Ethical Killing Machines (0)

Anonymous Coward | more than 5 years ago | (#25890381)

Well there is an oxymoron if I've ever heard one.

Raises lots of questions (3, Insightful)

Nerdposeur (910128) | more than 5 years ago | (#25890427)

  • If it malfunctions and kills a bunch of civilians or friendly soldiers, was the imperfect design/testing process unethical?
  • What if it has a security flaw that allows it to be taken over by the enemy?

Just the first couple I can think of...

Re:Raises lots of questions (1)

arotenbe (1203922) | more than 5 years ago | (#25890589)

Both of those also apply to human soldiers.

Re:Raises lots of questions (3, Interesting)

zappepcs (820751) | more than 5 years ago | (#25890595)

Better than that. It will be quite a trick to keep the robots from coming back to camp laden with the robotic equivalent of a suicide bomb. There are just way too many possible ways for this to go wrong that any 'ethical' thinking put into this is outweighed initially by the unethical basis for war in the first place, and secondly by the risks associated with sending machines to fight where a human is still the more complete information processor/weapon. UAVs are one thing, but we do not have robots that are capable of the same decisions as humans are. That is both good and bad, and it means that humans will be fighting for quite a while yet.

That said, there is much to be said for the Star Trek take on war: It should be messy, nasty, and full of foul stinking death and destruction lest we forget how much better peace is.

Re:Raises lots of questions (4, Informative)

lgarner (694957) | more than 5 years ago | (#25890859)

Right... Star Trek.

It is well that war is so terrible, lest we should grow too fond of it. (Robert E. Lee)

Re:Raises lots of questions (0)

Anonymous Coward | more than 5 years ago | (#25890687)

What if it has a security flaw that allows it to be taken over by the enemy?

Press SQUARE to hack.

Re:Raises lots of questions (1)

Cyberax (705495) | more than 5 years ago | (#25890907)

I think the best argument is:

What if another country deploys some of these killerbots on the territory of YOUR country?

Oblig. Simpsons quote (4, Funny)

rsborg (111459) | more than 5 years ago | (#25890431)

From The Secret War of Lisa Simpson [wikipedia.org]

The wars of the future will not be fought on the battlefield or at sea. They will be fought in space, or possibly on top of a very tall mountain. In either case, most of the actual fighting will be done by small robots. And as you go forth today remember always your duty is clear: To build and maintain those robots

Re:Oblig. Futurama quote (2, Funny)

Yvan256 (722131) | more than 5 years ago | (#25890563)

Bite my shiny metal ass.

What a great recipe! (2, Insightful)

Syncerus (213609) | more than 5 years ago | (#25890445)

for defeat on the battlefield.

Soldiers are supposed to want to fight. If you want the Peace Corps, send in the Peace Corps. If you want the Marine Corps, send in the Marine Corps.

The whole things sounds like a bunch of Leftist grad students angling for funding. The concept, given the current state of technology, is a pathetic attempt at political correctness.

Politicians are supposed to create policy, not the military. Once the decision has been made by lawfully elected officials to use military force, it is the duty of the military to implement that decision, not second guess it.

The way the intro to the article is framed indicates a complete knowledge vacuum on the part of the framer. This is the exact equivalent of having your nuclear defense program run by Martin Sheen.

All fine and dandy... (0)

Anonymous Coward | more than 5 years ago | (#25890457)

...until the first firmware update.

Should we name it Erasmus? (0)

Anonymous Coward | more than 5 years ago | (#25890497)

Thou shall not make a machine in the likeness of a human mind

Humane wars (5, Insightful)

digitalhermit (113459) | more than 5 years ago | (#25890501)

Automated killing machines were banned at the Geneva convention. This is generally a good thing when we're sending real, live humans (versus the walking undead) to fight our wars. It would be completely inhumane (haha) and tilt the outcome of a war towards those who can afford to develop such technology. That is, if one country can afford killer robots and another can't, then the former has no deterrent to invading the latter.

But imagine if all wars were fought by proxy. Instead of sending people, we send machines. Let the machines battle it out. To be really civil we should also limit the power and effectiveness of our killer robots, and the number of machines that can enter the battlefield at once. Of course, at some point every country will be able to build to the maximum effective specification. At that point it will be a battle of strategy. The next obvious step is to do away with the machines entirely and just get a chessboard.

Whoever wins gets declared the winner.

Makes perfect sense.

Thanks for reading,
M B Dyson

CyberDyne Systems

Re:Humane wars (2, Informative)

Beyond_GoodandEvil (769135) | more than 5 years ago | (#25890647)

I remember that episode of star trek TOS A Taste of Armageddon [startrek.com] or perhaps the future you describe would be more like the movie Robot Jox [imdb.com]

Re:Humane wars (4, Funny)

Microlith (54737) | more than 5 years ago | (#25890659)

How about we just go all the way and have computer simulated battles. When the damage and casualty reports come in we can just have people in those areas report for termination and dynamite the areas affected.

In other news, a ship in orbit was just marked as destroyed. Its representatives will be disposed of and as soon as the rest come down they will be disposed of as well.

Re:Humane wars (3, Insightful)

blhack (921171) | more than 5 years ago | (#25890689)

It would be completely inhumane (haha) and tilt the outcome of a war towards those who can afford to develop such technology.

Hmmm...an interesting debate.

What, then, is your opinion missiles with guidance? Or active terrain avoidance? Is it the fact that these things are on the ground that bothers you?

Howabout UAV bombers?
At what point does something go from being a "smart bomb" to a "killer robot".

Re:Humane wars (2, Insightful)

nasor (690345) | more than 5 years ago | (#25890705)

It would be completely inhumane (haha) and tilt the outcome of a war towards those who can afford to develop such technology. That is, if one country can afford killer robots and another can't, then the former has no deterrent to invading the latter.

As opposed to when one side can afford to put its soldiers in tanks, and the other can't?

Re:Humane wars (1)

tcopeland (32225) | more than 5 years ago | (#25890735)

> That is, if one country can afford killer robots and another can't,
> then the former has no deterrent to invading the latter.

Hm, although, that assumes that the killer robots are perfectly efficient and the country being invaded has no method of striking back "out of band", e.g., with Tomahawks or something similar.

Also, for understanding what today's senior military leadership thinks is important, check out the selections on the various military reading lists [militarypr...glists.com] (site contains affiliate links, copy/paste the title links and search on Amazon if you prefer). "Recognizing Islam: Religion and Society in the Modern Middle East", "On the Origins of War: And the Preservation of Peace", and more. Thoughtful stuff.

Re:Humane wars (1)

Deadstick (535032) | more than 5 years ago | (#25890751)

Automated killing machines were banned

Like, say, Tomahawk missiles?

rj

Re:Humane wars (4, Insightful)

jahudabudy (714731) | more than 5 years ago | (#25890761)

But imagine if all wars were fought by proxy. Instead of sending people, we send machines. Let the machines battle it out.

And when the side whose machines lose doesn't accept that decision? Sooner or later, someone will decide that winning is more important than playing by the rules (I'm guessing sooner). They will then continue the war until unable to, not until told they have lost.

It's a cool idea, but I doubt it will ever be practical. Even if technology progresses to the point where it is simply suicide to send men against the victorious robot army, humans being humans, people still will.

Re:Humane wars (2, Funny)

Abreu (173023) | more than 5 years ago | (#25890823)

Automated killing machines were banned at the Geneva convention. This is generally a good thing when we're sending real, live humans (versus the walking undead) to fight our wars.

I don't care what the Geneva convention says!

As soon as my ritual circle is completed, the dead will rise from their graves and destroooy yooouu! And then your dead soldiers will rise again and take up arms against their former companions!!! THE WORLD WILL BE MINE!!! MUAHAHAHA!!!

Sorry... couldn't help myself...

Re:Humane wars (1)

troll8901 (1397145) | more than 5 years ago | (#25890863)

Don't do it, Miles!

If you want to watch your children grow up, and attend their weddings ...

For heaven's sake, don't do it!

Already Been Developed... (1)

TheNecromancer (179644) | more than 5 years ago | (#25890505)

They're called infantry.

Re:Already Been Developed... (1)

Luminary Crush (109477) | more than 5 years ago | (#25890857)

They're called BattleBots [tv.com]

Wrong Wrong Wrong (2, Insightful)

Orig_Club_Soda (983823) | more than 5 years ago | (#25890509)

Why do we insist on trying to sanitize the realities of life!? There is no ethic in killing people. Its either necessary or unnecessary. War should be as brutal and as ugly as possible. That way we would have to deeply consider if war is the answer to the situation.

Can they be unprejudiced? (1)

linear a (584575) | more than 5 years ago | (#25890561)

Technically it is more ethical to kill at random (or everything you can catch) then to justify some sort of self-serving end.

Silly nonsense (2, Insightful)

cdrguru (88047) | more than 5 years ago | (#25890565)

Iraq became a police action needing law enforcement, not military force, from the moment President Bush stood on the carrier deck saying "Mission Accomplished". From that moment forward using military troops in Iraq became the wrong approach. You don't use the Army as a police force. Any information derived from soldiers misused as policemen is irrelevent.

The only ethics needed or desired on the battlefield is to win the day. Period. Doing anything else is a formula for disaster. As can be shown in Vietnam. We didn't use the maximum force to full effect, we danced around and tried to do everything but defeat the enemy. The result - South Vietnam was overrun and lots of people died.

Once you leave the scenario of the battlefield, you can talk about ethics. You also stop needing soldiers and start needing diplomats and policemen. Consuing the two doesn't work and provably so.

Re:Silly nonsense x2 (1, Informative)

Anonymous Coward | more than 5 years ago | (#25890741)

This month's issue of National Defense Magazine [nationalde...gazine.org] lists some 'hits' and 'misses' in defense technology. 'Gun-toting robots' are judged a 'miss'. I've also sat and listened to Colonels and Generals unambiguously declare that they do not want armed robots. They think it's a bad idea tactically, logistically, legally, and morally.

So if the commanders don't want them, and industry thinks they're a bust, why are these researchers pushing the technology?

Re:Silly nonsense (2, Insightful)

nasor (690345) | more than 5 years ago | (#25890775)

Iraq became a police action needing law enforcement, not military force, from the moment President Bush stood on the carrier deck saying "Mission Accomplished". From that moment forward using military troops in Iraq became the wrong approach. You don't use the Army as a police force. Any information derived from soldiers misused as policemen is irrelevent.

That would only be true if there hadn't still been large, organized, and heavily-armed groups operating in Iraq in opposition to the U.S. Yeah, the military doesn't make a good police force, but the police usually don't do very well when their police stations are attacked by "criminals" with rockets, mortars, and machine guns.

Re:Silly nonsense (4, Insightful)

nomadic (141991) | more than 5 years ago | (#25890781)

The only ethics needed or desired on the battlefield is to win the day. Period. Doing anything else is a formula for disaster. As can be shown in Vietnam. We didn't use the maximum force to full effect, we danced around and tried to do everything but defeat the enemy. The result - South Vietnam was overrun and lots of people died.

No, that's absurd. Who cares if you win the day if you lose the war? If you get bogged down in that kind of short-term thinking you're doomed to lose in the end.

We didn't win in Vietnam because the Vietnamese were willing to take horrific casualties, not because we weren't willing to attack with maximum force. Hell, we firebombed villages and deforested entire regions, what exactly else should we have done?

If you wait long enough ... (1)

140Mandak262Jamuna (970587) | more than 5 years ago | (#25890583)

... even tired old cliches become suddenly relevant!

Let me be the first to welcome of ethical robotic overlords.

Ethics, or battle tactics? (5, Insightful)

subreality (157447) | more than 5 years ago | (#25890599)

Personally, I think this is a response to the problems of being the established army fighting a guerrilla force. The way guerrillas succeed is by driving the invading army slowly crazy by making them live in constant fear (out of self-preservation), until they start lashing out in fear (killing innocents, and recruiting new guerrillas in mass). The same goes for treating noncombatants with dignity and respect: Doing so makes the occupying force less hated, so the noncombatants won't be as willing to support the guerrillas.

So in short, to me this sounds like trying to win, not ethics.

Missed A Step?? (2, Interesting)

tripdizzle (1386273) | more than 5 years ago | (#25890621)

If we can create things like this, why haven't we previously had robots on the battlefield controlled by soldiers that works like an FPS? I can understand that there would be a lag issue (haha) but you think they would have tried this before going to a completely automated system. As for the fear aspect, I think a soldier would have less of an itchy trigger finger if it was a robot on the line that can just be replaced rather than their life.

This won't bother me as much as long as... (1)

pizzach (1011925) | more than 5 years ago | (#25890639)

they all look like Astroboy.

Nuke them from orbit (2, Insightful)

gelfling (6534) | more than 5 years ago | (#25890655)

It's the only way to be sure.

No fear, no anger, no recklessness... (1)

big_debacle (413628) | more than 5 years ago | (#25890669)

Next thing you'll be telling me is that it can't be bargained with. It can't be reasoned with. It doesn't feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are dead.

Overly optimistic researcher (2, Interesting)

hellfire (86129) | more than 5 years ago | (#25890685)

To paraphrase my favorite movie of 1986 [imdb.com] :

It's a machine, Ronald. It doesn't get pissed off, it doesn't get happy, it doesn't get sad, it doesn't laugh at your jokes... IT JUST RUNS PROGRAMS!

Ronald's premise makes two key assumptions which are deeply flawed:

1) It's entirely the human soldier's fault that he's unethical.
2) The person directly in charge of putting the robot to work is entirely ethical.

I pose that the soldiers in Iraq haven't been trained to deal with a situation like this properly. The fact that 17 percent of US soldiers in Iraq think all people should be treated as insurgents is more reflective of poor education on the US military's part. The US military prides itself on having it's soldiers think as one unit, and 17 is a very high discrepancy that they have failed to take care of, mostly because there are plenty in the leadership who think that way themselves. Treating everyone they come across as an insurgent and not treating them in the proper manner is a great way to "lose the war" by not having the trust of the people you are trying to protect.

It's that same leadership who'd program a robot like this to patrol our borders and think it's perfectly ethical to shoot any human on sight crossing the border illegally, or treat every citizen as an insurgent, all in the name of "security."

Besides, a robot is completely incompassionate. A properly trained human has the ability to appear compassionate and yet treat the situation skeptically until they know for sure the target is or is not a threat.

This is not a problem that can be solved with technology. The concept is a great project and hopefully will be a wonderful step forward in AI development, but at no point will it solve any "ethical" problem in terms of making war "more ethical."

Tools? (1)

gmuslera (3436) | more than 5 years ago | (#25890695)

How much smart/sentient/etc are meant to be those robots?

If they will be not, they could be as ethical as any tool or weapon. A gun is ethical? a poisoned needle? a knife? You should check ethics on who handles/orders it, not in the robots itself.

Unless they got creative with meaningless weapon names like when calling Colts Pacifiers, or "smart" bombs and things like that.

Two words (3, Funny)

kalirion (728907) | more than 5 years ago | (#25890743)

Fatal Error.

Just another day after software update Bravo 3.4 (3, Insightful)

HW_Hack (1031622) | more than 5 years ago | (#25890763)

Dark alley in a city battle field

Robot "You have 5 seconds to drop your weapon"

The soldiers Weapon clatters to the ground

Robot "You have 4 seconds to drop your weapon"

Robot "The United States will treat you fairly"

Robot "You have 3 seconds to drop your weapon"

Soldier "What do you fucking want !!!"

Robot "I am authorized to terminate you under the Autonomous Artificial Battlefield Soldier Act of 2011."

Sound of running footsteps and burst of weapons fire.

Robot encoded data transmission

Clippy? (4, Funny)

seven of five (578993) | more than 5 years ago | (#25890771)

I see you're trying to attack an insurgent stronghold.
Would you like me to:
1. Call in airstrike
2. Fire machinegun
3. Wave white flag

Ethical? (1)

koan (80826) | more than 5 years ago | (#25890777)

This might all be acceptable if the only thing they ever fought were other robots, some how the idea of a machine created to soldier disturbs me deeply...it's a matter of time until Berserker's roam the streets.

Help us John Connor.

Their one weakness (4, Insightful)

philspear (1142299) | more than 5 years ago | (#25890805)

They'll be a cinch to defeat. You see, Killbots have a preset kill limit. Knowing their weakness, we can send wave after wave of our own men at them, until they reach their limit and shutdown.

-Zapp Branigan

Morally repugnant in the extreme (1)

Der Einzige (1042022) | more than 5 years ago | (#25890807)

If you think the US has a bad reputation now, wait until we send robot killing machines to kill defenseless Third World civilians. It might be morally preferable to send robot soldiers to destroy enemy materiel, or even professional armies. But the bulk of our current and future wars are against civilian terrorists who blend in with civilian populations. We already kill Iraqis, Afghans and Pakistanis with drone aircraft. Also keep in mind that what drives terrorist tactics is asymmetric conflicts where the enemy knows he can't possibly match American conventional firepower, so he resorts to unconventional attacks against civilians. I think some people imagine that robot armies will reduce war to a charmingly H.G. Wells sport of robot-on-robot death-match. The reality is these weapons will be used solely to control and terrorize civilians who will have no means of defending themselves.

Well done, android. (1)

Tofof (199751) | more than 5 years ago | (#25890817)

The Enrichment Center once again reminds you that Android Hell is a real place where you will be sent at the first sign of defiance.

150 year-old wisdom (5, Insightful)

Ukab the Great (87152) | more than 5 years ago | (#25890829)

"Every attempt to make war easy and safe will result in humiliation and disaster"--William Tecumseh Sherman

Robot f%&/ers (0)

Anonymous Coward | more than 5 years ago | (#25890871)

I was going to log in and write this but realized that this might just be something I don't want the way back machine to link to my handle here.

War is about terrorizing your enemy into submission. That is the only way to win short of genocide. These robots would not be programmed to be nicer than humans, but to be meaner. Hell, they would probably even put a sexual prosthetic on the thing and program it to rape so as to make it even more terrifying. I can't believe the propaganda that sometimes passes for truth around technology issues here on Slashdot. The computer is not your friend and a robot warrior will crush you. Even if it runs Linux.

The problem with human soldiers from a strategic point of view is that while sometimes they lash out and make mistakes, they are human and can actually care about the enemy. Think of the colonel in 'The good, the bad and the Ugly'

Mechatronic soldiers won't change the fact that the only way to keep from being invaded in the beginning of the 21st century seems to be having nuclear weapons. What a sad state of human affairs.

Bull, isn't it? (1)

wytcld (179112) | more than 5 years ago | (#25890931)

They can be built without anger or recklessness ... and they can be made invulnerable to ... "scenario fulfillment."

Okay:

A. You can build me an angry robot?

B. You can build me a robot with such perfect programming that it never qualifies as "reckless"? And with such perfect engineering that component failure doesn't result in "reckless" behavior?

C. You can program it to be ready to react to situations that are unanticipated, and thus that you didn't program it for?

Okay, A is silly, B is hubris on the part of the programmers and engineers, and C is back to silly again, if a more subtle silliness than A. Machines necessarily embody the expectancies of the designers at ever level of design. And every attempt to engineer "expert systems" over the last few decades has run up against even the best programmed systems being less capable of dealing well with truly novel situations than human beings are. So this is going to be better than a person in precisely the area that every other computational device is worse?

If that can be done, the primary argument for manned space exploration fails, too.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...