Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

'Ban Killer Bots,' Urges Human Rights Watch

Unknown Lamer posted about 2 years ago | from the assessment-will-not-be-by-humans dept.

Robotics 297

Taco Cowboy writes "A self-proclaimed 'Human Rights Group' — the 'International Human Rights Clinic' from Harvard Law School — has teamed up with 'Human Rights Watch' to urge the banning of 'Killer Robots.' A report issued by the Human Rights Watch, with the title of 'Losing Humanity,' claimed autonomous drones that could attack without human intervention would make war easier and endanger civilians. Where's the 'Robot Rights Watch' just when you need 'em?"

Sorry! There are no comments related to the filter you selected.

Sounds like a great idea (5, Insightful)

OrangeTide (124937) | about 2 years ago | (#42034603)

We should go back to using cruise missiles and carpet bombing.

Re:Sounds like a great idea (0)

Anonymous Coward | about 2 years ago | (#42034663)

When can I buy one for my property?

DIY (0)

Anonymous Coward | about 2 years ago | (#42034697)

You should be able to make German V-1 flying bombs with your homemade 3D printer. It's just a pulse jet engine. I'll post the source for a Arduino-based guidance computer later tonight.

Re:DIY (0)

Anonymous Coward | about 2 years ago | (#42035111)

Yeah have fun with that plastic pulsejet, 8bit avr at 16mhz, sampling off junk sensors with low sp/sec ADC, doing its best to get a useful read off that $15 gps module.

Re:DIY (0)

ArcadeMan (2766669) | about 2 years ago | (#42035209)

I never understood why they clock the AVR at 16MHz when that part is rated for 20MHz.

Re:DIY (1)

thygate (1590197) | about 2 years ago | (#42035443)

only more recent models are rated at 20MHz, 16MHz used to be the norm.

Re:DIY (1)

Anonymous Coward | about 2 years ago | (#42035841)

I'm pretty sure the vacuum tube and gyroscope guidence used in those old rockets and cruise missiles were a bit less sophesticated than what today's $5 microcontrollers can manage. Given that those V-1s were mostly plywood, I suspect plastic parts are not going to be a significant problem. As for tracking with a $15 gps, the pulse jet stuff isn't terribly fast anyways it's a cruise missile, so it operates at a cruising speed. (whatever you determine is a useful speed above your stall speed)

You tear down the idea, but wait until some nut job makes one and pisses every security organization off.

Re:Sounds like a great idea (1, Informative)

Icegryphon (715550) | about 2 years ago | (#42034709)

Agreed, Winston Churchill had it right. William Tecumseh Sherman had it right. Destroy every bit of the enemy infrastructure and they wont have anything to wage war with.

Re:Sounds like a great idea (2)

Trepidity (597) | about 2 years ago | (#42034837)

"Totaler Krieg – Kürzester Krieg", as they say

bring everyone back to the iron age? (2, Insightful)

Anonymous Coward | about 2 years ago | (#42035863)

a scortched earth policy: where you knock down granaries, cripple tractors and plows, break down damns and salt the earth if you have to is not really the kind of society we wish to represent.

Re:Sounds like a great idea (5, Insightful)

ThatsMyNick (2004126) | about 2 years ago | (#42034769)

Nope, but we dont need fully autonomous killer robots either. Would you rather have a robot determine if a target is worth killing, rather than a human?

Re:Sounds like a great idea (2)

Ambassador Kosh (18352) | about 2 years ago | (#42034797)

I would trust the robot more. You could program it to not take things like emotions into account. You can have it judge if someone is hostile or a combatant and only exercise the force required. Humans are far more likely to overreact.

Re:Sounds like a great idea (1)

ThatsMyNick (2004126) | about 2 years ago | (#42034877)

A robot controlled remotely by a human (even if only for the kill) would accomplish the same. The human is way detached, that it would be easy to control his emotions.

Re:Sounds like a great idea (1)

Ambassador Kosh (18352) | about 2 years ago | (#42034997)

And yes those humans do violate the various rules that cover war and do shoot into crowds of civilians. I would say so far that approach is not working very well.

Re:Sounds like a great idea (1)

ThatsMyNick (2004126) | about 2 years ago | (#42035061)

Assuming humans still control these autonomous robots, autonomous robots would solve these problems either. Would you expect these autonomous robots to refuse an order given by their human commander? Would you expect these robots to be programmed to be capable of refusing a command?

Re:Sounds like a great idea (1)

Ambassador Kosh (18352) | about 2 years ago | (#42035093)

They absolutely should be programmed to refuse orders like that. However, I don't expect that to happen which is very very sad.

Re:Sounds like a great idea (1)

BradleyUffner (103496) | about 2 years ago | (#42034885)

I would trust the robot more. You could program it to not take things like emotions into account. You can have it judge if someone is hostile or a combatant and only exercise the force required. Humans are far more likely to overreact.

How do you code to detect hostility?

Re:Sounds like a great idea (2)

r1348 (2567295) | about 2 years ago | (#42034955)

Adult male = terrorist, as everyone in Pakistan knows.

Re:Sounds like a great idea (1)

Ambassador Kosh (18352) | about 2 years ago | (#42035043)

That is the way that humans controlling the robots are doing the classifying right now. Clearly that is WRONG as hell. A robot programmed to follow things like the geneva conventions would not fire until actual evidence of someone being a combatant. (ie they actually pulled out a gun and fired at the robot or at something the robot is supposed to protect)

Re:Sounds like a great idea (1)

Ambassador Kosh (18352) | about 2 years ago | (#42034987)

Give the robot reflexes fast enough that you don't calculate it. You just scan for it and shoot back.

Time is not the same for computers as it is for us. What is an instance for a human is a very very long time for a computer. We have computers that can visually pick out a bad product from a free fall of products and use a small puff of air to move just that one bad product out of the way. To a human the whole thing looks like a solid fall of stuff.

Things like that are used for french fries for instance.

The point is the world is very slow for a computer. Since we don't' care if the robot is shot let it take the first shot and then shoot back. We could also program it with various parameters to protect things classified as nonhostile.

Re:Sounds like a great idea (2)

Black Parrot (19622) | about 2 years ago | (#42035881)

We should go back to using cruise missiles and carpet bombing.

Where do you draw the definitional line? Isn't a cruise missile a robot that kills people?

Re:Sounds like a great idea (1)

WrecklessSandwich (1000139) | about 2 years ago | (#42036109)

Where do you draw the definitional line? Isn't a cruise missile a robot that kills people?

Someone had to push the launch button.

Dwight from the Office already addressed this (0)

Anonymous Coward | about 2 years ago | (#42034615)

Just limit them to 6 foot power cables.

Killing without human intervention? (4, Interesting)

osu-neko (2604) | about 2 years ago | (#42034699)

That's nothing new. It's no different than land mines...

Oh, wait... [wikipedia.org]

killer robots are the new landmines (0)

Anonymous Coward | about 2 years ago | (#42034701)

if a killer robot slaughters a bunch of little kids, no worries, no one's responsible, the robot did it!

Re:killer robots are the new landmines (1)

Jetra (2622687) | about 2 years ago | (#42034751)

Until PETA become PETSCO (People for the Ethical Treatment of Sentient Creatures and Objects). Think they were bad when they went to war with Pokemon and Mario? It'll be a humorous variety show when they start posting statements like "Robots have feelings too!" and "Viruses must live!"

Oh great, I gave them ideas.

Human rights (4, Insightful)

Marxdot (2699183) | about 2 years ago | (#42034705)

Why would you deride human rights groups, Taco Cowboy? And yes, drones that attack autonomously are a very bad idea.

Re:Human rights (1)

Shavano (2541114) | about 2 years ago | (#42035819)

What do you mean? I want my Cylon Centurions!

the danger of abstracted combat (5, Insightful)

wierd_w (1375923) | about 2 years ago | (#42034713)

While cliche, take a look at "wargames".

Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people.

It makesit fantastically easier to justify and ignore wholesale slaughter.

A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!) Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! Can't make an omlette without breaking some eggs, right?!

When you can simply push a button and walk away without having to witness the attrocities you cause, you abstract away a fair bit of your conscience.

The military probably thinks that's a GREAT thing! Kids with guns won't cause mental trainwrecks to drones when they get mowed down, and the operator doesn't have to see it!

The reality is that deploying terminators is the same as turning a blind eye to consequences, and the innately terrible thing that war is, and why it should always be avoided whenever and however possible.

Re:the danger of abstracted combat (4, Insightful)

ThePeices (635180) | about 2 years ago | (#42034847)

But we have all been taught from an early age that it is wrong to feel guilt for killing bad guys. If you feel guilty, then you are *for* the bad guys, and therefore one of *them*. ( remember, its a binary good/evil world we live in, amiright?)

Killing bad guys is doing your country a service, we are taught. We are making the world a better place, a safer place, when we kill our enemies.

This we are taught. If any one disagrees with that, then they are unpatriotic, and aiding and abetting the enemy.

This we are taught, so it must be true.

Re:the danger of abstracted combat (4, Interesting)

wierd_w (1375923) | about 2 years ago | (#42035031)

What is an enemy?

Is it a person who wishes to do you harm?
A person who wants to take something you have?
A person with whom you disagree?
Or just someone in the way of what you want to do?

In a war, do not both sides, regardless of the motives of either side, satisfy all of those? Is it no wonder that both sides refer to the other as the enemy?

In this case, what do the "terrorists" represent, that they merit being exterminated, without conscience nor remorse?

"Thy killed a shitton of people when they bombed the trade center!" You say?

Why? Why did they blow up the trade center?

It couldn't be because our countr(y/ies) was(were) meddling in their affairs, causing them harm, taking things from them, and fundementally in disagreement with their way of life?

Certainly not! They should be HAPPY that we want to destroy their culture, because we view certain aspects of it as being backwards and primitive! Our way is simply BETTER!

Now, let's do a thought experiment here. Powerful aliens come down from wherever out in space they are from, find our culture to be backward and primitive, and start strongarming us to cease being who we are, and become like them. They say it's a better way. Maybe it is. That isn't the point. The point is that they don't give us the choice. They do this because it makes it easier for them to establish trade with them, or to work within their stellar economy, or whatever. They profit, by eliminating our culture.

Would we not go to war with them, fighting their influence in every possible way, and even resort to guerilla and "terrorist" acts when faced by such a superior foe?

After thinking about that, can you really say you are any different than the "terrorists" we condemn with military machines daily?

We kill them, because they don't submit. They don't submit, because we are destroying and marginalizing their culture, because we feel it isn't worth retaining/is backward.

They don't want our help. They don't want our culture. They dnt want our values. They don't want us. We insist on meddling on all of thoe things.

We started the war.

Re:the danger of abstracted combat (1, Troll)

CRCulver (715279) | about 2 years ago | (#42035929)

It couldn't be because our countr(y/ies) was(were) meddling in their affairs, causing them harm, taking things from them, and fundementally in disagreement with their way of life?

Only the last is really true. The current wave of Islamist violence, and hatred of the United States in particular, is traceable in large part to the writings of Sayyid Qutb. He visited the United States in the late 1940s and condemned it for its culture (e.g. its sexual openness, or at least its perceived sexual openness), not for meddling foreign policy. America's interventions in the Middle East certainly added fuel to the fire, but the fire was burning before them.

Sometimes easy to tell (1)

SuperKendall (25149) | about 2 years ago | (#42036007)

What is an enemy?

Anyone firing an un-tagged large rocket in a region you control.

In a war, do not both sides, regardless of the motives of either side, satisfy all of those?

Satisfy all of your absurdly soft-boiled and meaningless definitions of enemy? Yes.

It couldn't be because our countr(y/ies) was(were) meddling in their affairs

Actually no, it was simply that they wanted to collapse our economy. Our non-religious existence is an affront many of the terrorists wished to correct.

That's the really sad thing; that even now you cannot understand such a simple and obvious truth.

Re:the danger of abstracted combat (0)

Anonymous Coward | about 2 years ago | (#42034923)

You could say the same of bows/archers -- all they see is just some silhouette falling in the distance, not like swordsmen who have to smell the blood and guts -- they don't feel the guilt of felling an enemy in combat.

Re:the danger of abstracted combat (1)

wierd_w (1375923) | about 2 years ago | (#42035717)

Not exactly... snipers still feel remorse for their first kill at the very least. Becoming numb to losing your humanity comes later.

A robot never suffers that. It doesn't lose any humanity, because it never had it to lose.

Re:the danger of abstracted combat (0)

Anonymous Coward | about 2 years ago | (#42034935)

The reality is that deploying terminators is the same as turning a blind eye to consequences

Tell me about it. I lost an entire squad to the Warp. I just didn't think it could happen to me.

Re:the danger of abstracted combat (0)

Anonymous Coward | about 2 years ago | (#42034947)

" It makes it fantastically easier to justify and ignore wholesale slaughter. "

Yes, indeed, slaughtering the Arabs and Afghans like livestock. It would be murder if the weapons were to be used against actual humans.

" Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! "

They are indeed "poor," because they have no patios to sweep or pools to clean. If they could afford even a broom then they'd be sweeping dirt floors, and what is the point of that? As for pools, perhaps they could dig a big hole in some dirt, but then the water would be so murky they couldn't see what they're trying to net.

" Kids with guns won't cause mental trainwrecks to drones when they get mowed down, and the operator doesn't have to see it! "

Well, then kids shouldn't be carrying guns through town.

-- Ethanol-fueled

Re:the danger of abstracted combat (0)

Anonymous Coward | about 2 years ago | (#42034963)

People need to learn more about what they're protesting. Crazy hypothetical what-ifs like "What if all the robots became sentient, went insane, shook off human control, and started killing all humans with two arms" only serve to weaken the arguments against using them. First off, it's not the soldiers who are deciding to conduct the war. Civilian governments are the ones who decide to go to war, and they don't suffer from the innate guilt of killing people. Furthermore, humanity has been distancing itself from combat ever since we invented bows and arrows. You may as well be asking that we return to clubbing each other to death so that our soldiers are as traumatized as possible. And even if that kind of gruesome demand was moral, a drone operator with a camera who has been watching the target for six hours is certainly more connected with the morality of the action than an artilleryman or a bomber pilot, or even an infantryman firing from 200 meters away.

Re:the danger of abstracted combat (2)

wierd_w (1375923) | about 2 years ago | (#42035195)

. First off, it's not the soldiers who are deciding to conduct the war. Civilian governments are the ones who decide to go to war, and they don't suffer from the innate guilt of killing people.

Exactly. The soldier is to government, what the drone is to the soldier. A layer of abstraction, that makes the burden of killing another person easier to bare.

The government points soldiers, and says "Kill!". They don't see the faces of those they killed, not have to face the families of the slain. The numbers killed are just nameless, nonhuman statistics. The enemy isn't a person anymore, and killing doesn't induce guilt.

The soldier suffers psychological harm from following those orders, and sometimes even refuses to follow them, if they inhuman enough. To government, the abstraction then fails.

They solved it by abstracting the killing away from the soldier too.

Who will protest an order to comit a warcrime, when nobody with control over the automated weapons cares?

What you have said just now is not license to use drones, because of an existing evil. It is proof of the argument I tendered.

We make war too easily already. We DON'T need to make it even easier.

Re:the danger of abstracted combat (5, Insightful)

Anonymous Coward | about 2 years ago | (#42035071)

Dude have you seen what happens when people are forced to kill face-to-face? They don't rely on their "conscience" to limit human casualties, they mentally reassign their opponents as non-human and murder them like you or I would murder a termite colony infesting our houses. History is nothing but one long string of horrific atrocity after atrocity committed by warring factions against the opposing side, or civilians, or even their own comrades in arms if there isn't a convenient "other" nearby they can target. Moving to a more abstracted method of fighting isn't just about saving our own forces' physical and mental well-being, it's also about limiting the damage they cause to others when they snap from the pressure and take their aggression out on whoever's available.

Of course we need to monitor our use of robots - we need a system of checks and balances in place to keep the controllers from engaging in unnecessary combat. But drones don't mass-rape, they don't torture old men and little children for fun, they don't raid houses to steal anything of value within, they don't build towers out of the skulls of their enemies, and they won't burn entire villages to the ground massacring everyone within because they're upset that their buddy was killed in combat the other day. Human involvement isn't always a good thing.

Re:the danger of abstracted combat (4, Insightful)

wierd_w (1375923) | about 2 years ago | (#42035299)

You are not comprehending what I am telling you.

War is to be avoided, because nothing about it is good, just, nor honorable. War scars the minds of those who engage in it, live through it, or even witness it first hand. The damage and price of war is more than just soldiers killed and buildings blown up. It is the destruction of people's lives, in every imaginable sense. Surviving a war might be less humane than dieing in it.

The point was that by removing the consequences of war, (soldiers becoming bloodthirsty psychos that rape, kill, torture, and lose respect for the lives of others, all others-- in addition to simply having people die, and having economic and environmental catatrophes on your hands), you make war look more and more desirable as an option.

What I was trying to get you to see, is that war is always a bad thing, and trying to mae it seem like less of a bad thing is the WRONG way to go about it.

Re:the danger of abstracted combat (0)

Anonymous Coward | about 2 years ago | (#42035629)

You're right, I don't comprehend. Are you saying that anything that limits the terrible consequences of war is a bad thing, because people are more willing to accept "less bad" consequences than "more bad" consequences? Because I'm pretty sure the world has no shortage of people willing to engage in war no matter how horrific the consequences are.

I guess what I'm saying is that this seems like a situation which the phrase "the perfect is the enemy of the good" was made for - I'd love it if nobody engaged in warfare ever, but since that is not going to happen in the world we have right now I want to at least minimize the damage. By all means critique the current methods - I think the way we do drone strikes right now is unacceptably inaccurate, leading to too many civilian casualties - but I do think drones are a step in the right direction because they're not as fallible as ground troops.

A ripple in time (0)

Anonymous Coward | about 2 years ago | (#42035073)

I'm certain humanity has had this discussion before.

How is the abstraction of using robots any different from the abstraction of dropping bombs from 30,000 feet?
How is the abstraction of dropping bombs different from shooting someone in the face at 500 yards?
How is shooting someone different from hitting someone with an arrow at 100 yards?
How is hitting someone with an arrow different from dropping boiling oil on someone 30 feet below?
How is dropping oil different from spearing someone?
How is spearing any different from slashing with a sword?
How is a sword different from a rock?

Every time we have this conversation it gets easier. Every time humanity loses.

Re:A ripple in time (1)

wierd_w (1375923) | about 2 years ago | (#42035347)

Indeed. Humans NEVER accept that the answer is so simple.

Don't resort to war. If your cause requires forcing somebody else at gunpoint to comply, it isn't just, it isn't honorable, and it cannot be justified. So, just don't do it.

But no. Human kind is OBCESSED it making other people OBEY, even if it kills everyone else.

War and Pacifism (5, Insightful)

gd2shoe (747932) | about 2 years ago | (#42035977)

Indeed. Humans NEVER accept that the answer is so simple.

Don't resort to war. If your cause requires forcing somebody else at gunpoint to comply, it isn't just, it isn't honorable, and it cannot be justified. So, just don't do it.

Let's say that China attacks Guam tomorrow, and starts moving for Hawaii and the US mainland. What should be done? What should France have done when Germany invaded them in the blitzkrieg?

Clearly somebody isn't justified in any war. Frequently it's both parties. However, it is the height of intellectual dishonesty to say that war is never justified for any of the participants.

Yes, war is never, ever a good thing. Sometimes, though, it really is better than the alternative.

Re:the danger of abstracted combat (0)

Anonymous Coward | about 2 years ago | (#42035109)

Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people.

I get your point but I'm pretty sure we moved past that goal line years ago. When the United States goes to war the first thing that happens is a lot of missiles that are self guided are shot from boats towards targets to disable all communications and any type of radar/detection devices.

One might argue that having a self guided weapon be just a little smarter is a better alternative to a precision blind strike.

Baby steps.

Re:the danger of abstracted combat (5, Insightful)

Eevee (535658) | about 2 years ago | (#42035131)

A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!) Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! Can't make an omlette without breaking some eggs, right?!

So you're for robots and drones, right? Because right now the glitch in programming is when human soldiers in a combat area see someone with something that might be a weapon, they tend to shoot them. Why? Because the ones going "Is that a weapon or is it a broom" don't tend to last when it is actually is a weapon. A drone operator, on the other hand, can take the time to evaluate the situation since they aren't in harm's way.

Re:the danger of abstracted combat (1, Insightful)

wierd_w (1375923) | about 2 years ago | (#42035451)

No. You fail to comprehend my position at all.

There shouldn't be anyone making that decision. At all.

Making that decision easier, by having a machine do it, to alleviate the guilt of a human operator, and his chain of command, is the WRONG direction.

Want to know where it ends?the creation of things like "perfect" WMDs. Kills all the people, spares everything else. Push the button, war is over. A whole society dies, and the one pushing the button loses nothing. What possible reason would that society have to NOT simply push that button whenever it didn't get what it wanted, or to theaten to push it when it didt get its way?

THAT is the danger of abstracted warfare. It makes the decision to go to war easier. It makes ware a more desirable option.

I support 'war isn't a real option, it's an outcome of aggression. It should never BE CHOSEN.'

So, no. I don't support drones, and I don't support soldiers.

Re:the danger of abstracted combat (0)

Anonymous Coward | about 2 years ago | (#42036085)

Ok then. You have your beliefs, I have mine. My belief says that its my religious duty to execute you as a heretic. Please standby while I execute anyone else in your household before I put an AK-47 round through your skull. /sarcasm

A pacifist mentality only works when EVERYONE has a pacifist mentality (or its protected by those with a non-pacifistic mentality.)

Re:the danger of abstracted combat (0)

Anonymous Coward | about 2 years ago | (#42035407)

Maybe we can get to a stage where autonomous warfare IS the only warfare. Then again that would give some impressive power to military organizations, especially government owned ones. But hey, maybe it'll turn into a for-profit industry by then?

Re:the danger of abstracted combat (1)

wierd_w (1375923) | about 2 years ago | (#42035619)

That does not solve the problem.

We have a problem with war, because we have a problem with believing we can (and should!) Force other people to do what we want them to do against their will.

It is the same crime as with rape, and with slavery. I have the biggest gun, do what I say!

The ONLY time to take up arms is when an aggressor comes to visit YOU. You should NEVER take up arms against another to conquor. Your failing economy is not justification. Your need for cheap energy is not justification. Your fucking god is not justification. Proving your dick is bigger is not justification.

There should only be defensive armies. Drones are not a defensive tool. They should not exist.

Re:the danger of abstracted combat (1)

pkthunders (2777383) | about 2 years ago | (#42035839)

1) I encourage you to not use the word "you" too blatantly; it makes it seem like I am at fault for all of this. Try "one" instead. I mean I don't have a failing economy, my country does (though it's not really my country, but that's beside the point). 2) My needs and wants are mine and mine only. I am also an atheist. I think that you need to understand that generalizing a society is a pathetic excuse to condemn people that you don't truly understand (aka the individual). 3) My comment was satirical. 4) Yes, I am OP. And yes, I did finally care enough to make an account... 5) Also, I believe your whole argument ignores the fact that modern society is based upon forced manipulation into conformity caused by the want of individuals to feel like they belong. The media (such as this site) is a strong catalyst for this. It's true but that's enough cynicism today. 6) Eh, defense drones exist/are being used currently, and I also think that your views are too idealistic. Although I do commend you for them, it will take quite a while for peace within humanity. I actually hope that a major incident involving drones occurs, resulting in their being labeled as a taboo (such as atomic bombs). Hopefully that could even help everyone see that this is pointless. Memento mori. But seriously, I think you are very offensive.

Re:the danger of abstracted combat (1)

gd2shoe (747932) | about 2 years ago | (#42036025)

There should only be defensive armies. Drones are not a defensive tool. They should not exist.

Of course they are. They actually work better as defensive weapons. Just because they are often used offensively doesn't mean they aren't natural peace keepers.

Re:the danger of abstracted combat (1)

Hentes (2461350) | about 2 years ago | (#42035831)

Abstracting away the reality that you are killing people, by making a machine do the actual deed after deployment removes the innate guilt of killing those people.

It makesit fantastically easier to justify and ignore wholesale slaughter.

A glitch on the program makes the drone think that anyone carrying a cylinder 2ft long and 1 inch diameter a combatant? (Looks like a gun barrel!) Well, all those poor fuckers carrying brooms and sweeping their patios had it coming! Nevermind those uppity pool boys with dipnets! Can't make an omlette without breaking some eggs, right?!

While the US army has developed software that can theoretically command drones autonomously they don't use it exactly for these reasons. Current drones are human controlled. Your argument is based on a false assumption.

Killer robot overlords (0)

Anonymous Coward | about 2 years ago | (#42034727)

I for one do NOT welcome our new killer robot overlords.

Ban (3, Insightful)

ThePeices (635180) | about 2 years ago | (#42034747)

Trying to ban killer robots is a waste of time, and wont work. There is also little desire to ban them overall, in the interests of health and safety.

Its safer to kill people using a robot than going out and risking your own skin with guns and/or explosives.
Remember, in this day and age, safety is paramount. You want to be able to kill people from a distance, safely and easily. Why run the risk of getting injured, or even worse, getting killed, when you can kill people using safer methods? Using a robot to kill people just makes sense.

Even worse, you could get sued for endangering the safety of others and breaking health and safety regulations. Killing other people can be a dangerous business, so reducing potential hazards and minimizing harm is a very prudent and right thing to do. You need to be able to kill people safely and efficiently. If you can kill people at a lower cost, then that is even better.

Thats why drones are so popular nowadays. All the benefits of killing people, without all the personal risk. Its a win-win all round.

Makes sense doesn't it?

Re:Ban (1)

AwesomeMcgee (2437070) | about 2 years ago | (#42034845)

What we really need is *more* killer robots. TONS of them. Then we can just have them fight our wars for us, between each other and we don't get human casualties on either side. As soon as your killer robots have killed all of the enemies killer robots, their people will obviously secede power to you because it's not like they could defeat your killer robots without there's, so it would be stupid for them to even continue bothering.

For maximum safety, we just let the killer robots fight each other on the south pole, and war as we know it is over (though there would be a lot more wars, but they'd be nothing like our current wars.)

Re:Ban (1)

dcollins (135727) | about 2 years ago | (#42034993)

"Then we can just have them fight our wars for us, between each other and we don't get human casualties on either side."

Maybe you're joking, but enough people honestly believe this to say -- This is one of the top, ludicrously insane myths among the geek set.

If people are willing to die for a cause, or if they feel life is not worth living without principle or resource X, then they will not stop fighting until they are dead. Simple as that. War is the final extremity, when all agreements break down, and one side is convinced that only extermination will stop the other side. QED.

Re:Ban (0)

Anonymous Coward | about 2 years ago | (#42035187)

Heroin is cheaper. The Talibs are doing a fine job that way.

Re:Ban (1)

AwesomeMcgee (2437070) | about 2 years ago | (#42035251)

Heh I appreciate you're trying to ensure people don't take me seriously, but you don't need to correct me, I was being completely sarcastic. People who fight wars are dumb, killer robots won't fix that, it'll just make it easy for dumb people to kill more

Re:Ban (1)

Kjella (173770) | about 2 years ago | (#42035433)

As soon as your killer robots have killed all of the enemies killer robots, their people will obviously secede power to you because it's not like they could defeat your killer robots without there's, so it would be stupid for them to even continue bothering.

And then we all sing kumbayah? Or is that when most of the human population get the ultimatum to obey that unstoppable killer robot army or die? I don't think you want to find out what 21st century slavery would be like. No more free countries to run off to. Tagged with a microchip, a GPS foot bracelet, cameras and sensors everywhere and merciless and uncorruptable robots enforcing and possibly supervising the system. Every form of communication like phone, email, facebook and whatever monitored and the rest outlawed. Sabotage? Revolt? Execute a few civilia...sorry, slaves and they'll rethink what they're doing. The big question with any robot army is who have the controls, and who do anyone controlling a robot army answer to? None, most likely...

Re:Ban (1)

AwesomeMcgee (2437070) | about 2 years ago | (#42035485)

The really funny thing is when the dude holding the controls dies and then the robots no longer answer to anyone, just running everything the same way it's been for years.

Man, there'll sure be egg on our faces that day!

Re:Ban (1)

JonySuede (1908576) | about 2 years ago | (#42034883)

Thats why drones are so popular nowadays. All the benefits of killing people, without all the personal risk. Its a win-win all round. Makes sense doesn't it?

It does, way too much for my own taste. That why it should be banned: it does make it too safe to kill...

Re:Ban (1)

j00r0m4nc3r (959816) | about 2 years ago | (#42035095)

Do you have any idea how many things there are that make it safe to kill? We can't just ban everything that has the potential to be used in warfare, there would be nothing left. What is required is a total shift in human consciousness away from even needing such machines. Our lust for war/power/goods is the real problem. Obviously in the meantime we need to defend ourselves, but banning things like this won't make that any easier, and will only stunt our technological progress. If we don't build ThingX some other country WILL decide to build it, and we'll be at a disadvantage. It's sad game of deterrence, yes, but until people learn to stop fighting each other and start working together, it's a necessary evil...

Why so cynical? (0)

Anonymous Coward | about 2 years ago | (#42034779)

Taco Cowboy must have a very small penis, which he compensates by having a verrrry big gun.

Ban the drones (1)

Anonymous Coward | about 2 years ago | (#42034789)

The majority of drone strikes that the US participates in kill innocent civilians in countries we're not even at war with.

Ban the drones.

Those aren't the drones you're looking for. (1)

Shavano (2541114) | about 2 years ago | (#42035859)

The drones people complain about in (for instance) Afghanistan aren't autonomous robots. They're flown and their weapons are targeted by human pilots.

Move along.

Robot rights? (1)

girlintraining (1395911) | about 2 years ago | (#42034799)

Where's the 'Robot Rights Watch' just when you need 'em?"

They don't have feelings. If you don't believe me, go stab your toaster. I think what you meant was "Human rights" and the effect wide-spread use of robots with the ability to kill would have on them.

Backwards (1)

Anonymous Coward | about 2 years ago | (#42034817)

Combat robots will almost certainly SAVE civilian lives, not cause them.

  A robot needs no personal safety. It doesn't get nervous or make foolish choices or act for revenge. A human soldier may have to make a split-second decision about whether a suspicious person is a terrorist or a civilian, and if he's wrong he's dead. That leads to a lot of casualties. A robot doesn't need to act until it's sure. The worst that can happen is that the Army loses some money.

Re:Backwards (0)

Anonymous Coward | about 2 years ago | (#42034939)

The assumption being that it's in the attacking country's best interest NOT to have excessive casualties in the defending country's civilian populace.

Contrary to popular belief that is rarely held as true in modern society given the overabundance of potential colonists in whatever major power's populace you wish to consider this against.

I have a feeling in the near future we're going to see genocide on a scale that will make all prior genocides pale in comparison. I just sincerely hope I'm not in the recieving demographic.

how appropriate (0)

Anonymous Coward | about 2 years ago | (#42034841)

...that a poster whose previous post is whining about how he was bullied at school by people who "knew they were inferior" is now glorifying automated killing and deriding those genuinely intelligent men and women (Harvard has a few trustafarian idiots, but they're a tiny minority) with a sense of humanity.

No.... (0)

Anonymous Coward | about 2 years ago | (#42034843)

Just specify that robots can only kill other robots.
And we can make war obsolete.

KILL THOSE WHO WILL BAN KILLER BOTS !! (0)

Anonymous Coward | about 2 years ago | (#42034867)

All in favor say Aye !!

Killer Robots meet the killer apps (1)

DarksideCoatiMundi (2777343) | about 2 years ago | (#42034897)

Killer robots are inevitable. Equally inevitable is the hacking of killer robots.

Shover robots okay (0)

Anonymous Coward | about 2 years ago | (#42034915)

I don't mind them at all as long as they're shover robots. It's the pusher robots we need to worry I about.

In related news ... (1)

PPH (736903) | about 2 years ago | (#42034931)

... the trend giving killer drones an initial lead seems to have been reversed [slashdot.org] in human's favor recently.

Antisemitism (0)

Anonymous Coward | about 2 years ago | (#42034933)

But banning emotionless murdering kill bots is surely antisemitic?

Guns don't kill people... (1)

Nyder (754090) | about 2 years ago | (#42034941)

...Gun Wielding Robots do!!!!!

That horse left the barn long ago... (1)

Kjella (173770) | about 2 years ago | (#42035001)

Robots doing the killing is not going to be very different from bombing from 10000 ft, launching a cruise missile or long range artillery bombing. It's a long time since you had to look your opponent in the eye as you stabbed him with sword and spear. And that didn't seem to help much to stop war, either. Potentially you can do better with robots because robots are expendable, you don't have to return fire until you're sure you've isolated the enemy. Even if you were willing to sacrifice your own soldiers to reduce collateral losses, the soldiers in the field probably aren't over being only 90% sure that's a terrorist or 90% sure the grenade won't kill anyone else.

Of course if you want to act with reckless disregard of - or worse, reign of terror over - the civil population there's no real fighting back. But if a modern army wants to raze the city they don't need robots to do it. The only real game changer I see is that a small clique could hold control over a 100% loyal military that'd ruthlessly crush any rebellion. But most of the gruesome things they could want to do they already got the big guns for. Of course the argument is that clean war leads to more war, but well... we've seen big and dirty war in Hiroshima and Nagasaki. I think for civilian more wars with cruise missiles still beats less wars nuking whole cities off the map.

Re:That horse left the barn long ago... (1)

BeanThere (28381) | about 2 years ago | (#42035829)

Most people are missing the point. It's not about the method of killing, it's whether a particular act of killing is justifiably self-defense or not. If a 'killer robot' protects an innocent woman and/or child from being raped and/or murdered, then it stands to reason this is good. Denying innocent parties a valid method of self-defense is (e.g. banning methods of self-defense), on the other hand, wrong. When 'killer robots' become intelligent enough to be used for e.g. home security then I'll be getting one ... it will keep my wife and children safer from murderous thugs.

As for military, again, what if 'killer robots' could be used to help conduct surgical precision strikes against ruthless murderous dictators like Kim Jong Un, minimizing loss of innocent life? Would they still be 'bad'? Wouldn't banning them in such a case actually prevent killers like Kim Jong Un from being stopped?

The problem with nuclear weapons (that you mention) is that they inherently and unavoidably kill thousands of innocent people. 'Killer robots' can in fact do the exact opposite - they can be used for surgical precision strikes of precisely the 'bad guys' - which is actually the correct thing, in war or not. (Yes they could be used for evil, but that doesn't mean they should be banned.)

User Error (0)

Anonymous Coward | about 2 years ago | (#42035059)

Drones are not to blame.

In an all robot army, the drones would not have to worry about protecting the allied meatbags on the ground. There would be much less of an incentive to "take the shot."

They could be designed within specifications and the people who designed and implemented these specifications could be held responsible because they were able to make the decisions in a stress free environment.

By putting allied humans in the danger zone along with civilians, and spreading the responsibility across a hierarchy of operators, administration, and engineers, we have have decided to implement the case that results in the absolute maximum number of dead civilians.

Banning something which doesn't exist (3, Insightful)

poity (465672) | about 2 years ago | (#42035085)

No one has autonomous battlefield drones yet, and I highly doubt any military would rely on them, ever. Well.. unless it's a robot military after they gain sentience and create their own civilization, but then they would be as human as us.

Re:Banning something which doesn't exist (0)

Anonymous Coward | about 2 years ago | (#42035255)

Nobody can develop real estate on other planets yet either, but we already have a treaty on that.

Re:Banning something which doesn't exist (1)

ThatsMyNick (2004126) | about 2 years ago | (#42035755)

It is easier to ban something that doesnt exists. Governments can be persuaded to sign these, with a promise that every country is signing these. Now try the same for nuclear weapons (which exists), and you would have trouble, even, to bring it up for discussion.

This worries me too (1)

Omnifarious (11933) | about 2 years ago | (#42035293)

The 'Berserker' novels of course are an examination of the end result of building such killer robots. It will happen eventually. But I don't want it to happen until some of us are no longer in the solar system.

Re:This worries me too (0)

Anonymous Coward | about 2 years ago | (#42035475)

See also, The Battle by Robert Sheckley.

"http://libertydwells.com/archive/index.php/t-3177.html"

The only thing I find worth noting (0)

Anonymous Coward | about 2 years ago | (#42035361)

Seems like that the manufacturer might have to claim responsibility for autonomous actions. Paramilitary organizations/industries, assemble? I mean it's not like they care what they kill as long as they get paid.

Reality (1)

TRRosen (720617) | about 2 years ago | (#42035365)

Cruise missiles are more of a robot then a drone is. Fact is drones are just remotely piloted planes. The problem isn't their existence but there misuse.

Re:Reality (0)

Anonymous Coward | about 2 years ago | (#42035889)

Cruise missiles are more of a robot then a drone is. Fact is drones are just remotely piloted planes. The problem isn't their existence but there misuse.

Yes, missiles are the worse. I think Killer Robots cannot deal out more than the 100% collateral damage that nukes do... unless we arm the robots with nukes.

Obligatory: Robots Are Our Friends (0)

Anonymous Coward | about 2 years ago | (#42035523)

Robots Are Our Friends [albinoblacksheep.com] .

Wrong target (3, Insightful)

gmuslera (3436) | about 2 years ago | (#42035739)

Killer Bots dont kill people, people kills people. Ban the people responsible for those killer bots, and, uh... oh, wait, they just got reelected.

A question from hoi polloi (0)

Anonymous Coward | about 2 years ago | (#42035761)

Aren't jihadis killer robots?

But how do we enforce such a ban.. (1)

AwesomeMcgee (2437070) | about 2 years ago | (#42035817)

If we are to ban such a thing, I think we must ensure appropriate even-handed enforcement of this ban, as such I propose we enlist a force strong enough to subdue any killer robot army should someone break said ban, therefore I suggest we build an army of large mechanized automatons heavily laden with weaponry to subdue any would-be killer robot army, or anyone who might be suspected of attempting to build such an army for that matter.

Re:But how do we enforce such a ban.. (1)

pkthunders (2777383) | about 2 years ago | (#42035861)

Brilliant: nothing can do wrong with this!

Not needed (0)

Anonymous Coward | about 2 years ago | (#42035911)

We don't need to ban killbots. What we need to do, is give them a preset kill limit.

Fry: "I heard one time you single-handedly defeated a horde of rampaging somethings in the something something system"
Brannigan: "Killbots? A trifle. It was simply a matter of outsmarting them."
Fry: "Wow, I never would've thought of that."
Brannigan: "You see, killbots have a preset kill limit. Knowing their weakness, I sent wave after wave of my own men at them until they reached their limit and shut down."

The Milgram Experiments: the reason why (2)

GodfatherofSoul (174979) | about 2 years ago | (#42035933)

As has been stated in other posts, every level of abstraction away from the act of violence removes a layer of conscience from the execution of the act; whether it be robots, drone strikes, or trigger-happy, 60 year-old politicians who ducked service in Vietnam.

http://en.wikipedia.org/wiki/Milgram_experiment [wikipedia.org]

You can have my autonomous killer drones... (1)

SuperKendall (25149) | about 2 years ago | (#42035947)

...when you pry the controls out of my centuries old desiccated hands in my underground mountain fortress.

Sounds like someone's a sissy (1)

AwesomeMcgee (2437070) | about 2 years ago | (#42035951)

This whole idea wreaks of wussiness, one weenie is scared of killer robots and makes a big fuss so none of us get to have killer robots? I don't think so hombre.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?