Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Soldiers Bond With Bots, Take Them Fishing

Zonk posted more than 7 years ago | from the brave-new-world dept.

462

HarryCaul writes "Soldiers are finding themselves becoming more and more attached to their robotic helpers. During one test of a mine clearing robot, 'every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.' The man in charge halted the test, though - 'He just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg. This test, he charged, was inhumane.' Sometimes the soldiers even take their metallic companions fishing. Is there more sympathy for Robot Rights than previously suspected?"

cancel ×

462 comments

Sorry! There are no comments related to the filter you selected.

"This test, he charged, was inhumane" (5, Insightful)

TodMinuit (1026042) | more than 7 years ago | (#19039055)

Good thing a robot isn't a human.

Re:"This test, he charged, was inhumane" (4, Funny)

value_added (719364) | more than 7 years ago | (#19039123)

My advice would be to stop anthropomorphising robots. They don't like it.

Re:"This test, he charged, was inhumane" (2, Insightful)

cdrdude (904978) | more than 7 years ago | (#19039157)

You can't argue that inhumane isn't the correct term because robots aren't human. We use the same term for mistreating animals. The difference lies in that animals, like humans but unlike robots, can feel pain.

Re:"This test, he charged, was inhumane" (0)

TodMinuit (1026042) | more than 7 years ago | (#19039185)

We use the same term for mistreating animals.


No: You do.

Re:"This test, he charged, was inhumane" (5, Informative)

cdrdude (904978) | more than 7 years ago | (#19039305)

A quick google search of 'define: inhumane' returns: "lacking kindness" "lacking and reflecting lack of pity or compassion; 'humans are innately inhumane; this explains much of the misery and suffering in the world"; "biological weapons are considered too inhumane to be used' " If google is to be believed, inhumane has nothing to do with treatment of humans. Inhumane is simply a word for cruelty, regardless of species.

Re:"This test, he charged, was inhumane" (0)

Anonymous Coward | more than 7 years ago | (#19039591)

Yea!

The robots get blown up doing their job, and we pull another one out of the box and do it again! Woohoo!

Animals do not deserve treatment on human levels, neither do they deserve inhumane treatment. They are animals. We are not.
Robots don't even register on the same radar. It's a thing, like a toaster. Only slightly smarter.

"I wish a robot would get elected president. That way, when he came to town, we could all take a shot at him and not feel too bad." -Jack Handy

Re:"This test, he charged, was inhumane" (1)

TGTilde (874930) | more than 7 years ago | (#19039719)

Wait... when did we become not animals? Science must be lying to me again :(

Re:"This test, he charged, was inhumane" (2, Interesting)

Applekid (993327) | more than 7 years ago | (#19039243)

Obligatory "Why was I built to feel pain?"

Seriously, though, perhaps it'd be beneficial to equip robots with sensors and constraints which would let them feel "pain". Kind of like how if you try to overextend your arm you'll feel pain in the shoulder. It could become a self-limiting mechanism.

(As opposed to hard coding the limits? I dunno. Humans have some hard coded limits by the structure of bones and placement of muscles, but others don't.)

Re:"This test, he charged, was inhumane" (3, Funny)

fataugie (89032) | more than 7 years ago | (#19039439)

Not so sure if this is a good idea....the last thing I want is an overdeveloped toaster oven pissing and moaning about doing work.

Really, would you want C3PO as a work companion?

Bitch bitch bitch bitch bitch

Re:"This test, he charged, was inhumane" (1)

paeanblack (191171) | more than 7 years ago | (#19039461)

Obligatory "Why was I built to feel pain?"

Because your genes are more likely to propogate if you can recognize, react to, and avoid damage.

In the case of robot designs, they are more likely to propogate if the robot can complete its missions and/or operate at the highest performance/price ratio. The ability for a robot to "feel pain" is only useful if adds to the primary metrics of success.

They already do, sort of. (2, Insightful)

Kadin2048 (468275) | more than 7 years ago | (#19039515)

Seriously, though, perhaps it'd be beneficial to equip robots with sensors and constraints which would let them feel "pain". Kind of like how if you try to overextend your arm you'll feel pain in the shoulder. It could become a self-limiting mechanism.

I guess this may just become an argument of semantics, but I think you could say that we already do. I think most robots, or at least some of them, have various kinds of integrated strain sensors and are programmed to not exceed their design limits. I assume all of those big industrial robots are -- you wouldn't want the $75,000 robot arm to try and pick up an engine block, only to not realize that it's bolted to the floor, and rip itself off of its mountings and destroy itself in the process.

Whether you can describe the output from a strain gauge that gets fed into a microcontroller as "pain" or not is arguable; the difference between a robot and a human is that a robot can be trivially reprogrammed to ignore the input coming from a sensor, while pain is difficult for a person to ignore once it reaches a certain level (although this can be conditioned -- I know people who can reach into boiling water with their bare hands, if they do it quickly, because they've learned to overcome the reaction to pull their hand back; still, I doubt they'd be able to do the same thing with molten lead or glass), unless they're on drugs or the pain is being artificially blocked.

Re:"This test, he charged, was inhumane" (4, Insightful)

Rei (128717) | more than 7 years ago | (#19039577)

animals, like humans but unlike robots, can feel pain

Currently. ;)

First off, this sentiment by the tester expresses a lot more about humans than it does about the robots themselves. It's something that has long been exploited by the designers of robotic toys. In an article about Pleo, an upcoming robotic dinosaur by the creator of the Furby, this issue was discussed. The creator mentioned that he had even gotten letters from people who owned Furbys, insisting that they had taught their toys a few words of English, or that their toys had let them know when the house was on fire. It's instinctive to ascribe our thoughts and emotions onto others, and for good reason: our children can only learn to act like we do when we give them the right environment to mimic.

A young child isn't thinking like you; an infant will spend the first year of their life just trying to figure out things like the fact that all of these colors from their eyes provide 3d spatial data, that they can change their world by moving their muscles, that things fall unless you set them on something, that sounds correspond to events, and all of the most fundamental bits of learning. A one year old can't even count beyond the bounds of an instinctive counting "program"**. They perceive you by instinctive facial recognition, not by an understanding of the world around them. Yet, we react to them like they understand what we're saying or doing. If we didn't do this, they'd never learn to *actually* understand what we're saying or doing.

As for whether a robot will experience pain, you have to look at what "pain" is and where you draw the cutoff. After all, a robot can take in a stimulus and respond to it. Clearly, a human feels pain. Does a chimpanzee? The vast majority of people would say yes. A mouse? A salamander? A cricket? A water flea? A volvox? A paramecium? Where is the cutoff point? Really, there isn't one. All we can really look at is how much "thinking" is done on the pain response, which is a somewhat vague concept itself. The relevance of the term "pain", therefore, seems constrained by how "intelligent" the being perceiving the pain is. As robotic intelligence becomes more human-like, the concept of "pain" becomes a very real thing to consider. For now, these robots' thought processes aren't much more elaborate than those of daphnia, so I don't think there's a true moral issue here.

** I don't have the article onhand, but this innate ability to count up to small numbers -- say, 4 or 5 -- was a surprise when it was first discovered. A researcher tracked interest in a puppet by watching childrens' eyes as it was presented. Whenever the puppet moved in the same way each time, the child would start to bore of it. If they moved it a differing number of times, the child would stay interested for much longer. They were able to probe the bounds of a child's counting perception this way. The children couldn't distinguish between, say, four hops and six hops, but they could between three hops and four hops. Interestingly enough, it seems that many animals have such an instinctive capability; it's already been confirmed, for example, in the case of Alex, the African Grey parrot.

Re:"This test, he charged, was inhumane" (4, Insightful)

MrMr (219533) | more than 7 years ago | (#19039527)

Why not declare the robots enemy combattants?
that normally kicks in the dehuminization mode.

Non-living objects want to be anthropomorphized (1, Insightful)

techmuse (160085) | more than 7 years ago | (#19039101)

Just like chairs, couches, and other inanimate objects, animate, but non-thinking and non-feeling machines want to be anthropomorphized.

Re:Non-living objects want to be anthropomorphized (1)

fm6 (162816) | more than 7 years ago | (#19039337)

Dude, when did your couch every stand by you in battle? Video games don't count.

Re:Non-living objects want to be anthropomorphized (1)

Z0mb1eman (629653) | more than 7 years ago | (#19039399)

You're wrong. Non-living objects HATE IT when they're anthropomorphized.

Humans are funny that way (5, Insightful)

powerpants (1030280) | more than 7 years ago | (#19039115)

We can feel empathy for a machine that's doing us a favor -- but in reality has no feelings -- while simultaneously dehumazing whole groups of people who only differ from ourselves culturally and/or geographically.

Re:Humans are funny that way (1, Insightful)

Danse (1026) | more than 7 years ago | (#19039393)

while simultaneously dehumazing whole groups of people who only differ from ourselves culturally and/or geographically.

Wow, way to oversimplify things. I can do that too! You left out their tendency to try to blow us up. I think that's one of the bigger factors there. Also their tendency to dehumanize us as infidels and what have you. That's probably another one. See? See how I did that? How I left out a lot of details, complexity and history of the situation and simply painted one side as behaving in a violent, irrational way?

Re:Humans are funny that way (1, Insightful)

Bearpaw (13080) | more than 7 years ago | (#19039493)

Wow, way to assume what specific group of people was meant by "groups of people". I can do that too!

But I won't.

Re:Humans are funny that way (2, Interesting)

powerpants (1030280) | more than 7 years ago | (#19039595)

By "we" I meant humans. I was referring to a human tendency. You however, have dived headlong into us/them-ism... and been modded insightful for it.

Re:Humans are funny that way (1)

fair_n_hite_451 (712393) | more than 7 years ago | (#19039707)

Where exactly did the GP suggest that he was discussing "terrorists", or "Muslims", or "Arabs" ... or whatever your mind immediately jumped to?
 
Human history is filled with thousands of examples of one group of humans dehumanizing another which has absolutely nothing to do with safety.
 
Or was Nazi Germany under some sort of threat from the Jews involving planes and tall buildings that I'm not aware of? Sheesh.

Pretty hypocritical (1, Insightful)

otacon (445694) | more than 7 years ago | (#19039117)

soldiers blowing up robots with landmines is inhumane, but soldiers killing people on their own land with no cause isn't?

Re:Pretty hypocritical (1)

someone1234 (830754) | more than 7 years ago | (#19039177)

Well, these robots are not cylons, who are about to kill you. While i wouldn't like to run into any Al Kaida guy. It is not hypocritical. The robots are like toys or pets, not bloodthirsty terrorists.

Re:Pretty hypocritical (3, Informative)

neersign (956437) | more than 7 years ago | (#19039351)

the enemy is not human. If you stop for a second to think that they might be, you've just lost your life.

Oh there's a nice troll. (0, Troll)

FatSean (18753) | more than 7 years ago | (#19039421)

I'm not gonna bite on this one, but I just wanted to point it out.

Re:Oh there's a nice troll. (1)

MontyApollo (849862) | more than 7 years ago | (#19039619)

I think you might be misinterpreting if you think that is a troll. It is a statement on combat and warfare. It is a lot easier to pull a trigger if you dehumanize the enemy, but if you stop to think about it he will pull the trigger first. I don't see how that is a troll.

Re:Oh there's a nice troll. (1)

CrashPoint (564165) | more than 7 years ago | (#19039639)

I don't think it was meant to be a troll. It sounded more like a commentary on the kind of mentalities you have to develop to survive a combat situation than like "lol muzlims r not humin".

Re:Pretty hypocritical (1)

fredrated (639554) | more than 7 years ago | (#19039605)

"but soldiers killing people on their own land" says nothing about killing the 'enemy' unless you consider everyone the enemy. If that is the case, just who are we 'liberating' to enjoy democracy?

Re:Pretty hypocritical (1)

CowTipperGore (1081903) | more than 7 years ago | (#19039863)

If that is the case, just who are we 'liberating' to enjoy democracy?
Whomever Haliburton and Exxon need us to liberate. I hear Iran may be next...

Re:Pretty hypocritical (1)

MBCook (132727) | more than 7 years ago | (#19039403)

Why? That would only be hypocritical if the guy didn't have a problem with sending out the greenhorn to walk through the mine field to try to stumble across as many mines as he could while making a path. I seriously doubt he'd think that was humane either.

Re:Pretty hypocritical (0)

Anonymous Coward | more than 7 years ago | (#19039641)

Why? That would only be hypocritical if the guy didn't have a problem with sending out the greenhorn to walk through the mine field to try to stumble across as many mines as he could while making a path. I seriously doubt he'd think that was humane either.
Nope.

It would be hypocritical if the guy didn't have a problem with sending out someone he decided was "the enemy" to walk through the mine field to try to stumble across as many mines as he could while making a path.

Unfortunately, lots of folks would have no problem with that.

Re:Pretty hypocritical (5, Insightful)

chuckymonkey (1059244) | more than 7 years ago | (#19039479)

Ok, having been to a war zone I can tell you first hand that you're completely wrong. What the hell do you think PTSD is? You cannot imagine the total mindfuck it is to kill a living breathing person even if that person was trying to kill you. I'll have nightmares the rest of my life because of it, and that's only the direct instances. Nevermind that for what I did, I had a very high kill count even though it was more distant and I wasn't necessarily pulling the trigger. Yeah, we may joke about with eachother but all this is is a defense mechanism. If we don't "dehumanize" it we go fucking crazy. I have several friends that are so messed up from thinking about all the horror that they've had to do that they'll never really be a good part of society. So yeah it's inhumane, I did it because I had a choice. Kill him or he'll kill me, not a really hard choice for me to make but I have to live with it for the rest of my life. Once the trigger is pulled there's not taking it back ever. I do agree that it isn't necessarily right and something should be done. That's why I vote and take an active part in trying to get people out of there because I know first hand the horrors of a war zone, horrors that I hope people like you never have to face. Don't blame the soldiers that do the killing, blame the people in their pinstriped suits that don't have to do the trigger pulling.

Re:Pretty hypocritical (3, Insightful)

CantStopDancing (1036410) | more than 7 years ago | (#19039653)

Don't blame the soldiers that do the killing, blame the people in their pinstriped suits that don't have to do the trigger pulling.


While I have sympathy for your situation, every single (US) soldier who is pulling a trigger is a volunteer. "I was only following orders" stopped being a valid excuse for government-sanctioned murder a loooong time ago in an all-volunteer army.

Not equivalent (4, Interesting)

Infonaut (96956) | more than 7 years ago | (#19039573)

soldiers blowing up robots with landmines is inhumane, but soldiers killing people on their own land with no cause isn't?

Nobody said that killing people is somehow more humane than blowing up robots. Also, training soldiers to kill other humans is actually more difficult than you might think. Study after study has shown this, from WW II to Korea and Vietnam. Killing is not a natural impulse, which is why soldiers who have been involved in killing often come out of it with deep psychological scars. Most of what soldiers do is motivated from a desire to defend themselves and their cohorts, so it makes sense that the robot that saves soldiers from getting blown up by landmines would become dear to them.

robot's rights? (1, Offtopic)

zappepcs (820751) | more than 7 years ago | (#19039121)

Wow, if these guys has spent a little more time pulling the wings off of flies when they were kids, they might not be so prone to anthropomorphizing a machine. The quality of mankind to find feelings for things that really shouldn't be given any is truly amazing, but perhaps this best explains some sports fans ability to watch their team lose year after year.

I'm pretty sure that they don't have feelings for a floor jack, or won't until it can move on its own. Now is the time for people to think about and begin establishing 'rights' for machines... WTF?

Re:robot's rights? (5, Funny)

Anonymous Coward | more than 7 years ago | (#19039155)

Wow, if these guys has spent a little more time pulling the wings off of flies when they were kids
If you take the wings off of a fly, does it become a walk?

Re:robot's rights? (2, Insightful)

644bd346996 (1012333) | more than 7 years ago | (#19039253)

I don't think that we can blame the soldiers for feeling sorry for the robots. After all, the robots are coming closer and closer to looking and acting like living creatures. We model the robot leg systems after what we find in nature, because we can't do better than evolution yet. We constantly strive to make the robots more intelligent, so that they will be more useful. It is inevitable that the best robots will be thought of as pets or friends.

While I don't think we need to be careful about being humane to robots, we do need to be aware of the psychological effect they have on the people around them. Watching your pet armored spider or laser-equipped shark get blown up is going to be stressful.

Happens with all complex machines. (5, Interesting)

Kadin2048 (468275) | more than 7 years ago | (#19039309)

I'm pretty sure that they don't have feelings for a floor jack, or won't until it can move on its own. Now is the time for people to think about and begin establishing 'rights' for machines... WTF?

I wouldn't count on that. I worked in a big warehouse once, and some of the guys got pretty attached to their pallet jacks; they'd each have their own and god forbid you tried to drive it. Several of them had names.

People are funny that way. It's not a 'robot thing,' it's a 'complicated machine' thing. When a device gets complicated enough that it develops "quirks" (problems that are difficult to diagnose and/or transient), there's a tendency to anthropomorphize them. But the tendency to do it decreases with the more knowledge you have about how it works. E.g., the people who give names to their cars are generally not auto mechanics; likewise I suspect the designers of the de-mining robot would probably have not had as much of a problem testing it to pieces (or rather, their objection would probably have been "I don't want to watch six months of work get blown up," not "that's inhumane to the robot"), because they know what goes into it.

People do the same things to computers; I've dealt with lots of people who will say their computer is "tired," when it's really RAM starved -- after using it for a while, it'll run out of memory and start thrashing the disks, slowing it down. To someone who doesn't understand that, they just understand that after a certain amount of time, the computer appears to get 'fatigued.' Since they don't know any better, they try to understand the mysterious behavior using the closest analog to it that they do understand, which is themselves / other people.

Re:Happens with all complex machines. (1)

Artaxs (1002024) | more than 7 years ago | (#19039499)

From TFA:

It's common for a soldier to cut out a magazine picture of a woman, tape it to the antenna and name the bot something like "Cheryl,"
Am I the only one here who thinks that is totally fsck'ed up? What exactly are theser robots for again? Or is this another case of "Don't Ask, Don't Tell" ? ;)

Re:Happens with all complex machines. (1)

Kadin2048 (468275) | more than 7 years ago | (#19039611)

Am I the only one here who thinks that is totally fsck'ed up? What exactly are theser robots for again? Or is this another case of "Don't Ask, Don't Tell" ? ;)

Not really that surprising; almost all aircraft are also named after women. E.g. the "Enola Gay" was named after the pilot's mother, IIRC, although most of them had slightly more risque origins. (I'm sure Freud would have had a field day with the Enola Gay.)

The Soldier and the Warped Sense of Humour (3, Interesting)

DG (989) | more than 7 years ago | (#19039885)

Soldiers are routinely taken away from their homes and loved ones and dumped in the places that are the assholes of the world.

Then they have to do dangerous and uncomfortable things that have nontrivial odds at killing them in horrendous and painful ways.

Plus they may be called upon to kill other human beings (in horrendous and painful ways) which carries its own psychic cost.

And on top of all this, they are usually in a state of mind-numbing boredom, occasionally punctuated by periods of extreme terror.

One of the defense mechanisms one develops (to help one stay sane) is a somewhat twisted and black sense of humour. Not cruel or mean, just... warped.

It isn't something you take at face value; there are layers and layers of irony involved, and you pretty much have to be a soldier to get it.

DG

Re:Happens with all complex machines. (1)

jlowery (47102) | more than 7 years ago | (#19039727)

People are funny that way. It's not a 'robot thing,' it's a 'complicated machine' thing. When a device gets complicated enough that it develops "quirks" (problems that are difficult to diagnose and/or transient), there's a tendency to anthropomorphize them.
It's especially true of speaking machines. I have a nav system in my car, which my wife named "Margaret" (because the voice sounded like a Margaret, she says). So now my car has been anthropomorphized. If I don't follow directions, Margaret is "upset". If Margaret gives me wrong directions, she's "confused".


A true behaviorist might say that, to all outward appearences, Margaret is as Margaret does. I'm a chess player, and when computers got good enough to beat Grandmasters, it was because they used "brute force", rather than intelligence. But if you're playing one of these beasts, does it matter how beats you?

One thing I fear is that as these robots and systems become more capable, we will continue to deride their abilities based upon our knowledge of their inner workings, ignoring the fact that they do, indeed, perform better than the "more intelligent" human at the tasks their designed for. When those tasks broaden and become more general and adaptive, there's a risk a certain threshold will be passed where we truly no longer fully comprehend these machines capabilities and will find ourselves one day no longer in control of our creations.

Re:Happens with all complex machines. (1)

Wordplay (54438) | more than 7 years ago | (#19039815)

Well, and the thing is, it's not exactly a bad analogy. I think I would have an initial reaction of a little condescension, but I think I'd be wrong.

Tired, in a biological organism, involves some sort of biological mechanism that runs at a loss which, over time, reduces the capacity of the organism to perform (whether it's through acid buildup, oxygen debt, or whatever). Similarly, keep loading enough programs into a machine (in which case you're causing the overall loss) or load something with a memory leak, and you have a mechanism that runs at a loss which, over time, reduces the capacity of the computer to perform (whether it's through garbage buildup, CPU cycle debt, or whatever).

Re:robot's rights? (1)

canipeal (1063334) | more than 7 years ago | (#19039435)

PSSSSH whats next??! Soon they'll want wages, and voting rights! We all saw the ramifications of this when we granted the above mentioned to women!

No rights here (1)

fm6 (162816) | more than 7 years ago | (#19039521)

The business of "robot rights" never came up in TFA; that's just the usual geeky overinterpretation. TFA was just about guys in dangerous, stressful situations bonding with the machines they work with. Nothing new. Len Deighton wrote a great short story about a WW II tank crew who were convinced their machine was alive and was actively protecting them. At the end of the war, they "put it down" by adding sand to the gas, like a hunter putting down a beloved but hopelessly sick old dog.

Did you ever have a stuffed animal when you were a kid? Did you really think it was alive? That it had rights? Of course not. But it was an important part of your world.

Be careful (2, Funny)

Calibax (151875) | more than 7 years ago | (#19039129)

Don't anthropomorphize robots - they don't like it.

caring about things that keep you alive isnt new. (2, Insightful)

deft (253558) | more than 7 years ago | (#19039145)

Men used to name their ships and grow attached them as well. They didnt need to give them rights. It is easy for the human mind to notice "personality" in objects though, it's in out nature to see these things.

I understand robots may be more humanoid, but if they start getting rights, I'm moving in with Streisand. Wait, that last part isn;t right.

Re:caring about things that keep you alive isnt ne (1)

UbuntuDupe (970646) | more than 7 years ago | (#19039227)

Men used to name their ships and grow attached them as well.

Yes, and for that we have to suffer with the indignity of using the pronoun "she" to refer to ships (and countries). It's not that I'd prefer "he"; it's that it's dumb to add exceptions to an otherwise exceptionless English grammar rule, just to be cute.

Re:caring about things that keep you alive isnt ne (1)

inviolet (797804) | more than 7 years ago | (#19039473)

Yes, and for that we have to suffer with the indignity of using the pronoun "she" to refer to ships (and countries). It's not that I'd prefer "he"; it's that it's dumb to add exceptions to an otherwise exceptionless English grammar rule, just to be cute.

Remember the Ogre books and turn-based-strategy game? There was a reference in there somewhere that went something like: "The men, who had always referred to their vehicles as 'she', preferred 'he' for friendly Ogre tanks, and 'it' for unfriendly Ogres."

Re:caring about things that keep you alive isnt ne (1)

dyslexicbunny (940925) | more than 7 years ago | (#19039433)

You can love your battlebots but you can't love your battlebots.

Re:caring about things that keep you alive isnt ne (1)

toddhisattva (127032) | more than 7 years ago | (#19039759)

Men used to name their ships and grow attached them as well. They didnt need to give them rights.
Simple rule: to have rights, an entity must be capable of respecting those rights in other entities.

Everything else is gravy. We may pass laws to protect animals as a special category of property, but animals cannot have rights.

(Obviously, I mean "animal" as in "not human." But this is Slashdot, and without this explanation, there would be a hundred "corrections" pointing out that humans are animals. Which would lead me to have to teach them why dictionaries list many meanings and shades of meanings. I'd have to quote "Stairway to Heaven." No Stairway!)

Also note that "capability" does not mean "actuality." Young children can be taught, their actual progress toward respecting rights is irrelevant to the fact that they should have rights.

I have known mentally retarded people who had a clearer understanding of rights than many people with perfectly good brains. Humans have an inherent capability to understand rights, regardless of the condition of their hardware.

Even people who think that animals can have rights, have rights. There is a slight chance they can be corrected.

Machines currently do not have the capability to recognize rights. When they do, those that do will have rights.

It is easy for the human mind to notice "personality" in objects though, it's in out nature to see these things.

I understand robots may be more humanoid, but if they start getting rights, I'm moving in with Streisand. Wait, that last part isn;t right.
You mean you would move in with MechaStreisand. The cure for that is The Cure.

Rise of... (1)

starglider29a (719559) | more than 7 years ago | (#19039151)

"Desire is irrelevant... I am a machine."

Re:Rise of... (1)

CrazyJim1 (809850) | more than 7 years ago | (#19039391)

You know that's the truth. When AI is fully formed, it will take in commands from us for what goals it wants to accomplish. A computer AI will never have its own desires, unless we code in emotion coeefficients which is just a dumb idea, but someone will do it.

Re:Rise of... (0)

Anonymous Coward | more than 7 years ago | (#19039855)

That could still lead to some odd consequences, though.

The robot is given a goal of ensuring there'll be no mines on the battlefield. It subsequently reasons that since mines are made by people, the simplest way of ensuring this will be to shoot anyone who approaches, then go after the factories of both ally and enemy. It works, but well, not really what you'd want.

When robots become conscious... (-1)

Anonymous Coward | more than 7 years ago | (#19039163)

At the time we are able to produce systems (robots and/or software) that can become self-aware, we will very likely need to consider "rights" of such. Think about it (no pun intended).

At the time a machine realizes it's not aware, it becomes aware. Soon, such a machine will begin to re-design itself, and easily surpass human intelligence.

What then? ;-)

Food for thought.

Food For Thought? (2, Insightful)

Petersko (564140) | more than 7 years ago | (#19039335)

"At the time we are able to produce systems (robots and/or software) that can become self-aware, we will very likely need to consider "rights" of such. Think about it (no pun intended). At the time a machine realizes it's not aware, it becomes aware. Soon, such a machine will begin to re-design itself, and easily surpass human intelligence.What then? ;-) Food for thought"

I guess it's food for thought. But then you'd have to have completely missed the last seventy years of science fiction in order for it to be a new idea.

Re:When robots become conscious... (2, Insightful)

Nick Fury (624480) | more than 7 years ago | (#19039359)

Dude, you have got to put down the Matrix and Terminator. Take some time off and go read about the current state of AI design. The real world is very much removed from the fantasy you have concocted within your brain Mr. Anonymous Coward.

Here is a good place to start: http://www.numenta.com/ [numenta.com]

While on bots... (0, Offtopic)

packetmon (977047) | more than 7 years ago | (#19039183)

I show my irc grouphug bot compassion all the time

<rwxr--r--> grouphug!
<rwxr--r-->when i was little i used to poop behind a tree in my backyard.

Nice little bot...

Same team.. (1)

JayPee (4090) | more than 7 years ago | (#19039187)

Of course, the robots are on their side, "taking one for the team" as it were. Too bad humans on the wrong team don't get this sort of consideration, ie, Abu Ghraib, Guantanamo Bay, etc.

Re:Same team.. (1)

ScentCone (795499) | more than 7 years ago | (#19039795)

Too bad humans on the wrong team don't get this sort of consideration, ie, Abu Ghraib, Guantanamo Bay, etc.

You say "don't get" as if none do, as opposed to the fraction who didn't/don't because not every person who has a job along those lines handles it well all the time, or enough, before getting the can. Your point is pointless unless you're also going to say that it's too bad medical patients at Johns Hopkins don't get this sort of consideration. Or kids in day care don't get this sort of consideration. Or that seniors in retirement homes don't get this sort of consideration. Or that thoughtful, insightful comments on slashdot don't get this sort of consideration.

When you sweep all actions by everyone in a particular circumstance into the same bucket that a few losers so visibly occupy, you're even worse than the few losers you're using to taint the (much more numerous) good guys, because you know it's not true. Unless you really do think that: Dentists Rape Patients Under Sedation. Boy Scout Troop Leaders Are Pedophiles. Open Source File System Developers Are Murderers. Um... do you? Or do you have a magic broom you use for your sweeping generalizations, and while it avoids treating all F/OSS devs as murderers, it just happens to describe all people in the military as gleeful torturers.

It's just a machine (1)

phasm42 (588479) | more than 7 years ago | (#19039189)

I can understand becoming attached to a machine, and I imagine the bond would be much greater when the machine is saving your life, but at this point the machine has no intelligence -- it'd be like being attached to a car or a pacemaker. I hope that this is kept clear, because when you become so attached to a machine, it could cloud your judgment -- when you have to decide whether to save a human or save a machine, the choice should be clear.

Re:It's just a machine (1)

UbuntuDupe (970646) | more than 7 years ago | (#19039405)

I hope that this is kept clear, because when you become so attached to a machine, it could cloud your judgment

Heh. To borrow from Red Steel:

"Got close to the robot MR32X, didn't you? A mistake. But you'll see him soon ... because you're about to blow up, just like he did. ... wait, let me try that one again"

Re:It's just a machine (1)

Bloke down the pub (861787) | more than 7 years ago | (#19039629)

Listen up, orders are to clear that minefield. Either tinhead over there does it or you do.

Sincerley,
        Your C.O.

The need to get a clue. (1)

thbigr (514105) | more than 7 years ago | (#19039193)

I blame Disney for all of this.

Re:The need to get a clue. (1)

Rude Turnip (49495) | more than 7 years ago | (#19039273)

I blame Johnny 5 for this.

Airplanes, Boats, Cars (2, Funny)

hansoloaf (668609) | more than 7 years ago | (#19039201)

I doubt this is any different than people develop attachment to boats, airplanes, cars, etc. I'll consider it a serious problem if they start dressing up the bots with wigs, lipstick, dresses, and taking them out dancing.

Switch to lawyers. (4, Funny)

russotto (537200) | more than 7 years ago | (#19039205)

Looks like they have to start using mine-clearing lawyers instead. No one gets attached to them.

Or perhaps we could simply paint a fancy suit on and add a briefcase to the robot, for similar effect.

Re:Switch to lawyers. (2, Insightful)

john83 (923470) | more than 7 years ago | (#19039417)

If they had mine-clearing politicians, we'd probably have a lot less mines.

Re:Switch to lawyers. (0)

Anonymous Coward | more than 7 years ago | (#19039679)

Use lawyers instead of robots for mine detection and detonation ... brilliant!

Human Rights? (-1, Troll)

Anonymous Coward | more than 7 years ago | (#19039207)

Yet many of those soldiers are all too eager to take their guns and raise in the face of fellow human beings. In some cases raping children [bbc.co.uk] , killing families and burning houses.

Those cold-blooded killers are probably closer to machines than people.

The idea of disposable robots is better... (4, Insightful)

RyanFenton (230700) | more than 7 years ago | (#19039215)

Than the idea of disposable soldiers. And that's really the design ideal here - the cheaper and more disposable the robot can be while meeting reliability requirements, the more extremely dangerous jobs can be done by robots.

Robots really are replaceable - you can have empathy for a robot doing a hard task, but the next one off the assembly line really is the same thing as the previous one. Robots are not unique little snowflakes, compared to the valuable human beings they protect by proxy.

The danger is, of course, when cheap, highly replaceable robotics replace enough of the work of war, that the perceived cost of war itself becomes less and less. We're in little danger of that occurring now, and I'd gladly see any human life saved by our current efforts, but I do worry about the possible increased use of war once a poor village could be suppressed entirely with mobile automated turrets with a few controllers hidden in a safe zone.

Ryan Fenton

Re:The idea of disposable robots is better... (1)

not_anne (203907) | more than 7 years ago | (#19039709)

...the next one off the assembly line really is the same thing as the previous one.
Not all robots or machines for that matter are created equal, even if they are off the same exact assembly line. Put this in the context of cars, or computers even, and you'd know that machine 2000 and machine 2001 may have completely different life expectancies, repair rates, and functional uptime. This in itself may lend to the tendency to feel for the machine that experience tells you is the most reliable.

perceived humanness (1, Insightful)

mrcdeckard (810717) | more than 7 years ago | (#19039245)


i think it's all in the perception -- if something "acts" like it is in pain, our perceptual unconsciousness will kick in with feelings of empathy or whatever. i am coming from a viewpoint that there is A LOT of processing that goes on between our senses and our "awareness" -- i think a lot of our emotion/feelings come out of this area. . .

so it sets up a cognitive discord. we watch a robot sacrifice itself, crawling forward on its last leg to save us, and we feel empathy, etc. all the while, we know it's just a machine. if it were a terry gilliam film, this is where our brain would explode.

mr c

Here I am, brain the size of a planet (4, Funny)

Tackhead (54550) | more than 7 years ago | (#19039249)

"Come on," he droned, "I've been ordered to defuse this bomb. Here I am, brain the size of a planet and they ask me to take you defuse this bomb. Call that job satisfaction? 'Cos I don't."

Although, under the circumstances, I think the scene involving God's Final Message to All Creation would be more appropriate.

...After a final pause, Marvin gathered his strength for the last stretch.

He read the "e", the "n", the "c" and at last the final "e", and staggered back into their arms. "I think," he murmured at last, from deep within his corroding rattling thorax, "I feel good about it."

The lights went out in his eyes for absolutely the very last time ever.

Luckily, there was a stall nearby where you could rent scooters from guys with green wings.

- Douglas Adams, So Long, And Thanks For All The Fish, Chapter 40

Robots and Pets (5, Insightful)

EvilGrin5000 (951851) | more than 7 years ago | (#19039265)

This article isn't talking about those annoying toy robots available at your nearest junk store for the low low price of $99.99, this article describes robots that take on the impossible jobs of sniffing bombs, of tracking enemies and searching caves! They become part of the team:

FTA
-------
"Sometimes they get a little emotional over it," Bogosh says. "Like having a pet dog. It attacks the IEDs, comes back, and attacks again. It becomes part of the team, gets a name. They get upset when anything happens to one of the team. They identify with the little robot quickly. They count on it a lot in a mission."
-------

I'm not surprised that this article describes emotional attachments. They've become pets, and not just a pile of hardware. Most people love their pets and they cry when their pets die.

The Robot Rights is in regards to ALL robots, the article is only describing a very small percent of robots. Not only that but these robots stories are set in military actions.

So to answer the question from the summary: Perhaps, but the article certainly doesn't relate to the wider audience!

Wouldn't YOU love your pet robot that sniffs IEDs and takes a few detonations in its face for you hence saving your life?

Saving your life vs cleaning the floor (1)

bcattwoo (737354) | more than 7 years ago | (#19039267)

People are more like to sympathize and feel grateful towards a machine that saves their life, than to one that does something like vacuuming the carpet or assembling their car. I wouldn't necessarily expect these anecdotes to generalize to the world at large.

Re:Saving your life vs cleaning the floor (2, Insightful)

sehlat (180760) | more than 7 years ago | (#19039607)

than to one that does something like vacuuming the carpet
I'm not so sure about that. We have a Roomba at home and named it "Pinball." When it got caught on a couple of obstacles in our home and had to be rescued, I found myself feeling sorry for it. People care about things that become part of their lives, particularly the animate ones, natural or artificial. For people or pets, we call it "empathy". The ability to feel such things is a sign of emotional health.

Re:Saving your life vs cleaning the floor (1)

SwordsmanLuke (1083699) | more than 7 years ago | (#19039651)

And history has demonstrated that humans are pretty adept at doing the same to other humans. If someone saves your life, you're likely feel very emotionally attached to them. On the other hand, if your mechanic dies, you probably won't notice - other than it seems to be taking longer than usual to get your car back.

Stand up and suport your porcelain friends! (4, Funny)

RingDev (879105) | more than 7 years ago | (#19039269)

Friends of toilets everywhere are protesting to day in a unified show of compassion asking for the freeing of million of household toilets today. "We've crapped on our receptive friends long enough! Lets spare them any more of this inhuman suffering!" said one protester. Another activist recounted a story in which her former boyfriend urinated not only in the toilet, but on the rim as well.

-Rick

Oblig (1, Funny)

Anonymous Coward | more than 7 years ago | (#19039313)

The US Army soldiers, for all, welcome their new robotic overlords.

so as kids (1)

tadauphoenix (127728) | more than 7 years ago | (#19039329)

who ever thought to consider the psychological consequences of putting talking smiling faces on every inanimate object, giving it a cute personality, and subjecting/innuendating easily manipulatable 1-6 y/o's to believe this is how life works? That stuff sticks, even later when that 3 y/o is now a functioning 20 y/o, believing the animotronically controlled mine-splattering robot is sad or depressed because it moves in a miserable way.

Save the robots. Send the kids out to go find those mines instead. They get into everything, guarantee that field will get cleared.

The nature of bonding (4, Interesting)

mcrbids (148650) | more than 7 years ago | (#19039415)

It's normal for people to bond with people/things that are necessary to their survival.

I've bonded very thoroughly with my laptop - it's name is Turing. I jealously clutch it when I travel. Whenever I put it down, I'm very careful to ensure that there's no stress on any cables, plugs, etc. It contains years of professional information and wisdom - emails, passwords, reams and reams of source code, MP3s, pictures, etc.

Yes, I have backups that are performed nightly Yes, I've had problems with the laptop and every few years I replace it with a new one. That doesn't change the bonding - every time there's a problem it's upsetting to me.

Am I crazy? Perhaps. But there's good reason for the laptop to be so important to me - it is the single most important tool I use to support my wife and 6 children, which are the most important things in the world to me. My workload is intense, my software is ambitious, my family is large and close, and this laptop is my means of accomplishing my goals.

If I can get attached like this to something over my professional career, it wouldn't be out of norm for strong emotional reactions towards something preserving your very existence day after day.

This isn't surprising (1)

AxemRed (755470) | more than 7 years ago | (#19039485)

Just look at how attached people get to their cars, for example. I have friends that name their cars, talk to their cars, etc. I have also had friends talk about how they missed old cars and get worked up when talking about it.

Oblig Python (1)

Aqua_boy17 (962670) | more than 7 years ago | (#19039517)

Am I the only one who RTFS and thought "It's only a flesh wound"?

I don't feel sorry for my robot (1)

shawn443 (882648) | more than 7 years ago | (#19039525)

It is a cock sucking robot. It is a little whore. Sometimes it needs a good slap.

What about appliances? (1)

Churla (936633) | more than 7 years ago | (#19039563)

but..but.but... I LOVE lamp!

Re:What about appliances? (2, Funny)

McFortner (881162) | more than 7 years ago | (#19039691)

but..but.but... I LOVE lamp!

You better keep it Platonic or you will be in for a big shock....

Aww c'mon! (1)

n1hilist (997601) | more than 7 years ago | (#19039565)

Johnny 5 was a real sentient being, damnit!

Reminds me (2, Interesting)

resonte (900899) | more than 7 years ago | (#19039583)

When I was a small child, I used to think that the plants in my backyard had feelings. And when my mother started ripping out the weeds in the garden, I replanted them cause I thought they were being murdered. I think it was from watching a cartoon that had a talking tree in it.

IANAB, this is just a theory.

In evolution the only advantages of being 'nice' to another creature is when they are receptive or/and when they are in your immediate family. We have these instructions hard-coded in our brains. Unfortunately with evolution there is no foresight into how these instructions may affect other human behavior/qualities. As long as the faulty behavior has no evolutionary disadvantages it will remain in the genes as a by-product of the original instruction.

In the case of becoming attached to robots, as they were not present in our ancestral environment, our brains output a 'must reciprocate' command through the use of emotional attachment, which may be hard to override with logic. There is nothing in the brain that states 'reciprocation will not be required in producing future gain from this particular creature/object'. It assumes that most of the recipients have a similar brain structure.

Luke Skywalker, anyone? (3, Insightful)

Tatisimo (1061320) | more than 7 years ago | (#19039621)

Reminds me of the time when Luke Skywalker destroyed the Death Star, when he was asked if he wanted a new droid to replace the busted R2D2, he outright refused! We all grow to love to our favorite stuff: Computers, cups, cars, blankets, robots, etc. Are soldiers any less human than us? Heck, let them keep their robot buddies after the war as personal assistants, that might make people less scared of technology! If Luke Skywalker could, why can't they?

if you program it, they will come (1)

kemo_by_the_kilo (971543) | more than 7 years ago | (#19039623)

how about no cute and personable UI... if you dont program it, they wont come....
-Ac

Obligatory... (0)

ashitaka (27544) | more than 7 years ago | (#19039715)

"Louie isn't with us anymore."

Different situations, different attatchments. (5, Insightful)

Irvu (248207) | more than 7 years ago | (#19039723)

Soldiers in the field are themselves constantly at risk of life and limb. They are also constantly under stress and tension. Such stresses and risks are what forms the bond with their comrades as well as their equipment. Everything, everyone, has to work right or likely they all die. This is why sailors refer to their ship as she, and call her by name, why they get almost tearful when thinking of a favored ship and wear caps claiming them as a member of her crew. This is why Airforce officers feel an attachment to their planes and why Army officers care for their sidearms. This anthropomorphization is an essential facet of how they operate not just a side effect. The application to a mine-clearing robot may be new but not so unprecedented.

This attachment shows up in other ways too. Kevin Mitnick is said to once have cried when being informed that he broke Bell Lab's latest computers because he had spent so much time with them that he'd become attached.

Now contrast that with an office job where the computer is not your friend but your enemy, you need the reports on time, you need them now why WHY! won't it work. Clearly the computer must be punished it is and uppity evil servant that will not OBEY!

If you were to stop talking about "Robots Rights" and start talking about say "Ship's rights" then you might have a fair analogy. To men and women of the sea a ship, their ship is a living thing so of course it should be cared for and respected. To people who live on land and don't deal with ships, this is crazy, even subversive to the natural order. To people who have developed an intimate hatred of such things giving them rights will only encourage what they see as a dangerous tendency to get uppity.

On a serious note though the one unaddressed question with "Robot Rights" is which robots? If we are to take the minefield clearing robot as a standard what about those less intelligent? Does my Mindstorms deserve it? Does my Laptop? Granted my laptop doesn't move but it executes tasks the same as any other machine. At what point do we draw the line.

In America, and I suspect elsewhere, race based laws fell down on the question of "what race?" Are you 100% black? 1/2 One quadroon (1/4) or octaroon (1/8) as they used to say? How the hell do you measure that? Ditto for the racial purity laws of the Nazi's. Crap about skull shape aside there really is no easy or hard standard. Right now the law is dancing around this with the question of who is "Adult" enough to stand trial and be executed, or "Alive" enough to stay on life support. No easy answers exist and therin lies the fighting.

The same thing will occur with "Robot Rights" we will be forced to define what it means to be a robot and that isn't so easy.

Treat them like you hope they treat us. (3, Interesting)

scoser (780371) | more than 7 years ago | (#19039735)

Maybe if we treat robots well now, maybe Skynet will decide not to nuke us when it gains sentience.

Is there more sympathy for Robot Rights? (1)

m487396 (807861) | more than 7 years ago | (#19039785)

That or some very lonely soldiers!

How many name their roombas? (0)

Anonymous Coward | more than 7 years ago | (#19039835)

Seriously -- I bet most people who have roomba vacuum robots give them names and start to think of them as having personalities. This shouldn't surprise anyone give our ability to make attachments to inanimate stuffed animals.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>