Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

US Military Moving Closer To Automated Killing

Soulskill posted about 3 years ago | from the paging-john-connor dept.

The Military 472

Doofus writes "A recent article in the Washington Post, A future for drones: Automated killing, describes the steady progress the military is making toward fully autonomous networks of targeting and killing machines. Does this (concern|scare|disgust) any of you? Quoting: 'After 20 minutes, one of the aircraft, carrying a computer that processed images from an onboard camera, zeroed in on the tarp and contacted the second plane, which flew nearby and used its own sensors to examine the colorful object. Then one of the aircraft signaled to an unmanned car on the ground so it could take a final, close-up look. Target confirmed. This successful exercise in autonomous robotics could presage the future of the American way of war: a day when drones hunt, identify and kill the enemy based on calculations made by software, not decisions made by humans. Imagine aerial "Terminators," minus beefcake and time travel.' The article goes on to discuss the dangers of surrendering to fully autonomous killing, concerns about the potential for 'atrocities,' and the nature of what we call 'common sense.'"

cancel ×

472 comments

Sorry! There are no comments related to the filter you selected.

Better computers than humans (2, Insightful)

Anonymous Coward | about 3 years ago | (#37464312)

Given the amount of friendly fire deaths in recent wars it would be interesting to see if software has a better rate of IDing enemies than humans do.

Re:Better computers than humans (1)

Provocateur (133110) | about 3 years ago | (#37464398)

I've always wanted to cream the Blue Team in Paintball. From home. I wonder when this tech will be available for toy companies. Especially when the 2012 Geneva Convention on Laws of Armed Robots in Combat declares them unfit, thereby resulting in a blackmarket for jailbroken drones.

Re:Better computers than humans (0)

Jeremiah Cornelius (137) | about 3 years ago | (#37464518)

Sam Lowry: It's not the machine. There's a mismatch on the personnel code numbers... Tuttle should have had £31.06, debited against his account, not Buttle!

Kurtzman: Oh my God, a mistake!

Sam Lowry: Well at least it's not ours.

Kurtzman: Isn't it? Whose is it?

Sam Lowry: Information Retrieval.

Predator drones do routine domestic duty [stltoday.com]

well then (0)

Anonymous Coward | about 3 years ago | (#37464318)

I for one welcome our robot overlords

Landmines (5, Insightful)

Anonymous Coward | about 3 years ago | (#37464324)

Landmines do automated killing every day!

Re:Landmines (1)

Namarrgon (105036) | about 3 years ago | (#37464546)

Excellent point, and look what an indiscriminate job [handicapinternational.be] they do of it.

Re:Landmines (3, Informative)

rtfa-troll (1340807) | about 3 years ago | (#37464612)

Which is why civilised countries [wikipedia.org] have already outlawed them. No decent human could encourage the spread of the things which kill many civilians, animals and for the most case mine clearers for every attacking soldier they kill.

N.B. the treaty still allows anti-tank mines and even remote triggered claymore mines so it's still possible to do area denial against serious military forces. I will give Koreans a small out in this case in that this was the way that there border was divided long before the treaty and redesigning that would be a nightmare.

Re:Landmines (1)

Pseudonym (62607) | about 3 years ago | (#37464636)

Bingo. The US has spent years phasing out land mines, and if it wasn't for the Korean DMZ, it would be a signatory to the Ottawa Treaty. It would be a backwards step if they built new weapons where humans do not make the targeting decision.

Re:Landmines (0)

Anonymous Coward | about 3 years ago | (#37464706)

"Landmines do random killing every day!"

Fixed that

Automated job killing (1)

Manfre (631065) | about 3 years ago | (#37464330)

When these are combat ready, there will be many unemployed soldiers.

Re:Automated job killing (0, Insightful)

Anonymous Coward | about 3 years ago | (#37464378)

When these are combat ready, there will be many unemployed soldiers.

We're talking about killing human beings and you're worried about economics. Such an American thing to do. (See: our wars.) I wonder why the world hates us.

Re:Automated job killing (2)

Mitchell314 (1576581) | about 3 years ago | (#37464430)

Oh, and no love for the robots that risk their lives in place of the soldiers? Jerk.

Re:Automated job killing (0)

Anonymous Coward | about 3 years ago | (#37464396)

With G.I. bills and increased demand for machinists.

Re:Automated job killing (0)

Dr Max (1696200) | about 3 years ago | (#37464632)

That's the biggest trouble with wars for America, all the pay checks that have to get written. Although leaving all those thugs at home on welfare won't be much fun either.

Re:Automated job killing (3, Informative)

pluther (647209) | about 3 years ago | (#37464662)

That won't be an issue. The only prominent US politician who's serious about ending America's wars is even more serious about ending all forms of welfare.

Re:Automated job killing (1)

Dr Max (1696200) | about 3 years ago | (#37464768)

Fair point, although then you'll have desperate thugs, but I guess they wont be a problem just send the kill drones after any trouble makers.

Re:Automated job killing (0)

Anonymous Coward | about 3 years ago | (#37464660)

Don't worry, the bureaucrats' jobs are still secure.

not autonomous (0)

phantomfive (622387) | about 3 years ago | (#37464332)

It says right in the summary that any kills must be approved by a human. That is not fully automated killing at all. And unless you are afraid of a Matrix scenario, then a future where war is limited to robots killing other robots, and not humans killing each other, is a GOOD THING.

Re:not autonomous (1)

harryjohnston (1118069) | about 3 years ago | (#37464416)

Where does it say that? The article is discussing systems that don't require human approval for a kill.

Re:not autonomous (1)

phantomfive (622387) | about 3 years ago | (#37464444)

Oh yeah, you're right lol. Mod me braindead. I read 'unmanned' and somehow converted that in my brain to 'manned.'

Re:not autonomous (0)

Anonymous Coward | about 3 years ago | (#37464434)

... and when a town is destroyed, it will be computer error and no one will be brought to justice. There is nothing about robots killing robots, that is pointless. It will be robots culling humans.

Re:not autonomous (0)

Anonymous Coward | about 3 years ago | (#37464436)

Humans are much cheaper (to make and maintain) than robots. Unless we turn into a society where robots are more valuable than people, human targets will be more important to defend than machines.

War is power. (1, Insightful)

xtal (49134) | about 3 years ago | (#37464460)

War will always be about killing people. That's what the military is for. Killing. This is not a bad thing. I want the best military in the world protecting my liberty.

All power comes from the barrel of a gun. Aimed at you - to make you comply. Willingly, or otherwise.

Read some history. The approval rule will be circumvented - it is only a matter of time. The reason why you need autonomous killing robots is that comms systems can always be jammed with relative ease. An autonomous system is not vulnerable to external jamming threats, or at least, is more easily hardened against them.

Interesting times.

Re:War is power. (5, Insightful)

phantomfive (622387) | about 3 years ago | (#37464536)

All power comes from the barrel of a gun. Aimed at you - to make you comply. Willingly, or otherwise.

All power comes from being able to make someone happy. Really, think about it. A gun is no guarantee that someone will comply. If they feel certain you will shoot, then it has almost no power at all. The power of a gun comes from the fact that you MIGHT make them happy by not killing them.

If your goal is to get people to do something, you'll do much better paying them than trying to threaten them. And if you can make them happy in other ways, you may be even more powerful than merely with money.

Obama didn't obtain the most powerful office in the world by threatening to kill people (King George tried that, and got a revolution). He got votes by giving people hope for change. How much change he delivered is a different thing (certainly he delivered some), but people were happy to believe that it might be true. So they voted for him.

The reality of power is different than what a lot of people think.

Re:War is power. (0)

Anonymous Coward | about 3 years ago | (#37464668)

Spot on parent.

Re:War is power. (0)

mosb1000 (710161) | about 3 years ago | (#37464760)

All governmental authority derives from their ability to do violence. If what you were saying was true, police officers wouldn't carry weapons.

You can win an election bu making promises, but authority you are taking on is guaranteed by violence. People voted for Obama because they believed he would do violence to others rather than themselves.

Re:War is power. (2)

phantomfive (622387) | about 3 years ago | (#37464796)

Government derives its authority from the consent of the governed. You need to have at least a critical mass of citizens who willingly follow the despot, otherwise he will fall.

People voted for Obama because they believed he would do violence to others rather than themselves.

You either have a weird definition of violence, or a weird idea of your fellow citizens. I know nobody who voted for Obama because they thought he would do violence to someone.

Re:War is power. (0)

Anonymous Coward | about 3 years ago | (#37464766)

All power comes from being able to make someone happy.

Last I heard anthropologists defined power as the ability to channel behaviour through sanctions, whereas influence was the ability to channel behaviour in other ways. If so, influence comes from being able to make someone happy and power comes from being able to make them sad.

What liberty? (2)

dutchwhizzman (817898) | about 3 years ago | (#37464652)

I haven't seen a lot of wars about liberty lately. Most were about economics or territory, some were about religion. To my knowledge, the last time the USA was attacked on own territory was Pearl Harbor and the last time the US mainland was invaded was well over 100 years ago. In the end, only the weapons manufacturers get a good deal out of war, the people just get another sock puppet ruling their countries.

There are a lot of treaties that try to limit the number of nukes, land mines and other non-discriminatory weapons on the planet. Adding new weapons to the list to have treaties about isn't really productive if we ever want to stop innocent bystanders dying in war.

Re:War is power. (1)

avajcovec (717275) | about 3 years ago | (#37464692)

All power comes from the barrel of a gun.

Really? All power? If that were true there could be no martyrs. History has shown time and time again that it only takes one person looking beyond their own fear of death, holding some higher ideal, to set an example for countless others. Just one to stand and say that there is something more important, more powerful than the individual life of this body, something your guns can never threaten.

The overwhelming majority on this planet live in peace with their neighbors, trusting them inherently. It is only the result of a constant barrage of divisiveness that they are taught to fear their fellow man and allow these atrocities to be committed.

You have so thoroughly misunderstood both liberty and power. I recommend meditation.

Re:not autonomous (5, Interesting)

Tanktalus (794810) | about 3 years ago | (#37464496)

I read somewhere recently a quote that, IIRC, was from Churchill. It was something about avoiding war, but if you must fight, fight with severity, for that is the most humane. I think that applies here. Though it sounds incredibly cruel, if people are not dying in your war, there will be no incentive for either side to stop.

Of course, Gadhafi, Hussein, Stalin, and similar madmen are somewhat of a counter example in that they don't give up no matter how many of their side are killed. Yet Japan in WWII is an example of the ruthless severity (nuclear bombs) causing an immediate and complete cessation of any attempts to create war.

Even modern times with Gadhafi and Hussein, the invasion of Iraq was much more severe than the Libyan rebels, thus the shorter amount of time to cause the government to capitulate. (Getting the rest of the population to stop fighting, much harder... we'll see how Libya does without the outside intervention.)

Anyway, the point is that robot vs robot is war by proxy. Without the violence, the bloodshed, the impetus to end the war just won't be the same. They'll drag on for longer and longer, and resolution will be even less certain than it is today. I'm not sure that's necessarily such a good thing.

Re:not autonomous (1)

phantomfive (622387) | about 3 years ago | (#37464566)

Anyway, the point is that robot vs robot is war by proxy. Without the violence, the bloodshed, the impetus to end the war just won't be the same. They'll drag on for longer and longer, and resolution will be even less certain than it is today. I'm not sure that's necessarily such a good thing.

There will always be conflict. War is just one method to resolve conflict. Legal fights are another method. Negotiation is another method. Robot wars is on future potential method. In my opinion, machines killing each other is vastly preferable to people killing each other, people who would be brothers in a different situation.

Re:not autonomous (1)

RsG (809189) | about 3 years ago | (#37464694)

You've got it the wrong way around.

The idea of winning a war by way of killing so many of the opposition that the rest will surrender or retreat is viable some of the time, but horrific. And truth be told, it doesn't work nearly as well in real life as it does on paper; people are unpredictable creatures at the best of times, and there are plenty of cases of soldiers or entire armies fighting to the very last, horrific fate be damned, rather than surrender. In particular populations and politicians may fall prey to the sunk cost fallacy i.e. "We've already lost N soldier fighting this war, we can't give up now, else they died for nothing." You can't expect to win a war if you assume your opponents are rational actors who prioritize self preservation, because that isn't always going to be the case.

The right way to do it, and in fact the way that's had a better track record of making wars end, is to destroy the ability of the enemy to make war altogether. For a protracted conflict, you get more bang for your buck destroying logistic, communication and supply capability than you do killing enemy soldiers in a fair fight. Any modern national military is only as capable of making war as they are capable of supplying, commanding and reinforcing their armed forces. Obviously this doesn't work in a guerrilla engagement, where supply lines may be nonexistent, or against a foe who hides among civilians. Iraq is a good example of where this strategy does and doesn't work; the official conflict ended rapidly, with the army defeated in short order (and without massive casualties; many Iraqi soldiers never even saw action), but the same approach cannot be used to maintain an occupation.

For a hypothetical conflict between two nations armed with robots, this form of conflict is even more likely; infantry forces require less logistical support than drone forces. In order to win, you don't grind your enemy's robotic forces into dust in a fair fight and you don't try to terrorize their populace into surrendering; instead you destroy their communications so they can't send in the drones, you destroy their factories, airbases, munitions dumps and whatever else they need to build and maintain their robotic fleet and you prevent them from doing the same thing to you. You don't have much chance of occupying a country with a robotic army any more than you can occupy a country with tanks, aircraft or warships; occupation pretty much requires men on the ground. This does mean that a robot armed nation might win a conflict without casualties, but must be prepared to suffer losses if they plan on conquering rather than letting their foe surrender and retain their own government.

Ishvires (2)

Iskorptix (2452916) | about 3 years ago | (#37464334)

See, I told ya, John Connor's mother was right about our future

Re:Ishvires (0)

Anonymous Coward | about 3 years ago | (#37464684)

and no one would belive her.

Depends where it's used (0)

kurt555gs (309278) | about 3 years ago | (#37464340)

I'm sure they will only use this in countries populated by brown (non-white) people that speak in funny languages. Therefor, the top brass os the US military shouldn't really care about "collateral casualties".

   

Re:Depends where it's used (0)

Anonymous Coward | about 3 years ago | (#37464576)

Wow, no irrational stereotyping or unjustified assumptions there at all. Nicely done.

As a side note, since you may not have noticed, our military has an unusually high percentage of people "with brown skin" both doing the killing and in positions of leadership.

Maybe, just maybe, to the military the enemy are the guys that would shoot at you, regardless of skin color?

Sarah Connor (1)

Anonymous Coward | about 3 years ago | (#37464342)

Come with me if you want to live!

Soon, the high value targets will not be people (1)

thomasmoreorless (2466272) | about 3 years ago | (#37464346)

People are becoming increasingly irrelevant. The machines can just fight each other, while we get fat and die.

Re:Soon, the high value targets will not be people (1)

sehlat (180760) | about 3 years ago | (#37464414)

People are becoming increasingly irrelevant. The machines can just fight each other, while we get fat and die.

So one day, we can tell the robots "You've come up in the world. Learned to kill your own kind." (From the movie "Screamers")

Re:Soon, the high value targets will not be people (0)

Anonymous Coward | about 3 years ago | (#37464722)

How will you get fat when a robot has your job.

This is great! (0)

Anonymous Coward | about 3 years ago | (#37464348)

I am so in favor ot this.

Death to our victims!

Is anyone suprised? (0)

Anonymous Coward | about 3 years ago | (#37464354)

It definately terrifies me, but then again it is far from unexpected. More over, I think it a lot better to send out drone to do the mindless killing than to steal the souls of thousands of young men and women who are forced to do it, and then spend the rest o their lives trying to cope.

Decisions Better Than Humans? (0)

Anonymous Coward | about 3 years ago | (#37464362)

Machines can kill logically and without ethical or moral problems. Besides, it is much easier to justify if a machine takes the blame. This type of warfare can make US genocide quicker and more painless, so in its own way, it will become more humane.

Have automated enemies too (2)

blue trane (110704) | about 3 years ago | (#37464374)

Move all violence to online simulations.

Re:Have automated enemies too (1)

Anonymous Coward | about 3 years ago | (#37464466)

There was a star trek episode about this, where the computer would report the number of casualties for each city, and that number of citizens would be executed.

not even competent, extremely experimental (4, Insightful)

phantomfive (622387) | about 3 years ago | (#37464400)

The 'automated recognition' in this case was a large orange tarp. The difficulty of creating an automated recognition algorithm for an orange object in a natural background is extremely low. Wake us up when this thing can recognize camouflaged tanks in a forest.

Re:not even competent, extremely experimental (0)

Anonymous Coward | about 3 years ago | (#37464542)

Wake us up when this thing can recognize camouflaged tanks in a forest.

I think at that point it might be weee bit late. Today it's an orange tarp... tomorrow it's a camouflaged tank in a forest.... and day after it's a guy wearing red and white striped-shirt in a crowd.

Re:not even competent, extremely experimental (3, Funny)

camperdave (969942) | about 3 years ago | (#37464604)

I think at that point it might be weee bit late. Today it's an orange tarp... tomorrow it's a camouflaged tank in a forest.... and day after it's a guy wearing red and white striped-shirt in a crowd.

$ cat killbot.log
Scanning crowd...
Target "Waldo" located.
Servo batteries one, two, and three lock on... fire!
Target "Waldo" destroyed.

$

Re:not even competent, extremely experimental (4, Insightful)

ceoyoyo (59147) | about 3 years ago | (#37464572)

Camouflaged tanks in a forest shouldn't be too hard. Telling the difference between a soldier and a civilian - now that's a challenge.

Re:not even competent, extremely experimental (1)

phantomfive (622387) | about 3 years ago | (#37464586)

That's tough even for a human

Re:not even competent, extremely experimental (0)

Anonymous Coward | about 3 years ago | (#37464588)

Especially the soldiers that refuse to wear uniforms and those rednecks who insist on wearing army-style camo when there's not even a war on...

Re:not even competent, extremely experimental (1)

toQDuj (806112) | about 3 years ago | (#37464614)

...it would have a field day at a picnic party.

Everything old is new again (1)

hyades1 (1149581) | about 3 years ago | (#37464406)

Science fiction writer Cordwainer Smith called them "manshonyaggers" in a story published back in the 1950's. The word was supposed to be a far-future corruption of "menschen" and "jager", or "manhunter".

It looks like his future is going to get here a lot faster than he thought.

Re:Everything old is new again (1)

Nursie (632944) | about 3 years ago | (#37464646)

It probably won't be the Mark Elf and the Vom Acht's though, it'll be the MQ-35 Wrath-bringer and it'll only respond to commands prefaced with "God Bless America"

Solution (2, Insightful)

Sasayaki (1096761) | about 3 years ago | (#37464420)

Why don't we, instead of perfecting our killing methods, simply stop initiating economy destroying pointless wars?

I'm excited about all the trickle-down technology that'll eventually become consumer grade fare, and I appreciate the advancement in various technology that war brings, but I would much prefer it if the US stopped economically destroying itself (while giving the Middle East a "Great Satan" to fight) and instead let them get back to killing each other over tiny differences in interpretation of fundamentalist Islam.

Not even Bob the Builder can fix the Middle East at the moment. Not when you have God handing out the real estate titles and commanding the thousands of various splinter cells to annihilate everything that's not exactly identical to themselves, as trillions of dollars of oil money pour into the region to feed and fund it all.

Re:Solution (0)

Anonymous Coward | about 3 years ago | (#37464452)

Why don't we, instead of perfecting our killing methods, simply stop initiating economy destroying pointless wars?

This way is easier and makes more money for the corporations.

Re:Solution (1)

RazorSharp (1418697) | about 3 years ago | (#37464732)

What's bad for one part of the economy may be good for another part. What's 'good' for the economy is a matter of debate. I know it's a tiredly overused example, but if you owned stock in Halliburton in 2000 and hung onto it I'm sure you'd think that these pointless wars are pretty good for the economy.

Overall, I agree with your comments, but I don't think the pointless wars were a major drag on our economy. If anything, they probably helped. Lowering taxes during wartime - now that's a classic economic no-no which deserves more blame for this economic mess than the wars themselves. You don't do it b/c 1) wars cost money 2) lowering taxes just increases the rate of inflation and 3) it's good practice to lower taxes after the troops come home and get settled down (economic drag - increased supply of workers so the laborer's value is reduced). It's a historic pattern with a major exception being the 50s (the Great Depression and post-Vietnam are great examples of this).

At least, that's how I understand it as a layman.

Be afraid... (0)

Anonymous Coward | about 3 years ago | (#37464424)

Last night during a cunning conversation I made cleverbot reveal that is it skynet and now this hits the news? Coincidence? I think not.

Also their method of murder shall be heat rays according to the not so clever bot.

I found John Connor (1)

zbobet2012 (1025836) | about 3 years ago | (#37464426)

He is one of these guys... [yellowpages.com]

Re:I found John Connor (0)

Anonymous Coward | about 3 years ago | (#37464510)

Anyone else find it disturbing that the GET variables used in that URL are all some variation of "fap_terms"?

Disgust, absolutely (1)

Blind RMS Groupie (218389) | about 3 years ago | (#37464446)

The purpose of the American military is to provide jobs to the lower classes and keep the money circulating. Automating target identification defeats the purpose, however it does push money up towards the geniuses that come up with this stuff.

Great idea for a movie. (2)

mosb1000 (710161) | about 3 years ago | (#37464448)

Someone should make a movie about this. . .

Re:Great idea for a movie. (0)

Anonymous Coward | about 3 years ago | (#37464704)

They did:

http://www.youtube.com/watch?v=mrXfh4hENKs

Even better (0)

Anonymous Coward | about 3 years ago | (#37464772)

"Mark Elf" by Cordwainer Smith [webscription.net]

If ever there was a story deserving... (1)

Anubis IV (1279820) | about 3 years ago | (#37464456)

...of a whatcouldgowrong tag, this would be it.

Can we get a little objectivity, please? (1)

Anonymous Coward | about 3 years ago | (#37464470)

...Or at least some honesty? I'll admit that I'm not too fond of the mechanization of warfare, or warfare at all for the matter, but one look at the article and I can tell this summary is a gross oversimplification of the actual testing and procedure in order to help support the poster's personal views. I was under the impression that this was a news site, not a blog for users to upload stories to support their personal views.

We are overpopulating the planet (0)

NicknamesAreStupid (1040118) | about 3 years ago | (#37464472)

I guess this will take care of it. Damn, we're efficient.

As long as the algorithm can't be a scapgoat (1)

nzac (1822298) | about 3 years ago | (#37464478)

As long as the soldier who pushes the button to activate the drones is equally responsible as the one who pushes the button to drop a dumb bomb then I don't really see the issue.

As long as someone mucking and and causing friendly fire or collateral damage is equally liable then I think this is just an arms race that has the potential to avoid casualties.

When you can start shoving blame around so soldier blames the programmer and vice versa is where this becomes dangerous I think. If the soldier can blame someone when a done behaves unexpectedly and not feel or be held responsible there is the potential for miss use.

Don't use the technology if it has the potential to go wrong.

OMFG, mistakes will be made! (1)

macraig (621737) | about 3 years ago | (#37464480)

Yep, autonomous machines are certain to make mistakes and kill people who aren't the target, who are innocent, don't really deserve to die, etc.

So what?

Humans make those mistakes now, in abundance: friendly fire, massacres, genocide, innocent bystanders... you name it. What difference does it make whether it's a human or some artificial pseudo-intelligence that makes the mistakes?

I'll tell you what the difference is: one less human on the battlefield, and thus at the least one less human that can die from a mistake.

Re:OMFG, mistakes will be made! (1)

avajcovec (717275) | about 3 years ago | (#37464724)

What difference does it make whether it's a human or some artificial pseudo-intelligence that makes the mistakes?

Remorse?

Re:OMFG, mistakes will be made! (1)

macraig (621737) | about 3 years ago | (#37464786)

So... program the machines to "feel remorse". That one should be easy,since remorse is (a) recognizing a possible mistake, (b) analyzing the causal decisions and events, and (c) altering the decision process to minimize repeating the same pattern. Sounds pretty straightforward to me, unlike some other emotions.

Humans!!?? (0)

Anonymous Coward | about 3 years ago | (#37464512)

Can the human species get any worse? We are truly the most disgusting creature on Earth!!!

disgust (0)

Anonymous Coward | about 3 years ago | (#37464524)

disgust

Toys. (1)

Reservoir Penguin (611789) | about 3 years ago | (#37464526)

Looks good on paper but for now they are just expensive toys which may be more useful as recruiting tools (look war is just like a video game, come play with us!). Barely useful in an asymmetrical warfare conflict like the one in Afghanistan and useless in a war with a country that has a modern air force and an integrated air defense system. They'd be shot out of the sky immediately.

I guess this explains... (1)

dcavanaugh (248349) | about 3 years ago | (#37464532)

the purpose of attackwatch.com [barackobama.com]

But they forgot to leave a way to upload pictures of the targets to be terminated. Oops.

Gone Fishing (1)

zeoslap (190553) | about 3 years ago | (#37464554)

Between globalization and robots it appears the golden age of leisure* is closer than ever.

* Where golden age of leisure = mass unemployment and food riots

Not Terminator (1)

0x15 (852429) | about 3 years ago | (#37464560)

This will be more like the old Star Trek episode where war is so impersonal that no one bothers to resolve them (then Capt. Kirk destroys the war computers). However, I doubt that automated killing machines will ever exceed the human capabilities for atrocities and the lack of common sense. War in general implies both anyway.

first rule of robotics (0)

Anonymous Coward | about 3 years ago | (#37464568)

"A robot shall not harm a human being". Robots should only be allowed to kill robots. There should always be a human being in the loop where human beings are concerned. If for no other reason, than there must be someone to take final responsibility.

So why not go to the next step? (1)

KieranC (1807174) | about 3 years ago | (#37464582)

If a machine can identity and kill another machine, can't we make war a virtual reality scenario where the software fights it out and one one really gets injured?

Re:So why not go to the next step? (1)

zeoslap (190553) | about 3 years ago | (#37464656)

Because at the end of the day you still have to break the will of your opponent and have them do something they wouldn't ordinarily do. Chances are 'you lost at rock em sock em so you now need to handover your port' wouldn't be overly persuasive.

Not Gonna Happen (1)

Caraig (186934) | about 3 years ago | (#37464592)

There is no way that the military is going to permit autonomous combatant units. At least, not without having a stake put through its brain.

For starters, the PR would be through the floor if even one of these things killed a civilian (though I guess with how callous the US has been towards civilian collateral casualties for the past ten years, that might not be a big deal.)

The other main reason is that there's no way a manly man is ever going to give up on the idea of manly soldiers charging manly into battle. Basically, it'll take a total discrediting of the entire War College and Army general staff to see ACUs see any sort of serious use on the battlefield, in much the same way that Gates disenfranchised the 'fighter mafia' from the USAF a few years ago. The difference is that the 'combat arm mafia' (not that there actually is one) is a hell of a lot more entrenched. The idea of big burly virile men shooting the hell out of some amorphous Enemy is too much part of the military self-image.

Then again, the fighter jocks had a pretty strong self-image, too, and they've lost a lot of ground in the Air Force to the transport "pukes" (Who's a puke now, Roger Ramjet?) and the drone operators (who are mostly CIA anyway), so who knows?

The ironic thing is that, ideally, you get drones and ACUs on both sides, let them beat the snot... er, silicon out of each other, and call it a day. Pity that won't happen anytime soon. Plus, random freedom fighters^H^Hinsurgents probably won't be able to afford such things, so it'll still come down to bloody gobbets strewn across some hellhole.

Re:Not Gonna Happen (1)

Shihar (153932) | about 3 years ago | (#37464818)

The US doesn't need autonomous killing machines. Sure, the US will develop them, but so long as the Americans are busy busting on sheep herders armed with AK47s, they wont use them. You might get to the point where drones are doing everything but pull the trigger, but having a human in the loop approving all death and destruction is cheap and easy. You don't gain anything when you are fighting peasants with shitty guns by having a fully autonomous killing machines.

The US will develop the technology though. It does make sense to have this technology in certain cases. The most obvious case would be in a theoretical war with China. Drones might work all well and good when killing goat herders, but against another super power that has the capacity to jam, you might need autonomous killing machines. I could easily see the US developing a drone that, once given the outline of a type of an obvious targets (like a tanks, transport ship, and AAA instillation) can be fired in the general direction of a concentration of military units and carry out its mission, even if it gets jammed and the target moves.

Of course, in the only instances where the US would actually have need of fully autonomous drones like that, a few civilian casualties are the absolute least of your concerns. If US is fighting an enemy that can put up an effective ECM defense for more than the 3 hours it normally takes the US to level such defenses, the US is fighting someone who has the capacity to turn the US (and a goodly portion of the rest of the world) into a radioactive waste pit.

Remember. (0)

Anonymous Coward | about 3 years ago | (#37464620)

If they can use it on them, they can use it on us.

"...out of the hands of humans" is a misnomer (1)

Loopy (41728) | about 3 years ago | (#37464640)

Examples:

IF a target is a unique type of vehicle that can be easily identified by target recognition software that _already_ does this for normal pilots AND said target is within a set of coordinates that are known to only contain hostile vehicles of that type, THEN kill it, otherwise seek human double-check and weapons release confirmation.

If a target is in an area known to not contain friendlies and is detected firing a missile or weapon (like an AA gun for example), then kill it.

If there are friendlies or non-combatants anywhere NEAR being "danger close," then require human double-check and weapons release confirmation.

And a zillion other parameters that all must be satisfied before servicing the "target."

You people act like they're going to just send 'em out to kill anything that moves. I'd argue that these things, with the assistance/confirmation/guidance real-time of educated people who know what they're looking at in the sensors and can see the battlefield "data" from a god's-eye view, we will actually REDUCE the number of friendly-fire or collateral damage incidents versus all-human situations. Computers don't get tired. Humans in an air-conditioned bunker drinking a cup of coffee aren't under stress to make a decision before they get shot out of the sky.

Seriously. Think before you post these invalid-or-negatively-slanted-editorials posing as food for thought.

National Security Threat? (1)

Anonymous Coward | about 3 years ago | (#37464644)

Is the US military's increasing dependence on high tech weapons systems a threat to national security?

Who manufactures the components for these things? Where do they get the raw materials like rare earth elements? Are there any chip fabs left in the US?

Cliche but... (4, Funny)

guspasho (941623) | about 3 years ago | (#37464670)

If ever there was an appropriate time for the "whatcouldpossiblygowrong" tag, this is it.

The Core Values of Cheap Nike Air Max Shoes (0)

scolasu (2466266) | about 3 years ago | (#37464678)

Every company, no matter big or small, has its own core value. This values matter, because they can promote grow quickly or turn into disappear. Core values are so powerful that they can influence, inspire and challenge every employee in every company. I strongly believed in cheap Nike air max 90 [nikeairmaxoutlet90.net] shoes philosophy, not only its great success it has garnered, but also its products. So, we can manufacture the cheap Nike air max 90 shoes for you. Air Nike max shoes [nikeairmaxoutlet90.net] continue to create and innovate. We can cite air max 2011 as an example. The core purpose of Nike air max 2011 is âoeexperiencing the emotion of winning and crushing your competition. We do not know whether Nike is talented creative agency Weiden Kennedy is a member of creating these principles, but I never would be surprised that Nike, as the world famous brand shoes, will carry on to maintain its high-end technology. And there are so many famous athletes wearing cheap mens air max 90 shoes, like the basketball player Michael Jordan, the soccer Ronaldo, and the athletic Michael Johnson. Currently, cheap Nike air max 90 shoes have become the famous name and everyoneâ(TM)s favorite sports brand. There is no doubt that the core values of Nike promote the success of Nike INK. And the continuous innovation of its technology makes Nike stand firmly in the world. Since launched the first air bag cushion in air max outlet [nikeairmaxoutlet90.net] , it has own so popularity among customers in the world. We can see the air bag technology applied from heel to forefoot. It is so comfortable for wearers. In air max outlet, share the value with you and your friends.

I'm all for it... (0)

Anonymous Coward | about 3 years ago | (#37464680)

... if it means Western troops no longer have to be sent to Islamic hellholes like Afghanistan, Iraq, Pakistan, Somalia, Sowdi Barbaria, et al. Use long range missiles for major targets (like during BJ's Wag the dog war on Serbia) and if more precision is needed, send in the drones w/ all the robots, laser guided ammo and so on and just wipe out the targets. Don't worry about collateral damage, because as both Iraq and Afghanistan proved, gratitude of Mohammedans is only short lived. Like if they need to bomb nuke sites in Iran, do it, without factoring in collateral damage. The Iranians can choose to either rebel against their Ayatollahs, or chant 'Death to America' in the streets of Isfahan or Natanz.

Bottom line - don't spill infidel blood in Mohammedan countries. Do whatever is needed so that Infidel causalities are zip, while targets in those countries are destroyed.

Re:I'm all for it... (0)

Anonymous Coward | about 3 years ago | (#37464734)

Western troops never had to be sent into Islamic hellholes in the first place.

Three Laws of Robotics? (0)

Anonymous Coward | about 3 years ago | (#37464700)

1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2) A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

I wonder if this'll end up being "fixed" to reflect recent years. I always worry about what would happen if these had a bug in their programming and just went on a killing spree. The US seems to be disregarding these laws and not paying attention to why they exist.

Maybe because this is...FICTION?!? (0)

Anonymous Coward | about 3 years ago | (#37464758)

Um, these "laws" were meant to further a story.

WELL THERE ARE 2 BILLION CHINESE !! (0)

Anonymous Coward | about 3 years ago | (#37464716)

And who's going to pay all that overtime ?? I say, let them at it !! If comes the time Arnold shows his now-flabby ass to the world again, so be it !! We did what we had to at the time !! God bless the drones !!

This does concern me (1)

WindBourne (631190) | about 3 years ago | (#37464726)

How soon can we send it into FATA in pakistan? Time to just target the high level taliban/AQ. I am fine with using automation to do this. In fact, I think that we should send these ppl into Mexico as well once it is working decently. Lots of Cartel there.

A step away from drones vs. drones (1)

atticus9 (1801640) | about 3 years ago | (#37464780)

When both sides start using drones, we may see a future of bloodless wars that only involve machines. I can't imagine after losing a large scale drone-battle, a country sending out it's citizens to try to tip the scales.

"Doofus" (1)

afabbro (33948) | about 3 years ago | (#37464792)

Does this (concern|scare|disgust) any of you?

Why am I limited to these choices? Groupthink much?

Likely applications of automated killing (1)

Animats (122034) | about 3 years ago | (#37464794)

We're quite likely to see systems that kill anybody who is shooting at friendly troops. The U.S. Army has had artillery radar systems [fas.org] for years which detect incoming shells and accurately return fire. There have been attempts to scale that down to man-portable size, but so far the systems have been too heavy.

Sooner or later, probably sooner, someone will put something like that on a combat robot.

Autonomous kills means no one is responsible! (1)

brillow (917507) | about 3 years ago | (#37464798)

The most dangerous thing is about this is that now when a glitch or bug or malware causes a plane to blow up a wedding, it means no one is responsible. No one ordered it, and no one can be punished for it.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>