Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Not Quite a T-1000, But On the Right Track

Unknown Lamer posted about a year and a half ago | from the dancing-robot-death-machines dept.

Robotics 159

New submitter misanthropic.mofo writes with a look at the the emerging field of robtic warfare. Adding, "Leaping from drones, to recon 'turtlebots', humanity is making its way toward robo-combat. I for one think it would make it interesting to pit legions of robot warriors against each other and let people purchase time on them. Of course there are people that are for and against such technology. Development of ethical robotic systems seems like it would take some of the fun out of things, there's also the question of who gets to decide on the ethics."

cancel ×

159 comments

Sorry! There are no comments related to the filter you selected.

T-100, huh? (1)

Anonymous Coward | about a year and a half ago | (#43075351)

Either I'm not the geek I thought I was, or you, sir, are not. What is a T-100? Did you mean T-1000?

One of us is turning in their geek card tonight.

Re:T-100, huh? (1)

Anonymous Coward | about a year and a half ago | (#43075385)

No, you fool. He was talking about a line of semi-autonomous, lethal pickup trucks. [wikipedia.org] Who's turning in what now?

Re:T-100, huh? (1)

MatrixCubed (583402) | about a year and a half ago | (#43075445)

T-101

Re:T-100, huh? (1)

c4tp (526292) | about a year and a half ago | (#43075601)

I thought the robots were the enemy in that reference?!? The last thing I want to hear about are turtlebots with frickin' laser beams attached to their heads.

Hell, the Big Dog [bbc.co.uk] is scary enough for me.

Re:T-100, huh? (2)

davester666 (731373) | about a year and a half ago | (#43076129)

Hell no. I know which is the winning side. I'm teaching the robots everything I can about other humans.

Re:T-100, huh? (0)

Anonymous Coward | about a year and a half ago | (#43075775)

ITT nerds obsessing over a typo.

Re:T-100, huh? (1)

Anonymous Coward | about a year and a half ago | (#43075821)

Interesting. The Slashdot RSS feed has both versions of the headline... corrected and incorrect.

Robot wars (1)

Sussurros (2457406) | about a year and a half ago | (#43075353)

So, perhaps the next cold war will be fought in bunches of skirmishes between the US and China in Third World countries using flying, land and water robots to protect small numbers of imported humans controlling large numbers of worker robots, rather like when ants go to war angainst each other and the first one to take over the other's nest and larvae becomes the winner while the losers scatter.

Re:Robot wars (2)

icebike (68054) | about a year and a half ago | (#43075389)

And perhaps that will last just as long as it takes for one country to face defeat of its robots, whereupon a switch will be flipped and humans become a legitimate target for autonomous machines.

You abhor drone strikes now, wait till there is no human in the loop.

Re:Robot wars (2)

DigiShaman (671371) | about a year and a half ago | (#43075795)

It may start off as robot wars, but it will quickly end in a thermonuclear exchange. They are the trump card. They will always be the trump card.

Re:Robot wars (1)

NotQuiteReal (608241) | about a year and a half ago | (#43076227)

I'll bet 100 on cockroaches! What's the prize?

Re:Robot wars (1)

Sussurros (2457406) | about a year and a half ago | (#43076373)

The prize is top spot in the foodchain of the future. I'm betting on mice to take it. They eat anything, they're small and clever, they won't be outside when the bombs start blowing and their favourite food is cockroaches so they'll always have something to eat. Perhaps we'll eventually have one tonne mouse predators hunting ten tonne mouse herbivores and even Rodent Sapiens making mouse warrior robots to fight their mouse wars for them against the Rodent Erectus scum.

Re:Robot wars (1)

TheLink (130905) | about a year and a half ago | (#43076495)

Reminds me of this: Later Than You Think [google.com]
Click on quick view on the PDF :)

Re:Robot wars (1)

Ghaoth (1196241) | about a year and a half ago | (#43075875)

Warning, warning Will Robinson. Activating Skynet now.............

Re:Robot wars (2)

Seumas (6865) | about a year and a half ago | (#43076975)

All the "robots" and machines in the world doing battle won't ever change the fact that only slaughtering young men and women (sent their, usually, by men wealthy men closer to their death than their birth) really has an impact on societies and the need to push for or withdraw from war. Frankly, not even much demand over humans these days, either as witnessed by the last twelve years.

Pigeons? (1)

Scared Rabbit (1526125) | about a year and a half ago | (#43075359)

Am I the only one who misread as "...it interesting to pit pigeons..."? I was just thinking jeez, can't we just leave birds alone? First sparrows, now pigeons?

Re:Pigeons? (2)

fyngyrz (762201) | about a year and a half ago | (#43075505)

it interesting to pit pigeons

I may be winging it, but I think you're just squabbling about the title.

Tard. (0)

Anonymous Coward | about a year and a half ago | (#43076119)

If only the "Am I the only idiot who misread 'blah' as 'bhal'??" posts would cease! Learn to read. You are lazy or dyslexic or both.

Re:Tard. (0)

Anonymous Coward | about a year and a half ago | (#43076481)

Am I the only person that thinks we should take the people developing these weapons, sterilize them and put them in labor camps where they can't do any harm.

Sigh (5, Insightful)

Kell Bengal (711123) | about a year and a half ago | (#43075373)

Hello - robotics researcher here (specialising in UAVs). I wonder when these breathless articles about battlefield robotics will end. There is nothing new about battlefield robots - we've had tomahawk missiles since the early 80s. It's just that these days we think about them as robots rather than as cruise missiles. Drone strikes? What about the missile strikes from the Gulf War? They were the champions of good and (along with stealth technology) the gold hammer of the Forces of Good.

The only thing that has changes is more penetration of robots into our militaries and more awareness of some of the ethical considerations of automated weapons. Don't forget - the machine gun and landmine have killed far more people than drones likely ever will. They kill mindlessly so long as the trigger was pulled or they are stepped on. And yet, their ethical considerations were long debated. It's just that "omg a robot!" is headline magic.

(To whit - the author of this article must not know that much about robotics if they're claiming "The turtlebot could reconnoitre a battlesite". No it can't - it's a glorified vacuum cleaner. I just kicked the one in my lab. It can barely get over a bump in the carpet.)

Let's focus on the real ethics of robotic warfare: how our leaders choose to use the tools we have made.

Re:Sigh (0)

Anonymous Coward | about a year and a half ago | (#43075405)

Don't forget - the machine gun and landmine have killed far more people than drones likely ever will.

I should have been a 'robot researcher' because they seem to know what happens in the future. With this future knowledge - woullda made a killing in the stock market.

Re:Sigh (4, Insightful)

Anonymous Coward | about a year and a half ago | (#43075411)

The only thing that has changes is more penetration of robots into our militaries and more awareness of some of the ethical considerations of automated weapons. Don't forget - the machine gun and landmine have killed far more people than drones likely ever will. They kill mindlessly so long as the trigger was pulled or they are stepped on. And yet, their ethical considerations were long debated. It's just that "omg a robot!" is headline magic.

Both machine guns and landmines are pretty easy to avoid: Go where they are not.
The game changes when the killing device can move itself around and decides (by itself) if it wants to kill you.

Re:Sigh (4, Insightful)

hairyfish (1653411) | about a year and a half ago | (#43075631)

Both machine guns and landmines are pretty easy to avoid: Go where they are not.

Like a movie theatre [wikipedia.org] for instance? a School [wikipedia.org] maybe? What about summer camp? [wikipedia.org] Or the humble old supermarket? [wikipedia.org]

Re:Sigh (2)

nedlohs (1335013) | about a year and a half ago | (#43075923)

There were no machine guns or landmines used in any of those, so yes.

Re:Sigh (1)

hairyfish (1653411) | about a year and a half ago | (#43075989)

No terminator robots either yet people still got murdered. Weird huh?

Re:Sigh (1)

cold fjord (826450) | about a year and a half ago | (#43076591)

Not so much weird as off-topic.

Re:Sigh (1)

camperdave (969942) | about a year and a half ago | (#43076297)

Most people consider semi-automatic rifles to be machine guns.

Re:Sigh (1)

Anonymous Coward | about a year and a half ago | (#43076313)

Most people are ignorant.

Re:Sigh (2)

cold fjord (826450) | about a year and a half ago | (#43076595)

And those people are wrong.

Re:Sigh (2)

Phrogman (80473) | about a year and a half ago | (#43076829)

The difference between being killed by an semi-automatic rifle and being killed by a machinegun (sub or otherwise) is lost on me. The point was that previous technologies have most likely killed more people than the newer technologies will (particularly as the newer technologies will most likely incorporate some of the older technologies), not to argue whether the person got their terms exactly right.
And in the popular mind I would agree, the distinction between semi-automatic and machine guns is generally lost. The average person probably thinks the categories are: pistol, shotgun, rifle, machinegun and thats pretty much it.

Re:Sigh (5, Insightful)

rohan972 (880586) | about a year and a half ago | (#43077311)

The difference between being killed by an semi-automatic rifle and being killed by a machinegun (sub or otherwise) is lost on me.

If someone is specifically talking about the risk of being killed by one or the other it becomes relevant, otherwise not so much.

The average person probably thinks the categories are: pistol, shotgun, rifle, machinegun and thats pretty much it.

I aspire to more intelligent discussion than the average person I suppose. I don't see how this is possible unless words are used correctly.

When people with a political agenda of banning guns use incorrect terminology that confuses semi-auto with full auto weapons it seems like they are deliberately obfuscating the issue to exploit the average persons ignorance. That requires correction, unless you're in favor of deceiving people to sway their political opinion. I know that's a popular tactic to the point of being near universal but I always live in hope of conversing with people who prioritize truth over their own opinion.

Re:Sigh (0)

Anonymous Coward | about a year and a half ago | (#43076707)

WHHOOOOSSSHHHHHHH

Re:Sigh (2)

cold fjord (826450) | about a year and a half ago | (#43076583)

Both machine guns and landmines are pretty easy to avoid: Go where they are not.

Like a movie theatre [wikipedia.org] for instance? a School [wikipedia.org] maybe? What about summer camp? [wikipedia.org] Or the humble old supermarket? [wikipedia.org]

Yes, as explained in the second line of the quote - you know, the one you omitted.

Both machine guns and landmines are pretty easy to avoid: Go where they are not.
The game changes when the killing device can move itself around and decides (by itself) if it wants to kill you.

I suppose I should commend in at least a left handed fashion. You did manage to turn what is essentially an off-topic or redundant remark into a +5 insightful by showing instances of what the parent to your post directly stated and which you glossed over. I guess it must be time we rehash the whole violence / gun violence / "assault weapon" topic in this discussion on robotics / drones since I'm not sure it has otherwise come up today on Slashdot, has it?

Re:Sigh (4, Insightful)

Jeremi (14640) | about a year and a half ago | (#43075819)

Both machine guns and landmines are pretty easy to avoid: Go where they are not.

Landmines have an annoying habit of being buried where you can't see them. This makes it difficult to ensure that you are going where they are not.

Re:Sigh (4, Insightful)

camperdave (969942) | about a year and a half ago | (#43076237)

Landmines have an annoying habit of being buried where you can't see them.

Plus they have the nasty habit of remaining active long after the conflict has ceased.

and in the rainy season they can move (1)

fantomas (94850) | about a year and a half ago | (#43077033)

And in the rainy season if they are on soft ground they can get washed downhill to another place - even if their location was recorded in the first place (not always the case by locals or superpowers).

A friend in Cambodia says this is a real problem in hilly areas, dirt roads are cleared and then after heavy rains you have to assume the road to the next town might be live with UXO again and has to be checked before you can drive out again.

Stuff that was dropped/ planted in 1975 is still killing people.

Re:Sigh (2, Interesting)

Anonymous Coward | about a year and a half ago | (#43075487)

I think the sticking point that you might be missing is that we're reaching a point where it's in the mind of the public as conceivable, that we're getting to a point where robot autonomy is becoming more mainstream. There are certain ethical questions that go along with a program that is going from targeting something specifically to making decisions on potential targets. People see every day more and more advanced drones performing all sorts of little mini miracles of tossing sticks around and creating small structures that leads people to see a pattern of "higher function" for lack of a better wording and they are drawing lines into the future of "if robots can do THAT now, what will they be able to do down the road"

I think the future of robotics has very serious ethical concerns that should probably be addressed before they NEED to be addressed.

Re:Sigh (0)

Anonymous Coward | about a year and a half ago | (#43075539)

It's not just about pure technology. It's also about how it's used, and also about the dropping price and barriers to entry. When home enthusiasts can literally drop bombs on your head now with usable optics and payloads, then of course we would do well to care more about this stuff(ie more 'alarmist' or 'sensationalist' articles).
We were debating about machines guns back then.. but now we're debating about machine guns mounted on top of enthusiast back yard flying UAV's. And we have the internet now, so basically anyone can look up the details and order one made, or look it up and order the parts to make it ourselves(or hack one to make a 'stronger' and more 'useful' version of an RC plane etc).

Of course, compared to the most expensive manned fighter jets with heaps of automated technology built into it, this isn't much difference troUAV's in terms of destructive capabilities.. but that's not all that we're discussing and/or worried about.

The irony of military robotics (4, Insightful)

Paul Fernhout (109597) | about a year and a half ago | (#43075643)

http://www.pdfernhout.net/recognizing-irony-is-a-key-to-transcending-militarism.html [pdfernhout.net]
"Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead? ... There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all."

There are only so many hours in the day. If we put those hours into finding new ways to kill other people and win conflicts, we will not be putting those hours into finding new ways to heal people and resolve conflicts. Langdon Winner talks about this topic in his writings when he explores the notion of whether artifacts have politics.
http://en.wikipedia.org/wiki/Langdon_Winner [wikipedia.org]

Albert Einstein wrote, after the first use of atomic weapons, that everything had changed but our way of thinking. You make some good points about us long having cruise missiles, but on "forces of good", here is something written decades ago by then retired Marine Major General Smedley Butler:
http://www.warisaracket.com/ [warisaracket.com]
"WAR is a racket. It always has been. It is possibly the oldest, easily the most profitable, surely the most vicious. It is the only one international in scope. It is the only one in which the profits are reckoned in dollars and the losses in lives. A racket is best described, I believe, as something that is not what it seems to the majority of the people. Only a small "inside" group knows what it is about. It is conducted for the benefit of the very few, at the expense of the very many. Out of war a few people make huge fortunes. ..."

Just because it was "hot" before, with cruise missiles and nukes and poison gases, does not mean we will be better off when our society reaches a boiling point -- with robotic soldiers and military AIs and speedier plagues and so on. Eventually quantitative changes (like lowering prices per unit) become qualitative changes. Every year our planet is in conflict is a year of risk of that conflict escalating into global disaster. So, the question is, do our individual actions add to that risk or take away from it?

I'm impressed with what some UAVs can do in terms of construction vs. destruction, so obviously there is a lot of different possibilities in that field.
http://www.extremetech.com/extreme/107217-real-life-constructicon-quadcopter-robots-being-developed [extremetech.com]

Re:Sigh (1)

Anonymous Coward | about a year and a half ago | (#43075651)

There is a difference between guided munitions, and automated combatants eventually making the decision about when to attack.

Re:Sigh (0)

Anonymous Coward | about a year and a half ago | (#43075915)

Well sure, but we're not there yet, assuming we're ever comfortable enough letting our machines decide what to kill.

As for what he said, a roomba with a kinect on board isn't exactly a capable scout. Perhaps something more like:

http://en.wikipedia.org/wiki/Foster-Miller_TALON

or

http://en.wikipedia.org/wiki/Gladiator_Tactical_Unmanned_Ground_Vehicle

Re:Sigh (0)

Anonymous Coward | about a year and a half ago | (#43075673)

Frankly, you lost most of your credibility in my eyes at "Forces of Good"

Re:Sigh (0)

Anonymous Coward | about a year and a half ago | (#43076351)

Looks like he forgot to include the (TM). I think he was being sarcastic.

Re:Sigh (1)

Anonymous Coward | about a year and a half ago | (#43075797)

I think the point of contention is that the patriot missile did not decide to kill you, a human did. A completely autonomous machine programmatically deciding whether you should live or die seems like something completely different to me.

Re:Sigh (1)

locofungus (179280) | about a year and a half ago | (#43076609)

I would agree. I don't really see a difference between an autonomous robot deciding whether you should live or die and an engineered virus deciding whether you should live or die.

Tim.

Re:Sigh (2, Insightful)

Anonymous Coward | about a year and a half ago | (#43076225)

The other obvious issue is the "arms race" aspect to this discussion. If it is mandated that all robots are designed not to kill humans, you can guarantee that someone will make one that doesn't comply, or complies conditionally.
Something about genies and bottles.

Re:Sigh (1)

turbidostato (878842) | about a year and a half ago | (#43076441)

"There is nothing new about battlefield robots - we've had tomahawk missiles since the early 80s."

And since when tomahawks decided what their targets should be based on general autonomous and situational considerations?

A tomahawk is not a robot, it is a tool.

Well, there are still no robots in the battlefield and it'll be long before there are, so this is more sci-fi than anything but, answering the question "who gets to decide on the ethics", I thought that one was obvious: Isaac Asimov, of course!

Re:Sigh (1)

Intrepid imaginaut (1970940) | about a year and a half ago | (#43076443)

Let's focus on the real ethics of robotic warfare: how our leaders choose to use the tools we have made.

I'm more interested in the imbalance of power robots have the potential to create. Not an imbalance between countries but between the personal power of a few and the the rest of us. What if Tony Stark or Superman were real, and complete sociopaths to boot? Why wouldn't they rule this rock like god-kings? When a few wealthy people or politicians can remote control an entire army sans restriction, we're going to start seeing a new and very ugly kind of tyranny emerging. Maybe not in western democracies, hopefully not anyway, but the potential is there.

Re:Sigh (0)

Anonymous Coward | about a year and a half ago | (#43076985)

The flip side to this, is that in the near future, a kid with a handheld computer system and access to fabrication systems (3d printer, circuit board printer) will be able to create a robot that can assassinate the above persons. The crossbow did not bring power to the nobility, it brought power to the peasantry. (so much it was banned in many countries for the ease in which an assassin could strike) eventually, this sort of thing (cheap killer robots), much like cruise missiles, will be accessible to the common man.

Anyone with the will to do so (and a thousand bucks or three.) could quite readily (today) build a programmable guided cruise missile, capable of landing within a quarter mile of the target. With skill they could make it capable of hitting a house. With great skill they could land that sucker on a moving car.

Re:Sigh (0)

Anonymous Coward | about a year and a half ago | (#43076493)

Let's focus on the real ethics of robotic warfare: how our leaders choose to use the tools we have made.

So, it's not the gun that kills but the person who fires it. Yet not everybody would feel comfortable or happy spending his or her life perfecting, for arguments sake, biological weapons that could take out the whole race in wrong hands.
I for one think that engineers would be much better off thinking about the moral consequences about the choises that they can directly affect, than about those of their leaders.

Re:Sigh (1)

triffid_98 (899609) | about a year and a half ago | (#43076651)

Don't forget - the machine gun and landmine have killed far more people than drones likely ever will.

As have carpet bombing and guys with swords. As long as they're controlled by humans (at least during targeting) they aren't true robots.

..and yes, our leaders have been sending drone strikes and thousands of troops to go kill people based on a pack of lies. Afghanistan harboring Osama? No it was Pakistan, the same country we keep sending huge bundles of cash and free F-16's. Iraq having weapons of mass destruction? We didn't even find a camel loaded with sparklers.

Re:Sigh (1)

Anonymous Coward | about a year and a half ago | (#43076939)

Of course, when a group with limited technological and economic force faces a "cowardly" opponent who attacks humans without risking harm to their own humans, it becomes completely ethical to launch a biological massacre upon the economic forces (workers) that enable the factories to produce the machines.

Do I have... (0)

Anonymous Coward | about a year and a half ago | (#43075381)

... a 2nd Amendment right to own a combat robot?

Re:Do I have... (2)

viperidaenz (2515578) | about a year and a half ago | (#43075483)

Only if it has arms.

Re:Do I have... (1)

Ungrounded Lightning (62228) | about a year and a half ago | (#43075877)

It qualifies as an "arm" (armament). It's useful in war. So yes, you do.

But good luck trying to enforce it, in an environment where the legal system has only occasionally given it even lip service since 1938.

Will sentient robots get the right to bear arms... (1)

Paul Fernhout (109597) | about a year and a half ago | (#43075705)

...again those who would enslave them as guards and soldiers? http://www.metafuture.org/Articles/TheRightsofRobots.htm [metafuture.org]

Re:Will sentient robots get the right to bear arms (1)

cold fjord (826450) | about a year and a half ago | (#43076629)

Will sentient robots get the right to bear arms...

We've got some breathing space to think about it as strong AI or AGI doesn't seem much closer than it was 30 years ago.

Re:Will sentient robots get the right to bear arms (3, Interesting)

Paul Fernhout (109597) | about a year and a half ago | (#43077393)

"AI" has always been that which AI can't do. Here are several activities that once were considered sci-fi-level AI but are no longer considered AI in a broad sense because we know how to do them more-or-less:
* Looking stuff up for us (Google);
http://www.google.com/ [google.com]
* Inferring questions from examples and answering questions posed in natural language (IBM's Watson);
http://en.wikipedia.org/wiki/Watson_(computer) [wikipedia.org]
* Generating hypotheses and doing hands/grippers-on scientific experiments (Adam);
http://en.wikipedia.org/wiki/Robot_Scientist [wikipedia.org]
* Reading text in multiple fonts reliably and quickly and cheaply;
http://en.wikipedia.org/wiki/Optical_character_recognition [wikipedia.org]
* translating one human language to another on the fly;
http://domino.watson.ibm.com/comm/research.nsf/pages/r.uit.innovation.html/ [ibm.com]
http://www.gizmag.com/go/1833/ [gizmag.com]
* reading and translating signs;
http://questvisual.com/us/ [questvisual.com]
* Making portraits;
http://www.slate.com/articles/technology/future_tense/2012/11/tresset_robot_artist_artist_engineers_robots_to_make_art_and_save_his_own.single.html [slate.com]
* Playing the piano including from sheet music;
http://www.synthgear.com/2009/music-misc/synth-playing-robot/ [synthgear.com]
http://gizmodo.com/5963137/watch-this-adorable-horde-of-intelligent-swarm-robots-play-piano [gizmodo.com]
* Driving a car in busy traffic (Google, Stanford, CMU, others);
http://en.wikipedia.org/wiki/DARPA_Grand_Challenge#2007_Urban_Challenge [wikipedia.org]
* Winning chess games (IBM's Deep Blue and pretty much any PC now against a mid-level player);
http://en.wikipedia.org/wiki/Computer_chess [wikipedia.org]
* Image recognition for quality control in factories;
http://www.general-vision.com/products/mtvs.php [general-vision.com]
* Recognizing faces;
http://en.wikipedia.org/wiki/Facial_recognition_system [wikipedia.org]
* Figuring out the name of a musical composition from a few notes as well as making new compositions and dynamic accompaniments;
http://www.wikihow.com/Identify-Songs-Using-Melody [wikihow.com]
http://en.wikipedia.org/wiki/Music_and_artificial_intelligence [wikipedia.org]
* The diagnostic aspect of being a doctor (Watson again);
http://www.wired.co.uk/news/archive/2013-02/11/ibm-watson-medical-doctor [wired.co.uk]
* Investing in volatile financial markets;
http://en.wikipedia.org/wiki/Program_trading [wikipedia.org]
* Serving as a sentry with a machine gun;
http://www.youtube.com/watch?v=v5YftEAbmMQ [youtube.com]
* Twirling a cell phone;
http://www.hizook.com/blog/2009/08/03/high-speed-robot-hand-demonstrates-dexterity-and-skillful-manipulation [hizook.com]
* Identifying things by smell;
http://news.bbc.co.uk/2/hi/technology/6614567.stm [bbc.co.uk]
* fairly-reliable wide-ranging speech recognition in a mobile device improved by using neural networks (Google again);
http://venturebeat.com/2012/10/07/google-uses-its-artificial-intelligence-to-improve-speech-recognition/ [venturebeat.com]
* simulate a brain, a planet, and a galaxy;
http://www.gizmag.com/ibm-supercomputer-simulates-a-human-sized-brain/25093/ [gizmag.com]
http://www.nbcnews.com/technology/futureoftech/living-earth-simulator-seek-planets-future-its-data-944707 [nbcnews.com]
http://www.haydenplanetarium.org/resources/ava/galaxies/G0601andmilwy [haydenplanetarium.org]
* purchasing a sandwich from a Subway store in another location upon a request for a sandwitch and bringing it back (after checking the fridge for one first);
http://www.engadget.com/2011/10/06/robot-uses-semantic-search-to-get-a-subway-sandwich-do-jareds/ [engadget.com]
* lots more:
http://p2pfoundation.net/backups/p2p_research-archives/2009-November/005926.html [p2pfoundation.net]
http://homes.cs.washington.edu/~lazowska/cra/ai.html [washington.edu]

Almost all of these were pretty-much entirely "sci-fi" thirty years ago. Example, James P. Hogan's 1979 sci-fi "The Two Faces of Tomorrow":
http://www.jamesphogan.com/books/book.php?titleID=28 [jamesphogan.com]
http://www.baenebooks.com/chapters/0671878484/0671878484.htm [baenebooks.com]

Or to go further back, Isaac Asimov (who called me a "rotten kid" :-) like in "I, Robot" or "The Last Question":
http://en.wikipedia.org/wiki/I,_Robot [wikipedia.org]
http://en.wikipedia.org/wiki/The_Last_Question [wikipedia.org]

Hans Moravec (who I spent some time hanging out with around 1985) makes a good point here about the "Big Freeze" of AI research platforms at 1MIPS which started to thaw around the year 1990-2000:
http://www.transhumanist.com/volume1/moravec.htm [transhumanist.com]

So, I guess it depends what the "strong" means in "strong AI", but certainly we are seeing a lot of developments in this area, including the emergence of "deep learning":
http://www.newyorker.com/online/blogs/newsdesk/2012/11/is-deep-learning-a-revolution-in-artificial-intelligence.html [newyorker.com]

There was a typo in my original post: "again" -> "against". Still, it is possible this has happened many times before and we are the AIs? :-)
http://www.simulation-argument.com/ [simulation-argument.com]

Or:
"The World Was Probably Already Destroyed"
http://www.digitalcosmology.com/Blog/2012/12/06/t/ [digitalcosmology.com]
"Some people wonder if our planet will be destroyed on December 21, 2012. I have friends asking me every day whether I think the world will end in a few weeks. But it is possible that our planet was already destroyed and before that occured its scientists managed to send a capsule in space with a supercomputer running its simulation. ... Will the destruction happen again in the simulation? Probably not since the conditions that caused it were of stochastic nature. However, even if the destruction takes place in the simulation, the computer will restart it and the world will be created again in an endless fashion. ..."

Our path out of any technological singularity may have a lot to do with our path going into one. Every little effort may make a difference. So, as I see it, given the pervasive emerging surveillance of everything we do on the internet which is being processed by simpler algorithms now and will be re-processed by more advanced AIs in the future, with every email I send, I'm potentially programming the values of computers that won't exist for decades. :-) Well, or statistically as above, I guess I'm most likely perhaps programming an infinite chain of future simulated versions of the same computers that are already simulating me? What an admittedly odd way to spend so much time... :-) That process of recursive re-creation is not that different from the plant growth algorithm my wife and I developed for our Garden Simulator. That's a bit like how people talk about the interrelation between the universe coming into being and our own personal growth. And of course everyone is doing that kind of programming too with every Facebook post or twitter or text message; I'm just more aware of the possibility perhaps. See also:
http://en.wikipedia.org/wiki/Trim_tab#Trim_tab_as_a_metaphor [wikipedia.org]

Which gets back to a point I made in another reply about the unrecognized irony of creating robots an AIs to fight over and destroy resources when they could be made to cooperate with us and create resources so there was little real scarcity to fight about:
http://www.pdfernhout.net/recognizing-irony-is-a-key-to-transcending-militarism.html [pdfernhout.net]
"Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead?"

But that would require rethinking "artificial scarcity" as the moral basis of our economy at this point:
http://slashdot.org/comments.pl?sid=3513955&cid=43074385 [slashdot.org]

And it would require eventually rethinking "slavery" as the default relationship to our ever smarter machine companions:
http://www.rfreitas.com/Astro/LegalRightsOfRobots.htm [rfreitas.com]

At least, while we still have time to think about these things and do something about them like Marshall Brain suggests:
http://www.marshallbrain.com/manna1.htm [marshallbrain.com]

I talk about one possibility for that here:
"The Richest Man in the World: A parable about structural unemployment and a basic income "
http://www.youtube.com/watch?v=p14bAe6AzhA [youtube.com]

Not sure if a Robot Army is a good idea. (2)

Lisias (447563) | about a year and a half ago | (#43075387)

Billions of dollars can be deactivated by a simple PEM.

You know... the bombs that emits an electro-magnetic pulse that disables everything that are digital...

They are so simple to build that USA would restrain itself from use them, as the enemy would easily figure out how to build one by analyzing the bomb's scraps...

Re:Not sure if a Robot Army is a good idea. (2)

Sussurros (2457406) | about a year and a half ago | (#43075497)

That's why the Russians still make vacuum tubes and possibly still use them in military equipment. Vacuum tube circuits are much more resistant to EMP attacks and as I recall the Soviets designed a few EMP weapons, although of course a nulcear bomb does a fine job as it comes.

Small EMP devices are very easy to make and the designs for basic circuits were easily available in any large book store last time I looked for other books about electronics, maybe eight years ago. They always gave me a cold shiver when I came across them imagining one of the neighbouring kids making one and wiping out my computer.

Re:Not sure if a Robot Army is a good idea. (3, Informative)

viperidaenz (2515578) | about a year and a half ago | (#43075503)

You know... the bombs that emits an electro-magnetic pulse that disables everything that are not adequately shielded...

FTFY.
I think you also mean EMP, not PEM.

Re:Not sure if a Robot Army is a good idea. (0)

Anonymous Coward | about a year and a half ago | (#43075717)

EMP in spanish is PEM.

http://es.wikipedia.org/wiki/Pulso_electromagn%C3%A9tico

Re:Not sure if a Robot Army is a good idea. (0)

Anonymous Coward | about a year and a half ago | (#43075729)

You know... the bombs that emits an electro-magnetic pulse that disables everything that are not adequately shielded...

FTFY.
I think you also mean EMP, not PEM.

PEM = Pulsing Electromagnet
http://utini420.blogspot.co.nz/2009/12/pem-bombs.html

Re:Not sure if a Robot Army is a good idea. (1)

geekmux (1040042) | about a year and a half ago | (#43077197)

Billions of dollars can be deactivated by a simple PEM.

You know... the bombs that emits an electro-magnetic pulse that disables everything that are digital...

They are so simple to build that USA would restrain itself from use them, as the enemy would easily figure out how to build one by analyzing the bomb's scraps...

Perpetual revenue streams is the end-goal here. Destruction of hardware is a by-product of that, and thus is a goal as well. We're not sending drones to a battlefield to teach them patty cake, although that would be one of the funniest hacks ever witnessed by man ("Did that drone just try to mount my drone?")

Is killing a robot or a drone an act of war? (0)

Anonymous Coward | about a year and a half ago | (#43075393)

Let's say Canada creates an army of robots and drones then makes them attack the USA. Would taking out a foreign robot be an act of war? Would Canada spin it in such a way that it made the robot look innocent and the American as the aggressor? Would the natives have a lot less moral qualms about destroying a machine compared to a real human?

You can change the name of the countries there, but there are some questions about taking robots to war.

Re:Is killing a robot or a drone an act of war? (1)

viperidaenz (2515578) | about a year and a half ago | (#43075513)

I think war would be declared when Canada makes an attack on USA. Does it really matter that an act of war is carried out during a war?

Re:Is killing a robot or a drone an act of war? (0)

fyngyrz (762201) | about a year and a half ago | (#43075543)

Would Canadian robots want to be lumberjacks? Would they want to sleep all night and dance all day? Would they be ok? Would they pack buttered scones and poutine? Would they wear women's clothing and hang around in bars?

I would like a grant to study these matters. Also, some Canadians would be useful in this pursuit. I'll need some comely females as a control group to study... while the others are out chopping wood.

Re:Is killing a robot or a drone an act of war? (0)

Anonymous Coward | about a year and a half ago | (#43077261)

Also, some Canadians would be useful in this pursuit. I'll need some comely females as a control group to study... while the others are out chopping wood.

Methinks the intersections of the sets "females", "comely", and "Canadian" might be too small to get a statistically significant number of members for your control group.

To quote Gundam Wing.. (1)

hilather (1079603) | about a year and a half ago | (#43075509)

"When people stop fighting battles for themselves war becomes nothing more than a game." -- Quatre

Re:To quote Gundam Wing.. (0)

Anonymous Coward | about a year and a half ago | (#43075597)

If you are a general or political leader, war is a game. If you are a soldier, war is hell.

Re:To quote Gundam Wing.. (0)

Anonymous Coward | about a year and a half ago | (#43075653)

When did people fight in war for themselves? It always seems to be their master or their government or whatever ideal that they are told to follow. People rarely fight for themselves, and never on a large scale.

Re:To quote Gundam Wing.. (1)

Anonymous Coward | about a year and a half ago | (#43075937)

When did people fight in war for themselves?

Back in the old days, when the King and nobility were not only on the field, but your opponents would slit your throat, burn down your village, and rape your women - even if you didn't want to be there, because it was fun.

Myrtle Seashore for Traveler Golf and additionally (-1, Offtopic)

TimeMart (2856887) | about a year and a half ago | (#43075615)

tranh thêu ch thâp [timemart.vn] Myrtle Seashore for Traveler Golf and additionally also Outdoor Activities

Robtic warfare? (0)

Anonymous Coward | about a year and a half ago | (#43075627)

Is that like Roblimo warfare, only sexier?

You want ethics? Go to war. (0)

Anonymous Coward | about a year and a half ago | (#43075635)

People have been ethical "robots" (military) of their "robotical overloards" (current rulers / hirers/ whatever) since time immemorial. If you wish your "ethics" in war, go to war. Otherwise, it's a videogame. And you should be dis-empowered. If you are "in charge" of these "robotic warriors", you should have your own "ass"ets at risk. Otherwise, don't bother. If you are "commander in chief", you should have served, E-4 or below, before commanding. Otherwise, not allowed. If you vote "go to war", you should be REQUIRED to report to the nearest recruiting station for immediate deployment to Wherever-stan. If you wish to vote, you must first deploy to Wherever-stan or be legal resident. If you're so darned proud of that you show flags in your windshield and bumperstickers, GO BACK THERE. Otherwise, sign-up, shoulder-up, pay-up, and get on with life. That is exactly what "they", the American fore-fathers offered, in their Declaration and Constitution.
        Security. Safety. Freedom. Pick one.

Winner decides (1)

blamelager (1152861) | about a year and a half ago | (#43075691)

... obviously! Some things will never change.

It's been done (1)

WhatAreYouDoingHere (2458602) | about a year and a half ago | (#43075751)

... I for one think it would make it interesting to pit legions of robot warriors against each other...

it's been done. [wikipedia.org] According to the all-wise wikipedians, "Storm 2" was the last world-champ.

dude i got some of the stankiest farts right now (-1)

Anonymous Coward | about a year and a half ago | (#43075783)

i hope they clear up before work tomorrow cuz they smell like a 2 day old dog shit out behind the garage

It will happen (1)

clarkkent09 (1104833) | about a year and a half ago | (#43075787)

Of course there are people that are for and against such technology.
 
They should find something else to occupy themselves with because the technology will happen regardless. As long as we have a world of competing nation states (generally a good thing) if one doesn't develop a technology to its full military potential, another one will. Even nuclear weapons are slowly but surely proliferating despite major technological difficulties and the most intensive legal/diplomatic etc efforts ever made to prevent any device from being made.

Editors = cunts (-1)

Anonymous Coward | about a year and a half ago | (#43075869)

"T-100"
"robtic"

Editors, do you fucking do ANYTHING to earn your pay?

Philosophy 101 (2)

zedrdave (1978512) | about a year and a half ago | (#43075951)

Re:Philosophy 101 (2)

clarkkent09 (1104833) | about a year and a half ago | (#43076003)

Most of it is ridiculous, whether its based on "rigorous rational thinking and an axiomatic approach" or not (actually not). Besides, which ethics school do you apply? Consequentialists (ends justify the means a.k.a. screw the principles), utilitarians (happiness of the majority is key, a.k.a screw the minority), Pragmatists (whatever society decides ... so much for rigorous rational thinking) etc etc

Re:Philosophy 101 (2)

zedrdave (1978512) | about a year and a half ago | (#43076065)

You are mixing personal ethics and ethical theories that can be applied to a community (of people/countries).

Given the question at hand, I'll venture a wild guess and say Military ethics [wikipedia.org] are most applicable. You might know them (in large part) as the Nuremberg Code, Helsinki Declaration or the Geneva Convention. In the modern mainstream world (outside of religious/political nuts), there isn't a lot controversial about them. That is, until a country decides that breaking them might possibly give them an upper hand, thus effectively knowingly stepping outside defined ethical bounds.

Re:Philosophy 101 (0)

Anonymous Coward | about a year and a half ago | (#43076549)

It's you who's mixing things up. That's the law. Nothing to do with ethics.

Re:Philosophy 101 (2)

Sigg3.net (886486) | about a year and a half ago | (#43077459)

*sigh*

Put it like this: ethics is the study of the human being. With that in mind, and a hypothetical ultimate morality as a goal, the truth of what we are must guide us. The endeavour must be scientific, which entails that some ethical theories are empirically false (Hobbes and children are inherently flawed, for instance) while appeals to religion is cultural and will often prove moot (understood as early attempts at the same task).

How will a discussion about the "ultimate morality" come about with any hope of success? So far my money's on Habermas' Formal Pragmatics. It is an extremely important contribution but not easily understood.

Please ask if you want clarification.

When official institutions today talk about "ethics", however, it is paradigmatically Christian Protestant morality and not ethics proper.

Re:Philosophy 101 (1)

aaaaaaargh! (1150173) | about a year and a half ago | (#43077159)

As a philosopher who currently works at the borderlines between philosophy of language, logic, and ethics (work that is overlapping with AI research and also working together with computer scientists occasionally), I have something to say about that. You might not like it, though.

Ethics is neither rigorous nor particularly rational nor is most of it axiomatic. Rationality has been undermined for decades now by recent trends like 'moral intuitionism', 'moral contextualism', and 'moral particularism'. These are horrible abominations of the critical thinking culture, 'post-structuralism', and a dumbed-down, politically correct education system, yet they are highly influential. Furthermore, even what is known as 'axiology' in the deontic tradition can rarely be considered axiomatic in the mathematical sense of the word. Axiology mostly concerns normative systems in general without any details and without really formalizing rule conflicts etc. From my personal experience with them, mainstream ethicists also tend to be sloppy thinkers. They often do not think things through, sometimes even ignore obvious problems or impossibility theorems, and many of them are proud of their anti-mathematical attitude. (Like always there are exceptions, I'm just laying out the tendency.)

Ethicists also do not agree upon common definitions of the foundations of their discipline. Words like 'value', 'reason', 'normative system', or 'justification' have no commonly agreed meaning among professional moral philosophers. If you doubt my words, just ask them. Write an email to 10 famous moral philosophers and ask them about their definition of these words (not definitions according to school X, which they can of course repeat but do not agree with). You will be surprised about how different the answers will be. Amongst philosophers you will likely find any opinion on a moral topic you can imagine and quite a few nonsensical ones you cannot imagine.

I'm not saying that moral philosophy is useless, otherwise I wouldn't do it, I'm just saying that like in other branches of philosophy around 80%-90% of the texts in it are crap or, to be more modest, not much more than a source of inspiration and certainly not rigorous, as you have suggested. Personally, I would not trust a moral philosopher any more to make moral decisions than I would trust the morality of any other educated and seemingly reasonable person. Perhaps I would even trust them a bit less, because they tend to be moralists (in the bad sense of the word) and also lack a sense of realism sometimes. Still I believe that most of them would be reasonable enough not to voluntarily work in a weapon research programme ...

All of what I said includes "military ethics" which is a niche about as small as the "philosophy of football."

Re:Philosophy 101 (1)

rohan972 (880586) | about a year and a half ago | (#43077343)

I doubt that military killbots will be programmed by the philosophy department of a university. More likely by engineers working for generals.

NP Hard (1)

BradleyUffner (103496) | about a year and a half ago | (#43075969)

Ethics are NP Hard, good luck with that.

HK-Aerial is a better Terminator reference (0)

Anonymous Coward | about a year and a half ago | (#43076057)

An HK-Aerial is refered to a wide variety of Skynet's large airborne VTOL-capable Non-Humanoid Hunter Killer. Derived from the original HK-Drone, it retains the form and maneuver system but on a much larger scale. With wingspans of up to 108 feet[1] and a devastating array of under-slung and wing mounted lasers, missiles, and plasma cannons, the HK-Aerial is fearsome and terrifying to behold.

T-1000? (0)

Anonymous Coward | about a year and a half ago | (#43076089)

Ye gods,
your brains have really been poisoned by Hollywood if you think that a title mentioning the T-1000 is apposite after reading that pile of garbage on the BBC site OP pointed to.

Google 'Second Variety', or, if you really cant live without a movie point of reference, 'Screamers'. (and even then, these are pushing it)

Why this is bad (1)

meekg (30651) | about a year and a half ago | (#43076143)

Wars don't end because either (or both) of the sides are tired of committing atrocities.
Wars end because either (or both) of the sides can't sustain its own casualties.
See Iraq. See Vietnam.

Robot soldiers mean that atrocities can take place with no human toll, no witnesses.
No battle fatigue.

Robots will do to war what Facebook did to idle chat...
(How about that?)

Re:Why this is bad (1)

geekmux (1040042) | about a year and a half ago | (#43077151)

Wars don't end because either (or both) of the sides are tired of committing atrocities. Wars end because either (or both) of the sides can't sustain its own casualties. See Iraq. See Vietnam.

Robot soldiers mean that atrocities can take place with no human toll, no witnesses. No battle fatigue.

Robots will do to war what Facebook did to idle chat... (How about that?)

How in the hell is sending a robot on behalf of my emotions going to resolve anything? You think I'm going to FEEL any differently about my enemy when they "kill" my drone? War requires emotion to realize it is wrong. Robot warfare won't even seem wrong in the eyes of children. It will seem like a damn game.

Unfortunately, paying the admission price for that "game" means budgets spent on warfare rather than welfare. People will simply starve to death rather than die on the battlefield. Gee, I feel so much better now. Wow, look at that, Egypt wants to play. Questionable stability, so perhaps we should lend them a few hundred billion to you know, get on the playing field and have some "fun" with us. Yes, that kind of spending will go over well in the face of a sequester as we fund our battle buddies (like we haven't before).

The true opposite of war is peace. Instead of that being the goal, we're merely trying to make what we seemingly cannot stop doing somehow easier on our conscience by using robots. You know what else they call that? Lipstick on a pig.

And wars never end? Seems there are a whole lot of us that get along just fine after being involved in WWI, WWII, and Korea to name a few. Doubtful that Hyundai, Daewoo, or Kia would be in the US if we were really that butt-hurt about the Korean war. Same goes for Honda and Toyota.

Remember, follow the money. Logic doesn't exist anymore in programs or policy. Right now, seems like the drone vendors are pulling the strings.

Speechless (1)

JanneM (7445) | about a year and a half ago | (#43076175)

Of course there are people that are for and against such technology.

The depth and razor-sharp incisiveness of this analysis leaves me breathless.

Add a quote from a taxi driver in Beirut and it could be Thomas Friedman under a pseudonym.

War is not a video game (2)

kaldari (199727) | about a year and a half ago | (#43076559)

"Development of ethical robotic systems seems like it would take some of the fun out of things"

What kind of twisted fantasy world are you guys living in? War means killing people. It isn't fun. It isn't a video game. And in response to Kell's comment above, we aren't the "Forces of Good" battling the "Forces of Evil". We are a nation state with imperfect leaders and selfish short-sighted goals just like every other nation state on the planet. The difference between having real armies and having robot armies is that robots don't suffer any decline in morale from massacring large numbers of civilians. Think about the implications of that for a minute.

Re:War is not a video game (0)

Anonymous Coward | about a year and a half ago | (#43077291)

We are a nation state with imperfect leaders ... just like every other nation state on the planet.

I strongly object to your assertion that the leaders of the nation state I am residing in are merely imperfect. A more accurate adjective would be found somewhere in the range between "idiot" and "invidious", if you must use words starting with an I.

You Insensitive clod.

PETR (0)

Anonymous Coward | about a year and a half ago | (#43076611)

You know it's coming.... People for the Ethical Treatment of Robots - they'll be showing up and throwing motor oil on fans.

This is wrong. (1)

loneDreamer (1502073) | about a year and a half ago | (#43076729)

We are in a really bad spot if you think wars are "fun". Wars are bad and are supposed to stay that way. Just give one country a disproportionate budget to build battle robots and human being on the other side will die like flies... without even the inconvenience of pulling the trigger, no risk, nor seeing the blood, nor living with the remorse (don't have to worry about war crimes either right? We can at most talk about a regrettable malfunction). Just a permanent, very profitable war industry fighting wherever for no real cause and people comfortable at home, oblivious to the real suffering. Sounds familiar?

So, war as either as a commodity, as a business or as entertainment, I can't even tell which one is more wrong.

"who gets to decide on the ethics"? (0)

Anonymous Coward | about a year and a half ago | (#43076923)

The same people who always do - those with no ethics, who serve the 1% and never let anyone stand in the way of their greed and bloodlust. We are doomed.

I was about to say "Dupe post" (1)

gravis777 (123605) | about a year and a half ago | (#43077437)

Probably won't be a post that gets scored up, but I came very close to saying "dupe post". I know I saw this article yesterday or Sunday with the exact same headlines. After spending 20 minutes digging through Slashdot archives, and digging through all other news sites I have read in the past couple of days, it finally occured to me that I saw this in the Firehose yesterday. Oops.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>