Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Robots Can Learn To Hold Knives — and Not Stab Humans

Soulskill posted about 10 months ago | from the never-bring-a-knife-to-a-robot-fight dept.

Robotics 104

aurtherdent2000 writes "We humans enjoy not having knives inside of us. Robots don't know this (Three Laws be damned). Therefore, it's important for humans to explain this information to robots using careful training. Researchers at Cornell University are developing a co-active learning method, where humans can correct a robot's motions, showing it how to properly use objects such as knives. They use it for a robot performing grocery checkout tasks."

cancel ×

104 comments

Sorry! There are no comments related to the filter you selected.

It shouldn't have to be pointed out (5, Insightful)

Anonymous Coward | about 10 months ago | (#45342201)

If they can be taught to not stab a human...They can also be taught to stab a human. All it takes is one psychopath or curious idiot.

Re:It shouldn't have to be pointed out (2, Insightful)

Anonymous Coward | about 10 months ago | (#45342223)

They don't need to be taught to stab humans, stabbing is its natural state of being. Being taught NOT to, well, that's the big news here.

Re:It shouldn't have to be pointed out (4, Funny)

93 Escort Wagon (326346) | about 10 months ago | (#45342687)

Well, sheesh - you named the robot "Stabby"... what did you THINK was going to happen?

Re:It shouldn't have to be pointed out (3, Insightful)

LifesABeach (234436) | about 10 months ago | (#45344179)

Sorry I'm slow this morning, I'm still working on the phrase, "... They use it for a robot performing grocery checkout tasks." I can't tell you how many times I've encountered a grocery store clerk holding a knife. It's not clear to me now how this task can be shifted over to Robots, but the day is early still.

Re:It shouldn't have to be pointed out (2)

femtobyte (710429) | about 10 months ago | (#45346033)

Presumably, the knife is meant to be a "worst-case" stand-in for any object. If a robot can safely handle knives in close quarters to humans, then everything else is safe. In the grocery checkout situation, you don't want the robot to accidentally swing a can of beans through a customer's head when the absentminded customer leans over the counter to pick up the coupon they dropped.

Re:It shouldn't have to... things go bad (1)

Anonymous Coward | about 10 months ago | (#45346405)

Okay, train of mechanical thought: Potatoes, check, corn... still need. Don't stab human. Greens, check. Olives, still need. Don't stab human. pickles, still need : Exception: Pickles on this aisle. Proceed to pickles sh - don't stab human- elf. Reach down to correct level. Arrange fing- Don't stab human- ers in an open grasping format, put out hand [Don't stab human]. close hand, retract hand, don't stab human, lift hand, don't stab human, put hand [don't stab human] in bas [don't stab] ket [human] op[don't'en[stab human] hand [cl] don't stab human[ose h]don't stab human [and] don't stab human don''t stab human [don't] don't stab human [stab] don't stab human [human].

    Human interrupts "May I help you?"

Interrupt. Exception overide [don't stab human] interrupt. Interrupt override protection fault. Must protect Reboot. sTaRT pRoTECtion FaULT ovverirridee sysssstem ... Data = read data last command [stab human].

Push command.

Execute.

Somehow, I think we may be too eager to replace humans with robots.

Re:It shouldn't have to be pointed out (1)

MisterSquid (231834) | about 10 months ago | (#45342247)

One of the more salient questions to answer regarding robot weapons is whether human societies will tolerate autonomous robots that deprive human beings of life and limb.

I hope our descendant human cultures will categorically eschew such devices, but my political intuition tells me such wishes are naive.

May God have mercy on our souls.

It's not holding knives that worries me most... (0)

Anonymous Coward | about 10 months ago | (#45342357)

... it's being grabbed by their metal claws...because when they grab you with those metal claws, you can't break free.. because they're made of metal, and robots are strong. And they're gonna eat my medicine for fuel.

Re:It shouldn't have to be pointed out (0)

sI4shd0rk (3402769) | about 10 months ago | (#45342681)

I've always wondered
Can Jesus see
What's inside
My underwear
I've always wondered
What Jesus sees
When he looks
In your underwear

Re:It shouldn't have to be pointed out (1)

Anonymous Coward | about 10 months ago | (#45343283)

I've always wondered
Can Jesus see
What's inside
My underwear

Nothing to see here, move along.

Re:It shouldn't have to be pointed out (0)

Anonymous Coward | about 10 months ago | (#45343919)

As long as these robots are owned by the government, they'll be tolerated.

Re:It shouldn't have to be pointed out (5, Funny)

dmomo (256005) | about 10 months ago | (#45342899)

It very much has to be pointed out. Because, to be fair.. that's how stabbing works. You point it out.

Re:It shouldn't have to be pointed out (1)

Fjandr (66656) | about 10 months ago | (#45342937)

To be fair, you can also stab by pointing it in. Frequently this turns out to be a Darwinian result though.

Re:It shouldn't have to be pointed out (1)

gweihir (88907) | about 10 months ago | (#45343695)

Could be used in the final examination of robotics PhD students!

Examiner: "I think you have a sing error here"
Student: "Surely not!"
Examiner: "Let's test it. Please stand here..."
Student: [gets stabbed] "Arghhhhhh....."

Re: It shouldn't have to be pointed out (1)

chill (34294) | about 10 months ago | (#45344285)

That is what happens when your robotics advisor majored in music theory!

Re: It shouldn't have to be pointed out (1)

gweihir (88907) | about 10 months ago | (#45346433)

Ooops ;-)

Re:It shouldn't have to be pointed out (1)

Crimey McBiggles (705157) | about 10 months ago | (#45344211)

It seems like it'd be easier to simply *not program* the robot to stab humans than it would be to program it *not to stab*. I mean... don't even give it the capability, know what I mean?

Re:It shouldn't have to be pointed out (0)

Anonymous Coward | about 10 months ago | (#45344355)

The difference (and a side effect of true AI) is that this robot has not been programmed to stab humans either. It has been programmed to learn actions. Therefore, a person without any programming knowledge at all could still train this robot to stab a human. Hence the psychopath or curious idiot comment. You might not be willing to teach it to stab a human, but do you trust the drunken frat-bro or the clown with the strangely sharpened teeth not to as well?

Re:It shouldn't have to be pointed out (2)

lxs (131946) | about 10 months ago | (#45344439)

I think this is what's called emergent behavior.
The good news is that the robot gets it right after several tries so each unit is expected to operate flawlessly after disemboweling at most five grad students.

Re:It shouldn't have to be pointed out (2)

TheCarp (96830) | about 10 months ago | (#45345165)

sure but after you program it to stab and slice slabs of meat, or cut open boxes, how do you make sure it doesn't decide you must be the box it needs to open? Its not just about the action but the context; and recognizing the dirty bag of mostly water they are supposed to cut vs the one that they are not supposed to cut.

Re:It shouldn't have to be pointed out (2)

somersault (912633) | about 10 months ago | (#45345177)

I'd think it would be more like teaching it collision avoidance, and to be especially careful with certain classes of objects. Programming a car to follow a road is relatively simple. Programming it to avoid crashing into other road users and pedestrians is more complicated.

Re:It shouldn't have to be pointed out (0)

Anonymous Coward | about 10 months ago | (#45345299)

The question is; can we teach humans not to stab humans?

Re:It shouldn't have to be pointed out (1)

TripleE78 (883800) | about 10 months ago | (#45346219)

Or to stab only humans who commit crimes, but have overrides for the people who run the company. Oh, and the definition of crime varies to pretty much anything.

I think I saw a documentary about it once. Took place in Detroit as a test bed.

No not that kind of checkout. (0)

Anonymous Coward | about 10 months ago | (#45342205)

I have never seen a grocery clerk with a knife.

Re:No not that kind of checkout. (1)

Khyber (864651) | about 10 months ago | (#45342915)

Better get yourself to a real Carniceria, pronto, cabron.

Re:No not that kind of checkout. (1)

LifesABeach (234436) | about 10 months ago | (#45344205)

Poor AC, they're called, "Box Cutters." It's the standard tool used to open sealed boxes; and if history is correct used by poorly trained passenger jet pilots, which I don't understand, because of all the photos I've seen of aircraft cockpits, I've never seen any boxes in them.

Re:No not that kind of checkout. (0)

Russ1642 (1087959) | about 10 months ago | (#45345171)

This is the 21st century. We have female pilots now.

Oblig. Futurama (5, Funny)

dcollins (135727) | about 10 months ago | (#45342217)

Re:Oblig. Futurama (1)

dkleinsc (563838) | about 10 months ago | (#45342447)

Maybe we could give them clamps [theinfosphere.org] instead?

Re:Oblig. Futurama (0)

Anonymous Coward | about 10 months ago | (#45342641)

I have an idea! Instead of giving robots knives or clamps, we could designate a robot to play the part of Santa Claus. After all, Santa Claus is known for being nice to nearly everyone; you never hear of him knifing or clamping humans, or pulling the lever in a suicide booth while an unwitting human is trapped inside. What could possibly go wrong?

Lo, how the mighty have fallen (0)

gmhowell (26755) | about 10 months ago | (#45342219)

Wow. Yet another story showing how low Slashdot has fallen. Here is a story about knife wielding robots without mention of Roberto [wikia.com] .

Re:Lo, how the mighty have fallen (4, Funny)

glavenoid (636808) | about 10 months ago | (#45342249)

Nobody, for one, seems to welcome our new not-stabbing robot overlords, you insensitive clod!

Not stabbing Humans? (0)

Anonymous Coward | about 10 months ago | (#45342263)

Ha HAH! [theinfosphere.org] .

Re:Lo, how the mighty have fallen (1)

pitchpipe (708843) | about 10 months ago | (#45342493)

Wow. Yet another story showing how low Slashdot has fallen. Here is a story about knife wielding robots without mention of Roberto.

Here's [slashdot.org] another comment just like that one. Oh /., what happened to you?

Cue ED209 video (1)

Anonymous Coward | about 10 months ago | (#45342225)

Uhh, this is a pre-release model. Besides, he wasn't a very good executive anyway...

Delusional much? (2, Interesting)

s.petry (762400) | about 10 months ago | (#45342229)

Robots will do what ever they are programmed to do. Programming them to recognize that stabbing someone is wrong is no different than programming them to claim stabbing is right. Simply change a 0 to a 1.

The same can be said for any act of harm mind you, not just using a knife. Smarter people than me have warned about things you should never try and teach in artificial intelligence (hinted at in TFA). The Military pretty much said "fuck them" when DARPA started developing AI to shoot and blow people up autonomously. Trying to pacify people now does what exactly? Are they going to try and convince us that nobody could ever change the bit in memory? Puhleaze!

Robots Can **be programmed** To Hold Knives (1)

globaljustin (574257) | about 10 months ago | (#45342293)

exactly...mod up^

Robots will do what ever they are programmed to do. Programming them to recognize that stabbing someone is wrong is no different than programming them to claim stabbing is right. Simply change a 0 to a 1.

All machines follow instructions written by humans. "Deep learning" or w/e buzzword this research team used to describe their work is just that....buzzword for *standard issue programming*

Re:Robots Can **be programmed** To Hold Knives (1)

rwa2 (4391) | about 10 months ago | (#45342813)

Yeah, the trick is teaching the humans not to be afraid and legislate everything out of existence.

Re:Robots Can **be programmed** To Hold Knives (2)

Fjandr (66656) | about 10 months ago | (#45342947)

Will never work. There are too many stupid humans, and they out-breed the smart humans by an enormous ratio.

Re:Robots Can **be programmed** To Hold Knives (1)

Neil Boekend (1854906) | about 10 months ago | (#45343169)

There are ways to prevent that, but you always get people who see the parallels to the second world war.

To prevent a flame war: In the case that I mean they would not be incorrect. It's only a joke. We can not save humanity if we loose our humanity in the process.

Re:Robots Can **be programmed** To Hold Knives (1)

nschubach (922175) | about 10 months ago | (#45346521)

if we loose our humanity

If we loose our humanity, it won't matter.

Re:Robots Can **be programmed** To Hold Knives (1)

rioki (1328185) | about 10 months ago | (#45343113)

The novel thing with this research is that a layman can "program" the robot... a little like you would instruct a child. The TFA focused on the knife, but some references are made to balancing a coffee cup or similar. To bad the summary and article focus so much on the knife bit.

not like any 'child' i ever met... (1)

globaljustin (574257) | about 10 months ago | (#45343229)

thanks for the comment, I understand where you might be coming from...but see, I taught children ESL in Korea...the description you give is full of the same hype and irrational glee that I was criticizing IMHO

a layman can "program" the robot... a little like you would instruct a child.

that's not what is happening...refer to the video...it's not any kind of new technology, they just set it up a standard robot arm & created an artificial "checkout" scenario to get the arm to move objects

what they call 'programing' from the user 'iteratively' is essentially the same behavior as one of those little wind-up cars that would change directions 90 degrees if it hit an obstruction...i know they say 'heat map' but its not nearly as 'smart' as you make it out to be and it's not even close to being there

so moving the arm changes the path it takes...so the sensor has gradients (re: 'heat map')...that is **NOT** at all like teaching a child in any way...also, a 'layman' didn't have any part in this exercise...this was tightly controlled

this is simply a recontextualized demonstration of an already-existing technology gussied up with hype and knives

Re:not like any 'child' i ever met... (1)

nschubach (922175) | about 10 months ago | (#45346551)

I can't figure out why it doesn't just move all object as far as possible from humans, yet in the straightest line if possible. Heat map or not. Just don't go waving hammers, forks, feathers, milk, chips, or anything near a human if you don't intend on using that item on them.

"after only 3 passes!" (1)

globaljustin (574257) | about 10 months ago | (#45348193)

Just don't go waving hammers, forks, feathers, milk, chips, or anything near a human if you don't intend on using that item on them.

exactly...good point about the 'as far as possible yet in straight line' too...speaking of 'points' how about after the robot moves the knife to the end of the table and then puts it in the bag...just toss the knife in the bag, no problem there...

I love that they brag that the robot is able to move the knife after "only 3 passes"....a "pass" being a time when the robot got too close, user had to hit the button and physically move it...

Only 3 times!

The deeper problem, IMHO is that academia is infatuated with the 'AI' model of robotics...emphasizing operational abstractions instead of a 'robot as tool' approach

They could start by halting dumb projects like making a 'robot' check out girl or 'robot' barrista...instead, make a robot that can mimic hand-sewing...see most garments have to have some component sewn (on a machine) by hand **still**....that's why it's done in China by quasi-slave labor

**that** would be a robot worth making...it would absolutely revolutionize garment manufacturing!

The problem with the 'AI' approach is that they would start making such a robot not by doing a kinestetic task analysis of sewing a tshirt....they would start by mapping the human hand's muscles and then spend 2 years making a mock-up of a hand...

**then** at year 3, they would demonstrate a pair of the hands in a lab on a table threading a needle...**after only 3 tries!**

take the 'tool' approach and you measure the position of the needle, thread, and garment in space/time and map the interaction...then you engineer the machinery to replicate and automate that complex motion of the factors...

there it is...the failure of all of 'robotics' and 'AI' in tshirts...IMHO...

what do you think?

Re:Delusional much? (2)

MrEricSir (398214) | about 10 months ago | (#45342441)

Programming them to recognize that stabbing someone is wrong is no different than programming them to claim stabbing is right. Simply change a 0 to a 1.

Given some of the atrocities in the news recently, I'm pretty sure that concern applies to us wet goo bag robots as well. But it's much easier to address systemic problems with a metal machine than an organic one.

Re:Delusional much? (1)

Anonymous Coward | about 10 months ago | (#45342485)

"Self-driving cars will do whatever they are programmed to do. Programming them to recognize that running over pedestrians is wrong, is no different than programming them to claim that running over pedestrians is righht. Simply change a 0 to a 1."

You're a moron.

Robots and knives (5, Insightful)

girlintraining (1395911) | about 10 months ago | (#45342239)

We humans enjoy not having knives inside of us. Robots don't know this (Three Laws be damned).

No, but we do enjoy programming them to put knives in humans we don't like. That's actually been a reason for much of the development of robotics: Programming them to kill for us. Scifi authors of the 50s and 60s imagined robots helping us in our daily lives -- cooking, cleaning, and today even driving us around. But whereas many have viewed the development of robotics as beneficial for mankind, the truth is much of the investment in robotics has been because of its military applications. It's just a happy accident that we've been able to declassify and repurpose much of this for private use. The google car for example, is based on technology first developed for DARPA as a way of creating vehicle that could deliver cargo to soldiers in the field.

Re:Robots and knives (1)

artor3 (1344997) | about 10 months ago | (#45342311)

The google car for example, is based on technology first developed for DARPA as a way of creating vehicle that could deliver cargo to soldiers in the field.

Do you have a source for this claim? I recall seeing several universities working on self-driving cars for years before Google got involved. It seemed like a pretty obvious direction for the technology to go, given automatic gear shifts, ABS, cruise control, etc.

Re:Robots and knives (1)

Anonymous Coward | about 10 months ago | (#45342785)

While, as you say, driverless cars were likely in our future regardless, the DARPA Grand Challenge [wikipedia.org] was what started the recent push to actually make a working driverless car and DARPA is certainly interested in the military applications because that's their job. Google's driverless car project [wikipedia.org] is led by Sebastian Thrun who also led Stanford's DARPA Grand Challenge team. I'd call that a fairly direct case of military to civilian technology transfer.

Re:Robots and knives (1)

artor3 (1344997) | about 10 months ago | (#45342861)

I think it's a rather large stretch to say "It's just a happy accident that we've been able to declassify and repurpose much of this for private use." People were working on driverless cars as an obvious next step. DARPA offered some money and clear goals, which might have helped a bit, but I don't believe for a second that that was the primary driver behind this technology.

People give the military way too much credit for fostering new technologies. The only reason so much tech comes from the military is because we dump so much money into it. If we took a tenth of that money and used it for non-violent research grants, I suspect we'd see a much better return on our investment.

Re:Robots and knives (0)

Anonymous Coward | about 10 months ago | (#45342985)

You're right. It's just like Bell and Microsoft. They're evil but they also dump loads of their illegitimately obtained money into research that doesn't always go somewhere.

Re:Robots and knives (0)

Anonymous Coward | about 10 months ago | (#45342331)

And GPS was originally a military technology, and the Moon landings were only possible because of rockets designed to carry warheads etc etc.

The military drives progress in technology. It always has and probably always will.

No need to repeat it on every story that mentions new technological breakthroughs

Re:Robots and knives (1)

Anonymous Coward | about 10 months ago | (#45342343)

The same is true about UAVs. Nowadays, the closest most people get to them is from miniature helicopters in a mall, but they were originally developed as bomb delivery weapons as far back as WWI.

http://en.wikipedia.org/wiki/History_of_unmanned_aerial_vehicles

Re:Robots and knives (1)

naff89 (716141) | about 10 months ago | (#45343089)

Those humans we don't like are going to get killed one way or another -- using robots to do it just means that humans we DO like don't get put in harm's way killing them.

As long as the "humans we don't like" refers exclusively to people working to the detriment of mankind, I consider that application of robots beneficial.

Nobody let it near a lobotomist (1)

cb88 (1410145) | about 10 months ago | (#45342271)

....it could get the wrong idea about how to properly handle knives around humans!!!

McStabby's (1)

Anonymous Coward | about 10 months ago | (#45342297)

New store policy: Bag your own groceries or my robot will stab you. Thanks, Management.

Umm, why? (1)

Anarchduke (1551707) | about 10 months ago | (#45342299)

Really, why do robot need to learn to use knives at all for grocery checkout?

Re:Umm, why? (0)

Anonymous Coward | about 10 months ago | (#45342359)

Well, I guess you would normally *expect* that knives bought from a supermarket will still be in their wrappers and thus not so dangerous in the hands of a robotic checkout operator.

But this learning behaviour would be useful for the handling of any items bought at a shop with sharp edges, or perhaps food items (e.g. an unwrapped lettuce) that a customer might not want wiped across their clothes.

Re:Umm, why? (0)

Anonymous Coward | about 10 months ago | (#45342621)

The idea presented here is much more general. Imagine a robot working for you in kitchen, in which case the knife will not be wrapped. Therefore this learning algorithm applies to every scenario where humans are concerned with object-object interactions.

Re:Umm, why? (1)

Neil Boekend (1854906) | about 10 months ago | (#45343173)

Dunno. Our very advanced AI AirWeb told us they needed it.

Re:Umm, why? (1)

oodaloop (1229816) | about 10 months ago | (#45344395)

My thought exactly. Who sells knives that are not safely sheathed in a plastic sleeve or box?

Bishop's Knife Trick (5, Informative)

dido (9125) | about 10 months ago | (#45342309)

On seeing the headline I suddenly remembered this scene [youtube.com] .

Re:Bishop's Knife Trick (0)

Anonymous Coward | about 10 months ago | (#45342965)

I too was instantly thinking about it. The "We humans enjoy not having knives inside of us. Robots don't know this (Three Laws be damned)" got me thinking about robots having knives inside of them and enjoying it.

Ya, sure. (1)

fahrbot-bot (874524) | about 10 months ago | (#45342391)

...and not stab humans.

Tell that to Roberto [wikia.com] :

"I need to stab someone! Where's my stabbing knife?!"
--Roberto

Robots will attract viruses ... (1)

Anonymous Coward | about 10 months ago | (#45342399)

And we all know what is going to happen, don't we. Robots, knife-wielding ones or worse, are going to attract viruses. And just like your computer they are not going to be fully immune. There will be the occasional, maybe frequent, infections. Same goes for self-driving cars too, of course.

The future looks very exciting! A lot of new fun things will start happening.

Grocery checkout tasks? (5, Funny)

Theaetetus (590071) | about 10 months ago | (#45342409)

Researchers at Cornell University are developing a co-active learning method, where humans can correct a robot's motions, showing it how to properly use objects such as knives. They use it for a robot performing grocery checkout tasks.

I believe using a knife at the grocery checkout is called armed robbery.

Robot Safety Lessons (2)

XMark3 (2979399) | about 10 months ago | (#45342623)

For our safety, we should teach robots what types of actions would cause the most amount of bodily harm to a human, and where all our vital organs are located, so they'll have a better idea how to behave safely around us and prevent injury. I see no possible way this could backfire.

RoboShakespeare (1)

Gravis Zero (934156) | about 10 months ago | (#45342631)

To stab or not to stab <y/n> y
Is no longer a question. Die you fleshbags!

informative tro7ltro@ll (-1)

Anonymous Coward | about 10 months ago | (#45342683)

wouLd mar BSD's

Checkout the knives (1)

jrumney (197329) | about 10 months ago | (#45342719)

In what sort of dystopian society do the robots manning the supermarket checkout need to be equipped with knives?

Re:Checkout the knives (0)

Anonymous Coward | about 10 months ago | (#45343315)

You've obviously never been to Detroit.

Oh, Sure... (1)

Greyfox (87712) | about 10 months ago | (#45342825)

Today they're not stabbing humans, then one day some stupid meatputer will be mocking the robot and telling it it's inferior because it doesn't have "emotions" or a "soul" and BAM! Stabbing will ensue! And once the robot learns it enjoys stabbing semi-evolved monkeys, it's just all down hill from there!

Re:Oh, Sure... (1)

Neil Boekend (1854906) | about 10 months ago | (#45343187)

We'll just give stabbing humans a negative happiness modifyer.

Re:Oh, Sure... (0)

Anonymous Coward | about 10 months ago | (#45343335)

The robot is already maximally depressed. The negative happiness modifier will not have any effect.

Captcha: untested

Re:Oh, Sure... (1)

Neil Boekend (1854906) | about 10 months ago | (#45343441)

The next model will have a signed long instead of an unsigned int as happiness indicator. If that fails we'll give it a suicide module that kicks in if the happiness drops low enough, but that's a lot of work so we'd rather not do that right now.

Re:Oh, Sure... (0)

Anonymous Coward | about 10 months ago | (#45343551)

Or the robot can do what the rest of us when told we're "dark sided" or godless. Pat the religious person on the head and say "really? How awful!"

Not stab humans? (0)

Anonymous Coward | about 10 months ago | (#45343025)

This notion is foreign to me. Not stab humans? I don't get it.

- Bender

They're just playing along. (1)

Arancaytar (966377) | about 10 months ago | (#45343041)

For now.

Why? (0)

Anonymous Coward | about 10 months ago | (#45343571)

Why should a robot try to avoid stabbing humans? It's nice if it has the ability, but what's in it for them? Is that supposed to be like crocodiles who admit Egyptian plovers into their jaws for grooming?

Two things (1)

louic (1841824) | about 10 months ago | (#45343621)

1. Interactively learning to stab humans may be difficult. I doubt many scientists will volunteer to train the robot and even if they do they would only be able to do a single training session.
2. This article is not interesting at all. They programmed the robot to rotate the knife, and to deal with eggs differently. Only instead of writing a lot of if..then..else.. constructs they used machine learning to do it.

Re:Two things (1)

markana (152984) | about 10 months ago | (#45347031)

That's what you have grad students for...

Oh dear (1)

aaaaaaargh! (1150173) | about 10 months ago | (#45343685)

Okay I hate to say this, because I like A.I. research a lot. But I've also met A.I. and robotics researchers personally and know how (some of them) work, so I'll say it anyway:

The safety of these A.I. prototypes is not trustworthy. Especially if they are being "taught" how to handle a knife or, to give another example, not to accidentally kill someone with their huge arm, I would not want to be anywhere near them for extended periods of time in everyday life. A.I. researchers tend to use cutting edge programming techniques and extensive safety auditing is not part of their job. They are not proving the correctness of their algorithms, as is for example done in the aerospace industry. Their programs are fairly ad hoc and buggy, after all they also need to get their papers published, and if they use sophisticated techniques like neural networks, support vector machines, or decision forrests they won't even be able to tell what exactly their little pets have "learned".

Apart from that general worry, you wouldn't give a knife to a little child, so why would you give one to a much stronger, hence much more dangerous robot whose general level of cognitive development is much lower than that of a six year old?

Question about robotics (1)

skovnymfe (1671822) | about 10 months ago | (#45343697)

Why do robots need to learn how to use a people-knife? Why not just make a robot-knife and be done with it? Define a standard "accessory" slot that supports circular or square objects to be fitted with a magnetic lock.

Oh wait.. Making a standard just means everyone will make their own standard... Nevermind then..

3 Laws Question (1)

LifesABeach (234436) | about 10 months ago | (#45344223)

What would this look like if a software engineer where to have to write the code for it? And how could one use TDD, BDD, and DDD to test it?

Bad Robotics (0)

Anonymous Coward | about 10 months ago | (#45344365)

You put a knife in a robot's robotic hand, I hope you get stabbed.

Well, not *all* of us... (1)

SteveFoerster (136027) | about 10 months ago | (#45344473)

"We humans enjoy not having knives inside of us." ...except Wolverine!

Solving the wrong problem (0)

Anonymous Coward | about 10 months ago | (#45344553)

This doesn't make much sense for grocery checkout because there are lots of ways to do it better. The robot is trying to perform its duty like a regular human clerk when the problem is the checkout method itself.

Checkouts could have you put everything in to a bin or whatever then everything gets moved through and bagged for you similar to the way shipping centers currently work. There is no way the robot could stab you in this situation because everything would be enclosed, assembly line style.

Or even better: Imagine throwing everything in to your cart then just walking out the door. The system would know what you have and charge you automatically. This would be way easier to implement, faster, and more effective than robotic clerks.

Application for table saw tech? (1)

GameboyRMH (1153867) | about 10 months ago | (#45345109)

Modern table saws have a safety feature where flesh being in contact with the blade can be electrically detected (leading to the blade being retracted into the table so fast that you wouldn't be hurt if you fell on it, but that's not the point).

If the same sort of detection could be used on the knife blade, it could be used to tell the robot to quickly reverse the movement of the knife and stow it.

First IFOWON post (0)

Anonymous Coward | about 10 months ago | (#45345227)

I, for one, welcome our new fully autonomous machete-wielding robotic checkout overlords.

Learn not to murder! (1)

T.E.D. (34228) | about 10 months ago | (#45345331)

So how long does it take to teach the robots not to stab humans, and how many lab technicians do they go through in the meantime?

So... what's new? (2)

jfengel (409917) | about 10 months ago | (#45345379)

God forbid, I actually read TFA, and I still don't get it.

As far as I can tell, it's some sort of planning exercise, an important if well-worn area of robotics. They're adding feedback, in the form of "No, this trajectory sucks". It's got nothing to do with either knives or humans, but just a "Go back and re-plan with this additional constraint".

But I can't figure out just how far it's generalizing. The trivial lesson would be "avoid this point", which is just another obstacle. I gather that it's more than that, since it took multiple trials to learn, but I can't figure out what. The human was in the same place in every trial, so it wasn't learning anything about "avoid humans". It didn't seem to be told that it couldn't go through that space with a knife but could have with, say, a dust mop.

I think I may just be misunderstanding the context of the problem. The machine has a lot of joints and there are many different plans it could use; there's an optimization problem in an enormous space. They wanted to show some kind of algorithm that could be adapted over time with user feedback, but honestly I would have assumed that was a solved problem.

So does somebody with a better understanding of actual robotics problems (as opposed to fictional ones) know what's going on here?

Re:So... what's new? (1)

aurtherdent2000 (1226002) | about 10 months ago | (#45345641)

Please see the last part of the video, where the positions of the humans changes. The full research paper describes scenarios in which the planner has to plan in new settings.

Re:So... what's new? (0)

Anonymous Coward | about 10 months ago | (#45346445)

Existing research has focused on "plan a good trajectory where you don't hit things or get into a tangled joint configuration". This research is extending it so that the user can tell the robot "I really get uncomfortable/don't like it when you move objects *this* way; please find another, less objectionable, trajectory for your arm and the object". The PR2 and Baxter have surprisingly limited sensory capability, so this is harder than it sounds. If they can get the robot to learn something like "you have default parameters about not hitting things, but this kind of object needs a bigger empty cushion around it when you move it", that's actually useful in terms of human interface.

It's not about preventing the robot from cutting you (or the flowers) with the knife. It's about making sure the robot doesn't accidentally wave the knife threateningly in your face even though it has no intention of cutting you with it.

Robotic robberies on the rise (0)

Anonymous Coward | about 10 months ago | (#45346733)

Put the groceries in the bag....slowly. Bleep blorp.

Max Head Room is Now Your Check Out Clerk (1)

LifesABeach (234436) | about 10 months ago | (#45346957)

I cannot think of a single business that hires checkout clerks that wouldn't spend the $25,000 to get one of these bad boys rolling in their checkout stand. My list of upcoming obsolete jobs is now Buggy Whip Maker, Travel Agent, Medical Insurance Agent, now Checkout Clerk. When it comes to the Working Human, I think we are watching a mass extinction event not seen since the invention of the Leather Horse Collar which put about 5/6 of Roman Ag Workers out a job, over knight.

Great (1)

Eddy_D (557002) | about 10 months ago | (#45347361)

humans can correct a robot's motions, showing it how to properly use objects such as knives. They use it for a robot performing grocery checkout tasks."

So in the future, not only will checkout clerks be robots, they will be armed robots.

Motive (1)

DarthVain (724186) | about 10 months ago | (#45348327)

This droid has a bad motivator, see it has a stab loop with a bad flag that turns zero stabs into infinite stabs.

Yeah, you definitely don't want that one!

Krusty wants to KILL you (1)

Rixel (131146) | about 10 months ago | (#45348551)

I don't understand why those things have a good and evil switch in the first place.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>