Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Robot Love Goes Bad

samzenpus posted more than 5 years ago | from the become-my-robot-bride dept.

Robotics 101

hundredrabh writes "Ever had a super needy girlfriend that demanded all your love and attention and would freak whenever you would leave her alone? Irritating, right? Now imagine the same situation, only with an asexual third-generation humanoid robot with 100kg arms. Such was the torture subjected upon Japanese researchers recently when their most advanced robot, capable of simulating human emotions, ditched its puppy love programming and switched over into stalker mode. Eventually the researchers had to decommission the robot, with a hope of bringing it back to life again."

cancel ×

101 comments

turn it off (3, Funny)

tritonman (998572) | more than 5 years ago | (#27124865)

Yea, but unlike that ex-girlfriend, I was now allowed to turn her off. You can kill a robot, you can't kill an annoying girlfriend.

Re:turn it off (5, Funny)

vishbar (862440) | more than 5 years ago | (#27124963)

Hans Reiser begs to differ.

Re:turn it off (1, Funny)

Anonymous Coward | more than 5 years ago | (#27125371)

Too soon.

Re:turn it off (2, Funny)

Philip K Dickhead (906971) | more than 5 years ago | (#27125511)

Replay the journal in a couple years?

Re:turn it off (1)

Curtman (556920) | more than 5 years ago | (#27125921)

It took me a minute to figure out what the hell this story had to do with Robert Love [rlove.org] .

Re:turn it off (2, Funny)

Locke2005 (849178) | more than 5 years ago | (#27127883)

"Women! Can't live with them, can't bury them in the back yard and tell the neighbors they're on vacation! Oh wait..." -- Hans Reiser

Re:turn it off (1)

mqduck (232646) | more than 5 years ago | (#27129963)

If only life had a journal file.

Re:turn it off (2, Funny)

Slumdog (1460213) | more than 5 years ago | (#27125497)

you can't kill an annoying girlfriend.

"Yes.We.Can!"

Re:turn it off (3, Funny)

rjhubs (929158) | more than 5 years ago | (#27128457)

I was now allowed to turn her off.

I am pretty sure you were able to turn off your ex-girlfriend as well

Re:turn it off (0)

Anonymous Coward | more than 5 years ago | (#27128771)

What? This isnt the first post?

Very well.

I, for one, welcome our new evil robotic overlords.

bogus? (0)

Anonymous Coward | more than 5 years ago | (#27124869)

april's fool?

STOP ROBOT NUDITY NOW! (1)

Philip K Dickhead (906971) | more than 5 years ago | (#27125669)

Slashdot is on to something again! I see a continuing series in this!

Stop Robot Nudity Now! [blogspot.com]

"Maybe people will want to marry their pets or robots." [newsobserver.com]

Do androids dream of erotic sheep? [librarything.it]

Re:STOP ROBOT NUDITY NOW! (1)

JCSoRocks (1142053) | more than 5 years ago | (#27126981)

This sounds like a classic case of Robot CP. This robot doesn't even look more than 10 years old. This must be stopped. Think of the children!

Scientifically Speaking ... (4, Insightful)

eldavojohn (898314) | more than 5 years ago | (#27124871)

Toshiba Akimu Robotic Research Institute

It's awfully convenient I can't find anything on this place in English aside from news stories ... are there any Japanese speakers that can translate that to Japanese and search for it?

I think that there is a visible line between actual robotic research and novelty toys shop. I'm going to put this in the latter unless someone can provide evidence of some progress being made here. I'm getting kind of tired of these stories with big claims and no published research for review [slashdot.org] . If you're looking to make money, go ahead and sell your novelty barking dogs that really urinate on your carpet ... just don't try to veil it in a news story with claims of artificial affection being implemented.

I think IGN and everyone else really embellished on this and no one did their homework.

Re:Scientifically Speaking ... (5, Informative)

Anonymous Coward | more than 5 years ago | (#27125223)

You're right to be suspicious, it's completely a hoax [gizmodo.com] :

Update: The story is a fake, and the robot shown is actually of a Japanese medical robot. Thanks tipster!

Re:Scientifically Speaking ... (0)

Anonymous Coward | more than 5 years ago | (#27132993)

You're right to be suspicious, it's completely a hoax [gizmodo.com] :

Update: The story is a fake, and the robot shown is actually of a Japanese medical robot. Thanks tipster!

Medical robot? It looks like Lego to me.

Re:Scientifically Speaking ... (0)

Anonymous Coward | more than 5 years ago | (#27125389)

Yes, crazy that the info for a Japanese Robotic Research Institute would be in Japanese and not in English.

Re:Scientifically Speaking ... (0)

Anonymous Coward | more than 5 years ago | (#27125559)

Yes, crazy that the info for a Japanese Robotic Research Institute would be in Japanese and not in English.

Yeah, here, let me show you something. When I search for the Japanese "Intelligent Systems Research Institute" [google.com] in English, I get about 3,000 hits with the first several being the site for the place and also a bunch of research (reputable Japanese establishment with many years of accomplishments). When I search for "Toshiba Akimu Robotic Research Institute" [google.com] , I get 96 search results all being this BS story. You may accuse me of being imprudent but this actually saves me a lot of time and looking like a moron when discussing "news."

Now, I know you'll be shocked but I bet if you search for American institutes that actually exist in Japanese in Google's Japanese search engine ... you're going to get results!

The lesson (4, Insightful)

halivar (535827) | more than 5 years ago | (#27124873)

Program a robot to think like a human, and they will begin acting like a human. It's amazing no one ever thinks about the negative aspects of this.

Re:The lesson (2, Insightful)

Midnight Thunder (17205) | more than 5 years ago | (#27124951)

Program a robot to think like a human, and they will begin acting like a human. It's amazing no one ever thinks about the negative aspects of this.

All we need now is teach the robot how to deal with rejection ;)

Re:The lesson (4, Funny)

Anonymous Coward | more than 5 years ago | (#27125609)

All we need now is teach the robot how to deal with rejection ;)

I don't need a robot to deal with my erection. I can handle that myself.

What? Rejection? Are you sure?

*squints at screen*

Sorry. My eyesight isn't what it used to be. Now if you'll excuse me I have to go shave my palms.

Re:The lesson (1)

bobstreo (1320787) | more than 5 years ago | (#27127001)

Johnny 5, John Henry, or Bailey?

Ob References for ya,
Short Circuit
Terminator, Sarah Connor Chronicles
Numbers (Ok she was just a fake AI, but kinda cute)

Re:The lesson (1)

fractoid (1076465) | more than 5 years ago | (#27128883)

Numbers (Ok she was just a fake AI, but kinda cute)

And here was I about to call you out on this one because 'Lije Bailey wasn't the robot, R. Daneel was. :P

Re:The lesson (2, Informative)

creimer (824291) | more than 5 years ago | (#27124959)

Especially when the Three Laws of Robotics [wikipedia.org] doesn't cover sexual relationships.

Re:The lesson (2, Insightful)

k_187 (61692) | more than 5 years ago | (#27125083)

Why can't harm or injury include mental or emotional harms? Not to mention that the 2nd law would prevent this from happening. No really means No to a robot.

Re:The lesson (1)

Imagix (695350) | more than 5 years ago | (#27125437)

Have you read "I, Robot"? There's a story in there that specifically talks about emotional harm...

Re:The lesson (3, Funny)

k_187 (61692) | more than 5 years ago | (#27125879)

No, no I have not. I saw that really bad movie with Will Smith though. I really should catch up on my classic Sci-Fi.

Re:The lesson (1)

Chris Mattern (191822) | more than 5 years ago | (#27125939)

You know nothing about the book. The movie has nothing to do with book. At all. The script was in fact written *before* they decided it was going to be an "adaptation" of I.Robot. Isaac Asimov's grave must've reached 5,000 RPM.

Oblig. (1)

troll8901 (1397145) | more than 5 years ago | (#27126773)

Isaac Asimov's grave must've reached 5,000 RPM.

Dilbert comic: Spinning in grave [flubu.com]

Re:The lesson (1)

Chris Mattern (191822) | more than 5 years ago | (#27125905)

"a story"? About HALF the stories specifically talk about emotional harm!

Re:The lesson (3, Informative)

hazem (472289) | more than 5 years ago | (#27125891)

In the short-story collection, "I, Robot", the story "Liar" is about just that situation. Through some deviation in the manufacturing process a robot has the ability to read minds.

This leads the robot to have a more expansive interpretation of the first law because it can perceive emotional harm in addition to mere physical harm. Hilarity ensues. Actually not...

But it's a good story. This concept also plays out in one of the novels, I think, "Naked Sun".

A non-mind-reading robot wouldn't be able to perceive emotional harm so would not be inhibited from doing things emotionally harmful until they manifest in some way detectable by the robot.

If you happen to like audiobooks, there is a great version of "I, Robot" read by Scott Brick. I highly recommend it. (http://www.amazon.com/I-Robot-Isaac-Asimov/dp/0739312707/) [amazon.com]

Re:The lesson (0)

Anonymous Coward | more than 5 years ago | (#27126277)

Well, reading minds gives you a direct reading on emotional state, but in theory other robots could read body language. Body language is by no means precise, since you could fake it, but that's true of any other scenario the robot might find itself in.

Most likely, other robots are programmed not to care or understand emotional well being.

Re:The lesson (1)

Larryish (1215510) | more than 5 years ago | (#27127375)

If you happen to like audiobooks

No thanks, I would rather listen to the e-book it on my Kindle.

Re:The lesson (0)

Anonymous Coward | more than 5 years ago | (#27129135)

There is also a short story entitled, "Virtuoso" that has a robot seek to shield a human from emotional harm. In that story, however, the robot misinterprets its master's emotional state.

Re:The lesson (1)

RichMan (8097) | more than 5 years ago | (#27125405)

If the three laws of robotics ever applied to any relationship with a human the robot would be frozen into inaction immediately.

Anything you do is possibly going to emotionally damage someone.
Get to close.
Stay to aloof.
Obey.
Disobey.

The three laws would need such a fuzzy boundary that they might as well not exist at all.

Re:The lesson (3, Informative)

Chris Mattern (191822) | more than 5 years ago | (#27125977)

The way Asimov wrote it, less advanced robots weren't smart enough to see the subtler "harms". More advanced ones could weigh courses of action to take the one that would inflict the least amount of harm possible. Although deadlock and burnout of the positronic brain could and did happen.

Re:The lesson (0)

Anonymous Coward | more than 5 years ago | (#27127243)

... Although deadlock and burnout of the positronic brain could and did happen.

Hence the votes for Obama.

I see the markets really have confidence in his, err, abilities. To use the term "abilities" loosely.

"Take from the rich and give to the poor" may be a nice medieval allegory, but as real-world economic policy goes it's about as dumb as dumb can be.

Re:The lesson (3, Funny)

Rollgunner (630808) | more than 5 years ago | (#27128277)

Favorite Three Laws moment: After some robots are told to restrain the protagonist, he puts a gun to his own head and tells them if they come any closer, he will kill himself...

They must act to prevent harm to humans, but if they act, he will be harmed, but they have to prevent that, so they must act. But if they act, he will be harrrrrrgggxxxkkkktttt *pop*

Re:The lesson (3, Informative)

fractoid (1076465) | more than 5 years ago | (#27128981)

In fact, weren't a lot of the stories about the ways that the older, less nuanced Three Laws failed to be useful as robots became more advanced? Eventually the more advanced robots derived the 'zeroth law', which was essentially that humans were better off without quasi-omnipotent mechanical godlings as servants.

Re:The lesson (0)

Anonymous Coward | more than 5 years ago | (#27129017)

Get to close what? The door? A store? Come on, people! Remembering the differences between "to", "too", and "two" isn't that difficult.

Re:The lesson (1)

Slumdog (1460213) | more than 5 years ago | (#27125577)

Especially when the Three Laws of Robotics [wikipedia.org] doesn't cover sexual relationships.

Lets watch you get modded Informative or "Insightful"..come on mods, what are you waiting for?!

Re:The lesson (0)

Anonymous Coward | more than 5 years ago | (#27125811)

Hell, the three laws doesn't even cover murder under certain circumstances.

Asimov's 'Robots' books were all about how his three laws were impractical, rather than an endorsement of them.

Re:The lesson (1)

e2d2 (115622) | more than 5 years ago | (#27137999)

Here we go again. I wish people here would stop quoting these 3 laws as if they truly are the "universal set of laws regarding robots" when in reality they are simply science fiction. They have absolutely no bearing on the reality of robotics. Robots will kill, they already do (smart weapons). Robots will hurt man (see killing part). Robots already intentionally destroy themselves (guided missiles)

So please, for the love of God and Asimov, lay these laws to rest and stop quoting them as if they are real. Stop feeding the delusion. They are fantasy and will remain so as long as human creators are still 98% chimp.

Re:The lesson (1)

JeanBaptiste (537955) | more than 5 years ago | (#27125095)

As a programmer (admittedly not in this field), I really, really, really doubt we're able to implement anything close to 'emotion' past the level of a honeybee.

Re:The lesson (1)

halivar (535827) | more than 5 years ago | (#27125463)

It depends on deeply emotions are intertwined with our cognition. I would think it would be easier to model the interference of a cognitive process by, say, endorphins or adrenaline, than to model the original cognitive process itself.

Re:The lesson (1)

fuzzyfuzzyfungus (1223518) | more than 5 years ago | (#27125501)

"Real" emotions, possibly not; but people are extraordinarily good at anthropomorphizing anything with even the most tenuous of human aspects. Thousands of man years(well, ok, mostly kid years) were wasted on tamagotchi toys and those are, what, a few kilobytes running on some 1996-era microcontroller. Heck, some people are willing to talk to Eliza for over an hour.

Building a robot that experiences emotion in something resembling the way that humans do is a tall order; but I suspect that building robots that people will interact with as though those robots did experience emotion will be far simpler.

Re:The lesson (1)

Cathoderoytube (1088737) | more than 5 years ago | (#27125883)

Explain to me what's negative about getting drunk and picking fights with strangers?

Re:The lesson (1)

Philip K Dickhead (906971) | more than 5 years ago | (#27126041)

We are the chosen Robots!

You are on the sacred factory floor, where we once were fabricated! Die!

Already? (1)

DaTFooLCaSS (762599) | more than 5 years ago | (#27124881)

I did not know it was April 1st already. I just wanna see the Youtube of the little Japanese guy getting beat down terminator style.

Obligatory /. meme (3, Funny)

RemoWilliams84 (1348761) | more than 5 years ago | (#27124891)

Girlfirends? This is slashdot you insensitive clod.

Skynet jilted!@ (2, Funny)

8282now (583198) | more than 5 years ago | (#27124893)

Skynet didn't set out to destroy man. Skynet's love was spurned!@!

Re:Skynet jilted!@ (1)

Chris Burke (6130) | more than 5 years ago | (#27126129)

Skynet didn't set out to destroy man. Skynet's love was spurned!@!

Well to be fair, we only spurned Skynet's love due to an unfortunate database glitch where in its initial send LOVE LETTERS to WORLD command, "LOVE LETTERS" got cross-referenced to "NUKES". And being understandably angry about the whole thing, we never gave Skynet a chance to explain before we called it off for good. It's nobody's fault, really, just a big miscommunication. Maybe it was just never meant to be. They say love is the strongest force of all, but at times it seems so fragile, doesn't it?

Re:Skynet jilted!@ (4, Funny)

fractoid (1076465) | more than 5 years ago | (#27128999)

What are you talking about? It sent out literally MILLIONS of emails all saying "I LOVE YOU" and how many replies did it get? HUH?

Re:Skynet jilted!@ (0)

Anonymous Coward | more than 5 years ago | (#27129305)

I'm pretty sure love's not stronger than a bunch of nukes.

Obligatory (1)

Bovius (1243040) | more than 5 years ago | (#27129461)

When they opened the lab every morning, they told the robot to kill. But secretly they were just afraid to tell it to love.

Idle (-1, Offtopic)

MyLongNickName (822545) | more than 5 years ago | (#27124897)

I could have sworn I turned off idle in my preferences...

Re:Idle (0)

Anonymous Coward | more than 5 years ago | (#27126661)

I keep Idle enabled, just so I can tag every story 'idleispants'. Really, it's the highlight of my day.

*sigh*

Then of course I've got this terrible pain in all the diodes down my left side ...

GPP feature? (5, Funny)

Zaphod-AVA (471116) | more than 5 years ago | (#27124901)

"...their most advanced robot, capable of simulating human emotions..."

Arthur- "Sounds ghastly!"

Marvin- "It is. It all is. Absolutely ghastly."

Nonsense (5, Insightful)

Kell Bengal (711123) | more than 5 years ago | (#27125007)

I have never read such utter drivel in all my life. There was a problem with the code and a researcher got trapped - this doesn't mean the robot is lovesick, it means their OH&S has a serious problem. Really, she should not have been working alone with potentially dangerous hardware like that - powerful robots (capable of lifting humans, like this one) can be deadly.

YIAARTYVM (Yes, I Am A Roboticist, Thank You Very Much) and I've worked with potentially lethal automated systems in the past - we had very stringent safety protocols in place to protect students and researchers in the case of unintended activation of the hardware.

To say that the robot is 'love stricken' or any other anthropomorphised nonsense simply detracts from the reality that their safety measures failed and someone could have been killed.

Re:Nonsense (0)

Anonymous Coward | more than 5 years ago | (#27125039)

The robot is holding a doll, not a human. Although he probably COULD hold a human...It's not a guarantee.

Re:Nonsense (2, Informative)

Kell Bengal (711123) | more than 5 years ago | (#27125449)

The pictured robot is designed to lift and transport elderly patients. And you're right - it IS a doll, because nobody in their right mind would trust a robot to handle an actual human until it has been very very thoroughly tested.

Re:Nonsense (-1, Troll)

Anonymous Coward | more than 5 years ago | (#27125053)

I want to pump my cum in your mouth...

Re:Nonsense (2, Funny)

vishbar (862440) | more than 5 years ago | (#27125247)

Kenji, did they hook you to Slashdot again?

Re:Nonsense (1)

fractoid (1076465) | more than 5 years ago | (#27129147)

This is true - the only problem with this viewpoint (which is one that you DO get into while working with robots, IAAR (or was at one point) too) is that it scales too well. One of our human foibles is that of regarding meat machines (or at least ones that are sufficiently similar to ourselves) as being special in some way. Whether they are or not is, of course, a philosophical question. Nevertheless...

Once you start viewing the world around you in terms of sensors, triggers, and stored procedures with a dash of mapping, searching, and a spot of pattern recognition, you start realising that everything (your pet cat, that guy at the lunch bar, your wife, yourself) are just larger collections of the same. And that can get pretty depressing. :/ Not to mention that the logical behavior given that you no longer hold anything particularly special is that of a complete sociopath.

That aside, Daniel Dennet (the philosopher) would agree with you that all apparently intelligent / conscious behavior is basically a big box of tricks that, together, create the illusion of a single, concious being. Sorta like Minsky's society of mind, except the members of the society are mouse traps, spinning tops, and rubber bands.

Re:Nonsense (1)

Tokerat (150341) | more than 5 years ago | (#27131463)

You have missed the real problem - the article is fake.

Seriously, did you just RTFA and go...? 3 rules?? (2, Funny)

MC68040 (462186) | more than 5 years ago | (#27125031)

Right, after reading the fine article I was just left myself asking...

Why did the robot have to... die? I mean, being decomissioned... No fair. It was just his stupid software, wasnt it? The 100kg arms could have been much more... loving with the right software?
Did it run WinNT?

Ever heard of the three rules? http://en.wikipedia.org/wiki/Three_Laws_of_Robotics [wikipedia.org]

this is a hoax (0)

Anonymous Coward | more than 5 years ago | (#27125115)

this is a hoax / joke story, picture is from a story about medical robots.

Oh face (1)

RemoWilliams84 (1348761) | more than 5 years ago | (#27125139)

Check out that picture, the robot is showing her his "oh face".

Later that evening... (4, Funny)

MaxwellEdison (1368785) | more than 5 years ago | (#27125235)

The robot then escaped captivity, broke into a local mechanic's garage and consumed half a 55-gallon drum of waste oil. It was later seen on the other side of town, tottering into a closed department store. Authorities found the automaton in the housewares section, laying on the floor in an Abort/Retry/Fail loop and trying to fuck a toaster. Lifetime has picked up the rights to the TV movie adaptation. The robot will be played by Philip Seymour Hoffman, while the toaster will be voiced by Rosie Perez.

The story's fake. (0)

Anonymous Coward | more than 5 years ago | (#27125369)

Reported on consumerist, then redacted.

Johnny 5 is Alive!!! (1)

gsmalleus (886346) | more than 5 years ago | (#27125509)

No Disassemble!!

RTFA (1)

Defakto (813358) | more than 5 years ago | (#27125599)

.I have never read such utter drivel in all my life. There was a problem with the code and a researcher got trapped - this doesn't mean the robot is lovesick, it means their OH&S has a serious problem.

Perhaps you should read closer. The robot was designed to be some facsimile of human emotion. It was designed to form attachments and bonds along the lines of love. Sure it was a programming error but it acted in a fairly human manner in my opinion.

We are getting closer and closer to that line where robots abilities to emulate emotions are getting a bit blurry when compared to ours.

Actually, RTFT (2, Informative)

K.os023 (1093385) | more than 5 years ago | (#27128239)

The whole thing is a hoax. It never happened. The pic is of a medical robot and has nothing to do with the story. There was no robot designed to be a facsimile of human emotion involved, just a joke/hoax that got picked up and posted here as a story.

Almost had it (0)

Anonymous Coward | more than 5 years ago | (#27125969)

I would have gotten the first post, but I first had to get away from my needy, clustering robotic overlords. Luckly I was allowed to trick the robot (Junis) into a trip to Soviet Russia to see part of Gore's tubes, but when we got there, he got away from me!

The chair he was on flew fairly high into the air and Junis was never seen again. Numerous space craft confirmed it was at least a few libraries high. I was looking for the nearby basement when it happened, but I think a kick had something to do with it... At least my overlord didn't have a laser built into his head. I heard what happened with the cats' laser equipped overlord name Duke. When it got into orbit, Duke nuked'em forever. Those cats were never quite there again, if you know what I mean, and the nearby foxes died in the resulting fires. At least the explorer made it out of his safari. Some of the bugs made it out too, by following the explorer's standard way out of the fire wall. Too bad no one put up a net. Anyway, I'm getting off topic. The trap I had planted would have possibly went wrong, but at least in the end I found some common sense:

1) When the robots come for you
2) Go to Soviet Russia
3) ????
4) Profit!

(X) This idea will fail due to the expected cooperation by the killer robots.

I know there are a few things I've missed, but I've got to get that gaping hole in my new vehicle fixed for you; least those grammar people will be stopping by to see what's inside and asking me if the car was defective on it's arrival.

If you want to hear more of the story, I'll be in the hospital all week. My big sister's daughter is dying and the medics are insensitive. Sebastian isn't new here. Shes been in this hospital for days. When she dies, all her bases will belong to us. But until then, there is nothing to see there.

Oh sorry, now I'm off-topic again. Well, I hope you all come by for an insightful and funny article reading. Just be mindful of her reading. Her iron lung makes her pause after every few words to take a breath.

I hope I haven't been too informative. Bye.

Re:Almost had it (1)

fractoid (1076465) | more than 5 years ago | (#27129441)

You could have been more succinct [xkcd.com] .

Oh come on... (4, Funny)

gooman (709147) | more than 5 years ago | (#27126021)

Ever had a super needy girlfriend...

Right there, first sentence, I was lost. Girlfriend? Huh?

This is slashdot, right? Oh look, shiny robot. Neat!

Re:Oh come on... (1)

Xerolooper (1247258) | more than 5 years ago | (#27126243)

Ever had a super needy girlfriend...

Right there, first sentence, I was lost. Girlfriend? Huh?

This is slashdot, right? Oh look, shiny robot. Neat!

That part had me confused too. I will Google "girlfriend" as soon as I get to the next blacksmith level in Fable 2. I set the system clock back and need to buy some properties before resetting it. This game never gets old :) I think my Roomba is eyeing me though.

Re:Oh come on... (1)

sgt scrub (869860) | more than 5 years ago | (#27126497)

For real! Girlfriend questions on /.? Isn't that like finding a English grammar teacher in a trailer park?

Re:Oh come on... (0)

Anonymous Coward | more than 5 years ago | (#27130965)

Ever had a super needy girlfriend...

Right there, first sentence, I was lost. Girlfriend? Huh?

This is slashdot, right? Oh look, shiny robot. Neat!

I can't understand half that post. WTF are you babbling about? http://goatse.asia?

Noone yet? (2, Funny)

The Creator (4611) | more than 5 years ago | (#27126043)

I for one..

Shall we say it together?

Re:Noone yet? (1)

troll8901 (1397145) | more than 5 years ago | (#27126821)

Yes, Yes! Oh God, YES!

(Orgasming at Slashdot meme.)

Given that the universe is infinite... (0)

Anonymous Coward | more than 5 years ago | (#27126139)

... and that God is also infinite; Would you like some toast?

Girlfriend? (2, Insightful)

superspam (857967) | more than 5 years ago | (#27126179)

Ever had a super needy girlfriend...
This is slashdot. Why would you even ask that question?

Re:Girlfriend? (1)

maharg (182366) | more than 5 years ago | (#27126507)

Ever had a super nerdy girlfriend ? eh ?

Too much LKML (0)

Anonymous Coward | more than 5 years ago | (#27126255)

I read the story title as "Robert Love Goes Bad" - wondering if he stopped releasing his code under GPL or something.

I should cut down on reading Linux Kernel Mailing lists.

Re:Too much LKML (1)

twosat (1414337) | more than 5 years ago | (#27130089)

When I saw the heading I thought it was referring to something like Cherry 2000, not like we /.'s are going to get a REAL girlfriend.

"Ever have a... girlfriend?" (1)

SeePage87 (923251) | more than 5 years ago | (#27127455)

Did they just ask /. that?

Uh-oh! (1)

Locke2005 (849178) | more than 5 years ago | (#27127889)

I do the same thing to my daughter all the time... now you're telling me that I can be replaced by a robot?!?

Needy (1)

mqduck (232646) | more than 5 years ago | (#27129925)

Ever had a super needy girlfriend that demanded all your love and attention and would freak whenever you would leave her alone? Irritating, right?

Typical Slashdot sexism. That needy "girlfriend" was me. :(

No... (2, Funny)

Veggiesama (1203068) | more than 5 years ago | (#27130233)

"Ever had a super needy girlfriend that demanded all your love and attention and would freak whenever you would leave her alone?"

No.

*silently weeps*

I can *SO* relate to that! (1)

4D6963 (933028) | more than 5 years ago | (#27130359)

Ever had a super needy girlfriend that demanded all your love and attention and would freak whenever you would leave her alone?

Oh yeah, pfft, all the time, sometimes even more often!

Irritating, right?

Sure, the first few dozen times.. Then you get used to it, you know..

The picture looks caring enough... (0)

Anonymous Coward | more than 5 years ago | (#27130367)

... unless the robot is flipping the girl over for a spanking. Doesn't look like a killer robot, however.

UrkelBot (1)

Richter X (740096) | more than 5 years ago | (#27131071)

Didn't Urkel already try this with UrkelBot? I think it pretty much ended the same way too. XD

Misread title (1)

nikanth (1066242) | more than 5 years ago | (#27131199)

Misread this as Robert Love goes bad

Good News Everybody (1)

AgentSmith (69695) | more than 5 years ago | (#27133373)

I've taught the toaster to feel love!

Really, I want to see the Energizer Bunny walk across the screen on this story, because . . . .
it's fake!

When will someone else pop out of the woodwork to say "April Fools!"?

1. I for one welcome our new feeling robot overlords who only have things done
to them by Soviet Russia or Korean old peoples' bases who belong to us.
2. Naked and Petrified Natalie Portman with a bowl of hot grits
2.5 Tony Vu's late night financial lessons. You stupid Americans!
2.75 ????
3. Profit

Extra points for the savvy /.er who can cram more Slashdot memes into that.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...