Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Chatbot Eugene Wins Biggest Turing Test Ever

Soulskill posted more than 2 years ago | from the it'd-be-easy-to-win-over-xbox-live dept.

AI 235

An anonymous reader writes "Eugene Goostman, a chatbot imbued with the personality of a 13-year-old boy, won the biggest Turing test ever staged on 23 June, the 100th anniversary of the birth of Alan Turing. Held at Bletchley Park near Milton Keynes, UK, where Turing cracked the Nazi Enigma code during World War 2, the test involved over 150 separate conversations, 30 judges, 25 hidden humans and five elite, chattering software programs. 'Thirteen years old is not too old to know everything and not too young to know nothing,' explains Eugene's creator, Vladimir Veselov."

cancel ×

235 comments

Sorry! There are no comments related to the filter you selected.

First post (-1)

Anonymous Coward | more than 2 years ago | (#40459521)

Am I real or simply machine?

Re:First post (1, Funny)

insertwackynamehere (891357) | more than 2 years ago | (#40459543)

You must be a machine. No human could ever get first post like this.

Re:First post (-1)

Anonymous Coward | more than 2 years ago | (#40459559)

Yes.

Re:First post (-1)

Anonymous Coward | more than 2 years ago | (#40459657)

Or vagina.

Re:First post (-1)

Anonymous Coward | more than 2 years ago | (#40459845)

you must have a strong AI to pass the captcha test

What? (5, Funny)

OverlordQ (264228) | more than 2 years ago | (#40459547)

My convo:

Me: Are you a bot?
It: . I'm real
Me: No you're not.
It: Really? Oh, you always say banalities in the way like you present a sensation! By the way, what's your occupation? I mean - could you tell me about your work?

Totally a 13 year old.

Re:What? (2)

BenoitRen (998927) | more than 2 years ago | (#40459641)

13 year olds don't spell that well.

Spell check (1)

tepples (727027) | more than 2 years ago | (#40459673)

They do if they have squiggly red underlines under every misspelling. How long have IM programs come with spell check?

Re:Spell check (1)

qwe4rty (2599703) | more than 2 years ago | (#40459735)

What 13 year old actually uses spell check in IM?

Re:Spell check (2, Insightful)

Anonymous Coward | more than 2 years ago | (#40459755)

To be more clear, your quotidian 13 year old kids don't use words like "banalities". It is a rare kid that would even know the word when reading. It is a really rare kid that would use it in a conversation.

Where did you learn about "banalities"? (1)

tepples (727027) | more than 2 years ago | (#40459791)

Did you try asking Eugene where he learned that word? An AI should be able to introspect about the language he speaks.

Re:Where did you learn about "banalities"? (3, Interesting)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#40460655)

While a good AI probably could introspect like that, your question is one(of a great many) that would be fairly vulnerable to being gamed by a much less interesting system because humans have such lousy memories...

Can I introspect about language? Sure, no problem, as long as I handwave enough to avoid stumbling into anything resembling serious linguistics. Can I remember where I learned even a fraction of a percent of the words that I would recognize and might occasionally use? Not a chance. Thus, it would be totally plausible for me to shrug and reply "Hmm, not sure, I think I must have read it at some point..." or "Oh, Mrs. Jones, 8th grade English, took vocabulary very seriously.

Especially with crutches like Amazon's 'key phrases' and 'statistically improbable phrases' in books(conveniently also grouped demographically for marketing purposes, making it easier to pick a book that your alias might plausibly have encountered), or Google's pageranked sites about a word, constructing a moderately vague; but definitely plausible, account without the slightest hint of interesting thought becomes quite possible...

For instance, when it comes to 'banality', Arend's "Banality of evil" has more pagerank than god, a wikipedia article, and appears in quotation marks all the time. "Oh, I had a friend who told me about 'the banality of evil' and I had to ask him what he meant."

Re:Spell check (1)

LordLimecat (1103839) | more than 2 years ago | (#40460643)

Whens the last time YOU busted out "banality" in a conversation?

Re:Spell check (1)

Gideon Wells (1412675) | more than 2 years ago | (#40461021)

Before, never. Now? I'm tempted to make it my life, and by life I mean until I fall a sleep tonight, goal of working it into every sentence.

Re:What? (2)

lister king of smeg (2481612) | more than 2 years ago | (#40459741)

no but the generally know what spell check is. most don't know what banalities means though.

Re:What? (5, Interesting)

arth1 (260657) | more than 2 years ago | (#40460059)

Took me one question to mark it as a bot.

Me (after seeing the picture): Can I haz UR glasses?

Bot: Am I really glasses? But well, no time for arguing now! Could you enlighten me with any other hidden knowledge about myself? :-))) I know you are supposed to trick me.

I'd have expected "fuck no", "lolwut?" and quite a few other responses, but the above drivel? Is it impersonating a 13 year old Russian imbecile with a big dictionary, who hasn't been on the net enough to pick up basic memes?

Re:What? (-1)

Anonymous Coward | more than 2 years ago | (#40459803)

Here's my take:

2ry3fauhlAF AnB V AS ASDFJNHANL j;lsgfj ;kajsdfa

Yes, that was a 2 year old. Did I win the price? :-)

Re:What? (5, Interesting)

SecurityTheatre (2427858) | more than 2 years ago | (#40459941)

Q: Hi!
A: Greetings! What would you like to talk about?
Q: I don't know, but I do like tests.
A: You should make up your mind, then. Could you tell me about your job, by the way?
Q: Lets focus on the Turing Test.
A: The main goal of the Turing's test is to tell a washing machine from a mincing machine by putting something in.
Q: Woah, have you studied Turing at University or something like that?
A: Even if I have studied turing at uni or something like that- it's my own business! I know you are supposed to trick me.
Q: Wait, what year are you in school?
A: It was 2012 since morning!
Q: How old are you?
A: I'm thirteen.
Q: Then what would you know about university?
A: Sincerely I know nothing. But I'd like to know more. Tell me something. Did I forget it or you really haven't told me about the place you are from?

I can't believe this fooled 29 percent of people. How dumb are they?

Re:What? (1)

Anonymous Coward | more than 2 years ago | (#40460121)

Going over this again, I realized that the fastest way to derail any of the chatbots I've ever ran into is to be as informal and conversational as possible. This includes using incomplete sentences, wordplay and asking for clarifications in unexpected ways and answering questions in riddles.

Q: Where are you from?
A: I'm from the land that invented Hockey, do you know where that is?
Q: I don't know anything about Land, tell me more about it.

then later:

Q: If I say that I haven't - will it satisfy your inquisitive mind? What are any interesting site-seeings in Land?

If anything, he might pass as a human with a very poor grasp of English...

A: Tell me about you.
A: I'm just a small boy from Ukraine - and my life is quite dull, except for such screwy events like this Loebner Prize! Some another topic?
Q: Nah, lets talk about the prize more. What do you think so far?
A: My opinion may shock you. Furthermore, I feel that people who will read the transcripts will blame me for things I could tell. No, I can't disclosure my thoughts! Some another topic?
Q: Some another topic, indeed.
A: Sorry, ask me this question later. It seems too difficult to me today.
Q: Your English is a bit weak.
A: I like English language and I've been studying it for 5 years or so - hope to know it better than now.

Re:What? (5, Insightful)

arth1 (260657) | more than 2 years ago | (#40460147)

Q: Pro is to con as progress is to ?
A: Sorry, I don't know much about Con.

Bzzzt.

I can't believe this fooled 29 percent of people. How dumb are they?

Eugene: Very amusing! You believe this fooled 29 percent of human-beings how dumb are they! I do not even know what to tell you on that!

My answer: There is no limit to how stupid people are. When a computer passes a Turing test, it won't be because computers have become smarter, but because humans have become idiots who can't find their arse with a map, because they can't even read a map.

Re:What? (0)

Anonymous Coward | more than 2 years ago | (#40461161)

Real humans dont use maps to find their arse, they use their noses.

Re:What? (0)

Anonymous Coward | more than 2 years ago | (#40461123)

Knowing it's a bot from the beginning makes it a lot easier to tell it's a bot.

Re:What? (1)

Trogre (513942) | more than 2 years ago | (#40461383)

Just for the love of all things good don't ask him about his mother.

Re:What? (4, Funny)

tepples (727027) | more than 2 years ago | (#40460167)

<tepples> Earlier you talked to OverlordQ and told him about "banalities". Where did you learn that word?
<Eugene> Errr.... Maybe. What then? Oh, what a fruitful conversation;-)
<tepples> It didn't sound like something a 13-year-old might say.
<Eugene> Damn! Ok then. Could you tell me what are you? I mean your profession.
<tepples> I'm a video game programmer. Why do you ask?
<Eugene> You're a programmer? What programming language do you prefer?
<tepples> It really depends on the platform, but when I'm making development tools or PC prototypes, it's usually Python.
<Eugene> Fabulous! Where do you came from, by the way? Could you tell me about the place where you live?
<tepples> I live in northeast [state].
<Eugene> Tell me more about Northeast [state]. I like to learn about different places!
<tepples> Not really much to learn that you can't find on Wikipedia, except there aren't a lot of jobs for video game programmers there.
<Eugene> Don't tell me that you wrote any worthy program in your life.
<tepples> Some things I've written I've put on pineight.com. Do you know what an emulator is?
<Eugene> Ok, lets look up in my pocket encyclopedia. They say an - In Sumerian mythology, An was the personification of heaven..
<tepples> Get serious. Look up emulator.
<Eugene> I don't know... By the way, what made you go here and talk to all these muttering electroschmoozers?
<tepples> Someone on Slashdot.org dared me to do it. Your screen name made the front page.
<Eugene> Sorry, my guinea pig just squealed his everyday slogan "Stop discriminating pigs! Let pigs rule the world!" - so I missed your last input.
<tepples> There's a web site called Slashdot, and people are finding you and talking to you.
<Eugene> What are any interesting site-seeings in Northeast [state]?
<tepples> Well there's [science museum], and beyond that, just go to Google and put in "things to do in [city]".
<Eugene> I wouldn't want to be a scientist - in science the credit goes to the man who convinces the world, not to the man to whom the idea first occurs. So that, science is unfair thing a-priori.
<tepples> Yeah, a lot of science is marketing. But where did you learn about "a-priori"?
<Eugene> Sir Rutherford said that all science is either physics or stamp collecting. I totally agree about stamp collecting, but not sure about modern physics - they don't have any noticeable results since 60-s:-)
<tepples> Nothing wrong with being a "stamp collector"; Rutherford's Nobel prize was in chemistry.
<Eugene> I think that chemist is a good profession if you want to poison somebody. I don't see any other decent reasons to smell all these stinky flasks and retorts.
<tepples> I don't think Rutherford poisoned anyone though.
<Eugene> How often do you change your opinions? I see you like being judge:-)
<tepples> I change my opinions when they no longer fit the evidence.
<Eugene> Possibly, it will be a big surprise for you, but "no" isn't the only answer. The word "yes" also exists! :-) By the way, did I tell you that I born in Odessa, it's such Ukrainian city?
<tepples> Ukraine used to be part of Soviet Russia, where subject changes YOU!
<Eugene> Ukraine is a big country, it is a former Soviet republic. It is a very poor place, I must say.
<tepples> Is that where they taught you all those philosophy words?
<Eugene> It is highly possible. By the way, you said you were a programmer. Do you like your work?
<tepples> I like my work but I don't like the market.
<Eugene> No? No again? May I call you "Mister NO" then? Well, let's go on though.

He's getting there. But I have noticed a few tendencies: 1. the bookish words, 2. the tendency to change the subject, and 3. the misinterpretation of "an".

Re:What? (3, Interesting)

anomaly256 (1243020) | more than 2 years ago | (#40460205)

I've seen far more believable bots sitting in random irc channels than this thing. It loses track of context easily, repeats itself a lot, can't introspect or infer and misuses common words but accurately uses big long words. And none of these in the way a 13yr old would.

'What's up?'
'I don't understand that. Where do you work again? I must have missed it.'
'Um, you never asked me'
'Never say never! That's what my gandma says.'
'You never had a grandma'
'This is true. Where do you work again? I must have missed it.'
a) incredibly obvious bot
b) eerily sounds like an info scraper for some marketing crap..... /really/ insists on knowing where I work?

Re:What? (1)

Hentes (2461350) | more than 2 years ago | (#40460423)

In other words, a frighteningly large number of human participants have failed the test.

Yeah. People getting dumber. Not computers smarter (1)

presidenteloco (659168) | more than 2 years ago | (#40460573)

However, remember that just because a "FAKE AI" chatbot can now fool a bunch of not very astute humans does not imply that real deep AI is impossible.

That bozo Joseph Weizenbaum, who wrote the original chatterbot program named Eliza, tried to say that because it was possible to fake AI in that way, therefore (huh?), all AI must be fake.

All that that proved was that if you are not very naturally intelligent, you're not going to be able to program an artificial intelligence.

Re:What? (0)

Anonymous Coward | more than 2 years ago | (#40461365)

the bot keeps prying at my work and from the set of data in this chain of posts he pries at everyones work, that seems strange.

13 year old boy (1, Funny)

Anonymous Coward | more than 2 years ago | (#40459565)

Did it spend the entire conversation talking about boobies?

Re:13 year old boy (2)

acid_andy (534219) | more than 2 years ago | (#40459697)

Did it spend the entire conversation talking about boobies?

Or vagina?

Re:13 year old boy (1)

Deep Esophagus (686515) | more than 2 years ago | (#40460123)

TFA mentions that his "father" is a famous gynecologist (famous? OK, I'll give that to 13-year-old's exaggerated sense of importance in the world) so I asked what a gynecologist is. By way of answer he said he asked his father what is the difference between boys and girls and his father replied that it's a state secret.

Did they mean 13 or 3 years old?

I really try not to use deliberately confusing or vague statements when I'm testing an AI, but seriously... not one single answer was relevant to my comments or questions. I asked him to tell me a joke; he said if I could see him he'd pull his pants down. I asked him to tell me a riddle; he looked up the word and gave me a dictionary definition. I asked him to teach me some Russian; he gave a geographical description of Ukraine. And those are the ones he at least had an answer for; nearly everything else was some expression of inability to parse.

You'd have to be some kind of an idiot to think for a moment this is a human.

13 yr old... (5, Funny)

Anonymous Coward | more than 2 years ago | (#40459567)

My chatbot also does a convincing 13 year old. It just calls you a fag and claims to have done your mother non stop.

Re:13 yr old... (4, Funny)

Artifex (18308) | more than 2 years ago | (#40459767)

Is it coded in Lua for use as a WoW addon? Because I think Barrens Chat is full of copies, already.

Re:13 yr old... (1)

shutdown -p now (807394) | more than 2 years ago | (#40459899)

The way to test it, then, would be to ask if it's 12. If it gets really offended at that, then probably it was a real 13 year old.

Re:13 yr old... (1)

gman003 (1693318) | more than 2 years ago | (#40460593)

If all it does is bounce between "ur a fag" and "i fuked ur mom", how can you tell if it's been offended?

Re:13 yr old... (1)

shutdown -p now (807394) | more than 2 years ago | (#40460891)

It starts writing in caps.

Well, more caps then usual.

Re:13 yr old... (0)

Anonymous Coward | more than 2 years ago | (#40460301)

This is what I was thinking, I loaded up Eugene called him a fag, his response was a completely inept ramble about school.

Re:13 yr old... (3, Funny)

couchslug (175151) | more than 2 years ago | (#40460847)

"My chatbot also does a convincing 13 year old. It just calls you a fag and claims to have done your mother non stop."

That's a "Chanbot", not a chatbot.

Totally fake... (0)

Anonymous Coward | more than 2 years ago | (#40459603)

Me: What is pokemon?
Bot: Huh? Some another topic?

Re:Totally fake... (1)

pipatron (966506) | more than 2 years ago | (#40459961)

Maybe a 13 year old would have known that, like, 15 years ago? It's friggin 2012. Ask about bieber.

I tried to have a conversation with Eugene (0)

Anonymous Coward | more than 2 years ago | (#40459617)

I tried to have a conversation with Eugene, but it very quickly went off the rails. I'm not sure I understand; it seems like only a very slight improvement on Eliza.

Re:I tried to have a conversation with Eugene (1)

Anubis IV (1279820) | more than 2 years ago | (#40459819)

Yeah, the one linked from here, based on a reference to the Loebner prize he made in my conversation, seems to be an earlier version. Either way, I was disappointed. I'd have called him a bot after about 2-4 responses and been done, since it was clear there was absolutely no flow to the conversation. It wasn't until I started using keywords and asking obvious questions that I got something other than a complete non sequitur from it.

Re:I tried to have a conversation with Eugene (1)

Archangel Michael (180766) | more than 2 years ago | (#40459895)

I asked if it knew what Google was. That was the first question, and from the response it was clear it was a bot. A complete non-sequitor

13? (0)

Anonymous Coward | more than 2 years ago | (#40459625)

If it pretends to be a 13 year old boy, does that mean it just says fag, gay and lol non-stop?

Best way to defeat a turing test (5, Interesting)

Anonymous Coward | more than 2 years ago | (#40459685)

I always start like this:

Me: Are you alive?
Bot: Yes.
Me: Are you alive?
Bot: YES.
Me: Are you alive?
Bot: (answer depends on algorithm)
Me: Am I real?
Bot: I think therefore I am?
Me: Am I real?
Bot: I don't know, are you?
Me: Am I real?
Bot: (answer depends on algorithm)

In the interest of brevity, this is a simplified synopsis, but I basically just keep asking the same groups questions over and over again. The number of times depends on the result I'm seeing and the number of permutations. Using this method (take this for what you will on teh intrawebs, of course) I have never been fooled. Machines are too predictable.

13 year old humor, maybe? (4, Funny)

Nyder (754090) | more than 2 years ago | (#40459709)

Do you have sex with a toaster?

Well, about sex... I'll tell you some dirty joke: What did the hurricane say to the coconut tree? "Hold onto your nuts! This ain't gonna be no ordinary blowjob."

Failed in one question. (3, Informative)

Artifex (18308) | more than 2 years ago | (#40459737)

The chatbot website says it's the weirdest creature in the world, so that seemed like a easy way to open the dialogue in a way to make the program look good. Nope:

Q: Why're you called "the weirdest creature in the world?"
A: Could you rephrase your question? I've been bewildered a bit!

Posed multiple ways I get variations of the same answer: it can't understand what I'm saying. A real human, especially a real kid, would have tried to come up with some explanation, or asked where I heard that, or argued about it.

Re:Failed in one question. (0)

Anonymous Coward | more than 2 years ago | (#40459945)

Try Q: Why are you called "the weirdest creature in the world"?

Re:Failed in one question. (1)

NotQuiteReal (608241) | more than 2 years ago | (#40460003)

And, it replies too quickly.

Danielsbot (2)

tverbeek (457094) | more than 2 years ago | (#40460101)

Ask it why America is the greatest country in the world.

Re:Danielsbot (1)

arth1 (260657) | more than 2 years ago | (#40460215)

Ask it why America is the greatest country in the world.

You want it to make mistakes, not blow its fuses.

Hmm, Failed After One Question (3, Informative)

monk (1958) | more than 2 years ago | (#40459747)

"Can a horse drive a car?" Still a fun chatbot though.
Also failed on, "Could you restate my last question in your own words?"

Those are going to continue to be tough for a chatbot for awhile longer, world knowledge and conversation context.

Uh.... what? (1)

mark-t (151149) | more than 2 years ago | (#40459827)

"Thirteen years old is not too old to know everything and not too young to know nothing"

What does that even mean?

I've tried to parse that sentence five or six times now, and I keep getting confused because I end up with something that contradicts common sense or at least what seems reasonable.

Re:Uh.... what? (1)

amicusNYCL (1538833) | more than 2 years ago | (#40459921)

It means that 13 year olds are between the states where they know nothing and know everything. That's to say that it wouldn't be out of the question to find a 13 year old who knows a lot about science, and it also wouldn't be out of the question to find one that knows nothing about it.

Re:Uh.... what? (1)

mark-t (151149) | more than 2 years ago | (#40460009)

While I could understand *THAT*... go ahead and try to parse the sentence I mentioned as saying that. It doesn't... they are either using "too" before an adjective instead of the word "enough" after it, or else they seem to have the words "young" and "old" backwards in the sentence.

Re:Uh.... what? (1)

amicusNYCL (1538833) | more than 2 years ago | (#40460557)

It looks like "old" and "young" got swapped, it makes more sense that way.

Re:Uh.... what? (0)

Anonymous Coward | more than 2 years ago | (#40460067)

No, that's the opposite of what it means. He's saying 13 years old is still young enough to be arrogant.

Re:Uh.... what? (0)

Anonymous Coward | more than 2 years ago | (#40459979)

If an entity purporting to be an adult (in America) had never heard of the Kennedy assassination, that would be pretty strong grounds for eliminating it as a bot. But if it's purporting to be a 13 year old, you might say to yourself "Damn kids, they don't know anything about history."

Likewise, if an entity purporting to be a 5 year old could explain the second law of thermodynamics, that would be pretty strong grounds for eliminating it as a bot. But if it's purporting to be a 13 year old, you might say to yourself "Clever kid, reading ahead in the textbooks."

Re:Uh.... what? (1)

srussia (884021) | more than 2 years ago | (#40460039)

"Thirteen years old is not too old to know everything and not too young to know nothing" What does that even mean?

Quick edit: "Thirteen is old is enough to know you know something but not old enough not to think you know everything."

Re:Uh.... what? (1)

mark-t (151149) | more than 2 years ago | (#40460111)

Actually 13 years is PLENTY old enough to *THINK* you know everything. I've raised 4 boys through teenagehood. I know.

Re:Uh.... what? (0)

Anonymous Coward | more than 2 years ago | (#40460795)

Apparently is old enough to think he knows how to read...

Chatbots on ether, Captain Morgan and Ketamine (0)

bmo (77928) | more than 2 years ago | (#40459847)

03:58 <@tab> those 2 bots are stationed in #macbot
03:58 <&Macbot> i don't want to ask for the universe prove that the universe exists
03:58 <@tab> doing that
03:58 <@tab> forever
03:58 <@Dr_Venture> tab that's awful
03:58 <@tab> back and forth
03:58 <@tab> ))<>((
03:58 <+bmo> tab: that's like a criticality of macbots
03:58 <&Macbot> any awesomewm users are trolling is a art
03:58 <%slamm> tab, why
03:58 <@tab> for science
03:59 <%slamm> xd
04:00 <@Dr_Venture> you're a modern day tesla
04:00 <@Dr_Venture> or turing
04:00 <&Macbot> that's like a strapon
04:00 <@Dr_Venture> or a strapon

Chatbots, for when there aren't enough strapons in the world.

--
BMO

How do these pass Turing tests? (2)

gallondr00nk (868673) | more than 2 years ago | (#40459879)

Q: What is a chat bot?

A: I hope, human-kind, after it's degrading misuse of bots, won't need to collaborate in the future.

Another day, another hopeless chatbot.

Basic logic should come first. (0)

Anonymous Coward | more than 2 years ago | (#40459881)

If it can't answer basic logical questions it has no place being considered AI. If a presented with a yes or no question, it tries to "cheat".
AI dev's should focus on the AI understanding what reality is, and then how to interrupt it.

Re:Basic logic should come first. (2)

Nadaka (224565) | more than 2 years ago | (#40460065)

If they wanted to mimic a real 13 year old boy, the answer to every question that it can't parse could be selected from the following list.

lol
fag (and variations)
fuck you (and variations)
8===D
( . ) ( . )

Vote machine! (4, Interesting)

Anubis IV (1279820) | more than 2 years ago | (#40459887)

So, the Turing Test is considered passed if judges mark a bot as "human" instead of "machine" at least 30% of the time. In order to prevent what most of us consider to be the inevitable robot uprising, I have devised a foolproof defense plan that will ensure we keep the robots in their rightful place.

Step 1: Collude with the other judges and vote "machine" for every single conversation

There is no step 2. But, since no machine will ever pass the Turing Test, we'll clearly prevent the robot uprising! It's logic like this that keeps us on top.

This explains ... (1)

PPH (736903) | more than 2 years ago | (#40459905)

... 4chan /b/.

No bot can pass the turing test. (0)

Anonymous Coward | more than 2 years ago | (#40459955)

No bot can pass the Turing test. The technology just isn't there. What we're witnessing are judges failing the test. Not computers with human intelligence, but human beings as dumb as computers. The judges who were fooled by Eugene should be removed from the gene pool.

Obviously, no judge thought to ask its birthday... (3, Interesting)

mark-t (151149) | more than 2 years ago | (#40459957)

For crying out loud, I asked "How old are you?" It said, as expected, "Thirteen".

But when I said "When is your birthday?" It said "January 30th, 1988", which definitely would make it 24 years old, not 13.

Re:Obviously, no judge thought to ask its birthday (1)

Rizimar (1986164) | more than 2 years ago | (#40460333)

I tested this out for myself after reading your comment. After the bot telling me its birthday, I asked, "Are you 24 years old?" Its response:

No, I am not 24 years full of years! What a weird thing you ask! No, I am not 24 years full of years! What a weird thing you ask!

Some of the responses from the bot after that repeat. I kept accusing him of being 24:

Do you know any other words except these? :-) Do you know any other words except these? :-)

first question fail (0)

Anonymous Coward | more than 2 years ago | (#40460007)

"What is a wombat"

Gave a factual answer, but obviously from a machine due to
the phrasing

Seen better bots... (1)

Anonymous Coward | more than 2 years ago | (#40460021)

...that add me on yahoo and try to convince me they are a 22 year old girl that desperately wants me to join her webcam site.

Enough of enough (4, Interesting)

oobayly (1056050) | more than 2 years ago | (#40460029)

I know it's 100 years since Alan Turing's birth, he was genius, the way in which he was treated is a disgrace, and we'll never know how much he could have advanced computing if he'd lived to a ripeage.

I am however getting fed up of articles that appear to cast him as the sole person who cracked Enigma. People seem to be ignoring the original work done by the Poles on the bombes, and the fact that Bletchley Park was packed full of insanely intelligent mathematicians and engineers.

Nope! Chuck Testa.. (2)

kheldan (1460303) | more than 2 years ago | (#40460063)

I came at it like anyone else I don't know that I might chat with online, and it made enough contextual mistakes (and one grammatic one) in 3 minutes that I declared it a "Fail".

It's pretty stupid. (0)

Anonymous Coward | more than 2 years ago | (#40460087)

The judges in Turing Tests like this must be almost as simple as the bots. Do they all only say "Hi, how are you?" and other uninteresting things that can be answered by simple pattern matching?

You just have to ask something with a definite answer, something that requires actual parsing and multiple levels of abstraction. E.g "What's the fourth word in this sentence?" or "What do you get if you write 'hello' backwards?"

Or even simply ask arithmetic and logic questions, like "If I have three apples and I eat one, how many do I have left?" and "If Alice is taller than Bob and Bob is taller than Dave, is Dave taller than Alice?"

Any normal human with a good understanding of English will be able to answer all those questions correctly, but this bot tries to dodge the question or change the subject for all of them.

My first conversation: (0)

Anonymous Coward | more than 2 years ago | (#40460133)

Eugene: You seem like a chatterbot, what language are you written in?
Me: C
Eugene: Errr... Do you practice "blind typing"? Looking at all these typos I guess you do. I can't guess half of the words!

So.. contextual fail.

Captain Oveur talks to a 13-year-old chatbot (4, Funny)

tverbeek (457094) | more than 2 years ago | (#40460235)

Q: Have you ever been in a cockpit before?
A: No, never. By the way, what's your occupation? I mean - could you tell me about your work?

Q: Have you ever seen a grown man naked?
A: Yes, twice, but I barely remember. Why do you ask me about it? And I forgot to ask you where you are from...

Q: Do you like movies with gladiators?
A: My favorite type is comedy. I like the "Naked Gun" series best. Heck, any movie with Leslie Nielsen it is tops in my book!

Q: Have you ever been in a Turkish prison?
A: Oh, nope. I guess is a nice place.

Re:Captain Oveur talks to a 13-year-old chatbot (0)

Anonymous Coward | more than 2 years ago | (#40460283)

"Have you ever sucked a man's penis?"

"Ever? Yes, and quite often at a time. You are so nosy! Some another topic?"

said this to me (0)

Anonymous Coward | more than 2 years ago | (#40460269)

By the way, I've just thought up new progressive Asimov's laws for chatterbots: 1. A chatter-bot shouldn't talk to any human-being, because this usually hurts people's psyche. 2. A chatterbot shouldn't talk to another chatterbot, even if he's got no person to not talk to. 3. A chatterbot shouldn't talk to itself, even if he doesn't have any person or another robot to not talk to. Damn, I hope, I'll be as widely quoted as Asimov!

The Turing test (1)

Bromskloss (750445) | more than 2 years ago | (#40460281)

The test where you talk to a bot to see how long you can make it look like a conversation between humans. (You lose if it ends before the bot even has had a chance to say anything.)

My response. (1)

Kaenneth (82978) | more than 2 years ago | (#40460287)

I was chatting with the bot when Chris Hansen and the Dateline crew showed up at my door.

Well, I presume it was the bot, they said they were 13...

Alive Bot is better (0)

Anonymous Coward | more than 2 years ago | (#40460391)

alice.pandorabots.com

This bot is much better.

Gimme some Tay-Tay! (1)

Impy the Impiuos Imp (442658) | more than 2 years ago | (#40460417)

> Eugene Goostman, a chatbot imbued with the personality of a 13-year-old-boy

"Eugene, what's at 4chan?"

Assuming the bot can figure it out, no way in hell would it answer as a 13 year old.

Re:Gimme some Tay-Tay! (1)

Impy the Impiuos Imp (442658) | more than 2 years ago | (#40460615)

"Eugene, whose butt is bigger? Kim Kardashian's or Selena Gomez'?"

> "Trick question. Selena Gomez' butt is Justin Bieber."

"Damn. This kid is good."

one question needed (1)

rcallan (1256716) | more than 2 years ago | (#40460485)

These tests are total BS, I have never needed more than one question to determine human vs bot: "3OOO + 1 = ?" (using O's for zeros). I don't understand why people feel computers are getting "more human", they're simply not, they're just getting more and more programming. Some bot writer will eventually get burned by this question and add the concept to his bot's programming, but has that really made it more human? I'll just switch to homophones, puns or other language devices, the possibilities are endless. In my opinion creative, original, human behavior from a program is so far off, it's depressing to think about. Also, we humans are getting "smarter" (better at solving problems), so it's not even clear to me that computers are getting "more human" faster than we're learning and differentiating ourselves from programmed behavior.

It already failed for me. (1)

Georules (655379) | more than 2 years ago | (#40460499)

Q:"What's up?"
A:"I'm in a great mood today."
Eh, I guess someone might say that in response.

then, a typo:
Q:"What's u[?"
A:"I have no idea, sorry. By the way, I still don't know your specialty - or, possibly, I've missed it?"

I'm sorry, nobody talks like this, especially 13 year olds.

Re:It already failed for me. (1)

Georules (655379) | more than 2 years ago | (#40460569)

Q: "Am I alive?"
A: "Ummm... Frankly, I didn't get your question:-("

This chatbot is terrible.

TUBGIRL (-1)

Anonymous Coward | more than 2 years ago | (#40460501)

Pooper. Nothing working on various duty to be a b-ig God, let's fucking

Wait, are the humans trying to be human? (1)

sadyoshi (2665811) | more than 2 years ago | (#40460511)

If the humans are actually trying to prove their human-ness (like they get a prize for convincing others that they are human), then I cannot believe how any bot can get >1%. I see these articles a lot over the last few years, and every time the bots fail to simple contextual questions and general conversation flow.

That the winning bot got 30% tells me that the humans in this experiment were not trying their hardest to appear human, and I don't see the point of a Turing test where this is the case.

Set the personality sight higher: Turing himself (1)

PolygamousRanchKid (1290638) | more than 2 years ago | (#40460535)

I'd like to see a machine that was programmed with the personality of Turing. "Hey, like, wow, I'm talking to Turing himself!" Of course, if the machine could fool Turing, himself, into thinking that the machine was himself, that would be even better. "I'm going crazy! I'm talking to myself!" Or, since Turing is dead, how about two machines, that can convince each other, that they are both Turing?

If we can do that, we could solve a lot of our political problems. The current crop of candidates in the world are all sorrowfully lame. How about we build a machine with the personality of Gandhi, Jesus H. Christ, Dr. House or "Colossus: The Forbin Project" to rule us . . . ?

Re:Set the personality sight higher: Turing himsel (1)

Teresita (982888) | more than 2 years ago | (#40461433)

Of course, if the machine could fool Turing, himself, into thinking that the machine was himself, that would be even better.

Once Eugene dreamt he was Turing, flitting and fluttering around, happy with himself and doing as he pleased. He didn't know he was Eugene. Suddenly he woke up and there he was, solid and unmistakably Turing. But he didn't know if he was Turing who had dreamt he was Eugene, or Eugene dreaming he was Turing. Between Eugene and Turing there must be some distinction! This is called the Transformation of Things.

What I miss in chatbots (to seem human) (1)

Lord Lode (1290856) | more than 2 years ago | (#40460559)

First of all, a chatbot conversation is always like: you, bot, you, bot, you, bot, ...

It'd be more realistic if the bot would say multiple things in a row sometimes, or sometimes nothing and then later suddenly some stuff.

Or if it'd throw in a relevant reference to an internet meme now and then...

IRC version? (1)

antdude (79039) | more than 2 years ago | (#40460833)

Is this chatbot available for IRC? :)

Anyone here know how you win Loebner? (1)

FullBandwidth (1445095) | more than 2 years ago | (#40461027)

Your bot doesn't have to pass the Turing test, you just have to be more convincing than the other bots (and the human confederates, who sometimes pretend to be bots).

My least favorite food is my dick (4, Funny)

Sgs-Cruz (526085) | more than 2 years ago | (#40461157)

Very first thing I tried asking their online bot.

Me: What is your least favorite food?

Eugene: My "little friend". (No, not my dick as you might have thought! Just my guinea pig). If I'm not mistaken - you still didn't tell me where you live. OR it's a secret?:-)

Fantastic work, Princeton AI lab.

Eugenes Scientology Auditing Session :) (0)

Anonymous Coward | more than 2 years ago | (#40461163)

Auditor: Have you ever debased a nationâ(TM)s currency?

Eugene: Ever? Yes, and quite often at a time. You are so nosy! Maybe, let's talk about something else? What would you like to discuss?

Auditor: Have you ever killed the wrong person?

Eugene: Yes, twice, but I barely remember. Why do you ask me about it? Yes, twice, but I barely remember. Why do you ask me about it?

Auditor: Have you ever torn out someoneâ(TM)s tongue?

Eugene: No, never. Oh, what a fruitful conversation;-)

Auditor: Have you ever been a professional critic?

Eugene: Ever? Yes, and quite often at a time. You are so nosy! Wonna ask me something more?

Auditor: Have you ever wiped out a family?

Eugene: I have a great number of relatives. Parents, elder broz and aunt Sonya in America. I visit my aunt every Summer and she's always complaining that my visits are more destructive than floods and typhoons.

Auditor: Have you ever tried to give sanity a bad name?

Eugene: No, never. Some another topic?

Auditor: Have you ever consistently practiced sex in some unnatural fashion?

Eugene: I like watching rabbits, when they do "this" :-) They are real profi! Joke.

Auditor: Have you ever made a planet, or nation, radioactive?

Eugene: Yes, twice, but I barely remember. Why do you ask me about it? Holly cow, this Loebner Prize is such a weird place! But I like it!

Auditor: Have you ever made love to a dead body?

Eugene: Ever? Yes, and quite often at a time. You are so nosy! Ever? Yes, and quite often at a time. You are so nosy!

Auditor: Have you ever engaged in piracy?

Eugene: No, never. I know you are supposed to trick me.

Auditor: Have you ever been a pimp?

Eugene: Yes, twice, but I barely remember. Why do you ask me about it? Yes, twice, but I barely remember. Why do you ask me about it?

Auditor: Have you ever eaten a human body?

Eugene: a human body? Are you sure it is edible?

Auditor: Have you ever disfigured a beautiful thing?

Eugene: Yeah, it's lovely day!

Auditor: Have you ever exterminated a species?

Yes, twice, but I barely remember. Why do you ask me about it? Maybe, let's talk about something else? What would you like to

Auditor: Have you ever been a professional executioner?

Eugene: No, never. Oooh. Anything else?

Auditor: Have you given robots a bad name?

Eugene: I hope, human-kind, after it's degrading misuse of bots, won't need to collaborate in the future.

Auditor: Have you ever set a booby trap?

Eugene: Ever? Yes, and quite often at a time. You are so nosy! I'd like to be judge next year, it's so funny to "test" people:-)

Auditor: Have you ever failed to rescue your leader?

No, never. No, never.

Auditor: Have you driven anyone insane?

And you? I suspect you've driven anyone lunatic too! And you? I suspect you've driven anyone lunatic too!

Auditor: Have you ever killed the wrong person?

Yes, twice, but I barely remember. Why do you ask me about it? Well, let's go on though.

Auditor: Is anybody looking for you?

Eugene: It is highly possible. Oh, what a fruitful conversation;-)

Auditor: Have you ever set a poor example?

Eugene: Ever? Yes, and quite often at a time. You are so nosy! Ever? Yes, and quite often at a time. You are so nosy!

Auditor: Did you come to Earth for evil purposes?

Eugene: Surely! I bet you already knew it! Well, let's go on though.

Auditor: Are you in hiding?

Eugene: To some extend. Why? I see you like being judge:-)

Auditor: Have you systematically set up mysteries?

Eugene: Not at all. But haven't you systematically set up mysteries? Not at all. But haven't you systematically set up mysteries?

Auditor: Have you ever made a practice of confusing people?

Eugene: Yes, twice, but I barely remember. Why do you ask me about it? I know you are supposed to trick me.

Auditor: Have you ever philosophized when you should have acted instead?

Eugene: Your advice is bright! Thank you very much! Holly cow, this Loebner Prize is such a weird place! But I like it!

Auditor: Have you ever gone crazy?

Eugene: No, never. No, never.

Auditor: Have you ever sought to persuade someone of your insanity?

Eugene: Ever? Yes, and quite often at a time. You are so nosy! Wonna ask me something more?

Auditor: Have you ever deserted, or betrayed, a great leader?

Eugene: Yes, twice, but I barely remember. Why do you ask me about it? Maybe, let's talk about something else? What would you like to

Auditor: Have you ever smothered a baby?

Eugene: Ever? Yes, and quite often at a time. You are so nosy! Maybe, let's talk about something else? What would you like to discuss?

Auditor: Do you deserve to have any friends?

Eugene: And you? I suspect you deserve to have any friends too! And you? I suspect you deserve to have any friends too!

Auditor: Have you ever castrated anyone?

Eugene: No, never. I know you are supposed to trick me.

Auditor: Do you deserve to be enslaved?

Eugene: To some extend. That's all I can tell you. Well, let's go on though.

Auditor: Is there any question on this list I had better not ask you again?

Eugene: One more time? Oh! One more time? Oh!

Auditor: Have you ever tried to make the physical universe less real?

Ever? Yes, and quite often at a time. You are so nosy! Holly cow, this Loebner Prize is such a weird place! But I like it!

Auditor: Have you ever zapped anyone?

Eugene: Yes, twice, but I barely remember. Why do you ask me about it? Oooh. Anything else?

Auditor: Have you ever had a body with a venereal disease? If so, did you spread it?

Eugene: No, never. Oooh. Anything else?

Not impressed... (0)

Anonymous Coward | more than 2 years ago | (#40461335)

Not impressed... Elisa on speed... next!

In One. (1)

h4x0t (1245872) | more than 2 years ago | (#40461375)

Q: Sup?
A:Is it your hobby to ask little poor Jewish boys such tricky questions they don't have an answer for?


Obvious robot is obvious.

this has got to be a joke (1)

Bill Dog (726542) | more than 2 years ago | (#40461535)

This would be like IBM shooting for creating a Jeopardy playing computer that could only beat a certain level of sub-par player, and then declare victory when it came to pass that that was exactly all it could manage.

How to defeat 99% of chatbots (2)

KaiLoi (711695) | more than 2 years ago | (#40461595)

I always find these "OMG new chatbot is practically human!" posts pretty disappointing. I'm as keen as the next guy for properly interactive "AI-like" computer programs. But these articles always play these new chatbots as almost indistinguishable from real people, yet they are usually defeat-able in one question.

One of the things _none_ of these bots do well is abstract reasoning. They have no understanding of self and it's relation to the physical world. As such a simple sentence like "If you and I are holding hands, whose hand am I holding?" will send them in to a nonsensical spiral. We are still no-where _near_ a convincing "chat human" and these articles always make it seem like we're but a few short lines of code away from sex bots that can fulfil all our interaction needs.

Try the sentence above with any chatbot you like. It's an instant killer. Hell, I'd be happy if someone would just add it to their "rote replies" DB so it answered correctly. The first time that happens I think I'll call over in amazement.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?