Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

BotPrize: a Turing Test for bots

Anonymous Coward writes | more than 5 years ago

Programming 0

Philip Hingston writes "Computers can't play like people — yet

An unusual kind of computer game bot programming contest has just been held in Perth, Australia, as part of the IEEE Symposium on Computational Intelligence and Games. The contest was not about programming the bot that plays the best. The aim was to see if a bot could convince another player that it was actually a human player. Game Development Studio 2K Australia (creator of BioShock) provided $7,000 cash plus a trip to their studio in Canberra for anyone who could create a bot to pass this "Turing Test for Bots".

People like to play against opponents who are like themselves — opponents with personality, who can surprise, who sometimes make mistakes, yet don't robotically make the same mistakes over and over. Computers are superbly fast and accurate at playing games, but can they be programmed to be more fun to play — to play like you and me?

Teams from Australia, the Czech Republic, the United States, Japan and Singapore competed in the final. Competitors created bots to play a specially modified Unreal Tournament 2004 Death Match. Expert judges then tried to tell whether they were playing a bot or a human, just from their observation of the way they played the game. Judges included AI experts, a game development executive, game developers, as well as an expert human player.

The result?

The winning team AMIS, from Charles University in Prague, managed to fool 2 out of the 5 expert judges, and achieved an average "human-ness rating" of 2.4 out of 4. All the human players were judged more human than the bots overall, but the judges were fooled often enough to suggest that in next year's contest, some bots may be able to pass the test by fooling 4 out of 5 judges. AMIS won $2,000 cash plus an all expenses paid trip to 2K's Canberra studio.

You can check out the full results and competition videos, and try an online quiz that lets you be a judge yourself at"

Link to Original Source

cancel ×


Sorry! There are no comments related to the filter you selected.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>