Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

How To Index and Search a Video By Emotion

timothy posted about 4 years ago | from the black-beauty-special dept.

Input Devices 76

robotsrule writes "Here's a a demonstration video of EmoRate, a software program that uses the Emotiv 14-electrode EEG headset to record your emotions via your facial expressions. In the video you'll see EmoRate record my emotions while I watch a YouTube video, then index that video by emotion, and then navigate that video by simply by remembering a feeling. The web page for EmoRate explains how I used Emotiv's SDK to build the software program, and how I trained the system by watching emotionally evocative videos on YouTube while wearing the headset."

cancel ×

76 comments

Sorry! There are no comments related to the filter you selected.

Obvious trial run. (0)

Anonymous Coward | about 4 years ago | (#33387552)

Wonder how this would perform for porn and porn + regular videos? Would you end up getting one category as the dominant selection?

Re:Obvious trial run. (0)

Anonymous Coward | about 4 years ago | (#33388474)

I dunno about that... But it would be useful to catalog porn videos by the action going on. That way you can skip the boring parts.

The real EmoRate (5, Funny)

Kitkoan (1719118) | about 4 years ago | (#33387556)

is right here. [ratethatemo.com]

Oh...my... (1)

sznupi (719324) | about 4 years ago | (#33387566)

Brain-computer interface really taking up via to emos?

This is not the augmented / towards transhuman future I imagined...

Re:Oh...my... (1)

ZeroExistenZ (721849) | about 4 years ago | (#33390388)

Brain-computer interface really taking up via to emos?

For me this raises the question; if an emo watches emoRate. Should emoRate show the emo video material matching the emo's emotional state and thus amplifying and affirming emo's emotions entering a self-referencing emo-loop...

Or should one design an algorithm detecting emotional changes of users watching emoRate, and serve emo content countering emo's current emotional state in the opposite direction?

Say, distilling your facebook data:

"subject x in agerange y who lists keyswords a,b,c as interest matches category 'Emo'."
"category 'Emo' responds well to imagery of young kittens yet negative being presented by imagery of maternal figures"

Re:Oh...my... (1)

maxwell demon (590494) | about 4 years ago | (#33390470)

For me this raises the question; if an emo watches emoRate. Should emoRate show the emo video material matching the emo's emotional state and thus amplifying and affirming emo's emotions entering a self-referencing emo-loop...

Or should one design an algorithm detecting emotional changes of users watching emoRate, and serve emo content countering emo's current emotional state in the opposite direction?

I'd say that depends on whether it's a positive or negative emotion. If I feel extraordinary happy, I certainly don't want to be shown photos countering that. On the other hand, if I'm very sad, seeing pictures which counter that sadness might be very welcome.

Say, distilling your facebook data:

"subject x in agerange y who lists keyswords a,b,c as interest matches category 'Emo'."
"category 'Emo' responds well to imagery of young kittens yet negative being presented by imagery of maternal figures"

Sounds like an advertiser's wet dream. Targeted advertising based on your personal emotional profile!

Re:Oh...my... (1)

flyneye (84093) | more than 3 years ago | (#33398324)

I think you could use the headset for assigning particular functions to particular facial expressions as they have. I'll take the "magic" out of this contraption by noting that it doesn't read emotions, it merely senses pre defined facial expression and compiles the data. Over many years of research with EEG both Dr. Walter and myself found that even one sensor was enough and holding it between fingertips( where nerve endings are plentiful) rather than pasting it on the face or scalp produced far better sensing results.
Simply put, they could have used facial recognition software or mechanical means to produce the same results. I suspect the overkill is there to drive the price up.

Oh boy... (0)

Anonymous Coward | about 4 years ago | (#33387580)

This sure will come in handy when I'm looking for GOOD porn! There is a God!!!!

Re:Oh boy... (0)

Anonymous Coward | about 4 years ago | (#33388822)

But it only retrieves good porn when you're fully recalling the emotion you get from good porn.

So it's like a loan -- you can only get it by proving you don't need it. :(

Emo (0)

Anonymous Coward | about 4 years ago | (#33387620)

The best word in all of creation. Possibly followed by emu.

I assume you wouldn't always respond the same way (1)

Tynin (634655) | about 4 years ago | (#33387630)

It would be interesting to run this a few times... not sequentially, but maybe once every few months to give yourself time to reset (for the lack of a better word), and rewatch whatever it was you recorded with this device and then diff the results to see if their was a drift, and in what area's. I have to assume you wouldn't always respond the same way, and the results could be highly interesting, perhaps even moreso to the field of psychiatry in allowing a more exact gauging of the effectiveness of whatever drug they are administering to a patient.

Re:I assume you wouldn't always respond the same w (1)

Tynin (634655) | about 4 years ago | (#33387676)

Sorry for replying to myself. After a casual search I found it was naive of me to think EEG's weren't being used in psychiatry for a while now. Seems it is rather common place. Still, cool the tech is being used in new and more accessible ways.

Re:I assume you wouldn't always respond the same w (2, Interesting)

Kitkoan (1719118) | about 4 years ago | (#33387738)

It would be interesting to run this a few times... not sequentially, but maybe once every few months to give yourself time to reset (for the lack of a better word), and rewatch whatever it was you recorded with this device and then diff the results to see if their was a drift, and in what area's. I have to assume you wouldn't always respond the same way, and the results could be highly interesting, perhaps even moreso to the field of psychiatry in allowing a more exact gauging of the effectiveness of whatever drug they are administering to a patient.

I doubt you'd get nearly the same reactions. Things like boredom (reruns don't always get everyones attention) and (since these are static videos) predictability can and will cause detachment of your emotions and their intensity to what your watching. Think of things like horror movies, sure they can make you feel fear greatly during the first watch, but rarely can cause that much fear during the second viewing, let alone the third, forth, ect...

Use Current Boredom/Interest to adjust speed (1)

billstewart (78916) | more than 3 years ago | (#33399336)

Unfortunately, it's probably harder for these things to detect boredom vs. interest than simpler emotions, but it would be cool if you could set it so your player runs faster when you're bored and slower when you're interested (and there are already sound-adjuster programs out there so you can run faster or slower without distorting the sound pitch badly.)

You want to use this in interactive mode, not batch, so it's reacting to what you're interested in or bored about now, not what you felt about it last time. WAIT! WHAT? BACKUP! No, not that far - Yeah, there!

What's the big deal? (1)

dirtyhippie (259852) | about 4 years ago | (#33387650)

So the guy moves his face in accordance with his emotions and then, guess what, he can make the same gestures to go back to the place in the video where he previous made those gestures. Woot? If anything this only shows just how far off this sort of technology really is. Last I checked, I'm pretty sure I had more than 4 emotions... Am I missing something about how amazing this is?

Re:What's the big deal? (1, Interesting)

Anonymous Coward | about 4 years ago | (#33388226)

Yea, I get the feeling the software trained the person rather than the other way around. But this is the first step in software thinking just like people; it'll be here in 15 years.

Too bad the emotiv SDK costs $10,000 (2, Informative)

BitHive (578094) | about 4 years ago | (#33387686)

If you want the raw EEG data, you have to buy emotiv's $10,000 SDK. I'll stick with my Neurosky headset for now.

Re:Too bad the emotiv SDK costs $10,000 (2, Interesting)

c0lo (1497653) | about 4 years ago | (#33387756)

Neither has a Linux port for the SDK (wink - wonder why this is posted on /. then?)
Seriously, anyone with some references on similar devices that have Linux support?

Re:Too bad the emotiv SDK costs $10,000 (1)

MrBandersnatch (544818) | about 4 years ago | (#33388024)

There's an SDK version in the works with a promise of a beta at any point. Closed source drivers though which of course sucks but this is a small company trying to make it in a very small market; could have been worse, could have needed a dongle. Oh, wait :(

So yes, the Linux support could be better but they are at least making an effort in this area rather than having to be brow-beated about it for many years.

Re:Too bad the emotiv SDK costs $10,000 (3, Informative)

MrBandersnatch (544818) | about 4 years ago | (#33388002)

Nonsense!

The version to access EEG data is $750. They have a $500 developer version and a $299 consumer version - I don't even think they have a $10K product! As for Neurosky, do you mean that toy where you move the pong ball up and down? Sorry, genuinely interested since I hadn't thought they had done much beyond that.

Re:Too bad the emotiv SDK costs $10,000 (1)

BitHive (578094) | about 4 years ago | (#33388318)

It was $10,000 a year or so ago. I seem to remember a $15k "research" version too but I'm guessing there weren't very many takers.

Re:Too bad the emotiv SDK costs $10,000 (3, Informative)

Idiomatick (976696) | about 4 years ago | (#33389050)

Still completely fucking retarded. I was really excited about this product before it came out. Was a part of the forums. The they decided they would charge HEAVILY for the SDK to regular users. Who is their audience? People that want to toy with the thing. That is their ONLY audience until there are several thousand apps for the thing and when it is integrated into a bunch of games like the star wars mmo. And that will only happen if you have lots of developers. Which emotive is shitting on.

They likely made 20-50 grand selling the dev kits. Where-as the starwars mmo having mind control in it would sell at least 1,000 headsets likely way way more. They would have to try to not sell an additional 500-1000 headsets if they opened the SDK.

Explaining this obvious failure of business on the forums got me tossed then my post deleted. And now the old forums are gone. (When I explained it back then I was much more encouraging).

Seriously, this thing was super hyped at release but if you google them or look their crap up on youtube they almost completely died within weeks of release. The forum has 2-3 active devs.

So sad to see such a cool toy ruined by such a stupid stupid obvious business decision. :( That and proving that Balmer is smarter than you has to be embarrassing. DEVELOPERS!

Re:Too bad the emotiv SDK costs $10,000 (0)

Anonymous Coward | about 4 years ago | (#33389562)

That's a good explanation, currently they're forcing you to buy the 750 sdk to get full access to the headset, WHAT! I just wanted to get my meditation on using 299 model, oh well. It wouldn't be such a big deal if they had more then 4-6 apps to buy(which are not interesting whatsoever). I must say they have good CS , If quality apps don't come soon I don't think their capital will hold out...

Re:Too bad the emotiv SDK costs $10,000 (1)

RattFink (93631) | about 4 years ago | (#33390222)

Correct me if I'm wrong but isn't this what you are looking for?

http://www.emotiv.com/apps/sdk/179/ [emotiv.com]

$500 may not be cheap but it's a lot more reachable for the hobbiest then $10k.

Re:Too bad the emotiv SDK costs $10,000 (1)

Idiomatick (976696) | about 4 years ago | (#33392388)

Still biting the hand that feeds them. The choice to charge $500 for it was suicide, simple as that. Charging 10k may have been bad enough that they would sell ZERO copies. Which would have been sort of embarrassing.

Re:Too bad the emotiv SDK costs $10,000 (1)

wjousts (1529427) | about 4 years ago | (#33392448)

The developer edition doesn't include raw EEG data. Just their blackbox interpretation of those signals.

Re:Too bad the emotiv SDK costs $10,000 (1)

hellop2 (1271166) | about 4 years ago | (#33390758)

So, load up some USB monitoring software and make your own opensource SDK.

Re:Too bad the emotiv SDK costs $10,000 (1)

wjousts (1529427) | about 4 years ago | (#33392466)

I'm pretty sure their lawyers would have something to say about that.

Re:Too bad the emotiv SDK costs $10,000 (1)

Santzes (756183) | about 4 years ago | (#33391172)

Wow. Doing some face recognition and other image recognition stuff lately, I left this article open for the night as I was probably going to spend some time with it later. Not gonna happen now.

Re:Too bad the emotiv SDK costs $10,000 (0)

Anonymous Coward | more than 3 years ago | (#33445234)

Died?? They sold out three batches and I had to wait a couple of months to get mine! It's great. 750 bucks for an EEG machine and SDK, and I can play games. There are some great videos on Youtube showing a disabled girl playing the demo game. Over a month she changes from being completely disconnected and with no neck control, to a switched on and enthusiastic person who can support her own head. She had no reason before. The videos are at
http://www.youtube.com/watch?v=RhtrdBvUb-U
http://www.youtube.com/watch?v=HucKhwmenJY&feature=channel

Re:Too bad the emotiv SDK costs $10,000 (0)

Anonymous Coward | about 4 years ago | (#33390672)

Yeah, gee, if only they had a website with all the details.

Re:Too bad the emotiv SDK costs $10,000 (1)

wjousts (1529427) | about 4 years ago | (#33391174)

Not entirely correct either. The EnterprisePlus edition with raw EEG output is $7,500. The research $750 version is for individuals, research institutions and companies with turnover

The other thing that isn't clear to me is if you develop an application using the raw EEG output, do you need other $7,500 headsets to use that application on can you actually use them with the cheaper consumer headsets? If we develop something on our $7,500 headset and then want to implement it globally within our organization again it's trivial for a $299 headset, not so much for $7,500.

Re:Too bad the emotiv SDK costs $10,000 (1)

wjousts (1529427) | about 4 years ago | (#33392486)

I was trying to post in a hurry - the above is supposed to read "turnover less than $100,000"

Re:Too bad the emotiv SDK costs $10,000 (0)

Anonymous Coward | more than 3 years ago | (#33445118)

Jeez,you ar so far out of date! If you want EEG you can buy Emotiv's Research Edition SDK for $750

EmoRate (1)

YoshiDan (1834392) | about 4 years ago | (#33387708)

Does it cut itself too?

Re:EmoRate (1)

PerfectionLost (1004287) | about 4 years ago | (#33392088)

No it just pretends it does, so that you'll pay attention.

Goatse (1)

JimWise (1804930) | about 4 years ago | (#33387718)

What was his reaction to goatse videos? Actually, I think I'd rather not know. I'm sure such an experiment with a normal person would push the Emotiv beyond its capabilities.

Re:Goatse (1)

Kitkoan (1719118) | about 4 years ago | (#33387766)

Re:Goatse (0)

Anonymous Coward | about 4 years ago | (#33388234)

I love the girl with the hands protecting her head, as if to thwart the evil manifestation from further invading the sanctity of her mind. Goatse is a symbol of all the innocence that was lost in this cruel world.

Why? (1)

Required Snark (1702878) | about 4 years ago | (#33387726)

So let me get this straight: you hook this up, train it, and then in the future you can hook it up again and it will tell you how you feel. Because at some time in the future you won't know what you are feeling, so you have to ask the computer about what you just experienced.

I understand that there is a market for this in testing products, say video games, but who else would use this stuff? I just don't see this as being very commonly used. I would guess it is about as useful as a "lie detector", which doesn't do a very good job, and how many people have one of those in their home?

Re:Why? (1)

MrBandersnatch (544818) | about 4 years ago | (#33388162)

There's a BIG market for this technology which is only going to grow. Its already being used in neuro-marketing (market research), I'm personally looking to apply it to usability, and down the line context sensitive affective interactions are going to play a big part of how you interact with software (think a computer that can tell it gave you an unsatisfactory answer by the tone of your voice and thus does another search in the background to try and improve the results).

I'm personally still sceptical regarding the performance of these "consumer" level BCIs but the emotiv headset (EPOC) and SDK have been turned to some very interesting applications (e.g. wheel chair control for the disabled, neurophone using the p300 signal which could be more broadly applied to a range of applications).

Right now this is probably at the level of home computing in the 80s and the EPOC headset might actually be the ZX81 in terms of putting practical BCIs in enthusiasts hands. But that's a rather big "might" since I'm still trying to scrape funding together for mine *sigh*

Re:Why? (1)

pyrosine (1787666) | about 4 years ago | (#33390122)

For this "EmoRate" program that is true but the device itself not so much. Using their app, "EmoKey", you can bind neurological impulses (I think they really doomed their product by saying it was based on emotions) to key combinations. One application of this technology is simulating a forward feeling, backwards, sideways etc (I know I for one can send fake impulses) and binding it to a WASD setup. Setting up like this would actually help you game faster as it removes the physical lag. I know on their site they featured a picture browser where movement in a specific direction (you train the exact movement, could be moving your head, eyes etc) will change photo.

Re:Why? (1)

maxwell demon (590494) | about 4 years ago | (#33390486)

Hmmm ... if you combine this with something like Dasher, you would be able to thought-type.

For those curious about the test video (3, Informative)

Kitkoan (1719118) | about 4 years ago | (#33387750)

His test video he's watching is Sintel [blender.org] which is a free, open source CG movie soon to be finished.

Re:For those curious about the test video (1)

blair1q (305137) | about 4 years ago | (#33387780)

If it's open source, can I make a fork of it where Han shoots first?

Re:For those curious about the test video (2, Informative)

Kitkoan (1719118) | about 4 years ago | (#33387798)

Sure, here is the download link for their previous movie, Big Buck Bunny [bigbuckbunny.org] where you can download the movie in multiple formats and video sizes, and at the bottom is the entire studio back up (over 200 gig) where you can download every part of the movie made and used.

can someone wake me up.... (0)

Anonymous Coward | about 4 years ago | (#33387752)

can someone wake me up, when there is official youtube support, and the headset doesnt cost $299+tax

Can we play poker? (1)

blair1q (305137) | about 4 years ago | (#33387770)

When I'm watching a video alone, I don't usually have facial expressions, unless something is insanely funny, or I've got into the scotch.

Re:Can we play poker? (1)

pyrosine (1787666) | about 4 years ago | (#33390136)

An EEG doesnt read facial expressions, it is input with brain activity which the program then translates into emotions. The face expressions were just an exaggeration so it would look good on camera.

Re:Can we play poker? (1)

maxwell demon (590494) | about 4 years ago | (#33390242)

However it has been shown that your face expression does affect your emotions. So if he was doing those face expressions intentionally during the actual test, it may well have affected the results.

Re:Can we play poker? (2, Interesting)

blair1q (305137) | about 4 years ago | (#33395244)

Don't complain to me. That's what the summary said.

As for EEG, I wonder what mine looks like when I'm playing Poker.

I bet it's not too readable. I'm pretty good; mechanistic even when I'm bluffing.

Innovation (1)

retaj (1020999) | about 4 years ago | (#33387822)

Who wants to bet the porn industry is the first to monetize this?

Amalgamate the results (1)

ksandom (718283) | about 4 years ago | (#33387844)

That's awesome. This would become very powerful once these results are amalgamated with millions of other viewers. It would also be a very effective way of improving search results because rather than simply clicking the close button, there could be feedback to say the results sucked. And combined with another technology, which results sucked.

I rememebr seeing a documentary about this technology on beyond2000 years ago. It's great to see that it has made it into the consumer world.

For those that laughed at it (1, Funny)

Anonymous Coward | about 4 years ago | (#33387878)

Since you enjoyed the video "2girls1cup". EmoRate thinks you might also enjoy...

Re:For those that laughed at it (0)

Anonymous Coward | about 4 years ago | (#33387904)

Since you enjoyed the video "2girls1cup". EmoRate thinks you might also enjoy...

This [meatspin.com] and this. [tubgirl.ca]

Don't explain the joke! (1)

Psaakyrn (838406) | about 4 years ago | (#33388058)

Seriously, the first time you hear a joke is much more effective then subsequent times. So how would you be able to find jokes then?

this is perfect! (1)

ohiovr (1859814) | about 4 years ago | (#33388106)

Now I have the perfect tool to fine tune my propaganda and advertising!

Interesting, but... (2, Insightful)

flimflammer (956759) | about 4 years ago | (#33388126)

I can't say I see the benefit to this sort of system. My facial expression rarely changes throughout movies, unless I laugh over something funny or flinch due to a movie trying to scare me with a loud noise. I'm hardly alone; I showed the video to a few people and they had similar concerns.

I can't see myself giving off a heartwarming smile when I see something happy or frowning when I'm sad. At that point it seems like I'm merely trying to appease the technology to make it work, instead of just doing my natural thing and it picking up on that.

Re:Interesting, but... (1)

BJ_Covert_Action (1499847) | about 4 years ago | (#33389734)

I can't see myself giving off a heartwarming smile when I see something happy or frowning when I'm sad.

Well that's too bad. Where's the fun in watching a movie if you can't get lost enough in it to actually feel something? I mean, sure, it's fiction. That doesn't mean you can't let yourself empathize with the characters, or smile at their triumphs, or beetle your brow at one of their more perplexing decisions. Don't get me wrong, movies are not a substitute for real life, but being unable to watch a performance and not feel anything is ... well ... sad.

I hope you can open yourself up a bit someday, for your own sake and enjoyment.

Re:Interesting, but... (0)

Anonymous Coward | about 4 years ago | (#33391514)

Well that's too bad. Where's the fun in watching a movie if you can't get lost enough in it to actually feel something? I mean, sure, it's fiction. That doesn't mean you can't let yourself empathize with the characters, or smile at their triumphs, or beetle your brow at one of their more perplexing decisions. Don't get me wrong, movies are not a substitute for real life, but being unable to watch a performance and not feel anything is ... well ... sad. I hope you can open yourself up a bit someday, for your own sake and enjoyment.

You are absolutely correct, but please read what he wrote.

Feeling and giving off an expression is two different things. I can imagine that for Americans it may feel weird, but for me (Finnish) it's completely natural to have a completely emotionless face altough there's a lots of things going in my head - I guess the same goes for many other cultures, too (Japanese, for example?)

Re:Interesting, but... (1)

BJ_Covert_Action (1499847) | about 4 years ago | (#33393366)

Huh, I hadn't thought of that. Strange, but interesting.

Re:Interesting, but... (0)

Anonymous Coward | about 4 years ago | (#33390002)

Think a little bit further: this kind of device could be a tremendous help for people with physical disabilities. I see the example here as a proof of concept that you can have some level of control on some basic actions, only by thinking. The applications to accessibility are interesting.

Re:Interesting, but... (0)

Anonymous Coward | about 4 years ago | (#33390558)

Maybe you're watching wrong kind of multimedia? As you said, your facial expressions (rarely) change throughout movies. That means, you may automatically tag those rare occasions and later find those bits of multimedia with ease.
Further more, you could learn to use that extra input-device, so you can get hands-free tagging for multimedia, to get another task done in another thread. Its already dangerous to drive while watching youtube, even without thumbing in keywords for tags.

Re:Interesting, but... (1)

bjourne (1034822) | about 4 years ago | (#33392992)

The device is an EEG reader, it does not read facial expressions! I don't understand why the submitter had to mention facial expressions just to confuse poor slashdotters. Anyway the applications you can develop with a consumer priced and accurate real-time EEG reader is boundless. Marketers could use it to find out what commercials are the funniest, all psychology programs at all universities could use it for limitless amount of experiments, you could use it for your porn collection or music collection to automatically and passively get a rating for which porn/music you think is best. If you are one of those persons who send links to "funny" videos to your "friends" you could have the EEG reader app do it automatically for you when you watch a video that is funny enough.

If this gadget is as cool as it looks, it is definitely what I will get for Christmans. Though I'm fairly sure it will suffer the exact same problems early accelerometers did with to low resolution and to much noise to be useful as input devices.

Facial Muscle vs. Brainwave Commercial Products (1)

billstewart (78916) | more than 3 years ago | (#33399272)

There have been a couple of products like this out there on the market, with varying numbers and locations of sensors. Some of them are doing EEG type detection to try to see what your brain is doing, while others are mostly sensing facial muscles. It's hard to keep track of which products are which, especially when they're initially marketed toward gamers because they think that's a potential market.

If this is the one doing actual brain behaviour detection, and if the SDK weren't so expensive*, it'd be fun to use for things like neurofeedback experimentation. On the other hand, if it's yet another facial muscle detector, that's less interesting to me, but probably easier to program to do useful things for gamers.

Re:Facial Muscle vs. Brainwave Commercial Products (0)

Anonymous Coward | more than 3 years ago | (#33445176)

It's both - you always get facial muscle signals in EEG data - it's usually a problem. These guys have turned it into an advantage by classifying the noise into facial expressions. But it also reads real brain signals and makes some cool stuff out of them. You can cast magic spells and it can detect excitement, engagement, boredom, meditation and frustration.
The SDK is not that expensive - $750 for the EEG enabled Research Edition.

Re:Interesting, but... (1)

Lambeco (1705140) | about 4 years ago | (#33396120)

My facial expression rarely changes throughout movies[...].

I would bet that you give off more readable facial expressions than you realize...

Jar Jar Binks and Yoda.... (1)

syousef (465911) | about 4 years ago | (#33388304)

....make the software explode.

I recorded my emotions (1, Funny)

Anonymous Coward | about 4 years ago | (#33388554)

while watching the demonstration video. The result is: 'meh'

UAE: (-1, Offtopic)

Anonymous Coward | about 4 years ago | (#33388874)

United Arab Emorates!

(Yes, this would be funnier in the context of a joke. But I can't think of one just now, and if you were as drunk as I am, you wouldn't care either...)

An emotion reader- just for guys (1)

wombat1966 (1886522) | about 4 years ago | (#33389144)

I see a great market in helping guys understand women. The computer could watch her face and text him, "She's bored. Enough football talk." "She's getting upset. Stop talking about her mother." "Uh-oh. She didn't REALLY want to know if she looks fat." Could revolutionize relationships. Pam http://www.thebrewmag.com/ [thebrewmag.com]

Someone has invented... (1)

Alsee (515537) | about 4 years ago | (#33389306)

Pavlov's Orgasm Recall Navigation.

-

Oh, an emotion detector... (0)

Anonymous Coward | about 4 years ago | (#33392518)

... *that's* a really useful invention

<Watches 2 girls 1 cup> <EmoRate explodes>

Finally... (1)

Modern Primate (1503803) | about 4 years ago | (#33393244)

Finally something I can use to easily sort my porn.

WTF is that video thumbnail? (0)

Anonymous Coward | about 4 years ago | (#33394016)

It kinda looks like an alien frog-man in some kind of B&D outfit, being "dominated". W. T. F.

Re:WTF is that video thumbnail? (1)

osu-neko (2604) | more than 3 years ago | (#33396814)

WTFV (Watch The Friendly Video)

(It's the Sintel [blender.org] trailer...)

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>