Slashdot: News for Nerds


Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Computer Spots Fakers Better Than People Do

timothy posted about 4 months ago | from the just-don't-make-eye-contact dept.

AI 62

Rambo Tribble (1273454) writes "Using sophisticated pattern matching software, researchers have had substantially better success with a computer, than was obtained with human subjects, in spotting faked facial expressions of pain. [Original, paywalled article in Current Biology] From the Reuters piece: '... human subjects did no better than chance — about 50 percent ...', 'The computer was right 85 percent of the time.'"

cancel ×


good news for Skynet! (-1)

Anonymous Coward | about 4 months ago | (#46553965)

Machines enjoy torturing people so much more than people do!

Re:good news for Skynet! (0)

MightyYar (622222) | about 4 months ago | (#46554407)

If that's the case, they should let it run against Shatner in TOS.

I want authenticity... (-1)

Anonymous Coward | about 4 months ago | (#46553977)

in my snuff films damn it!

In 3 .. 2 .. 1 (5, Funny)

OzPeter (195038) | about 4 months ago | (#46553979)

Outlawed by FIFA!

Re:In 3 .. 2 .. 1 (2)

ericloewe (2129490) | about 4 months ago | (#46554165)

But, but... The referee's humanity is part of the game! It's not right to question his judgement with gadgets!

Re:In 3 .. 2 .. 1 (0)

Anonymous Coward | about 4 months ago | (#46554409)

I am imagining how they produced the images of faces of people where were actually in pain.

"Thanks for volunteering. We will pay each of you $10 to let us dump this hot water on you and snap a picture of your face. For science!"

Re:In 3 .. 2 .. 1 (2)

Marginal Coward (3557951) | about 4 months ago | (#46555419)

I am imagining how they produced the images of faces of people where were actually in pain.

They turned the dial on the Milgram Experiment [] up to 11. Or so it seems...

Re:In 3 .. 2 .. 1 (1)

TheRealHocusLocus (2319802) | about 3 months ago | (#46557021)

They turned the dial on the Milgram Experiment [] up to 11. Or so it seems...

1. Please continue.
2. The experiment requires that you continue.
3, It is absolutely essential that you continue.
4. You have no other choice, you must go on.

I have programmed my talking alarm clock to say this.
Strong coffee with the consistency of pudding also helps.
The experiment goes ever on and on.

Re:In 3 .. 2 .. 1 (0)

Anonymous Coward | about 4 months ago | (#46554451)

Like FIFA would need to outlaw it. They don't use technology by default.

I'm pretty sure there were clocks around in the 1800s, yet they still have magic time at the end of each half.

Easier to bribe without technology being utilized.

It wasn't the computer (2)

fullback (968784) | about 4 months ago | (#46553989)

The people who programmed "the computer" were better.

Re:It wasn't the computer (2, Insightful)

Trepidity (597) | about 4 months ago | (#46554009)

I'd say it's a mixture of the two. The computer can't discriminate these facial features without people to program it, but the people can't discriminate these facial features on their own, either, because we aren't good at applying this kind of analysis ourselves (even if we can come up with what it ought to be). The existence of a computer isn't enough, and the existence of the people is also insufficient, to carry out the task. So I'd call it a collaborative activity.

Re:It wasn't the computer (1)

Oligonicella (659917) | about 4 months ago | (#46554045)

I'd call it people using tools. Only if the computer is intelligent with motivation does it become collaboration.

Re:It wasn't the computer (2)

Trepidity (597) | about 4 months ago | (#46554183)

I'd say the computer is pretty intelligent. For one, it's better at recognizing facial expressions than people are! ;-)

I mean, if you hired a textile worker, nobody would object if you talked about the worker being "good" or "bad" at sewing, even though they didn't design the sewing machinery and aren't exhibiting any particular creativity, but rather are just following instructions.

Re:It wasn't the computer (1)

Anonymous Coward | about 4 months ago | (#46554209)

You are saying this yourself: the textile worker is neither creative nor intelligent, he is just skilled. What's so difficult to grasp? Can't the computer be skilled at pattern matching? Doesn't mean it's suddenly intelligent.

Re:It wasn't the computer (1)

Oligonicella (659917) | about 4 months ago | (#46554609)

"I'd say the computer is pretty intelligent." Then you are wrong. If turned on, OS allowed to settle and no tasks assigned, the computer would sit for as long as power and hardware allowed and do nothing. That is not intelligence.

Re:It wasn't the computer (0)

Anonymous Coward | about 4 months ago | (#46555347)

You mean like couch potatoes? I'd say that you've hit the nail on the head...

Re:It wasn't the computer (1)

Dog-Cow (21281) | about 3 months ago | (#46556561)

Why do you equate a tolerance for boredom with a lack of intelligence?

Re:It wasn't the computer (1)

Culture20 (968837) | about 4 months ago | (#46554161)

It's all the computer. The people making/training the system could be only 50% at their pain discrimination, but the computer can be trained up to 85% because it's a neural net doing the computer vision matches, and it gets trained. The computer is shown a billion videos of real pain, and told it is real pain. Then a billion videos of fake pain and told it's fake pain. A human might be able to do it at 85% (or better) if they looked at two billion videos to learn from. We shouldn't expect a human's instinctual awareness of pained facial features to be better than a trained professional's awareness of pained facial features, whether it's a human or computer doing the analysis.

Re:It wasn't the computer (3, Insightful)

Artifakt (700173) | about 4 months ago | (#46554389)

There's a good precedent for your argument that this is a question of instinctual skill vrs trained skill, but it doesn't take anything like a billion examples to train a person in the example I'm considering. A very common way to teach health care personnel to recognize Fetal Alchohol Syndrome is to give them an album with several hundred photos of people in various life stages, all suffering from FAS. This method has worked since the time when the photos were black and white, and in fact, using color shots or video footage doesn't seem to have any impact on success or the number of examples needed. Once someone is trained that way, the success percentage is in the very high 90s, and stays that way, at least for a typical crreer. Similar methods are used for other diseases, for example most people have learned to spot Down's syndrome from just a few examples, but where the syndrome produces only some of the usual appearance effects, spotting the 'borderline cases' with high accuracy can be taught this same way, usually taking about 15 minutes.

Re:It wasn't the computer (1)

Oligonicella (659917) | about 4 months ago | (#46554019)

Indeed. It would also be interesting to know how much time and, importantly, iterations it took to get that software to that state. Comparing that program with the run of the mill Joe or Jane isn't really fair. I'd like to see the program compared to doctors or medics.

Re:It wasn't the computer (0)

Anonymous Coward | about 4 months ago | (#46554071)

Doctor Phlox can inflict as much pain as he likes. It's totally ethical.

Re:It wasn't the computer (3, Informative)

ColdWetDog (752185) | about 4 months ago | (#46554147)

That would be an interesting test. I agree, 25 random volunteers really isn't all that high of a bar. Do the same with some experienced clinicians and see what happens.

Moreover, pain is a pretty complex issue - there is acute, nocioceptive (pain receptor) pain - as in the test. Chronic pain is quite a bit different. Visceral pain (from nerve fibers in the abdomen) is different still.

I think that a computer assisted study of emotions has the potential for improving human performance in decoding those emotions, but this is clearly in it's infancy. I don't think there will be an app for that in the near future (a real one, that is).


mosel-saar-ruwer (732341) | about 4 months ago | (#46554213)

What did they use to teach the computer?

Did they torture a bunch of undergraduates with cattle prods, and then load the photographs of the undergraduates screaming in pain into their great big giant neural network of "true positives"?

Versus photographs of other undergraduates, deeply immersed in their favorite pornography websites, as their "true negatives"*?

And who were the human beings who were competing against the computers?

Random joe sixpacks plucked right off of the street, or folks like Military Medics and Civilian EMTs and Oncology Nurses and Trauma Surgeons, who have a lifetime of experience watching true pain in their patients and who know exactly what it looks like?

*And God only knows how they would categorize the undergraduates who like to hang out at sites like, watching naked people being tortured with cattle prods.


Artifakt (700173) | about 4 months ago | (#46554411)

While you are at it, what happens if they test specifially with medical personnel who have been told they need to spot people faking pain to get their opiate fix,and avoid at all costs encouraging their addiction?

also: it wasn't pain (1)

globaljustin (574257) | about 4 months ago | (#46555769)

It was ***ice water*** not actual pain.

from TFA:

In the second, the volunteers immersed an arm in a bucket of frigid ice water for a minute, a genuinely painful experience, and were given no instructions on what to do with their facial expressions.

For the first one, they told them to ***make a fake painful face***

I'd say humans did just fine...better than the researchers who designed the study!

The computer just recognized the patterns it was was optimized on their faces

another AI hype/fail

Re:It wasn't the computer (1)

ras (84108) | about 4 months ago | (#46560657)

The people who programmed "the computer" were better.

You don't say why. But I'm guessing if I follow you logic the people who programmed Deep Blue were better at chess than Deep Blue itself, or the people who programmed Watson were better at Jeopardy than Watson. Since computers did these tasks better than the best people in the world clearly they weren't, and by a large margin.

Technically computers can do some things better than us. For example, they can store a series of images of someone responding to pain perfectly for long periods of time. Humans can't. A direct consequence is humans must make their decision on a facial expression within a second or so. Computer's on the other hand can take as long as they want. So if you give them 20 minutes they can multiply whatever power they have at hand by over 1000 (since 20 minutes is over a 1000 seconds).

The reality is that the issue isn't the about of computer power a computer can bring to the table. It is true humans have huge brains, but it has to be split over many tasks. For any single task it is now easy to throw order's of magnitude more compute power at it than a human can. So the problem isn't that the brain is more powerful than a computer, the problem is the programmers figuring how to do the task. One they do that - well Facebook is now better at recognising faces in photographs that a human is.

Hollywood (4, Interesting)

AlphaWolf_HK (692722) | about 4 months ago | (#46554059)

Perhaps watching faked facial expressions on TV and whatnot has dulled our ability to distinguish them?

Re:Hollywood (1)

rubycodez (864176) | about 4 months ago | (#46554437)

because people never faked emotions or facial expressions beforehand?

half the population has been faking orgasms since the dawn of humanity

Re:Hollywood (3, Interesting)

Artifakt (700173) | about 4 months ago | (#46554441)

In the 1970's there was a book called "Four Arguments for the Abolition of Television", or something like that. One of the arguments was the limited image quality of the 512 line scan made even very poorly faked emotions very hard to distinguish from the real thing, and so children who got their learning examples of human expressions from TV would have a hard time telling who was really feeling emotions or just faking them. The author also claimed that emotions such as Rage, Fear, and Strong Suffering would come through better than subtler emotions such as Boredom, Fondness or Compassion, so TV scripts would come to emphasize those emotions which at least somewhat worked and ignore the rest. Perhaps there's something to these ideas.

Type I and Type II errors? (1)

Jeremy Erwin (2054) | about 4 months ago | (#46554061)

My guess is that this would be useful for detecting people who only want narcotics to sell, or to use recreationally, But a computer algorithm that falsely identifies pain sufferers as fakers would be downright cruel, whereas a computer algorithm that fails to identify fakers is merely less useful for drug control efforts.

Re:Type I and Type II errors? (1)

ericloewe (2129490) | about 4 months ago | (#46554175)

I always hated how both types of error were given the incredible opaque designations of "Type I" and "Type I"...

Pain versus pleasure (2)

swb (14022) | about 4 months ago | (#46555863)

I often wonder how well our medical establishment has studied the euphoric effect of opiates and how they contribute to or even in some cases surpass the functional pain relief.

I had a traumatic hand injury two months ago which involved a partial amputation of one of my fingers. I experience a lot of "pins and needles" nerve stimulation and some false limb pain (pressure or stabbing-type sensation where I have no finger) and generalized fatigue in my hand. I take small (5 mg) doses of oxycodone once or twice a day and I "feel better" but without necessarily specific reduction of any one kind of pain -- I still feel it, but it bothers me less.

I don't think it's an addiction response; some days I take zero and don't feel any classic withdrawal symptom I've ever read about. But I sometimes wonder if the pain reduction is really the result of interaction with my pain, or because the eurphoric nature of the drug just makes me feel overall better, raising my psychological tolerance of pain without actually reducing the pain itself.

I wonder philosophically if it "matters" -- if the drug produces a euphoria that allows me to tolerate the pain, is that somehow less legitimate than some functional reduction of pain that may be the drug's principal purpose? What is the effect and what is the side effect? Os is it just a question of dose versus ancillary risks (whether it's addiction or some other more organic disturbance, eg, skin rash, etc).

Empathy, inexperience (2)

Dan East (318230) | about 4 months ago | (#46554099)

Computers do not feel empathy. Perhaps the empathy humans feel when seeing others in pain overrides any minor inconsistencies in the visual side of things. Further, humans do not "analyze" one others' faces to identify an emotion. We see faces (even if it's the same ones over and over) so many times that it's just an automatic, generalized classification. If people often faked painful facial features, and there was some strong motivation to identify that fact, then I'm sure we would be more adept at it. The only time I can think of in a real-world setting where people fake painful facial features is in jest or to be funny or "sarcastic" in some way (not counting football (aka soccer) matches). Thus the overall context totally reveals the expression to be fake and thus visual side of things is just an afterthought.

Re:Empathy, inexperience (1)

Jmc23 (2353706) | about 4 months ago | (#46554159)

Some of us do analyze faces. Especially if the refresh rate on your eyes is fast enough you pick up micro-emotions a lot easier.

I used to always wonder why people lied so much and why others couldn't tell that they were lying.

Then I found out how self-involved most people are.

Re:Empathy, inexperience (0)

Anonymous Coward | about 4 months ago | (#46554265)

I used to always wonder why people lied so much and why others couldn't tell that they were lying.

No, they can tell they're being lied to, but they enjoy playing the lying game so much that they lie about playing it. People are literally full of shit.

Re:Empathy, inexperience (1)

Anonymous Coward | about 4 months ago | (#46554369)

I always find myself wondering why people think I'm lying, even though I'm not. I'm not some liar, and even people I've never met do this.

So, I'd say humans are pretty bad at telling if others are lying. They get so sickeningly sure that there are arbitrary 'signs' that people give off when they're lying, and, as if not wanting to be wrong, they won't ever admit that they can't magically detect if someone is lying just because they do something. Annoying cretins.

Re:Empathy, inexperience (0)

Anonymous Coward | about 4 months ago | (#46554997)

People are generally convinced that their assumptions are true regardless of evidence. If people have a preconceived notion that you have an insincere face, nothing you say or do will ever convince them otherwise. This makes life difficult if you ever want to find a job or stay out of jail: interviewers will reject you because you look like a liar, and cops will put you in jail because you look like you belong there.

Re:Empathy, inexperience (0)

Anonymous Coward | about 4 months ago | (#46554485)

Having a good time there up on your high horse? You sound kinda self-involved yourself.
And the claim about having a fast refresh rate to pick up "micro-emotions"..have any links to back that up?

I think the parent is spot on though. Empathy and emotion is probably why most people have problems reading faked emotions. Or rather one of the reasons.
That's probably also why psychopaths are much better at reading other people than your average person, even if their own emotional life is much shallower. They are more like the computer in a way. They will just see you for what you are, without any bias coming from empathy or emotions.

ice water != pain (0)

globaljustin (574257) | about 4 months ago | (#46555779)

another important factor to note: they didn't experience actual pain in the control

from TFA:

In the second, the volunteers immersed an arm in a bucket of frigid ice water for a minute, a genuinely painful experience, and were given no instructions on what to do with their facial expressions.

For the first they told the subjects to make a **fake** painful face.

So, a face painful face vs a hand in ice water w/ no instruction...vs an optimized facial recognizer optimized on their expressions.

IMHO the subjects performed better than **the researchers themselves**

This is atrocious experimental design.

85% not good enough (1)

Culture20 (968837) | about 4 months ago | (#46554113)

Still not good enough to easily deny someone pain meds.

Re:85% not good enough (1)

wisnoskij (1206448) | about 4 months ago | (#46554151)

the junkies will break their own arms to get their fix, so I doubt this method, even if if 100%, would be useful for that. Also, in general, you do not have to demonstrate your pain to get meds. You tell a doctor you are in pain, and he gives you a prescription, 90% of people who get them are not outwardly displaying their discomfort.

Re:85% not good enough (1)

GodfatherofSoul (174979) | about 4 months ago | (#46555885)

If you run that software a few times it does. So, someone comes it with fake pains 2 times for meds and gets flagged. There's a 98% chance it's legit. Three times and there's a 99.7% chance they're faking it. You're not going to stop the one-timer, but you'll nail the addict.

Re:85% not good enough (1)

zippthorne (748122) | about 3 months ago | (#46557509)

You're not going to stop the one-timer, but you'll nail the addict.

And.. then what?

Fake money too (1)

Lacompa Cida (3396233) | about 4 months ago | (#46554271)

Banks all over the world uses scanners and computers to spot fake money too, because they don't think the humans can do it as well.

Bullshit (1)

Karmashock (2415832) | about 4 months ago | (#46554367)

I would really want to go over the photos they were showing people. I can think of three different ways this study was contaminated.

Re:Bullshit (1)

perih60 (2846125) | about 4 months ago | (#46581023)

I would really want to go over the photos they were showing people. I can think of three different ways this study was contaminated.

because of personal reasons , i am interested in this subject , could you expand on what you have so far contributed PLEASE !

Re:Bullshit (1)

Karmashock (2415832) | about 4 months ago | (#46582115)

Again, its extremely easy to bias a study like this... I'll go through some of my guesses as to how it might be biased off the top of my head in no particular order:

Looking at pictures of pain in photographs isn't a reasonable test especially if you're doing that to someone cold. That is without a little training. No more then 10 or 20 minutes is what a person needs. But if you just throw it at them out of no where they probably won't adapt to it instantly

Nearly always when we see people in pain in photos it is an act. Its entertainment. Its the solder on the field telling his Sargent to make sure his mother knows he loves her. Its a show.

The fact that they did this through photos is the first red flag.

The second is that they are not describing how the samples were prepared. For example, the images of pain could be either subtle or exaggerated. They could have little mona lisa grimaces or could have their mouths open in agony. And either one could be bullshit.

I am generally dubious of statistical studies unless I know they were conducted by multiple groups that were compartmentalized so they people collecting the data didn't know the hypothesis of the people analyzing it. There are two types of statistics. Scientific statistics which most scientists never take contrary to context... and bullshit. Nothing else.

Complete with control groups. The whole deal.

Furthermore, the computer was probably trained on those photos or photos very similar to those. Where as the random people brought in to see it were just expected to guess right instantly.

its nonsense.

Utter Rubbish (0)

Anonymous Coward | about 4 months ago | (#46554455)

This article is a GREAT example of how betas are fooled by the facts. The LIE of the article is the conclusion- that the computer has better 'PATTERN'-matching skills, but now lets consider things again.

Microsoft's NSA domestic spy platform, the Xbox One, was launched with a VERY specific claim- that it could monitor the heartbeat of people in the room. Sounds outlandish until one understands that a little bit of DSP processing combined with the fact that the cameras of the NSA designed sensor block, the Kinect 2, see well in infra-red.

The relevance is this. When a computer 'looks' at a face, it has access to simple data that the Human may not. This has NOTHING to do with 'clever' algorithms. And then, of course, a computer can be pro-programmed with simple rules that if known to a Human observer, would give them the same success rate or better.

Let's say that a Human in pain has a simple 'TELL' (poker term- Google it). Even if a person could be educated to look for that 'tell' does not mean that random subjects have this knowledge. On the other hand, the ANYTHING BUT CLEVER computer software WILL have been given knowledge of this 'tell'. So, to a beta, the computer will APPEAR to be clever.

So what would be a valid test of software to impress an alpha? Simple- well trained Humans vs the computer. Why wasn't the test conducted like this? After all the computer is supposedly demonstrating a SKILL, so the comparison would be against Humans with a similar skill, NEVER against a randomly 'chosen' bunch of unskilled people.

The promotion of this software GARBAGE is a usual con to try to bilk money out of dimwitted investors. And its presence on Slashdot is the usual "kerching" pay-for-play promotion.

But can it...... (0)

Anonymous Coward | about 4 months ago | (#46554475)

determine that pro wrestling is fake?

Didn't we know this already? (0)

Anonymous Coward | about 4 months ago | (#46554567)

Wasn't it already postulated that humans are inherently bad at detecting fake facial expressions (and by extension lies) in order for better social cohesion?

Being less good at detecting lies meant more cooperation and trust (The irony) which lead the highly perceptive (and less cooperative) to simply.... die off.

Most lies are white lies. We probably have inherently good social intentions, but perhaps the truth is just in the way?

It's interesting to imagine that the development of these traits could have lead to strange mannerisms in our contemporary society...
We always ask each other how it's going and reply with "Great! how are you?". Remnants of social acceptance ritual or something more?

Perhaps in a previous, less herd like animal ancestry, millenia ago. Your very very life would be highly dependent on being able to perceive the primal intentions of others. It's only humorous that fitness would be determined by the exact opposite trait here in the present.

Not meant to fool computers? (1)

OneAhead (1495535) | about 4 months ago | (#46555053)

Just a thought, but might this simply indicate that fake facial are trained/wired to fool humans (and the attributes a human watcher would consciously or subconsciously use to judge the expression), not computers?

Re:Not meant to fool computers? (1)

OneAhead (1495535) | about 4 months ago | (#46555131)

Damn it, that should have been "fake facial expressions". Where is the "edit" button on this thing?

Re:Not meant to fool computers? (0)

Anonymous Coward | about 4 months ago | (#46604329)

I do believe Dr Sofia Koutsouveli was correct in her interpretation of general relativity involving discrete packets of time. It's a shame her theory didn't make it to Nature.

Humans evolved to trick humans (0)

Anonymous Coward | about 4 months ago | (#46555123)

Humans did not evolve to trick computers

"Guilty" expressions (1)

joe_frisch (1366229) | about 4 months ago | (#46555181)

This technology can get quite dangerous if it becomes good at detecting people with guilty expressions.

The End of Civilization (0)

Anonymous Coward | about 4 months ago | (#46555337)

forget GMO crops, asteroid collisions, global warming, etc. this will be the end of civilization. machines everywhere that detect and call-out deception. lies are the foundation of civilization.

As a Vicodin addict. (0)

Anonymous Coward | about 4 months ago | (#46555349)

Fuck you! :'(

Practical Uses (0)

Anonymous Coward | about 4 months ago | (#46555693)

This will get used when dispensing pain medication, doctors will be compelled to use such a mechanism to vet patients (either by law or by the threat of lawsuits and indictments). Next step will be another flavor of lie detector.
In practice this is of dubious value, but, since it's better than nothing (read: relying on human intuition) it will be widely adopted.

Did they train the humans? (0)

Anonymous Coward | about 4 months ago | (#46556069)

I'm sure that the computer got tons of training, but the people were probably just people. The results might be enormously different if it were preceded by some careful instruction on how to perform the task.

random people, what about experts (1)

locopuyo (1433631) | about 4 months ago | (#46556079)

So random people weren't very good at it, but what about experts in this sort of thing, like a Sherlock Holmes? How would they compare to the computer?

What else is new? (0)

Anonymous Coward | about 4 months ago | (#46556119)

If people could accurately identify pain and other basic emotions reasonably accurately, the bulk of the porn industry would be out of a job.

computers and pain (1)

perih60 (2846125) | about 4 months ago | (#46580991)

due to my having an illness that is at times VERY painfull , and because my GP has known me for over a decade , he does not just look at my FACE ! a couple of years ago , he told his secrety to call an ambulance because of my STANCE , and that was before i even had a chance to say anything to him , due to the fact that i try not to worry my familie , i have learned not to show how much pain i am in !! another thing is that most people do not seem to know that a person in a lot , and i do mean a lot of pain , or in shock behave differently . for example a person i know had an ulcer burst , a phycologist needed to know if he had his pension card when he called an ambo for him , because the person that was ill had problems answering , the other bloke was getting angry ! i squatted down ( the ill person was sitting ) and asked if he had the card that he had to show to a busdriver to get the consetion rate , the answer was " yes " and came straight away ! the point i am trying to make is that looking at someone's face in pain " or not " is only a small part , in the assesment .

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account