Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Wolfenstein Ray Traced and Anti-Aliased, At 1080p

timothy posted more than 2 years ago | from the sounds-like-an-in-n'-out-burger-order dept.

Graphics 158

An anonymous reader writes "After Intel displayed their research demo Wolfenstein: Ray Traced on Tablets, the latest progress at IDF focuses on high(est)-end gaming now running at 1080p. Besides image-based post-processing (HDR, Depth of Field) there is now also an implementation of a smart way of calculating anti-aliasing through using mesh IDs and normals and applying adaptive 16x supersampling. All that is powered by the 'cloud,' consisting of a server that holds eight Knights Ferry cards (total of 256 cores / 1024 threads). A lot of hardware, but the next iteration of the 'Many Integrated Core' (MIC) architecture, named Knights Corner (and featuring 50+ cores), might be just around the corner."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered


first ray trace (2)

russ1337 (938915) | more than 2 years ago | (#37412194)

first ray trace...

now where is a decent link.

Re:first ray trace (0)

Anonymous Coward | more than 2 years ago | (#37412262)

I was about to say the same thing. I bounced through 3 slashdot articles before I reached a "Problem Loading Page" error.

Re:first ray trace (1)

Anonymous Coward | more than 2 years ago | (#37412504)

cache:http://blogs.intel.com/research/2011/09/wolfenstein_gets_ray_traced_-_2.php [googleusercontent.com]

Oh, look, there's something Bing can't do.

Re:first ray trace (1)

Dahamma (304068) | more than 2 years ago | (#37412690)

Mostly useless, though. None of the image links work...

Re:first ray trace (2)

TheRaven64 (641858) | more than 2 years ago | (#37412870)

On the other hand, the wayback machine's version does have working images [archive.org]. And it doesn't use your page view to harvest information about you.

Re:first ray trace (0)

Anonymous Coward | more than 2 years ago | (#37413420)

But has been slashdotted....

Re:first ray trace (3, Funny)

suso (153703) | more than 2 years ago | (#37412652)

That was like the ultimate combination of keywords on slashdot. That server had no chance.

Re:first ray trace (0)

Anonymous Coward | more than 2 years ago | (#37414146)


A video about Wolfenstein, narrated by a German engineer, and posted by on 9/11.

Re:first ray trace (1)

Anonymous Coward | more than 2 years ago | (#37414268)

Another IDF has started and we are excited to show our latest progress. Since previous demos we enhanced our cloud-based setup that was using four Knights Ferry cards as the (Intel MIC) as the "cloud" to now run Wolfenstein: Ray Traced at even eight cards in a single machine. In order to utilize the huge amount of horse power we are now running our demo for the first time in 1080p.

8KNF_01.jpg 8KNF_02.jpg

As additional eye-candy we included several post processing special effects (thanks to Ben Segovia). Just to clarify: those are not specific to ray tracing and have been seen in some games already. They are operating on the pixels of the rendered image (not on the 3D scene) - in our case directly on the Knights Ferry card. They can improve the perception of the rendered scene dramatically.

        Depth of Field: The effect is well known to photographers. If we want the spectator to focus on a certain area in the picture then the less relevant parts can be blurred. Therefore the object of interest is still sharp and will attract the main attention.

        02_regular.jpg 02_regular_depth_of_field.jpg
        Depth of field on/off (3% performance difference)

        HDR Bloom: If in reality we leave from a dark room into the bright outside our eyes are adjusting over a few seconds to the new brightness. The same can also be observed with digital (video) cameras that mimic this behavior and adjust the brightness spectrum to a pleasantly looking image. While doing this cameras might produce a bloom that can also "bleed" into other objects.

        01_bright.jpg 01_bright_hdr.jpg
        Overbright scene with HDR on/off (2% performance difference)

        Bloom effect

        Inter-lens reflections: While camera manufacturers are trying to avoid lens flares computer games and movies are often adding them as an artistic element. In this implementation several smaller sized version of the image, shifted to a specific color (e.g. green, blue and orange) will be blended into the original image.

        Subtle (image-based) inter-lens reflections (0.1% performance difference)

Another step we are doing for the first time is a smart way of anti-aliasing (thanks to Ingo Wald and Ben Segovia). There are different possibilities on how to do anti-aliasing. Most of them work pretty much brute-force and therefore invest additional calculations in areas where the improvement might not be noticeable. Our implementation will be applied after the image has been rendered. As ray tracing easily allows to just shoot a few rays for refinement we are analyzing each pixel depending on two factors if it requires more anti-aliasing:

        The angle of the polygon that got hit at that pixel
        The polygon mesh ID of that object

If there is a high enough variation in the angle or a different mesh ID is found we will shoot 16 more rays (supersampling) for that specific pixel and average the resulting color into that pixel. (Please note that the difference can be seen best in the full-sized images that appear after clicking the thumbnails.)

Courtyard view: Smart Anti-Aliasing off

Courtyard view: Smart Anti-Aliasing on (59% performance difference)

Close-up on cable: Smart Anti-Aliasing off

Close-up on cable: Smart Anti-Aliasing on (32% performance difference)

For future implementations more criteria like the color of the pixel (e.g. imagine an almost black spot in the picture - aliasing will not be noticeable here) or the color between neighboring pixels could be added. Further before comparing those colors MLAA could be used to reduce aliasing first and then later only certain areas could be refined through shooting new rays. Tweaking the numbers of additional rays to 4 or 8 might lead to a better performance/quality tradeoffs.

We would be happy to show you the real-time demo at IDF at the Intel Labs pavilion, booth 5080 in the exhibition hall.

Additional thanks to Ram Nalla, Nathaniel Hitchborn, Sven Woop, Alexey Soupikov, Alexander Reshetov.

The system that can hold the eight double-sized PCI-Express cards has been provided by Colfax International to us. Thanks!

Comments (0)

Hmm... (1)

Lyrata (1900038) | more than 2 years ago | (#37412224)

Slashdotted already? I think not!

Re:Hmm... (1)

MichaelKristopeit412 (2018834) | more than 2 years ago | (#37412436)

not very impressed by intel hardware and their ability to process large datasets

Slashdotted? (0)

Anonymous Coward | more than 2 years ago | (#37412236)

They should use one of the MIC architecture systems to host their website...

Get some artists already (1)

Goaway (82658) | more than 2 years ago | (#37412260)

I think Intel would find it easier to get people excited about this technology if they actually used it to render something that looked interesting, or at the very least looked good at all.

Re:Get some artists already (0)

Anonymous Coward | more than 2 years ago | (#37412410)

Intel is wooing game studios, not the public.

Re:Get some artists already (1)

Goaway (82658) | more than 2 years ago | (#37414530)

Wooing them by showing them graphics that look worse than pretty much everything else on offer? How is that supposed to work out for them?

Re:Get some artists already (2)

im_thatoneguy (819432) | more than 2 years ago | (#37414684)

I think Intel is wooing the Wall Street Journal more than anyone else.

Intel: "Look at these amazing graphics!"
WSJ: "Wow! Raytraced you say?
Intel: "Yeah the future!"
WSJ: "Ooooo! Buy stock!"

Re:Get some artists already (0)

Anonymous Coward | more than 2 years ago | (#37412864)

Yes, I don't understand why they don't do something like full radiosity or something that actually looks like realworld stuff like the renderers of several commercial offerings for 3d apps do, something like maxwell render? Than it would be cool. This wolfenstein raytracing stuff looks like shit.

Ray Traced on Tables (1)

skovnymfe (1671822) | more than 2 years ago | (#37412292)

But it's not ray traced on tables, is it? It's ray traced on a 256 core system and then somehow displayed on a tablet. Or am I reading this summary completely backwards?

Re:Ray Traced on Tables (1)

Aladrin (926209) | more than 2 years ago | (#37412372)

The summary clearly says it's rendered in the cloud on 256 cores. That link should read: "Wolfenstein: Ray Traced" on Tablets.

Re:Ray Traced on Tables (1)

Guspaz (556486) | more than 2 years ago | (#37412868)

So, in other words "OnLive but with a software raytracer on the server-side instead of a GPU."

Raytraced or not (1)

Osgeld (1900440) | more than 2 years ago | (#37412310)

Its still subpar to a budget 360 title's visuals

Re:Raytraced or not (1)

blair1q (305137) | more than 2 years ago | (#37412632)

It's code and art that's 13 years older than any 360 title.

Re:Raytraced or not (1)

Osgeld (1900440) | more than 2 years ago | (#37412752)

its based on return to castle wolfenstein not the original.

Re:Raytraced or not (1)

blair1q (305137) | more than 2 years ago | (#37412944)

Okay. 5 years. But W3D is still based on engine tech and visual standards that are much older. And I'll call "shenanigans" on your "any." There are likely many Xbox 360 games that look like crap compared with this.

And if Intel had a good-looking Xbox 360 title's source code, this box would out box that box by 359-1 at least.

MIC, now with ALOT (2)

Quartus486 (935104) | more than 2 years ago | (#37412318)

'Many Integrated Core'? Sounds like something from a parody. 'Many Integrated Core', with 'A Lot Of Thread'. They also come in a high-end version, 'Several Interesting Rate'. Abbreviated SIR MICALOT on Knights Corner...

Re:MIC, now with ALOT (0)

Anonymous Coward | more than 2 years ago | (#37412536)


If they like big butts, then that would make perfect sense.

Re:MIC, now with ALOT (1)

elrous0 (869638) | more than 2 years ago | (#37413012)

Yeah, but it opens up the opportunity for them to make a lot of lame jokes about "Rocking the MIC."

Not even very good performance (1)

gr8_phk (621180) | more than 2 years ago | (#37412398)

I can ray trace that at about 1FPS per core. Why do they need 256 cores? And who can play anything rendered in the cloud?

Re:Not even very good performance (1)

MozeeToby (1163751) | more than 2 years ago | (#37412770)

Depends if the rendering server is halfway across the country or halfway across the house. I remember people talking a while back (7 years or so) about using a home server to do the number crunching and moving back towards thin clients to access it. Wireless N bandwidth and latencies are pretty good, with modern technology you could probably make the idea work. Offer a suite of products that play well together: a powerful and easily upgraded server, lightweight laptops, and tablets. If you could make the price right for the clients you could have a quite expensive server with a ton of horse power while still making it a cheaper option for families that often have 4+ computers running.

Nintendo beat them to it (1)

tepples (727027) | more than 2 years ago | (#37413430)

Depends if the rendering server is halfway across the country or halfway across the house

A rendering server halfway across the house is called a "Wii U console".

Re:Not even very good performance (1)

iYk6 (1425255) | more than 2 years ago | (#37412808)

And who can play anything rendered in the cloud?

One person at a time.

Next - Ray-Traced Nethack (1)

billstewart (78916) | more than 2 years ago | (#37412406)

The umber hulk hits! - more
The umber hulk hits! - more
The umber hulk hits! - more
You die - more

Re:Next - Ray-Traced Nethack (0)

Anonymous Coward | more than 2 years ago | (#37412568)

I'd rather see ray-traced, 1080p Dwarf Fortress.
It would be a great demo of the technology, what with all the hairy dorfs, zombie fish somehow capable of walking and computers the size of sky-scrapers to calculate 8+8.
Oh and let's not forget the cats. Millions and millions of cats. It will be like that PS3 demo with the leave tornados, except cats... a cat tornado.
All that... IN 3D! AWE!

Re:Next - Ray-Traced Nethack (1)

Thud457 (234763) | more than 2 years ago | (#37413076)

ok, I'll give you that the cat tornado would be awsome.
Almost as much as pirates on flying sharks that are on fire that shoot fireballs.

The the real market for this is for server farms for Progress Quest.

Side affects of ray tracing (4, Funny)

sl4shd0rk (755837) | more than 2 years ago | (#37412496)

Intel is apparently running the ray tracing process on the same server their blog is on.

Re:Side affects of ray tracing (4, Funny)

blair1q (305137) | more than 2 years ago | (#37412668)

Their blog is ray-traced.

Re:Side affects of ray tracing (0)

Anonymous Coward | more than 2 years ago | (#37413890)

I ray-traced your mom.

Then Snotty beamed her. Twice. She claims it was wonderful.

Re:Side affects of ray tracing (-1)

Anonymous Coward | more than 2 years ago | (#37412960)

Oh the pleasure i have of seeing other people using the logical word (server(s)) and not the marketing bullshit buzzword "cloud". I almost have a crush on you just for that.

Annoying trendy jackass's. [/RANT]

Re:Google Cache Link (0)

Anonymous Coward | more than 2 years ago | (#37412722)

it has some nice pictures about the last generation of games. for example this one http://blogs.intel.com/research/assets_c/2011/09/01_bloom1.php

Cluster = Cloud (1)

HalAtWork (926717) | more than 2 years ago | (#37412622)

Ok, so cluster = cloud now? Even though they both serve very different purposes?

Re:Cluster = Cloud (2)

loufoque (1400831) | more than 2 years ago | (#37412782)

*remote* cluster = cloud

Re:Cluster = Cloud (1)

MozeeToby (1163751) | more than 2 years ago | (#37412812)

*remote* cluster = cloud = unacceptable latencies for gaming.

The only way this concept works is if the rendering farm is running in a closet somewhere in your house.

Re:Cluster = Cloud (1)

Guspaz (556486) | more than 2 years ago | (#37412900)

OnLive made it work with acceptable latencies, but then they did it with a cheap GPU and not a 256-processor cluster.

Re:Cluster = Cloud (1)

MozeeToby (1163751) | more than 2 years ago | (#37412940)

Acceptable or not depends on where you live and how good your ISP is. Personally, my mediocre cable internet regularly has latency in the 200s which is annoying enough trying to play online games, I can't imagine having that kind of latency for the basic I/O layer of the game. And that's not even at 12 midnight, when they decide to push out the schedule updates to every single cable box on their network simultaneously.

Re:Cluster = Cloud (1)

Guspaz (556486) | more than 2 years ago | (#37414014)

While this may be true in your particular case, many people are within the 1000 mile radius of an OnLive data center on a decent connection.

People talk a lot about how the network latency would make the input lag to OnLive unbearable, but consider this: 50ms of latency gets you from Montreal to Dallas (~2800km), and GTA IV on the XBox 360 has 133-200ms of input lag [eurogamer.net] despite being local. In fact, every console game that Eurogamer measured had at least 67ms of latency, and they claim that the average seemed to be about 133ms. Gamers are clearly willing to accept this latency (GTA IV, with latency higher than OnLive in many cases, is clearly a very popular game), making OnLive seem much more practical.

I'm by no means close to an OnLive datacenter, or even in the same country (I live in Montreal, and the nearest OnLive datacenter is in D.C., if memory serves), and to me the latency would seem to be on par with a laggy console game. That is to say, not great, but no worse than I've seen with some console games. To me, the real issue with OnLive was the low bitrate; it looks OK (just OK) when there's no movement, but playing a match of UT3 on vehicles was unpleasant due to all detail being lost while in motion (and this on either a 50 meg down 14 meg up VDSL2 connection, or a 60 meg down 3 meg up cable line).

The good news is that it's a lot easier to increase the bitrate of a video stream than it is to break the speed of light ;)

Re:Cluster = Cloud (1)

wagnerrp (1305589) | more than 2 years ago | (#37414452)

Why would schedule updates be sent individually over the internet, rather than simply broadcast to everyone on a spare channel? That just sounds like a hideously inefficient use of spectrum.

Re:Cluster = Cloud (1)

afidel (530433) | more than 2 years ago | (#37413224)

Uh, most modern GPU's have a lot more than 256 "cores", my fairly low end HD5750 has 720, a GTX 560 has 336 (yes a CUDA core and a SP are different, I know). These chips are the continuation of Larrabee which was meant as a GPU chip.

Re:Cluster = Cloud (2)

Guspaz (556486) | more than 2 years ago | (#37413868)

Sure, but a "core" in a GPU is far simpler than a "core" in a CPU, and Larrabee wasn't stripped down anywhere near that far. Larrabee was supposed to feature 32 cores in one package initially on a 45nm process, bumping it up to 48 on a later 32nm process. Intel is still on a 32nm process, so when they talk about a "256-core cluster", they're almost certainly talking about multiple systems; an 8-chip 32-core-per-chip system (or 4-chip 64-core) would not be a "cluster" in and of itself. And such a system does not sound cheap by any stretch of the imagination. Remember, Intel cancelled Larrabee because the performance, even with software rasterization, wasn't remotely competitive with modern GPUs, and software rasterization would be a heck of a lot faster than software ray tracing!

Re:Cluster = Cloud (1)

afidel (530433) | more than 2 years ago | (#37414164)

Fab42 is going to be 14nm and is being built right now, I assume they have lab equipment capable of the same so for a demo chip I can easily see them using that process node. Going from 48 cores on 32nm to 256 on 14nm doesn't seem all that incredible.

Re:Cluster = Cloud (1)

Guspaz (556486) | more than 2 years ago | (#37414612)

It would help if I had read the summary. They've got a single server with eight Knight Ferry cards, each having 32 cores. That's where they get their 256 cores from. And they're calling the single server a "cloud".

What makes this most unimpressive is that nVidia has been making a GPU-accelerated real-time raytracing engine for years now (you can even download working demos [nvidia.com]), and before that they were selling a GPU-accelerated final-frame renderer (non-real-time raytracing). Intel is showing off in-house demos of stuff running on expensive hardware, while nVidia has been giving away the same thing to customers for years, and it's something that's actually out there that you can use. Heck, so far as I can tell, it's free.

Re:Cluster = Cloud (1)

nitrowing (887519) | more than 2 years ago | (#37413332)

The only way this concept works is if the rendering farm is running in a closet somewhere in your house.

With liquid cooling, you could pipe it to the central heating system.

Re:Cluster = Cloud (1)

doogledog (1758670) | more than 2 years ago | (#37413092)

Apparently so.

From the video that someone linked to and may be somewhere in TFA:

'But the calculations are not done on this device themselves, they're done in the cloud *air quotes*... which is over here' *points*
I chuckled. And wanted to punch someone in marketing.

Re:Cluster = Cloud (1)

Amouth (879122) | more than 2 years ago | (#37413878)

personally i love that in the pictures of the box with the cards - they used a WD drive instead of an Intel drive in an Intel box..

I don't get it (2)

msobkow (48369) | more than 2 years ago | (#37412852)

What's the big deal?

Ray tracing isn't new.

Parallel processing isn't new.

It's an old game.

What makes this news?

Re:I don't get it (1)

MozeeToby (1163751) | more than 2 years ago | (#37412996)

They're getting close to commodity hardware. A large 256 core server today is a run of the mill desktop in 5 years. Intel wants you to believe that GPUs have a limited lifespan, that they'll last only until real time ray tracing on the CPU can produce equivalent or better results. They could be right... but the only way to find out is going to be to wait until the hardware catches up to the point that it's economically competitive and see what the GPU makers have done in the meantime. All in all, these kind of demos cost them hardly anything at all and if they can get a couple of game developers to start seeing things their way it could pay off hugely in the future.

Re:I don't get it (0)

Anonymous Coward | more than 2 years ago | (#37414362)

> A large 256 core server today is a run of the mill desktop in 5 years

No, it isn't. Even with the full benefit of Moore's law assumed, and rounded up, we'll only have 8 times as many cores five years from now. A run of the mill desktop has 2-4 cores, so that makes 16-32. A very long way from 256.

Intel keeps slogging raytracing (1)

Sycraft-fu (314770) | more than 2 years ago | (#37413002)

About once or twice a year they go on a big press buzz about raytracing. Reason is they would like that you don't spend money on graphics cards, and instead take that money and spend it on a bigger processor. So they are looking in to something that GPUs don't do so well, which is raytracing. They keep trying to get people excited about the idea of raytraced games, which would be done by systems with heavy hitting Intel CPUs, rather than rasterized games done mainly with a GPU.

As long as they keep doing press blitzes on this, expect to keep hearing the same kind of story.

Re:Intel keeps slogging raytracing (1)

Bengie (1121981) | more than 2 years ago | (#37413418)

Based on what I've read about Raytracing vs Rasterization, Raytracing *will* win out in the long run. I guess RT scales better than rast, but the overhead is expensive. Once we get to the point where RT is about the same speed as rast, it will only take 1-2 generations before RT is several factors faster.

Whichever company is ready to push out RayTracing, will stomp the market. If you release too early, your product will just be a gimmick, if you release too late, the competition will be several times faster than you.

Re:Intel keeps slogging raytracing (1)

Bram Stolk (24781) | more than 2 years ago | (#37413926)

Yes... ray tracing will indeed win in the long run.
This is because the performance is almost independent of primitive (triangle) count.

As a matter of fact... currently, ray tracing really complex models is already faster than rasterizing them.
Few hundred million or more triangles or so can be done faster with RT.

This is why their choice for content baffles me:
They should NOT be using doom datasets for this, they should be using hundreds of millions of polygons in their dataset.
That is where RT is shining.

Rasterizing is O(N) in nr of triangles.
Ray tracing is better than O(logN), approaching O(1) even.


Re:Intel keeps slogging raytracing (1)

Anonymous Coward | more than 2 years ago | (#37414702)

The example they showed in one of the videos posted was an engine that looked similar to Return to castle Wolfenstein, not Doom era Wolfenstein.

They had a 1 million polygon glass chandelier, with accurate refraction as the central point of the demo they were giving. It isn't your hundred million polygon demo, but it was certainly not your doom-era graphics.

(Personally, I suspect they could do Wolf3d or even Doom real-time raytracing on current CPU hardware... Especially at native frame sizes of 640x480)

Re:Intel keeps slogging raytracing (1)

Amouth (879122) | more than 2 years ago | (#37413936)

yeap - and this is Intel - a company that knows how to play for the future (to an extent).. an example is Hyper Threading.. most people pass it off but honestly if you expect for it and optimize some things for it you can see ~80% increase in performance. Now the group that came up with it and started designing it - started their research in i believe 1992.

some companies know how to do R&D and some don't, Intel is one that does.

Re:Intel keeps slogging raytracing (0)

Anonymous Coward | more than 2 years ago | (#37413522)

But GPU's can do raytracing very well. They've been doing this for a couple of years now.

Which one? (1)

ZeroSerenity (923363) | more than 2 years ago | (#37412856)

Ye old Wolfenstein for DOS and SuperNES or Wolfenstein for Windows, PS3 and Xbox?

thi5 FP for gNAA (-1)

Anonymous Coward | more than 2 years ago | (#37413048)

NIGGER ASSOCIATION endless conflict Jesus Up The You don't need to BUWLA, or BSD S1gnificantly DOG THAT IT IS. IT Baby take my

Not practical (1)

J-1000 (869558) | more than 2 years ago | (#37413588)

Their idea is to render the graphics on the server farm and stream them over to a thin client? If the server farm is local, then it's an expensive solution that only a small subset of users can afford. If it's cloud-based then there will be massive control lag. Neither idea is practical.

Re:Not practical (0)

Anonymous Coward | more than 2 years ago | (#37414290)

OnLive is already offering cloud based gaming, which is apparently quite good with a solid 'net connection.

'course, they're also not running games that take a cluster of servers to run.

Re:Not practical (1)

J-1000 (869558) | more than 2 years ago | (#37414724)

Wow, I never thought people would consider 150ms of lag (for *every movement*) acceptable, but apparently they do! I wouldn't mind so much for RPGs I guess, but something like Super Mario Brothers would drive me nuts.

Modern online games not running through the cloud are designed to mask online latency by providing you with instant feedback for certain actions (movement, firing animations). OnLive games, unless custom-coded, would not have this luxury. It would be akin to playing the original Quake (aka NetQuake) online... press button, wait, press button, wait.

Re:Not practical (0)

Anonymous Coward | more than 2 years ago | (#37414722)

Not sure Onlive would agree with you

Wrong Wolfenstein (0)

Anonymous Coward | more than 2 years ago | (#37414388)

Bother. Wrong Wolfenstein.

Lag? (1)

FrootLoops (1817694) | more than 2 years ago | (#37414824)

Is the lag for this type of solution tolerable? I can see it going both ways and (shockingly) don't have the necessary hardware to test this combo out myself. I have no experience with OnLive either.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account