Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

With 8 Cards, Wolfenstein Ray Traced 7.7x Faster

timothy posted more than 2 years ago | from the but-there-are-children-starving-in-africa dept.

Cloud 97

An anonymous reader writes "As Slashdot readers know Intel's research project on ray tracing for games has recently been shown at 1080p, using eight Knights Ferry cards with Intel's 'Many Integrated Core' architecture. Now a white paper goes into more detail, documenting near-linear scaling for the cloud setup with 8 cards, and gives details on the implementation of 'smart anti-aliasing.' It will be interesting to see how many instances of Intel's next MIC iteration — dubbed Knights Corner, with 50+ cores — will be required for the same workload."

Sorry! There are no comments related to the filter you selected.

Nice scaling (1)

symbolset (646467) | more than 2 years ago | (#39229875)

The scaling with 8 cards is very near linear. These cards are going to be great if we ever see them at retail.

Re:Nice scaling (0)

Anonymous Coward | more than 2 years ago | (#39229961)

I suspect it might have more to do with the type of task at hand, more than the cards themselves.

But lets hope I'm wrong.

Re:Nice scaling (2)

moderatorrater (1095745) | more than 2 years ago | (#39229973)

So you're implying that intel, a company notorious for building shoddy GPUs, used Wolfenstein for their demonstration not because Wolfenstein is the latest and greatest game that requires good hardware, but because it shows their cards in a good light?

Yeah, that makes sense.

Re:Nice scaling (0)

Anonymous Coward | more than 2 years ago | (#39229985)

No, I was referring to raytracing.

Re:Nice scaling (1)

Anonymous Coward | more than 2 years ago | (#39230015)

My thought was that if it takes 8 Intel GPUs that you should be able to run it only at most 4 nVidia or AMD GPUs.

Re:Nice scaling (3, Informative)

Creepy (93888) | more than 2 years ago | (#39232435)

Wrong. Nearly the entire raster pipeline would be ignored for ray tracing, and you don't really need a lot of shading units for the rest (don't need to multitexture in the background and whatnot). The main use for the GPUs in ray tracing would be collision detection, which could be written into shaders as long as the entire scene was loaded into each GPU's memory, so Wolfenstein is actually a good choice - a large scene would have problems because of memory constraints. Ray tracing works very well with lots of parallel CPUs, but usually is memory constrained (dependent on memory access more than anything else) in that scenario, so splitting it off onto multiple GPUs is a way to remove that constraint, but basically it still works like a lot of parallel CPUs accessing the same scene in memory.

Re:Nice scaling (4, Informative)

hairyfeet (841228) | more than 2 years ago | (#39230027)

Not to mention that those 8 cards probably don't have as much raw pixel pushing power as a pair of SLI or CF cards from the other guys, really not that impressive. Frankly Intel should have tried to buy out Nvidia years ago, the thing that gimps their chips more than anything is the crappy IGPs, instead they shoot themselves in the face by killing the Nvidia chipset business thus making Atom completely worthless. With ION it was actually a nice little unit but with Intel IGPs Atom is horrible.

That is one place you really have to give AMD credit, they saw the direction the market was going with low power mobile devices and bought ATI which makes excellent chips and ended up with Brazos, a dual core with HD graphics that sucks less than 18w under full load. I mean sure 8 cards might be nice for a render farm, but how many are gonna be willing to pay that electric and cooling bill with the prices constantly going up?

While I appreciate the effort Intel's answer to graphics has never been very good and if they need to throw 8 fricking cards at it I don't even want to know how many watts this sucker is pulling. The future is obviously mobile and while Nvidia has tegra, AMD has brazos, what's Intel got that can compete with that level of graphics in that small a wattage envelope?

Re:Nice scaling (4, Insightful)

billcopc (196330) | more than 2 years ago | (#39230127)

Spot on! ION was the best thing that ever happened to the Atom platform. Really, it was the only thing that made it into a usable HTPC or ultra-low-power desktop. They really need to stop shitting down NVidia's throat because they are precisely the kind of aggressive, performance-driven company that would fit alongside Intel's model.

Re:Nice scaling (2)

hairyfeet (841228) | more than 2 years ago | (#39233193)

Try the new Brazos friend, especially the E350 and E450 units, they are what Atom+ION USED to be. I have built several low power office units and HTPCs out of them and was impressed enough by the performance i sold my full size laptop for a 12 inch EEE with an E350. i get nearly 6 hours watching HD videos, never gets hot, video is smooth, hell I even play L4D and GTA:VC (I could play the newer ones but don't care for them) on it and it doesn't skip a bit. Also because AMD isn't in bed with Microsoft or trying to force you to upscale there is no artificial limits on their chips, most will run 8Gb of RAM and the lowest you'll find on max RAM is 4Gb, compare to Intel's 2Gb limit and 10 inch screen mandate. Check out Tiger and Newegg and you can find several units where all you do is slap in the HDD and RAM and you're good to go and these babies can be passively cooled and just sip power.

But what happened with Intel and Nvidia is just a classic example of what happens when a company gets too greedy. Intel couldn't stand the thought of not getting 100% of the chipset business and instead of seeing the added value having a powerful GPU could bring to a weak chip like Atom they killed the entire line and made the chip all but worthless. All my customers that bought non ION Atom units quickly returned them because they were just too damned slow, the units even under XP just felt like they were constantly dragging. I don't know how many time I had an Atom unit brought in by a customer asking "Can you make this any better?" only to have to tell them "Yeah, sell it and get an AMD because this sucks". My EEE has 8Gb of RAM and came with Win 7 HP X64 and runs that full OS as well as any desktop.

Oh a word of advice, if you do try a Brazos unit get "Brazos tweaker", its a free download from Google Codeplex and by dropping the voltage a little you can squeeze even longer battery life or get an even quieter unit. I managed to drop my EEE down 200Mhz at idle and 5% voltage under load which translates to nearly an hour on my 6 cell. really a great chip though, often cheaper than Atom alone yet gives better performance than Atom+ION, makes a great HTPC and both Tiger and the Egg have some really small passive units, just a great chip.

Re:Nice scaling (1)

billcopc (196330) | more than 2 years ago | (#39258075)

Thanks for the tip! I'll give them a try, I've been wanting a low-power office/surf machine anyway.

Re:Nice scaling (5, Informative)

subreality (157447) | more than 2 years ago | (#39230247)

Note that these are raytracing cards, not rendering. Raytracing is a very different technique which can do cool effects like refraction through glass (shown in the chandeliers and scopes), jawdropping water, and realistic lighting effects that rendering cards simply cannot do.

It's also much more demanding on hardware. One of the big drawbacks is it requires a lot of scattered reads out of memory making caching much less effective. You need tons of bandwidth to low latency memory to make it happen. We're still a very long ways out from having this possible in reasonably-priced consumer GPUs.

Rag on Intel for their integrated graphics if you want (though I consider them a good non-gaming graphics chip with very good open source support), but these cards are not related to those in any way. These are full-featured x86/x64 processors with 32 cores per die. In other words, they created a 256-core system capable of software-raytracing the whole thing at high resolutions.

That is quite an accomplishment, and rest assured, it is top-tier performance in the raytracing world. This isn't meant to be a practical gaming system; this is pretty clearly being done by Intel to show off the benefits of their many-cores processors, and it is an impressive show.

To the GP: They're using Wolfenstein because it's one of very few games that has a ray-traced variant, and it exists only because Intel created it as a testbed. More on that here: []

Re:Nice scaling (5, Insightful)

root_42 (103434) | more than 2 years ago | (#39230357)

The problem with these demos is, they use ray tracing like we did in 1980 (i.e. Whitted style). All computations are highly coherent and efficient. As soon as you want to have more natural rendering, with diffuse illumination etc. Parellization doesn't scale proportionally anymore. Rays become heavily incoherent, memory access scatters and you get cache misses etc. So the real feat would have been if tey show 7.7x speed with diffuse global illumination.

Re:Nice scaling (1)

myurr (468709) | more than 2 years ago | (#39230751)

And this is illustrated beautifully by the pretty shoddy visuals that Intel are showing off. These graphics would have been jaw dropping 15 years or so ago, but frankly today they look amateur and desperately outdated. Reflection and refraction are nice gimmicks but it'll be a rare game that actually makes use of them to improve the gameplay. For all the other titles out there these effects are usually faked to a high enough quality that it doesn't make much of a difference to the gamer.

If Intel really want to revolutionise the graphics market just shooting for ray tracing in and of itself is not going to be enough. They need to look for something truly compelling that is beyond what is capable with current polygon based rendering. If they were showing off, I dunno, a real time volumetric renderer with natural ray traced lighting that looks as good as todays AAA titles but allowed for fully deformable worlds AND reflections and refractions, then I would be impressed and predicting a revolution. As it stands Intel have a gimmick that they can point at whilst the rest of the world shrugs and gets on with their lives.

Re:Nice scaling (1)

justforgetme (1814588) | more than 2 years ago | (#39232655)

Judging from my limited experience with 3dsmax and real diffuse material pipelines I would suggest that state of the art RT algos won't come into the real time scene for ohh at least two decades. That is for real implementations. AFAIK You still can `hack` reflection and refraction behaviors to kind of simulate true diffuse refractions.

I can remember blowing render jobs' render times into thousandfolds with misuse of diffuse reflections.

Re:Nice scaling (1)

symbolset (646467) | more than 2 years ago | (#39233083)

What a coincidence. That's about how long it will take for certain patents to expire.

Re:Nice scaling (0)

Joce640k (829181) | more than 2 years ago | (#39230897)

Note that these are raytracing cards, not rendering. Raytracing is a very different technique which can do .... and realistic lighting effects that rendering cards simply cannot do.

Nope. Raytraced images are some of the least realistic. Once you get past doing glass balls on a checkered floor it's downhill all the way.

Raytracing has it completely backwards, light doesn't work that way in the real world.

Re:Nice scaling (4, Informative)

subreality (157447) | more than 2 years ago | (#39231045)

I believe you're mistaken. Raytracing IS the technique where you're tracing light much the way it happens in the real world. The techniques usually used in GPUs are quite backward. It hasn't really been all that downhill, though; they've gotten pretty good at faking a lot of the effects, but when it comes to things like shadows, local lighting, radiosity, and refraction, Raytracing is where it's at.

Examples: [] [] [] [] []

All of those are from POV-Ray. There are plenty more in their gallery over here: []

Feel free to send some counterexamples of other techniques doing it better.

Re:Nice scaling (1)

PhrstBrn (751463) | more than 2 years ago | (#39232237)

I would have mistaken them for miniature figures if you didn't tell me it was CGI using ray tracing.

Re:Nice scaling (0)

Joce640k (829181) | more than 2 years ago | (#39232603)

I believe you're mistaken. Raytracing IS the technique where you're tracing light much the way it happens in the real world.


In the real world light comes from a light source, bounces around a bit, then a lucky few photons arrive at your eyes.

In raytracing you trace a ray from your eye to the world and try to get back to the light source. That's the opposite direction.

Re:Nice scaling (2)

subreality (157447) | more than 2 years ago | (#39232717)

So what? It's essentially same end result with orders of magnitude less computation from wasted photons.

Where do you get that raytracing is "some of the least realistic"? It's one of the most realistic techniques that's actually in use, far better looking than anything that's currently done in realtime.

Re:Nice scaling (0)

RubberChainsaw (669667) | more than 2 years ago | (#39234311)

Most raytraced images are 100% in focus, which is very different from what we expect from a traditional image. So it appears that the image is fake. It is the same effect that a lot of people have upon seeing high definition movies on a good TV. The enhanced framerate and better contrast displays make the image different from what was expected, so the viewer ends up with a negative reaction to what are meant to be positive enhancements.

Re:Nice scaling (1)

Mr.Z of the LotFC (880939) | more than 2 years ago | (#39235297)

I raytraced a simple scene once where I solved the rendering equation analytically (to see if it was practical), & the result was badly out of focus because I had neglected to include a lens (I did not expect it to be that realistic). Better raytracing does in fact produce non-perfectly-focused images, even with approximate solutions (although it is possible that said improvements to raytracing have different technical names than "raytracing").

Re:Nice scaling (1)

jackbird (721605) | more than 2 years ago | (#39256547)

Nah, it's raytracing, you just scatter the rays you shoot for each pixel taking into account the lens's circle of confusion (and shoot more rays overall), with biases for things like the number of leaves on the camera's iris for extra realism.

Most of the time a 2D DoF effect using a rendered zbuffer is just fine, but raytracing will give you proper defocusing of reflections and refractions, as well as showing objects that would be completely obscured in the in-focus render.

Like a poster above said, though, it's gonna be a while before approaches that shoot that many rays are going to be viable in realtime applications.

Re:Nice scaling (1)

subreality (157447) | more than 2 years ago | (#39237127)

Raytracing definitely allows depth of field (a certain focal length). They're showing it in TFA, and in many of the examples I linked.

Re:Nice scaling (1)

gargll (1682636) | more than 2 years ago | (#39232825)

In raytracing, there is no requirement to trace rays from the camera or the light source, either direction is valid. Actually, a variant called bidirectional pathtracing traces rays from both the camera and the light source. This is also the approach of photon mapping. Raytracing in itself is simply the process of modeling the physics of light transport. There are however some limitations. For instance, it's not relly suited for modelling diffraction, but it's hardly limiting.

Re:Nice scaling (0)

Anonymous Coward | more than 2 years ago | (#39234011)

Fortunately for us, the laws of physics state that all light paths are reversible. So we can start wherever we want and we usually start at the camera since that usually is the most efficient way.

Re:Nice scaling (0)

Anonymous Coward | more than 2 years ago | (#39235827)

Physics is clearly not your forte.

Re:Nice scaling (1)

Hast (24833) | more than 2 years ago | (#39231633)

If you use the simplistic style of raytracing then yes, but there are many additions which make it possible to do extremely realistic scenes.

The fundamental problem is that ray tracing is only half of the puzzle. Typically you trace from each pixel on a "screen" into the 3d scene and look at where that ray intersects with an object. You then calculate the color of the object at that point and this becomes the color of that pixel on the screen. (In a real scenario you typically calculate multiple rays per pixel.)

The problem is how you calculate the color, with simple algorithms (such at color of the surface and distance and angle to the nearest light-source) then the effect is not very realistic. But global illumination methods like photon mapping (where you basically first run a ray tracing from all light sources which "light" the scene and then run a screen ray tracing and use the data from the first run to calculate the color of pixels become a lot more advanced) or image based rendering (where you use an HDR image as a light source) can produce very good results.

There is a reason why most professional rendering is done using ray-tracing techniques. It may be slow, but if you can trade speed for quality you can get very realistic results.

Re:Nice scaling (1)

Joce640k (829181) | more than 2 years ago | (#39232673)

Hybrid engines which combine raytracing, radiosity and photon mapping can give very good results, yes.

OTOH saying that raytracing alone gives realistic lighting is very naive.

Re:Nice scaling (1)

gargll (1682636) | more than 2 years ago | (#39232847)

Photon mapping is a raytracing technique. Photons are traced from the light source, precisely the same way that rays are traced from the camera. Where photon mapping differs is that the contribution of the photons is computed by local density estimation. This reduces the noise in the output, at the expense of introducing bias (ie. bluriness).

Re:Nice scaling (1)

Hast (24833) | more than 2 years ago | (#39231577)

I think the term you're looking for regarding "normal" graphics cards is "rasterising". Both rasterisation and ray-tracing are examples of rendering, which is the general term for turning data into an image. (Typically 3D data onto a 2D screen.)

Intel have been trying this technique of putting x86 cores on a board for quite some time now. But they still seem to be struggling to figure out a good use for them. One thing traditional GPUs have going for them is that they are rather dumb and limited in their capabilities. They basically do a lot of calculations, very fast, and that's it. Intel is obviously hoping that their "one trick pony" x86 will be able to do some new things here. But in the end you have a limited amount of transistors and at least so far putting lots of more limited GPU cores on there have outperformed putting fewer more complex x86 cores there instead.

If you want more on this topic PCPer did a few interviews with John Carmack on this topic. First one in 2008 regarding rasterisation vs raytracing (, there is a podcast for that too which I recommend). Back then Carmack talks about the possibilities of a combined raytracer and rasteriser. (Because a rasteriser is a lot more efficient, and for most of the things in a game you will not see much of a difference.) There is also one from last year (, there is a video for that online as well) where he has updated his research a bit. And Carmacks speech at QuakeCon is also interesting if you want to hear more (

Well worth reading/watching/listening to if you want a more critical view on the possibilities of these (and other future) technologies.

Re:Nice scaling (1)

subreality (157447) | more than 2 years ago | (#39231693)

You are correct. I guess I learned some of the terminology wrong back in the day. :)

Re:Nice scaling (3, Interesting)

godrik (1287354) | more than 2 years ago | (#39232865)

It's also much more demanding on hardware. One of the big drawbacks is it requires a lot of scattered reads out of memory making caching much less effective. You need tons of bandwidth to low latency memory to make it happen. We're still a very long ways out from having this possible in reasonably-priced consumer GPUs.

Yes, it is exactly what Intel Mic card are awesome for. They are generic x86 core with 4-way SMT and a buttload of memory bandwidth. I worked with Knight Ferry prototypes and studied the scalability of the worst case of algorithms for scattered memory access: graph algorithms. (The paper will be published soon but the preprint is available at [] .) Basically, we achieve close to optimal scalability on most of our tests.

These MIC card are designed to scale in good cases (compact memory and SIMDizable operations such as dense matrix vector multiplication, or image processing) but almost in the bad cases (lots of indirections, accessing caches lines in pathological scenarios such as sparse matrix vector multiplication, graph algorithms.)

I am excited to get a hold on the commercial card (we worked on prototypes) to make a CPU/GPU/MIC comparison.

Re:Nice scaling (0)

Anonymous Coward | more than 2 years ago | (#39230629)

EXCEPT you know if you tried to do ray-tracing at 1080P with NVIDIA or ATI cards you would need 16 or more of them right, not to mention beefy CPU? NVIDIA/ATI cores are great for pushing triangles, textures and shaders but they suck for ray-tracing (same as Intel chipset GPU) BUT this knight-whatever cards are optimized for ray-tracing and can do in REALTIME graphic and special effects you see in big blockbuster movies

the way i see future you will have ATI or NVIDIA (or Intel integrated GPU) integrated on CPU or motherboard like you had math coprocesor 10 years ago, everyone will have it for things like old plain games for example Crysis or Skyrim, and than real gamer's will have ray-tracing cards for real high-end gaming (same ones that have GForce 580/590 currently)

i only hope NVIDIA and ATI will start making ray-tracing cards also because if Intel is only one doing it prices will be insane and technology will not advance further

Re:Nice scaling (0)

Anonymous Coward | more than 2 years ago | (#39232721)

BUT this knight-whatever cards are optimized for ray-tracing and can do in REALTIME graphic and special effects you see in big blockbuster movies

An interesting quote from John Carmack in this interview [] where he says, " the real world where people make production renderings, even if they have almost infinite resources for movie budgets, very little of it is ray traced. There are spectacular off line ray tracers but even when you have production companies that have rooms and rooms of servers they choose not to use ray tracing very often because in the vast majority of cases it doesn't matter. It doesn't matter for what they are trying to do and it's not worth the extra cost."

Re:Nice scaling (0)

Anonymous Coward | more than 2 years ago | (#39230787)

"One company to rule them all" is just a bad business model. Intel should stick to what it does, processor design. It should simply reach a business agreement with Nvidia to integrate its capabilities into their processors. They don't have to "own" Nvidia to make that happen. Greedy control freak.

Re:Nice scaling (1)

laffer1 (701823) | more than 2 years ago | (#39230815)

Intel would have destroyed NVIDIA. They give up on graphics every few years and they only make enough for the lowend market. They similarly would have killed any progress with NVIDIA after a few years. It only would have caught them up for awhile.

Intel doesn't get graphics. It's so bad, I recommended an AMD A4 yesterday over an ATOM build because of the GPU.

The point isn't really raytracing (1)

symbolset (646467) | more than 2 years ago | (#39233049)

Raytracing is an example of an embarrassingly parallel vector math problem. It's not the only such example nor the only use these cards are being put to. They're being used in thermodynamic, aerodynamic and hydrodynamic modelling of systems for computer design, for mineral exploration, for climate modelling. It would not surprise me if NASA has a cluster with them for certain space physics applications. No doubt for financial modelling too.

The point of displaying the cards doing real-time 1080p raytracing of a classic game is to shock and awe some of us geeks who understand the scale of this application. It's pretty extreme scale computation for a single PC at this point in technology history.

Intel could put add some people to this project with less geek and more art it's true, if they want the maximum emotional impact from the gamer contingent. But those folks from Illumination Entertainment and Weta Digital are awfully hard to get, even on loan.

These are not laptop nor desktop chips and won't ever be barring some crazy improvements in power consumption before we hit the minimum node size available on silicon. That doesn't mean this work doesn't need to be done.

Re:Nice scaling (2)

wisty (1335733) | more than 2 years ago | (#39230045)

Actually, I remember Intel doing a lot of work on a new Wolf 3D engine specifically designed for excellent scaling, rather than raw performance on a single GPU.

This isn't Wolf3D. It's a much better engine (especially if you are Intel), with Wolf3D content.

Re:Nice scaling (1)

moderatorrater (1095745) | more than 2 years ago | (#39230103)

I didn't know that. Thank you.

and yet (2, Insightful)

Osgeld (1900440) | more than 2 years ago | (#39229899)

it honestly looks like an old game to me, yes there are some impressive features, but I really have to look for them in the images, something that is not going to happen at 60Hz (and if its not running at real speed who cares, that is a movie which can take its sweet ass time rendering frame by frame)

This isn't a fucking game announcement (2)

FatLittleMonkey (1341387) | more than 2 years ago | (#39229963)

Wolfenstein is an old game. The ray-traced version is being used as a "Utah Teapot", a standard object to develop and compare rendering techniques.

Re:and yet (4, Insightful)

rhyder128k (1051042) | more than 2 years ago | (#39230071)

Welcome to the current generation of ray traced game engine demos. They're poor looking. Apart from the metal sphere floating above the water and the odd glass refraction effect.

You'd think that they could hack something impressive looking together, particularly as they are competing against whatever it is the next generation of polygon rendering will come up with.

Re:and yet (2)

poly_pusher (1004145) | more than 2 years ago | (#39230453)

I'm sorry Michael but you have a poor understanding of rendering technology. They are showing off technology to potential partners and customers. They are not proposing an engine that utilizes the technology which will go to the consumer. It won't look pretty until it is developed as an end product.

What is important is that these rays are being cast in a relatively efficient manner allowing for realtime feedback. An engineer doesn't care specifically what those rays may be used for but just that they are being calculated efficiently and accurately.

Ya see with raytracing you can use the cast rays for many effects like reflection and refraction. But why develop and advanced, layered shading system which uses those rays in all sorts of cool ways if this is just a research project?

Re:and yet (1)

fast turtle (1118037) | more than 2 years ago | (#39231887)

There's another aspect of raytracing that many don't even get. It's the Military aspects, such as being able to efficiently calculate the origin of a shot (backtrace). For things like this (think about the final action in Last StarFighter) and you suddenly realize that Intel isn't working on this for gamers but for the Military. Lots more money to be made when you consider all of the CiC systems that would benefit from the ability to backtrace incoming fire and take it out with the appropriate weapon (StarShip Troopers/BattleStar Galactica). That's where they're going with this.

Re:and yet (1)

GuldKalle (1065310) | more than 2 years ago | (#39233313)

You've been drinking too much. Or too little, I forget how it is with you. Anyway, you haven't been drinking the exact right amount.

Re:and yet (4, Insightful)

JoeMerchant (803320) | more than 2 years ago | (#39230759)

Rent the Pixar Shorts DVD and watch what they did before Toy Story 1. Red's Dream, Tin Toy and Andre'B are all pretty crappy looking short films that demonstrate what a couple of guys in a lab could do at the time - when viewing them, you're supposed to imagine what could be done by a larger studio with funding, not bash them because a couple of guys in a lab given 3-6 months aren't producing something competitive with hundreds of people given millions of dollars and a couple of years.

Re:and yet (0)

Anonymous Coward | more than 2 years ago | (#39233603)

I don't know if I would call them crappy, they were phenomenal at the time. Red's dream(1987) predates OpenGL (92) and the wave of CGI from movies like Lawnmower man (1992) Beyond the Mind's Eye(1992) and shows like Reboot(1994). There wasn't much to compare it to out there. They're all short effects like the stained glass knight from Young Sherlock Holmes (85) or the Genesis sequence from Wrath of Khan (82).

Re:and yet (1)

jackbird (721605) | more than 2 years ago | (#39256565)

Also, while Red's Dream didn't win any awards, Luxo Jr. from the previous year was a nominee for the best animated short Oscar, and Tin Toy from the following year won. So yes, cutting-edge.

Ummmm except (2)

Sycraft-fu (314770) | more than 2 years ago | (#39235401)

That Intel is not just some small outfit, and they are the ones who want to push this change from rasterization to ray tracing. Rasterization works great and looks good and is what run well on all the GPUs out there today. Makes AMD and nVidia happy, they make billions doing it. Intel is unhappy, they want you spending less, or rather none, on those products, more in Intel products. So they are on about ray tracing. Something that GPUs aren't as good at.

Well guess what? To convince people the change is worthwhile, they've got to show it as being better. I'm not interested in something that requires expensive new processors unless it gives me a benefit. So let's see it then. Let's see a ray tracing engine blowing rasterization out of the water.

It's not like they can't hire someone to do some work on it. Look at the demos that come from places like Uengine and 3DMark. Let's see something that makes us go "Wow, that is sweet."

Instead, we see lame demos that don't look much if any better than the original (and that being an old game) and on the rasterization front we see photorealistic skin rendering by a few guys at a university that you can run on your PC at home [] .

You can see why maybe people are not so impressed with Intel carrying on about raytracing as the future of games.

Re:Ummmm except (1)

JoeMerchant (803320) | more than 2 years ago | (#39235763)

That Intel is not just some small outfit,...

Yes, but... this is apparently not a big part of their greater plans at the moment. Not everything that has the Intel name on it is given billion dollar backing.

I think that the realtime raytracing thing is coming, not this year, probably not with 22nm processes, but by the time 6nm processes and 3D packaging are here, there are going to be way more than 8 cards worth of transistors on a single chip.

Re:Ummmm except (1)

Sycraft-fu (314770) | more than 2 years ago | (#39235873)

Have to compare that to what will be available form nVidia and AMD though. There really isn't a "right" rendering technology, people are not all in with ray tracing even in the high end world. 3Dsmax uses a scanline renderer by default, there are plugins for it like the Indigo Renderer which uses basically uses various Monte-Carlo methods to get really realistic images.

In terms of realtime rendering it will be whatever can give the best perceived quality on the least amount of hardware. Maybe that'll end up being raytracing in the end but maybe not.

I think too many Slashdot types think it is the be-all, end-all either because they think that all high quality rendering is done using it (it's not) or because they learned about it in CS class and heard the "O(log n)" talk and don't really understand the limits in the real world.

I've nothing against raytracing. However the one and only thing I care about for realtime graphics is what looks the best, and can do it with the most affordable hardware.

Re:Ummmm except (1)

JoeMerchant (803320) | more than 2 years ago | (#39236417)

The evolution I have seen, for better or worse, over the last 30 years is from impossible to barely possible to practical to so-easy you can do it with stupid simple algorithms, and most people do because the hardware is cheaper than writing clever software.

Clever software will always have a great economy of scale, but when people have the equivalent of a 1990s supercomputer in their cell phone running for 7 days on a battery that weighs 20 grams, clever software won't matter as much as it used to.

Ray tracing isn't clever, it's a straight forward stupid simple algorithm. When that's all you need, it sure helps the software delivery schedule to use it instead of something more efficient and complex.

I'm looking forward to 2020 when we will be able to have a 50 core FPGA system, not because I want to endlessly tweak the hardware designs for clever little things, but because I will be able to configure "the right sledgehammer for the job" and get it done without burning so many brain cycles.

Re:Ummmm except (1)

Sycraft-fu (314770) | more than 2 years ago | (#39238057)

The problem is that ray tracing doesn't do the trick. As I pointed out in another post, ray tracing sucks at indirect lighting. Since you tracing back from the display to sources, it only does direct lighting well. Thing is, most of the lighting we see in the real world is indirect. So you've got three choices:

1) Deal with poor lighting. Suboptimal, particularly since rasterization isn't so problematic with this. You can handle indirect lighting a number of ways and have it work fairly well.

2) Use a trick. Something like photon mapping along with teh ray trace. Fair enough, but all the tricks that do a good job take a bunch more calculations and thus make it even less able to be implemented in realtime.

3) Give up on ray tracing and go to a fully unbiased renderer like one of the ones that use a monte-carlo system. Of course those are even worse in terms of the calculations.

I'm fine with the idea of "do it simple" if the hardware can handle it. However we are a long, long way from that in graphics.

My bigger point is just nothing Intel has produced has convinced me that ray tracing is a better way to go. Never mind that they are still talking about hardware that doesn't exist, I want to see the demo showing how good it can look. That helps determine a whole lot as to how useful it realistically is. I want to see how it compares against something like the SSSS demo, Battlefield 3, Uniengine, and so on.

The reason is that I'm suspicious that other than smooth corners thanks to lots of polygons (which may have limits, what they don't tell you in CS is that O(log n) for raytracing isn't taking memory accesses in to account) and maybe nice shiny surfaces, it isn't really going to look that special. You also have the problem that rasterizers are solving some of the problems with neat tricks like tessellation (things get more geometry when you get close).

Basically, talk is cheap, let's see some demos that show what they are on about. They don't have to be long or large. The resource thing isn't a valid argument. Please remember that SSSS demo I talk about was done by 2 dudes at a university in Spain, who also like 9 months ago released a new method for anti-aliasing with a rasterizer that takes very few resources (SMAA).

Re:Ummmm except (1)

JoeMerchant (803320) | more than 2 years ago | (#39241099)

I'm fine with the idea of "do it simple" if the hardware can handle it. However we are a long, long way from that in graphics.

My bigger point is just nothing Intel has produced has convinced me that ray tracing is a better way to go. Never mind that they are still talking about hardware that doesn't exist.

Ray tracing may not (or, eventually with photon mapping, may) be the way to go. If by long, long way you mean 8 years, then, yes, I'd agree.

At my age, 8 years goes pretty quick, and even when I was younger, I only replaced my computers at most every 4 years, I've had a couple of systems for 8 or more years.

And, if they weren't talking about hardware that didn't exist, I'd be pretty bored - the existing stuff is pretty well understood, and yes, on the existing stuff, realtime ray tracing is pretty sucky compared to the more clever hacks.

Re:and yet (1)

thetoadwarrior (1268702) | more than 2 years ago | (#39230135)

Because there are so many brand new games with tons of mod tools and an open source engines.

Re:and yet (2)

Charliemopps (1157495) | more than 2 years ago | (#39230343)

You're only saying this because you don't have a clue how 3D engines work. A raytraced game could potentially look "real" unlike current games which continue to just look like more and more sophisticated animated cartoons. Google raytraced images sometime, it's hard to tell that they are CG and not real a lot of times.

Re:and yet (0)

Osgeld (1900440) | more than 2 years ago | (#39232217)

I know exactly how this works, and I have written a couple crappy little ray-tracers in the past. ray-tracing is one of those things like the space program ... yea it could do a lot, but it doesn't, because in reality its not very practical and not very useful. Displaying pixels on a grid your always going to have an margin of display error, and who cares if you can see its a 100% perfect circle as long as the computer knows and correctly calculates it.

No it is pretty easy (1)

Sycraft-fu (314770) | more than 2 years ago | (#39235433)

Raytracing falls down bigtime in the lighting department. It can't handle indirect lighting well and you get this situation of everything looking too perfect and shiny. Reflective metal spheres it is great at. Human flesh, not so much.

Now there are solutions, of course. You do photon mapping or raidosity and you can get some good illumination that can handle diffuse lighting, caustics, and that kind of shit. However ray tracing by itself? not so much. Problem is none of that other shit is free. You don't just "turn it on" it takes anywhere from a lot more calculation to "holy fuck this is like 50x as slow" more calculation.

Now on the other side of things, have a look at the SSSS demo: [] . This is a demo of photorealistic human skin that you can run on a normal computer right now. All you need is a Windows system with a reasonable DirectX 10 or better GPU. Works with current rasterization technology.

People need to stop treating ray tracing like it is some be-all, end-all of computer graphics. No, it is a method that has some good point (easy implementation being a big one) and some bad points (indirect lighting being one).

Re:No it is pretty easy (1)

Charliemopps (1157495) | more than 2 years ago | (#39264101)

It IS the be-all end-all of computer graphics. Indirect lighting is only a problem due to the limitations of CPU speed. Specifically, when you set up a render you set it up with the number of "bounces" a ray will make. When you're doing live video, those bounces are set to about 3... it's hard to get ambient lighting with that. Is a Raytracing engine the solution to computer graphics right now? Probably not. In 100 years when computers are likely smarter than we are and have us hosted in matrix-like virtual environment so they can feed off of our souls, what kind of rendering will they use? Ray-tracing. Set the number of ray bounces to 1000... or 10,000 and your ambient lighting issues suddenly go away.

Even now, Raytracing could do a lot more. Remember that the ONLY reason DirectX works at all is that there are hardware developers building specific equipment to take advantage of it and accelerate its content. There are no Raytracing video cards, although there have been several attempts at creating some proof of concepts. I'm not sure how much it would help but I know for a fact that DirectX would be dead without the existence of Nvidia and ATI.

Re:and yet (1)

poly_pusher (1004145) | more than 2 years ago | (#39230413)

This is all still proof of concept. Just the fact that you can raytrace an image like this is impressive. Once realtime raytracing is a reality, then more advanced shading systems will be developed which do more interesting things with those cast rays. An example shown in the article is physically based refraction for glass and water. A more challenging application would be subsurface scattering "light penetrating a surface and bouncing around before it bounces back out, i.e wax," or light dispersion " colored light being separated. The advantages to raytracing are very real. The tests that are run just wont look very special. Another good example is soft shadowing and ambient light. If you have a high detail surface "bricks, cracks mortar and such" These features will really shine. When it's being tested on a level with 60,000 total polygons and only diffuse textures, not so much.

Re:and yet (1)

Rockoon (1252108) | more than 2 years ago | (#39230749)

The real advantage to ray tracing is how it scales only logarithmically with scene geometry.

Games (and so forth) are using more and more on-screen polygons, which scales linearly with a rasterization but logarithmically with ray tracing. Ray tracing will inevitably be as efficient for the same quality as rasterization if things continue as they do, and from then on rasterization will never be able to keep up (just like bubble sort cant keep up with any O(N log N) sort for sufficiently large N)

But the real problem with ray tracing is anti-aliasing. All current ray tracers anti-alias by costly super-sampling, and unless the amount of super-samples is extremely high its quite plainly inferior in quality to how rasterizers do it. The solution is to not do ray tracing but instead do whats called beam tracing, but that has its own scaling issues with regard to reflection and refraction (beams that grow to encompass the entire world geometry) that make the current cache issues look like a joke in comparison.

intel 3d (5, Funny)

maestroX (1061960) | more than 2 years ago | (#39229917)

so, how does it stack up against a Riva TNT2?

Re:intel 3d (0)

Anonymous Coward | more than 2 years ago | (#39229941)

Or my old Voodoo Extreme?

Re:intel 3d (1)

Kymermosst (33885) | more than 2 years ago | (#39229955)

Hell, I had a Voodoo Rush and I think it was faster tan a new Intel 3d card.

Re:intel 3d (1)

Tastecicles (1153671) | more than 2 years ago | (#39230553)

I've still got a 32MB Voodoo 3 3k (given to me as surplus to requirements - also the guy couldn't get the driver to work on NT, which I managed to get going on Slackware 8)... still works, too. I'm using it as a head for one of my thin clients.
Another client has a NVidia Riva TNT2 Model 64 32MB dual head AGP (my first AGP card).
The third has an ATI Rage Pro 8MB (upgraded from 4MB). This was the first PCI graphics card I ever bought.

Ridiculously old cards, but they still work as advertised - which is plenty good enough for low-power machines which are only booted up when someone wants to write a document, do some light browsing, send a quick email or hook into the media server.

Re:intel 3d (1)

justforgetme (1814588) | more than 2 years ago | (#39232713)

Weren't the Voodoo 3 cards 3D accel only? How did you manage to push frame buffers through them?

Re:intel 3d (1)

nbehary (140745) | more than 2 years ago | (#39232843)

No, the voodoo 3 was both 2d/3d. It was the first of their cards that wasn't 3d only. (well, in the main line. I think there was a variant of 1 or 2 that was less powerful but also did 2d. Not sure about that though. )

Re:intel 3d (1)

Anaerin (905998) | more than 2 years ago | (#39233251)

Voodoo 1 and 2 were 3D only. Voodoo Banshee and Voodoo 3+ had framebuffers too to be used as full video cards.

Meh. (0)

Anonymous Coward | more than 2 years ago | (#39230467)

Hercules Stingray 128 is where it's at.

Hercules, man! It's like some kind of Greek Demigod of SVGA Awesome.

Re:intel 3d (1)

Albert Sandberg (315235) | more than 2 years ago | (#39230215)

How was your fillrate with TNT2 on 1080p resolution? :-)

Re:intel 3d (0)

Anonymous Coward | more than 2 years ago | (#39231301)

with realtime raytracing?

I did some raytracing back when those particular cards were popular. That sort of video would have taken weeks to months to render.

While raytracing is 'embarrassingly' parallel it is also embarrassingly expensive.

By my math it is 4 channels (probably not 8 bit, usually floating point) at 1900x1080 which is 8,208,000 calculations. That does not include the pre calc z buffer elimination (another 2 million). Then the backscater rays (prob another 20 million). Oh and do it at 60fps. So yeah it is actually semi impressive.

Also didnt they show this same demo a couple of years ago? Just before they canned their 32 way atom processor? This seems more like a tech demo to sell to movie studios or people who do animated movies.

Good (2)

Grindalf (1089511) | more than 2 years ago | (#39229931)

Good, FHD 1080p is old now too, UV Ray disks are about 4 times as wide and that's coming "Real Soon Now."

Nested links are nested (1)

Anonymous Coward | more than 2 years ago | (#39229953)

Jesus guys, how many Slashdot articles do I have to go back through until I can find the original Wolfenstein thing?

Raytracing is embarrassingly parallel (2)

gentryx (759438) | more than 2 years ago | (#39229957)

...which is why it's easy to scale [] up. Thus the speedup isn't that impressive. Scalability on tightly coupled apps would be much more interesting.

Re:Raytracing is embarrassingly parallel (0)

Anonymous Coward | more than 2 years ago | (#39231653)

As a parallel software developer, I think you have it backwards (and yes, I know I sound crazy). The scalability of applications which have lots of data dependencies and serialization points is actually UNinteresting. This is because it's easy to predict how such software will scale: the prediction is "like crap." The real fun of parallel programming is trying to optimize those curves, not map out all the different shapes they take.

Maybe plotting out Joe Blow's brain-dead pthreads implementation of parallel merge sort which achieves a 2x speedup on 16 cores is interesting in a theoretical sense, but if you want to have an entertaining conversation, skip it.

Game makers take note. (0)

Anonymous Coward | more than 2 years ago | (#39229979)

Making your game LOOK this good has nothing to do with getting me to buy it.

Making it play well. Tell a good story. Having customizable control options. Having replayability.
Install and uninstall quickly and cleanly. With no extra processes running or hoops to jump thru to play...

Those ARE good ways to get me to buy your game.

Re:Game makers take note. (1)

maxwell demon (590494) | more than 2 years ago | (#39230123)

Intel doesn't make games. Intel makes hardware. You can use that hardware to play great games, or you can use the same hardware to play bad games. GPUs cannot help with the story, the replayability or the installation, but they can help with the graphics.

Re:Game makers take note. (1)

icebraining (1313345) | more than 2 years ago | (#39230259)

And with the physics simulation.

Re:Game makers take note. (1)

Rockoon (1252108) | more than 2 years ago | (#39230781)

Without a lot of work being done, PhysX and the like are just a high latency wholly inefficient transaction. Since games must support people that cannot accelerate the physics, they also cannot do lots of it, culminating in PhysX just being that high latency transaction. It becomes just a bullet point on the box, rather than something advantageous.

Maybe in a few more years when games give up on supporting the current mid-range GPU's....

Re:Game makers take note. (0)

Anonymous Coward | more than 2 years ago | (#39230433)

And game makers will look at this demo as something they want to do eventually. It's good to be clear with what we want.

Re:Game makers take note. (1)

maxwell demon (590494) | more than 2 years ago | (#39230489)

And game makers will look at this demo as something they want to do eventually. It's good to be clear with what we want.

While game makers will probably look at the demo, they will certainly not look at a random comment of an Anonymous Coward in a Slashdot discussion about that demo.

How is it a "cloud" ... (1)

dbIII (701233) | more than 2 years ago | (#39229991)

... if it's all in the same datacentre?

Re:How is it a "cloud" ... (1)

Anonymous Coward | more than 2 years ago | (#39230039)

It's a "cloud" because people don't understand what it means and because it makes it sound better to marketing.

For airsoft it's "Lipoly Ready" which means nothing, all it means is it is physically possible to attach a lipoly battery to it that is it.

For food it's "healthy", "natural" etc, than you read the lable and you find out the so called natural ingredients make up almost nothing of the product.

For cars I can't even think of something as they try to pull too many things, and I look into the car specs and performance more than the marketing they do. So I failed the car analogy.

Re:How is it a "cloud" ... (1)

gl4ss (559668) | more than 2 years ago | (#39230059)

they forgot where in the datacentre the machines where.

then it qualifies as cloud.

came for a 1080p vid of wolfenstein 3d raytraced.. (1)

gl4ss (559668) | more than 2 years ago | (#39230053) the article, got disappointed. it's a reboot they're raytracing :.

and couldn't find a video(youtube has an older vid..).

and one of the links in the article is broken.

shoddy. now someone do a hack to make onlives servers do this parallel setup..

Desperate attempt to make it seem feasible (1)

fa2k (881632) | more than 2 years ago | (#39230169)

A really cool article, but why do they spin it as a 'cloud' setup?

In my experience, the gamers who care about such beautiful graphics are happy to spend a few grand on hardware. They are not happy with jitter due to the internet connection, or waiting in line for a server.

cool (0)

arthuro (2587621) | more than 2 years ago | (#39230299)


Not the original :( (4, Funny)

shish (588640) | more than 2 years ago | (#39230449)

Apparently this is the newer wolfenstein games; I wanted to see what 8 GPUs worth of fancy effects could do to the original pre-Doom Wolfenstein :(

Re:Not the original :( (0)

Anonymous Coward | more than 2 years ago | (#39231779)

That would be Wolfenstein 3D then... not what the title of the article stated at all.

Here's as close as you'll get (0)

Anonymous Coward | more than 2 years ago | (#39233927)

New Wolf (Original Wolf3D albeit redone into OpenGL): []


Is it really that special? (1)

poly_pusher (1004145) | more than 2 years ago | (#39230461)

I'm impressed by the raytracing speeds and all but is it surprising that it has near linear scaling? Raytracing is very well suited for parallel processing and scaling is nearly linear on CPU's if the software is well optimized and you're on a good network.

Re:Is it really that special? (1)

Tastecicles (1153671) | more than 2 years ago | (#39230589)

hmmm... optimised software. Read: custom code for massively parallel clusters. Oh, yeah. :)
Good network. Read: 2-ary-4-tree with twin redundant fibre switching. Or for home users with a bit of spare cash rather than a University department with EOY budget to blow, several lengths of cat5, some PCI Gigabit ethernet cards and redundant Gigabit switchgear (what I did with a pair of DLink 24-port Gigabit switches and a boatload of surplus cat5 patch cables. Oh, yeah, that's one fast network).

IAAG (I Am A Geek).

What's with the X deep links? (1)

dutchwhizzman (817898) | more than 2 years ago | (#39232419)

I want my visual porn, not an endless link farm of old /. articles.

ray/path tracing already possible with current gen (0)

Anonymous Coward | more than 2 years ago | (#39234273)

cool, but... (0)

Anonymous Coward | more than 2 years ago | (#39236315)

really it makes rasterization look damn good.

My new gaming system (0)

Anonymous Coward | more than 2 years ago | (#39245561)

So now I just need a server room in my basement to run a render farm so that I can play Wolfenstein. God help us when we find out the requirements for running Skyrim on this.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?