Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Ray Tracing To Debut in DirectX 11

CmdrTaco posted more than 6 years ago | from the directx-11-is-confusing-to-my-brain dept.


crazyeyes writes "This is breaking news. Microsoft has not only decided to support ray tracing in DirectX 11, but they will also be basing it on Intel's x86 ray-tracing technology and get this ... it will be out by the end of the year! In this article, we will examine what ray tracing is all about and why it would be superior to the current raster-based technology. As for performance, well, let Intel dazzle you with some numbers. Here's a quote from the article: 'You need not worry about your old raster-based DirectX 10 or older games or graphics cards. DirectX 11 will continue to support rasterization. It just includes support for ray-tracing as well. There will be two DirectX 11 modes, based on support by the application and the hardware.'"

cancel ×


Sorry! There are no comments related to the filter you selected.


Anonymous Coward | more than 6 years ago | (#22921214)

God created Adam and Eve, not Adam and Steve!!


Anonymous Coward | more than 6 years ago | (#22922132)

No, it did say Adam and Steve. That's what you get for reading a shoddy translation.

Only available with Windows 7 (-1, Flamebait)

inTheLoo (1255256) | more than 6 years ago | (#22921216)

and other vaporware that's just around the corner. Pay no attention to the Vista reality, OSX or that Linux thing with its OpenRT.

Sleep little techie, don't say a word.
Stevie is going to buy you a VR world.

Re:Only available with Windows 7 (4, Insightful)

Vigile (99919) | more than 6 years ago | (#22921736)

This is very obviously a lie or joke for early April fools. I didn't know Slashdot fell for them. Did anyone actually read the last page?

Re:Only available with Windows 7 (5, Insightful)

Shade of Pyrrhus (992978) | more than 6 years ago | (#22922052)

Yeah, for anyone who reads up to the last page, it seems pretty clear that it's not true. Something like this would be more likely announced by Microsoft PR a good while before release, in order to grow some hype.

TFA states

"As DirectX 11 is a work in progress, Microsoft does not have an exact timeline. But the source claims that DirectX 11 could be part of Windows Vista by late 2008."
I don't know where these guys get their information, but even Microsoft does planning ahead of time for products they create - especially if it's to be released the same year! The absurdity climaxes at the third yourself a favor and read it for a little laugh.

"They also plan to have DirectX 11 ready in time to debut with Windows Vista Service Pack 2"
Service Pack 2? Sure, SP1 wasn't an improvement and SP2 might be needed - but, again, plans for this would have been more well announced or planned by Microsoft.

Sorry guys, article is simply BS.

Call me old and grumpy (5, Funny)

Bullseye_blam (589856) | more than 6 years ago | (#22921230)

But I am really annoyed that April Fool's has now become a multi-day event.

Re:Call me old and grumpy (5, Insightful)

Nimey (114278) | more than 6 years ago | (#22921636)

Shit. That reminds me that I'm going to have to ignore Slashdot tomorrow because it'll be full of unfunny-because-they're-trying-too-hard stories.

Re:Call me old and grumpy (5, Funny)

pohl (872) | more than 6 years ago | (#22921704)

No doubt. At this point the best slashdot could do on April 1 is post 100% real stories and watch everybody try to figure out where the silly fake stuff is.

Re:Call me old and grumpy (3, Insightful)

StarvingSE (875139) | more than 6 years ago | (#22921848)

That's how it used to be before the OMG PONIES era...

Re:Call me old and grumpy (4, Interesting)

Nimey (114278) | more than 6 years ago | (#22921928)

They've been doing this for a while before OMG PONIES actually. There was one year where they had a really good April Fools' post about how Microsoft was doing something evil to Slashdot (cease-and-desisting or something) but too many people forgot what day it was & it got blown out of proportion.

So now we have the current situation of stories so stupid that anyone can tell they're fake.

Re:Call me old and grumpy (1)

shawn(at)fsu (447153) | more than 6 years ago | (#22921788)

But you might miss the next "OMG Ponies"!

Re:Call me old and grumpy (5, Funny)

Ed Avis (5917) | more than 6 years ago | (#22921818)

Just like any other day of the year then?

Re:Call me old and grumpy (1)

gstoddart (321705) | more than 6 years ago | (#22922212)

Shit. That reminds me that I'm going to have to ignore Slashdot tomorrow because it'll be full of unfunny-because-they're-trying-too-hard stories.

I'm far more worried about that evil friggin' pink color scheme which burned itself into my retinas last year.

But, yes, excellent point. Must avoid Slashdot on April 1st. It just gets way to friggin' stupid.


Re:Call me old and grumpy (4, Interesting)

Dachannien (617929) | more than 6 years ago | (#22921892)

This way, they can post it as a dupe tomorrow, thus making it funny for multiple reasons.

Re:Call me old and grumpy (1, Interesting)

Anonymous Coward | more than 6 years ago | (#22922220)

I demand ponies! Seriously, Slashdot should revisit that joke.

Just ignore the people that are going to call it a dupe.

You're not alone. (1)

pragma_x (644215) | more than 6 years ago | (#22922308)

I just looked at the calendar and let out a deep sigh. The entire internet is going to be unusable all week, isn't it?

Poster is really excited (4, Insightful)

pembo13 (770295) | more than 6 years ago | (#22921234)

If one's thing sure. Pity DirectX11 will work on so few platforms.

Re:Poster is really excited (0)

Anonymous Coward | more than 6 years ago | (#22921396)

If visiting Slashdot was illegal, would you do it?
No, it's not really that entertaining, and the few pieces of interesting factual information I get here I can find a million other places.

OpenGL (1)

phorm (591458) | more than 6 years ago | (#22921564)

I haven't really been in the 3d-graphics-API scene for awhile, so I'm wondering what's available for OpenGL raytracing. There are a bunch of plugins etc for 3d-rendering that I remember, such as POVRay, etc, but how about realtime?

Anyone know if there's anything available/in-the-works?

Re:OpenGL (4, Informative)

Yetihehe (971185) | more than 6 years ago | (#22921678)

There is now only OpenRT which have Open only fro similarity with OpenGL (it is fully proprietary implementation, but has API similar to that of OpenGL).

Re:OpenGL (0)

Anonymous Coward | more than 6 years ago | (#22921900)

I wish they would create a DirectX/OpenGL API for game stories with minimum requirements. Most games these days are just a new graphical shell on some licensed game engine with a story tacked on for advertising/promotional purposes. Pretty graphics does not make a game. I miss the days when you could get totally immersed in a game story and feel like you were a part of things going on.

Re:Poster is really excited (1, Offtopic)

YaroMan86 (1180585) | more than 6 years ago | (#22922032)

It gives me great comfort that the so-called "industry leader" in Microsoft seems to be leading from the last place in the group.

Why the fuck did it take DirectX eleven iterations to implement an old technology? Why is Microsoft putting DirectX on fewer platforms now?

Surprisingly forward thinking on MS' part (4, Insightful)

Froze (398171) | more than 6 years ago | (#22921236)

Or maybe just obvious to anyone in the industry. Since clock speeds are bounded and not getting any faster and you can only lower voltages so much before signals get lost in the noise, the only way forward is in parallelism and ray tracing is wondrously parallelifyable (is that a real word?).

One new M$ feature = buy a new OS (1)

diodeus (96408) | more than 6 years ago | (#22921288)

Yeah, but you'll have to buy a new post-Vista operating system just to get this nifty new feature.

Re:Surprisingly forward thinking on MS' part (1)

The Living Fractal (162153) | more than 6 years ago | (#22921328)

It's probably parallelable.

Anyway, clock speeds aren't increasing much right now but I suspect that this is only a limitation of current tech. Someday I hope we can get clock speeds to reach much higher levels.. like say a frequency close to the composition of planck times [] inherent in the circuitry of the CPU.

Re:Surprisingly forward thinking on MS' part (0)

Anonymous Coward | more than 6 years ago | (#22921352)

parallelifyable (is that a real word?)
No. The word you want is "parallelizable".

Re:Surprisingly forward thinking on MS' part (4, Funny)

the_humeister (922869) | more than 6 years ago | (#22921498)

...parallelifyable (is that a real word?).

Yes, it's a very cromulent word.

Re:Surprisingly forward thinking on MS' part (1)

gabebear (251933) | more than 6 years ago | (#22921592)

I don't think ray tracing has any inherent advantage over rasterization when it comes to parallelism. Both techniques require that each rendering node have access to all of the data for the scene. Plenty of parallel rasterization hardware cards are on the market, the first one I used was the Voodoo2 SLI [] .

Software rendering may come back into style with these faster CPUs, but I'm doubting we are going to see ray tracing gain any serious ground in real-time 3D rendering.

Re:Surprisingly forward thinking on MS' part (1)

CastrTroy (595695) | more than 6 years ago | (#22921726)

I remember a comment from a previous story on raytracing that basically said, since using raster processing is faster, to get the same quality image, you will always be able to get a better image out of a raster graphics processor, then from a ray-tracer. Which means that raytracing is nice if you have time to wait around, but if you wait around the same amount of time with raster processing, you'll get a better image.

Re:Surprisingly forward thinking on MS' part (1)

Froze (398171) | more than 6 years ago | (#22921748)

The benefit is that ray tracing can generate better scenes by evaluating the physics of lighting in a more accurate way. The simple parallelism aspect comes into play when you have 32 GPU's that can render individual frames at rates higher than one per second each giving you more realistic animation and better lighting physics, not to mention the ability of the intra scene parallelism. As a reference the wikipedia article here has supporting corroboration []

Re:Surprisingly forward thinking on MS' part (1)

poetmatt (793785) | more than 6 years ago | (#22922004)

That may sound like a good idea, but what about the fact that Carmack says the stuff is a waste [] (and isn't planning to program it in)....let alone we're talking Intel integrated graphics, which is not stuff that gamers use. So where is this magically supposed to make a difference other than break compliance?

Personally, I'd have people pay more attention to textures and make things more efficient than concentrate on a new shiny method to improve shadows. Shadows are the first thing to go from non high end systems. Does any part of intel graphics sound like high end systems? No, and it never will. So who is going to really use this stuff? About scalability: all graphics have scalability. Ray tracing is low on that priority level. With 32 GPUs, things like textures will come first, not ray tracing, not rasterization, and not anything involving shadows.

The wiki link talks a lot more about Rasterization (current method) than Ray Tracing. Ray tracing has its potential uses, but we are WAY way off from it being necessary or something.

Re:Surprisingly forward thinking on MS' part (1)

Wavebreak (1256876) | more than 6 years ago | (#22921820)

Quite true, in one of the major factors rasterization is so ubiquitous is the inherent parallelizability. Modern gpu's are basically just loads of somewhat slow special-purpose processors working in parallel. However, so is ray-tracing, and it might just scale better on future hardware, so we might start seeing some movement towards it.

Re:Surprisingly forward thinking on MS' part (0)

Anonymous Coward | more than 6 years ago | (#22921746)

If its a word, it'd be spelled parallelifiable.

Re:Surprisingly forward thinking on MS' part (1)

plague3106 (71849) | more than 6 years ago | (#22921772)

Why do you assume advances won't let use move to higher clock speeds and lower voltages? I think the word you're looking for is "parallelizable."

oh, please (1)

nguy (1207026) | more than 6 years ago | (#22921844)

Ray tracing has been around for decades, and it's been steadily getting faster. Even in the 1980's, it was clear that eventually it would be doable in real-time. And there have been real-time ray tracing demonstrations around for several years now.

So, there is nothing "forward thinking" about this, Microsoft is simply following industry trends.

Re:Surprisingly forward thinking on MS' part (1)

MoparMark (1043238) | more than 6 years ago | (#22922034)

is wondrously parallelifyable (is that a real word?).
President Bush would think so.

Re:Surprisingly forward thinking on MS' part (1)

SQLGuru (980662) | more than 6 years ago | (#22922244)

Dan Quayle thought so first.

Re:Surprisingly forward thinking on MS' part (0)

Anonymous Coward | more than 6 years ago | (#22922168)

Also, most realtime raytracing is currently done with an OpenGL like api (OpenRT), which is completely cross platform and does not use any proprietary Microsoft technologies. Given that DirectX seems to be such a good... argument for people to buy the latest and greatest Windows, this is good forward thinking indeed!

It's surprising though that they would exclude certain hardware vendors by standardizing on a specific architecture. You should think that they need all the support that they can get.

Re:Surprisingly forward thinking on MS' part (0)

Anonymous Coward | more than 6 years ago | (#22922292)

Ray-tracing will never beat rasterization. I said it ten years ago, I've said it to every vendor demoing "real-time" ray-tracing hardware (none of which has been successful), and I stick with that assessment today (and yes, I've designed shipping 3d hardware). Pixels are grossly parallel, that's a fact that's taken advantage of by all the 3d hardware out there. Modern rasterizers already have multiple triangles' worth of pixels in-flight on hundreds or thousands of threads in parallel, and each thread is far simpler than even the most cut-down x86 core. Rasterizers use a tiny amount of fixed-point hardware to step through pixels within triangles, and utilize wide lines of memory very very efficiently. Modern hierarchical techniques are equally applicable to rasterizers, so there's no magic bullet for ray-tracers there. Ray-tracers waste most of the memory bandwidth they need because the data structures are tree-like rather than well suited for streaming, and all the tricks to increase coherence make the hardware vastly more complicated. The cost of memory bandwidth is what drives 3d capabilities at the consumer level, and for any given amount of bandwidth you will have vastly more flexibility and realism with a rasterizer (not to mention far greater realism per Watt).

To Tech ARP... (1)

yoblin (692322) | more than 6 years ago | (#22921284)

Say it with me: 30 days has September, April, June and November. All the rest have 31, except February which has 28... (someone messed up their April Fools joke, judging by the "industry responses")

Re:To Tech ARP... (1)

91degrees (207121) | more than 6 years ago | (#22921414)

They're (allegedly) in Malaysia. Granted, it's still an hour or so too early but getting an April fools joke in shortly before midnight for the next day's edition would explain things.

Either that or the entire Tech industry has started hiring PR people via "investment opportunity" emails.

Re:To Tech ARP... (1)

Dunbal (464142) | more than 6 years ago | (#22921474)

omg ponies?

Re:To Tech ARP... (1)

F-3582 (996772) | more than 6 years ago | (#22921948)

Read the version tag on the third page. It has been filed on April 1st, so it is definitely April Fools.

Re:To Tech ARP... (3, Funny)

F-3582 (996772) | more than 6 years ago | (#22921432)

Yep, like Microsoft's:

Who told you this? We have been monitoring your articles based on leaked Microsoft information and like this one, they are ALL incorrect. Please let us know who your source is so we can correct him. (Editor : Or fire him???) Note that we have notified our legal department and the FBI as all Microsoft internal documents are not meant to be taken out of the building. They will be in touch shortly. Please extend all courtesies in cooperating with their investigation. (Editor : Good luck! We are in Malaysia!)

By the way: It is already April 1st over there.

Translation (0, Offtopic)

TheSpoom (715771) | more than 6 years ago | (#22921292)

Microsoft: "Pleeeeeease buy Vista! We'll even give you more eye candy!"

Bah! (1)

neowolf (173735) | more than 6 years ago | (#22921294)

If this isn't an April Fools joke- maybe they could get DX-10 to work first, before worrying about DX-11?(!)

Re:Bah! (1)

0racle (667029) | more than 6 years ago | (#22921606)

The date on the article is April 1.

Re:Bah! (0)

Anonymous Coward | more than 6 years ago | (#22921686)

Could this also be seen as a move to lock AMD/ATI out, too, since it's based on Intel's implementation? At the very least, it seems to be another way to extract more licensing fees from AMD.

"This is breaking news" (0, Offtopic)

lampsie (830980) | more than 6 years ago | (#22921296)

Dugg, for breaking news.

Re:"This is breaking news" (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22921604)

how the hell is this "insightful"? "dugg" is not insightful... its inane.

Looks like a shun to current GPUs (4, Interesting)

LiquidCoooled (634315) | more than 6 years ago | (#22921322)

It says nvidia will be locked out because DirectX11 raytracing will be based on x86.
Wasn't DirectX meant to be a generic middleman to allow developers to abstract away from the specific implementations?

Isn't this a backwards step that basically cuts anyone developing for it out of using the code on other systems (and I am meaning even the xbox 360).

Re:Looks like a shun to current GPUs (0, Troll)

Dunbal (464142) | more than 6 years ago | (#22921374)

Wasn't DirectX meant to be a generic middleman to allow developers to abstract away from the specific implementations?

      What part of EXTINGUISH don't you understand? This is Microsoft standard procedure. You will ALL be assimilated.

Re:Looks like a shun to current GPUs (1)

GreggBz (777373) | more than 6 years ago | (#22922314)

Now, hopefully I'm explaining this right. I'm sure a developer will set me straight if it's wrong.

DirectX works by talking right to the driver. Hence the name, DirectX. The hardware vendor is responsible for translating said DirectX function to operations in their hardware.

It's good in that there's less overhead and the drivers can be optimized per the vendor. It's bad in that some of the features may or may not be supported in hardware, and you are at the mercy of the vendor.

OpenGL, on the other hand, is a type of state machine, and kind of like a blob of functions and states you can use in a software VM, so there is an abstraction layer between the hardware and the API calls.

Basically, all OpenGL functions and state queries should be available, and the ones that are not supported by functions in hardware are run in software via the state machine.

Which one's better? I dunno. I personally prefer DirectX. Shame though, it's one OS only.

Misread as ... (1)

gzipped_tar (1151931) | more than 6 years ago | (#22921324)

Ray tracing to debut in Direct X11

Vista only, BTW. (1, Insightful)

snarfies (115214) | more than 6 years ago | (#22921344)

DX10 is Vista-only. I'm going to guess DX11 will be the same. Which means I'll never see it in action, as I will switch to Linux before I switch to Vista.

Wow direct X 11 (1, Funny)

Dunbal (464142) | more than 6 years ago | (#22921356)

This is great news for the 14 people who actually own Vista.

Re:Wow direct X 11 (1)

The MAZZTer (911996) | more than 6 years ago | (#22921552)

I own Vista. I got it for free through an MS promo (no, not the "TPB promo"). It's not great news for me, since all my games run much better in XP than Vista and I rarely boot into Vista anymore.

Re:Wow direct X 11 (1)

hairyfeet (841228) | more than 6 years ago | (#22921888)

Which is why I don't understand WTH MSFT thinks they are doing thinking it is a good idea to kill of XP,what are they nuts? First they got little cheap EEE style laptops taking off-that sure as hell ain't going to run Vista.Gamers hate it as it kills framerate dead.Casual users hate it because they changed too damn many things and now they can't find anything,and businesses hate it as it needs twice the hardware to run half as fast.

If anyone at MSFT has any common sense left they'll repackage XP with SP3 like they did with SP2 and ride the sales of that until they can get Windows 7 out the door.And unless they want 90+% of the games to stick with DirectX 9c they should seriously think about backporting DX10 to XP.I do a lot of Windows repair work and upgrade a lot of boxes for gamers and when asked if they'll go Vista they always look at you like you have a screw loose. I was one of the beta testers and gave away my copy I was given for beta testing because it made my 3.06 celeron with 2Gb of ram feel like a 486 with 1Mb running Win95. It was just too painful for words.

Funny thing is my copy of Vista last I heard had changed hands four times already. It is becoming like one of those fruitcakes that just keeps getting passed around. And everyone that has brought a Vista machine into the shop has hated it and either wanted me to somehow make it act like XP or they just gave up and asked me to pick them up a copy of XP and install it. After word got around that I still build XP boxes guys that were still running Win98 and Win2K(one even running WinME-yikes) have come in wanting one built before the cutoff date. The bad press and word of mouth has everyone around here avoiding Vista like the clap. If MSFT doesn't want a mass exodus to Apple and Linux land they should really rethink killing XP. But that is my 02c from down here in the trenches,YMMV.

Re:Wow direct X 11 (2, Funny)

Charcharodon (611187) | more than 6 years ago | (#22921658)

I think you made a minor math error....

....You forgot to carry the one a hundred million times or so.

Re:Wow direct X 11 (1)

CastrTroy (595695) | more than 6 years ago | (#22921806)

Even better for the people who live 6 years in the future and who are running Windows 7.


Anonymous Coward | more than 6 years ago | (#22921366)

I can't wait until I run Windows to be able to experience the awesome benefits of all the hard work that has gone in to coding this masterpiece.

John Carmack on Ray Tracing (5, Interesting)

Reality Master 101 (179095) | more than 6 years ago | (#22921418)

An interesting read on this very subject here [] . Quote:

"I have my own personal hobby horse in this race and have some fairly firm opinions on the way things are going right now. I think that ray tracing in the classical sense, of analytically intersecting rays with conventionally defined geometry, whether they be triangle meshes or higher order primitives, I'm not really bullish on that taking over for primary rendering tasks which is essentially what Intel is pushing."

Carmack admits he has his own personal preference, but generally he's pretty sensible about these things. He's usually called it correctly in the past when people have pushed various technologies that were supposed to take over the world, and they've fallen by the wayside.

Hopefully he'll chime into this latest article with some further thoughts.

Re:John Carmack on Ray Tracing (0)

Anonymous Coward | more than 6 years ago | (#22921530)

It's not actually Carmack's horse. It was Hanan Samet's horse far before Carmack's. Carmack is a horse thief!

Re:John Carmack on Ray Tracing (1)

Dr. Eggman (932300) | more than 6 years ago | (#22921558)

That article was also discussed on Slashdot [] .

Re:John Carmack on Ray Tracing (1)

eebra82 (907996) | more than 6 years ago | (#22921560)

PC Perspective also features an article [] written in January on the impact of ray tracing in games. It provides many pictures of what it will look like and what the benefits are vs rasterization. It's written by Daniel Pohl, a research scientist at Intel.

Re:John Carmack on Ray Tracing (2, Informative)

Azarael (896715) | more than 6 years ago | (#22921662)

There's this one from PC Perspective as well which is an interview with NVidia's Tech Director: []

His view on ray tracing is pretty much summed up by:

David Kirk, NVIDIA: I'm not sure which specific advantages you are referring to, but I can cover some common misconceptions that are promulgated by the CPU ray tracing community. Some folks make the argument that rasterization is inherently slower because you must process and attempt to draw every triangle (even invisible ones)--thus, at best the execution time scales linearly with the number of triangles. Ray tracing advocates boast that a ray tracer with some sort of hierarchical acceleration data structure can run faster, because not every triangle must be drawn and that ray tracing will always be faster for complex scenes with lots of triangles, but this is provably false.

There are several fallacies in this line of thinking, but I will cover only two. First, the argument that the hierarchy allows the ray tracer to not visit all of the triangles ignores the fact that all triangles must be visited to build the hierarchy in the first place. Second, most rendering engines in games and professional applications that use rasterization also use hierarchy and culling to avoid visiting and drawing invisible triangles. Backface culling has long been used to avoid drawing triangles that are facing away from the viewer (the backsides of objects, hidden behind the front sides), and hierarchical culling can be used to avoid drawing entire chunks of the scene. Thus there is no inherent advantage in ray tracing vs. rasterization with respect to hierarchy and culling. /blockquote

Even slower Windows 7 graphics ... (1)

scruffy (29773) | more than 6 years ago | (#22921420)

... will be ensured by using ray tracing to render characters in your word processing application! Finally, Vista will get some love.

Out by end of the year? (1)

Mr.Fork (633378) | more than 6 years ago | (#22921424)

Out by end of the year in MICROSOFT TIME means OUT BY 2011 - Q4. Maybe.

Direct X11? (0)

Anonymous Coward | more than 6 years ago | (#22921448)

How does it differ from regular X11?

Re:Direct X11? (1)

TeknoHog (164938) | more than 6 years ago | (#22921644)

How does it differ from regular X11?

Must be something to do with direct rendering. Which reminds me, there's nothing new about Vista's DRM, after all you need the DRM module to get accelerated 3D in Linux/X11 ;)

More from David Kirk? (3, Interesting)

argent (18001) | more than 6 years ago | (#22921476)

"I'll be interested in discussing a bigger question, though: 'When will hardware graphics pipelines become sufficiently programmable to efficiently implement ray tracing and other global illumination techniques?'. I believe that the answer is now, and more so from now on! As GPUs become increasingly programmable, the variety of algorithms that can be mapped onto the computing substrate of a GPU becomes ever broader.

As part of this quest, I routinely ask artists and programmers at movie and special effects studios what features and flexibility they will need to do their rendering on GPUs, and they say that they could never render on hardware! What do they use now: crayons? Actually, they use hardware now, in the form of programmable general-purpose CPUs. I believe that the future convergence of realistic and real-time rendering lies in highly programmable special-purpose GPUs."
Very interesting. A couple of years later he was arguing against special purpose GPUs for ray tracing, and for the use of "General Purpose GPUs", and the new nVidia 8xxx series seem to be following that path... away from dedicated rendering pipelines and towards a GPU that's more like a highly parallel CPU.

More comments from David Kirk. []

I would be very interested in what he learned between 2002 and 2004 that led him to argue so eloquently against Phillip Slusallek. I'd also like to know what Professor Slusallek is doing at nVidia, where he's "working with the research group on the future of realtime ray tracing" [] .

Some less breathless articles (5, Interesting)

jmichaelg (148257) | more than 6 years ago | (#22921478)

Intel has this article [] about the hardware needed to run at 50fps at 1920x1080p. They're claiming you need 8 cores. In a couple of years, that could well be within reach for most gamers.

There's also this John Carmack Interview [] . Carmack isn't too optimistic about ray tracing replacing rasterized graphics.

Duelling articles! (1)

argent (18001) | more than 6 years ago | (#22921594)

SaarCOR [] was getting about 10 FPS for Quake3 with a minimal FPGA-based implementation of a hardware raytracer running at less than 100 MHz with a fraction of the gate budget of a modern GPU... in 2005. Raytracing is highly scalable - it's an "embarrassingly parallelizable" problem - so if nVidia is really working on raytracing hardware they could well be able to beat Intel to the punch.

not "embarrassingly parallel" (1)

Speare (84249) | more than 6 years ago | (#22922030)

A number of people refer to raytracing as an "embarrassingly parallel" process. The implication in the term being that there's no need for communication between each core or thread or process: they each just get handed a rectangular portion of the offscreen screen memory, and they do their job alone, and when they're all done, then the screen can be flipped to show the results.

I will grant that the actual rendering of pixels is indeed independent, but that's not the proposal. Nobody wants the same geometry shot from the same camera angle with the same lights to be rendered at 50Hz. They want motion. The lights move. The camera moves. The geometry moves. Every frame is different. And while the pixel-pushing is embarrassingly parallel, each one of those cores is going to have to be told what to draw each time around. Shared memory throttles the effectiveness of parallel processing. Shared caches, shared pipelines, shared buses, shared anything.

As the core count goes up, so does the cost of fanning out the new geometry updates every frame. I'm not going to say it's a deal-breaker, but it's hardly an "embarrassingly parallel" problem.

So they say (1)

esocid (946821) | more than 6 years ago | (#22921512)

It also obviates the need for the GPU which has stolen much of the limelight in recent years.
So that $250 EVGA 8800GTS [] I just bought soon will be used for a doorstop? I haven't even checked out the DX10 with it, I'm still kicking DX9. I may test out the vista x64 ultimate and see how crysis runs there as opposed to xp, which I doubt will be that dramatic. I somehow don't see gpus disappearing when dx11 premieres since not many people will actually have 8 or 16 core cpus.

As DirectX 11 is a work in progress, Microsoft does not have an exact timeline. But the source claims that DirectX 11 could be part of Windows Vista by late 2008.
Whew, that makes me feel much better. In lamens terms, that means 2011, give or take 5 years (give).

But will it ... (1)

Alpha Whisky (1264174) | more than 6 years ago | (#22921528)

run Duke Nukem Forever?

Major breakthrough for Business Software (4, Funny)

Thanshin (1188877) | more than 6 years ago | (#22921584)

Raytracing allows the implementation of mirrors in 3d environments.

Finally all business software will have the feature of showing the cause of most problems. (See also "Error Id: 10T" and PEBKAC)

Vista Adoption (1)

Hythlodaeus (411441) | more than 6 years ago | (#22921610)

DX11, like DX10, will probably be Vista-only. So, will Intel build OpenGL support, roll their own API, or tie the success or failure of their graphics architecture to Vista?

Windows XP will soon go out of print (2, Insightful)

tepples (727027) | more than 6 years ago | (#22921962)

DX11, like DX10, will probably be Vista-only.
So will new PCs built with these new chips, if only because Windows XP will have been taken out of print by this July.

Get this! (0, Offtopic)

Improv (2467) | more than 6 years ago | (#22921618)

You will be dazzled and wowed by the suggestion that, get this, submitters should learn style (and how!) and those who approve stories should edit them for style! Get this! Yeah!

povray won't look outdated, yet (1)

G3ckoG33k (647276) | more than 6 years ago | (#22921664)

povray ( won't be outdated anytime soonish, I guess. Today there is more than raytracing to it, like light scattering effects etc. Still, if those additional effects are done in hardware too, povray and other renderers may face an uphill battle. Like within just a few years.

Re:povray won't look outdated, yet (1)

Yetihehe (971185) | more than 6 years ago | (#22921870)

if those additional effects are done in hardware too, povray and other renderers may face an uphill battle
What do you mean battle? They will happily use this hardware effects for faster rendering!

Re:povray won't look outdated, yet (1)

G3ckoG33k (647276) | more than 6 years ago | (#22922010)

"What do you mean battle? They will happily use this hardware effects for faster rendering!"

How is that? Didn't realize that!

From: []

  _ _

"Will POV-Ray render faster if I buy the latest and fastest 3D videocard?"


3D-cards are not designed for raytracing. They read polygon meshes and then scanline-render them. Scanline rendering has very little, if anything, to do with raytracing. 3D-cards can't calculate typical features of raytracing as reflections etc. The algorithms used in 3D-cards have nothing to do with raytracing.

This means that you can't use a 3D-card to speed up raytracing (even if you wanted to do so). Raytracing makes lots of float number calculations, and this is very FPU-consuming. You will get much more speed with a very fast FPU than a 3D-card.
  _ _

Ok, as this is a CPU hardware improvement it can be used by povray I guess.

predictable (0)

Anonymous Coward | more than 6 years ago | (#22921680)

So basically you have one company who is looking for a reason to force people to upgrade their hardware working in tandem with another company who wants to force people to upgrade their software to push a technology that no current system is able to support adequately.

I am shocked! SHOCKED I SAY!!

End of year (1)

gmuslera (3436) | more than 6 years ago | (#22921690)

Don't be unfair, is not Microsoft intentionally delivering what they promise far later, is that they measure time in an exponential curve while we measure it in a linear one, so the last month for them of this wait will take several of ours (if happens in our lifetime, at least).

OpenRT (0)

Anonymous Coward | more than 6 years ago | (#22921692)

OF course, there's an existing, vendor-neutral raytracing API that is like OpenGL, only for raytracing. Will microsoft implement it? Will they fuck. []

But will DirectX 11 support.... (1)

pandrijeczko (588093) | more than 6 years ago | (#22921720)

...non-linear non-regurgitations of non-previously released games titles that take more than 8 hours to complete for thirty five of our Earth pounds?

No, I thought not...

Re:But will DirectX 11 support.... (0, Troll)

phoenix321 (734987) | more than 6 years ago | (#22922098)

I take that as less than an educated guess, in fact, gameplay quality can only improve when using raytracing. Less effort has to be spent on the rendering engine, no more hacks to make it look good, less chances to screw up on complexity and bugs and more chances of using the same 3d world and engine on multiple platforms from workstations to handhelds.

This increases the market window for quality games while also increasing the budget percentage that can be spent on level design and storyboard. I'm pretty optimistic about this one, as I hope that a rendering engine using raytracing can be a generic commodity so any independent game studio can easily get in the market and "hit the ground running" if you excuse this bad marketing metaphor. So I hope we see independent and less overused games and maybe even some new genres in the near future. New technologies always allow for that and I certainly remember when Wolfenstein 3d and others opened the door to the entire FPS genre we have today. I admit that while we have pixel-perfect glory by now, we're still stuck at shooting Nazis, but the same technology led to the development of modern 3d engines and hardware used for almost all games by now.

And even if raytracing doesn't bring any more diversity into the market, I am absolutely, positively sure we will have an in-game rendering of Omaha Beach in perfect 1080p HD, so *maybe* some game developers get the clue that players do not want to virtually land on Normandy beaches ever again.

Re:But will DirectX 11 support.... (1)

pandrijeczko (588093) | more than 6 years ago | (#22922142)

Thanks for the detailed explanation and I am no games developer - but hasn't just about the same thing been said for every new iteration of DirectX?

Makes Quad cores look a little more attractive (0)

RobinH (124750) | more than 6 years ago | (#22921758)

This is interesting to those of us buying new PCs right now. I was not sure whether to go with a dual or quad core. The problem is that very few applications that I would run actually make use of quad cores, but stuff like raytracing is highly parallel, so this gives me hope that purchasing a quad core processor won't be a waste of money.

Modern Ray Tracing is OVERRATED!!! (2, Funny)

SomeoneGotMyNick (200685) | more than 6 years ago | (#22921786)

I'll hold on to Imagine [] for my Amiga until it's pried from my cold, dead hands.

34.2 minutes per rendered frame gives me plenty of time to do other things around the house.

Actually, I would have mentioned Turbo Silver instead if there were any good links for it.

OpenGL (0)

Midnight Thunder (17205) | more than 6 years ago | (#22921884)

So Direct X is getting it. Are we likely to see something similar with OpenGL?

Re:OpenGL (1)

WhiteWolf666 (145211) | more than 6 years ago | (#22922208)

I think that is the realm of OpenRT [] instead.

April Fool's Day already? (0)

Masa (74401) | more than 6 years ago | (#22922124)

Oh, come on! This has to be either an early April Fool's Day joke or a false rumor. Maybe someday we will see something in this regards, but I think we are still quite a long way from practical implementations, which could provide ray tracing in such performance that it would be usable in gaming. Besides, there is no sources mentioned at the article, it seems to be based completely on rumors (or lies?) and at the end of the article, there is some "quotes" from manufacturers, which seem like a 12-year old kid has written them. Besides, what's the deal with all those smiley faces?

Re:April Fool's Day already? (1)

UtsuMaster (874626) | more than 6 years ago | (#22922306)

I think so too. The second page of the article is completely absurd. What kind of PR departments are those?

But in the Microsoft response the editor comments they are in Malaysia, so maybe it's already April 1st there? Joke lag anyone?

RT vs Raster. (1)

phoenixwade (997892) | more than 6 years ago | (#22922228)

Instead of a reply buried in the RT vs. Raster debate that this article generated, I thought a reply to the entire thread would be more appropriate: WHHHOOOOOOOOOSSSSHHHHHHHHH.... As the joke flew over your head. It's an early post April Fools bit, people. However, it might serve some of us to step back and examine our need to defend our own prejudices... {nah... what am I thinking... this is Slashdot..... Carry on.}

More MS lock-in (1)

jfbilodeau (931293) | more than 6 years ago | (#22922264)

Is anyone else seeing this as MS' tactic to ensure that games develop for Windows will only run on Windows. What I mean by this is that MS is almost forcing developers to use a new technology that forces games to upgrade to Vista (most likely) and upgrade their machines. Real-time ray tracing seem experimental to me but I can see MS pushing it down the developers and users throats. Why? Amongst other things, MS wants to make sure that the next generation of Windows game will not be available to Cedega (or other platforms like the Mac) for as long as possible.

I'm not against ray-tracing or new APIs for games. I'm against MS using it as another strategy to lock gamers in Windows.

Oh, and don't get me started on the patents that MS will probably acquire concerning that technology.

Look at the date on last revision of the article.. (0, Redundant)

ironwill96 (736883) | more than 6 years ago | (#22922320)

The "revision date" of the article is listed as 01-04-2008 (which is pretty much anywhere outside the United States way of saying 04-01-2008). This is obviously an April Fool's joke, just read the last page with the comments from the various companies.

Oh, sorry, I forgot nobody actually reads articles on Slashdot, this is merely a forum for complaints!
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>