Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Doubts Ray Tracing Is the Future of Games

kdawson posted more than 6 years ago | from the pick-your-horse dept.

Graphics 198

SizeWise writes "After Intel's prominent work in ray tracing in the both the desktop and mobile spaces, many gamers might be thinking that the move to ray-tracing engines is inevitable. NVIDIA's Chief Scientist, Dr. David Kirk, thinks otherwise as revealed in this interview on rasterization and ray tracing. Kirk counters many of Intel's claims of ray tracing's superiority, such as the inherent benefit to polygon complexity, while pointing out areas where ray-tracing engines would falter, such as basic antialiasing. The interview concludes with discussions on mixing the two rendering technologies and whether NVIDIA hardware can efficiently handle ray tracing calculations as well."

cancel ×

198 comments

Counterpoint (4, Funny)

peipas (809350) | more than 6 years ago | (#22677248)

Kirk should talk to Picard [demoscene.hu] who is quite enthused about real time raytracing.

Re:Counterpoint (4, Funny)

moderatorrater (1095745) | more than 6 years ago | (#22677428)

That lends a lot of weight to the raytracing argument, since I generally prefer Picard over Kirk...

Re:Counterpoint (1)

youngdev (1238812) | more than 6 years ago | (#22677538)

I liked cisco from ds9

Re:Counterpoint (4, Funny)

peipas (809350) | more than 6 years ago | (#22677660)

Yes, I loved the episode where Odo shape-shifted into a layer 3 switch during the Romulan invasion.

Re:Counterpoint (1)

Araxen (561411) | more than 6 years ago | (#22677932)

Well if you could spell his name right I might believe you.
It's spelt "Sisko".

Re:Counterpoint (1)

Enlightenment (1073994) | more than 6 years ago | (#22678950)

What's important is not his name. What's important is that everyone loved Crisco.

Re:Counterpoint (2, Insightful)

Naughty Bob (1004174) | more than 6 years ago | (#22677508)

As a counter-counterpoint, the article has quite misleading pictures-

The Ray-Tracing images are super slick, but are non real-time, highly processed work.

Whereas the comparison Rasterized images are real-time, game-generated examples. If you were to allow the pro-rasterization side the same time to produce a single picture, it would be super fancy.

Re:Counterpoint (4, Interesting)

Applekid (993327) | more than 6 years ago | (#22678504)

We're not talking about the current technology, we're talking about the future. As in whether Ray Tracing is the Future of Games.

Graphics hardware has evolved into huge parallel general-purpose stream processors capable of obscene numbers of FLOPs per second... yet we're still clinging tenaciously to the old safety blanket of a mesh of tesselated triangles and projecting textures onto them.

And it makes sense: the industry is really, really good at pushing them around. Sort of how like internal combustion engines are pretty much the only game in town until alternative save themselves from the vapor.

Nvidia, either by being wise or shortsighted, is discounting ray-tracing. ATI is trailing right now so they'd probably do well to hedge their bets on the polar opposite of where Nvidia is going.

3D modelling starts out in abstractions anyway with deformations and curves and all sorts of things that are relatively easy to describe with pure mathematics. Converting it all to mesh approximations of what was sculpted was, and still is, pretty much just a hack to get things to run at acceptable real-time speeds.

Re:Counterpoint (1)

Naughty Bob (1004174) | more than 6 years ago | (#22678688)

Wait.... Applekid lecturing me about the Future of Games?

Snark aside, I think that the true future is a combination of both methods, with ray-tracing being used for light effects over the top of rasterized 3d models.

After all, that's (pretty much) how it works in real life....

Re:Counterpoint (3, Interesting)

SanityInAnarchy (655584) | more than 6 years ago | (#22680638)

From what I read, I think part of this is nVidia assuming GPUs still stay relevant. If we ever do get to a point where raytracing -- done on a CPU -- beats out rasterization, done on a GPU, then nVidia's business model falls apart, whereas Intel suddenly becomes much more relevant (as their GPUs tend to suck).

Personally, I would much prefer Intel winning this. It's a hit or miss, but Intel often does provide open specs and/or open drivers. nVidia provides drivers that mostly work, except when they don't, and then you're on your own...

Steve Jobs also uses this trick (4, Insightful)

OrangeTide (124937) | more than 6 years ago | (#22677266)

Saying something sucks if he's already developing a product for it.

Relief Texture Mapping (5, Informative)

DeKO (671377) | more than 6 years ago | (#22677292)

A good way to mix both techniques is Relief Texture Mapping [ufrgs.br] . It's a good way to get smooth surfaces thanks to the texture interpolation hardware, with no extra polygons.

Re:Relief Texture Mapping (0)

Anonymous Coward | more than 6 years ago | (#22677634)

Relief Texture Mapping is already implemented today.

This article is about the future. Unless you can reason there's plenty of future research and advancements for relief texture mapping?

Re:Relief Texture Mapping (1)

WilyCoder (736280) | more than 6 years ago | (#22678056)

Except it can't do jack-squat at the edges of geometry...

Re:Relief Texture Mapping (1)

DeKO (671377) | more than 6 years ago | (#22678112)

Yes, it can. See the "Relief Maps with Silhouttes" demo. Notice the shadows too. Try the demos, they even work on wine, with a cheap FX 5200.

Re:Relief Texture Mapping (0)

Anonymous Coward | more than 6 years ago | (#22679406)

You should have payed more attention to that page. Take a look at the "Relief Maps with Silhouttes" technical report.

Re:Relief Texture Mapping (1)

LBt1st (709520) | more than 6 years ago | (#22678200)

This technique has nothing to do with ray-tracing.
It's a shader combining a normal map and cube map (both are raster).

Re:Relief Texture Mapping (3, Interesting)

DeKO (671377) | more than 6 years ago | (#22678432)

Ray tracing is firing rays from each screen pixel, test for collision against the geometry, and figure out the proper color. RTM is firing rays from each polygon's "pixel", test for collision against the "texture geometry", and figure out the proper color. It's just ray tracing in a subset of the screen pixels, in a geometry (heigthmap) represented by a texture (or multiple textures) from a polygonal face. Why do you think this is not related to ray tracing?

Re:Relief Texture Mapping (1)

LBt1st (709520) | more than 6 years ago | (#22679196)

Your right. I didn't bother to watch the videos. That's actually quite impressive!

Like we were expecting something else (5, Insightful)

Btarlinian (922732) | more than 6 years ago | (#22677302)

Seriously though, does anyone expect Nvidia to say, "Yes, we really do think that our products will all be obsolete and outdated in a few years. Thank you for asking." I personally have no idea as to whether or not ray tracing is the future of games, but I really don't think that Nvidia is the right person to ask either, (just as Intel isn't).

Re:Like we were expecting something else (1)

Naughty Bob (1004174) | more than 6 years ago | (#22677416)

but I really don't think that Nvidia is the right person to ask either, (just as Intel isn't).
True. I believed Intel, when they explained the superiority of Ray-Tracing, and now I believe Nvidia, when they say the opposite.

From what I can tell, Ray-Tracing is closer to 'reality', and so you'd expect the technology eventually to tend in that direction. But the explanation from the Nvidia dude makes it seem like that point is many years away, owing to the excellent results available now with rasterization, and the extreme resolution currently used by the reality.

Re:Like we were expecting something else (2, Insightful)

Kjella (173770) | more than 6 years ago | (#22677990)

From what I can tell, Ray-Tracing is closer to 'reality', and so you'd expect the technology eventually to tend in that direction.
Not necessarily, humans aren't that picky that the virtual reality be a perfect reality. For example, I don't ever expect to pick up a laser, find a thin enough slit and see the quantum effects on the wall, not the real simulated deal anyway. More natural than modelling it as a pefect beam? Sure, but like what's the point. Graphic cards and Copperfield are in the same business, making the grandest possible illusion with a minimum of resources and maximum immersion. If you get the macroscopic effects close enough, there's no added value to actually doing what you just made appear right with smoke and mirrors.

Re:Like we were expecting something else (1)

Naughty Bob (1004174) | more than 6 years ago | (#22678266)

I don't ever expect to pick up a laser, find a thin enough slit and see the quantum effects on the wall, not the real simulated deal anyway.
I'm assuming you mean within a simulated 'reality'.

On the one hand, quantum interference can easily be exempted from the potential ray-tracing future, should it prove hard to model. On the other, what's so hard about it?

Re:Like we were expecting something else (1)

MenTaLguY (5483) | more than 6 years ago | (#22678656)

Ray tracing is based on tracing the paths of particles, not wavefronts.

Re:Like we were expecting something else (1)

Naughty Bob (1004174) | more than 6 years ago | (#22678876)

But it intends to be able to reproduce reality via a set of starting assumptions, no?

Re:Like we were expecting something else (1)

Bob-taro (996889) | more than 6 years ago | (#22679300)

True. I believed Intel, when they explained the superiority of Ray-Tracing, and now I believe Nvidia, when they say the opposite.

It sounds like ray tracing is better, but slower. If that is the case, a move to ray tracing might be more likely if we see a "leveling off" of scene complexity while hardware performance continues to increase. It might get to where a ray traced game looks better and is fast enough.

Re:Like we were expecting something else (2, Insightful)

Mr. Underbridge (666784) | more than 6 years ago | (#22677492)

Seriously though, does anyone expect Nvidia to say, "Yes, we really do think that our products will all be obsolete and outdated in a few years. Thank you for asking." I personally have no idea as to whether or not ray tracing is the future of games, but I really don't think that Nvidia is the right person to ask either, (just as Intel isn't).

One could argue that the Nvidia folks have been well aware of ray-tracing for a long time, and if they thought it was reaching the point where it was going to be useful that they would have begun incorporating it in a future generation of chip. So it's not like they're permenantly committed against it - they may honestly believe it's time is not here.

As for Intel, I do think it's fairly obvious that the inherent parallelization of ray tracing is a big part of what makes it attractive to them right now. That and they have enough cash to just screw around with it without having to market it. But there's no reason Nvidia wouldn't go to multi-core chips if they thought the demand was there.

Re:Like we were expecting something else (1)

Telvin_3d (855514) | more than 6 years ago | (#22677594)

My understanding is that part of the threat to Nvidia and other dedicated graphics card makers is that ray tracing doesn't lend itself as well to dedicated solutions. Or rather, the type of processor needed tends to be the type that is already being used as a CPU with some minor tweaks to optimize performance. So instead of buying a separate chip for graphics, you get the same performance boost from just getting a second CPU or one with more cores. Instead of a graphics card with more RAM, you just add more to your general system RAM.

Re:Like we were expecting something else (3, Interesting)

Znork (31774) | more than 6 years ago | (#22678010)

So instead of buying a separate chip for graphics, you get the same performance boost from just getting a second CPU or one with more cores.

Not only that; you get a performance boost from your server. And from your kids computers. You get a performance boost from your HTPC. And potentially any other computer on your network.

The highly paralellizable nature is the most interesting aspect to raytracing IMO; with distributed engines one could do some very cool things.

Re:Like we were expecting something else (5, Informative)

Abcd1234 (188840) | more than 6 years ago | (#22678338)

Actually, I don't think that's true at all. Raytracing, just like today's rasterizers, can greatly benefit from dedicated hardware for doing vector operations, geometry manipulation, and so forth. This is particularly true as raytracing benefits greatly from parallelization, and it would be far easier to build a dedicated card with a nice fat bus for shunting geometry and texture information between a large number of processing units than it would be to use a stock, general multicore processor which isn't really designed with those specific applications in mind.

Besides, the whole reason to have separate, specialized gear for doing things like audio/visual processing is to free up the main CPU for doing other things. Heck, we're even seeing specialized, third-party hardware for doing things like physics and AI calculations, not to mention accelerators for H.264 decoding, etc. As such, I see no reason to move graphics rendering back to the main CPU(s).

Re:Like we were expecting something else (1)

mikael (484) | more than 6 years ago | (#22677912)

When it comes to ray-tracing complex shapes using spline patches, the recommended approach is to tessellate the geometry into triangles and then ray-trace the collection of triangles, using octrees for optimization (just test a single cube for ray-intersection than a whole set of triangles).

Graphics cards already do something similar with deferred rendering. They sort the projected triangles according to position on the screen, and render groups of triangles into a local cached copy of the framebuffer. Only when all the triangles for that area have been rendered, is it written back to the main framebuffer. Another optimisation is to render the entire scene without texture-mapping just to get the Z-buffer data, then do a second-pass to get the texture data. This saves wasting multiple texture passes on each pixel. Reflection and refraction can already be handled using cube-maps.
So using graphics hardware is as good as a two-ray deep ray-tracer.

It's only when you need inter-object reflections to large depth that ray-tracing really has an advantage (although there are programming techniques to do that with graphics cards too).

Re:Like we were expecting something else (1)

Abcd1234 (188840) | more than 6 years ago | (#22678376)

although there are programming techniques to do that with graphics cards too

Whoops, I think you meant "hacks", there.

This is the same thing that's been going on with rasterization for years. Developers and hardware designers have built hack upon hack in order to implement what raytracing does so easily. The result is a library of tricks that often can't be easily intermixed, and are always a pain to work with.

So, if you can switch to raytracing and get similar performance, and end up with a larger feature set that produces higher quality results, why *wouldn't* you?

Re:Like we were expecting something else (1, Insightful)

Anonymous Coward | more than 6 years ago | (#22679362)

Because those "hacks" are what make it possible to do it in real time, and use it in games. You can't switch to raytracing and get similar performance. If you could then games would already be using it today.

Plus when people spout the benefits of raytracing they always seem to forget those benefits cost processing time. So yes, with raytracing things like radiosity, realistic glass and water rendering is possible to do in a straightforward simple manner.. just you'll be getting 1 frame per hour instead of 60 frames per second.

Of course you can add "hacks" or assumptions to the raytracing engine, and only use the "realistic" engine for certain things.. but hey, that's no different than what is done now in games.. back to square one.

Re:Like we were expecting something else (1)

Abcd1234 (188840) | more than 6 years ago | (#22680040)

You can't switch to raytracing and get similar performance. If you could then games would already be using it today.

No one is saying you can, today, and you bringing it up constitutes a strawman. The point is that we're getting to the point, in terms of available technology, where it *will* be possible.

The rest of your post is based on the same, erroneous presumption, so there seems little point in addressing it.

How would it obsolete their products? (1)

Sycraft-fu (314770) | more than 6 years ago | (#22677980)

Just because their products now are focused on rasterization (their current GPUs can do raytracing as well) doesn't mean their next generation ones have to be. I'm sure they'd be happy to produce raytracing hardware, if there was a demand for it and if they could make it fast.

That is the problem, as the article noted. You get research oriented things that are very pie in the sky about rendering techniques. They concentrate on what is theoretically possible and so on. nVidia isn't in that position, they are concerned with what is actually faster when implemented in hardware. I'm quite sure nVidia would shift over to making raytracing cards, if it was better to do so. In this case better means "Produces a better looking image in realtime."

I'm all for research in to other ways than what we do now, however that doesn't mean that just because something looks good on paper that it works in reality well. The question isn't what technique has what big O, the question is when implemented on real silicon with today's technology, what gives the better image?

Re:How would it obsolete their products? (1)

ivan256 (17499) | more than 6 years ago | (#22678316)

Just because their products now are focused on rasterization...
Ray tracing is a form of rasterization... In other words, it translates a description of a scene into an array of pixels.

Re:Like we were expecting something else (1, Insightful)

podperson (592944) | more than 6 years ago | (#22679396)

I think his response was pretty reasonable and balanced, actually.

1) Ray-tracing isn't going to solve all problems (it doesn't for movie rendering, why would it for real-time?)

2) Existing software still needs to run.

3) A hybrid approach will end up making the most sense (since it has for everything else).

He's not just talking "party line" ... he's talking common sense. Ray-tracing everything is just an inefficient way to get the job done. It produces great mirror-finished objects but ugly shadows and mediocre lighting. (Guess which demos are full of mirror-finish objects?)

Re:Like we were expecting something else (1)

xeoron (639412) | more than 6 years ago | (#22679858)

No I don't think they will, but they might say, "We see this becoming the next trend, which we are committed to supporting."

obligatory.... (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22677354)

KHAAAN! [khaaan.com]

Hmmm - who to believe, who to believe?? (4, Interesting)

$RANDOMLUSER (804576) | more than 6 years ago | (#22677400)

Intel says to do it with the CPU, and nVidia says to do it with the GPU. What a surprise.

Re:Hmmm - who to believe, who to believe?? (0)

Anonymous Coward | more than 6 years ago | (#22678122)

This reminds me of another player in the graphics world that put their foot in their mouth.

http://www.penny-arcade.com/comic/1999/01/29 [penny-arcade.com]

Yeah. Good luck with that Nvidia.

Re:Hmmm - who to believe, who to believe?? (0)

Anonymous Coward | more than 6 years ago | (#22678854)

Exactly:

NVIDIA Doubts Ray Tracing Is the Future of Games

... and Chevron Doubts Hydrogen Is The Future of Cars

Re:Hmmm - who to believe, who to believe?? (1)

sexconker (1179573) | more than 6 years ago | (#22679386)

and Chevron Doubts Hydrogen Is The Future of Cars
So do I.
Remember kids - Hydrogen isn't free!

Re:Hmmm - who to believe, who to believe?? (0)

Anonymous Coward | more than 6 years ago | (#22679100)

Surprise left the computing industry long time ago..

Re:Hmmm - who to believe, who to believe?? (1)

not-enough-info (526586) | more than 6 years ago | (#22679320)

Intel says to do it with the CPU, and nVidia says to do it with the GPU.
AMD does it in court!

Obey your thirst... (5, Insightful)

Itninja (937614) | more than 6 years ago | (#22677424)

Personally, I prefer spites to either ray-trace or polygons. I still thing Starcraft (the game, not the conversion van) had some of the best graphics. But then I am kind of a fuddy-duddy. I also think River Raid was an awesome game.

Re:Obey your thirst... (1)

Dorkmaster Flek (1013045) | more than 6 years ago | (#22677530)

I've always been a huge fan of 2D sprites. A well animated and detailed 2D sprite uses up much less resources and can do things you could never do with a 3D model, even an insanely detailed high-res textured one. Seeing well detailed 2D graphics makes me smile.

Re:Obey your thirst... (2, Insightful)

waffledoodle (1070284) | more than 6 years ago | (#22678530)

It goes both ways. Sprites are perfect at a small scale or when you don't need too many frames of animation, or when you want to blend and layer animations. However, a super smooth screen-size sprite (a boss, for example) can easily suck up more resources than a model if you're dishing out lots of animation.

Re:Obey your thirst... (0)

Anonymous Coward | more than 6 years ago | (#22677610)

Then you're a fan of ray-tracing. All of Starcraft's sprites were ray-traced ahead of time.

If they could be ray-traced in real time, then Starcraft could support larger monitors simply by scaling up the screen, instead of being stuck at 800x600. Or they could go the lame "superzoom!" thing that newer RTS games are doing which really destroys the entire point of the game. But I'd rather think of 1600x1200 Starcraft.

Re:Obey your thirst... (1)

Wabin (600045) | more than 6 years ago | (#22677708)

Agreed. Doing anything out of spite is a bad idea. Long live Space Invaders!

Re:Obey your thirst... (1)

CubeRootOf (849787) | more than 6 years ago | (#22677736)

Starcraft had some of the most EFFECTIVE graphics.

There isn't a sense that you would ever miss your targets because the graphics display couldn't get on the same page with where the engine thought the graphics should be...

sprites makes sure everyone is on the same page, by being able to say, beyond a doubt, this thing is here and it is this big... no no - don't compute its position, just put it there... no no - I don't care what kind of processor you have, or what OS you are using, put it where I said to put it.

Re:Obey your thirst... (2, Funny)

Belial6 (794905) | more than 6 years ago | (#22677830)

I still remember, as boy in the early 80's, getting the opportunity to take a cruse on a navy ship. Seeing the targeting equipment on that ship gave me a real appreciation for just how realistic the graphics on Missile Command were. They were darn near indistinguishable from the real thing.

Re:Obey your thirst... (0)

Anonymous Coward | more than 6 years ago | (#22678050)

"--
Can you find what's unusual about my sig? I will posit that you cannot."

Me certainly see something excluded.

Of course, you're also confused, but that's nothing unusual on /. If you weren't confused you would know that there are all kinds of things "unusual" about your .sig: it is yours, for example. What you really want to ask is, "Can you find out THE ONE THING I THINK is unusual about my sig?"

Quite a different proposition, and clearly a purely psychological problem, as most of these things are. Who knows, of the millions of unusual things about your .sig, which particular one you happen to think is the only unusual thing about your .sig?

Re:Obey your thirst... (1)

Itninja (937614) | more than 6 years ago | (#22679628)

No. There is actually something quantifiably unusual about my signature. Not something subjective, but a trait about it that's very unusual, even unique.

Re:Obey your thirst... (1)

daenris (892027) | more than 6 years ago | (#22679864)

I wouldn't call the lack of the letter e 'unique.'

Re:Obey your thirst... (1)

popmaker (570147) | more than 6 years ago | (#22678204)

Very good point! It does maybe not have that super-realistic "I shot the guy in the head and he's actually BLEEDING from the head", but it just looks... nice! The same with the old black isle games, such as Baldur's gate (especially II), fallout, planespace torment, etc... Clean, simple and nice.

And, by the way, I thought I'd laugh my ass off first time I went on a camping trip and saw the Starcraft conversion van. :)

Re:Obey your thirst... (2, Informative)

Stormwatch (703920) | more than 6 years ago | (#22678246)

Talking about sprites, did you see the teaser video for King of Fighters XII? So. Fuckin'. Beautiful. [kotaku.com]

Re:Obey your thirst... (1)

GreggBz (777373) | more than 6 years ago | (#22678514)

If you were talking about pixel detail, I might agree.

But It's all about the lighting and animation in a changing environment (starcraft was isometric with static lighting). It's hard to light and animate sprites convincingly, unless you have a lot of artists willing to draw a lot of frames.

Re:Obey your thirst... (2, Informative)

ObsessiveMathsFreak (773371) | more than 6 years ago | (#22679680)

It takes a lot of work to get a 3D model to look as good as a 2D sprite. You gain more freedom and, as the amount of actions increase, can create new animations with as lot less hassle. But it remains very difficult to get a really good "animated" feel with 3D models which need to look good from all angles, and nowadays under all lighting conditions. 2D sprites, while laborious to create, invariably display precisely as the animator intended.

Games like Ratchet and Clank or Jak and Daxter pull this off well. It's not just down to character design allowing a certoony look. Apparently the games use a Naughty Dog technique whereby the models "bones", i.e. canonically fixed points, are themselves allowed warp and distort, meaning that the models do not simply consist of fixed points rotating on joints. Jax and Daxter exemplifies this best, with characters undergoing highly exaggerated warping and distortion both in game and in scripted scenes. Think of a Looney Toons double take. A game like Viewtiful Joe, which while cell shaded, did not look as good, simply because it did not use this effect. I believe Sly Cooper used a combination of the two styles.

Design is a far, far more important factor than graphics capability in improving a games overall look. Call of Duty 4, while technically impressive, looks fairly dry. This simply cannot be helped as you are playing as "realistic" soldiers in what are ordinary locals. Something like Unreal Tournament 3, which is actively using often exaggerated artistic designs, and where you fight in alien locals, is much more aesthetic.

Ray tracing "can" make games look better, but only slightly. If you want better looking games, you need better artistic design. I don't see how ray tracing delivers this in a measurably better way over other, less intensive techniques. Unless it's for something like weird water effects, I just can't see the advantage when you could be putting cycles to work on other things like movement in the background, more animated sprites or things like dust and spray.

This just in... (2, Insightful)

Majik Sheff (930627) | more than 6 years ago | (#22677484)

IBM doubts the future of the "personal computer"
Buggy manufacturers poo-poo the new horseless carriage
etc, etc.

Re:This just in... (1)

morgan_greywolf (835522) | more than 6 years ago | (#22677834)

IBM doubts the future of the "personal computer"
Actually, that was Xerox and maybe some factions at IBM back in the day. But various factions at IBM were watching the young 'personal computer' (they were called 'home computers' back then) market to see when the best time to jump in would be.

Re:This just in... (1)

somepunk (720296) | more than 6 years ago | (#22678426)

Nobody remembers the doubts that turned out to be justified. This is a logical fallacy of the same sort as the one that starts out with people laughing at somebody.. who could be a misunderstood genius waiting for the recognition of history, but more likely as not, really is just an idiot.

Re:This just in... (0)

Anonymous Coward | more than 6 years ago | (#22679070)

I'd explain that whooshing sound you heard, but I don't really see a point.

Translation (2, Insightful)

snarfies (115214) | more than 6 years ago | (#22677510)

We can't do it as well as Intel yet, therefore it sucks. BUY NVIDIA.

What do the people that make the software say? (4, Interesting)

9mm Censor (705379) | more than 6 years ago | (#22677526)

What about game and engine devs? Where do they see the future going?

Re:What do the people that make the software say? (2, Informative)

Solra Bizna (716281) | more than 6 years ago | (#22677662)

What about game and engine devs? Where do they see the future going?

IAAGD. My current paid project is to have both a raytracing module and a rasterizing module, and is designed to use them completely interchangeably. Personally, I'm a much bigger fan of raytracing than rasterization, and I'm going to a great deal of effort to make sure that it can be done efficiently with my engine.

-:sigma.SB

Re:What do the people that make the software say? (0)

Anonymous Coward | more than 6 years ago | (#22678276)

IAAGDM.

Get back to work, biyatch.

Re:What do the people that make the software say? (5, Interesting)

Squapper (787068) | more than 6 years ago | (#22677700)

I am an game-developing 3d-artist, and our biggest problem right now is the lack of ability to render complex per-pixel shaders (particulary on the PS3). But this is only a short-term problem, and what would we do with the ability to create more complex shaders? Fake the appearance of the benefits of Ray-tracing (more complex scenes and more realistic surfaces) of course!

On the other hand, the film industry could be a good place to look for the future technology of computer games. And as it is right now, it's preferable to avoid the slowness of raytracing and fake the effects instead, even when making big blockbuster movies.

Re:What do the people that make the software say? (1)

p0tat03 (985078) | more than 6 years ago | (#22678240)

I am an amateur game developer, so I probably can't speak for large devs, but I can speak for the community. The biggest problem facing game devs (and graphics people in general) is lighting. Doom 3 created an elegant, unified lighting model (i.e. the environment and characters are lit with the same algorithm, making them consistent), but it had severe limitations. Gone were the days where lights could be soft, and bounce around, and generally look nice, and replacing it was a very harsh lighting system that did not account for radiosity (due to limitations in hardware performance).

We've spent the last few years since Doom 3 trying to fake ways to get nice, soft lighting in our games, but in the end every single method is still just that - a hack. As we see more and more advanced lighting schemes come about to try and do soft lighting in real time, you will notice the systems get more and more convoluted and complex.

The only "elegant" solution to this problem is raytracing. Instead of faking radiosity through texture caches, secondary renders, and a whole slew of messy, fake solutions, we need to be able to execute raytracing in real-time. As soon as we can do it, the complexity of our lighting code will collapse to something incredibly elegant, simple, and universal (i.e. it will work across ALL situations).

Re:What do the people that make the software say? (1)

MenTaLguY (5483) | more than 6 years ago | (#22678872)

Well, sort of. Raytracing isn't a complete solution to the rendering equation -- you'll still need hacks to get nice soft shadows, indirect lighting, and ambient occlusion.

Re:What do the people that make the software say? (1)

p0tat03 (985078) | more than 6 years ago | (#22679394)

A proper raytrace implementation will automatically account for things like the shadow penumbra (soft shadows), indirect lighting (light bounces, aka radiosity), and ambient occlusion. We're definitely not talking about the raytracers of yesteryear, which were very functionally limited.

I suppose the argument isn't even really about raytracing vs. not. It's about whether it's worthwhile to brute force the problem (thereby keeping the solution elegant and simple) with sheer CPU power, or to try and fake your way to good visuals via ever more convoluted "fake" solutions.

As a coder I'm obviously in the first camp - the sheer amount of hacks to get your "unified" lighting system together is ridiculous, and gets even more complex every day. Ideally we'd just calculate everything for real, in real-time, and the bonus is that it's scalable to computing power. Want better lighting? Crank up the sample count in your raytrace solution... if you have the power :P

Re:What do the people that make the software say? (1)

Creepy (93888) | more than 6 years ago | (#22679966)

I know this isn't what you meant, at least by how you phrased the rest of your comment, but technically light bounces are handled very well by raytracers, as is proper color absorption from nearby reflections as long as you are referring to Specular (shiny) lighting. What isn't handled well is non-point source diffuse (soft), which is why many ray tracers bolt on photon mapping or radiosity.

Probably right on this one... (4, Insightful)

Bones3D_mac (324952) | more than 6 years ago | (#22677550)

For the most part, I really don't see ray-tracing adding much to the world of gaming that isn't being handled well enough by current methods. Unless someone was specifically creating games that somehow directly incorporated either the benefits or the added calculations involved with ray-tracing itself, it would only be a costly, and highly inefficient gimmick of an alternative to current techniques.

Sure, ray-tracing has its place in a lot of areas, but real-time gaming would be a terrible misuse of processing horsepower... especially when you could be applying it to other areas of gaming that actually affect gameplay itself. For example, how about more robust AIs for in game elements, or high-end physics processing that can combine things like fabric/hair/ fluid/fire physics processing with the ability to decimate objects completely as vector-calculated chunks based on the surrounding environments, rather than all this predetermined destruction we currently see in games. (Example, a surface could be eroded incrimentally by having a fluid running acrossed it until a hole forms in the shape of the fluid path...)

Re:Probably right on this one... (2, Interesting)

ZenDragon (1205104) | more than 6 years ago | (#22677876)

You mention "predetermined destruction" which I agree is rather annoying limitation in almost all modern games. Personally at this point in time I would rather see a more interactive environment, than incredible graphics. What good is a beautifully rendered environment if you cant blow holes in it? I want to see realistic bullet holes, with the light shining through the wall or arms fall off when I mutilate some guy with a chain saw. I want to see water splash when I walk through it, or grass and leaves swaying naturally in the wind. And why cant I shoot the vase off the table for target practice? The damn thing seems to be bullet proof!

I think they need to be working more on the physics of the environment than making it all look pretty. Hardware like the PhysX card are a step in the right direction and I would like to see that trend continue.

Not an either-or dilemma (1)

yaugin (1026682) | more than 6 years ago | (#22677654)

Why does there have to be One True Rendering System? As long as a company as big as Intel is backing raytracing, chances are good that it will eventually turn up in de facto APIs like OpenGL and Direct3D. And then if it picks up developer support, it will become a necessary bullet point feature to sell new graphics cards (or not, if Intel's plan for pushing this with multicore CPUs works). Game developers will have the choice of using one type of rendering system or the other, or some mixture of both, as necessitated by their game design. I see a duality happening, and it is win-win for everyone except NVIDIA, who has a vested interest in the status quo, is just afraid of losing their market share to CPU cores. And unlike with AGEIA, they won't have the funds to buy up Intel anytime soon. In fact, this avenue may be one of the few chances for AMD to redeem the ATI merger. "Check out our parallel raytracing/rastering chip!"

Hardly anything new (4, Interesting)

K. S. Kyosuke (729550) | more than 6 years ago | (#22677714)

For years, the movie studios were using Pixar PRMan, which is in many ways a high-quality software equivalent of a modern graphics card. It takes a huge and complex scene, divides it into screen-space buckets of equal size, sorts the geometrical primitives in some way, and then (pay attention now!) tesselates and dices the primitives into micropolygons about one pixel each in size (OpenGL fragments, anyone?), shades and deforms them them using simple (or not-so-simple) algorithms written in the RenderMan shading language (hey, vertex and pixel shaders!) and then, it rasterizes them using stochastic algorithms that allow for high quality motion blur, depth-of-field and antialiasing.

Now the latest part is slightly easier for a raytracer, which can cast a ray from any place in the scene - of course, it needs this functionality anyway. But a raytracer also needs random access to the scene which means that you need to keep the whole scene in memory at all times, along with spatial indexing structures. The REYES algorithms of PRMan needs no such thing (it easily handles 2 GB of scene data on a 512 MB machine along with gigs of textures), and it allows for coherent memory access patterns and coherent computation in general (there is of course some research into coherency in raytracing, but IIRC, the results were never that good). This is a big win for graphics cards, as the bandwidth of graphics card RAM has never been spectacular - it's basically the same case as with general-purpose CPU of your computer and its main memory. But with 128 or so execution units of modern graphics card, the problem is even more apparent.

Unless the Intel engineeers stumbled upon some spectacular breakthrough, I fail to see how raytracing is supposed to have any advantage in a modern graphics card. And if I had to choose between vivid forest imagery with millions of leaves flapping in the wind and reflective balls on a checkered floor, I know what I would choose.

Re:Hardly anything new (1)

Aidtopia (667351) | more than 6 years ago | (#22680036)

All true. But, of course, Pixar has converted RenderMan to a full ray tracer because it's now feasible and because they wanted that extra bit of realism that ray tracing elegantly delivers (see Cars and Ratatouille).

Depending on design decisions, ray tracers can represent complex scenes with far less memory than most rasterizers. Rasterizers generally rely on increasing numbers of triangles (or perhaps other polygons) for complexity. Ray tracers can use all sorts of parametric surfaces that require much less storage. And vast amounts of memory are more affordable, so the size of the geometry is becoming less of an issue.

Besides, when games based on rasterizers try to emulate features that ray tracer gives natively (reflections, shadows, global illumination, etc.), they generally require random access to the model similar that required by ray tracers.

Radiosity , not raytracing (1)

Animats (122034) | more than 6 years ago | (#22677778)

Radiosity does more for indoor scene quality than does raytracing. Radiosity gives you the visual cue of a dark band at an inside corner, which is subtle and a basic part of the human visual mechanism for resolving depth. Raytracing makes shiny things look cool.

Oh, right, this is for gamers.

Re:Radiosity , not raytracing (0)

Anonymous Coward | more than 6 years ago | (#22679218)

Radiosity is just a precomputed approximation of global lighting. The scene is diced up into tiny polygons and then you build up a giant matrix where each row represents the visibility of the polygons with regards to each other. Then you solve it to determine how much light should be displayed on each polygon after it is reflected off of the others. This works well for static scenes with small numbers of polygons (architectural renderings for example) but does not work for dynamic scenes (you build and solve the matrix every time the scene changes) or for large polygon counts (a million triangles results in a billion entry sparse matrix to solve). It also really only computes the diffuse lighting as specular or reflective lighting would require the scene to be diced up into pixel-sized triangles (because the lighting is approximated across the triangle).

You can do all of the same work with a raytracer and get better results with simpler code. It just takes much longer to compute. A rasterization rendering is just a highly optimized raytrace from the eye with one ray per pixel (modern graphics cards don't scan convert triangles, they convert them into line equations and do inside/outside tests, see Pineda). However once you generalize this rasterization you can rapidly spend more rays per pixel to get better effects. First is anti-aliasing which is simple to do via shooting 16 rays per pixel. Then shadows, 1 ray per light per pixel. Then antialiased soft shadows, N rays per pixel per light. The mirror reflections are popular because you get them by shooting 1 reflection ray per pixel so they are cheap. However you can do diffuse reflections via sending M rays per bounce scattered by the material. And of course radiosity is just an approximation of multiple bounces of diffuse reflections.

Re:Radiosity , not raytracing (1)

djhindsight (1025317) | more than 6 years ago | (#22680010)

I think he was confusing ambient occlusion with radiosity.

If any alien race ever recieves that (1)

Daimanta (1140543) | more than 6 years ago | (#22677802)

They will cut back our "first contact" date for at least 500 years.

I don't get these comparisons (0, Redundant)

nuzak (959558) | more than 6 years ago | (#22677934)

I'm not too up on 3D graphics, but what's with these comparisons of ray-tracing vs polygons, or ray-tracing vs rasterizing? Isn't ray tracing just a lighting model?

Re:I don't get these comparisons (1)

The End Of Days (1243248) | more than 6 years ago | (#22678192)

If you want to be that abstract about things, vision is just a lighting model.

uhm what? (1)

JeanBaptiste (537955) | more than 6 years ago | (#22677942)

what does antialiasing have to do with anything? you can anti-alias just fine with raytracing.

Oh really? (0)

Anonymous Coward | more than 6 years ago | (#22679814)

Sure you can---by shooting 4 or more rays per pixel, which is really slow. There might be ways to speed it up but they will make the raytracer much more complicated.

Modern video cards can give you 4x FSAA without missing a beat. They don't have to do much extra work except the extra fragment shading which is totally parallelized already in the hardware we already have.

graphics company?! (0, Redundant)

rice_burners_suck (243660) | more than 6 years ago | (#22677976)

I doubt NVIDIA really thought this through. How could a leading graphics company say such a thing about ray tracing? Ray tracing provides THE best quality images.

Re:graphics company?! (1)

Charcharodon (611187) | more than 6 years ago | (#22678782)

RTFA, they mostly agreed that ray tracing produces some very nice graphics, they just are fast enough for games and currently not better enough to justify the extra price for the hardware over what is already available.

radiosity (1)

Dark-Dx (1190049) | more than 6 years ago | (#22678046)

Of course it's not, radiosity is.

As Warren Buffet says, (0)

Anonymous Coward | more than 6 years ago | (#22678372)

"Don't ask a barber if you need a haircut!"

Kirk contradicts Pixar (0)

Anonymous Coward | more than 6 years ago | (#22678456)

If raytracing techniques cannot resolve anti-aliasing problems, how do you explain Pixar films? I don't see any antialiasing problems in Pixar's films. There are several techniques that can be applied, most of which deals with random sampling.

Real time Ray Tracing on PS3 (1)

Verunks (1000826) | more than 6 years ago | (#22678886)

IBM did realtime raytracing [youtube.com] using three ps3 with linux

spon6e (-1, Troll)

Anonymous Coward | more than 6 years ago | (#22679092)

BE NIGGeR! BE GAY!

RE: Relief mapping (1)

dsarchs (1252582) | more than 6 years ago | (#22679230)

"A good way to mix both techniques is Relief Texture Mapping [ufrgs.br]. It's a good way to get smooth surfaces thanks to the texture interpolation hardware, with no extra polygons." The problem with a technique like this [similar for bump and normal mapping] is that it doesn't affect the actual geometry. That means that the edges of meshes, and the mesh generally when viewed from very oblique angles, will appear flat. Displacement, on the other hand, is a good solution to this as it actually affects the geometry involved.

A matter of speed (3, Insightful)

CopaceticOpus (965603) | more than 6 years ago | (#22679616)

"C is much too slow, to get good performance you must use assembly."

"Scripted languages are much too slow, to get good performance you must use compiled languages."

As computers get faster, there is always a move from technologies that are easier for the computer to technologies that are easier for the developer. Since ray tracing involves less hacks and is a more direct model of the effects developers want to create, it seems inevitable.

Re:A matter of speed (0)

Anonymous Coward | more than 6 years ago | (#22680512)

I always figured that C was fairly constant, regarding speed.

Blacksmith Doubts Car is the Future of Transport (1)

Haeleth (414428) | more than 6 years ago | (#22679634)

Which isn't to say I'm a raytracing fanboy. The blacksmith might well be right, if the car in question is powered by steam. But the blacksmith's opinion still isn't exactly news...

The Future of Gaming (1)

Khyber (864651) | more than 6 years ago | (#22679726)

Considering how I've seen people going back to '50s style music, clothing, etc., it'd make sense if they suddenly just went back to text-gaming. Retro's the way to go! Stranded on an alien planet without a six pack of beer, baby!

I'll call it the mind-generated graphics system, or MGGS for short.

Patented and Copyrighted and Trademarked!

plus 4, T8oLl) (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22680046)

Strange (1)

scubamage (727538) | more than 6 years ago | (#22680326)

Intel's original studies stated that currently raytracing requires too much of a performance hit to be viable. They're expecting it when 8-16 core processors becoming available at a commodity level, and thats at least 2-3 years from now. As for anti-aliasing, I thought that raytracing removed the need for it entirely because of how graphics are drawn?

Voxels (1)

Chaduke (1181285) | more than 6 years ago | (#22680698)

I'd love to see Nvidia or ATI work on hardware that accelerates a pure voxel engine. I personally think there's too much emphasis on reaching a goal of photo-realism. Current polygon-based rendering tends to dictate a lot in terms of gameplay, often without game designers even noticing it because they don't know what's possible with something like a voxel engine.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...