Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Crytek Bashes Intel's Ray Tracing Plans

kdawson posted more than 6 years ago | from the future-is-raysterization dept.

Intel 151

Vigile writes "Despite all good intentions, Intel continues to see a lot of its work on ray tracing countered not only by their competition, as you'd expect, but also by the very developers that Intel is going to depend on for success in the gaming market. The first major developer to speak on the Intel Larrabee and ray tracing debate was id Software's John Carmack, who basically said that Intel's current plans weren't likely to be implemented soon or ever. This time Cevat Yerli, one of the Crytek developers responsible for the graphically impressive titles Far Cry and Crysis, sees at least 3-5 more years of pure rasterization technology before moving to a hybrid rendering compromise. Intel has previously eschewed the idea of mixed rendering, but with more and more developers chiming in for it, it's likely where gaming will move."

cancel ×

151 comments

Sorry! There are no comments related to the filter you selected.

Ray-Tracing Extremely CPU Intensive (4, Insightful)

foxalopex (522681) | more than 6 years ago | (#23035194)

It's no surprise that Intel is being bashed over their idea of real-time CPU ray-tracing. As anyone who has ever ray-traced will realize it's extremely slow. At times you're talking about HOURS PER FRAME while realistically you want at least 30 frames per second and even that isn't considered great by many gamers. It's going to take a HUGE and I mean HUGE increase in computation power before that happens. Rasterization techniques are tremendously faster and they look nearly as good as Ray-tracing for the most part. Considering that we're yet to reach a point in Rasterization where we don't need more processing power (Crysis in high resolution.) I don't see us moving away from it yet. The day when we declare that we have graphics cards more powerful than we need for Rasterization is when we start moving towards ray-tracing. That day isn't anytime soon unfortunately.

Re:Ray-Tracing Extremely CPU Intensive (1)

Idiomatick (976696) | more than 6 years ago | (#23035344)

Not only that but why blow that CPU power on ray-tracing. We'd have to first run out of other useful things to spend processing power on. I'm sure everyone on /. can think of a few dozen. Thinking it will be a=implemented anytime soon (for anything other than proof of concept) is absurd.

Re:Ray-Tracing Extremely CPU Intensive (1)

Dzonatas (984964) | more than 6 years ago | (#23038190)

Start to question why do people spend so much dang money on multi-boxing on WoW:

How much money people will spend: 15 grand
http://forums.worldofwarcraft.com/thread.html?topicId=5784541038&sid=1&pageNo=4#66 [worldofwarcraft.com]

How many people are doing it: lots
http://forums.worldofwarcraft.com/thread.html?topicId=5781098289&sid=1&pageNo=1 [worldofwarcraft.com]

People are surely going beyond the resources of a single computer budget.

Re:Ray-Tracing Extremely CPU Intensive (4, Informative)

Yetihehe (971185) | more than 6 years ago | (#23035436)

Here it goes again. Try to rasterize on CPU. It will be similarly slow. On the other hand with good hardware (like raytracing on gpu (PDF) [uni-sb.de] , or on cell processor (PDF) [uni-sb.de] , or just on PS3 cluster [youtube.com] ) is ALREADY possible. If you could make custom accelerator for raytracing (PDF) [uni-sb.de] gamers and graphicians would love it.

Re:Ray-Tracing Extremely CPU Intensive (2)

somersault (912633) | more than 6 years ago | (#23035592)

Very good point. Raytracing is obviously quite parralelisable from what you are saying, so it doesn't take a breakthrough in technology so much as just a whole bunch of appropriately raytracing oriented graphics cards chained together if you want to play raytraced games :P Rasterised graphics are good enough for me at the moment anyway, if it were between rasterised graphics or paying £2000 for photorealistic graphics, I'm not sure I'd be wanting to pony up the cash.. meh.. who am I kidding, I'd be all over it..

Re:Ray-Tracing Extremely CPU Intensive (4, Insightful)

DrXym (126579) | more than 6 years ago | (#23036570)

Those PS3 tech demos are cool but could more accurately be called ray casting. They bounce a primary and maybe a secondary ray off some fairly simple scenes. I expect if you looked close up there would be jaggies all over the shop, and things like reflection & shadows would be brutal. Proper ray tracing requires sub pixel sampling with jitter and recursion to look even remotely acceptable.

I don't think anyone denies that ray tracing is lovely etc., but its a question of whether it is remotely feasible to do it on the current generation of CPUs or GPUs. If it takes a cluster of Cell processors (basically super fast number shovels) to render a simple scene you can bet we are some way off from it being viable yet.

Maybe in the mean time it is more suitable for lighting / reflection effects and is used in conjunction with traditional techniques.

Re:Ray-Tracing Extremely CPU Intensive (2, Insightful)

-noefordeg- (697342) | more than 6 years ago | (#23035774)

Why would one want 30 framed per second?

If I were to mention a number, I would either want at least ~72 frames per second (where the eye/brain would have a hard time discerning between individual frames) or at least match the sync of an ordinary LCD screen at 60 fps.

Re:Ray-Tracing Extremely CPU Intensive (2, Informative)

Laughing Pigeon (1166013) | more than 6 years ago | (#23037484)

Why would one want 30 framed per second? If I were to mention a number, I would either want at least ~72 frames per second (where the eye/brain would have a hard time discerning between individual frames) or at least match the sync of an ordinary LCD screen at 60 fps.
That is not usefull at all. 30 frames per second suffice to make the eye see something as "moving" instead of taking small steps, what You describe as "where the eye/brain would have a hard time discerning between individual frames". The reason that one sees flickering on a crt is that the phosphor dots "cool down" after being hit by the electron beam, the dots have to be hit time after time. To prevent this from giving a flickering screen, the frequency by which the pixels are "activated" has to have a certain minimum value (for many people 72 Hz is enough)(and nobody truly needs more than 640 Hz ;-)). This has nothing to do with the brain discerning individual frames.

Re:Ray-Tracing Extremely CPU Intensive (1)

Kabal` (111455) | more than 6 years ago | (#23037840)

Most people can easily see the difference between 30 and 60 fps, and possibly higher. If you're suggesting 30 is all that anyone needs, you are wrong.

Re:Ray-Tracing Extremely CPU Intensive (1)

PitaBred (632671) | more than 6 years ago | (#23038316)

No, you can't "see" the difference between 30 and 60fps. That's why movies play at 24fps. 30 is a great FLOOR. The problem is that you're talking average FPS rates, which yes, 30 is too low. It means there are a significant number of times where it's below 30fps.

Re:Ray-Tracing Extremely CPU Intensive (1)

Oktober Sunset (838224) | more than 6 years ago | (#23037852)

depends on how much action is going on, at 30fps you games will look perfectly smooth unless you turn quickly, then the steps between the images will be too large to perceive as smooth motion.

Re:Ray-Tracing Extremely CPU Intensive (2, Interesting)

Floritard (1058660) | more than 6 years ago | (#23035814)

you want at least 30 frames per second and even that isn't considered great by many gamers.
I've always wondered about the need for a solid 60 fps in every game. A lot of games, especially console games of late, are going for that cinematic experience, and as theatrical movies themselves run at 24 fps, maybe all it would take is today's prettiest graphics and a really sophisticated use of motion-blur to make a good game running at that mere 24 fps. Maybe for first-person shooters and racing games, you want that almost hyper-real 60 fps of unblurred, crystal clear action, but for those other action/adventure games you could probably get by with less. There was an article recently about how playing sports games isn't so much like simulating you playing the sport as it is simulating you watching a televised sports program. In that case, why would you need more fps than that at which your television (NTSC: 29.97 fps, PAL: 25 fps) has traditionally broadcast? It might even look more real with less frames.

Re:Ray-Tracing Extremely CPU Intensive (2, Interesting)

Naughty Bob (1004174) | more than 6 years ago | (#23035976)

Dude, FPS for video games is not really comparable with FPS in films/TV etc., for one simple reason-

In video games, the frame rate is also the rate at which everything else (physics, etc.) is calculated.

Re:Ray-Tracing Extremely CPU Intensive (5, Interesting)

andersbergh (884714) | more than 6 years ago | (#23036022)

No it's not, usually games have a separate loop for logic (physics, AI, etc) running at say, 30 fps. If the GPU can push more frames than that, then why not.

Re:Ray-Tracing Extremely CPU Intensive (0)

Anonymous Coward | more than 6 years ago | (#23036434)

remember quake2?, that's why everybody ran on these magic 120-something fps, because you could jump further...

Re:Ray-Tracing Extremely CPU Intensive (1)

PitaBred (632671) | more than 6 years ago | (#23038416)

You wish that's how it worked. That's a VERY recent development... most games (C&C: Generals and later even) typically run everything in a main loop. It was so bad that when a friend of mine was working with OSG and did a separate render thread, he got NO speedup with ATI's drivers (they've since fixed that) because ATI just did a busy-wait if you enabled VSYNC. So his on a single CPU processing thread actually got slower when not blasting the images to the screen more times than the actual LCD could display. Completely stupid.

Re:Ray-Tracing Extremely CPU Intensive (0)

Anonymous Coward | more than 6 years ago | (#23036328)

Dude, FPS for video games is not really comparable with FPS in films/TV etc., for one simple reason-

In video games, the frame rate is also the rate at which everything else (physics, etc.) is calculated.
ehhh... since when??
that would make physics dependent on gameplay, not a wise choice i think...

i'm in the betatest for a new trackmania patch, and when the discussion was about car physics, they told us the game does the physics at 400 "fps", independent of how fast the graphics are.

the only real reason why game fps would have to be higher, is that there's no such thing as motion blur, and other effects that make a moving image look so good.

Re:Ray-Tracing Extremely CPU Intensive (4, Informative)

irc.goatse.cx troll (593289) | more than 6 years ago | (#23037066)

Since quake1, and everything dervived from it in some way.

Yes its not a 'wise decision', but not all decisions can be made based on whats most logical..sometimes you need to cut corners based on what will work fastest or easiest.

In quake your movespeed and your ability to move/accelerate in the air is based entirely on your fps. Some trick jumps can't be done without a certain framerate.

In quake3 that changes more into your jump height, but the same end result -- Some jumps require certain fps to become possible.

In any HL based game your ability to slide up a steep wall instead of slide down it is impacted by your fps (and also the servers framerate).

In TFC hwguy assault cannon and a few other weapons would fire more often with higher fps.

In Natural Selection(1.x) how quick your jetpack fuel replenishes is based on your fps. Enough FPS and you could fly forever.

Theres more, but the tl;dr version: Any game that uses quake's "player.think()" system to do calculations will fire off more .think()s per second on clients with higher framerate.

Re:Ray-Tracing Extremely CPU Intensive (1)

dctoastman (995251) | more than 6 years ago | (#23036394)

But it is calculated based on time elapsed, not on a count of frames.
So at 60, 30, or 1000fps, you still move the same speed.

Re:Ray-Tracing Extremely CPU Intensive (3, Informative)

jfim (1167051) | more than 6 years ago | (#23036954)

It depends on the game. For example, the first releases of Quake 3 had different physics depending on your framerate, due to integer clamping of player positions. They fixed the issue in later patches by adding an option to force everyone to run at 125 Hz, but by default it is off.

This allows a couple jumps that are not possible UNLESS you are running at 125 Hz, such as the megahealth jump on q3dm13.

This guide has more information: http://ucguides.savagehelp.com/Quake3/FAQFPSJumps.html [savagehelp.com]

Re:Ray-Tracing Extremely CPU Intensive (1)

MadnessASAP (1052274) | more than 6 years ago | (#23037838)

That's just plain wrong there's nothing forcing you to calculate physics or anything else for that matter at the same rate as the graphics systems, AI for instance may only be calculated once or twice a second and physics may be calculated 3 or 4 times as fast as the graphics. It just so happens that it can simplify the programming by searializing these operations rather then running them in seperate threads.

Re:Ray-Tracing Extremely CPU Intensive (3, Interesting)

Colonel Korn (1258968) | more than 6 years ago | (#23035994)

Well done motion-blurred 24 currently would take more power than 60 unblurred fps, but yeah, the notion isn't a bad one.

Re:why are you using CPU to do GPU's job? (0)

Anonymous Coward | more than 6 years ago | (#23036110)

The current CPU technology is moving toward multi-core and it is just what ray-tracing needs. What you need to accelerate RT is to have 1 or 2 cores handle the k-D tree space parsing, and then reverse ray trace from the display with each ray utilizing a hardware thread. GPU technology has long been utilizing the multi-core method, but currently, they don't have the optimized k-D tree space parsing component. Even so, utilizing the nVidia CUDA package, you can squeeze 40+ FPS out of RT at 1280x1024. RT hardware just need some time to mature. It is an inevitable trend due to the hardware vendors keep on fitting more cores into a single chip and abandoned their efforts in improving on single core chips now.

Re:Ray-Tracing Extremely CPU Intensive (0)

Anonymous Coward | more than 6 years ago | (#23036800)

30 frames a second is far too slow for a fast pased FPS.

Re:Ray-Tracing Extremely CPU Intensive (1, Interesting)

Anonymous Coward | more than 6 years ago | (#23037272)

> It's no surprise that Intel is being bashed over their idea of real-time CPU ray-tracing. As anyone who has ever ray-traced will realize it's extremely slow. At times you're talking about HOURS PER FRAME

Hours per frame if you do accurate global illumination... Which is also very expensive to do using rasterization, and isn't done in any modern game, by the way.

Raytracing a very complex scene with a proper scene partitioning scheme can be done in under a second on a modern single processor machine. If you add adaptive antialiasing (only done at visble edges), you can add maybe 50% more CPU time... If you want soft shadows, make that a few seconds of rendering time... You can add some approximate global illumination on top of this to make it more viable.

Still, this can be done fast using multiple cores, and with specialized hardware, will be feasible in real-time. Someone has already shown very basic raytracing can be done in hardware, in real-time, using an FPGA (look up SaarCOR). If nvidia made a raytracing graphics card, they could absolutely deliver something on par with current rasterization products that runs at real-time framerates.

In the end, it's slower than rasterization, but it looks alot better.... You get soft shadows, reflections, refraction, etc. all for "free"... I must also mention that with rasterization, implementing realistic effects can be very painful, while with raytracing it's alot more "natural" and intuitive to program (since it's based on an actual simulation of light, rather than projecting triangles).

Re:Ray-Tracing Extremely CPU Intensive (4, Insightful)

xouumalperxe (815707) | more than 6 years ago | (#23037548)

Bullshit. Just the same as raster graphics, the amount of time you spend per frame on ray-tracing is adjusted to your needs and desires. Take, say, a Pixar film. Those are mostly done with raster graphics, with key effects done with ray-tracing. How much time do you reckon it takes to render each of one of those films' frames? (Pixar films are all drawn with Photorealistic Renderman, which is based on the REYES algorithm, which reads like a fancy raster engine)

The part about computational power is another fine display of complete misrepresentation of reality. Raster graphics are this fast nowadays for two major reasons. The most obvious is because graphics cards entire massively parallel processors specialized in drawing raster graphics. It's pretty damn obvious that, given two techniques for the same result, the one for which you use a specialized processor will always be faster, which doesn't produce evidence that a technique is inherently faster than the other. The second, less obvious, is that raster graphics have been the focus of lots of research in recent years, which makes it a much more mature technology than ray-tracing. Once again, a more mature technology translates into better results, even if the core technique has no such advantage. What Intel is supposedly aiming for here is getting the specialized hardware and mindshare going for ray-tracing, which might lead to real-time ray tracing becoming a viable alternative for raster graphics.

Re:Ray-Tracing Extremely CPU Intensive NOT! (1)

Dzonatas (984964) | more than 6 years ago | (#23038108)

If a modern ray tracer spents hours on a frame, it would be so realistic that there is no current GPU that could even match the quality. You, sir, have surely exaggerated. Only thing ray-tracers have advantage over GPU is fast dynamic scenes. The GPU has been a remains a major memory bottleneck. A quadcore does very nicely to render fast frames rates with a ray tracer.

If one wants to argue that people won't spend any extra money for it, go look at the tons of people that multi-box on WoW and how much money they spend for 6 or more computers!

Re:Ray-Tracing Extremely CPU Intensive (1)

noname444 (1182107) | more than 6 years ago | (#23038164)

O rly?

http://pouet.net/prod.php?which=9461 [pouet.net]
http://pouet.net/prod.php?which=2228 [pouet.net]
http://pouet.net/prod.php?which=688 [pouet.net] (DOS)
http://pouet.net/prod.php?which=5 [pouet.net]
http://pouet.net/prod.php?which=3845 [pouet.net]

Some more info
http://tog.acm.org/resources/RTNews/demos/overview.htm [acm.org]

Too bad the trend died off, but I think we'll see some more demos in the realtime ray tracing area in the next couple of years.

Stop motion movies (4, Interesting)

BadAnalogyGuy (945258) | more than 6 years ago | (#23035216)

For years some claymation movies were set up by hand and shot frame by frame in a process called stop motion [wikipedia.org] . While adequate, the resulting film was typically unnatural and the movements very stiff compared to live actors.

Enter ILM and go motion [wikipedia.org] . Instead of filming static scenes, the clay was moved slightly during the shot to create a blurry frame. This blurry frame made the scene seem more realistic. The blur is what the eye picks up in the movie frame, so an actor walking in a scene is not a set of pinpoint focus shots but a series of blurs as the man moves.

Ray tracing is great for static scenes. But movement is the key to games that require this much detail, and so each frame should not be beautifully rendered framebuffers, but a mix of several framebuffers over the span of one frame. Star Wars did it great. Most computer games, not so much.

Re:Stop motion movies (2, Interesting)

ozmanjusri (601766) | more than 6 years ago | (#23035334)

Ray tracing is great for static scenes.

Where did you get that idea?

Ray tracing can do selective motion blur very inexpensively. You test against a bounding sphere a triangl's motion span, then interpolate the ray along an approximation of the triangle's path.

That's a very bad analogy you're using...

Re:Stop motion movies (1)

BadAnalogyGuy (945258) | more than 6 years ago | (#23035426)

Ray tracing is great for static scenes.
Where did you get that idea?

Are you saying it's not?

Re:Stop motion movies (1)

ozmanjusri (601766) | more than 6 years ago | (#23035486)

Are you saying it's not?

Radiosity is better.

Re:Stop motion movies (1)

Goaway (82658) | more than 6 years ago | (#23036424)

Those are not mutually exclusive. You more often than not combine ray tracing with some kind of global illumination method like radiosity.

Re:Stop motion movies (1)

alexhard (778254) | more than 6 years ago | (#23036682)

That's a very bad analogy you're using...
Hence, BadAnalogyGuy

Re:Stop motion movies (1)

Yetihehe (971185) | more than 6 years ago | (#23035568)

Actually programs can use 4d raytracing. It means program can scatter samples in time dimension, which gives blurring. It's only one of techniques for blurring in raytracing.

Re:Stop motion movies (1)

nschubach (922175) | more than 6 years ago | (#23035836)

Not to mention, it's not like Anti-Aliasing in ray tracing is impossible.

ftp://ftp.alvyray.com/Acrobat/6_Pixel.pdf [alvyray.com] (from 1995 no less)

Re:Stop motion movies (1)

Have Brain Will Rent (1031664) | more than 6 years ago | (#23038628)

Actually programs can use 4d raytracing. It means program can scatter samples in time dimension, which gives blurring. It's only one of techniques for blurring in raytracing.

That idea has been around for quite a while with the goal of efficiently generating multiple ray-traced frames by a single calculation of the intersection of a ray, with a moving object, over a period of several frames, thus simultaneously generating the solution for several frames. See, for example,

Spatio-Temporal Coherence in Ray Tracing", Chapman, J. et. al., Graphics Interface 91 http://books.google.ca/books?id=hbJ20d4NgiUC&pg=PA101&lpg=PA101&dq=%22spatio-temporal%22+coherence&source=web&ots=jpQARDC-US&sig=mEJNEIgfR9YvaUXNTXvO1PsYcPE&hl=en [google.ca]

which discusses solving the ray and moving object intersection problem.

Re:Stop motion movies (1)

EdibleEchidna (468353) | more than 6 years ago | (#23036018)

Aardman Animations (the creators of Wallace and Gromit) may disagree with you that stop motion animation looks "unnatural" and "stiff".

Re:Stop motion movies (1)

Bombula (670389) | more than 6 years ago | (#23037150)

You are absolutely 100% correct, and anyone in the VFX field can tell you this is true. This is why motion blur is such a hugely effective way to improve visual quality in gaming with a very low performance cost. It is silly that 3D engines don't make better use of frame and motion blurring. As you pointed out, the human eye does not see motion as a series of static, focused snapsots; it sees motion as a slurry of blurred imagery. Conveniently, it is much easier to render blurs than pin-point accuracy.

As another example, it took a long time for manufacturers of video cameras to figure out that a series of in-focus frames will look stuttery and fake compared to a series of semi-blurred frames. And so for more than a decade video - even hi-def video - looked like crap compared to film. By a stroke of good luck, celluloid film stock captures blurred imagery the way the human eye does, and so it looks beautiful and realistic, even when it contains far less per-pixel information. (This is also why older film, whether motion picture or still picture, looks wonderful: less information is often better, as it is more pleasing to the eye than microscopic clarity). This is finally being corrected in high-end video cameras, so that the lastest generation of HD video cameras have exposure controls to simulate these effects and make them appear more like film.

Re:Stop motion movies (1)

Have Brain Will Rent (1031664) | more than 6 years ago | (#23038488)

Just set your shutter speed and you'll get all the blur you want. Even digital photographers have known that for many years.

Re:Stop motion movies (1)

Dzonatas (984964) | more than 6 years ago | (#23038238)

That blur used to automatically exist on phosphorus backed TV set. Now we have LCD screens that are pretty quick on the frame, and people want the blur back (go figure). So now, there are LCD screens that have smart chips inside to smooth the transition between two frames. Surely, the renderer does not need to do this if the LCD screen can do it.

you've got it arse about face. (3, Funny)

timmarhy (659436) | more than 6 years ago | (#23035230)

it's customers that drive the market, not developers. christ these guys sound like a bunch of OSS developers.

Re:you've got it arse about face. (1)

somersault (912633) | more than 6 years ago | (#23035694)

Nothing wrong with giving the customers more options, and letting them decide. Most customers wouldn't say "hey, can you work on superfast ray tracing please, I want my Monsters Inc game to look just like the real thing!". Most people wouldn't know the difference between ray tracing and rasterisation if it hit them in the face (and bounced off at an angle of reflection equal but opposite to the angle of incidence).

Re:you've got it arse about face. (4, Interesting)

AKAImBatman (238306) | more than 6 years ago | (#23035996)

it's customers that drive the market, not developers.

In the case of a company like Intel who's pushing a new technology, the developers are the customers. It's not Joe Consumer who's going to be buying into Intel's technology. (At least not until there are games that support it.) It's going to be the developers. Developers who will be taking a gamble on a next generation technology in hopes that they lead the pack. And as history has proven, the first out of the gate often earns the most money. (At least in the short term.)

Of course, history has also proven that new technologies often fail. Thus the risk is commiserate with the reward. There may be a lot to gain, but there is also a lot to lose. A lot of dollar signs, that is.

First out of the gate? (0)

Anonymous Coward | more than 6 years ago | (#23037012)

And as history has proven, the first out of the gate often earns the most money. (At least in the short term.)
I seriously doubt that. In fact usually the pioneers get the least amount of money.

Was MySpace the first social networking site?

Was World of Warcraft the first MMORPG?

Consider Ford versus Toyota/Honda/etc.

And countless other examples, pretty much anything outside of patent stuff.

Re:First out of the gate? (4, Insightful)

AKAImBatman (238306) | more than 6 years ago | (#23037444)

Your comment doesn't make a lick of sense. I mentioned that the early entrants into a new market make the most money in the short-term. You then try to refute my argument with a long-term argument. Logic error. Danger Will Robinson. Danger!

Was MySpace the first social networking site?

No. That dubious distinction belongs to Classmates.com, a site launching in 1995 that did quite well for itself and is still going strong. (Oddly.)

Was World of Warcraft the first MMORPG?

Neverwinter Nights, Ultima Online, and Everquest (nay, Evercrack!) were all highly successful and made their creators a lot of money in the short term.

Consider Ford versus Toyota/Honda/etc.

Consider what? Ford went gangbuster when it released the Model T to the market. In the short term, Ford's assembly-line approach effectively handed them the market. Toyota and Honda weren't competitors for nearly 80 years!

The best person to ask? (5, Insightful)

Don_dumb (927108) | more than 6 years ago | (#23035234)

Cevat Yerli, one of the Crytek developers responsible for the graphically impressive titles Far Cry and Crysis
Is he the same developer who made a game (Crysis) so resource hungry that no gaming platform can handle it? Shouldn't we be asking someone who knows how to make a game look great on current hardware, such as Valve perhaps?

Re:The best person to ask? (0, Flamebait)

Anonymous Coward | more than 6 years ago | (#23035420)

Crysis on medium settings still looks better than HL2 engine maxed out.

Re:The best person to ask? (1)

ZiakII (829432) | more than 6 years ago | (#23035446)

Is he the same developer who made a game (Crysis) so resource hungry that no gaming platform can handle it? Shouldn't we be asking someone who knows how to make a game look great on current hardware, such as Valve perhaps?

I saw a demo for Quake 4 done with ray tracing even with 4x Quadcores the game was unplayable.. here is the demo. [youtube.com] Going with ray tracing will definitely not make any game less resource hungry.

Re:The best person to ask? (2, Insightful)

Yetihehe (971185) | more than 6 years ago | (#23035624)

I would really like to see quake4 with pixel shading on just cpu. You people forget that current games use specialisted graphic processors which currently even go to 128 shading units working in parallel. What if I had 128 specialized raytracing units? We should see results THEN

Re:The best person to ask? (1)

nschubach (922175) | more than 6 years ago | (#23035946)

...and you wouldn't leave nVidia out of the loop because their are always gamers that want that one extra level of realism. You could have an Intel core doing threads of tracing and have the nVidia core work along side of it giving more depth, more rays, or real time radiosity down the line.

This is why I don't understand why there is a huge debate on this. It's not like GPUs will suddenly vanish because of raytracing. They just won't be mainstream, which may be the reason. /shrug

Re:The best person to ask? (1)

Yetihehe (971185) | more than 6 years ago | (#23036108)

They WILL be mainstream. They will just have raytracing units along normal shading units. This is not about gpu OR rpu (ray processing unit), but gpu manufacturers want all to belevie that (or meybe they believe it themselves).

Re:The best person to ask? (1)

lorenzo.boccaccia (1263310) | more than 6 years ago | (#23036614)

ehm... what if those people actually start making complete game? give me the other half of crysis story!! get me to kick those alien on that island!

Re:The best person to ask? (1)

nschubach (922175) | more than 6 years ago | (#23036746)

Not really, because you wouldn't be REQUIRED to have a GPU/RPU to do any of the rendering. Right now, you'd be hard pressed to play anything without a DirectX 9 card where with ray tracing you could just buy the biggest/best processor and build your system around that. You'd get good enough performance and only those that need bleeding edge will actually buy accelerators.

Re:The best person to ask? (0)

Anonymous Coward | more than 6 years ago | (#23035462)

Crysis runs at 20-40 FPS on 1280x960/High (the highest settings that you can get under XP) on my system, which cost me about $600 a year ago - It's not even current hardware anymore, given that most of the components are prev-gen by now.

Do what now?

Re:The best person to ask? (1)

Alwin Henseler (640539) | more than 6 years ago | (#23035636)

Is he the same developer who made a game (Crysis) so resource hungry that no gaming platform can handle it?

Are you kidding? Nobody wants to play the 100th Doom clone other than for replay value. For a 'wow' factor, a game needs something new, something that was never done before, or never done that good. A never-seen-before feeling of immersion, a great, unique storyline, artwork that makes existing stuff look old, and sometimes... unique technical features.

To enable innovative technical features, you often need more processing power, whether from CPU, GPU or elsewhere. And for that reason, any game that pushes the envelope will be coded to run on the latest available hardware. And vice versa, the latest hardware will be beefed up to run the latest games smoothly.

So as much as I'd like game developers to create "the most resource-friendly required for a new experience", they'll continue to aim for "get the most impressive out of the newest hardware". That's just how it is.

And 2, 3 years down the line, no-one will care anyway. I've got a machine built around +/- 2 year old parts, and it (just) meets Crysis' requirements. So systems you carry out of the shop right now will do fine. Apart from the fact that you don't have to buy current games or hardware.

Re:The best person to ask? (4, Funny)

AioKits (1235070) | more than 6 years ago | (#23035684)

It's not THAT resource hungry! Sure, I mean, I had to steal, err, borrow a few human organs, particularly livers and kidneys and follow some archaic diagrams I found in my original Doom shareware documentation to create a device powerful enough to run Crysis at full capacity. But it worked damnit! For about 30 minutes... I think I got something wrong though when I built it because now all it wants to play is Doom 6, I didn't even realize there was a Doom 6 out yet! Oh, and there's this red 'gash' on the wall behind my desk. It's kinda oozing but but the drip pan takes care of that. One of my cats is missing too.

Re:The best person to ask? (1)

the grace of R'hllor (530051) | more than 6 years ago | (#23035724)

I think someone who pushed systems slightly *over* the edge is excellently positioned know where the edge currently is.

And yes, slightly. Give it three months and there'll be plenty of systems that can run the game very well. (Alas, mine will not be one of them. I hope to game at least a year more on my current rig)

Re:The best person to ask? (1)

Nullav (1053766) | more than 6 years ago | (#23037218)

I think someone who pushed systems slightly *over* the edge is excellently positioned know where the edge currently is.
Someone who pushed systems over the edge and didn't bother to step back a bit, mind you. Also, only those with more money than sense will rush out for new hardware every six months.
Then again, we are talking about the 'gamer' crowd, with their window-modded monitors and magic smoke pumps.

Re:The best person to ask? (1)

Torn8-R (1190051) | more than 6 years ago | (#23036082)

Crysis = "Hey, we got this really neat game engine but we can't afford keeping this in QA to work out all the bugs - hey, let's just wrap a shell of a game around it and market it. People will pay to test our game." Management = "Brilliant!" Now that my rant is over, Crysis was the most rediculous release of any game/software I have ever seen. As a software developer, I was embarassed by the amount of bugs and just dumb stuff that should have been caught. And then the first major patch for it wrecked the game again by making the save game files increasingly bigger to the point you're waiting 30-45 minutes for a level to load or something to reset after you died. It got so bad I didn't even finish the game. And then when the guy that lent me the game told me that I was at the last scene of the game, I was astonished. As for resources, considering I was working on a dual core AMD with a 8800GTX - I really wasn't impressed. The game still ran horribly and I had to end up turning the video settings down so that it was playable.

Re:The best person to ask? (1)

ifrag (984323) | more than 6 years ago | (#23037152)

That's odd, I was running Crysis on a very similar setup (FX-62 and 8800 Ultra) on high settings with almost no problems. The game did lockup one time on me, in the no-gravity area. I went through it with the most recent nVidia beta drivers and Crysis patch so maybe some of the problems had been addressed. Overall I thought the game looked really good, but was disappointed by being cut short in terms of storyline.

Re:The best person to ask? (1)

DirePickle (796986) | more than 6 years ago | (#23037468)

Seriously? Are you this stupid? It's only that resource hungry if you want to have every god-damn feature enabled. Should they have chopped out all of the extra-pretty features so it looked and ran as well as Half-Life 2? Then it would run on four-year-old hardware (like it does now, if you turn stuff off!) but the people that do have fast hardware wouldn't get any benefit. And as you beef up your computer, you'll be able to continue to get extra enjoyment out of the game for years as you dial it up.

Re:The best person to ask? (1)

Don_dumb (927108) | more than 6 years ago | (#23038468)

Seriously? Are you this stupid? It's only that resource hungry if you want to have every god-damn feature enabled. Should they have chopped out all of the extra-pretty features so it looked and ran as well as Half-Life 2? Then it would run on four-year-old hardware (like it does now, if you turn stuff off!) but the people that do have fast hardware wouldn't get any benefit. And as you beef up your computer, you'll be able to continue to get extra enjoyment out of the game for years as you dial it up.
Please refer to another clearly "very stupid" poster who has replied to my post above - The Post [slashdot.org]

And the fact that it can't have every feature running is kind of my point, it isn't that he can make new really pretty features, it is that he isn't the best placed person on how to optimize them for actual game play, there may be more valuable opinions out there. I am not dissing the act of chasing the carrot, I am attacking those who release buggy software which is way too ambitious about the hardware's ability to make up for what seems to be shoddy engineering.

Re:The best person to ask? (2, Insightful)

monoqlith (610041) | more than 6 years ago | (#23037648)

It would seem so at first, yes. But then, I would argue, the person who has made a game that was meant to run on hardware that doesn't exist yet might be more qualified to comment on rendering methods that run on hardware that doesn't exist yet.

Re:The best person to ask? (0)

Anonymous Coward | more than 6 years ago | (#23038040)

What game has Valve ever made that looks decent? Hopefully that was a joke.

Since you don't seem to realize...Crysis/Crytek was made to demo their engine which they then attempt to sell to other developers wishing to make games. It's his job to make the game demonstrate every possibility, not what your AMD w/ 3D Now! can push.

Re:The best person to ask? (0)

Anonymous Coward | more than 6 years ago | (#23038076)

Well try doing a bleeding edge video game and see what resources it takes up. That's always the price we pay for this kind of new game. And also comparing Crysis and Half Life doesn't make sense as Crysis looks much better and has more graphical features then any game I've seen period. Game play wise, and story, well I'll go with Half Life II on that one.

So not an April fool then? (3, Interesting)

mofag (709856) | more than 6 years ago | (#23035376)

I ignored this story first time around because I assumed it must be an April fool's joke which I think is not unreasonable: Intel leading innovation in the GPU sector ....

Re:So not an April fool then? (1)

Hanners1979 (959741) | more than 6 years ago | (#23035740)

Intel will be re-entering the discrete graphics market at either the end of this year or early 2009 - How well they can compete in the traditional Direct3D/OpenGL graphics market remains to be seen (although Intel are rather bullish about it at the moment), but it appears that they will also be targeting 'Larrabee' (for that is its codename) parts at other possible market sectors such as real-time ray tracing and other general-purpose computing tasks a la NVIDIA's Tesla GPGPU offerings.

Re:So not an April fool then? (0)

Anonymous Coward | more than 6 years ago | (#23035752)

Me too, actually. It sure read like an April Fool's joke.

Re:So not an April fool then? (2, Informative)

Molochi (555357) | more than 6 years ago | (#23037080)

ARP did the article "DX11 to support hardware accelleration of raytracing" (and it was an April Fools prank). However Intel is "serious" about hardware that does it... or at least serious about owning and promoting patents for the hardware.

Re:So not an April fool then? (1)

eggnoglatte (1047660) | more than 6 years ago | (#23037438)

Mod parent up!!!

There is still alot of confusion around that DX11 "announcement". Time for somebody to set it right!

Twofo Goatse (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#23035378)

Goatse. [twofo.co.uk] [goatse.ch]

You nerds love it.

Crytek? talking of bad performance? (1)

iamwhoiamtoday (1177507) | more than 6 years ago | (#23035404)

I know that CPU Ray-Tracing is thrown around a lot here on slashdot, and yes, it's slow (for current processors) BUT, lets look at the company doing the mudslinging at Intel. Crytek. Their latest release, Crysis, has abismally poor performance. All of their press releases for the last few years say "oh sure, we support multithreading" / "Get a Quad Core for our game" and I Know quite a few people who's main reason FOR getting a Quad core was Crysis. Then Launch day came, and we realized that the game is single threaded, and that our rather expensive processors had been just that, an expensive waste of money. It just seems to me that Crytek should impliment advances in technology, rather then just complain about Intel's latest idea.

why bash? (5, Insightful)

damnfuct (861910) | more than 6 years ago | (#23035424)

Yeah, so it's going to take 3-5 years before anything real comes out of it. Do you think that process of using high-k hafnium in the 45 nm microprocessors was developed overnight? I'm sure intel is used to the R&D cycle, and 3-5 years is not unheard of. Besides, how much longer can you use rasterization "band aids" to address rending issues (reflections, shadows, light sources)? Rasterization is just a hack to try to implement features that simply "fall out" of ray tracing. Sure it's going to take computational power, but we're not going to be using pentium 75's.

Re:why bash? (2, Insightful)

LingNoi (1066278) | more than 6 years ago | (#23035584)

Sure it's going to take computational power
So why waste it on ray tracing which adds little benefit over current techniques when it could be spent on so many other things?

There are other ways of producing global illumination which is much faster then ray tracing. It's pointless because it's like taking a step back just because we can now bute force simple scenes.

Ray Tracing will still be slow on global illumination anyway. The more reflections you have the longer it takes, so it's not going to look as good too.

Re:why bash? (3, Insightful)

deroby (568773) | more than 6 years ago | (#23035874)

On the other hand, ray-tracing would be much less of a hack. Things simply look great the way they are, not because you niftily put a semi-transparent super-texturized with shader magic polygon in that corner of the field of view whenever the light source is like that and the so-called mirror is on that position etc...

Sure it requires (a lot) more cpu-power, but development wise it should all be much more straight-forward. Build the scene and have it rendered.

Right now I'm under the impression that each time you want wow-factor, things are like : build scene, render scene, look for awkward stuff caused by incomplete technology, add tweaks to scene, render again... Repeat process until it all looks good from all corners. If not feasible within given time frame : either prevent user from walking out of the prepared spaces, drop idea altogether or leave it in half-baked and blame it on the drivers.

Re:why bash? (2, Insightful)

Goaway (82658) | more than 6 years ago | (#23036518)

Ray-tracing is nearly just as much of a hack as rasterizing polygons is. It's miles away from anything like a realistic model of lighting.

And it would still require just as much tweaking to make it look good, and make it fast.

Re:why bash? (2, Insightful)

steelfood (895457) | more than 6 years ago | (#23037324)

It's a good first step to true global illumination.

Progress doesn't always come in leaps and bounds. Sometimes, it's about baby steps.

Be smarter, not more forceful! (1)

MessyBlob (1191033) | more than 6 years ago | (#23035510)

We can't afford to render every pixel to infinite depth, so we must be smart. I predict that over the next five years or so, the techniques around ray tracing will develop. That's subtly different from saying, "We'll be using chips powerful enough to ray-trace." Video encoding took the same path, but now the stream contains enough information to make us believe that the information is there. While I don't believe that games will have every pixel in every frame rendered by ray tracing in the immediate future, I believe there will be a transition, where the likes of Intel and AMD (etc) are able to do more, but before this happens, we can be smart.

Of course he's going to bash it... (-1, Flamebait)

mdm-adph (1030332) | more than 6 years ago | (#23035558)

<rampant speculation>
...considering Crytek went to great lengths to make sure that Crysis ran better on Nvidia hardware, I'd say they're going to publically speak out against anything which threatens (a graphic hardware company like Nvidia's) business, since I've always heard that when ray tracing is perfected, discrete graphic hardware advancement will no longer be needed (since, if ray tracing makes stuff look photorealistic, what need is there for further advancement?
</rampant speculation>

Crytek...? (0)

Anonymous Coward | more than 6 years ago | (#23035652)

As much as I'm open to the whole criticisms that developers make against such ideas like raytracing in realtime, the use of realtime raytracing isn't in the gaming market, rather it's focused on the markets that don't use GPUs or atleast don't use them for what Crytek uses them. So, their criticisms are valid, but not too relevant to Intel's target market (as they're more interested in the fidelity that raytracing can offer versus the raw frame rates). And in reality, only a handful of successful realtime raytracing projects exist, which is why I think Crytek is jumping the gun here with its criticisms, because if it isn't a big company like Intel that will waste the R&D funds on this then who will? God? The Easter Bunny? Who? Someone, logically and empirically so, must waste their money on the wrong answers to realtime raytracing (and realtime raytracing itself), otherwise no one can ever say they really know the potential of such technology (considering it effectively DOES NOT EXIST). Ultimately, Crytek's devs are trying to play armchair electrical engineer here, and being a fellow code monkey as they are I'll say this: STFU until you get a degree in electrical engineering otherwise you're wasting our RSS feeder's time with your banter. Sorry, but that's how I feel about it, mod this comment as you wish. :)

Why he's not into ray tracing... (0, Offtopic)

Alzheimers (467217) | more than 6 years ago | (#23035808)

Cevat Yerli is an INKER! [rochestergoesout.com]

Well... duh! (4, Insightful)

Yvanhoe (564877) | more than 6 years ago | (#23036028)

Carmack didn't really bashed it, neither did Crytek. They just make it clear that you can't have rasterization on day N and have raytracing on day N+1. A 3-5 years transition period is very reasonable. Using raytracing optimally requires to change the whole data structure of the virtual world. It would require making new modeling tools, new rendering engines, integrating new possibilities into the game design.
Keep also in mind that Intel proposes this as a future way of doing rendering. Their hardware is not even here yet. Given this, any prediction below 3 years would be quite surprising.

Re:Well... duh! (2, Interesting)

daveisfera (832409) | more than 6 years ago | (#23036792)

Actually, Carmack did say that he thought it would never fully transition to raytracing. He said that rasterization would always stay a step ahead and could "emulate" or fake a lot of the effects that raytracing can pull off. He did say that a hybrid method showed the most promise, but he also spent the majority of the time talking about how his new idea (has some goofy name that I can't remember right now) would be the best option of all.

Re:Well... duh! (1)

Cthefuture (665326) | more than 6 years ago | (#23037240)

You don't need anything new to use ray tracing. Ray tracing is just a new lighting method. A few minor tweaks to an existing engine and you could use all the models and textures you were already using. The only thing that will change is the shadows, reflections, etc.

At some point when the 3D models get sufficiently complex then ray tracing will become a lot more attractive. With enough complexity you can model the small details that are currently faked with textures. Those small details would be hard to light accurately with rasterization so this is where ray tracing would come in.

Personally I'm finding as games become more realistic looking it's getting really hard to see anything. What we really need is true 3D displays where depth perception and eye focus can do its thing. A 100% realistic scene projected on to a 2D panel requires you to focus on everything at once which makes it difficult to see clearly.

Re:Well... duh! (3, Informative)

Yvanhoe (564877) | more than 6 years ago | (#23037410)

Yes, you can do raytracing on polygons, but it kind of misses the point. For rendering polygons, Carmack is right, rasterization will probably always stay faster as long as triangles are bigger than 1 pixel.

The point of raytracing is that instead of having a 100,000 polygons cloth animation to raster, you could have a smoother result with about 1000 control points on a mathematical surface.

Today, game makers and modelers have the habit of breaking everything into triangles because of rasterization but the raytracing approach isn't limited to triangles; it can use any shape for which a collision with a ray can be computed. It is a very powerful approach but new tools have to be developed to use it to its full extent.

1. Consoles 2. ??? = Ray Tracing! 3. Profit? (4, Interesting)

Colonel Korn (1258968) | more than 6 years ago | (#23036366)

Let's surmise for a minute:

The problem with ray tracing, as Carmack said, is that it will always be much slower than raster-based graphics with a given amount of computing power. He pointed out that there's nothing impressive about Intel's demo of a game from two generations ago running sort of acceptably at moderate resolution on an overpowered Intel demo system. He said that they'll never be able to produce a ray traced engine competitive with the state of the art raster-based games, so the ray tracing, while technically satisfying, will in every case offer poor performance for inferior graphics.

All of this boils down to a time lag. If raster graphics can do something in 2008, ray tracing can do it in 2012, etc. What if raster graphics stopped progressing for four years? Then ray tracing would have a chance to catch up, perhaps leading to new engines and APIs based on ray tracing, which would ensure long term use.

But wait...raster graphics have already been at a standstill for two years, for the first time since their inception. When the 360 came out and then the 8800 line showed up to put it firmly in its technical place, gaming graphics capabilities suddenly stopped. Not only did nVidia have its first unassailable lead over ATI in a long time, but suddenly the PC gaming market finally showed very strong signs of finally dying. Most of the remaining PC game developers shifted development to consoles, leading to (again as Carmack pointed out) a stationary graphical hardware target for new games. The overall number of PC gamers managed to stay high, but literally almost all of them were playing World of Warcraft, which has very low graphics card requirements.

Now two years have gone by, and WoW still dominates PC gaming, while only a few games have shown up that really push current hardware, with few people buying them. It's a pity that the most graphically impressive game is also quite mediocre when it comes to gameplay. There's very little market pressure on nVidia outside of the small enthusiast community, and they've managed to milk a 4x hardware lead over consoles for an unprecedented length of time. The graphics card industry used to beat the living crap out of Moore's Law, but now they've managed to eek out a 10% improvement in over two years, which is just sad. The next generation parts may or may not be coming soon, may or may not bring a large performance boost, and may or may not have any software at all to really justify their purchase.

Going waaaaay back to the beginning, CPU speeds over this same time period have been keeping up with their normal exponential increase in power. At this rate, it would only take two more generations of PC gaming failure for ray tracing on the CPU to catch up with rastering on the GPU, and if that happens, it could end up going to consoles. Hell, it might even be good for PC gaming's health. Currently most console players have a PC, but with its Intel integrated graphics it's only suited to playing games from 6-8 years ago. Already those same PCs can probably match that with ray tracing. If games were only dependent on CPU speed, they'd be a lot more accessible and easily played by a much larger part of the population.

This was settled in a recent demo (1)

gigertron (1066074) | more than 6 years ago | (#23036374)

By the new group 'NDivia': http://pouet.net/prod.php?which=50006 [pouet.net] Note that the reflection on the chrome sphere rolling over the checkerboard during the 'Raytracing sux lol' scene is actually being raytraced in shader code.

Intel pushing ray tracing... is like Exxon ... (1)

syn1kk (1082305) | more than 6 years ago | (#23036488)

promoting a new car engine that everyone wants b/c it is so "good" ( lots of horsepower, torque, etc ) ... but the desirable "good" traits come at the price of a tradeoff of more "good" vs less miles-per-gallon.

-----

Aside from the horrible metaphor to explain my point. I am basically saying that it sounds very much like ray tracing is something Intel wants everyone to use ... b/c by using it everyone will need faster computers... the need for faster computers means everyone needs to buy more Intel products.

I guess my question is, wouldn't it be better to invest 5 years in current "rasterization" rather than 5 years in "ray tracing" ?

It seems like rasterization will get the similar quality but for require less processing!?

So why would you use a technology that requires more expensive hardware to do the same thing but with less expensive hardware?

simplicity wins (3, Insightful)

sgt scrub (869860) | more than 6 years ago | (#23036642)

Like all technology races, simplicity wins. If Intel provides tools that make it easier to develop ray tracing games, the GPU will be displaced.

Who needs ray tracing? (0)

Anonymous Coward | more than 6 years ago | (#23036876)

When you can have graphics like this? [gladiatus.us]

Raster engines == big $$ (0)

Anonymous Coward | more than 6 years ago | (#23036880)

Keep in mind that companies selling graphics engines (such as Crytec) have a vested interest in maintaining the complexity that is associated with raster graphics. A move toward ray-traced games would help level the playing field with regard to visual quality and render (no pun intened) the R&D companies such as Crytec and Epic have put into their massively expesive engines somewhat null and void.

If I was Epic and I could sell an engine for 750k a pop, I certainly would have a vested interest in maintaining the status quo.
 

STOP THE PRESSES! (0)

skia (100784) | more than 6 years ago | (#23036932)

Wait, so developers who have decades of experience in rasterized graphics are speaking out against a technology that would render said experience obsolete? How can this be?!

Perhaps OT (2, Interesting)

jjohnson (62583) | more than 6 years ago | (#23037292)

But how much better do game graphics need to be?

I played the Crysis demo on a recent graphics card, and was suitably impressed for ten minutes. After that, it was the same old boring FPS that I stopped playing five years ago. Graphics seem stuck in the exponential curve of the uncanny valley, where incremental improvements in rendering add nothing to the image except to heighten that sense of 'almost there' that signals to the brain that it's *not* photorealism.

This isn't meant to be the same old "it's the gameplay, stupid" rant that we get here. It's simply to question why any real work is being done on rendering engines when we seem to long have passed the point of diminishing returns.

A word about raytracing purism. (4, Interesting)

SilentBob0727 (974090) | more than 6 years ago | (#23037800)

Personally, I'd love to see realtime raytracing see the light of day because I recognize the math behind it as more "pure" than rasterization. Of course there are several algorithmic hurdles preventing pure realtime raytracing from seeing the light of day, unless you start to hyperparallelize the operations in a dedicated GPU, and even then there are obstacles; in the worst cases, a ray can bounce along an infinite path, dividing into multiple segments as it goes, leading to infinitely branched recursion until some heuristic or another cuts it short. And as we all know, "heuristic" is a fancy word for "cheat".

Further, raytracing cannot handle advanced refraction and reflection effects, like the surface of water causing uneven illumination at the bottom of a pool, or a bright red ball casting a red spot on a white piece of paper, without preemptive "photon mapping", which is another cheat.

In short, we have not been able improve upon the original raytracing algorithms without "cheating reality". Modern raytracing that includes photon mapping is a hybrid anyway. So the raytracing purists really have nothing to stand on until there's enough hardware to accurately calculate the paths of quadrillions of photons at high resolution sixty times a second. I'm not saying we won't get there, I'm saying probably not within this decade.

The reality is, the only advantage raytracing has over rasterization is its ability to compute reflection, refraction, and some atmospheric effects (e.g. a spotlight or a laser causing a visible halo in its path) with "physical" accuracy. The capabilities of rasterization have grown leaps and bounds since the 1960s, roughly linearly in proportion to available hardware.

Purists be damned. A hybrid of each technique utilizing what it's good at (raytracing for reflection, refraction, and atmospheric halos, rasterization for drawing the physical objects, "photon mapping" for advanced reflection and refraction effects) is likely the best approach here.

I, for one, welcome our hybrid renderers (0)

Anonymous Coward | more than 6 years ago | (#23038252)

All these rants against one or the other (raytracing/rasterization) are such a total waste of time that many people just ought to get a life. Oh wait.. this is Slashdot.

I wonder when people will start seeing the forest for the trees in this issue. First of all, it's an apples and oranges comparison. Second, most of the articles on game development sites pitting the two against each other don't pass the laugh test. Total garbage. I mean, you really need to fix the application domain and the relevant constraints to be able to make any kind of a reasonably objective judgment. Furthermore, you just need to your homework on both and at least try to pretend that you're being objective. Even at that point, if you're not careful, you're still probably making an apples and oranges comparison.

Let us unite in the name of rendering. Let us bring together these gems of human intellect:

Rasterization and ray tracing!

Ray tracing meet rasterization. Rasterization meet ray tracing.

Raytracer: Nice to meet you, Mr. Rasterizer.

Rasterizer: Hello, Mr. Raytracer. Pleased to meet you.

Raytracer: Hey, you know what, I've been thinking that it would be a blast if you could trace my primary rays, Mr. Rasterizer? Oh sorry, forgive me.. Rasterize my primary rays.. Umm. Anyway, there's a lot of coherence between those rays there that I think you could really take advantage of. How does that sound?

Rasterizer: Yeah, sure, why not! I'd like to ask you a favor.. Would you be so kind as to tracing some secondary rays for me.. You know.. they're really incoherent and I totally suck at that.. Like sampling the lighting integral.. I've heard stories about Final Gathering or some such. Anyway, there are all kinds of nice things that you and I can achieve together.

Raytracer: It's a deal, Mr. Raytracer. Let us join forces and bring about a new era in uber-cool computer graphics!

-- Jani

That timeline sounds perfect... (1)

divisionbyzero (300681) | more than 6 years ago | (#23038284)

Larrabee won't be ready for primetime till 2010-2012.

Growth of technology. (1)

MaWeiTao (908546) | more than 6 years ago | (#23038328)

I don't see how a particular technology can be criticized based on today's limitations. It would be like someone in 1985 completely discrediting 3D because computers back then couldn't handle it. Why bother with 3D when 2D games provided a suitably entertaining experience.

While some of today's games certainly look impressive they've still got a long way to go because they can be deemed realistic. Actually, I find photo-realism to be bland. It's kind of like photo-realistic paintings. Certainly, the technique is extremely impressive, but ultimately, what's the point if the end result looks no different than a photograph?

I'll concede, however, that realism in gaming is a bit different. There is a big place for it in the future of gaming if for no other reason than to provide a holodeck-like experience.

That said, I don't think console and PC gaming is even on Pixar's level in terms of sophistication of graphics. They're very good, but they don't yet come close in terms of animation, detail, textures or lighting.

I don't know all the technical details of ray tracing, but to me I'd say the big advantage would be how it affects production. Current games require a considerable amount of work in order to reproduce all kinds of visual effects. With ray tracing a developer merely has to designate a surface as reflective or drop a light somewhere in the scene; the hardware handles all the math and everything comes out automatically looking right.

Hrm... eight cores... in-order only processing... (1)

divisionbyzero (300681) | more than 6 years ago | (#23038348)

This sounds familiar... Oh yeah, it's the Cell. Of course, it won't be the Cell, but I think it competes with that more than traditional GPUs.

Re:Hrm... eight cores... in-order only processing. (0)

Anonymous Coward | more than 6 years ago | (#23038652)

8 cores kind of sucks. Traditional GPUs already have >128 cores. By the time Intels stuff comes out GPUs will have probably >256 cores.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>