Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Wolfenstein Gets Ray Traced

Soulskill posted about 4 years ago | from the ach-mein-framen dept.

Graphics 184

An anonymous reader writes "After showcasing Quake Wars: Ray Traced a few years ago, Intel is now showing their latest graphics research project using Wolfenstein game content. The new and cool special effects are actually displayed on a laptop using a cloud-based gaming approach with servers that have an Intel Knights Ferry card (many-core) inside. Their blog post has a video and screenshots."

cancel ×

184 comments

Sorry! There are no comments related to the filter you selected.

I don't get it (3, Insightful)

Yuioup (452151) | about 4 years ago | (#33570568)

Why build a ray tracer into a 4th game after doing it for Q3, Q4 and ET:QW. Why don't they focus on improving already existing raytracing code into the first 3 games.

I donnow but it seems like they're keeping themselves busy for the sake of looking busy.

Y

Re:I don't get it (2, Interesting)

pieisgood (841871) | about 4 years ago | (#33570656)

Yeah, this project is simply here to validate itself.

I don't know if that's entirely true though. Carmack talks of slowly integrating raytracing technology into videogames. This research into raytracing in games could prove useful later in videogame development. As I understand most advancements in videogame visuals today are optimizations on old research. So I wouldn't rain on their parade entirely.

Re:I don't get it (4, Funny)

cupantae (1304123) | about 4 years ago | (#33571846)

I can't understand why they're not giving people what they want: ray traced Nethack.

Graphics card are obsolete (3, Funny)

Orphis (1356561) | about 4 years ago | (#33570582)

Mom, can I buy a new cloud to play Halo 10 ?

Re:Graphics card are obsolete (4, Funny)

Elektroschock (659467) | about 4 years ago | (#33570600)

Mein Leben!

Nothing to see here (4, Insightful)

dsavi (1540343) | about 4 years ago | (#33570602)

It's rendered in the cloud. If they managed to actually get more bang for the buck- i.e. made this run on conventional hardware- Then I'd be interested. They're just doing something that has been done before, albeit maybe not in real time (But you never know, seeing these new OpenCL apps), running it on high-end servers, and piping it into a small laptop. I'm not sure how much of an achievement this is, we've all heard of gaming in the cloud before.

Re:Nothing to see here (1)

Dthief (1700318) | about 4 years ago | (#33570610)

yeah.....but that accent.......doesnt it make you want to buy the brooklyn bridge from him?

Re:Nothing to see here (1)

AHuxley (892839) | about 4 years ago | (#33570726)

Yes an intel box with many cores for $x00000 would be neat, but shipping it off to a cloud just seems so 'done'.
Anything that moves this tech forward, rather than just another Intel "demo".

Re:Nothing to see here (1)

kramulous (977841) | about 4 years ago | (#33570846)

How long before that kind of power is available on a single die? Assuming programmers program a cpu with the same mindset people do with the gpu.

Re:Nothing to see here (1)

V!NCENT (1105021) | about 4 years ago | (#33571934)

According to some calculations I have done, based on Intel's roadmap and extending graphs (yes, you may shoot me), one year ago it was about 4-5 years to get the full packages (ambient occlusion and all that) with a single Intel chip.

But looking at more recent data I suggest 3 years before a very expensive desktop computer can render it. Keep in mind: rendering, excluding fluid animations and all that!

Re:Nothing to see here (1)

V!NCENT (1105021) | about 4 years ago | (#33572016)

PS: this is also ignoiring the new effects that will be introduced and especialy HD gaming. Think about 1024*800 res running at 24-30 fps.

Re:Nothing to see here (1)

Ecuador (740021) | about 4 years ago | (#33571096)

Exactly, using a bunch of servers to run a game on a laptop is neither impressive not new.
Plus, the game looks nothing like Wolfenstein, which by the way used to run fine on my 386SX - no raytracing there of course. Where are the narrow grey or blue stone-walled corridors? And what is all that furniture doing in Castle Wolfenstein?

Re:Nothing to see here (1)

xtracto (837672) | about 4 years ago | (#33571250)

Exactly, using a bunch of servers to run a game on a laptop is neither impressive not new.
Plus, the game looks nothing like Wolfenstein, which by the way used to run fine on my 386SX - no raytracing there of course. Where are the narrow grey or blue stone-walled corridors? And what is all that furniture doing in Castle Wolfenstein?

Mmm nope you are wrong. You are thinking of Wolfenstein 3D [wikipedia.org] , the game they are presenting is indeed called Wolfenstein [wikipedia.org] .

Re:Nothing to see here (1)

Ecuador (740021) | about 4 years ago | (#33571290)

That was supposed to be a joke...

Joke? Puhleeze. (3, Funny)

RulerOf (975607) | about 4 years ago | (#33571438)

My 486 ray-traced perfectly. I don't understand why we're using processing power to show glass reflections in ray-traced sniper scopes when all the old monitors showed the reflections of people approaching from behind already!

Stupid matte LCD panels.

You were supposed to woosh him ;)

Re:Joke? Puhleeze. (1)

mehrotra.akash (1539473) | about 4 years ago | (#33571718)

My 486 ray-traced perfectly. I don't understand why we're using processing power to show glass reflections in ray-traced sniper scopes when all the old monitors showed the reflections of people approaching from behind already!

Stupid matte LCD panels.

So thats why most laptops come with glossy screens..:)

Re:Joke? Puhleeze. (1)

V!NCENT (1105021) | about 4 years ago | (#33571972)

Wolf3D was ray casting, not ray tracing! ;)

Other than that. whooooosh!

So many (1)

sych (526355) | about 4 years ago | (#33570658)

So... many... triangles!

Re:So many (2, Funny)

BodeNGE (1664379) | about 4 years ago | (#33570672)

Oh my god, ... it's full of triangles!

Re:So many (1)

ciderbrew (1860166) | about 4 years ago | (#33570960)

A double triangle what does it mean?

Re:So many (1, Funny)

Anonymous Coward | about 4 years ago | (#33571384)

it means newfags can't triforce.

Two triangles is a Square, not a failed Triforce. (0)

Anonymous Coward | about 4 years ago | (#33571978)

nt.

Re:So many (1)

RulerOf (975607) | about 4 years ago | (#33571450)

A double triangle what does it mean?

Fucking triangles! How do they work!?!?!?

A bag of triangles?? I've go so many questions!

Re:So many (1)

Deus.1.01 (946808) | about 4 years ago | (#33571018)

That aint normal....OH GOD all the NORMALS!

Re:So many (0)

Anonymous Coward | about 4 years ago | (#33571598)

no, there aren't. that's the whole point of it, duh.

Re:So many (3, Insightful)

Rockoon (1252108) | about 4 years ago | (#33571752)

This is the true advantages of raytracing. A rasterizer would have to deal with each and every triangle in that chandelier.

Rasterizers scale on O(triangles) while raytracers scale on O(pixels * log triangles). I dont remember if it was Microsoft Research or something out of Intel, but 5 or so years ago they did some scalability testing and concluded that about 1 million polygons was the sweet-spot where raytracing and rasterization were about equal in efficiency using the per iteration constants derived in their testing.

This was based on visible geometry only, so no pretending that the fact that rasterizers can use logarithmic data structures for hidden surface removal, that that makes any bit of difference.

Since then, triangle counts have remained about the same in games (with more per-pixel processing being done to simulate more geometry,) but the number of pixels have quadrupled as higher and higher resolution displays have become common. Yet they are reaching the limits with the fakes that can be done with shaders, and resolution is probably not going to go through another quadrupling, so raytracing really is comming.. just not quite yet.

When the polygon counts do get high enough, there will be no looking back. Raytracing will be here to stay after that because of the way it scales. At 1 million polygons, a raytracer spends 20 iterations per ray cast using a logarithmic structure.. doubling the number of polygons to 2 million only adds 1 more iteration.. or about 5% more processing power required, and doubling again only adds another ~4.5%, and so on.. meanwhile each doubling of polygons on the rasterizer literally doubles the processing power required.

Re:So many (1)

grumbel (592662) | about 4 years ago | (#33572096)

A rasterizer would have to deal with each and every triangle in that chandelier.

Or it could just do LOD with a geometry shader on the GPU.

When the polygon counts do get high enough, there will be no looking back.

The problem is that you don't just want high polygon counts, but high polygon counts for dynamic objects. And ray tracing itself has its strength in static objects, as soon as stuff moves and deforms, ray tracing runs into quite a few issues, not necessary unsolvable issues, but that demo was rather lacking in that aspect as the particle system looked like complete garbage compared to todays games.

Or to put it another way: The job of a tech demo is to blow me away, not present something that needs a server farm to run and looks worse then a $200 Xbox360.

Re:So many (1)

Rockoon (1252108) | about 4 years ago | (#33572294)

Or it could just do LOD with a geometry shader on the GPU.

LOD doesnt absolve the scaling problem. Lets say you have a system set up for 25% LOD versions of the geometry, but then double the number of polygons in the geometry.. well you have also doubled up the polygons in the LOD versions as well.

Your thinking only applies if "Double Polygons" = "Double-Sized World" ... whereas the study was on the visible geometry itself, which has nothing to do with the size of the world, where "Double Polygons" = "Double Detail"

Game developers no longer worry about the number of polygons "in the world" .. there are plenty of log n algorithms for ignoring that issue (bsp's, quad tree's, and a plethora of other common scene graphs). The issues they face are dealing with the number of polygons "on screen right now" .. There are few unexplored tricks to faking more geometry than is actually there... pixel shaders are more than enough these days to do almost anything you can think of. The techniques are what they are, and each polygon must be visited and sent to the renderer (because its on screen!) and thats why its O(triangles) for rasterization.

The problem is that you don't just want high polygon counts, but high polygon counts for dynamic objects. And ray tracing itself has its strength in static objects, as soon as stuff moves and deforms, ray tracing runs into quite a few issues, not necessary unsolvable issues, but that demo was rather lacking in that aspect as the particle system looked like complete garbage compared to todays games.

They also didnt spend a hundred million making the game. Sure, raytracing has issues... I think the biggest one that you missed is the much higher cost of anti-aliasing.. for rasterizers, anti-aliasing techniques pretty much fall right out of it.. for raytracers they either need to do super-sampling or actually implement a hybrid with cone-tracing for when a ray hits a polygon edge.. (which has its own issues with regard to reflections and refractions, where the size of the cone possibly becomes nearly 360-degrees after a few bounces)

In the end it is inevitable tho, and it is precisely because of how things scale.. for rasterizers every doubling of visible geometry doubles the computational resources required.. but for raytracers increasing geometry can almost be shrugged off as inconsequential... to put this as clear as day, a raytracer spends 20 iterations per ray for a million polygons, and 30 iterations for a billion polygons. A thousand-fold increase in geometry only requires 50% more computational resources. Once we hit that sweet spot where rasterization and raytracing are about equal in performance, it becomes dumb not to start piling on as much geometry as memory permits.

Sign of the times... (5, Insightful)

rh2600 (530311) | about 4 years ago | (#33570694)

When a laptop packing a multi-GHz 64bit CPU with gigs of RAM gets called a thin client...

Re:Sign of the times... (1)

aws4y (648874) | about 4 years ago | (#33570816)

it is thiner than the server cluster computing each frame at the back end of this caned demo.

Re:Sign of the times... (1)

RulerOf (975607) | about 4 years ago | (#33571460)

it is thiner than the server cluster computing each frame at the back end of this caned demo.

Not only is it thinner, but it's Intel certified not to squash your nuts when used on your lap, unlike the rackmount server.

...I learned that that the hard way.

The rackmount servers do tend to run cooler, though, so if you're not terribly attached to your nuts....

Re:Sign of the times... (1)

L4t3r4lu5 (1216702) | about 4 years ago | (#33570898)

It is a thin client. All it's doing is holding the client software to accept the pre-rendered feed. It does nothing but hold a high speed network connection and display rendered frames.

It's their fault for using such a high powered bit of kit, but if it's doing no processing of its own it's still just a thin client. Albeit extremely expensive.

Re:Sign of the times... (1)

drinkypoo (153816) | about 4 years ago | (#33571800)

It's their fault for using such a high powered bit of kit, but if it's doing no processing of its own it's still just a thin client. Albeit extremely expensive.

Seems more like a thick client anyway...

I doubt they could have done it without the bandwidth that the newer hardware affords. Intel has traditionally been starved for bandwidth of all types; not so now.

Re:Sign of the times... (1)

Sulphur (1548251) | about 4 years ago | (#33571612)

That is dual core you insensitive clod.

Poor ray tracing (2, Interesting)

DarwinSurvivor (1752106) | about 4 years ago | (#33570716)

Their ray tracer has a few issues.
-The player does not appear in the scope reflection (but his shadow does).
-The people's shadows are cast in a different direction than the car's.

Re:Poor ray tracing (3, Informative)

Purity Of Essence (1007601) | about 4 years ago | (#33570778)

1. It's extremely common in FPS games for the player model to be excluded from the player perspective. It really complicates things and usually doesn't look good without a lot of extra work.

2. That's not the car's shadow. The building shadow is the shadow you are seeing. You can't see the car's shadow because the car is mostly (if not entirely) shadowed by the building behind it. The viewing angles were not suited for showing a shadow cast by any directly illuminated portion of the car.

Re:Poor ray tracing (2, Informative)

Anonymous Coward | about 4 years ago | (#33570836)

You are right, the player model is often excluded, but that isn't really necessary. Especially id Software is usually known to show the player model's shadows and refelctions (including mirrors) since Doom 3.
And if you really want a game with not only visible player model but actually pretty good player animation and physics, you should try out Dark Messiah of Might and Magic.

Re:Poor ray tracing (1)

noodler (724788) | about 4 years ago | (#33570914)

Talking about shadows,
in the indoor scenes, specificly the one with the nine monitors, there should be loads of shadows underneath the desk.
Especially because there are several lamps above the desk.
There are, however, no shadows.
And the lamps don't seem to be lighting anything.

So i'm pretty curious about their definition of ray tracing.

Re:Poor ray tracing (3, Insightful)

ciderbrew (1860166) | about 4 years ago | (#33571032)

This sounds like a John Lasseter I saw ages ago. Those guys are scientists not 3D artists. They can't see why it's wrong. It's job done when the maths work. I've not idea why they don't hire in a guy, most of these problems have been identified and fixed in the pre-rendered market years ago. Maybe extra lights kills the frame rate too much.

The worst example of 3D I've seen so far would be the "shadows on a mirror" trick - nice.

Re:Poor ray tracing (1)

ciderbrew (1860166) | about 4 years ago | (#33571286)

SORRY - "This sounds like a John Lasseter interview I saw ages ago"

What's the point? (2, Insightful)

pacinpm (631330) | about 4 years ago | (#33570732)

I know they just started but still... what is the point of this? There is no upsides to rendering. It's slower (you need 4 servers), it looks worse (they had no antialiasing, ugly smoke, no complex lightning). You can do some things like reflections and refractions and portlas bit easier than with other methods but most of the time you don't need 100% correct reflections/refractions (simplified models work quite nice) and security cameras where implemented in Duke Nukem 3D on i486 machines without problems.

Other than selling Intel chips I see no purpose for this project.

Re:What's the point? (4, Insightful)

retroStick (1040570) | about 4 years ago | (#33570790)

As someone who has dabbled with raytracing before, I would have to agree. It's an interesting tech demo of something that's possible, but not really of practical use. For instance, they showed the chandelier with a million polys - that's all well and good, but it's on the ceiling! If the game was actually being played, the player would never get close enough to see those clever refractions. (And even if they did, the demo shows the frame rate would drop to around 17-20 FPS).

Re:What's the point? (1)

hairyfeet (841228) | about 4 years ago | (#33571956)

Not to mention who is actually gonna use this thing? Bandwidth ain't cheap for most folks, and uncapped connections are becoming a thing of the past. Finally you have the fact that a good 90%+ of the games are made for either the consoles first, or are designed to be "multiplatform" which means my $36 HD4650 I bought over a year ago plays just about every game out there at my LCD native 1600x900 thanks to the consoles being so behind the curve.

So other than spending a whole bunch of money and man hours so Intel can go "Ooohh shiny" I fail to see any real world application for this. Film designers aren't gonna give a shit about real time, only quality, and consumers can't afford the bandwidth this hog would suck, which would probably be a slideshow on the average cable or DSL line, and game designers are building for the x360 level graphics which makes this right out. Am I missing something?

Re:What's the point? (1)

gmthor (1150907) | about 4 years ago | (#33570796)

From what I heard. Raytracing scales better than rasteriastion. In other words O(raytracing) \subset O(rasterisation). Obviously, raytracing has really bad coefficients.

Re:What's the point? (1)

MichaelSmith (789609) | about 4 years ago | (#33571202)

It should scale well for multiple clients, particularly where surfaces are not perfect optical reflectors. If every surface scatters then each client need only require tracing for the last leg of every ray.

Re:What's the point? (3, Informative)

TheRaven64 (641858) | about 4 years ago | (#33571622)

Not quite. The complexity of rasterisation is (very) roughly O(number of polygons * number of lights). The complexity of ray tracing is O(number of rays). The number of primary rays is the number of pixels (sometimes multiplied by 4 or 9). The number of secondary rays depends on the number of lights (you fire a ray into the scene and then a secondary ray from what it hits to each light). This means that increasing the complexity of the scene does not affect the ray tracing time very much, but increasing the resolution does. On the plus side, ray tracing gives you shadows and reflections for free. It also degrades more gracefully - you can get a lower quality scene quickly (just from one primary ray per pixel) and then add the details from secondary rays and extra rays if the user doesn't move. In contrast, rasterisation tends to just lower the frame rate.

Re:What's the point? (1)

ByteSlicer (735276) | about 4 years ago | (#33572224)

That's all nice in theory. But in practice, the raster version runs on my desktop at 200+ FPS, while the ray-traced version runs at 0.1 (?) FPS.
Probably one day the chipsets will be powerful enough to do this on a single machine, but even then ray-tracing doesn't give a full render solution. You'd need a global illumination implementation (radiosity, photon mapping) to get realistic (secundary) shadows/hilights.

Re:What's the point? (1)

91degrees (207121) | about 4 years ago | (#33570854)

This does strike me as a bit of a problem with ray tracing in general. Purists like it because it is an accurate model of the optical system, but while it does have a number of inherent features that you get basically for free, it's slow, and a lot of the time you can get better results in rasterisation by faking.

Re:What's the point? (3, Interesting)

leuk_he (194174) | about 4 years ago | (#33570948)

There is no point now. But in 10 years (maybe faster) the cpu speeds has increased to the point that you don't need a high performance cluster. It would be nice if you can at that moment run a game without an advanced GPU. in full detail.

If you have to start research about raytracing when the hardware is cheap enough you are too late.

And as for quality: fun of a game has little to do with grpahics quality. But it has to advance, or else we still would be looking at pong like graphics. people buy 1080p tv at sizes where it almost impossilbel to see the difference with 720p. But they still want the best quality.

PS, when they speak of wolfenstein i still think of the 1991 prequal to doom that was playable on a 286.

Project Offset (4, Interesting)

nacturation (646836) | about 4 years ago | (#33570748)

Anybody know what happened to http://www.projectoffset.com/ [projectoffset.com] ? They released tons of killer videos showing an amazing game concept, outstanding real-time effects [youtube.com] ... then Intel buys them and... nothing!

Re:Project Offset (1)

Nagrom (1233532) | about 4 years ago | (#33570788)

I'm seeing a fairly generic fantasy world and a bunch of nice rendering techniques that by now have pretty much all appeared in released games, on consoles even. Am I missing something?

Re:Project Offset (1)

nacturation (646836) | about 4 years ago | (#33570860)

I'm seeing a fairly generic fantasy world and a bunch of nice rendering techniques that by now have pretty much all appeared in released games, on consoles even. Am I missing something?

Keep in mind that these demo videos were released in 2005 by a three person team working out of an apartment. It was met with pretty much universal acclaim [webcitation.org] back then and still holds up extremely well against any engine today.

Re:Project Offset (1)

Squapper (787068) | about 4 years ago | (#33570938)

Yes they had some impressive tech and good artists, but about every experienced game developer in the field (including me) realized that super-ambitious projects started by a handful of indies in a basement rarely makes it to the shelves nowadays.

If you want to create a hit as an indie startup, you make something like Braid or Limbo.

Re:Project Offset (1)

Dekker3D (989692) | about 4 years ago | (#33570932)

Absolutely ridiculous triangle counts and shaders that most of us wouldn't even dream of. Of course, they could've made better use of all that, so that people would actually get enthousiastic about it.

Does it run on a Beowulfenstein Cluster? (5, Funny)

Hadlock (143607) | about 4 years ago | (#33570756)

Yeah, you're rendering Wolfensetein on a cluster.... but can you get Wolfenstein running on a Beowulf cluster... or, dare I say it... a Beowulfenstein cluster???
 
;)

Re:Does it run on a Beowulfenstein Cluster? (1)

L4t3r4lu5 (1216702) | about 4 years ago | (#33570934)

With enough XML, probably.

Re:Does it run on a Beowulfenstein Cluster? (1)

arndawg (1468629) | about 4 years ago | (#33571034)

Add some SOAP and virtualization in the mix and you got a synergy that will revolutionize the way you manage your beowulf raytracing..

Ironic (3, Interesting)

Anonymous Coward | about 4 years ago | (#33570768)

That none of intels graphics processors have any hope in hell of real time ray tracing.

fps counter lying? (2, Interesting)

citizenr (871508) | about 4 years ago | (#33570772)

Chandelier part displays 40 fps on top right, but you can clearly see on the screen that its more like 15. Not to mention unimpressive difference between RT and normal renderer. I was expecting something more real life.

Re:fps counter lying? (1, Informative)

Anonymous Coward | about 4 years ago | (#33571046)

You ARE seeing a slower framerate: Youtube does not play at 40 fps...

Re:fps counter lying? (1)

cheater512 (783349) | about 4 years ago | (#33572136)

That, and I assumed that there was lag when manually altering the camera angle, whilst other things moving around are fluid.

Cloud gaming and latency (2, Insightful)

loufoque (1400831) | about 4 years ago | (#33570808)

The very idea of using the cloud to render a FPS is preposterous and will never work in practice, for obvious latency reasons.

Re:Cloud gaming and latency (2, Funny)

Purity Of Essence (1007601) | about 4 years ago | (#33570838)

Is that supposed to be ironic given the runaway success of the OnLive game service? http://www.onlive.com/ [onlive.com]

Re:Cloud gaming and latency (1, Informative)

Anonymous Coward | about 4 years ago | (#33570950)

Heh, you say runaway success and I haven't heard anything about OnLive in months.

Re:Cloud gaming and latency (2, Insightful)

Fross (83754) | about 4 years ago | (#33571342)

Do you have anything to back up that "runaway success" claim? As far as I can tell it's been shunned by hardcore gamers due >100ms input lag, and I've not seen anything about it having huge takeup.

Re:Cloud gaming and latency (0)

Anonymous Coward | about 4 years ago | (#33571550)

If you use the niche group of hardcore gamers to define success then few things will be called successful.

Re:Cloud gaming and latency (2, Insightful)

Thanshin (1188877) | about 4 years ago | (#33570844)

The very idea of using the cloud to render a FPS is preposterous and will never work in practice, for obvious latency reasons.

How else will you start training for the moment when that computing capacity is on every PC?

You use the cloud, ignore the lag and build an engine ready for the generation of computers that will come in five or ten years. You'll lose a lot of your investigation, but anyone who starts studying RT at that point will be years behind you.

Re:Cloud gaming and latency (1)

kramulous (977841) | about 4 years ago | (#33570896)

Forward thinking? That's craziness. Where on earth did you get such a crappy idea?

I wonder whether 5 years out is a little too far before this compute power hits the consumer.

Re:Cloud gaming and latency (2, Insightful)

tibit (1762298) | about 4 years ago | (#33572362)

I don't know if latency is any sort of a problem. You're talking of a LAN connection. This technology is not meant to render stuff somewhere out there on the intertubes. It needs to be in the same building, or on the same campus.

The surveillance station. ... (1)

malakai (136531) | about 4 years ago | (#33570824)

The surveillance station.
At a wall in the game you see twelve screens that each show a different location of the level. This can be used by the player to get a tactical gaming advantage. Have you ever seen something similiar in a current game? Again - probably not

Someone doesn't play many games. Many 3D engines, for well over 10 years, have had some means of rendering to a texture and throwing it up on a poly in the game world. I'm going to say that hardware accelerated means of doing this have been common for at-least 6 years.

Re:The surveillance station. ... (4, Informative)

WhitetailKitten (866108) | about 4 years ago | (#33570974)

You wanna know the last game I played that featured this "surveillance camera" business?

Duke Nukem 3D


Ohhhh, snap!
/* OK, it was one monitor at a time, but that's arguably a tactical decision to not let the player see every camera at once */

Re:The surveillance station. ... (1)

ninjacheeseburger (1330559) | about 4 years ago | (#33571084)

Garry's Mod for Half Life 2 allowed you to build your own surveillance station and place/move the cameras where you wanted. You could also change the texture of an object to show the view from the camera.

Re:The surveillance station. ... (1)

WhitetailKitten (866108) | about 4 years ago | (#33571220)

Garry's Mod for Half Life 2 allowed you to build your own surveillance station and place/move the cameras where you wanted. You could also change the texture of an object to show the view from the camera.

Granted. I said the last game I played. Not the last game. I'm sure Call of Honor Arena 2009 or whatever the kids are playing these days. (My old geezer imitation isn't that good.)

Re:The surveillance station. ... (1)

daid303 (843777) | about 4 years ago | (#33571284)

You had to 'use' the monitor to view it. I think Unreal (or atleast Unreal Tournament) was the first engine that managed to render a 3D back to a texture and display it ingame. And that's more then 10 years old.

Re:The surveillance station. ... (0)

Anonymous Coward | about 4 years ago | (#33571866)

Not so, actually there was always a low-resolution view through the monitor, rendered in the game world.

Re:The surveillance station. ... (0)

Anonymous Coward | about 4 years ago | (#33571126)

I think the "probably not" is in reference to the twelve screens.

Two things struck me about this: (1)

pancakegeels (673199) | about 4 years ago | (#33570858)

1. There is no reason why a game would need a raytraced chandelier. But how good is raytracing at deformation and breakage - what happens if I shoot that rendered glass plate? 2. The reflections make sense as a tactical advantage , but the screen recursion doesn't even fit with reality. Screens have a resolution, cameras taint an image, so zooming into a screen shouldn't lead to a Portal-like infinity. Is it possible to set restraints/limits and add effects to situations like this? Anyway, that was fun, if a little shiny. I love this kind of tech, but its just that little step short of being a game changer for me.

Re:Two things struck me about this: (1)

leonem (700464) | about 4 years ago | (#33571204)

On point 1, a ray-tracer would probably not incur much additional rendering load shattering the glass (physics is another matter), so long as it remains the same number of polygons. There might be some additional calculation as multiple pieces of glass passed in front of each other, but most raytracing engines put an absolute limit on the number of ray interactions. This is generally not perceptible because, as you point out, screens have a resolution ;)

Re:Two things struck me about this: (0)

Anonymous Coward | about 4 years ago | (#33571306)

The last time I read about ray tracing, they needed to keep the objects in a space subdivision tree structure like BSP to get the O(log n) scaling they like to advertise. The problem was that there was no easy way to update a BSP. Is there now a good solution for maintaining an efficient space subdivision tree for a shattering a million-polygon chandelier?

Re:Two things struck me about this: (1)

leonem (700464) | about 4 years ago | (#33571422)

Ah, no, that's a very good point. My experience is of rendering for video, where there is no penalty for breaking up an object. However I'd therefore assume it's not using pre-calculated structures to speed up collision testing (and is therefore slower).

Re:Two things struck me about this: (1)

dbIII (701233) | about 4 years ago | (#33571572)

But how good is raytracing at deformation and breakage

It isn't, but good looking brittle fracture shouldn't be too hard to model in other ways (simple finite element arrays with a big mesh) if you don't care about true realism. Simple elastic and plastic deformation on a macro scale (eg. bending a bar) shouldn't be that hard either because you'd be dealing with much bigger elements (cubes, cylinders etc) than the polygons you have to worry about with lighting.

That's... Lovely. (4, Interesting)

L4t3r4lu5 (1216702) | about 4 years ago | (#33570878)

10fps to be able to see glass refraction on a surface so small it's totally inconsequential.

Yawn. Wake me up when they get refraction working with a playable framerate like Source had seven years ago. Regarde [youtube.com]

Will we get Raytracing in the next 50 years? (1)

devent (1627873) | about 4 years ago | (#33570968)

Everybody agrees that ray tracing is just awesome and I at least think it's the future of 3D computer graphics. But there is only one big 3D hardware vendor left, AMD is more a CPU vendor that tries to get into the 3D market because Intel is too big in the CPUs market. Intel only have small on-board graphic chips. Will we see ray-tracing from Nvidia anytime soon?

I sure hope that maybe Intel or AMD try to take over the 3D computer graphics market with their CPU know-how (ray-tracing is using mostly the CPU). But I really don't hope to see ray-tracing in laptops or desktops in the next 50 years to be mainstream. Nvidia and ATI all focused on triangles and ray-tracing is like an new beginning. http://caustic.com/ [caustic.com] is really a step in the right direction with their OpenRL SDK.

Re:Will we get Raytracing in the next 50 years? (1)

grumbel (592662) | about 4 years ago | (#33571962)

Everybody agrees that ray tracing is just awesome

Actually, no, the raytracing shown in that demo isn't awesome, it is rather primitive and ugly. You can render shiny spheres with it and static high polygon objects, but basically nothing else.

The stuff that you need to make graphics look good is global illumination and that demo had none of that. Todays games on the other side start to get there, you can already find realtime ambient occlusion in some games, you can get soft shadows and there have been tech demos even showing realtime photon mapping. And of course there is a lot of post-processing trickery like crepuscular rays that make things look good.

At a certain polygon count raytracing may have advantages and for many of the advanced global illumination stuff you also end up tracing rays in one form or the other. But the demo shown demonstrated none of that, it was basically your grandfathers primitive ray tracing. Not exactly impressive by todays gaming standards, especially since it even failed to get basic stuff right, like the particle effects which looking ugly by last-gen standards.

And yet the players still don't move like people (1)

VShael (62735) | about 4 years ago | (#33570994)

I want my kills to look hyper-realistic. And soon.

Re:And yet the players still don't move like peopl (1)

Jedi Alec (258881) | about 4 years ago | (#33571158)

Join the armed forces and ship off to Afghanistan? Doesn't get much more realistic than that...

Re:And yet the players still don't move like peopl (2, Funny)

daid303 (843777) | about 4 years ago | (#33571292)

Nah, the respawn time sucks in Afghanistan.

Re:And yet the players still don't move like peopl (1)

MichaelSmith (789609) | about 4 years ago | (#33571302)

Buddhism might be the go.

Submit your Resume (1)

stimpleton (732392) | about 4 years ago | (#33571022)

If the "Future of Graphics Rendering" was a job being advertised and potential candidates were asked to submit their Resume, then Intel's would be very thin.

The job is asking for 5 years experience, with a tertiary qualification, preferably post grad.

In Graphics, Intel has completed High School and done 2 years admin temping.

And yes, I am still bitter about the Intel i740 Graphics Card [wikipedia.org] . Intel are just great at the snowjobs, even suckering John Carmack in a very ancient .plan [floatingorigin.com] update:
"Good throughput, good fillrate, good quality, good features. A very competent chip. I wish intel great success with the 740. I think that it firmly establishes the baseline that other companies (especially the ones that didn’t even make this list) will be forced to come up to."

The reality turned out to be what this story will be - smoke and mirrors.

Hmmm (1)

DrXym (126579) | about 4 years ago | (#33571070)

It's interesting to see what a game looks like with raytracing, but I don't see any practical use for this tech until they can make it happen in a normal GPU.

The problem with ray tracing is that if you have 1280x720 display then you're going to have to fire off at least 921,600 rays which must be intersected with objects and these in turn split into more rays as they reflect / refract around the screen. In a complex scene you may end up firing millions of rays. And I say at least because at 1 ray per pixel the picture quality will be awful. A ray might miss an edge completely so you get weird ragged edges and patterns blinking in and out. The normal way to address ragged edges is to fire more rays per pixel so you might end up firing 4,5,6 pixels in a ray, and you might jitter (randomize them) to minimize weird effects on patterns. Then if you want shadows and stuff to not like shit you have to think about diffusion & radiosity. Then you have effects like fog, clouds, smoke, fire etc. to worry about.

So you've possibly got to be rendering 5,000,000+ rays per frame in a highly complex scene and do so fast enough to deliver at least 30fps.

Done properly it would look awesome, but the calculation required to get acceptable results is enormous.

cyuo Fail it (-1, Offtopic)

Anonymous Coward | about 4 years ago | (#33571118)

Are having troubLe coomitterbase and could save it

Ahh Youth (5, Insightful)

kenp2002 (545495) | about 4 years ago | (#33571532)

"The surveillance station. At a wall in the game you see twelve screens that each show a different location of the level. This can be used by the player to get a tactical gaming advantage. Have you ever seen something similiar in a current game? Again - probably not"

Yes, In Duke Nukem 3D... over 15 years ago. And again in a bout 40 other FPS games that followed including the Unreal series, more then a few Quake maps especially in capture and control maps.

"There is nothing more amusing to watch then some young kid discover something old and think it is new" - That quote in action.

Why don't they run it on GPU's (1)

Neil Boekend (1854906) | about 4 years ago | (#33571544)

Warning: I am not hindered by much knowledge in this area.

It seems to me GPU's, with their massive amounts of stream processors, would be quite suitable to do this job. The separate rays do not interfere with each other but have the same operations applied to them: perfect for stream processing.
To get a 1280*720 display with 9 rays per pixel (to get the edges correct) working you'd need nearly 9.000.000 rays each frame.
With 400 stream processors (A little beyond EUR 100 (HD5670)) you'd need every stream processor to process 22500 rays each frame.
Assume every ray requires 1000 clock cycles.
Assume a frame rate of 40
The clock speed needs to be 1000*22500*40=900000000 Hz. => 900 MHz
900 MHz is almost possible. The GPU I chose worked at 775. It's not an expensive one and you can insert 2 of them for more stream processors
With some optimization this number may fall. Brute force is usually not the best way.
Where did I guess or think wrong? Would it already be possible for the cost of EUR 200?

Sigh (1)

ledow (319597) | about 4 years ago | (#33571560)

Multi-million dollars graphics render farms and we still can't draw convincing fire, trees or animate a human walking smoothly (even with motion capture you often "see the join" between one action and another).

Moore's Law is NOT a tool (1)

Carrion Creeper (673888) | about 4 years ago | (#33571602)

In the Knight's Bridge link, Intel PR references Moore's Law as if it were some method for increasing processing power: "and use Moore's Law to scale to more than 50 Intel cores". Moore's Law is a prediction, not a design method. Sheesh.

Add that to the grammar mistake on the Aubrey Isle image, and you have some pretty bad PR for anyone paying attention.

Re:Moore's Law is NOT a tool (2, Insightful)

ksandom (718283) | about 4 years ago | (#33571658)

Moore's Law has become an expectation, and thus a design method from a marketing point of view. This is particularly visible in harddisks where they release a harddisk that has been designed to scale up, but only contains a single platter, then a little over a year and a half later, the same hard disk is released with a second platter. The expectation allows them to get ahead, while the previous iteration is slowly allowed to get to it's full potential. Then they work on the next thing and while the current platform grows.

Re:Moore's Law is NOT a tool (1)

Carrion Creeper (673888) | about 4 years ago | (#33572024)

My problem is more that the press release is putting Moore's Law, which is at best a product life cycle methodology, on the same level as the 22nm process, which is an actual technology for making better/faster/smaller chips.

They put it there so that people who know nothing else can see something familiar in the introductory paragraph. The problem being that they are making the misinformed even more misinformed by using Moore's Law in the wrong context and confusing the issue. Nobody who know better wants to have to explain misconceptions about Moore's Law just because Intel is putting out crappy press releases.

Watch the video on the blog (1)

ksandom (718283) | about 4 years ago | (#33571632)

It's awesome!
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?