Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

How Quake Wars Met the Ray Tracer

timothy posted more than 5 years ago | from the cannot-break-the-laws-of-physics dept.

Graphics 158

An anonymous reader writes "Intel released the article 'Quake Wars Gets Ray Traced' (PDF) which details the development efforts of the research team that applied a real-time ray tracer to Enemy Territory: Quake Wars. It describes the benefits and challenges of transparency textures with this rendering technology. Further insight is given into what special effects are most costly. Examples of glass and a 3D water implementation are shown. The outlook hints into the area of freely programmable many-core processors, like Intel's upcoming Larrabee, that might be able to handle such a workload." We mentioned the ray-traced Quake Wars last in June; the PDF here delves into the implementation details, rather than just showing a demo, and explains what parts of the game give the most difficulty in going from rasterization to ray-tracing.

Sorry! There are no comments related to the filter you selected.

A Day in the Life of Debbie G1bs0n (0, Troll)

Reikk (534266) | more than 5 years ago | (#26604749)

A Day in the Life of Debbie G1bs0n

A silver tear rolled down Debbie's perfect cheek as she slowly lowered her
sleek young body into the white marble bathtub. When she was younger, a nice
hot bubble bath was all she needed to raise her spirits, but now it seemed that
nothing would calm her troubled soul. Life wasn't easy for the teenage singing
sensation. It seemed that no matter what she did, no one would take her work
seriously.

"Trite," the critics had called her last album. "Trite, cheesy and
sappy." Debbie shuddered and began to weep harder. These were her innermost
feelings they were poking fun at. If "Lost in Your Eyes" and "No More Rhyme"
weren't heartfelt reflections of the depth of the human soul - she didn't know
what was. And surely "Electric Youth" was the most inspirational song about
youthful potential since David Bowie's "Changes." But still her finest works
were ridiculed by those too emotionally and intellectually immature to fully
understand them.

But Debbie's musical career wasn't what was bothering her, and she knew it
all too well. Her real problem was that she could no longer go on ignoring the
feelings that were swelling inside her body. She was blossoming into woman-
hood, but could not realize her fantasies in fear of tarnishing her image as
the fresh, innocent pop starlet. It wasn't so much to preserve her career -
she knew in her heart of hearts that she could make it on her talent alone -
but she felt she owed it to her fans. She wanted to be a role model to young
girls, to tell them that it's cool to just say no to sex and drugs - to follow
their dreams and to be individuals. But at the same time, Debbie was finding
it harder and harder to resist the powerful desires coursing through her veins.

Yes, Debbie was a virgin, but it was more by circumstance than conscious
choice. She was curious, but didn't want to just hop into bed with the first
guy that came along. And since her busy career prevented any kind of real
romance from developing, it seemed that she was doomed to chastity forever. It
had been months since the last time she had been touched in a sexual manner. A
smile crept across her face while her mind replayed once again that delicious
evening.

She washed the tears from her face while her slender toes slipped around
the tiny chain on the rubber stopper in the tub. A gentle tug and the water
began slowly draining away. Debbie began gently caressing her taut young body
as the water lowered, exposing her soft flesh to the cool air. Bubbles
crackled and popped on the delicate surfaces of her small, pert breasts -
sending tingling pleasures from her tiny pink nipples to her moist womanhood.

"Kirk," she whispered to herself. "Oh... Kirk...."

To most people, Kirk Camer0n was just another television star. He played
Michael on the popular sitcom "Growing Pains" - a winsome youth with an
irresistible smile and a keen wit. But he was more than this to Debbie. Much
more.

By now the water had reached the floating curls of her soft blonde pubic
hair. Debbie ran her slender fingers through the tiny locks and remembered
that night at the Emmys.

By mere chance they had been seated next to each other. They talked a
little, mostly about being mobbed by hordes of twelve year old fans whenever
they went out in public. But while they spoke, Debbie could feel Kirk
undressing her with his eyes - tracing her curves and taking obvious glances at
her tight skirt. He had an air of hungry confidence about him, and she felt
desires welling up inside her that she had never felt before. The lights went
down in the room, and the ceremony began. Kirk took Debbie's hand and began
gently stoking it. Then he suddenly let go, and instead put his hand on her
knee. Slowly he began to move it up her leg, stroking and caressing her inner
thigh; making Debbie swoon in shameful anticipation.

Lying in the bathtub, Debbie's mind played over the delicious image of
Kirk gently slipping his fingers underneath her silk panties, his manicured
nails lightly grazing her swollen rosebud - all the while looking into her eyes
and coyly mocking her obvious passion. She pictured that face, those fingers,
penetrating over and over....

And then it boomed over the sound system, "And the winner for best actor
in a Family-Oriented Situation Comedy is... KIRK CAMER0N!"

Kirk removed his hand from Debbie's sopping underwear with admirable
swiftness, only a split second before the roaming cameras would whirl to meet
his ever-charming smile.

Debbie began thrashing about in the bathtub, shuddering violently with
orgasmic tears, but only a second after her muffled cries began to escape her
ruby lips - the wooden door into the room blew into a thousand pieces under the
force of a strategically-placed tactical plastique explosive.

Into the room jumped an unholy trinity of nefarious evildoers. The
central figure was a fully clad ninja warrior - armed with razor sharp
precision weapons and dressed in the black eelskin Shinomo garb that only
outfitted the assassins of kings. The ninja was flanked by a pair of Nazi
frogmen in gray-green wetsuits and flippers - each carrying a deadly speargun
whose purpose was all too obvious. On their chests was the unmistakable emblem
of Adolph Hitler's Third Reich. Without hesitation, the two frogmen advanced
while the figure in black stood back to survey the carnage. Debbie had the
sudden feeling that she might be in trouble.

What only Debbie's adoptive family and a handful of others knew, however,
was that this young nightingale was far from defenseless. When Debbie was only
a few months old, she and her natural family had been in a shipwreck - and
Debbie, the only survivor, washed up on the shores of a small uncharted isle
somewhere between the Fiji and Easter Islands. She was raised by wolves for
the first few years of her life, until she unwittingly came across the only
other human being on the island, an aging Shaulin Martial Arts Master named
Bruce who taught her the ways of man and the art of self defense. After ten
years of rigorous training, Debbie decided to once again rejoin the real world,
and fulfill her destiny as the best-loved pop starlet of all time. On a
makeshift outboard canoe, Debbie sailed to New York, where she was soon adopted
by a nice upper-middle class Protestant family, who introduced her to record
producer Fred Zarr - and the rest was history.

Debbie leapt from the tub in a flying summersault, barely avoiding a
forked spear that fiercely penetrated the four foot luffa only inches from
where her sinewy young form had just been. Even in mid-flight, she was able to
identify the deadly curare poison coating her opponents' barbed projectiles.
They were playing for keeps. She spun to meet the evil duo, and remembered the
words of her master... "The less effort expended, the more powerful the
connection." An indescribably graceful spinning crescent lunge kick underneath
the chin of her first opponent neatly severed his head and sent it flying into
the bidet.

She ducked a slice from the second frogman's nine-inch serrated hunting
knife, and with a deafening cry of "WAX ON!" she plunged her open hand through
the Swastika emblem on his chest - and with a similar yell of "WAX OFF!" she
withdrew his still-beating heart. As the body slumped to the floor, Debbie
whirled to meet the stoic gaze of the remaining figure in black.

"Who are you?" she cried, "And what do you want with me!? I broke a nail
on your lame-ass frogman's collarbone, and I'm really pissed off!"

"You have killed two of my finest warriors," intoned the ninja. "And as
you die, I want you to know who is killing you." The figure pulled off its
sinister hood, and out poured a cascade of fiery red hair.

It was T1ffany. Debbie's arch-rival in the musical netherworld of teenage
pop icons, and the very figure of evil incarnate. Her fans thought of her as a
quiet young girl with modest dreams of stardom, when in reality she was a
brazen harlot who would stop at nothing to have the whole of the music industry
under her wicked thumb.

"T1ffany!" cried Debbie. "I should have guessed!"

"You were expecting maybe Chuck Norris?" quipped back the red haired
vixen. "I mean, Chuck's pretty hard up - but he's got better things to do than
nail a prissy little WASP like you!"

"What are you doing here? What do you want with me?" screamed Debbie,
falling back into a defensive posture.

"You ruined my career! I was on the verge of creating a musical empire...
I'd taken the first few steps to establishing myself as the hottest young thing
around - when all of a sudden you came around singing those insipid little
ballads of yours and stealing my thunder! Next thing I knew, I found myself
classified and categorized as a flash-in-the-pan little tart like you."

"What?" gasped an amazed and unbelieving Debbie. "You honestly thought
you could make it big by covering Beatles' tunes for the rest of your life?
Not!"

"You untalented little blonde tease!"

"You plagiarizing red haired slut!"

"Slicing your throat open is too quick a death for you!" sneered T1ffany,
dropping her weapons' belt to the floor. "I'll crush you with my bare hands!!"
She let loose a double reverse snake punch aimed at Debbie's naked torso.

But Debbie was too fast for her and did a double backwards somersault to
the other end of the room. As T1ffany sped towards her, Debbie crouched down
and threw her lower body upwards for the little known Shaulin upside down
spinning helicopter kick for which there is no known defense - except, of
course, for the even lesser known Japanese flying supersonic blur-hand in which
T1ffany had been expertly schooled. The two clashed together in a tangle of
limbs and flesh, leaving them locked in a strangling embrace - pitting will
against will in a struggle to the death.

But as Debbie's hands closed around her opponent's neck, she found herself
mesmerized by the tender fierceness in her eyes. She suddenly remembered what
it was that she was doing before this rather startling interruption, and the
proximity of such a beautiful, healthy young body pressing against hers sent an
unexpected flash of heat through her loins. This took Debbie completely by
surprise. I mean - she shaved her legs and had long hair and everything - she
never dreamed that she might be a lesbian! But her body cared very little
about her mind's outdated ethics as she pressed her firm young bosom into
T1ffany's.

As she did so, both her and T1ffany's grip loosened, and their snarls of
anger transformed into faint moans of pleasure. Debbie found herself entranced
with the delicate lips of her opponent, and before she could stop herself she
was kissing them. For a moment it occurred to Debbie that T1ffany's acceptance
of this might be a ruse to get the upper hand - but then she felt a soft, warm
tongue slide into her mouth, and she knew she had a willing and eager partner.

"I wanted you so bad," whispered T1ffany between kisses. "So bad I wanted
to destroy you, because I didn't think I could ever have you."

"Mmmmmm..." replied Debbie. "I never thought it could be like this...."

T1ffany's hands roamed freely over Debbie's supple body, as Debbie neatly
removed her black ninja garb. Underneath she wore nothing, and Debbie swooned
as she uncovered a figure not unlike her own - save for a wild growth of fiery
red hair between her legs.

"I never believed you were a real redhead," quipped Debbie tenderly, as
she slowly kissed down her torso.

"That's OK," countered T1ffany, gingerly swinging her partner around into
a sixty-nine. "I never thought you were a real blonde."

Re:A Day in the Life of Debbie G1bs0n (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26604979)

I don't know what's worse... that someone took time to write that, or that I actually took time to read it. Lots of adjectives, not bad writing, for crap, I suppose. You should write some of those sappy novels I see in the grocery stores. I suppose I'll see this particular post about a billion more times over the next month or two on /. The Obama shit eating was getting quite stale. I really don't know why people do this... but whatever.

And I'm not just writing this because I've got a decent buzz ... ok, maybe I am.

Troll has been fed...

Re:A Day in the Life of Debbie G1bs0n (1)

Anonymous Coward | more than 5 years ago | (#26605297)

Hey, don't complain.

That's a classic Cult of the Dead Cow story, right from the start of the internet.

Re:A Day in the Life of Debbie G1bs0n (2, Informative)

Flentil (765056) | more than 5 years ago | (#26605119)

Best off-topic post I've seen today.

Re:A Day in the Life of Debbie G1bs0n (2, Funny)

Isauq (730660) | more than 5 years ago | (#26605311)

Indeed. I sort of actually started paying attention when I caught, "The ninja was flanked by a pair of Nazi frogmen...."

Re:A Day in the Life of Debbie G1bs0n (0)

Anonymous Coward | more than 5 years ago | (#26605335)

I loved the unexpected and unlikely plot twists! Thank you for giving giggles and erection at the same time!

Re:A Day in the Life of Debbie G1bs0n (0)

Anonymous Coward | more than 5 years ago | (#26606187)

Well. That was unexpected.

Astounding... (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26604755)

It looks like raytracing will finally be a viable rendering method for games. It may be problematic because
Yeah, check dis out, this is O.G.L.B, [photobucket.com]
Knowhatimsayin? Im on my little O.G. Warren G -
And he just droppin this to let you B.G.'s know
Whas happen, y'all got to recognize
Cause this is y'know a Long Beach Thang -
21st street, but check this out,
G. gonna go out there, knoamsayin'?
And handle that shit now? yeah.

Would have posted first, but... (4, Funny)

Anonymous Coward | more than 5 years ago | (#26604801)

I was using a raytracer.

Re:Would have posted first, but... (1)

masshuu (1260516) | more than 5 years ago | (#26604933)

well it just took me 20 minutes to do a little 800x600 picture. i don't have $45k to spend on a gaming comp atm, so ill just stick to my world of warcraft, that can run on 10 year old comps

Re:Would have posted first, but... (1)

rhyder128k (1051042) | more than 5 years ago | (#26605141)

Only 20 mins? What accelerator card does your Amiga have?

Re:Would have posted first, but... (1)

larpon (974081) | more than 5 years ago | (#26606377)

Must be the 68060

Re:Would have posted first, but... (1)

aliquis (678370) | more than 5 years ago | (#26606493)

Hey!! Wtf was that!?!?! :D

Don't shit where you live!

If only Essences coder did games =P

http://ada.untergrund.net/showdemo.php?demoid=189 [untergrund.net]
http://ada.untergrund.net/showdemo.php?demoid=437 [untergrund.net]
http://ada.untergrund.net/showdemo.php?demoid=386 [untergrund.net]
http://ada.untergrund.net/showdemo.php?demoid=428 [untergrund.net]

Not that Amiga was ever any good for 3D graphics but whatever :D

animation, bottlenecks, etc... (0, Redundant)

j1m+5n0w (749199) | more than 5 years ago | (#26604819)

Interesting article. A little light on details. (What renderer were they using? OpenRT? Something they wrote themselves? Is it based on Kd-trees? BVH? BIH?) Also, not much mention of animation. Re-sorting the geometry whenever objects move is a hard thing to do efficiently, though there has been a lot of recent research in this area.

Re:animation, bottlenecks, etc... (5, Informative)

MichaelSmith (789609) | more than 5 years ago | (#26604833)

I interpreted this bit...

For this project, we started rewriting the renderer from ground zero. Because of this, the very first images from the renderer were not of typical ray- tracing caliber, but displayed only the basic parts of the geometry, without any shaders or textures

...to mean that they rolled their own.

Re:animation, bottlenecks, etc... (0, Redundant)

j1m+5n0w (749199) | more than 5 years ago | (#26604879)

Ah, I guess I skimmed the article a little too quickly.

Re:animation, bottlenecks, etc... (2)

XcepticZP (1331217) | more than 5 years ago | (#26605583)

Or not at all.

Re:animation, bottlenecks, etc... (0)

Anonymous Coward | more than 5 years ago | (#26607183)

Let's see. On the very first page, there is a bold heading, "STARTING FROM SCRATCH". The first sentence under that heading is "For this project, we started rewriting the renderer from ground zero."

If you skimmed it so quickly as to miss that, I dare say your skim-fu is very FU.

Re:animation, bottlenecks, etc... (1)

V!NCENT (1105021) | more than 5 years ago | (#26609295)

Well, Crysis (Crytek) has it's own renderer, yet it calls Direct3D. Same goes for ET:QW RT; Intel made a ray tracing renderer from the ground on up that calls the OpenRT lib.

Hrmm (5, Funny)

acehole (174372) | more than 5 years ago | (#26604821)

So when can I buy the CPU/Vid card that can do raytracing, heat my house, cook food off and pipe extra heat out for a steamhouse?

Re:Hrmm (2, Funny)

atomicthumbs (824207) | more than 5 years ago | (#26604903)

Right here! [nvidia.com]

Re:Hrmm (2, Funny)

Spy Hunter (317220) | more than 5 years ago | (#26604909)

2010. [wikipedia.org]

Re:Hrmm (4, Informative)

j1m+5n0w (749199) | more than 5 years ago | (#26604923)

Quoting wikipedia: "Intel planned to have engineering samples of Larrabee ready by the end of 2008, with a video card featuring Larrabee hitting shelves in late 2009 or early 2010."

Of course, it's always possible that AMD or Nvidia could beat Intel to market with a ray-tracing friendly GPU, but it doesn't seem likely that they'll bet the farm on a technology that isn't well-established.

If you want to play a software ray-traced game right now (or you just want to heat your house for the winter), you might want to look at Outbound or Let There be Light, which are both open-source games (though they run on Windows) built on top of Arauna. Gameplay is not really up to par with commercial games, but as a technology demo they're quite impressive. Framerates are tolerable on reasonably modern CPUs.

raytracing is VERY established (5, Informative)

CarpetShark (865376) | more than 5 years ago | (#26605287)

AMD or Nvidia could beat Intel to market with a ray-tracing friendly GPU, but it doesn't seem likely that they'll bet the farm on a technology that isn't well-established.

What? Not well-established? Raytracing is probably one of the most established graphics technologies. Specifically, it's been coming to games for years; only a matter of time. In fact, I don't really know why they're making such a big deal out of it here, since I'm pretty sure I read that the original quake (or was it doom?) traced a ray or two for some mapping reason, back when the source code was released.

Raytracing has mostly been replaced with other, faster technologies these days, which produce similar results, so it's not the panacea it seemed back when you had 5-bit hand-drawn stuff OR raytracing.

None of which is to belittle the work done on this game, because it does look nice, and improves on the graphics of the games before. But so do most games. Wake me up when town characters have emotions based on that guy you killed last week who rebuilt the clock tower because you suggested it back when you weren't so torn up about your wife dying.

Re:raytracing is VERY established (2, Interesting)

91degrees (207121) | more than 5 years ago | (#26605345)

since I'm pretty sure I read that the original quake (or was it doom?) traced a ray or two for some mapping reason, back when the source code was released.

Not sure about those two, but I'm pretty certain Wolfenstein 3D did. That was for visibility and texture coordinate calculation, rather than light and shadow. Since the map was 2D only a handful of rays were needed.

Re:raytracing is VERY established (1)

CarpetShark (865376) | more than 5 years ago | (#26605363)

Sounds like the same system, yes. I'm pretty sure it was back with Doom 1, now that I think about it more.

Re:raytracing is VERY established (3, Informative)

Narishma (822073) | more than 5 years ago | (#26605703)

That's ray casting, not ray tracing. Two different things.

Re:raytracing is VERY established (1)

V!NCENT (1105021) | more than 5 years ago | (#26609347)

Wolfenstein was ray casting and doom was affine texture mapping

Re:raytracing is VERY established (4, Interesting)

TheBracket (307388) | more than 5 years ago | (#26607635)

Wolf3D used raycasting, rather than tracing to give a pseudo-3D rendering of what was basically a 2D grid map.

It's pretty clever how it worked, I remember having a LOT of fun cooking up my own similar renderer back in the day (Turbo Pascal with inline asm was fun!). If I remember rightly:
First, the ceiling and floor were drawn in, covering everything (intersecting in the middle, vertically). Then, they took your location on the map, and cast a ray for each row of pixels (320 of them, I believe). This ray went forward until it intersected a wall - and the distance to the wall was measured. It then did a quick calculation (lookup table) to determine the height of the wall at that distance, subtracted half that height from the center of the screen, and plotted a vertical line in the color of the wall. I seem to remember the wall color was retrieved from a small texture and scaled.
That gives surprisingly good results, albeit with no lighting or shading.

Re:raytracing is VERY established (1)

91degrees (207121) | more than 5 years ago | (#26607945)

Yes, quite right. My terminology was incorrect. Raycasting is considerably less than ray tracing.

Re:raytracing is VERY established (1)

Quantumstate (1295210) | more than 5 years ago | (#26609117)

I also had fun writing a ray caster. I am slightly puzzled at why you say that they would use a lookup table for the height of the wall. It is a single division which is needed to get the height and this only needs to be done once per row of pixels per frame so at 30 fps and 320 wide that is only 9600 divisions per second surely not worth a lookup table.

ray casting != ray tracing (4, Informative)

Joce640k (829181) | more than 5 years ago | (#26605617)

Wolfenstein did "ray casting" - not the same thing.

Re:ray casting != ray tracing (1)

wastedlife (1319259) | more than 5 years ago | (#26608851)

"Ray tracing" and "ray casting" were used interchangeably when ray casting was popular. Now, that ray tracing has become feasible for advanced real-time rendering, a distinction was made between the two phrases.

Re:raytracing is VERY established (1)

SanityInAnarchy (655584) | more than 5 years ago | (#26609211)

Raytracing has mostly been replaced with other, faster technologies these days, which produce similar results, so it's not the panacea it seemed back when you had 5-bit hand-drawn stuff OR raytracing.

Those technologies are only faster for the moment. Theoretically, at some point in the future, raytracing will be faster again, and already produces better effects.

It's actually hard to tell which will win, just thinking about it. If I'm reading TFA right, they went from a 20-machine cluster to a single machine in some 4-5 years. And raytracing has better theoretical scalability -- it's embarrassingly parallizable, and has quite a few cases (extremely complex geometry, real curves instead of just triangles, any kind of shaders, hall-of-mirrors effects like Portal) where it outperforms rasterization even on a single machine. Intel's Larrabee is all about exploiting the "embarrassingly parallel" part and simply throwing more CPUs at the problem -- but this is a case where, if you bought two Larrabee cards, you would very likely get double the framerate, just like that.

On the other hand, someone once pointed out that rasterization is still the preferred method in a lot of places in Hollywood, where they do point thousand-machine clusters at the problem of rendering a movie. If true, that would tend to suggest that either these guys are smarter than Hollywood (possible), or that it's going to be a long time before raytracing will beat rasterization on similar hardware.

Re:Hrmm (1)

palegray.net (1195047) | more than 5 years ago | (#26604981)

You forgot all about taking care of business in bed and making you a sandwich afterward.

Re:Hrmm (1)

Lumpy (12016) | more than 5 years ago | (#26606877)

Buy a Quad P4 motherboard from 4 years ago and use heatpipes to do all that you ask.

Hell if you did it right you could get one of the old P4 Xeon dell servers that would take 8 processors and easily heat your home and water for you.

Re:Hrmm (1)

robthebloke (1308483) | more than 5 years ago | (#26609547)

You Already [pixelution.co.uk] can [computerarts.co.uk]

Can anyone tell de difference... (0)

Anonymous Coward | more than 5 years ago | (#26604967)

...between a raytracing demo and a slideshow?

Re:Can anyone tell de difference... (0)

Anonymous Coward | more than 5 years ago | (#26605003)

One runs at 0.3 FPS and looks really pretty, the other runs at around 30 FPS, at a terribly low resolution, and with very few objects onscreen at once?

Re:Can anyone tell de difference... (1)

amn108 (1231606) | more than 5 years ago | (#26607455)

1280x720 is pretty usable and popular resolution these days, it is used in everything from HD movies to Playstation 3 games, and nobody seems to be complaining of it being "terribly low".

Re:Can anyone tell de difference... (1)

default luser (529332) | more than 5 years ago | (#26609549)

Sure, but I believe the point is this: if you paid retail for the render farm this project used, you would be VERY disappointed with 1280x720 resolution at 20fps.

When you can get 1920x1080 @ 60fps playing ETQW on a $100 video card, you start to see the raytracing results in a different light.

Thank you Intel (0)

Anonymous Coward | more than 5 years ago | (#26604987)

Thank you Intel for not smooshing this informative summary into a bullet list PowerPoint.

Never ending chase... (5, Insightful)

grumbel (592662) | more than 5 years ago | (#26605043)

Yet another ray tracing article and yet again all the same problems as before. Doing yesterday games in ray tracing is all nifty, but also kind of pointless. For one we already played them, but more importantly, it doesn't actually use the strength of ray tracing. Rendering a tree build out of texture quads is a nice accomplishment, but wasn't the whole point of ray tracing that one can have a million polygons and no longer need such hacks? So show me a realistic tree instead of trying to replicate the limitations of rasterization.

I am still waiting for a game/demo that actually is build from the ground up with ray tracing in mind and by that I mean one that actually looks good, just a few shiny spheres might have been impressive back on the Amiga some 20 years ago, not any more.

Re:Never ending chase... (5, Informative)

j1m+5n0w (749199) | more than 5 years ago | (#26605129)

I am still waiting for a game/demo that actually is build from the ground up with ray tracing in mind and by that I mean one that actually looks good,

Have you tried Outbound? You can find it here [igad.nhtv.nl] . While it's probably not destined to be a huge hit, it looks nice and runs at a playable framerate on a reasonably fast computer. (If you don't want to try to "beat" the game, there's an option buried in one of the configuration files to disable physics and just fly around and admire the scenery.)

Re:Never ending chase... (1)

aliquis (678370) | more than 5 years ago | (#26607703)

Looks kinda crappy vs todays games, or the games off 5 years ago, on par with those released 8 years ago maybe ;)

Re:Never ending chase... (1)

V!NCENT (1105021) | more than 5 years ago | (#26609431)

Art and realism are two different things.

Re:Never ending chase... (3, Informative)

YesIAmAScript (886271) | more than 5 years ago | (#26605259)

The problems haven't changed since the 80s.

I attended Siggraph in 1989 and watched the AT&T Pixel Planes presentation. Things still haven't changed in 20 years.

I have no idea how you say that ray tracing somehow frees you from quads (or tris). You're still going to have to describe the geometry somehow. Depending how things are done you might get some freedom from surface normals and such, but you'll still have to figure out how to make that tree from sub-elements so that the ray-tracer can bounce rays off it. When a ray passes through the bounding box of the tree, you're going to have to be able to find out of the ray truly intersects the tree and if so, where on it did it hit, at what angle and what color the tree would appear to be from the angle the ray came from. That's going to require you describe the tree with geometry elements and the texture/color and spectral changes depending on angle.

Re:Never ending chase... (1)

malakai (136531) | more than 5 years ago | (#26606073)

I think the gp was talking about using transparent quads with textures to fake complex graphics.

IE, branches of a tree are really just a texuture painted on a quad with bump-mapping. It's 2 polys, not 1000 it could be.

the article makes a reference to trying to speed up hit testing of transparent quads when shadow-casting.

Games/technology will continue this slow R&D followed by a quick jump in visuals for all games for the next 10 years. At some point, our power will be suffiecent enough that the tree will not be a fixed set of polygons, but simply a natrual algorithm that gets fed some seed values and outputs a line of triangle strips. Maybe you can even tell it just which part of the tree you want it to 'grow' and then render those triangles.

This will free us from games having a forced 'fixed' resolution based on whatever the graphic artist was told his poly limit was at the time.

Re:Never ending chase... (1)

tepples (727027) | more than 5 years ago | (#26606503)

IE, branches of a tree are really just a texuture painted on a quad with bump-mapping. It's 2 polys, not 1000 it could be.

That's the case in Fighting Force for PS1. It's also the case in Animal Crossing, which was designed for a 1996 GPU and a fixed-angle camera. (The GameCube game was a source port from N64, the DS has N64-class GPU, and the Wii game intentionally kept the same art style.) But in plenty of games using a "realistic" art style designed for the PS2 or more powerful hardware, branches are actual geometry.

Re:Never ending chase... (5, Insightful)

Sycraft-fu (314770) | more than 5 years ago | (#26605563)

You aren't going to see that kind of thing in a game for many reasons that boil down to that ray tracing isn't ready to do realtime. To make a game that used ray tracing would pretty much doom it to failure.

One problem you have is that the graphics hardware out there isn't built for ray tracing, it's built for rasterization. Now while I'm sure you can write your own ray tracer on the newer hardware that does GPGPU stuff, I'm also sure it wouldn't run as well. Reason is that current graphics cards are purpose built rasterizers. They are designed to do that as fast as possible. So you are left with writing your own ray tracing engine in software, either on the CPU or GPU. This is not going to be fast, especially since ray tracking is fairly computationally intensive.

Well then you hit the next problem: Pixels. Ray tracers do NOT scale well with resolution. Each pixel has to have it's own ray cast. If you want to do anti-aliasing, then you have to do more rays for that. This is why ray tracing demos tend towards low resolutions. It is much faster the less pixels you have to do. Ok well that doesn't compare favorably against the rasterizers. They scale extremely well with resolution, and also in terms of anti-aliasing. Many of them can do 4xFSAA with next to no performance penalty, and can do it at full HD resolutions. Not the case with your ray tracer. If it can render 40 FPS at 1920x1200 with no AA, it'll be just 10 FPS with 4x AA since it now has to do 4 rays per pixel.

So you aren't going to see it happen any time soon. The net result wouldn't look as good as the equivalent rasterized game. It won't be the sort of thing you see until either there starts to be purpose built ray tracing hardware (GPUs may start be made for both) or general purpose processors are so fast it makes no real difference.

Intel is all up on this because they see GPUs as a threat to their computation market. However, as this demonstrates here, there really isn't an advantage at this time. You throw a positively massive system at it and you get poor performance. Even if you redid the game so it used extremely high geometry, nobody would give a shit. IT would run way too slow on any normal computer.

Re:Never ending chase... (0, Flamebait)

Rockoon (1252108) | more than 5 years ago | (#26605737)

You obviously don't know what you are talking about.

Firstly, in rasterization, 4xAA does not mean 4 samples per pixel. Not all pixels are subsampled or supersampled. Nobody does that shit anymore.
Secondly, in raytracing, 4xAA does not mean 4 samples per pixel. Not all pixels are subsampled or supersampled. Nobody does that shit anymore.

Thirdly, what the fuck do current video cards have to do with *anything* about this? This is called RESEARCH. Ever do any?

Fourthly, rasterization began as a software rendering technique.

Finally, you dont have a fucking clue about why raytracing actualy scales *well* and is obviously a superior rendering method for the future. It may not replace rasterization, but something sure as damn well will and if its not raytracing, whatever is next will scale at least as well as raytracing.

Rasterization scales very poorly to scene complexity, which is a far more important metric than scaling poorly to resolution. Resolution has doubled 3 times in 25 years, while scene complexity has doubled more than 20 times under the same time period. The chance that resolution will double again in the next 5 years is just about zero, while the chance that scene complexity will double several times in the same period is just about 100%.

If you are a decent well learned programmer, essentialy an expert in algorithmic complexity, then surely you understand the comparison O(n) vs O(log n) and why you cannot refute it with horseshit.
If you arent a decent programmer, then realize that you are an ignorant blowhard spouting about something that you do not understand (go read some Donald Knuth, spend a few years living it, then open your mouth)

Re:Never ending chase... (2)

Anonymous Coward | more than 5 years ago | (#26606425)

Interpersonal skills, go read some, spend a few years living it then open your mouth.

It may be worth wiping the rabid spittle off your chin first.

( I'm a random passerby, not the original poster. )

Re:Never ending chase... (2, Insightful)

karnal (22275) | more than 5 years ago | (#26606631)

I know I use my share of the foul words in the english language, but think about this - everyone would take your comment more seriously if you didn't use them; at least not to the excess seen in your post.

Re:Never ending chase... (0)

Anonymous Coward | more than 5 years ago | (#26606785)

I count 5-6 pixel doublings since CGA, or more with anti-aliasing. Are you betting that WQXGA and, say, another level of oversampling, won't become common in the next 5 years? The recession and the portable trend could prevent it, but "just about zero"?

Re:Never ending chase... (5, Informative)

robthebloke (1308483) | more than 5 years ago | (#26606885)

If you are a decent well learned programmer, essentialy an expert in algorithmic complexity, then surely you understand the comparison O(n) vs O(log n) and why you cannot refute it with horseshit.

How about real world experience then?

We have approximately 120 rack units in our renderfarm, each with dual quad core xeons + Dual Quadros. Of the rendering jobs we submit, approximately 0.001% use raytracing exclusively, about 0.5% make use of raytracing extensions. The rest is rasterization because it's a hell of a lot more efficient. Period. (And I'm talking about the real world here - not Big O notation on paper)

The arguments for scene complexity go out of the window very quickly in all fairness, for quite a few reasons.

1. To double the complexity of a model, we typically expect to see the time spent authoring that asset to increase by a factor of 6. We currently employ in the region of 200 modellers. A doubling of scene complexity would take that number to 1200 (if you don't count the additional management overhead etc). There simply aren't enough skilled people to make that a reality, so there is an absolute cap on how complex a given scene can become.

2. We always have, and always will continue to seperate the rendering into seperate passes for the compositor to correctly light at a later stage in the pipeline. A highly skilled compositor can produce higher quality images quicker than a better rendering algorithm can. Because we always split the scene into smaller constituent parts, the scene's never get complex enough to see any ray tracing benefits (and those parts can be rendered seperately on different nodes in our RF).

3. We typically use 4k x 4k image sizes, rasterization is certainly fast enough for those image sizes. Our scene complexity is far higher than that of a any game now, or in the next 5 years.

4. Scene complexity is inherently limited by 1 other major factor that you've completely ignored. Memory speed. As your data set increases, rendering performance degrades to the speed at which you can feed the processors - i.e. the speed of the memory. Again, this is another reason why we seperate the scene into render layers.

CG has never, and will never, use accurate mathematical models to produce images. If a cheap trick is good enough, it's good enough. Raytracing never really made the in-roads into the FilmFX world that the early 80's/90's evangelists predicted - And i predict that it will never make the in-roads into Games that you seem to believe.

Thirdly, what the fuck do current video cards have to do with *anything* about this? This is called RESEARCH. Ever do any?

Wow! Ever done any research yourself? If you did, you'd know that the answer is an awful lot! The only computational resource available that can provide both the memory bandwidth, and the computational power required for raytracing is the GPU. Our rendering process has been using GPU's to accelerate raytracing (and rasterization) for a couple of years now, unfortunately all of the problems I raised above regarding ray-tracing still apply.

Re:Never ending chase... (2, Interesting)

0xABADC0DA (867955) | more than 5 years ago | (#26608511)

4. Scene complexity is inherently limited by 1 other major factor that you've completely ignored. Memory speed. As your data set increases, rendering performance degrades to the speed at which you can feed the processors - i.e. the speed of the memory. Again, this is another reason why we seperate the scene into render layers.

What's neat about raytracing is that the memory access can be divided into millions of separate 'threads' that are not dependent on each other. So, with a processor (such as Tera MTA) where threads run in the order that memory is available you achieve maximum memory bandwidth.

On 'modern' processors where memory is read in the order that threads run you get massive pipeline and cache stalls when using a software raystracer. So when you are comparing the 'vs' of rasterization and raytracing you need to consider that raytracing currently has thousands of hands tied behind its back.

Raytracing never really made the in-roads into the FilmFX world that the early 80's/90's evangelists predicted - And i predict that it will never make the in-roads into Games that you seem to believe.

I think it's exactly the opposite. 'FilmFX' world isn't using raytracing because it hasn't been used in games. Games drive this tech, and if we get fast hardware for raytracing then movies will use it exclusively.

It's really been a hardware issue... designing an asynchronous CPU with millions of thread AND getting it to interface efficiently with a PC card bus is basically impossible. But, move the renderer into the CPU itself and it's far, far easier and faster on all fronts. For that reason I see raytracing soon getting it's first real chance to compete head-to-head.

Re:Never ending chase... (1)

robthebloke (1308483) | more than 5 years ago | (#26609397)

I think it's exactly the opposite. 'FilmFX' world isn't using raytracing because it hasn't been used in games. Games drive this tech, and if we get fast hardware for raytracing then movies will use it exclusively.

You'd have a point if it wasn't for the fact that dedicated ray tracing hardware has been around for decades as standalone hardware [pixelution.co.uk] , and dedicated PCI cards [computerarts.co.uk] .

The FilmFX industry already has dedicated hardware accelerated ray tracing, has had it for some time, and finds no use for it. I still find it laughable that everyone gets so excited about HW ray tracing, because quite frankly, it's a dead end.....

Re:Never ending chase... (5, Informative)

daVinci1980 (73174) | more than 5 years ago | (#26608541)

Rockoon, you are mistaken in a lot of your points. Even if you seem a bit angry, please allow me to explain. (I work for nvidia, but I do not speak for them).

Firstly, in rasterization, 4xAA does mean 4 samples per-pixel. The short version is that 4xAA basically means that we render into buffers that are twice as large in the X and Y direction (so 2*2 is 4), and then resolve the extra pixels with hardware when we go to present the backbuffer into the front buffer.

I can't speak to 4xAA in raytracing, but to be apples-to-apples, it would have to literally be extra rays in the X and Y directions. Note that I'm not claiming there's a 4x performance penalty here, though, because modern ray tracers rely a lot on cache coherency to be performant. Algorithmically, I would agree that there really is a potential for 4x the cost, but algorithmically we don't care about the constants we multiply by, right?

Third, it's important to consider what current cards do because they're the largest install base, and they are what developers will target. It's also important if you believe that hybrid raytracing is the future--almost all modern raytracers use rasterization for the eye rays to try to help with the pixel complexity problem.

Fourth, you are correct. In fact, there are probably relatively few hardware inventions that didn't begin their life as a software implementation--CPUs excepted.

Finally, you are incorrect. Raytracing scales O(pixels) and O(ln(complexity)). Rasterization is relatively constant in the number of pixels, and O(complexity). I agree, scene complexity has gone up considerably (and continues to go up considerably) every generation of new titles. Fortunately, in the same time period rasterization has massively decreased the cost of processing geometry while simultaneously increasing the ability to parallelize those types of workloads. Modern GPUs (like the relatively old 8800 GTX [wikipedia.org] ) can process in the neighborhood of 300M visible triangles per second. That means that if you're trying to redraw your scene at 60Hz, you can have around 5M triangles per scene per frame. The closest I've seen of most modern titles is in the 500K-1M range, so I think we still have some head room in this regard. Modern techniques, such as soft shadowing and depth-only passes definitely eat into this count, which is why we're seeing much higher counts than we used to.

Regarding pixel complexity, the number of pixels that matters is more than just the resolution, it's also how many times you'd like to draw those pixels in a given second. Seven years ago, you were lucky to find a CRT that drew 1280x1024 (which is a weird, 5:4 resolution, but I digress) at more than 60 Hz. 85 was reasonably common, but finding a monitor that drew at 1600x1200x85 was pretty rare.

Now, you can find monitors that render at 1920x1200x120 for relatively cheap. And 240 Hz is on the way. [extremetech.com] That's a lot of pixel data to be moving and redrawing. And speaking from experience, I can say that leveraging coherence within a single frame is hard, and leveraging coherence between frames is virtually unheard of.

It's not that raytracing is an impossible dream, it's just that the GP was correct: it's no panacea.

I'd like to reiterate: though I work for nvidia, I do not speak for them.

Re:Never ending chase... (1)

Chris Burke (6130) | more than 5 years ago | (#26608703)

If you are a decent well learned programmer, essentialy an expert in algorithmic complexity, then surely you understand the comparison O(n) vs O(log n) and why you cannot refute it with horseshit.

Meh. The same thing could be said about Z-buffer, which is O(n), vs Painter's Algorithm, which is O(n log n). Until the hardware became fast enough to overcome the much larger multiplicative constants in Z-buffer, Painter's Algorithm won and that's all there is to it. Not only did it take quite a while until hardware was fast enough that Z-buffer was even feasible, it took a while before hardware was fast enough to have scenes with enough complexity that the scaling factor worked in Z-buffer's favor. It wasn't until z-buffer could render a scene of equal complexity as painter's equally fast that the switch was made.

So the thing is, while anyone can look and say that, at least given the existing options, raytracing is the future and rasterization will be a thing of the past, that reality isn't coming soon. Our rasterization algorithms and software are very good and only getting better. Raytracing only has a chance when scene complexity increases to the point where it can outdo rasterization. However, scene complexity is not going to increase faster than the rate that rasterization itself allows.

Re:Never ending chase... (1)

gwappo (612511) | more than 5 years ago | (#26607979)

I'm going to have to disagree a bit here.

The issue with raytracing is memory access patterns; this is not so much an issue with GPUs vs CPUs, but rather that both CPUs and GPUs rely on linear prefetch patterns through memory, which raytracing breaks as you traverse the spatial subdivision structure.

Secondly, ray tracers scale very well with resolution O(n) where n = number of pixels; we currently still have a relatively high constant cost, but assuming moore's law keeps up in performance and we find an answer to the memory problem, it is superior.

What makes the move to raytracing somewhat an inevitability, however, is not raytracings ability to very straightforwardly do more sophisticated lighting (many of which can be mimicked in awkward ways), but its ability to scale to massive amounts of geometry; eg. when using octrees, we're looking at O(log(n)) for n primitives - this is better than GPU rasterizing hardware where it is O(n).

The one critical failure of raytracing - and the reason it is hard to do games - is that to get to O(log(n)) you (currently) have to have static geometry - it cannot animate dynamically in realtime as you need to rebuild the spatial subdivision structure for the animated geometry on a frame by frame basis, and that gets expensive..

Perhaps some kind of mix will ultimately be used, but the geometry benefits of raytracing do still make the technology inevitable.

Having said that, Larrabee will be a stillborn for the first reason outlined.

Re:Never ending chase... (1)

Sloppy (14984) | more than 5 years ago | (#26608067)

Near as I can tell, your whole argument boils down to "raytracing is slow[er]." (I added the "er".) The years keep going by, though, and while the clocks are creeping up slower than they used to, we keep getting more 'n' more cores. Raytracing might always be slower but it's only a matter of time until it's fast enough.

When you say 10 FPS, I think "That's amazing!" That means in 10 years we'll all be doing 60 FPS it on $500 machines.

Re:Never ending chase... (0)

Anonymous Coward | more than 5 years ago | (#26608139)

One of the major ideas that makes realtime raytracing possible is using ray bundles, thus shooting only one or a few rays for many pixels.

With custom hardware (there's a reason why Intel's new SIMD units have much wider registers than previous iterations) you can easily make these bigger and thus scale raytracing to HD resolutions.

Re:Never ending chase... (1)

SanityInAnarchy (655584) | more than 5 years ago | (#26609509)

One problem you have is that the graphics hardware out there isn't built for ray tracing, it's built for rasterization.

This paper is by Intel, who wants to release video hardware specifically designed for raytracing. It's also embarrassingly parallel -- this example originally ran on a cluster of 20 machines.

Ray tracers do NOT scale well with resolution. Each pixel has to have it's own ray cast.

That's linear. As in, throw in twice as many cores, and you can handle twice as many pixels. How does rasterization scale with pixels?

Ok well that doesn't compare favorably against the rasterizers. They scale extremely well with resolution, and also in terms of anti-aliasing. Many of them can do 4xFSAA with next to no performance penalty...

Alright -- but you haven't actually shown what the requirement is. Is it logarithmic? Linear? Constant?

The net result wouldn't look as good as the equivalent rasterized game.

No, it will look better, when it gets there.

As an example: A mirror in a rasterized game will either require a second rendering pass (that is, render an image from the mirror's perspective, then use it as a texture and render that), or it will require you to duplicate the geometry of everything on one side of the mirror to the other.

Now, what if you have two mirrors facing each other? What if you're playing something like Portal?

With a raytraced mirror, you simply send out another ray from the mirror.

Or, suppose you want to show a reflective sphere. That's a tricky shader, and how many polygons does it take? Too many, and you slow things down. Too few, and it becomes obvious that it's not a sphere, but an n-hedron. With a raytracer, you just throw in a perfect sphere, and with high school math, you can make it reflective. Just like the mirror, it can just send out another ray, at a slightly different angle.

You throw a positively massive system at it and you get poor performance.

From TFA:

With Intelâ(TM)s latest quad-socket systemsâ"equipped with
a 2.66 GHz Dunnington processor in each socketâ"we
can achieve approximately 20 to 35 fps at a resolution
of 1280x720. Nonetheless, this represents a significant
improvement over the experiments in 2004 that required
20 machines to render a simpler game more slowly and
at a lower resolution.

In other words, what took a positively massive system in 2004, takes a single machine in 2009, if I've read that right. I wonder what it will look like in 2014? Or when Intel comes out with its special-purpose raytracing hardware, that's basically a bunch of CPUs?

Re:Never ending chase... (1)

Jackie_Chan_Fan (730745) | more than 5 years ago | (#26606081)

Actually your tree example has nothing at all to do with raytracing. Textured Quads are used for leaves because of polygon count. The polygon count required to create every individual leave is extremely huge and it takes not only more render power but a hell of a lot more setup/translation processing.

The higher polygon count required would put just as much demand on a raytracer as it would on a reyes or scanline renderer. In fact it may put more stress because raytracing scenes tends to require larger amounts of data being available in memory for each raytrace as it passes through transparent surfaces and bounces off reflective surfaces...

Re:Never ending chase... (2, Informative)

grumbel (592662) | more than 5 years ago | (#26606221)

Textured Quads are used for leaves because of polygon count.

The whole point of realtime ray tracing is that it scales with O(log(n)) instead of O(n) when it comes to polygons. Which means that you can and should model each leaf as fully polygon. That is actually done in quite a few other examples, such as the sunflower scene [openrt.de] or that Boeing model [uni-sb.de] , where every last screw is modeled and yes, ray tracing can handle those just fine.

Now there is of course a cavet, this scalability only works for static scenes and things become quite a bit more problematic when stuff is animated, but never the less the whole point of 'going ray tracing' is because presumably polygon counts are slowly get high enough that ray tracing just outruns rasterization.

Re:Never ending chase... (2, Insightful)

robthebloke (1308483) | more than 5 years ago | (#26606943)

There's one other caveat. That scalability fails to apply when you take into account that memory read's are not free.

Re:Never ending chase... (1)

RiotingPacifist (1228016) | more than 5 years ago | (#26609021)

isnt the point that x*log(n) (where x represents the extra cost of doing raytracing) is still going to be smaller than n (for large enough n), infact the benifit of switching to raytracing (if it can approach its theoretical limit) is that for highly complex scenes it scales incredibly well (almost flat infact)

Re:Never ending chase... (1)

slashdotjunker (761391) | more than 5 years ago | (#26608183)

Grr, this always happens. Why do people always compare static rendering with dynamic rendering? A billion dollar industry isn't wrong. Rasterization is better for games. Period.

Dynamic, order-independent rendering is O(n) no matter what rendering system you use. The oft-quoted O(log n) is only valid for static, order-dependent renderers! To turn a dynamic scene into a static one you have to sort the data into a static acceleration structure. Doing that takes you O(n log n) so order-independent rendering will always be better for dynamic scenes.

Re:Never ending chase... (1)

grumbel (592662) | more than 5 years ago | (#26608495)

Large parts of any game level are completly static and so are many models, but you are absolutely right, dynamic scenes are a problem and todays games have plenty of them. Which is exactly why I said I would like to see stuff that focuses on the strength of raytracing instead of trying to replicate current games.

I don't doubt that rasterization will dominate computer games for a long while to come, compatibility on multiple platforms alone pretty much guarantees that, all technical benefits aside. However I also think that it might be time to try other rendering algorithms again, after all, we have been stuck for a decade with 3d hardware and while it can do many pretty things there are also plenty of things it just can't do well. And with multicore CPUs getting common and stuff like PS3 cell it might be time for some creativity in that area. Who knows, maybe some space game with gigantic hyper detailed Star Destroyers or whatever would be good to showcase raytracing.

Anyway, my whole complaint about this raytracing thing is simply that research is the stuff that should blow you away and make you go "wow" and not let you go "yeah, nice, I played that last year..."

Was it more fun? (3, Insightful)

samkass (174571) | more than 5 years ago | (#26606109)

So was the ray-traced version of the game more fun? Or am I missing the point of games?

Re:Was it more fun? (2, Insightful)

Cheapy (809643) | more than 5 years ago | (#26608217)

In this case, yea. You were missing the point. It wasn't meant to make the game more fun. It was meant to show that ray tracing was possible on a fairly modern game. It's like modding a toaster to run BSD, or adding a laser turret to your mailbox: a substantial reason for doing it is to see if it's possible.

before after pictures (3, Interesting)

robvangelder (472838) | more than 5 years ago | (#26605057)

When looking at the before/after pictures, was anyone else surprised when they read which was the raytraced version?

To me, the ship in the water looks better with the bump map.

Re:before after pictures (4, Informative)

CMKCot (1297039) | more than 5 years ago | (#26605063)

both screens were raytraced, they just showed off two ways of simulating the water.

Re:before after pictures (2, Insightful)

Shadow of Eternity (795165) | more than 5 years ago | (#26605111)

it's got more obvious special effects but the other one looks for more realistic.

Re:before after pictures (3, Insightful)

nbert (785663) | more than 5 years ago | (#26605351)

Maybe the reason for this is that the 2D surface with the bump map resembles the look of water we expect in a game. I also thought it looked better, but it's not really possible to judge this based on a screenshot, because when it comes to water it's all about the movement.

Re:before after pictures (1)

rrossman2 (844318) | more than 5 years ago | (#26606487)

"Maybe the reason for this is that the 2D surface with the bump map resembles the look of water we expect in a game"
Naw, the top is how the water in the lake my friends and I take my boat out on looks. Trust me, we should know. We met it face first quite a few times trying to learn to wakeboard a few years ago

Bad sample (3, Informative)

argent (18001) | more than 5 years ago | (#26605915)

As someone else noted, both pictures were raytraced.

To really show the difference between 2d and 3d water, you need to show the water interacting with a solid object close enough so that you can see that in one example the waves really go up and down and in another they're just a picture of waves on a mirror.

There's been a LOT of work making 2d water look dramatic, and I've seen people say they prefer 2d water in broad shots like this in other games (not even raytraced ones), but when you're in the game looking over the edge of a dock or looking at a nearby boat with the light behind you, it's pretty clear that spending more time on the physics of the water pays off.

Heck, even with 2d water, paying attention to the wave effects in shallow versus deep water pays off when you interact with it. And that's rarely done because it's not as dramatic.

Re:before after pictures (1)

rrossman2 (844318) | more than 5 years ago | (#26606459)

I absolutely agree. This was my first thought as well. Unless there was a surface fog over the water that was managing to reflect the light onto the boat like that and give the water that hazy look, the light should have been more of a drastic difference on the side of the boat vs the top, and the reflections off of the water.

Did Intel graphics improve when I wasn't looking? (0)

pecosdave (536896) | more than 5 years ago | (#26605065)

Seriously, as of a year ago I would rather have used an old AGP TNT2 than than the latest built in Intel graphics. I improved the performance of a relatives machine by 35% after putting in a PCI GeForce 5600 instead of using the built in Intel.

Did something happen over the past year or two that caused Intel to be able to publish papers like this? I mean their graphics are fine for a Windows desktop running Office and a browser, but it stops there unless something recently changed.

Re:Did Intel graphics improve when I wasn't lookin (1)

am 2k (217885) | more than 5 years ago | (#26605081)

There's a huge difference between the things they have in their lab and the things they're selling.

Re:Did Intel graphics improve when I wasn't lookin (1)

Linuss (1305295) | more than 5 years ago | (#26605325)

Hmmm, I WAS going to say that you should get yourself a better pc, but I think I've changed my mind and would recommend common sense, instead. No Shit intels lab work is going to be better than what the end user ends up seeing, usually because the end user is too dumb to set up a pc correctly, but maybe even a little bit because THEY USE STUFF YOU CANT BUY? sheesh, people these days.

Re:Did Intel graphics improve when I wasn't lookin (2, Informative)

EvolutionsPeak (913411) | more than 5 years ago | (#26605165)

All of this stuff is done in software on the CPU, so the graphics hardware really doesn't affect it.

Re:Did Intel graphics improve when I wasn't lookin (1)

Kashgarinn (1036758) | more than 5 years ago | (#26605317)

they're looking into raytracing ''because'' they're graphic cards are so bad. They can't be bothered to juice up their video graphics department and productivity, so they're trying to make the world change, instead of Intel changing their ways.

- They would be a lot better off if they would just stop making graphics processors altogether and let ATI, Nvidia do the integrated graphics, or better yet, motherboard makers should bloody well realize that intel integrated is crap and stop buying integrated graphics from intel.

Re:Did Intel graphics improve when I wasn't lookin (1)

91degrees (207121) | more than 5 years ago | (#26605361)

Intel are perfectly capable of producing high performance graphics hardware. The reason they don't is that the cost of entry to that market is too high. ATI and nVidia have it sewn up between them. Intel make a lot more money selling lots of cheap chips to people who don't need much performance.

I feel sorry for the... (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26605301)

guys they did this work, I played this game enough to be able to tell it wasn't fun to play, it tried to be a Battlefield 2 clone with a broken physics engine, and "real-time" shadows that wasted FPS and didn't need to be real-time at all, static objects could have just been baked into the megatextures like bf2, was sad to see ETQW when it finally showed up a year late and suck ass gameplay. Splash Damage and id should be ashamed of this product and tech.

Now on to the poor saps that had to add ray tracing on top of this poop in a box, you could show me another 500 images and I wouldn't be able to tell the difference.

Sorry fellas you can put a tuxedo on a turd but it's still a turd, maybe try again with Crytek 2 if they will let you have the source to work with at least you will have something that is state of the art to begin with, not a warmed over version of Doom3 / Quake 4 tech that was poorly coded by Splash.

Carmack is still turning in his grave, oh wait he still around? Oh yeah he is, ahh Mr. Carmack.. sir please stop working on console games and blow the world away again, we need you, and being a console whore is one more step toward mediocrity.

Re:I feel sorry for the... (2, Informative)

wild_quinine (998562) | more than 5 years ago | (#26605601)

How did this get modded insightful - by ANYONE?

guys they did this work, I played this game enough to be able to tell it wasn't fun to play, it tried to be a Battlefield 2 clone with a broken physics engine, and "real-time" shadows that wasted FPS and didn't need to be real-time at all, static objects could have just been baked into the megatextures like bf2, was sad to see ETQW when it finally showed up a year late and suck ass gameplay. Splash Damage and id should be ashamed of this product and tech.

QW:ET is one of the best made, best balanced team FPS games I have EVER played. If it draws from anything, it draws from the previous Enemy Territory game. I'm sure we've all played a lot of the original ET, being that it was free. QW is like a much refined version of this, with a modern graphics overhaul, and more interesting setting.

a warmed over version of Doom3 / Quake 4 tech that was poorly coded by Splash.

I mean, come on? Flamebait if not outright troll. But insightful? Where's the evidence that this was poorly coded - this game is a masterwork, IMO.

Re:I feel sorry for the... (1)

Jesus_666 (702802) | more than 5 years ago | (#26605931)

Carmack is still turning in his grave, oh wait he still around? Oh yeah he is, ahh Mr. Carmack.. sir please stop working on console games and blow the world away again, we need you, and being a console whore is one more step toward mediocrity.

Don't worry, just wait for his next game. John Carmack is about to make you his bitch.

Re:I feel sorry for the... (1)

jandrese (485) | more than 5 years ago | (#26607463)

Wasn't that John Romero?

Bandwidth & processing, quantum effects? (-1)

mcrbids (148650) | more than 5 years ago | (#26605315)

The problem with simulating reality is that reality is so incredibly BROADBAND. Every piece interacting with every other piece on every level from entangled subatomic particles to heat convection and induction to light reflections and .... wow! It's just an incredible experience, this reality thing!

Reality, by definition, is "dirty". We have dust, we have imperfections in every surface, no matter how carefully machined. Houses are never truly square, roads are never perfectly level, and points in a corner are always rounded. Always.

Computers, by definition, are "clean". Squares are always truly square, roads are as perfectly level as they were designed to be, and corners are always razor sharp, no matter how much you "zoom in".

The problem with modern graphics systems is they are computed to extreme levels of precision. If they incorporated a sort of fundamental randomness, if they were intrinsically uncertain, they just might be able to really approximate reality, which is messy, ugly, and imperfect.

Personally, I'd think that it really would take massive bandwidth, orders of magnitude beyond what we can produce today, and some pseudo-custom designed chips to get truly realistic graphics. Personally, I'm thinking about FPGAs which produce circuits at relatively low bandwidth but that are highly tuned to the task at hand. If there were a spec for FPGAs in video cards to handle specific physics handling tasks or ray-tracing, and there were a widely recognized spec for including this capability in a video card or somesuch, the problem of taking care of specialized processing problems could disappear tomorrow.

The other issue is bandwidth - and there's nothing to do about this except provide more of it. (Hypertransport, anyone? I have a 4-core Athlon Database server that outperforms an 8-core Xeon at 1.5x the "clock speed" simply by having more real, useful throughput....

Re:Bandwidth & processing, quantum effects? (2, Insightful)

IsThisWorking (883966) | more than 5 years ago | (#26605603)

Reality, by definition, is "dirty". We have dust, we have imperfections in every surface, no matter how carefully machined. Houses are never truly square, roads are never perfectly level, and points in a corner are always rounded. Always.

Computers, by definition, are "clean". Squares are always truly square, roads are as perfectly level as they were designed to be, and corners are always razor sharp, no matter how much you "zoom in".

The problem with modern graphics systems is they are computed to extreme levels of precision. If they incorporated a sort of fundamental randomness, if they were intrinsically uncertain, they just might be able to really approximate reality, which is messy, ugly, and imperfect.

You seem to be confusing texture irregularity with material consistency. A house wall is not perfectly "razor sharp", but no matter how many times you look at it, they do not suffer from "randomness" or are in any way "uncertain". At least not if you are not looking at a sub-atomic level. Also, the bandwidth would not be that high, if you take into account that human eyes have very little resolution, and thus an extreme amount of detail at a distance would be pretty much irrelevant.

Re:Bandwidth & processing, quantum effects? (1)

complete loony (663508) | more than 5 years ago | (#26605729)

If you just add randomness to every frame, it would look like a mess. The tricky part is to draw each frame with the same randomness so it doesn't jump around. Which of course means you aren't drawing it very randomly at all...

Re:Bandwidth & processing, quantum effects? (0)

Anonymous Coward | more than 5 years ago | (#26606107)

so we're agreed then. We need rigidly standardised rules of randomness which precisely define who is what when and how.

if only there was a way to create numbers... a seed, if you will.

Re:Bandwidth & processing, quantum effects? (1)

robthebloke (1308483) | more than 5 years ago | (#26606975)

Personally, I'm thinking about FPGAs which produce circuits at relatively low bandwidth but that are highly tuned to the task at hand.

Hardware-Accelerated Shaders Using FPGAs [dctsystems.co.uk]

Erm... error? (1)

achenaar (934663) | more than 5 years ago | (#26605365)

"however, if one part of the bundle hits another surface then the other one, it produces some reorganization"

Is it just me or does this widely disseminated professional document contain a then/than error?

Let's hope that wasn't one of the coding challenges they faced.

if (x > y/2) than

{

'uh oh

}

(This post written assuming "the other one" = "the other part of the ray bundle")

Still taking the wrong approach... (3, Interesting)

argent (18001) | more than 5 years ago | (#26605951)

They're still taking the wrong approach to raytracing. If Philip Slusallek was able to get 30 FPS in a raytraced game in 2005, using a single Pentium 4 behind a raytracing accelerator that was roughly equivalent to a Rage Pro in terms of gates and clock speed, it seems silly to me to ignore the possibilities of adding an "RPU" to the mix instead of just adding more general purpose CPU power. Yes, I know that's Intel's thing, but even for Intel... a raytracing core would be a tiny speck in an I7.

Re:Still taking the wrong approach... (1)

MobyDisk (75490) | more than 5 years ago | (#26607041)

Intel isn't trying to do ray tracing. Really, their point is to find a way to make GPUs unnecessary since it is a threat to the CPU market. So adding an RPU to the mix would make sense if the goals was fast ray tracing, but it would defeat what they are trying to show.

I wonder if they used any SIMD extensions when they coded this.

Mod parent up insightful... (3, Insightful)

argent (18001) | more than 5 years ago | (#26608953)

Intel isn't trying to do ray tracing. Really, their point is to find a way to make GPUs unnecessary since it is a threat to the CPU market.

They can call it "ray tracing extensions" to the I7 or I8 CPU. It's not like the x86/x86_64 instruction sets are some kind of blushing virgin whose precious architectural purity would be violated by adding instructions like "RT_LOAD_MESH" and "RT_LOAD_SHADER"...

What bothers me is how nVidia is missing the boat.

Intel promotional text (5, Funny)

bitrex (859228) | more than 5 years ago | (#26606043)

Our raytracing engine is the finest available! For all your sphere-on-chessboard game needs! If your game is going to involve spheres, chessboards, reflective spheres, or possibly spheres floating on water - raytracing is the way to go!
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?