×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Real-Time, Movie-Quality CGI For Games

kdawson posted more than 4 years ago | from the we-can-animate-it-for-you-wholesale dept.

Graphics 184

An anonymous reader writes "An Intel-owned development team can now render CGI-quality graphics in real time. 'Their video clips show artists pulling together 3D elements like a jigsaw puzzle (see for example this video starting at about 3:38), making movie-level CG look as easy as following a recipe.' They hope that the simplicity of 'Project Offset' could ultimately give them the edge in the race to produce real-time graphics engines for games."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

184 comments

Wow (4, Funny)

binarylarry (1338699) | more than 4 years ago | (#31238376)

They've discovered the hidden secrets to rendering Academy Award winning films such as "Gears of War" and "Crysis."

Congrats Intel dev team!

Re:Wow (1)

socsoc (1116769) | more than 4 years ago | (#31239204)

I prefer the cutscenes of Super Mario Bros and ExciteBike. When are films gonna achieve that quality? I mean Avatar tried, but didn't quite nail it.

Re:Wow (2, Insightful)

Anonymous Coward | more than 4 years ago | (#31240168)

And then there's South Park, which appears to have been created with PowerPoint.

What this really means is ... (2, Insightful)

WrongSizeGlass (838941) | more than 4 years ago | (#31238388)

... now they can pump out crappy movies that have quality CG faster than ever before?

Re:What this really means is ... (2, Interesting)

Anonymous Coward | more than 4 years ago | (#31238820)

Hating on the current quality of movies/games/music automatically gets you karma points even if you haven't the least bit of idea of what you're talking about....

Re:What this really means is ... (5, Funny)

biryokumaru (822262) | more than 4 years ago | (#31239550)

I hate it when people hate on people hating on something they hate just to get karma points just to get karma points.

It's almost as bad as people hating on people hating on people hating on something they hate just to get karma points just to get karma points just to get karma points.

Grammar works like nesting things, right?

Re:What this really means is ... (1)

nacturation (646836) | more than 4 years ago | (#31240408)

Grammar works like nesting things, right?

Don't anthropomorphize grammar works... they hate it when you do that.

Re:What this really means is ... (1, Interesting)

Anonymous Coward | more than 4 years ago | (#31238836)

I think it will make the CGI worse.

When someone has to spend time creating the graphics, a little bit of human soul leaks over even if their ideas are uninspired.

Some of the 80's CGI (Tron etc) looks dated, but has a vivacity that is lost when everything is too perfect.

Re:What this really means is ... (2, Interesting)

Korin43 (881732) | more than 4 years ago | (#31239176)

Wouldn't this mean that this is just another step in the direction of letting anyone make movies (without needing a billion dollars with of computers and another billion dollars worth of actors)?

Re:What this really means is ... (5, Interesting)

biryokumaru (822262) | more than 4 years ago | (#31240098)

I read a really great short story once about a future where all films are made completely on computers, with AI actors. Then one guy starts filming movies with a real girl in them, just with computerized scenery, and doesn't tell anyone. It blows people away just how "real" his films feel compared to normal movies.

Anyone else read that? It was pretty good.

Re:What this really means is ... (2, Insightful)

Pseudonym (62607) | more than 4 years ago | (#31240106)

Anyone can already make movies without a billion dollars worth of computers and a billion dollars worth of actors. The difficulty is finding a million dollars worth of animators and fifty thousand dollars worth of screenwriters.

reducing implementation time is a good thing (1)

snooo53 (663796) | more than 4 years ago | (#31239574)

Here's the thing. Normal people don't want to spend hours and hours creating detailed 3D models in Blender or whatever. They just want the easiest way to turn their ideas into reality. Reducing the implementation time for a high quality end product, and eliminating the tedious tasks is a worthy goal. It's the same reason normal people don't program in assembly anymore. With the exception of some very specific programs, higher levels of abstraction are almost always better, and this is no exception.

Re:reducing implementation time is a good thing (1)

Dalambertian (963810) | more than 4 years ago | (#31239884)

It's been a while since I've heard from these guys. They are following a trend a lot of indie game developers have latched onto: what they lack in terms of budget can be made up with brains. The simple fact is we no longer need to have models crammed with millions of polygons in order to make high-quality assets. I shall have to make the obligatory demoscene reference, here; consider exhibit A: http://www.demoscene.tv/prod.php?id_prod=13374 [demoscene.tv] Also check out the works of Introversion, Eskil Steenberg, and (more drastically) Will Wright's Spore.

Re:reducing implementation time is a good thing (1)

Sam36 (1065410) | more than 4 years ago | (#31240026)

Makes me think of microsoft power point. The same crappy blue shaded background on every slide I see.

Great... (4, Funny)

Beelzebud (1361137) | more than 4 years ago | (#31238390)

Now maybe they can get to work on shipping on-board graphics cards that can actually play games released within the past couple of years...

Re:Great... (0)

Anonymous Coward | more than 4 years ago | (#31239184)

Or ones that can display a CLI properly.

Re:Great... (1)

Nyder (754090) | more than 4 years ago | (#31239232)

Now maybe they can get to work on shipping on-board graphics cards that can actually play games released within the past couple of years...

How about you don't be so cheap and by a dedicated graphics card?

Re:Great... (1)

contrapunctus (907549) | more than 4 years ago | (#31239408)

I thought the nice thing about intel cards is that the drivers are open. It's not about being cheap.

Re:Great... (0)

Anonymous Coward | more than 4 years ago | (#31240596)

Then don't complain in the first place.

Re:Great... (0)

Anonymous Coward | more than 4 years ago | (#31240642)

It must suck to work on GPUs at Intel, they had the Larrabee BSers on the one hand, and on the other they're facing an executive that refused to unleash them on a full-blown GPU design. Intel will never bite the GPU bullet because institutionally the cannot bring themselves to accept it's as important as the CPU and it deserves some goddamned respect, not some shitty free-D corner of a chipset somewhere.

Make a REAL GPU with a rasterizer and dedicated raster ops & early zbuffer architecture that is not some half-assed parallel compute device with too few cores or a tiny area on the chipset to fill the lowest end of the product line.

Build something that's viable NOW, not something that will be viable if teh whole friggin' world changes direction and does exactly what Intel's silly wet dream hopes they will.

Take anyone who hypes ray-tracing and fire them, that would be a good start to weed out the incompetent bozos.

As long as Moore's law holds (2, Interesting)

Anonymous Coward | more than 4 years ago | (#31238422)

How can there be any doubt that realtime rendering will approach the quality of today's offline rendering when computing power grows exponentially?

Re:As long as Moore's law holds (5, Insightful)

binarylarry (1338699) | more than 4 years ago | (#31238448)

Unfortunately, the faster the processors get, fancier rendering features become possible in the offline space as well.

Realtime rendering will never be on par with offline rendering of the same vintage.

Re:As long as Moore's law holds (0)

Anonymous Coward | more than 4 years ago | (#31238516)

That's kinda why I qualified "offline rendering" there, you see?

Re:As long as Moore's law holds (1)

JT The Geek (1121759) | more than 4 years ago | (#31239500)

Absolutely true, but there is an apex that both achieve to reach which is photo realistic rendering. There is a foreseeable point that both of these two technologies will one day meet and then eventually move completely past that into faster than realtime rendering where we could possibly be rendering in realtime many different virtual spaces. One useful example of this might be for content creators such as the movie studios to watch a video of a scene they are working on in real time with many different lighting conditions. So yes, realtime rendering will never be on par with offline as eventually offline rendering will be an antiquity.

Re:As long as Moore's law holds (3, Informative)

Pseudonym (62607) | more than 4 years ago | (#31240156)

Absolutely true, but there is an apex that both achieve to reach which is photo realistic rendering.

No, because "photo realism" is not a goal that visual effects aspires to. If you can take a photo of something, then it's almost always cheaper and better to do that, even though it usually requires many thousands of dollars on crew, make up, sets and lighting.

CGI is used for things that you can't take a photo of, such as a Na'vi or a talking ant. If the space ship can travel faster than light, or the penguin can dance, then "realistic" is not a goal.

(Disclaimer: I used to work in visual effects.)

Re:As long as Moore's law holds (0)

Anonymous Coward | more than 4 years ago | (#31240578)

True except for two factors, massive compute in parallel fp GPUs and the law of diminishing returns. Movies will win when it comes to crafting and polishing a few scenes and mixing with live action, but I think due to the fallibility of our human senses and diminishing visual returns per compute the gap is closing. Of course even movies are being rendered on GPUs now using CUDA for compute acceleration.

Not supprised (1)

SoCalledNotion (1548979) | more than 4 years ago | (#31238430)

While the clips look amazing, I can't say that I'm in the least bit shocked by this. GPU's and CPU's are only getting more powerful, so it would stand to reason that this kind of thing is finally becoming feasible. The real question is whether or not this game

CGI-quality graphics (5, Funny)

Anonymous Coward | more than 4 years ago | (#31238446)

now there we have an accurate statement: "Computer Generated Imagery" quality graphics

"Movie-Quality" (3, Insightful)

nitehawk214 (222219) | more than 4 years ago | (#31238482)

"Movie-Quality" is basically a worthless statement. Which movie? Avatar, Final Fantasy, Toy Story, Tron? The quality of digitally produced movies, and the quality of game graphics power are constantly moving targets.

Re:"Movie-Quality" (2, Funny)

Beelzebud (1361137) | more than 4 years ago | (#31238490)

It could even be a Sci-Fi Channel movie. I have games with better graphics.

Re:"Movie-Quality" (1)

iamhassi (659463) | more than 4 years ago | (#31239140)

True, and while the Meteor [projectoffset.com] video looked impressive, it's nothing we haven't seen before in Crysis, and it's no where near real people walking around and speaking. I think they jumped the boat a bit on "Movie Quality", needs a few more years.

Re:"Movie-Quality" (3, Insightful)

drinkypoo (153816) | more than 4 years ago | (#31238500)

This is basically what I was going to say. The latest crop of "funny fuzzy animal" movies have graphics about as good as the best video games — the secret to making games look as good as movies is apparently to make movies look shitty. I just can't sit through a movie that doesn't look as good as playing a game. I also can't sit through a movie with a worse plot than nethack, but that's a separate issue. Unfortunately, the aforementioned movies suffer from both of these failings.

Re:"Movie-Quality" (1)

scdeimos (632778) | more than 4 years ago | (#31238848)

You seem to forget that those funny fuzzy animal movies are only marketing tools for the related computer games and McDonalds toys. The games are about the same in terms of look as the movies themselves because they're often based on movie assets.

For example, see an interview with the Avatar game developers [worthplaying.com] where they talk about getting the models from Lightstorm Entertainment (who were responsible for the movie graphics).

Re:"Movie-Quality" (0, Offtopic)

drinkypoo (153816) | more than 4 years ago | (#31239694)

You seem to forget that those funny fuzzy animal movies are only marketing tools for the related computer games and McDonalds toys.

That is totally false. They also make substantial money from the DVD releases.

For example, see an interview with the Avatar game developers where they talk about getting the models from Lightstorm Entertainment

Actually, that interview contains better support for your point; Cameron allegedly included (or attempted to include) a vehicle only used in the game but not actually featured in the movie... in the background of the movie, presumably in order to help sell the game. But there's so many problems with Avatar it barely bears discussing. I'll probably still see it if it's not too late to see it in IMAX 3-D by the time I get home. I saw most of it in bootleg form in a bus in Panama recently, a pretty good cam job that was obviously done with a HD camera. Unfortunately I missed the end because the driver failed to properly utilize the clutch and killed the bus, then he had to start it again and the DVD player reset, and the remote had been lost or stolen, with no controls on the unit itself. Sigh.

Re:"Movie-Quality" (1, Offtopic)

rockNme2349 (1414329) | more than 4 years ago | (#31239380)

It is like the Heisenberg Uncertainty Principal. In order to determine a particles position to a high degree of accuracy you merely need to do a shitty job measuring its velocity.

Re:"Movie-Quality" (0)

Anonymous Coward | more than 4 years ago | (#31240174)

It is like the Heisenberg Uncertainty Principal. In order to determine a particles position to a high degree of accuracy you merely need to do a shitty job measuring its velocity.

Please refran from making highly complicated jokes about extremely complicated scientific knowledge while possesing no spelling skills. It makes you look even dumberer.

Re:"Movie-Quality" (2, Interesting)

MBCook (132727) | more than 4 years ago | (#31238580)

Can anyone tell me how close we are to being able to render Toy Story in real time? Say 1080p?

I know the state of the art keeps moving, Avatar is far better looking than the original Toy Story, but with the limited visual "feature set" used in Toy Story, are we very far from being able to do something close looking in real time?

Can we do it raster, now that we have so many GPU based effects?

Re:"Movie-Quality" (4, Informative)

Anonymous Coward | more than 4 years ago | (#31238756)

Not sure, but I can tell you that we're nowhere near rendering state of the art movie CGI in real time. Vertex and pixel shaders have enabled a class of effects that were previously impossible in real time, but those are all direct lighting effects or crude approximations of indirect lighting. Shadows are not really smooth, they're just blurred. Realistic smooth shadows depend on the size of the light source and are computationally prohibitive on current hardware under real time constraints. Movie-quality CGI includes a class of light interactions which is currently impossible in real time, for example caustics: A caustic is light which is reflected or refracted onto a surface which reflects diffusely. Light being refracted by the surface of a swimming pool is an effect which can be faked but not simulated in real time. Render farms use an algorithm called Photon Mapping to simulate this and other complicated light interactions. This algorithm is conceptually related to Raytracing but even more computationally intensive. It does not map well to the hardware which is currently used in the real time rendering pipeline.

Re:"Movie-Quality" (1)

Latinhypercube (935707) | more than 4 years ago | (#31239662)

They are getting close though. Mental Ray, which is the renderer of choice on most feature films, was recently bought by Nvidia and is about to release iRay. iRay uses Nvidia cards to accelerate photoreal Global Illumination. This is utterly accurate, real world lighting. Some setups show a new form of progressive rendering, where you see the finished render gradually. But this system scales up linearly so multiple cards could equal movie quality renders in realtime. Exciting time. Good news for Nvidia.

Re:"Movie-Quality" (1, Interesting)

Anonymous Coward | more than 4 years ago | (#31238828)

I imagine you could do it on any decent powered computer nowdays. The only thing is you couldn't. Realtime rendering and the sort of movie rendering use wildly different techniques, you would have to remake a lot of the film. Seconly you probably couldn't pan the camera much or anything; as back then it was so stressful on their computers they probably removed most of the unseen faces.

Re:"Movie-Quality" (2, Interesting)

afidel (530433) | more than 4 years ago | (#31238912)

Toy Story isn't particularly difficult to render, even at the time you could render scenes with better quality in a matter of minutes so with a decade and a half of doubling every 18 months I'm pretty sure it could be done by your average gaming GPU in realtime. The biggest problem was sufficient memory for texture and model details but with 2GB of ram available on consumer level video card's I don't think that's such a big deal these days.

Re:"Movie-Quality" (2, Interesting)

Jonner (189691) | more than 4 years ago | (#31239414)

If Pixar had been able to render scenes with better quality in a matter of minutes, they wouldn't have needed over 100 machines [findarticles.com] in their render farm. In fact, each frame took "from two to 13 hours."

Re:"Movie-Quality" (2, Interesting)

afidel (530433) | more than 4 years ago | (#31239590)

Well, I was talking about a year after the movie came out, obviously the stuff *before* the movie came out that was used on the multi-year project would have been less powerful. Figure 120 minutes, three doublings of cpu power so divide by eight and you get 15 minutes. Increase ram and you can use better textures or more complex poly's.

Re:"Movie-Quality" (2, Informative)

afidel (530433) | more than 4 years ago | (#31239704)

Just found some numbers, the SPARC CPU in the SS 20's used for Toy Story were capable of 15 MFLOPS peak, an Alpha 21164 433 which came out about 6 months after the movie could do over 500 MFLOPS peak or about 30 times more. Even the PPro 200 could do 150 MFLOPS.

Re:"Movie-Quality" (0)

Anonymous Coward | more than 4 years ago | (#31239918)

peak to peak comparisons are only good for marketers they do not scale the same way on different architectures
hell, they probably don't even scale the same way with the same chip in a different model motherboard

Re:"Movie-Quality" (1)

BikeHelmet (1437881) | more than 4 years ago | (#31239964)

But GPUs are about 100x faster than CPUs at rendering. Imperfect rendering, but with how much they've advanced, they'd do fine for something like Toy Story.

Factor in the doubling of speed every X months, and a high end modern GPU could probably render Toy Story realtime 1600p no problem.

The guy below you says those machines have a theoretical speed of 15mflops. Pretty soon GPUs will be approaching ~2-3tflops (theoretical), so estimating low... 1500000mflops / 15mflops = 100,000 times faster than each of those machines.

If it was a task that doesn't split well, I'd say perhaps software inefficiency would prevent the GPU from managing it - but rendering is what GPUs excel at.

Re:"Movie-Quality" (2, Interesting)

Pseudonym (62607) | more than 4 years ago | (#31240192)

Blinn's Law states that the amount of time it takes to compute a frame of film remains constant over time, because audience expectation rises at the same speed as computer power.

I think it was Tom Duff who commented that one eyeball in a modern Pixar film requires roughly the same amount of work as a frame of Toy Story.

Re:"Movie-Quality" (3, Interesting)

Zerth (26112) | more than 4 years ago | (#31240022)

According to this [fudzilla.com], the original Toy Story needed about 7 TFLOPS to render in real time, although I've seen higher estimates.

87 dual-processor and 30 quad-processor 100-MHz SPARCstation 20s took 46 days to do ~75 minutes, so you need to be 883.2 times as fast to render in realtime. Anyone overclock a quadcore processor to 8 GHz? I suppose setup with 4 quadcore cpus @ 2GHz isn't out of reach.

But then again, the machines might have been IO bound instead of CPU bound, needing to send 7.7 gigabytes per second.

If I were a betting man (1)

$RANDOMLUSER (804576) | more than 4 years ago | (#31238582)

I'd wager that their solution is way more CPU-intensive than GPU-intensive. Or maybe I'm just paranoid.

So what if it is? (1)

dreamchaser (49529) | more than 4 years ago | (#31239676)

As long as it gets the job done it's an interesting innovation. Real time rendering of game or modern movie quality CGI would be a good thing regardless of how it's implemented.

Re:If I were a betting man (1)

drinkypoo (153816) | more than 4 years ago | (#31239766)

I'd wager that their solution is way more CPU-intensive than GPU-intensive.

I'd bet you're right... and you'll be able to do this stuff in realtime at home as soon as you have thousands of cores [cnet.com]. More seriously, though, a future without GPUs would be a good thing, if we could get the same performance (or better) without them. Why? Because in order to use the full power of a computer with a big GPU, you have to do two kinds of programming. A computer where all the powerful processing elements were identical would be much easier to fully utilize, and that means less wasted money.

Who will write the software for the bird? (1)

ipquickly (1562169) | more than 4 years ago | (#31238608)

At what point will the hardware capabilities exceed the software we can write?

If we have the hardware to simulate 'The Matrix', but no-one has written the software to make it realistic, what do we gain?

Re:Who will write the software for the bird? (1)

mustafap (452510) | more than 4 years ago | (#31238770)

>At what point will the hardware capabilities exceed the software we can write?

With the state that the education system is in, I'd say not far off at all.

Re:Who will write the software for the bird? (0)

Anonymous Coward | more than 4 years ago | (#31239796)

>At what point will the hardware capabilities exceed the software we can write?

With the state that the education system is in, I'd say not far off at all.

Um... you know hardware doesn't get faster on its own right? It is designed by humans who are subject to the same education system as programmers.

Re:Who will write the software for the bird? (5, Insightful)

PotatoFarmer (1250696) | more than 4 years ago | (#31238832)

At what point will the hardware capabilities exceed the software we can write?

Never. More hardware means programmers can get away with writing less efficient code.

Re:Who will write the software for the bird? (0)

Anonymous Coward | more than 4 years ago | (#31240372)

Never. More hardware means programmers can get away with writing less efficient code.

Wasting resources deliberately or due to ignorance is a bad thing, but taking advantage of ever-increasing CPU and memory capacities in order to boost to a new layer of abstraction is not. Highly efficient code tends to be more complex. This means it is harder to maintain and harder to understand, usually. More importantly, it is harder to demonstrate its correctness.

As both a producer and consumer of open source software, I far prefer a piece of code that I can understand, but which is possibly not as efficient as it could be, to a mystifying jungle of cleverness which takes weeks to understand and hours just to make small modifications. Not to mention the inability to completely test those modifications, because the code is so hard to follow I can't figure out all the ramifications of even simple changes.

Re:Who will write the software for the bird? (1)

frieko (855745) | more than 4 years ago | (#31239432)

Moore's law isn't some sort of natural occurrence, it's economics. If hardware pulls ahead, hardware engineers switch to writing software. If software pulls ahead they switch to writing hardware.

slashdotted (0)

Anonymous Coward | more than 4 years ago | (#31238644)

someone please tag the article as /.'ed

server is already showing a 500

As a former (contract) developer on Project Offset (5, Interesting)

whiplashx (837931) | more than 4 years ago | (#31238686)

4 or 5 years ago, it was basically comparable to Unreal 3. The motion blur was probably the best feature I saw. Fine graphics, but nothing really mind blowing. Having said that, I have not seen what they've done since Intel bought them, but I'm guessing its basically support for Intel's research projects.

As a developer of modern console and PC games, My Professional Opinion is that there's nothing new to see here.

Re:As a former (contract) developer on Project Off (0)

Anonymous Coward | more than 4 years ago | (#31238718)

Having said that, I have not seen what they've done since Intel bought them, but I'm guessing its basically support for Intel's research projects.

Why even guess? The article there shows you exactly what they've done since Intel bought them.

Oh wait, this is /. No one reads the articles.

Re:As a former (contract) developer on Project Off (1)

skyride (1436439) | more than 4 years ago | (#31239006)

You will never be able to get a true representation of graphics from a compressed image or video. The only way to truely show it would be something like lagarith or bitmap, or, you know, to actually see it in person?

linkzzz (0)

Anonymous Coward | more than 4 years ago | (#31239386)

http://www.youtube.com/watch?v=FpdPWVfaBQs&feature=related

Three years ago. Combat animations certainly aren't on par with GTA4 (Euphoria) and this demo at least reeks of Oblivion imo.

http://www.youtube.com/watch?v=jVk1GArKqfo&feature=player_embedded

New demo is... kinda better. Impressive PhysX(R) (TM)

Re:As a former (contract) developer on Project Off (1)

nacturation (646836) | more than 4 years ago | (#31240680)

Exactly. I've been following now for about four years and they occasionally throw out a few interesting videos and such, but ultimately I haven't seen anything new from their team in quite some time. It was an interesting choice selling out to Intel of all places... I only hope they don't turn it into another Duke Nukem: Forever.

So... (0)

Anonymous Coward | more than 4 years ago | (#31238704)

will they actually spend more time on the gameplay now, or is this just a new plateau in making barely interactive movies disguised as games?

Not tied to their parallel HW now... (0)

Anonymous Coward | more than 4 years ago | (#31238714)

It's no longer hamstrung by Larrabee to it should be pretty cool if it ships, they actually purchased this company a year or two ago. I hope they release the engine because Unreal Engine UDK is now free to develop with and you license when you ship under flexible terms. It'd be nice to get these tools from Intel, until it ships it's just a science project. There are games now which do all of these rendering effects with real game data and very large paged worlds already.

Priorities first! (1)

Stormwatch (703920) | more than 4 years ago | (#31238894)

Attention, developers: graphics are not the most important thing.

For example, the two Sonic Adventure games for the Dreamcast were imperfect but very enjoyable. Now check Sonic The Hedgehog for PS3/X360. It looked far better, but it had craploads of game-breaking glitches, long loading times, overall poor design, so the reviews were mostly negative. Another example, Doom. Everyone loved the first two games... then came in Doom 3, that looked stunning, but played more like a survival horror game. How can someone take such a wild, frantic, exhilarating series and make something so boring out of it?

So, first get a game that PLAYS good, then make it look good.

Who are you arguing with? (1)

beakerMeep (716990) | more than 4 years ago | (#31238972)

Who said graphics are the most important thing? Why do people always defensively trot out this argument when advances are made in graphics?

Re:Priorities first! (1)

Tynin (634655) | more than 4 years ago | (#31239264)

Doom. Everyone loved the first two games... then came in Doom 3, that looked stunning, but played more like a survival horror game. How can someone take such a wild, frantic, exhilarating series and make something so boring out of it?

That really is a sad statement on how far survival horror games have fallen when someone thinks Doom 3 fits that genre. Doom 3 was just a crappy FPS... walk into dark room, shoot the bad guy that is always positioned in an out of the way corner, rinse and repeat. It was never a survival horror... if I told my wife what you said she would be quaking with fiery, and her ranting would be epic. She and I both miss the glory days of survival horror...

That said, the rest of your point still stands and I agree with you. Enjoyable game play should always be the focal point that everything else branches off of. Instead what we get are games that are so graphically polished but play more like tech demos showing how pretty the latest and greatest video card can make everything.

Re:Priorities first! (1)

aronschatz (570456) | more than 4 years ago | (#31239818)

Wait, have you played the original Sonic games on the Genesis? If so, how can you say that ANY 3D Sonic game is good?

Hopefully SEGA won't screw up with Sonic 4. After the crap they continually push out in the Sonic realm, one must wonder...

This is actually quite old... (1)

nickdwaters (1452675) | more than 4 years ago | (#31239042)

I've been watching the development of Project Offset for at least 3 years. Sam McGrath and Co. are doing great things. The stuff that was shown was built way before Intel bought into them.

Re:This is actually quite old... (1)

nickdwaters (1452675) | more than 4 years ago | (#31239084)

I also would like to add that Red5 Studios is using the Offset engine with a MMO project they have I believe in Korea / Asia.

I must be jaded (1)

Voyager529 (1363959) | more than 4 years ago | (#31239086)

So I went to the link in the summary to see the video, and I MUST be too jaded. It looked *exactly* like a level from Unreal Tournament 3. I love that game, so that's all well and good. I'm sure my laptop could render that youtube clip in realtime without a problem. It still seemed fake to me. The movement of the foliage was too "calculated", as was much of the debris when it fell. The camera motion was "too perfect" and looks exactly like what my camera moves look like in After Effects, which bear very little resemblance to what a camera move looks like for real.

Better than Mass Effect? yeah. Better than Counterstrike/Half-Life/Half-Life 2? you bet. Better than UT3 or Crysis? That, I feel, is debatable. If it's debatable, then I'm not certain that there is a breakthrough here. But that's just me, and I could be completely off-base here.

Re:I must be jaded (1)

gnud (934243) | more than 4 years ago | (#31239350)

The camera path can be set beforehand, and the scene can still be rendered in real time.

Re:I must be jaded (1)

Voyager529 (1363959) | more than 4 years ago | (#31240122)

Agreed; that seems to be common practice in video games. The point I was getting at is that the paths seem a little "too perfect", and the motion itself seems a bit linear and calculated. I'm not saying that they need to have Michael Bay program the cameras, but for true photorealism, the camerawork needs to be less computationally convenient.

define movie quality (3, Insightful)

poly_pusher (1004145) | more than 4 years ago | (#31239096)

As stated by other posters, "film quality" is misleading. Primarily it refers to resolution and remember many cameras record at up to 4k, so the ability to render in real time at ultra-high res is definitely sought after.

Currently, the big push in 3d rendering is towards physically based raytrace or pathtrace rendering.
http://en.wikipedia.org/wiki/Path_tracing [wikipedia.org]
http://en.wikipedia.org/wiki/Ray_tracing_(graphics) [wikipedia.org]
Physically based rendering produces a much more accurate representation of how light interacts with and between surfaces. It has always taken a long time to render using physically based techniques due to the huge amount of calculations necessary to produce a grain free image. This has changed somewhat recently with multi-core systems and with GPGPU languages such as CUDA and OpenCL we are about to experience a big and sudden increase in performance regarding these rendering technologies.

While this game looks great, the engine is by no means going to be capable of rendering scenes containing hundreds of millions of polygons, ultra-high res textures, physically accurate lighting and shaders, and high render resolution. We are still pretty far away from real-time physically-based rendering, which is the direction film is currently headed. So that would have to be what "Movie-Quality CGI" is defined as and this game does not live up to that definition.

Not impressed (1)

Charliemopps (1157495) | more than 4 years ago | (#31239158)

I didnt see anything in those videos that I can't find in most modern games. Also, they say its rendering in real time... So what? You can sit and optimize a scene for weeks before releasing it. It's when you get half a dozen real players running in unpredictable directions and in unpredictable patterns that an engine either shines or fails.

Gameplay (0)

Anonymous Coward | more than 4 years ago | (#31239180)

Hopefully soon games will render so realistically that no one will be able to tell them from real life. That will hopefully end the obsession with shiny graphics at the expense of all else in games. then finally they can start work on elements like depth of gameplay experience. Or maybe they will just move their attention to smell sythesisers and try and get it so you can smell those burning corpses you fragged with generic first person gun #23. Seems like the movie industry could also benefit from forgetting about shiny CGI a bit in favour of things like interesting plot developement, character depth etc.

If CGI is so great then why wasn't Star Wars Episode I better than Episode V?

Re:Gameplay (1, Interesting)

Anonymous Coward | more than 4 years ago | (#31239498)

Well, the problem is that even if the game engines can render and animate photorealistic graphics in real time, you still need the artists to produce the models and textures at the requisite quality.

It used to be if you wanted a building, you could draw a few walls and slap on a texture. Nowadays, especially if you want destructible physics, you'd practically have to draw out a CAD model.

I think that's one reason why most 3D full-CGI movies are animated/cartoony rather than realistic. It's too much work to make everything perfectly life-like and you run the risk of falling into uncanny valley if you do it wrong.

Pictures? (2, Interesting)

incubbus13 (1631009) | more than 4 years ago | (#31239308)

Okay, so this is slightly off-topic, but something I've always wondered about.

I can take a 12megapixel picture. And reduce it down to a 12k gif. Or 120k or whatever the compression results are.

At that point, it's just a .gif. (or .jpg or whatever). The computer doesn't know it's any different than a .gif I created in MSPaint, right?

So if I open GameMaker 7, and use that photo as one of the frames in my character's animation. By repetition, I could create a character moving and walking frame by frame.

Right? What's wrong with this?

I understand that on-the-fly rendering is nice. And that the goal is to get a computer to generate a 'real' picture. But. The difference between a 'great' game and an okay one is the graphics. I could (if I could draw) take a pencil and do one of those black and white sketches that almost looks like a photo, and scan it in and use it too.

What are the technical hurdles or barriers that prevent someone from just doing this?

K.

Re:Pictures? (0)

Anonymous Coward | more than 4 years ago | (#31239390)

You don't play many 3D games, do you?

Re:Pictures? (1)

h4rr4r (612664) | more than 4 years ago | (#31239488)

At one time people did that, see the famous game Myst.

These days people like moving where ever they want.

Re:Pictures? (4, Interesting)

am 2k (217885) | more than 4 years ago | (#31239564)

Actually, that's how the characters in the older Myst games worked (except that they used this great new technology called "video camera" to get moving pictures into them).

This was fine in those games, because the viewpoint was always fixed. That's a restriction you don't want to have in current games.

Boof (2, Interesting)

Windwraith (932426) | more than 4 years ago | (#31239328)

So this means we are going to see games with movie budgets and no gameplay at all...we already do, but the balance will detriment gameplay even further by reasoning of manpower.

Movie quality? (0)

Anonymous Coward | more than 4 years ago | (#31239492)

nVidia beat you to it nearly ten years ago, Intel... the GeForce 3 [nvidia.com] was used to create "Final Fantasy: The Spirits Within" big-screen movie.

Where is the AI? (0)

Anonymous Coward | more than 4 years ago | (#31240504)

I will be in awe when we will have realistic AI. I would prefer a pixelated character with a realistic AI over brain dead but realisticly rendered character.

I think it is easier to push multiplayer (no need for AI) and realistic graphics than realsitic artificial intelligence.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...