Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

2000x GPU Performance Needed To Reach Anatomical Graphics Limits For Gaming?

Soulskill posted more than 2 years ago | from the perfectly-photorealistic-rocket-jumps dept.

Graphics 331

Vigile writes "In a talk earlier this year at DICE, Epic Games' Tim Sweeney discussed the state of computing hardware as it relates to gaming. While there is a rising sentiment in the gaming world that the current generation consoles are 'good enough' and that the next generation of consoles might be the last, Sweeney thinks that is way off base. He debates the claim with some interesting numbers, including the amount of processing and triangle power required to match human anatomical peaks. While we are only a factor of 50x from the necessary level of triangle processing, there is 2000x increase required to meet the 5000 TFLOPS Sweeney thinks will be needed for the 8000x4000 resolution screens of the future. It would seem that the 'good enough' sentiment is still a long way off for developers."

Sorry! There are no comments related to the filter you selected.

Development costs? (4, Interesting)

oakgrove (845019) | more than 2 years ago | (#39291823)

My question is this: how much more will games have to cost to support the development to this level of detail?

I want this card... (1, Funny)

Anonymous Coward | more than 2 years ago | (#39291923)

...this would be awesome for bitcoin mining.

Re:Development costs? (5, Insightful)

SlightlyMadman (161529) | more than 2 years ago | (#39291943)

It shouldn't make a huge difference, actually. Things like trees and faces are already rendered to a complexity beyond where it's reasonable to create them by hand. That's why there are 3rd-party utilities to render these things easily, with some simple inputs, like plugging a formula into a fractal generator. You don't have to hand-design an NPC's face any more than their parents had to piece their fetus together. You plug in the DNA and the code does the rest.

Re:Development costs? (2)

Dahamma (304068) | more than 2 years ago | (#39292469)

Yeah, but you are now limited to those algorithms, which will be way inadequate when other objects, textures, etc are photorealistic. All of the new R&D costs to come up with something better is an extra cost, as well.

Not to mention you have to improved everything to match when you improve the poly count and lighting - making perfectly lifelike characters does no good if their animations look robotic... yet more dev costs (probably, in algorithms, mo-cap, and hand animated/tweaked animations/keyframes/etc).

Added complexity takes added effort with takes added $$$. If it didn't today's CGI movies wouldn't have to keep adding more and more TDs, animators, artists, etc.

Re:Development costs? (1)

Hentes (2461350) | more than 2 years ago | (#39291945)

Don't worry, most studios don't spend much on salaries, the big costs are marketing and management.

Re:Development costs? (2)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39291989)

My suspicion would be that the level of detail that is commercially viable will depend largely on the availability of tools to generate it more efficiently(and recycle without being too blatant what you've already generated)...

Something like a high-resolution 3D laser scan and motion capture isn't cheap; but if you have the capability to take a library of captured actors and then programmatically mix-and-match and slightly randomize certain parameters to generate unlimited NPCs, the start-up cost is high; but the incremental cost of adding additional mooks becomes relatively cheap.

Same thing would go for programmatically generated trees, grabbing textures with (relatively) cheap high-resolution digital cameras, and so on.

Re:Development costs? (1)

masternerdguy (2468142) | more than 2 years ago | (#39292505)

Duh. If our eyes cant capture the detail we should build better ones.

Re:Development costs? (0)

Anonymous Coward | more than 2 years ago | (#39292001)

It really depends on how they cut their corners. They could simply use real actors and acquire the 3d representation at a much finer granularity than they are presently doing. Full body mocap and facial animation mocap has already proven to be cheaper and produce more realistic results than doing it all by hand, why not capture the texture and geometry of the face/body in the same way.

The biggest problem with mocap/imaging technologies today is that they often produce too much data and require artists to post process them and simplify em. But this is the sort of data we'd need for this level of detail given the amount of processing power.

I'd be curious if this sort of mapping technology plus an old school movie makeup artist could produce a better looking "Orc" than say a regular 3d modeling doing it all by hand.

Re:Development costs? (5, Funny)

DahGhostfacedFiddlah (470393) | more than 2 years ago | (#39292005)

Same as any other new technology.

In 2028, James Cameron will spend 3.2 trillion dollars on Avatar: Reloaded. You'll spend $50 to see it once in theatres.

The technology will be ported to games about 5 years after that, costing $60/game (top-tier game prices haven't changed since 1980).

5 years after that, $2 flash games will all include photo-realistic graphics at 200 fps.

Re:Development costs? (5, Funny)

r1348 (2567295) | more than 2 years ago | (#39292243)

The possibility that flash might be still around in 2038 frightens me to the bone.

Re:Development costs? (4, Funny)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#39292727)

What else will enable rich multimedia experiences on IE6 Legacy Essentials for Positronic Brainstem Implants Enterprise Edition?

Re:Development costs? (1)

AJH16 (940784) | more than 2 years ago | (#39292487)

What is this about games not having changed in cost since the 1980s?? AAA games used to be $50, not $60. The $60 game trend started relatively recently (like within the last 10 years) with the advent of heavily subsidized gaming consoles and then stayed there when they realized the kind of licensing fees they could get away with for developing for their hardware. I'd expect one more jump by then (though in fairness, it will more or less be cost equivalent when you factor in inflation.)

Re:Development costs? (2, Informative)

Anonymous Coward | more than 2 years ago | (#39292677)

No.

Many SNES games were 75-80 dollars.

Re:Development costs? (4, Insightful)

mcgrew (92797) | more than 2 years ago | (#39292087)

It's not the cost of the games, but cost of the hardware. That's one reason I got out of the gaming scene -- to play a new game you had to have the latest, greatest, fastest, most expensive hardware.

Sweeny and company need to get a clue. I'm a nerd, but I'm not Steve Wozniac. I have bills to pay and much better things to do with my time and money than to spend half a C-note on hardware, take the time to install the hardware, just to play a $50 game I might not even enjoy that much.

I mean, its a GAME. I don't care that every hair on Duke Nukem's head is perfectly rendered. I just want it to be FUN.

Re:Development costs? (2, Informative)

Anonymous Coward | more than 2 years ago | (#39292307)

I have bills to pay and much better things to do with my time and money than to spend half a C-note on hardware

Fifty dollars on hardware doesn't sound that bad, honestly. Presumably that will last for at least a couple years.

Re:Development costs? (2)

Bengie (1121981) | more than 2 years ago | (#39292477)

IGPs can play Crysis now. GPUs aren't the bottleneck anymore, it's how many command you can issue to the GPU. This part is limited by IPC*mhz and # of threads.

Less (2)

HalAtWork (926717) | more than 2 years ago | (#39292253)

Developers used to start off with high resolution models and have to pare down the triangle count and adjust textures to meet memory and processing requirements. In the future, they won't have to do all of that tweaking and will be able to use full resolution models, so it will probably be cheaper.

Also, not all games aim for realistic depictions, many (most?) are stylized, and won't necessarily need to be highly detailed. The extra processing power could go to effects, deformation, physics, etc.

Re:Development costs? (1)

jd (1658) | more than 2 years ago | (#39292357)

Less than at present. Currently, they have to hard-code data and pre-render bumpmaps. That's expensive. The more realistic you want to make something, the more abstract you want the model*, which means less work for the designers and less computer time spent pre-generating things.

*An abstract model can be rendered under a wider range of conditions and thus look real under them. A pre-generated bitmap only looks realistic under very specific conditions. At best. Letting the computer do the work, rather than the designer/coder, means getting more out for less development time in.

Re:Development costs? (1)

shadowrat (1069614) | more than 2 years ago | (#39292459)

less because you will just be able to scan real world objects without doing any cleanup on the resulting meshes. When it comes to things that don't exist, like alien monsters and battle armor, well, you can model them using techniques that result in more wasted triangles without worry.

Tim "the Toolman" Sweeney says... (1)

Anonymous Coward | more than 2 years ago | (#39291825)

It needs more power. OH HO HO!

I'm all for (4, Funny)

Bodhammer (559311) | more than 2 years ago | (#39291839)

better looking "anatomical peaks"!

Re:I'm all for (1)

sexconker (1179573) | more than 2 years ago | (#39292055)

better looking "anatomical peaks"!

I prefer valleys.

Re:I'm all for (1)

Bodhammer (559311) | more than 2 years ago | (#39292165)

Actually, I prefer both if "you know what I mean, nudge, nudge, wink, wink." ;-)

Re:I'm all for (0)

Anonymous Coward | more than 2 years ago | (#39292417)

OH HO HOOOOOOOOOOOOOOOOOOOOO is funny because they mean tits and ass, but are using landscape as a metaphor.

Re:I'm all for (5, Funny)

girlintraining (1395911) | more than 2 years ago | (#39292065)

better looking "anatomical peaks"!

Yeah, me too. To date, I have yet to see a video game character with a realistic-looking male crotch. Those poor, poor bastards. And yet, so many guys look at those video game characters as heroes in spite of their status as eunichs. I hope that with this latest advancement in technology, men will finally get some anatomical upgrades so they can be, you know... men.

Re:I'm all for (0)

Anonymous Coward | more than 2 years ago | (#39292507)

It will only happen when someone decides to fund a 3D CGI motion capture porn studio and production company.

Re:I'm all for (0)

Anonymous Coward | more than 2 years ago | (#39292509)

Well played!

Re:I'm all for (0)

Anonymous Coward | more than 2 years ago | (#39292399)

I think you're missunderstood TFA, they are talking about better silicon, not silicone.

Cheaper solution! (1)

galanom (1021665) | more than 2 years ago | (#39292593)

It would be cheaper to hire a hooker.

Re:I'm all for (1)

eyenot (102141) | more than 2 years ago | (#39292745)

I said it once before, I'll say it again: pixel boobs should pixel bounce.

640K (1)

dmt0 (1295725) | more than 2 years ago | (#39291865)

640K should be enough for anyone

Re:640K (4, Funny)

Anonymous Coward | more than 2 years ago | (#39291969)

I came here to read this. I leave satisfied.

Re:640K (1)

gtirloni (1531285) | more than 2 years ago | (#39292187)

If it's 640K TFLOPS, I agree.

Rasterization (0)

Anonymous Coward | more than 2 years ago | (#39291869)

And he's only talking about rasterization. Expect a switch to raytracing somewhere in the not so near future.

Re:Rasterization (0)

Anonymous Coward | more than 2 years ago | (#39291925)

lol

Because raytracing uses so much less computing resources? Or, because you don't really know what it is?

Re:Rasterization (3, Funny)

binarylarry (1338699) | more than 2 years ago | (#39291953)

Think of how glorious the reflective spheres and checkerboards will be bro!

Re:Rasterization (4, Insightful)

localman57 (1340533) | more than 2 years ago | (#39292023)

And he's only talking about rasterization. Expect a switch to raytracing somewhere in the not so near future.

But that won't really matter either. The problem at this point isn't the number of pixels, or the number of polygons, or the depth or resolution of the textures. It's the fact that the image is being projected on a rectangle with a strip of plastic around it. In the end, what we really are shooting for is what literature people call "Suspension of disblief". You can only get so far looking into a glowing rectangle. The wrap-around screens of eyefinity help some, and 3d glasses have the potential to help a little bit.

The reality is that hte most immersive gaming experience I've had was in the mid to late 90's when i was hooked up to a real VR system with a helmet, and held a gun with approximately wii-controller input capability. The ability of that system, despite its craptacular by today's standard rendering capability, to be immersive was much higher, because the ability to see my entire environment by moving my neck and body was more important the the quality of the environment itself.

Re:Rasterization (1)

jd (1658) | more than 2 years ago | (#39292259)

CAVE systems (such as the "hamster balls" where you have 360/360 vision) offer the best immersive experience. However, I *still* disagree with this "suspension of disbelief" concept. JRR Tolkien described it as being a failure of the person creating the experience and I'll take the opinion of an expert over and above the opinion of just about anyone else.

The number of pixels and the quality of the textures DO matter at this point - you have to cross the Uncanny Valley completely before the perceived quality goes up. The perceived quality will actually FALL until you reach the other side.

Raytracing will help, provided radiosity is added in (raytracing is lousy at diffuse reflections, which matter if you're wanting true realism). Photon tracing and photon mapping are even better, but the computational cost goes up accordingly. To get audio to match video, you really want to use sound tracing techniques too. You have to have the sound (echos included) match expectation or the brain will detect the mismatch and rebel. There's nothing worse than a brain marching up and down the hall making demands.

To get realism to the point where it will be truly "good enough", I would argue that 20,000x current performance is closer to what is required.

Re:Rasterization (0)

Anonymous Coward | more than 2 years ago | (#39292547)

Close. Red Tails was one of the best movies I've seen recently. Granted, the characters were as shallow as real fighter pilots. However, the CG just wasn't good. They put a lot of work into it, and there was a lot of the aero and formation flying that was just wrong, but what sucked was that my smalll brain never bought into it.

Re:Rasterization (0)

Anonymous Coward | more than 2 years ago | (#39292273)

Seriously. It's a shame most kids are too young to have tried those 90's era VR booths. The graphics were really low polygon count, but the experience of being able to adjust your view by moving your head has never been matched by anything else.

Today you could probably even make the hardware afordable for a home system (how much does a kinect cost?) and with better graphics. Too bad a moder eye laughs at the dated look of the booth.

Your generation is not special, more will follow (4, Insightful)

elrous0 (869638) | more than 2 years ago | (#39291873)

there is a rising sentiment in the gaming world that the current generation consoles are 'good enough' and that the next generation of consoles might be the last

If developers can't find a way to improve games beyond the next generation, it's not because we've reached some peak of gaming possibilities, it's just because those particular developers have reached the peak of their imaginations.

Somewhere right now their is a young guy sitting somewhere who has an idea in the back of his head which will become the next great innovation in gaming. It will require a lot more computing power than the current generation of PC's, much less consoles. If he were to pitch it at EA, he would be laughed at. If he tried to explain it at a Game Developers Conference, he would be greeted by blank stares and derision. He's probably already used to hearing responses like "That can't be done", "Who would want THAT?", "That could never be done on a console", etc. But one day people will look back and say "Wow, how could they *not* have seen that that was the future?" and "How could they have been so arrogant as to think that gaming had peaked with the millionth variation of the FPS?".

What's more, I suspect that even Sweeney is off-base. The next real innovation won't be about improving resolution or framerates to some theoretical max, or making an even prettier FPS. It will be some whole new way of thinking about gaming that is just in the mind of that weird guy right now. Most of us can no more imagine it now than some guy playing Pacman could have foreseen Half-Life 2. But it's coming.

Every generation thinks it's special. But never be so arrogant as to think your generation has somehow reached the pinnacle of achievement in ANY area.

Re:Your generation is not special, more will follo (1)

19thNervousBreakdown (768619) | more than 2 years ago | (#39291971)

Somewhere right now their is a young guy sitting somewhere who has an idea in the back of his head which will become the next great innovation in gaming. It will require a lot more computing power than the current generation of PC's, much less consoles. If he were to pitch it at EA, he would be laughed at. If he tried to explain it at a Game Developers Conference, he would be greeted by blank stares and derision. He's probably already used to hearing responses like "That can't be done", "Who would want THAT?", "That could never be done on a console", etc. But one day people will look back and say "Wow, how could they *not* have seen that that was the future?" and "How could they have been so arrogant as to think that gaming had peaked with the millionth variation of the FPS?".

68% of you won't re-post this, but the 42% of you with VISION will. Our voices will be heard! No fees for gaming, or we'll QUIT VIDEOGAMES!

Re:Your generation is not special, more will follo (0)

Anonymous Coward | more than 2 years ago | (#39292019)

Wow, I didn't realize that 110% of the population were gamers! ;)

Re:Your generation is not special, more will follo (4, Funny)

rgbrenner (317308) | more than 2 years ago | (#39292031)

68% + 42% = 100% eh? Maybe quitting video games would be a good thing for you. It would give you more time to study math.

Re:Your generation is not special, more will follo (0)

Anonymous Coward | more than 2 years ago | (#39292367)

Are you a math nazi?

Re:Your generation is not special, more will follo (1)

localman57 (1340533) | more than 2 years ago | (#39292453)

I don't think so; if you read the first question out loud, he sounds Canadian.

Re:Your generation is not special, more will follo (2)

aevan (903814) | more than 2 years ago | (#39292389)

Nono, totally unrelated: 68% of the people won't repost. 42% who have vision will. So 32% of the people who repost are of the 76% that can see, meaning he considers 24% of the populace to be blind. Apparently there is a high level of head injury in his area resulting in eye trauma.

Re:Your generation is not special, more will follo (1)

19thNervousBreakdown (768619) | more than 2 years ago | (#39292589)

For the humor impaired, this was an intentional parody of the moronic chain posts on Facebook, complete with terrible math, ambien-level hyperbolic drama, and random capitalization, inspired by the dumbass Facebook-post-esqe quoted paragraph.

You'd think I'd know better than to try a gag like this on Slashdot (or the internet in general) by now.

Re:Your generation is not special, more will follo (1)

king neckbeard (1801738) | more than 2 years ago | (#39292069)

Why is an innovation inherently going to make use of more computing power?

And yes, there are pretty clearly areas where there is no practical room for improvement. For example, we have digital audio quality that can exceed the perception of even the best humans, so for humans, there is no reason to go further. That's not to say that there isn't room for improvement, but rather, for such an improvement to be useful, we'll need a better human.

Re:Your generation is not special, more will follo (1)

jdgeorge (18767) | more than 2 years ago | (#39292283)

Why is an innovation inherently going to make use of more computing power?

Didn't the post you're "responding" to that say about the innovation "will be some whole new way of thinking about gaming", rather than just higher resolution, FPS, etc. And did it mention anything at all about using more computing power?

If you're just looking for a place post your two cents on a subject, you could at least make it a reply to something vaguely related to what you're talking about.

It's not that I think what you're saying is wrong; it's just a nonsequitor in this thread.

Re:Your generation is not special, more will follo (1)

Trepidity (597) | more than 2 years ago | (#39292395)

The post he was replying to said:

Somewhere right now their is a young guy sitting somewhere who has an idea in the back of his head which will become the next great innovation in gaming. It will require a lot more computing power than the current generation of PC's, much less consoles.

That was also my reaction on reading that--- why should we assume that the next great innovation in gaming will necessarily involve "a lot more computing power"? It's possible that there exist such innovations, but I'm also pretty confident that we haven't exploited the gaming possibilities of current hardware, or even of last-gen hardware. Heck, going back further, I don't think the SNES era actually explored all the gaming possibilities that one could've explored on an SNES; the market moved on so some avenues never got explored.

Re:Your generation is not special, more will follo (1)

X0563511 (793323) | more than 2 years ago | (#39292501)

Well, there's some small limited-scope audio baubles that could be improved.

For example, having audio recorded at 192khz allows you to do slow-motion effects without the audio turning into bass sludge (you'd get to hear all that neat stuff you normally can't).

Better HRTF and simulation algorithms would allow you to directly generate audio based on geometry interactions, media density, temperature etc - instead of using all pre-recorded sounds and pre-defined characteristics (such as room size, simplified geometry for occlusion).

Think of it this way, in terms of video: We are rastering now, but we could ray-trace. We could even photon-map, conceivably.

Note: raytracing traces light away from the camera pixel by pixel, all the way to the source (unless constrained by simulation specifications). Photon mapping is different - it from sources, simulating their interactions until they either hit the camera or hit some other simulation constraint. [wikipedia.org]

We could do similar stuff for audio. Might be a bit ridiculous, but still. It's probably more efficient space-wise to store/define characteristics of things and events, and let the system generate the rest.

Well... this got rambly. My point was there are things that can be done to audio that could be perceived by people.

Re:Your generation is not special, more will follo (1)

X0563511 (793323) | more than 2 years ago | (#39292519)

Well, that's what I get for not closing my anchor tag. Slashdot extended that URL to the whole phrase and lopped out some words while it was at it.

I meant to say it "simulates light emissions from sources"

Re:Your generation is not special, more will follo (1)

PeanutButterBreath (1224570) | more than 2 years ago | (#39292325)

Somewhere right now their is a young guy sitting somewhere who has an idea in the back of his head which will become the next great innovation in gaming.

But it will never see the light of day because it is genuinely innovative, rather than an rehash of previous ideas that is easily marketed thanks to technological stats.

People value what they can measure.

Re:Your generation is not special, more will follo (1)

timeOday (582209) | more than 2 years ago | (#39292355)

You're ignoring Sweeny's entire point and arguing that a different proposition - "gaming is as good as it could ever be!" - is false. So what?

Re:Your generation is not special, more will follo (3, Insightful)

vux984 (928602) | more than 2 years ago | (#39292535)

Most of us can no more imagine it now than some guy playing Pacman could have foreseen Half-Life 2. But it's coming.

The guy playing pacman (released in 1980) only had to move a couple cabinets over to play Battlezone (also released in 1980) to foresee Half Life 2 and FPS's in general.

Re:Your generation is not special, more will follo (1)

Kamiza Ikioi (893310) | more than 2 years ago | (#39292587)

Somewhere right now their is a young guy sitting somewhere who has an idea in the back of his head which will become the next great innovation in gaming.

And somewhere behind him is a woman throwing all of his stuff out of a bedroom window because he hasn't turned around from his gaming in 7 hours....

Re:Your generation is not special, more will follo (0)

Anonymous Coward | more than 2 years ago | (#39292681)

But never be so arrogant as to think your generation has somehow reached the pinnacle of achievement in ANY area.

So, who is the bright fella who is going to add decimals to Tex' version number?

Re:Your generation is not special, more will follo (1)

AJH16 (940784) | more than 2 years ago | (#39292749)

Nice theory, but in the days of Pacman, people COULD and DID envision a future with things like HL2. More realism was always and has always been the goal. The problem is now that we are getting to a point that many people consider to be "good enough," there is a lot of questioning as to what the future will hold. Most likely, the answer is a combination of incremental upgrades of realism coupled with increased focus on either a) marketing for big titles or b)different ways of thinking of gameplay, though even that concept doesn't really leave a whole lot of room. Most games are simply adding "with a computer" or "on the internet" to things that people have always wanted to do. The ideas of FPS for example are no different from the concepts of any action movie ever made, it just seeks to make the experience more immersive. As the tools mature, costs will become lower to implement and more focus can be put on quality story telling.

At the end of the day, that is what separates a game like HL2 or Mass Effect from a game like Angry Birds. You can make a great game in two ways. One is make something that is simply psychologically addictive and mind numbing that is good for mindless amusement or the other is to tell a great story in an immersive way. The most memorable games tend to be the ones with a great story that pulls you in. Saying that the direction of games will change drastically is like saying the direction of books will change drastically. They've been the same for thousands of years. Why? Because they work. They are immersive and tell a story people want to experience. Games are no different. Technology makes the media look different (e-books for example), but what makes them work doesn't change because people don't change.

50x to 2000x in 5 years (0)

Anonymous Coward | more than 2 years ago | (#39291919)

Given a doubling in power every 18 months, I give this roughly 5 years... 8 years tops before it is sitting in my entertainment center running my games.

Re:50x to 2000x in 5 years (1)

rgbrenner (317308) | more than 2 years ago | (#39292109)

In 5 years, assuming doubling every 18 months, it will only be about 10.5x faster.

Re:50x to 2000x in 5 years (0)

Anonymous Coward | more than 2 years ago | (#39292155)

This is slashdot. Nobody expects actual math, only flashy headlines.

So ... Five years!

Anatomical? (2, Insightful)

girlintraining (1395911) | more than 2 years ago | (#39291931)

When the article's authors have shoehorned a word so obviously not related to the subject matter into the subject line, and then go on to repeat it over and over again, only one of two things can be true:

1. There were no better words in the dictionary, and rather than taking the sensible approach of creating a new one, they opened to a page at random, stuck their finger on it, and started using whatever their finger touched.

2. Author was trying to sound trendy and interesting.

As a footnote, salahamada is a made-up word waiting patiently for its debut. Give it a little love?

Re:Anatomical? (1)

Pope (17780) | more than 2 years ago | (#39292177)

And a salahamada to you as well on this glorious day!

Re:Anatomical? (0)

Anonymous Coward | more than 2 years ago | (#39292189)

As a footnote, salahamada is a made-up word waiting patiently for its debut. Give it a little love?

I think the TF2 pyro has that one trademarked

http://www.google.com/url?sa=t&source=web&cd=3&ved=0CC4QtwIwAg&url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3D9v2SlLIS9eU&ei=mQhZT_jJJZGTtwe30a2FDA&usg=AFQjCNHfkU2De9xACLqvISqlGLNjiYeXhQ [google.com]

Re:Anatomical? (1)

sideslash (1865434) | more than 2 years ago | (#39292423)

I interpreted "human anatomical peaks" in the sense that we have anatomically-caused limits of visual resolution and color that we can perceive. The "peaks" part may also communicate that some of us see better than others (*adjusts eyeglasses*). The overall limit or "peak" is directly related to the scale of our bodies and how our eyes are put together. The phrase is a slightly unusual shorthand in this summary (of course I didn't read the article), but it makes sense to me.

ANATOMICAL GRAPHICS LIMIT (0)

Anonymous Coward | more than 2 years ago | (#39291933)

Again... the Porn industry leads the way...

Cloud rendering (0)

Anonymous Coward | more than 2 years ago | (#39291957)

Should be possible today. Let's see it.

optimization (1)

Anonymous Coward | more than 2 years ago | (#39291975)

i remember when cell first came out and sony was starting the hype on it's use in the PS3. i can't say whether or not it has reached it's potential, but if you want to see just how important optimization is, go find a video comparison between skyrim on ps3/xbox/PC and then go watch the new Kara trailer from quantic dream. you mean to tell me that uncharted is the best we can expect from the current gen consoles and that we are "good enough" now? what a load of crap. if i had a dev on my team run with the "it's good enough" argument, i'd can his ass.

Ha (2, Informative)

Anonymous Coward | more than 2 years ago | (#39291999)

Tim's explanations of first- and second- and third-order approximations are somewhat bizarro. Unreal doesn't use second-bounce in its lighting. All game engines are first-bounce only unless they contain some realtime radiosity simulation, and very very few do. This has been true since Wolf 3D and is true today.

And once you have a system for second-bounce, third- and fourth-bounce can be trivially computed (over multiple frames if need be), and the results are hardly different to second-bounce.

I wish I knew what he meant by these levels of approximation.

Raw resolution is nearly irrelevant (0)

Anonymous Coward | more than 2 years ago | (#39292027)

The real measurement is dots per degree, but that's a very variable measure as the physical dimension of 1 square degree of screen is dependant on your distance from the screen. So the more designable measure is dots per inch (or centimeter if you can't understand inch), which has a tendency to decrease as the screen size increases (since standard resolutions tend to stay over a range of actual screen sizes).

As for the graphics being "good enough," most of them are, it's the plot and gameplay that really need help. I know that games weren't actually any better in the past, but at least when it comes to old games and consoles we have the advantage of mostly forgetting about the lame ones. Except E.T. on the Atari, no amount of therapy will suffice if you played that one.

Only for games where one is seated (1)

WillAdams (45638) | more than 2 years ago | (#39292035)

stationary before the screen which is located close enough for the screen to fill one's vision.

You'll need to go even farther to fill a wall, or better still three or four so that a gaming experience like Legend of Zelda: Skyward Sword is fully immersive.

William

Anatomical Peaks (4, Insightful)

Archangel Michael (180766) | more than 2 years ago | (#39292083)

While describing the layer and textures, it is going to be offset by what is known as "uncanny valley". There is a point at which the reality is flawed because it looks too real for the context.

I'm even starting to see uncanny valley on magazine covergirls after they've been photoshopped till they are almost unrecognizable. There is a point where you stop fixing flaws and start making them.

Re:Anatomical Peaks (0)

Anonymous Coward | more than 2 years ago | (#39292313)

Excellent points. You even said stuff I've thought, but hadn't put together in a coherent statement. That's why I HATE photoshopped women, etc. They cease to be real, and thus I cease to be interested.

Re:Anatomical Peaks (1)

eyenot (102141) | more than 2 years ago | (#39292617)

Hmm, you might be interested to learn that most real-real women, i.e. the kind that walk around, breath, and aren't impressions on glossy paper, typically aren't photoshopped, either!

Re:Anatomical Peaks (2)

billcopc (196330) | more than 2 years ago | (#39292615)

It's not so much about being too perfect, but more closely related to what I call "uneven reality". If one aspect seems less real than the rest, like for example picture-perfect facial detail but choppy motion, that tends to trigger the uncanny valley response. It's our brain going "I recognize this as human, but something is very very wrong with them". The same can occur with sounds, as if a non-realistic image or machine is paired with a human voice, we perceive it as a disembodied human, which can be a quite creepy. If you're old enough (or wise enough), think of Hal from 2001: A Space Odyssey, or having a fluent conversation with your toaster. Creepy.

What about AI? (5, Insightful)

i_ate_god (899684) | more than 2 years ago | (#39292105)

Everyone talks about how far we can push graphics.

But what about pushing the AI?
What about procedural generation of the game?
What about vastly improved physics including a destrucable world?

I'd rather see these things pushing hardware development than how many polygons you can crunch in a second.

Re:What about AI? (0)

Anonymous Coward | more than 2 years ago | (#39292577)

MOAR POLYGONS

Re:What about AI? (1)

phorm (591458) | more than 2 years ago | (#39292585)

Physics does seem to be building towards more realism lately, but still isn't as much focus as graphics.

However, full physics realism in most games wouldn't work well. It takes away from the gameplay a bit if you can just point the MonsterBlaster 10000 at the wall and blow a hole to the nearest exit :-)

Re:What about AI? (1)

c.r.o.c.o (123083) | more than 2 years ago | (#39292607)

I completely agree that there are more important things to gaming than pretty graphics. I'd love to see self generating environments, NPCs with voice acting and even quests.

Skyrim (and perhaps other games) tried implementing a random element to quests where the quest line remains static but the location of items is dynamic. But this still falls far short, and after visiting the same cave a couple of times it makes no difference if I need to find item X or Y in there.

Moving goalpost... (1)

RightSaidFred99 (874576) | more than 2 years ago | (#39292115)

I don't think anybody believes we're good enough for 8000x4000, people are talking about 1080p when they say "good enough" for consoles. Seems we're pretty close even with current gen consoles, and if they can quadruple GPU power or more in the next gen consoles it should be fine for years until those higher resolution displays are actually commonplace.

So What? (0)

Anonymous Coward | more than 2 years ago | (#39292195)

By the numbers in the article we went from 1GFLOPS in 1998 to 2.5TFLOPS in 2011. That's more then 2000X in 13 years.

Mr. Knowitall (1)

Beelzebud (1361137) | more than 2 years ago | (#39292197)

I enjoy reading the responses from armchair know-it-all's that seem to think Sweeney is some sort of light-weight when it comes knowledge about rendering.

What about human vision? (1)

davidyorke (543505) | more than 2 years ago | (#39292207)

Can anyone speak to how the limits of human vision relate to the need for 8000x4000 pixel resolution? I don't know why we need such high resolution for personal home video gaming presuming single player on a 30-45" screen.

Re:What about human vision? (0)

Anonymous Coward | more than 2 years ago | (#39292285)

I know that on my 19" computer screen, I can't tell the difference between 720p and 1080p unless I get too close for any practical use case.

Yes, and 16k is enough for anyone too (4, Interesting)

Twinbee (767046) | more than 2 years ago | (#39292249)

I think 2000x GPU power is very much underestimating the potential for a number of reasons:

1: Raytracing / global illumination. In comparison to games with true global illumination [skytopia.com] , current technology 3D worlds with only direct illumination (or scanline rendering) look crude and unconvincing. Objects appear 'cookie-cutter' like and colours tend not to gel with the overall 3D landscape.

Toy Story 3 took around 7 hours to render each frame [wired.com] . To render in real-time for a video game (say 60 FPS), you would need a processor that was around 1 million times faster than what we have today. And AFAIK, that's mostly using Reyes rendering (which incorporates mostly rasterization techniques with only minimal ray tracing.

2: Worlds made of atoms, voxels or points. This makes a world of difference for both the user and the designer. Walls can be broken through realistically, water can flow properly, and explosions will eat away at the scenery.

2000x? Pah, try 2 TRILLION as a starting point.

Not far fetched... or far off (2, Informative)

Anonymous Coward | more than 2 years ago | (#39292451)

Exponential improvement in technology is the historical norm, yet it can still be difficult to fathom.
2000X should be achievable by 2024, at 2x improvement per year; or by 2029 at 2X every 18 months.
Some of us should see 2 trillion-fold improvement in about 40+ years at 2X per year; or by 2075 at 2X every 18 months.
Barring the occurrence of any variety of manmade and natural disasters, of course.

It's already good enough for me (0)

Anonymous Coward | more than 2 years ago | (#39292265)

His talking about when things are good enough and how it can be objectively measured for everyone. Well, I am using a three year old computer that is underclocked as my primary gaming rig. I can get away with it because I can turn down the graphics settings on my computer. I can turn down the graphics settings on my computer because I can't see for crap. This is especially so when more and more distance is given from the object. Rather then the quality of the picture making a difference, I need more color. All the games today look like crap to me, even cranked up all the way because there is hardly any color to them. Maybe if they changed the pallet from gray and brown to actual color, we can talk, but as it stands the current generation is more than adequate for me.

google is my calculator (0)

Anonymous Coward | more than 2 years ago | (#39292379)

(log(2000) / log(2)) * (24 months) = 21.9315686 years

Re:google is my calculator (0)

Anonymous Coward | more than 2 years ago | (#39292643)

Wait. Isn't (log(2000) / log(2)) * (18 months) = 16.4486764 years more accurate? I thought they doubled every 18 months not two years.

How uninspiring. GET UP. (1)

eyenot (102141) | more than 2 years ago | (#39292387)

Listen to YOU. "Good enough", you say. Do you think that's how the video game console designers, from days and years and years and months ago thought and talked and acted?

Good enough? It's "good enough", that's why the Nintendo Entertainment System dominated the gaming market and we're not all just trading old Atari 5200 cartridges?

"Good enough", that's why there are fifty games starting with the word "Super" for the Super Nintendo?

"Good enough", that's why the Jaguar has two processors with two separate bus widths, and featured Quarantine AND Cybermorph?

"Good enough", that's why the 3D0 had like three first-person AD&D games, Braindead 13, AND all those interactive sex videos?

"Good enough", that's why the Playstation 2 features roughly three or four MORE clones per cloud of exact clones closing in directly on your fighter's position at any given wave along the rails?

"Good enough", that's why the Playstation 3 didn't come with back-compatibility for the PSX?

"GOOD ENOUGH", is that WHY, the Wii makes old people relevant to the video game scene?

GOOD ENOUGH!? IT'S NEVER GOOD ENOUGH! IT ALWAYS SUCKS! ARE YOU WITH ME! LET'S MAKE ANOTHER SHITTY CONSOLE HURRRGGGGHHHHHHHHHH1!!!!!!!!!

*CROWDS ROARING, MASTURBATING*

Don't worry (0)

Anonymous Coward | more than 2 years ago | (#39292391)

Don't worry about graphics -- if 80x25 was good enough for ZZT, then it should be enough for us!

Sold! (1)

billcopc (196330) | more than 2 years ago | (#39292471)

Okay, I like the sound of this. Get me four of those graphics cards so I can SLI the tits out of my hydro bill.

Sure, there is only so much data my eyeballs can process, but larger displays do serve a purpose. For example, I would love to have a 4k projector shooting at my wall, instead of two 27" monitors. Actually, I'd like two, stacked on top of each other. Why ? Because then my wall becomes a giant display surface. Even right now, I can't really mentally process the entirety of my pixel space at once, but the realities of multitasking and my working habits dictate that I need a bunch of windows in the sidelines, so that I may occasionally glance over to consult some chart, monitor logs in real-time, or juggle a half-dozen IM and email convos without getting signals crossed, or keep WoW open in the background while I wait for a damn raid to assemble.

So, if Tim Sweeney wants 8000x4000, then I want 16000x8000. He can render all the anatomically correct games his heart desires, but I want moar datas!

8000x4000 needed? No. (0)

Anonymous Coward | more than 2 years ago | (#39292475)

Your eye probably wouldn't be able to tell the difference between 4000x2000 and 8000x4000, so why do we need the extra processing power that can't be perceived.

Eh.. (4, Interesting)

Quiet_Desperation (858215) | more than 2 years ago | (#39292513)

To me the real problem is focusing on the wrong details. Take Skyrim for example. Is it really a big thing if they, say, tripled the detail on the existing characters? Do the NPCs need pores or drops of sweat?

Or would it be more interesting to walk into Whiterun, and there's a 100 NPCs walking around, or you assault a fort with the Stormcloaks and there's 100 other soldiers at your side attacking the 100 Imperials in the fort, and clouds of arrows raining down [nice knowing ya, shieldless dual wielders :-) ]? It's a "more detailed objects" versus "more objects in the world" sort of argument, I guess. I'd rather see the power applied to "more objects" at this point, IMHO.

There is a lot more to realism rendering (2)

MarkH (8415) | more than 2 years ago | (#39292597)

I think main challenge is the interaction between player and environment. On something like MW3 that is limited to blowing up the odd chicken, window or set piece designed into game.

I want to swish my ( virtual ) hand through a river and see ( and feel ) the water flow around it.

Any true physics model would require awesome cpu capacity. We have at one end mindcraft ( where the atoms are decidely blocky ) and second life ( where behavioural programs can be attached to objects).

My dream would be a virtual universe with atoms large enough to fit 1000 times current cpu capacity but behaves like a 'real' experience.

By 'atom' i mean the simplist construct in this universe which satisfies greatest potential complexity but still cpu possible.

In conclusion future 'games' will not be about fps or polygons per sec but model calculations.

I dunno. (0)

Anonymous Coward | more than 2 years ago | (#39292635)

Most gamers I know bring this down to about 2x due to pot.

We dont realy need 8000x4000 with eye-tracking (5, Insightful)

roemcke (612429) | more than 2 years ago | (#39292679)

By using eye tracking, we dont really need to render the whole screen at high resolution.
We only need to render the part the eyes are looking at at high resolution

The ability of the eye to percieve high resolution is only limited to a very small area, and the brain fakes it by moving the eyes around.
By superimposing a small image with high dpi on top of a larger image with low dpi, we get a high resolution window into the larger image.
If this high res window follows the eyes around, the brain will percieve a large high resolution image.

Naturally for this to work, the smaller image has to be updated to show the same part of the scene that it is replacing.

This can also be used to emulate a high resolution screen by keeping an area your screen black, and using a projector to project the smaller high-dpi image on the black area.

Oh, and by the way. Remember this post and use it as prior art in case some troll patents "A method of simulating high resolution images by combining multiple images of different scales and resolution"

The future: Voice synthetization and human-like AI (0)

Anonymous Coward | more than 2 years ago | (#39292691)

In my opinion the future in gaming will be voice synthetization and human-like AI. Voice synthetization will pretty much eliminate the cost of voice recording and bring it to even the Indie field. True human-like AI would use voice synthetization to be able to generate custom responses based on user-input. Imagine playing Skyrim and actually be able to choose your responses down to the last word.

Imagine a developer been able to generate a NPC just by setting a few parameters in a function. Creating a fully-fledge character in seconds. These two tecnologies and the tools to use them in an easy way will make Skyrim huge game world look like a hobby project in the future.

'Good enough' does not mean perfect (2)

brainzach (2032950) | more than 2 years ago | (#39292771)

Good enough does not mean that you have to match the anatomical limits of human perception. That is asking for perfection.

Unless the increase of graphics performance will lead to new radical ways of gaming, then the current GPU performance is good enough.

It is not like it was 10 years when 3D graphics opened the door to new types of gameplay, like the creation of Grand Theft Auto III. Now 3D gaming has matured and there isn't any more frontiers to discover other than just better graphics, which are just marginal improvements.

I am thinking innovation will come from input devices like touch screens, Kinects or another technology that no one has thought of yet.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?