Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Crysis 2 Update a Perfect Case of Wasted Polygons

Soulskill posted more than 2 years ago | from the almost-as-many-as-starfox dept.

AMD 159

crookedvulture writes "Crytek made news earlier this summer by releasing a big DirectX 11 update for the PC version of its latest game, Crysis 2. Among other things, the update added extensive tessellation to render in-game elements with a much higher number of polygons. Unfortunately, it looks like most of those extra polygons have been wasted on flat objects that don't require more detail or on invisible layers of water that are rendered even in scenes made up entirely of dry land. Screenshots showing the tessellated polygon meshes for various items make the issue pretty obvious, and developer tools confirm graphics cards are wasting substantial resources rendering these useless or unseen polygons. Interestingly, Nvidia had a hand in getting the DirectX 11 update rolled out, and its GeForce graphic cards just happen to perform better with heavy tessellation than AMD's competing Radeons."

cancel ×

159 comments

Sorry! There are no comments related to the filter you selected.

Hmmmm. (4, Insightful)

Moryath (553296) | more than 2 years ago | (#37115058)

So you're saying that a graphics card company just *might* have tried to get a company writing a largely-used benchmark in their favor.

Not that it's ever happened before... *coughintelnvidiacough*...

Re:Hmmmm. (0)

Anonymous Coward | more than 2 years ago | (#37115272)

No, Theyre saying Crysis 2 was a waste of time.

Re:Hmmmm. (-1)

Anonymous Coward | more than 2 years ago | (#37115296)

Don't read this... it is a curse...

In 2007, cunt whore whore whore whore house porch a little boy named Timothy. I turn riding bitch corridor that connects the fucking cunt whore hall scunt anal herpes, you idiot, Graham Cracker fucking whore funds. This led him to the new whore rides huge black cock fucking slut fuck whores in Graham cracker boxes black people who should be the next to the pussy finger in the ass are converted! So, for whatever reason try anofuck whore, fucking whore, graham crackers and boxes, start caught a gay black man. Big Black Ass bitch fuck nude black man cried against bootyass: "This is not an idiot," and let the whore whore whore whore room and Timothy went Mofucks his whore's cunt black. Negro vaginal bitch fuck big cock his smell delicious, then a child. Timor-ass niggas cry whore sloppy whore house rumble was the fucking bitch, fucking bitch, I need to change Timothy parked his car outside his home, like to see a good friend and said, can go cunt cunt. In conversation with his friend, Timothy, Timothy and highway driving slut whore fuck in the car.

Timothy geotinde cunt was gone, cars, friends began to slowly fuck slut opportunity: "Oh yes, it damn time bitch"

Flickr Cunt Nigger fag! Shit shit shit Pupiui ass.
Flickr Cunt Nigger fag! Shit shit shit Pupiui ass.
Flickr Cunt Nigger fag! Shit shit shit Pupiui ass.
Flickr Cunt Nigger fag! Shit shit shit Pupiui ass.

Timothy asked cunt. I told all my friends laugcuntd mischievously: "You are what you ride bitch copyrights, you can give Madow slowly," a fucking bitch waiter turned to some old.

Cabbage Patch Kids continue to stall and riding bitch car was overtaking. Timothy got out of hell Slut (it moves fast, since more work) and started to run. It seems the public slut whore fuck fucking whore in the business world of pain, faster than the speed of light whore whore whore whore in the first round trip off your ass is clear! Finally, when he tried to flee, naked ass, just a little cabbage patch runs bootyass Timothy! Sitting absorption and child Fucking Bitch hips, and Timothy spagcunttti Pasta (grandmofuck whore) slut fucking package! At that time the castle was Cabbage Patch Kids and big mirrors, the pussy that you tickle the hips!

Now can you like a whore ass pasta spagcunttti ride one weeks fuck slut whore wcuntn Cabbage Patch Kids epilepsy to read the best!

You can prevent this by doing the following: post this comment in three different threads. Please wash all the black cock whore whore whore whore people, you know.

Re:Hmmmm. (-1)

Anonymous Coward | more than 2 years ago | (#37115792)

I loled

Re:Hmmmm. (0)

Osgeld (1900440) | more than 2 years ago | (#37115944)

OMG FUCK OFF

Re:Hmmmm. (3, Informative)

Luckyo (1726890) | more than 2 years ago | (#37115512)

It's worth noting that most benchmarks use a certain version of popular games. If next version breaks benchmark functionality in a significant way, testers simply continue using old version.

Then again, has crysis 2 ever been used a serious benchmark? The game actually looked worse then crysis (especially warhead) in terms of graphics in spite of having higher polygon counts and such, and was designed from ground up to work on machines that would never be able to run original crysis or warhead (current gen consoles).

Re:Hmmmm. (2)

scumdamn (82357) | more than 2 years ago | (#37115876)

I'm betting a lot of review sites wanted to use it as a DX11 benchmark but they found out about this crap and put a stop to that. If you do see a review site using it to benchmark DX11 you know they're shady or biased or not terribly thorough.

Re:Hmmmm. (2, Interesting)

im_thatoneguy (819432) | more than 2 years ago | (#37116026)

I'm not convinced. I'll have to talk to my friends in DX development to give me the final nod one way or another but I know this author is clueless about the subject.

There are a lot of times in computer graphics where something is seemingly wasteful--but is the most efficient solution.

For example the claim that "This is the most detailed parking barrier in cinema or game history" is untrue. Pixar's Renderman renderer at least for now is still probably the most popular renderer in VFX. For every pixel it renders it automatically tessellates multiple teeny tiny polygons. So if you rendered a 1080p parking barrier it would be more than 1080x1920 polygons. The wireframe if you could view one would just be solid.

I imagine what the crysis developers discovered was that being "dumb" about tesellation was more efficient than trying to adaptively tessellate the entire scene. GPUs can handle millions upon millions of polygons in rasterization. That's not a problem. What bogs down a modern GPU are shader networks.

If the Nvidia cards have a specialized (and largely unused) hardware tessellation engine that's not being put to use then it can probably tessellate everything within sight with minimal performance cost. What would cost it a lot of performance is evaluating every object on the fly to determine the proper level of tessellation.

Dumb is fast. Smart takes power. If there is a giant tessellated ocean wasting a 20k polygons under the ground but isn't being shaded... it's probably barely harming performance.

I'm sure they'll refine the system in the future and spend a lot of time on the art assets, but why hold back a feature if you can throw in a quick and dirty version now that's completely automatic and makes some of the game look better?

Re:Hmmmm. (5, Insightful)

Anonymous Coward | more than 2 years ago | (#37116634)

I work in games. You sir, are an idiot. Are you seriously comparing a game engine to RenderMan? We have to render a game's frame in 16ms, you have to render a frame in something less than a minute. I read the entire article. This was clearly a patch meant to appease the PC gamers into thinking that it wasn't a shoddy console port.

Re:Hmmmm. (0)

Anonymous Coward | more than 2 years ago | (#37117370)

Yes, he's comparing renderman.

What part of "the claim that 'This is the most detailed parking barrier in CINEMA or game history" is untrue' did you not understand?

Re:Hmmmm. (1)

makomk (752139) | more than 2 years ago | (#37116750)

They're just being dumb - or favouring NVidia. The tesselation support is designed to make it pretty much trivial to adapt tesselation levels based on distance. While NVidia cards can cope with ludicrous levels of tesselation and polygons, ATI cards can't - and the penalty NVidia users pay for getting this support is that their hardware offers worse price/performance on everything else, which is why NVidia are so keen for all games to use this.

(There have been similarly fishy things before. For example, some game called Hawx which NVidia were involved in created all its terrain via a massive number of tesselation steps from a tiny number of polygons. This was static terrain - which meant it was a total waste of compute power to use tesselation to recompute it every frame - and I suspect NVidia were probably cheating and computing it in their drivers once because doing so every frame would cost them a lot too.)

Why exactly does this have an AMD picture by it? (0)

Anonymous Coward | more than 2 years ago | (#37115072)

This has nothing to do with AMD. Nvidia helped, AMD didn't. So why the AMD tag? You know, other than the obvious reason that a fanboy wants to start a war.

Re:Why exactly does this have an AMD picture by it (1)

monkyyy (1901940) | more than 2 years ago | (#37115090)

YAY fanboy wars are awesome

Re:Why exactly does this have an AMD picture by it (1)

artor3 (1344997) | more than 2 years ago | (#37115186)

*This picture brought to you by Cuil*

Re:Why exactly does this have an AMD picture by it (4, Insightful)

hairyfeet (841228) | more than 2 years ago | (#37115404)

Uhhh...because this is another case of Quack.exe? look at the facts: You have a highly intensive programming trick used for no fucking reason on completely stupid shit like concrete dividers. Have you EVER said to yourself 'Boy this game would have totally had me if it had only rendered the concrete dividers in such loving detail I can make out the scuff marks from the boot of the guy who last leaned on it'?

Then you have this SAME technique used to slam the GPU even when it isn't on the screen or will EVER be seen, such as rendering highly tesselated water being rendered underneath the land. This isn't Minecraft, they can't dig their way down to actually see the fricking water!

Then it turns out that this game, which has often been used as the standard for benchmarks, by loading up the game with worthless crap the user can't even see will surprise surprise...run better only on certain Nvidia GPUs.

I don't think we need to call in Kojack to crack this case folks. Nvidia used their position to make another Quack.exe so that the benches made using this game will score higher on their GPUs, by loading the game up with invisible crap that slams the GPU in a way they designed theirs to take better than the competition. Hell I wouldn't be surprised if they tesselated the manhole covers just to get the count up! The sad part is like the Intel compiler (which is STILL rigged BTW) most gamers won't know they are being had unless someone points out the BS that is going down behind the scenes.

Re:Why exactly does this have an AMD picture by it (1)

bhcompy (1877290) | more than 2 years ago | (#37115758)

Ultimately, this is why tile-based rendering owns. Unfortunately, the Kryo series is dead. 3dfx's last dying breath was also in enhanced occlusion to improve performance. Pity the big dogs don't do it

Re:Why exactly does this have an AMD picture by it (1)

hairyfeet (841228) | more than 2 years ago | (#37117002)

Personally I think Crysis is a big wank fest for those "Must have teh benchmarks!" dumbasses with more money than sense myself.

After bumpgate on the Nvidia side and the compiler and bribery scandals on the Intel side I put my money where my mouth was and went full on AMD in my shop and my customers as I couldn't be happier. Lately I've been leaning towards the HD48xx series, which give frankly insane amounts of bang for the buck for around $60 for the HD4830 (which you can flash and turn into an HD4850 if you're brave) and the HD4850 for $75 which is just nuts for a 256bit wide pipeline.

But I can see why Nvidia stooped to this, their way of designing chips is frankly getting too expensive. The AMD way of designing the midrange chips as the main GPU and then simply going X2 for the high end and flipping off some cores in software for the low end if the smarter way to go IMHO, as it costs less which can then be passed on to the consumer. Meh until the next console refresh it won't matter anyway as those $60 chips like I'm selling crank out the purty on all the latest games at 1080p.

Cranking the tesselation on dividers and under the ground where it can't be seen is just lame though, and you'd think they would have more pride than to do a quack.exe in this day and age. Guess not.

News about wasted polygon, really?? (-1)

Anonymous Coward | more than 2 years ago | (#37115108)

Did /. really make a news about this? Is the summer that dull or I'm missing something actually interesting in that article because for me, it sound like the "Some Hollywood star broke his nail this morning" news in some crappy Hollywood magazine.

Re:News about wasted polygon, really?? (0)

Anonymous Coward | more than 2 years ago | (#37115176)

I thought this was quite interesting that such a high-profile game would be so optimized. They have to render the realistic water in scenes where it doesn't show up at all, for instance. Seems like a huge waste of resources.

Not surprised (5, Interesting)

0123456 (636235) | more than 2 years ago | (#37115120)

One thing I learned from writing video drivers is that game developers are probably the very last people who should be developing graphics engines. We were constantly amazed by the insanely performance-sucking tricks they used to play which we then had to detect and work around; often their poorly-designed method of rendering something would be 10-100x slower than a sensible implementation.

Valve and id are the most obvious exceptions; I don't think we ever found them doing anything really retarded unlike certain big name developers I could mention.

Re:Not surprised (0)

Anonymous Coward | more than 2 years ago | (#37115160)

If you weren't doing things ass-backwards and developing workarounds in drivers for individual games then they would be forced to do things properly wouldn't they?

Re:Not surprised (2)

thegarbz (1787294) | more than 2 years ago | (#37115336)

Only to lose business to a competing product which does optimise it's drivers? What kind of arse backwards logic is that?

Re:Not surprised (0)

Anonymous Coward | more than 2 years ago | (#37116674)

GP is right, what we need is agreed standards between the major card manufacturers. Then all effort/competition can go into raising the bar for all games, not wasted effort making sure that a handful of AAA titles look pretty. It's wasteful for the card developers to work that way and it's also wasteful for the games developers to have to use hacks and workarounds (and for developers moving between companies to learn about each others hacks and workarounds) instead of working to defined standards.

just like html (0)

Anonymous Coward | more than 2 years ago | (#37116774)

its just like html. proper html pretty much works everywhere but shitty html sort of works in some browsers... unfortunately there are a whole lot of game developers writing shitty code that sort of works on some graphics cards, but from the POV of the user this looks like a problem with the graphics card.

Re:Not surprised (1)

0123456 (636235) | more than 2 years ago | (#37115410)

If you weren't doing things ass-backwards and developing workarounds in drivers for individual games then they would be forced to do things properly wouldn't they?

That was always my argument, but then people would stop buying our cards and buy cards where the game ran 'properly'.

Re:Not surprised (0)

Syshak (2427740) | more than 2 years ago | (#37115168)

Try saying that to John Carmack.

Re:Not surprised (3, Insightful)

tepples (727027) | more than 2 years ago | (#37115252)

Valve and id are the most obvious exceptions

Try saying that to John Carmack.

I think that was the point. Mr. Carmack works for Idthesda, and Valve's Source engine is based on GoldSrc, which in turn was forked from the engine of Quake (early Id Tech 2) written by Mr. Carmack.

Re:Not surprised (-1, Offtopic)

Anonymous Coward | more than 2 years ago | (#37115374)

Try reading, fuckface.

Re:Not surprised (0)

Syshak (2427740) | more than 2 years ago | (#37115456)

Apparently none of you smartasses got my point.

Re:Not surprised (-1)

Anonymous Coward | more than 2 years ago | (#37116072)

In case you still don't get it, the OP said:

Valve and id are the most obvious exceptions

See the "id" there? Now what could that be referring to? Could it be: http://en.wikipedia.org/wiki/Id_Software#John_Carmack [wikipedia.org]

Maybe this is the wrong site for you. Try these instead: http://www.starfall.com/ [starfall.com]
http://www.abc-read.com/ [abc-read.com]

Learn to read before calling people names. We may be smart asses, but you're one hell of a dumbass.

Re:Not surprised (1)

Savantissimo (893682) | more than 2 years ago | (#37115556)

Well, I've got to say that Rage (on which Carmack spent the last six years or so implementing a "megatexture" hack that was worth maybe a couple of months) looks like crap compared to Crysis. Everything looks smeary (marketed as "painterly" and "atmospheric").

And while Crysis may waste polygons, Rage doles them out like a miser - the main character's head has visible lumps - it's actually even pointed. His big, round shoulder pads get about half a dozen polygons each - you can see the corners and seams of the mesh. Anything round looks like it was whittled with an ax. Constant 60 fps is not that important if the frames themselves are nothing but giant paintings over meshes that could have come from a game from 10 years ago. Maybe the gameplay makes up for it, but when watching the Rage trailers the suspension of disbelief is constantly being knocked down by lousy details.

OTOH, the Jersey barrier in Crysis that TFA takes such an issue with looks so hyper-real that it just makes you go "wow!" The only thing is, it looked just as good before the tesselation. The effort would have been better spent on the leaves in that scene, which only look a little better with the amount of tesselation they used.

Re:Not surprised (0)

Anonymous Coward | more than 2 years ago | (#37115748)

While you're standing around looking at shit and going "Wow!" guys who are getting a solid 60 FPS are going to be blowing your fucking head off.

Re:Not surprised (0)

bhcompy (1877290) | more than 2 years ago | (#37115786)

Well, duh. It has to work on the shitty Xbox.

Anyways, the new Doom was the same thing. Everyone raved about how great it looked, but to me it looked like a bunch of plastic dolls. Might be highly detailed textures, but when they look like waxy plastic models it makes it worse than previous attempts.

Re:Not surprised (-1)

Anonymous Coward | more than 2 years ago | (#37116354)

Constant 60 fps is not that important if the frames themselves are nothing but giant paintings over meshes that could have come from a game from 10 years ago.

Yes it is.

Unless you are playing an interactive movie made for the consoles (Hello all you third person shooters with auto-aim.) then that extra framerate seriously adds to playability.
This is one of the reasons to why Quake1 and Quake2 are still both playable and entertaining.

Re:Not surprised (0)

Anonymous Coward | more than 2 years ago | (#37115178)

The workaround may have been due to completely brain-dead video drivers that suck balls at doing something sensible.

I've seen many such inexplicable performance drops in big name graphic chip developers.

Intel is an exception. They just suck balls, period.

Re:Not surprised (1)

Anonymous Coward | more than 2 years ago | (#37115212)

Not that I don't believe you... I have (many) reasons to... but a technical example of bad optimization would be nice. It's boring reading endless amounts of non-substantiated claims on Slashdot.

Re:Not surprised (0, Offtopic)

vlueboy (1799360) | more than 2 years ago | (#37115376)

Tsk tsk tsk!
The internet isn't as "safe" as all of you gossipers, want us to believe. That thing there in the GP is a username, '0123456' and he just gave out two company names 'id' and 'valve'. If you know the USA or have read any lawsuit nightmares here, then you know that giving out his any more triangulating info gets too specific.

Anyone who worked with him on that section of code will notice today, or on a google search a month from now, and is bound to be drawing eyeballs to quote us here for other forums (while post deletion at the HERE source is not an option.) In our crowds or theirs there are bound to be whistle-blowers, old enemies who still work for the GP's company at the time, and the lawyers from all three companies. All three with very large sticks that point at non-expiring NDA's he signed for the privilege to play ball while making a living.

If you personally want "anonymity," then invite us to a 4chan thread created by "you". And be prepared for information that sounds more "substantiated" as you ask, but might be 100% lies from someone youll find isn't even the GP in the first place. That is the point... we really won't know even if GP chooses to reply to you pretending all the info given in his follow-up is "true."

The last few days, with wikipedia and some other stories, slashdot has started to show voyeurs wanting a good story for nostalgia's sake. But we're not a peeping site. There's an unwritten law in professional circles (even beyond medicine's implicit and legal nondisclosure norms) that says that juicy stories should anonymized enough. After all, we ask the same of Facebook and other free services, so we need to be just as considerate.

Good day to you.

Re:Not surprised (0)

Anonymous Coward | more than 2 years ago | (#37115500)

What the fuck are you talking about?

Did you forget your medication today?

Re:Not surprised (0)

Anonymous Coward | more than 2 years ago | (#37115750)

No, you're just not appreciating that Anonymous Coward title and its safety.

He's a bit wordy, but you can't be that dumb.

Re:Not surprised (0)

Anonymous Coward | more than 2 years ago | (#37116808)

No, you're just not appreciating that Anonymous Coward title and its safety.

He's a bit wordy, but you can't be that dumb.

He's referring to this post

Not surprised (Score:5, Interesting) by 0123456 (636235) Alter Relationship on Tuesday August 16, @11:34PM (#37115120) One thing I learned from writing video drivers is that game developers are probably the very last people who should be developing graphics engines. We were constantly amazed by the insanely performance-sucking tricks they used to play which we then had to detect and work around; often their poorly-designed method of rendering something would be 10-100x slower than a sensible implementation. Valve and id are the most obvious exceptions; I don't think we ever found them doing anything really retarded unlike certain big name developers I could mention.

So, he's not using Anonymous Coward.

Re:Not surprised (1)

Anonymous Coward | more than 2 years ago | (#37117254)

If the GP wants anonymity then they should post as AC or STFU. Their +5 Interesting comment is completely devoid of any valuable information. Apparently some developers wrote inefficient code at some point in time. Whoa, you're blowing my mind.

Valve just does bad hacks elsewhere (0)

Sycraft-fu (314770) | more than 2 years ago | (#37115308)

The one I've noticed the most is sound stuttering. Half-Life 2 had some bad sound stuttering issues back when it came out. Valve swore up and down it was a soundcard issue, not their engine. Well, it still does it today on a completely different (and stupidly powerful) system as does Team Fortress 2. It isn't horrible, but it is noticeable and there's no excuse give that other games don't do it and my system is extremely overpowered compared to what the games need.

Re:Valve just does bad hacks elsewhere (1)

Osgeld (1900440) | more than 2 years ago | (#37115948)

valve has always blown ass in sound, I meant WTF was that crap in half life one, it sounded like someone got drunk and made a proof of concept for pc speaker, 2 decades late, on a sound blaster live

Re:Not surprised (1)

Seumas (6865) | more than 2 years ago | (#37115310)

Oh, yEAh?

Re:Not surprised (1)

windwalkr (883202) | more than 2 years ago | (#37115332)

One thing I learned from writing video games is that driver developers often don't know much about real-world performance. ;-) Much of the performance advice we have seen given by GPU teams in the past had zero benefit to game performance and took weeks of developer time to implement and maintain. On the other hand, sometimes you come across a real gem.

Short version: good programmers good, bad programmers bad. Sometimes what is good for one case is not good for another case.

Re:Not surprised (1)

0123456 (636235) | more than 2 years ago | (#37115486)

Much of the performance advice we have seen given by GPU teams in the past had zero benefit to game performance and took weeks of developer time to implement and maintain.

One thing worth noting is that a change that makes no difference on the card you're testing with may make the difference between the game being playable or a slideshow on a different card.

One particularly amusing issue I remember was with a new feature in Direct3D where I believe we were the only people who supported it in hardware at that time and everyone else emulated it in software; we got a new game from big name game developer X and it ran vastly slower on our card than on much less powerful systems. The idea was that you'd enable this feature once and then keep using it, but they were turning it on and off hundreds of times in a frame and each time that caused a major pipeline stall in our hardware. So once we figured that out we just detected the game and dropped back to software emulation like everyone else, but if they'd known what they were doing the game would have worked fine on all cards and been faster on ours because they'd actually have been using the hardware instead of the CPU.

(Details kept vague to protect the guilty)

Re:Not surprised (1)

windwalkr (883202) | more than 2 years ago | (#37115672)

One thing worth noting is that a change that makes no difference on the card you're testing with may make the difference between the game being playable or a slideshow on a different card.

Absolutely true. My anecdotes above were in regards to very specific hardware so this comment doesn't really change what I'm saying, but it's an important thing to understand in a general sense.

One particularly amusing issue I remember was with a new feature in Direct3D where I believe we were the only people who supported it in hardware at that time and everyone else emulated it in software; we got a new game from big name game developer X and it ran vastly slower on our card than on much less powerful systems. The idea was that you'd enable this feature once and then keep using it, but they were turning it on and off hundreds of times in a frame and each time that caused a major pipeline stall in our hardware. So once we figured that out we just detected the game and dropped back to software emulation like everyone else, but if they'd known what they were doing the game would have worked fine on all cards and been faster on ours because they'd actually have been using the hardware instead of the CPU.

To be fair, you're accusing the dev in question of not optimising for your card when you admit that the card in question was unusual and probably released after the game in question was developed- otherwise you probably would have worked with them to improve their software? It's all well and good to say "they should have known better" (and I've used that line before, so I know how you feel) but if you're the odd man out, it's hard to really blame the dev for not being able to predict how performance would change in the future. Especially if the dev is using some kind of third-party engine or graphics library (eg. Cg) where they don't necessarily have fine control over state changes.

I don't know the details of your case so I won't comment further, but it's worth remembering that there are two sides to every story.

Re:Not surprised (1)

Anonymous Coward | more than 2 years ago | (#37116712)

To be fair, you're accusing the dev in question of not optimising for your card when you admit that the card in question was unusual and probably released after the game in question was developed- otherwise you probably would have worked with them to improve their software?

on the other hand, it's pretty well known that issueing unecessary state changes to 3d apis is bad and can be costly. So even if they didn't know the extent of the problems it caused on that particular card, enabling and disabling something for no good reason a hundred time per frame is bad.

Re:Not surprised (1)

windwalkr (883202) | more than 2 years ago | (#37116876)

on the other hand, it's pretty well known that issueing unecessary state changes to 3d apis is bad and can be costly. So even if they didn't know the extent of the problems it caused on that particular card, enabling and disabling something for no good reason a hundred time per frame is bad.

Agreed- however in the (distant) past we've had to do exactly this because of bugs in the driver state caching. I've also seen Cg hitting state changes fairly hard on some platforms- there was an optimisation to prevent this but it used to cause memory leaks. It can be difficult to know exactly what's going on under the hood there and you can't really blame the application developers for this without knowing the specific circumstances.

Re:Not surprised (0)

Anonymous Coward | more than 2 years ago | (#37116512)

As a former game programmer I wholeheartedly agree. Somehow people working on the rendering engines often had a demo scene mindset and were so engrossed by micro and early optimization that they wrote the most terrible code, and ended up hurting performance anyway due to bad code design/architecture.
I remember that (relatively big name) mmo I worked on where the rendering engine couldn't handle transparency at first because sorting polygons was deemed "too slow" by the guy who did the original implementation. An ambitious mmo that boasted realistic graphics. And we had to tell artists anything transparent wasn't possible.

At the same time, he grouped things to render by shader by first throwing them all randomly into a list and then sorting it (even though by design there were only like 3 or 4 different shaders and thus each could have had its own render list).

Oh, and it couldn't handle multiple viewports even though it was obvious that a character paper doll would have to be rendered (it was also fun to use this engine in the level editor), because obviously having all the scene management structures statically allocated was necessary for good performance.

I laughed when I played the final version of the game and lighting in the game world affected lighting in the character screen.

Re:Not surprised (0)

Anonymous Coward | more than 2 years ago | (#37117054)

*shudders*
Tell me more! These kind of horror stories interest me.

Re:Not surprised (1)

Walkingshark (711886) | more than 2 years ago | (#37117426)

Ah yes, Dark Age of Camelot. Good times!

Re:Not surprised (1)

Ecuador (740021) | more than 2 years ago | (#37117374)

Perhaps though the reason is what is apparent from this article. It seems the developers had a (cash obviously) incentive to make one manufacturer's card look better. While they could optimize for that manufacturer, most sensible optimizations could possibly benefit the other manufacturer too and finding optimizations that would work much better on your preferred manufacturer would be too hard to do.
So, what if you know there is a particular function that is very slow on the manufacturer you want to show in a bad light? The easiest thing to do is to put millions of useless calls to this function and you got it. This is exactly what is going on here, instead of sensibly using tessellation, they just threw it in loads wherever it was easy (a whole tessellated sea under the city - brilliant!) and problem solved. The AMD driver developers will find it retarded (if they don't outright attribute it to malice) and will try to code around it.
What makes me shudder thought is that a developer would compromise the user experience due to make a hardware manufacturer look good. Yes, AMD users get hit harder, but these dis-optimizations do not come free for nVidia cards either. So all Crysis 2 customers get lower performance on their hardware - for example it is possible that an nVidia gtx 560 user could get the same experience as an gtx nVidia 570 user if it was not for this crap.
I am a developer myself (not games), and there are some things that are familiar:
- Optimize the app, we want to make it as good as possible for our users.
- Forget about optimizing the app, we just have to ship it.
- Cripple the non-paying version of the app.
What is definitely out of my experience is: Cripple the app that people paid good money for.

Never attribute to malice...? ha! (2, Interesting)

Anonymous Coward | more than 2 years ago | (#37115128)

Never attribute to malice that which is adequately explained by stupidity

It's entirely possible that the tessellation is per-node. E.g. in the case of the barrier, only the top seems to benefit in that the handles jut out (why those handles aren't polygons to begin with is another question, given that it would take only 8 or so for each.. hardly making a dent in polygon budgets), but it's the entire thing that gets tessellation applied. Similarly, unseen parts get tessellated (why there is water underneath a city that will never be seen is yet again another qestion).

So while it could be explained by stupidity... when you're working on a high performance game, the problems indicated in that article would have quickly been dealt with. So perhaps malice is in play.

I suspect this will be (partially) fixed in an upcoming patch, as I doubt they'd want to be known as the game that was dropped from benchmarks due to apparent bias.

Exactly. Perhaps a better phrase... (1)

mykos (1627575) | more than 2 years ago | (#37115258)

Perhaps a better phrase would be "Never attribute to stupidity what can be explained with cold hard cash".
A title with the triple-A budget of Crysis 2 wouldn't have developers that never once bothered to view the map in wire frame at some point before release.
Take a look at all the TWIMTBP/Nvidia logos slapped all over the game and you know who is paying the bills.

Re:Exactly. Perhaps a better phrase... (2)

scumdamn (82357) | more than 2 years ago | (#37116094)

This one is just blatant as fuck. Tessellated invisible water running under everything? Really? Nvidia has been touting their better tessellation performance for how long now? And Crysis was the benchmark of choice so they had to go muddying up Crysis 2 to try to get the advantage.

3D ready (4, Interesting)

gmuslera (3436) | more than 2 years ago | (#37115154)

Once more software is steps ahead of hardware. The game is ready for hologram projectors, if you can't see those layers of water is because you are using a 2D display.

Re:3D ready (2)

Opportunist (166417) | more than 2 years ago | (#37115230)

Damn right, in a 4D display you could see them, even if they're underground!

Re:3D ready (1)

Arancaytar (966377) | more than 2 years ago | (#37116884)

In a 4D display the game would have to render the objects that will be there at some future point, too.

Re:3D ready (1)

Osgeld (1900440) | more than 2 years ago | (#37116008)

I know your joking, but the current state of holographic protection is a convex mirror, you can in fact replicate it with a chrome popcorn bowl and a flashlight

What a minute (2, Funny)

SilverHatHacker (1381259) | more than 2 years ago | (#37115240)

People actually play Crysis? I thought the whole reason it was made was to be a test for your graphics card? One giant benchmark.

Re:What a minute (1)

scumdamn (82357) | more than 2 years ago | (#37116102)

Crysis 2 is now one giant benchmark that is biased toward Nvidia.

Re:What a minute (1)

TheLink (130905) | more than 2 years ago | (#37116150)

Uh that's why this is a problem. If the report is true, it's no longer a good benchmark, but a skewed one.

Re:What a minute (0)

Anonymous Coward | more than 2 years ago | (#37116834)

It never was a good benchmark - certain GPUs have always handled things better and worse than others so any one particular game is always going to perform better in some areas and worse in others depending on your card. If you're using one specific game as your benchmark you're always going to find other games that under or overperform your expectations. Basing an entire generation of PC gaming benchmarking on a single product was always idiotic - all they've done here is skew it to the absurd.

Re:What a minute (1)

paziek (1329929) | more than 2 years ago | (#37116844)

I've got it in a bargain for $15, played singleplayer and it wasn't that bad. Didn't play prequel, but this one played pretty good. For $15 I would say it was a fair price. If I were to buy it for more, then I would pass it, just as I did on its launch. Its too bad tho that it ended so quick, wanted more...

Re:What a minute (0)

Anonymous Coward | more than 2 years ago | (#37117340)

Odd. I thought that Crysis 2 was a tech demo for CryEngine. The main clue is that the horizontal FoV is about ten degrees. Great for a tech demo (the smaller the proportion of the scene that is rendered, the more detailed you can make it), not so good for a game, particularly one with enemies that use melee attacks.

Not surprised (1)

thegarbz (1787294) | more than 2 years ago | (#37115298)

Crysis's claim to fame was that it gave the GPU a real workout, and it did. They ended up rendering a whole world of extra detail to make a realistic looking environment. Along comes Crysis 2 and frankly I am not at all impressed. On a computer that has no troubles handling any other game I had to drop the quality settings to ultra ugly to make the game playable. I'd prefer less pretty garbage on the screen then having to play a game at a resolution where the pixels are the size of a man's fist.

It just seems to me that Crysis 2 has lost the plot, and it's no longer about making the prettiest game, but rather making the most poorly optimised one.

Re:Not surprised (0)

Anonymous Coward | more than 2 years ago | (#37115392)

Mmmm ...

They basically did away with that (almost free-form) sandbox that allowed you to develop an insane amount of strategies and routes to complete mission objectives, the environment is way less chaotic and more structured (don't much like urban environments for shooters), they say it allows you more vertical freedom (and it doesn't) and it feels like you're on console rails as you navigate your way through it. Compared to the original, online sucks, single person sucks ... all in all this puppy doesn't have anything like the immersiveness and flair of the original.

The graphics? I started out in DirectX 9, installed the upgrades, loaded the DirectX 11 variant with hi-res textures ... and went back to DirectX9. Bottom line ... the graphics are inferior to the 2007 original, and DirectX 11 detracts from the experience. (And I have a pretty high end graphic card that runs the puppy with high res textures and at high resolution at better than 80fps)

All in all, Crysis 2 has been a bit of a disappointment. We'll see what the engine manages to produce in the way of other games over the next year or so ... but Crytek really screwed the pooch on this one.

Re:Not surprised (1)

will_die (586523) | more than 2 years ago | (#37115966)

Crysis 1 claim to fame was that it was a great game that also gave the GPU a real workout. If the game had sucked it would not of been used for the graphics capability, there are other games that provided high end graphics but not the gameplay.
Crysis 2 does not look bad even without this DX11 patch but the game play does suck.
However Crysis 2 does kill the claim from people that they want a game with gameplay not graphics.

Look at the screenshots! (-1)

Anonymous Coward | more than 2 years ago | (#37115324)

Take the time to actually look at the pictures... the images are substantially better in the updated version despite what the article says. Yeah, sure... not optimized as well as they could have been... but the improvement is noticeable far outside of the things the article draws attention too.

Re:Look at the screenshots! (1)

scumdamn (82357) | more than 2 years ago | (#37116108)

I guess you didn't notice water running under the street, then? Saying "not as optimized as well as they could have been" is like saying the budget deficit is a tad large.

As someone who works at AMD (1)

Anonymous Coward | more than 2 years ago | (#37115348)

and who's views don't represent that of the company in any official capacity, this pisses me off.
I don't believe for a second it was an accident. This is bare knuckles marketing pure and simple and I'm glad it's getting some attention.

Re:As someone who works at AMD (0)

Osgeld (1900440) | more than 2 years ago | (#37115964)

did not look at the pictures did you? there is a significant improvement, and please dont be mad cause your designing inferior cpu's and shit video chipsets that have not had a decent driver yet since the Mach 64.

Improvement? (0)

Anonymous Coward | more than 2 years ago | (#37116138)

Which is a significant improvement: the invisible water or the single-pixel polygons?

Yes, it looks better, but it would look just as good without the invisible details. This is Crytek throwing a bomb that Nvidia crafted for them, and AMD is left cleaning up the wreckage.

buzz off bozo (1)

unity100 (970058) | more than 2 years ago | (#37117454)

and come back when you learned what does "INVISIBLE UNDERWATER POLYGONS" means.

Whiny (0)

HumanEmulator (1062440) | more than 2 years ago | (#37115418)

So let me get this straight... A free update makes the game look better by using new DirectX11 features, but the whole article is criticizing the game for using a hardware technology (that's only just starting to appear in game engines), in a way that isn't as optimized as they would like? Are gamers feeling that entitled these days? If you speculatively purchase a faster hardware, it's not anybody's obligation to write software to push it to the limit you know.

Re:Whiny (0)

Anonymous Coward | more than 2 years ago | (#37115510)

No, it's more than that. nVidia has tried to pull this sort of thing before, with HAWX 2. They're pushing excessive tessellation because their cards are designed with a greater focus on it, so they look better when the developer puts in stupid amounts of it.

Re:Whiny (1)

Osgeld (1900440) | more than 2 years ago | (#37115972)

so its nvidia's fault that a cheap 1 hit wonder tech demo company does not know how to make a decent game

right

Have you run the game? (0)

Anonymous Coward | more than 2 years ago | (#37116040)

You see an advertisement every time you start a game of Crysis 2. Guess whose it is? (Hint: it's not AMD).

Heavy, unnecessary tessellation in Crysis 2 was predicted months before release [kitguru.net] . Lo and behold, those nasty rumors have now been proven accurate. Nvidia has become a very predictable abuser of its market position.

"The Way It's Meant to Be Paid"

Re:Whiny (1)

scumdamn (82357) | more than 2 years ago | (#37116124)

Was their hit FarCry, FarCry 2 or Crysis?

Re:Whiny (1)

The Dawn Of Time (2115350) | more than 2 years ago | (#37115854)

What do you mean? Gamers have acted entitled for years. They whine and cry when they don't get every last thing they want in the way they want it, and for free to boot, and that's been the case since the 90s. The only way it could get worse is if they decided they deserve to be paid to deign to play the games.

Re:Whiny (1)

Osgeld (1900440) | more than 2 years ago | (#37115996)

console gamers whine and cry, pc gamers vote with their money, if the game is good they spend it, consolers buy whatever crap fad company X pushes and whine when kinect is not nanosecond perfect. pc gamers want bigger better graphics they buy a fucking video card, consolers on the other hand whine for half a decade about not having AA in 720 P and then buy another 6 games in 4 months.

or in other words you have your story backwards

Re:Whiny (0)

Anonymous Coward | more than 2 years ago | (#37116234)

You come across as whiny, you know?

Re:Whiny (0)

Anonymous Coward | more than 2 years ago | (#37117306)

PC gamers steal games on an industrial scale and then whine like bitches when their favourite games don't get made as dedicated PC exclusives and come filled with intrusive DRM. Console gamers never start the whole console vs. PC argument - it's always that whiny PC crowd that tries to justify a life spent downloading driver updates and investigating crashes and buying a new graphics card every six months and being totally dependent on Microsoft's OS by insisting that PC games are so much better than console games.

PC gamers are just console gamers with better Internet access. This makes them exponentially more whiny.

hey fucktard (1)

unity100 (970058) | more than 2 years ago | (#37117480)

if a graphics card company and a gaming company conspire together to deceive benchmarks and rip me out of my cash by deceiving me, i feel entitled to many, many things.

if you do not feel the same when someone attempts to deceive and fraud you, you are a moron of the first order and i have a bridge to sell you.

Re:Whiny (1)

scumdamn (82357) | more than 2 years ago | (#37115890)

No. There's water under the ground taking up valuable GPU time. It's slowing performance everywhere. Just happens to be worse on AMD cards.

Dont act like a moron. (1)

unity100 (970058) | more than 2 years ago | (#37117470)

Yeah. READ the article before talking like you did above and make yourself stand out as a moron.

the problem here is, nvidia used some programming gimmicks to make their cards perform better by creating extra load in polygons that are rendered UNDER WATER UNDER LAND, and therefore INVISIBLE.

no benefit to gamers here. no benefit to anyone. NOONE WILL BE ABLE TO SEE WHAT IS BEING RENDERED.

however, this will create extra unnecessary load in a way that some nvidia chips can handle better, and show competitors bad.

let me summarize - nvidia is incompetent, unable to beat ati in REAL game of rendering 3d, and is trying underhand tactics like a son of whore and DECEIVING GAMERS AND COSTING THEM CASH in the process.

this is basically an assault against my wallet. your wallet. gamers' wallets.

Great way to move (1)

k4f (2433858) | more than 2 years ago | (#37115914)

So all that hidden geometry is there [goo.gl] to make sure anyone with anything less than a top-end GPU basically chokes to death rendering unseen details. Great way to move those premium class GPUs!

Not really a big deal (1)

kayoshiii (1099149) | more than 2 years ago | (#37116220)

As best I can tell this essentially boils down to retrofitting directX 11 to an already designed engine after the fact and doing so in a limited time frame.
I don't really see it as a big deal as a) the game was originally designed for directX9 hardware so anybody trying to run the game on DirectX11 hardware will probably do just fine anyway and secondly the way that modern graphics cards are designed this extra geometry generated on GPU may not even be the bottleneck.

I think that engines that will really take advantage of these technologies are only really being built now. At work we use one of the engines that has had this feature for the longest (http://www.youtube.com/watch?v=9F6zSgtRnkE). Other companies we know using the same engine aren't using these features because there are not enough people with the Graphics to support it. I think this will mark the beginning though.

Incidently the way that tesselation is implemented in the engine we are using. Is with two maps - the first is a displacement map and the second specifies the level of tesselation which significantly lowers the level of tesselation on flat surfaces.

Well duh. (1)

mad_minstrel (943049) | more than 2 years ago | (#37116248)

Of course there's going to be a lot of flat surfaces... After all the artists have been told to make them with a lot of flat surfaces so that they don't need too many polygons on the non-DX11 platforms.... If you want to see artwork that uses tessellation well, you have to tell artists to make some.

tessalation of flat surfaces (2)

gl4ss (559668) | more than 2 years ago | (#37117100)

makes the shading on them look different, so it's not all wasted vertices(well, depending on how they calculate the shading). but you can easily test that on some modeller, make a cube that has each side made of two triangles, observe how it's shaded with basic opengl shading - now, turn on some tessalation(while keeping the shape as it is) on it, and you can see the difference, you can see highlights on flat surfaces even without applying some fake phong technique.

this or any graphics upgrade doesn't help with crysis lacking in complexity due to launch on consoles though so who cares - the memory and cpu available for the game logic was dictated by that.

Re:tessalation of flat surfaces (0)

Anonymous Coward | more than 2 years ago | (#37117410)

I guess you missed the point of the article where it is shown there really is no difference in the rendered frames.
There is also the more interesting point of the underground sea with millions of polygons corresponding to zero pixels.

ATI is crap (-1, Troll)

loufoque (1400831) | more than 2 years ago | (#37117350)

Since when is this newsworthy?

Re:ATI is crap (1)

unity100 (970058) | more than 2 years ago | (#37117494)

tell that to the great performing cheap ati cards i bought since last 8 years while morons had been shelling out cash to this son of a whore company which doesnt refrain from deceiving and frauding them off their money.

but maybe you like getting frauded. thats your preference and i respect it.

Re:ATI is crap (1)

loufoque (1400831) | more than 2 years ago | (#37117562)

ATI is the one trying to trick you.

Their cards have less computing power, but they make up for it with nifty tricks that only work in certain cases. As soon as you get out of these idealized cases, performance drops dramatically.

Too many polygons, so what? (1)

loufoque (1400831) | more than 2 years ago | (#37117384)

Games use too many polygons, so what? They also use too much RAM, too much disk space, and too much processing power in general.
The important thing in video games is making them work, not making them optimal.

Why is tessellation done everywhere even on relatively flat stuff? Because the development team did not waste time studying each object one by one, the tessellation aspect was computer-generated for everything.

read first, moron. (1)

unity100 (970058) | more than 2 years ago | (#37117500)

the issue here is that, the process is done in water that is UNDER LAND and will not be seen by any son of god on this planet in any way.

basically its a hidden object that favors some nvidia chips, and makes the competitor cards get choked.

its fraud.

What a shocker! (0)

Anonymous Coward | more than 2 years ago | (#37117442)

Subdividing triangles constituting planes gives you more triangles, without any geometric accuracy improvement. And geometries in games mainly consist in planes.

Re: read first, moron. (1)

loufoque (1400831) | more than 2 years ago | (#37117570)

I did read TFA.
It's a missed optimization. Are optimizations compulsory now?

If ATI cards can't deal with a higher computation load, it's just because they're not as good, that's all there is to it.

Re: read first, moron. (1)

loufoque (1400831) | more than 2 years ago | (#37117574)

Oops, wrong place in thread. Can someone remind me why slashdot still doesn't allow editing or deleting posts?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>