Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVidia Accused of Inflating Benchmarks

michael posted more than 11 years ago | from the pointing-fingers dept.

Graphics 440

Junky191 writes "With the NVidia GeForce FX 5900 recently released, this new high-end card seems to beat out ATI's 9800 pro, yet things are not as they appear. NVidia seems to be cheating on their drivers, inflating benchmark scores by cutting corners and causing scenes to be rendered improperly. Check out the ExtremeTech test results (especially their screenshots of garbled frames)."

Sorry! There are no comments related to the filter you selected.

FIRST POST (-1, Troll)

anonymous cowfart (576665) | more than 11 years ago | (#5963213)

First post!

Detonators rox!

Inflating (-1, Funny)

smg_mrBlonde (629104) | more than 11 years ago | (#5963226)

I guess size does matter.

They are soooo busted. (1)

BoomerSooner (308737) | more than 11 years ago | (#5963523)

Although does it really matter? Even with errors in a driver they beat the living hell out of ATI (in my opinion).

Giveing them self a bad name (3, Interesting)

SRCR (671581) | more than 11 years ago | (#5963232)

To bad Nvidia has to resort to these things to keep selling there cards.. The used to be great.. but now i have my doubts..

Re:Giveing them self a bad name (0, Flamebait)

Alan Partridge (516639) | more than 11 years ago | (#5963400)

you dick

it's BECAUSE of idiots like you who actually think it matters if card x gives 229fps on Quake III where card y can only manage 227fps that these companies are forced to pull this kind of trick. Companies like AMD, nVidia and ATi are trying to do business in a world where people actually place importance in the blatherings of the self-important know-nothings that run sites like Hard OCP and Toms Hardware.

Re:Giveing them self a bad name (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5963466)

Ph34r M3 4l4n 94r7r1dg3.. Mr. Smarty I Have Won the Fucking Spelling Bee, of that there ain't no question.

PH34R M3!!!!!

Re:Giving themselves a bad name (1)

Alan Partridge (516639) | more than 11 years ago | (#5963479)

+1, Quite Funny

Re:Giving themselves a bad name (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5963553)

d0nu7 m0ck m3, 4l4n 94r7r1dg3!

3y3 w1ll h4x0r j00r b0x3n 4nd s734l 4ll j00r pr0n!!!!!!!!!

Re:Giveing them self a bad name (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5963414)






"great. Now ..." .. is not a proper form of pronunciation. The use of the conjuction "but" is redundant.

"my doubts" is also redundant, forcing one to wonder if you're possbily schizophrenic. Who else's doubts might you have?

Too bad our educational system allows people to move on with tripe like this the best they can (or are motivated to) produce. Your post consists of 25 words and nearly half as many errors. Bravo.

Re:Giveing them self a bad name (1)

Alan Partridge (516639) | more than 11 years ago | (#5963462)

Good, but you missed the parents title, which I assume was SUPPOSED to have read "Giving themselves a bad name"

Re: seems misleading.. (3, Interesting)

op51n (544058) | more than 11 years ago | (#5963458)

Since upon reading the article it even states that nVidia don't have access to the version of 3dmark2003 (not on the beta team) so they can have errors between the drivers and the code for 3dmark and not know. This is the kind of thing that can happen, and will take a driver update to fix, but does not necessarily mean they are doing anything wrong.
As someone who has always been impressed by nVidia's driver updates and the benefits they can give each time, I am going to wait to see if it really is something bad they are doing deliberately before changing my opinion of them.

There is, at the moment, no real evidence in anyones favour.

What's the big news? (5, Insightful)

binaryDigit (557647) | more than 11 years ago | (#5963235)

Isn't this SOP for the entire video card industry? Every few years someone gets caught targeting some aspect of performance to the prevailing benchmarks. I guess that's what happens when people wax on about "my video card does 45300 fps in quake and yours only does 45292, your card sucks, my experience is soooo much better". For a while now it's been the ultimate hype driven market wrt hardware.

Re:What's the big news? (3, Interesting)

diesel_jackass (534880) | more than 11 years ago | (#5963265)

I know, I thought this was common practice across the board in the video card industry. NVidia has always had the shadiest marketing (remember what the 256 stood for in the GeForce 256?) so I don't really think anyone would be surprised by this.

Re:What's the big news? (2, Interesting)

TopShelf (92521) | more than 11 years ago | (#5963401)

In a way, it's a symptom of the importance that these benchmarks have assumed in reviews. Now, cards are tweaked towards improved performance within a particular benchmark, rather than improving overall.

Re:What's the big news? (4, Insightful)

newsdee (629448) | more than 11 years ago | (#5963442)

Now, cards are tweaked towards improved performance within a particular benchmark

This is always the case with any chosen performance measurement. Look at managers asked to bring quarterly profits. They tend to be extremely shortsighted...

Moral of the story: be very wary on how you measure and always add a qualitative side to your review (e.g. in this case, "driver readiness/completedness").

Problem is the benchmarks themselves (4, Interesting)

Ed Avis (5917) | more than 11 years ago | (#5963550)

Why is it that people are assessing the performance of cards based on running the same narrow set of benchmarks each time? Of _course_ if you do that then performance optimization will be narrowly focused towards those benchmarks. Not just on the level of blatant cheating (recording a particular hardcoded text string or clipping plane) but more subtle things like only optimizing one particular code path because that's the only one the benchmark exercises.

More importantly why is any benchmark rendering the exact same scene each time? Nobody would test an FPU based on how many times per second it could take the square root of seven. You need to generate thousands, millions of different scenes and render them all. Optionally, the benchmark could generate the scenes at random, saving the random seed so the results are reproducible and results can be compared.

Lies (1, Funny)

Flamesplash (469287) | more than 11 years ago | (#5963437)

Lies, Damn Lies, and Marketing

Re:What's the big news? (3, Insightful)

Surak (18578) | more than 11 years ago | (#5963297)

Goodbye, karma. ;) And, realistically, what does it matter? If two cards are similar in performance, but one is just a little bit faster, in reality it's not going to make *that* much of a difference. You probably wouldn't even notice the difference in performance between the new nVidia card and the ATI 9800, so what all the fuss is about, I have no clue.

Re:What's the big news? (2, Funny)

Cloud 9 (42467) | more than 11 years ago | (#5963430)

You probably wouldn't even notice the difference in performance between the new nVidia card and the ATI 9800, so what all the fuss is about, I have no clue.

Two things, both related to the key demographic:

1) When you're spending $200USD or more on any piece of hardware, you want to know that your purchasing decision was the best one you could make. Given that the majority of the people making these big-buck video card purchasing decisions are males in high school/college, who in general don't have that much money to begin with, the distinction between the cream and the crap can easily come down to the matter of a few hundred 3DMarks.

2) Penis size. When previously mentioned teenage boys buy the biggest, baddest video card there is, they typically like to rub that fact in all their friends' noses.

Re:What's the big news? (1, Funny)

Ed Avis (5917) | more than 11 years ago | (#5963502)

Please run your posts through a mixed-metaphor checker before pressing 'Submit' :-P.

NVIDIA == Thieves and Liars if et is correct (1)

SubtleNuance (184325) | more than 11 years ago | (#5963513)

so what all the fuss is about, I have no clue.

The fuss is about the honesty of nvidia's business practices. I dont know about you, but I do not excuse dishonesty from business people -- they should be held to a very high standard.

If what extremetech is saying (that nvidia purposefully wrote their driver identify a specific benchmark suite, and then ONLY to inflate the results) it would be increadiby significant. if so, I would *NEVER* buy another nvidia product again -- and I would make clear to the (many unfortunately) people with whom I speak regularily about computer-purchase decisions...

Re:Does this even improve your experience? (5, Funny)

Hellkitty (641842) | more than 11 years ago | (#5963300)

You make an excellent point. I am tired of spending way too much money trying to reach that holy grail of gaming. The slight improvement in hardware isn't going to change the fact that I'm only a mediocre gamer. The best gamers are going to kick my ass regardless of what hardware they use. I don't need to spend $400 every six months to be reminded of that.

Re:Does this even improve your experience? (1, Insightful)

Pulzar (81031) | more than 11 years ago | (#5963421)

Why do you feel obligated to post the "I don't care about the zillion fps in quake"? Do you post a similar message to every story that you don't care about?

This is a big deal to people who care -- it insults the reviewers who spent hours benchmarking their card, and it insults the users who bought/will buy their card. There are people who care, and people who do want the fastest card for a reason, and they are interested to hear from other people who care, and not the people who don't!

Re:Does this even improve your experience? (1)

Joe the Lesser (533425) | more than 11 years ago | (#5963561)

I know man, every time I think I've found the holy grail of gaming, it just turns out to be a beacon!

Re:What's the big news? (1, Insightful)

Anonymous Coward | more than 11 years ago | (#5963324)

They all do it. You just need a proper NT style WHQL test, which tests each pixel out of the output to make sure it's rendered according to spec. Would you believe this isn't done, so all the tests carried out by manufacturers tell you how quickly `something` was rendered?

Re:What's the big news? (1)

Alan Partridge (516639) | more than 11 years ago | (#5963446)


a large part of the technology that goes into current 'cards is related to image quality (hence different aniso, aa and colour handling techniques). so one could not likely have a situation where the output of two cards using different techniques could possibly match so precisely.

Re:What's the big news? (5, Interesting)

Anonymous Coward | more than 11 years ago | (#5963406)

Posting anonymously because I used to work for a graphics card company.

I've seen a video card driver where about half the performance-related source code was put in specifically for benchmarks (WinBench, Quake3, and some CAD-related benchmarks), and the code was ONLY used when the user is running said benchmark. This is one of the MAJOR consumer cards, people.

So many programming hours put into marketing's request to optimize the drivers for a particular benchmark. It makes me sick to think that we could have been improving the driver's OVERALL performance and add more features! One of the reasons I left......

Re:What's the big news? (2, Interesting)

Anonymous Coward | more than 11 years ago | (#5963559)

Now we know why there is no chance of open sourcing the NVidia drivers on linux.

Hmmmm (2, Interesting)

the-dude-man (629634) | more than 11 years ago | (#5963245)

Well they got caught...they obviously arnt to good at it, after all they did get caught

I dont know why anyone ever cheats on could you ever get away with it? do you really think no one is going to do their own benchmark? Come on. This is probably one of those most retarded things I have ever seen a company do.

Oh well, Nvidia is getting to the point were they are going to have beat out ATI at some point if they want to survive

Re:Hmmmm (5, Informative)

drzhivago (310144) | more than 11 years ago | (#5963285)

Do you remember how a year or so ago ATI released a driver set that reduced image quality in Quake 3 to increase frame rate?

Here [] is a link about it in case you forgot or didn't know.

It just goes to show that both companies play that game, and neither to good results.

Re:Hmmmm (1)

goonerw (99408) | more than 11 years ago | (#5963433)

What about software vendors that prohibit you from publishing benchmarks unless they are in the company's favour?

Don't forget, ATI cheated too. (0)

Anonymous Coward | more than 11 years ago | (#5963485)

Remember when the Radeon 8500 drivers fucked with mipmapping to give better benchmarks?

Re:Don't forget, ATI cheated too. (1)

Alan Partridge (516639) | more than 11 years ago | (#5963557)

yeah, what a party that was!

We won't see times as good as those coming back too soon!

Rock & Roll!

Fuck 'em! (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5963246)

The bigger a corporation gets, the higher the chance it will start to suck.

Do something good today, chop a corporation in half.

I don't know (5, Insightful)

tedgyz (515156) | more than 11 years ago | (#5963247)

I have read two reviews on AnandTech [] and [H]ardOCP [] . Neither of them made any such accusations. They both said visual quality was fine.

Targeting benchmarks is just part of the business. When I was on the compiler team at HP, we were always looking to boost our SPECint/fp numbers.

In a performance driven business, you would be silly not to do it.

Re:I don't know (1)

Pike65 (454932) | more than 11 years ago | (#5963332)

Speaking of HardOCP, they had seen this kind of thing coming, and have been leading a drive against artificial benchmarks since February. They seem to have been trying to focus on engine-based testing since with the likes of Quake 3 and UT in their reviews too.

More information here [] .

You're right, you don't know (3, Informative)

krog (25663) | more than 11 years ago | (#5963335)

The point is that visual quality *was* fine... within the benchmark's prescribed path. "Off the rail" is when problems started occuring.

This is why all software and hardware should be open-source.

Re:You're right, you don't know (2, Insightful)

Anonymous Coward | more than 11 years ago | (#5963444)

This is why all software and hardware should be open-source.

Right, and why all your bank records should be public (just in case you are stealing and have illegal income). And all your phone records should be public as well as details of your whereabouts (just in case your cheating on your wife/skipping class). And of course, why the govt should have access to all your electronics transmissions (internet, cell, etc), just in case you're doing something that they don't like.

Re:You're right, you don't know (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5963465)

makes you wonder why NVidia insists on keeping their drivers closed source, eh?

Re:You're right, you don't know (1)

Alan Partridge (516639) | more than 11 years ago | (#5963533)

I'm sure nVidia would gladly open source their drivers if all their competitors would do likewise, and if they didn't think that some Chinese pirates would just copy every bit of it and destroy their legitimate business overnight.

Re:I don't know (2, Interesting)

eddy (18759) | more than 11 years ago | (#5963346)

Read the article. The cheating does not directly affect quality. Then how is it cheating I hear you ask? Because it only increase performance in the _specific_ scene and path rendered in the benchmark.

This is similar to claiming having the worlds fastest _calculator_ of decimals of Phi, only to have it revaled that you're simply doing std::cout << phi_string << std::endl;

ATI, Trident [] and now nVidia. I really hoped nVidia would stand about this kind of lying.

Re:I don't know (1)

georgep77 (97111) | more than 11 years ago | (#5963349)

Today I'm sure you will see some additional content at these sites to discuss the issue. If you actually READ the article at extremetech you can see that nvidia has been caught at a blatant cheat. I think the guy/guys that had to sit through 3DMark 2k3 and manually create the hard coded culling frames (how boring would that be) are going to be the most pissed over this. :-) The scary aspect of this is that this same cheat could easily be hardcoded for any of the well known "packaged" in game benchmarks (timedemo on Quake3, Antalus on Unreal2k3 etc). I just wonder if a class action lawsuit will result from this...

Re:I don't know (0)

Anonymous Coward | more than 11 years ago | (#5963357)

It's not like they actually optimized anything that might apply in real life.

They apparently specifically detected when the benchmark was running and then did specific things to make it faster, things that wouldn't work in anything but the benchmark.

If you would have read the article.....

Anyway, the benchmark follows the same path every time, so they apparently put code in the driver to tell the card not to render things they knew would be visible in the benchmark. The reviewer has a developers version of the benchmark that Nvidia did not have, one that allows you to move around the scenes.

Moving around the scenes makes it apparent that Nvidia hard coded clipping paths for the scenes in the benchmark.

Re:I don't know (1)

apdt (575306) | more than 11 years ago | (#5963495)

I have to agree, I read a review of the card on Tom's Hardware Guide [] which showed it beating the ATI card in real-world benchmarks; that is games (including doomIII incidentally).

The extremetech article seems wildly speculative without any real evidence. Sure, it's possible, but there are a number of other possible explainations.

Also, if NVidia are cheating on benchmarks how come it doesn't show up on real-world tests....

Read the Anandtech article again (0)

Anonymous Coward | more than 11 years ago | (#5963496)

They were denied the ability to comment on the image quality as per Nvidia's orders.


whatever (2, Insightful)

JeffSh (71237) | more than 11 years ago | (#5963251)

I looked at the photos, and it seems to me to be just a driver fuckup on the 3dmark benchmarks.

Since when did rendering errors caused by driver problems become "proof" of a vendor inflating benchmarks?

And this story was composed by someone with the qualifications of "Website content creator, who likes video games alot" not a driver writer, not anyone technically inclined beyond the typical geek who plays alot of video games and writes for a website called "EXTREME tech" because you know, their name makes them extreme!

note: I'm not an Nvidia fanboy, i just bought an ATI Radeon 9500, so I am just a skeptic of incredulous, idiotic derivations of fact, when all he has are some screenshots of a driver screwing up the render of a scene.

Re:whatever (1, Informative)

Anonymous Coward | more than 11 years ago | (#5963350)

The issue is that the driver problems don't occur when you run the benchmark normally - They had a special version of the benchmark that let them stop and fly around, which then revealed the graphical errors - of course, all of this is explained if you actually read the article.

Re:whatever (1)

Metaldsa (162825) | more than 11 years ago | (#5963361)

"Our own interpretation of these test results is that nVidia is improperly cutting corners to attempt to inflate its 3DMark2003 scores for the new GeForceFX 5900 Ultra. The company, on the other hand, again believes that the problems can be attributed to a driver bug. Let's explore exactly what we found, and you can draw your own conclusions."

They didn't say proof they said in their own interpretation. They didn't put out a news flash that says Nvidia is cheating their consumers. They said there might be a problem and they are investigating it.

Either way its good that they spotted this. If it was just a "bug" then fixing it will decrease the score. If the bug was on purpose the fix will decrease the score. Either way the result is the same. The benchmarks before weren't accurate and nvidia's new coming didn't kick ATI's ass by so much.

Re:whatever (5, Informative)

Pulzar (81031) | more than 11 years ago | (#5963375)

Instead of only looking at the pictures, read the whole article before making decisions on whether it's a driver "fuckup" or an intentional optimization.

The short of it is that nVidia added hard-coded clipping of the scenes for everything that the banchmark doesn't show in its normal run, and which gets exposed as soon as you move the camera away from its regular path.

It's a step in the direction of recording an mpeg on what the benchmark is supposed to show and then playing it back at 200 fps.

Re:whatever (0)

Anonymous Coward | more than 11 years ago | (#5963378)

well in the article it talks about nvidia hard coding the clip planes used in those tests into the drivers, and in doing so they can reduce the work load on the card to inflate the score while keeping the visual quality the same.

The screenshots on the other hand are said to be taken from a developers version of the benchmark which allows the camera to be moved from its fixed position, thus showing the rendering errors caused by having fixed clip planes and because of that its suspect to be cheating.

IF you had read the article (0)

7-Vodka (195504) | more than 11 years ago | (#5963412)

You would have understood exactly why those photos were damming evidence.

But wtf am I thinking. This is /. Hundreds of idiots post without reading the article and many get modded up.

Re:whatever (5, Interesting)

GarfBond (565331) | more than 11 years ago | (#5963522)

Because these rendering errors only occur when you go off the timedemo camera track. If you were on the normal track (like you would be if you were just running the standard demo) you would not notice it. Go off the track and the card ceases to render properly. It's an optimization that is too specific and too coincidental for the excuse "driver bug" to work. It's not the first time nvidia has been seen to 'optimize' for 3dmark either (there was a driver set, a 42.xx or 43.xx, can't remember, where it didn't even render things like explosions and smoke in game test 1 for 3DM03)

keyword here is *appears* ... (1)

DataShark (25965) | more than 11 years ago | (#5963252)

anyone tryed to get a comment from NVidia ? ... from all the guys on earth doing reviews of the latest FXs only these ones found this ... seems spooky to me ... Let 's hope that at least ATI is not involved ... FUD you know : also happens with Linux ...

Re:keyword here is *appears* ... (-1)

Anonymous Coward | more than 11 years ago | (#5963556)

Here's a novel idea: read the article, THEN post!

A large part of the article was based on the fact that ExtremeTech asked nVidia about it and was told it was a bug. ET then investigated since, suspiciously enough, it's a very beneficial bug ... if you happen to be nVidia, that is.

And you know what else? "All the guys on earth" might be reviewing the card, but they are pretty much only reviewing the cards because nVidia sends them cards. You could tell when the GeForceFX was announced and heavily reviewed that a lot of the review sites wanted to trash the card all to pieces, but only very few really said anything bad. Review sites that depend on the kindness of others (i.e., the manufacturer) for their review samples ... well, you do the math, bearing in mind that old axiom about not biting the hand that feeds you.

Yeah well... (3, Interesting)

IpsissimusMarr (672940) | more than 11 years ago | (#5963256)

Read this article NVIDIA's Back with NV35 - GeForceFX 5900 Ultra []

3Dmark03 may be inflated but what counts is real world game benching. And FX 5900 wins over ATI in all but Comanche 4.

Interesting ehh?

Re:Yeah well... (2, Insightful)

truenoir (604083) | more than 11 years ago | (#5963290)

Same deal with Tom's Hardware. They did some pretty extensive benchmarking and comparison, and the 5900 did very well in real world games (to include the preview DOOM III benchmark). I'm inclined to believe the driver problem nVidia claims. Especially since it's nVidia and not ATI, they'll likely fix it quickly (not wait 3 months until a new card comes out...not that I'm still bitter about my Rage Fury).

The reason (5, Funny)

S.I.O. (180787) | more than 11 years ago | (#5963257)

They just hired some ATI engineers.

Good, now they're even... (1)

gpinzone (531794) | more than 11 years ago | (#5963260)

Surprisingly, most people didn't flinch when ATI did it. (Remember the Quake.exe vs Quack.exe story?)

Re:Good, now they're even... (2, Insightful)

JDevers (83155) | more than 11 years ago | (#5963367)

Well, to tell you the truth...I LIKE application specific optimization as long as it is general purpose enough to be applied across the board to that application. However, in this case, the corners are cut in a benchmark and are targetted SPECIFICALLY at the scene as rendered in the benchmark. If ATI had done the same thing in Quake, the pre-recorded timedemos would be faster, but not actual gameplay...that wasn't the case, the game itself was rendered faster. The only poor choice they made was how they recognized that Quake was what was being ran, optimizing a specific rendering path would have been more general purpose and have seemed a lot less like cheating.

This on the other hand, if true, could be construed as NOTHING BUT cheating. Especially when coming from a company that said they didn't support 3Dmark 2003 because it was possible for companies to optimize their drivers specifically FOR such benchmarks...well, they proved their point.

Re:Good, now they're even... (0)

Anonymous Coward | more than 11 years ago | (#5963517)

I think that you're missing the fact that NVidia isn't part of FutureMark's beta program, so they didn't have access to the version of 3dmark that ExtremeTech ran. How do you optimize and cut corners for something you don't have?

Re:Good, now they're even... (1)

GarfBond (565331) | more than 11 years ago | (#5963475)

Uhhh, what are you talking about? When ATI did it EVERYONE ridiculed them for such a bug (it was a genuine driver bug; one driver release later the image quality AND expected performance returned). Not to mention when ATI did it, it was nvidia that was giving the information about it to the websites. No evidence of that in this instance (yet). People still bring it up whenever people talk about optimizations and cheating; even you just did.

As the mighty start to fall... (4, Interesting)

mahdi13 (660205) | more than 11 years ago | (#5963271)

nVidia has been one of the more customer friendly video card makers...ever. They have full support for all platforms from Windows to Macs to Linux, this makes them, to me, one of the best companies around.
So now they are falling into the power trap of "we need to be better and faster then the others" which is only going to have them end up like 3DFX in the end. Cutting corners is NOT the way to gain consumer support.

As I look at it, it doesn't matter if your the fastest or's the wide variety of platform support that has made them the best. ATi does make better hardware but their software (drivers) are terrible and not very well supported. If ATi would get the support that nVidia has been giving for the last few years, I would start using ATi hands down...It's the platform support that I require, not speed.

Re:As the mighty start to fall... (1)

EMN13 (11493) | more than 11 years ago | (#5963438)

As much as I agree with you I don't the the article gives sufficient grounding for the accusation. First of all, driver optimizations that are specific to a certain type of 3d-engine or even a particular 3d-engine of even a particular application of that 3d-engine aren't per-se a bad thing; it's certainly the case that nVidia and ATI probably take specific account of Q3 and UT2003 engines in their drivers - if that account for a large part of their usage, it would be insane not too. As such, a benchmark that isn't optimized (also in the software section) isn't a very realistic reflection of reality. I believe there was a discussion on tom's hardware of the 3dmark2003 benchmark which wasn't very positive - or rather an analysis of nVidia's accusations and futuremarks responses. I personally did not find futuremark very convincing in their own defense. So 3dmark2003 scores really don't interest me a bit. Give me real games any day.

A while back ATI had specifically "optimized" Quake3 at the expense of image quality. Obviously this isn't acceptable; If nVidia does that (and to a certain extent they did - on all cards their aniso quality was lower that the radeons' before the detonator FX) This section of the accusation - that the optimzations caused rendering problems, is more serious. However, the reasons given in the article are pure conjecture, and are just as likely to simply be bugs. Anyhow; it's not unbelivable that optimizations might cause bugs - that's merely bad engineering - the issue is whether nVidia realized it or not. If an optimizations works 99% of the time, you've got to realize that you need to use another method for the remaining 1%, and that can be quite hard.

In conclusion; I find the accusation overly broad, and unfounded to boot.


Re:As the mighty start to fall... (0)

glam0006 (471393) | more than 11 years ago | (#5963489)

You don't need speed?

I believe there are some cross-platform Trident boards available on Ebay for $1...

to pay for their mistake (0)

Anonymous Coward | more than 11 years ago | (#5963277)

they should give me a new video card... how the FUCK am i going to play half life 2, doom 3, etc etc.

Ati pro 3d or something.. (1)

telax (653371) | more than 11 years ago | (#5963282)

If I recall right my ati with 4MB(or 2MB) of memory beated nVidia TNT in speedtests.. though most of the output was black screens :)

So? (0, Redundant)

Pig Hogger (10379) | more than 11 years ago | (#5963287)

Who doesn't???

Article talks about DEVELOPER version of 3DMark03 (2, Insightful)

Anonymous Coward | more than 11 years ago | (#5963289)

"Because nVidia is not currently a member of FutureMark's beta program, it does not have access to the developer version of 3DMark2003 that we used to uncover these issues."

Wow, some prelease software is having issues with the new brand-new drivers? Who would have thought... Why not wait for official release of the software and the drivers before making hasty conclusions?

In addition, who really cares about 3DMark? Why not use time which is wasted on 3DMark benchmark for benchmarking real games? After all 60fps tells a lot more about performance than 5784 3DMarks.

Re:Article talks about DEVELOPER version of 3DMark (0)

Anonymous Coward | more than 11 years ago | (#5963443)

They are a member of the beta program and therefore have access to the developer version of the benchmark. They are not using a beta of the benchmark (as I understand it), but a developer version. There's a difference. See: Beta. Developer. They're even spelled differently. Your second comment is addressed in the article, which I'm assuming that you read, right?

Re:Article talks about DEVELOPER version of 3DMark (3, Informative)

Pulzar (81031) | more than 11 years ago | (#5963470)

Please try reading the article in more detail.

The developer version is not a pre-release, it's the same version with some extra features that let you debug things, change scenes, etc.

As soon as you move the camera away from it's usual benchmark path, you can see that nVidia hard-coded clipping of the benchmark scenes to make it do less work than it would need to in a real game, where you don't know where the camera will be in advance.

As I mentioned in another post, it's a step in the direction of recording an mpeg of the benchmark and playing it at a high fps rate.

Intel (0, Redundant)

13Echo (209846) | more than 11 years ago | (#5963299)

Anyone remember when Intel did this a few years ago with motherboard chipsets? Programs like HDTach got insane benchamrks with their chips.

Very old practice. (4, Interesting)

shippo (166521) | more than 11 years ago | (#5963307)

I recall about 10 years ago that one of the video adaptor manufacturers optimised their Windows 3.1 acclerated video drivers to give the best performance possible with the benchmark program Ziff-Davis used for their reviews.

One test involved writing a text string in a particular font continuously to the screen in. This text string was encoded directly in the driver for speed. Similarly one of the polygon drawing routines was optimised for the particular polygons used in this benchmark.

Sigh... (2, Insightful)

Schezar (249629) | more than 11 years ago | (#5963310)

Back in the day, Voodoo cards were the fastest (non-pro) cards around when they first came out. A significant subset of users became Voodoo fanboys, which was ok, since Voodoo was the best.

Voodoo was beaten squarely by other, better video cards in short order. The fanboys kept buying Voodoo cards, and we all know what happened to them ;^)

GeForce cards appeared. They were the best. They have their fanboys. Radeon cards are slowly becoming the "other, better" cards now.


(I'm not sure what point I was trying to make. I'm not saying that nVidia will suck, or that Radeon cards are the best-o. The moral of this story is: fanboys suck, no matter their orientation.)

Re:Sigh... (1)

mahdi13 (660205) | more than 11 years ago | (#5963434)

3DFX slowly died when the GeForces came out and then 3DFX was aquired by nVidia
So, what you are saying is now that nVidia is slowly dieing they will soon be aquired by ATi in the next couple of years?

I like that theory...hopefully it doesn't happen to nVidia, but it's a solid theory ;-)

"nvidia engineers are investigating these issues" (1)

Wolfier (94144) | more than 11 years ago | (#5963320)

Investigate on what? On how to make up excuses "this is an unexpected irregularity of the driver"? This is ridiculous.

It's clearly a deliberate attempt. But it looks like NV's going to deny responsibility on this one.

Shame on them...

cheater cheater.... (-1)

Anonymous Coward | more than 11 years ago | (#5963322)

pumpkin eater!

Another reason to open-source drivers (4, Insightful)

BenjyD (316700) | more than 11 years ago | (#5963338)

The problem is that people are buying cards based on these silly synthetic benchmarks. When performance in one arbitrary set of tests is so important to sales, naturally you're going to see drivers tailored to improving performance in those tests.

Of course, if Nvidia's drivers were released under the GPL, none of the mud from this would stick as they could just point to the source code and say "look, no tricks". As it is, we just get a nasty combination of the murky world of benchmarks and the murky world of modern 3D graphics.

Re:Another reason to open-source drivers (1)

georgep77 (97111) | more than 11 years ago | (#5963377)

The REAL problem is that companys like DELL use these "silly synthetic benchmarks" for their buying desisions...


So true. (1)

Schezar (249629) | more than 11 years ago | (#5963486)

A geek can use a piece of hardware/software and tell you it's strengths/weaknesses pretty easily. Outside of geeks, marketoids and management types don't have that "magic touch," so they demand numbers that they can then put into charts and graphs.

They don't want to hear "Card A is good at foo, but it everheats, and card B is good at bar, but slow at foo..." They want to hear "Card A is 125 foobars better than Card B."

Re:Another reason to open-source drivers (1)

Obiwan Kenobi (32807) | more than 11 years ago | (#5963492)

Of course, if Nvidia's drivers were released under the GPL, none of the mud from this would stick as they could just point to the source code and say "look, no tricks".

Forgive me, but that sounds like one very stupid idea.

Why would you want to expose your hard earned work to the world? NVidia pays very well for programmers to think of wild and imaginative (out o' the box) programming techniques to get the most from their hardware.

With rogue drivers out there thanks to open-sourcing the code, someone could inadvertendly damage their card or render it (god what a pun) utterly useless.

Not to mention that other video card companies (ATI, anyone?) would love to see how their rendering pipelines work, and how they might be able to do the same. Remember that just until recently Catalyst drivers only covered upper echelon Radeon cards, and were certainly not "reference" (ie, covering an entire product line).

Terrible idea, though you had good intentions with it.

Re:Another reason to open-source drivers (0)

Anonymous Coward | more than 11 years ago | (#5963538)

and say "look, no tricks".
Because if they open sourced the drivers right now and said, look, no tricks, it would end up like something out of a simpsons or family guy episode. I bet nvidias drivers are really just the dna of small monkeys encoded into binary so that they can jump around and throw their own feces at each other.

It might not be premeditated (1)

ArmorFiend (151674) | more than 11 years ago | (#5963356)

My suspicion is the benchmark is giving (incorrect) culling info to the driver. ATI's driver ignores it, and Nvidia honors it.

They have all done it (1)

GauteL (29207) | more than 11 years ago | (#5963358)

Reviews should try to uncover it and find out who does it right now which is the only thing that really matters when getting a product.

The whole Quake / Quack fiasco for ATI was enlightening, but does anyone know if ATI does this currently?

Frame rates are overrated anyway, since people buying these cards are buying new ones before their current ones go down to noticable frame rates. Features, picture quality and noise is what matters.

ATI seems to still have the upper hand, and at least for ATI cards there is some free drivers for Linux that can handle 3d-acceleration.

Re:They have all done it (3, Interesting)

pecosdave (536896) | more than 11 years ago | (#5963427)

I bought the first 64MB DDR Radeon right after it came out. I held on to the card for months waiting for ATI to release a driver, didn't happen. I heard of people having sucess getting 3D acceleration to work, but I could never duplicate that success.

Finally after months of waiting I traded my Radeon to my roommate and got a GeForce 2 Pro with 64MB of DDR. Runs beutifully on Linux, I even play UT2K3 with it on an Athlon 850. Finally after having the GeForce2 for about four months I happen across a site that tells me how to make 3D acceleration work for the Radeon. To late now, I'm happy with my GeForce, and UT2K3 seems to only really want to work with nVidia anyways.

I don't think drivers are the best way to defend ATI considering they tend to shrug off other OS's and nVidia has committed themselves to supporting Alternate OS's.

Statistics (1)

cwernli (18353) | more than 11 years ago | (#5963376)

Benchmarks are nothing else than statistics: In order to get to a (more or less) meaningful benchmark, you repeat the same process over and over, possibly in different environments. Then you analyze the results, resulting in a statistic of whatever you've benchmarked.

Therefore, the old Disraeli saying applies: "There are lies, damn lies, and statistics."

Or, to essentially say the same thing without expletives: Never trust a statistic you haven't faked yourself.

32 fps ... (-1, Redundant)

kruczkowski (160872) | more than 11 years ago | (#5963398)

Correcty me if I'm wrong, but I read somewere that the average eye can only see at 32 fps.

Re:32 fps ... (1)

BenjyD (316700) | more than 11 years ago | (#5963468)

An interesting variation on the standard graphics-card article troll.Personally I can see at 34.152 fps on a good day, but can only manage 28.693 when I'm tired.

Shouldn't it be.. (1)

1000101 (584896) | more than 11 years ago | (#5963404)

They are NFlating benchmarks? ;)

Not a big deal. (4, Informative)

grub (11606) | more than 11 years ago | (#5963416)

One has to take all benchmarks with a grain of salt if they come from a party with financial interestes in the product. Win 2K server outperforms Linux, a Mac is 2x the speed of the fastest Wintel box, my daddy can beat up your daddy..

It's not suprising but it is somewhat disappointing.

shame.. (1)

lengis (563392) | more than 11 years ago | (#5963447)

Ironic. This is the same thing ATI did about a year or two ago. ATI used those special quake3 drivers that increased preformance *only* in q3.

It's a shame that companies resort to lying to consumers.

Much worse than ATI's cheating (1)

Freedom Bug (86180) | more than 11 years ago | (#5963450)

ATI was caught optimizing Quake3. In theory, this is a *good* thing. Quake3 is used by a lot of people, and was/is the engine for many of the games that people buy top end video cards for.

I'm sure nVidia does the same thing: new Detonator driver releases have been known to get amazing improvements for specific games.

ATI screwed up by affecting the visual quality. Well, screwing up visual quality would be acceptable if there was a documented setting to turn that particular optimization off, but there wasn't, so public chastisement followed.

In other words, it was an implementation problem. It sucks, but I write software for a living, and I can guarantee that every piece of software I have in the wild has at least one bug.

nVidia was caught optimizing benchmarks. No excuse. A public flaying is in order.


Re:Much worse than ATI's cheating (0)

Anonymous Coward | more than 11 years ago | (#5963498)

Huh? How optimizing benchmark is worse than optimizing a real game?

Intel and many other CPU vendors create special compilers for SpecINT/FP tests. It's acceptable practice.

This is NOT standard practice. (3, Informative)

mr_luc (413048) | more than 11 years ago | (#5963467)

Targetting performance for benchmarks is one thing.

These drivers were written with specific limits built in that make the drivers COMPLETELY irrelevant to ordinary gaming, as ET demonstrates by moving the camera just a bit from the designated path.

This would be like chopping the top off of a car to make it lighter, to reduce the distance it takes for it decellerate in a brake test. Or compensating for a crappy time off the starting line by removing the back half of the car and bolting a couple of RATO rockets where the back seats used to be. Or loading the car up with nitro, or something. You think Car and Driver Magazine wouldn't say something?

These drivers make the card completely unsuitable for ordinary gaming. They aren't 'more powerful' -- they are a completely altered version of the drivers that are ONLY good at improving one particular set of benchmarks.

Random Rail (1, Insightful)

Anonymous Coward | more than 11 years ago | (#5963477)

How difficult would it be to have a "random rail" generator? This would be fair for review purposes, just generate a "random rail" path for the specific review and run the benchmark with each card. This is essentially what they did to discover the "driver bug" anyway, so why not make that a 3D benchmark feature?

Dont trust the Doom III benchmarks (1, Interesting)

Anonymous Coward | more than 11 years ago | (#5963490)

It would be real funny, if true, as nVidia was slamming 3DMark for not being a real-world indicator of performance of their NV30.

Remember, those great Doom III numbers were obtained on machines that nVidia supplied to reviewers. These numbers should also be suspect. If this is true, they had to know it would not look good. If nVidia did cheat like this, it can only mean the 5900 DOES NOT BEAT the ATI card. Desperate times indeed at nVidia.

Edetorial on this issue (2, Informative)

Anonymous Coward | more than 11 years ago | (#5963512) [] has a very well thought out Editorial on this issue titled ""Trust is Earned" It is well worth the read.

Re:Edetorial on this issue (0)

Anonymous Coward | more than 11 years ago | (#5963551)

Oops. Even thought it showed up in the previews, the direct link to the editorial was lost when posted.

Here it is []

I've seen reviews of the 5200 with bad images (1)

stratjakt (596332) | more than 11 years ago | (#5963549)

Apparently the first run of drivers had major bugs that would screw the images up, deleting objects and textures. I surmise its a problem with the new z-buffer whizbang.

It's funny that the site I read this 'review' on was very forgiving about these problems, yet its forums were full of rants about 'terrible ati drivers'. Of course, 99% of those rants has to be user error, as I've never had a serious driver issue with any of the ati cards I've used. They probably dont follow the installation instructions right.

Anyways, yeah. nVidia invents 'benchmarks' to make themselves look faster. So does AMD (the 2800+ ratings are based on an old t-bird core running at 2800+, not a comparison to intel). It's not so much fraudulent, since benchmarks really don't mean shit in the first place.

But fanboys swallow it up as gospel.

Anyhow, anyone have any firsthand experience with the 5200 and/or 5200 ultra? It seems like a worthy upgrade for $100 but so far I havent seen any really objective reviews of the card.

Enough of that... (2, Interesting)

Dwedit (232252) | more than 11 years ago | (#5963555)

Show me the Quack 3 Arena benchmarks! Then we'll decide which card is the best!

NVidia sucks XFree86 wise, After all ATI works. (0)

Anonymous Coward | more than 11 years ago | (#5963560)

Not enough that NVidia only distributes binary only drivers. And refuse to release hardware API.

With ATI you get the source which means that the you won't get stuck because you have no way of getting drivers for the new kernel which has it's binary kernel module api changed.

With NVidia your videocard upgrade future is hostage..

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?