Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Futuremark Replies to Nvidia's Claims

CmdrTaco posted more than 11 years ago | from the what-exactly-are-kid-gloves dept.

Graphics 317

Nathan writes "Tero Sarkkinen, Executive Vice President of Sales and Marketing at Futuremark, has commented on the claims by Nvidia that 3DMark2003 intentionally puts the GeforceFX in bad light, after Nvidia had declined becoming a member of Futuremark's beta program. This issue looks like it will get worse before it gets better." ATI also seems to be guilty of tweaking their drivers to recognize 3DMark.

cancel ×

317 comments

FP (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6048852)

Microsoft Rules you cockjockeys.
Get a job, you know, money is good.

Re:FP (-1, Offtopic)

Jayanef (317674) | more than 11 years ago | (#6048868)

Get a life
Microsoft doesn't rule me

Re:FP (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#6048985)

Why don't you go back to your basement and continue downloading Linus porn while working on a shitty version of a project that Microsoft did 10 years ago.

Slashdot is in trouble? (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6049052)

Look at how many stories are making the front page in a given day... it's ridiculous. The content in most is too weak to be front page fodder, but the editors are making a common amateur's mistake and think that dynamicism through quantity will attract eyes more often, therefore providing additional views of the banner ads.

The reality is that such tactics water down the product and drive potential subscriptions and future ad-views away.

F970P (-1)

Guns n' Roses Troll (207208) | more than 11 years ago | (#6048858)

First PPC970 post. All ye Apple faithful shall pray to St Jobs that he deliverith unto them a wicked machine, fast enough to smite the heathen PC users from the face of Creation!

I see that you're unemployed again. Pih-T (-1, Troll)

Anonymous Coward | more than 11 years ago | (#6048934)

Apple's InkWell: Not Just OCR!

Recently, Microsoft announced "Digital Ink," a handwriting-recognition technology that many compare to Apple's InkWell, both respectively set to debut in the next major revisions of Windows and Mac OS X. As whenever similar technologies pop up at Microsoft, Apple Mac zealots ask a few questions: Was Microsoft's Digital Ink developed in-house at Microsoft? Was it bought from a third-party? Grabbed from a sub-licensor?

The answer is that Digital Ink came directly from Apple. But the story behind how Microsoft was able to so simply buy InkWell and rename it for use in Windows is a tale of moral depravity and sordid carnal desperation that few are privy to-- until today! Read on to discover how Microsoft came to own yet another key Apple technology in the most sordid of political maneuverings.

It all began in the late 1970s. Steve Jobs, after a night of smoking marijuana and tripping on "acid" (lysergic acid diethylamide), conceived of a way to interact with computers using only the mind. Well-known at Stanford for his telekinetic abilities, such as making entire fields of grass sway with but a thought, Steve wanted to move the "mouse" and "menu" (bizarre, alien concepts to anyone outside of his clique of 2600 hackers and EE alcoholics) with nothing but the power of his mind. Of course his compatriots, peaceful, bearded Steve Wozniak and the illegally immigrated Avie Tevanian, dismissed the idea as yet another episode of harmless drug-induced rambling.

In 2002, 26 years after his messianic user interface vision, Steve Jobs was hard at work in the deepest part of Apple's labs, personally overseeing secret user interface experimentation. It turns out that Steve had never forgotten about his psychedelic user-interface dream and was tirelessly attempting to realize it 30 miles beneath Cupertino, CA. Down here, in his "dungeon," the attempts to connect silicon to carbon were in full force and without regard to their subjects.

Some men had industrial-grade alligator clamps attached to their nipples and testicles which were randomly jolted with millions of volts of electricity in order to stimulate their brains. Other men had deadly mixtures of cocaine and heroin ("eight-balls") injected into their penises while being forced to watch gay porn. Another group endured horrible procedures in which their own arms, legs, and scrotums were replaced with those of gorillas, chimpanzees, and orangutans. One smaller group were forced to smoke opium eight hours a day alternately being whipped and beaten until they managed to move the cursor a pixel or two with just their poppy-addled minds. The most successes, however, had come from Steve's own bizarre device dubbed "handJobs."

handJobs was a series of wires and electro-sensitive pads placed on the fingertips that allowed one to manipulate elements of the Mac OS GUI with simple motions. Steve Jobs, being telekinetic from years of tripping acid, wielded it more powerfully than anyone else in his R&D dungeon. In fact, so powerful was his mind that he like to hook the wires and pads up to his own penis and controlled his Power Mac by means of pelvic thrusts and lude gyrations of his hairy penis and scrotum.

Bill Gates, on a visit to the Apple Campus, accidentally stumbled onto handJobs in a moment that would change UI in computing forever. Feeling that he simply owned the Apple Campus as he did the rest of the world, Mr. Gates walked into Steve Jobs's private office without knocking. Steve was in the middle of "making love" to thin air, pants in a puddle at his ankles, hands on hips, thrusting his engorged member at the monitor! He had decided to take his latest revision of the device to his office to test out when Mr. Gates had walked in on him! Gates knew what he liked and liked what he saw, and began immediately bargaining with Jobs.

By the end of the day, Jobs had created a new technology agreement with Gates. Apple would begin partnering with Microsoft on alternative input technologies, and by late June MS would announce "Digital Ink" for Windows. In reality Digital Ink was a front, and both it and InkWell for Mac OS were place-holders for what handJobs would eventually become. Until handJobs was ready, however, the masses would be fed OCR capabilities from the operating system. Before the ink on the contract was signed, however, Jobs had finagled Gates into receiving a "technology preview" of handJobs, with Jobs attempting to control Gate's breathing with nothing but his leathery scrotal sack, using Gates's chin as a "touch pad." Now you know the immoral, homo-erotic history behind InkWell, Digital Ink, and the next generation of OCR and handwriting-recogntion. I hope that Apple Macintosh zealots everywhere think about this before they blindly evangelize their operating system of choice, inadvertently infecting the minds of the masses with years of sweating gay R&D and "bleeding edge" (of anus) techno-faggotry.

I don't care how fast it is... (4, Insightful)

Anonymous Coward | more than 11 years ago | (#6048860)

It won't be fast enough next year.

Re:I don't care how fast it is... (0)

Anonymous Coward | more than 11 years ago | (#6048897)

Anybody remember how excited you got when you first got your hands on a 286 machine?

Re:I don't care how fast it is... (0)

Jayanef (317674) | more than 11 years ago | (#6048945)

yeah...
I cannot play digger!

troll tuesday (-1, Troll)

Anonymous Coward | more than 11 years ago | (#6048869)

Definitely, it's been a downer today. Come on!

cg6 * (-1, Offtopic)

norculf (146473) | more than 11 years ago | (#6048871)

bleh

nVidia vs. ATI (4, Funny)

wowbagger (69688) | more than 11 years ago | (#6048893)

nVidia: "Well, for this program they will never step off the rail, so we can fake it so it looks good from the rail only."

ATI: "Well, this shader program isn't optimally coded - here is a more optimally coded shader that does the exact same thing but more quickly."

nVidia: "Well, you caught us, but we have to cheat because you have it in for us!"

ATI: "Well, you caught us, and although we were doing the exact same thing (only faster), we will remove that code ASAP."

Re:nVidia vs. ATI (5, Funny)

KillerHamster (645942) | more than 11 years ago | (#6048930)

SCO: Hey, we have a patent on cheating, pay us money!

Re:nVidia vs. ATI (2, Funny)

Anonymous Coward | more than 11 years ago | (#6049184)

"SCO: Hey, we have a patent on cheating, pay us money!"

Wouldn't that be:

SCO: Hey, we have a patent on SOMETHING you did, pay us money!

Re:nVidia vs. ATI (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6048961)

this is troll tuesday. take your lame-ass posts elsewhere.

Re:nVidia vs. ATI (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6049022)

Just because it's not funny to you doesn't mean it's a troll.

Re:nVidia vs. ATI (5, Interesting)

asdkrht (655186) | more than 11 years ago | (#6049017)

From what I've heard, Nvidia totally replaced the shader program with ones that they wrote. All ATI did was re order some of the instructions in the shaders to "optimize" them. The optmized and the original shader programs were functionally evquivalent. Sort of what happens when a complier optimizes code. The same can't be said for what Nvidia did.

Re:nVidia vs. ATI (5, Insightful)

stratjakt (596332) | more than 11 years ago | (#6049284)

If it was a generic optimization (and probably should have been), there'd be no issue.

ATI recognized the 3dmark executable and special cased for it. Which is misleading and wrong. The performance is enhanced for 3DMark and 3DMark alone.

Fine Trolling My Friend (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6049023)

Moderation +1
40% Troll
30% Insightful
30% Interesting
Extra 'Troll' Modifier 0 (Edit)
Karma-Bonus Modifier +1 (Edit)
Total Score: 3

Mod Point Eating Powerhouse
All Hail Troll Tuesday!

*cough* QUACK 3 ISSUE! (0, Informative)

Anonymous Coward | more than 11 years ago | (#6049229)

Quake 3 vs Quack 3 [gamers.com] Go troll elsewhere. They're both guilty from time to time of doing the same bullshit.

Re:nVidia vs. ATI (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6049349)

1. Collect underpants
2. ???
3. Profit!

Hey, on the upside (5, Funny)

Frederique Coq-Bloqu (628621) | more than 11 years ago | (#6048895)

3DMark will look totally sweet because it's *optimised* for both cards.

Re:Hey, on the upside (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#6048981)

actually, the upside is that you are a lamer.

Re:Hey, on the upside (1)

Fembot (442827) | more than 11 years ago | (#6049362)

When will they start optimising for glxgears damm it?

The real reason this is important. (5, Interesting)

Cannelbrae (157237) | more than 11 years ago | (#6048906)

Its about the OEMs as much or more than the consumer market. They watch the benchmarks closely -- and make decisions based on results.

This is where the money really is, and what is worth fighting for.

Re:The real reason this is important. (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6049005)

do you practice being lame, or are you just a natural?

Re:The real reason this is important. (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#6049083)

and if your nVidia card had a dick, you'd suck it.

You have to expect it (3, Informative)

leeroybrown (624767) | more than 11 years ago | (#6048908)

I suppose you have to expect some poor practices considering that the top 3DMark card will be considered by many gamers to be the best to buy. It's a massive temptation for such a big industry. I find ATI's decision to remove code which it claims boosts over all performance quite funny.

Re:You have to expect it (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#6049040)

You know what I find funny? The fact that you're such a lamer.


Oops, I meant "pathetic", not "funny". My bad.

ATi wasn't so bad (5, Funny)

Loie (603717) | more than 11 years ago | (#6048913)

ATi's tweak yields a 1.9% gain by rearranging the instructions 3dmark issues it's hardware. Anyone familiar with assembly language knows that properly arranging your instructions prevents stalls; the end result, however, is exactly as intended. It sounds to me that this is what ATi did. nVidia, on the other hand...40% gains with very obvious visual errors is..well, wrong.

Re:ATi wasn't so bad (2, Insightful)

homer_ca (144738) | more than 11 years ago | (#6049133)

In one sense ATI's tweak is not as bad because they're still rendering the scene with full image quality, where NVIDIA is rendering with reduced quality. However, it's still deceptive because it's optimizing for the special case of a benchmark, and real games (or a renamed 3dmark executable) will run slower.

Re:ATi wasn't so bad (1)

Ashran (107876) | more than 11 years ago | (#6049266)

No renaming the executable wont help as they use a different method of detection 3DMark.
I guess they learned a bit from ATI and their quack.exe debacel .. not enough tho :/

Re:ATi wasn't so bad (1)

MindStalker (22827) | more than 11 years ago | (#6049298)

and real games (or a renamed 3dmark executable) will run slower

Not entirly true, as ATI and nVidia both work closly with big name game studios to make sure that optimizations such as the these are in the game. Obviously the benchmarks didn't use the optimizations they asked for so they took it into their own hands. Sneaky yes, but it is reflective of preformance in real games (atleast big name ones -snicker-)

why tweak for the better? (1)

_newwave_ (265061) | more than 11 years ago | (#6048918)

if nvidia wants to really screw 3dmark, why don't they just make their benchmarker stop working w/ their chips all together...

Re:why tweak for the better? (1)

gerf (532474) | more than 11 years ago | (#6048947)

Because they're not trying to 'screw 3DMark' but rather ATI. They want higher scores, as does ATI. Thus, they make their drivers a little different, to take advantage of the program. It's not that hard to understand really, making people think your performance is better than it really is. In reality, would you even notice the few percentage points faster they claim extra? Yeah, me neither.

Slashdot replies to Goatse's claims (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6048919)

Suck my nuts!

If it weren't for standards ...... (2, Interesting)

oliverthered (187439) | more than 11 years ago | (#6048920)

"3DMark03 was developed strictly according to DirectX9 standard in very close cooperation with Microsoft and other BETA members. If hardware performs well 3DMark03, it performs well in all applications that use DirectX 9. Note that since 3DMark is designed to be an objective evaluation tool, it does _not_ include manufacturer-specific optimizations. This is why it is exceptionally well suitable for objective performance measurement. "

Does this guy [slashdot.org] work for NVidia?

Actually it's a pretty poor DX9 benchmark. (4, Informative)

aliens (90441) | more than 11 years ago | (#6049014)

Hardocp [hardocp.com]

They do a good job of disecting the benchmark, and I'd have to agree that as a DX9 benchmark it fails.

Whatever, it's still just a synthetic mark and nothing more.

Re:Actually it's a pretty poor DX9 benchmark. (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6049096)

The parent post only received a 2? Come on mods, this is very informative. The link proves 3DMark03 is a piece of shit in the first place.

Re:If it weren't for standards ...... (4, Informative)

Anonymous Coward | more than 11 years ago | (#6049066)

This quote is misleading. "DirectX9" alone means nothing.

We need to look at the new shader features offered by DirectX9, these are:
- Pixel and Vertex shaders 2.0 (supported by ATI R3xx line and GeForceFX)
- extended Pixel and Vertex shaders 2.0 (supported only by GeForceFX)
- Pixel and Vertex shaders 3.0 (no support until R400/NV40)

Now let's look at the features which are used by 3DMark03:
- Game 1: no shaders at all, only static T&L
- Game 2: vertex shader 1.1 and pixel shader 1.4 (which isn't natively supported by NVIDIA cards)
- Game 3: vertex shader 1.1 and pixel shader 1.4 (which isn't natively supported by NVIDIA cards)
- Game 4: vertex shader 2.0 and pixel shader 1.4+2.0

This means that:
-DirectX9 offers three new different shaders.
-Three of four 3DMark03 demos don't use new DirectX9 shaders at all
-Three of four 3DMark03 demos use Pixel Shader 1.4 which was introduced with DirectX8.1 and isn't natively supported by NVIDIA cards
-Only one set of new DirectX9 shaders are partially used in one 3DMark03 demo

Thus 3DMark03 shouldn't be called "DirectX9" benchmark. Following quote: "If hardware performs well 3DMark03, it performs well in all applications that use DirectX 9" should be changed: "If hardware performs well 3DMark03, it performs well in all applications that use Pixel Shader 1.4"

Re:If it weren't for standards ...... (5, Insightful)

sjelkjd (541324) | more than 11 years ago | (#6049122)

3dMark isn't a standard. It's a business, who makes money by charging hardware developers LOTS of money to be included in their "BETA" program. In real life(TM), manufacturer-specific optimizations matter. Many games will look better and run faster if they use vendor-specific OpenGL extensions, for instance. For a gamer looking to buy the fastest card to run his favorite game, he should look for benchmarks on that game. FutureMark is trying to make a business by predicting behavior of games that aren't out. Well, either the game you want to play is out or it isn't. If it's out, buy your card based on benchmarks for it. If it's not, wait until it's out before you spend your money. There is no guarantee that 3dMark is a good predictor of DirectX 9 performance.

open source (5, Insightful)

phre4k (627961) | more than 11 years ago | (#6048923)

People keep begging that nvidia release their drivers under a open license. Well i guess we now know why they don't. /Esben

Tero Sarkkinen (-1, Offtopic)

ih8apple (607271) | more than 11 years ago | (#6048931)

If anyone is interested in publicly communicating with Tero Sarkkinen, he has been known to monitor this forum [beyond3d.com] . (He's also posted there...)

Evaluating the evaluation. (5, Insightful)

Poofat (675020) | more than 11 years ago | (#6048933)

This is why you need many forms of evaluations to properly test something. Just running one program to show you pretty pictures is not going to give any meaningful result. You need to stress test the card in other ways.

And, since one of the main reasons people will buy this is to play flashy and pretty games, ignoring the performance in those games is rediculous.

Same old story.... (0, Troll)

zoobaby (583075) | more than 11 years ago | (#6048942)

As long as there are benchmarks, companies will write drivers to get the best score. I would say about 95% of consumers purchase (buying high end cards) based on benchmark scores. Of course you would write a driver that gives the best score to increase your sales. nVidia is just very good, and way better than ATI, at writing drivers that exploit many of the benchmarks now in use.

Is it a bad thing? Not from nVidia's view and ATI is jealous that their code monkeys are falling behind (or understaffed).

Does it make the benchmark invalid? Yes. But it does not matter since 3DMARK has become a "standard." Both card companies will try to exploit the benchies to get the best score and take the "performance crown" and the sales that come with it.

Re:Same old story.... (5, Insightful)

stratjakt (596332) | more than 11 years ago | (#6049148)

All this does is make 3DMARK look worthless as a benchmarking app. All it has now is some value as a pretty looping demo or stress testing application. I run it to make sure the card works and the drivers are installed properly (as in runs all tests) and thats it. The little number it spits up at the end is worthless.

I dont even bother with 3DMark scores when I read reviews, I skip straight to the tested games and get a look at the FPS at various levels of detail.

Then it's easy to realize that card A gives 201 FPS, card be gives 199 FPS, and the answer is: buy whichever is cheaper.

This gives me much more useful information that relates to what I want the card for - playing games.

Re:Same old story.... (0)

Anonymous Coward | more than 11 years ago | (#6049208)

Yeah, you in particular don't bother with 3DMark scores as you realize their meaningless, and so would many technical people. But to the average buyer, these "awards" and scores and bargraphs on the back of the box showing card A beating card B in 3DMark is a marketing tool, and sells cards.

3DMark may be failing as a technical tool, but it's succeeding in the selling department, and that's what matters to the card manufacturers.

State of nvidia development team (5, Interesting)

cmburns69 (169686) | more than 11 years ago | (#6048946)

Back when nvidia aquired 3dfx, they began to merge their development teams. The fx is the first card by nvidia to be developed by engineers from both the nvidia and 3dfx groups.

Of course it will work better when you do it their way; It was 3dfx's strength in the beginning, and its downfall in the end.

But I believe that their current development team has yet to hit its stride, and future offerings will see the trophy going back to nvidia... ... But who knows! I'm no fortune teller ...

Biased Reporting (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6048979)

Why has NVidia become the whipping boy around here lately?

Slashdot Headlines:

Monday: NVIDIA CHEATS!
Tuesday: NVIDIA IS THE DEVIL!
Wednesday: HOLY FUCK IS NVIDIA SCUM
Thurday: NVIDIA CARDS WILL MAKE YOU DEAF AND BLIND
Friday: NVIDIA RAPES YOUR CHILDREN!!
Saturday: NVIDIA IS PRICE GOUGING!
Sunday: NVIDIA STILL CHEATS (oh yeah & maybe ati)

Re:Biased Reporting (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6049091)

Biased reporting?
You must be new here...

driver tweaking (5, Informative)

erikdotla (609033) | more than 11 years ago | (#6048987)

ATI also is crafty at tweaking their drivers to suck. They should be working on decent drivers instead of cheating on stupid benchmarks to get +1.9%.

I bought a Radeon 9700 Pro. The driver issues almost make it not worth the increased FPS over my Ti4400.

The refresh rate problem in XP is annoying as hell. ATI handles it even worse than NVidia, where you set your "maximum" refresh rate and your "highest" resolution, and it assumes that all lower resolutions can only handle that refresh rate.

There's no way to tell your ATI card, "My monitor can do 1280x1024 @ 85hz, but 1600x1200 @ only 75hz." You either get 75hz in everything if you want 1600x1200, or you get 85hz up to 1280x1024, and have to avoid 1600x1200 use lest ye risk getting "frequency over range".

NV handles it better with the driver, allowing you to set maximum refresh rates for every resolution individually.

These refresh rate tweaking programs don't help either, since half the games out there choke when you use them. PC 3d is in a bit of a sorry state right now, and I'm tired of it.

Re:driver tweaking (3, Informative)

JDevers (83155) | more than 11 years ago | (#6049097)

You forget that both of those paths are workarounds for a problem with the operating system. If you want to complain about the problem, complain about Microsoft.

I have a 9700 and don't have ANY driver problems, what sort of issues are you having?

Re:driver tweaking (0)

Anonymous Coward | more than 11 years ago | (#6049147)

this is what DDC is for. the video card asks the monitor what resolutions it supports.

Re:driver tweaking (4, Informative)

UberLord (631313) | more than 11 years ago | (#6049166)

Try using Refresh Force [pagehosting.co.uk] which directly alters the windows monitor information in the registry and removes every mode but the one you specify for each resolution. This allows games to run in any screen size at the requested refresh rate and not cause them to choke or crash. It's worked fine on all my cards so far (GeForces, Radeons, Kyros)

Besides, this is more of a windows quirk than a driver thing as MS requires the driver to behave like this to pass it's WHQL tests.

No problems here...PEBKAM (1)

Genjurosan (601032) | more than 11 years ago | (#6049193)

Problem Exists Between Keyboard and Monitor *grin*

My 9700 Pro works without a hitch. Not a single problem.

Re:No problems here...PEBKAM (1)

Doom Ihl' Varia (315338) | more than 11 years ago | (#6049398)

You mean pens, my sun glasses, and wallet can cause problems? Will it help if I move them out from between my monitor and keyboard?

Re:driver tweaking (0)

Anonymous Coward | more than 11 years ago | (#6049243)

That is odd. I have an older Radeon card and I can set the refresh rate separately from the resolution. Is there anybody who can support this guy's claims?

What a mess! (5, Informative)

georgep77 (97111) | more than 11 years ago | (#6048996)

This whole episode has turned into a big mess. NVDA seems to be the bad guy in all of this. Their DX-9 product was delayed and their existing products where only DX 8.0*. The benchmark heavily favours DX-9 parts and NVDA's existing lineup was/is getting smoked in the benchmark by it's main (only) competitor. They decided to go on the offensive and try to kill off this benchmark. The 30 person company that produces 3D Mark have stood their ground against the multi-billion dollar NVDA. NVDA instead of admitting that their Pixel Shader is quite slow when running against 2.0 specs insteads tries to decieve and FUD their way out of it. Looks like they got more than just some patents when they purchased 3DFX...
Now they have painted themselves into a corner and how this will turn out is anyone's guess.

*DX8.1 has PS 1.4 which is actually much closer (AFAIK) to PS 2.0 than PS 1.3 (DX8).

Preposterous! (5, Funny)

LegendOfLink (574790) | more than 11 years ago | (#6048998)

What!? Two giant corporations actually doing something MS-like to make themselves more appealing?! That's unheard of! Why, one might think this is a ploy to increase marketshare! Corporations are our friends, they would never manipulate the people. Damn the man!

Re:Preposterous! (0, Troll)

NanoGator (522640) | more than 11 years ago | (#6049190)

"What!? Two giant corporations actually doing something MS-like to make themselves more appealing?!"

I know! Let's buy a bunch of video cards that both ATI and NVidia make that they take a loss on, then try to circumvent their protection mechanisms so we can install Linux on them. Won't it piss them off that they lost money AND they got Linux installed on it instead of using it as a graphics processor!

Re:Preposterous! (0)

ovit (246181) | more than 11 years ago | (#6049291)

Notice how we are here, getting angry at NVDA for this. They will pay a cost for this. This is how the "man's" system works. You may screw the people, but they will eventually get the last laugh.

Once Again A Call For Open Source Benchmarks (4, Insightful)

EXTomar (78739) | more than 11 years ago | (#6049000)

The problem isn't that benchmarks lie. We all know they do. The problem is we don't know how they lie. Creating open source benchmark applications can show how the driver is excirsed so everyone who wants to know or learn where cards and drivers are strong and weak. Everyone is on the level if everyone can look at the code that came up with numbers. Not to mention there are things to learn from code in benchmarks that excirse the fringe elements of graphics cards and drivers.

The alternative is what we have now: hand waving voodoo. Not only do we have to take the vendor's word they aren't monkeying around with the driver to match execution of the benchmark but now we have to question where the aligence of the benchmark makers.

Re:Once Again A Call For Open Source Benchmarks (5, Insightful)

UberLord (631313) | more than 11 years ago | (#6049129)

Open Source benchmarks are a bad idea because Closed Source drivers can still be used which may or may not contain these "cheats".

Better to have open source drivers so we can inspect the driver for cheating/optimisations.

Infact, open source them all if the hard numbers are that important!

Re:Once Again A Call For Open Source Benchmarks (0)

Anonymous Coward | more than 11 years ago | (#6049161)

Someone correct me if I'm wrong, but wasn't the Dhrystone system benchmark an open benchmark?
At least in the sense that you could get your hands on the source code of the benchmark program.

That didn't stop people from cheating on it, in fact I seem to remember that companies used to write compilers that would optimise the Dhrystone code solely for the purpose of allowing better benchmark scores.

Open source is not a solution for everything, and I can't see how it would improve anything here.

nvidia - ati cheat difference (5, Informative)

jtilak (596402) | more than 11 years ago | (#6049010)

AFAIK, ATI displays the graphics on screen properly, the drivers are just optimized for the benchmark. One could still consider this cheating. NVIDIA however does not display the graphics properly, it really does cut corners (literally) to get higher scores. ATI got an extra 3% from cheating. NVIDIA got a whopping 24% higher scores from cheating! take a look at the extremetech screenshots:

http://www.extremetech.com/print_article/0,3998, a= 41574,00.asp

Get over it....just look at it how YOU will use it (5, Interesting)

rimcrazy (146022) | more than 11 years ago | (#6049021)

I've worked in the PC industry more years than I care to think about. All graphic card vendors tweak their drivers and bios to make their cards look better. If people didn't put so much emphisis on benchmarks for buying decisions then there would not be much reason to tweak things but the reality of the world is they do.

On a side note, me and my team many, many years ago designed, what was at the time, one of the fastest chip sets for the blinding new 12 Mhz 386 PC. We had discovered that the Norton SI program that everyone was using to benchmark PC's based most of it's performance on a small 64 Byte (yes, that is not a typo 64 BYTE) loop. We had considered putting a 64 byte cache in our memory controller chip but our ethos won at the end of the day as cleary what we would have done would have been discovered and the benchmark would have been rewritten. Had we done it however, for our 15 mins of fame our benchmarks would have been something crazy like 10x or 100x better than anything out there.

What exactly are kid gloves? (1, Informative)

Zach Garner (74342) | more than 11 years ago | (#6049026)

from the what-exactly-are-kid-gloves dept.

Get the answer at Straight Dope [straightdope.com]

Just because the Register says so? (3, Informative)

Performer Guy (69820) | more than 11 years ago | (#6049053)

Look there is a clear difference between what NVIDIA and ATI have done here. ATI are not cheating, they look at a sequence of instructions and reorder them if they fit a pattern, but they do exactly the same thing as before. This is central to the kinds of things optimizing compilers and CPUs do. Maybe you thing it's too narrow a path, but it's a minor infraction at best compared to the blatant cheats of NVIDIA, who not only rewrote shaders but did several otehr really heinous things, like disableing screen clear and adding their own hidden clip planes.

It's a real shame that The Register obscured the truth here with an article that attacks ATI for conservatively removing optimizations while giving the real miscreant gets a free pass. ATI should leave their optimizations in IMHO, but maybe you disagree because their mathematically equivalent optimization is not general enough, it's a close call, but they don't deserve what the distorted treatment given in The Register.

Big deal, ATI cheated WAY worse before - Quack.exe (0)

brunes69 (86786) | more than 11 years ago | (#6049408)

Why has everyone forgotten the http://www.hardocp.com/article.html?art=MTEx">Quac k.exe cheat ATI had with the first radeons? That was (IMO) far worse than what NVidia is doing, since it affected a popular GAME not some lame benchmark that means nothing.

Confused (2, Insightful)

Otter (3800) | more than 11 years ago | (#6049060)

ATI came a cropper the same way. Futuremark saw an eight per cent decrease in the score of one benchmark, Game Test 4, when it conducted the test with a renamed executable rather than correctly titled code. ATI's fix, said Futuremark, contributed to an improvement of just under two per cent in the overall 3DMark 03 score.

I'm confused about what this means. Is the 1.9% difference in ATI performance between Game Test 4 with correct and modified names, or between the current driver and an older version?

Most people here seem to think it's the latter, and I'd agree that they did nothing wrong if that's the case. But it's not obvious to me that they're not accused of the same thing as NVIDIA.

Why the heatred for Nvidia all of a sudden (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#6049065)

Cripes, all of a sudden they suck ass. Why? because they are successful, and slashdot hates any company that does well. For the same nonsense reasons you filthy fucking hippies hate Microsoft - because they are good at what they do and because they make money.

"Make money" is what the rest of the world does when they get a "job" because they have "skills." Do a google search if you don't understand any of this. Hint: Linux isn't involved in any of it.

Do you filthy fucks hate Ford? Wal-Mart? I bet you do, as you drive your shitty broken-down Kias to some Mom & Pop store where you pay twice as much, which is really stupid since you don't have any money. Other than the cash you make sucking dick in the alley before you log on for the day and spend the rest of your time spanking off looking at picture of Linus Dickvolis.

The more things change... (2, Interesting)

jridley (9305) | more than 11 years ago | (#6049069)

I remember video card companies cheating on benchmarks 10 or more years ago. This was when PCI was the latest thing on the block, and was competing with VESA-local bus. They wrote drivers specifically to detect when they were being called by the PC-Magazine benchmark program, and they'd do some stuff like just returning from every other call, since the prog was just calling the same thing 10000 times.

Slashdot may be in trouble... (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#6049093)

Look at how many stories are making the front page in a given day... it's bordering on ridiculous. The content in most is too weak to be front page fodder, but the editors are making a common amateur's mistake and think that dynamicism through quantity will attract eyes more often, therefore providing additional views of the banner ads.

The reality is that such tactics water down the product and drive potential subscriptions and future ad-views away.

Re:Slashdot may be in trouble... (-1, Troll)

Anonymous Coward | more than 11 years ago | (#6049185)

But it's a troll paradise, more opportunities for first posts, and less "interesting and insightful" comments to vie for attention.

NVIDIA DID NOT CHEAT (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#6049104)

I am an employee at Nvidia, and you have my word: We are NOT cheating on 3dmark2003. What we ARE doing is performing some simple optimizations to make the benchmark run as intended on our cards. This is in no way cheating since developers do this all the time. If you want to call what we are doing cheating then everyone who has ever used the "-O" flag in gcc is also a cheater.

Why is ATI complaining so much? The fact is that our cards are just better than theirs. ATI's cards are built on a backwards architecture that is impossible to optimize for. This is why you don't see ATI performing the same tweaks as us. An Nvidia developer can spend an hour to get a 40% performance increase, but it takes ATI weeks to get less than a 2% performance increase. You do the math.

Re:NVIDIA DID NOT CHEAT (0)

ovit (246181) | more than 11 years ago | (#6049345)

If it only takes an hour to get a 40% performance increase, why didn't you take that hour before the FX came out, and then you could have actually beaten the radeon 9800.

Or better yet, spend a couple hours and figure out a way to get rid of the locomotive like cooling system built into the FX....

Respect (1, Insightful)

Anonymous Coward | more than 11 years ago | (#6049106)

I lose respect for companies when I hear stuff like this. They should try to reorganize their best-practice protocols and rework their ethics. Then they should read more Scott Adams.

Quake III at 300 FPS (5, Insightful)

Genjurosan (601032) | more than 11 years ago | (#6049127)

When Quake III runs at 300 FPS on my system under my 9700 Pro with 4x AA, I could care less about 3DMark and what ATI or Nvidia tweak. If the games run smooth and they look good, then go with it. Truth is, the ATI looks better than the Nvidia card under QIII, WCII, JKII, and pretty much everything else I've been playing.

The issue with low FPS is a game problem 9 out of 10 times. The faster the video card, the less the game development houses work to streamline and improve their framerate.

Cheating??? (3, Insightful)

JDevers (83155) | more than 11 years ago | (#6049139)

In what way is Ati cheating, really? If you think about it, virtually every modern processor does some minor instruction rescheduling right? Basically, Ati is doing this in the driver and not on-chip, that's the only difference. I'm sure in the next few generations of GPUs we'll see the introduction of hardware features like this. Once the pixel/vertex shaders get ironed out pretty well and a lot of people use them. Right now very few games really make use of them and they spend most of their time emulating hardcoded T&L which is again a part of the driver.

Nvidia is cheating and acting like a child, er, large corporation...but that isn't at all what Ati is doing.

Re:Cheating??? (3, Insightful)

Anonymous Coward | more than 11 years ago | (#6049188)

There is a difference between out of order execution as found on modern CPUs and what ATI is doing. The rearranging of CPU instructions is done on-chip, and is done no matter what application is being executed. What ATI did was hard code a rearrangement of instructions into their driver. Something like if(app=3dmark) swap(instruction1,instruction2),swap(instruction3, instruction4)... If the app being run isn't 3dmark3003 then no performance gain will be had. Now if ATI came up with a general algorithm to rearrange instructions FOR EVERY APPLICATION and either implemented it on the driver or in hardware, that would not be cheating.

but this doesnt seem fishy? (3, Insightful)

222 (551054) | more than 11 years ago | (#6049176)

Hey, i want to know the truth just as much as the next guy, but seriously.... does this seem odd that when Nvidia opts out of this hundred-thousand-dollar beta program, this happens?
Ive read 5-6 reviews of the FX 5900 and everyone seems to think its great, and rightly gives Nvidia the 3d crown. (Especially concerning Doom ]|[ :)
If you read the interview, its even brought up that the 5900 seems to do just fine in all other benchmarks, only futuremark seems to give it a hard time, and im not buying that crap about Doom 3 benchmarks not being readily available.
If i remember, Toms had a good review of that....

Re:but this doesnt seem fishy? (1, Interesting)

Anonymous Coward | more than 11 years ago | (#6049200)

If i remember, Toms had a good review of that....

Ah yes... Tom's Hardware...
The guys who compared a dual Opteron system with 2 GB of RAM with a dual Xeon system with only 512 MB of RAM.
A great source for unbiased reviews and comparisons...

Re:but this doesnt seem fishy? (2, Insightful)

Anonymous Coward | more than 11 years ago | (#6049244)

You forget to mention that the benchmarks run weren't memory intensive, therefore no swapping occured on either machine so the operton might as well have had 512 mb, it made no difference.

Re:but this doesnt seem fishy? (1, Interesting)

Anonymous Coward | more than 11 years ago | (#6049287)

I have no idea how memory intensive the benchmarks were, since I can't actually see everything they ran, jsut what they said and the results.

I'll be perfectly blunt though, I just plain don't trust Tom's Hardware.
If it would have made no difference had it been 512 MB instead of 2 GB then they should have ran it with that, you want to make the test field at least appear even.

Re:but this doesnt seem fishy? (1)

222 (551054) | more than 11 years ago | (#6049402)

I think i made a mistake in mentioning toms hardware. Its a website i read, and personally trust.

Hardocp also had a review of the 5900 running doom ]|[, and if you dont trust them im sure there are a dozen more hardware sites out there. (Note: the hardocp team had troubles with the current set of ATI drivers, from what i understand this is/has been/being worked on)

Re:but this doesnt seem fishy? (1)

0123456 (636235) | more than 11 years ago | (#6049290)

"If you read the interview, its even brought up that the 5900 seems to do just fine in all other benchmarks, only futuremark seems to give it a hard time, and im not buying that crap about Doom 3 benchmarks not being readily available."

It seems fairly clear that the FX does well in DX8 benchmarks, and 3DMark03 may be the only widely-available DX9 benchmark around (I can't think of any others). So rather than whine about the benchmark, we might conclude from this that the FX is a good DX8 chip and a slow DX9 chip... after the cheating fix, the web numbers I've seen show that the FX seems to run the 3DMark03 pixel shader test at about half the speed of the 9800 Pro, for example.

As for Doom3, I'll wait for it to be released before I worry about benchmark numbers.

Does anyone think it's coincidence (4, Funny)

bhsx (458600) | more than 11 years ago | (#6049196)

Is it coincidence or some sort of nVidia inside joke that changing the name of the Dawn executable (fairy.exe iirc) http://www.nvidia.com/view.asp?IO=demo_dawn
to quake3.exe removes those pesky leaves, revealing her suptle nature, and that renaming it to 3dmark2003.exe removes the leaves and her wings? Is the inside joke that they leave "certain things out" of quake3 and 3dmark? Does the government know of the existence of aliens and wormhole portals to other worlds?

Coincidence? (3, Insightful)

Throatwarbler Mangro (584565) | more than 11 years ago | (#6049203)

What fortune should I happen to get at the bottom of the comments page?

What upsets me is not that you lied to me, but that from now on I can no longer believe you. -- Nietzsche

Classic.

This is a good thing (2, Insightful)

onyxruby (118189) | more than 11 years ago | (#6049214)

The less cooperation between the testing companies and the tested companies the better. The last thing this industry needs is to become like so many other industries where the test standards lose all merit because the testers and testee's are in bed togethor. Test results are only of merit of they are done completely independent of the manufacture during the entire test process.


Think of it this way, when's the last time you saw PC World roast a product that truely deserved it? How many review sites gave WinMe a thumbs up when it's widely viewed in the industry at MS's worst OS to date? We (the public) simply aren't being served if the test companies are cooperating with the companies their testing. Look if a testing company, review site or whatever other lab doesn't occasionaly come out and just say "this sucks" than you know they aren't credible. There's too much out there that sucks, and too few reviewers willing to let the public know before they waste their money.


It's the same reasoning that dictates why consumer's reports will buy their cars anonymously from dealers using third parties instead of getting "special" delivery directly from the manufacture. What we should really see with the behaviour were observing so far is an impetus to develop an open source test benchmark application. By doing this we would assure that the results can't be bought, just like has become common practice in so many other industries.

End users don't need 3D Mark2003 results (1)

C_Kode (102755) | more than 11 years ago | (#6049220)

I decide what games I'm going to play, and I see how they look on each card. The main people that buy highend card generally know others that do also. I just compare their cards with the games I wish to play. I pick the best of the lot, and buy it. Currently I have a GeForce4 4400TI. It does great for the games that I play. I'm sure I will need a new card when Doom3 is released. I will check Doom III out on every card I consider. Then buy the one I think looks the best.

who gives a flying fuck?!?!? (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#6049232)

they're just stupid video cards. NOBODY will ever notice any speed or feature difference between two cards from the same generation. Get a life morons.

Video card manufacturers have ALWAYS rigged their drivers to cut corners on the known benchmarks+games because people are so stupid that they buy one card based on some silly made up benchmark number printed on the box rather than based on which one is cheaper or has a better set of outputs.

I'm shocked. (1)

blair1q (305137) | more than 11 years ago | (#6049259)


SHOCKED! to find that there is optimization going on here.

(Alphonse enters.)

Your SPECmarks, sir.

Thank you.

Unintentional Consequence? (1)

VernonNemitz (581327) | more than 11 years ago | (#6049297)

Some time ago I had read something to the effect that Nvidia was using "genetic algorithm generation" procedures, to improve its drivers. If so, then it may be possible that the driver cheats just happened as a result of that process, and not as a result of deliberation.

Re:Unintentional Consequence? (1)

0123456 (636235) | more than 11 years ago | (#6049306)

So you mean that nvidia drivers just randomly replace shaders with different versions which produce different results but run at twice the speed?

Teach the subject, not the test (5, Informative)

LostCluster (625375) | more than 11 years ago | (#6049300)

This happens so often in grade school I'm surprised the computer industry hasn't caught on to it yet. If you give students a copy of the exam the night before the exam, the only material they are going to bother to study the question-answer pairs on that exam, and may just remember what the answer to #6 is rather than even try to understand the question.

In order for a driver benchmark to be useful at all, it needs to be kept absolutely secret from the chip manufacturers before the test, and then once it is used and revealed that benchmark needs to be retired, because the next generation of testing should be designed to concentrate on the new features that the graphic card developers are expected to put in their next generation of cards that will be used in the next generation of games.

In short, the best benchmark will always be based on "that sure-to-hit game that's just about to come out."

Take a step back and look at the big picture. (1)

carlcmc (322350) | more than 11 years ago | (#6049365)

In one test NVDA performance sucked and they cheated in their drivers to make it look similar to what their performance is on other tests.

In Doom ]|[, the most advanced graphics game out there currently, NVDA smokes ATI. On test equipment that NVDA only provided the card.

Am I going to "play" 3DMark or am I going to play DOOM 3. For all of you who will be playing 3DMark more than doom, go ahead and get the ATI card. I'll make my decision based upon the stellar DOOM 3 performance.

So what? Who cares? (2, Insightful)

tomstdenis (446163) | more than 11 years ago | (#6049390)

I never put much stock into benchmarks. One test says one thing another something else.

All this proves is that a benchmark is a highly isolated incident of observable performance.

For example, most charts I see rate the P4 as "faster than the Athlon" at the same clock rate. Yet when I benchmarked my bignum math code I found that the Athlon completely kicked the P4s ass

[K7]
http://iahu.ca:8080/ltm_log/k7/index.html

[P4]
http://iahu.ca:8080/ltm_log/p4/index.html

Does this single test prove the athlon is faster than the P4? Hell no. It proves that using portable ISO C source code todo bignum math is faster on an Athlon. If I used SSE2 the P4 would probably smoke the Athlon, etc...

Can we stop putting stock into this BS?

For the record I have a Ti200. Its decently fast [50fps at 1280x1024 in UT2] and there are no visible artifacts or other "cheats". It works nicely. So if nVIDIA cheated to make their 3dmark score better all the power to them. Screwing around with meaningless benchmarks is a good way to discredit them.

Tom

Of course... (0)

Trent Polack (622919) | more than 11 years ago | (#6049400)

Of course both companies are guilty of optimizing their drivers to get better performance on the benchmark. The fact that ATI is also doing it (though, not to the extreme that nVidia did) should come as no surprise to anyone.

This is business, and nVidia and ATI are different companies, both are doing everything in their power to one-up the other, so I'm sure if anyone looked close enough at various benchmarks, some instances of "foul-play" would be present.

Personally, I don't think the optimizations made by either company is anything to be "ashamed" of, it's not like they're producing fake results, they're simply pushing their own cards to their extent for the benchmark, and that's nothing "unfair."

What about game demo benchmarks? (0)

Anonymous Coward | more than 11 years ago | (#6049404)

To what extent does this also cast doubt upon video card benchmarks that use game demos in which the camera moves along a fixed path?
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...