Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

More on Futuremark and nVidia 429

AzrealAO writes "Futuremark and nVidia have released statements regarding the controversy over nVidia driver optimzations and the FutureMark 2003 Benchmark. "Futuremark now has a deeper understanding of the situation and NVIDIA's optimization strategy. In the light of this, Futuremark now states that NVIDIA's driver design is an application specific optimization and not a cheat."" So nVidia's drivers are optimized specifically to run 3DMark2003... and that's not a cheat.
This discussion has been archived. No new comments can be posted.

More on Futuremark and nVidia

Comments Filter:
  • by Samari711 ( 521187 ) on Tuesday June 03, 2003 @02:18PM (#6107392)
    and i didn't use a cheat sheet, i used a memory priming sheet.
    • Re:riiiiight... (Score:5, Insightful)

      by PhxBlue ( 562201 ) on Tuesday June 03, 2003 @02:22PM (#6107434) Homepage Journal

      That is the way it sounds, isn't it?

      "Application-specific optimization". . . In other words, "We're not cheating, we're just adding code to our driver to make sure our card works really well with benchmarking software." Of course, if it works better with benchmarking software than it does with real-world applications, that is cheating, isn't it?

      It actually reminded me of the axiom, "That's not a bug, it's a feature!"

      • by YetAnotherAnonymousC ( 594097 ) on Tuesday June 03, 2003 @02:32PM (#6107559)
        I think the money quote is:

        However, recent developments in the graphics industry and game development suggest that a different approach for game performance benchmarking might be needed, where manufacturer-specific code path optimization is directly in the code source. Futuremark will consider whether this approach is needed in its future benchmarks.

        I can sort of see the argument here, but it basically ruins the point of having a standard interface like DirectX. It's also like telling your math teacher, "no, it would be easier for my equations if you made 1+1=3. Now do it because I'm your star student."
        • by TopShelf ( 92521 ) on Tuesday June 03, 2003 @02:54PM (#6107811) Homepage Journal
          This is a pretty disturbing point, in that it makes it even more difficult for independent game developers to make headway. Basically, the videocard manufacturers offer to assist in optimizing games to work with their hardware. But of course that assistance will vary with the size and clout of the developer, leaving smaller outfits with the task of trying to optimize for various cards on their own, or suboptimizing the features in their products... either way it's a mess.
          • by daVinci1980 ( 73174 ) on Tuesday June 03, 2003 @04:54PM (#6109142) Homepage
            I would mod this down, but I'd rather argue (so much more fun! ;-) ). As someone who has worked on a very large game (team of >50, sold 500K copies so far), someone who works at a small studio now (20 people), and someone who develops at home on the side, (whew) I can say that size and clout has little to do with how much attention video card manufacturers are willing to give you. All that really matters is that they see an interesting prospect, and a way for their card to look "better."
        • by Zathrus ( 232140 ) on Tuesday June 03, 2003 @03:25PM (#6108089) Homepage
          I can sort of see the argument here, but it basically ruins the point of having a standard interface like DirectX

          Shrug... welcome to reality. DirectX, OpenGL, etc. don't properly model the hardware in some cases leading to much worse performance than should be available.

          It's not like saying 1+1 = 3. It's more like saying what's 7+7+7+7+7+7? Well, it's the same as 7*6, but guess which one is faster to calculate?

          And it's not quite like that either, I know, because the bit that FutureMark is tentatively agreeing with is Nvidia changing the shader precision, which can lead to a loss of quality (so maybe it is 1+1 = 1.999999999998).

          ATI did pretty much the same thing with their drivers, leading to a much slimmer 1.9% improvement. Of course, it's unclear how much of Nvidia's improvement was from the shader changes (which FM is considering) versus from the other modifications they made (like clipping issues). The latter points are not in question by FM - they are cheating.
          • by be-fan ( 61476 ) on Tuesday June 03, 2003 @03:41PM (#6108276)
            One key point, though: NVIDIA's shader precision is much higher than ATI's at its highest settings. NVIDIA's middle precision, however, is lower than ATI's maximum. This means that comparing the performance of the two is kind of a crap shot, because you can't configure the two to use the same precision.
          • by Happy Monkey ( 183927 ) on Tuesday June 03, 2003 @03:50PM (#6108378) Homepage
            It's not like saying 1+1 = 3. It's more like saying what's 7+7+7+7+7+7? Well, it's the same as 7*6, but guess which one is faster to calculate?

            It's more like saying "What's 7+7+7+x+7+7?" For the benchmark program, x happens to be 7, leading to 7*6, but the general case is actually 7*5+x. No attempt is made to check an arbitrary game to see if x is 7. It only applies the optimization if the executable is a particular benchmark.
        • Am I the only one wondering whether the discussion in question was of the "we'll sue you out of existance" variety.

          Its odd they would make their benchmark useless, and thats what it has become now IMHO - it doesnt tell you how fast a random game is likely to run, it tells you how good their hackers are at hand tuning a meaningless benchmark.
      • by confused philosopher ( 666299 ) on Tuesday June 03, 2003 @02:37PM (#6107625) Homepage Journal
        This just means that games have to design themselves to mimick benchmarking software.

        Come to think of it, why doesn't nVidia just optimize their software for games instead of benchmarking software...?
      • Re:riiiiight... (Score:3, Informative)

        by jetmarc ( 592741 )
        > Of course, if it works better with benchmarking software than it does
        > with real-world applications, that is cheating, isn't it?

        I like it when card manufacturers optimize their driver to achieve high
        Quake III Arena frame rates, because coincidently Quake III Arena is my
        favourite game (and actually the ONLY game I play). I don't care if the
        drivers are good by thoughtful design, or by re-engineering the Q3A code
        path and then constructing a driver that is an exact fit (possibly with
        penalties for othe
    • And I was only optimizing my tax returns based on potential future wages!
    • by Jack William Bell ( 84469 ) on Tuesday June 03, 2003 @02:37PM (#6107617) Homepage Journal
      How many of these can we do?

      "Officer, I wasn't speeding. I was driving in a manner consistant with the road, conditions and the huge motor in my car!"

      "It was creative accounting sir! Not an attempt to 'cook the books'."

      "We are only writing software with the features our users want. This isn't code bloat, and I never made that remark about 640k being enough for anyone!"

      More?
    • by p3d0 ( 42270 ) on Tuesday June 03, 2003 @02:41PM (#6107664)
      No, not a cheat-sheet, but an examination-specific memory optimization.
  • Fine With Me (Score:4, Interesting)

    by HeelToe ( 615905 ) on Tuesday June 03, 2003 @02:19PM (#6107395) Homepage
    Though, I do prefer they make application specific optimizations that mean better gameplay.

    It's just another piece of information to keep in mind when selecting a new card.
    • Futuremark should (Score:3, Interesting)

      by Achoi77 ( 669484 )
      ..pump out the most ugly coded, cycle-wasting benchmark. I don't mean one that showcases all the newest rendering techniques, but rather one that strains to put out a simple rotating triangle. Then just have Nvidia pull out all the stops to try to make their card work faster.

      It's sort of akin to walking around with a backpack full of cinderblocks. That way, when you put down those cinderblocks(ie benchmark), you'll notice how much stronger you got.

      Perhaps they should use .NET for their next benchmark. Or

  • Cheat? (Score:3, Funny)

    by Davak ( 526912 ) on Tuesday June 03, 2003 @02:19PM (#6107396) Homepage
    So nVidia's drivers are optimized specifically to run 3DMark2003... and that's not a cheat

    Errrr... that seems like a cheat to me!

    Davak
    • Re:Cheat? (Score:5, Interesting)

      by Davak ( 526912 ) on Tuesday June 03, 2003 @02:26PM (#6107477) Homepage
      Maybe I should elaborate...

      Specifically designing your product to work better in a test than in real life should be considered cheating.

      This could be avoided if 3DMark2003 would release different methods of testing the video cards each year... or if one could download updates from 3DMark2003 that would block any driver-specific optimizations.

      I usually look at the latest and greatest fps benchmark for the latest and greatest game anyway.

      Well, actually... my current Nvidia video card laughs at my little CPU anyway. I until I can find some more CPU to drive my screaming video card... I am not going to find any performance increase.

      Davak
    • Re:Cheat? (Score:5, Funny)

      by malia8888 ( 646496 ) on Tuesday June 03, 2003 @02:31PM (#6107548)
      So nVidia's drivers are optimized specifically to run 3DMark2003... and that's not a cheat

      That is right, this is not a cheat.. we are just redesigning the arrow and repainting the target so they match;)

  • by MerryGoByeBye ( 447358 ) on Tuesday June 03, 2003 @02:19PM (#6107399) Journal
    But then, I'm Bill Gates.
  • whee (Score:5, Funny)

    by Anonymous Coward on Tuesday June 03, 2003 @02:19PM (#6107405)
    Oh, it seems some careless individual has left this big pile of money on the table! Well, we'll just leave for a few moments and maybe when we come back it will have gone away.
  • Yeah, right (Score:3, Insightful)

    by jabbadabbadoo ( 599681 ) on Tuesday June 03, 2003 @02:19PM (#6107408)
    Time to update Websters. Cheat just got new semantics.
    • Time to update Websters. Cheat just got new semantics.

      optimize

      verb. optimized, optimizing, optimizes

      See cheat.

      1. Jimmy optimized his test score when the teacher wasn't looking.

  • by Nogami_Saeko ( 466595 ) on Tuesday June 03, 2003 @02:20PM (#6107414)
    I think [H]ardOCP stated it best as "Futuremark didn't want to get sued by Nvidia". Nvidia has the legal and financial resources to totally ruin Futuremark and they know it.

    And now Futuremark has totally invalidated their own benchmark software by declaring it "open season" for hardware manufacturers to distort the "tests" in any way shape or form they desire to make the numbers higher.

    N.
    • Exactly. The whole point of a benchmark is to help determine real world performance, not to get some artifically generated 'performance number' so you can be higher than your friends. 3D Mark has some great eye candy, but after this I wouldnt use them as a benchmark any more, just for recreation. :)
    • Well, Futuremark was in trouble when Magazines ran ads for PC manufactures who used Futuremark scores.

      Now the last 2 months of PC ads are worthless if based on Futuremark scores.

      Hey, 0-60mph, in 2 hours...
  • Futuremark scared? (Score:3, Insightful)

    by steveit_is ( 650459 ) on Tuesday June 03, 2003 @02:21PM (#6107422) Homepage
    Looks like someone is scared of somebody elses lawyers. Yuck! This is obviously Futuremark trying to appease Nvidia.
  • by L. VeGas ( 580015 ) on Tuesday June 03, 2003 @02:22PM (#6107426) Homepage Journal
    It's not cheating if you don't get caught.

    Oh, I did get caught?

    No, I didn't. Let's move on, shall we?
  • Bullshit (Score:5, Interesting)

    by Obiwan Kenobi ( 32807 ) * <evan@misterFORTR ... m minus language> on Tuesday June 03, 2003 @02:22PM (#6107429) Homepage
    This is politics at its worst, and I'm calling bullshit.

    There was no need for this nicey-nice statement other than NVidia threatening lawsuits and Futuremark wanting to protect what assets they have.

    Futuremark had every right to call NVidia on their selfish claim and unbelievable hacks. To say that they weren't liable for their own blunder is to say that Futuremark's reputation has been replaced by corporatespeak and a lack of respect almost unparalelled.

    What's worse is that I really thought "Yeah, this time the bad guy gets his due" and that NVidia should've known better.

    But of course, a few weeks later we've got to put on the nice face again for the public en large.

    What a complete waste of time. I know there isn't much respect left in corporate America, but hell, if you can't call a spade a spade, why even bother with the benchmarks when someone can just rewrite an ENTIRE SHADER and only keep a picture clear while the demo is on rails?
    • This is politics at its worst, and I'm calling bullshit.

      You're calling bullshit? Okay, you can have it. I call the last piece of pie.

      Why would anyone call bullshit? It tastes like, well you know.
    • Re:Bullshit (Score:3, Funny)

      by PhxBlue ( 562201 )

      This is politics at its worst, and I'm calling bullshit.

      You know, maybe that was the idea? Imagine the scenario: "Okay, folks, we need to rephrase the statement that 'NVidia cheated' so that they won't sue our pants off our asses. What can we come up with?"

      'I know! Let's call it an application-specific enhancement! Their lawyers will stare blankly, but any geek shopping for a video card will read right through it!'

      Who knows? Coulda happened that way. :)

    • Re:Bullshit (Score:3, Funny)

      by Loki_1929 ( 550940 ) *
      "I'm calling bullshit."

      I'm sorry, Bullshit isn't here to answer your call.
      Please leave a message after the tone.

      *BEEP*

  • Stack Creep (Score:2, Insightful)

    by netolder ( 655766 )
    This sort of outcome is inevitable as drivers move "up the stack" into the application layer. To get better and better optimizations, the drivers need to know more and more about the application that is requesting the services - thus, we end up violating the strict separation between application and driver.
  • Great! (Score:5, Funny)

    by blitzoid ( 618964 ) on Tuesday June 03, 2003 @02:23PM (#6107445) Homepage
    Well, that's excellent... now they can put 'Designed to run 3Dmark2003' on Nvidia product boxes!
    • Re:Great! (Score:3, Insightful)

      by DickBreath ( 207180 )
      now they can put 'Designed to run 3Dmark2003' on Nvidia product boxes!

      Having such a logo labeling program might be a revenue opportunity for FutureMark.

      Another revenue center for FutureMark might be to sell benchmark result coefficients. Each different card's results are biased by some coefficient, whose value can be purchased according to a tiered pricing model. The ensures uniformly fair bias according to what each video card vendor is willing to spend.

      On the video card side of the fence, cou
  • by Viewsonic ( 584922 ) on Tuesday June 03, 2003 @02:23PM (#6107448)
    Cripes already. No one even BOTHERS with #DMark anymore, and after this fiasco no one is ever going to bother with them again. Gamers will use REAL EVERYDAY GAMES to see what runs the fastest again. Looking at some goofy simulation app coming up with scores and people buying into the company and people tricking drivers for particular tests is just crappy and makes 3DMark 100% invalid to any of my concerns in the future. I will only trust reviews that benchmark the latest and greatest games that I will be buying these cards for, whoever can run them fastest at that particular time IS WHAT IM GOING TO BUY. Peroid. Enough of this 3DMark BS.
    • Posted in response to this [slashdot.org] initially, but this is such a popular misconception. ;-)

      I think the idea was to test new technologies that haven't been implemented yet in Quake 3 or Unreal Tournament 2003 (like in the upcoming DOOM III).

      A quote of a quote in their 10/26/98 press release:

      "3DMark sets a long awaited standard for testing actual game performance for titles like Unreal as well as the future technologies. I support it one hundred percent."

      -- Tim Sweeney, Unreal Programmer, Epic MegaGam

    • Actually, WE care. (Score:3, Insightful)

      by Canis ( 9360 )
      Actually, we game developers care. We want to make use of new features of graphics cards to increase performance and/or visual quality. But also, we want our games to run on all (relatively-recent) cards, without having to write complex hacks to work around the bugs of each one.

      It's no use benchmarking on the latest and greatest games -- because, as developers, we try and avoid releasing games that run horribly (slowly or with obvious bugs) on certain cards. Sometimes we can persuade the manufacturers to f
  • ... FutureMark and NVidia stated that they were proud to announce that former President Bill Clinton had joined their boards and had assumed management responsibilities.

    • by Anonymous Coward
      I can hear it now:
      "Our driver did not have inappropriate clipping optimizations with that application."
  • by MongooseCN ( 139203 ) on Tuesday June 03, 2003 @02:25PM (#6107472) Homepage
    Run Quake3 with the video card and check out the frame rate and image quality. Run it under UT also and every other 3D game you can. Then compare the framerates and image quality. Who cares what it's 3DMark is. Did you a buy the card to specifically run it under 3DMark? Most people buy these cards for playing games so comparing how it runs the actual game to other cards is the only meaningful measurement.
  • by Zone5 ( 179243 ) on Tuesday June 03, 2003 @02:26PM (#6107478)
    All your benchmarks are belong to us!
  • by MyNameIsFred ( 543994 ) on Tuesday June 03, 2003 @02:26PM (#6107483)
    The press release is short on details. But I think it raises two points. First, Futuremark is no longer calling it cheating. Second, Futuremark is considering changes to the way it benchmarks cards.

    So the question in my mind is did Futuremark learn something from the discussions? Is there something it was ignoring in its tests?

    I'm trying to not be a cynic and assume a big fat envelope was passed under the table. That what Nvidia did was legitimate.

    • by OmniGeek ( 72743 ) on Tuesday June 03, 2003 @02:50PM (#6107761)
      NVidia did things that were clearly NOT legitimate, and FutureMark caught them at it. There's a PDF report [futuremark.com] on FutureMark's Web site (assuming it hasn't met with an "accident" by now) detailing the dirty deeds. Chief among them, IMHO, was a trick where the driver was supposed to draw and update positions of stars in a night sky (involving clearing the background) as one moved along a 3D path; if one stays on the exact preprogrammed track of the demo, it looks OK. BUT... if you turn around (possible in the beta mode of the benchmark) you see that the driver SKIPPED clearing the background; the stars smear like mad. There is NO POSSIBLE WAY their driver was behaving legitimately. (Especially since changing the benchmark's fingerprint oh-so-slightly caused all these quirks to vanish; they were detecting the demo and screwing with things if it was being run...) The rest is just fear-of-pissing-off-the-800-pound-gorilla. A FutureMark developer admitted as much in a newsgroup posting. Sigh...
  • Big quality loss (Score:4, Insightful)

    by 1001011010110101 ( 305349 ) on Tuesday June 03, 2003 @02:27PM (#6107491)
    Those that are following this, should check the pictures on the previous article. The quality of the nvidia "optimized" version sucked (showed big artifacts). That's no optimization, unless there was no image quality loss.
    "This card is optimized for quake as long as you follow the left trail, the right trail will just look like crap but nobody follows it anyway".
  • by pjwhite ( 18503 ) on Tuesday June 03, 2003 @02:28PM (#6107504) Homepage
    If a benchmark doesn't measure performance related to real-world applications, what's the point? If a driver is optimized to run a benchmark faster, that SHOULD mean that the real world apps should run faster, too. If not, the benchmark is useless.
  • Looking closer... (Score:2, Interesting)

    by Infernon ( 460398 )
    While nVidia has made great cards for some time now (I still use and love my GeForce 3), could it be possible that they're not able to keep up? A lot of the reviews that I've been reading tend to favor the Radeon cards over any anything that nVidia has put out lately. While I doubht that nVidia will become another 3Dfx because they're involved in other markets and I've read about them having US government contracts for this project or that, I would propose that they will not be the huge players that they
  • by Kjella ( 173770 ) on Tuesday June 03, 2003 @02:29PM (#6107523) Homepage
    If you want to, you can prerender the whole fucking test, stick it in your driver and just play it off instead of actually rendering when Futuremark is running, that would be an "application-specific optimization" too.

    The benchmark is ment to reflect performance in the actual game, the reason it takes the same path is merely to make the results comparable. What ATI was something the game *could* have achieved in game, if the operations were properly sequenced. What Nvidia did is to fake a performance it can't actually give if a person had followed the exact same path in the game. That is cheating.

    It is pathetic by Nvidia, and it's pathetic by Futuremark to present this press statement. Get some backbone and integrity.

    Kjella
    • The question is to what extent it was "optimized." Nvidia also optimizes for Q3 and UT, so is it not valid to ask how these optimized performances compare to the optimized benchmark?

      What if they optimized not just for Q3, but for Q3 Level 1 while you're using player model X on a sunny day with a BFG and 13 bots? Would you cry foul? Are they cheating to make Q3 run faster? Isn't that their job? Where is the line? And, if 90% of the best selling games today run at that same "optimized" speed, how good

  • by corebreech ( 469871 ) on Tuesday June 03, 2003 @02:30PM (#6107531) Journal
    Now that all the latest games have benchmarking modes, what do we need FutureMark for?

    If NVidia wants to do application-specific optimizations that make UT2003 go faster, then that would be great. That's what they should be doing. Those are optimizations that genuinely benefit the user.
    • by TrekkieGod ( 627867 ) on Tuesday June 03, 2003 @03:47PM (#6108354) Homepage Journal
      If NVidia wants to do application-specific optimizations that make UT2003 go faster, then that would be great. That's what they should be doing. Those are optimizations that genuinely benefit the user.

      Problem is, NVIDIA didn't just optimize. Their application specific "optimization" made the images look worse. And when you couldn't notice it, it was because they were clipping outside the camera angle (becuase they knew exactly where the camera was, something they can't do when you're playing UT2003)

      Like the original statement by futurmark said, optimization is great. But when you change the image intended by the software designer in order to make it go faster, that's not an optimization. For god's sake, I can turn all the details to low on UT2003 and get it to go faster, but that's besides the point

      Reason games specific benchmark don't fly for me (although now that Futuremark has issued this statement, I'm sure ATI will start cheating as well, making the whole thing useless) is that synthetic benchmarks can test features new to the cards that games may not yet have implemented. So I have an idea how it'll perform with future games.

  • So nVidia's drivers are optimized specifically to run 3DMark2003... and that's not a cheat.

    Editorial comments like this are wonderful. They make the best shine out in all of us to which we can feel profoundly enlightened. Michael I thank you for taking the time to summarize two press releases by stating the complete obvious, if not for you I might just have been forced to click those god-awful annoying hyper-links.

    Since you've obviously missed the "concern" over this whole issue let me help you out i

  • "...the physical relationship with the President included oral sex, but not sexual intercourse."
  • It's just an unfortunate choice on nVidia's part.

    I don't know about you, but I'd rather have my nVidia card optimized for, say, Quake3 or Battlefield 1942. Someone call ATI and tell them they've got their new marketing campaign.
  • by CaffeineAddict2001 ( 518485 ) on Tuesday June 03, 2003 @02:36PM (#6107601)
    "Cheating? Our developers, The Franklin Family, would disagree. I think a meeting can be arranged with them, if you wish. Isn't that right Mr. Franklin?"

    *shakes hundred dollar bill side to side, speaks in high tone out of side of mouth*

    "Sure is, Boss!"
  • When I read the statement it seemed to indicate that Futuremark was unsure how to write comparitve benchmark now that you can "cheat" at the benchmarks.

    I can understand that if there are thing in a GPU that can be optimized for an application then you should go ahead and do it, but of course that means how do you truly, evenly compare the performance of one piece of hardware (now with tons of customizable software) versus another one?

    Futuremark has a tough time ahead now getting people to believe they add
  • goons...hired goons (Score:3, Interesting)

    by Ubergrendle ( 531719 ) on Tuesday June 03, 2003 @02:36PM (#6107613) Journal
    Futuremark: "NVidia is cheating! Not as much as ATI, but they're cheating!"

    Nvidia: Knock knock

    Futuremark: "Who's there?"

    Nvidia: "Goons...hired goons."

    Futuremark: "Oh...haha...um...Nvidia is actually in the business of application optimisation! Our mistake. Won't happen again."

    Seriously folks, this is Nvidia using big bad lawyers to scare Futuremark into capitulating. They might have held their ground, until ATI was proven to be doing the same thing, albeit to a much lesser degree.

    Unfortunately, the only person who loses in this scenario is the consumer.
    • A few months ago, in a dark warehouse in Brooklyn...

      *knock* *knock* *knock*
      nVidia: Who is it?
      *Futuremark goons enter stage left*
      Futuremark: You's late wit you's beta program "membership dues". You know what happens wit da peoples dat don't pay they's dues, right?
      nVidia: Piss off! We're not paying this year.
      Futuremark: Dat's a pretty benchmark score you got there. Be a shame is somethin' BAD happened to it...
      *Futuremark pushes nVidia's benchmark trophy over, shattering it*
      Futuremark: oops. Looks like
  • This has been discussed before. Other companies have tried to modify their drivers to produce better results for certain benchmarks. They've always been thrown out as invalid before. I wonder why Futuremark seems to be considering allowing NVidia's enhancement to stand.
    There's a line from the story:
    "...However, recent developments in the graphics industry and game development suggest that a different approach for game performance benchmarking might be needed, where manufacturer-specific code path opt
  • The numbers... (Score:5, Insightful)

    by BrookHarty ( 9119 ) on Tuesday June 03, 2003 @02:38PM (#6107630) Journal
    Well, after seeing the future mark scores, it looked to me as nvidia fx chips are blowing away ati and its gf4 line.

    But I found a really nice german benchmark site 3dcenter.org [3dcenter.org] that had to be the best benchmarks ive ever seen, they actually use the games on each and lists the fps.

    Looks like the FX/GF4 5200/4200 4600/5600 (non ultra) are the same. And the ATI 9700PRO/9800 are faster than the 5800 Ultra.

    After reading these benchmarks, you can really tell nvidia tweaked the SHIT out its drivers for futuremark...
  • by TellarHK ( 159748 ) <tellarhk@NOSPam.hotmail.com> on Tuesday June 03, 2003 @02:38PM (#6107632) Homepage Journal
    (blunt)The problem with a lot of the reasoning I see here with people saying they want the card that plays the game they're interested in quickly, is that it's completely stupid. (/blunt)

    When you're looking for a video card, you -should- rely on a capable, and untainted/optimized benchmark for comparison simply because you can't predict what the software companies that make the actual games are going to do. Will they support -your- chosen card, or will some other GPU maker offer a better bribe to the developer? You may know that kind of info about games shipping RSN, or already on the shelves, but what about next year's?

    Getting the card based simply on one or two games instead of looking at some kind of objective benchmark does no good whatsoever. It's just a way to rope yourself into upgrading the card faster.
  • by Anonymous Coward on Tuesday June 03, 2003 @02:41PM (#6107673)
    Total BS, 3DMark2003 is already geared toward specific vendors. I've personally analyzed the data from the driver (since I'm writing one), and they totally favor ATI with the heavy use of PS 1.4 shaders. In fact, the data changes completely if PS 1.4 support isn't claimed. (3x more geometry is sent)
    Also, PS 1.4 shaders don't always translate 'up' to PS 2.0 hardware very well, which is why (IMHO) Nvidia started all this hub-bub in the first place.

    The only vendor that natively supports PS 1.4 is ATI.

    They should have created PS 1.1 shaders for the masses, and then if 2.0 hardware is detected, had 2.0 shaders for everything.

    And their "DX9" onyl test is a piece of crap too. They use one or two new instructions in the VS, and PS2.0 is only used for the sky. big whoop. No branching in the VS, two sided stencil, or anything cool.

    It's sad the OEMs put alot of stock in 3Dmark, they don't seem to realize that gamers play games all day, not benchmarks.

    • Bogus (Score:4, Interesting)

      by 0123456 ( 636235 ) on Tuesday June 03, 2003 @03:45PM (#6108328)
      "The only vendor that natively supports PS 1.4 is ATI."

      Sorry, but that's garbage, pure and simple. Or are you not aware that PS 1.4 support is _required_ for DX9 cards with PS2.0 support. Your complaint may be valid when comparing a GF4 against a Radeon 8500, but is totally bogus when comparing two DX9 cards.

      "And their "DX9" onyl test is a piece of crap too. They use one or two new instructions in the VS, and PS2.0 is only used for the sky."

      Gee, one minute you're complaining that they use PS1.4 instructions, and now you're complaining that they don't use PS2.0 instructions. PS1.4 instructions _are_ effectively DX9 instructions since other than ATI, no other DX8 chips use them: you need a DX9 chip to run PS1.4 shaders.

      And it would appear to be real lucky for nvidia that they don't use many PS2.0 instructions since from the results of their shader test once the nvidia "optimization" of throwing away the shader and running a completely different shader was fixed, shows them running PS2.0 shaders at about half the speed of a Radeon 9800. The low performance of PS2.0 shaders on the FX card seems to be the reason why nvidia hated 3DMark03 so much; there was no way to get a good score without redesigning the chip or "optimizing".
    • MOD PARENT DOWN (Score:3, Interesting)

      by mczak ( 575986 )
      Please mod that troll down - it's just nvidia pr.

      I've personally analyzed the data from the driver (since I'm writing one), and they totally favor ATI with the heavy use of PS 1.4 shaders. In fact, the data changes completely if PS 1.4 support isn't claimed. (3x more geometry is sent)

      You don't need to "analyze the data" to figure this out. futuremark themself stated how much PS 1.4/2.0 shaders are used. And that 3x times more geometry figure if you don't support PS1.4 is correct - guess why? If you do a

    • You are pretty much on target. However the real reason that 3DMark is a bad benchmark [videogamestumpers.com] is because the 3DMark Score is an aggregate number. You lose all your detail and are able to hide any "cheats" the hardware makers want. It's the reason that whenever people compare hardware in the real world (cars, televisions, computers) they don't use some made up figure. They measure using metrics like horsepower, torque, size, weight and speed.

      What Futuremark is currently doing is akin to taking a Dodge Ram picku

  • by Geek of Tech ( 678002 ) on Tuesday June 03, 2003 @02:47PM (#6107744) Homepage Journal
    Right, right, right...

    And I suppose people who cheat at online MMRGPs are just using undocumented program calls and extended functionality. It's all so clear now...

    What would be great...
    If someone was to reverse engineer the drivers, remove the "Optimisation", recompile and compare results. See what percent the "Optimisations" fudged the results.

    • by micq ( 266015 ) on Tuesday June 03, 2003 @02:56PM (#6107835)
      What would be great...
      If someone was to reverse engineer the drivers, remove the "Optimisation", recompile and compare results. See what percent the "Optimisations" fudged the results.


      Don't have to. In the previous story, they stated how they simply removed the condition that the driver used to switch on this optimisation and, as you said, saw what percent the optimisation fudged the result.
  • Argh (Score:4, Insightful)

    by retro128 ( 318602 ) on Tuesday June 03, 2003 @02:52PM (#6107793)
    What a load of crap. This is one of those things that when you think about it too much a bunch of false lines of logic get drawn and you come up with a nonsensical answer. Either that or 3DMark is trying to avoid a lawsuit from nVidia, which no doubt has been threatened.

    The point of a benchmark is to test dissimilar systems against common references to get an idea of how they perform against each other in such a way that you have an apples to apples comparison.

    If 3DMark writes their program in a way that allows optimization paths for a specific GPU, then it is no longer a benchmark.

    You now no longer have an idea of how fast the card REALLY runs as there is no guarantee that game writers will use GPU-specific optimizations. It's the same thing as MMX...Nobody sees the benefits if it's not hardcoded into the program, so what's the point if being uberfast in a benchmark if you won't necessarily see the same results in the real world?
  • by Thagg ( 9904 ) <thadbeier@gmail.com> on Tuesday June 03, 2003 @03:26PM (#6108094) Journal
    I believe the correct interpretation of what FutureMark is saying is that the game writers are building their games differently for the different boards that are out there. That's what they mean by "manufacturer-specific code path optimization is directly in the code source." The source code they are referring to are UT and Q3, as examples.

    They are saying that the boards have become different enough that game writers are coding differently for them. Not too surprising, really. That's the way it's always been.

    This makes writing a synthetic benchmark extraordinarily difficult, needless to say. I don't know if it's even possible in this case. Perhaps rather than try to come up with one number that specifies how fast a board is, you can come up with a series of metrics for each capability.

    While I'm sure that FutureMark has had some pressure applied to them to make this statement, it's not an unreasonable statement on its face. It's just the path they took to get there that is questionable.
  • by default luser ( 529332 ) on Tuesday June 03, 2003 @03:42PM (#6108285) Journal
    Don't get me wrong, there is nothing wrong with application-specific optimizations.

    But this misses the whole point of 3dmark 2003. Different developers stress the pixel and triangle pipelines in different ways to produce a whole boatload of effects. While major games and engines are often optimized-for, there is no guarantee that ATI or Nvidia will sit down and optimize for the game you just bought.

    That said, 3dmark 2003 should be considered a relational tool for generic perfrormance. Consider it a good bet that if two cards perform similarly and acceptably, the two cards should be able to run almost any DX8/DX9 game off the shelf acceptably.

    The fact that Nvidia's unopitmized drivers perform significantly behind ATI's unoptimized drivers in 3dmark 2003 raises a significant question:

    We all know how well the 5900 does in Quake III, Serious Sam 2, UT2003, etc, but how does it do in ?

    I want to know that if I take *insert random DX8 game here* home to play, IT WILL PERFORM WELL. That is the entire point of having a benchmark like 3dmark. To do application-specific optimizations for it is to nullify the entire point of the benchmark.
  • by Woodie ( 8139 ) on Tuesday June 03, 2003 @04:01PM (#6108495) Homepage
    OK -

    first off, for those of you wondering what the big deal with 3DMark 2003 is - and why you might use it in place of "real games" to benchmark 3D performance - here you go:

    3DMark is a test application to benchmark next-generation performance, so that you can get an idea how your video card might handle games that will be out this time _next_ year. Specifically some aspects of 3DMark are geared toward testing DX9 functionality, and it's Pixel and Vertex shaders. No game currently on the market uses these features (at least not that I am aware of).

    Secondly, the difference between a cheat and optimization is a fine one. If a given function continually produces the same output for the same inputs, and it takes 1 second to do so, and another function can produce the same results given the same inputs, but only takes 1/2 a second - it can be said to be functionally equivalent. However, it has been optimized. It's entirely possible, even desirable to replace pixel shaders and vertex shaders with routines which are optimized for your hardware. In much the same way that compilers schedule instructions optimally for the underlying CPU architecture, so too can instructions be re-ordered in a pixel shader routine... It's an optimization.

    Cheating occurs when people start making approximations (analogies to bringing a cheat-sheet to a test are not valid), or by failing to process (in the case of video cards) the same visual fidelity, and detail that was intended. By example:

    A> Reducing texture bit-depth.
    B> Reducing geometry detail (merging 2 or more polygons).

    This is only cheating if it's not the intent of the original application developer (not driver developer).

    A driver developer could make the following optimizations, since they don't affect the intent of the application developer:

    A> Pre-calc tables. A classic demo optimization would be to precalc a SIN function table to some level of precision as looking up a value was faster than calculating it on the fly.
    B> Replacing various pixel/vertex shader routines with functionally equivalent, but faster ones.
    C> Reordering data and textures (keeping detail and fidelity) into more optimal chunks for your hardware architecture.

    Those aren't cheats - they are optimizations. Of course, the only way you can tell this is if you have an objective standard to gauge against. 3DMark 2003 doesn't seem to provide this. In order to do so they would need the following:

    A> A software renderer for their demo.
    B> Timed snapshots of the demo saving uncompressed images from the software renderer to disk.
    C> The ability to re-run the demo using a hardware renderer (3D Card and drivers).
    D> The ability to take the same snapshots and save them, uncompressed to disk.
    E> The ability to do a histogram, per-pixel comparison to the software renders...

    This would enable you to arrive at some objective comparison of visual fidelity - instead of the occassionally subjective I think screenshot X looks better than screenshot Y. Without the intent of the 3DMark developers being known, we really can't know how true the hardware vendors and their drivers are to the original vision.

    Anything less than 3% difference is highly likely to be indistinguishable from the intent of the developers in this case. 5% to 10% may be visible, but acceptable (i.e. tweaks for speed in place of quality). Over 10% and you're playing with fire.

"If it ain't broke, don't fix it." - Bert Lantz

Working...