Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Hands On With Nvidia's New GTX 280 Card

CmdrTaco posted more than 6 years ago | from the that-time-of-year-again dept.

Graphics 212

notdagreatbrain writes "Maximum PC magazine has early benchmarks on Nvidia's newest GPU architecture — the GTX 200 series. Benchmarks on the smokin' fast processor reveal a graphics card that can finally tame Crysis at 1900x1200. 'The GTX 280 delivered real-world benchmark numbers nearly 50 percent faster than a single GeForce 9800 GTX running on Windows XP, and it was 23 percent faster than that card running on Vista. In fact, it looks as though a single GTX 280 will be comparable to — and in some cases beat — two 9800 GTX cards running in SLI, a fact that explains why Nvidia expects the 9800 GX2 to fade from the scene rather quickly.'"

Sorry! There are no comments related to the filter you selected.

Yeah but... (2, Funny)

Anonymous Coward | more than 6 years ago | (#23811159)

Can it play Duke Nukem forever?

Re:Yeah but... (2, Insightful)

Anonymous Coward | more than 6 years ago | (#23811181)

Can Linux run it?

Same answer as all cool new hardware: NO!

Re:Yeah but... (0)

the_humeister (922869) | more than 6 years ago | (#23811433)

If it has branch instructions, perhaps it could.

Re:Yeah but... (2, Insightful)

Devin Jeanpierre (1243322) | more than 6 years ago | (#23812519)

Same answer as all cool new hardware: NO!
Easy counter-example would be any new CPU architecture, which is generally adopted by Linux faster than the competition (especially Windows, which is probably what you're comparing Linux to, given the context). AMD64 (and Itanium 2, for that matter) is an example. While Linux can be slow to get support for some things, that's certainly not true for all cool new hardware. What about the PS3? Pandora? Heck, some cool hardware Linux supports would be impossible for a great deal of other OSes, especially Windows (especially if I mean Anything/Linux, and not GNU/Linux).

Don't worry.... (2, Funny)

Totenglocke (1291680) | more than 6 years ago | (#23812125)

They still have 10 more years to develop video cards before Duke Nukem Forever comes out!

The this incredible!! (2, Funny)

Zosden (1303873) | more than 6 years ago | (#23811193)

Before the only computer that could do this at even medium graphics was Link [computerworld.com]

Anandtech and TechReport reviews (1, Informative)

Anonymous Coward | more than 6 years ago | (#23811277)

http://anandtech.com/video/showdoc.aspx?i=3334 [anandtech.com]

http://techreport.com/articles.x/14934 [techreport.com]

Conclusion: 9800GX2 is faster and cheaper

Re:Anandtech and TechReport reviews (3, Informative)

TheLinuxSRC (683475) | more than 6 years ago | (#23811549)

The 9800GX2 may be cheaper but it most certainly is not faster, even considering your links. From Anandtech [techreport.com] , the charts show a significant speed increase with the new hardware.

In fact, from the article:
The GTX 280 delivered real-world benchmark numbers nearly 50 percent faster than a single GeForce 9800 GTX running on Windows XP, and it was 23-percent faster than that card running on Vista. In fact, it looks as though a single GTX 280 will be comparable to--and in some cases beat--two 9800 GTX cards running in SLI, a fact that explains why Nvidia expects the 9800 GX2 to fade from the scene rather quickly.

Which leads me to the question, are you trolling?

Re:Anandtech and TechReport reviews (0)

Gnavpot (708731) | more than 6 years ago | (#23811779)

From your quote:
"will be comparable to--and in some cases beat--"


Is that what you call "a significant speed increase with the new hardware" (comparing a GTX 280 to a 9800 GX2)?

Re:Anandtech and TechReport reviews (2, Informative)

BronsCon (927697) | more than 6 years ago | (#23812111)

Actually, they're comparing the GTX 280 to TWO 9800s in SLI configuration. RTFS FFS

Re:Anandtech and TechReport reviews (4, Informative)

Kamidari (866694) | more than 6 years ago | (#23812431)

Yes, they're comparing it to two 9800GTXs, which is what a 9800GX2 is: Two 9800GTXs on one board. RTFS indeed. It seems like a case of give and take. The GTX280 is more expensive, but is a single GPU solution, which tends to be more stable. The 9800GX2 is cheaper, runs about as fast, but is a dual GPU unit, so you might have a few more "issues" to deal with in your game playing adventures.

Re:Anandtech and TechReport reviews (1)

blueZ3 (744446) | more than 6 years ago | (#23812119)

I think you missed the last half, which says that a SINGLE 280 is comparable to and in some cases beat TWO 9800s

Re:Anandtech and TechReport reviews (1)

mordenkhai (1167617) | more than 6 years ago | (#23812151)

The chart you linked to is 3DMark, in which the new GTX 280 does in fact top the list. However if you slide a few pages to the games section its a very different picture. http://techreport.com/articles.x/14934/10 [techreport.com] Call of Duty, which is topped by the 9800GX2
http://techreport.com/articles.x/14934/11 [techreport.com] Half Life 2, is alot closer and lets face it both cards should be rolling through HL2 w/o any issues, however the GX2 is again on the top, by a small to tiny margin
http://techreport.com/articles.x/14934/12 [techreport.com] ET:QW finally a real split, the GX2 tops out the 1920x1200 resolution and below, but the GTX280 takes the crown come high end 2560x1600
http://techreport.com/articles.x/14934/13 [techreport.com] This Chart shows Crysis and Assasins Creed info, both of which the 9800GX2 tops the average with a 10-20% margin.
So it isnt as clear cut as you make it sound at all. Everyone has their own opinion on which is more important, for me, I'll take game performance over numbers crunching, though I do use 3DMark as well.

Re:Anandtech and TechReport reviews (1)

rgviza (1303161) | more than 6 years ago | (#23812629)

Right, that's why at this link:
(Posted 05/1/08 at 04:20:57PM | by Michael Brown)

http://www.maximumpc.com/article/gigabyte_geforce_9800_x2 [maximumpc.com]

They show Crysis getting 41.4FPS on a Gigabyte GeForce 9800 X2

However from the linked article in the op (which is the same freaking magazine):
Crysis GeForce 9800 GTX: 11.7 FPS, 2x Geforce 9800 GTX: 12.8?

WTF? They think we were born yesterday? They either lied for the first article or they are lying now.

I smell payola and Maximum PC is not to be trusted.

They are either:
A. falsely claiming that the 9800 GTX is the fastest current nvidia processor (what about the GX2?), or
B. Comparing the GTX280 to a lemon.

This benchmark/comparison is worth almost as much as that Brooklyn Bridge deed I have in my wallet... Wow I just sneezed and it sounded a little like "HBULLSHIT!"

This article is suspect at best. I don't own a 9800GTX (I have a GTS 640 from last year) but if I did I'd take this article with a grain of salt the size of a standard rubik's cube.

-Viz

Re:Anandtech and TechReport reviews (0)

Anonymous Coward | more than 6 years ago | (#23811761)

Conclusion: You can copy and paste! Congratulations, way to not provide any of your own information.

Not fitting the narrative!!! (1, Funny)

Anonymous Coward | more than 6 years ago | (#23811877)

The GTX 280 delivered real-world benchmark numbers nearly 50 percent faster than a single GeForce 9800 GTX running on Windows XP, and it was 23-percent faster than that card running on Vista.


Please: We here at Slashdot are trying to catapult the propaganda that Vista is a failure. Your bringing in "facts" and "benchmarks" isn't helping.

Will nobody think of the FUD?

Re:Not fitting the narrative!!! (1)

clampolo (1159617) | more than 6 years ago | (#23812457)

It's not saying anything about Vista being good or bad. It is saying that the GTX280 on Vista will run 23% faster than a GeForce 9800 GTX running on Vista.

Re:Anandtech and TechReport reviews (1)

MrDiablerie (533142) | more than 6 years ago | (#23812687)

Read the articles over again. GTX 280 blows away the 9800 series in performance.

Power vs Intel (4, Interesting)

SolidAltar (1268608) | more than 6 years ago | (#23811299)

Let me say I do not know anything about chip design but I have a question -

How is Nvidia able to year after year make these amazing advances in power while Intel makes (although great) only modest advances?

As I said I do not know anything about chip design so please correct me on any points.

Re:Power vs Intel (5, Informative)

the_humeister (922869) | more than 6 years ago | (#23811363)

Because graphics operations are embarrassingly parallel whereas regular programs arn't.

Re:Power vs Intel (1, Insightful)

Deltaspectre (796409) | more than 6 years ago | (#23811403)

It's not exactly raw power, I seem to remember an article explaining the differences between the 7x00 gen and the 8x00 gen, where they fundamentally changed how the graphics card dealed with "stuff", can't remember what magazine it was in though.

I don't think intel has quite that flexibility because of their commitment to backwards compatability, while nVidia can just push out new drivers.

Re:Power vs Intel (1)

gnick (1211984) | more than 6 years ago | (#23811411)

How is Nvidia able to year after year make these amazing advances in power while Intel makes (although great) only modest advances?
There is more room for improvement in the graphics card/GPU arena than in the CPU arena. Since the market is so much larger surrounding CPUs, more research has been done and the chips are closer to "perfectly" using available technology and continually expanding the realm of what technology is available.

And I'll echo the_humeister's statement that graphics operations are much more easily done in parallel than generic computing. You can throw processors/cores at the problem pretty easily and continue to see improvement.

Re:Power vs Intel (2, Informative)

corsec67 (627446) | more than 6 years ago | (#23811443)

I think one huge thing is that graphics is a hugely parallelizable task. The operations aren't very complex, so they can just keep cramming more and more processing units onto the chip.

Intel and AMD are having issues getting over 4 cores per die right now, while this card "... packs 240 tiny processing cores into this space, plus 32 raster-operation processors".

Re:Power vs Intel (4, Interesting)

cliffski (65094) | more than 6 years ago | (#23812699)

As a game dev, and from what I see, I'm assuming its a stability thing.
Intel's chips have to WORK. and I mean WORK ALL THE TIME. getting a single calculation wrong is mega mega hell. remember the pentium calculation bug?
People will calculate invoices and bank statements with that intel chip. It might control airplanes or god knows what. It needs to be foolproof and highly reliable.

Graphics chips draw pretty pictures on the screen.

It's a different ballgame. As a game dev, my 100% priority for any new chips is that they ship them with stable, tested drivers that are backwards compatible, not just great with directX 10 and 11.
If someone wrote code that adhered correctly to the directx spec on version 5 or even 2, the new cards should render that code faithfully. Generally, they don't, and we have to explain to gamers why their spangly new video card is actually part of the problem in some situations :(

Power Consumption (4, Interesting)

squoozer (730327) | more than 6 years ago | (#23811305)

Something that has always concerned me (more as I play games less often now) is how much power these cards draw when they aren't pumping out a zillion triangles a second playing DNF.

Most of the time (90%+ probably) I'm just doing very simple desktop type things. While it's obvious from the heat output that these cards aren't running flat out when redrawing a desktop surely they must be using significatnly more power than a simple graphics card that could perform the same role. Does anyone have any figures showing how much power is being wasted?

Perhaps we should have two graphics cards in the the system now - one that just does desktop type things and one for when real power is required. I would have thought it would be fairly simple to design a motherboard such that it had an internal only slot to accept the latest and greatest 3D accelerator card that suplimented an on board dumb-as-a-brick graphics card.

Re:Power Consumption (1)

SolidAltar (1268608) | more than 6 years ago | (#23811359)

From the article:

"However, Nvidia claims idle power draw for the GT200 of only 25W, down from 64W in the G80."

Re:Power Consumption (4, Informative)

SolidAltar (1268608) | more than 6 years ago | (#23811423)

More detail (sorry):

You may be wondering, with a chip this large, about power consumptionâ"as in: Will the lights flicker when I fire up Call of Duty 4? The chip's max thermal design power, or TDP, is 236W, which is considerable. However, Nvidia claims idle power draw for the GT200 of only 25W, down from 64W in the G80. They even say GT200's idle power draw is similar to AMD's righteously frugal RV670 GPU. We shall see about that, but how did they accomplish such a thing? GeForce GPUs have many clock domains, as evidenced by the fact that the GPU core and shader clock speeds diverge. Tamasi said Nvidia implemented dynamic power and frequency scaling throughout the chip, with multiple units able to scale independently. He characterized G80 as an "on or off" affair, whereas GT200's power use scales more linearly with demand. Even in a 3D game or application, he hinted, the GT200 might use much less power than its TDP maximum. Much like a CPU, GT200 has multiple power states with algorithmic determination of the proper state, and those P-states include a new, presumably relatively low-power state for video decoding and playback. Also, GT200-based cards will be compatible with Nvidia's HybridPower scheme, so they can be deactivated entirely in favor of a chipset-based GPU when they're not needed.

Re:Power Consumption (4, Informative)

CastrTroy (595695) | more than 6 years ago | (#23811493)

This is how graphics cards used to work. You would plug a VGA cable from your standard 2D graphics card to your, for example, Voodoo II card, and the Voodoo II card would go out to the monitor. You could just have the 3D card working in passthrough mode when not doing 3D stuff. Something like this could work on a single board though. There's no reason you couldn't power down entire sections of the graphics card that you aren't using. Most video cards support changing the clock speed on the card. I'm wondering if this is a problem at all, with any real effects, or whether it's just speculation based on the poster assuming what might happen. Anybody have any real numbers for wattage drained based on idle/full workload for these large cards?

Re:Power Consumption (1)

willy_me (212994) | more than 6 years ago | (#23812373)

But all modern operating systems support 3D accelerated displays. MacOS has their Quartz Extreme, Windows has Aero (I think), and even Linux has one (although the name escapes me.) Your solution would have worked well 5 years ago but times have changed. The 3D component is now always on.

Re:Power Consumption (2, Informative)

CastrTroy (595695) | more than 6 years ago | (#23812479)

However, not all that power is needed for running the 3D desktops. I can run Compiz (linux 3D desktop) on my Intel GMA 950 without a single slowdown. With all the standard 3D eyecandy turned on. So you wouldn't need to run an nVidia 8800 at full clock speed to render the desktop effects. Also, Windows Vista and Linux both support turning off the 3D effects and running in full 2D mode. I'm sure Mac OS supports the same, although I've never looked into it, so it's hard to say for sure. Especially since MacOS has such a limited number of computers that it is supported on.

Re:Power Consumption (1)

Barny (103770) | more than 6 years ago | (#23811561)

This is why NV came up with their new trick, build an integrated video adapter into all boards and let the high end cards use the pci-e 2.0 bus to move the framebuffer over to that when playing games, then when just doing normal windows tasks use the SM bus to turn off these electric heaters.

Works in vista only though, and of course, that OS is still showing signs of flop in the games area, despite DX10 and SP1.

Re:Power Consumption (1)

Xelios (822510) | more than 6 years ago | (#23811585)

ATI's last generation of cards had a feature called PowerPlay, which gears the card down when its not being heavily used. The 4800 series will have the same feature and judging from TFA Nvidia's doing something similar with the GT200.

Re:Power Consumption (1)

wagnerrp (1305589) | more than 6 years ago | (#23812035)

My old 6800GS already had this feature. It idles at 350MHz on the desktop, but ramps up to 485MHz when you open a 3D application.

RTFA (2, Informative)

ruiner13 (527499) | more than 6 years ago | (#23811703)

If you'd RTFA, you'd note this part:

Power Considerations

Nvidia has made great strides in reducing its GPUs' power consumption, and the GeForce 200 series promises to be no exception. In addition to supporting Hybrid Power (a feature that can shut down a relatively power-thirsty add-in GPU when a more economical integrated GPU can handle the workload instead), these new chips will have performance modes optimized for times when Vista is idle or the host PC is running a 2D application, when the user is watching a movie on Blu-ray or DVD, and when full 3D performance is called for. Nvidia promises the GeForce device driver will switch between these modes based on GPU utilization in a fashion that's entirely transparent to the user.
So, yes, they hear you, and are making improvements in this area.

Re:Power Consumption (1)

kipman725 (1248126) | more than 6 years ago | (#23812037)

what like the orignal Voodoo accelerator boards? (completly amazing bits of hardware btw)... it resutlts in problems with programs that only expects one graphics card and don't allow selecting which one to do the acceleration on.

What about Aero graphics? (1)

Joce640k (829181) | more than 6 years ago | (#23812117)

Aero graphics must surely be bad for the environment - it prevents most of the GPU from powering down.

Re:Power Consumption (1)

eharvill (991859) | more than 6 years ago | (#23812137)

Perhaps we should have two graphics cards in the the system now - one that just does desktop type things and one for when real power is required. I would have thought it would be fairly simple to design a motherboard such that it had an internal only slot to accept the latest and greatest 3D accelerator card that suplimented an on board dumb-as-a-brick graphics card.

Heck, I wouldn't mind having a "turbo" button, ala the XTs and 286s to handle what you just described...

Re:Power Consumption (0)

Anonymous Coward | more than 6 years ago | (#23812719)

Nvidia thought of this already, they came out with something called "Hybrid Power" which is supported on newer motherboards. It does exactly what you have stated above. In some cases, it can even use BOTH the on-board chip and the discrete card for high performance. Too bad you didn't patent your idea earlier :)

Re:Power Consumption (1, Funny)

Anonymous Coward | more than 6 years ago | (#23812789)

Sony does this with a lot of their VAIO laptops. But you have to install 17 rootkits for it to work.

MY Space Heater! (1)

Zymergy (803632) | more than 6 years ago | (#23811309)

I for one, Now plan on purchasing a new space heater soon for my box (NOTE: Nvidia GT200 has a TDP of 236W!)... so long as I can FINALLY have Crysis playable at resolution!

-Another good article on the GTX280 (GT200 GPU) at TR: http://www.techreport.com/articles.x/14934 [techreport.com]

Re:MY Space Heater! (2, Funny)

Firehed (942385) | more than 6 years ago | (#23812271)

236W? I doubt you'll have any problems kicking ass on the frozen levels.

did i read the same review? (1, Interesting)

Anonymous Coward | more than 6 years ago | (#23811315)

The article talks about the new card smoking ATI and showing 50% improvement yet the benchmark chart at the end of the article shows only a couple games got 50% fps boosts and ATI still outperformed it in Crysis with a card that is available today.

9800GX2 != 2x9800GTX (-1)

SamP2 (1097897) | more than 6 years ago | (#23811319)

The 9800GX2 card is made up of two 9600GT cards in quasi-SLI (as in dual everything but don't require 2 slots). Individually, each of those is inferior to a single 9800GTX, but the 9800GX2, which puts 2 of those together,(usually) beats a single 9800GTX.

If the GTX280 card outperforms two 9800GTXs, it'll be vastly superior to the GX2.

And about time, may I say. The last major increase in performance was the 8800 series. The 9800s were just marginally better, and while they support DX10, who needs that when games under the wonderful OS Vista run twice as slow than they do on XP?

Re:9800GX2 != 2x9800GTX (1, Informative)

pdusen (1146399) | more than 6 years ago | (#23811377)

and while they support DX10, who needs that when games under the wonderful OS Vista run twice as slow than they do on XP?
That would be a good point, if it were true.

Only you can prevent FUD!

Vista cuts performance... (0, Troll)

corsec67 (627446) | more than 6 years ago | (#23811331)

The GTX 280 delivered real-world benchmark numbers nearly 50 percent faster than a single GeForce 9800 GTX running on Windows XP, and it was 23-percent faster than that card running on Vista.


Why would Vista make the performance gains so much less? I could see XP running say 20% better with both cards, but why does Vista penalize the new card so much?

Digital Restrictions Management strikes again, I guess...

Vista: where do we want you to go today?

Re:Vista cuts performance... (0)

Anonymous Coward | more than 6 years ago | (#23811389)

where do we allow you to go today

Re:Vista cuts performance... (2, Insightful)

pdusen (1146399) | more than 6 years ago | (#23811399)

Actually, it's likely just a less developed Vista driver, like most performance problems people report with Vista (and by report, I mean actually experience and document, not the random anti-Vista FUD it's so popular to spout these days.)

Re:Vista cuts performance... (1)

Firehed (942385) | more than 6 years ago | (#23812357)

Vista's been out, what, 18 months? You think they'd get on that driver thing, especially as it's nVidia's shitty drivers that are causing so much with Vista in the first place. I'm genuinely surprised that they haven't been sued by Microsoft at this point, seeing how much of the Vista-hate is their fault.

Re:Vista cuts performance... (5, Interesting)

Colonel Korn (1258968) | more than 6 years ago | (#23811463)


Why would Vista make the performance gains so much less? I could see XP running say 20% better with both cards, but why does Vista penalize the new card so much?

Digital Restrictions Management strikes again, I guess...

Vista: where do we want you to go today?
TFA has some very weird numbers compared to Anandtech and Tomshardware and all the other real review sites that actually tell you all the details of their testing. The 280 looks more like it's 50-75% faster than the 9800GTX in most reviews, and most of those are done in Vista. Framerate in XP vs. Vista is completely even on a 9800 GTX with current drivers (the Vista slowdown went away a long time ago), except on Oblivion where Vista is about 20% faster for no apparent reason, but maybe the drivers Maximum PC used weren't the same as those used by the serious review sites, or maybe they have something wrong with their Vista install.

DirectX 10 is the reason (4, Informative)

DrYak (748999) | more than 6 years ago | (#23811599)

I could see XP running say 20% better with both cards, but why does Vista penalize the new card so much?
Crysis is a DirectX 10 game.
When run under Vista, it features tons of additional effects. Those are the reasons why the speed improvement in Crysis aren't that much impressive under Vista.

PS: And for the record, Radeon HD3870X2 uses the exact same GDDR3, not GDDR4 as TFA's review says. ATI choose to go for GDDR3 to cut the costs of the dual GPU setup. (Only a few non standard boards by 3rd party manufacturer use GDDR4 and a PCI-express 2.0 bridge).

Re:DirectX 10 is the reason (1)

rgviza (1303161) | more than 6 years ago | (#23812275)

Crysis is a DirectX 10 game. When run under Vista, it features tons of additional effects. Those are the reasons why the speed improvement in Crysis aren't that much impressive under Vista. ----------------- /QFT There's a lot of additional rendering happening with a DX10 vs. DX9 even using the same game, if you have the DX10 features enabled in the game. A more interesting comparison could have been done using a game which supports DX9 or DX10 on vista, and running it in DX9 mode to compare to XP. -Viz

Re:DirectX 10 is the reason (4, Informative)

Anonymous Coward | more than 6 years ago | (#23812461)

When run under Vista, it features tons of additional effects.
Stupidly enough, a 20% loss in framerate is all you get by running the game in DX10 mode. All of the 'DX10' effects can be enabled in DX9 (on XP as well) with simple modifications to the game's configuration files. When you do it, it looks exactly the same as DX10 without the deleterious effect on framerate.

I suppose if the reviewers were stupid, they may not have run the game in DX9 mode on both XP and Vista, which would account for the difference even if the graphical options were set to the same levels. But it doesn't make the game look better just by running it in DX10 mode, the only difference is that it's slower.

For those who aren't familiar with this: in XP/DX9 mode the highest-level graphical options are grayed out. This is an entirely artificial limitation; the configuration changes I mentioned simply replace graphical options with a higher ones (they're just integers in a plaintext file), so for example 'High' (DX9) becomes 'Very High' (the 'DX10' effects) in practise.

Re:Vista cuts performance... (2, Interesting)

Z34107 (925136) | more than 6 years ago | (#23811775)

There is no "Vista DRM." That little copy protection stuff lets you play Blu-Ray discs.

You can rip CDs, DVDs, and pirate t3h internetz if you want. I do so on a daily basis on my Vista x64 machine.

Now, if OS support for DRM bothers you, take it up with the studios that require it. Not playing DVDs is not an option.

Re:Vista cuts performance... (1)

cduffy (652) | more than 6 years ago | (#23812161)

Smaller companies can get away with saying "the studios require it". Microsoft can't.

Re:Vista cuts performance... (1)

drsmithy (35869) | more than 6 years ago | (#23812305)

Smaller companies can get away with saying "the studios require it". Microsoft can't.

Why ? Microsoft are insignificant players in content creation and delivery marketplace.

Re:Vista cuts performance... (1)

bigstrat2003 (1058574) | more than 6 years ago | (#23812675)

Exactly! People refuse to get this through their heads for some reason. Microsoft has a lot of pull in some circles, but that doesn't include Hollywood. When it comes to content delivery, the studios don't need Microsoft, Microsoft needs them. It's no different in this case than it is for smaller companies.

Re:Vista cuts performance... (0, Offtopic)

XnavxeMiyyep (782119) | more than 6 years ago | (#23812685)

Offtopic, but

You could change One day I'm going to get to One day I'll get and you'd have room to spell out last in your sig.

GX2 Cheaper and Faster (4, Informative)

Colonel Korn (1258968) | more than 6 years ago | (#23811361)

In most reviews, the 9800GX2 is faster, and it's also $200 cheaper. As a multi-GPU card it has some problems with scaling, and micro-stutter makes it very jumpy like all existing SLI setups.

I'm not well versed in the cause of micro-stutter, but the results are that frames aren't spaced evenly from each other. In a 30 fps situation, a single card will give you a frame at 0 ms, 33 ms, 67 ms, 100 ms, etc. Add a new SLI card and let's say you have 100% scaling, which is overly optimistic. Frames now render at 0 ms, 8 ms, 33 ms, 41 ms, 67 ms, 75 ms, 100ms, and 108ms. You get twice the frames per second, but they're not evenly spaced. In this case, which uses realistic numbers, you're getting 60 fps might say that the output looks about the same as 40 fps, since the delay between every other frame is 25 ms.

It would probably look a bit better than 40 fps, since between each 25 ms delay you get an 8 ms delay, but beyond the reduced effective fps there are other complications as well. For instance, the jitter is very distracting to some people. Also, most LCD monitors, even those rated at 2-5 ms response times, will have issues showing the 33 ms frame completely free of ghosting from the 8 ms frame before the 41 ms frame shows up.

Most people only look at fps, though, which makes the 9800 GX2 a very attractive choice. Because I'm aware of micro-stutter, I won't buy a multi-GPU card or SLI setup unless it's more than 50% faster than a single-GPU card, and that's still ignoring price. That said, I'm sort of surprised to find myself now looking mostly to AMD's 4870 release next week instead of going to Newegg for a GTX280, since the 280 results, while not bad, weren't quite what I was hoping for in a $650 card.

Re:GX2 Cheaper and Faster (1)

The Living Fractal (162153) | more than 6 years ago | (#23811787)

You don't have SLI... I do, and micro stutter is barely noticeable at worst. And I can play at resolutions and anti-aliasing that no sigle card could've made playable.

Short answer: it's fine. if you have the money, and want to play at extreme resolutions, get SLI.

It almost seems like micro stutter is some kind of viral ATI anti-marketing bs.

Re:GX2 Cheaper and Faster (4, Interesting)

Colonel Korn (1258968) | more than 6 years ago | (#23812243)

You don't have SLI... I do, and micro stutter is barely noticeable at worst. And I can play at resolutions and anti-aliasing that no sigle card could've made playable.
I've had three SLI setups (an ancient 3dfx X2 and two nVidia pairs). I liked my first SLI rig but I felt not to satisfied with the feel of the last two when compared to a single card, and now that I've learned about this issue I know why. Lots of people say that microstutter is barely noticeable, but lots of people also insist that a $300 HDMI cable gives "crisper" video over a 6 foot connection than a $10 HDMI cable. The micro-stutter effect that you can barely notice is the inconsistency in frame delay, which I mentioned ("For instance, the jitter is very distracting to some people."), but beyond that, there's the problem I described with the bulk of my comment. It's not just a question of whether you can tell that frames are coming in in clumps. It's a question of whether you can tell the difference between 60 fps delays, which is what you paid for, and 40 fps delays, which is what you get. SLI definitely improves performance and for those of us who don't mind the jitter (I never did, actually), it is an upgrade over a single card, but even with 100% scaling of fps, the benefit is more like a 33% increase in effective fps.

It almost seems like micro stutter is some kind of viral ATI anti-marketing bs.
Definitely not, and definitely not BS, but speaking of ATI, rumor has it that the 4870x2 may adapt the delay on the second frame based on the framerate, eliminating this problem. If it's true, then it will be the best dual-GPU card relative to its own generation of single card ever, by a very large margin. But of course, the rumor may just be some kind of viral ATI marketing bs. ;-) I hope it's true.

Thank goodness! (1)

DRAGONWEEZEL (125809) | more than 6 years ago | (#23811365)

I was just about to go buy a new video card! Now I'll hold out!

Noise leveb (4, Informative)

eebra82 (907996) | more than 6 years ago | (#23811369)

Looks like NVIDIA went back to the vacuum cleaner solution. Blatantly taken from Tom's Hardware:

During Windows startup, the GT200 fan was quiet (running at 516 rpm, or 30% of its maximum rate). Then, once a game was started, it suddenly turned into a washing machine, reaching a noise level that was frankly unbearable - especially the GTX 280.
Frankly, reviews indicate that this card is too f*cking noisy and extremely expensive ($650).

no surprise (2, Informative)

unity100 (970058) | more than 6 years ago | (#23811455)

8000gts were much louder than their 3870 counterparts too.

i dont get why people fall for that - push a chip to limits, put a noisy fan on it, and sell it as high performance card.

at least with ati 3870 you can decide whether you gonna overclock the card and endure the noise or not.

Re:Noise leveb (2, Informative)

clampolo (1159617) | more than 6 years ago | (#23812663)

and extremely expensive ($650)

Not at all surprising. Did you see the size of that chip die? You can fit 6 Penryn on it!! I used to work for a semiconductor company and the larger the chip the more expensive it gets. This is because the larger the die is the less likely it is to be defect free when it comes out of the fab.

can't... resist... (0, Troll)

tripmine (1160123) | more than 6 years ago | (#23811391)

But does it run Crysis on full? I mean, can it REALLY?

I call bull on those conclusions. (2, Insightful)

JohnnyBigodes (609498) | more than 6 years ago | (#23811427)

and in some cases beat â" two 9800 GTX cards running in SLI, a fact that explains why Nvidia expects the 9800 GX2 to fade from the scene rather quickly.

Bullshit. The 9800GX2 is consistently quite a bit faster (TechReport's very detailed review here [techreport.com] ), and it costs around $450, while the GTX 280 costs $650 (with the younger brother the 260 at $400), with the only drawbacks being more power drawn and higher noise. Even then, I think it's a no-brainer.

Don't get me wrong, these are impressive single-GPU cards, but their price points are TOTALLY wrong. ATI's 4870 and 4850 cards are coming up at $450 and $200 respectively, and I think they'll eat these for lunch, at least in the value angle.

Re:I call bull on those conclusions. (1)

Xelios (822510) | more than 6 years ago | (#23811681)

The 4870 will be $350, [fudzilla.com] not $450. And at that price Nvidia is going to have a hard time convincing me to buy a GTX 280, even if it does turn out to be marginally faster.

Lets see how the reviews of the 4800 series pan out.

Re:I call bull on those conclusions. (1)

CastrTroy (595695) | more than 6 years ago | (#23811801)

ATI's 4870 and 4850 cards are coming up at $450 and $200 respectively, and I think they'll eat these for lunch, at least in the value angle.
People buying $400 video cards aren't looking for value. Around $200, I could see the price being a factor. However, once you've decided to spend $400 on a video card, price isn't even something you are considering.

Re:I call bull on those conclusions. (1)

JohnnyBigodes (609498) | more than 6 years ago | (#23811911)

Not everyone who makes a relatively large investment is "mindless", so to speak. That's what I call an expensive graphics card, an investment. And even the big ones must pay off somehow, and these new cards don't.

Re:I call bull on those conclusions. (1)

CastrTroy (595695) | more than 6 years ago | (#23812139)

I find it really hard to follow the logic that an object that will be worth 50% of it's current value in a year (and in each consecutive year) to be an investment. It would be hard to argue that a new production model car to be an investment. If you kept it in the garage until 25 years later it might be worth more than the original, by 3 or 4 times. However, if you just took the original money you spent on the car and invested it for 25 years, you would end up way ahead. With a car, at least you could argue that eventually it will be a collectors item. I don't see a graphics card ever being a collectors item.

Re:I call bull on those conclusions. (1)

JohnnyBigodes (609498) | more than 6 years ago | (#23812235)

I didn't say it was meant to be a collector's item, that would be quite ridiculous :) What I mean is that for me, a proper investment is to stay on top of what I intend to have in regards of resolution/image quality (1680 @ 4xAA) for a reasonable timeframe without upgrades. My 8800GTX has served me extremely well in this regard, as I can now still sell it for almost half its original value and pay roughly half or more of a new card, give or take a few.

Re:I call bull on those conclusions. (1)

TheLink (130905) | more than 6 years ago | (#23812573)

Well if he can play his desired game NOW in its full "maximum quality" glory, and he typically spends USD400 a night on entertainment, then it could actually help him save money.

Basically he spends USD400, plays computer games for a few nights, and actually ends up with more money than he would otherwise (I actually know someone who did save some money in a similar way). In contrast if it were USD4000 for a vid card, the calculation could be different - he could get bored of the various games and go back to spending more.

The other way it could be an investment is if he's a professional gamer. Higher fps is better for many games.

So, practically who bought 9800 (2, Interesting)

unity100 (970058) | more than 6 years ago | (#23811441)

are royally screwed ? it was a 'new' card and all.

well done nvidia. very microsoft of you.

Re:So, practically who bought 9800 (4, Insightful)

Jellybob (597204) | more than 6 years ago | (#23811573)

AMD and NVidia are always going to release new cards. It's just the way of the industry.

If you buy a graphics card in the hope that it's going to be the top of the line card for longer then a few months then you're very much mistaken.

Buy a card that will do what you need it to, and then just stick with that until it stops being powerful enough for you. Anyone hoping their computer will be "future proof" is heading towards disappointment very fast.

Re:So, practically who bought 9800 (1)

Richard_at_work (517087) | more than 6 years ago | (#23811613)

So tell me, which idiots believe the world is going to stand still just because they paid out money for something?

Re:So, practically who bought 9800 (1)

rjstanford (69735) | more than 6 years ago | (#23812367)

Everyone who bought an iPhone at launch, apparently. Hell, I did - and didn't feel ripped off when the price dropped later, but apparently I was the only one...

Re:So, practically who bought 9800 (1)

Barny (103770) | more than 6 years ago | (#23811643)

You must be new to the whole "computer part" thing...

Re:So, practically who bought 9800 (1)

CastrTroy (595695) | more than 6 years ago | (#23811889)

Does the 9800 stop working, or in some way slow down, because NVidia came out with something new? Did they say they were going to cut off driver support for the 9800? You got what you paid for, and you still have what you paid for. New stuff will always come along that's cheaper and more powerful than what already exists.

ATI's Response? (2, Interesting)

mandark1967 (630856) | more than 6 years ago | (#23811513)

Anyone know when the new ATI card will be released?

Based on the information I've seen on it, it will be pretty comparable in terms of performance, but at a far cheaper price.

I'm hoping that the new ATI card performs within 10% - 15% or so of the GTX280 because I'm getting a bit tired of the issues I have with my current nVidia 8800GTS cards. (SLI)

I cannot set the fanspeed in a manner that will "stay" after a reboot.

My game of choice actually has some moderate-sever issues with nVidia cards and crashes at least a couple times a week due to some issue with nvcpl which I have bugged for 10 different versions of drivers and they never fix

My last ATI Card was their 9700Pro. I switched to nVidia because, while their drivers are closed source, the installation in Linux is far easier and their performance was pretty top-notch. Now I'm considering switching back to ATI if they can deliver a decent competitor.

Re:ATI's Response? (1)

UncleFluffy (164860) | more than 6 years ago | (#23811967)

Looks like it's being released right now. Best wait a week or two until the reviews are out and you can compare the two before wasting your valuable beer tokens.

Inquirer camping outside the NDA session. [theinquirer.net]

Performance crown my butt (-1, Troll)

unity100 (970058) | more than 6 years ago | (#23811541)

assasin's creed was running better on ati cards on directx 10.1, due to ati drivers being more mature.

nvidia moaned.

ubisoft removed 10.1 support with a patch. now assasin's creed runs worse on ati cards.

thats how they get their 'performance'.



in case you are wondering, whether i am a fanboi or not - i dislike being fanboi of anything. but when i see some party pulling bullshit, i side with the other.

Research before you spread FUD. (3, Informative)

ericvids (227598) | more than 6 years ago | (#23811991)

From the Tech Report [techreport.com] :

TR: Does the removal of this "render pass during post-effect" in the DX10.1 have an impact on image quality in the game?

Beauchemin: With DirectX 10.1, we are able to re-use an existing buffer to render the post-effects instead of having to render it again with different attributes. However, with the implementation of the retail version, we found a problem that caused the post-effects to fail to render properly.

TR: What specific factors led to DX10.1 support's removal in patch 1?

Beauchemin: Our DX10.1 implementation was not properly done and we didn't want the users with Vista SP1 and DX10.1-enabled cards to have a bad gaming experience.

Re:Performance crown my butt (1)

Applekid (993327) | more than 6 years ago | (#23812639)

in case you are wondering, whether i am a fanboi or not - i dislike being fanboi of anything. but when i see some party pulling bullshit, i side with the other.
I don't know much about 10.1, but if what you say is true, your beef is probably with Ubisoft for actually doing it, not NVidia for requesting it.

Tame Crysis at 1900x1200? (2, Insightful)

L4t3r4lu5 (1216702) | more than 6 years ago | (#23811587)

I run Crysis, all maxed out, on an 8800gtx, and only get lower than 30fps in the end battle.

If I want more speed, i'll get another 8800. That card is phenomenal, and about to get a lot cheaper.

Re:Tame Crysis at 1900x1200? (1)

ady1 (873490) | more than 6 years ago | (#23812525)

Exactly what I was thinking. I am able to run crysis on 1900x1200 with 8800GT sli with >40fps (all setting very high).

I understand using crysis as a benchmark but pretending that there wasn't any setup capable of running crysis on 1900x1200 is exaggerating

Great news - not that I want to buy the thing... (1)

Enleth (947766) | more than 6 years ago | (#23811635)

But every time Nvidia releases "THE new, big thing" the prices of the previos and, especially, the second-previous generation cards drop by a significant amount, making them worth the buck for an occasional gamer who doesn't want to spend a fortune to play games and is happy with his games running on the low to medium details settings.

Re:Great news - not that I want to buy the thing.. (1)

cptnapalm (120276) | more than 6 years ago | (#23812057)

not to mention that the games that are still played a few years after release will look absolutely fantastic.

Seriously (1, Funny)

Anonymous Coward | more than 6 years ago | (#23811683)

Did you really read the "MAXIMUM PC" article
did you read

The 8800 GTX has 24 ROPs and the 9800 GTX has 16, but if the resulting pixels need to be blended as they're written to the frame buffer, those two GPUs require two clock cycles to complete the operation. The 9800 GTX, therefore, is capable of blending only eight pixels per clock cycle. The GTX 280 not only has 32 ROPs but is also capable of blending pixels at full speed--so its 32 ROPs can blend 32 pixels per clock cycle. The GTX 260, which is also capable of full-speed blending, is outfitted with 28 ROPs.
without you eyelids drooping and your head nodding onto your chest
You did ?
You may enjoy this or [barbwiremuseum.com] this [feargod.net] then.

Noise... (2, Funny)

Guanine (883175) | more than 6 years ago | (#23811687)

Yes, but will you be able to hear your games over the roar of the fans on this thing?

The scene has changed. (5, Interesting)

wild_quinine (998562) | more than 6 years ago | (#23811731)

I used to be near the front of the queue for each new line of graphics cards. I used to wait just long enough for the first price drops, and then stump up. Cost me a couple of hundred quid a year, after the sale of whatever my old card was, to stay top of the line. Compared to owning and running a car, which I don't, owning and running a super-rig was positively cheap (in the UK). Some might call it a waste of money, and I have sympathy for that argument, but it was my hobby, and it was worth it, to me.

This year I put my disposable income towards getting in on all three next generation consoles, and the PC will languish for a long time yet.

I don't think I've changed, I think the market has changed.

They're getting bigger and hotter, and no longer feel like cutting edge kit. They feel like an attempt to squeeze more life out of old technology.

DirectX 10 as a selling point is a joke, with the accompanying baggage that is Vista all it does is slow games down, and none of them look any better for it yet. In any case, there are only five or six of them. You can pick up an 8800GT 512 for less than 150 dollars these days, and it's a powerhouse, unless you're gaming in full 1080p. There is no motivation to put one of those power hungry bricks in my rig. Nothing gets any prettier these days, and FPS is well taken care of at 1680x1050 or below.

Game over, graphics cards.

I wonder what will happen if everyone figures this out? Imagine a world in which the next gen of consoles is no longer subsidised, or driven, by PC enthusiasts...

I have to agree (1)

default luser (529332) | more than 6 years ago | (#23812791)

I've been holding out this generation, holding onto my 7900GT. I like the GT because it delivers solid performance for only 60w, which is half the power ATI's x1900 series was offering at the time. I've also been able to stall because games like Team Fortress 2 and Quakewars ET still look great on my current card.

Only now with the release of the 8800GT and 9600GT is the power consumption/performance ratio getting reasonable (and yes, the ATI 3870 has similar power consumption to the 8800GT, but cannot match it in performance). I'm actually enticed by the 55nm 9800GT (due out in July), which should cut the power consumption of the 8800GT to the same as my 7900GT :)

The only other card I'm considering is the 4850. Only time will tell if it delivers better performance than the 8800GT without breaking my power budget.

 

Does it sound like a jet engine? (2, Funny)

alen (225700) | more than 6 years ago | (#23811749)

how loud is it and does it need the hoover dam to power it up?

the way things are going you will need 2 power supplies in a PC. one for the video card and one for everything else

Well, there goes my upgrade plan (4, Insightful)

kiehlster (844523) | more than 6 years ago | (#23811933)

No wonder people say Console killed the PC game star -- "Alright, got my hardware list done. Time to order. Oh, look what just came out, guess I'll wait for prices to drop. Alright, they dropped! No wait, a new processor is out, think I'll wait. Sweet, think I can order now. No, nevermind, Crysis just came out, I'll have to wait until I can afford the current bleeding edge. Awesome, I can afford it now! No, a new GPU just came out that runs the game better. Oh, SATA 600 is coming out. Ah, forget this, I'm buying an Xbox."

The Voice of Reason (1)

Einstein_101 (966708) | more than 6 years ago | (#23812095)

First of all, I would like to remind everyone that benchmarks are subjective. I know this may come as a shock to you, but just because someone types with perfect grammer, has a pretty banner and a nice website layout that doesn't mean they aren't immune to bias or hyperbole. Even further, sometimes their intentions are honest, but their results just aren't typical

A perfect example of this was the ongoing debate about the 9600GT vs 8800GT vs HD 3870. Go through about 5-6 different reviews and you'll realize that there clearly isn't a winner in the fight, yet some websites won't hesitate to call the HD 3870 inferior, or try to pidgeon hole the Radeon card as being a "better value" - overlooking the fact that it outperformed the NVidia card on some games, while using less power and running cooler. The fact of the matter is all 3 of the cards perform better that the other 2 in at least 1-2 situations, but some people are just plain reluctant claim an ATi card is better than it's NVidia counterpart (due to it having almost the same performace with lower power consumption).

It's all subjective people. Benchmarks have never been an exact science anyways.

(ps: You all aren't kids anymore. Google this stuff for yourself and save me some time. I'm at work too you know :oP)

Re:The Voice of Reason (1)

DragonTHC (208439) | more than 6 years ago | (#23812237)

of course they're subjective.

But there are a great many of us who can translate benchmark scores into real world performance.

all it takes is a comparison of those scores to your own scores to know what type of performance to expect.

I was considering upgrading to a 9800GX2, but given these scores, I think I will wait. My 8800GTS 640MB is performing fairly well.

Re:The Voice of Reason (1)

Einstein_101 (966708) | more than 6 years ago | (#23812419)

Then I wasn't talking to you.

Granted, there are a great many of you who can translate benchmarks into real world performance, but you and I both know that your kind is in the minority. Like 10% of all gamers minority. The rest will either live and die by reviews and benchmarks (with no independant thought of their own)), or acknowledge that there is more to gameplay and performace than benchmarks - and just leave things alone at the acknowledgement.

I fall into that second category, and even I am in the minority.

Crysis? (1, Flamebait)

steveaustin1971 (1094329) | more than 6 years ago | (#23812391)

From the look of the benckmarks, neither the SLI 9800's nor the new card tame crysis at all. You can't make a poorly coded game fluid with more power I guess.

And it uses more power than your A/C (1)

mario_grgic (515333) | more than 6 years ago | (#23812451)

and it's probably noisier too :D.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?