Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA GeForce GTX 690 Benchmarked

samzenpus posted more than 2 years ago | from the running-the-numbers dept.

Graphics 119

MojoKid writes "NVIDIA has lifted the embargo on benchmarks and additional details of their GeForce GTX 690 card today. According to a few folks at NVIDIA, company CEO Jen Hsun Huang told the team to spare no expense and build the best graphics card they possibly could, using all of the tools at their disposal. As a result, in addition to a pair of NVIDIA GK104 GPUs and 4GB of GDDR5 RAM, the GeForce GTX 690 features laser-etched lighting, a magnesium fan housing, a plated aluminum frame, along with a dual vapor chamber cooler with ducted airflow channels and a tuned axial fan. The sum total of all of these design enhancements results in not only NVIDIA's fastest graphics card to date, but also one of its quietest. In the performance benchmarks, NVIDIA's new dual-GPU powerhouse is easily the fastest graphics card money can buy right now, but of course it's also the most expensive." The GeForce GTX 690 has been reviewed lots of different places today, Tom's Hardware and AnandTech to name a few.

Sorry! There are no comments related to the filter you selected.

Finally (5, Funny)

busyqth (2566075) | more than 2 years ago | (#39885321)

Finally I can play minecraft the way it was meant to be played!

Re:Finally (3, Insightful)

MrEricSir (398214) | more than 2 years ago | (#39885353)

Not to mention Minesweeper!

This baby is s-l-o-w !! (-1, Troll)

Taco Cowboy (5327) | more than 2 years ago | (#39887081)

According to the chart at http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/15 [anandtech.com]

AMD Radeon HD 7970 CF wipes the floor with this baby from Nvidia

Re:This baby is s-l-o-w !! (2)

SpinningCone (1278698) | more than 2 years ago | (#39888957)

after looking at all the other charts for a single card it is indeed the fastest. interesting that for skyrim the crossfire places almost bottom and even blow the non crossfire solution.

for the compute section likely this is an architectural difference and demonstrating the same calculations that make AMD rock bitcoin mining too.

Re:This baby is s-l-o-w !! (0)

Anonymous Coward | more than 2 years ago | (#39891133)

And yet it beats the crap out of ATI where it really counts: actual gaming. Good on you linking straight to page 15 of a technical review to find a single graph with which to troll, though. I'm sure you're very proud of yourself.

Re:Finally (0)

Lanteran (1883836) | more than 2 years ago | (#39886449)

You kid, but the shaders mod rapes framerate. Even my 9600 GT can only run it at 10fps with decent settings.

Re:Finally (0)

Anonymous Coward | more than 2 years ago | (#39888059)

I hate to break this to you, but your 9600 GT is a really old, obsolete GPU. Like five generations old. You should expect to get terrible performance on all but the lowest settings for any game.

Slashvertisement (-1, Troll)

Charliemopps (1157495) | more than 2 years ago | (#39885345)

Seriously... who thought this was a good article? The "I have an $800 video card, I'm awesome!" fad died at least 5 years ago...

Re:Slashvertisement (5, Insightful)

poly_pusher (1004145) | more than 2 years ago | (#39885423)

Actually high performance computing has created more demand. Nvidia GPU's are being used in massive supercomputers using OpenCL and CUDA. "AMD GPU's support OpenCL." There are a many more people who are interested in the latest and greatest GPU than you may think, specifically on a news for nerds site. So yeah, sweat article and thanks for the heads up about the new benches MojoKid.

Re:Slashvertisement (1)

poly_pusher (1004145) | more than 2 years ago | (#39885437)

argh... *Sweet

My bad.

Re:Slashvertisement (0)

Anonymous Coward | more than 2 years ago | (#39885531)

I'm looking forward to seeing a few Blender benchmarks with this little puppy. I mean, wow, a guy could probably run a high-end render farm on a couple of domestic electric circuits, now.

Re:Slashvertisement (4, Insightful)

billcopc (196330) | more than 2 years ago | (#39885781)

Yes, except the GTX 6xx series is slower at CUDA processing than its predecessors. This is a gaming product. Nvidia did this on purpose, sacrificing some compute speed in favour of rendering performance and power efficiency. They also artificially limit double-precision FP speed on consumer boards, to steer professional users toward the Quadro.

As a result of this hobbling, GPU computing hobbyists tend to gravitate toward the Radeon, which has outperformed the GeForce in OpenCL for a few years now, in both performance-per-dollar and performance-per-watt.

Re:Slashvertisement (3, Interesting)

TheRealMindChild (743925) | more than 2 years ago | (#39886223)

You mean BitCoin mining don't you? That is the only thing you can consider "faster" with OpenCL, while actually using it for a real purpose.

Re:Slashvertisement (1)

Surt (22457) | more than 2 years ago | (#39885987)

But if you're into HPC, this is not the card for you. You want a 685/695 which won't be released until 1st quarter next year.

Re:Slashvertisement (0)

Anonymous Coward | more than 2 years ago | (#39886935)

Incorrect.

  If you're into HPC, you want an Intel MIC. Or if you're REALLY into HPC you already have a few. All the love is currently leaving CUDA for the Knights Ferry|Bridge MIC cards. They kick the hell out of the CUDA and OpenCL cards.
Though they only barely give the Tilera cards a run for their money on a few apps. (Then again... as few apps out there that can run without floating point, who cares??)

Re:Slashvertisement (0)

Anonymous Coward | more than 2 years ago | (#39888471)

Wake me up when you can actually buy one of those things. As for product which is 5 years in production it will be absolutely awesome. Like DNF surely was ;>

Re:Slashvertisement (0)

Anonymous Coward | more than 2 years ago | (#39887085)

Have you tried OpenCL on AMD GPUs recently? OpenCL on AMD GPUs does not work. The drivers are buggy as hell.

AMD pays lip service to GPGPU.

Re:Slashvertisement (0)

Anonymous Coward | more than 2 years ago | (#39887769)

I hit exactly *one* actual driver/APP bug in 2 years. All other times it was really a bug in my application or compute kernel. And we're testing against SDK 2.1 through 2.6 on catalyst from 10.3 up to 12.4 on cards ranging from a 4770 to a 7950.
So unless you provide something a bit more substantial than "waah", I call bullshit.

Re:Slashvertisement (1)

Botia (855350) | more than 2 years ago | (#39888349)

Actually high performance computing has created more demand. Nvidia GPU's are being used in massive supercomputers using OpenCL and CUDA. "AMD GPU's support OpenCL." There are a many more people who are interested in the latest and greatest GPU than you may think, specifically on a news for nerds site. So yeah, sweat article and thanks for the heads up about the new benches MojoKid.

They did not focus on the general purpose computing circuitry [anandtech.com] so that power and heat could be reduced. The main focus was on gaming. Seperate cards will be created specifically for computing.

Re:Slashvertisement (2)

hairyfeet (841228) | more than 2 years ago | (#39888449)

I'm sorry but this is NOT insightful, because NOBODY is gonna use GEFORCE cards in supercomputers, as their FP is just too shitty. if you would have been talking about an AMD since that company is pushing their lines towards a single design (they are even switching their GPUs from VLIW to Vector to get more GPGPU performance) then yes, you would be correct. but this is Nvidia we are talking about here who have been consistently separating their GPGPU line (Tesla) and their gamer line (Geforce) and the two lines are NOT identical, nowhere even close.

So I'm sorry friend but this is NOT for supercomputers, this is for showing you have the largest ePeen on the benchmark boards or triple monitored Skyrim. If you wanted to argue they are testing designs on the gamers, kinda like how RHEL uses Fedora as a testbed? that might be possible, but nobody is gonna buy geforce cards (which the FP has gone down not up with these latest cards) to slap into supercomputers.

Re:Slashvertisement (2)

whoop (194) | more than 2 years ago | (#39885463)

I got an AMD 6870 over a year ago ($150), and it's played everything I've thrown at it just fine with maxed graphics. Skyrim, Witcher 2, etc play without any stutter and look wonderful. All on an AMD 965 (3.4 Ghz X4) CPU from the year prior.

I'm just trying to figure out what I'm missing by not spending 5x that price.

Probably not much (4, Insightful)

NotSoHeavyD3 (1400425) | more than 2 years ago | (#39885547)

I always harp about this but in a couple of years there will probably be a game that requires that much power. However by that time there will be a $150 card that can run it.

Re:Probably not much (2)

stewartjm (608296) | more than 2 years ago | (#39886695)

I always harp about this but in a couple of years there will probably be a game that requires that much power. However by that time there will be a $150 card that can run it.

That's only true if you're running a 60Hz low-mid res display, say 1920x1200(~2.3 megapixels) or less. Though, even then the actual retail price of such a card, most of the time , will probably closer to $250 than $150.

If you want to run 120Hz, or run 2560x(1600|1440)(~3.7-4.1 megapixels), or run 3+ monitors in an eyefinity configuration(~4-24.5 megapixels), then you need all the power you can get. And as the games progress, you'll continue to need to upgrade to the higher end GPUs, at least for the foreseeable future.

The 690 can mostly max out a single 120hz 1080p display in the more demanding games, as well as a single 60hz 2560x(1600|1440) display. And it's no slouch in the 6-7mp eyefinity category.

But if you want to do 120Hz 1080p eyefinity, or 60Hz 12+ MP eyefinity, then you'll need even more power than this monster provides. Again that's only in the more demanding games. There are plenty of console ports that'll do alright in these higher end monitor configurations with a single 7970 or 680.

Re:Probably not much (4, Insightful)

StikyPad (445176) | more than 2 years ago | (#39887101)

That's only true if you're running a 60Hz low-mid res display, say 1920x1200(~2.3 megapixels) or less. Though, even then the actual retail price of such a card, most of the time , will probably closer to $250 than $150.

If you want to run 120Hz, or run 2560x(1600|1440)(~3.7-4.1 megapixels), or run 3+ monitors in an eyefinity configuration(~4-24.5 megapixels), then you need all the power you can get.

Yes, but the GGP was addressing someone complaining about the cost of the card, not someone who's running a fucking surround-video in their replica Cessna cockpit. Anyone who's dishing out for high end displays isn't going to (justifiably) complain about the price of the card(s) needed to drive them. For everyone else, like the OP, a $150 GPU will play almost any game on their standard 1920x1080 60Hz display with decent performance settings just fine.

Re:Slashvertisement (2)

sandytaru (1158959) | more than 2 years ago | (#39885619)

The only game my Radeon couldn't handle at top graphics was FFXIV. And the only person I've ever met that got XIV to benchmark at max graphics was running on a brand new $2000 rig with SLI. 95% of us won't need a card this powerful, but I bet more than 5% of us sort of want one anyway.

Re:Slashvertisement (0)

Anonymous Coward | more than 2 years ago | (#39885895)

To be fair, FFXIV runs horribly on every system because it's CPU bound. It doesn't use multi-cores efficiently, and the game itself is pretty terrible.

Other than FFXIV, most MMORPG games are designed around 10 year old hardware because the more people that can play it, the more people they can fleece. Exceptions are the new star wars MMORPG which requires more CPU power than typical games, but you can get away with 5 year old graphics cards.

The pattern often seen in games, at least since the advent of Intel "onboard video" is to make it "work" on the oldest thing that many people have:
http://store.steampowered.com/stats/
http://store.steampowered.com/hwsurvey/videocard/

Only in the last two months have Intel "GPU"'s even been on the board, and even that, the 3000 series brings it up to around 10%. So if onboard video is starting to be "Good enough", you'll start seeing dedicated "budget" GPU's disappear. Only a handful of games actually work "well" on these GPU's, the vast majority of games still only work on onboard GPU's with all the settings set to the most crippled mode.

Re:Slashvertisement (1)

sandytaru (1158959) | more than 2 years ago | (#39890783)

I've poked around in XIV a bit lately and the patches in preparation for the "end of the world" and reboot to version 2.0 next fall have actually made the game a lot more bearable and, dare I say, fun than it was when it first came out. My best friend played it much more extensively, and she had a custom built rig from Square Enix she won in a contest a few years ago that choked and sputtered on XIV until we replaced the motherboard, processor, and video card. When even their own house built gaming systems won't run it without extensive upgrades....

Re:Slashvertisement (1)

silas_moeckel (234313) | more than 2 years ago | (#39886457)

It's all about the monitor. Are you running at least 1080P? HDTV has done a horrid thing and effectively keeping displays at or close to HD res there are very view monitors that will do more than 1920*1080 some pack some more vertical pixels in at 1920*1200 with a 16x10 ratio. Anyways it's rather easy for a few year old card to run maxed out if you only running an average display 1366x768 was the most predominant per http://www.w3schools.com/browsers/browsers_resolution_higher.asp [w3schools.com] and that's really not a lot of pixels.

Re:Slashvertisement (1)

Nadaka (224565) | more than 2 years ago | (#39886645)

It sucks. Now that we have graphics cards capable of effectively pushing >3 million pixels effectively, we don't have the high end CRT monitors that have that kind of resolution any more.

Re:Slashvertisement (2)

Smauler (915644) | more than 2 years ago | (#39888077)

This [ebay.com] is a 2560*1440 monitor for $320. The early ones had higher quality internals, and could actually run at 100hz at that resolution. They're shipped direct from Korea.

People saying they're running on maximum settings, without mentioning the pixel count are being disingenuous. The above monitor pushes over 3.5 million pixels. 1336*768 is about 1 million.

Re:Slashvertisement (1)

2.7182 (819680) | more than 2 years ago | (#39885469)

Yeah, but the "I can simulate the 14000-body problem in real time" fad is alive and kicking.

Re:Slashvertisement (0)

Anonymous Coward | more than 2 years ago | (#39888105)

That's funny, because Newegg shows all of their Geforce 680 and 690 stock as being sold out [newegg.com] . It seems that they are selling pretty well for something that nobody wants.

Re:Slashvertisement (1)

Traciatim (1856872) | more than 2 years ago | (#39889825)

More like they only had three of them in the first place so now it looks like they are selling out everywhere when really there is just a huge supply problem.

WTF (5, Interesting)

Billly Gates (198444) | more than 2 years ago | (#39885369)

Tomshardware is showing GTX beating ATI by 50 - 200% in every benchmark. Anandtech shows the opposite with ATI still winning under the same games? Anyone else notice this?

Does Toms Hardware or Anandtech get paybacks from either company for biased remarks?

Re:WTF (3, Interesting)

deweyhewson (1323623) | more than 2 years ago | (#39885495)

I've seen it rumored in more than a few places that Tom's Hardware is very Intel and Nvidia, shall we say, "friendly". Obviously colloquial evidence is nothing to base a hard opinion on, but the thought does come into my head whenever I see review discrepancies like this pop up.

Part of it depends on what you choose to bench (5, Interesting)

Sycraft-fu (314770) | more than 2 years ago | (#39886081)

I don't care for Anad's benches much because they seem to like synthetic compute benchmarks. That is really all kinds of not useful information for a game card. I want to see in game benchmarks. If any compute stuff is going to be benchmarked, let's have it be an actual program doing something useful (like Sony Vegas, which uses GPUs to accelerate a lot of what it does).

Personally I'm a HardOCP fan when it comes to benchmarks. Not only are they all about game benchmarks, but they are big on actual gameplay benchmarks. As in they go and play the game, they don't run a canned benchmark file. This does mean that it isn't a perfect, "each card sees the precisely equal frames" situation, but it is far more realistic to the task they are actually asked to do, and it all averages out over a play session. I find that their claims match up well with what I experience when I buy a card.

http://hardocp.com/article/2012/05/03/nvidia_geforce_gtx_690_dual_gpu_video_card_review [hardocp.com] is there 690 benchmark. It's a selection of newer games, generally played with triple head (the game displayed across three monitors at once) on a 690, 2 680s SLI'd and two 7970s CF'd.

Re:Part of it depends on what you choose to bench (0)

Anonymous Coward | more than 2 years ago | (#39886495)

Personally I'm a HardOCP fan when it comes to benchmarks. Not only are they all about game benchmarks, but they are big on actual gameplay benchmarks. As in they go and play the game, they don't run a canned benchmark file. This does mean that it isn't a perfect, "each card sees the precisely equal frames" situation, but it is far more realistic to the task they are actually asked to do, and it all averages out over a play session. I find that their claims match up well with what I experience when I buy a card.

Yeah right. These are the fuckers that got caught with their hand in the cookie jar.

Plus Kyle is an epic douche.

Re:Part of it depends on what you choose to bench (1)

noname444 (1182107) | more than 2 years ago | (#39890083)

I don't care for Anad's benches much because they seem to like synthetic compute benchmarks. That is really all kinds of not useful information for a game card. I want to see in game benchmarks. If any compute stuff is going to be benchmarked, let's have it be an actual program doing something useful (like Sony Vegas, which uses GPUs to accelerate a lot of what it does).

But... They tested with 10 games, 1 raytracer and 0 synthetic benchmarks. I don't know what they usually do but this article was very focused on real world performance.

Re:WTF (1)

Brad1138 (590148) | more than 2 years ago | (#39886521)

I was kind of bummed when AMD bought ATI. I have always been a AMD & NVidia fan.

Re:WTF (2)

hairyfeet (841228) | more than 2 years ago | (#39888313)

Why? Once Intel gutted Nvidia's chipset business (which I still haven't figured out why they didn't get an Antitrust investigation for that) their exit from that business was inevitable and it seems kind of stupid to have to load both sets of drivers, one for the motherboard and another for the GPU so if one went with AMD (which is what all my family PCs are and we're quite happy) it makes sense to pair it with an AMD GPU as one driver update takes care of everything.

As for TFA if you want to blow a grand on an ePeen? I'm happy for you, everybody has to have a hobby I suppose. i just don't personally see the point as all the games now are built to be released with both consoles (which are positively ancient) and the PC with the exception of a few tech demos pretending to be games so with the exceptions of trying to chase the top score on some leaderboard or a few GPGPU applications frankly one won't really notice much difference between this and even a 2 year old card on standard games at standard resolutions. Hell I'm running a seriously old HD4850 and it still doesn't drop below 30FPS on any of the newer games at the native 1600x900 my 22 inch monitor displays and i like to play shooters.

Maybe when the new consoles come out we'll see a jump, but if the rumors of the PS4 and Mii are true frankly the next gen consoles are gonna be about as powerful as last years midrange PCs so I'm just not seeing anything on the gaming front to get really jazzed about. One place I will give Nvidia props though is the tegra, they are squezing some crazy graphics power out of those chips in the teeny tiny battery limits of smartphones and that IS pretty impressive. This? Might get you on top of the leaderboard a few months until the next superwank card comes out.

Re:WTF (5, Informative)

Gadget_Guy (627405) | more than 2 years ago | (#39886559)

I've seen it rumored in more than a few places that Tom's Hardware is very Intel and Nvidia, shall we say, "friendly".

That would explain why in their most recent Best Graphics Cards For The Money [tomshardware.com] AMD's cards only won 5 categories compare with Nvidia's massive win in 1 category (plus a tie in another and 3 categories with no winners). Basically if you ignore all the times that they say good things about AMD, then it is obvious that they favour Intel and Nvidia.

As for the original poster claiming big differences in the rankings, I just don't see it. If you filter out the cards that are not tested on both sites you get the following rankings:

Battlefield 3
Toms: 680GTX-SLI, 690GTX, 7970CF, 6990, 590GTX, 680GTX, 7970, 580GTX
Anan: 680GTX-SLI, 690GTX, 7970CF, 6990, 590GTX, 680GTX, 7970, 580GTX

Skyrim
Toms: 680GTX-SLI, 690GTX, 7970CF, 590GTX, 6990, 680GTX, 7970, 580GTX
Anan: 680GTX-SLI, 690GTX, 590GTX, 680GTX, 7970, 580GTX, 6990, 7970CF

DiRT 3
Toms: 680GTX-SLI, 690GTX, 7970CF, 680GTX, 6990, 590GTX, 7970, 580GTX
Anan: 680GTX-SLI, 690GTX, 7970CF, 590GTX, 680GTX, 6990, 7970, 580GTX

Metro 2033
Toms: 7970CF, 680GTX-SLI, 690GTX, 6990, 590GTX, 7970, 680GTX, 580GTX
Anan: 7970CF, 680GTX-SLI, 690GTX, 6990, 590GTX, 7970, 680GTX, 580GTX

Only Skyrim seems to show any major differences, and that was probably due to some driver issues, game version or alternative testing methods.

Re:WTF (1)

symbolset (646467) | more than 2 years ago | (#39886759)

And this is why you're the gadget guy.

Re:WTF (1)

Gadget_Guy (627405) | more than 2 years ago | (#39891601)

And this is why you're the gadget guy.

He he. You made me think back to the days when I first came up with the Gadget moniker. I was probably using a TNT2 Ultra video card then. It was a far cry from the monster cards we are looking at today! Even so, my card had great TV input/output and included LCD 3D glasses. It seems the actual feature set if graphics cards hasn't improved a lot over the years.

Re:WTF (0)

Anonymous Coward | more than 2 years ago | (#39889939)

+1 Informative

Re:WTF (0)

Anonymous Coward | more than 2 years ago | (#39891311)

I wouldn't say they've been bought and paid for by any one specific entity, but they do tend to come up with such completely and obviously false stats on occasion that it is difficult to believe the outlet is completely unbiased. I would lay down good money that they routinely take "donations" from some of the very companies whose products they are benchmarking.

Re:WTF (-1)

Anonymous Coward | more than 2 years ago | (#39891985)

Fuck you nigger.

Re:WTF--- compute on linux... (1)

johnjones (14274) | more than 2 years ago | (#39885509)

well it depends on what you want things for

basically I don't really see much difference in the graphics openGL/DX11 side of things but this was very interesting to me :

http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/15 [anandtech.com]

regards

John Jones

Re:WTF (1)

Anonymous Coward | more than 2 years ago | (#39885743)

I simply don't trust Tom's Hardware anymore. I bought an A7V Mobo (1st gen Athlon) many years ago based on TH's glowing reviews. The board was buggy as all hell and it came out months later that Asus had paid for that review either directly or indirectly through advertising. I now take a fairly sceptical view of all reviews I read now based on that experience.

Re:WTF (0)

Anonymous Coward | more than 2 years ago | (#39886109)

Oh god, the A7V... via 686 southbridge with the data corruption bug, and the VRMs had a tendency to blow up FNAR.

Re:WTF (1)

dbIII (701233) | more than 2 years ago | (#39886401)

I think I've still got a couple of desktop machines still around with that board but it does have an "SE" in the name. It's funny, give people a bigger disk, more memory and a second monitor and they are happy with an old PC for years.

Re:WTF (1)

nickersonm (1646933) | more than 2 years ago | (#39886187)

They seem to have merely omitted the games which favor AMD more strongly. Compare, for example, the Metro 2033 [anandtech.com] benchmarks [tomshardware.com] (or BF3, or Skyrim) and you can see that they are relatively similar. THG did not test Crysis or Total War: Shogun 2, which the AMD cards perform better on.

Re:WTF (1)

Anne_Nonymous (313852) | more than 2 years ago | (#39886343)

Tom's Whoreware? No, they've never taken a sweetener.

Re:WTF (1)

sexconker (1179573) | more than 2 years ago | (#39886869)

Tom's Hardware is a joke. They're moneyhatted by Nvidia and Intel all the time.

Re:WTF (1)

westyvw (653833) | more than 2 years ago | (#39891569)

I dont know about that, buit I do wonder why Toms Hardware wont let Archive.org's wayback machine show their old pages.....

Big Fermi is still on the horizon... (3, Interesting)

poly_pusher (1004145) | more than 2 years ago | (#39885381)

The GTX 680 and 690 have turned out to be pretty spectacular. The most impressive aspect is the relatively low power consumption for a high performance card.

I'm still waiting for the GK110-based "Big Fermi" due out Q3. Considering how well the 680 and 690 have performed the Gk110 will be a monster, probably power hungry but still a monster. Nvidia really hit gold with their latest generation, it is speculated that the current 680 was intended to be the 660 until it outperformed AMD's top offering. Can't wait to get my hands on a 4gb GK110.

Re:Big Fermi is still on the horizon... (-1)

Anonymous Coward | more than 2 years ago | (#39885391)

But still inferior to ATI cards.

Re:Big Fermi is still on the horizon... (3, Informative)

poly_pusher (1004145) | more than 2 years ago | (#39885505)

It's not ATI anymore... It's AMD. I'm a longtime fan of their hardware in general. The AMD X2 blew my mind when it came out. Currently their Vision and Fusion products are pretty awesome. Unfortunately Nvidia has been beating them pretty consistently in the GPU world for the past many generations regarding all around performance and stability. AMD always stays pretty close "on a lower R&D budget" most the time and edges them out for a while but Nvidia always comes back quickly with a punch. Even the mess that was the GTX 480 delay brought a card that smoked AMD in DX11 technology like tesselation. Nvidia always seems to choose the right place to focus their development.

Re:Big Fermi is still on the horizon... (1)

hairyfeet (841228) | more than 2 years ago | (#39888383)

Citation please? because I have been using Radeons pretty much exclusively at the shop and frankly haven't seen any stability problems since the 9.x drivers. I have to give AMD credit where credit is due because before the purchase it was common knowledge that while ATI hardware kicked some ass their drivers were often shit and one would have to wait 6 months or more before the major bugs were worked out but since being bought by AMD the drivers are actually pretty nice. I mean if there were stability problems you'd have thought I'd have run into it, since I've been selling everything from APUs like Bobcat based netbooks and STBs and Liano laptops and desktops, and on the card side everything from the low rent 4650s and 5450s to the more high mid 68xx and 78xx (not much call for the ePeen cards in these parts) and frankly their drivers have been just fine. In fact I can only think of a single time I had a problem with an AMD driver and that was called by MSFT having a bad .NET patch that didn't take, once i cleaned that mess out it was all roses.

Re:Big Fermi is still on the horizon... (1)

Anonymous Coward | more than 2 years ago | (#39885557)

Nvidia really hit gold with their latest generation,

But only if you do single-precision FP workloads. If you do integer workloads, the 680 can't even beat Nvidia's own 580. Pass.

Re:Big Fermi is still on the horizon... (4, Insightful)

billcopc (196330) | more than 2 years ago | (#39885839)

If you're doing serious GPGPU stuff, you shouldn't be relying on fickle consumer boards in the first place. This is a gaming card marketed to extreme gamers. I've fooled around with CUDA stuff like raytracing and H.264 encoding, mostly as a curiosity, but the reason I bought this quad-SLI setup years ago was for games and real-time 3D rendering. I couldn't care less about FP performance, and neither does Nvidia's target market for this product line.

GPGPU on consumer cards is still a novelty at this point. We're getting close to the tipping point, but for most users, as long as it plays their game and can handle 1080p video, they're content. If and when that balance tips in favour of OpenCL and CUDA, both GPU manufacturers will adjust their performance targets accordingly. Their #1 priority is still 3D gaming for now.

Re:Big Fermi is still on the horizon... (0)

Khyber (864651) | more than 2 years ago | (#39887617)

"This is a gaming card marketed to extreme gamers."

And since games are probably the most resource-intensive fucking thing, you should expect your GAMING CARD to kick major ass at everything else if it has the capability.

This is why nVidia is losing in the general-purpose GPU arena. AMD just keeps trucking along, upgrading EVERYTHING. NVidia? Gimps your shit.

Re:Big Fermi is still on the horizon... (1)

FyRE666 (263011) | more than 2 years ago | (#39891839)

"This is a gaming card marketed to extreme gamers."

"And since games are probably the most resource-intensive fucking thing, you should expect your GAMING CARD to kick major ass at everything else if it has the capability."

Did you not understand what he said?! It's a GAMING CARD - it's designed to kick ass when rendering games. Everything else is secondary - it's not a general purpose card, and it's not marketed as anything other than a high end gaming card. If it happens to kick ass as a more general purpose GPU, then that's just a bonus - it has nothing to do with its gaming performance. Nvidia sell other cards that excel in general purpose GPU work. I really don't think they care less whether this card, or any of their gaming cards, perform better at cracking encyption keys than something from ATi. Nor do the vast majority of people who buy it.

Personally I'm getting a pair of 680's instead, as it seems a better buy for slightly better performance, but this card does exactly what it's designed to do: be the fastest graphics accelerator for the gaming market.

Hmmm, and what uses FP32 workloads? (5, Insightful)

Sycraft-fu (314770) | more than 2 years ago | (#39885861)

Oh that's right: Video games. You know, the thing it was made for.

The GTX series are nVidia's gaming cards. They are made for high performance when you wanna play 3D games. They aren't made for compute performance. That is not to say they cannot handle compute stuff, just that it isn't what they are primary designed for. So the kind of compute stuff they are the best at will be more related to what games want.

Their compute products with be the Teslas. They are made for heavy hitting compute performance of all kinds. If you are after purely GPGPU stuff, they are what you want.

nVidia seems to be separating their designs for the two to an extent. Still common over all design, but concentrating on making the desktop GPUs more efficient, at the expensive of high end computer features (like Integer and FP64 power), and the workstation/compute cards good at everything, even if they need beefier power and are louder.

I'm ok with that. I buy a GeForce to play games, not to do high end GPGPU stuff. We buy Teslas at work for that.

Also, there's a shitload of other things out there GPGPU wise that are FP32, and the 680 really is killer at that. Does a great job accelerating video encoding and the like.

Re:Hmmm, and what uses FP32 workloads? (1)

tyrione (134248) | more than 2 years ago | (#39887165)

If I'm after GPGPU and OpenCL I chose AMD, not Nvidia whose implementation is weak and have been kicking and screaming that CUDA isn't what the industry has adopted.

Re:Hmmm, and what uses FP32 workloads? (0)

Anonymous Coward | more than 2 years ago | (#39888549)

Ok, honest question for you. Not trolling, or anything like that. 6 years ago, I bought a top of the line Dell xps 410, with a nvidea geforce 8600 gts, Vista, (which I will ditch post haste) 4 gb of memory, dual core 2.13ghz intel processors. I know I have waited too long to upgrade, but, I am planning on gutting my box this year and replacing everything, motherboard, chip, graphics card, memory, power supply, everything. Up til now it has been a money issue, since what I had worked fine for what I use it for. Mainly netflix, the occasional game once the price drops, seti@home etc. Now I want to make it top of the line again. Got a better job, still not a millionaire, but better off. My Nvidia card crapped out on me a year after I bought it, was still under warranty from dell, so was replaced free, but now I'm having trouble with the software. Apparently windows data protection decided that all nvidia software is using memory incorrectly and now it won't work. I've allowed it in the data execution prevention, but it still crashes every time I open it. I'm stuck using windows personalization to change my settings, which sucks on dual monitors with different native resolutions. (one in my living room and one in my bedroom, again for netflix.) Since I had trouble with drivers for years, and went through two cards already, this has led me to believe that its an nvidia problem, and I was considering going with a different brand next time. Are all nvidia cards as glitchy as mine, or did I just happen to get crap both times? Or is it a vista problem? For 4 years after I bought the box, I couldn't upgrade the driver even, without having to do a restore. Finally gave up and was using the windows default driver. Finally got a driver upgrade to work, then after the last update, data execution prevention won't allow it to run. I admit, I'm not as tech savvy as I should be, but I'm a smart dude, just behind the times, and I want to learn so as to avoid the same problems when I rebuild and there is no warranty to save my ass. I've built computers before, just iffy on the software side. Side note, is it worth using the dell case? Or should I just buy a new one? I know they aren't that expensive, but I want to reuse what I have if possible. What use would I have for an old case except to keep the whole pc intact and eliminate the 50 ft video cable run to my bedroom. Hey, thats an idea. May have answered that question myself. Any help would be appreciated, as I want to get this done before GTA5 comes out. I know no one reads ac posts, but I've been lurking for years, just never found a compelling enough reason to create an account.

Re:Hmmm, and what uses FP32 workloads? (0)

Anonymous Coward | more than 2 years ago | (#39889167)

Ebay your old stuff and use it to offset the cost of your new stuff.

Re:Hmmm, and what uses FP32 workloads? (1)

Kelbear (870538) | more than 2 years ago | (#39890385)

The reason for making an account is so that you're notified when you receive a reply. Further, when someone is considering whether or not to respond to you, they'll have some level of confidence that you'll actually see their response. Why go through the trouble of writing something useful to an AC who won't read what you're saying?

As for your issue: just get a new computer, all you need to know is how much money you have, and then follow one of these:
$650: http://www.tomshardware.com/reviews/build-gaming-pc-overclock,3159.html [tomshardware.com]

$1200: http://www.tomshardware.com/reviews/build-a-pc-budget-overclock,3160.html [tomshardware.com]

$2650: http://www.tomshardware.com/reviews/core-i7-3930k-overclock-radeon-hd-7970,3158.html [tomshardware.com]

Most of your problems are likely just Vista. Win7 has ironed out most of Vista's problems. Buying a PC is not as complicated as it once was, and is also far FAR cheaper than it has been in the past. Don't waste your time researching parts for trivial performance gains, just buy items off these guides according your budget and call it a day.

I think people need to stop being so hyped up (4, Interesting)

Sycraft-fu (314770) | more than 2 years ago | (#39885815)

There is zero actual evidence that there is going to be a "GK110" this year, or that if there is it will be a high end part (bigger numbers in their internal code names don't always mean higher end parts).

I see people all in a lather about the supposed amazin' graphic card that is up and coming, and lots of furious rumors, but nothing in the way of any proof. I also can see some fairly good arguments as to why nVidia would NOT be releasing a higher end card later on (excluding things like Teslas and Quadros, which are higher end in a manner of speaking).

Speaking of Teslas and Quadros, that may be all that it is: A version of the hardware with a redesigned shader setup to give higher FP64 speed. As it stands the card is quite slow at FP64 calculations compared to FP32. It could be 50% of the speed, in theory, but is more like 1/16th. Basically it seems to be missing the necessary logic to link the 32-bit shaders together to do 64-bit calculations for all but a fraction of the shaders. Maybe to protect their high end market, maybe to keep size and heat down (since it does take additional logic). Whatever the case a Tesla/Quadro version with that in place would have much improved FP64 speed, and thus compute performance for certain things, but be no increase to gaming at all.

So I think maybe people need to settle down a bit and stop getting so excited about a product that may not even exist or be what they think, and may not launch when they think even if it is. Chill out, see what happens. Don't get this idea that nVidia has something way MOAR BETTAR that is Coming Soon(tm). You don't know that, and may be setting yourself up for a big disappointment.

Power consumption matters (2)

MrEricSir (398214) | more than 2 years ago | (#39886567)

I've been watching my UPS power load meter since I upgraded from a GTX 560 to a GTX 680. I'd estimate the 680 uses a bit less than half the power of the 560 when idle. At peak usage the 680 uses more, but only by a hair.

I was never happy with the 560 in general. The 3D performance was surprisingly glitchy at 1080p. Even though I wasn't too keen on trying NVIDIA again after that, I gotta admit they won me back with the 680.

Re:Big Fermi is still on the horizon... (0)

Anonymous Coward | more than 2 years ago | (#39887093)

GK110 is next year. 2013. only the compute version will ship in 2012.

Re:Big Fermi is still on the horizon... (1)

cbope (130292) | more than 2 years ago | (#39887493)

Sorry, but NVIDIA have already hinted strongly that there is no "big Fermi" gaming card coming this year. At least nothing that will eclipse the 690. I'm seriously starting to wonder what happened to big Fermi, if it was all just a rumor or perhaps they are saving it for Quadro/Tesla.

Still, I can't wait for 680 prices to drop a bit so that I can replace my overclocked 570. AMD hardware is pretty decent these days, but every time I touch their drivers... I go back to NVIDIA. Hate to say it, but NVIDIA's drivers are still more stable and bug free, and they don't flash "gamey" looking crap in your face every time you open the driver control panel.

Slashvertisement (0)

Anonymous Coward | more than 2 years ago | (#39885457)

Whatever happened to the Slashvertisement tag?

Unfortunately you won't be able to get one (3, Informative)

edxwelch (600979) | more than 2 years ago | (#39885465)

According to Semiaccurate there's a mask design flaw in the GK104, which has caused poor yields. Less than 10,000 GTX 680s shipped worldwide, even though it's been released a month ago.
http://semiaccurate.com/2012/05/01/why-cant-nvidia-supply-keplergk104gtx680/ [semiaccurate.com]

Re:Unfortunately you won't be able to get one (4, Interesting)

Sycraft-fu (314770) | more than 2 years ago | (#39885975)

I would encourage people to look at the site's name before taking anything they say seriously. And then I'd encourage them to look in the archives (if they keep true and accurate archives of their past stuff, I've never checked) to see all the shit they get wrong (and there is a lot of it). Then maybe you understand that like most rumour sites, you don't want to take it too seriously.

For some overall perspective, consider that Charlie Demerjian, the guy who runs it, was given the boot from The Inquirer, which is not precisely what one would call a bastion of journalistic excellence.

As an example of one major madeup story from them, in February they claimed that the GTX680 would have a "PhysX Block" basically either dedicated hardware to speed up PhysX, or special instructions/optimizations for it at the expense of other things. They said that the supposed edge in benchmarks was only because of that, the 7970 would out do it in most games.

That is not at all the case, it turns out. The GTX680 has nothing particularly special for PhysX, other than a shit-ton of shaders, and it in fact outperforms the 7970 by a bit in nearly all game, including ones without PhysX. HardOCP (http://hardocp.com/article/2012/03/22/nvidia_kepler_gpu_geforce_gtx_680_video_card_review/) has them both tested with real gameplay, as usual.

So really, don't take anything that site says seriously. It is a blatant rumours site that just makes shit up.

Re:Unfortunately you won't be able to get one (1)

edxwelch (600979) | more than 2 years ago | (#39890421)

I do follow that site and most of the stuff is spot on. It's true a few stories are wild speculation, but this story rings true.

who cares (1, Interesting)

epyT-R (613989) | more than 2 years ago | (#39885669)

who is going to pay $1000 for a piece of hardware with a halflife of maybe one year? this card is really worth about $400 at most.. and the 680 should be $200. what games actually take advantage of this? there are hardly any pc games worth playing nowadays :\. It's too bad too, because I LIKE new graphics hardware. it's always fun to play with, but at $1000 I can't justify it.

Re:who cares (0)

Anonymous Coward | more than 2 years ago | (#39885793)

Halflife of one year?

I have a GTX 295 (which was the premium card in 2008/09) and it's still doing great in games.

The top cards last a very long time.

Re:who cares (1)

Anonymous Coward | more than 2 years ago | (#39886655)

I think what he means is that the performance of these cards will be the best, or close to it, for only about a year. In a year you can get a card this good for $150-200 instead of whatever it is right now. Combining that with what you wrote we get the question of why should anyone buy these cards, right now, when the old card still works and you can get these whizz bang cards for cheap in a year?

Re:who cares (1)

Carlos Laviola (127699) | more than 2 years ago | (#39887871)

There's no way in hell this card will have dropped down to $150-200 in a year. I own a 560 Ti and that's still going for about $250, as a reference.

Re:who cares (0)

Anonymous Coward | more than 2 years ago | (#39888007)

In most games where crossfire actually works a 5970 that was $599 at launch 2.5 years ago (!) easily beats 580, 6970 and 7870 and still holds up pretty well against 7970 and 680... are those $150-200?

Re:who cares (-1)

Anonymous Coward | more than 2 years ago | (#39885843)

Maybe you should get a job then?

Re:who cares (5, Insightful)

billcopc (196330) | more than 2 years ago | (#39885973)

Normally I'd have preordered two of these already, but it's too rich for my blood right now. This card is for us nutjobs who want quad-SLI and panoramic "3D Surround", with our custom-built driving cockpits and 3 large monitors, or the equally obsessive flight sim crowd. In my case, these displays run at 2560x1440 and that requires a ton of memory bandwidth on each card, just to push all those bits around.

For almost everyone else, a single $300 GPU is enough to run just about any game at 1920x1080 with very respectable settings.

As for your suggested prices, you're just talking out of your ass. If you're going to lowball the latest and greatest GPU on the market, maybe you should set games aside for a while and look at your income. Even though I agree the price is a bit high, spending $1000 on a hobby is nothing. You save up for that shit, and it lasts a very long time. My current cards are over 3 years old, so it works out to just over a dollar a day for kickass gaming graphics. Even if I played for just a few hours a week, it's still cheaper than any other form of modern entertainment. Cheaper than renting a movie, cheaper than a single pint at the pub, cheaper than basic cable TV, cheaper than bus fare to get to and from a free goddamned concert. For what I get out of it, having the highest end gaming hardware ends up being a sweet deal.

Re:who cares (-1)

Anonymous Coward | more than 2 years ago | (#39886963)

Insightful? Man, you sound like a fucking nigger.

Re:who cares (0)

Anonymous Coward | more than 2 years ago | (#39887127)

That would be true if all you needed to play games was a video card. You have to buy the games, and other hardware. I'm pretty sure you don't sit in the dark holding your GPU card and imagining the graphics....then again this is slashdot.

Re:who cares (2)

Kjella (173770) | more than 2 years ago | (#39887961)

Meh, if there was a reasonable (no, a $36000 Eizo doesn't count) 4K/QFHD monitor I'd consider it. I don't like triple screen setups with their bezels and odd aspect ratio with stretching and whatnot, I want it all on one screen. IMO the problem is not the price of the graphics card, it's having something useful to show it on. Even at 2560x1440 I'd have to pay more for a single monitor than for a 680 GTX, which is why I'm still on a good 1920x1200 IPS monitor. Of course it helps that I'm not a FPS junkie but I'd easily want Skyrim in 4K.

Re:who cares (1)

Smauler (915644) | more than 2 years ago | (#39888139)

Even at 2560x1440 I'd have to pay more for a single monitor than for a 680 GTX

No you wouldn't. I mentioned this monitor [ebay.com] earlier in the discussion... I'm not trying to sell them, by the way, I was suprised at the cheapness. Most of the consumer feedback with them has been good, too.

Re:who cares (1)

Cederic (9623) | more than 2 years ago | (#39891575)

Ah, nod. A brand I've never heard of, on a site I don't trust, with no UK suppliers providing it.

Most of the astroturfing has been good though, I agree :)

Given the resolution you can get on the iPad2 it's reasonable to expect 2560x1440 for $400, especially without the free iPad2 thrown in too. It has taken monitor providers several years to really push into that market though.

Even the one you're linking to.. 27 inches? Why not 22? Hell, I'll happily use a 15" 1920x1080 screen, so why not a 19" screen with 2550. It's ludicrous that monitor manufacturers can't compete with handheld computers for screen quality.

So What? (-1)

Anonymous Coward | more than 2 years ago | (#39885691)

How long again before it burns out and Nvidia plays stupid?

Biznat3oh (-1)

Anonymous Coward | more than 2 years ago | (#39885707)

FP64 (1)

sanosuke001 (640243) | more than 2 years ago | (#39886099)

So FP64 performance went from 1/8th FP32 performance in the 500 series to 1/24th FP32 in the 600 series? I, for one, would love it if using doubles in OpenGL 4.x didn't suck so much. I write visualization software with planet-sized ranges and having that extra precision on the card would be quite nice.

I miss the days (1)

rsilvergun (571051) | more than 2 years ago | (#39886557)

when getting a new graphics card every few years was like a new console launch to me. I get that they're being used by statisticians but seriously, what do gamers do with these? The only game that comes close to taxing a $150 graphics card is Crysis 2, and even that's not doing much... Maybe nvidia could put more effort into making these easy to program for so we'd get better games cheaper? I think we've hit the limit on graphic quality, if only because it's too much work to do the art assets...

Re:I miss the days (3, Insightful)

Anonymous Coward | more than 2 years ago | (#39886657)

Mostly multihead gaming. While a $150 card is plenty at 1080p, at 5400x1920 or 4320x2560 it's a different story.

Enthusiast (0)

Anonymous Coward | more than 2 years ago | (#39886637)

I know some people are very passionate with their hobbies but even if I were a billionaire I just couldn't justify spending around $1000 just for a few more frames per second. Just seems very strange.

Re:Enthusiast (0)

Anonymous Coward | more than 2 years ago | (#39886891)

Surround gaming. Pushing decent framerates at 10MP+ takes a *lot* of GPU.

new space heater? (0)

Anonymous Coward | more than 2 years ago | (#39886883)

no thanks, summer's coming.

Awesome (1)

Lord Lode (1290856) | more than 2 years ago | (#39887643)

One question though: If I can play Skyrim with all settings to max at 1920x1200 with a GTX 560, what is SLI of two GTX 690's needed for?

Re:Awesome (1)

Smauler (915644) | more than 2 years ago | (#39888187)

According to this page [techspot.com] , a GTX 560 _averages_ 25fps at 1920x1200. That's not that good.

Re:Awesome (0)

Anonymous Coward | more than 2 years ago | (#39888293)

Check again, the 560 averages 44FPS at max settings at 1920x1200 in that test, you looked at 2560x1600.
But I'd consider that near unplayable, 44FPS average in skyrim usually means about 10FPS in complex scenes, aka stuttering slideshow.

Re:Awesome (1)

Lord Lode (1290856) | more than 2 years ago | (#39888583)

Hmm, plus I didn't use FSAA or other types of AA so maybe it was not *all* setting to max :)

Re:Awesome (1)

Traciatim (1856872) | more than 2 years ago | (#39890075)

Who needs anti-aliasing anyway? Just have a couple of shots of rum before you play and disable all the anti-aliasing. You get all the blurry screen with all the performance.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?