Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia's GF100 Turns Into GeForce GTX 480 and 470

timothy posted more than 4 years ago | from the that-nda-must-have-been-awfully-heavy dept.

Graphics 132

crazipper writes "After months of talking architecture and functionality, Nvidia is finally going public with the performance of its $500 GeForce GTX 480 and $350 GeForce GTX 470 graphics cards, both derived from the company's first DirectX 11-capable GPU, GF100. Tom's Hardware just posted a comprehensive look at the new cards, including their power requirements and performance attributes. Two GTX 480s in SLI seem to scale impressively well — providing you have $1,000 for graphics, a beefy power supply, and a case with lots of airflow."

cancel ×

132 comments

Sorry! There are no comments related to the filter you selected.

I'm really not impressed. (0)

Anonymous Coward | more than 4 years ago | (#31636512)

They caught up with ATI but with a more expensive, hotter and more
power hungry card.

Re:I'm really not impressed. (0)

Anonymous Coward | more than 4 years ago | (#31637536)

Am I missing something? The GTX 480 got owned in all of those benchmarks, sometimes by "lower" model Nvidia GPUs. They didn't catch up to ATI at all.

Re:I'm really not impressed. (4, Informative)

yoyhed (651244) | more than 4 years ago | (#31638544)

Exactly; it's barely on-par with the months-old HD 5870, and it gets taken to school by the 5970. I love when AMD wins.

Re:I'm really not impressed. (2, Interesting)

beleriand (22608) | more than 4 years ago | (#31638722)

The previous gen. NV Cards don't do DX11, thus where running this bench in DX10 mode. Which is kind of a misleading thing for THG to do, while they explained it in the text they should have made seperate chart for those two modes to make it clear on thirst glance. There is a comparison of image quality in Unigine Benchark (DX10 vs. DX11) out there somewhere, and the difference is night and day.

Re:I'm really not impressed. (0)

Anonymous Coward | more than 4 years ago | (#31639130)

Are the enhancements in DX11 that significant? From what Wikipedia lists as the main features, it doesn't even seem worth it.

Tessellation - We've already been doing this for years
Multithreaded rendering -- This too
Compute shaders -- CUDA and OpenCL

Re:I'm really not impressed. (2, Informative)

ooshna (1654125) | more than 4 years ago | (#31640044)

I'm not sure but I've seen pics first in dx10 then in dx11 and it added some nice visuals alot more detail in the texture of things like stairs out of bricks with some of them misaligned instead of perfect rectangles and alot of detail on the dragon statue. In fact here dx9 dx10 dx11 comparison [overclock.net]

Will it run Linux? (0)

Anonymous Coward | more than 4 years ago | (#31636518)

Seriously, that whole thing about drivers earlier makes me wonder if it's worth it to buy this beef without any way to make it sizzle.

Re:Will it run Linux? (0)

Anonymous Coward | more than 4 years ago | (#31636644)

Will it run Linux?

Of course it will - what else would you run on your cluster? You'll have to install the provided drivers, though.

... if it's worth it to buy this beef without any way to make it sizzle.

I'm sorry, but what?

Re:Will it run Linux? (1)

Skarecrow77 (1714214) | more than 4 years ago | (#31636938)

Annoyingly enough, I've been waiting for these specifically -because- I run both windows and linux, and ati drivers in linux are poop if you want to accelerate anything that's actually on store shelves.

Mostly it's annoying because I happily would have bought 5850s for myself and my wife already for $50 cheaper than the GTX 470s are going to cost for roughly the same performance.

Do I need to upgrade? (1)

Taco Cowboy (5327) | more than 4 years ago | (#31638142)

My PC has an HD4670 1GB installed.

Do I need to upgrade?

If I do, which card should I upgrade to? ATI or Nvidia?

BTW, my PSU is rated 1,500W. Silverstone.

Re:Do I need to upgrade? (1)

Taco Cowboy (5327) | more than 4 years ago | (#31638154)

My PC has an HD4670 1GB installed.

My sincerest apology.

The card is HD 4760 with 1GB of GDDR3 RAM.

Re:Do I need to upgrade? (1)

anss123 (985305) | more than 4 years ago | (#31639184)

With a PSU like that?

GTX480 SLI is your only hope at making good use of it.

Re:Will it run Linux? (0)

Anonymous Coward | more than 4 years ago | (#31639756)

Why buy a beast of a card if the drivers hinder it from achieving full performance?

Poop (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#31636524)

POOP FIRST!

Loop (0, Flamebait)

eatspoop (1604225) | more than 4 years ago | (#31636538)

Poo poo poo I'm stuck in a poo loop

Re:Loop (0)

Anonymous Coward | more than 4 years ago | (#31638326)

Möbius toilet?

Where time becomes a poop? Becomes a poop? Becomes a poop?

$1000 for graphics (5, Funny)

tpstigers (1075021) | more than 4 years ago | (#31636556)

Come on - is that all? There HAS to be a way I can spend 5 times that to play a video game.

Re:$1000 for graphics (1)

gman003 (1693318) | more than 4 years ago | (#31636764)

Just wait for the dual-PCB cards. Quad-SLI, anybody?

Re:$1000 for graphics (1)

lightrush (1471807) | more than 4 years ago | (#31636916)

Sadly I think the PCIe power specification limits any PCIe device to 300W. You can be sure that dual-PCB is a no-no. Can't spend 5 times that :/ .

Re:$1000 for graphics (1)

zill (1690130) | more than 4 years ago | (#31639718)

There's always Nvidia Quadro Plex, where the the cards sit in an external 1U case.

Re:$1000 for graphics (1)

Z34107 (925136) | more than 4 years ago | (#31636888)

Come on - is that all? There HAS to be a way I can spend 5 times that to play a video game.

TFA suggested purchasing two of these $500 cards, three $400 120Hz monitors, and a $200 NVIDIA stereoscopic vision kit. That'll let you game in 3D across three 1080p monitors.

So, you can spend $1400 in accessories to match your $1000 cards. And then, you know, buy the rest of the computer. Not quite five times more, but I'm still salivating over getting my hands on such a setup some day...

Re:$1000 for graphics (1)

The Archon V2.0 (782634) | more than 4 years ago | (#31637826)

TFA suggested purchasing two of these $500 cards,

What? Only TWO? Where's my quad SLI? Where's my more-money-than-sense option? Where's my burn-people-to-death-with-my-air-outtake option?

Re:$1000 for graphics (1)

FreonTrip (694097) | more than 4 years ago | (#31638402)

Your dreams of incandescence for wayward travelers in your home or office would have been nicely facilitated by a quad CPU Tejas [wikipedia.org] workstation. You might have needed three-phase power, ear protection, and an asbestos suit, but that's just the price to pay to be a True Hardcore Gamer.

Re:$1000 for graphics (0)

Anonymous Coward | more than 4 years ago | (#31637492)

Would you like a $1130 CPU [newegg.com] with that?

Re:$1000 for graphics (2, Funny)

thebes (663586) | more than 4 years ago | (#31638594)

And they charge $0.99 for shipping

Re:$1000 for graphics (1)

crossmr (957846) | more than 4 years ago | (#31638188)

Sure, I'll sit in front of your machine and stop you from playing it until you pay me $4000.

Re:$1000 for graphics (1)

Kjella (173770) | more than 4 years ago | (#31638374)

Well, you could simply declare that multiple monitors is for losers and get a QuadHD (3840x2160) LCD instead, like say this one [westinghousedigital.com] . It's only supposed to set you back 50,000$ [engadget.com] or so. A 2160 cinema projector can easily set you back a few hundred thousands if that's not enough. There's always options if you have enough money...

Re:$1000 for graphics (1)

timeOday (582209) | more than 4 years ago | (#31639442)

Wow, a $50K monitor with "an ultra-wide viewing angel" [westinghousedigital.com] [sic].

Westinghouse doesn't fab LCD panels do they?

Re:$1000 for graphics (0)

Anonymous Coward | more than 4 years ago | (#31639110)

In a time of economic hardship and a drive for less power hungry devices, Nvidia presents us with a $1000 grill. And the games you play on it will be just as entertaining as those of ten years ago, just looking slightly better. I guess it's progress of a sort.

Expensive, power hungry? (5, Funny)

Megahard (1053072) | more than 4 years ago | (#31636572)

Sounds like the GF100 turned into the MRS100.

Re:Expensive, power hungry? (1)

theralfinator (1087355) | more than 4 years ago | (#31636588)

Hahahahahaha. Seriously, that's a good one.

Re:Expensive, power hungry? (1)

eric-x (1348097) | more than 4 years ago | (#31638356)

I smiled

Fermi needs a refresh or v2 (5, Informative)

Artem Tashkinov (764309) | more than 4 years ago | (#31636580)

To summarize Fermi paper launch:
  • Fermi is a damn hot and noisy beast
  • Fermi is more expensive and only slightly faster than the respective ATI Radeon cards, thus DAAMIT will not cut prices for Radeons in the nearest future
  • Punters will have to wait at least for two weeks for general availability
  • Fermi desperately needs a reboot/refresh/whatever to attract masses

It seems like NVIDIA has fallen into the same trap as with GeForce 5XXX generation launch.

Re:Fermi needs a refresh or v2 (1)

Dragoniz3r (992309) | more than 4 years ago | (#31636752)

Point #3 is misleading. "Paper Launches" are common in the industry. It's not an nvidia or fermi-specific thing.
Point #4 is just bogus. There will be plenty of people who'll buy these chips.
Also, prices are likely to fall on these chips, which will cause Radeons to fall as well. And it's not going to take that long.
I suggest everyone go check out HardOCP's GF100 review for a real-world analysis, rather than 4 trollish bulletpoints.

Re:Fermi needs a refresh or v2 (0, Troll)

WhatAmIDoingHere (742870) | more than 4 years ago | (#31636782)

Point 2 is inaccurate. The 480 is cheaper than the 5970 (by almost $200) and the 480 beats the 5970 in multiple benchmarks.

Re:Fermi needs a refresh or v2 (0)

Anonymous Coward | more than 4 years ago | (#31636828)

What bench marks did you see, most of the benchmarks I saw the 5970 still whipped the 480, with a few minor exceptions.. I won't be buying a 480, especially with Nvidias QA/QC. With all the hype that surrounded the GF100 (from that loser of a ceo Jen-Hsun Huang), it really is a fail. I would have been more impressed if they came out 6 months ago.

Re:Fermi needs a refresh or v2 (1)

lightrush (1471807) | more than 4 years ago | (#31636842)

You mean 1 benchmark and it is the same one at which HD5870 is about 5-10% slower than GTX480?

Re:Fermi needs a refresh or v2 (2, Informative)

WhatAmIDoingHere (742870) | more than 4 years ago | (#31639570)

I mean the real world results posted on hardocp.

Re:Fermi needs a refresh or v2 (1)

TheKidWho (705796) | more than 4 years ago | (#31637002)

No, I wouldn't compare this to the 5xxx at all. Especially considering the Nvidia cards whoop the Radeons in tessellation and geometry operations.

Not to mention the overwhelming lead Nvidia has with GPGPU currently.

Re:Fermi needs a refresh or v2 (0, Flamebait)

binarylarry (1338699) | more than 4 years ago | (#31637150)

Nvidia also has decent drivers.

ATI's drivers are horrifyingly bad.

Re:Fermi needs a refresh or v2 (2, Insightful)

JuniorJack (737202) | more than 4 years ago | (#31637952)

Not to mention the overwhelming lead Nvidia has with GPGPU currently.

We are using GPU's for a number crunching tasks - integer operations. Currently one 5970 (aircooled) outperforms
a computer with 4 x GTX 295, watercooled and overclocked to 725 Mhz each.

NVIDIA has to do really much better with those new cards to win us back

Re:Fermi needs a refresh or v2 (1)

Fred_A (10934) | more than 4 years ago | (#31638184)

Not to mention the overwhelming lead Nvidia has with GPGPU currently.

We are using GPU's for a number crunching tasks - integer operations.

And the CPU is busy computing the OpenGL the screensaver graphics ? :)

I know the GPUs have now moved into a different realm altogether but I still find it strange at times.
I still see my graphics card as a glorified Tseng ET 4000 despite it probably having more processing power than most of my previous machines combined...

Re:Fermi needs a refresh or v2 (2, Interesting)

im_thatoneguy (819432) | more than 4 years ago | (#31637642)

Maybe. But that assumes that your GPU is just being used to render DX or OpenGL games.

I think Nvidia made a very wise business decision with Fermi. Right now there is NO DEMAND for a video card on Fermi's level. All of the popular games run at full quality in full HD with AA. There is no "Crysis" which nobody can run at a decent framerate. We've sort of plateaued at "Good enough" since most games are cross developed for consoles (which are running aging video cards) and PC. Both AMD and Nvidia have released gaming cards that are overkill. So Nvidia has decided to take a different tact. They've managed to release a gaming card that is competitive with the very best video card for gaming and also redesigned their cores to be fast GPGPUs.

In the AnandTech review the GTX400 is 2x-10x faster than the GTX 285 or Radeon 5870.

That might not do much for Modern Warfare 2 but Modern Warfare 2 already runs great. It will offer huge performance improvements in things like video encoding, photoshop or any other CUDA ready application.

As OpenCL gets used more in games for things like hair and cloth simulation or ray-traced reflections Nvidia will have an architecture ready to deliver that as well. At some point AMD is going to need to go through a large re-architecture as well. But the longer they wait the more likely they'll be trying to push out a competing product while the competition is fierce. If there is a time to deliver an average product and suffer huge delays it's during an economic turn down and a period where there is little reason to upgrade.

Re:Fermi needs a refresh or v2 (4, Informative)

MartinSchou (1360093) | more than 4 years ago | (#31638728)

In the AnandTech review the GTX400 is 2x-10x faster than the GTX 285 or Radeon 5870.

That's overstating it WAY too much.

In certain benchmarks the GTX480 is quite a bit faster than the 5870, but what you're saying is that it is across the board, which is just not true. From the conclusion of the AnandTech review:

To wrap things up, let's start with the obvious: NVIDIA has reclaimed their crown - they have the fastest single-GPU card. The GTX 480 is between 10 and 15% faster than the Radeon 5870 depending on the resolution, giving it a comfortable lead over AMD's best single-GPU card.

There is a massive difference between "10 to 15%" and "2x-10x faster".

Re:Fermi needs a refresh or v2 (1)

Kjella (173770) | more than 4 years ago | (#31639062)

Well, honestly I don't quite get this card. It's either a much cheaper Quatro or it's an overpriced, hot gaming card. It's six months after the Radeon 5xxx series launched, I wouldn't be very surprised if between this paper launch and actual availability AMD has binned up and announces the HD5890 to go head-to-head with nVidia for the title of fastest single gaming card again. Also AMD has managed to roll out a full series top-to-bottom of 40nm cards already, while I think nVidia will take a lot longer to trickle down.

Yes, there's those that need CUDA but I don't think the intersection between gamers and CUDA users is that big - and if gamers don't buy this card and the CUDA users buy this instead of the Quatros then it's a lose-lose proposition for nVidia. But I guess I see the outlines of the "new nVidia", they're just preparing to leave the "regular" GPU market sooner than I expected. With Intel including graphics on Atoms/Core i3/Core i5 and AMD heading for AMD Fusion then nVidia is being shut out of the market for integrated graphics. But there'll still be a decent market for discrete chips between that and the CUDA-focused cards.

This card looks like a misfit, like if you've put a truck engine in a sports car because they're both big, powerful engines. If AMD just focuses on the "average" discrete market and does that much better they got plenty to live off even if nVidia takes the CUDA market. I'm not so sure nVidia can sustain themselves just on being a niche high-end company. Maybe they can but most companies that have tried have been steamrolled by the huge volume and investment happening mainstream. This card to me at least looks like it's serving customers with one hand and packing its bags with the other.

Re:Fermi needs a refresh or v2 (1)

pnewhook (788591) | more than 4 years ago | (#31638532)

And what about OpenGL? It's completely useless without OpenGL support.

Re:Fermi needs a refresh or v2 (0)

Anonymous Coward | more than 4 years ago | (#31639404)

It seems they still have to work on the OpenGL drivers according to this benchmark:
http://hothardware.com/Articles/NVIDIA-GeForce-GTX-480-GF100-Has-Landed/?page=10

Nvidia can only hope... (0, Troll)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#31636604)

That there are a lot of lunatic performance enthusiasts and deep-pocketed GPU computing users out there. $500, 250 watts, only modestly faster than the competitor's cheaper, cooler card that has been out for some months now, and has variants and cut-downs spanning more or less the entire price/performance spectrum from sub-$100 to mid $400s...

One cannot deny that they are, in fact, the fastest; but in all other respects they just got owned. More power draw than a CPU from the bad old days of Prescott(and Prescott was 90nm, this sucker is 40nm), a gigantic die that must cost a small fortune just to manufacture, hideously audible fan noise just to keep the thing from melting down. They'll have to cut the power draw by a factor of five to land any laptop design wins at all, a factor of ten for anything that isn't a 2.5 inch thick gamer box of a laptop.

Unless there is a large enough market of crazy gamers who just must have the fastest, or GPU computing people who don't care how expensive or noisy these cards are because they are in the datacenter doing some sort of algorithmic trading, Nvidia has a real loser on their hands...

Re:Nvidia can only hope... (5, Informative)

RzUpAnmsCwrds (262647) | more than 4 years ago | (#31636734)

More power draw than a CPU from the bad old days of Prescott

Prescott at its hottest (Pentium 4 HT 571) was only 115W, which is about the same or (in some cases) vastly less than nearly every mid-range to high-end GPU today.

Radeon 5830 is 175W
Radeon 5850 is 151W
Radeon 5770 is 108W

Prescott at its hottest actually used less power than some of the current high-end Core i7 CPUs (i7-920 is 130W), although of course that's comparing a 1-core CPU to a vastly faster 4-core CPU.

What's happened is that CPU coolers have gotten much better (thanks in part to heatpipes and larger fins/fans), power supplies have gotten more efficient and larger, and cases are better ventilated. The result is that today a 130W CPU is no big deal, whereas with the Prescott it caused all kinds of thermal nightmares for people building their own PCs (professionally engineered commercial PCs generally fared OK with Prescott).

Still, 250W on a GPU is stupid. Even with modern efficient air cooling, it's hard to keep such a GPU cool without making a ton of noise. Add the crazy power supply requirements (most people are recommending 550W or more, which means $100+ if you want a quality PSU), and it's a pretty big burden. The real problem is that the ATI card is almost as fast, cheaper, and 80 watts cooler. And it's been on the market for 8 months.

Re:Nvidia can only hope... (4, Informative)

afidel (530433) | more than 4 years ago | (#31637652)

You're missing the best card on a performance per watt basis, the HD5750. The Powercool Go!Green edition pulls 62W max, 52W in normal gaming. It's so efficient it doesn't even need a PCIe power cord. It will get you 95% of the performance of the HD5770 pulling twice as much power. Oh and for HTPC's that are on 24x7 the 14W idle is nice too =) Now if only they would come down from $160 and Newegg would get them back in stock...

Crap Hardware vs. Crap Drivers? Is that it atm? (1, Insightful)

Anonymous Coward | more than 4 years ago | (#31636610)

So I can choose between nice 20W idle with ATI, but shit windows and goddamn awful linux drivers with only outdated X.org / kernel support for the cards.

Or this power hungry overpriced heater (yay, summer is coming), which at least has decent drivers.

The Free Market has failed us! Damn commies!

Re:Crap Hardware vs. Crap Drivers? Is that it atm? (1)

lightrush (1471807) | more than 4 years ago | (#31636952)

Uhm, you have a point but the radeonhd driver is advancing very quickly. OpenGL 2.0 support on HD3xxx, HD4xxx series is complete I believe. :)

Re:Crap Hardware vs. Crap Drivers? Is that it atm? (1)

Narishma (822073) | more than 4 years ago | (#31639296)

You say that as if it's a good thing. Wasn't OpenGL 4.0 just announced a few weeks ago?

Re:Crap Hardware vs. Crap Drivers? Is that it atm? (1)

Skarecrow77 (1714214) | more than 4 years ago | (#31636970)

I hear the ATI proprietary drivers are great as long as you don't wanna run anything newer than the 2xxx series...

And there are of course the nouveau open source drivers, whose 3d acceleration is best paraphrased as "maybe someday. stop asking dammit."

Yeah, it's pretty much Nvidia or something from someone else circa 2007, take your pick.

Re:Crap Hardware vs. Crap Drivers? Is that it atm? (1)

lightrush (1471807) | more than 4 years ago | (#31636994)

Re:Crap Hardware vs. Crap Drivers? Is that it atm? (1)

Skarecrow77 (1714214) | more than 4 years ago | (#31637184)

ok, so I exagerated, they run the 4xxx series. Let me know when they run the 5850 and 5870.

Re:Crap Hardware vs. Crap Drivers? Is that it atm? (1)

Yaa 101 (664725) | more than 4 years ago | (#31637088)

Sorry but the ATI prop drivers are unstable, thus unusable for most part.

Re:Crap Hardware vs. Crap Drivers? Is that it atm? (2, Funny)

Fred_A (10934) | more than 4 years ago | (#31638206)

So I can choose between nice 20W idle with ATI, but shit windows and goddamn awful linux drivers with only outdated X.org / kernel support for the cards.

Or this power hungry overpriced heater (yay, summer is coming), which at least has decent drivers.

I think I read somewhere (I'd have to look it up) that both ATI and nVidia make other models.
Maybe you could find one that's more to your liking among those ?

OTOH, with summer comes the season of open case barbecues, so nVidia has at least something going for it !
(is the GF100 dishwasher safe ?)

So... (4, Funny)

RyanFenton (230700) | more than 4 years ago | (#31636616)

Most unique, perhaps, is that the surface of the card is actually part of the heatsink, above the fin array. Normally, this would be a part of the card you could grab onto when pulling it out of a system. But when I burnt my hand on it, I thought a temperature reading would be interesting. Turns out that, during normal game play (running Crysis, not something like FurMark), the exposed metal exceeds 71 degrees C (or about 160 degrees F).

...So, are any third party manufacturers planning on making an easy-bake oven attachment for this thing? At least have that thing creating some gaming snacks with some of that extra heat.

Ryan Fenton

Re:So... (0)

Anonymous Coward | more than 4 years ago | (#31637056)

I wonder how much I will be able to make in the lawsuit! (I hope /. doesn't reveal my IP)

Re:So... (1)

DigiShaman (671371) | more than 4 years ago | (#31637286)

Cookies!!! And they'll pop out like a sideways toaster. Shweet

Re:So... (1)

fragMasterFlash (989911) | more than 4 years ago | (#31637804)

If this thing can be rigged to cook bacon then even I might buy one.

Modern Engineering (2, Funny)

lightrush (1471807) | more than 4 years ago | (#31636788)

What do you know, heaters for PCIe come with moderately fast GPUs onboard these days!

Anand Tech Review (4, Informative)

alvinrod (889928) | more than 4 years ago | (#31636802)

There's also an Anand Tech [anandtech.com] review which is pretty good and has plenty of different benchmarks. It has the added benefit of testing a 480 SLI configuration which produces some interesting results. It also presents some benchmarks that help to show off nVidia's GPGPU performance as well, which is something that they've been using to hype these new cards.

In my own opinion, ATI still has a competitive advantage, especially considering that they can always drop their price if they feel threatened. nVidia is lucky that they have the ION and Tegra to fall back on, because it doesn't seems as though they don't have a pot to piss in right now in terms of high-end desktop graphics offerings. The 480 seems to be about equal to similarly priced ATI offerings and doesn't give them the edge in performance that they're accustomed to having.

Re:Anand Tech Review (1)

Nemyst (1383049) | more than 4 years ago | (#31639614)

Uh... Why is the page you linked considered dangerous by Firefox?

Bleeding edge isn't usually worth it (4, Interesting)

NotSoHeavyD3 (1400425) | more than 4 years ago | (#31636820)

I mean look at it like this. You can probably get a card for $120-$150 now that will probably run every current game well right now. (Well except for Crisis) So there is no point in buying it for current games. You could get that $500 card hoping that it will run future games well but it never seems to happen that way.(They're slow no matter what old card you have.) Instead you can just buy another $120-$150 card in a few years and that one will run it well. (This way you end up spending less money and actually get better performance.) So my experience is just buy a decent card ($120-$150) and in a few years buy another one and do whatever with the old one. (Sell it, give it to a family member whatever.)

Re:Bleeding edge isn't usually worth it (1)

TheKidWho (705796) | more than 4 years ago | (#31636942)

For most people, yes. Some people like to be on the bleeding edge however.

Re:Bleeding edge isn't usually worth it (0)

Anonymous Coward | more than 4 years ago | (#31637328)

Not true if you want to run new games in DX10/DX11 with the latest eyecandy.

Right now that's... Just Cause 2, Metro 2033, Shattered Horizon and a couple of others.

Within the lifetime of a GPU you'd buy today, there will be tons more. So unless you are happy to keep playing in DX9 mode (possible in some cases, but not always), I'd say Radeon HD 5850 is easily justifiable today ($300 or so) assuming you will use your GPU for 2-3 years.

$400+ GPUs are pretty much pointless, I agree there. The only point would be to run massive triple-screen setups and I think at that point the price of the GPU is the lesser issue and the price of three good quality monitors is the issue :D

Re:Bleeding edge isn't usually worth it (4, Interesting)

tirefire (724526) | more than 4 years ago | (#31637442)

I mean look at it like this. You can probably get a card for $120-$150 now that will probably run every current game well right now. (Well except for Crisis)

Crysis came out in Q3 2007. It's not really a current game anymore. Its use as a benchmark for video card performance is frustrating because it's an incredibly inefficient game engine. Don't get me wrong, it looks beautiful... but so do games that will run at twice the frame rate on the same system.

So my experience is just buy a decent card ($120-$150) and in a few years buy another one and do whatever with the old one. (Sell it, give it to a family member whatever.)

Right on. This is what I used to do until spring of 2007, when I bought an nVidia 8800 GTS 320 MB to play STALKER. That card continues to serve me well with any game I throw at it. I was expecting to need to upgrade it in 2009, but I never did... new games kept running great on it. I've had that card for almost exactly THREE YEARS now and it still amazes me. I've never had any piece of computing hardware that did that.

Changes in graphics card features and speed were really taking place at a white-hot pace between about 2003 and 2007. Those years saw the introduction of cards like the Radeon 9800, the GeForce 6800, and the GF 8800. All of those cards totally smashed their predecessors (from both nVidia AND ATI) in benchmarks. It was even more amazing than the CPU world from 1999 to 2004, when clock rates where shooting through the roof and when AMD embarrassed Intel with the introduction of the 64-bit hammer core (Athlon 64).

GTX285 (0)

Anonymous Coward | more than 4 years ago | (#31636864)

I got me one of these back in october for $400. (Eh, pricy, but worth every cent) Its a nice 2gb video card.

New hardware is good (2, Insightful)

Sarten-X (1102295) | more than 4 years ago | (#31636918)

I love seeing new generations of hardware come out. It means that the perfectly adequate cards from two years ago will be even cheaper.

hello Nvidia GrillForce (4, Funny)

distantbody (852269) | more than 4 years ago | (#31636948)

It's about time a product acknowledged my desktop grilling needs.

Better perf than this shows? (1)

Vigile (99919) | more than 4 years ago | (#31636980)

Another review here points to slightly more of a performance edge to the GTX 480 and 470:
http://www.pcper.com/article.php?aid=888 [pcper.com]

I don't understand. (0)

Anonymous Coward | more than 4 years ago | (#31637094)

Why is Slashdot reporting on this? Did you tell me back in April 2000 that the Geforce 2 was the only card I'd ever need? This betrayal hurts, it really does.

This is why we need the on-live service to succeed (1)

Flentil (765056) | more than 4 years ago | (#31637102)

These new cards, as usual, are way too expensive. I had the best video card when Doom3 came out. Since then I've upgraded once, and need a whole new motherboard, CPU, and RAM before I can upgrade to a newer card. This is why people turn to consoles. This is what's killing PC gaming. I really hope on-live works out, as I see it as the ultimate solution to this problem without having to resort to an xbox/ps3.

Re:This is why we need the on-live service to succ (1)

linzeal (197905) | more than 4 years ago | (#31637208)

Doom 3 was released 6 years ago, are you telling me that the PS3 and Xbox360 came out 6 years ago?

I don't know what your priorities are for computing needs but you are on Slashdot and your telling me you do a refresh every 6 years or so?

I'm paid to do CAD for a living so I need a beast at home to do work on but I could not imagine a 6 year old, what an original Athlon or P3 could even handle 1080p streaming content, let alone any hardcore programming environment for compiling or a modern parametric modeler.

Re:This is why we need the on-live service to succ (1)

anss123 (985305) | more than 4 years ago | (#31637236)

You don't have to get the absolute best ya know? OnLive - a youtube like gaming service - is unlikely to give you a better gaming experience than a $70 graphic card. If you got to have the absolute best graphics out there then the PS360 is already getting long in the tooth and MS/Sony is fretting more about their Wii inspired controllers than graphics these days.

Re:This is why we need the on-live service to succ (1)

Namarrgon (105036) | more than 4 years ago | (#31637276)

Well what on earth are you buying the new cards for? Last year's mid-range cards are far cheaper and perfectly adequate for any game around (especially if you run at console-standard 720p). Also, if you last upgraded in 2004 - you'd be needing a new console by now anyways. Not many games released for the PS2 or Xbox lately.

On-Live will be ok for slower-paced games (latency kills any FPS playing), but you'll need a fairly beefy connection if you want even console-level resolutions, let alone PC-level. Plus, the larger the frame size, the greater the transmit time and the larger the latency.

Re:This is why we need the on-live service to succ (2, Insightful)

Totenglocke (1291680) | more than 4 years ago | (#31637338)

And you know who buys the top of the line super expensive cards? Pretty much no one. Everyone else either buys a mid-range card or last years top of the line. Both of those will last you a few years and the all around computer cost is less than a console.

Don't believe me that consoles are more expensive? I'm a PC gamer (who occasionally plays console games) and a friend of mine is a console gamer (who occasionally plays PC games). He tries to use your argument about "it's expensive with upgrading your computer", yet he ignores the fact that 1) console games virtually never go down in price, where PC games drop in price very quickly after the first few months and 2) Consoles nickel and dime you to death. We actually sat down and did the math one time and for his Wii, 360, PS3 and enough controllers for 4 players on each, it came out to over $2,500 for just the console hardware. You can easily buy two very good gaming systems for less money over the course of the lifespan of a console generation.

So no, people don't turn to consoles because they're cheaper, people turn to consoles because they can't do basic math.

Re:This is why we need the on-live service to succ (2, Interesting)

bertok (226922) | more than 4 years ago | (#31637482)

And you know who buys the top of the line super expensive cards? Pretty much no one. Everyone else either buys a mid-range card or last years top of the line. Both of those will last you a few years and the all around computer cost is less than a console.

Don't believe me that consoles are more expensive? I'm a PC gamer (who occasionally plays console games) and a friend of mine is a console gamer (who occasionally plays PC games). He tries to use your argument about "it's expensive with upgrading your computer", yet he ignores the fact that 1) console games virtually never go down in price, where PC games drop in price very quickly after the first few months and 2) Consoles nickel and dime you to death. We actually sat down and did the math one time and for his Wii, 360, PS3 and enough controllers for 4 players on each, it came out to over $2,500 for just the console hardware. You can easily buy two very good gaming systems for less money over the course of the lifespan of a console generation.

So no, people don't turn to consoles because they're cheaper, people turn to consoles because they can't do basic math.

Actually, people do buy the super expensive cards, and it's often not a bad deal.

I got myself an NVIDIA GTX8800 when it just came out. It ran super hot, cost me quite a bit, but it was the fastest single-card/single-chip 3D accelerator on the market for something like a year, and even when faster cards came out, the difference was something like 10% for a long time.

In the end however, it was cheaper for me to buy a very good card once and keep it for a couple of years, than to repeatedly buy older model cards at a lower price to be able to play the latest games.

I could play Crysis just fine at 1920x1200 when it was first available, which was pretty much only possible on that card or an SLI system, unless you enjoyed playing the "Crysis slideshow". If I had an older model card, I'd have been forced to upgrade.

Re:This is why we need the on-live service to succ (1)

chazwurth (664949) | more than 4 years ago | (#31637508)

And you know who buys the top of the line super expensive cards? Pretty much no one.

Then why can't supply satisfy demand? Prices on all the enthusiast-oriented cards have been going up for months, and if you really want a top-end card (5970 for example), it's really hard to find one.

Monitors are getting bigger and cheaper, and a lot of people want to play at (minimally) 1900x1200 at high settings. For newer games that takes expensive cards.

Re:This is why we need the on-live service to succ (2, Interesting)

PhunkySchtuff (208108) | more than 4 years ago | (#31637546)

We actually sat down and did the math one time and for his Wii, 360, PS3 and enough controllers for 4 players on each, it came out to over $2,500 for just the console hardware. You can easily buy two very good gaming systems for less money over the course of the lifespan of a console generation.

So you can buy two PCs (that can have one, or at most two people playing at once) or you can buy three consoles and enough peripheral hardware to have four people playing at once on each console and... consoles are more expensive?

Consoles are also more convenient. Turn it on. Put in a disc, or load a game off the hard drive. Play. Turn it off. Easy.

Re:This is why we need the on-live service to succ (1)

aussie_a (778472) | more than 4 years ago | (#31638708)

Consoles are also more convenient. Turn it on. Put in a disc, or load a game off the hard drive. Play. Turn it off. Easy.

Then what's up with all this install patches business for specific games?

Re:This is why we need the on-live service to succ (1)

sa1lnr (669048) | more than 4 years ago | (#31637978)

And don't forget the price of the games.

Here in the UK PC games are a standard £30/35 Sterling and have been for years. A lot of console games that I see in the stores are up to and above 50% more expensive.

To summarize Fermi paper launch (-1, Troll)

Anonymous Coward | more than 4 years ago | (#31637142)

To summarize Fermi paper launch:
Fermi is a damn hot and noisy beast
Fermi is more expensive and only slightly faster than the respective ATI Radeon cards, thus DAAMIT will not cut prices for Radeons in the nearest future
Punters will have to wait at least for two weeks for general availability
Fermi desperately needs a reboot/refresh/whatever to attract masses
It seems like NVIDIA has fallen into the same trap as with GeForce 5XXX generation launch.
China Mobile Phones [chinamobilephones.org]

I want (2, Interesting)

Anonymous Coward | more than 4 years ago | (#31637214)

a 40nm 9800GT with 80W TPD. The 9800 is fast enough for my needs and has been for 2 years now. Less heat. Less power. Less noise. A 150W video card has absolutely no appeal to me.

Re:I want (1)

TheRaven64 (641858) | more than 4 years ago | (#31638576)

I'm surprised that there's even a market for 80W GPUs. Most computer sales are now laptops, with handhelds catching up quickly. In a laptop, you have maybe 10W, 20W if you're on mains, in a handheld you've got under 1W for the CPU and GPU. Given the kind of performance that we're seeing from the current generation of GPUs on ARM SoCs, which use about 500mW, 80W seems extravagant.

Re:I want (1)

fostware (551290) | more than 4 years ago | (#31639126)

80W seems extravagant.

The Dell XPS 1730 (SLi 8800GTX) comes with a 230W power supply. :S

There will always be someone who wants to pay for the privilege of the best graphics, and a lot of mates are now forgoing the shuttle cases and buying a decent GPU laptop.

That said, I have CURRENT i7 desktops with power supplies with 230W power supplies.

Crippled double precision, bleh. (1, Interesting)

Anonymous Coward | more than 4 years ago | (#31637322)

The only real reason I wanted to get a Fermi / GTX480 card was to experiment with GPGPU and finally be able
to work at reasonable performance using double precision algorithms which my 8800GT won't do.
Now I find that they've crippled the double precision performance to something like 1/4th the hardware's actual capability
just to price gouge the developers that want that capability as opposed to just playing video games.
So as it stands the AMD 5870 is about 2/3rds the price or so and has 4x the double precision performance, runs a fair
bit cooler, and has been available for many months as well. I think I'll have to pass on the Fermi / GTX4xx series cards
until they get to their senses and make a product that is fully competitive with the much older and much cheaper AMD
58xx series products performances in this regard.

I don't really see why NVIDIA would think that it is reasonable to cripple DP performance for market segmentation reasons
as if somehow DP wasn't a mainstream necessity for consumer and small business computing; every CPU out there has had a DP
FPU for decades now (and wouldn't have if it wasn't useful to absolutely ordinary tasks), and OpenCL / DirectCompute / HDR / etc. etc. are all technologies that very much benefit from DP that are being pushed heavily for mainstream multimedia, image processing, and ordinary PC application performance enhancement.

It is hardly esoteric HPC level stuff these days. Actually the real question is why it has taken so long
to get quad precision / long double / whatever standardized into the computer languages / compilers (C, C++, CLR) and CPUs / GPUs, it would've been a logical progression around the time things went to 64 bit or earlier (for different but analogous reasons).

Now if only AMD's drivers and OpenCL implementations weren't quite so bad. . .

Re:Crippled double precision, bleh. (1)

RCL (891376) | more than 4 years ago | (#31638388)

I'm also eager to run my GPGPU code on that, but since it uses single-precision floats only, I'm OK with NVIDIA's decision.

People still think about cards in terms of "traditional" graphics API. DirectX 11 bleh... These days, you can again render pixels your own way, and rasterization and polygon-based graphics API can be completely bypassed... without losing performance!

Re:Crippled double precision, bleh. (1)

evilbessie (873633) | more than 4 years ago | (#31639194)

I'm guessing that they want to sell you the $2000 top of the line Quadro FX card to do your number crunching, or the Tesla.

Minimum Framerates and Graphics Lag spikes (1)

moozoo (1308855) | more than 4 years ago | (#31637588)

I think Fermi can be summed up with the comments near the bottom of the Crysis Warhead benchmark in the review done by AnandTech. "The GTX 400 series completely tramples the 5000 series when it comes to minimum framerates, far more than we would have expected. " Fermi is a mac truck that ploughs though the tougher scenes. There is nothing worst than having smoke, explosions, and water falls etc causing graphics spikes.

Basically an Epic Fail (0)

gweihir (88907) | more than 4 years ago | (#31637604)

All their boasting cannot obscure that. Nvidia has nothing this market round. Maybe they will be back in the game next round, but only if they can moderate their arrogance and stop lying to their customers. Otherweise it looks like Nvidia may be going to be history with regard to the consumer market.

I certainly will not buy from them again after 2 failed GFX cards (the bump problem) and 1 failed mainboard (much too much heat), both from shoddy engineering on their part. It also seems that they have lost their edge on the driver side. I have now had several instances were Nvidia GFX crashed, while AMD GFX did not. May also be that Nvidia hardware is now so bad that the drivers cannot compensate anymore.

good card for playing with GPGPU? (1)

FuckingNickName (1362625) | more than 4 years ago | (#31637940)

Per subject, what would be a reasonable card for playing with GPGPU tech (under Win7)? I have been thinking about the GT220 or GT240, and while I am bombarded with reviews by Top Elite gamer sites indicating that these are low to mid range cards, as far as I can tell they basically do what the higher range cards do, but with fewer cores/less memory/slower clock. And the only significant thing I might be missing out on is double precision arithmetic.

Of course, I am likely to be wrong... what else would I not be able to play with GPGPU-wise by considering a $80ish card rather than a $1000 one?

Re:good card for playing with GPGPU? (1)

Ogi_UnixNut (916982) | more than 4 years ago | (#31639582)

When CUDA came out I went and bought the cheapest GPU I could find (30 euro GeForce 8400 GS) and started learning. If you've never done GPU programming before (like me) and you just want to give it a go, better get the cheapest card to start with. You can always sell it and buy a better card once you get proficient at it (or just keep them both, which is what I did, for multiple GPU jobs at once). I use Linux, but the same should apply to win7 (assuming Nvidia has drivers for the card, which I assume they do).

This is assuming you just want it for GPU work, rather than as a gaming/graphics card. Wikipedia has a list of CUDA cards, of varying price/performance. [wikipedia.org]

No source engine benchmarks? (2, Interesting)

Hadlock (143607) | more than 4 years ago | (#31638838)

I got halfway through the first paragraph before I started looking for the link to the L4D2 benchmarks, which are a pretty good indicator of how well your computer is going to run L4D2, TF2, and very importantly, Portal2. None detected, even though it's one of their primary tests on all of their video card shootouts. Another failure for the guys at Tom's Hardware.

CS5 ? (1)

yakumo.unr (833476) | more than 4 years ago | (#31639234)

Will the Adobe CS5 Mercury Playback Engine run on this or are they really locking it JUST to Quadro's ?

90 degrees C, at Idle!! (4, Informative)

guidryp (702488) | more than 4 years ago | (#31639464)

http://www.legitreviews.com/article/1258/15/ [legitreviews.com]

I discovered that the GeForce GTX 480 video card was sitting at 90C in an idle state since I had two monitors installed on my system. I talked with some of the NVIDIA engineers about this 'issue' I was having and found that it wasn't really an issue per say as they do it to prevent screen flickering. This is what NVIDIA said in response to our questions:

"We are currently keeping memory clock high to avoid some screen flicker when changing power states, so for now we are running higher idle power in dual-screen setups. Not sure when/if this will be changed. Also note we're trading off temps for acoustic quality at idle. We could ratchet down the temp, but need to turn up the fan to do so. Our fan control is set to not start increasing fan until we're up near the 80's, so the higher temp is actually by design to keep the acoustics lower." - NVIDIA PR

Regardless what the reasons are behind this, running a two monitor setup will cause your system to literally bake.

Yikes!

I already wasn't impressed, but after reading this it looks more like a fiasco, than just a mild disappointment.

Nice. Too bad nobody makes PC Games anymore... (1)

rtrifts (61627) | more than 4 years ago | (#31639784)

Without putting too fine a point on it, hardware like this used to be pretty cool. I have had several GTX 260 and a Asus 4870 for the past 1.5 years. I've even got two M1710 laptops with SLI. Truth is, I've yet to really flex the muscles on *any* of this hardware since I've owned it.

There just aren't many Triple-A PC titles being made these days; let alone any that benefit much from hardware like this.

It would be very cool if there *were* such titles. But there aren't. Worse, there are not many coming into focus on the horizon, either. I suppose we can all hope the system requirements and eye candy in Star Wars: The Old Republic and in Diablo III will shine with this hardware.

But I wouldn't bet on it. So we buy one of these to play the Witcher II? Then what?

Hardware like this is a solution looking for a problem. And that IS the problem.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?