Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA GeForce GTX 780 Offers 2,304 Cores For $650

timothy posted about a year ago | from the that's-a-lot-of-grande-lattes dept.

Graphics 160

Vigile writes "When NVIDIA released the GTX Titan in February, it was the first consumer graphics card to use the GK110 GPU from NVIDIA that included 2,688 CUDA cores / shaders and an impressive 6GB of GDDR5 frame buffer. However, it also had a $1000 price tag that was the limiting specification for most gamers. With today's release of the GeForce GTX 780 they are hoping to utilize more of the GK110 silicon they are getting from TSMC while offering a lower cost version with performance within spitting range. The GTX 780 uses the same chip but disables a handful more compute units to bring the shader count down to 2,304 — still an impressive bump over the 1,536 of the GTX 680. The 384-bit memory bus remains though the frame buffer is cut in half to 3GB. Overall, the performance of the new card sits squarely between the GTX Titan ($1000) and AMD's Radeon HD 7970 GHz Edition ($439), just like its price. The question is, are PC gamers willing to shell out $220+ dollars MORE than the HD 7970 for somewhere in the range of 15-25% more performance?" As you might guess, there's similarly spec-laden coverage at lots of other sites, including Tom's, ExtremeTech, and TechReport. HotHardware, too.

cancel ×

160 comments

Sorry! There are no comments related to the filter you selected.

Still slower than AMD (-1)

Anonymous Coward | about a year ago | (#43802987)

nVidia's compute architecture still sucks compared to AMD/Stream, and more cores won't fix it.

There's a reason nobody builds bitcoin miners with nVidia/CUDA. There's a reason nobody who builds serious gaming rigs uses nVidia/CUDA. There's a reason nobody who is serious about computational problems still uses it.

CUDA is dead. Has been for a while now.

Re:Still slower than AMD (3, Insightful)

Anonymous Coward | about a year ago | (#43803155)

Implementation trumps architecture. There's a reason nobody who's interested in power efficiency, noise and/or heat uses AMD products.

Re:Still slower than AMD (2)

wooferhound (546132) | about a year ago | (#43803287)

They seem to be using the terms Cores and Shaders interchangeably. is a Shader a core ?

Re:Still slower than AMD (3, Informative)

Vigile (99919) | about a year ago | (#43803391)

In GPU terms, yes. The shaders and cores are very different between AMD and NVIDIA (that's why AMD can have 1536 and compete with a HD 7970 with 2048 shaders).

Re: Still slower than AMD (1)

iamhassi (659463) | about a year ago | (#43803575)

AMD/Nvidia/inbreadkitty comes out with faster, more expensive video card, News at 10. This happens yearly, why is this on /.? I understand being on the hardware websites, but not here

Re: Still slower than AMD (4, Insightful)

bdwebb (985489) | about a year ago | (#43804859)

Pretty much seems like the entire point of this site. News for nerds...did we as nerds stop caring about hardware suddenly? I was not included in this memo.

Re:Still slower than AMD (1)

parlancex (1322105) | about a year ago | (#43804209)

Well, kind of... or not really... not in the traditional sense. I can't speak for AMD but in Nvidia architectures those 2000+ "cores" are clumped into very wide (32+) groups which all share a single instruction decoder and all the parts that go around that. Are they really individual "cores" if they all have to execute the exact the exact same instruction in lockstep?

Re:Still slower than AMD (1)

Xrikcus (207545) | about a year ago | (#43804583)

GPU manufacturers have a tendency to use the word "core" to mean "one ALU in the middle of a vector unit". It's not really very different in principle from saying that an AVX unit is 8 cores, though, so you have to be careful with comparisons.

If you look at the AMD architecture for each compute unit, it's not so different from the cores you see on the CPU side, so it's much more fair to call the 7970 a 32 core chip. The way that a work item in OpenCL, say, or a shader instance in OpenGL maps down to one of those lanes is as much an artifact of the toolchain as of the architecture.

Re:Still slower than AMD (2)

IRWolfie- (1148617) | about a year ago | (#43803359)

Noise? Wouldn't that just depend on which manufacturer packages the AMD GPU?

Re:Still slower than AMD (0)

Anonymous Coward | about a year ago | (#43803365)

Implementation trumps architecture. There's a reason nobody who's interested in power efficiency, noise and/or heat uses AMD products.

And obviously I meant Nvidia.

Re:Still slower than AMD (-1)

Anonymous Coward | about a year ago | (#43803513)

There's a reason nobody who's interested in linux support uses NVIDIA.
Linus said: "NVIDIA is the WORST company regarding linux support. FUCK U NVIDIA!!".

AMD has bad drivers. NVIDIA hast the WORST!

Re:Still slower than AMD (1)

TheRealMindChild (743925) | about a year ago | (#43804075)

I'm sure you meant "except their APU platform"

Re:Still slower than AMD (-1)

Anonymous Coward | about a year ago | (#43804511)

nVidia's compute architecture still sucks compared to AMD/Stream, and more cores won't fix it.

There's a reason nobody builds bitcoin miners with nVidia/CUDA. There's a reason nobody who builds serious gaming rigs uses nVidia/CUDA. There's a reason nobody who is serious about computational problems still uses it.

CUDA is dead. Has been for a while now.

yeah sure it until you run blender. you have heard of blender? yeah then there is simply no comparison. all you open source fan bois rant and rave about amd blah blah blah. there are other uses for high end video cards. buying an AMD is not even a consideration for me...

Oh and cuda isn't dead see blender in your open source catalog...

Still? (0)

Anonymous Coward | about a year ago | (#43803077)

Is anyone else getting real tired of companies purposely crippling their high end products in order to sell them for less money? It's like openly broadcasting that their cards cost way too much to begin with.

Re:Still? (5, Informative)

MetalliQaZ (539913) | about a year ago | (#43803153)

Is anyone else getting real tired of companies purposely crippling their high end products in order to sell them for less money? It's like openly broadcasting that their cards cost way too much to begin with.

It's a question of tolerances. The chips that come out of the fab are not 100% perfect. The designs are amazingly complex, and they usually contain some defects in the manufacturing process. If they don't meet the high-end specs, maybe they can disable the broken cores, relabel it as a mid-range chip, and sell for less money. It allows the yield to be higher and it lowers the price for ALL of the products.

Re:Still? (3, Informative)

bouldin (828821) | about a year ago | (#43803753)

As long as Nvidia keeps crippling double-precision performance on their (non-Tesla) cards, I'll keep buying AMD.

One of the highlights of the GTX Titan was that the card did double-precision floating point at full speed, just like one of Nvidia's Tesla products. That's no longer the case here - the GTX 780 performs double-precision at 1/24 of normal rate, just like a standard desktop GPU.

Re:Still? (2, Insightful)

Anonymous Coward | about a year ago | (#43805079)

Alternate description: "Nvidia lowers the cost of standard desktop GPUs by not including features for high-speed high-accuracy functions that serve no purpose in gaming".

Re:Still? (1)

eclectro (227083) | about a year ago | (#43804237)

The chips that come out of the fab are not 100% perfect.

While this may be true for these graphics cores, I don't think it's necessarily true for Intel's CPU chips. I think they have their design so refined that their yield is close to 100% for all but the highest density cores.

Otherwise they simply would not be able to offer multi core chips. Maybe someone in the know could comment on this.

Re:Still? (1)

adisakp (705706) | about a year ago | (#43804253)

FWIW, they did the same thing on the PS3. Disabling one of the SPU cores to get higher yields. Even on machines where all the SPU's passed, they still had to disable one though to "standardize" performance... so sometimes functional chips are actually crippled to meet demand for lower specs but it's more often a factor to attain higher yield. Intel did "crippling" on functional Pentiums at first to meet Celeron demand (before actually making a new die for celerons). It's a little bit of both to be honest.

Re:Still? (1)

s1lverl0rd (1382241) | about a year ago | (#43804925)

AMD sold tri-core processors for a while - most if not all of those were just quadcores with one core either non-functional or intentionally crippled. Pretty smart move.

Re:Still? (1)

Bill of Death (777643) | about a year ago | (#43803169)

They aren't necessarily purposefully crippling, though that sometimes happens. This allows them to sell chips that have some manufacturing defects; they turn off the bad shaders. The more chips from each wafer they can sell, the lower the price of each chip.

Re:Still? (2)

blackC0pter (1013737) | about a year ago | (#43803201)

While this does happen it is also a way for them to increase the yield of the chips. These are huge chips they are building and the chances of bad cores on the die is rather high. So instead of junking the chips with bad cores since they cannot be sold as the ultimate high end, they create a cheaper product with cores disabled. Not all disabled cores will be bad but this does help them improve manufacturing efficiency. Also, they might have serious manufacturing issues producing these huge chips so it might cost them an arm and a leg to build them right now. So they set the price abnormally high to control the demand until they can iron out the manufacturing issues and improve the yield.

Re:Still? (0)

Anonymous Coward | about a year ago | (#43804127)

Shut up and go buy the expensive, full featured one then.

The Question on Everyone's Minds .... (-1)

Anonymous Coward | about a year ago | (#43803083)

Will it play WoW on high settings?

Re:The Question on Everyone's Minds .... (1)

Anonymous Coward | about a year ago | (#43803219)

Since WoW is CPU-bound the answer is a clear, resounding "it depends".

Graphics cards (1)

snidely (1589397) | about a year ago | (#43803097)

Im a little surprised at people snivelling over 1000.00 video cards. I was in printing for 10 years back in "the day" and a supermac thunder card (with on board jpeg acceleration) was 2500.00

Re:Graphics cards (1)

Anonymous Coward | about a year ago | (#43803249)

You can still buy $2500+ video cards. The difference is they are targeted to the professional crowd who need warranty and support on professional applications like CAD software, Photoshop, and the sort. The FirePro (AMD) and Quadro (NVidia) lines aren't exciting as far as gaming and hardware goes, but you're not paying for gaming, you're paying for the support and certification that it will work for various professional applications.

Re:Graphics cards (2)

TWX (665546) | about a year ago | (#43803671)

I am so glad that I grew up in the 3dFX Voodoo days for my gaming. The cards were relatively cheap, and spending $200 on one seemed like a huge sum of money. Like, this-is-your-only-Christmas-present money.

Upwards of $1000 for a consumer-grade video card? I've spent less on road-worthy vehicles.

Re:Graphics cards (1)

0123456 (636235) | about a year ago | (#43804217)

Meanwhile, my $200 graphics card runs all the games I own at max or high settings at 1920x1080.

This is a card for fanatics who want to run six monitors at > 1920x1080, not Joe Sixpack who bought a PC from Walmart.

Re:Graphics cards (1)

jma05 (897351) | about a year ago | (#43803383)

Nobody snivels at any priced hardware, as long as they are using it to make enough money to offset its price. There are as pricey workstation GPUs even now that people use for work.

Re:Graphics cards (1)

dywolf (2673597) | about a year ago | (#43804569)

what happened to 6 shader cores being a big deal? now we're in quadruple digits? holy cow.

Seriously? (1)

bogie (31020) | about a year ago | (#43805083)

Where have you been since 1997? Since then you've been able to consistently spend roughly $200 per video card and play the latest games at acceptable settings. That's why people cast a sneer at a $1,000 card let alone a $650 card that in of itself can build a really fast computer.

More Coors is always better (4, Funny)

TWiTfan (2887093) | about a year ago | (#43803105)

More Coors takes the pain of remembering away.

Re:More Coors is always better (5, Funny)

fibonacci8 (260615) | about a year ago | (#43803213)

But then you have to drink something to forget you were drinking Coors.

Re:More Coors is always better (1)

smallfries (601545) | about a year ago | (#43803617)

Dude I think he covered that: Moore Coors.

Re:More Coors is always better (0)

RoboRay (735839) | about a year ago | (#43803709)

Moore Coors?

Re:More Coors is always better (1)

Anonymous Coward | about a year ago | (#43804297)

Moore Coors?

Yes, the amount of Coors doubles every 2 years.

Re:More Coors is always better (0)

Anonymous Coward | about a year ago | (#43804119)

If you're drinking Coors it's already easy to forget you're drinking anything at all.

I can get an entire laptop for that cost (5, Insightful)

MetalliQaZ (539913) | about a year ago | (#43803115)

I must have crossed the border into adulthood somewhere back there because I would never pay that much for a performance uptick in a video game. I can get myself a nice new laptop for that cash, and it would be still be proficient at 90% of today's games.

Re:I can get an entire laptop for that cost (0)

Anonymous Coward | about a year ago | (#43803217)

I must have crossed the border into adulthood somewhere back there because I would never pay that much for a performance uptick in a video game. I can get myself a nice new laptop for that cash, and it would be still be proficient at 90% of today's games.

EXCEPT, why be merely PROFICIENT???

Re:I can get an entire laptop for that cost (5, Insightful)

MachineShedFred (621896) | about a year ago | (#43803239)

The good news is that all the "I've got to have the latest and best to make all my friends and e-buddies drool" crowd will start unloading their barely-used last generation cards on eBay, and those of us that want good performance at a good price will benefit.

Re:I can get an entire laptop for that cost (0)

Anonymous Coward | about a year ago | (#43804085)

If you're in the market for buying a $650 video card on essentially impulse, are you really that concerned about selling your [probably] year old card on ebay for $150?

Re:I can get an entire laptop for that cost (0)

Anonymous Coward | about a year ago | (#43803273)

I am sorry for your loss.

Re:I can get an entire laptop for that cost (0)

Anonymous Coward | about a year ago | (#43803333)

Most people do not buy these cards, or GTX in general. They buy in a lower price range. This is what you buy if you want the very top end.

Re:I can get an entire laptop for that cost (0)

Anonymous Coward | about a year ago | (#43803401)

From this statement I can tell you aren't running a three or six monitor Surround array, which is the target market for this kind of GPU.

Also, it is not related to adulthood - I'm a 30 something, and I have a nice Surround array. And yes, I am considering an upgrade like this.

Re:I can get an entire laptop for that cost (0)

Anonymous Coward | about a year ago | (#43803869)

I think OP was referring to adulthood as a state of being, not an age. Maybe you just haven't hit it yet.

Re:I can get an entire laptop for that cost (0)

Anonymous Coward | about a year ago | (#43803663)

I want that rig. However, I want portability more with some semblance of a battery life. So that puts me into the same realm as you in buying laptops.

I used to drop 3-5k for a rig. Would still do it. But these days I want portable.

We will be seeing this sort of tech in laptops in about 3 years once they get the heat and power usage under control.

Re:I can get an entire laptop for that cost (0)

Anonymous Coward | about a year ago | (#43803675)

Probably in the minority but the CUDA/OpenCL ability of those cards was kinda drool worthy. So, one could use it as an excuse to buy a top end card for video ga...Err..I mean..to render texture maps faster.

Re:I can get an entire laptop for that cost (1)

parlancex (1322105) | about a year ago | (#43804243)

If by 90% of today's games you mean 90% of today's facebook games, then yes, I suppose ;). But seriously, are games less technically demanding than they were 10 years ago? Yes. Are you really going to have that much fun trying to play actual modern 3D games on medium low detail settings, no AA, low resolution, at ~30 fps? You might, obviously a lot of people wouldn't, myself included.

Re:I can get an entire laptop for that cost (2)

dywolf (2673597) | about a year ago | (#43804545)

1920x1080
High settings
1x AA (and honestly dont even need that when running at this rez or higher)
60+ FPS in nearly all games including your precious "modern 3d games"...because honestly they dont push any harder now than they did a bit ago.

the card that does this for me? a 4 year old GTS 250

either your standards are too high, or you're deluding yourself as to the actual worth of running at 400000x rez with 20xAA and 50 bajillion gigawatts of memory.
probably both, as you;ve obviously falled for the graphics = gameplay fallacy.

Re:I can get an entire laptop for that cost (0)

Anonymous Coward | about a year ago | (#43804617)

... or your standards are too low. Try running Far Cry 3 on that or something. GTX480 keeps up with it "ok", but the framerate drops do show up here and there. I'm sure a GTS250 would do great for Thief 3 or Quake 3 Arena, but that's moot.

Re:I can get an entire laptop for that cost (2)

gsnedders (928327) | about a year ago | (#43804671)

Note that a 2560x1600 panel has almost double (1.98x) the number of pixels of a 1920x1080 one, and given how ugly scaling tends to be, it can be entirely worthwhile to have a high end graphics card.

On the other hand, I still have a GTX 580 (and when I bought it, the mid-range card couldn't get a smooth framerate above 1920x1200), and I don't have any impetus to upgrade yet, as the difference isn't that great.

Re:I can get an entire laptop for that cost (1)

maroberts (15852) | about a year ago | (#43804895)

You should thank God (in whatever form you worship his Noodly Appendages) that people do buy high end devices such as graphics cards. These spendthrifts create the market and provide the initial downward push on prices so that the rest of us can afford them.

I still remember paying £2500 ($4000) for a 486DX-33 with 8MB of RAM and a 200Mb hard drive. Those were the days.....

Well, some people like to spend money on hobbies (1)

Sycraft-fu (314770) | about a year ago | (#43804993)

Seriously, for some people, gaming is their hobby and that kind of money is not that much when you talk what people spend on hobbies. My coworker just bought himself like a $2000 turbo for his car, to replace (or augment, I'm not sure) the one that's already there. He has no need for it, but he likes playing with his car.

Now that you, and most others, don't want to spend that kind of money is understandable and not problematic. There's a reason why companies have a lineup of stuff and why the high end stuff is just for those with plenty of money. It also doesn't scale linearly since the higher end something is, the less units get sold, and so the more the fixed costs influence the unit cost.

However don't hate on it. That you don't wish to spend that kind of money doesn't mean that nobody should. Also you should be glad people do: The expensive parts fund the cheap parts. They can recover more R&D costs on these units, letting them sell lower end parts for less, since lower end parts are the same tech, just less of it.

Re:I can get an entire laptop for that cost (0)

Anonymous Coward | about a year ago | (#43805063)

Where is this $650 laptop which is proficient at 90% of games? How much time passes before it can no longer play even 50% of the new games? My friend got a $650 laptop recently and couldn't even buy BGEE because Intel Integrated Graphics are not supported.

2,304 cores = 1 line of 2K HD (2)

Sla$hPot (1189603) | about a year ago | (#43803123)

Ten years more and it will be one core per pixel. That's insane.

Are you insane? (0)

Anonymous Coward | about a year ago | (#43803173)

People complain that Macs cost too much and then you get news like this about video cards that cost as much as the entry-level Apple laptop.

Hell, I could pay both my rent and electricity bill for two fucking months for the price of one GTX Titan card.

Re:Are you insane? (1)

kannibal_klown (531544) | about a year ago | (#43803243)

Sounds more like a regional thing.

Here in upper/middle NJ, $1000 gets you a pretty bare-bones apartment in most towns. Maybe a 2-bedroom in worse area.

Unless you're splitting rent with someone at $1000 doesn't carry much weight here.

Re:Are you insane? (0)

Anonymous Coward | about a year ago | (#43803403)

$385 per month for the rent, average $50 of for the electricity. Three bedrooms apartment with a living room, in Canada.

Re:Are you insane? (1)

kannibal_klown (531544) | about a year ago | (#43803483)

Great, yet another reason I wish I was Canadian.

NJ has a high cost of living. Though it varies: if you want to leave in Western NJ near the PA border it's a little cheaper. Or if you move way down to South NJ it's a little cheaper still. Though hadn't seen a nice apt for $500.

But unfortunately there are a lot more jobs in northern NJ, and it's not worth driving 1+ hours each way let alone 2.

Re:Are you insane? (0)

Anonymous Coward | about a year ago | (#43803717)

That should come with a results not typical. In almost every major centre in Canada, the rents are 1000+ for a three bedroom, if you can can find a place to live at all (many places have very low vacancy rates). The only way you can get a rent like that is by living in an extremely remote location like Flin Flon, or in a location with low job prospects like Charlottetown Town.

Re:Are you insane? (0)

Anonymous Coward | about a year ago | (#43803741)

>Great, yet another reason I wish I was Canadian.
FYI: Goods, groceries in Canada cost more.

http://www.citynews.ca/2011/09/11/why-consumer-goods-cost-more-in-canada-than-in-the-u-s/
>“It depends on the equipment, but it can vary from 20 per cent to 45 per cent cheaper in the U.S.,” said one parent. “When you get into the high-quality products, the price difference is higher.”

Re:Are you insane? (0)

Anonymous Coward | about a year ago | (#43804595)

>Great, yet another reason I wish I was Canadian.
FYI: Goods, groceries in Canada cost more.

http://www.citynews.ca/2011/09/11/why-consumer-goods-cost-more-in-canada-than-in-the-u-s/
>“It depends on the equipment, but it can vary from 20 per cent to 45 per cent cheaper in the U.S.,” said one parent. “When you get into the high-quality products, the price difference is higher.”

Live in a border town and do your shopping in the US, Canadians make up about 50% of our business where I'm at.

Re: Are you insane? (1)

iamhassi (659463) | about a year ago | (#43803535)

Same in most major US cities. But people have to have a place to live but they don't have to play video games. If its too much, don't buy it

Re: Are you insane? (1)

jma05 (897351) | about a year ago | (#43803711)

Yes, but people living in higher cost of living centers generally have higher relative incomes as well. So it is less of a proportional hit on the budget than elsewhere.

Re:Are you insane? (0)

Anonymous Coward | about a year ago | (#43803351)

Of course, you could run a dozen instances of Apple's OS on the video card alone and still have resources left over to slam the shit out of any game on the market, but let's not mention that. Obvious Apple shill is obvious.

Re:Are you insane? (0)

Anonymous Coward | about a year ago | (#43803745)

Expensive video cards are magically able to run x86 code?

Re:Are you insane? (0)

Anonymous Coward | about a year ago | (#43803373)

Well, if you had that video card, your electricity bill would be higher...

Re:Are you insane? (1)

0123456 (636235) | about a year ago | (#43804155)

Hell, I could pay both my rent and electricity bill for two fucking months for the price of one GTX Titan card.

Then you're not the target market.

Some people buy $250,000 cars, some people buy $1,000 video cards, some people pay $2500 for Mac with the same hardware as a $1000 Windows PC. To most of us, they're just a curiosity.

2304 cores aught to be enough for anybody (0)

Anonymous Coward | about a year ago | (#43803197)

Well, by tradition I don't pay more than ~$250 for a graphics card, so it's interesting but still MUCH too expensive for me. But give it a year or so and it might be in that price range, at which point I'm probably going to say 2304 cores is plenty.

no linux driver no nvidia (-1)

Anonymous Coward | about a year ago | (#43803309)

NVIDIA doesn't care at all about linux users. As Linus Torvalds said: "Nvidia is the worst company regarding linux support! FUCK U NVIDIA!!".
There a lot bugs in the linux drivers that exists since 2011 and NVIDIA doesn't care, doesn't acknowledge that they exists, and doesn't provide even a work around. The latest driver from NVIDIA is a JOKE for any labtop users that use linux, nothing more. As NVIDIA is the worst, any bad driver from AMD is still better. So who cares about NVIDIA?

Re:no linux driver no nvidia (2)

doublebackslash (702979) | about a year ago | (#43804187)

Does AMD support VDPAU these days? Because VA-API support is mighty poor in my experience. Broke down and bought a card when I had a perfectly find integrated one for my TV box because I can't get VA-API to work with mplayer. There is a source version that supports it, I couldn't get it to compile cleanly, though.

I also had a ton of trouble with the "legacy" vs new ATI drivers (the computer was low end, but only a few months old when this nonsense happened). Not sure what caused that split, but it was hell to get working on Ubuntu. Left a bad taste in my mouth all around. Think they resolved it, but it was just yet another barrier between me and what is normally a flawless experience.

Of course they'll pay... (-1, Flamebait)

chris200x9 (2591231) | about a year ago | (#43803325)

...hardcore gamers are idiots.

Re:Of course they'll pay... (0)

Anonymous Coward | about a year ago | (#43803375)

No I'm not, and I will not be getting this card. 1) I am an AMD guy for my graphics cards, and 2) That is a shit ton of money and I am just a poor college student.

Well, that's lack of competition for you... (2)

Gordo_1 (256312) | about a year ago | (#43803347)

If this was a previous generation where AMD was actually still competitive, Titan would have been the high end part, and it would have cost $500 instead of $1000. The part known as GTX 780 would have been a slightly depopulated part capable of 90% the performance for a 20% savings or so and the rest of the line would have fallen under those two. Since AMD is no longer really a threat in the high-end GPU space, Nvidia can literally maintain the MSRPs of the old parts as if the new parts are merely higher performing extensions of the previous generation without any downward pricing pressure on anything.

Re:Well, that's lack of competition for you... (0)

Anonymous Coward | about a year ago | (#43803555)

AMD is still competitive for anyone who needs a linux driver. AMD drivers are bad. NVIDIA hast the WORST driverss according to Linus Torvalds. So in his words "FUCK U NVIDIA!"

Re:Well, that's lack of competition for you... (0)

Anonymous Coward | about a year ago | (#43804659)

That's great! I'm sure you can run DX11 games on Linux with that AMD too. Let's get real: it is still true that most any high-end game is Windows-only so far.

Yowsa (1)

Anonymous Coward | about a year ago | (#43803367)

Imagine a Beowulf cluster of these babies rendering images of Natalie Portman covered in hot grits!

Re:Yowsa (0)

Anonymous Coward | about a year ago | (#43803573)

Oh my EYES! Why did you have to say that, it's going to take a long time to get that image out of my head.... I need to barf now...

$500 is already close to insanity (2)

Alejux (2800513) | about a year ago | (#43803463)

Anything more the $250, and it's likely you won't find any game out there that can't be played well with a $250 card.

Re:$500 is already close to insanity (2)

ctrlshift (2616337) | about a year ago | (#43803689)

Definitely, don't ever shell out the money it costs to have the #1 best fastest hottest GPU (I wouldn't recommend the #2 either). It's basically digital viagra. Lasts about as long too.

Re:$500 is already close to insanity (0)

Anonymous Coward | about a year ago | (#43804165)

Lasts about as long too.

Ah, wouldn't know...

Re:$500 is already close to insanity (1)

cdrudge (68377) | about a year ago | (#43804225)

Doesn't that advice apply to just about everything? TVs, cars, medicine...The ultra high end or bleeding edge technology usually isn't "worth it". Yet that edge always seems to move along and what is today's best, most expensive tech is tomorrow's everyday tech.

Re:$500 is already close to insanity (0)

Anonymous Coward | about a year ago | (#43804027)

Not right now, because games are stripped down to what the consoles can handle. With the new consoles, we'll be back to seeing benefit from better video cards.

Re:$500 is already close to insanity (1)

0123456 (636235) | about a year ago | (#43804181)

With the new consoles, we'll be back to seeing benefit from better video cards.

Except the new consoles' graphics will probably be just about on par with next year's integrated GPUs.

Re:$500 is already close to insanity (0)

Anonymous Coward | about a year ago | (#43804197)

These sorts of cards really are beneficial if you want to go above 1080p. Personally, I have a 2560x1440 monitor but run games at 1080p so my $200 graphics card won't choke.

Bullshit VS Console (-1)

Anonymous Coward | about a year ago | (#43803581)

This is where I get to laugh when I hear all the PC junkie boys running to go spend another 800 bucks while I spend 400 on a gaming console that everybody makes fun of. Meanwhile, I get 8-10 years of use out of that console while you pc junkies spend the equivalent of a small used car on graphic upgrades that make no sense from a gaming perspective and definitely not monetarily.

Re:Bullshit VS Console (1)

UnknowingFool (672806) | about a year ago | (#43803665)

This is where you don't understand that maybe 1% of PC gamers will buy this card. Of course these are the same people that will drop lots of money on bling like LED lights. Most gamers will spend much less and still have a PC that beats your 8 year old console as it hasn't been/can't be upgraded when it comes to hardware.

Re:Bullshit VS Console (0)

Anonymous Coward | about a year ago | (#43804021)

Beat it by what? Are we going to count framerates or how well/fun the game plays? I don't ever see dropped frames or lag on a console unless its internet related. I can't say the same for the shit that plays on my expensive pc

Bitcoin / Litecoin mining? (1)

King_TJ (85913) | about a year ago | (#43803661)

Until I was asked to write a few tech. articles on bitcoin and other virtual currencies last year, I didn't really pay a lot of attention to them. But I've learned that high end ATI video cards are pretty much the "engines" required for any respectable bitcoin/litecoin mining rig to work successfully.

(As a rule, nVidia cards have been ignored as "not as good of performers as ATI" for this specific use -- though I wonder how this GTX 780 would do?)

People building these mining rigs generally cram 3 - 4 of the cards on one motherboard, and run several identically configured machines at a time -- meaning a pretty hefty investment in video boards. It makes me wonder if this isn't really a significant reason for the sales of the more costly models, as opposed to the audience you'd assume was buying them -- 3D gamers?

Re:Bitcoin / Litecoin mining? (1)

NeutronCowboy (896098) | about a year ago | (#43803967)

ASIC machines are coming on the market for bitcoin mining. They're blowing any GPU rig out of the water in terms of BTC mined/watt-hour. GPU-mining is officially dead.

Re:Bitcoin / Litecoin mining? (0)

Anonymous Coward | about a year ago | (#43804621)

ASIC machines are coming on the market for bitcoin mining. They're blowing any GPU rig out of the water in terms of BTC mined/watt-hour. GPU-mining is officially dead.

They actually have to come out first.

Re:Bitcoin / Litecoin mining? (1)

Monkey (16966) | about a year ago | (#43804687)

Not for scrypt based coins like litecoin. GPUs are still the best option.

Re:Bitcoin / Litecoin mining? (0)

Anonymous Coward | about a year ago | (#43804931)

ASIC machines are coming on the market for bitcoin mining. They're blowing any GPU rig out of the water in terms of BTC mined/watt-hour. GPU-mining is officially dead.

If only someone was selling ASICs. Avalons are off the market, they aren't making a batch 4. The only other company is Butterfly labs, and the verdict is still out on if they are a long con scam. If I could buy an Avalon for $10k I would do so today... alas.
 
In anycase I'm doing 1 Gh/s mining with 2 video cards (7870 and a 7950), making $6 a day (~$180 with electric costs of $18). Once my bitcoin dollars build up again, I suspect I'll buy another GPU as the rate the difficulty is climbing, it should still be profitable for a while longer. Plus the high end video cards retain a lot of value, so I can resell them down the road. GPU mining isn't dead yet.

Re:Bitcoin / Litecoin mining? (1)

Fweeky (41046) | about a year ago | (#43804899)

It'll perform a bit worse than a GTX Titan, which gets in the region of 330Mhash/sec [bitcointalk.org] . For comparison, an AMD HD5870 from 2009 managed about 400Mhash/sec.

90% will buy AMD (0)

Anonymous Coward | about a year ago | (#43804145)

With AMDs A-series and the up-coming Kaveri architecture, 90% of desktop users will find what they require right there, at an amazing price. Only 10% or less require a massive and dedicated graphics card and/or a massive CPU.

I really think AMD is doing the most innovative work at the moment, and that they're on the right path with APUs and HSA.

Can somebody put this into non-gaming terms? (1)

Khopesh (112447) | about a year ago | (#43804291)

We do love our big numbers, but there are limits to what our eyes can perceive in FPS. What does this mean for real world applications like video encoding and password cracking? How long do we anticipate having to wait for tech like this to get affordable? Also, how does this compare to the nVidia Tesla, the current gold standard in password cracking?

I saw only one reference to nVidia Tesla (and no references to password cracking or video encoding) in those reviews (@Tech Report [techreport.com] ), and it might be damning:

Speaking of things that don't matter much, Nvidia has decided to scale back the GTX 780's capacity for double-precision floating-point math. Double-precision support is built into the GK110 GPU because of the chip's compute-focused role aboard Nvidia's Tesla products. Real-time graphics basically don't require that level of precision.

new cards are finally coming (0)

Anonymous Coward | about a year ago | (#43804447)

kinda off topic, but I hope Nvidia releases the successors to the GeForce 630 and 640 later this year. I need a new low-end card for my PCIe 1.1 slot.

yuo 7ail it. (-1)

Anonymous Coward | about a year ago | (#43804711)

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>