Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Launches GeForce GTX 560 Ti 448-Core GPU

Unknown Lamer posted more than 2 years ago | from the more-cores-more-powers dept.

Graphics 127

MojoKid writes "NVIDIA has just launched the GeForce GTX 560 Ti with 448 cores. Though perhaps a bit unimaginative in terms of branding, the new GeForce GTX 560 Ti with 448 cores is outfitted with the same GF110 GPU powering high-end GeForce GTX 570 and GTX 580 cards, but with a couple of its streaming multiprocessors fused off. The card has 448 CUDA cores arranged in 14 SMs, with 56 texture units and 40 ROPs. Reference specifications call for a 732MHz core clock with 1464MHz CUDA cores. 1.2GB of GDDR5 memory is linked to the GPU via a 320-bit bus and the memory is clocked at an effective 3800MHz data rate. Performance-wise, the new GPU proved to be about 10 to 15 percent faster than the original GeForce GTX 560 Ti and a few percentage points slower than the GeForce GTX 570."

Sorry! There are no comments related to the filter you selected.

Do they accept trade-ins? (3, Funny)

Anonymous Coward | more than 2 years ago | (#38216094)

I bought a 560 Ti just a month ago and now this? FFFFFFfffffffffff...

Re:Do they accept trade-ins? (5, Informative)

ebombme (1092605) | more than 2 years ago | (#38216466)

If it is EVGA brand I believe they allow a trade in program within a certain amount of time for situations just like this. They have a trade in program where you send in your old card and pay the difference and they will send you the updated card of choice.

Re:Do they accept trade-ins? (2)

spacepimp (664856) | more than 2 years ago | (#38216818)

If it is EVGA brand I believe they allow a trade in program within a certain amount of time for situations just like this. They have a trade in program where you send in your old card and pay the difference and they will send you the updated card of choice.

I just bought the 560 TI 2GB EVGA. Your post just made my day.

Re:Do they accept trade-ins? (0)

Anonymous Coward | more than 2 years ago | (#38216718)

In all probability, the applications you use don't achieve the theoretical peak performance of the hardware, and there's probably some dynamic clock-speed adjustment to manage core temperatures.

Re:Do they accept trade-ins? (4, Funny)

flimflammer (956759) | more than 2 years ago | (#38217478)

Probably not but it doesn't hurt when you're thinking of future-proofing your investment in high end graphics card technology. Especially considering how fast it moves.

Re:Do they accept trade-ins? (4, Insightful)

Yvan256 (722131) | more than 2 years ago | (#38217942)

future-proofing your investment in high end graphics card technology

Funniest post of the day.

Re:Do they accept trade-ins? (2)

hairyfeet (841228) | more than 2 years ago | (#38218864)

I have an even better question...does someone have a translator please? Like what the difference between a Stream processor and a core is? Because my old HD4850 has 800 Stream processors but i have NO clue as to how to translate that to cores.

Frankly the whole CPU and GPU business is starting to give me a bit of a headache. Remember when all you needed to know was MHz? Now there is in order and out of order, there are modules and full cores and hyperthreading and its all getting to the point one really needs a lookup table handy so one can translate this into numbers we can actually compare. While i doubt I'll be needing anything of this price range for me or my boys (they too are on HD4850s and quite happy as am I) I do still have a few gamer customers as well as some engineering and graphics guys and being able to translate to English would be off the good.

BTW sorry to go a little OT but does anyone know how well Nvidia cards work with SolidWorks? I have an older engineer that uses the program (he gets it from the college he volunteers for, Jesus is that program high!) and the HD4650 I sold him seems to be doing okay with it but they keep having the 210s on sale and I wonder if anyone has run the program on one of those. Just adding the HD4650 to his main PC gave it a hell of a speed boost, with no more graphic hangs when he rotates large models, but I'd hate to tell the guy to buy a 210 if it turns out Nvidia cards aren't so good with it. all the website says is use the more expensive CAD cards but frankly he doesn't need FirePro or Quatro levels of precision, he just needs the models to render quick and clean. He is a hell of a nice guy (and a former NASA engineer, getting to hold some of the actual shuttle plans? Sweeet!) and I'd really like to steer him right.

Re:Do they accept trade-ins? (0)

Anonymous Coward | more than 2 years ago | (#38219816)

Frankly the whole CPU and GPU business is starting to give me a bit of a headache.

One of the reason I switched to a Mac. Tired of the Microsoft bullshit and tired of the upgrade treadmill bullshit. Now all I need to decide is if I want something portable or not, built-in display or not. Then I have two or three models to chose from. Once you apply your budget on your choices you might not even have a choice to make since you can only afford the low-end model. It's that easy!

Re:Do they accept trade-ins? (1)

mikael (484) | more than 2 years ago | (#38220548)

I believe a stream processor is a processor dedicated to processing pipelines of data. More optimized for sequential memory access and arithmetic/trigonometric calculations than for branching and conditional instructions.

A core is just the basic CPU with or without an FPU, but will have support for multiple threads (shared code/data space but separate execution points). Multiple cores will have specially adapted cache memory to allow data sharing.

Unfortunately there's not a good translation (1)

Sycraft-fu (314770) | more than 2 years ago | (#38220714)

nVidia and AMD do it different. nVidia counts their processing sub units as a "core". They aren't quite a core as you'd think of it on a CPU, but similar. AMD counts each execution sub unit in their "cores" as a "stream processor". So roughly speaking for the 4000 and 5000 series there are 5 SPs for each core.

That doesn't quite tell the whole story though as what each core can do is different between the different vendors. More or less, to the extent it is useful information at all, it is only comparing within products of one vendor.

Also there are other things not advertised as much that matter. That would be things like ROPs (Render Output Unit) and TMUs (Texture Mapping Unit). Those deal with rasterizing pixels and textures respectively. So those relate to rendering speed as well. Those used to be all that graphics cards talked about, before they had the processing section that makes them a GPU.

Really the thing to do is this: Decide how much you can spend, decide what you wish to do, then go read reviews that show how the cards actually work with that software. For me as a gamer, I read HardOCP. They take the cards and actually play games with them, and you can see how well they fair.

Solidworks plays fine with nVidia cards. Probably better than an ATi card since nVidia's OpenGL drivers are top notch (just as fast and features as their DX drivers) and ATi's are not quite as optimized. In terms of a nVidia 210, well those are very low end, lower than a 4650 so I don't know how much it'd help. Still probably be head and shoulders above the integrated graphics though.

Re:Do they accept trade-ins? (-1)

Anonymous Coward | more than 2 years ago | (#38221026)

Which is faster 4 mhz z80 or 2 mhz 6502?

frist (-1)

Anonymous Coward | more than 2 years ago | (#38216104)

frsty pisss

Re:frist (-1)

Anonymous Coward | more than 2 years ago | (#38218042)

nope.avi

And the name is the same? (1)

Anonymous Coward | more than 2 years ago | (#38216128)

They couldn't call it a 561 Ti, or a 560 Pt? Or Au, or Ir, or whatever other element is "better" than Titanium.

Re:And the name is the same? (3, Funny)

Opportunist (166417) | more than 2 years ago | (#38216416)

One up would be Vanadium, V. And why not, it has a pretty cool ring to it, as would 560V, don't you think? Sounds like it has a really high voltage. :)

I mean, you have to leave some room for the future, if you call it the 560pt, you're soon at Gold, Mercury, Thallium and ... well, Lead. And I dunno if Lead is really what you want attached to your graphics card name, it doesn't really sound "fast"...

Re:And the name is the same? (0)

Anonymous Coward | more than 2 years ago | (#38216620)

It's been a while since I've looked at a periodic table, I was just going with a name theme, not so much a location on the chart.

And just looked at one, and damn, Titanium is element 22? I guess I never paid much attention to it. No surprise there, mind you, not like we used it much in chemistry class.

Re:And the name is the same? (1)

Shark (78448) | more than 2 years ago | (#38217390)

I know of some lead objects that move really fast, they come out the business end of a gun ;) We could call it the 560 Bullet when we get to that.

Re:And the name is the same? (0)

Anonymous Coward | more than 2 years ago | (#38216634)

Yea, NVIDIA fails eternally at numbering conventions. ATI's numbering might be a little confusing, but it is actually meaningful if you know how to read it, NVIDIAs numbering just seems arbitrary. The problem here though is that it's not actually an "upgraded" 560, it's a downgraded 570, a situation that I don't really think fits in with anyone's numbering convention. That said, they still should have called it a 565 Ti, which would have at least sucked slightly less.

fused off? Really?! (1)

G3ckoG33k (647276) | more than 2 years ago | (#38216184)

"but with a couple of its streaming multiprocessors fused off."

fused off? Really?!

I don't know how they would do that, except than rather not connecting them in the blueprints.

Or are they just "defect" 570 and 580 relabeled.

Re:fused off? Really?! (5, Informative)

rahvin112 (446269) | more than 2 years ago | (#38216324)

They use lasers to cut the traces on the processor and firmware to disable what they can't cut. The chips are designed with this ability so they can bin and disable to differentiate models and use parts with defects. All the different model numbers are just binned parts with the bad sections disabled.

Re:fused off? Really?! (1, Flamebait)

mr1911 (1942298) | more than 2 years ago | (#38216442)

Or they use fuzes, which leads to the "fuzed off" language. The fuzes are generally only accessible via the IC tester and the pads are not bonded out when packaged.

Re:fused off? Really?! (0)

Anonymous Coward | more than 2 years ago | (#38216588)

Fuze? Is that some type of trademarked name for the concept of a fuse?

Re:fused off? Really?! (4, Funny)

mr1911 (1942298) | more than 2 years ago | (#38216880)

Fuze? Is that some type of trademarked name for the concept of a fuse?

I get my feeling hurt when no one makes snotty comments on my posts. It was troll bait. And it worked.

Thank you.

Re:fused off? Really?! (2)

jones_supa (887896) | more than 2 years ago | (#38217686)

Well played, sir.

Re:fused off? Really?! (2)

Ellis D. Tripp (755736) | more than 2 years ago | (#38218248)

Actually, "fuze" is a proper spelling--assuming you are talking about a timer/impact sensor/radio type device that sets off a bomb or artillery shell. The string or cord that you light with a match to set off a simpler type of explosive or firework is spelled with an "s", however.

When referring to an electrical overcurrent device, it is also spelled with an "s".

Re:fused off? Really?! (0)

Anonymous Coward | more than 2 years ago | (#38216394)

While I did not RTFA, I did read some other storys on the new cards that said there a limited run of cards based on 580 cards that have defective cores which are disabled.

Re:fused off? Really?! (4, Informative)

mprinkey (1434) | more than 2 years ago | (#38216420)

Probably referring to efuses that can be burned out on the die. These are common and allow CPU/GPUs to have unit-specific information (like serial numbers, crypto keys, etc) coded into otherwise identical parts from the fab. Video game systems like the 360 use them as an anti-hacking measure...disallowing older version of firmware to run on systems that have certain efuses "blown." Likely, there is an efuse for each core or group of cores. Those can be burned out if they are found to be defective or to simply cripple a portion of the part for down-binning. That is a practice at least as old as the Pentium 2.

yes, really (4, Informative)

OrangeTide (124937) | more than 2 years ago | (#38216458)

all parts have defects. although sometimes companies don't test throughly enough to find the defects. and usually the defects don't impact normal operation. But when the potential exists for problems, they are forced to either scrap the part or bin [wikipedia.org] it as a lower spec part. Binning improves yield and helps keep prices down on the higher-end parts that do pass tests.

But here's the problem with binning from a marketing standpoint: "and a few percentage points slower than the GeForce GTX 570". This binned 570 is about $60+ cheaper and will likely slide down to the old 560 Ti (naming is confusing!) prices. So now they've created a cheaper version that is almost the same performance, and run the risk that customers will choose the cheaper product over the more expensive (and I assume higher margin) product.

Re:yes, really (0)

AdrianKemp (1988748) | more than 2 years ago | (#38216598)

Oh I wouldn't jump to the conclusion that the more expensive card is higher margin...

My information is all third hand quite possibly bogus, but everything I've heard and learned over the years about graphics cards is that the highest performance cards are low margin. They make them because they have to, more than anything. Since you're already doing all the R&D to make faster cards, you might as well sell some. Meanwhile the high performance cards from three years (or more) ago that now cost three bucks to make are sold for 80% or more profit.

However, I will freely admit that I don't see how that situation works with intel doing on-chip graphics for a lot of the workstation/mid-range market. So that information could be out of date, or could never have been right in the first place.

Re:yes, really (0)

Anonymous Coward | more than 2 years ago | (#38216954)

Not in the case of a binned part. Production goes all the way to the testing phase as if it were a high spec part. All costs up to this point are now sunk. For those that pass testing, they package and sell at a high price point. For those that don't pass, further processes happen, which results in more cost and a lower margin, and the final product is sold at a lower price point, further reducing the margin.

Re:yes, really (0)

Anonymous Coward | more than 2 years ago | (#38217036)

The high end cards are almost always high margin per unit sold. They just contribute the least to profits because they don't sell that many. The mid-range or "sweet spot" cards are lower margin per unit, but bring in much more money due to volume sold.

Re:yes, really (1)

Anonymous Coward | more than 2 years ago | (#38216688)

They have probably been filling the bin since the 570's started rolling out of the factory. I'm sure they waited until they stopped producing regular 560's, and waited until they had a waning supply of regular 560's before announcing this "new" 560. They probably have a person or team of people dedicated to figuring out the perfect time to stop production of one card and when to release another card. This isn't a small inexperienced start-up we're talking about here.

Re:yes, really (5, Informative)

billcopc (196330) | more than 2 years ago | (#38217120)

Strictly speaking, it costs the same to make this new 560 Ti chip as a balls-out 580 chip. They're identical from the fab's perspective. In practice, the 560 Ti is a way to maximize yield by salvaging defective 580s. This is very much like Celerons being Pentiums with the defective parts lasered or fused off.

If it weren't for this binning, they would have to toss these chips in the garbage. In theory, salvaging imperfect chips allows them to price things more aggressively across the product line, since the sunk cost of manufacturing is averaged out over a much greater volume.

Here's an example, and note these numbers are purely arbitrary, I know nothing about fab economics:

Suppose they had to throw 4 out of every 5 GTX chip away, and each one cost $100 to make, then each good GTX would cost $500 on average. The yield is thus 20%.

If instead, they can sell those 4 bad chips as lower-spec products, each chip costs $100. The yield is now 100%.

So yes, the higher binned cards theoretically cost the same to build as the cheap ones (assuming similar memory/outputs/PCB). They are thus higher-margin, but also scarce due to manufacturing limitations. The smaller the pitch, the harder it is to produce a perfect chip. This scarcity is what leads to the increased cost. After all, if a 580 cost the same as a 560, everyone would want the faster one and no one would buy the rejects. OEMs partially compensate by bundling a bunch of stuff with the high-end cards, like pack-in games, DP converters and assorted swag - cheap stuff with a higher perceived value. Put it this way: a recent title like Battlefield 3 might sell for $69 in stores, but you can be sure the OEM is paying much less to bundle it with their product. That goes a long way toward buttering up prospective buyers.

I'm sure NVIDIA would rather be stuck with a handful of "true" 570s than a shit ton of useless defective chips. People will choose the cheaper product, that's the point! If it weren't for binning, these chips would be worth zero dollars, destined for the incinerator. Pricing doesn't really matter that much, this late in the generation. The GTX 6xx (Kepler) is due out in March 2012, and will probably be available only as a high-end part at first. Bleeding-edge nutjobs like myself will be able to blow $1500 on a pair of the latest and greatest, while the sane people buy out the remainder of 5xx inventory at clearance prices, and only then will the new low-end cards be launched. They've got it down to a science.

Re:yes, really (1)

drinkypoo (153816) | more than 2 years ago | (#38218792)

Bleeding-edge nutjobs like myself will be able to blow $1500 on a pair of the latest and greatest, while the sane people buy out the remainder of 5xx inventory at clearance prices, and only then will the new low-end cards be launched. They've got it down to a science.

...and some of us will get them years later, used, and they'll still run everything we want to run. And on behalf of this group, I want to thank you for spending the big bucks so that they're motivated to keep cranking out newer, bigger, and faster cards, which I enjoy but do not want to pay full price for.

Re:yes, really (2)

isorox (205688) | more than 2 years ago | (#38218808)

Suppose they had to throw 4 out of every 5 GTX chip away, and each one cost $100 to make, then each good GTX would cost $500 on average. The yield is thus 20%.

If instead, they can sell those 4 bad chips as lower-spec products, each chip costs $100. The yield is now 100%.

But what if the availability of those $100 chips reduces demand for the $500 ones? You could end up selling the 4x$100 ones, but can't shift the $500 one.

Look at airlines. They fly a plane from London to New York, it has 200 economy seats, 50 business seats.

Economy sell for $500. Business for $2000.

After everyone's on board, the cabin crew notice that Business class only has 30 people in (economy has 150 people). They hold an auction to upgrade economy travellers, and remaining 20 seats go for $200 each.

On the face of if, the cabin crew have just raised an extra $4,000 for their airline.

however

On the next flight the business is empty. Economy has 180 people in. The cabin crew do the same auction, and this time seats go for $250 each. Wonderful, they've now made $12,500, at least on the surface.

The problem is, the people that would normally spend $2,000 on a business ticket will buy economy for $500 and pay for the upgrade ($250)

The first flight would have made 30*2000 + 150*500 = $135,000
The upgrades would have bumped that up to $139,000

However the effect of the upgrades caused the second flight to be economy class only. The flight makes 180*500 + 50*250 upgrades, or $102,500.

Sometimes it's worth throwing away stock, or seats, to protect your market.

Correct and what people need to understand (1)

Sycraft-fu (314770) | more than 2 years ago | (#38220768)

Is you pay by the wafer when making chips. A wafer will cost a certain amount to make depending on the process, the size, the fab and so on. That is how the company that makes GPUs is charged. So the more chips that come off the wafer, the more the cost of that wafer can be spread out. That means not only having smaller chips, which of course can fit more per wafer, but having less defective chips.

Hence binning based on units that work (or don't). As you say, it brings up yields and thus brings down unit cost.

This is particularly useful for large chips. Defects are the kind of thing that occur at a certain frequency per wafer. So if you have a wafer that has, say, 4 defects roughly evenly spaced around it, you probably have 4 chips with failures. If your chips are small and you fab 2000 of them per wafer, no big deal, just toss them. However if your chips are huge and you fab 20 of them per wafer, that is a massive amount of failures. If you can instead simply shut down the damaged section and bin it as a lesser chip, you can bring yield back up.

That's also why chips that are designed smaller can cost less. That is more or less what lower end GPUs are. They just design a smaller chip, with less of the units form the bigger one. You can then put more of them on a wafer, and thus they cost less.

Re:yes, really (1)

Kjella (173770) | more than 2 years ago | (#38217320)

It's half and half, obviously quite a few are binned because of real defects. This I suspect is also why Intel got so many confusing variations with various virtualization features. But it also happens that the mix isn't what the market demands, for example say they're producing too many fully functional 2600Ks while the market wants 2500Ks. Those who do the math say those customers can't be sold up to the 2600K because they're cash limited and slashing the 2600K prices would reduce total revenue from all the other sales, making it unprofitable. Basically you need more of a limited chip to fill a certain price point. Then it happens that they fuse off fully functional parts to do it, and in the past there have been hacks for some products to unlock these parts. Sometimes it's been ignored because so few do it, sometimes they've resorted to more drastic means to make sure it's physically impossible to activate them again.

Some people get annoyed when they learn that their hardware has been intentionally crippled like that, but you got what you paid for. That it's actually cheaper to make one product and sell multiple variations is more of an internal matter, just like most of the time the different versions of software is just a few #ifdefs in the code. If you bought Photoshop Express, you got that and it's a bit odd to complain that you only got a crippled Photoshop CS to that price. Also it's not like they instantly must bin everything to a current product, sometimes they save particularly good or bad chips to introduce a low volume better/worse model. It's complicated, but at the end of the generation they like to have sold as many of the chips as possible for the most they could. A thrown away chip is after all a wasted chip you got nothing for.

Re:yes, really (1)

Gavin Scott (15916) | more than 2 years ago | (#38218030)

It has been suggested that this will be a limited-production chip/card.

They may have simply accumulated enough fermi chips with a particular defect profile that it made sense to introduce this "new" version which would let them clear out that accumulated inventory.

G.

Re:fused off? Really?! (0)

Anonymous Coward | more than 2 years ago | (#38216560)

Or are they just "defect" 570 and 580 relabeled.

Basically this.

The GPU is designed to have various chunks that can be fuzed off if necessary. Manufacturing defects might result in a nearly-perfect chip... So they just fuze-off the bad chunk, and sell the rest of it at a slightly lower price.

Features! (0)

Anonymous Coward | more than 2 years ago | (#38216240)

And it makes toast!

Cha-ching BitCoin! (3, Funny)

stevegee58 (1179505) | more than 2 years ago | (#38216250)

Yay gotta get me one o' those!

Re:Cha-ching BitCoin! (0)

Anonymous Coward | more than 2 years ago | (#38216388)

Yeah, because NVIDIA cards are so awesome for Bitcoin... /sarcasm

Re:Cha-ching BitCoin! (1)

Mashiki (184564) | more than 2 years ago | (#38216502)

Last time I looked, it cost more in hydro to make a bitcoin than it's actual value.

Re:Cha-ching BitCoin! (1)

omglolbah (731566) | more than 2 years ago | (#38216784)

Not if you rent with power included in your bill :p

(Not that I'd bother with something like that, but I know people who would... nutters)

Re:Cha-ching BitCoin! (1)

stevegee58 (1179505) | more than 2 years ago | (#38216832)

What's hydro? Some kind of Canadian thingy?

Re:Cha-ching BitCoin! (0)

Anonymous Coward | more than 2 years ago | (#38216522)

The only thing more useless than BitCoin are the people who think it's actually good for something. Get a job you lazy cunt.

The summary doesn't make it clear... (5, Funny)

Anonymous Coward | more than 2 years ago | (#38216280)

The summary doesn't make it clear, but... how many cores does this new video card have?

Re:The summary doesn't make it clear... (2, Funny)

ebombme (1092605) | more than 2 years ago | (#38216406)

The summary doesn't make it clear, but... how many cores does this new video card have?

448 cores

Re:The summary doesn't make it clear... (1)

Anonymous Coward | more than 2 years ago | (#38216568)

Whoosh? The AC was making a joke, numbnuts. The 448 core thing is mentioned in the title and the first three sentences of the summary and they were mocking the redundancy of it all.

Re:The summary doesn't make it clear... (-1)

Anonymous Coward | more than 2 years ago | (#38216642)

*whoosh* subtle humor is evidently too subtle for you... to repeat it again is part of the joke.

Re:The summary doesn't make it clear... (2)

jones_supa (887896) | more than 2 years ago | (#38217062)

On the same note, even the "The AC was making a joke, numbnuts." message could in theory have been humor too. :)

Re:The summary doesn't make it clear... (2)

Tarsir (1175373) | more than 2 years ago | (#38218990)

It's jokes all the way down!

Re:The summary doesn't make it clear... (1)

Surt (22457) | more than 2 years ago | (#38218132)

Worse, the 448 core thing is actually in the product title, so it isn't even the fault of the summary.

Re:The summary doesn't make it clear... (-1)

Anonymous Coward | more than 2 years ago | (#38216624)

Whoosh? :-)

Re:The summary doesn't make it clear... (-1)

Anonymous Coward | more than 2 years ago | (#38216978)

448 I 448 think 448 it 448 is 448 cores 448.

P.S. 448

Re:The summary doesn't make it clear... (1)

timeOday (582209) | more than 2 years ago | (#38217580)

More importantly, what does "core" really mean in this context?

Re:The summary doesn't make it clear... (0)

Anonymous Coward | more than 2 years ago | (#38218376)

Who effing cares? This one has 448! Way more than eleven.

Re:The summary doesn't make it clear... (1)

bberens (965711) | more than 2 years ago | (#38218492)

I think it has something to do with the Marines.

Crazy (4, Funny)

Guppy (12314) | more than 2 years ago | (#38218842)

Meanwhile, at AMD/ATI Headquarters:

"Well, fuck it. We're going to 449 cores."

Nah, far too few (1)

Sycraft-fu (314770) | more than 2 years ago | (#38220636)

ATi redefined what they call a "core" some time ago to basically mean each sub processing unit of their cores so they say their Radeon 6970s have "1536 Stream Processors". I'm sure at some point nVidia will redefine theirs to mean each bit of an operator or something and we'll have cards with millions of "cores" before long.

Marketing. (1)

hollywoodb (809541) | more than 2 years ago | (#38216290)

What is with the branding scheme on these things? I see a summary with lots of letters and numbers and almost no useful information as to what the hell good they all are.

Re:Marketing. (0)

Anonymous Coward | more than 2 years ago | (#38216596)

Moores Law-style performance, at Moores Law x 5 prices. For Moores Law x 1 prices, see the cheaper models, which are the Moores Law x 0, relative to performance about 5 years back.

Re:Marketing. (1)

E IS mC(Square) (721736) | more than 2 years ago | (#38216972)

To give slashdot some credit, I do see few words in between, such as "high-end", "performance" and "faster".

Fun movie for the whole family! (-1)

Anonymous Coward | more than 2 years ago | (#38216340)

I really loved that movie. http://www.imdb.com/title/tt0436339/

what else to say (1)

xorbe (249648) | more than 2 years ago | (#38216350)

They already implied that these are salvaged 570 parts for a limited holiday production run. If you snag one with a free game you want, not a bad deal. Think carefully if you might go SLI in the future, since they'll be hard to find later.

Re:what else to say (1)

billcopc (196330) | more than 2 years ago | (#38217188)

Not so hard to find, if you check ebay or craigslist.

The ones that are truly hard to find are the super-high-end dual-GPU cards. And by "hard to find", I mean you usually have to pay some goddamned scalper a ton of money because the retail inventory sold out in mere weeks.

But Zotec? (0)

Anonymous Coward | more than 2 years ago | (#38216430)

Jeez, I don`t like them.

Re:But Zotec? (0)

Anonymous Coward | more than 2 years ago | (#38216590)

That's because it's easier to get cards from them for "free" for a review site than ones like EVGA. They are a good manufacturer nonetheless.

Those numbers (-1)

Anonymous Coward | more than 2 years ago | (#38216492)

...give me a hardon

Impressive specs (3, Insightful)

ifiwereasculptor (1870574) | more than 2 years ago | (#38216570)

As a whole, it's impressive that we can build such a thing. It's equally impressive that the number one reason for such an advanced piece of technology is so people can virtually shoot the current unfashionable eastern europeans by using more polygons.

Re:Impressive specs (1)

Anonymous Coward | more than 2 years ago | (#38216920)

It's equally impressive that the number one reason for such an advanced piece of technology is so people can virtually shoot the current unfashionable eastern europeans by using more polygons.

Speak for yourself. Some of us use these things in HPC projects and couldn't care less about their ability to render 3D shit.

Re:Impressive specs (0)

Anonymous Coward | more than 2 years ago | (#38218454)

It's equally impressive that the number one reason for such an advanced piece of technology is so people can virtually shoot the current unfashionable eastern europeans by using more polygons.

I don't get what point you are trying to make. That an entertainment device was created to be sold to an entertainment market instead of curing cancer?

Video cards render graphics, that's what they are for. How about complaining about how the extremely high-quality (DPI) screens on iPhones were invented for people to play angry birds and make phone calls instead of for use in medical imagers?

Expensive much? (3, Informative)

RobinEggs (1453925) | more than 2 years ago | (#38216772)

I still can't fathom spending $300 on a video card....and feeling like I got a slammin deal in the process.

What happened to the red-hot competition of 2008, when I built my first modern system and got a newly released Radeon 4850 for $150? That card was maybe the fourth most powerful you could get; there was no serious improvement to be had without adding more dies, via either X2 cards or crossfire.

Now today the 560 Ti and the 6950 occupy the same relative position in the hierarchy of GPUs that my 4850 held in 2008...yet rather than being brand new and $150 those two cards are almost a year old and $250-$300.

Ouch.

Re:Expensive much? (1)

modecx (130548) | more than 2 years ago | (#38217056)

I remember when a megabyte of RAM cost about $150, so I don't feel too bad about what a $150 video card will do these days. That's about where the sweet spot is, unless you really need to push very high resolution displays.

Re:Expensive much? (3, Interesting)

Calibax (151875) | more than 2 years ago | (#38218862)

I remember when a megabyte of RAM dropped to under $1,000,000, when we switched from core to semiconductor technology circa 1972. And we thought it a great advance.

Now get off my lawn, noob.

Re:Expensive much? (1)

Raenex (947668) | more than 2 years ago | (#38220550)

Considering the price of computers these days and the diminishing returns on graphic cards, I'd say $50 is the sweet spot.

Seriously, the games were already looking pretty amazing not long after 2000. Just how much better does a game look with this year's model versus a card from three years ago?

Re:Expensive much? (0)

Anonymous Coward | more than 2 years ago | (#38220598)

Seriously, the games were already looking pretty amazing not long after 2000. Just how much better does a game look with this year's model versus a card from three years ago?

For most of the last three years PC game graphics have been limited by the need to also run on consoles with antique GPUs. It's only recently that game developers seem to have decided that they might benefit from building games that actually take advantage of PC hardware.

Re:Expensive much? (2)

mikael (484) | more than 2 years ago | (#38217804)

My first PC in 1988 was around £2000 or $3000.

Now, even smartphones and USB sticks are getting GPU's to do texture-mapping at HD resolutions.

To think that it used to $150,000+ just to get a basic 24-bit color framebuffer and basic graphics API.

Re:Expensive much? (0)

Anonymous Coward | more than 2 years ago | (#38218098)

You're off by a bit there. The 460 is equivalent to what your 4850 was in 2008. 460's can be had nowadays for less than $100 when they go on sale. It's still a very good card for even modern games.

Re:Expensive much? (1)

Osgeld (1900440) | more than 2 years ago | (#38221436)

a 9600GT is still a pretty good card for modern games if your around 720p (I use a 1280x1024 screen) as the fact that all modern games are made to run on consoles which had stale video cards in them on launch.

Re:Expensive much? (1)

UnknownSoldier (67820) | more than 2 years ago | (#38218368)

> I still can't fathom spending $300 on a video card....and feeling like I got a slammin deal in the process.

Did you miss the $300 Radeon 5970 on the NewEgg Black Friday sale too? =)
http://www.newegg.com/Product/Product.aspx?Item=N82E16814103195 [newegg.com]

While I agree it's hard to justify that price point, that combination of THAT bang/buck is phenomenal !
http://www.tomshardware.com/reviews/battlefield-3-graphics-performance,3063.html [tomshardware.com]

Specifically ...
http://www.tomshardware.com/reviews/battlefield-3-graphics-performance,3063-8.html [tomshardware.com]

Re:Expensive much? (1)

karnal (22275) | more than 2 years ago | (#38221132)

I bought a new 8800gt at 300$ and felt I got my money's worth out of it. Of course, now I got a gently used GTX 280 (for free) and haven't played games for about a year... heh.

Great, but how many cores does it have? (1)

Radical Moderate (563286) | more than 2 years ago | (#38217000)

Nice summary, more redundancy please!

Re:Great, but how many cores does it have? (1)

Surt (22457) | more than 2 years ago | (#38218150)

It's not the summary's fault. 448 cores is actually in the product title.

Very nice, but (1)

TheGoodNamesWereGone (1844118) | more than 2 years ago | (#38217190)

That's pretty impressive.... It takes the current crown for fastest video card... all so I can play a dumbed-down Skyrim with its dumbed-down console interface and low-res console textures. What is it *for*?

Not worth it (0)

Anonymous Coward | more than 2 years ago | (#38217204)

if you read all the information available on these gpu's and cards, they will only be available for a month or 2 while supplies of these binned gpu's last, one article i read even referred to them as a limited christmas edition card. which is totally going to suck when you decide you'd like to sli it in a few months

Who buys discrete graphics anymore? (0)

Anonymous Coward | more than 2 years ago | (#38217464)

I'm sure all 37 hardcore gamers in the world will be ecstatic, but who else is going to buy these highend cards?

Re:Who buys discrete graphics anymore? (1)

aztracker1 (702135) | more than 2 years ago | (#38217528)

Bitcoins anyone? Seriously though, A lot of people don't quite settle for the onboard GPU, especially if you like to game with a very high resolution, or multiple monitors. I wouldn't even consider myself a gamer, but I run discrete in my desktop, though it's a now aging ATI Radeon HD 5770, it still works well enough for my needs. My biggest reason is multiple displays supported well. Most discrete graphics don't have multiple digital connections, so one will be via VGA, and invariably the colors won't match between two of the same monitor, which is far more annoying than anything to me, and calibration was still too far off for my tastes.

The point is far more people are willing to buy these. I happen to favor mid-range cards for myself, and usually will start with onboard graphics for anyone that doesn't play 3d games.

Re:Who buys discrete graphics anymore? (1)

Gaygirlie (1657131) | more than 2 years ago | (#38218412)

Seriously though, A lot of people don't quite settle for the onboard GPU, especially if you like to game with a very high resolution, or multiple monitors.

I sure as hell wouldn't be able to settle for an onboard GPU, they're simply too slow. I already cringe whenever I'm away from home for longer periods and have to kill some time by trying to play on my laptop, and these days I'm starting to feel my overclocked Geforce 460 on my desktop is starting to get slow and needs a replacement soon.

Re:Who buys discrete graphics anymore? (1)

bberens (965711) | more than 2 years ago | (#38218550)

I'm in the same boat. I run a discrete graphics card on my desktop because for about $50 I got a video card that runs all of the games I play (I only buy video games when they hit the $10 bin at [big box store]), while the on-board graphics card won't handle it. Of course, my desktop is about 8-10 years old now so that might be a bad example.

Re:Who buys discrete graphics anymore? (1)

cashman73 (855518) | more than 2 years ago | (#38217748)

There is most certainly a use for high-end GPU nVidia cards. Like in supercomputing applications [cnet.com] ,. . .

Re:Who buys discrete graphics anymore? (1)

kramulous (977841) | more than 2 years ago | (#38220294)

For a very few specialist problems. Just like FPGAs.

On the whole, are they're useless for most applications requiring performance. A lot of people bought into the hype.

heat (0)

Anonymous Coward | more than 2 years ago | (#38217648)

haha, being as its a nvidia chip it probably will produce enough heat to turn an steam turbine.

I wonder? (1)

cashman73 (855518) | more than 2 years ago | (#38217706)

Impressive! Imagine what a Beowulf Cluster of these things could . . . wait a minute! This chip IS it's own Beowulf Cluster! =)

How fast for Angry Birds? (2)

bryan1945 (301828) | more than 2 years ago | (#38217980)

Does it make my speedy bird go really really REALLY fast?

Re:How fast for Angry Birds? (1)

Surt (22457) | more than 2 years ago | (#38218174)

There actually is an easter egg in there for owners of high end graphics cards. If you have a sufficiently good one your speedy bird will cross the light speed barrier, creating a warp wake that creates a wave of destruction. Sorry if you're missing out with your wimpy card though.

Opinion (0)

Anonymous Coward | more than 2 years ago | (#38218350)

' Performance-wise, the new GPU proved to be about 10 to 15 percent faster" ,source-wise they are 100 percent proprietary and documentation-wise they are 0 percent competitive. Knights corner seems to be the way.

Let's go backwards. (1)

garyoa1 (2067072) | more than 2 years ago | (#38218586)

Still trying to figger out the method to the madness of going back wards. They had a 9000 series and then a 500 series. Then a 200 series. I'm waiting until they come out with the zero series. And it's not just Nvidia. Seems a lot of companies are doing it.

Insane power consumption (3, Interesting)

Lawrence_Bird (67278) | more than 2 years ago | (#38218734)

Its pretty difficult to get precise power figures on graphics cards - reviews always rate against 'total system' but never give us for reference the system power use without the card (or perhaps an onboard video solution). In any event, all modern cards are total power pigs. At a time where Intel and AMD try are trying very hard to reduce cpu power consumption, graphics cards are using up many multiples of those savings. I'm not sure where 'it has a wall outlet plug' gave these card and gpu producers license to subsidize the power companies.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?