×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Launches GTX 750 Ti With New Maxwell Architecture

Soulskill posted about a month ago | from the new-hardware-for-the-graphics-arms-race dept.

Graphics 110

Vigile writes "NVIDIA is launching the GeForce GTX 750 Ti today, which would normally just be a passing mention for a new $150 mainstream graphics card. But company is using this as the starting point for its Maxwell architecture, which is actually pretty interesting. With a new GPU design that reorganizes the compute structure into smaller blocks, Maxwell is able to provide 66% more CUDA cores with a die size that is just 25% bigger than the previous generation all while continuing to use the same 28nm process technology we have today. Power and area efficiency were the target design points for Maxwell as it will eventually be integrated into NVIDIA's Tegra line, too. As a result the GeForce GTX 750 Ti is able to outperform AMD's Radeon R7 260X by 5-10% while using 35 watts less power at the same time."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

110 comments

believe it when I see it (0, Informative)

Anonymous Coward | about a month ago | (#46278815)

It's always easy for nvidia to say their graphics cards outperform AMD cards in computation, but difficult to make it happen. Nvidia is great at selling hype, nothing more.

Re:believe it when I see it (2)

Billly Gates (198444) | about a month ago | (#46278899)

It's always easy for nvidia to say their graphics cards outperform AMD cards in computation, but difficult to make it happen. Nvidia is great at selling hype, nothing more.

Thanks to the Litecoin and Bitcoin minners Nvidia are the only cards at the shelves at stores and who do not have a 150% to 200% damn markup from the MSRP price?!

If Maxwell can also mine coins do not expect any reasonably priced GPU for years to come.

Re:believe it when I see it (3, Informative)

TyFoN (12980) | about a month ago | (#46279113)

What else would you use an amd card for?
I you want to game and/or use linux with any performance and stability you need a nvidia card.

Intel integrated gpu works fine for linux though if you don't need the performance,

Re:believe it when I see it (1)

K. S. Kyosuke (729550) | about a month ago | (#46279319)

What else would you use an amd card for?

For anything you want? They even come with HW documentation these days. Also, Linus. ;-)

Re:believe it when I see it (1)

Anubis350 (772791) | about a month ago | (#46280921)

What else would you use an amd card for?

For anything you want? They even come with HW documentation these days. Also, Linus. ;-)

It comes with Linus? Is there a super efficient NVIDIA invented process that bundles a small piece of him on die in place of a core in every chip?

Re:believe it when I see it (1)

Billly Gates (198444) | about a month ago | (#46279419)

ATI's driver is free and opensourced and speced unlike the nvidia one which is a binary blob.

Re:believe it when I see it (0)

Anonymous Coward | about a month ago | (#46279703)

It may be free, open source, and have documentation, but what good is that if it doesn't work properly?

Re:believe it when I see it (0)

Anonymous Coward | about a month ago | (#46279995)

It may work properly, but what good is that if it isn't free, open source, or have documentation?

Re:believe it when I see it (0)

Anonymous Coward | about 2 months ago | (#46281645)

Nvidia troll is full of excess heat. 5-10% while using 35 watts less power at the same time, for 400% more money, naturally.

Re:believe it when I see it (1)

Mr_Wisenheimer (3534031) | about 2 months ago | (#46281781)

We'll see if AMD retakes the lead in efficiency. In the meantime, they still beat Nvidia on price and system integration, although obviously system integration is not really a concern for people looking for aftermarket cards, but it is one of the major reason that all three gaming consoles went with AMD. Still, I use Nvidia cards, although I am very disappointing with them purposefully restricting double-precision speed.

Re:believe it when I see it (1)

future assassin (639396) | about 2 months ago | (#46281783)

I got an AMD A10 5600K chip and was having some issues with FPS in Xonotic. Wasn't sure WTF and blamed it on AMD. Picked up a used nvidia gtx 280. Well the graphics seemed much smoother for about 5 min then the slow downs started/ WTF? Anyways make a long story short I got a big fat CPU cooler and out of the blue the A!0 chip with opensource radeon drivers can play the game maxed out with no issues.

So here I was blaming AMD for crappy drivers when the CPU was overheating with the stock cooler. Oh and what did the Nvidia drivers do well they filled up my logs with bg of junk for some bug (Im using Linxu Mint 16) in the kernel/drivers. Eventually got the Running low on / space error messages.

Re:believe it when I see it (1)

edxwelch (600979) | about a month ago | (#46279277)

Maybe AMD should think about making more. Just an idea.

Re:believe it when I see it (0)

Anonymous Coward | about a month ago | (#46279511)

That takes fab time, which is bought months in advance.

It's not a matter of simply saying they need more chips.

Re:believe it when I see it (1)

edxwelch (600979) | about a month ago | (#46280307)

Yes, but it is "months in advance". The crypocoin thing started in November. Besides that TSMC is only at 80% capacity

FUCK BETA (3, Interesting)

synapse7 (1075571) | about a month ago | (#46279547)

Hype or not, games on my gtx 760 look amazing. Looks like they are testing the waters for the next flagship.

Re:believe it when I see it (1)

Somebody Is Using My (985418) | about 2 months ago | (#46281277)

Honestly, my three-year old GTX 580 makes games look amazing, and it is still surprisingly capable with modern games (really, only Crysis 3 on Ultra made it wheeze). This is thanks to how anemic were the GPUs in the last-gen game-consoles, but I'd wager it still holds up well with the launch titles released for the XBOne or PS4. I suppose I'll get a new GPU in a year or two, but after that I think I'll be find until the PS5/XBox-NextWhatever is released.

My days of upgrading video-cards on a yearly basis seem long gone; so much of the power we have available at our fingertips goes unused these days that it seems an unnecessary expenditure. I'd feel sorry for AMD and nvidia but, well, they did sort of bring it on themselves. ;-)

Re:believe it when I see it (0)

Anonymous Coward | about 2 months ago | (#46282197)

my GTX 480 runs recent games at 1440p just fine. It's loud and power hungry but I haven't found a real reason to upgrade yet.

Re:FUCK BETA (0)

Anonymous Coward | about 2 months ago | (#46282851)

Just remember that the games you find graphically amazing now will appear really dull in less than 10 years. Never, ever get too infatuated with graphics, particularly if the gameplay isn't suitably worthy as well. It's worth making a conscious decision to avoid becoming infatuated with graphics, such that you are able to enjoy idler games which might have better gameplay rather than the typical rehash that we get nowadays.

Always keep your standards just on the precipice. It'll ensure you can be surprised by anything new, while still enjoying anything old.

Re:believe it when I see it (2)

viperidaenz (2515578) | about a month ago | (#46280159)

Don't worry. They 260X they're comparing it to is still 2x faster at bitcoin mining. Still faster per-watt as well.

Re:believe it when I see it (1)

blackraven14250 (902843) | about 2 months ago | (#46281605)

Not quite, and totally not. source [cryptomining-blog.com]

Re:believe it when I see it (1)

viperidaenz (2515578) | about 2 months ago | (#46281673)

What does scrypt performance have to do with bitcoin?

Re (1)

Blaskowicz (634489) | about 2 months ago | (#46282291)

Won't you spend more $$$ on electricity for bitcoin mining than what you get out of it?, unless you steal the electricity.

Re:Re (1)

viperidaenz (2515578) | about 2 months ago | (#46283095)

Apparently you're supposed to make money off transaction fees.

That's besides the point. blackraven14250 attempted to rebut a statement with something completely irrelevant.

Believe it! (1, Insightful)

Anonymous Coward | about a month ago | (#46278909)

It's all about what you're trying to do. Nvidia usually has an edge in the reliability/gaming sector, while AMD has an edge in the mining/hashing sector. To say that Nvidia is pure hype is, ironically, hyperbole.

Re:Believe it! (1)

Billly Gates (198444) | about a month ago | (#46278943)

It's all about what you're trying to do. Nvidia usually has an edge in the reliability/gaming sector, while AMD has an edge in the mining/hashing sector. To say that Nvidia is pure hype is, ironically, hyperbole.

This chip has more hash performance and shadders with less wattage than an ATI card.

This one might be the preferred one for miners while it is 3x slower it uses 1/5th the power of full dual 290x. Expect $600 and limited to no availability once the miners discover it :-)

Re:Believe it! (3, Insightful)

LordLimecat (1103839) | about a month ago | (#46278975)

I had understood that anyone with half a brain was on ASICs now.

Then again anyone with half a brain wouldnt be joining the pyramid scheme so late in the game.

Re:Believe it! (2, Interesting)

Billly Gates (198444) | about a month ago | (#46279033)

I had understood that anyone with half a brain was on ASICs now.

Then again anyone with half a brain wouldnt be joining the pyramid scheme so late in the game.

Butterfly which makes the ASICS has been busted taking as long as 8 months for the orders.

Basically you plow down $2,000 for the units and they keep them for 7 months and mine the coins with your device. Then sell it to you when the cost of the coins go up 3000% so you lose your investment and Butterfly keeps the interest made ... similiar to banks with holding cash for 48 hours etc.

But at least banks give the money back after 72 hours after they short stocks and keep the interest on your cash first. These guys are sharks.

Litecoins are cheaper and so are Dogecoins so it is not too late for these as with the value falling it is a great time to invest again before it goes back up.

Re:Believe it! (2, Informative)

bloodhawk (813939) | about a month ago | (#46279983)

I am sure as the Zimbabwe dollar fell people were also saying it was a great time to invest before it went back up again. At some stage if enough confidence is lost there is no recovery.

Re:Believe it! (1)

LordLimecat (1103839) | about a month ago | (#46281153)

I had kind of wondered right off the bat why a company advertising devices which print money wouldnt use them to print money.

Possibilities:
1) Theyre phenomenally dumb, but still capable of making ASICs
2) They used the ASICs ahead of time, then sold the ASICs once the value propisition of said ASICs had plummetted
3) They realized that there was no more value to BitCoin than to roulette

None of those sound terribly appetizing. Sure you can make money there, just like you can with horse racing and house flipping.

Re:Believe it! (0)

Anonymous Coward | about 2 months ago | (#46282811)

They sell you an ASIC now at a predictable profit and in return they give up the possibility of a making more money from much higher-risk future bitcoins. That's completely different from roulette where the house always wins long-term. They also may not have the infrastructure to run all of the ASICs that they can sell. It could be a terrible deal, yes, but that's not something you can determine without looking at the specifics first.

Re:Believe it! (1)

LordLimecat (1103839) | about 2 months ago | (#46283181)

If youre the first into the ASIC world, you would be absolutely crazy not to run your ASICs for a few weeks before sending shipments off. Its like playing the stock market or blackjack with a 30% discount: statistically, the advantage is so great that you cant lose.

Except in this case theyre not even risking money-- theyre just pushing the ship date back a few weeks.

Re:Believe it! (2, Informative)

JDG1980 (2438906) | about a month ago | (#46279493)

I had understood that anyone with half a brain was on ASICs now.

That's true for Bitcoin, which uses SHA-256 as its hashing protocol. But for Litecoin, Dogecoin, and a bunch of other knock-off "altcoins", the proof-of-work is Scrypt, and that is difficult to support on ASICs because of the memory requirements.

There are some Scrypt ASICs currently being tested, but hash rates are quite modest and they focus more on saving power than on outgunning the top AMD video cards.

Re:Believe it! (0)

Anonymous Coward | about a month ago | (#46280933)

Really? 300Watts for 5,000Khash are the current ASICS being tested for Scrypt mining.

but, sure, I'd love to know which AMD video card is getting 5 mega hashes at scrypt mining.

(PS. thats the claims of the ASIC devs, god knows if its real - but they have stopped taking preorders, and are still talking - no preorders means they aren't trying to fatten up before running with the money, which gives them some chance to be real!)

Re:Believe it! (1)

LordLimecat (1103839) | about a month ago | (#46281173)

Anyone want to guess how the company making those devices would know the performance stats of the ASICs if they arent ready to ship?

Heres a hint: Theyre probably not waiting on manufacturing-- theyre waiting for someone to get around to shutting the devices off.

Re:Believe it! (0)

Anonymous Coward | about 2 months ago | (#46281741)

Here's a guess:
Netlist level based estimation (available pretty early, but even if you know what you're doing you only get ballpark estimates).
Post-route parameter extraction based simulation (gives you rather accurate results if you're using a well characterized process, but it's slow and only possible in the last design stages).

Here's a hint: Coming up with conspiracy theories that sound plausible to an audience with no experience in that field is trivial. Oh, and "Heres" is not a word. Neither is "theyre".

Re:Believe it! (0)

Anonymous Coward | about 2 months ago | (#46282135)

If you don't bother to do any basic performance estimates before the chips are manufactured, then you're taking a hell of a gamble that you did everything right, will get the right ballpark of clock speeds, and didn't make any timing mistake that is bottlenecking the whole chip's performance for something easy to fix in the design.

Re:Believe it! (0)

Anonymous Coward | about a month ago | (#46279185)

It's all about what you're trying to do. Nvidia usually has an edge in the reliability/gaming sector, while AMD has an edge in the mining/hashing sector. To say that Nvidia is pure hype is, ironically, hyperbole.

Clearly when you're using words like "mining/hashing" to describe a company formerly known for manufacturing graphics cards, the end use has changed considerably. I wouldn't even call AMD a graphics card company anymore with statements like that, and it also tends to make one think that we're comparing apples to oranges here.

Perhaps we should start calling mining cards what they are, rather than what they are not.

Re: Believe it! (0)

Anonymous Coward | about a month ago | (#46280451)

They're very good graphics cards that just happen to be good at hashing as well. NVIDIA started the GPGPU craze with CUDA but AMD currently outperform them in many cases. Doesn't mean it's not a GPU.

Re:believe it when I see it (1)

K. S. Kyosuke (729550) | about a month ago | (#46279055)

However, double-precision math is further pared back to 1/32 the rate of FP32; that was 1/24 in the mainstream Kepler-based GPUs.

Apparently, they achieved it at least partially by further carving up FP64 capabilities - even the cheapest AMD stuff has 1:16, as lousy as it is for some applications. Oh, what the hell. They're just gaming it. ;-)

Re:believe it when I see it (1)

JDG1980 (2438906) | about a month ago | (#46279513)

It makes sense to cut down on die space and power usage by removing capabilities that almost no one uses. Why should 99% of gamers have to carry the burden for 1% of HPC users? Presumably Nvidia will create a successor card to Tesla that will include full FP64 capability on the Maxwell platform. It won't come cheap, though.

Re:believe it when I see it (1)

K. S. Kyosuke (729550) | about a month ago | (#46279649)

... capabilities that almost no one uses. Why should 99% of gamers have to carry the burden for 1% of HPC users?

"Almost no one?" There's also less demanding engineering workstations and even content creation scenarios where better DP support would come in handy. But, having said that, AMD's APUs make probably much more sense for these than a gaming card with limited memory.

Re:believe it when I see it (1)

0123456 (636235) | about a month ago | (#46279949)

"Almost no one?" There's also less demanding engineering workstations and even content creation scenarios where better DP support would come in handy.

Yeah, exactly. That's "almost no one" when compared to the size of the gaming market.

Re: believe it when I see it (0)

Anonymous Coward | about a month ago | (#46280473)

That's why NVIDIA over a Quadro series in addition to GeForce.

Only requires... (2)

thevirtualcat (1071504) | about a month ago | (#46278871)

... five expansion slots to fit the fans, this time!

Re:Only requires... (0)

Anonymous Coward | about a month ago | (#46278959)

Looking at the pics in the links (heresy, I know), this specific model has one end-plate, but the fan does look a little too tall. On top of that, it is a basic fan/fin setup instead of a piped arrangement with a dedicated exhaust port, so having a card in the next open slot you can physically place one could be problematic.

Looks like 5 is an overestimate, but I'd still suggest filling your PCI slots from the bottom up and putting this in the express slot with the most clearance (if you have multiple).

Re:Only requires... (2)

thevirtualcat (1071504) | about a month ago | (#46279079)

Yeah, I saw that. I wasn't surprised. That seems to be the common configuration these days. (AMD is guilty of it, too.)

Fortunately, the days of packing my computers with expansion cards are long gone, anyway.

Won't stop me from make a cheap joke about it, though.

Cryptocurrency (0)

Anonymous Coward | about a month ago | (#46278929)

How well can it mine cryptocurrency?

Heat and noise.... (3, Interesting)

Kenja (541830) | about a month ago | (#46278939)

Am I the only one annoyed that average operating temperature and noise output are not standard graphic card benchmarks?

Re:Heat and noise.... (5, Informative)

ustolemyname (1301665) | about a month ago | (#46279057)

The benchmarks on Phoronix [phoronix.com] did temperature, and commented on (though didn't measure) noise. Was actually a fairly comprehensive, well done benchmark, the only thing missing was frame latency measurements.

Re:Heat and noise.... (5, Funny)

lordofthechia (598872) | about a month ago | (#46280135)

I opened the link scrolled through it, only to get all excited that my card (the R9 290) was trouncing everything else in one of the charts!

Then I realized I was looking at the temperature charts....

Re:Heat and noise.... (0)

Anonymous Coward | about a month ago | (#46279107)

I agree. Comparing Nvidia vs ATI or etc... Radeon GPU's tend to run hotter and for laptops, that heat-conducting paste to transfer the heat to the heat sink, tends to evaporate faster, leading to a shorter life (per the repair tech) .

Having a benchmark of noise and heat, and more important the heat generated by the GPU itself versus the card, would help consumers determine a longer life of a card. Less heat, the better.

Re:Heat and noise.... (1)

Charliemopps (1157495) | about a month ago | (#46279125)

They are not because both of those numbers are highly subjective to not only the devices use, but also it's enclosure. Do you have the card in SLI mode, inside a rack with 100 other cards all running bitcoin miners? Well then, noise and heat will be through the roof. Do you spend most of your time on in a terminal emulator and your case is water cooled? Well then it's going to be pretty low. "Average" is totally subjective and I think it best to leave those measurements up to an external review.

Re:Heat and noise.... (3, Insightful)

Rhywden (1940872) | about a month ago | (#46279469)

By your standard, almost anything would be subjective. Let's go through your line of thinking:

The tester chose an enclosure you probably don't have at home. As such, the card will not demonstrate the same values in your enclosure at home. As a result the tests are "subjective".

Power consumption? Well, you've probably got a different PSU. Subjective.

FPS? You've probably got a different CPU, different OS configuration, motherboard, harddisc... Subjective!

In summary: If the tester uses the same enclosure for every card they test, I don't see how it's subjective. Sone or dB as a unit of loudness are measurable, as is temperature. Or do you want to tell us that, say, the distance to Betelgeuze is subjective just because you don't happen to have the proper equipment to measure it?

Re:Heat and noise.... (0)

Anonymous Coward | about a month ago | (#46280359)

The variety of things like CPU and memory to chose from are kind of fall on a single spectrum and don't get in the way of the GPU test (unless you chose a bad test that has a CPU bottleneck). The PSU doing its job is not going to affect the power consumption of the card. How airflow hits the card within different cases, and which direction has different amounts of sound reflection/absorbance, etc., will cause a much larger variation in heat and sound emissions.

Re:Heat and noise.... (2)

geekoid (135745) | about a month ago | (#46279721)

Forget enclosures. Power up the device with no enclosure, give me the numbers at 1 meter distance.

Now we have a COMMON bar to use to judge.

Re:Heat and noise.... (1)

Immerman (2627577) | about a month ago | (#46280335)

I doubt miners care all that much about sound, ops/watt/s being far more relevant to their usage. If you've got a bank of dozens or hundreds of cards being hammered 24/7 it's going to be loud, period.

For everyone else I would think that the relevant questions would be how loud it is while being hammered by a graphically demanding game, watching a movie, and using a word processor. And these days it sounds like the last two usage cases tend to be comparable. As for the influence of case, etc. I'd say that's irrelevant as long as measurements are made relative to some reference. Ideally remove the case, silence the CPU and power supply, and test just the volume of the card from whichever direction is loudest. Or perhaps leave it in the with the side off and do idle and demanding measurements of it and several other cards from a few feet away, including at least one passively cooled one, for "0" reference

Re:Heat and noise.... (1)

geekoid (135745) | about a month ago | (#46279697)

No, I've been saying they should do that for any femputer part with a fan.
What's the noise and 1 meter at mid power and max power?
TYVM

Re:Heat and noise.... (2)

triffid_98 (899609) | about a month ago | (#46279785)

Yes

Operating temperature and noise output would only be valid measurements for the reference card. Once it starts getting manufactured by PNY or Diamond or eVGA or whomever they'll be using their own coolers and their own variations of the NVIDIA board.

Re:Heat and noise.... (1)

Khashishi (775369) | about a month ago | (#46280961)

Power consumption is heat generation. If you decrease power consumption, this should also reduce noise since a slower fan can be used.

66%? big deal. (1)

funwithBSD (245349) | about a month ago | (#46278965)

going from 1 unit to 1.25 unit size is 56% bigger in area, so they gained 10%?

Re:66%? big deal. (3, Insightful)

amorsen (7485) | about a month ago | (#46279105)

If you had read the article, you would have known that they went from 118 mm2 to 148mm2, i.e. a 25% increase in area.

If Slashdot entered the 21st century, it would be able to render superscript.

Re:66%? big deal. (4, Funny)

Ash Vince (602485) | about a month ago | (#46279417)

If Slashdot entered the 21st century, it would be able to render superscript.

Maybe the beta supports it :)

Re:66%? big deal. (4, Informative)

mrchaotica (681592) | about a month ago | (#46280013)

Maybe the beta supports [superscript] :)

I just checked; it does not. (I tried both ways: using unicode character entities and using the <sup> tag.)

Re:66%? big deal. (0)

Anonymous Coward | about a month ago | (#46279119)

No, the 25% is the die size (area, the mm^2), so that's some 30ish percent improvement per area efficiency (at lower wattage without changing the process, which is the kicker here)

Re:66%? big deal. (0)

Anonymous Coward | about a month ago | (#46279181)

The 25% increase in die size is already in terms of area. It's 148 square mm instead of 118 sqare mm.

If you want to criticize the math, you might instead point out that the phrasing is misleading, intending to imply a 66/25=2.5x improvement, when it's really 1.66x the transistors in 1.25x the area, which is really a 1.66x/1.25x=33% improvement. (That said, it still seems like a decent improvement....)

Wow (1)

viperidaenz (2515578) | about a month ago | (#46279045)

5-10% better than a cheaper rival card that came out 5 months ago.
Go nvidia, go!

Re:Wow (2)

sexconker (1179573) | about a month ago | (#46279283)

5-10% better than a cheaper rival card that came out 5 months ago.
Go nvidia, go!

I'm no nVidiot, but 5-10% improvement at a substantial power savings in the same price bracket is indeed an impressive feat.
This being a brand new architecture means that later cards can also reap these benefits.

AMD and their OEMs are still slowly trotting out 290X cards with decent cooling at inflated prices. The sooner nVidia gets their next architecture out there the sooner we'll see new products / price drops on the AMD side. The sooner that happens, the sooner we see new products / price drops on the nVidia side.

This competition thing has its benefits, you see. Meanwhile, in CPU land we've been stuck for years of Intel charging $BUTT for marginally better (and sometimes worse) shit that, as usual, requires a new mobo+chipset to fully utilize.

Re:Wow (3, Insightful)

fsck-beta (3539217) | about a month ago | (#46279567)

Meanwhile, in CPU land we've been stuck for years of Intel charging $BUTT for marginally better

If you think Haswell, Ivy Bridge or Sandy Bridge were 'marginally better' you aren't paying attention.

Re:Wow (1)

Bengie (1121981) | about a month ago | (#46280089)

I forget which synthetic benchmark, but it was non-SIMD and had Intel 2x faster single threaded and about 50% faster multi-threaded, while consuming about 25% less power at idle and 50% less power at load, than AMD's offering. AMD's very recent APU actually is about on par with Haswell clock-for-clock, but at the expense of 100% more power.

Re:Wow (0)

Anonymous Coward | about 2 months ago | (#46281231)

I forget which synthetic benchmark, but it was non-SIMD and had Intel 2x faster single threaded and about 50% faster multi-threaded, while consuming about 25% less power at idle and 50% less power at load, than AMD's offering. AMD's very recent APU actually is about on par with Haswell clock-for-clock, but at the expense of 100% more power.

It's kind of funny in a really sad way.

nVidia decided to chase scientific computing and completely retooled their GPUs as pure-SIMD CPUs which resulted in really shit graphics performance at massively inflated power budgets compared to AMD's incremental improvements which were cheaper, faster and used half the power. nVidia is finally making that boneheaded decision work after about 6 generational revisions.

In the CPU space, AMD pulled an nVidia, completely retooling their CPUs around having an integrated GPU (APU) with bulldozer. This resulted in AMD's highest end chips having increased power usage and lower performance than even Intel's bottom end CPUs. AMD still have not managed to make their CPUs work, maybe in a few more revisions they might finally complete their "let's copy what did not work for nVidia with our CPU line" and actually produce something worth comparing to Intel's line, assuming they don't go bankrupt first (They are apparently not doing so well).

One of those "they who do not learn from history are doomed to repeat it" things.

Re:Wow (3, Insightful)

viperidaenz (2515578) | about a month ago | (#46280069)

You say the big advance is in power, then mention the 290X, which has a single precision GLOPS/W figure of 19.4, between the new GTX750's 19.0 and the GTX750TI's 21.8

The 290X has a double precision GFLOPS/W of 2.6, the GTX750TI gets 0.68. Compared to the 65W TDP Radeon 250's double precision performance of 0.74, its a loser.

This is just hype and selective benchmarks for a new architecture that was supposed to be 20nm. They couldn't get it built on 20nm so they've had to stick with 28.
If it was 20nm, it probably would be better all round.

Re:Wow (1)

sexconker (1179573) | about a month ago | (#46281047)

You say the big advance is in power, then mention the 290X, which has a single precision GLOPS/W figure of 19.4, between the new GTX750's 19.0 and the GTX750TI's 21.8

The 290X has a double precision GFLOPS/W of 2.6, the GTX750TI gets 0.68. Compared to the 65W TDP Radeon 250's double precision performance of 0.74, its a loser.

This is just hype and selective benchmarks for a new architecture that was supposed to be 20nm. They couldn't get it built on 20nm so they've had to stick with 28.
If it was 20nm, it probably would be better all round.

You can't compare the flagship 290x to the low-end 750, lol.

Beyond that, nVidia doesn't give a shit about double precision compute - never have, never will. Most other people don't, either, even people doing computing on GPUs.
nVidia is still very much gamer-focused, despite CUDA still having a huge market advantage over ATI Stream / OpenCL / DirectCompute / AMD APP Acceleration / etc. nVidia's new architecture is promising, and this card's launch is giving us a taste.

I currently have only AMD cards in my PCs.

Re:Wow (0)

Anonymous Coward | about 2 months ago | (#46281271)

*he* didn't bring up the 290X, GGP did.
If I were to bring up a card, I'd probably point out the HD7790.
21.1 GFLOPS/W single precision, 1.5 GFLOPS/W double precision.
$149 MSRP.
Released 11 months ago.

Re:Wow (1)

JDG1980 (2438906) | about a month ago | (#46279531)

5-10% better than a cheaper rival card that came out 5 months ago.

This is a mobile-first design. Look at the power consumption figures to see why it is a major advance.

Nvidia's L2 Cache Jump (1)

MatthiasF (1853064) | about a month ago | (#46279859)

I think the most drastic thing about this new chipset is the fact Nvidia bumped the L2 cache up past 2 MB.

The Radeon R7 260 it is being compared against has only 768 KB and Kepler units had 256-320 KBs.

The performance improvement could simply be the L2 being larger, which means it is paging out to it's memory less.

Re:Nvidia's L2 Cache Jump (2)

viperidaenz (2515578) | about a month ago | (#46280133)

It's also only measuring single precision performance. The AMD GPU's are more power efficient at double precision.

Re:Nvidia's L2 Cache Jump (1)

Nemyst (1383049) | about 2 months ago | (#46282881)

The amusing thing is that it's actually a choice on NVIDIA's part. They're crippling their DP performance on consumer-level cards to push their workstation cards, which have full DP performance but cost in the many thousands of dollars. I have to wonder if it actually works out well for them, though I suppose gamers must be thankful they can at least get one vendor's cards at reasonable prices with the ludicrous mining craze.

Let me know (0)

Anonymous Coward | about a month ago | (#46281081)

when I can find it in an APU package. Until then, I will continue to use my AMD A-10.

7xx or 8xx? (1)

Twinbee (767046) | about 2 months ago | (#46281257)

I thought Maxwell was going to be under the 800 series [wikipedia.org]. For sure, the 700 series [wikipedia.org] already exists and has used the older Kepler architecture. Why confuse customers with ambiguous product naming?

Re:7xx or 8xx? (0)

Anonymous Coward | about 2 months ago | (#46281435)

the 700 series already exists and has used the older Kepler architecture

You state that like it's some sort of law, yet the 600 series was a mix of Fermi and Kepler.

Re:7xx or 8xx? (1)

Twinbee (767046) | about 2 months ago | (#46281539)

Like that excuses them. It shows they were already inconsistent.

Re:7xx or 8xx? (0)

Anonymous Coward | about 2 months ago | (#46282097)

I didn't say it wasn't confusing, just that both sides have been pulling this shit for quite a while.
One of the funnier/dumber events I remember is shortly after AMD introduced the HD6000 series you could still find remaining HD5000 series cards on clearance.
Meaning, you'd see a $139 HD6770 next to a $129 HD5770.
If you by now guessed that the former is a straight re-badge of the latter, you win one internets.

Re:7xx or 8xx? (1)

Nemyst (1383049) | about 2 months ago | (#46282869)

NVIDIA's done it a lot in the past: they'll introduce a tweaked or completely new architecture as a midrange card to test the grounds and tweak things. This most likely lets them get actual production cards out, which they can then test for problems and use as foundation to build their top-end cards. I guess it also lets them test the grounds with reviewer reactions and consumers.

Re:7xx or 8xx? (0)

Anonymous Coward | about 2 months ago | (#46283001)

AMD also does it.
HD4750/4770 were relased a year after all others in the series and based on a 40nm process.
HD7790 was also released about a year after the bulk of the series, turns out it's the only other GCN1.2 GPU besides Hawaii...

Re:7xx or 8xx? (1)

Twinbee (767046) | about 2 months ago | (#46283249)

Interesting, I might wait for the 8xx cards instead. Is x50 always 'reserved' for 'beta' versions of new architectures, and if so is it just x50 or are others like say x30 or x70 also reserved?

Unfortunately, Wikipedia doesn't even bother to list the architecture in its giant table [wikipedia.org] - a very important aspect of a graphics card.

Please tell me I'm dreaming! (0)

wdhowellsr (530924) | about 2 months ago | (#46281321)

Please tell me the browser cache is screwing with me. Please tell me that my wife wants to have sex more often ( ok that isn't going to happen, I have a 12 and 15 year old) Do we really have Slashdot.org back?

"Mainstream" graphics card? (1)

wvmarle (1070040) | about 2 months ago | (#46281887)

I wonder who they want to sell to, when it comes to "mainstream" cards.

When it comes to graphics I consider myself mainstream. Watching video, running the OS, an occasional photo edit - that's about the heaviest it goes. I rarely play games (and those are not graphics intensive, just online games), I don't do CAD or anything else that's graphically intensive.

Motherboards come with graphics built in, and that works just fine for those not into hardcore gaming or hardcore graphic design work. Both relative small markets, and both not exactly "mainstream" markets.

So what is this "mainstream" market for graphics cards, nowadays?

Outperform AMD's Radeon R7 260X by 5-10%? (0)

Anonymous Coward | about 2 months ago | (#46281931)

That is a $119 MSRP card (gigabyte on Amazon, XFX on newegg right now on stick) - or $115 after $10 rebate (sapphire on newegg), compared to the 750 Ti at $149. And even so, this is as long as, for example, you don't care about OpenCL where the Radeon demolishes the new Maxwell. Or FP64, since they reduced performance even further to 1/32 from 1/24. The GTX 750 Ti is great only for efficiency, which is I guess very important for some people (hard to cool server farms, silent HTPCs etc), but still it is a silly summary, nVidia still tries to get away with a high price.

What diffence would cores make if they cripple? (1)

BrendaEM (871664) | about 2 months ago | (#46283149)

As an owner of several nVidia produts, I appalled what nVidia does to the non-Quadro cards!

So you can choose a crippled gaming cards that can't do math well, or choose a workstation card that can't cool itself, and doesn't really know what to shaders.

Tell your marketing department, a loyal customer will seriously give AMD/ATI a close look the next time around.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...