×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD's Radeon R9 290 Delivers 290X Performance For $150 Less

Soulskill posted about 5 months ago | from the superfluous-frames-per-second-battle-continues dept.

AMD 183

crookedvulture writes "The back and forth battle for PC graphics supremacy is quite a thing to behold. Last week, Nvidia cut GeForce prices in response to the arrival of AMD's latest Radeons. That move caused AMD to rejigger its plans for the new Radeon R9 290, which debuted today with a higher default fan speed and faster performance than originally planned. This $400 card offers almost identical performance to AMD's flagship R9 290X for $150 less. Indeed, it's often faster than Nvidia's $1000 GeForce Titan. But the 290 also consumes a lot more power, and its fan spins up to 49 decibels under load. Fortunately, the acoustic profile isn't too grating. Radeon R9 290 isn't the only new graphics card due this week, either. Nvidia is scheduled to unveil its GeForce GTX 780 Ti on November 7, and that card could further upset the balance at the high end of the GPU market. As AMD and Nvidia trade blows, PC gamers seem to be the ones who benefit." Additional reviews available from AnandTech, PC Perspective, Hot Hardware, and Tom's Hardware.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

183 comments

Hi how is everyone? (-1)

Anonymous Coward | about 5 months ago | (#45341553)

This looks good

Re:Hi how is everyone? (-1)

Anonymous Coward | about 5 months ago | (#45341575)

Very high, thanks. If you ask the right question, I'll answer. Yes, exactly, I smoked crack cocaine, eh?

Are PC gamers benefiting ? (1, Redundant)

Taco Cowboy (5327) | about 5 months ago | (#45341693)

As AMD and Nvidia trade blows, PC gamers seem to be the ones who benefit

TFA claimed that PC gamers would be benefiting from the "trade blows" between AMD and Nvidia.

But are they ?

Back in the days when the marketplace was served by a lot more vendors, PC Gamers back then were truly benefiting from the fierce competition.

Sadly, it is not happening anymore.

From a marketplace that used to be served by 6 competing vendors into a duopoly marketplace that is currently served by only 2 vendors --- the pace of innovation has slowed to a crawl.

The performance increment for every new edition of hardware is getting smaller and smaller.

Re:Are PC gamers benefiting ? (0)

Anonymous Coward | about 5 months ago | (#45341977)

Demand has slowed, margins have fallen. Good luck getting a 3rd party in this mess.

Re:Are PC gamers benefiting ? (1)

viperidaenz (2515578) | about 5 months ago | (#45341985)

If the two players in the doupoly don't keep their game up, they'll have a 3rd player move in on their turf by the name of Intel.

Re:Are PC gamers benefiting ? (1)

SJHillman (1966756) | about 5 months ago | (#45342253)

I've been pretty impressed with how well Intel's integrated graphics have been doing... with the HD 3000, I can play many modern games on intermediate settings with no problem at 1920x1080, which I imagine is good enough for a majority of users.

Re:Are PC gamers benefiting ? (1)

Anonymous Coward | about 5 months ago | (#45342321)

Bullshit. Maybe with HD5200 iris pro, or if you consider duke nukem 3d a modern game.

Re:Are PC gamers benefiting ? (1)

bzipitidoo (647217) | about 5 months ago | (#45342933)

I find the Intel HD4000 quite capable of handling modern games. It can do a reasonable job on 3D accelerated graphics. I find it about equivalent or maybe a bit faster than older low end stuff like the Radeon HD5450. Games won't have the fastest frame rates, and will want to tune the graphics options to the least demanding settings, but they work. And the drivers may be buggy with DirectX 11, but DirectX 9 works.

The point of a chipset like Intel's HD line is low power usage, not high performance. A system with a power sipping CPU like the I5-3317U and the HD4000 graphics needs only 30 watts to run the most demanding 3D accelerated graphics it can handle. Playing videos on Youtube takes only 20 watts, and just running an office program in a GUI takes a mere 10 watts. If you want more performance, you'll have to burn more power.

Re:Are PC gamers benefiting ? (0)

Anonymous Coward | about 5 months ago | (#45343215)

Bad example. Duke Nukem 3D doesn't use acceleration at all.

Re:Are PC gamers benefiting ? (2)

gl4ss (559668) | about 5 months ago | (#45342093)

s3? shit cards and everyone who bought them got burned.
powervr? desktop cards were shite and everyone who bought them got burned.
sis? everyone who bought their desktop 3d cards got burnt.
matrox? their 3d gaming stuff was shite.

getting the point? the problem was that all the competiton was even more liars about their cards than the two that remain.

Re:Are PC gamers benefiting ? (0)

Anonymous Coward | about 5 months ago | (#45342661)

Not quite exactly thing with Intel.

Re:Are PC gamers benefiting ? (4, Informative)

Smauler (915644) | about 5 months ago | (#45342235)

From a marketplace that used to be served by 6 competing vendors into a duopoly marketplace that is currently served by only 2 vendors --- the pace of innovation has slowed to a crawl.

We're most definitely not in a duopoly marketplace at the moment. There are currently only 2 companies offering high performance 3D consumer priced cards, but there are other companies in the graphics business. The most popular graphics card used by people using Steam is the Intel HD Graphics 3000 [steampowered.com], for example. Matrox is still about, too, but not competing in consumer 3D.

To be honest, I can't really remember a time in which there were more than 3 (possibly 4) major players in the high end consumer 3D market. Matrox dabbled, but never got close to a cost efficient gaming card, really IMO... the closest they came was the G400 IIRC. That was the era when you could possibly claim there were 4 competing vendors. Soon after, Matrox left the market to concentrate on 2D, and 3dfx dissapeared up their own arse. I'm not sure who the other 2 you are alluding to are.... SiS, VIA?

Re:Are PC gamers benefiting ? (1)

tibman (623933) | about 5 months ago | (#45342601)

I'm always curious about that survey. Wouldn't it count two graphics devices in most systems? Intel integrated and then the discrete NVIDIA/AMD card? I always buy without any integrated video because it has bit me in the past but it can be hard to find the perfect mobo sometimes.

Re:Are PC gamers benefiting ? (0)

Anonymous Coward | about 5 months ago | (#45343213)

Less not forget that AMD and nVIDIA tend not to release exclusive cards but they license out the technology where corporations around the world release their own version of the card. For example, I use an ASUS interpretation of an AMD card. But, ATI doesn't exclusively develop these technologies and same goes with nVIDIA even though it gets sold with that brand name. In business, many businesses work together on a product to make it the greatest. Even if it just ended at AMD vs nVIDIA, it still wouldn't be a duopoly by technicalities.

Re:Are PC gamers benefiting ? (1)

PixetaledPikachu (1007305) | about 5 months ago | (#45343591)

Matrox dabbled, but never got close to a cost efficient gaming card, really IMO... the closest they came was the G400 IIRC. That was the era when you could possibly claim there were 4 competing vendors. Soon after, Matrox left the market to concentrate on 2D, and 3dfx dissapeared up their own arse. I'm not sure who the other 2 you are alluding to are.... SiS, VIA?

3DLabs? PowerVR? Rendition? Granted, most of them only released 2-3 generations worth of graphic chips, but they did gave us options back then. I remembered the how PowerVR delivered competition when they released the Kyro after the success of PowerVR2 on DreamCast

Re:Are PC gamers benefiting ? (3, Insightful)

Moryath (553296) | about 5 months ago | (#45342495)

Better question: what game actually requires this?

Seriously now. Unless you're trying to just throw money away on some 6-screen rig or something, a single-screen at 1920x1080 will run almost all games of today fine from 3-year-old cards. "Bleeding edge" is a function of throwing your money away on diminishing returns problems.

Re:Are PC gamers benefiting ? (2)

epyT-R (613989) | about 5 months ago | (#45342593)

Depends on what point those diminishing returns start to diminish for you. Having a high framerate is nice, while having max texture and shader detail turned on at the same time. You don't need the million dollar sports car to get you to work, but that doesnt' mean it isn't nice.

Re:Are PC gamers benefiting ? (1)

CAIMLAS (41445) | about 5 months ago | (#45342953)

Crysis 2. Metro 2033. Metro: Last Light. Supreme Commander 1 and 2.

Just off the top of my head - all those games will benefit greatly from a faster card (or CPU, in the case of Supreme Commander).

Re:Are PC gamers benefiting ? (0)

Anonymous Coward | about 5 months ago | (#45343513)

My flight simulator (in combination with an insanely fast CPU)

290X (5, Informative)

grub (11606) | about 5 months ago | (#45341583)

I read the headline as a this new card delivering 290 times the performance of something else.

Re:290X (0)

Anonymous Coward | about 5 months ago | (#45341605)

I read it that way, as well.

Anandtech Fucked Up (3, Interesting)

Khyber (864651) | about 5 months ago | (#45341615)

They used a shitty case with absolutely horrible acoustic profile to measure the card noise and got a whopping 57 dB.

Had they bothered to use a real case, they'd have had it almost half as loud (looks like everyone else managed to stay under 50 dB.)

It's like Anandtech never heard of Delta Fans, either.

Re:Anandtech Fucked Up (1)

rsmith-mac (639075) | about 5 months ago | (#45341703)

Okay, I'll bite. What's wrong with the Phantom 630 case that Anandtech used? It has reviewed reasonably well, as far as I can tell.

Re:Anandtech Fucked Up (1)

Khyber (864651) | about 5 months ago | (#45341741)

The acoustic profile is absolutely horrible compared to say a HAF 922 or Fractal Define R4, hence Anand's nearly double-loudness noise measurement versus everyone else with a brain on choosing a proper computer case.

Well, they could've gone worse. They could've gone with an old SGI tower.

Typo? (0)

Anonymous Coward | about 5 months ago | (#45341897)

The HAF 922 has huge open mesh panels. Not the same sort of thing as the R4 at all. Perhaps you meant a different box.

Re:Anandtech Fucked Up (2)

aliquis (678370) | about 5 months ago | (#45342313)

Then again they are measuring the noise of the graphics card, not the noise dampening of a case.

They could measure it with no case at all..

Re:Anandtech Fucked Up (1)

iinlane (948356) | about 5 months ago | (#45342775)

The acoustic profile is absolutely horrible compared to say a HAF 922 or Fractal Define R4

I was actually considering the card but my HAF-X case has really poor noise dampening. It's practically a net with fans on it. For me the noise is definitely an issue.

Re:Anandtech Fucked Up (1)

bloodhawk (813939) | about 5 months ago | (#45342649)

As long as they are consistent it doesn't matter too much, BUT someone that cares about the noise levels of a card would find the Anandtech review useless as they would not be using such a case and people that afford these cards and care about the noise level nearly always go for proper cases as well so really Anandtech should either be using a better case or simply not reviewing noise levels.

Re:Anandtech Fucked Up (2, Informative)

Anonymous Coward | about 5 months ago | (#45341779)

It's probably more to do with taking measurements at a 12 inch distance rather than something reasonable or even standard like 3 feet. While they're not perfect, I find that that techpower up [techpowerup.com] has the best measurements regarding noise and the largest sample size of different cards.

Re:Anandtech Fucked Up (4, Insightful)

GeorgieBoy (6120) | about 5 months ago | (#45341791)

If they tested all the cards in the same case, then they did nothing wrong in their testing. Maybe it wouldn't be 57dB for the 290 in another PC case, but it would be lower for all the other cards too. Perhaps it wouldn't necessarily be a linear drop across all the cards, but you can't simply say their choice of case invalidates their findings that this card is REALLY loud compared to other cards. Plenty of people will own cases with "horrible acoustic profile[s]".

Re:Anandtech Fucked Up (-1, Flamebait)

rahvin112 (446269) | about 5 months ago | (#45342017)

You are missing the fanboi logic. Unless presented in the most favorable light possible the review is WRONG. It doesn't matter that the VAST majority of gamers are going to have cases with shitty sound profiles, it just means they aren't important and shouldn't be considered.

Remember, it's all about fanboi logic, or the lack thereof.

Re:Anandtech Fucked Up (1)

Khyber (864651) | about 5 months ago | (#45342897)

Fanboi of what, moron?

Show me the fanboyism. I think you'll find I can and do bash anything at any opportunity, including your dumb ass.

I've got cheap-ass $10 chinese cases quiet than that crappy thing Anandtech chose.

Also, Anandtech complains about noise, it's obvious they've never had a Delta fan or 5800FX.

Re:Anandtech Fucked Up (1)

kangsterizer (1698322) | about 5 months ago | (#45341981)

according to many other tests the 290 and 290X in fact heat more and thus have beefier fans than their geforce counterpart (the 780 and its cousin the titan) - and that, by a rather large factor.
If you don't care for a little noise tho, the AMD is a pretty good value right now. My 780TF is nearly silent under load. I choose the noise - but if i had a really good case, i'd be tempted.

Re:Anandtech Fucked Up (0)

Anonymous Coward | about 5 months ago | (#45342481)

they'd have had it almost half as loud (looks like everyone else managed to stay under 50 dB.)

So what magic did the rest due to not only get it half as loud (54 dB) to a fifth as loud or more at below 50 dB?

Re:Anandtech Fucked Up (1)

Khyber (864651) | about 5 months ago | (#45342901)

The logarithmic exponent scale of decibels is for every +/-10 dB you've either doubled or halved the loudness. What nonsense are you speaking?

Re:Anandtech Fucked Up (0)

Anonymous Coward | about 5 months ago | (#45343267)

No, each 10 dB is 10 times the loudness [wikipedia.org]. Double or half the loudness is approximately +/- 3 dB.

Re:Anandtech Fucked Up (4, Insightful)

adolf (21054) | about 5 months ago | (#45342787)

Noise measurements (all noise measurements, not just those related to PC hardware) are always suspect:

What is the ambient noise level?

What is the test environment? (Is it a well-isolated anechoic chamber, a common desk with a computer near the corner of the room, or is it on the deck of a boat, or on the back of a llama? It makes a huge difference.)

What is the distance between the rig under test and the measurement rig with the microphone?

Is this test rig calibrated? (To what standard?)

What are the properties of the noise? (if it is 57dBa at only 1.5kHz, it is very annoying to me. If it's 57dBa only at 25kHz, it is annoying only to my dog.)

Is the noise different in differing directions?

How do you know?

Did you measure it?

It's all important, lest the resultant number be absolutely unimportant.

Also: Meh. "This blue car sounds better than that other blue car!" is roughly as accurate as a non-descrip "noise measurement" of computer hardware.

Re:Anandtech Fucked Up (0)

Anonymous Coward | about 5 months ago | (#45342991)

You can say such and such dBa doesn't matter. And for reasons listed, it means measured dBa isn't proof that one card will have a louder apparent sound than another. But in the real world, GPUs with higher dBa often sound louder than those with lower.

You could measure two GPUs, one at 1dBa and another at 9 trillion dBa and say that the measurement doesn't tell the whole story. And I'd agree. But I'd also agree that the 1dBa GPU would probably sound completely silent and the 9 trillion dBa GPU would probably blow my head off.

Re:Anandtech Fucked Up (1)

dunkelfalke (91624) | about 5 months ago | (#45343261)

Oh yes, I still remember the Delta fan of that particular Globalwin copper CPU cooler. I have built the PC for an almost deaf gramps and he actually complained about how loud the PC was. Vacuum cleaners pale in comparison.

Fucking product numbers, how do they work? (0)

Anonymous Coward | about 5 months ago | (#45341651)

Who are the jackasses in charge of marketing at these companies?

The numbers AMD are flinging around as product names are bloody confusing, and I used to build computers for a living (though, admittedly, that was back during the pre-Intel "Core" days where anyone knew that a Pentium 4 was better then a Pentium 3). They look like revisions for the silicon design or something, and the "X" at the end isn't helping either (makes me think it's a multiplier or something). It all kind of reminds me of how AMD used to name their CPUs, and used some bullshit number they pulled out of their asses rather then an actual megahertz rating like everyone else.

It amazes me that people can't figure out a basic, simple, and scalable numbering scheme to make it obvious that something is newer then something else and more or less powerful at the same time (because not everything new is better). I would think this would be a priority so people can figure out what the fuck they want to buy, but then again, maybe that's the point- confuse the customer so much they land up purchasing what you want them to buy, rather then what they actually need.

Re: Fucking product numbers, how do they work? (0)

Anonymous Coward | about 5 months ago | (#45341715)

They did not use clock speeds because it was an unfair comparison against Intel's failed Netburst architecture.

They used those numbers for their CPUs because the consumers were the morons who equated MHz with performance.

Re:Fucking product numbers, how do they work? (2)

viperidaenz (2515578) | about 5 months ago | (#45342025)

A 1.4GHz P3 is a hell of a lot faster than a 1.4GHz P4.
Sort of how a Pentium M 1.6 is around the same performance as a 2.4GHz P4...

Re:Fucking product numbers, how do they work? (2)

AK Marc (707885) | about 5 months ago | (#45342079)

If it's factual (clock speed or such) then it isn't as easily trademarked. Intel lost the trademark on 486, didn't they?

Better headline: AMD's Radeon R9 290 Slashvertised (4, Insightful)

Gravis Zero (934156) | about 5 months ago | (#45341667)

seriously, it could only have been worse if there was "ON SALE NOW!" in the summary. then again, there is "Nvidia cut GeForce prices" so meh.

Re:Better headline: AMD's Radeon R9 290 Slashverti (2)

aliquis (678370) | about 5 months ago | (#45342289)

Not really. Running harder to reach similar performance as a higher level card with high energy consumption, lots of noise and a GTX 780ti coming soon.

May sell to some, not to others.

Re:Better headline: AMD's Radeon R9 290 Slashverti (1)

Mashiki (184564) | about 5 months ago | (#45342305)

Considering how shitty nvidia drivers have been since ~292.xx? They'd have to pay me to buy one of their cards at this point, seems that they've done a great flip as has happened in the past, and they outsourced their driver development to 3 cats and a dog.

Re:Better headline: AMD's Radeon R9 290 Slashverti (3, Interesting)

Rockoon (1252108) | about 5 months ago | (#45342803)

I was always (post 3DFX) an NVidia GPU user until this year, and it was the drivers and their negative effects that prevented me from choosing NVidia this time around.

It wasnt always that way. For the most part you could just use the latest drivers and everything would be OK, but about 2 years ago I started having issues where a game wouldnt work with one driver while another game wouldnt work with the ones that would work with... which bothered me but didnt push me over the edge. Then the reports in June of the newest drivers killing cards, and rendering horrible artifacts in many games...

Its a shame, because I was really eyeballing that vanilla GTX 650 that runs on 64 watts...

In the intrim I picked up an A10-6800K with its integrated HD 8670D which I am extremely impressed with (low expectations shattered), and now I am eyeballing the HD 7790 that runs on 85 watts.

Re:Better headline: AMD's Radeon R9 290 Slashverti (0)

Anonymous Coward | about 5 months ago | (#45342817)

Considering how shitty nvidia drivers have been since ~292.xx? They'd have to pay me to buy one of their cards at this point, seems that they've done a great flip as has happened in the past, and they outsourced their driver development to 3 cats and a dog.

As a user of AMD cards, I can assure that the AMD drivers still suck as well.

Newest bug: Starting a fullscreen game on a dual monitor system causes the second monitor to black out. Not turn off, mind you, just go 100% black whilst still being powered on. This did not happen on 13.4 but they broke it somehow in 13.9, it's like they spend all their time somehow managing to find ways to progressively make the drivers worse over time[*]. Either that, or the things are horrid messes of spooky-action-at-a-distance spaghetti code... probably the latter.

Oh, also random mouse cursor corruption. The arrow cursor mutates into a white dotted vertical bar until you refresh it.

[*] Only bright side seems to be that they finally fixed the frame-rate drop bug on the Windows desktop with Aero. I'm not 100% sure on this though, I'll have to see if it starts happening again. (This bug has existed for at least 2 years at this point)

Re:Better headline: AMD's Radeon R9 290 Slashverti (0)

Anonymous Coward | about 5 months ago | (#45342797)

Better headline: AMD's Radeon R9 290 Slashvertised

I thought IT people like gadgets and cool consumer hardware?

Funnily enough, consumer hardware is made by companies that sell it for money, we don't have a solution to 'fix' that yet.

So what? (-1)

Anonymous Coward | about 5 months ago | (#45341669)

If you spend north of $150 for a video card, you're a sucker. I don't care what kind of "gamer" you are.

Title should focus on AMD vs Nvidia (4, Interesting)

Lawmeister (201552) | about 5 months ago | (#45341677)

The real story is a $400 AMD card can perform as well as or better than a $1000 Nvidia one....

Re:Title should focus on AMD vs Nvidia (0)

Anonymous Coward | about 5 months ago | (#45342035)

Honestly it is business as usual. AMD has been superior per dollar for years but intel/nvidia have been winning with advertising and partnerships.

Re:Title should focus on AMD vs Nvidia (1)

artor3 (1344997) | about 5 months ago | (#45342063)

The FPS per dollar scatter plot on page 9 of the linked article (here [techreport.com]) is really telling. There's a surprisingly tight correlation between dollars and FPS for almost all of the cards, and then the GTX Titan is way off in no man's land. Nvidia's going to have to drop the price, unless it's just there to soak up money from people with more dollars than sense.

Re:Title should focus on AMD vs Nvidia (4, Informative)

gman003 (1693318) | about 5 months ago | (#45342501)

The Titan isn't positioned as a high-end gaming card as much as it is a low-end scientific computing card. It's the cheapest GPU that has reasonable double-precision floating-point performance. For whatever reason, most Kepler cards run DP operations at 1/24th the speed of single-precision, but the Titan and most of the Tesla cards are able to do so at 1/3rd the speed. There, the Titan runs thousands less than the similar Tesla cards (the K20 is listed on Newegg for $3500, and the K20X is on Amazon for $7700).

The fact that the Titan also gets some buys from gamers with way too much money is just a side bonus. Even since the 780 came out, it's been extremely wasteful to get a Titan for gaming. And Nvidia's own 780 Ti is likely to out-perform the Titan in games for $300 less. Really, I think the only reason they ever marked it as a gaming card was as a publicity stunt - they held the title of "fastest card ever" for quite a while, and they held it by an impressive lead.

Re:Title should focus on AMD vs Nvidia (5, Interesting)

Entropius (188861) | about 5 months ago | (#45342709)

This.

Yesterday and today I installed 20 Titans in a compute cluster at work, replacing the crappy GTX480's that crash constantly. The cost of these ($20K) would buy us TWO nodes on the local K20 cluster.

We don't really care that much about the float performance, even; much of our code is memory-bandwidth bound, and much of the rest runs iterative sparse-matrix solvers that can be run in "mixed precision", where you iterate a hundred times in single precision, do one update in double precision, a hundred more in single... So we could use the cheaper gamer cards, but the Titan's a price/performance sweet spot that we can't beat. It's even faster than the K20, and compared to the other gamer cards the 6GB memory gives us a huge amount of flexibility.

Re:Title should focus on AMD vs Nvidia (0)

Anonymous Coward | about 5 months ago | (#45343527)

Out of curiosity, what do you run on these cards?

Re:Title should focus on AMD vs Nvidia (5, Informative)

Nemyst (1383049) | about 5 months ago | (#45342115)

I was modded down the last time I talked about this, so let me be even clearer: if you buy a Titan for gaming, you're either stupid or have a lot of money to waste. The sole reason Titan is at the price it is is because it has the full double-precision speed, similar to Quadro cards which retail for many thousands of dollars (well, that and the fact NVIDIA had zero competition at such a high range for the better part of a year). They're effectively semi-pro cards for number crunching. NVIDIA thinks that this is enough to warrant the price and to be honest I'd probably take one over a Quadro (which can run up to something like $5,000!).

But again, for gaming, it's entirely unnecessary. Heck, it's extremely likely that the 780Ti, which should be revealed in a few days, will basically be a Titan with higher clocks, slower double-precision operations (whereas the 780 has a few cores less) and less VRAM.

Re:Title should focus on AMD vs Nvidia (0)

Anonymous Coward | about 5 months ago | (#45343313)

Then why is it that so many folks are still so attracted to the 1000usd one? Because the tech is stagnating and it offers different features and is designed for different people whom aren't all about gaming. Let's face it, the Titan has been nvidia capitalizing on amd's being 8 months behind and we all know how notoriously poor investments our precious computer hardware is over time.

Also, I own a 290x and I still don't give a damn about heat or power as I didn't back in the days of the GTX480. But that said, I'm pretty sure the 290 is a poorer binned 290x that's clocked as high as its increased fan profile allows which looks like good value on paper but when thermal envelopes are maxed out, the geforces are looking pretty much on par performance wise while still drawing less power.

Be Warned, Anandtech was paid off (-1, Troll)

Anonymous Coward | about 5 months ago | (#45341739)

Nvidia has been holding VERY profitable meetings with every possible technical site, explaining in detail just how they should trash the new AMD cards in their forthcoming reviews. Sites that follow Nvidia's guidelines, and use Nvidia's specific language to denigrate the new AMD GPUs receive very handsome payments.

AMD's new Hawaii/290 chip is MUCH smaller than the Nvidia part it thrashes at enthusiasts' resolutions (only a fool would use these cards for 1080P or less). The consequence of going for a smaller die with a larger memory bus is that AMD's part consumes more power than the 'Titan' equivalent from Nvidia, an issue that enthusiasts absolutely do NOT care about (and when running in 'idle'- eg., doing desktop stuff, even these high-end cards from Nvidia and AMD use only very modest amounts of power).

AMD can, and will sell its 290 at much lower prices than Nvidia. This, despite the fact that only AMD has the trueaudio sound acceleration DSP block, and only AMD supports the new Mantle API that exactly apes the low level 'to-the-metal' coding possible on the new consoles from Sony and MS (both of which exclusively use AMD tech for their CPU and GPU functions).

Nvidia is in maximum FUD mode, devoting tens of millions of its PR budget to paying shills to bad-mouth AMD. In truth, until Nvidia can launch new parts based on a process shrink to 20nm late next year, Nvidia really has no option but to rely on disinformation.

It gets WORSE. Now the new consoles are here, each with 8GB of memory, the average amount of GPU memory needed for 1080P or above is about to rise above 2GB for the first time. Nvidia's high mid-end cards (the 770/680 and below) only come with a NONE future-proof 2GB, unless you invest in a mega expensive 4GB version of the 770. AMD, on the other hand, has been selling high mid-end FUTURE-PROOF cards with 3GB at a reasonable price for years now, the 3GB 7950/7970 (and now the 3GB 280x). AMD's 7950 is less than HALF the cost of Nvidia's 4GB 770. Nvidia simply does not sell affordable 3GB+ cards.

Nvidia is fully aware that 2GB will be completely inadequate for games released at the end of 2014, but considers this "built-in-obsolescence", forcing their foolish fan-boys to have to upgrade more often than they may otherwise wish. However, there is ZERO chance of Nvidia selling any card with 3GB+ at a reasonable cost in the foreseeable future.

BTW when your graphics card has too little memory, it doesn't cease working, but unless you lower the quality settings, running out of local RAM tends to HALVE the framerate. You can see this effect in benchmarks showing the performance of roughly equivalent cards- say the 1GB 6870 vs 2GB 7790 (260x) at 1080P. Next year 3GB AMD cards will be getting vastly higher framerates than 2GB Nvidia cards with otherwise similar GPU power. And this will be BEFORE the advantages of a 'Mantle' version' are taken into account.

Current Nvidia products have never been less future proof, and Nvidia has an uphill battle to reverse this fact next year.

Re:Be Warned, Anandtech was paid off (1)

thesupraman (179040) | about 5 months ago | (#45341895)

Astroturf much?
Just out of interest, since I could with a bit of free cash, how much is AMD paying for shilling these days?

Re:Be Warned, Anandtech was paid off (0)

Anonymous Coward | about 5 months ago | (#45342145)

Nvidia does more shilling on Slashdot than most any hardware company that exist .

Re:Be Warned, Anandtech was paid off (1)

Smauler (915644) | about 5 months ago | (#45342327)

It gets WORSE. Now the new consoles are here, each with 8GB of memory, the average amount of GPU memory needed for 1080P or above is about to rise above 2GB for the first time.

Erm... this doesn't even make sense. At all. Are you claiming that to display 1080P you need a graphics card with 2GB onboard? Are you claiming that the new consoles have 8GB GPU RAM? What?

Besides, the amount of onboard RAM has long been an utterly useless metric for determining graphics performance. Since RAM is so cheap, nVidia and ATI often just drop lots on a crappy architecture, and advertise the RAM.

um (2)

Charliemopps (1157495) | about 5 months ago | (#45341777)

Who the hell spends $400+ on a video card anymore? How many games will come out in the next year that will get any benefit from any card over $200? 2? maybe 3? And don't forget, a year from now the $200 mid range cards will out perform this card anyway.

Moreover... (0)

Anonymous Coward | about 5 months ago | (#45341849)

Who wants to spend 400 on a card that doesn't even support Double Precision floating point at 1/4 of it's SP processing capability?

Seriously, my 4 year old HD4770 STILL has better DP fp ratings than the majority of the current released cards up to like a 250 dollar pricepoint (msrp).

Re:Moreover... (0)

Anonymous Coward | about 5 months ago | (#45342029)

GPU are still primarily for graphics. Not sure if I really need to pay more for faster DP performance so I can have 192bit color. SP is already 32bit per channel, 3 color channels + alpha.

Re:um (0)

Anonymous Coward | about 5 months ago | (#45341869)

People who buy $400-$1000 cards just don't want to hear it, I've tried.

In a way it is a good thing, they subsidise the R&D for the excellent $200 cards.

Re:um (4, Insightful)

Anrego (830717) | about 5 months ago | (#45341949)

That's kinda how all consumer (and even most non-consumer) stuff works.

You have the enthusiasts who for whatever reason have a stronger interest in the technology and are willing to spend significantly more for slightly better. They fund the R&D until it makes it down to the cheaper mass consumer pricing.

Personally I don't see anything wrong with this. I for one was an early adopter of SSDs. I bought one (then another) when 30G was still a big deal. I knew in a few years you'd get way more capacity for way cheaper.. but I didn't care, it was something I wanted to play around with.

If someone has the money to spend and is going to get enjoyment out of paying $1000 for a card where a $200 or so card would probably do, so what... their money, their hobby.

Re:um (0)

Anonymous Coward | about 5 months ago | (#45342019)

There is a big difference between buying an early SSD which had huge measurable benifits and buying a top end video card which does... very little really. Many people that I know that buy these don't really have 'a stronger interest in the technology', they just want to show off how big their numbers in X are.

In a blind test I'm certain most of them wouldn't even know the difference if you gave them a $200 card.

Re:um (3, Informative)

BlueBlade (123303) | about 5 months ago | (#45342883)

I think you're underestimating how much GPU power games need these days. I bought a Dell 30" monitor 5 years ago, which I'm still using for gaming. The native resolution is 2560x1600, so not even close to the new 4K ones. At this resolution, my old 3 years old Radeon 5870 was struggling to get smooth framerates for several games. So I bought the new GTX 780 when it came out for $600. The new card is fantastic, I can finally play The Witcher 2 at full resolution with high settings, same with Bioshock Infinite, etc. Keep in mind, the new 4K resolutions will demand even more out of GPUs, so it's not likely that the demand will go down all that much yet.

Sure, if you're a gamer who fires up a 1080p console port once in a while, a cheap GPU will do. If you're an avid gamer who needs more than 1080p, you still need to buy the $400+ cards to keep up.

Re:um (1)

AHuxley (892839) | about 5 months ago | (#45341907)

http://www.techspot.com/review/734-battlefield-4-benchmarks/ [techspot.com]
Thats a year of gaming on top settings or emerging 4k resolutions to consider. We have the generation of games, ssd, Windows 8.1, cpu, bandwidth, ram, lcd at ~usable levels.
The "GPU" as a card or more cards is the interesting part to get right with drivers and ongoing issues.
Drop the resolution, quality and todays mid range cards are good, but where is the fun in that :)
How the brands write their code, deal with the heat and work over 2 or more cards is always fun to read about.

Re:um (1)

kangsterizer (1698322) | about 5 months ago | (#45341989)

Since dual cards (SLI/CF) of the top of the line from AMD/Nvidia barely put out 30fps at 4K in recent games, i'm guessing, we need better video cards, not worse.

Or better eyes (1)

rsilvergun (571051) | about 5 months ago | (#45342371)

my 22" monitor's picture doesn't get better after 1080p. Heck, a 40" tv doesn't... 4k is a novelty for anyone with less than a 70" except maybe die hard flight sim and racing fans.

Re:Or better eyes (0)

Anonymous Coward | about 5 months ago | (#45342459)

Bullshit.

Re:Or better eyes (1)

aliquis (678370) | about 5 months ago | (#45342461)

I don't get why people talk about inches. Put the screen at whatever distance and look at it.

You don't have a 7" screen at the same distance as a 120" one.

As for resolution I guess it can become "better", the eyes only bring good sharpness in a very small spot but where you happen to look it may matter.

Beyond any progress taking time and it just haven't happened yet I guess the reason we haven't seen more 4K so far is that HDMI wasn't up for it (in better frame rates at least) and lack of medium or distribution content for it.

Some tablets already do QHD, maybe you think that is useless in a distance where you can see the whole screen, but what about if you show various widges on it and you're just looking at one of those then? Like one for mail inbox or whatever?

Though of course we're likely close to where it doesn't matter for people with normal eyesight.

I wouldn't mind more sharpness over what I'm seeing now, and I guess some of the smothness of what I'm seeing isn't coming from the resolution but rather from FreeType (?) sub-pixel tricks.

Re:um (5, Interesting)

FlyHelicopters (1540845) | about 5 months ago | (#45342125)

People who play at a higher resolution than 1080P. I currently game using three Dell 30" monitors, so I have a total of 12 million pixels to push, which is 50% higher than 4K. I could use a pair of 290X cards in crossfire, once they get after market coolers. Yes, I know I'm not a typical user, but you did ask, "who the hell spends $400+ on a video card". The answer would be me and people like me. Considering that I have $3,000 worth of monitors on my desk, $1,000 for a pair of 290X cards in crossfire is not really all that crazy.

Re:um (1)

TheLink (130905) | about 5 months ago | (#45343145)

What sort of games do you play? Flight sim?

The larger monitors tend to have higher latencies, so they're not so good for games where higher lag would make a difference. Should be fine for flight sims I guess.

http://www.displaylag.com/display-database/ [displaylag.com]
There aren't as many big monitors with 16ms lag (16ms = 1 frame at 60Hz), except maybe some Sonys? For some reason the lag tends to get crappier the bigger the screen gets Despite what the database says I don't consider 30ms lag to be great when it comes to playing games.

Re:um (1)

L4t3r4lu5 (1216702) | about 5 months ago | (#45343369)

Flight- and driving-sim enthusiasts, the ones who spend £500 on full replica flight controls for an A-10, or mount a Recaro bucket seat and pro-grade pedal / wheel combo in a dedicated frame in front of their PC for the full rally experience. Where previously they'd need to run SLI / CrossFire cards, they can now do it with one card.

Also, 14 year olds who have daddy's credit card number and want super-realistic explosions while playing CoD / Battlefield online.

Re:um (2)

asmkm22 (1902712) | about 5 months ago | (#45342147)

I do, but it's not as bad as you think. I started buying the $1k cards about 10 years ago, then sell them after a year for roughly $700-800. There always seems to be people looking for "older" cards to SLI their current setup. So although I initially did pay $1k to buy into the game, so to speak, I rarely spend more than $200-300 to upgrade to the latest and greatest at any given time. It's not like I'm dropping $1k a year.

Do I need it? Definitely not, since the popularity of consoles has gimped the advancements in graphics in games. Is it worth the couple-hundred a year to stay on top? It is for me.

Re:um (1)

aliquis (678370) | about 5 months ago | (#45342323)

Yeah, fantastic! So getting a $200 card one year from now I can spend the same (yeah, you can sell the used card ..) and get the performance later!

As for games I'd say lots.

Re:um (1)

tomofumi (831434) | about 5 months ago | (#45342453)

time is money, and you can enjoy the latest tech earlier than most others (rather than wait for 1 year...). If money is not a problem to you, why not?

AMD "Quality" Strikes Again (0)

Anonymous Coward | about 5 months ago | (#45341965)

http://arstechnica.com/gadgets/2013/11/amd-stomps-nvidia-with-r9-290-at-least-in-reviews/

If you read the comments it looks like that, by default, Catalyst is set to spin the fans too slowly.

Seriously 400 1000 for a video card. (-1)

Anonymous Coward | about 5 months ago | (#45341983)

Look on CL you can get a jon boat motor and trailer for that.

Really its time to put the mouse away.

Frag-muhn-teh-shun! (0)

Anonymous Coward | about 5 months ago | (#45342135)

Stop it! We have too many PC graphic cards already. How will the developers cope with all the different hardware?!

Nvidia GeForce (1)

Anonymous Coward | about 5 months ago | (#45342477)

i don't mean to nitpick but "Last week, Nvidia cut GeForce prices..." should read: "Last week, Nvidia cut GeForce prices for GTX 770, 780." I checked the price of an NVidia Geforce 650 but the price was the same.

Expensive, noisy, power hungry (0)

Anonymous Coward | about 5 months ago | (#45343325)

400 W, competes in noise with vacuum cleaners, costs a lot.
Why does no one do a process shrink on last generation's chips and put them out as low noise, low power "good enough" options?

To those whom ask, who needs such powerful cards? (0)

Anonymous Coward | about 5 months ago | (#45343433)

After you have tried a 4k monitor, get Bach to me and this thread.

Obligatory notification: ATI's Linux drivers are pathetic.

Greekgeek

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...