×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Retail Radeon R9 290X Graphics Cards Slower Than AMD's Press Samples

Soulskill posted about 4 months ago | from the if-you-can't-trust-marketing-departments,-who-can-you-trust dept.

AMD 111

crookedvulture writes "AMD's recently introduced Radeon R9 290X is one of the fastest graphics cards around. However, the cards sent to reviewers differ somewhat from the retail units available for purchase. The press samples run at higher clock speeds and deliver better performance as a result. There's some variance in clock speeds between different press and retail cards, too. Part of the problem appears to be AMD's PowerTune mechanism, which dynamically adjusts GPU frequencies in response to temperature and power limits. AMD doesn't guarantee a base clock speed, saying only that the 290X runs at 'up to 1GHz.' Real-world clock speeds are a fair bit lower than that, and the retail cards suffer more than the press samples. Cooling seems to be a contributing factor. AMD issued a driver update that raises fan speeds, and that helps the performance of some retail cards. Retail units remain slower than the cards seeded to the press, though. Flashing retail cards with the press firmware raises clock speeds slightly, but it doesn't entirely close the gap, either. AMD hasn't explained why the retail cards are slower than expected, and it's possible the company cherry-picked the samples sent to the press. At the very least, it's clear that the 290X exhibits more card-to-card variance than we're used to seeing in a PC graphics product."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

111 comments

I never got the fascination with AMD/ATI (-1)

Anonymous Coward | about 4 months ago | (#45600039)

I mean its one thing to go against the crowd, but its another thing to cripple your computer with shitty parts because you want to be the special snowflake that doesn't go with the tried and true Intel/Nvidia combo.

Seriously. You're like the people that use BSD instead of Linux or Dvorak instead of Qwerty.

Re:I never got the fascination with AMD/ATI (1)

Anonymous Coward | about 4 months ago | (#45600071)

AMD cards are faster for GPGPU. They crunch numbers faster than nVIDIA cards, at the cost of more power and flakier drivers. Also AMD has better free software drivers.

Re: I never got the fascination with AMD/ATI (1)

O('_')O_Bush (1162487) | about 4 months ago | (#45603103)

For some values of GPGPU. They are great for MHash(Bit coin last year) but poor for some Boinc projects. Cuda (prior to some specific changes made for AMD to make it par) crunched 5-10x faster than its AMD equivalent.

Re:I never got the fascination with AMD/ATI (1)

DragonTHC (208439) | about 4 months ago | (#45603361)

perhaps, but this is typical of AMD and ATI products ranging all the way back to my experiences with Athlons and the ATI Rage 128 graphics card.

Their hardware just isn't as stable as Intel or Nvidia. And that's my experience with 4 different AMD systems over the years. When you put an intel computer on the same power supply, stability just happens.

Re:I never got the fascination with AMD/ATI (1)

GigaplexNZ (1233886) | about 4 months ago | (#45605413)

Hardware wise, I've had way more issues with NVIDIA. Software/driver wise, they've both had their fair share of issues. Don't go near AMD Enduro systems...

Re:I never got the fascination with AMD/ATI (1)

substance2003 (665358) | about 4 months ago | (#45604157)

Maybe AMD's open source drivers are better but Nvidia's drivers are better suited for use to do renders in Blender because of their support of CUDA.
While few will need to buy a graphics card for Blender 3D in particular it does show that you have to consider what you wish to use the card for when you want to make a purchase to get the performance you need out of it.

Re:I never got the fascination with AMD/ATI (1)

Monsuco (998964) | about 4 months ago | (#45604841)

Also AMD has better free software drivers.

Yeah, but their proprietary Linux drivers are simply atrocious. Performance isn't so bad but stability is awful.

Re: I never got the fascination with AMD/ATI (1)

Anonymous Coward | about 4 months ago | (#45600111)

And you are like the people that always eat the same pizza, while complaining that there's no variation.

Re:I never got the fascination with AMD/ATI (2, Insightful)

Anonymous Coward | about 4 months ago | (#45600627)

I use Intel/Nvidia with FreeBSD because they have better driver support than AMD/ATI.

Re:I never got the fascination with AMD/ATI (2)

tom17 (659054) | about 4 months ago | (#45601271)

I used to be an Nvidia guy until last week. Just hopped onto the crypto coin mining bandwagon and my new 7950 card should pay for itself in ~2-3 weeks.

Nvidia are junk for this, unfortunately.

And it shows, you try to find a new or 2nd hand 79xx card anywhere for a reasonable price. They have all been snapped up by the miners :(

Re:I never got the fascination with AMD/ATI (2, Informative)

arbiter1 (1204146) | about 4 months ago | (#45601463)

um yea coin mining now days, you will barley break even vs cost of card and electric needed to mine them. you missed the bandwagon by about 1-2 years.

Lots of coins (4, Interesting)

DrYak (748999) | about 4 months ago | (#45601545)

There are lots of alt-coins in addition of bitcoin.
Some are easier to mine on GPU than other.

You need to pick up a coin that plays well with it.

Re:Lots of coins (1)

gl4ss (559668) | about 4 months ago | (#45605219)

who buys those altcoins? speculators? do they have valid exchanges?

Re:Lots of coins (0)

Anonymous Coward | about 4 months ago | (#45605923)

Several altcoins are available on BTC-e [btc-e.com] (currently Litecoin, Namecoin, Novacoin, Peercoin, Featercoin, Primecoin and Terracoin).

I would call BTC-e a "valid exchange" because 16% of Bitcoin trades happen there (http://bitcoincharts.com/charts/volumepie/ [bitcoincharts.com]). Bitstamp has 19%, and MtGox has 22%.

Indeed (1)

DrYak (748999) | about 4 months ago | (#45606821)

Yeah, and that's the main exchange where I trade mine.

A few other thing:
- much smaller scale, but cryptsy [cryptsy.com] has almost the whole zoo.
- lots of trader are playing around and speculating with minor coin.
- That means there's a lot of exchange traffic between major and minor alt coins.
- That means it's easier to exchange whatever you mined with whatever is more usefull to do transaction for you.

- LTC is starting to grow big enough to get some indenpendance from BTC.
- ...thus it's starting to get more acceptance (more payment processor are accepting it too)
- ...its fluctuation vs. fiat isn't so thighly couple to BTC as before.

Re:I never got the fascination with AMD/ATI (1)

DragonTHC (208439) | about 4 months ago | (#45603377)

bwahahaha. If you were a serious miner, you'd have an ASIC rig.

GPU miners don't cover the cost of their electrical bills.

Re: I never got the fascination with AMD/ATI (0)

Anonymous Coward | about 4 months ago | (#45604625)

Ignorance means more money for me =)

Re:I never got the fascination with AMD/ATI (0)

Anonymous Coward | about 4 months ago | (#45605943)

He didn't say he was mining Bitcoin.

He could be mining Litecoin or other scrypt-based coins.

Re:I never got the fascination with AMD/ATI (0)

Anonymous Coward | about 4 months ago | (#45601905)

Intel/Nvidia are more expensive; AMD is better price/performance ratio. Budget-to-midrange is cheaper on AMD parts.

Re:I never got the fascination with AMD/ATI (1)

Billly Gates (198444) | about 4 months ago | (#45602395)

My cheap AMD/ATI system is the most stable computer I ever owned.

Only BSOD was from an unproper shutdown and file corruption that I did.

the cards run at higher temps by default (4, Informative)

spacepimp (664856) | about 4 months ago | (#45600059)

This has been discussed in many places like toms hardware. Essentially they found the cards volting is determined in the bios and the fan speeds can be altered. change the bios which many have released and the undervolting which occurs at lower temps is solved. Sapphire already released a new bios for the card to make these changes to keep them consistent yet keeping them from going above 95 degrees.

Re:the cards run at higher temps by default (1)

Sir_Sri (199544) | about 4 months ago | (#45600225)

There seem to be some problems with the cooling paste installed on a lot of cards as well, causing them to overheat and ramp themselves down.

liq N2 option (4, Funny)

harvey the nerd (582806) | about 4 months ago | (#45600503)

Someone obviously didn't buy the turbo liquid nitrogen supply option.

Re:liq N2 option (0)

Anonymous Coward | about 4 months ago | (#45601977)

Or they don't live in Canada with the cold climate where ATI division is located.
I could really use a 95C GPU warmer now.

Re:the cards run at higher temps by default (4, Informative)

Nemyst (1383049) | about 4 months ago | (#45600625)

If you bothered to RTFA (I know!), you'd see that they indeed checked this out. They flashed the BIOS of their sample card onto their worst performing retail card. There was a small difference, but far from enough to make up for the gap between that card and the sample unit they received from AMD. It also made the retail card crash because the voltage was too low. The sample card managed to give better performance at lower fan speeds and voltages.

At this point it's reasonable to assume that AMD cherry-picked the cards they sent reviewers to make sure they were as good as they could be.

Re:the cards run at higher temps by default (1)

Hamsterdan (815291) | about 4 months ago | (#45600747)

I wouldn't be surprised if nVidia/Intel/anyone else is doing the same.

Kinda like the picture on the menu and what you get in your plate

Re:the cards run at higher temps by default (1)

ericloewe (2129490) | about 4 months ago | (#45600913)

It's surely common in any industry, but the performance difference in this case (assuming no weird stuff is going on) is more than what is considered reasonable by the collective.

The problem is probably the lack of a specific lower threshold to which the cards are held (would also help explain the aggressive pricing).

Re:the cards run at higher temps by default (2)

Sable Drakon (831800) | about 4 months ago | (#45601031)

Of course they are. But they're also not delivering one product for review and a completely different product to sell. What AMD's done here is little more than a bait-and-switch that's coming back to bite them in the ass. Of course there's going to be some variance from card to card, but what's happening here goes far beyond what's expected.

Re:the cards run at higher temps by default (0)

Anonymous Coward | about 4 months ago | (#45604603)

Of course they are. But they're also not delivering one product for review and a completely different product to sell. What AMD's done here is little more than a bait-and-switch that's coming back to bite them in the ass. Of course there's going to be some variance from card to card, but what's happening here goes far beyond what's expected.

It ought to bite the review sites in the ass as well. I don't want to see performance results from a manufacturer selected piece of hardware. I want them to march their happy asses over to the goddamn retail store and pick one up off the shelf like anyone else does.

Re:the cards run at higher temps by default (1)

makomk (752139) | about 4 months ago | (#45606583)

Actually, NVidia have been doing the same thing for a couple of generations of GPUs as far as anyone's been able to tell, the press are just a lot less willing to kick up a fuss about anything they do than with AMD. (And I mean literally the same thing - designing their cards so the actual clock speeds and performance they run at out the box varies from card to card, then cherry-picking the best ones to send to reviewers.)

Re:the cards run at higher temps by default (1)

arbiter1 (1204146) | about 4 months ago | (#45601525)

Nvidia does it to, but since they HAVE a base clock on their cards then what ever boost card can do. You know all the cards if running at base clock will be within a few % of each other and not this 15-20% that some sites have seen with AMD cards. press nvidia cards seem to overclock like beats which does show you if you are lucky to get one those you could do with it, Sadly the "up to" terms AMD used makes them sound like an ISP.

Not that FIX / Thermal performance (4, Informative)

DrYak (748999) | about 4 months ago | (#45601777)

If you bothered to RTFA (I know!), you'd see that they indeed checked this out. They flashed the BIOS of their sample card onto their worst performing retail card. There was a small difference, but far from enough to make up for the gap between that card and the sample unit they received from AMD

Not that BIOS. As other have pointed in the thread, the variation in performance is more or less linked to the variance of thermal management.
Not all PWM Fan behave the same. There's a *newer* BIOS version (not as in "use the one that came with the sample" but as in "download the latest version that was made available on the manufacturer website, betwen when you bought it and now").
This version of BIOS is better at computing what signal it should send to the fan to have better cooling.
And once the cooling is improvent, the card will automatically scale up its speed.

Also, there can be difference in thermal grease, etc.

At this point it's reasonable to assume that AMD cherry-picked the cards they sent reviewers to make sure they were as good as they could be.

Or, instead of cherry-picking, maybe there's some build quality between the first engineering sample sent by AMD, and the mass-produced card by NONAME asian manufacturer ? (Or even mass-produced cards by very popular brands that have to fulfill lots of orders ?)

Difference in quality of the fans (NONAME will pick whatever is currently the cheapest, and even with popular big-names, there's going to be some variance, depending on where the current batch was sourced).
Difference in quality of thermal conduction of the interface. Difference of quality of thermal grease (NONAME will pick the cheapest, bigname might have variation in batches, specially if they source batch from several manufacturer to keep up with the pace). Difference in quality of work (NONAME might even do a sloppy job in applying the thermal medium to the radiator).

You end up with exactly the same chip produced by AMD, but vastly different thermal condition, all this with a firmware and a driver which isn't yet best at fan throttling, and you end-up with some measurable difference in output. ...BUT...

Pick up Nvidia cards, and you're going to see exactly the same effect.

Either card that vary in their performance (or that have big variation in temperature, depending on how the current firmware throttles the card)

Re:Not that FIX / Thermal performance (1)

drinkypoo (153816) | about 4 months ago | (#45606625)

Not all PWM Fan behave the same

That doesn't matter if you are competent, because they have a tachometer lead. you don't just send a PWM signal and then trust that the fan is going at the speed you want.

Exactly (1)

DrYak (748999) | about 4 months ago | (#45606837)

And according to Tom's, that exactly what the last BIOS update was all about. Taking better into account the tacho feedback.

Re:Exactly (1)

drinkypoo (153816) | about 4 months ago | (#45607507)

And according to Tom's, that exactly what the last BIOS update was all about. Taking better into account the tacho feedback.

But this isn't exactly ATI's first time around the bases... how did they forget how PWM control works?

Re:the cards run at higher temps by default (0)

Anonymous Coward | about 4 months ago | (#45602227)

This was a bug: AMD assumed that fans on reference coolers would be roughly the same and, eg, 50% power (PWM) = 50% speed. OEMs sourced different fan motors that would spin slower at a given PWM duty cycle, causing their cards to throttle more often.

AMD has already released a fix [tomshardware.com] that measures fan RPM instead of blindly setting the speed. Retail card fans now spin at the correct (faster) speed and get the better performance.

Re:the cards run at higher temps by default (0)

Anonymous Coward | about 4 months ago | (#45603993)

At this point it's reasonable to assume that AMD cherry-picked the cards they sent reviewers to make sure they were as good as they could be.

Yes. Clearly AMD cherry-picked one review sample to be ~5% faster than the retail cards, and the other review sample to be ~1% slower than the retail cards.

Oh, wait. That makes no bloody sense.

Re:the cards run at higher temps by default (1)

fuzzywig (208937) | about 4 months ago | (#45606581)

And this is why I'm going to be watercooling mine :)
(Assuming the waterblock gets any closer than Neufahrn which is where it's currently languishing on it's journey)

Card-to-card variance or install-to-install (0)

Anonymous Coward | about 4 months ago | (#45600103)

Because they aren't the same thing when dynamic clocking is enabled, and if you're serious enough to care you disable it. Duh.

AMD does what everyone else does. (-1)

Anonymous Coward | about 4 months ago | (#45600109)

And cherry picks the best units to send out to reviewers.
Can't seem to care too hard. Unless this wasn't an industry standard way of doing things... But it is.
Not just in computer related things either. The stuff sent to be reviewed is always the absolute best and pre-tested before the reviewers get it.

What i do care about tho. AMD cards still get me the best for my dollar.

So far my 9750 has had zero problems with any game. From directx to opengl. From old ass abandonware dosbox games. To every single latest game thats come out since i bought it.

And i don't have to do the nvidia driver shuffle to get everything from ancient games to the latest greatest to work. That shit got old when i was an nvidia user. This game needs this version. That game needs that driver version. This other game only works right with the default driver that came in the box with the card. I even had a little chart just to keep track of that shit. Looking thru my archive cd/dvds recently i found dozens of nvidia driver versions i kept around for the driver shuffle.

But on amd? So far using the driver that was new when i got win7. Everything has worked flawless. Good nuff. Fps is high. Games run perfectly.

When i find a problem i'll bitch loudly. But so far.. nope. good nuff and good price. I have no loyalty to any product. So long as what i buy does what i want with no problems.
And for this generation of hardware that turned out to be AMD/ATI. Mobo chipset and video.

Re:AMD does what everyone else does. (1)

rudy_wayne (414635) | about 4 months ago | (#45600985)

And cherry picks the best units to send out to reviewers.
Can't seem to care too hard. Unless this wasn't an industry standard way of doing things... But it is.
Not just in computer related things either. The stuff sent to be reviewed is always the absolute best and pre-tested before the reviewers get it.

Video card manufacturers have been cheating for more than 20 years. Sending hardware and drivers to reviewers that has been tweaked and optimized far beyond what is available at retail.

Mendacity or incompetence? (4, Interesting)

fuzzyfuzzyfungus (1223518) | about 4 months ago | (#45600135)

Cherry-picking would be a bad thing; but if it turns out that the junior thermal past application technicians get less attentive once the first production batch is finished and the people who've been babying the project leave, that wouldn't be a total surprise.

Re:Mendacity or incompetence? (0)

Anonymous Coward | about 4 months ago | (#45601283)

Cherry-picking would be a bad thing

That's common in every industry. You don't send the worst samples to demo to customer/client/reviewers, you send the best. After all, manufacturing processes have variances by nature. Trying to make cherry-picking the controversy is taking away from the main point that the performance differences are bigger than expected, implying possible bait-switch-like action.

...add variation in source. (1)

DrYak (748999) | about 4 months ago | (#45601821)

but if it turns out that the junior thermal past application technicians get less attentive once the first production batch is finished and the people who've been babying the project leave, that wouldn't be a total surprise.

Now add in the mix that some parts, like fans, might be sourced from several different manufacturer (and according to source, BIOS wasn't until latest update so good at operating them), add also that there might be variation in quality between the different batches of thermal paste (which got very probably sourced from several productors) and the output variation is clearly expected.

But also fixable (newest BIOS to compensate fans with better throttling, manually replace thermal paste. Now cards works as good as benchmarking sample or even better)

Re:...add variation in source. (1)

fuzzyfuzzyfungus (1223518) | about 4 months ago | (#45602025)

There are a variety of possible sources of variation. And, since the part's (not illogical, if somewhat unhelpful for making firm predictions) thermal management strategy seems to be 'run as fast as possible; but don't die', those variations would show up in performance.

What I'd want to know (but am unlikely to find out) is whether AMD was actively misleading reviewers by sending them hand-picked especially good cards, or whether review cards come from the initial run, probably the one where AMD's people are mostly tightly and nervously observing the process, rather than the potentially more variable just-another-day-slapping-chips-on-cards production that follows.

Merely variation is only inconvenient, and may well mean that the usual 3rd-party overkill heatsinks actually help significantly. Actively misleading reviewers? Not So Good.

Re:...add variation in source. (1)

Billly Gates (198444) | about 4 months ago | (#45602381)

THe issue is the thermal paste [tomshardware.com], as well as chip manufactors lower mhz clock speeds until yields improve.

It is a standard process for all chip makers. What AMD did was pick the best of the best where yields would not make sufficient defect free chips at that speed for the demo.

As chip makers increase production and yield quality increases then the speed goes up as well. Notice Tom got closed if not matching the real demo with these tricks.

So these were not botched demos at all! However, you do need to void the warranty and apply elbow grease or water cool them and you will gain the same performance.

Hanlon's razor / Napoleon Bonaparte (1)

DrYak (748999) | about 4 months ago | (#45602443)

or whether review cards come from the initial run, probably the one where AMD's people are mostly tightly and nervously observing the process, rather than the potentially more variable just-another-day-slapping-chips-on-cards production that follows.

I would indeed agree with your first post and this part. To me, a big conspiracy to manipulate results is far less likely than simply slopiness of a mass-produced good, where speed of production counts, in order to quickly meet the demand.

To quote a variant of Hanlon's Razor (often attributed without sources to Napoleon Bonaparte):
"Never ascribe to malice what can adequately be explained by incompetence."

Merely variation is only inconvenient, and may well mean that the usual 3rd-party overkill heatsinks actually help significantly.

Yup very probably. Specially with modern card that try to go as fast as they can, while still within target thermal envelope.

Typical (3, Insightful)

bit trollent (824666) | about 4 months ago | (#45600175)

This is becoming a habit for AMD. You can't even trust their FRAPS measured frame-rates. Seriously. I will never ever own another AMD card. Their graphics cards are nothing but empty promises and bad drivers.

Re:Typical (0)

Anonymous Coward | about 4 months ago | (#45600321)

I was a happy used of ATI cards. I do not currently have any problems with my post-buyout cards, but I will not buy any more.

Re:Typical (-1)

Anonymous Coward | about 4 months ago | (#45600497)

What a believably pathetic attempt at trolling. No wonder you were so easily swindled.

Re:Typical (0)

Anonymous Coward | about 4 months ago | (#45601159)

My first GPU was an ATi 9700 PRO, then I went Nvidia for a while with a 6600 GT and an 8800 GTS 640MB. Now I'm back using an AMD (ATi) 6850, and I must say that I will be returning to Nvidia and never looking back. The card (6850) runs extremely hot for its wattage and for some reason was designed so that it runs at about half the texel rate of equivalently priced cards -- and I didn't realize until I played Batman Arkham City that PhysX support could actually have large effect on how a game looks -- to the point of large environmental details being left out. The fact that they do shady things like purposefully over-report performance to FRAPS and give better-performing cards to reviewers is the nail in the coffin.

Re:Typical (0)

Anonymous Coward | about 4 months ago | (#45601207)

i'm happy with my latest AMD card, got to buy the 70 and upgrade it to a 90 just with firmware, card has never given me any trouble (made a lot of crypto currancy along the way). might be a bit noisier than nvdia; manufactures should make an AMD card that is water cooled out of the box.

Re:Nvidia has worse drivers today (2)

Billly Gates (198444) | about 4 months ago | (#45602303)

The latest ones crash ALOT if you read maximumpc.com or tomshardware.com and my Nvidia fanboys I raid with.

AMD has better quality hardware with less flaky voltage regulators. I went through 2 nvidia cards over 8 years that failed and switched all AMD/ATI in 2010 with the phenom II (better that steamroller per clock tick sadly ) and an ATI 5750.

Had one bizaare issue with black edges on the screen when switching to HDMI. That problem went away after I went into the drivers and configured my screen to not do underscanning.

NO issues for 2 years! Just upgraded to an ATI 7850 and it runs GREAT. No black screens, no overheating, no wierd driver issues at all. This was true in 2002 with the ill fated ATI Rage Pro 128 TV adapter. The Mac version was the only one that worked ok.

But damn that was 11 years ago!

I admit I do not like steamroller, but over all I got a nice $600 system that is 6core, runs VMWare accelerated, back in 2010 back when I would need an Intel extreme edition for $700 without anything else but the damn chip!

What they do not tell you is Intel cripples the bios and turns off hyperthreading and Virtualization support to force you into the more expensive units. AMD chipsets never do this.

IF AMD gets their act together and not create underperforming space heaters I will consider AMD again for a CPU purchase. But what I have works great and the phenom II was a pretty competitive chip for its time.

Raedons are great GPUs and are very competitive with Nvidia. Drivers are worse today with Nvidia unless they clean things up.

FPS TOO LOW!! (3, Funny)

SpaceManFlip (2720507) | about 4 months ago | (#45600277)

Oh gosh I hope this doesn't result in some poor sap attempting to play Battlefield 4.7 and while thinking they should achieve a pure 115fps they only hit a measly 92fps and their lives are ruined forever. The consequences will never be the same.

Re:FPS TOO LOW!! (1)

DigitAl56K (805623) | about 4 months ago | (#45600631)

When you pay for discrete graphics you're usually making purchase decisions based on performance/$. When the reviews all say you'll achieve a certain trade-off, but with the retail product you don't, then the market has been deceived - perhaps you would have been better off buying something else.

No comment as to what's actually going on in this case, if anything, since I haven't been following it...

Re:FPS TOO LOW!! (0)

ericloewe (2129490) | about 4 months ago | (#45600799)

Oh gosh I hope this doesn't result in some poor sap attempting to drive his car and while thinking they should achieve a pure 40mpg they only hit a measly 20mpg and their lives are runied forever. The consequences will never be the same.

See how stupid you sound? Please redeem yourself by thinking carefully about the situation (potentially mislabeled product) and ideally apologizing for an utterly useless comment.

Re:FPS TOO LOW!! (1)

SpaceManFlip (2720507) | about 4 months ago | (#45601067)

92/115 = 0.8 ratio != 20/40 = 0.5 ratio. See how stupid you sound? Also, trolling an obvious joke with insults....

Nobody even brought up the monitor refresh rate.

Re:FPS TOO LOW!! (1)

JohnnyBigodes (609498) | about 4 months ago | (#45600817)

I get your point, but nobody's interested in playing Quake 1 any more :). Generally speaking, with the recent/upcoming graphics-intensive games, when you're set to a reasonably high resolution and the pretties up, you'll have a hard time maintaining a steady 50+ FPS in the middle of an intense firefight. Which is exactly the moment when you need it the most, even if anywhere else you get 100 FPS. The lows are what you want to avoid.

Besides, the whole point of burning $550 on a top-of-the-line graphics card is the expectation of a performance level of X, and, well, you're going to feel ripped off when you find out that your store-bought $550 card, unlike the review sample, performs just slightly above the $400-ish one.

Re:FPS TOO LOW!! (1)

Bigbutt (65939) | about 4 months ago | (#45601913)

I am. I liked Quake 1, 2, and even Arena. I also liked Doom and Doom 2. And Duke Nukem 3d. And Command and Conquer. And Starcraft/Brood War. Hell, I'd love to be able to play Carmageddon again. What's wrong with having fun with a game vs being able to count the chest hairs on Duke as they wave in the gentle breeze?

And get off my lawn. It's dying under the snow there.

[John]

Re:FPS TOO LOW!! (1)

L4t3r4lu5 (1216702) | about 4 months ago | (#45606405)

If I buy a Ford Fiesta with a 1.2l engine, and $RoadTestReview gets 45MPG over a week, three tank refills, and a good mixture of motorway and city driving, I don't expect my (supposedly) identical 1.2l Ford Fiesta to get 36MPG in the same period under the same conditions.

Not uncommon... (1)

Last_Available_Usern (756093) | about 4 months ago | (#45600327)

This probably happens more often than we think. Frankly, it makes sense to validate that a card is going to run solid for someone before you send it to them if they're going to be blabbering all over the interwebs about it. It's just in this case they got burned (and justifiably) due to the fact that the software/driver adjusts the frequency independently instead of being a static clock speed (something they should have disclaimed to the reviewers).

It's known. (1)

DrYak (748999) | about 4 months ago | (#45601867)

due to the fact that the software/driver adjusts the frequency independently instead of being a static clock speed (something they should have disclaimed to the reviewers).

It's well known that these cards operate at a fixed temperature and push the clock and voltage as high as they can within these thermal limits.
It's so well known among professionnal, that some like Tom's are giving advice about BIOS replacement (newer have better and more consistent throttling or fan accross all the varied parts) or thermal paste replacement (to improve cooling and thus performance).

Re:Not uncommon... (1)

mjwx (966435) | about 4 months ago | (#45604261)

This probably happens more often than we think.

For a long time we've know that demonstrations lie.

Any demo indistinguishable from magic is insufficiently rigged.

I fail to see why anyone is surprised by this, Nvdia do it, Intel do it, Apple do it, everyone lies with demonstrators and all demonstration machines are rigged up to their eyeballs. IBM spends money making sure the lights on their demonstration machines blink in unison for crying out loud.

Which is why anyone with half a brain does not base purchasing decisions on a vendor's product demonstration.

Yeah (1)

yoshi_mon (172895) | about 4 months ago | (#45600413)

I'm a huge fan of market competition and AMD. If this is not not a slashvertismemnt then I'm a noob.

Re:Yeah (1)

game kid (805301) | about 4 months ago | (#45601641)

I'm a huge fan of market competition, which is why I can't wait for a capable third manufacturer to compete against the big two.

Re:Yeah (1)

SScorpio (595836) | about 4 months ago | (#45602979)

Intel should can compete just fine in the low end market, and is starting to poke their nose into the mid-range. You can play Battlefield 4 with the integrated graphics of an i5/i7 desktop Haswell chip at 720p and low settings at 30fps.

Desktop processors running Iris Pro graphics may let Intel start stealing the spot light from AMD's APUs. I wouldn't count on Intel diving into the high-end market, so hopefully Nvidia won't be killed off. I also believe Nvidia has foreseen the end of themselves being anything but a high-end option and has innovated with PhysX, 3d Vision, and the new GSync.

Same old story (1)

riis138 (3020505) | about 4 months ago | (#45600481)

It seems like this is a recurring problem for AMD. I recall a similar issue with the bulldozer cpu's

Re:Same old story (1)

ericloewe (2129490) | about 4 months ago | (#45600829)

As far as I remember, those were plainly slower than the ones they were to replace, let alone Intel's products, even running at significantly higher clocks.
They performed to spec, but the spec wasn't what AMD had originally hoped for.

Throttling + OEM fan speed and grease variance (4, Informative)

Anonymous Coward | about 4 months ago | (#45600485)

Toms Hardware covered this pretty extensively [tomshardware.com] a month ago.

The short story is that AMD is throttling clock speeds to hold within a temperature limit. They learned the hard way that 40% PWM does not equal 40% fan speed, especially across all fans the OEMS used. There's a driver fix for that now measures fan speed and adjusts accordingly when in quiet mode that eliminates most of this performance discrepancy (retail cards can now see higher performance in line with review samples).

Remaining differences between cards may be due to different heatsink grease, also already examined by replacing the grease on a retail card [tomshardware.com] for a significant performance gain.

Re:Throttling + OEM fan speed and grease variance (2)

jandrese (485) | about 4 months ago | (#45600825)

I've always found it shocking that after reading about how important it is to apply paste properly and take the time to do it right when building my own machine, every time I open up an OEM box I discover the paste just globbed on there willy nilly with a caulking gun. nVidia had a huge problem with this back with the 8xxx series GPUs in laptops.

I know some guy making $0.30/day in China isn't going to take a credit card and insure a perfectly smooth and even coating of thermal paste before carefully applying the cooler, but there must be a better way than blorting it on like a 3 year old with a tube of oil paint.

Re:Throttling + OEM fan speed and grease variance (2)

TubeSteak (669689) | about 4 months ago | (#45600923)

How come applying thermal grease is still such a big problem in the semiconductor industry?
They've been doing it for decades, but still haven't figured out how to get it right every time.

Even Apple, who are renowned for their design and manufacturing prowess, keeps hiring companies that screw it up.

Re:Throttling + OEM fan speed and grease variance (0)

Anonymous Coward | about 4 months ago | (#45601193)

It's especially weird because it seems like such a low-tech, cheap thing that has fairly significant performance implications. A 10% increase in GPU performance normally represents months of work and millions of dollars in R&D. Fixing this small piece of the production process seems like a no-brainer to me. I wonder why none of the big semiconductor companies have been able to deal with it?

Re:Throttling + OEM fan speed and grease variance (1)

tlhIngan (30335) | about 4 months ago | (#45602371)

How come applying thermal grease is still such a big problem in the semiconductor industry?
They've been doing it for decades, but still haven't figured out how to get it right every time.

Even Apple, who are renowned for their design and manufacturing prowess, keeps hiring companies that screw it up.

I think it's too much variation in the way heatsinks attach to the chip. Ideally it would be something like how heatsinks on Intel do it (there may be others, but Intel is what I have experience on) - you put on a small dab (which can be on a gun that gives a pre-measured amount like the condiment guns at fast food joints), and then the act of putting the heatsink on spreads it.

But that's if the heatsink applies pressure - other times it's just stuck on with glue or the mechanism has too much give for it to work well.

It's sort of why the industry moved to pads and the like - putting on the paste and smoothing it out took too much time to manufacture and is too prone to mistakes, so they use a pad and eliminate the problem head on.

Re:Throttling + OEM fan speed and grease variance (1)

Macman408 (1308925) | about 4 months ago | (#45603827)

Probably because there's no test for it. If they put the CPU on the board backwards, they'll notice when they try to turn the system on. Too much paste (or too little), and things will work just fine as far as any test is concerned.

When I did computer repair, I once encountered a PowerMac where the heatsink had a manufacturing defect; one of the posts that fit in a hole on the CPU card had a large extra blob of aluminum on it. It was impossible to seat the heatsink on the CPU, though it could still be strapped on (it's just that there was enough extra clearance caused by the malformed post that the heatsink didn't touch the CPU). As far as the customer was concerned, the computer worked just fine for close to 3 years (though I imagine it probably ran fairly slowly sometimes). Then, the CPU finally died and the computer wouldn't boot any more. Luckily, Apple agreed with me that it was totally their fault, even though he was well beyond the warranty period, and they covered replacing both the CPU and the heatsink. But it's not something that any manufacturing test would've caught, unless the CPU got hot enough to literally fry itself. (It was one of the slower models of the day, with not even a heatsink fan, so it couldn't have fried itself immediately.)

Re:Throttling + OEM fan speed and grease variance (0)

Anonymous Coward | about 4 months ago | (#45601089)

You apparently didn't read the article before posting.

Re:Throttling + OEM fan speed and grease variance (0)

Anonymous Coward | about 4 months ago | (#45601469)

The reviewer samples are cherry picked and hand tuned for sure. If you believe otherwise then you're pretty gullible.

The issue consumers are seeing indeed does seem to be poor/shoddy assembly that makes the cooler ineffective. (There are also plenty of criticisms aimed at the cooler itself) The GPU runs fine, and is designed to run as fast as it can inside it's admittedly generous thermal budget (95C!) Get rid of the heat, and it performs better.

Many are eager to get a hold of cards from other vendors that have custom coolers that are better designed and will have better quality control. Right now all you can buy are units from AMD with preassembled coolers (Which are literally resold with a vendor's sticker slapped on)

Thermal management is a big issue with modern high end GPUs. It's one of the biggest performance bottlenecks. There are a few really interesting articles about Nvidia's experience designing an adequate cooler for the Titan (Which at launch had the biggest thermal budget of any card by a huge margin). It was a significant investment on Nvidia's part.

Re:Throttling + OEM fan speed and grease variance (0)

Anonymous Coward | about 4 months ago | (#45606717)

They learned the hard way that 40% PWM does not equal 40% fan speed, especially across all fans the OEMS used

and 40% fan speed likely does not equal 40% of the maximum fan airflow either.

They're just rebranded last year's model cards.... (0)

King_TJ (85913) | about 4 months ago | (#45600607)

I believe the entire R9 series of cards are little more than rebranded versions of older cards AMD just discontinued. The R9 280X for example? It's just a 7970 card. Sure, it may have a few BIOS tweaks and such, but you can even peel the label off the circuit board of many of them and find 7970 etched on the board underneath the R9 280X label.

Personally, I think AMD should have waited until it had legitimate improvements to offer before releasing anything, rather than trying to fool people with the R9 series.

Re:They're just rebranded last year's model cards. (2)

ericloewe (2129490) | about 4 months ago | (#45600971)

Nope. The 290/290X is a much larger chip - similar architecture, but bigger (and mildly improved).

Re:They're just rebranded last year's model cards. (0)

Anonymous Coward | about 4 months ago | (#45601393)

The 280X is just the 7970 and it's not really a secret. The reviews I read pointed this out (I also just bought a 280X two weeks ago). The 280X even shows up in your device manager as 7970.

More or less (1)

DrYak (748999) | about 4 months ago | (#45602037)

Personally, I think AMD should have waited until it had legitimate improvements to offer before releasing anything, rather than trying to fool people with the R9 series.

The problem, is that AMD got too busy doing legitimate improvement under contract for the coming generation of consoles (Xbox One, Play Station 4, and some of the upcoming Steam Box, all run the same combo of AMD CPUs and GPUs).
With that work, there was going to be some delay for their PC's Radeon HD 8xxx serie.

So it was either:
- have absolutely nothing to sell.
- do some small upgrade on the older board (R2 270/280 are simply older board slightly upgraded) and older chips (R2 290(X) are GCN1.1 chips, slightly newer than the GCN1.0) and have something to sell until the real new gen comes, while still taking advantage of the time to add some improvements.

Note that this was pretty much announced that way, and well known for people in the field.

now look at the bright side of the things:
even if they are coming with some delay, that means that next year, you're going to finally see the newer GCN2.0 chip based card, that have taken advangtage of all the R&D done for consoles to improve the performance and quality for radeons.

A little bit later, but a little bit more R&D money poured on the steps leading to them (specially the latest step)

Re:More or less (1)

SScorpio (595836) | about 4 months ago | (#45603301)

Will gamers see that much of an improvement? The PS4 and Xbone being x86 hardware is nice as the excuses on why a port to PCs can't happen, but both consoles are pretty sad when compared to current mid-range PCs let alone a high-end rig. The Xbone one is struggling to hit 1080p while the PS4 is hitting it, but at 30fps. This is matching or lower than the performance of a current mid-range PC and the performance gap will only widen.

It's good that AMD was able to get the contracts to get income as they have been struggling lately, but the Radeon 7000 series was a big of a disappointment, and were rumors of Nvidia branding the GTX660 as the GTX680 and the Titan was supposed to be the original GTX680. AMD new series is all rebranded cards save for the R9 290 and 290X. The cards a cheaper compared to Nvidia's offering, but they run hot and loud. Plus this article is discussing performance issues so maybe they aren't as good as the original look into them lead everyone to believe.

AMD really needs to buckle down and hit one out of the park. The Bulldozer was a dud, so the processor side of the business is also without a hit. I thought we might see the beginnings of an PC architecture redesign once you look at the new consoles, but I don't believe that anymore. I'm not hopefully that Kaveri is going to be the strike again Intel that AMD needs like the glory days of the Athlon 64 vs the Pentium 4, nor AMD releasing a GPU in the coming year that will force Nvidia to stop coasting along and release a highend card that is 80% of Titan SLI performance on a single card.

Scale! (1)

DrYak (748999) | about 4 months ago | (#45606785)

Will gamers see that much of an improvement? The PS4 and Xbone being x86 hardware is nice as the excuses on why a port to PCs can't happen, but both consoles are pretty sad when compared to current mid-range PCs let alone a high-end rig. The Xbone one is struggling to hit 1080p while the PS4 is hitting it, but at 30fps. This is matching or lower than the performance of a current mid-range PC and the performance gap will only widen.

...all this done with intergrated GPU. That's the key point. These performance are pulled using just an embed chip, that pulls minimal amount of power.

Now scale this thing up, move one generation next ( to GCN 2.x ) and the discrete card that will be getting next year from AMD are going to be quite interesting.

Re:More or less (0)

Anonymous Coward | about 4 months ago | (#45603739)

What are you talking about? AMD already launched their Radeon HD 8000 series. They're basically rebranded 7000 series cards.

Radeon HD 9000 series were rebradned to R9.

Architecture (1)

DrYak (748999) | about 4 months ago | (#45606761)

I'm talking about all the above being GCN 1.x chips.
The GCN 2.x chips (which was initially what was going to be inside HD 8000, before they delayed everything) will be here early next year.

This is news how? (0)

Anonymous Coward | about 4 months ago | (#45600683)

I would expect the retail NVidia cards vary considerably from the press review cards. Each vendor has their own tweaks and adjustments.

Nvidia's cheque just cleared (-1)

Anonymous Coward | about 4 months ago | (#45601143)

Slashdot is a PAY-FOR-PLAY site disseminating either pro-war propaganda for Team Obama and Israel, or simple pro-company PR 'dressed' as technical analysis.

The rubbish about so-called Journalist 'golden samples' being used for review versions of AMD's 290 family was discussed weeks ago, and dismissed and Nvidia paid for FUD at sites like Tom's.

The TRUTH is that all modern high end cards from Nvidia and AMD use thermal and power throttling, and that the 'reference' cooling solution on AMD 290 products is sub-standard when attempting to push the GPU to maximum performance. Some cards have variance in fan cooling ability (manufacturing variance) and by default allow the card to clock up less well under load.

The problem has been partially addressed with updated drivers, and is best handled by user selected fan spin rates. However, the whole world is waiting for 290 cards with cooling solutions from OEMs, NOT cheapskate, badly designed rubbish from AMD's own people.

Meanwhile, Nvidia is meeting major tech sites once a week with concocted FUD campaigns against their major competitor, and the promise of large monetary rewards if the bullet points of the FUD documents are repeated in so-called 'independent' editorial content on the site. Anandtech took a massive pay-off, for instance, to declare the cooling solution of the current 290s was so LOUD, the cards were not worth considering under any circumstance.

Remember, AMD technology, CPU, GPU, Northbridge, Southbridge, sound etc, is in ALL THREE of the major consoles- PS4, Xbox One, Wii U. Nvidia and Intel are nowhere- complete and utter losers, and neither Intel nor Nvidia were in the running for the console contracts because neither company could offer even a 50% solution to the hardware needs. If AMD had a target on its back before, now Nvidia and Intel are gunning for AMD with a FUD PR spend that makes it into billions of dollars yearly in negative press.

Nvidia sells the lousy 4GB 770 for what you can expect to pay for a current, vastly more powerful, 290 card from AMD. Below 450 dollars, all Nvidia sells are TWO GIGABYTE CARDS- completely useless for 2014 AAA gaming at 1080P or above. Meanwhile, AMD has been selling multiple 3GB cards at the same 2GB per rendered frame. Xmas 2013 (now) sees current games hovering around the 2GB mark for 1080P with reasonable graphics options selected.

However, no technical site is warning their readers that buying 2GB Nvidia cards is horribly short-sighted, and will lead to terrible performance issues (unless graphics options are cranked down to 'medium') with next-year's AAA games (remember, both new consoles have in excess of 5GB available to their GPUs). Nvidia simply pays such sites to NOT even mention the issue when recommending which current expensive card to buy.

Large Epsilon (error tolerance factor) (0)

Anonymous Coward | about 4 months ago | (#45601349)

Benchmarking: If you want it done right, you have to do it yourself.

Yet another ill-considered article from TechReport (0)

Anonymous Coward | about 4 months ago | (#45601363)

What a surprise that Wasson couldn't be bothered to figure out what's actually going on this time, either.

Silicon varies. A lot, really. Some chips run faster and cooler than others, simply by chance. This is why they're sold at different speeds, core counts, etc. The process of separating the chips into these categories is called 'binning', but it's not perfect in that there will still be some variation within a bin. That variation won't go lower than the bin's minimum standards, but it will go higher. With the number of review cards AMD - or any other chipmaker - ships out, it would follow that several of them would be of this better quality. The only way to avoid that, in fact, would be to specifically select the minimally-performing cards, which would be silly.

The reason that we haven't seen these chips in retail boards - and, to be fair, you wouldn't expect to find one in a sample of two, more like one in ten or twenty or even fifty, and Nvidia would surely know this - is that the board manufacturers that buy the chips from AMD hold those best chips back for their special, overclocked cards that have fancy cooling solutions. Rumors point to those coming out sometime in the range from Christmas to January for the R9 290/290X.

More specifically to the 290-series, this chip variance will show up more readily in that better chips will throttle a little less - but it's usually not going to be as dramatic a difference as, say, the variation in fan speed that they had to solve (whether this was AMD's mistake in not considering variation in fan performance or the manufacturers using cheap, crappy fans is uncertain). Or using crappy thermal paste. The neat thing about this method of dynamic frequency is that, aside from the limitation that a chip can never draw more power than its maximum safe limit, performance is entirely controlled by cooling power. Want to never throttle? Well, nothing can be done about Furmark - it draws too much power and you'd burn your chip out. But for every game out there, just turning up your fan speed will suffice, and the cards even have a built-in switch to do that if you don't want to simply adjust a slider in the control software. And when the custom cards with those nice, top-grade chips turn up, they'll run at full speed and be nearly whisper-quiet, too. Except full speed could be between 1050 and 1100 megahertz. Or you could use water cooling - running at 1300 megahertz on water isn't unusual from what I've heard, though you would also need to overclock to do that.

In the end, this is just Nvidia using a shoddy journalist to do a little bit of mud-slinging.

price check (0)

Anonymous Coward | about 4 months ago | (#45602211)

ok, so I checked the price of the Radeon R9 290X. It is about $550. wow. Not trying to start an argument but do people really need a $500 video card? Just asking. I thought a $200 would be enough for most users including heavy gamers. I remember when gamers bought $200 AGP cards and were satisfied.

Re:price check (0)

Anonymous Coward | about 4 months ago | (#45604265)

R9 290x is high end solution (you know those that always costed 800$+ ) so price for top tech (on AMD side at least) dropped.

R9 290 is $400 and even that is high end card,
if you want equivalent of your $200 card there is R9 270, its only $179, and is what is called mid-range card one for ordinary mortals like yourself its only twice slower than fastest one (R9 290x)

You don't buy high end GPUs because you need them, you buy them because you want them, or in car analogy, Ferrari is not for everyone, some people are better served by Honda or Toyota :)

How about R9 290 then? (1)

fa2k (881632) | about 4 months ago | (#45607027)

I've just ordered a R9 290X, before I saw this (oh no!). If this effect is reducing the 290X performance, and 290X is just a higher clocked 290, could I expect similar performance on the 290 as the retail 290Xs? If so I just want to return the 290X and recoup 25 % of the cost... Advice greatly appreciated

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...