Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD's New Radeons Revisit Old Silicon, Enable Dormant Features

Soulskill posted about 10 months ago | from the you-mean-we-can-actually-use-the-whole-thing? dept.

Graphics 75

crookedvulture writes "The first reviews of AMD's Radeon R7 and R9 graphics cards have hit the web, revealing cards based on the same GPU technology used in the existing HD 7000 series. The R9 280X is basically a tweaked variant of the Radeon HD 7970 GHz priced at $300 instead of $400, while the R9 270X is a revised version of the Radeon HD 7870 for $200. Thanks largely to lower prices, the R9 models compare favorably to rival GeForce offerings, even if there's nothing exciting going on at the chip level. There's more intrigue with the Radeon R7 260X, which shares the same GPU silicon as the HD 7790 for only $140. Turns out that graphics chip has some secret functionality that's been exposed by the R7 260X, including advanced shaders, simplified multimonitor support, and a TrueAudio DSP block dedicated to audio processing. AMD's current drivers support the shaders and multimonitor mojo in the 7790 right now, and a future update promises to unlock the DSP. The R7 260X isn't nearly as appealing as the R9 cards, though. It's slower overall than not only GeForce 650 Ti Boost cards from Nvidia, but also AMD's own Radeon HD 7850 1GB. We're still waiting on the Radeon R9 290X, which will be the first graphics card based on AMD's next-gen Hawaii GPU." More reviews available from AnandTech, Hexus, Hot Hardware, and PC Perspective.

cancel ×

75 comments

Sorry! There are no comments related to the filter you selected.

So does that mean... (1)

crafty.munchkin (1220528) | about 10 months ago | (#45076097)

... that updating the BIOS on my 7870 might unlock these features?

Re:So does that mean... (0)

Anonymous Coward | about 10 months ago | (#45076129)

You didn't even read the summary. The 7790 has the locked features. The 7870 moves on as is.

Re:So does that mean... (0)

Anonymous Coward | about 10 months ago | (#45076135)

Nope, will mean bricking it.

Re:So does that mean... (1)

Anonymous Coward | about 10 months ago | (#45076209)

Summary for those who missed it and got right to commenting: go ahead and try it, let us know how it goes!

Re:So does that mean... (0)

Anonymous Coward | about 10 months ago | (#45076269)

no, chances are your card has some bad silicon on those areas but since they were never advertised you don't see it. once AMD has figured out how to improve their yields, they enabled these features in the newer cards

Re:So does that mean... (0)

Anonymous Coward | about 10 months ago | (#45076907)

They probably haven't even tested them on production or even design verification due to schedule/marketing reasons to push the product out ASAP.
So YMMV if/when someone magically turned on the bits sequence for the existing cards.

Re: So does that mean... (0)

Anonymous Coward | about 10 months ago | (#45080479)

I bet it is more a case that the driver team didn't have time to devote to this DSP block, so it sits idle on these older cards. May have gone through a metal spin to fix something once software got on board, but it is almost certainly mostly functional on all the silicon that has it.

Re:So does that mean... (0)

Anonymous Coward | about 10 months ago | (#45078763)

Yes, you should do it.

Do the kids still chase the newest video card? (1)

swb (14022) | about 10 months ago | (#45076245)

Or have we reached a diminishing return point and/or a point where money is being spent elsewhere (consoles, mobile, tablets, etc)?

Re:Do the kids still chase the newest video card? (4, Informative)

0123456 (636235) | about 10 months ago | (#45076297)

Or have we reached a diminishing return point and/or a point where money is being spent elsewhere (consoles, mobile, tablets, etc)?

The problem is that PC games have been cripppled for years by being developed on consoles and ported to PCs. Some do take advantage of the extra power of PC GPUs, but the majority will run fine on a GPU that's several years old, because it's more powerful than the crap in the consoles.

Re:Do the kids still chase the newest video card? (1)

Lucky75 (1265142) | about 10 months ago | (#45076349)

Just wait until games are being made for the PS4 and xb one. I'm looking forward to the optimizations and the fact that they'll be the same architecture as discrete cards. Hopefully that means game developers will allow their games to scale more since it shouldn't really be much work and they don't need to port them.

Re:Do the kids still chase the newest video card? (2)

0123456 (636235) | about 10 months ago | (#45076491)

It will certainly be an improvement, but from what I've read they're only comparable to current mid-range PC GPUs. By the time many games are out, a high-end gaming PC will still be several times as powerful.

Re:Do the kids still chase the newest video card? (-1)

Anonymous Coward | about 10 months ago | (#45076657)

It will certainly be an improvement, but from what I've read they're only comparable to current mid-range PC GPUs. By the time many games are out, a high-end gaming PC will still be several times as powerful.

You're still just running unoptimized crap a little faster. Look at FarCry 3... makes my system sound like a hairdryer, and for what...

Re:Do the kids still chase the newest video card? (-1, Troll)

AcidPenguin9873 (911493) | about 10 months ago | (#45077231)

You mean a machine that costs 4-5x what the console costs will be more powerful than the console? Shocking!

Re:Do the kids still chase the newest video card? (1)

0123456 (636235) | about 10 months ago | (#45077253)

You mean a machine that costs 4-5x what the console costs will be more powerful than the console? Shocking!

You mean you can't read and comprehend the thread before replying to it? Shocking!

Re:Do the kids still chase the newest video card? (1)

AcidPenguin9873 (911493) | about 10 months ago | (#45077345)

I've read and reviewed the entire subthread again and can't find anything that I didn't comprehend. Please, englighten me!

Re:Do the kids still chase the newest video card? (2)

chihowa (366380) | about 10 months ago | (#45078043)

The gripe is not that consoles are less powerful than PCs. The gripe is that many games are designed around the limitations of consoles and don't take advantage of all of the power in a PC. Back in the days of yore, new games would be able to take advantage of cutting edge GPUs. Now they (often) don't.

I'm just restating the OP [slashdot.org] , who said it very clearly himself:

The problem is that PC games have been cripppled for years by being developed on consoles and ported to PCs. Some do take advantage of the extra power of PC GPUs, but the majority will run fine on a GPU that's several years old, because it's more powerful than the crap in the consoles.

So yes, learn to read.

Re:Do the kids still chase the newest video card? (1)

AcidPenguin9873 (911493) | about 10 months ago | (#45078229)

The gripe is not that consoles are less powerful than PCs. The gripe is that many games are designed around the limitations of consoles and don't take advantage of all of the power in a PC.

If you would read the very next post in the subthread, (here it is [slashdot.org] ), it has a reasonable response to that gripe. Here, I'll quote it for you:

I'm looking forward to the optimizations and the fact that they'll be the same architecture as discrete cards. Hopefully that means game developers will allow their games to scale more since it shouldn't really be much work and they don't need to port them.

If that's hard to understand, I'll explain it. The console GPU architecture is basically PC GPU architecture, even though it's not quite as powerful as the best PC graphics cards. So the effort required by the game developers to use better PC hardware is hopefully low since it should be a pretty natural extension of what they're already doing for the consoles, as opposed to something totally different like it was for the last-gen consoles.

The OP then acknowledged this point, but then again complained about the lack of GPU horsepower in the consoles and compared it to high-end PC GPUs. I didn't really understand this to be anything other than complaining about the consoles being underpowered, since it basically ignored the response. What else is there to do at that point besides acknowledging the OP's complaint at face value and offering an explanation for it?

Does that make sense now or do you want to have another try?

Re:Do the kids still chase the newest video card? (1)

SuperAlgae (953330) | about 10 months ago | (#45078813)

I didn't really understand this to be anything other than complaining about the consoles being underpowered...

Yes, you didn't really understand. The OP did not acknowledge that having console architecture closer to PC architecture solves the problem. He acknowledged that it is an improvement, not a solution. As an example, Xbox has always had hardware architecture pretty similar to a PC, but that does not mean games ported from Xbox to PC take good advantage of high-end PC hardware. It still takes more work for the dev team to create higher quality assets (textures, models, etc) and make use of advanced HW features. A company focused on the console release probably won't bother.

It could be argued that there is too much emphasis on graphics anyway, but it's understandable if somone who spends 4-5x the cost of a console wants games to take advantage of that superior hardware.

Re:Do the kids still chase the newest video card? (1)

AcidPenguin9873 (911493) | about 10 months ago | (#45078859)

He acknowledged that it is an improvement, not a solution.

And then instead of saying any of the things that you said as to why it's not a complete solution (which are reasonable rebuttals), he just complained about underpowered console GPUs and compared them to high-end PC GPUs.

What he really did was read the first line of the response, ignore the rest of it, and assume that the argument that the response was making that games written for the new consoles would be better only because they would be targetting a GPU that is more powerful than the last-gen consoles. That was not the respondant's argument.

I certainly agree that there is effort involved to make a game originally written for a console look better on more powerful PC hardware. The respondant's point (which is probably wishful thinking, but still a point nonetheless) is that instead of spending developer effort and budget on simply porting the game to work on PC CPU+GPU instruction sets, their effort can be spent in all of those things that you named since the instruction sets of the consoles already match that which you'd find in a PC.

Re:Do the kids still chase the newest video card? (4, Insightful)

SuperAlgae (953330) | about 10 months ago | (#45079143)

It's true that the OP's comment did not give much explanation, but it at least had a constructive tone to it. Your response, however, was sarcastic and insulting. You have some good insight. Your comment history shows a lot of intelligence, but so much of your energy seems to go into belittling others. If you take a more constructive approach, you'll reach a lot more people. Occasionally a sarcastic remark can be an effective way to make a point, but it usually just turns people away and makes your effort go to waste.

Re:Do the kids still chase the newest video card? (0)

Anonymous Coward | about 10 months ago | (#45080977)

It's true that the OP's comment did not give much explanation, but it at least had a constructive tone to it. Your response, however, was sarcastic and insulting. You have some good insight. Your comment history shows a lot of intelligence, but so much of your energy seems to go into belittling others. If you take a more constructive approach, you'll reach a lot more people. Occasionally a sarcastic remark can be an effective way to make a point, but it usually just turns people away and makes your effort go to waste.

I feel like I've suddenly stepped back in time, to an much, much earlier incarnation of slashdot. Quick, somebody mention "global warming" or nuclear power, to return me to the 21st century!

Re:Do the kids still chase the newest video card? (1)

wisty (1335733) | about 10 months ago | (#45078867)

> from what I've read they're only comparable to current mid-range PC GPUs.

Yeah. But that's still shit-loads better than a 10 year old high-end PC GPU.

Re:Do the kids still chase the newest video card? (0)

Anonymous Coward | about 10 months ago | (#45076637)

Or have we reached a diminishing return point and/or a point where money is being spent elsewhere (consoles, mobile, tablets, etc)?

The problem is that PC games have been cripppled for years by being developed on consoles and ported to PCs. Some do take advantage of the extra power of PC GPUs, but the majority will run fine on a GPU that's several years old, because it's more powerful than the crap in the consoles.

My ass. PC games are notoriously unoptimized because you can throw more hardware at the problem.

Re:Do the kids still chase the newest video card? (0)

0123456 (636235) | about 10 months ago | (#45077243)

My ass. PC games are notoriously unoptimized because you can throw more hardware at the problem.

Graphics APIs these days are basically just a way of get shaders into the GPU. Odds are, pretty much the same shaders are running on the PC as the console, so there's no room to 'unoptimize' them.

And, on the CPU side, I rarely see mine more than 20% used when playing games. So they're not 'unoptimized' there, either.

Re:Do the kids still chase the newest video card? (0, Informative)

Anonymous Coward | about 10 months ago | (#45077885)

My ass. PC games are notoriously unoptimized because you can throw more hardware at the problem.

Graphics APIs these days are basically just a way of get shaders into the GPU. Odds are, pretty much the same shaders are running on the PC as the console, so there's no room to 'unoptimize' them.

And, on the CPU side, I rarely see mine more than 20% used when playing games. So they're not 'unoptimized' there, either.

Plenty of games can be CPU bound, Planetside 2 is a free download.
Far Cry 3 runs like ass with my GTX 690, and for what... I have to dial it down to sub-skyrim, sub-BF3 graphics to hear the game over my system fans.
Ram is cheap, so you don't run into system memory limitations much anymore, but that doesn't mean games make tremendously efficient use of what's available.
How about this, PC software - in general - tends to make poor use of available resources of ALL kinds.

Here's an example I just found.

I spin up Big Picture mode in steam on my old iMac, and it defaults into 720p resolution - because...
When I select 1080p, it warns that my video card 'only' has 512mb of video memory (but does let me continue).

Why.... who in their right mind decided they need THAT much memory to render their UI in 1080p.
Have you seen Big Picture mode? How many textures need to be on screen at once? I know what they did, "all video cards made after 20XX have more than XXX MB of memory, so target that". That's FINE, but if you DOUBLED the PS3 or 360's graphics memory to 512MB, you would see FAR better use of it than Big Picture mode's UI demonstrates.

My main point is that higher PC specs do not get you 1:1 increase in features over console specs because it's not worth the effort to make efficient use of PC resources when slightly faster ones come out every month.

Re:Do the kids still chase the newest video card? (-1)

Anonymous Coward | about 10 months ago | (#45076759)

Still blaming consoles? You PC guys never change.

Re:Do the kids still chase the newest video card? (2)

ducomputergeek (595742) | about 10 months ago | (#45077191)

And then there's star citizen. Honestly I'm looking at purchasing my first PC for gaming in over a decade. The last PC I bought primarily for gaming purposes was around 2001. Then the studios stopped producing the flight, space combat sims and FPS's like the original Rainbow 6 and Ghost Recon games I liked to play.

I've been looking around. I have a 3 year old desktop here that I'm thinking for $150 for a new PSU and 7xxx AMD card will get me through the beta for Star Citizen.

So I've just started looking around. I'm glad to here these new cards use even less power making having to spend $50 for a new PSU maybe less of a need and I can put it towards a better graphics card...

Re:Do the kids still chase the newest video card? (1)

Ash Vince (602485) | about 10 months ago | (#45080451)

Or have we reached a diminishing return point and/or a point where money is being spent elsewhere (consoles, mobile, tablets, etc)?

The problem is that PC games have been cripppled for years by being developed on consoles and ported to PCs. Some do take advantage of the extra power of PC GPUs, but the majority will run fine on a GPU that's several years old, because it's more powerful than the crap in the consoles.

But most of the nice visual effects like antialiasing are done in the drivers without the game needing to know to much about it so this is not necessarily true. Also, there are plenty of companies that develop for PC then port to consoles. Compare Skyrim on the PC on a NVidia 680 or 780 to running on the Xbox 360 to see the difference. Another example of a game that looked far better on PC than on consoles is BF3.

Maybe you should have caveated your post by saying that a lot of crap studios release crippled games on PC that are designed for consoles but their are plenty of companies out there who view the PC as important to their strategy and so put the work in making their product on PC take advantage of the good stuff modern graphics card provide. This is especially true when it is just a case of making very minor changes to leverage improvements in versions of DirectX 10 and 11 not available on the Xbox360.

Re:Do the kids still chase the newest video card? (2)

un1nsp1red (2503532) | about 10 months ago | (#45076323)

I think we've hit a temporary lull, but you'll see renewed interest once newer, larger monitors start to enter the market. i.e., I'm fine with my rig so long as I can play any game with the settings maxed. Rules will have to change once 4k monitors become the new norm.

Re:Do the kids still chase the newest video card? (-1)

Anonymous Coward | about 10 months ago | (#45076421)

Do the kids still chase the newest video card?

I'm fine with my rig

No, but the frat boys still say "my rig" to refer to their overclocked shitboxes.

Re:Do the kids still chase the newest video card? (0)

Anonymous Coward | about 10 months ago | (#45076611)

They certainly do above all sense. If you have a 1920x1080 monitor there is only so much GPU power you need for all current games at max detail. Doesn't stop people spending far too much.

Re:Do the kids still chase the newest video card? (1)

Ash Vince (602485) | about 10 months ago | (#45080489)

They certainly do above all sense. If you have a 1920x1080 monitor there is only so much GPU power you need for all current games at max detail. Doesn't stop people spending far too much.

As someone who just went from a Nvidia 480 to an Nvidia 780 I noticed an improvement. Firstly I could turn on full Antialiasing which made a big difference in things like Skyrim and BlackOps2. I imagine it will make a bigger difference in Black Ops Ghosts that comes out next month though which is what I really bought it for.

Re:Do the kids still chase the newest video card? (1)

ArchieBunker (132337) | about 10 months ago | (#45077313)

Once in a while I check Tom's Hardware for the video card roundups and they all seem priced accordingly. There is no longer a "best bang for you buck" card. A $70 Nvidia cards performs as well as a $70 AMD card.

Re:Do the kids still chase the newest video card? (1)

PixetaledPikachu (1007305) | about 10 months ago | (#45077991)

Once in a while I check Tom's Hardware for the video card roundups and they all seem priced accordingly. There is no longer a "best bang for you buck" card. A $70 Nvidia cards performs as well as a $70 AMD card.

The last bang for buck video card that I had was the GeForce 4 TI 4200 64MB. It lasted roughly 3 years before I migrated to the a 6600

Re:Do the kids still chase the newest video card? (0)

tibman (623933) | about 10 months ago | (#45077589)

I bought an HD 6970 (used from ebay) just two weeks ago. Really enjoying it so far. The new cards need PCIe 3.0 and this old mobo can't do that : / It seems like a gpu upgrade every two years is good enough. CPU upgrades are super easy too if the socket is long-lived. Just wait until a cpu model goes into the bargain bin, which doesn't take very long at all.

Re:Do the kids still chase the newest video card? (0)

Anonymous Coward | about 10 months ago | (#45077837)

Err, what?
Why wouldn't a 280X work in a PCIe gen1 slot when a 7970 happily does?

Re:Do the kids still chase the newest video card? (0)

tibman (623933) | about 10 months ago | (#45078333)

280x requires PCI Express x16 3.0 : /

Re:Do the kids still chase the newest video card? (1)

jakobX (132504) | about 10 months ago | (#45080175)

Supports and requires are not the same thing.

Re:Do the kids still chase the newest video card? (1)

citizenr (871508) | about 10 months ago | (#45078201)

The new cards need PCIe 3.0

no

Re:Do the kids still chase the newest video card? (0)

tibman (623933) | about 10 months ago | (#45078383)

All of the 7xxx series required PCIe 3.0: http://www.amd.com/US/PRODUCTS/DESKTOP/GRAPHICS/7000/7730/Pages/radeon-7730.aspx#2 [amd.com]
The new cards are rebranded 7xxx.

Re:Do the kids still chase the newest video card? (3, Informative)

citizenr (871508) | about 10 months ago | (#45078989)

no, they support, not require

Re: Do the kids still chase the newest video card? (0)

Anonymous Coward | about 10 months ago | (#45079009)

Except that's what it supports, not what it requires.

Marketing Numbers (4, Insightful)

ScottCooperDotNet (929575) | about 10 months ago | (#45076253)

Why didn't AMD's Marketing team name these 8000 series cards? Do they keep changing the naming scheme to be intentionally confusing?

Re:Marketing Numbers (1)

mythosaz (572040) | about 10 months ago | (#45076377)

At least Dell fixed this recently with *most* of their enterprise laptops.

A 6430, for example is a series-6 laptop with a "4"-teen inch screen in the 3'rd revision.

I have no clue what a 7970 is, of how it compares to an R7-260.

Re:Marketing Numbers (2)

armanox (826486) | about 10 months ago | (#45078281)

ATIs were sane for quite a while. In the Radeon X and HD series numbers were seven digits (ABCD), such as a Radeon HD 5770

A: Generation name. A 7xxx card is newer then a 5xxx card
B: Chip series. All chips in a generation with the same B number (x9xx) were based on the same GPU
C: Performance level. A lower number was clocked slower then a higher one (so a 7750 was slower then a 7770). Exception: the x990 was a dual GPU chip
D: Always 0

So, to compare ATI cards, a x770 was slower then an (x+1)770 which was the newer gen, a x870 was faster then the (x+1)770 because it was a better chip to start (a first gen i5 is better then a second gen i3), and 7770 was clocked faster then the 7750.

Re:Marketing Numbers (5, Informative)

gman003 (1693318) | about 10 months ago | (#45078567)

ATI/AMD has actually been consistent for several years now - they're literally just-now changing their scheme

The old system was a four-digit number. First digit is generation - a 7950 is newer than a 6870, and way newer than a 4830 or a 2600. The next two digits are how powerful it is within the generation - roughly, the second digit is the market segment, and the third is which model within that segment, but that's rough. They did tend to inflate numbers over time - the top-end single-GPU cards of each generation were the 2900 XT, the 3870, the 4890, the 5870, the 6970, and the 7970GE. Put simply, if you sort by the middle two digits within a generation, you also order by both power and price.

The fourth digit is always a zero. Always. I don't know why they bother.

Sometimes there's a suffix. "X2" used to mean it's a dual-GPU card, cramming two processors onto one board, but now those get a separate model number (they also only do that for the top GPU now, because they've found it's not worth it to use two weaker processors). "GE" or "Gigahertz Edition" was used on some 7xxx models, because Nvidia beat them pretty heavily with their 6xx series release so AMD had to rush out some cards that were essentially overclocked high enough to beat them. "Eyefinity Edition" used to be a thing, mainly it just meant it had a shitload of mini-DP outputs so you could do 3x2 six-monitor surround setups, which AMD was (and is) trying to push. And there were some "Pro" or "XT" models early on, but those were not significant.

Now forget all that, because they're throwing a new one out.

It's now a two-part thing, rather like what Intel does with their CPUs. "R9" is their "Enthusiast" series, for people with too much money. Within that, you have six models: the 270, 270X, 280, 280X, 290 and 290X. They haven't fully clarified things, but it seems that the X models are the "full" chip, while the non-X model has some cores binned off and slightly lower clocks. Other than that, it's a fairly straightforward list - the 290 beats the 280X beats the 280 beats the 270X and so on. Under those are the "R7" "gamer" series, which so far has the 240 through 260X, and an R5 230 model is listed on Wikipedia even though I've not seen it mentioned elsewhere.

Sadly, it's still a bit more complicated. See, some of the "new" ones are just the old ones relabeled. They're all the same fundamental "Graphics Core Next" architecture, but some of them have the new audio DSP stuff people are excited about. And it's not even a simple "everything under this is an old one lacking new features" - the 290X and 260X have the new stuff, but the 280X and 270X do not. And it gets worse still, because the 260X actually is a rebadge, it's just that they're enabling some hardware functionality now (the 290X actually is a genuine new chip as far as anyone can tell). So far, everything is 2__, so I would assume the first digit in this case is still the generation.

Oh, and there actually are some 8xxx series cards. There were some mobile models released (forgot to mention - an M suffix means mobile, and you can't directly compare numbers between them. A 7870 and 7870M are not the same.), and it looks like some OEM-only models on the desktop.

But yeah, it is a bit daunting at first, especially since they're transitioning to a new schema very abruptly (people were expecting an 8xxx and 9xxx series before a new schema). But not much has really changed - you just need to figure out which number is the generation, and which is the market segment, and you're good.

Re:Marketing Numbers (1)

bjb (3050) | about 10 months ago | (#45090077)

I'll take a 280ZX. With T-tops, preferably.

Re:Marketing Numbers (0)

Anonymous Coward | about 10 months ago | (#45076381)

I hate all marketing schema that don't make functional sense, which means I hate almost all marketing schema.

Re:Marketing Numbers (5, Informative)

edxwelch (600979) | about 10 months ago | (#45076729)

because there already is a 8000 series, which is a rebadge of the 7000 series. They rebadged so much that they ran out of numbers

Re:Marketing Numbers (2)

GrandCow (229565) | about 10 months ago | (#45077409)

Why didn't AMD's Marketing team name these 8000 series cards? Do they keep changing the naming scheme to be intentionally confusing?

Because there's a psychological barrier to naming a card 10,000 or higher, and as you approach that, the effect starts to show. It diminishes the numbers in your mind and makes it "pop" less. Because in certain peoples minds, going from a 7000 series to an 8000 series means more than going from a 10,000 series to an 11,000 series. The other option was to start using k, but then how do you differentiate different cards in the 10k series? 10k1? 10k2? Now you're in a different area where people don't want to pay $50 or $100 more to bump up a notch and "only" get 1 number higher.

Now most /. readers will laugh and say "that's stupid" but that's not the group of people they're renaming the cards for. It's your novices or the guys that have the disposable income but don't care about doing research, they fall victim to things like model numbers and more expensive = better than.

TL;DR: It's for the rich idiots.

Re:Marketing Numbers (1)

toddestan (632714) | about 10 months ago | (#45097881)

Actually, the problem was that they caught up to the first Radeons. Those actually started in the 7000's, but I didn't see as many of those around as the later 8000's and 9000's. It would have been way too confusing to have different Radeon 9600's around, even if the old one was a decade-old AGP part, so they went to a new scheme.

Incidentally, after the old 9000 series they went to "X" for 10, such as the X600, then later the "X1" which I guess meant 11, like the X1400. Then they decided it was just silly and dropped the X for the 2000-series.

Re:Marketing Numbers - A Brief History (1)

Cassini2 (956052) | about 10 months ago | (#45077695)

Marketing loves dealing with superlatives. ATI started with the Graphics Wonder card. After a while, new cards came out, and more superlatives were required. Combinations of superlatives were the new convention, ie: the VGA Wonder Plus, and the Graphics Ultra Pro. After the 3D Pro Turbo Plus card, no one tried using superlatives again.

ATI then proceeded to start naming Radeon cards 7000, 8000 and 9000 series. After MIPS 10k, no one wanted numbers larger than 10,000. As such, ATI tried the Radeon 300 series, and eventually made it to the X850 series, before trying 4 digit numbers again (ATI X1200 through AMD HD 8990).

Now ATI is copying the Intel i7-3220 convention and using dashed three digit numbers. Hence R9-260X. It is getting difficult for ATI/AMD to number the new cards differently than the old cards. Anyone want the 3D Pro Turbo Plus convention back again?

7790 gets no love (5, Interesting)

Anonymous Coward | about 10 months ago | (#45076279)

The HD 7790 never seems to get any love in reviews -- it is always pointed out that its slower than such and such, or more expensive than such and such... missing the point entirely

The HD 7790 is only 85 watts. It is often compared against the GTX 650 Ti, which is 110 watts and is only marginally better than the 7790 in some benchmarks (the regular GTX 650 however, is actually very competitive in power consumption, but is notably slower in most benchmarks than the 7790)

Now we see this new R7 260X getting dumped on in the summary for essentially the same ignorant reasons. The R7 260X is supposed to use slightly less power than the 7790, but here it is being compared to cards that use 50%+ more power.. essentially cards in a completely different market segment.

Reviewers are fucking retards.

Re:7790 gets no love (0)

Anonymous Coward | about 10 months ago | (#45076469)

Because most users don't care about power - they care about cost and performance. The reviewers are comparing to cards of similar cost.

Re:7790 gets no love (0)

Anonymous Coward | about 10 months ago | (#45079653)

Except for noise (if they don't install water cooling on it)/

Re:7790 gets no love (1)

Rich0 (548339) | about 10 months ago | (#45099465)

Because most users don't care about power - they care about cost and performance. The reviewers are comparing to cards of similar cost.

The cost is only comparable if you don't factor in having to buy a new power supply or whatever.

I had a video card die, and a decent portion of the costs to replace it went into a power supply because the best bang-for-buck GPU was just not going to work on the power supply I had in the system (which could only supply a single PCI-E connector - and didn't really have much headroom to use an adapter). I had half-considered downgrading just for that reason.

By all means make the comparisons, but pointing out the power angle would be useful in a review.

Re:7790 gets no love (0)

Anonymous Coward | about 10 months ago | (#45076543)

I think you might have missed the point - power consumption generally isn't very relevant for desktop cards. Price and performance however do matter...so comparing performance of cards that cost the same is perfectly legitimate regardless of one using more or less power.

Sure there's some use cases that do call for lower power - small cases, quiet PC's etc - these are niches though, not the mainstream gaming GPU market.

Re:7790 gets no love (0)

Anonymous Coward | about 10 months ago | (#45076583)

> The R7 260X is supposed to use slightly less power than the 7790

Really? Everything I have read is that it is the exact same die with more features enabled, more memory and high clock. They also suggest the power rating has jumped to 115W, which completely kills your argument.

Re:7790 gets no love (0)

Anonymous Coward | about 10 months ago | (#45079739)

You won't play new games with a 7790, at least not on high settings.

It's great that it's power efficient, but if you don't need a decent amount of power you're probably better off with an integrated GPU, which is even more power efficient.

Re:7790 gets no love (0)

Anonymous Coward | about 10 months ago | (#45081717)

You won't play new games with a 7790, at least not on high settings.

Why not? Everyone else is.

Far Cry 3, 1920x1080 Ultra settings, 30 FPS+

Sleeping Dogs, 1920x1080 Ultra settings, 45+ FPS.

stop being an Ignorant twat

Re:7790 gets no love (0)

Anonymous Coward | about 10 months ago | (#45082659)

Yes you will. if a 5 year old GTS250 can handle it, a 7790 certainly can.

Reward Clubs (-1, Offtopic)

jaredthegeek (2023500) | about 10 months ago | (#45076533)

At some of the "finer" hotels, you can join their rewards program and gain free internet access.

Re:Reward Clubs (0)

Anonymous Coward | about 10 months ago | (#45079855)

I'm going to hazard a guess and sugggest that you've posted in the wrong story.

wow (1)

buddyglass (925859) | about 10 months ago | (#45076885)

Talk about alphabet soup! Or, in this case, alphanumeric soup.

Dizzy (1)

Nethemas the Great (909900) | about 10 months ago | (#45077057)

Is it just that I'm tired from a long day at work, or is the summary really that incoherent, disorganized and lacking even a rudimentary grasp of proper sentence structure? I hate to be one of those "Nazis" but I shouldn't have to read and re-read the bloody summary to tease meaning out of it.

AMD/Radeon is dead (0)

Charliemopps (1157495) | about 10 months ago | (#45077103)

AMD/Radeon is dead. I was a big AMD/ATI guy for nearly a decade but their drivers and compatibility issues just kept getting worse and worse and worse. Their multi-monitor support is terrible. Their support for hardware accelerated video decoding took far to long to get straitened out. Their linux drivers dropped support for the majority of their older cards, which is silly as the majority of linux installs go on older computers. I have had ATI cards literally set 3 different motherboards on FIRE in the past few years due to compatibility issues... thank god for neweggs return policy. Finally I gave up, switch to Intel and Nvidea... no problems for 2 years now and thats on several dozen computer builds (I build computers for everyone I know basically, so much so that I have small business accounts with most retailers I buy from) Though intels recent switch to no OEM fan I found irritating.

All that combined with the all to obvious move by most people from PCs to tablets and I don't see AMD surviving the next decade. A friend of mine that works there says he sees the writing on the wall as well. I once spent $800 on an ATI card. I still have it, it's a work of art. But those days are long gone.

Re:AMD/Radeon is dead (0)

Anonymous Coward | about 10 months ago | (#45077595)

Nvidia is dead. I was a big Nvidia guy for nearly a decade but their drivers and compatibility issues just kept getting worse and worse and worse. Their multi-monitor support is terrible. Their support for hardware accelerated video decoding took far to long to get straitened out. Their linux drivers silently dropped support for more than 3 monitors. They released defective chips and drivers that set cards on fire, then blamed the users.

Re:AMD/Radeon is dead (4, Insightful)

realityimpaired (1668397) | about 10 months ago | (#45077749)

*shrugs* Everybody has their own experiences. I have a Core i5 2500k system with 16GB of RAM, and a Radeon HD 6970, and have never had a problem despite its age. It still runs all of my current games library without breaking a sweat (and that includes recent AAA titles on Steam running under WINE), and I've never had any of the issues you claim happened to yours.

In fact, I'm at a loss to explain how it's even possible for a video card to set your system on fire. You could blow some capacitors, I suppose, if you have a cheap motherboard with cheap caps, you could crater a chipset by sending too much voltage, you could even wreck a cold solder, but the flash point on the plastic they use to make motherboards is high enough that the system would have shut down for critical heat *long* before it ever got hot enough to set the silicon on fire....

All of the above would be solved by not having a crap motherboard, btw... I've seen all of the symptoms I've listed in computers, but every single one of them was either a cheap motherboard or a cheap power supply, and not really anything the CPU vendor could have controlled... (I've seen them all in Intel systems as well as AMD)

Re:AMD/Radeon is dead (0)

Anonymous Coward | about 10 months ago | (#45077939)

Yeah, I'm at a loss too, unless you're talking about the late days of AGP.
A 1.5V-only AGP 2.0 or 0.8/1.5V AGP 3.0 card in a 3.3V AGP 1.0 slot could cause the northbridge to "vent with flame".

Another time I've seen graphics cards properly smoke a mainboard was 4 5870s on a cheap-ish board (I think it was a MSI). You could clearly see the 12V traces from the ATX 24-pin to the PCIe x16 slots got hot enough to carbonize the board...

Re:AMD/Radeon is dead (-1)

Anonymous Coward | about 10 months ago | (#45081325)

Bullshit. If you're running "recent AAA titles on Steam under WINE" you're running a 10 year old graphics engine AT BEST. The term "recent" obviously doesn't mean much to you. STFU.

Re:AMD/Radeon is dead (0)

Anonymous Coward | about 10 months ago | (#45083575)

I have close to the same set up as you do, with the exception of having two 6970s instead of just one and most games cannot break 60% load on both cards. So far the only two that have were an unoptimized version of Arma 3 alpha, and during certain fights Planetside 2 on ultra. The 6970s can be a little loud, and may get warm if you have two of them but that's nothing an aftermarket cooler can't solve. I love these cards.

Interesting, I just went from NVidia to ATI (1)

dutchwhizzman (817898) | about 10 months ago | (#45079087)

Your story is interesting to read. I have recently bought an AMD 7870 card for my main desktop system. The main cause for me to switch was the openCL support that AMD in their proprietary drivers has. True, I have had trouble with multi monitor support and stability that was only fixed (for me at least) very recently and I contemplated switching back. However, with the latest drivers, I have had no trouble so far and the openCL performance I get out of the card is way better than a similarly priced NVidia board would give me.

Would I buy another NVidia, or would I go for AMD for my next purchase? I guess it depends on NVidia getting their support for openCL instructions sorted out and fix the performance. I regularly have to do brute force password hashing for my work and NVidia is miles behind AMD in that, because they have a few openCL instructions not properly implemented in their hardware. If their next silicon fixes that, I wouldn't know which I'd choose. If by the time I want another card they are still behind, I would likely go for an AMD card.

Re:Interesting, I just went from NVidia to ATI (0)

Anonymous Coward | about 10 months ago | (#45079665)

Your story is interesting to read. I have recently bought an AMD 7870 card for my main desktop system. The main cause for me to switch was the openCL support that AMD in their proprietary drivers has. True, I have had trouble with multi monitor support and stability that was only fixed (for me at least) very recently and I contemplated switching back. However, with the latest drivers, I have had no trouble so far and the openCL performance I get out of the card is way better than a similarly priced NVidia board would give me.

Best of luck using OpenCL with AMD's drivers.

I've managed to BSOD a Windows 7 system by trying to run OpenCL accelerated programs. That's no small feat considering the video drivers run in user space on Win7 (The driver crashes then comes back up only to crash again almost immediately which causes a "repeating video driver failure" kernel panic after it crashes 3 times in a row).

7970 been a great investment (0)

Anonymous Coward | about 10 months ago | (#45080493)

Bought one of the 7970 launch cards a month later when price always evens out. At this point top end binning does not exist, so you have a much better chance of getting good sillicon. All cards identical design and cooling. I won that silicon lottery, 1.3Ghz 7970 on stock air, when last tested, cranked cry2 ultra butter smooth,1440p, stable with drivers last year. Seems many can do this if you use right bios and methods, little bit old school style. That's pretty much 780 beating performance. Water cooled models to do this were on the horizon for similar speeds but have not seen this yet (haven't really bothered to look either). Aftermarket air will make it a little quieter but I don't thrash it too often, so not a biggie.

Whole point of this tirade: Guys with more extreme setups have reached 1.4-1.5ghz or more.. these chips have heaps of head room in them. Extremely efficient already, so these new silicon revisions will clock hard if they have power circuitry to do it, that's the main thing holding it back.. aka they are fast enough to beat 780gtx with nearly 2 year old hardware already....

AMD mantle with GCN will speed up new consoles>pc port and enable much easier tweaking of graphics (or using original design details...) to enable seamless transitions between.

Found for very varied, trialing situations, e.g. re-timed/powered, 80m+20m cable runs, unusual resolutions, amd/ati output and drivers to be superior for most situations. But driving dual 1440p screens last year was a tear fest whatever i tried.. go figure.

One further advantage was 10bit support, 7970 is a great all rounder card, especially for compute users who can take advantage of it. Premiere pro now supports it too.

microstutter.. stick to one fast card at high res or go SLI for now.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>