×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Radeon HD 7970 GHz Edition: Taking Back the Crown

Soulskill posted about 2 years ago | from the their-turn-at-leapfrog dept.

AMD 132

An anonymous reader writes "The benchmarks are in for the Radeon HD 7970 GHz Edition. Starting at $500, AMD's new single-GPU flagship boosts the original 7970's clock speed from 925 MHz to 1 GHz (1050 MHz with Boost). The GHz Edition also sports 3 GB of faster 1500 MHz GDDR5 memory, pushing 288 GB/s as opposed to 264 GB/s. While the AMD reference board runs hot and loud, retail boards will use different cooling solutions. A simple test of aftermarket GPU coolers shows that any other option will shave degrees and slash decibels. But it's the Catalyst 12.7 beta driver that really steals the show for AMD, pushing FPS scores into overdrive. With the new Catalyst, Nvidia's GeForce GTX 670 can no longer beat the original Radeon HD 7970, and the GHz Edition outmaneuvers the GeForce GTX 680 in most cases. However, when factoring price and possible overclocking into the equation, the original Radeon HD 7970 and GeForce GTX 670 remain the high-end graphics cards to beat."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

132 comments

X2 (1)

lightknight (213164) | about 2 years ago | (#40416627)

Where's my X2 edition?

Re:X2 (3, Interesting)

steelfood (895457) | about 2 years ago | (#40416863)

Forget X2. I want an All-In-Wonder version.

Completely unrelated /. trivia, but I just noticed the captcha isn't required at all for posting if logging in at the same time.

Re:X2 (1)

lightknight (213164) | about 2 years ago | (#40416897)

X2 All-In-Wonder edition.

Yes, I think that will do it.

Re:X2 (-1)

Anonymous Coward | about 2 years ago | (#40417087)

Forget stupid video cards. I'll spend my $500 on something that will actually do something - like that MS Surface or the Google Nexus Tablet. The video card thing is only for people doing things like geological mapping or petroleum reservoir simulations or some serious rendering work. Oh, and twitchers playing FPS. Can't forget the twitchers.

Re:X2 (0)

Anonymous Coward | about 2 years ago | (#40417301)

suck at quake eh? you must be one of those rage quitters

Re:X2 (0)

Anonymous Coward | about 2 years ago | (#40417591)

Honestly as someone who owns doom -> quake 3 arena (wolf3d too but only because of rtcw), I have to ask: Doom3+ Who the fuck cares? None of them brought anything new to the table, there's better games both visually and gameplay wise released earlier (I'm thinking farcry 1/2 for D3 era, and Borderlands, Fallout, all the Modern Warfare games, etc for Rage). Honestly for straight up FPSes there's a stack of open source games, both single and multiplayer that beat the pants off either, and usually with far more players, or far lower system requirements to start playing.

But hey maybe there's something the id fanboys know that I don't.

Rage (Or was that Doom3?): Carmack's Daikatana

Re:X2 (0)

Anonymous Coward | about 2 years ago | (#40417615)

PowerColor and HIS have both shown 7970 X2s (6GB RAM). Remains to be seen how long until they ship.

Is ATI coming back? (0, Flamebait)

YodasEvilTwin (2014446) | about 2 years ago | (#40416655)

So, are ATI* cards competitive** again, or is this simply reframing the debate in their favor? I don't really have any idea.

** Please don't talk to me about AMD, they suck and the graphics division is still basically independent.
** Both in terms of top performance, and cost for given performance levels.

Re:Is ATI coming back? (1)

Tough Love (215404) | about 2 years ago | (#40416961)

I don't know about ATI, but AMD's Radeon cards have been competitive for a long time.

Re:Is ATI coming back? (1)

hairyfeet (841228) | about 2 years ago | (#40419873)

Their bang for the buck has been pretty damned awesome and while I'm not gonna spend $500 on a card this does make me happy as all those $200-$300 cards will be dropping that much faster and I'm sure by XMas my HD4850 will be long enough in the tooth it'll be time to trade up.

I have to say what I've been liking lately about AMD is how many formats they are supporting for decode and transcode, with the new XCode codecs frankly my hexacore barely registers no matter what definition I'm playing because so much is being dumped to the GPU and even on my little E350 netbook I'm getting just great battery life when playing 720p video thanks to everything being offloaded.

so I have to say yay AMD, having more competition is always better for the consumer. Now if only they could come up with something better than Bulldozer/Piledriver on the CPU front I'd be a really happy camper.

Re:Is ATI coming back? (2)

Sir_Sri (199544) | about 2 years ago | (#40417271)

Competitive is an odd word in the hardware business. If you want to spend 1200 dollars on a CPU does AMD have a competitive offering? How about 500? What's the different in performance between a 500 dollar part and a 1200 dollar one?

With GPU's AMD and nVIDIA are pretty close in rendering performance, for specialized tasks (GPU computing) particular hardware may favour one guy over the other. But if 90% of the market is in GPU parts that cost less than 400 dollars, whether or not you hold the top spot at the 500 dollar price point doesn't really matter. It's more of an advertising problem than a technical one.

Some of us (myself included) have things like GTX 680's, 5970's and so on, those can be 600, 800 dollar parts. They aren't cheap, but they also aren't normal. If you wanted to buy a 350 dollar GPU from Newegg or equivalent yesterday both nVIDIA and AMD had competitive parts in that price bracket. Even at the top end, the 7970 was slower, but not much slower overall than the competitively priced nVIDIA part, and we're talking about theoretical performance anyway, experience with a specific game and driver support matters a lot to the experience.

And while yes, the graphics division of AMD is still based in Markham as time goes on we're seeing more integrated CPU-GPU products. That's not really the segment we're talking about here, but they're very much becoming an integrated outfit.

which OTHER ati card should I buy? (2)

ClintJCL (264898) | about 2 years ago | (#40416711)

My x1950 sucks for 1080p video (since, y'know, the HD line was the NEXT card after that), but it's fine for QuakeLive, which is the only game I really care about that much. But I know they have cards that "game good, video bad" as well as "game bad, video good". It's frustrating trying to figure out which product I should buy. My card was $200 in 2007. If I don't really want that much more than what I got then, why would I want to pay $500?

Now, I'm an ATI man who's been using TV out since 1995, non-stop. But I'm not willing to throw them so much money, especially when I have to change my entire operating system to accommodate their abandonment of "old" OSes like XP. Man, that jump to 64bit required updating so many scripts, and replacing so many utilities. Don't force change on me and I might give you more money, ATI.

I like stories.

Re:which OTHER ati card should I buy? (2)

YodasEvilTwin (2014446) | about 2 years ago | (#40416817)

Got a couple 6950's recently for ~$250 each and they've been great for games and videos. One was an older rev that crashed the driver, but Diamond replaced it with no questions asked and the new one works fine. For games that support Crossfire I get basically the performance of a 6990 for 2/3rds the price, and alternative frame rendering isn't too bad for other games either (I can play SWTOR on full settings at 1080p with no issues).

Must be your cpu (1)

ArchieBunker (132337) | about 2 years ago | (#40416843)

I have an ancient x1650 card with a Q6600 cpu running windows 7 and 1080p video uses minimal cpu. Your card is not the bottleneck. I'd like a newer card but when you look at the numbers the lowest end Fermi card is still slower than the old 9800GT series.

Re:Must be your cpu (1)

SJHillman (1966756) | about 2 years ago | (#40417035)

I also had an x1650 with an E6300 (until recently) - ran video at every resolution fine on both WinXP and Win7 x64. However, I'd recommend spending the $20 and upgrading to a Radeon HD 5450... much better performance in Win7 and only draws 18W.

Re:Must be your cpu (1)

ClintJCL (264898) | about 2 years ago | (#40417383)

Card's definitely the bottleneck in my case. Just because you have a lower card doesn't mean much - our system, software, and hardware is not identical. Cpu is nowhere near maxed out. Also, I play a lot of stuff using only software acceleration (not 1080p though, it still gets all kinds of tearing in VLC and media player classic), because I have an ambilight system on my computer that needs to be able to see the pixels to match the light color. Card ran better when I bought it than now, too. It's 5 yrs old.

Re:Must be your cpu (1)

ClintJCL (264898) | about 2 years ago | (#40417387)

It also depends on the bitrate for me - if it's a 2 hr movie encoded to a 3G 1080p MKV, i'm going to be fine. If it's a 2hr movie encoded to an 8G 1080p MKV, I'm not. Still not the cpu maxing out though.

Re:which OTHER ati card should I buy? (2)

Aranykai (1053846) | about 2 years ago | (#40417267)

If you aren't gaming, just grab a mid range 5xxx series and you will be fine. Should have very little trouble finding one for less than $80.

If you want more than that, get a HD 6770. You can have em for about $115 and it should keep up with any HD, moderate gaming(as long as you don't expect maxed settings) and should last you another 3/4 years before its too outdated.

There is no reason to spend more than $200 on a video card unless you are doing hardcore gaming on multiple or high resolution monitors.

Re:which OTHER ati card should I buy? (0)

Anonymous Coward | about 2 years ago | (#40417339)

A HD6770 is a HD5770. Same card.
You can find HD7770s new for ~$110 after rebate, performance wise in the same league as 5830 or 6850 but with a lot less power and noise...

Re:which OTHER ati card should I buy? (1)

ClintJCL (264898) | about 2 years ago | (#40417415)

I'm gaming, just - i don't need much gaming power behind what an x1950 offers. I want to play quakelive in 1080p with 60fps, but i'm on an HDTV anyway, at 30Hz usually, so, framerate's not TOO pressing of an issue.

Re:which OTHER ati card should I buy? (1)

Aggrajag (716041) | about 2 years ago | (#40419871)

I would recommend HD7770 as it consumes a lot less power compared to HD5770/6770 and the cheapest ones cost around $110 - $120.

Re:which OTHER ati card should I buy? (1)

LordLimecat (1103839) | about 2 years ago | (#40417699)

Have you thought about putting that money towards something like one of the new integrated graphics options, like Ivy Bridge or the AMD stuff? I can say at the least that SandyBridge HD2000 was sweet, and I cant imagine how sweet IB must be.

It wont match a mid-highend GPU, but it may do better than what you had. Just something to consider when checking out benchmarks.

Re:which OTHER ati card should I buy? (1)

ClintJCL (264898) | about 2 years ago | (#40417723)

I don't plan on upgrading my motherboard anytime soon tho.. i'm super-picky on motherboards. Quite traumatized by abit going out of business.

I'm done with spendy, top of the line cards (2)

ackthpt (218170) | about 2 years ago | (#40416733)

All of my expensive fancy video cards have died, usually right after any kind of warranty and I'm squeeking by on some horribly low res, limited palette and no hardware acceleration for graphics. But at least it's reliable!

Re:I'm done with spendy, top of the line cards (4, Informative)

Ogi_UnixNut (916982) | about 2 years ago | (#40416853)

Ditto! I kept buying top end Nvidia cards for CUDA work, only to have them die just after the warranty, usually a year or so. I dug out an old Nvidia Quadro 285 card from the early 2000's, and am using it again. Also the 8400gs I got works just peacy for simple CUDA stuff.

It is like they engineer their top end cards to fail after a year or so, no matter what. My GTX 280 never went beyond 50 degrees, and was underclocked to boot (I didn't need all the power). Yet it died after a year or so, about as long as the 8800GTX I had beforehand.

The Quadro has been in use in some form for more than half a decade, and it still does 99% of what I need (Apart from the CUDA stuff, otherwise it would be perfect). Their older stuff seems more solid.

Re:I'm done with spendy, top of the line cards (2)

ackthpt (218170) | about 2 years ago | (#40417213)

Ditto! I kept buying top end Nvidia cards for CUDA work, only to have them die just after the warranty, usually a year or so. I dug out an old Nvidia Quadro 285 card from the early 2000's, and am using it again. Also the 8400gs I got works just peacy for simple CUDA stuff.

It is like they engineer their top end cards to fail after a year or so, no matter what. My GTX 280 never went beyond 50 degrees, and was underclocked to boot (I didn't need all the power). Yet it died after a year or so, about as long as the 8800GTX I had beforehand.

The Quadro has been in use in some form for more than half a decade, and it still does 99% of what I need (Apart from the CUDA stuff, otherwise it would be perfect). Their older stuff seems more solid.

My suspicion, after looking at a few cards under a loupe, is the technology is exceding the board itself to host such densely packed, current hungry and heat producing electronics. To be able to sell and profit from these units they are produced rapidly by a robotic assembly line. If they slowed that line down a bit the failure rate would decline, but they rather operate under an acceptable rate of failure (early or later) as the assembly line will be tooled for something else after the run.

Our older cards place less physical and environmental demands upon the 7 (or more) layer circuit boards and that's why they still hold up. Possibly they came through the production line at a slightly slower pace, too.

Re:I'm done with spendy, top of the line cards (2)

F34nor (321515) | about 2 years ago | (#40418777)

You might try heating them in the oven to reflux the solder. It worked from my D820 laptop motherboard.

Re:I'm done with spendy, top of the line cards (3, Insightful)

Anonymous Coward | about 2 years ago | (#40417469)

There are differences between the professional lines (Quadro, Tesla, FireGL, Firestream) and the consumer lines (GeForce and Radeon). The professional lines are built for the GPU manufactures to controlled specs and designed for longer life. The consumer lines are built by OEMs from a reference design with incentives to push clock speeds and component specs to the limit.

Your experience likely has more to do with the old card being a Quadro than it does with newer cards being more fragile.

Re:I'm done with spendy, top of the line cards (1)

Rockoon (1252108) | about 2 years ago | (#40419559)

The consumer lines are built by OEMs from a reference design with incentives to push clock speeds and component specs to the limit.

Indeed.. and with that in mind, its not at all silly to buy a card with lower clock rates but the same gpu reference.. you probably wouldnt notice the framerate difference, but you WILL notice the temperature difference as the highest clocked cards are always maxing out their fans... even on menu screens

Re:I'm done with spendy, top of the line cards (1)

Jimbob The Mighty (1282418) | about 2 years ago | (#40417743)

Really? I've never had any issues with NVIDIA cards. Would be interested to know which brand of card, mobo and PSU you were using.

Re:I'm done with spendy, top of the line cards (3, Informative)

JonySuede (1908576) | about 2 years ago | (#40418083)

this is the detailed article explaining why the things are the way they are : http://www.geeks3d.com/20100504/tutorial-graphics-cards-voltage-regulator-modules-vrm-explained/2/ [geeks3d.com]

Re:I'm done with spendy, top of the line cards (3, Informative)

JonySuede (1908576) | about 2 years ago | (#40418105)

I forgot to add to read that one after reading the page I linked : http://www.geeks3d.com/20091209/geforce-gtx-275-vrm-damaged-by-furmark/ [geeks3d.com]

Re:I'm done with spendy, top of the line cards (1)

DigiShaman (671371) | about 2 years ago | (#40419189)

Which is why I never by OCed editions of hardware or manually OC my own equipment. I don't care how well you can cool the components. They will die fast and young!

Re:I'm done with spendy, top of the line cards (0)

Anonymous Coward | about 2 years ago | (#40418301)

BS, buy from quality OEMs. My 8800gtx overclocked BFG card $700 died 6 months ago. and it could take 260's in the benchmarks and hang with the 460's at low res any day of the week. I expect my Asus replacement 6770 w 6 display ports to last 7 yrs too.

Re:I'm done with spendy, top of the line cards (1)

malloc (30902) | about 2 years ago | (#40417305)

What card are you running that has limited palette? I haven't had that since I gave up my ISA Trident 9000 card. (For sake of argument I'm considering a 24-bit RGB signal as "unlimited". Consumers aren't going to go worrying about a 10-bit LUT in their hardware).

Re:I'm done with spendy, top of the line cards (0)

Anonymous Coward | about 2 years ago | (#40417355)

A 2MB PCI S3 couldn't do > 800x600 at 24bpp.

Re:I'm done with spendy, top of the line cards (0)

Anonymous Coward | about 2 years ago | (#40418921)

Why not? 800x600x32 bits is less than 2MB.

Well if that happens often (1)

Sycraft-fu (314770) | about 2 years ago | (#40417569)

You are fucking something up, most likely your cooling or power. Or I suppose you could just be really unlucky. Regardless, just get a card that has a lifetime warranty. eVGA will sell you one.

Not buying ultra high end cards because they cost too much is a good reason not to buy them. You can get near their performance for much less money. Not buying them because you can't be bothered to build a system with proper power and cooling and do a bit of research to get a longer warranty is a silly reason not to buy them.

Re:Well if that happens often (1)

locopuyo (1433631) | about 2 years ago | (#40418479)

He probably bought the highest end card but then cheaped out and got the one with a bad cooling system.
I have made that mistake before but now I spend the extra $20 to get the version that runs the coolest and quietest. That way it doesn't die after a year of use and doesn't sound like a vacuum.

One word, "BITCOIN" (1)

maitas (98290) | about 2 years ago | (#40416763)

They need to add a benchmark for BirCoin, since it makes a lot of the market of high end graphic card buyers, and AMD is way faster per Whatt than Nvidia.

Re:One word, "BITCOIN" (1, Insightful)

Anonymous Coward | about 2 years ago | (#40416789)

For the 3 people that care about shitcoin?

Re:One word, "BITCOIN" (2)

h4rr4r (612664) | about 2 years ago | (#40416825)

What is this "a lot of the market"?
You really think more than a tiny percentage of folks use these cards for bitcoin?

Most people prefer to play games with them, instead of entering into pyramid schemes. Cash out while you still can.

Re:One word, "BITCOIN" (0)

Anonymous Coward | about 2 years ago | (#40417317)

No, but that tiny percentage of people will buy multiple cards. I've heard of some people having upwards of 20 different high end graphics cards between all of their machines. Even if they're a small percentage of the market, they still buy a large number of cards.

However, unless the results are really good, I don't see too many bitcoin miners going for new cards. They usually buy up the older generation cards which are nearly as good, but considerably cheaper. Unless the performance per watt on these newer cards is significantly better, I doubt that too many will bother to invest in them.

No kidding (1)

Sycraft-fu (314770) | about 2 years ago | (#40417627)

I think many of these miners need to L2math. So one of these cards will run you $500. Running it full bore will take around 250 watts so 1kWh for ever 4 hours it runs. Also have to factor in cooling, if you live in a warmer area. Also factor in computer power (and cooling for that) if it would normally be off during that time.

Well you need to run the numbers for your own power costs and so on, but that is a lot of mining you have to do to break even depending on what price you can get per bitcoin at a particular time.

So if the idea is to buy these to "make money" or something, you probably should throw some math at it first to see what it'll take to make anything, all presuming the price on bitcoins holds. "Easy money" usually isn't.

Re:One word, "BITCOIN" (2)

deego (587575) | about 2 years ago | (#40416845)

>> They need to add a benchmark for Bitcoin.

The bitcoin world is now looking at FPGAs and ASICs.

AMD Linux support sucks (4, Informative)

rtkluttz (244325) | about 2 years ago | (#40416779)

Regardless Torvalds recently getting his feathers ruffled with Nvidia.... In most cases Nvidia just works on Linux. I swore off AMD/ATI loooong back because JUST about time they finally get a decent proprietary linux driver support for one of their chipsets, it drops off the back side of support. I DESPISE forced upgrades and won't get caught in that trap again. All of our perfectly working AMD video laptops still work great but no proprietary driver support and the open source driver is waaaay worse. Nvidia proprietary drivers still support VERY old chipsets.

Re:AMD Linux support sucks (1)

Anonymous Coward | about 2 years ago | (#40416885)

torvalds couldn't go off at amd because they at least opened the source on part of their drivers which now has group working to produce an open source version. it still means the official amd drivers are a pile of sh*t, but it at least means they can "try" do something better.

Re:AMD Linux support sucks (-1)

Anonymous Coward | about 2 years ago | (#40416975)

That's funny. I run all my Linux boxes with ATI video cards and I don't have any problem with them. 3D Unity works just fine on all of them and I can mine Bitcoins with four different ATI cards (one 5770, two 5830s and one 5850) at 200+Gh/sec. If you want an nVidia Quadro FX1800, I have two of them I can sell you cheap, but you can't connect them with Crossfire because, well, they're nVidia cards and they just don't play that way.

Re:AMD Linux support sucks (1)

LingNoi (1066278) | about 2 years ago | (#40419195)

because 3D unity really taxes a video card... come back when it works with a multitude of video games as well as wine "without problems" then you'd be on to something. Running the desktop is the lowest bar you can get.

Re:AMD Linux support sucks (0)

Anonymous Coward | about 2 years ago | (#40419699)

If you really wanted to use your video card for games, you wouldn't be running Linux.

Re:AMD Linux support sucks (2)

Tough Love (215404) | about 2 years ago | (#40416985)

The open source Radeon driver works just fine, I'm using it for heavy 3D work right now. Not the case with NVidia. Linus had every reason to flip NVidia the bird, especially considering NVidia's ambition to win bags of gold selling Android chipsets.

Re:AMD Linux support sucks (1)

Anonymous Coward | about 2 years ago | (#40417675)

The open source Radeon driver works just fine

So not supporting OpenGL features which the hardware is capable of and running with abysmal performance is "just fine"? The performance of the open source Radeon drivers is utter crap and something that should actually shame AMD. But they don't give a shit.

And guess what, Android systems won't be running any open source OpenGL drivers anytime soon, regardless of NVidia.
It's galling that the only way to get good 3D performance is to run NVidia/Linux instead of GNU/Linux, with a proprietary black blob three times the size of the kernel running your computer.

Re:AMD Linux support sucks (0)

Anonymous Coward | about 2 years ago | (#40418153)

So not supporting OpenGL features which the hardware is capable of and running with abysmal performance is "just fine"? The performance of the open source Radeon drivers is utter crap and something that should actually shame AMD.

You just effectively knocked on AMD for the work being done by Xorg.

Can't tell if you are a troll or just a complete moron. But that's beside the point. The bigger point is: STOP LYING.

Re:AMD Linux support sucks (0)

Anonymous Coward | about 2 years ago | (#40418583)

You just effectively knocked on AMD for the work being done by Xorg.

There has been work done by Xorg, but no serious work done by AMD for 3D drivers, and THAT is why they should be ashamed. Almost no AMD employee works to improve the open source drivers. AMD only makes sure that 2D works fine.

I am someone who actually tries to play OpenGL games in Linux with a Radeon card (which I bought to celebrate their "support" of open source). But if you don't believe my word for it that the performance sucks, just go look at the benchmarks...

And by the way, OpenGL 4.2 is out, but only OpenGL 2.1 is supported by open source drivers.

Can't tell if you are a troll or just a complete moron. But that's beside the point. The bigger point is: STOP LYING.

Right back at you, but exchange "troll" with "blind AMD fanboy". On second thought, just go fuck yourself.

Re:AMD Linux support sucks (1)

Tough Love (215404) | about 2 years ago | (#40418313)

I'm getting 75 million Phong shaded triangles/second at 1920x1200 out of a 6450 running the Xorg Radeon driver. What's not good about that? Note: that's a fanless $50 card.

Re:AMD Linux support sucks (0)

Anonymous Coward | about 2 years ago | (#40418727)

I'm getting 75 million Phong shaded triangles/second at 1920x1200 out of a 6450 running the Xorg Radeon driver. What's not good about that? Note: that's a fanless $50 card.

What is good about a Phong shaded triangle? Nothing, absolutely nothing. Can you say "all my 3D games run smoothly"? But if you're on Linux with open source Radeon drivers, the chance you even play a 3D game while caring about its performance is very small. We all know Radeon graphics cards have great potential performance, but it's a shame that it's not utilized by the open source drivers.

(Phong shaded triangles do not use textures. I suspect that despite the big number of them you get per second, your actual framerate is fairly low).

Re:AMD Linux support sucks (1)

Tough Love (215404) | about 2 years ago | (#40418957)

What is good about facts? Oh, never mind, I see you don't let any such thing stand in the way of a perfectly good rant.

Re:AMD Linux support sucks (1)

Tough Love (215404) | about 2 years ago | (#40418973)

(Phong shaded triangles do not use textures. I suspect that despite the big number of them you get per second, your actual framerate is fairly low).

For your information, Phong shading is quite demanding because of the exponentials involved. No my framerate is not low, it is capped at 60 FPS. Very impressive for this class of card. And at least a single layer of textures does not seem to bother it. See, I'm not throwing crap at the card, I'm using it the way it was meant to be used, that makes quite a difference. Now please crawl back into your hole fudster, and don't come out again until you are armed with some facts.

Re:AMD Linux support sucks (1)

ifiwereasculptor (1870574) | about 2 years ago | (#40418697)

So not supporting OpenGL features which the hardware is capable of and running with abysmal performance is "just fine"? The performance of the open source Radeon drivers is utter crap and something that should actually shame AMD. But they don't give a shit.

The open source driver is not meant to be better than the blob in terms of performance. It's meant to evolve much slower, but to support older cards much longer and to work more closely with open source projects. Fglrx is often slow to react to newer Xorg ABIs, for example, and in such cases radeon is there to pick up the slack. It isn't perfect when running any modern card, but it gives you a desktop, good video playback and all. Gaming is still far behind, but it's very possible to play a lot of older titles.

The alternative, Nvidia, is worse, in my opinion. The blob is good when it's running, but getting it to run (and keep it running, especially if you use their crappy installer) can be quite a bit more troublesome. And nouveau sucks, so you pretty much have to rely on the blob. When they stop supporting your chip, chances are nouveau isn't going to support it either, because they have limited manpower and coding blindly is way harder. Look at the state of NV3x and NV4x with GTK3. Nvidia doesn't work. Neither does nouveau. So I simply can't run it unless I use the soon-to-be-deprecated fallback mode (granted, this wasn't a big deal until Cinnamon was released). So we have two drivers, but I can't render a desktop on a card that can still play Portal 2.

Re:AMD Linux support sucks (0)

Anonymous Coward | about 2 years ago | (#40418805)

The open source driver is not meant to be better than the blob in terms of performance.

With this attitude it's doomed not to...

Since open source is better by being open source, comparable performance could be the aim. And it is, for a very small number of developers, but their efforts alone are not enough to get there before GPU's themselves are obsolete.

At this rate Intel integrated graphics with their open source drivers will be perfect soon for everyone who does not care about the newest games but only about legacy (and with comparable performance to Radeon with its open source drivers). Of course, Intel does not really care about OpenGL and yet they are the only ones doing it the correct way.

So 3D on Linux makes an unhappy penguin. :-(

Re:AMD Linux support sucks (1)

drinkypoo (153816) | about 2 years ago | (#40418729)

Piss on your open source radeon driver. It has never supported the X1250 graphics in my R690M chipset correctly in spite of the core being ancient. The graphics corruption I've encountered with every try to date is actually worse than before in more recent versions — just tried it today with Precise. And of course, this core is too "old" to be supported by fglrx, which was true the day I bought the system brand new at the store.

AMD linux support is ass unless you happen to have one of the few cards that is well-supported, and that's true whether you're talking about the radeon driver or fglrx.

Re:AMD Linux support sucks (4, Informative)

GrumpyOldMan (140072) | about 2 years ago | (#40417119)

+1 I had an ATI in my last Linux desktop. Never again.

The proprietary fglrx drivers tend to have weird bugs and as you say, they drop chips that are old enough to have decent support. On the flip-side, the open-source radeon drivers tend to require various bleeding edge bits and pieces to work correctly, so they are nearly impossible to run on stable distros, like an Ubuntu LTS or a RHEL.

Nividia's proprietary drivers just work, once you finally figure out how to blacklist nouveau hard enough that it doesn't get loaded via the initrd. Plus they support VDPAU for projects like MythTV and XBMC.

Re:AMD Linux support sucks (0)

Anonymous Coward | about 2 years ago | (#40417405)

+1 I had an ATI in my last Linux desktop. Never again.

The proprietary fglrx drivers tend to have weird bugs and as you say, they drop chips that are old enough to have decent support. On the flip-side, the open-source radeon drivers tend to require various bleeding edge bits and pieces to work correctly, so they are nearly impossible to run on stable distros, like an Ubuntu LTS or a RHEL.

Nividia's proprietary drivers just work, once you finally figure out how to blacklist nouveau hard enough that it doesn't get loaded via the initrd. Plus they support VDPAU for projects like MythTV and XBMC.

And this is why regular users prefer Windows

Re:AMD Linux support sucks (0)

Anonymous Coward | about 2 years ago | (#40417529)

And windows is why regular people prefer iPads.

Re:AMD Linux support sucks (0)

Anonymous Coward | about 2 years ago | (#40418851)

Hey man, right on. I'm going to run out and buy an Nvidia card right now -- thanks for the anecdotal evidence.

Your in-depth analysis of the problem and clear technical synopsis ("bleeding edge bits", "weird bugs" and "drivers just work") really spoke to me.

Re:AMD Linux support sucks (1)

WombleGoneBad (2591287) | about 2 years ago | (#40417551)

Goodbye nvidia, i wont miss you. For years I've stuck with nvidia because everyone said that they are the best choice for linux, but i've hated every minute of it. They were a constant annoyance for the following reasons.
  1. Every time you build a kernel you have to download the latest nvidea driver and install it separately.
  2. Want to try a release candidate kernel? nope forget it. Just to get the damn thing to install you have to reverse engineer their installation scripts, and even after you get that working, the driver is usually unstable (not surprisingly).
  3. It overwrites libgl with its own version that wont work with anything else ... WTF??
  4. I had serveral problems such as an unstopable crackle coming out of my TV/monitor over HDMI (nevermind actually getting the HDMI audio to work!) , or having to write my own modeline to stop overscan
  5. The installer is a real PAIN IN THE ASS if you swap about kernel builds a lot. It complains about kernel versions and refuses to install, or refuses to uninstall grrrrr
  6. nouveau simply did not work for me, and even if it did i wasn't sure the 3D would be good enough for my needs. I could never get X running with it on my monitor. just screen noise. The nouveau guys are kinda heroic but why isn't nouveau backed by nvidia? and why is nouveau helping sustain a product whos manufacturer wont play ball? might they not be better employed helping improve the open ATI drivers?

recently i finally decided to give the radeon and try. and bought a cheap radeon HD 6450. Ok. I admit i still had to write my own modeline to get rid of overscan issue (similar to my experience with nvidea), but after that it JUST WORKED, with seemingly any kernel i build without having to shoe-horn in proprietary drivers everytime i do a build.
The 3D performance seems perfectly adequate for my needs, and being opengl 4.1 I can build and run opengl ES 2.0 type code against it happily. I really dont know why every slags off ATI/radion support, and gives nvidea a free ride. I like life of the ATI side of the fence and I'm not going back to nvidia anytime soon.
For some of us having a card with a half decent opensource driver in the kernel tree is not an idealogical battle, but simply a practical necessity.

Re:AMD Linux support sucks (0)

Anonymous Coward | about 2 years ago | (#40419089)

Every time you build a kernel you have to download the latest nvidea driver and install it separately.
Want to try a release candidate kernel? nope forget it. Just to get the damn thing to install you have to reverse engineer their installation scripts, and even after you get that working, the driver is usually unstable (not surprisingly).

I hope you don't mess around with X updates as much as you mess with kernel updates. fglrx trails behind for months when the ABI changes.

Re:AMD Linux support sucks (2)

ianare (1132971) | about 2 years ago | (#40417693)

Funny you should mention that right when AMD wins a huge order [phoronix.com] of graphic chips precisely because they have open source drivers.

And anecdotally, I've never had a problem with AMD hardware, generally by the time the proprietary driver loses support, the open source one matches its performance.

Nividia has some strong products. (1)

Anonymous Coward | about 2 years ago | (#40416811)

This cycle, the latest nvidia GPUs have a lot going for them. The Kepler series really is impressive, and generally has much lower power consumption than the AMD parts. (This was flipped last gen funny enough. Those first fermi cards were heat machines)

I picked up a factory OC'd version of the 670 and It's shocking how fast it is. (And how quiet and cool it is) Remember that moment your system became good enough to run Oblivion at fully maxed settings with really high framerates? Or morrowind? Yeah, that just happened with Skyrim.

Re:Nividia has some strong products. (1)

asm2750 (1124425) | about 2 years ago | (#40418189)

The GTX 670 is probably one of the best cards I have gotten to date. I am happy I got it instead of the GTX 680. Granted everyone who got a 680 were kicking themselves in the pants when the 670 was just slightly less powerful than the 680 at a much cheaper price.

Half of it's functionality STILL won't work! (1)

mindmaster064 (690036) | about 2 years ago | (#40416815)

Seriously, I can't even directly compare Radeon to GeForce cards because they're functionally crippled due to the fact that game developers are ignoring them in Windows and the binary driver is completely borked in Linux so no kudos THERE either. Then there is that issue that every single time I buy one of these cards they spontaneously flame-out. Speed isn't everything and they've been stomped in the drivers and support by NVidia for years. Radeon's all fail to render graphics properly even to this day and it is especially noticeable in side-by-side comparisons. They have some nerve charging $500 for what is likely as much a piece of turd as the rest of their offerings.

Re:Half of it's functionality STILL won't work! (1)

Mashiki (184564) | about 2 years ago | (#40416969)

Might have something to do with developers you know having a bug up their ass, and still developing for consoles. You know 10 year old hardware. Nah couldn't be...

Re:Half of it's functionality STILL won't work! (0)

Anonymous Coward | about 2 years ago | (#40417147)

Really, I have a dual 4870 setup that runs beautifully, smooth, no crashes, looks wonderful (All I do is game with it). (it does run loud and hot, damn xfx and their small noisy fans).
I was thinking of switching to dual 7970 black editions

Psoto!!! (-1)

Anonymous Coward | about 2 years ago | (#40416823)

We are the slashdot trolls, my (just made, surprise butt fuck) friend.
And we keep on trollin' till the end.

We post goatse [goatse.ru] links.
Natalie portman and hot grits.

No time for lusers (Linux users),
Because FreeGayOS (FreeBSD) rules!

Re:Psoto!!! (0)

Anonymous Coward | about 2 years ago | (#40416861)

WHY DON't you GO fuck YOUSELF, Steve.

And of course the HD 7970Ghz edition ... (1)

King_TJ (85913) | about 2 years ago | (#40416903)

... will work just fine in my Apple Mac Pro! Oh wait.....

Seriously, this is the kind of boost Apple *should* have been after, since they're now stretching out the upgrade of the Mac Pro until some time in 2013..... They could at least update OS X with an incremental release and start offering this card for the now 2 year old Mac Pro they did a slight CPU speed bump to and called "updated", so there'd be SOME sane reason for people to order one.

Re:And of course the HD 7970Ghz edition ... (1)

Greyfox (87712) | about 2 years ago | (#40417367)

That's one of the main reasons I've stopped buying Apple Hardware. Having the choice between one crappy three-generation-old ATI video card and a much crappier four-generation-old nvidia card on a $5000 desktop machine wasn't really much of a choice at all. One of the supposedly-nice things about the platform was that you could find at least SOME games for it, kind of the situation we were in back when Loki Games was still supplying games to the Linux community. Except the video card they provided could barely get the pixels to that lovely big-ass flat panel. The video options on the Mac Pro desktop were like attaching a garden hose to a fire truck. And as near as I can tell, the card on my old Mac Pro laptop isn't supported in Linux by any driver, open source or otherwise. 2D acceleration works fine, at least, but 3D, not so much.

7970Ghz?! (0)

Anonymous Coward | about 2 years ago | (#40416935)

My GOD, I could run a Matrix in that!

Re:7970Ghz?! (0)

Anonymous Coward | about 2 years ago | (#40418239)

A MATROX MISTAEK!??!1

5% more shiniez? I *must* have one! (4, Insightful)

Leo Sasquatch (977162) | about 2 years ago | (#40417125)

OMG, does this mean I can now run Crysis at 80fps instead of 75? F**k me sideways with a spastic badger, my life is now complete.

Why does anyone care that the two major card makers are still in their dick-waving war? Is it just to keep the review sites in business? Hey, look, another new top-of-the-range GFX card, not totally dis-similar to the one we reviewed last month, only we got it for free, and you'll have to part with some serious wedge if you want to have the same toys as the cool kids!

There have been no real, serious differences between any of the last dozen iterations of hardware. Anything made in the last couple of years should run any game on the market at full shiniez at decent resolution. It won't, sadly, make the gameplay any better.

Re:5% more shiniez? I *must* have one! (1)

lightknight (213164) | about 2 years ago | (#40417273)

Because some of us want to see what it's like to play a game with maximum bling enabled?

My machine still chokes on Alice: Madness Returns, and that's with a 6970.

Re:5% more shiniez? I *must* have one! (1)

blackraven14250 (902843) | about 2 years ago | (#40417335)

Considering that a 680 outclasses the living hell out of a 280, to the point where a 680 is able to turn up graphics higher than a pair of 280's in SLI, your blanket comparison (involving anything, including stuff down at the 640 level, which is still significantly weaker than a single 280) seems unfounded.

Re:5% more shiniez? I *must* have one! (1)

SplashMyBandit (1543257) | about 2 years ago | (#40417589)

> Anything made in the last couple of years should run any game on the market at full shiniez at decent resolution.
Sigh. Clearly your video games diet is fairly bland. Try one of the DCS combat simulators (A-10C, Ka-50, P-51D), or even Armed Assault II and you'll quickly notice the difference between a high end video card and a run-of-the-mill one. Just because the 'mainstream' is designed as twitch games that fight on maps the size of postage stamps doesn't mean all games/simulations are like this. I guarantee if you look outside the BF3, MW3, Halo, Gears of War, Diabolo3 and other such stuff you will find games less widely known to Joe Sixpack that have a lot more interesting aspects (eg. buddy lasing coordinated laser-guided-bomb drops) than run, gun, collect new hat/badge/unlock.

Re:5% more shiniez? I *must* have one! (1)

TornCityVenz (1123185) | about 2 years ago | (#40417815)

I agree, and $500? wow. It is nice to run games at a decent frame rate, unfortunately for Radeon users their driver support seems to be falling behind their hardware by quite a lot. My Nvidia 560ti runs my favorite game Tribes Ascend with full bling, while Radeon users of higher end cards report all sorts of issues.

BTW the new Tribes is FTP..Join me https://account.hirezstudios.com/tribesascend/?referral=1207516&utm_campaign=email [hirezstudios.com] for anyone who remembers tribes 1 and tribes 2, you know why you should give it a shot, for anyone who has never tried a tribes game...well, you are missing out.

Re:5% more shiniez? I *must* have one! (1)

IorDMUX (870522) | about 2 years ago | (#40418801)

Why does anyone care that the two major card makers are still in their dick-waving war?

Because people buy the stuff they make (I dunno who, but their top-line stuff makes money somehow), and because, once they've replaced it with something newer and better, the price drops fast so the rest of us can build a nice, inexpensive gaming machine.

Re:5% more shiniez? I *must* have one! (1)

Jesus_666 (702802) | about 2 years ago | (#40419719)

Mostly but not entirely. I currently run an old hand-me-down Geforce 8800 GTS and while its processor is certainly powerful enough to calculate the scene in my native resolution it has an entirely different problem that makes newer games (say, Far Cry 2) run horribly on higher settings: 320 megabytes of RAM. You can have all the power you want in your GPU core but it all amounts to nothing if anything but the lowest settings induce noticeable pumping as the core spends most of its time mobing around data because the working set just doesn't fit in the memory.

If they sold an 8800 with one or even two gigs of RAM I'm fairly certain that I could play just about anything on high settings without any problems.

Hot and loud? (1)

xyourfacekillerx (939258) | about 2 years ago | (#40417447)

Why would the "reference board [run] hot and loud" ? AMD has the engineers to develop this stunning achievement, but not the engineers or the time or money to indicate a recommended cooling solution?

Re:Hot and loud? (1)

Rockoon (1252108) | about 2 years ago | (#40419633)

..because with the latest top-end GPU's, "proper" cooling is very expensive... even the mid-range cards are using multiple fans these days...

From what I am reading, the original 7970 drew 40 amps / 210W TDP at reference clocks and upwards of 100 amps when OC'd, and because this new 7970 is basically OC'd...

Obligitory (0)

Anonymous Coward | about 2 years ago | (#40417923)

Yes, but does it run Linux?

I didn't think so.

Semi-Accurate comments on this (1)

steveha (103154) | about 2 years ago | (#40417973)

I enjoy reading the articles posted on SemiAccurate.com [semiaccurate.com] about AMD, nVidia, Intel, etc. Most of the articles are by two writers, and the most entertainingly acerbic ones are by Charlie Demerjian (I'll call him "CD").

Five months ago, CD thought nVidia was going to crush AMD on the high end:

http://semiaccurate.com/2012/01/19/nvidia-kepler-vs-amd-gcn-has-a-clear-winner/ [semiaccurate.com]

However, nVidia seemingly can't produce their high-end chips in any useful quantity. So, CD snipes at nVidia about that in his comments about the new Radeon HD 7970:

http://semiaccurate.com/2012/06/21/amd-launches-tahiti-2-aka-hd7970ghz-edition/ [semiaccurate.com]

Executive summary (aka TL;DR): AMD has production of high-end chips running smoothly and they are now able to produce, in quantity, chips that are reliable at higher clock rates. AMD is actually shipping a graphics card that performs better than what nVidia is actually shipping.

steveha

Please stop the "don't use ATI/AMD for Linux" FUD (1)

Anonymous Coward | about 2 years ago | (#40418091)

It's embarrassing to see comments about how you should never use ATI-badged video cards in a Linux box, only to go home and watch my creaky-old 4550 run not only just fine, but also play 1080p video, and render 3D, while driving two monitors. Let me rephrase that: YOU should be embarrassed to say comments about how you should never use ATI-badged video cards yada yada yada...

Ditto for the "but I need proprietary drivers for Nvidia" crap. Nouveau drivers are getting just as good. Have a POS AGP FX5200? Notice how everyone in the room winced when I mentioned that model? But guess what, it runs great when not using Nvidia's crappy drivers.

Stop feeding AMD/ATI and Nvidia with "we want your drivers", and start to realize that the real solution is to pressure both of them to release the prior generation's hardware spec under NDA to the Xorg crew.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...