Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD's New Radeon HD 7950 Tested

Soulskill posted more than 2 years ago | from the showing-those-pixels-who's-boss dept.

Graphics 120

MojoKid writes "When AMD announced the high-end Radeon HD 7970, a lower cost Radeon HD 7950 based on the same GPU was planned to arrive a few weeks later. The GPU, which is based on AMD's new architecture dubbed Graphics Core Next, is manufactured using TSMC's 28nm process and features a whopping 4.31 billion transistors. In its full configuration, found on the Radeon HD 7970, the Tahiti GPU sports 2,048 stream processors with 128 texture units and 32 ROPs. On the Radeon HD 7950, however, a few segments of the GPU have been disabled, resulting in a total of 1,792 active stream processors, with 112 texture units and 32 ROPs. The Radeon HD 7950 is also clocked somewhat lower at 800MHz, although AMD has claimed the cards are highly overclockable. Performance-wise, though the card isn't AMD's fastest, pricing is more palatable and the new card actually beats NVIDIA's high-end GeForce GTX 580 by just a hair."

Sorry! There are no comments related to the filter you selected.

Faster video card, huh? (-1)

Anonymous Coward | more than 2 years ago | (#38885731)

Welcome to 2005. Yawn.

Re:Faster video card, huh? (0)

baka_toroi (1194359) | more than 2 years ago | (#38885819)

Oh, a new cellphone? Welcome to 1986. Yawn.

Re:Faster video card, huh? (1)

Ethanol-fueled (1125189) | more than 2 years ago | (#38885963)

That's me 40 years ago, but you wouldn't know it today, eh? I was in high school, I was a football star. All the girls wanted to dance with me. And I had a Diamond Viper. It was the fastest on the block.

Re:Faster video card, huh? (1)

Anonymous Coward | more than 2 years ago | (#38886221)

(not the GP) You wouldn't like it. Seriously. Speaking truthfully, my senior year in high school (25 years ago) I -was- a football star at a large Texas school, and drove an older corvette with a big-block. I -also- was a grade-A, unadulterated, shallow, ass-blossom douchebag with absolutely no values worth speaking well of. It's an over-rated experience that set me back years growing up and becoming an actual person. If I had to do it over once again and given a choice, I'd rather lose an arm.

Re:Faster video card, huh? (4, Interesting)

hairyfeet (841228) | more than 2 years ago | (#38886979)

You should probably still be cheering because that means the last gen stuff will drop like crazy! Hell the HD4850 I've got in here now retailed for $240 at release, know how much i paid for it a year and a half ago? $60. And frankly it still cranks out the purty on my 1600x900 monitor.

Of course that gets to the heart of the matter and why they are having to push 3D and GP-GPU and Eyefinity, simply because games don't keep up anymore. With the exception of a few games i call "benchmark bait" like Crysis frankly most of the games are console ports and all that extra power is sitting there twiddling its thumbs.

So while i'm hoping this will mean I'll find a steal on a 5850 or 6850 just because they crank out less heat honestly I doubt I really NEED it for any of the games i'm playing. What you'd actually use this card for except for winning benches and showing you have the biggest epeen is beyond me, is there even a game that would stress this bitch?

Re:Faster video card, huh? (3, Interesting)

SplashMyBandit (1543257) | more than 2 years ago | (#38887181)

Try a flight simulator like DCS:Black Shark 2 or DCS:A-10C. They will work out any video card pretty hard. So while you may play first person shooters with 300 meter horizons that don't stress your card out, when you get up in the air and have a 20 km horizon your card will be working its guts out.

Re:Faster video card, huh? (3, Informative)

Luckyo (1726890) | more than 2 years ago | (#38888455)

There are console first person shooters, and then there are PC first person shooters.

Try running BF3 on high/ultra in high resolution. My reasonably overclocked GTX 560Ti can just barely handle high in 1080p, ultra utterly murders it with clear jerkiness present in many situations. On the other hand, it eats MW3 for breakfast in pretty much any resolution/quality I could throw at it. You don't need to crank out a "20 km horizon" to overload a modern card.

And frankly, if a game makes your card render 20km of ground in level of detail that actually affects it, of which you will literally see only a few hundred meters, it's doing it wrong. Badly wrong.

Re:Faster video card, huh? (2)

hairyfeet (841228) | more than 2 years ago | (#38888489)

You want distance pal try Just cause II. You can climb on top of a mountain and....wow, the view is just stunning, with the snow whipping, and you can then jump and free fall alllllll the way through the different climate zones all the way down to the jungle floor. I actually tied a bike to the back of a 707, let the 707 pull me up to about 25,000 feet and then cut the line and did the wickedest free fall bike stunt you'd ever seen...played just fine on my HD4850 BTW.

Now as for the poster thinking the next consoles with be 7000 series? Sorry to burst your bubble but from reports they've already been in development for over a year, more likely a year and a half, so you are looking at a mid 5xxx series card MAX. See that is the problem, it takes such a long turnaround because these things are so complex now that there are probably 4 generations of GPU out by the time the thing hits the market. the ONLY reason it will look better initially is because of heavier optimization, with only one CPU and GPU to write for they optimize the hell out of it whereas I've noticed with PC games they tend to be of the "Meh we'll throw more cycles at it" mindset. Frankly if they did half the optimization on the PC they did on the consoles we'd have smoked the X360 a year after it came out.

In the end I can't complain TOO much about the consoles, after all they have helped to usher in what I call a "golden age" of PC gaming. Why is that? Simple because I can build a $550 PC for a customer that hooks right into his 1080P set and gives him sweet graphics and he'll be able to play for fricking years. Later on he can slap a $100 GPU upgrade and get even more years out of that same system. When i first started PC gaming you were damned lucky and had probably bought state of the art if you got a year and a half out of the system, then you had to chunk and start all over. The PC I'm typing this on cost $800 if you count the upgrades but the actual cost would be more like $600 since i was able to use the quad and motherboard in a new Xmas PC for my GF. This PC has 6 cores, 8Gb of RAM, 3Tb HDD, and an HD4850. When the price drops I can just swap that HD4850 for a new 5xxx or 6xxx and get another 2 or 3 years out of that GPU and frankly I'd be amazed if I need to build another PC for the rest of the decade as I don't see games hitting 6 cores for several years yet.

SO I say enjoy it, my kids have been gaming on HD4850s and Pentium Ds and just now am I needing to upgrade their systems, I'll slap them an AMD quad and 4Gb of RAM and they'll get another 2 years out of those HD4850s I paid a whole $60 for, just you watch. In the old days you were lucky if a card even lasted a year before there were games that wouldn't run, now a $100 card can play damned near everything without chugging. Good times friends, good times.

Re:Faster video card, huh? (0)

Anonymous Coward | more than 2 years ago | (#38889359)

You mean the horrible digusting view because they locked off a certain view distance and applied a nauseating blur despite the fact my machine should decimate the graphics.

All in the name of consoles... 360 was crap before it came out.

I'm tired of having my experience dumbed down and made uglier than the previous generation of games all so they can pander to the idiots like you. Not just equal to the previous two generations, but actually worse. They are taking away options from the graphics screens, giving pc's the console texture packs.

Re:Faster video card, huh? (1)

Sir_Sri (199544) | more than 2 years ago | (#38887255)

That's what happens though. Expect that the Xbox 3, PS4 will have something on par with a 7000 series Radeon or 600 series (not yet commercial) nvidia card, at which point, to keep up with a console you'll need something new ( I don't have any insider information here, but that would be consistent with the projected timelines and everything that has happened in the past).

That's how the market has worked for a long time. The consoles come in and converge performance parity to PC's by being sold at a loss for a while, then the price of parts drop, the PC moves on technology wise, but only a few titles that are really invested in the PC take advantage of it, eventually enough stuff comes out on the PC that makes the tech in consoles show its limitations (and this is where GPGPU comes in handy, for things like cloth and smoke it's much easier to do gpgpu than to do cell work) and then the console guys have to play catch up for a while.

Off the top of my head nothing jumps out at me that's mainstream that you just could not do on a console right now (performance wise). Some of the more complex strategy games, and obviously the controls for MMO's that kind of thing, but I could pretty easily spit out a demo on a 7000 series GPU that just plain could not be done on consoles, AMD actually has a professional quality demo, called "leo.mov" (which I seem to have lost the link for) that's about 400 mb that shows of stuff you can't do on a console but it's more of a marketing tool for the 7000 series than a great tech demo. The problem is, that wouldn't work on just about anything else on the market either.

Steams hardware survey is illustrative of the market. About 48% of gaming pc's have dual core, another 43% quad core, and then 'other' (which are probably going to be largely developers or people with rigs for something else that happen to also be used for gaming at 6 and 8 core, and the 5% of the market on single core are likely laptops or people with old and broken pcs). So we're a long ways off from the 'mainstream' PC market being way better than a console. Which is why all you get are console ports.

Well that and AMD nVIDIA and Intel have been working on more cores and not radically new tech lately. Sure, you could, for not a lot of money, easily get a card 4x the performance of your 4850 (a decent 6000 series) or better, but there's only so much visual fidelity to be added compared to being able to supply your little display vs a 2560x1600 (or triple monitor or the like).

Stuff that might NOT run on current consoles (1)

Lonewolf666 (259450) | more than 2 years ago | (#38890443)

Take any shooter or MMO with really large maps and corresponding memory requirements.

For instance, "All Points Bulletin" comes to mind. After a few minutes, it always brought my PC (AMD dual core, 2GByte RAM, NVidia 8600 GT) to its knees due to requiring 2GByte of memory or more for itself.
CPU and GPU seemed to have no problem, as the game ran fine until the memory limitation kicked in. So I guess the CPU and GPU in current-gen consoles might be able to handle the load as well. But memory-wise, they would run into problems even sooner.

Re:Stuff that might NOT run on current consoles (2)

Sir_Sri (199544) | more than 2 years ago | (#38892085)

Memory on consoles is a different baby completely than on PC. On a console you know exactly how quickly you can pull data in from the optical drive, and have a good idea about the hard drive. On the PC you figure most people have a couple of gigs of RAM, so you may as well use it, and you have no control over what else is using those resources on the system, so you're better to use RAM than to rely on disk access. You also have very different memory space requirements with the GPU (you might be mirroring your data between GPU and CPU memory, that sort of thing).

And yes, memory can matter a lot, that's usually textures. I always figured the easiest thing sony could have done to make the PS3 better was to swap the notebook hard drives for a desktop drive, and use the cost difference to put in 1 gig of memory rather than 512, which would have made developers have something they could point to as much easier to manage than the xbox.

But when you're actually developing on a console, you absolutely have to keep track of everything going into memory, and you are damn sure why it's there, but you also are 100% sure what every PS3 or Xbox2 will have for memory. On PC you have so much memory, and you might have been a nub and not set the right compiler options etc. but you just use the memory because it's there.

Re:Faster video card, huh? (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#38887289)

With the plummeting prices on monitors(at least for those of us blessed enough to have undemanding taste for the finer details of color reproduction and perfectly uniform luminosity, I can't speak for the poor fellows who have to buy the good stuff), I am appreciating the increasing number of video outputs that some of AMD's newer cards offer. When you can get a 1920 x 1080 panel in the 21ish inch range for ~$120, more video outputs means more sweet, sweet, screen area without the hassle of accommodating multiple video cards or the substantial expense and lousy performance of specialty Matrox gear.

My graphical demands go about as far as Oblivion(mid 2006, nothing exciting) at 1920x1080, draw distances maxed; but the fact that I can buy enough monitors to cover central and peripheral vision, and a card to drive them, for what a single top-of-range card would cost, makes day to day computing very much more pleasant(on the minus side, it has utterly spoiled me for mobile computing, as I've become used to having a massive work area; but so it goes...)

Re:Faster video card, huh? (0)

Anonymous Coward | more than 2 years ago | (#38892061)

when the the 5xxx series came out, i thought about the getting the 4xxx series at the discount. In looking at the IDLE POWER, the price difference (over a couple years expected life) made the 5xxx series cheaper

Having said that, the 5xxx series is when they started tackling the idle power issues a bit better.

How is it at mining BitCoins? (5, Funny)

Anonymous Coward | more than 2 years ago | (#38885751)

What's the calculations per watt? Will I be able to put them in a crossfire frankenbox to make my fortune?

Re:How is it at mining BitCoins? (0)

Anonymous Coward | more than 2 years ago | (#38885841)

If the 7970 is any indication, it's actually slower than the high-end 5000 series.

Re:How is it at mining BitCoins? (0)

Baloroth (2370816) | more than 2 years ago | (#38885961)

IDK what you're smoking, because you are just plain wrong. The benchmarks show the 7950 as faster than the 6970, the fastest last-gen AMD card (except for the dual-GPU monstrosity that is the 6990, of course). Unless you actually meant the dual-GPU card, of course, which is not in any way shape or form a fair comparison.

Re:How is it at mining BitCoins? (1)

mattventura (1408229) | more than 2 years ago | (#38886009)

The 5000 series was better in some regards than the 6000 series, so it is entirely possible that it is worse than the 5000 series yet better than the 6000 series.

Re:How is it at mining BitCoins? (1)

Baloroth (2370816) | more than 2 years ago | (#38886783)

The 6000 series mostly was the 5000 series. The high end may be a bit different, but the upper-midrange (6770, 6870 stuff) was literally the same chips with some minor stuff tacked on. 3D and some more advanced video support mainly, IIRC.

Re:How is it at mining BitCoins? (1)

BlackSnake112 (912158) | more than 2 years ago | (#38892709)

More RAM too. There were not too many 2GB models in the 5000 series. There are a bunch of 2GB or 3GB models in the 6000 series. Nvidia did the same thing with the 8000 series and the 9000 series. The 9000 series was the 8000 series with a few more features. I have a 8800GT 512 MB card and a 9800GT 1GB card. The 9800GT card does not take any extra power while the 8800GT needed a 6 pin power plug to function correctly. The difference nothing that I can see. The 8800GT is actually 'faster' according to tests. But with the larger amount of memory, the 9800GT plays games the same or better. Since the games I play have huge maps, the larger memory helps a lot. I am looking at a 6950 2GB model. The price has yet to drop with the release of the 7000 series. I am just waiting.

Re:How is it at mining BitCoins? (3, Insightful)

Squiddie (1942230) | more than 2 years ago | (#38886037)

He meant at mining bitcoins, you dolt. These new cards just don't perform well in that area.

So when will there be affordable cards (2)

afidel (530433) | more than 2 years ago | (#38885773)

So when will there be cards affordable by normal people? Also for me the biggest thing to come out of the new design is that we should be able to get a passively cooled card with more performance than the HD5750.

Re:So when will there be affordable cards (4, Informative)

Xanny (2500844) | more than 2 years ago | (#38885865)

When Kepler comes out expect all these cards to significantly drop in price.

GCN was a huge cost on AMDs part, and Kepler will be a refinement of Fermi, so Nvidia will aggressively price the 600 series (especially since they won't launch for another 2 months) and make profit on them. And expect AMD to take a loss on the investment but not on the returns from fabrication on the 7900 series (assuming they fab the 7800 and lower cards on their old VLIW architecture like the roadmap from last years aid they would).

So when Kepler comes out, it will probably be aggressively priced, and AMD will drop prices to match. For now they are exclusively the only maker of "next gen" gpus after 2010s 500 and 6000 series, and Kepler is 2 months away, so AMD is milking it.

Re:So when will there be affordable cards (2)

Kjella (173770) | more than 2 years ago | (#38886067)

According to MSI via Fudzilla, the 77xx series will launch in two weeks at $139/$149 and the 78xx series in March at $249/$299. After that the ball is in nVidia's court, but the current guesses are they're not ready until April, sometime around Intel's Ivy Bridge processors. I think it's working, I've looked at the 7950s and is tempted but will probably wait until then and see if they bring better performance or lower prices, if nothing else to get a better price from AMD. Currently the 7950 costs about double (in NOK at least) what I paid for my 5850 in 2009, which is rather disappointing - then again that was a steal I got before the MSRP price hike.

Re:So when will there be affordable cards (1)

afidel (530433) | more than 2 years ago | (#38886223)

Bah, the 7700 series is only going to have ~10% more memory bandwidth than the 30 month old HD5700 series, this is supposed to be progress?

That aside I'll be looking for benchmarks since it might have a bit more DX11 oomph in the same ~85W max TDP envelope.

Re:So when will there be affordable cards (1)

nzac (1822298) | more than 2 years ago | (#38887871)

I thought they added 100 to the version numbers for the 6000 series.
You should compare to the HD5600s.

Re:So when will there be affordable cards (1)

Lonewolf666 (259450) | more than 2 years ago | (#38890109)

The 7700 series will definitely be interesting, if you want to build a quiet computer that still can handle most games (albeit not at the highest graphics settings).
My latest PC upgrade a few months ago used 6770, and so far it has handled all I've thrown at it.

Re:So when will there be affordable cards (1)

ericartman (955413) | more than 2 years ago | (#38890763)

Quiet computer? Well I bought a 4870, which just burned up, thank god. I got tired of leaving my computer on all night because I couldn't go through boot up without my computer sounding like a jet plane taking off and waking the whole house. At least I could control the fan in Windows for gaming but in Linux it just sat there at 5k rpm. Well at my old age I went back to school and gave up gaming, no time or money for a video card that costs as much as a console. BTW playing even WoW, or Rift at ultimate caused the card to run at 100c, scared me, probably explained the burned up board. So its win win I get to play with Linux again, and I have my day for me. No more ATI for me, second expensive card I bought from them and both didn't work out so well.

Re:So when will there be affordable cards (0)

Anonymous Coward | more than 2 years ago | (#38891729)

AMD started getting really aggressive about power consumption/temps and noise after the 4xxx series. The 7970 cards have some of the lowest "idle" power draw of any discrete graphics card on the market these days, even compared to low-end, low-power cards from previous generations.

Re:So when will there be affordable cards (1)

Mashiki (184564) | more than 2 years ago | (#38886091)

So when will there be cards affordable by normal people?

Well the sweet spot is usually about 8-9mo after the release of a new card. That gets all the major bugs out of the manufacturing, and all the driver issues hammered out. And the prices have pretty much bottomed out too.

Re:So when will there be affordable cards (1)

Squiddie (1942230) | more than 2 years ago | (#38886117)

Yes, but at that point these cards will be old and busted and the new hotness will have been announced. It does backfire sometimes, though, you know Fermi and all.

Re:So when will there be affordable cards (1)

Mashiki (184564) | more than 2 years ago | (#38887191)

Wait. New hardware is old and busted? Okay. I mean it's not like the new stuff based on the old stuff, doesn't support current generation tech or anything. Like it did last time.

Re:So when will there be affordable cards (1)

Squiddie (1942230) | more than 2 years ago | (#38887845)

I was mostly making a joke, but it is true that eight months from now some will start to wait for next gen rather than buy current gen at a good price. The way I do it is to just buy whenever I think I need it. Like that there is no remorse.

Re:So when will there be affordable cards (1)

dadioflex (854298) | more than 2 years ago | (#38888097)

People waiting for the latest next gen card are lucky. They never have to buy anything. Just wait.

Re:So when will there be affordable cards (0)

Anonymous Coward | more than 2 years ago | (#38886105)

we should be able to get a passively cooled card with more performance than the HD5750

But, but, the number is so cool that I'm gonna come!

HD!!! And 75O!!!!!!!!!!!! Fuckin' eh !!!!!!!!!!!!!!!!!!

ATI? I'd rather give up computers.

Re:So when will there be affordable cards (0)

Anonymous Coward | more than 2 years ago | (#38886213)

Wait till the entry-level 7570, 7650 and 7970 come out. Or if you don't want to wait, get one of the lower end cards from the 6000 series. There is an entire range of cards which are affordable by normal people. If you want to game on three-monitor setup on max resolution, FX AA and want 60 fps, then you are not normal and shouldn't expect affordable cards.

I consider myself as normal but want to run all the games at 1080p on a single monitor with maxed out settings and great FPS. So I went ahead and ordered the 7970. Was it an overkill? Most probably. Will a 7950 have sufficed? Yes. Is the 7970 more future proof than 7950?

Re:So when will there be affordable cards (1)

afidel (530433) | more than 2 years ago | (#38886289)

No, I want to game at 1080p without needing a fan and for less than $150. Basically I'm hoping for a 28nm version of the HD5750 where the process shrink can gain me a bit more DX11 performance.

Re:So when will there be affordable cards (1)

Sir_Sri (199544) | more than 2 years ago | (#38887339)

Game what at 1080p? 4 year old games? 1080p is a resolution (1920x1080) and a frequency, it says nothing about the quality of the image being drawn, it just clearly defines the size and refresh of the image. You could run the original Xcom at 1080p on a 25 dollar air cooled card, but that's not what you mean is it?

If you want to play Arkam city with something close to max settings at 1080p you need a better card than the 5750. If you're willing to take crappy settings then it *might* be possible for an air cooled 7000 series card to do that, barely. But don't expect an air cooled card anytime soon.

The way this works is every chip is a 7970 to start, and then the not perfect ones are 7950's, the ok ones will be 7800's the crappy ones 7500's the horrible ones 7200's (air cooled), and they won't ship those in volume any time soon, and even then you're not really looking at the right product. Air cooled cards are for really really really basic machines and they only sell air cooled cards to avoid junking those parts, they aren't intended for anything 3d let alone gaming.

Re:So when will there be affordable cards (1)

Luckyo (1726890) | more than 2 years ago | (#38888477)

Not very likely to happen. Most modern games do put a lot of stress on GPU, which means that you either forego quality, fps, or you install a proper active cooling solution.

Market for functional "silent" solutions is generally an expensive one as it either uses expensive fans with high end bearings and bigger blades (allowing for slower rotation speeds for same air flow), or liquid cooling in high end. You're not going to enter it with a sub-150USD card with passive cooling - these cards are notorious for both having atrocious performance and actually crashing under heavy load due to overheating (as they are usually put into equally "silent" cases with no proper air flow).

Re:So when will there be affordable cards (1)

bemymonkey (1244086) | more than 2 years ago | (#38888075)

Well, the 6870/6850 was pretty much the bang-for-your-buck card in the last gen, with the 6770/6670/6570 being really affordable for most any aspiring gamer - so I'd assume you'll need to wait for a 7870/7770/7670... shouldn't be all too long now. I'm waiting for the 7770 (or the 7 series equivalent of the 6770) myself - should be a nice reduction in power consumption and noise, coming from an 8800GT.

Re:So when will there be affordable cards (1)

dunkelfalke (91624) | more than 2 years ago | (#38888777)

You could have one for a while now, Gigabyte HD5770 Silent Cell.

Re:So when will there be affordable cards (1)

afidel (530433) | more than 2 years ago | (#38888899)

Yeah, the 5770 is a whole what 100MHz faster on the core clock and maybe a smidge more on the memory clock than my 5750, certainly not worth spending another $150 to "upgrade".

Re:So when will there be affordable cards (1)

dunkelfalke (91624) | more than 2 years ago | (#38889175)

I was just nitpicking.

Disabled? (4, Interesting)

schitso (2541028) | more than 2 years ago | (#38885785)

"a few segments of the GPU have been disabled"

As in, can be re-enabled with a custom BIOS or something?

Re:Disabled? (0)

Anonymous Coward | more than 2 years ago | (#38886053)

Same question here.
I noticed there is a switch on the reference design, so we can hope an "unlocked" bios will be released to re-enable those inactive segments.

Re:Disabled? (1)

Moheeheeko (1682914) | more than 2 years ago | (#38886079)

Yes. I bought a first run 6950 and with a bios flash i now have a 6970 for $80 less.

Re:Disabled? (1)

Squiddie (1942230) | more than 2 years ago | (#38886137)

I sure hope so. I have a 6950 that I flashed. I wanted to get another, but I could not find any more. Still, I might just get the normal 7970 because it just overclocks so well. Still waiting on nvidia so that we can get some price drops, though.

Re:Disabled? (0)

Anonymous Coward | more than 2 years ago | (#38887295)

No. AMD has stated that the extra compute units (shader units in the case of the 6950->6970 unlock) have been laser-cut.

Additionally, as there are some indications that the 7970 itself has extra, cut-off compute units, it's likely that the cut parts don't work properly this time, and were disabled like buggy processor cores are, in order to improve yields. BIOS flashing thus far has only seemed to give improvements from better memory timings (and clock speeds, obviously).

Re:Disabled? (1)

serviscope_minor (664417) | more than 2 years ago | (#38889065)

As in, can be re-enabled with a custom BIOS or something?

Probably. Though since the cards have a very uniform architecture, with many repeats of the same thing, my guess is that they bin the chips according to the number of stream processors which are defective. This allows them to fab nothing but top end cards and get good yields of slightly off the top end cards.

GPU manufacturers certainly used to sidable non-"pro" level features in cheaper cards (which could be re-enabled by various hacks) though the cards usually weren't quite as fast (though 90% of the speed for half the price was a great deal).

Also, mods: WTF? Why is this post marked redundant.

Re:Disabled? (1)

schitso (2541028) | more than 2 years ago | (#38890641)

Also, mods: WTF? Why is this post marked redundant.

I was curious about that myself.

But... (3, Informative)

goldaryn (834427) | more than 2 years ago | (#38885827)

But does it run Linux?

No, seriously... last time I tried to install Ubuntu with an ATI card (a few months ago), I couldn't get dual monitors to work correctly.

The restricted drivers exist, but are unstable, awkward and painful. Linux and Nvidia - a bit better in my experience..

Re:But... (1)

Anonymous Coward | more than 2 years ago | (#38885935)

I've been running a dual-monitor setup since Ubuntu 8.04 with a radeon 4850 + proprietary drivers without any troubles.

Re:But... (2)

O('_')O_Bush (1162487) | more than 2 years ago | (#38885969)

Really? Because getting the nv drivers to work correctly with a 1440x900 monitor was like pulling teeth, which is why I abandoned my brand of choice for an ATI card this latest go 'round.

Re:But... (3, Informative)

Anonymous Coward | more than 2 years ago | (#38886609)

That's why the driver is called 'nvidia' not 'nv'. 'nv' is the incomplete, OSS driver. 'nvidia' is the driver supported by nVidia. At its core, it's the same driver as on Windows.

Re:But... (0)

Anonymous Coward | more than 2 years ago | (#38887507)

Beat me to the post. The nVidia blob has been a beautiful, painless experience for me, for ten years - full 3D acceleration on every single card I've run, every time, and vdpau playback on anything that supported it. ATI drivers blew long before AMD snarfed them down, and I had held my breath waiting for AMD to fix that problem. I exhaled long ago. It's unfortunate, because there are a couple of E350/450-6310/20 boards with the kind of specs I could use, but I'll be damned if I ever waste another hour of my time trying to make ATI's shit work. They should just toss cash at the open source community to maintain their drivers/modules.
No, the VESA driver doesn't cut it, in this case.

Re:But... (1)

hairyfeet (841228) | more than 2 years ago | (#38888599)

Hate to break the news to ya but the E350 has been supported OOTB by Ubuntu since 10.04 so i'm sure the others are up and running as well. you seem to forget AMD actually hired coders for the free FOSS drivers as well as handed out the specs, go look that chip up on Phoronix, they say it works just fine.

Re:But... (1)

JonJ (907502) | more than 2 years ago | (#38888773)

However, the open source drivers give generally poorer performance because there are some bits and pieces AMD/ATi won't tell you, because it's "not their technology".

Re:But... (3, Interesting)

hairyfeet (841228) | more than 2 years ago | (#38890319)

It has nothing to do with AMD and frankly you will NEVER get those bits because it would be illegal to give them to you. AMD has already said there is nothing they can do over HDCP and protected path as that technology is owned by the HDMI consortium and to give out that information would be breaking DMCA as well as get every AMD card blacklisted. If you want those bits you can use the blob which again Phoronix ran full tests [phoronix.com] and found it runs just fine on Ubuntu 10.04 and Ubuntu 11 runs OOTB, it also smokes Atom + ION on their benches. For a board that costs just $142 for the barebone kit [newegg.com] complete with PSU and case that makes it a hell of a cheap Linux box, especially when you figure in the fact you are getting dual core plus Radeon plus the ability to run 8Gb of RAM.

But FOSS users are simply gonna have to accept the fact unless you wanna do like RMS and hop on chinamart for some funky ass Loongson MIPS netbook there are NO machines that you are gonna have complete access to, because if it has even slightly modern video output it'll have protected path and if it has wireless it'll have non FOSS firmware. Hell even the Raspberry Pi has broadcom binary blobs, welcome to reality. in the end what should matter is "does it work" and as phoronix shows yes it does, and it beats Atom + ION while having better graphics and often a lower price. Seems like a win/win to me but if you really have your heart set on Nvidia they have a PCIe slot, and there is an open box GT210 on Newegg for less than $20, knock yourself out. Even with the discrete card it'll still be cheaper than an Atom + ION board.

HDCP and protected path is only one aspect (1)

Lonewolf666 (259450) | more than 2 years ago | (#38890675)

So far, 3D acceleration is also significantly slower than in the closed sorce Catalyst driver. Some of that technology may also be owned by 3rd parties, but it is not as clear-cut as in the case of HDCP.

I suspect AMD's reasons for not releasing that stuff are part legal and part not wanting to give away the latest know-how.
But the latter seems a bit silly, as NVidia drivers already have the better reputation and probably the better code. AMDs advantage seems to be on the hardware side, with their chips cranking out more (theoretical) GFlops/watt.

Re:HDCP and protected path is only one aspect (1)

hairyfeet (841228) | more than 2 years ago | (#38892665)

But its not like you have an either or choice here friend, if you want Nvidia graphics you can slap a $17 Nvidia discrete and it'll still stomp Aton + ION because the Zacate chip is simply the better CPU. But considering how fast the open team is catching up and the fact that AMD is paying extra coders to help them I'd say the safe bet is to use the closed now and the open in a year, maybe less. After all with each release they are closing the gap and now that even Nvidia is going OpenCL which AMD fully supports pretty much everything will end up being OpenCL accelerated.

I can tell you that as someone that owns an E350 EEE I wouldn't have any problem with putting the E350 as the center of a low cost PC, be it Linux or Windows. Its quiet, low power, has great performance, and if Asus can make it fly like an eagle under Expressgate which is a custom Android like Linux frankly i don't see why the distro guys would have any worse time of it. Frankly I don't know why the community isn't pushing Expressgate/Splashtop because its bloody brilliant. 6 second cold starts, really nice slick task based UI, app store, its just a really nice experience. Oh and on mobile it'll give me an extra hour and a half on the battery, what's not to like?

but considering you can often find E350 boards starting at $80 I'd say its a no brainer for a low cost Linux box, dirt cheap to buy, crazy cheap on power, nice performance, quiet as a churchmouse. Hell you can even go full fanless and with a SSD have a completely silent PC that'll do any basic office or web task you can dream up. I've actually used a couple of E350 boards as cheap upgrades for a local office building and they just love the things, you can't even hear the machine and doesn't put out any heat. Its just a really nice little unit and you can't beat a dual core that cheap.

Re:But... (2)

chromas (1085949) | more than 2 years ago | (#38885985)

In my experience (OpenSuse, though, not Ubuntu), install first, add extra monitors later, especially if they run at different resolutions. If you use the official/proprietary drivers, be sure the open drivers are completely removed from your system or you'll have a conflict.

Re:But... what? (0)

Anonymous Coward | more than 2 years ago | (#38886007)

I have a low-end AMD card running 3 monitors right in front of me (Radeon HD 3450) . Must've been user error. ;)

I kid I kid... =P It was actually a huge PITA to set up 3 monitors, but before that I had been running two monitors very nicely for a very long time. It's incredibly easy in my opinion.

This is the card I used for the 3 monitors: http://www.jaton.com/VGA/graphics_card_detail.php?pid=138

Re:But... (1)

Kjella (173770) | more than 2 years ago | (#38886015)

The Catalyst drivers just landed on the 27th of Januray I think, before that there was a hotfix release for real enthusiasts. Open source support is as far as I know still missing, but basic support should not be far away. They've consistently come closer to release date with each release, last it took 2.5 months and I expect less this time. If you want it the moment it's released expect to compile your own kernel/xorg/driver though. Don't expect any miracles from the OSS drivers though, as I understand it there's some major rework going on because of the architecture changes.

Re:But... (0)

Anonymous Coward | more than 2 years ago | (#38886025)

I added "Virtual" in "Display" subsection to my xorg.conf, and used xrandr.

Re:But... (1)

secretsquirel (805445) | more than 2 years ago | (#38886233)

like duh

3 linux

Re:But... (0)

Anonymous Coward | more than 2 years ago | (#38886573)

I run 3 monitor on a HD5450 with OSS drivers. It work, but the card buffer is too small to run them all at full resolution, it is an hardware limitation, I should have checked before buying. The software run fine (but of course, this card is crap, so I can't get game to run on it, it is why I have a NVIDIA 8800GT as back up GPU to crunch the 3D stuff)

Re:But... (0)

Anonymous Coward | more than 2 years ago | (#38886759)

I haven't had any trouble on Ubuntu with my HD6850 and two displays, running the latest version of Catalyst. I had some initial issues with lingering parts of the open, built-in drivers conflicting with those of ATI, but it took less than half an hour to resolve everything.

Re:But... (1)

PixetaledPikachu (1007305) | more than 2 years ago | (#38886837)

But does it run Linux? No, seriously... last time I tried to install Ubuntu with an ATI card (a few months ago), I couldn't get dual monitors to work correctly. The restricted drivers exist, but are unstable, awkward and painful. Linux and Nvidia - a bit better in my experience..

I have been doing dual monitor with ATI/AMD X300 (Benq Joybook 5200G), HD3470 (Toshiba Satelite M300), and HD5650 (Sony Vaio VPCEA36FG). The only time that dual monitor failed me is when I'm using Ubuntu 8.10. Currently I'm using 10.10, with a Samsung 43' LCD tv as secondary monitor via HDMI. Mirror and splitscreen works

Re:But... (0)

Anonymous Coward | more than 2 years ago | (#38887189)

I've got a 5570, and performance is noticeably lower under linux. I've tried open and closed source drivers. I find the closed to be slightly higher performing, but have frequent artifacting when pushed.

Re:But... (1)

Osgeld (1900440) | more than 2 years ago | (#38887501)

well I have an older ATI card in a linux box in the other room, if I load the propitary drivers as soon as X loads the screen shuts off, so I agree ATI+Linux = worthless

always has been, probably always will be

Re:But... (1)

webheaded (997188) | more than 2 years ago | (#38890669)

Actually the OSS ATI drivers aren't too bad on Linux. I hadn't really messed with any of that stuff before in KDE so when I did my new Arch install, I was surprised by how easy it was to configure all that. I was kind of irritated that hitting apply didn't save my settings and it took me quite some time to figure out there was a separate "save" button somewhere in the display dialogs, but other than that...it's not bad. The only thing that's kind of annoying is the power control. You have to manually set that and the fan up because whatever it has built in is worthless.

Is the price really that horrible? (5, Interesting)

Anonymous Coward | more than 2 years ago | (#38885833)

When Nvidia puts out a $500 card, it's attractively priced [hothardware.com] .

When AMD puts out a faster card for 10% less, it draws complaints about the price from the same reviewer. What gives?

Re:Is the price really that horrible? (4, Interesting)

Dyinobal (1427207) | more than 2 years ago | (#38885977)

Nvidia pays better, and sends better swag I'd guess.

Re:Is the price really that horrible? (1)

Anonymous Coward | more than 2 years ago | (#38889397)

You might not be joking. [phoronix.com]

Re:Is the price really that horrible? (2)

Baloroth (2370816) | more than 2 years ago | (#38885981)

People expect AMD to be cheaper, even when they are competitive on a performance standpoint. AMD usually aims for the mid-range market more, so I expect seeing a top-end card from them (at top-end prices) is a little surprising.

Re:Is the price really that horrible? (3, Insightful)

goldaryn (834427) | more than 2 years ago | (#38885987)

When Nvidia puts out a $500 card, it's attractively priced [hothardware.com] . When AMD puts out a faster card for 10% less, it draws complaints about the price from the same reviewer. What gives?

To be fair, that review you linked is from November 2010. Perhaps second-hand 580s are better value or something.

But also to be fair... (0)

Anonymous Coward | more than 2 years ago | (#38886941)

In the interest of fairness, I'd also like to point out that the 580 still costs more than the faster 7950. And yet the reviewer still gripes.

Re:But also to be fair... (1)

Sir_Sri (199544) | more than 2 years ago | (#38887395)

The market is changing, and the reviewer is reflecting that. People don't want to spend 600 dollars on a top end card, even if 5 years ago the 'top end' cost 800 dollars (or whatever it was).

The perception is (rightly or wrongly) that all of these things should be getting faster and cheaper at the same time. That's not entirely wrong, but it's not entirely right either. A die shrink should mean lower cost for the chip itself, depending on yields but has nothing to do with any of the other parts on the PCB, market pressures or the like. (And the reviewer probably knows AMD has the price jacked up a bit because they probably haven't moved more than 10k units of the top end card right now and are trying to get as much money as they can from them before launching other things at a lower price).

Re:Is the price really that horrible? (1)

Kjella (173770) | more than 2 years ago | (#38886171)

Maybe that the first of the 28nm process generation costs about the same as the last of the 40nm process generation released a year and a half ago? Currently the effect on the price/performance ratio has been almost nothing, they've offered higher performance at a higher price. Yes, the 7950 is now beating the GTX 580 in most ways but it's not exactly a massively better deal. Hopefully nVidia will be a bit more aggressive but if they're both held back by TSMC's delivery capacity the duel can get a bit lame.

Re:Is the price really that horrible? (0)

Anonymous Coward | more than 2 years ago | (#38886551)

It's because Nvidia is higher quality. The user experience with Nvidia is always much smoother and better.

Re:Is the price really that horrible? (0)

Anonymous Coward | more than 2 years ago | (#38886717)

Pay no mind. This AMD bashing is just the never-ending media onslaught against the underdog.

If AMD put out a brand new card next week whose performance smashes every other card in existence, and at half the price, it'd still get panned by every blogger, and known tech site.

At this point, it's just expected. They'll catch on at some point, but by then everyone will be ignoring them.

Re:Is the price really that horrible? (0)

Anonymous Coward | more than 2 years ago | (#38887047)

Because you can actually use an nVidia card. Unlike an ATI/AMD whose drivers suck so bad you'll be lucky if you can even boot your OS.

Ever try to write an OpenGL app for ATI? Fun, fun stuff.

Ever wonder why so many reviews with benchmarks have empty spots for the ATI card because they couldn't get it to work?

I believe they have superior hardware to nVidia but without drivers there just isn't any way to take advantage of it. They need to stop letting engineers design and write their drivers. Get some software people in there you morons!

Re:Is the price really that horrible? (0)

Anonymous Coward | more than 2 years ago | (#38892679)

Because AMD just released a card 1 year behind nVidia, offering about the same performance as a 1 year old card, and charging about the same price... does that sound like a good value to you?

I wasn't an ATI/AMD fan until... (3, Interesting)

Tastecicles (1153671) | more than 2 years ago | (#38886087)

...well, let's clear things up: I was always an AMD fan. Their CPUs rocked. I had a seriously great time overclocking my SS7 gear until it boiled.

The graphics cards sucked though. I'm talking about the old Radeon AGP cards. Put down your paddles, lads, 2006 was the last time I bought an ATI branded card (an X1800) and IMHO it sucked monkey balls. I couldn't even get it to perform at low resolution on Unreal 2002. That's why I went straight back to the store and swapped it for an NVidia 7600GT. Oh, yeah, life was sweet after that.

A couple weeks ago I bought a secondhand Sapphire HD3650 with 512MB DDR2. OK, it's a bloody old and very low spec card by tech standards, but it blows my GF 7600GT right out of the water - even on a slower, single core 64-bit processor running 32-bit platform. That made me a fan of ATI/AMD graphics right there. The old machine (Core Duo) with the NVidia is now collecting dust.

Re:I wasn't an ATI/AMD fan until... (2)

SplashMyBandit (1543257) | more than 2 years ago | (#38887237)

Lol. You replaced one old outdated card with another :) My personal experience has been that NVidia has excellent drivers. ATI/AMD have better hardware and better visual quality (NVidia often as strange visual artifacts). Downside of ATI is their drivers are dodgy. It is always a risk upgrading an ATI driver. Sometimes new drivers can break your favourite game until a hotfix comes out (usually takes a fortnight or so). So, whether you go NVidia or AMD depends on what you want (NVidia ease of use) or ATI (more power, better value).

Re:I wasn't an ATI/AMD fan until... (1)

webheaded (997188) | more than 2 years ago | (#38890779)

ATi drivers aren't just dodgy...they are awful. I've had a 4870x2 for a while now and I've seen issues ranging from buggy games, to crashing video drivers playing flash, and green video for flash. I did a completely clean install for the last release and got about 2 days of being able to use Youtube before it started green videoing again. It is truly incredible that they can make a video driver that can't properly play fucking YOUTUBE VIDEOS with hardware acceleration while at the same time being able to play Rage properly (same release...broke Flash and fixed Rage...wtf?).

I'm never buying ATi again. The hardware is great but the drivers are atrocious. I've never experienced the same level of bugginess from an nVidia card. ATi drivers ruin some really beautifully priced hardware and that's really sad. About the only thing they completely kick nVidia's ass at is overscan. I sorely miss having an ATi card on my media center because nVidia's overscan options are fucked. ATi? Drag a slider and the screen moves bigger or smaller...done. NVidia...drag a slider left, right, up, down, etc...sets a custom resolution. Sets normal resolutions to actually go to the custom one. Looks blurry as hell. I end up having to use a resolution that my card considers technically 1080 when I want to play a game on it because that card sure as hell can't handle games at 1080 but if I go to ACTUAL 720 (1280x720)...the screen is stretched off to where I can't see it. Awesome.

This certainly isn't going to make people want to PC game dealing with this kind of bullshit.

Re:I wasn't an ATI/AMD fan until... (0)

Anonymous Coward | more than 2 years ago | (#38892443)

This right here is pretty much what has kept me on the Nvidia train since Voodoo/Diamond/Monster died off. Nvidia apparently has a tendency to send out larger teams to dev houses working on titles, and the drivers are usually pretty solid. Whenever I hear of a driver issue with a game, it is almost always related to ATI/AMD drivers...

Re:I wasn't an ATI/AMD fan until... (1)

Osgeld (1900440) | more than 2 years ago | (#38887465)

yea my 9600GT kicked my 7600GT right square in the nuts, actually just about any card after the 7600GT would have rocked it, your comparing a sports car to a yugo. The 7600GT was the absolute worst waste of money I have ever spent on a video card as my 6600GT actually performed just as well

Re:I wasn't an ATI/AMD fan until... (1)

TheDarkMaster (1292526) | more than 2 years ago | (#38889369)

ATI have a better hardware, but their drivers are pure, total crap. It's like buying a Ferrari and put a mediocre pilot to drive. After many problems with drivers I gave up buying a new Radeon and now I use a GTX580

Re:I wasn't an ATI/AMD fan until... (0)

Anonymous Coward | more than 2 years ago | (#38889779)

I had a seriously great time overclocking my SS7 gear ...

I think you need to get out more
(yes I deserve to burn in moderation hell for this, but I really couldn't resist.)

X950s are overclockable like hell (1)

unity100 (970058) | more than 2 years ago | (#38886647)

im on a 6950, it is clocked at 810 mhz, but it can do 910 mhz by just using the ati catalyst slider. no fancy stuff. if you go into serious overclocking, you can approach 1000 mhz easily, if you play with the voltages and stuff.

moreover, X950s are generally unlockable. for example i unlocked the 6950 im sitting on, unlocking 24-30 or so shaders, basically making it a 6950. i could also flash a 6970 bios and make it a full 6970, but that's totally unnecessary, since i can get more than that by overclocking.

a good deal.

Really? (1)

TheRedShirt (1767228) | more than 2 years ago | (#38886817)

"Beats NVIDIA's high-end GeForce GTX 580 by just a hair."

You don't say. Must not have factored in Nvidia's history of selling and shipping GPUs that were known to be defective and then conspiring with the purchasers to hide this fact from the users until after their warranties ran out.

If they had, this new GPU would out perform Nvidia's by huge leaps and bounds.

6150 Go. Look it up.

Who gives a shit? (1)

Luke727 (547923) | more than 2 years ago | (#38887123)

My dick won't get hard until video cards take up 4 expansion slots and require 2 kW power supplies. Shiiiiiiiiiiiit [youtube.com] .

um yea that shit better come with a Asain hooker (1, Insightful)

Osgeld (1900440) | more than 2 years ago | (#38887439)

really what is the point of this any more? 90+ % of your games are optimised for consoles first giving you at best a geforce8800GT, computer monitors are not getting any higher resolution and they still have not come up with a cooling system that doesnt clog with dust in a month!

nevermind the absolute shit drivers ati ships

AMD/ATI still have scheissy Linux support (1)

TheGoodNamesWereGone (1844118) | more than 2 years ago | (#38887927)

I won't buy an ATI card until the Linux driver situation is fixed.

Re:AMD/ATI still have scheissy Linux support (1)

Anonymous Coward | more than 2 years ago | (#38888229)

My 5770 works fine on my current KDE desktop, multiple monitors with different resolutions and refresh rates. The gpu acceleration in Blender's new rendering engine is a lot of fun to mess about with.

Re:AMD/ATI still have scheissy Linux support (1)

JonJ (907502) | more than 2 years ago | (#38888797)

Lucky you. On my box, regardless of distro, kernel, and catalyst drivers, VLC always segfaults when trying to play accelerated video. It works fine with nvidia, so I have to conclude that the drivers for the AMD card are worthless.

Never again, ATI (0)

Sniper061 (2564827) | more than 2 years ago | (#38888033)

The last ATI card I bought was an HD5970 shortly after it was released. The card worked... fairly well... Performance on some games was pretty good but others were full of artifacts and/or crashed outright. Various driver releases alleviated some problems but not nearly all of them. Most problems required some "creative" solutions on my part to get my programs to work correctly. After finally getting fed up with the whole situation, I finally caved in and bought an NVidia card. Performance increased and I never had any problems with it. As long as both card companies stay roughly on par, I'll stick with NVidia.

Re:Never again, ATI (1)

unity100 (970058) | more than 2 years ago | (#38890661)

you should have bought a card from a proper manufacturer. 5970s are still monster cards.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?