×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Radeon HD 7970 Launched, Fastest GPU Tested

timothy posted more than 2 years ago | from the that's-rent-in-some-towns dept.

Graphics 281

MojoKid writes "Rumors of AMD's Southern Island family of graphics processors have circulated for some time, though today AMD is officially announcing their latest flagship single-GPU graphics card, the Radeon HD 7970. AMD's new Tahiti GPU is outfitted with 2,048 stream processors with a 925MHz engine clock, featuring AMD's Graphics Core Next architecture, paired to 3GB of GDDR5 memory connected over a 384-bit wide memory bus. And yes, it's crazy fast as you'd expect and supports DX11.1 rendering. In the benchmarks, the new Radeon HD 7970 bests NVIDIA's fastest single GPU GeForce GTX 580 card by a comfortable margin of 15 — 20 percent and can even approach some dual GPU configurations in certain tests." PC Perspective has a similarly positive writeup. There are people who will pay $549 for a video card, and others who are just glad that the technology drags along the low-end offerings, too.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

281 comments

This would be really cool... (2, Insightful)

Anonymous Coward | more than 2 years ago | (#38461024)

...if most PC games weren't just shitty console ports these days. If you spend over $150 on a graphics card you're an idiot.

Re:This would be really cool... (5, Funny)

Hatta (162192) | more than 2 years ago | (#38461080)

Hush. Those idiots finance the advance of technology.

Re:This would be really cool... (1)

AngryDeuce (2205124) | more than 2 years ago | (#38461160)

Hopefully the prices on the 5000 and 6000 series start dropping after Christmas. My 4670's are starting to show their age...

Re:This would be really cool... (1)

Baloroth (2370816) | more than 2 years ago | (#38461418)

I was actually hoping they come out with a 7850 or similar soon. My 5770 is still pretty strong, but I wouldn't mind an upgrade soon, and I like to have all the newest features (I can do without the top speed).

Re:This would be really cool... (1)

theantipop (803016) | more than 2 years ago | (#38461494)

You've been able to get a 6850 for darn near $100 for some time now. It's a pretty good deal unless you only buy high-end GPUs, which honestly seems like a waste anymore. My 4 year old 8800GT is just now starting to feel inadequate. I'm looking for the second coming of the 8800GT to emerge from this generation so I can hold on to it for another 4 years.

Re:This would be really cool... (0)

Anonymous Coward | more than 2 years ago | (#38461886)

Stuck with the 8800GT for a long time, but it falls short on newer games that are really shader heavy, unfortunately.

Went to a radeon 6870. Amazing value.

Re:This would be really cool... (1)

Anonymous Coward | more than 2 years ago | (#38461166)

You'll need a bit more than your average 150 dollar card to max out teh pretty on most "consolized"

I'll admit, though, that your average GPU nowadays has a much longer life than they used to.

Re:This would be really cool... (0)

Anonymous Coward | more than 2 years ago | (#38461302)

You'll need a bit more than your average 150 dollar card to max out teh pretty on most "consolized"

Uh... nope. I seem to be doing just fine with a video card under $100 (or, at least, it can max out Skyrim and various other games).

Re:This would be really cool... (5, Insightful)

Lumpy (12016) | more than 2 years ago | (#38461170)

Says the idiot that only uses a PC for gaming.

Adobe After Effects will use the GPU for rendering and image processing.

Re:This would be really cool... (2)

LordLimecat (1103839) | more than 2 years ago | (#38461294)

Arent you way better off with a workstation card for most workstation loads? From what Ive read, a GTX or ATI HD makes for a poor CAD or Adobe machine.

Re:This would be really cool... (4, Interesting)

Lumpy (12016) | more than 2 years ago | (#38461390)

Nope. Bang for buck this new card kicks the butt hard of the Workstation cards.

Re:This would be really cool... (4, Informative)

billcopc (196330) | more than 2 years ago | (#38461744)

Depends on the type of processing. GTX and Radeon cards artificially limit their double-precision performance to 1/4 of their capabilities, to protect the high-margin workstation SKUs. If all you're doing is single-precision math, you're fine with a gaming card.

Re:This would be really cool... (-1)

Anonymous Coward | more than 2 years ago | (#38461216)

If you spend less than $150 on a graphics card, you won't be able to play Battlefield 3 or Witcher 2 comfortably (60+ fps constantly). I want to play games with my PC, so I guess I am an idiot then.

Re:This would be really cool... (2)

Endo13 (1000782) | more than 2 years ago | (#38461360)

Not true. You just have to find the sweet spot of performance/$. My current card (I think it's a 6870 but I'd have to double-check to be sure) cost less than $150 a couple months ago and runs Witcher 2 quite smoothly with high settings. Haven't tried BF3.

Re:This would be really cool... (-1, Flamebait)

heinousjay (683506) | more than 2 years ago | (#38461264)

What I hate are the console games ruined by trying to cater to PC nerds. Since you guys hate everything and pirate it all anyway I wish they'd just give up on you and concentrate on the grateful gamers.

Re:This would be really cool... (1)

Anonymous Coward | more than 2 years ago | (#38461430)

What I hate are the console fanbois who ruin the gaming experience for everyone by refusing to play on anything except consoles, thus leading to most big games being developed for severely inferior systems.

(P.S.: This post is a troll. The real PC gamers should know what I mean.)

Re:This would be really cool... (2)

P-niiice (1703362) | more than 2 years ago | (#38461478)

I'm a console gamer, but I would prefer more balance; it's too console-leaning right now and we haven't gotten any real advancement in gaming for a long time. Consoles keep the developers and publishers afloat, and great PC games would temper devs and force them to add more depth to games....although that didn't help Oblivion and Skyrim (I love those games, but they lost some of the nerd-appeal of Morrowind).

Re:This would be really cool... (0)

Anonymous Coward | more than 2 years ago | (#38461794)

Make it so I can lend/swap my PC games again and I don't need a separate copy for my daughter (even if there will never be any co-op play) and we'll talk.

Yeah, Steam sales are cheap, swapping XBox 360 discs with my friends is cheaper. I spend a shit-ton on games, but graphics aren't the end all and be all. I'm typing this on a machine with Crossfired XFX 6870s, so it's not like I don't like PC gaming, but PC gaming sucks purely due to PC gaming sucking, not "consoleized" titles (seriously, if you think Deus Ex HR is any more fun on PC than console this argument is not only probably lost on you, you're probably a moron).

If you want to keep selling me PC games for 3 bucks in Steam and GOG sales, well then go ahead, but that's going to be most of my purchasing.

Incidentally, many of the games that have made the PC great don't actually use a lot of horsepower (and didn't even when they launched). Go take a look at the GOG library if you don't believe me. Also think about Minecraft, Terraria, etc. in recent years.

Re:This would be really cool... (2)

jellomizer (103300) | more than 2 years ago | (#38461272)

I think it is an issue that most game graphics have reached a peak with the current rendering technology. Where you need exponential work (In man power) to get a linear improvement.

Black and White text... All find and good until we need a graph.
Back and White graphics... Now only if it could do color.
CGA... What bad colors.
EGA... Looking a lot better if only we could get some shading and skin tones.
VGA... Enough visible colors to make realistic pictures. But a higher resolution will make it better.
SVGA... (The first good peak) OK static images are looking good, we can watch movies, but game animation is limited and 3d is getting popular.
3d cards. if we could get more polygons per second... More textures... Alpha Channels... Smoothing...
That is where we are now. Right now we have reached a limitation where we can display what we want to display. However the next steps will require new rendering methods that make graphics easier to create.

Re:This would be really cool... (1)

Anonymous Coward | more than 2 years ago | (#38461348)

Some of us use graphics cards for non-gaming purposes. For example, I develop CAD-CAM modelling and my programs rely on these math cop-processors which we call graphics cards to do thermal/structural analysis. Being able to rely on these 250€ worth of kit to do the job that it would take about 500€ of AMD processors and 2500€ of Intel processors, and all this wasting only a fraction of the energy and occupying a fraction of the volume, is an excellent good thing to have.

Re:This would be really cool... (1)

Anonymous Coward | more than 2 years ago | (#38461442)

You're an idiot! I could/would/should use this GPU for "GPGPU" rendering using MATLAB. And then I would *still* benefit from using two or three of them in the same setup. Just because you're a gamer doesn't mean that everyone else is!

Re:This would be really cool... (2)

durrr (1316311) | more than 2 years ago | (#38461490)

It's the fastest GPU in the known universe! surely it have to be worth something!

Re:This would be really cool... (0)

billcopc (196330) | more than 2 years ago | (#38461630)

Hey now! The last guy who called me an idiot got shit on by a transgendered midget.

Depends on your resolution, but yes. Those of us with beefy GPU setups tend to be doing 3D, multi-display, or bitchy resolutions like 2560x1440. Or in my case, CUDA processing and 3D raytracing. A lot of people forget that GPUs can do a lot more than just games. For some of us, those non-gaming uses are our day jobs.

Re:This would be really cool... (0)

Anonymous Coward | more than 2 years ago | (#38461702)

You should play Skyrim at 2560 x 1600 and ultra settings. Definately not a "console port".

Only once have I splurged like that (1)

halivar (535827) | more than 2 years ago | (#38461060)

I rebuild my machines every two years. My previous rig couldn't do Crysis as max settings so my latest system has dual 5870's that I got for $400 a piece. I'll never splurge like that on video cards again. Then again, 2 years later, I still max out the sliders on every game I get. It's great to have that kind of computing power... but maybe I should have waited 6 months? Those cards are going for $150 today.

Re:Only once have I splurged like that (1)

Moheeheeko (1682914) | more than 2 years ago | (#38461118)

Pretty sure most systems cant run Crysis perfect at max settings, simply for the fact that Crysis is one of the worst optimized games ever developed.

Overpowerful. (4, Interesting)

unity100 (970058) | more than 2 years ago | (#38461068)

Due to console gaming retarding pcs.

im on single radeon 6950 (unlocked to 6970 by bios flash), and i am doing 5040x1050 res (3 monitor eyefinity) on swtor (the old republic), all settings full, and with 30-40 fps on average, and 25 fps+ on coruscant (coruscant is waaaaaaay too big).

same for skyrim. i even have extra graphics mods on skyrim, fxaa injector etc (injected bloom into game) this that.

so, top gpu of the existing generation (before any idiot jumps in to talk about 6990 being the top offering from ati ill let you know that 6990 is 2 6970s in crossfire, and 6950 gpu is just 6970 gpu with 38 or so shaders locked down via bios and underclocked - ALL are the same chip), is not only able to play the newest graphics-heavy games in max settings BUT also do it on 3 monitor eyefinity resolution.

one word. consoles. optional word : retarding.

Re:Overpowerful. (4, Insightful)

parlancex (1322105) | more than 2 years ago | (#38461254)

... 30-40 fps on average, and 25 fps+ on coruscant (coruscant is waaaaaaay too big). same for skyrim...

Looks like PCs isn't the only thing gaming consoles have been retarding. Most PC gamers would have considered 25 fps nearly unplayable, and 30-40 FPS highly undesirable before the proliferation of poor frame rates in modern console games. There are still many of us that are unsatisfied with that level of performance, but are unwilling to compromise graphics quality.

Re:Overpowerful. (2)

rodrigoandrade (713371) | more than 2 years ago | (#38461414)

Yes, it's really a shame that 30 fps became an acceptable framerate for games nowadays, thanks to crappy underpowered consoles.

Funny, however, is that back in 1999 (Dreamcast days) any console game that didn't run at a solid 60 fps was considered a potential flop.

This framerate crap is one of the many reasons I'll never go back to console gaming.

Times change, no?

Re:Overpowerful. (-1, Troll)

unity100 (970058) | more than 2 years ago | (#38461482)

there is no difference in between 25 fps, or 30 fps or 40 fps to the human eye playing a game. other than the counter fraps shows. there wasnt any difference in between these with respect to human physiology and eye-brain connections either. we were still perceiving 24 fps as smooth, and anything over 30 fps as very smooth. thats why hdmi standard used to mandate 24 fps.

Re:Overpowerful. (-1)

Anonymous Coward | more than 2 years ago | (#38461648)

Wrong. I'll agree that over 60 doesn't really matter but as a former game developer I can tell you there is a HUGE difference between 30 and 60.

Here troll! Troll! Troll! Dinner time!

Re:Overpowerful. (0)

unity100 (970058) | more than 2 years ago | (#38461682)

Wrong. I'll agree that over 60 doesn't really matter but as a former game developer I can tell you there is a HUGE difference between 30 and 60.

yes. that should be why hdmi standard requires 24 fps as absolute minimum. because a game developer somewhere, and some percentage of gamers who equate the fps counter or the hardware in their machine with their egos, think they perceive differences in between 30 and 60 fps.

no, you dont. unless you genetically modified yourself. any argument otherwise is bullshit, until proper research is referenced.

Re:Overpowerful. (1)

anonymov (1768712) | more than 2 years ago | (#38461686)

Not really.

First, different people have different perception, so 20 may be enough for some and 60 just right for others.

Second, fast motion on low framerate requires some amount of motion blur to be perceived as "smooth". When shooting a film, this blur is already there thanks to the nature of filming. When rendering, developers have to care about it, and as it can be costly it's often dropped.

Re:Overpowerful. (1)

unity100 (970058) | more than 2 years ago | (#38461730)

then bring here ONE research that 'people with different perception' can perceive difference in between 30 vs 60 fps. one is enough.

Re:Overpowerful. (4, Interesting)

anonymov (1768712) | more than 2 years ago | (#38461926)

You keep talking about "research", may be _you_ care to provide a research that shows "24 fps should be enough for everyone"? (hint: it's not, and it's the reason for current studies for 50p/60p/72p film and television).

Why, you can just go here http://frames-per-second.appspot.com/ [appspot.com] and tell us "I don't see any difference". And then we'll just tell you to visit your eye doctor.

Re:Overpowerful. (1)

Charliemopps (1157495) | more than 2 years ago | (#38461762)

There is a difference, a very obvious difference. But your television most likely runs at 30fps, and your monitor at 60, if you're lucky. Get an old CRT capable of 200hrz, run some old 3D game like decent that a modern card would make easy work of so you can hit 200fps and marvel at the difference. It really does make a huge difference.

Re:Overpowerful. (0)

Anonymous Coward | more than 2 years ago | (#38461814)

I think you should check your facts on this one. You are wrong.

Re:Overpowerful. (0)

Anonymous Coward | more than 2 years ago | (#38461820)

thats why hdmi standard used to mandate 24 fps.

Movies are about 24 fps. But the frames in movies are SMOOTHED OUT. Each frame isn't a perfectly clear and sharp image of a single instant like with computer games.

Re:Overpowerful. (1)

unity100 (970058) | more than 2 years ago | (#38461928)

yes, and hence, therefore people have the biology to be able to perceive the difference in between 30, 40, 60 fps.

just because there materialized a percentage of gamers that think they do.

Re:Overpowerful. (0)

Anonymous Coward | more than 2 years ago | (#38461692)

As I always said: If you can win the game at below 30 FPS, the game is shit!

I remember the the two most important changes that improved my skill:
1. Play 30 minutes Quake 3 pro mode (CPMA) before the actual game. = Level 10,000 on the burst-top twitch scale
2. Make sure you play at 60 fps. Not 30. Period.
(Having a good cable mouse of the right type [laser vs optical] deserves third place, and having a screen with no lag is a fourth, although I still play on CRTs because they offer free resolution scaling and have neither lag nor smearing, so I wouldnâ(TM)t know.)

Re:Overpowerful. (1)

LordLimecat (1103839) | more than 2 years ago | (#38461318)

Consoles support 5040x1050? Color me suprised.

Re:Overpowerful. (1)

Junta (36770) | more than 2 years ago | (#38461842)

His point being that game developers are conservative about pushing graphical complexity such that they don't even produce a workload that remotely challenges modern top-end cards. He attributes this to developers targeting weaker consoles. I think it's just because they want to perhaps have a slightly larger market than those willing to shell out $600 a year in graphics cards *alone*, regardless of game consoles. Now to push framerates down to a point where it looks like things matter, they have to turn up the complexity settings to max *and* break out three monitors at 1920x1080 a head *and* turn up things like AA to ludicrous values to show a meaningful difference for game playing.

I personally look forward to a 'midrange' Northern Islands card that can pull off the 1920x1080 with basically all the settings cranked up without AA or multihead. I know I could get one already with either southern islands or fermi, but holding out for one more generation for just that much less power draw. My 8800GT just can't keep up with some of the new releases without turning a lot of settings down, despite the GP post opinion.

Re:Overpowerful. (2)

Warma (1220342) | more than 2 years ago | (#38461328)

This is highly a matter of preference. I feel that 25fps is just flat out unplayable and anything under 60 is distracting and annoying. I believe that most gamers would agree. I always cut detail and effects to get 60fps, even if this means the game will look like shit. Framerate is life.

So no, based on your description, the top of the previous generation is NOT able to play those games in the environment you defined. You would need around twice the GPU power for that. The benchmarks suggest that 7970 won't cut it either.

Re:Overpowerful. (1)

Jeremi (14640) | more than 2 years ago | (#38461428)

This is highly a matter of preference. I feel that 25fps is just flat out unplayable and anything under 60 is distracting and annoying.

It's partially a matter of what you are accustomed to. I remember playing Arcticfox on a 386/EGA at about 4FPS and thinking it was awesome, because compared to the other games available at the time it was.

Re:Overpowerful. (0)

Anonymous Coward | more than 2 years ago | (#38461484)

Agreed. It's ridiculous to state that current PC hardware is overpowered for games if you can't reach 60fps. Nintendo got that right with Metroid Prime and Mario Galaxy. Plus iPhone is definitely 60 fps. That's why the rest suck so bad.

yes. 60 fps. (1)

unity100 (970058) | more than 2 years ago | (#38461584)

because, there is at least ONE research that shows human physiology and senses are able to perceive differences in frame rates above 24 fps as smooth ?

no. the question is rhetorical. there isnt one single research that shows humans are able to perceive a difference in between 40 fps and 60 fps. its total bullshit.

hdmi specification requires 24 fps. not 60 fps. because, 24 fps is scientifically backed, whereas the only thing backing 'i can perceive 60 fps' is the self-propagated bullshit from gamers. nothing else.

Re:yes. 60 fps. (0)

Anonymous Coward | more than 2 years ago | (#38461750)

um, with high speed motion in video, you WILL be able to tell difference between 24 and 60fps. perhaps you should read some research papers and spend some time in the display industry.

Re:yes. 60 fps. (1)

unity100 (970058) | more than 2 years ago | (#38461880)

yeah please link me one of the papers that say people can distinguish in between 30 fps, 40 fps and 60 fps and therefore 60 being the 'necessary norm' for pc gaming. im waiting.

Re:yes. 60 fps. (0)

Anonymous Coward | more than 2 years ago | (#38461824)

Are you stupid or something? The 24 fps spec is because of film, dumbass. And films have motion blur builtin because of how they are done. I gaurantee that you should be able to distringuish between 24 fps and 60 fps of cripsly rendered frames. If not, see an eye doctor right away because your vision needs help.

Re:yes. 60 fps. (1)

unity100 (970058) | more than 2 years ago | (#38461912)

im going to just slap the same response :

yeah please link me one of the papers that say people can distinguish in between 30 fps, 40 fps and 60 fps and therefore 60 being the 'necessary norm' for pc gaming. im waiting.

Re:Overpowerful. (-1, Troll)

unity100 (970058) | more than 2 years ago | (#38461520)

. I feel that 25fps is just flat out unplayable and anything under 60 is distracting and annoying. I believe that most gamers would agree

and thats the problem. you FEEL that. all scientific researches show 'smooth' being 24 fps for human eye (hence hdmi minimum mandatory frame rate), and anything over 30 fps smooth even if you try to twinkle your eye by looking towards another direction than the display.

you FEEL it. its totally subjective, unscientific, unbacked opinion.

and its bullshit. excuse me but there is no other way of putting this, after thinking like you for close to a decade and trying to beef up my pc to meet the totally unnecessary frame rate counts, only to eventually notice that its a whole load of subjective bullcrap.

we, as people, do not notice differences from 30 fps and on, EVEN if we ourselves 'feel' that we do.

Re:Overpowerful. (2, Insightful)

Warma (1220342) | more than 2 years ago | (#38461684)

No need to use strong language and unbacked opinions. You are simply incorrect.

Put two FPS players of similar skill in front of a computer. Configure the computers so that the other shows 30fps and the other shows 60fps (easy to do in Quake and most other shooters). No matter what you may hope for the truth to be, the player with the 60fps display will have an enormous advantage.

It is extremely easy to test this yourself, just go play a game and record your performance by some metric while alternating the frame rates. I am willing to test it with you, if you wish.

Re:Overpowerful. (0)

unity100 (970058) | more than 2 years ago | (#38461718)

No need to use strong language and unbacked opinions. You are simply incorrect.

UNbacked opinions ?

link me ONE research that shows humans are able to perceive differences in between 30 fps and 60 fps. ONE research. except from a percentage of 'fps gamers' thinking that they can perceive 30 and 60 fps.

and then move on to explain why hdmi standard mandates 24 fps, and this has been the minimum requirement for 'smooth' since aeons in display related 'anything'.

no need to use strong language ? why yes there is. there is a percentage of people who think that their subjective 'feeling' precedes SCIENCE.

im waiting for the proper research to show that people are able to perceive 30 vs 60 fps, AND rebuke the hdmi specification.

Re:Overpowerful. (0)

Anonymous Coward | more than 2 years ago | (#38461866)

You do realize that perceive and feel mean the same thing right?

Just because your conscious mind can't go as fast as 24 fps, doesn't mean that gamers can't perceive that difference especially having trained their bodies to make split second decisions.

This is the difference between studying the mind, and actually using it practically and why so many of us disagree with you.

Plus I find it highly dubious for any scientific person to say without a shadow of a doubt that no one can "see" anything higher than 24 fps. You are making such a blanket statement there it's not funny. Especially to those of us who can definitely feel monitor's refresh rates and the mechanical flicking of fluorescent tubes.

Re:Overpowerful. (1)

Kjella (173770) | more than 2 years ago | (#38461936)

Well, I'm sorry you're an idiot. You're somewhat right, actually the eye just takes 18-20 fps to feel smooth if the scene is motion blurred. Reality doesn't have frames, during that 1/20th of a second everything moves. A rendered screen is not motion blurred and will seem extremely stuttering. Yes, perhaps if you rendered at 60 fps and averaged down to 24 fps you wouldn't notice the difference, but having a graphics card that can only render at 24 fps is clearly insufficient. You should go see an optician if you don't notice it.

Re:Overpowerful. (0)

Anonymous Coward | more than 2 years ago | (#38461400)

...and with 30-40 fps on average, and 25 fps+...

Haha... HAHAHAHA.

I'll wait till it's $50 (1)

na1led (1030470) | more than 2 years ago | (#38461074)

This is good news for many who are looking for bargain Video Cards. I recently purchased a GT240 on Newegg for $40 which performs as well as a Geforce 8800 sold for $600 a few years ago.

Re:I'll wait till it's $50 (1)

drinkypoo (153816) | more than 2 years ago | (#38461578)

I paid $100 for one of those (with 1GB) when I built my system almost two years ago. It's 75% of the performance of a GT250 at 50% of the power and (at the time) just over 50% of the money. It's a great card, or family thereof.

Does GMA still stand for Graphics My ___? (1)

tepples (727027) | more than 2 years ago | (#38461082)

and others who are just glad that the technology drags along the low-end offerings, too

Has the advance of high-end NV and AMD GPUs dragged along the Intel IGP in any way, shape, or form?

Re:Does GMA still stand for Graphics My ___? (0)

Anonymous Coward | more than 2 years ago | (#38461154)

Thankfully, yes. As the very least, the current Intel on chip graphics do a nice job of video de/encoding on chip, and do offer modest dx11 acceleration.

Re:Does GMA still stand for Graphics My ___? (1)

yincrash (854885) | more than 2 years ago | (#38461190)

not by much, but it does mean that $50 for an amd and nv card will get you more power.

Re:Does GMA still stand for Graphics My ___? (1)

na1led (1030470) | more than 2 years ago | (#38461232)

Intel IGP has the fastest encoding of all other GPU's if you have software that support Quick Sync. Intel has come a long ways but it's mostly geared for HD video not gaming.

Re:Does GMA still stand for Graphics My ___? (1)

LordLimecat (1103839) | more than 2 years ago | (#38461336)

Intel Sandy bridge graphics are enough for most things, and Ivy Bridge, is supposed to increase its performance by another 20%.

Re:Does GMA still stand for Graphics My ___? (1)

jandrese (485) | more than 2 years ago | (#38461658)

Assuming you're willing to run at lower resolutions and framerates, yes.

Bitcoin (5, Funny)

Anonymous Coward | more than 2 years ago | (#38461122)

Yes yes.. Rendering yada yada. How many Mhash/s does it average when bitcoin mining? And what is the Mhash/Joule ratio?

Re:Bitcoin (0)

Anonymous Coward | more than 2 years ago | (#38461364)

If you got the card for free and you tried to pay for the increase of your power bill with mined bitcoins, it wouldn't cover half. But for some people, power is free when you mine from work. (In the late 80's, They Might Be Giants had an answering machine in Brooklyn that would play you something new every day. They just had a regular NY phone number, considered "long distance" at the time for most Americans, but they helpfully added the tip that the call is "free when you call from work". I just realized that I'm old and my nerdy references now need explanations. Sorry.)

Re:Bitcoin (1)

Guppy (12314) | more than 2 years ago | (#38461436)

I've never been interested in Bitcoin mining, but as it becomes less worthwhile, I'm hoping it will depress prices on the used graphics card market, as former miners liquidate their rigs.

Re:Bitcoin (1)

Statecraftsman (718862) | more than 2 years ago | (#38461854)

Not to speculate too much but we've probably passed the peak of mining hardware being listed for sale. Due to difficulty decreases and recently increasing prices, you may see increasing prices on GPUs in the next month or two.

Linux Driver State? (5, Insightful)

chill (34294) | more than 2 years ago | (#38461134)

What is the state of Linux drivers for AMD graphics cards? I haven't checked in a few years, since the closed-source nVidia ones provide for excellent 3D performance and I'm happy with that.

But, I'm in the market for a new graphics card and wonder if I can look at AMD/ATI again.

No, I'm not willing to install Windows for the one or two games I play. For something like Enemy Territory: Quake Wars, (modified Quake 3 engine), how does AMD stack up on Linux?

Re:Linux Driver State? (0)

Anonymous Coward | more than 2 years ago | (#38461304)

How about the open source AMD/ATI drivers specifically?

Re:Linux Driver State? (1)

Anonymous Coward | more than 2 years ago | (#38461320)

I find the proprietary AMD graphics driver more than satisfactory. I run a compositing desktop, one 3D game and sometimes a bit of CAD or modelling software, on a Radeon HD 5700. Very smooth, utterly reliable, no problems at all, except that because the driver apparently conflicts with the GPL my distro vendor no longer provides it, so I have to rely on another repository. But that's okay.

Frankly, I'm willing to use non-GPL-friendly drivers for a video card. Yes, it would be nice if they'd show more respect for the GPL and FSF, but one has to be pragmatic.

Re:Linux Driver State? (3, Insightful)

Anonymous Coward | more than 2 years ago | (#38461350)

They suck just like they always have. But don't feel left out, they suck on Windows as well.

ATI/AMD may at times make the fastest hardware but their Acillies Heel has and apparently always will be their sucky drivers. The hardware is no good if you can't use it.

They need to stop letting hardware engineers write their drivers and get some people that know what they are doing in there. They need solid drivers for Windows, Linux, and a good OpenGL implementation. Until then they can never be taken seriously with their broke-ass software.

Re:Linux Driver State? (1)

LordLimecat (1103839) | more than 2 years ago | (#38461378)

Quake3 probably doesnt need a top of the line graphics card. Go with an nVidia, for years that has been the best move if you think you may use Linux at some point.

$30 should get you a card that maxes out anything quake3.

Re:Linux Driver State? (1)

chill (34294) | more than 2 years ago | (#38461540)

Bah! My mistake. It is a heavily modified id Tech 4 engine [modwiki.net], which is Quake 4/Doom 3 -- not Quake 3. No $30 card will max that out.

My fault.

Re:Linux Driver State? (1)

simcop2387 (703011) | more than 2 years ago | (#38461508)

I've personally had stability issues with the AMD drivers for my laptop. But this seems to be because they only try to support ubuntu and all the versions that they use, anything else it's hard to get a fix for until ubuntu either updates to your version or you find a work around in the community. The open source drivers aren't as performant but they are much more stable for me.

Re:Linux Driver State? (2)

drinkypoo (153816) | more than 2 years ago | (#38461554)

The almost-but-not-quite-latest card is generally fairly well-supported by fglrx. If your card is old enough to be supported by ati then it may work but it probably won't support all its features. You're far better off with nvidia if you want to do gaming.

Every third card or so I try another AMD card, and wish I hadn't immediately. Save yourself.

Re:Linux Driver State? (0)

Anonymous Coward | more than 2 years ago | (#38461606)

I know that peoples experiences with ATI drivers and Linux are quite varied, but mine have been terrible.
Using both their closed source, and the open source Xorg drivers, I have had nothing but stability issues with ATIs graphics cards. Xorg crashing, kernel dumps, system freezes, you name it and I've experienced it. These same graphics cards work flawlessly under Windows.

So, these days, when I've got a Linux box with ATI graphics, I just use the VESA framebuffer drivers. If I need more than that, I order an NVidia card.

Please bare in mind that I'm not saying ATI and Linux is always bad, I'm just relating my experiences.

Re:Linux Driver State? (0)

Anonymous Coward | more than 2 years ago | (#38461644)

Good question. Which modern graphics card will increase your fps from 200 to 500 for Quake 3?

I don't get it. (1)

pablo_max (626328) | more than 2 years ago | (#38461874)

How is this modded as insightful?
What have Linux driver to do with this card? How are Linux users in any way the target market for a high end enthusiast GAMING graphics card?

Perhaps once you can purchase BF3 or the like for Linux, then ATI and NV will spend more time writing drivers for Linux.

I cannot imagine that anything more than an older HD48xx series will help you in any way.

Don't care (-1)

Anonymous Coward | more than 2 years ago | (#38461136)

After my experience with an ATI X1650, and those crap Catalyst drivers, I'm never buying ATI Radeon again.

Yeah but compatibility... (0)

Anonymous Coward | more than 2 years ago | (#38461158)

That would be fine, but my general experience is that the NVIDIA cards are more compatible with OpenGL, whatever the specs say, which is frequently used for indy games. Also the NVIDIA cards have been more reliable and compatible on the whole when used in workstations in my experience.

Granted, I rarely have the budget to try the newest or best models... maybe it's different at the top end of the spectrum.

Re:Yeah but compatibility... (1)

Tr3vin (1220548) | more than 2 years ago | (#38461268)

The only difficulty I have had with my 5870 was during the first week that Rage was out. It was a terrible mess. Once the drivers and game were patched, everything worked fine. I have never had issues with any Indie games though. "Compatibility" is typically measured in the number of patches made to the drivers to work around the various developers' interpretations of OpenGL or DirectX.

Re:Yeah but compatibility... (0)

Anonymous Coward | more than 2 years ago | (#38461460)

I think it's the reverse. If the drivers have to be re-written for every game with optimization/tweaks it makes me think game developers are adhering to the spec and the driver companies aren't. One driver should work fine with any game.

It's too bad... (0, Troll)

Reverand Dave (1959652) | more than 2 years ago | (#38461198)

...that AMD can't put out a video driver that isn't total garbage.

That FERMI fire... (1)

Anonymous Coward | more than 2 years ago | (#38461246)

..keeping you nice and warm in the winter?

Re:It's too bad... (1)

jiriw (444695) | more than 2 years ago | (#38461448)

Well .. PC Perspective had to benchmark this card with some sort of drivers... Guess what; those probably were written by AMD personell. It's already faster than the competitors offering. If it had any major defects they surely would have mentioned it in the article. So if that's total garbage, it can only improve, no?
I'm not that afraid the cards will be usable only as badly designed space heaters. Because apparently that's something they do badly... having a similar thermal envelope as the previous gen cards. The developers of high power PSUs will be the least pleased with this new product :P

Re:It's too bad... (1)

Reverand Dave (1959652) | more than 2 years ago | (#38461602)

I'm speaking as a technician that works in a shop that uses almost exclusively ATI cards and having experienced many many problems caused DIRECTLY by shitty ATI drivers.

Re:It's too bad... (4, Insightful)

Dr. Spork (142693) | more than 2 years ago | (#38461506)

Which version are you having trouble with? Are you sure that you're not just mindlessly repeating a 7 year old meme? Are you also one of the people who switched to Chrome because "Firefox uses too much memory" when simple tests show that Chrome uses more? I know it feels like you're a part of the club when you repeat what you hear from the other club members. But don't confuse groupthink with truth - especially when it comes to the quickly-changing world of tech.

Re:It's too bad... (1)

drinkypoo (153816) | more than 2 years ago | (#38461594)

When I brought home my R690M-chipset-based netbook ATI had already abandoned the X1250 graphics in it and dropped them from fglrx (assuming they were ever in there) and they apparently haven't given the folks making the ati driver enough information to support it properly despite their claimed commitment to open source (IME intel has made good on this in more cases than AMD) so it craps all over my system if I try to run Linux, even with RenderAccel disabled.

ATI Rage Pro stuff is the only ATI stuff that seems to work flawlessly under Linux for me... but how old is that? And how bad were the windows drivers for those cards when they were new?

Re:It's too bad... (1)

Dr. Spork (142693) | more than 2 years ago | (#38461784)

Yeah, but you know very well that the policies back then (many years ago) don't apply to ATI practices with new hardware, which is what this article is about. For starters, it's a different company now. Also, their relationship with the OSS community has changed a lot. Maybe they're not going back and fixing 7 year old problems, but that doesn't mean that their new stuff has the same problems. I'm not saying that everything is peachy now with the drivers, but I'm saying that I see too many people base their conclusions on what they know to be outdated reasons.

Re:It's too bad... (1)

Reverand Dave (1959652) | more than 2 years ago | (#38461872)

Read below before you start going all fan boy. I routinely have problems with that shitty catalyst software on a wide numbers of computers in the shop I work. It's experience talking. So I guess you could count your trolling as unsuccessful.

What bugs me most (3, Interesting)

Psicopatico (1005433) | more than 2 years ago | (#38461466)

Why card manufacturers utilize (rightfully) new manufacture processes (28nm transistors) only to push higher performances?

Why the hell don't they re-issue a, say, 8800GT with the newer technology, getting a fraction of the original power consumption and heat dissipation?
*That* would be a card I'd buy in a snap.
Until then, I'm happy with my faithful 2006's card.

Re:What bugs me most (4, Insightful)

jandrese (485) | more than 2 years ago | (#38461724)

The problem is that speed is only one part of the equation. that 8800GT only supports DX10.0. DX10.1 games may run, but you'll find them crashing after awhile unless the developer was very careful (they were not). DX11 games won't work at all.

You're much better off with a modern card that just has fewer execution units if you want to save money. They won't be out right away (the first release is always near the top end), but they will eventually show up. Since you're worried about saving money/power, you don't want to be an early adopter anyway. Oftentimes the very first releases will have worse power/performance ratios than the respins of the same board a few months down the road.

But can it run Unity without lag? (2)

captrb (1298149) | more than 2 years ago | (#38461468)

But can it run Unity on two screens without lag? I suspect that whatever video card I buy, the modern Linux dualhead display will feel slower than it did in 2005 :-/

switched to radeon, not thrilled. (0)

doug141 (863552) | more than 2 years ago | (#38461500)

I used to use nvidia cards, but always assumed video card brand made no difference, since I'd never heard anything to that affect. I recently got my first Radeon card, a minor upgrade that was twice as fast as my old card but used the same wattage (cool, right?). About once an hour, the new card freezes for a full 2 seconds, and maybe 30 seconds later freezes again for half a second. Sounds like no big deal, unless you are playing modern warfare 2, because then you are dead, and your kill-streak is reset. Sometimes I come out of the 2 second freeze to see my avatar ran off a ledge and is falling to his death. Kinda funny, but not when it borks a kill-streak. Getting killed by your video card is just wrong. I googled around for a solutions, and just found forums full of people lamenting the bad quality of drivers for those cards. It's fine for most games, and it's fine 99% of the time, but that other 1% can be a cruel bitch. Now you know.

Re:switched to radeon, not thrilled. (2)

jandrese (485) | more than 2 years ago | (#38461830)

Those freezes are probably the driver crashing and resetting itself. It used to be that a driver crash brought down your whole system, but now they can do it in the background silently and all you'll notice is some stuttering (or a short freeze).

I suspect that ATI and nVidia have been able to use the silent-restart feature to sell more defective cards. If your system totally locks up every half hour when playing a game, you're going to return the card. If it freezes but then resumes silently you may be annoyed but not pin the blame on the card, and not be annoyed enough to actually take action. It could be the disk, motherboard, or something else too, you don't get an indication that it was a video card problem. The very first revisions of this feature used to pop up a box telling you what happened, but that doesn't seem to be the case anymore.

My NVIDIA driver died when I hit the comments page (1)

kaychoro (1340087) | more than 2 years ago | (#38461782)

Ironically, when I started reading people's comments, my driver failed and had to be reloaded in Windows 7 for my year old NVidia card.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...