Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Unveils New Family of GPUs: Radeon R5, R7, R9 With BF 4 Preorder Bundle

samzenpus posted about a year ago | from the latest-and-greatest dept.

AMD 188

MojoKid writes "AMD has just announced a full suite of new GPUs based on its Graphics Core Next (GCN) architecture. The Radeon R5, R7, and R9 families are the new product lines aimed at mainstream, performance, and high-end gaming, respectively. Specs on the new cards are still limited, but we know that the highest-end R9 290X is a six-billion transistor GPU with more than 300GB/s of memory bandwidth and prominent support for 4K gaming. The R5 series will start at $89, with 1GB of RAM. The R7 260X will hit $139 with 2GB of RAM, the R9 270X and 280X appear to replace the current Radeon 7950 and 7970 with price points at $199 and $299, and 2GB/3GB of RAM, and then the R9 290X, at an unannounced price point and 4GB of RAM. AMD is also offering a limited preorder pack, that offers Battlefield 4 license combined with the graphics cards, which should go on sale in the very near future. Finally, AMD is also debuting a new positional and 3D spatial audio engine in conjunction with GenAudio dubbed 'AstoundSound,' but they're only making it available on the R9 290X, R9 280X, and the R9 270X."

cancel ×

188 comments

Sorry! There are no comments related to the filter you selected.

A new indecipherable numbering system, yay! (1)

Anonymous Coward | about a year ago | (#44954981)

Looks like the numbers were getting too close to the big ten thousand again, gotta reshuffle the whole system.

Fuck it, I'm buying some sort of PlayBoxStation One-Four. I give up.

Re:A new indecipherable numbering system, yay! (0)

Anonymous Coward | about a year ago | (#44955007)

And stop saying 4K resolution. Say 4000x2017 or whatfckingever resolution.

Re:A new indecipherable numbering system, yay! (1)

viperidaenz (2515578) | about a year ago | (#44955075)

But 3840x2160 isn't as cool as "4k"

Re:A new indecipherable numbering system, yay! (2, Funny)

Anonymous Coward | about a year ago | (#44956355)

Wonder how long until HD manufacturers adopt 1k == 960 to "avoid customer confusion"

Ignore numbers (3, Interesting)

rsilvergun (571051) | about a year ago | (#44955543)

just look at the bit rate on the memory bus. Video card manufactures use the mem bus bitrate to limit card performance so that their low end doesn't cannibalize their mid range and high end (ala 3DFX).

128-bit is low end.

192-bit is your mid range card.

256-bit is your high end.

You don't need to pay attention to anything else until 256 bit. After that just sort by price on newegg and check the release date. Newer is better :)

Re:Ignore numbers (0)

Anonymous Coward | about a year ago | (#44955691)

You forgot 64-bit for 'spreadsheet end and grandmas'

Re:Ignore numbers (3, Interesting)

Anonymous Coward | about a year ago | (#44955857)

False. Perhaps this was true in the past, but currently memory bandwidth is tailored to the GPU's processing power - that is, it's the bandwidth the core needs, usually defined by the most bandwidth-hungry scenario.

Bandwidth is not constrained by bitrate alone, but by bitrate and clockspeed - a 128-bit interface at 2 ghz is just as good as a 256-bit interface at 1 ghz. Usually the wider bus is less power-hungry at the same bandwidth, and is therefore preferred.

Also, bitrates of 384 and 512 exist.

Re:Ignore numbers (0)

Anonymous Coward | about a year ago | (#44955871)

you forgot 384 bit, 512bit (this new radeon card with 4GB GDDR5 seems to be 512bit) and "dual" cards
with 2 GPU and twice ram (2*384bit and 2*512bit probably)

Now I feel like a fool (1)

Billly Gates (198444) | about a year ago | (#44954999)

For paying $225 a month ago for an ATI 7850 with just 1 gig of ram.

Re:Now I feel like a fool (2)

alvinrod (889928) | about a year ago | (#44955105)

Where did you buy it from that you paid that much? I bought a 2GB 7850 over a year ago for around that price.

Re:Now I feel like a fool (-1)

Anonymous Coward | about a year ago | (#44955223)

TigerDirect

Re:Now I feel like a fool (-1)

Anonymous Coward | about a year ago | (#44955319)

should have done newegg

Re:Now I feel like a fool (-1)

Anonymous Coward | about a year ago | (#44955407)

yea then you could have paid more + sales tax

Re:Now I feel like a fool (0, Funny)

Anonymous Coward | about a year ago | (#44955167)

I paid like 220 for a 7950 with 3 gigs 6 months ago, but there's always some retard willing to flush their money away on the first handful of beans they see

dumbass

Re:Now I feel like a fool (1)

pspahn (1175617) | about a year ago | (#44955359)

Maybe one day we will be allowed to upgrade RAM on these cards so that going from 1Gb to 2Gb didn't cost you another $60.

Re:Now I feel like a fool (1)

Hamsterdan (815291) | about a year ago | (#44955489)

Like we could do with ISA VLB and PCI cards around 20 years ago (even onboard graphics and some soundcards)

Re:Now I feel like a fool (1)

Anonymous Coward | about a year ago | (#44955531)

at GDDR5 speeds memory has to be soldered to GPU card, you could have removable memory but at DDR3 speeds that are several times slower

on the other hand if your good with soldiering and have few millions dollars in soldiering equipment you can just soldier additional ram to board yourself, so expanding is still possible for some

Re:Now I feel like a fool (2)

haruchai (17472) | about a year ago | (#44955953)

Keep soldiering on, AC

Re:Now I feel like a fool (1)

Anonymous Coward | about a year ago | (#44956035)

If they are not BGA, then they are pretty easy to solder with at most a bit of copper braid and maybe a flux pen. BGA requires more equipment, like a toaster oven and some solder paste or flux that might cost you a little more than the generic stuff. The hardest part will be getting the RAM chips in small quantities and if something has to be done besides just putting the chips on the board. The soldering of even tiny surface mount stuff is easily in the reach of most geeks with any reasonable level of hand tool use.

Re:Now I feel like a fool (1)

LordLimecat (1103839) | about a year ago | (#44955585)

That would not be possible, considering noone sells graphics RAM for consumer installation.

Re:Now I feel like a fool (1)

armanox (826486) | about a year ago | (#44955657)

Wow, I was looking at picking up a Radeon HD 7870 for $150 at Microcenter....

Re:Now I feel like a fool (0)

Anonymous Coward | about a year ago | (#44956595)

You paid almost $100 too much. I bought one for $139. You can even buy a 7950 for $205...

Schlameel, Schlamazel (1)

slick7 (1703596) | about a year ago | (#44955001)

What happened to the Radeon R4, R6, R8?

Re:Schlameel, Schlamazel (1)

cheater512 (783349) | about a year ago | (#44955269)

Or the R1 for that matter?

Re:Schlameel, Schlamazel (5, Funny)

SJHillman (1966756) | about a year ago | (#44955325)

The same thing that happened to the Intel i2, i4 and i6 processors.

Re:Schlameel, Schlamazel (1)

Ralph Ostrander (2846785) | about a year ago | (#44956407)

Suckers I am holding out for the i10.

Maybe time for an upgrade? (2)

Fwipp (1473271) | about a year ago | (#44955035)

Wow, that $89 R5 actually looks surprisingly attractive. If the benchmarks hold up, I might think about replacing the old power-hungry card I've got in my main desktop machine right now - I'd probably save energy and get better performance to boot.

Re:Maybe time for an upgrade? (0)

Anonymous Coward | about a year ago | (#44956543)

I did look at it as I'd like to replace my 5 year old Radeon 4850 but judging by the specs of the R5 M200 my 5yo card annihilates it!

Radeon 4850 vs R5 M200:
800 vs 384 shading units
40 vs 24 TMUs
16 vs 8 ROPs
10 vs 5.2 GPixel/s
25 vs 15.6 GTexel/s
1000 vs 499.2 GFlops
63.6 vs 14.4 GBbps BW (256 vs 64 bit) ...
Sure, it's probably very low power but unless your machine is *very* outdated it's gonna be a big downgrade.

The $139 R7 260X seems like something worth upgrading to but then again, it's a matter of price:performance ratio, compared to existing offerings like the existing Radeon 7790 (both being a little short of getting 4000 points on 3dmark fire strike) which you can buy today for $120, or the Radeon 7850 which seems a little faster still (a hair over 4000 points) and which can also be had for $139 today. The $199 R9 270X scores about 5500, whereas the existing 7950 (scores 6000) can be had for $205 today... Meh. Nothing to see here...

And then, $40! (0)

LordMyren (15499) | about a year ago | (#44955049)

And once one has started playing BF4 ($60 value), one can either pay for DLC individually or spend $40 more for Premium.

I'm pretty displeased BF4 is a $100 game.

Re:And then, $40! (1)

Anonymous Coward | about a year ago | (#44955109)

Or, you know... not buy the DLC if you don't think it is worth the money.

That's what I do. Not buy things I don't think are worth buying. Always worked out well so far.

without decent drivers (2, Insightful)

epyT-R (613989) | about a year ago | (#44955053)

that work reliably for more than the current crop of just released games, I don't care how much faster these chips are. I've had too many glitches with radeon drivers over the years to consider them again. Their opengl is horrible, and CCC is a bloated pos.

Re:without decent drivers (3, Informative)

LesFerg (452838) | about a year ago | (#44955145)

Yeah I feel the same way about their driver support, couldn't trust them with too much of my limited gaming hardware budget.
Also, would it be really really difficult for them to hire some decent programmers and produce a new version of Catalyst control center that doesn't have to run on .Net?
Whatever happened to C++ and fast reliable software?

Re:without decent drivers (1, Informative)

epyT-R (613989) | about a year ago | (#44955833)

What happened? Point-and-stick software 'development.' Visual basic on steroids (.NET), and huge interpreted runtimes (python/php/ruby/.NET/ad nauseum) being used to write programs that could be done in a few dozen lines of C or shellcode..

This disease is everywhere. Basic system software should have as few dependencies as possible. GNU land suffers from this too. Honestly if CCC was the only problem, I could live with it.

Re:without decent drivers (1)

Nemyst (1383049) | about a year ago | (#44956421)

Interfaces in a few dozen lines of C? Hahahaha.

Re:without decent drivers (5, Insightful)

MojoMagic (669271) | about a year ago | (#44956819)

A couple of problems with this statement:
- .Net is not a programming language. Your comparison is just silly.
- In case you meant to refer to C#, no part of this development process is "point-and-click". In this regard, it is no different to C++ (I develop in both).
- It is not interpreted. Nor has it ever been.
- I think you'll find that the simple programs of "a few dozen lines" that you mention would likely be smaller (3 of lines) in C# than C++. But, again, this is a silly comparison and shouldn't be used in any reasonable comparison. If things like this are a problem, you are just using the wrong libraries; in most cases it has little to do with the language directly.

Re:without decent drivers (1)

Anonymous Coward | about a year ago | (#44956907)

The idea that these platforms and languages (python/php/ruby/.NET/Java) provide "Point-and-stick software development" is fucking retarded, anyone who knows even the slightest bit about them knows immediately that such a statement is objectively false. But that very thing is often said by the sort of people have no understanding of them through choosing to ignore them or genuine inability to comprehend them.

Frankly the runtimes are so lightweight and efficient that if you cant manage to write a GUI control panel in say .Net that performs as well as native then you are just a shit programmer. But of course people who think the programming world begins and ends with C that have no understanding of anything else will immediately blame the language or the tools rather than admit they need to actually learn more.

Re:without decent drivers (1)

sharklasers (3047537) | about a year ago | (#44956811)

Whatever happened to C++ and fast reliable software?

The path of least resistance. Given the amount of computing power available these days, there's little effort in writing clean, efficient, snappy code requiring little in the way of dependencies, if it results in longer development times. Heck, what are people gonna do, not buy an AMD/NVIDIA card just because of its control panel? That doesn't happen - people grit their teeth, install the .NET dependencies and live with it because no-one's gonna boycott cards because of it - and the companies know this. So there's no motivation to code better.

I have no problem with modern toolkits and IDEs making it easier to product something quickly compared to traditional programming tools. But it also makes it much easier to write rubbish, but quickly-developed rubbish that does the essentials with a lot of overhead. Such is the nature of this industry.

Re:without decent drivers (0)

Anonymous Coward | about a year ago | (#44955191)

I have been using them for the last 3-4 cards with no issues what so ever, but then again I run an OS that isnt spearheaded by a manchild and a bunch of basement dwellers

Re:without decent drivers (1)

epyT-R (613989) | about a year ago | (#44955791)

Well, my experience with radeon cards is in windows, for the most part. Really, it doesn't matter because the fglrx drivers for X11 aren't any better. So unless Windows was developed by basement dwellers, your assumption is incorrect.

Re:without decent drivers (1)

happymellon (927696) | about a year ago | (#44956775)

I think you are both saying the same thing.

Balmer == Manchild.

Re:without decent drivers (1)

DigiShaman (671371) | about a year ago | (#44955225)

Wish I had mod points for you. This is absolutely true. Going far back as 2003, the CCC was bloated, buggy, and aggravating to install. Particularly annoying was that .NET was required. That means having to install updates and .NET in default VGA mode first before you could load the drivers on a fresh format/reinstall of XP. Not sure if that's still the case though. Yuck!

You can have the best GPU in the world. It doesn't do much good without quality drivers however.

Re:without decent drivers (2)

Billly Gates (198444) | about a year ago | (#44955295)

NVidia drivers tend to be worse nowdays.

I hear them on hear saying how crappy Windows 7 is because aero brings thier GTX 680s to a crawl. Funny my parents Intel GMA 950 integratred 2007 era graphics run aero fine. Again driver issues.

I only had one bug with my ati drivers and if you base your data from 10 years ago then it is obsolete.

Re:without decent drivers (2)

epyT-R (613989) | about a year ago | (#44955771)

I based it starting at 10 years ago.. actually it starts with the ATI rage 128 which came out in 98(?), through to the radeon 5000 series. that 128 used to bsod windows on a regular basis with opengl applications (eg quake2). Years later, a litany of broken scenes, kernel panics due to unhandled exceptions (HANDLE YOUR DAMNED EXCEPTIONS!), tearing in video playback, completely broken support in non-game accelerated applications, etc, have kept me far far away. There's a reason adobe, autodesk et al, (and even the demoscene) stick with nvidia even though they want to move to openCL. It's not due to kickbacks, it's because their drivers work (for the most part).

I've had to service a lot of machines over the years (don't do it anymore thankfully), and when the owner complained of video problems, more often than not, there was a radeon in the machine.

I realize this is anecdotal, but my 770 (which is a 680 with higher clocks and faster ram) works fine with the 326.80 and 327.xx drivers. I do know some were having some issues with post 314.x drivers, but I didn't run into any.

Re:without decent drivers (0)

Anonymous Coward | about a year ago | (#44956069)

With any new generation of products from hardware companies, there is enough of a chance to drop the ball and screw things up, or to get lucky and do better (or just not screw up as bad as the competitor). Anecdotes and grudges from 10 years ago are rarely relevant.

Re:without decent drivers (1)

epyT-R (613989) | about a year ago | (#44956121)

You lack reading comprehension. Please reread.

Re:without decent drivers (2)

Mashiki (184564) | about a year ago | (#44956917)

I do know some were having some issues with post 314.x drivers, but I didn't run into any.

Some? Anything post 290.x have been complete crap on 400-600 series cards. At best, they might be stable, at worst you're going to see amazing hardlocks which require a complete powerdown to fix. The last time I looked on the nvidia forums with that issue, there was a thread on this with nearly 140k views. Funny enough it was the bad drivers that broke me, and I dumped my 560ti for a 7950 I have no complaints of doing so.

Bitcoin? (0)

Anonymous Coward | about a year ago | (#44955073)

The real question is, how many bitcoins a day can these things mine?

Re:Bitcoin? (-1)

Anonymous Coward | about a year ago | (#44955091)

Enough to buy a night with your whore mom.

Re:Bitcoin? (1)

Anonymous Coward | about a year ago | (#44955205)

GPU mining is over.

Re:Bitcoin? (2)

Agent ME (1411269) | about a year ago | (#44955231)

Mining bitcoins with GPUs is no longer profitable for the most part. Most profitable miners today use hardware designed specifically for bitcoin mining.

Re:Bitcoin? (0)

Anonymous Coward | about a year ago | (#44955893)

Not sure, but I think they reduced the probabilities in finding bitcoins, so gpus are no longer cost effective.

Mantle API (5, Interesting)

LordMyren (15499) | about a year ago | (#44955079)

Personally I would've gone for a mention of Mantle, the proprietary API they are introducing that sidesteps OpenGL and DirectX. I don't really know what it does yet, haven't found good coverage, but DICE's Battlefield 4 is mentioned as using it, and the description I've read said it enabled a faster rate of calling Draw calls.

http://www.xbitlabs.com/news/graphics/display/20130924210043_AMD_Unveils_Next_Generation_Radeon_R9_290X_Graphics_Card.html [xbitlabs.com]

Re:Mantle API (1)

Anonymous Coward | about a year ago | (#44955147)

That is really what we don't want though - a return to the bad old days of game developers having to write code specifically for each vendor's cards. As much as some people love to hate on them, standards such as OpenGL (and for Windows DirectX) make it so that the game developer doesn't have to keep writing modules to support video cards and other video APIs.

Re:Mantle API (1)

blahplusplus (757119) | about a year ago | (#44956263)

"That is really what we don't want though - a return to the bad old days of game developers having to write code specifically for each vendor's cards"

It doesn't really matter since there are only two videocard vendors now, and the pace of graphics card innovation has slowed to a crawl. Not only that a vendor API would not prevent anyone from doing a direct X and OGL executable. The same way many games have DX9 and Dx10 exe's.

More importantly EA is big enough to afford to do native API + OGL + Direct X if they wanted to. I don't see the problem.

Re:Mantle API (2)

tibman (623933) | about a year ago | (#44955367)

Windows only (for now), blah. Still really exciting though! I remember glide being pretty awesome back in the day. It's funny that NVIDIA bought 3dfx and got glide but it is AMD that built a new low-level api. NVIDIA's NVAPI doesn't seem like an openGL or directX replacement but a helper of sorts for managing all kinds of stuff on the card.

Curious about stability (5, Interesting)

PhrostyMcByte (589271) | about a year ago | (#44955667)

The idea is that operating systems introduce a huge amount of overhead in the name of security. Being general purpose, they view their primary role as protecting all the other apps from your unstable app. And, lets face it, even AAA games these days are plagued with issues -- I'm really not sure I want games to have low-level access to my system. Going back to the days of Windows 98's frequent bluescreens isn't on my must-have list of features.

John Carmack has been complaining about this for years, saying this puts PCs at such a tremendous disadvantage that consoles were able to run circles around PCs when it came to raw draw calls until eventually they simply brute-forced their way past the problem.

Graphics APIs have largely gone a route that encourages keeping data and processing out of the OS. That's definitely the right call, but there are always things you'll need to touch the CPU for. I'm curious exactly how much of a benefit we'll see in modern games.

Re:Curious about stability (5, Interesting)

epyT-R (613989) | about a year ago | (#44956021)

1. today's consoles also run protected mode (or architecture specific equivalent) operating systems too. The userland kernel hardware latencies are present.

2. You're complaining about games? Today's operating systems are hardly any better off. There is no way the vendors can vouch for the security of 10gb worth of libraries and executables in windows 7 or osx. The same is true for OSS. Best practice is to just assume every application and system you're using is compromised or compromisable and mitigate accordingly.

3. IIRC that particular carmack commentary was done to hype up the new gen systems. It's largely bogus. I'm sure the latencies between the intel on-die hd5000 gpu and cpu are lower, but that doesn't mean it's going to perform better overall. Same thing goes with the amd fusion chips used in the new consoles. They're powerful for their size and power draw, but they will not outperform current gaming pc rigs..

Quick situation check (0)

Anonymous Coward | about a year ago | (#44955093)

At last check (not that long ago)

- Catalyst control centre still crashed periodically
- Linux drivers were slow (I don't give a phuck about open source or not, I want my driver to work, like nvidia's do).

I'd like to consider ATI for a next card, but I've been constantly rebutted from doing so by crappy drivers. Nvidia's "just work".

What is the situation like now?

Re:Quick situation check (1)

armanox (826486) | about a year ago | (#44955679)

I can play DOTA2 and L4D2 on Linux with a Radeon HD 7750 with about the same FPS the card would give me in Windows, if that would help. Only other ATI card I've had recently is a Radeon HD 5770, and that does about the same as the 7750 (slightly better or worse depending on the application - OpenCL performance seems better on the 7750, and the 7750 also supports DP FP).

Re:Quick situation check (0)

Anonymous Coward | about a year ago | (#44956147)

gallium drivers are getting better, but still have issues.. fglrx drivers are still shit.. CCC is still garbage.

Re:Quick situation check (1)

dltaylor (7510) | about a year ago | (#44956281)

My 6850 also "just works", CCC and all, in Debian 7 (amd64).

Haven't checked the supported cards list, lately, to see what newer card works.

How does it perform hash-wise? (1)

terbeaux (2579575) | about a year ago | (#44955125)

I can't find any information on the scrypt hash rate of these cards. Does anybody have any info? Thanks

Bigger question is what's going on with Nvidia (1)

Anonymous Coward | about a year ago | (#44955131)

Nvidia has let slip 2 product generations. Their latest cards are re-badges of the previous gen, with prices shifted down one SKU. Their top-end became the 2nd fastest card, and a stripped down version of the Titian became the top-end. (The titan isn't a mainstream product. It's a remarked version of their workstation/server product sold in the consumer space, much like the Intel's relationship between the Xeon and the "Extreme Edition" cpus.) ... And that was last gen. This current gen, the one that AMD is releasing now, has slipped. They aren't releasing anything outside maybe a few SKU tweaks/re-badges. Probably some price cuts too.

Re:Bigger question is what's going on with Nvidia (1)

epyT-R (613989) | about a year ago | (#44956157)

They haven't needed to. The 800 series is out next year iirc. Internet rumor mills say it is a real architecture change.

'MANTLE' was the game-changing announcement (2, Interesting)

Anonymous Coward | about a year ago | (#44955177)

AMD has totally ruined the future of Nvidia and Intel in the AAA/console-port gaming space. Working with partners at EA/DICE, AMD has created a 'to-the-metal' API for GPU programming on the PS4, Xbox One, and any PC (Linux or Windows) with AMD's GCN technology. GCN is the AMD architecture in 7000 series cards, 8000 series, and the coming new discrete GPU cards later this year and onwards into the foreseeable future. It is also the GPU design in all future AMD CPUs with integrated graphics.

GCN is *not* the architecture of any Intel or Nvidia products, neither now or in the future. Nvidia and Intel will be stuck with only openGL or directX versions of games, and these versions will be much slower/feature incomplete compared to 'Mantle' versions ported form the consoles.

OpenGL and DirectX are OBSOLETE methods of controlling rendering for future AAA games. Both of these APIs/drivers have massive state overheads, and can never be made efficient for the mixed rendering/GPGPU methods required for the new games engines of late 2013 and later.

While some nerds with a better memory than brainpower will dribble about 'Glide' (the proprietary API from now defunct 3DFX), the correct comparison is x86 vs 68000 (the Motorola CPU design). GCN is actually an ISA (instruction set architecture) like the x86 ISA. Intel and Nvidia are like TI and Motorola at the time of emerging competing 16-bit CPU designs that finally led to the dominance of x64. Nvidia is Motorola. Intel is TI. TI's 16-bit CPU designs were no-hopers. Motorola was widely seen as superior to Intel at the time. When Intel was chosen for the first PC, it was game-over for Motorola.

Using OpenGL or DirectX to 'program' a modern GPU is like using Fortran to program the CPU. Using 'Mantle' on the other hand is like using 'C'. However, because 'Mantle' closely connects to the GCN 'metal', it is almost impossible to envisage a version of Mantle for competing GPU architectures.

Of course, ATI customers with 6000 series cards or earlier (or Zacate, Llano, or Richland APUs) are as out-of-luck as Intel and Nvidia GPU users. AMD is only supporting GCN, because older GPU designs from AMD use a different GPU ISA.

With the rise of Mantle, many console games developers are going to desire that the PC market rapidly changes to AMD only, so the ported games need have only one version- the good one. Other developers, whose games do NOT need strong GPU performance, will wish to use only OpenGL on the PC, for maximum compatibility with games in the ARM space (where OpenGL ES rules).

Any PC gamer interested in high-performance would be INSANE to buy any Nvidia product from now on. 99%+ of all AAA games will originate on the new consoles released later this year- which are 100% AMD. Most casual gamers might as well choose AMD for maximum future compatibility. Intel was never really in the game. Nvidia, on the other hand, will be cursing themselves for ever starting this war (Nvidia previously paid AAA games developers to cripple AMD performance, and attempted to leverage the proprietary PhysX physics engine).

Re:'MANTLE' was the game-changing announcement (4, Insightful)

Lawrence_Bird (67278) | about a year ago | (#44955311)

Using OpenGL or DirectX to 'program' a modern GPU is like using Fortran to program the CPU

Are you saying that OpenGL and DirectX are the fastest? Because Fortran code sure is.

Re:'MANTLE' was the game-changing announcement (4, Interesting)

Guspaz (556486) | about a year ago | (#44955485)

So, you're convinced that the slight improvement in performance brought about by a reduction of software overhead is going to completely cripple nVidia? Yeah, sure.

Even if Mantle does produce faster performance (and there's no reason to doubt that it will), the advantages will be relatively small, and about all they might cause nVidia to do is adjust their pricing slightly. The won't be anything that you'll be able to accomplish with Mantle that wasn't possible without it, such is the nature of fully programmable graphics processors.

Game publishers, for their part, will hesitate to ignore the 53% of nVidia owners in favour of the 34% AMD owners. It's highly unlikely that this will cause a repeat of the situation caused by the Radeon 9700, which scooped a big win by essentially having DirectX 9 standardized around it. In that case, ATI managed to capture significant marketshare, but more because nVidia had no competitive products on the market for a year or two after. This time around, both companies have very comparable performance, and minor differences in performance usually just result in price adjustments.

Re:'MANTLE' was the game-changing announcement (1)

Anonymous Coward | about a year ago | (#44956313)

The difference in performance will be MASSIVE when the rendering features made viable by Mantle are enabled. If these features are disabled, and the game is played in '2009' mode, Mantle will only give a small boost at best. So, if you want your game to look like crap, but play at some insane resolution with an ancient form of AA forced, you are right- the Nvidia solution will be fine.

Game publishers have a 95%+ AAA market that is AMD GCN exclusive- it is called the next-gen console market (and yes, these figures won't be true until a few years time, but this is the new unstoppable trend that all serious publishers must account for). Against the Xbox One and PS4, the PC market is a sad joke. It becomes a far less sad joke if AMD GCN gets properly established on EVERY gaming PC.

And what does Mantle solve? It solves the biggest problems with AAA graphics on the PC- namely memory management, multi-CPU-core processing of GPU intensive tasks, and massive numbers of GPU state changes per frame. DirectX and OpenGL are horrid jokes, designed for single thread CPU work driving simple workloads on a GPU. As these ancient driver models/APIs are forced to work with large numbers of CPU cores and GPUs that need to be in intimate control of their own memory resources, and instruction flows, inefficiencies build faster that improvements in CPU/GPU resources.

Mantle allows CPU and GPU operations that are just NOT worth attempting under DirectX/OpenGL. Here we are talking about performance factors that may be 10-100 times faster. A Mantle method could thus easily have a game running at unplayable frame rates if emulated under DirectX.

With existing engines like Unreal or those from Crytek or Dice, Mantle would, as I said, allow for features that could be enabled for the Mantle version, and disabled for the 'Nvidia' version (much as Nvidia does with PhsysX today). BUT the next gen engines ported from the consoles to the PC would suffer vastly greater degradation when run in 'Nvidia' mode on the PC. This problem will be solved for the next 3 years by having the 'Nvidia' version (or DirectX version if you will) being some form of the LAST generation console game- in other words the Nvidia version of the PC game would be running in 'Xbox 360' mode. The GCN version will be running in PS4 mode.

PC owners are in terrible denial about the massive architectural improvements in the PS4- they focus entirely on comparing the GPU of the console with current PC graphics cards, and completely ignore the advanced architectural overhaul the PS4 has over the ancient desktop PC. AMD wants to bring the same modern HSA/Huma design to the PC, liberating the full power of the PC hardware for the first time. Mantle is an essential part of this strategy. Without Mantle, PC owners can play games with the same crappy visuals, just at greater resolutions and framerates in the future. This is Nvidia's strategy. With Mantle (and the improvement of the PC architecture), the PC will have games that actually render ever more impressive scenes.

And here's a warning. If, for some pathetic reason, Mantle fails in the PC space, fewer and fewer AAA games will be ported to the PC, because as engines improve on the new consoles, their rendering codebases will be less and less compatible with the capabilities of the desktop PC. It will literally not be worth the effort of the publishers of games on the PS4 and Xbox One to move their most complex games to the PC. Nvidia will have killed AAA gaming on the PC dead.

The Mantle project is ESSENTIAL to ensure the best games have the best possible home on our PCs. The GCN ISA is about to become the coding language of AAA gaming algorithms, just as the x86 rose to prominence. Nvidia fanboys will whine and whine and whine and whine, but nothing they scream and dribble about will change the fact that 95%+ of future AAA games will use GCN close-to-the-metal, and therefore the tiny AAA PC gaming market must follow the same path. Nvidia fans can rest assured that casual games will be near 100% OpenGL in the future, and that some AAA games will always have vastly inferior DirectX modes of operation to allow S3 err, I mean PowerVR err, no I mean Nvidia owners to 'enjoy' the game in some form or other.

Seriously, AAA engine companies no longer want to play the Nvidia/Intel game. AAA engine companies are NOT AMD fanboys- they are fans of producing the best possible experience the hardware theoretically allows. They want their code to be unleashed. They don't want excuses. Windows is a horrible OS if you rely on Microsoft's own APIs, drivers or memory management. But Windows allows the programmer to bypass everything crappy with their own code, PROVIDING bad driver models can be eliminated.

And don't forget, Mantle is just as good on Linux as well. Mantle means the end of crappy Linux performance in games.

Re:'MANTLE' was the game-changing announcement (1)

willy_me (212994) | about a year ago | (#44956667)

Game publishers have a 95%+ AAA market that is AMD GCN exclusive- it is called the next-gen console market (and yes, these figures won't be true until a few years time, but this is the new unstoppable trend that all serious publishers must account for). Against the Xbox One and PS4, the PC market is a sad joke. It becomes a far less sad joke if AMD GCN gets properly established on EVERY gaming PC.

The new AMD flat memory model for GPU / CPU is very interesting. I assume this is what will be used in the upcoming consoles. But you have to realize that the console and PC markets are currently being crushed by the newly created portable market (Android / iOS). The larger market of iOS users combined with the App Store model that limits piracy has game developers making more money with their stupid little Apps then they do with their much more impressive console games. The Android market is also growing and, despite being limited by piracy, will eventually catch up with regards to developer profits.

AMD looks like it's making some impressive products when it comes to GPU / CPU integration. If they could catch up to Intel fabs (now at 14nm) they would be doing great in the market. They might also bring their flat memory model to mobile ARM CPU/GPU chips which would be quite interesting. But the reality is that AMD is not going to dominate the market anytime soon. Those who think otherwise have been drinking too much AMD Koolaid. These changes take time and we don't even know how Intel / Nvidia plan on responding - and they will respond.

Re:'MANTLE' was the game-changing announcement (0)

Anonymous Coward | about a year ago | (#44956911)

Silence dirty console peasant! /r/pcmasterrace

Re:'MANTLE' was the game-changing announcement (2)

Narcocide (102829) | about a year ago | (#44955487)

I have a completely different prediction: you don't know what the fuck you're talking about. Nothing can kill OpenGL, if DirectX couldn't do it, certainly not this proprietary shit.

Re:'MANTLE' was the game-changing announcement (0)

Anonymous Coward | about a year ago | (#44955713)

actually he is probably right, both 3D APIs DirectX and OpenGL are bulky, have big overhead and dont bring much to table except limiting developer, most future application/game UIs will be made by programing directly in OpenCL, or CUDA or MANTLE with DirectX/OpenGL "compatibility" layer loaded for slow old games, and if i was choosing API out of i would choose MANTLE because of level of control and performance compared to other 2

Re:'MANTLE' was the game-changing announcement (1)

epyT-R (613989) | about a year ago | (#44956081)

Not if you care about your game running on the majority of hardware out there. You'd support a mantle path, but not exclusively.

Re:'MANTLE' was the game-changing announcement (0)

Anonymous Coward | about a year ago | (#44956609)

if you want compatibility there is always OpenCL, runs on AMD, NVIDIA and Intel, but its slower than CUDA/MANTLE

Re:'MANTLE' was the game-changing announcement (2)

Khyber (864651) | about a year ago | (#44956785)

" limiting developer"

Not OpenGL. Extension not there? Just add it in. Can't do that with Direct3D.

Re:'MANTLE' was the game-changing announcement (0)

Anonymous Coward | about a year ago | (#44955505)

You've just described the death of PC gaming, I think. Extrapolate into the future where all PC gaming is done with AMD cards because there is no other option. What is the motivation for AMD to innovate? Sure, they might do so on the console, because they'll be selling huge volume to the console makers. So you'll have consoles being pretty much the same as PC and PC cards not being really worked on... Why not just buy a console and be done with it?

Re:'MANTLE' was the game-changing announcement (1)

Anonymous Coward | about a year ago | (#44955529)

Hey Charlie, still hold a grudge against Nvidia and Intel?

Re:'MANTLE' was the game-changing announcement (1)

TejWC (758299) | about a year ago | (#44955611)

Isn't nVidia's Cg very close to metal as well? Also, you seem to be implying that making an GPL driver for Mantle should be easy since it will just send the compiled code directly to the card. Could Linux finally be able to get "release day" open source drivers for AMD cards?

Re:'MANTLE' was the game-changing announcement (0)

Anonymous Coward | about a year ago | (#44955929)

Isn't nVidia's Cg very close to metal as well?

it is, but this is even closer, CG is made to be compatible with many generations, this one is made without any compatibility whatsoever, compatibility is expensive

Also, you seem to be implying that making an GPL driver for Mantle should be easy since it will just send the compiled code directly to the card. Could Linux finally be able to get "release day" open source drivers for AMD cards?

release date probably not, but would make time difference much lower since most code would be cross platform

Re:'MANTLE' was the game-changing announcement (3, Insightful)

DarkTempes (822722) | about a year ago | (#44955643)

Mantle does sound like good news but they also said it is an open API and so I wouldn't be too worried about Nvidia...they'll just implement it themselves if it's so good.

And Nvidia has been crushing AMD/ATI in the PC market for a while (the Steam hardware survey shows 52.38% Nvidia to 33.08% AMD/ATI with 14% Intel).
Hopefully this will even things out some but I don't see it making OpenGL or DirectX obsolete.
OpenGL and DirectX have so much momentum and market share that game devs are going to have to target and support them for a while yet.

Also, until we get more solid details about Mantle we won't know how good it really is. I am cautiously optimistic but at most this will cause me to delay my next video card purchase until things shake out.

Re:'MANTLE' was the game-changing announcement (1)

epyT-R (613989) | about a year ago | (#44956049)

So, how much is amd paying you? I'd like to supplement my income.. OpenGL and D3D aren't going anywhere for the immediate future. We went down this vendor-api route with glide, and while it did run well, it created support issues for the consumer that fragmented the market and made it difficult to make money selling gpus. It would be nice, however, to see better support parity between the vendors' shader compilers.

Re:'MANTLE' was the game-changing announcement (0)

Anonymous Coward | about a year ago | (#44956333)

The main benefit they touted was lowering the cpu load, which is interesting because the cpu is rarely the bottleneck. Also notice AMD's cpu's are far behind Intel's, and they actually have to benchmark the flagship gpus with intel cpus because otherwise the AMD cpu would be the bottleneck.

3dmark firestrike performance? (0)

Anonymous Coward | about a year ago | (#44955199)

this firestrike? [hardware.info]

New Family, My Ass (4, Informative)

Khyber (864651) | about a year ago | (#44955241)

Re:New Family, My Ass (1)

Pinhedd (1661735) | about a year ago | (#44955431)

Low end models in each familar are almost always a rebadge of the high end models from the previous family. This has been the case for a very long time. It allows the manufacturer to better move inventory that would otherwise be unsold.

Re:New Family, My Ass (1)

AbRASiON (589899) | about a year ago | (#44955609)

When you say "a very long time" I assume you only mean a generation or two? because this tacky shit wasn't done by any of them a while back. Each model number in a 3xxx series was all based on the 3xxx tech or the 2xx or whatever. Now as you state the top end part in a series is new, the middle end parts in the new series are old.

It's deceptive.

Re:New Family, My Ass (3, Insightful)

armanox (826486) | about a year ago | (#44955731)

nVidia is famous for rebadging. I'll give an example: the Geforce 8800GTX became the 9800 GTX, and then the GTS 250.

ATI on the other hand, has followed a different pattern. All cards of a series (HD 2xxx, 3xxx, 4xxx, etc) are based on the same tech. The 6xxx series cards were tuned versions of the 5xxx cards, and I think what's happening is the new R-series cards are tuned versions of the 7xxx series. nVidia does this with their cards now too - the Fermi family (4xx and 5xx) and Kepler family (6xx and 7xx) introduce a chip in the first gen, and refine that chip in the second.

Re:New Family, My Ass (0)

Anonymous Coward | about a year ago | (#44956231)

Ever looked at their Mobile GPU range?

AMD is full of rebadges.

Re:New Family, My Ass (0)

Anonymous Coward | about a year ago | (#44956575)

It also happened 10 years ago, not just " a generation or two" ago. In 2003 the Radeon 8500LE was rebadged as the Radeon 9100.

Re:New Family, My Ass (0)

Anonymous Coward | about a year ago | (#44956107)

Low end models in each familar are almost always a rebadge of the high end models from the previous family

Try the same card in some cases. And I mean literally the same card. e.g. the 5770 and 6770. The only thing that changed was the first numeral.

Re:New Family, My Ass (1)

TubeSteak (669689) | about a year ago | (#44955889)

They also seem to be saying that the flagship R9 290X is going to be based on the new technology.

a change is gonna come (1)

enzo1 (931050) | about a year ago | (#44955483)

I am starting to distrust the Radeon brand. Look at their decrepit website—clearly AMD is under poor management. They are a near–penny stock and haven't even thought to improve their marketing image. I've always bought Radeon cards, but an idiot could see that the GPU market is ready for a landscape change.

Re:a change is gonna come (1)

PixetaledPikachu (1007305) | about a year ago | (#44956479)

are you by chance opened radeon.com? Go to amd.com instead

At the risk of being labeled troll (1)

rsilvergun (571051) | about a year ago | (#44955645)

I'll ask what /.ers think of the stability of low end ATI hardware. I've heard once you get into the $250 range it's fine, but everything I've tried below $130 has crashed hard on everything except the biggest titles :(. I miss my super stable 1650...

Re:At the risk of being labeled troll (1)

Khyber (864651) | about a year ago | (#44956189)

I never had a problem with my crap of the line HD4200. Sure, it's not going to run the latest and greatest at any respectable frame rate, but hey, it worked and didn't die.

Re:At the risk of being labeled troll (1)

bemymonkey (1244086) | about a year ago | (#44956449)

I'm currently running a 7750 (my critera were basically SC2 and CS:GO at a solid fps-capped 128FPS, so anything more would have been overkill) and haven't had any issues whatsoever. Rock solid cheap card...

Re:At the risk of being labeled troll (1)

PixetaledPikachu (1007305) | about a year ago | (#44956459)

The most recent AMD card that I have is the Radeon 5650, embedded on my Vaio E-Series. It certainly won't win any speed contest, but it can comfortably runs MOH, BF3, Skyrim, Deus Ex Human Revolution, Dragon Age 2 on windows, TF2, Dota 2, and Strike Suit Zero from Steam's Linux library, and finally Street Fighter x Tekken on wine on Ubuntu. No crash whatsoever, at least nothing GPU related. I started with ATI Rage Pro, and went Nvidia for several generations (Riva TNT2, Ti 4200, and 6xxx). I returned to ATI with the X300, just in time when ATI released their first linux binary driver. I'm not really an avid gamer and any graphic chip will suit my daily usage scenario, so I decided to support them, and my next two cards are AMD's 3470 (on Toshiba M300), and the aforementioned 5650. I have ran GTA III, Need For Speed (Most wanteds, carbons), Fallout 3, Super Street Fighter IV, and many other things, and the AMD cards are the least of my problem

Today I learned (0)

Anonymous Coward | about a year ago | (#44956573)

that AMD still exists.

Re:Today I learned (3, Insightful)

Khyber (864651) | about a year ago | (#44956795)

You mean today you just crawled out of your hole, considering AMD has all three consoles, and they're about to drop a brand new graphic architecture to the table.

AMD = case study in good engineering, bad biz dev (1)

hyperfl0w (2429120) | about a year ago | (#44956875)

Recall when AMD first had 64bit support and intel was still doing *EMULATED* 32bit. AMD was hands down the best option. Either AMD biz dev was bad, or Intel biz dev was good (illegal?), or both. I suspect it was both. Having one CPU maker is bad for everyone. Long live AMD, I guess.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>