Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Unveils Preliminary Radeon HD 8000M Series Mobile GPU Details

Unknown Lamer posted about 2 years ago | from the faster-better-includes-integrated-egg-cooker dept.

AMD 51

MojoKid writes "AMD has just released some preliminary information regarding the company's upcoming Radeon HD 8000M series of mobile GPUs. Based on the naming convention alone, it may obvious that the Radeon HD 8000M series is AMD's second generation of products featuring the GCN (Graphics Core Next) architecture, which debuted in the Radeon HD 7000 series. Like its predecessors, the Radeon HD 8000M series targets gamers with full DirectX 11.1 support and improved gaming performance over the previous-gen, but the architecture also lends itself to GPU compute applications as well. The Radeon HD 8500M sports 384 Stream Processors with an Engine Clock up to 650MHz. Memory clocks will vary based on the use of GDDR3 or GDDR5 memory. The Radeon HD 8600M is essentially the same, but with a slightly higher Engine Clock up to 775MHz. The Radeon HD 8700M is also based on the same GPU, but will be clocked at up to 850MHz, for a further increase in performance over the 8600M. The Radeon HD 8800M series, however, is based on a larger, more powerful chip and will sport 640 Stream Processors with an engine clock of up to 700MHz. GDDR5 memory will be used exclusively with 8800M, at speeds up to 1125MHz. It will be interesting to see how these new GPUs stack up versus NVIDIA's latest GeForce 600M series of mobile chips."

Sorry! There are no comments related to the filter you selected.

Onewholeinternets (1)

Anonymous Coward | about 2 years ago | (#42321589)

I wonder if these new cards are gonna be any better mining those bitcoins even after the reward halving...


Re:Onewholeinternets (1)

Anonymous Coward | about 2 years ago | (#42321625)

Fool's game.

Re:Onewholeinternets (-1)

Anonymous Coward | about 2 years ago | (#42322167)

you're a fools game

Re:Onewholeinternets (1)

GarretSidzaka (1417217) | about 2 years ago | (#42322595)

I burned out a graphics card mining. literally. there was a tiny pool of copper in the middle of a scortch mark on the back of the card, where it was facing up. an entire lil logic chip exploded into fire, and liquid copper. roasted the psu, the mobo too. suffice to say i summarily informed my wife that i would be ceasing bitcoin mining, immediately :P

Re:Onewholeinternets (3, Funny)

Black LED (1957016) | about 2 years ago | (#42322723)

That's what you get for wiring explosives to your PC.

Re:Onewholeinternets (1)

Pinhedd (1661735) | about 2 years ago | (#42323869)

That was solder, not copper. Copper melts at over 1000 degrees centigrade

Re:Onewholeinternets (0)

Anonymous Coward | about 2 years ago | (#42328553)

It had a distinct orangish hue to the metal puddle. That's why I think its copper. When this thing burned it filled my entire house with smoke too

Re:Onewholeinternets (0)

Anonymous Coward | about 2 years ago | (#42330259)

That was probably from the rosin.

Re:Onewholeinternets (0)

Anonymous Coward | about 2 years ago | (#42322567)

Not better enough to beat out ASICs

Wait... what? (1)

Dexter Herbivore (1322345) | about 2 years ago | (#42322923)

Did I just read a press release?

Re:Onewholeinternets (1)

slashmydots (2189826) | about 2 years ago | (#42325019)

Nope! All GPUs are dead and buried the second that ASICs come out. A 220 watt 5830 can do 330MH/s overclocked to the max. The new Jalapeno ASIC for about $149 can do 3,500MH/s (over 10x faster) at 2.5 watts.

Re:Onewholeinternets (1)

Lennie (16154) | about 2 years ago | (#42332365)

And the value of bitcoins will drop like a brick when these get these devices get on the market ? And continue dropping until the price of running such a device is higher than the bitcoins it creates ?

First Post (-1)

Anonymous Coward | about 2 years ago | (#42321649)


Re:First Post (2, Funny)

Anonymous Coward | about 2 years ago | (#42321659)

Sorry, you have failed. HAND.

Wait, why am I replying to myself?

Re:First Post (-1)

Anonymous Coward | about 2 years ago | (#42322881)

Never mind, I cancel that comment.

That's nothing (4, Funny) (311775) | about 2 years ago | (#42321881)

I have 350 heads on a 305 engine and a Nikon D3200 with a 18–55 with a new DX-format CMOS that can do 3.4 FPS while I chat over IAX using G.729.

Re:That's nothing (1)

noelhenson (691861) | about 2 years ago | (#42324049) I wish that I'd thought of that. LMAO!

Linux drivers? (0, Insightful)

Anonymous Coward | about 2 years ago | (#42321889)

I dumped them after they obsoleted a perfectly good card deciding they would no longer support it for Linux. ATI -- kiss my ass.

Re:Linux drivers? (1)

Anonymous Coward | about 2 years ago | (#42322035)

They only dropped old cards. Those which probably have better support from the open drivers anyway. Seriously, did you try them? You may have binned a perfectly good card for no reason.

Re:Linux drivers? (5, Informative)

corychristison (951993) | about 2 years ago | (#42323009)

As the other A/C has pointed out... AMD only dropped support for older cards. Honestly the open source drivers are great.

My HTPC is a few years old, it has an onboard Radeon HD 3200 (ITX board). Again, using the open source Radeon drivers, it works excellent. Direct BluRay rips play flawless on 1080P display, audio over HDMI.

Recently received a new laptop as a gift. It has a Radeon HD 7500M in it. Using the open source drivers I have not had any problems. I don't play video games so I really can't tell you what to expect there, though.

It's been a rough road but KMS is starting to mature and "Just Works" in most sutations... I don't think the Ati Catalyst drivers support KMS yet.

Re:Linux drivers? (1)

Anonymous Coward | about 2 years ago | (#42324247)

The open source drivers are good, but not great. My Radeon 5770 runs most Linux games fine, but Team Fortress 2 is so slow that it's not playable. Worse, the power management is next to useless, forcing me to manually (or rather, I use a script at boot) set the power profile to 'low' (with low performance as a result) if I don't want noise and high temperatures. The closed driver has excellent power management, but often crashes when resuming from suspend (which the open driver almost never does). IME, Radeon is great, but the Linux drivers come with so many tradeoffs that you're bound to be unhappy with them.

Re:Linux drivers? (0)

Renraku (518261) | about 2 years ago | (#42326647)

Goddamn, I hate this. Back when I bought my last laptop I got the one with the best possible graphics card. Had something like 4GB RAM, a decent processor, and a decent video card. The shit can't run 5+ year old games at any playable resolution. The original Half Life is barely playable, I get like 15 stuttery FPS on it. It could barely run SNES emulators.

Re:Linux drivers? (1)

Khyber (864651) | about 2 years ago | (#42327063)

And what Laptop would that be, as my Pentium 3 laptop still runs SNES and PSX emulators JUST FINE.

Even my broken DV7 with an ATi4200HD mobile runs these emulators, killing floor, Unreal Tournament 2K4, etc. No issues.

Re:Linux drivers? (1)

Lonewolf666 (259450) | about 2 years ago | (#42325303)

They have recently dropped support for anything up to the Radeon HD 4xxx. Which is not that old.

On the other hand, according to various articles on [] the open source Radeon drivers are making great progress. One drawback is that you may have to tinker with your Linux installation to get all of that:
You will need the very latest source code, which probably means recompiling mesa and maybe the kernel ;-)

Re:Linux drivers? (1)

corychristison (951993) | about 2 years ago | (#42330943)

They have recently dropped support for anything up to the Radeon HD 4xxx. Which is not that old.

On the other hand, according to various articles on [] the open source Radeon drivers are making great progress. One drawback is that you may have to tinker with your Linux installation to get all of that:
You will need the very latest source code, which probably means recompiling mesa and maybe the kernel ;-)

I absolutely agree, it is not for the faint of heart or new users. I am sure as soon as the popular distros like Ubuntu, Mint and Arch shift into faster development cycles to keep up with the kernels new pace we will see things get easier for everyone.

I personally use Gentoo on my workstation since I built it and keep it current. Just upgraded my kernel to 3.7.1 with Gentoo patchset last night.

I am no stranger to compiling software and kernels.

Re:Linux drivers? (1)

corychristison (951993) | about 2 years ago | (#42330961)

Should also mention my HTPC and Laptop are also using Gentoo/Funtoo.

Power specs? (1, Interesting)

Anonymous Coward | about 2 years ago | (#42321981)

Honestly, that's sort of 50% of why the 600 series is such a badass in the mobile market: because it has a pretty damned low power draw and still manages to issue forth a lot of power. I'm seeing them show up in ultrabooks, for god's sake. Can AMD really bring the heat on this?

Re:Power specs? (1)

jakobX (132504) | about 2 years ago | (#42324857)

If they fix Enduro then yes. They started fixing the damn thing recently so i guess the answer will be yes.

8500, 8600, huh? (-1, Troll)

Anonymous Coward | about 2 years ago | (#42322109)

I've had one of those for five years. Kinda late to the party. What next, they announce the release of their 386 processor with turbo and a separate floating point unit?

Power consumption (1, Insightful)

Psicopatico (1005433) | about 2 years ago | (#42322487)

While it's all fine and good emphatizing on the computing capabilities and bragging with MHz, GFLOPS and the such, any good slashdotter knows we're already well beyond the "good enough" threshold.
In the meantime, only few vague words are spent for improved power efficiency.

I personally don't feel the need for a graphic card that goes the double faster and draws 80% more power.
Give me a graphic card that goes same as the actual one, but consumes 40% less, thank you.

Re:Power consumption (1)

jones_supa (887896) | about 2 years ago | (#42322935)

For things like games, nothing will be "good enough" for a long time. For general computing tasks, you are correct though.

Anyway. Both AMD and NVIDIA release also cut-down versions of their architectures, which should accomplish what you want. You get a chip that consumes less power than your previous one, but has roughly the same amount of performance than what you had.

Re:Power consumption (1)

Rockoon (1252108) | about 2 years ago | (#42323591)

Indeed, mid-range (~$200 price-metric) video cards from 2 generations ago such as the 8800GT were drawing 105 watts under load. An equally performing (in a gaming context) card now draws ~65 watts and is under $100, with twice as much memory.

But with regards to raw computing prowess (DirectCompute/CUDA/OpenCL), these mobile GPU's completely spank that 8800GT I mentioned above pretty hard. Computationally they are significantly faster, but because they lack comparable memory bandwidth they simply cannot fall into the "good enough" category. This is the way things will stay because memory bandwidth is expensive, contradictory to what makes mobile GPU's successful in the market.

Re:Power consumption (3, Interesting)

somersault (912633) | about 2 years ago | (#42324467)

Game graphics are already "good enough" though. Good enough doesn't mean "photorealistic" in all cases. Photorealism ruins the feel of certain types of games - for example I preferred the style of GTA III to GTA IV. I think Saints Row 3 gets a nice balance between the two. Keeping slightly cartoon-ish characters while making the environment more realistic. Especially in fantasy games.

For racing games and other vehicle simulators etc we already basically have photorealism in the models and lighting. It's little details like camera shake help to make things feel more realistic, and I'd say they're more important details than getting the last few percentage points towards perfect reflections and particle physics (though those are nice too).

It's people who buy games based on graphics that are letting gaming become such a formulaic experience these days. What isn't "good enough" these days is how publishers take a good idea/series and plow the shit out of it. The guitar hero/rock band games were great fun, but they oversaturated the market. Assassin's Creed 1 was okay. 2 was spectacular, Brotherhood even better. But then they brought out Revelations which didn't add much at all or have a particularly involving story, and even AC3 felt a bit rushed IMO. The graphics in all of these games are definitely good enough to get the job done. It's the gameplay that is important.

Sigh. Rant over..

Re:Power consumption (1)

jones_supa (887896) | about 2 years ago | (#42324757)

I think Saints Row 3 gets a nice balance between the two. Keeping slightly cartoon-ish characters while making the environment more realistic.

Agree with that one. Saints Row 3 is really nice package both technically and artistically.

Re: "good enough" (1)

Lonewolf666 (259450) | about 2 years ago | (#42325393)

Matter of taste.

For me, "good enough" means Half Life 2 graphics quality, which even an older card can handle. For instance, the Nvidia 8600 GT in my older, secondary PC from 2007 :-)

More importantly (-1, Troll)

Anonymous Coward | about 2 years ago | (#42322577)

Will the next gen have half decent drivers that don't cause constant blue screens? Or a working implementation of power management (Enduro) that doesn't require constant driver updates with each new game release to avoid massive under utilisation in games?

The 7990M has the potential to be top of class at a significantly lower cost that the nVidia equivalent (680M). Due to months of dicking around and really poor quality drivers nvidia can continue to charge $250-300 MORE for their 680M in a laptop - and people like me will still buy them over a 7990, because they work.

Not interesting... (0)

Anonymous Coward | about 2 years ago | (#42323201)

... unless they compare like with like i.e. nVidia's series 700 (or whatever they'll call it.)

Compare next gen AMD with this gen nVidia says the 7000 series lost this round.

Re:Not interesting... (1)

jakobX (132504) | about 2 years ago | (#42324767)

When it comes to mobile they certainly lost. Enduro is still a work in progress while nvidias optimus works (unless you are using linux :) ).

When it comes to desktops i think they provide a better value than nvidias current offerings.

Naming convetion (1)

Anonymous Coward | about 2 years ago | (#42323655)

I alredy have 8600 ...
oh wait never mind.

The most important question... (0)

thejynxed (831517) | about 2 years ago | (#42323715)

Will these new versions still have that frame-buffer relay delay that causes micro-stuttering in pretty much any game? Certain nVidia models also had this issue, but the entire 7xxx line from AMD has it.

And yes, there's nothing end-users can do to fix it. It was either a driver issue (no shock there), or a hardware issue. I am thinking the latter.

There's a reason AMD pushed Newegg and other retailers to dump their entire current gen 7xxx series by using the carrot of 2 free games with purchase...

Re:The most important question... (3, Informative)

jakobX (132504) | about 2 years ago | (#42324833)

You are refering to techreport articles right? There might be a problem or there might not be. I certainly havent seen any micro-stuttering with my hd7850. The type of tests they do are veeeeeery easy to manipulate. You just have to select a different short scene to render if you dont like the results and voila the other card wins. Average framerates might not tell you everything but atleast they are harder to manipulate. (though techreport still has a tendency to measure much lower average FPS for AMD than competing review sites)

Re:The most important question... (-1)

Anonymous Coward | about 2 years ago | (#42325215)

It is more likely that you do have the issue and are trying to defend the product due to a sense of buyer's remorse. Your HD7850 is a piece of shit, deal with it.

Re:The most important question... (1)

thejynxed (831517) | about 2 years ago | (#42326347)

SWTOR for one, people experience this issue with. Dead Space 2 is another example, along with Saint's Row the Third.

Those are just some of the games, but yes, it has to do with this problem.

Average framerates are telling you nothing, because when being monitored, the framerates never drop. There's just stuttering in the rendering of certain objects.

In the case of SWTOR: The animation when people mount/dismount from speeders, certain casting animations and particle effects, shadowed text enabled - all examples of when you will see this happen, and this is with the framerates holding at a steady 60 FPS (V-Synch enabled).

Naming Conventions (1)

StoneyMahoney (1488261) | about 2 years ago | (#42324289)

I can see a bit of a problem with the numbering system that nVidia and ATI use. nVidia had a range of mobile graphics processors that used the name 8x00m. Queue lawsuit and consumer confusion in 5...4...3...

I like AMD (1)

AdmV0rl0n (98366) | about 2 years ago | (#42324677)

and always have, but the 7970m has been out an age, was a good price, had potential to be the gaming card in mobile riggs - and came out with broken drivers. AMD seemingly had extreme disinterest in fixing these drivers, or were unable to fix the problem with drivers. Someone inside AMD has to get a grip and make sure issues like this get solved, resolved, fixed.

And I do not know how the 7970m got stellar reviews - because later it became legion that it has issues. I hope the 8000m series is better.

Based on the Naming Convention (1)

SchMoops (2019810) | about 2 years ago | (#42324801)

Well, based on the naming convention, it may obvious... or it may not.

This is re-badge, not real 8000 series (1)

edxwelch (600979) | about 2 years ago | (#42324977)

These cards are based on the same Southern Islands core as the 7000 series. So, why is AMD calling these 8000 series? Because AMD has run out of money, causing the real 8000 series (Sea Islands) to be delayed: []

anyone know? (1)

slashmydots (2189826) | about 2 years ago | (#42325041)

"It will be interesting to see how these new GPUs stack up versus NVIDIA's latest GeForce 600M series of mobile chips."
Not really. I care about how they stack up against the trinity A8 and A10's onboard graphics chips. Those things kick ass. One got a 6.6 if I remember correctly in WEI on a semi-gaming laptop I got for someone and it ran Fallout New Vegas at 60FPS at medium high settings at 1440x900. Anyone know if these are any faster? I would assume they are but for all I know they run in coordination with that modified crossfire feature that goes between a GPU and APU.

Of dubious value (1)

Eravnrekaree (467752) | about 2 years ago | (#42325829)

I really do not plan on playing games on a *shudders*, laptop. For someone that was really looking for the best value, the most performance for least cost, the desktop will always be where the cutting edge is on performance. You can't beat the desktop which does not have to worry about space, cooling and power restraints, that lead to lower powered devices that cost more.

Also, shouldn't AMD be producing x86 cell phone system on chips and working on getting these sold to manufacturers as it should have years ago? You still cannot purchase an x86 cell phone in the US despite them being available in other countries. Such chips are perfectly suitable for a cell phone and provide added benefit of compatability with PC software.

Re:Of dubious value (0)

EmperorArthur (1113223) | about 2 years ago | (#42329363)

I shudder at the thought of x86 used on cell phones.
First, what's the point. x86 is only useful for running legacy proprietary software. Mostly for windows.
Those old apps have minimum screen sizes, and almost none of them are touch compatible.

Second, there's battery life, or lack thereof. x86 is what happens when you add 30 years of bloat to an old variable length cisc instruction set.
The idea of saving memory and being faster by having the processor do more is a good one. Unfortunately, that means the processor is an order of magnitude more complex, with all the manufacturing and power draw disadvantages that come from that.

If there is anything we should take away from x86 it is that a small fast instruction set with separate modules outside of the cpu is far better than a monolithic instruction set that does everything.

The sole exception to the above are larger tablets that can be used as laptops. While x86 is still horrid, you need it for windows compatibility.

Driver documentation (1)

hobarrera (2008506) | about 2 years ago | (#42328329)

Wasn't AMD supposed to have it's 8000/9000 series release with their documentation for the OSS drivers?

This would be a good oportunity for AMD to release driver documentation, so dev can start developing FLOSS drivers for *nix.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?