Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Releases Fastest Mobile GPU

Unknown Lamer posted more than 3 years ago | from the most-amazing-15-minutes-of-battery-life dept.

AMD 60

Stoobalou writes "AMD has scored another point over its graphics rival Nvidia with what it claims is the world's fastest single-GPU mobile graphics processor, the Radeon HD 6990M. While the red team is unlikely to hold the crown for long in the fast-moving world of discrete graphics, the company's latest chip is certainly impressive enough. Based on the TeraScale 2 unified processor architecture and the Barts GPU core, the Radeon HD 6990M — a mobile equivalent to the company's high-end Radeon HD 6990 PCI Express graphics card design — features 1,120 stream processing units, 56 texture units, 128 Z/stencil ROP units, and 32 color ROP units."

cancel ×

60 comments

Sorry! There are no comments related to the filter you selected.

HOW FAST DOES IT GO ?? (1)

Anonymous Coward | more than 3 years ago | (#36739216)

Faster than before !! Pay up !! And don't complain like it's Windows 8 MOTHERFUCKERS !!

Re:HOW FAST DOES IT GO ?? (0)

Anonymous Coward | more than 3 years ago | (#36739378)

They've gone plaid!

Can you imagine... (0)

Anonymous Coward | more than 3 years ago | (#36739220)

A beowulf cluster of these?

Re:Can you imagine... (0)

Anonymous Coward | more than 3 years ago | (#36739252)

Yes but can it run linux?

Is it cooled with hot gritz?

Re:Can you imagine... (1)

Moryath (553296) | more than 3 years ago | (#36739514)

The better question is... what games really need it? Most of the popular games on the market are still programmed to run well on 7 year old hardware.

Re:Can you imagine... (3, Insightful)

PC and Sony Fanboy (1248258) | more than 3 years ago | (#36739674)

The better question is... what games really need it? Most of the popular games on the market are still programmed to run well on 7 year old hardware.

Well sure, some game still run on old hardware. Whether or not the run well is a completely subjective opinion.

Now I'm not saying you need to buy new, top of the line hardware every year to enjoy your games... but the exponential speed/power increase means that you need to be getting mid-range hardware every few years to play modern titles at high resolution with decent detail.

Again, you're not wrong. But I like to play at FHD with 4x AA and very high details. If you're okay with VGA @ low, then you can keep your dual 9800gtx setup for another few years. See, the low/entry level point for graphics performance is ... low. VERY low. Even modern IGP solutions will play many games at 20fps. But I like to see this sort of high-end tech come out, because it puts downward pressure on mid-range cards, that perform quite well considering their low price. I won't buy a top-of-the-line card every year ... but I'm glad that someone will, because those purchases drive the innovation.

You may also notice that most popular games on the market are console ports, designed to run on consoles. But not all of them ... so if you're still playing MW2 or some other brainless console port, sure. You're right. Pat yourself on the back for that earth-shattering revelation.

Re:Can you imagine... (1)

DaVince21 (1342819) | more than 3 years ago | (#36746540)

But I want to be able to run Duke Nukem Forever!

I can imagine it.... (0)

Anonymous Coward | more than 3 years ago | (#36739776)

...IN MY PANTS!!!!!

At last! Portable bitcoin miner! (1)

Lead Butthead (321013) | more than 3 years ago | (#36739240)

Oh wait...

the interesting questions (0)

Anonymous Coward | more than 3 years ago | (#36739310)

How does it perform compared to modern desktop GPU's ? How much power does it need ? Can I get a passively cooled one for my desktop ?

Re:the interesting questions (1)

John Napkintosh (140126) | more than 3 years ago | (#36739520)

It'll be a bit before relative performance numbers are out, but I can tell you that, unless you've go the world's only MXM desktop motherboard, you're not going to be putting this mobile GPU into your desktop.

Re:the interesting questions (1)

fuzzyfuzzyfungus (1223518) | more than 3 years ago | (#36741664)

While there are... other obstacles... I'm pretty sure that at recent vintage iMacs have MXM slots, and are arguably desktops. Good luck finding an MXM card that plays nice with Apple's peculiar EFI implementation; and the thermal and mechanical limits of a chassis designed with aesthetics at the forefront; but it should at least work mechanically.

Re:the interesting questions (0)

Anonymous Coward | more than 3 years ago | (#36745866)

Like this [liantec.com] ?

Re:the interesting questions (0)

Anonymous Coward | more than 3 years ago | (#36746004)

completely missed point.

it is not about putting one of those on the desktop, but to see how they fare against their desktop counterparts, specially now that we live in a era where product numbers have become meaningless (seriously, what's better from a 5830, 6770, and other oddly numbered radeon cards?)

Re:the interesting questions (1)

marcosdumay (620877) | more than 3 years ago | (#36743858)

Yeah, the subject parses as "AMD releases fastest slow GPU". None of the important data is shown.

Can it run Crysis? (1)

Hsien-Ko (1090623) | more than 3 years ago | (#36739344)

Yes, but it can't run old LucasArts 3D games. No table fog support in Radeons since forever. What a shame!

Re:Can it run Crysis? (0)

Anonymous Coward | more than 3 years ago | (#36739524)

Seriously,
    who cares?
    Gamers are a small % of users.
    If that is all that matters to you, then take my advice and get a life.

What is going to be more important, is how much of the GPU Power can we use for compute intensive tasks and how easy is it to use?
That is important.

Re:Can it run Crysis? (0)

Anonymous Coward | more than 3 years ago | (#36741852)

Shouldn't that be emulatable in software by now? I mean just write a small GPGPU "adapter" kinda thingy on top of tho OpenGL driver, and it's done.

But hey, I bet you could run those engines entirely in software by now. ^^

Also, I'm a fan up new engines, texture and model packs for old games. Like that XreaL thing for Quake 3 Arena.

battery (0)

Anonymous Coward | more than 3 years ago | (#36739432)

and you battery will last a whole half hour!

Re:battery (1)

Hsien-Ko (1090623) | more than 3 years ago | (#36740454)

and the knee burning will be totally worth it for my important prime95s on a plane!!! yea!

Of course it's faster... (1)

mandark1967 (630856) | more than 3 years ago | (#36739458)

6990 - 590 = 6400

It's a whole 6400 faster!!1

apple should put this in the mini but we will like (1)

Joe_Dragon (2206452) | more than 3 years ago | (#36739522)

apple should put this in the mini but we will likely get the POS i3 cpu with on board video at $700

Re:apple should put this in the mini but we will l (0)

Anonymous Coward | more than 3 years ago | (#36739640)

Because shoving a high-powered graphics card into what amounts to a set-top box is a brilliant use of resources and wouldn't cause heating issues in the slightest. I mean, c'mon, this is a brand new card just realized, I bet it's even cheaper than the rest!

Re:apple should put this in the mini but we will l (0)

Anonymous Coward | more than 3 years ago | (#36739670)

I wouldn't mind the i3 so much (as long as it's at least a bit faster then the Core 2 Duo for the same power requirements), it's the crap intel GPU I'm worried about.

Re:apple should put this in the mini but we will l (1)

LordLimecat (1103839) | more than 3 years ago | (#36741910)

Running sandy bridge here (2310), onboard GPU is impressive. Dont knock it till youve tried it.

Re:apple should put this in the mini but we will l (1)

interkin3tic (1469267) | more than 3 years ago | (#36739754)

I think that comes from the fact that the target demographic for the apple mini thinks that "processor" is a fancy word for "blender." I didn't think they advertised components of apple products. You buy an apple laptop, it comes in either "normal" or "fancy metal 'pro.'" My limited experience has been that if you ask a mac user for details on their computer, they'll say "Well... it's white?" Not exactly a lot of demand for better processors.

Re:apple should put this in the mini but we will l (1)

samkass (174571) | more than 3 years ago | (#36739894)

we will likely get...

If you buy it without this GPU, then Apple was right not to spend the money on it. Don't buy it if it's not up to your needs and Apple will learn to set the product/price appropriate to the market.

Personally, I don't care much about desktop/laptop GPU power anymore as much as I care about what Apple can cram into the iPad 3...

open 3D acceleration for Linux? (2)

bzipitidoo (647217) | more than 3 years ago | (#36739528)

Doesn't matter how fast it is if the driver can't use it. Where are the Linux drivers? I thought back when they were still ATI that they'd pledged to open up their hardware. As far as I know, in Linux we can get 2D acceleration only in a good open driver, or we can get 3D acceleration in a closed driver that is otherwise not so good.

Re:open 3D acceleration for Linux? (0)

Anonymous Coward | more than 3 years ago | (#36739970)

Even if we had a Linux driver it would probably suck.

It has always been this way with ATI and it's the reason I haven't bought an ATI card in well over 10 years.

Drivers man, drivers. If their shit worked it might be great. In fact ATI shows pretty damn good performance at OpenCL stuff (versus nVidia) but if their hardware doesn't work right because of driver issues then I'm not going to use it.

The key features I need in a video driver are: Full Linux support, stability, performance, media acceleration, and a fully working OpenGL implementation. ATI has consistently failed on all of those. Their OpenGL support has to be one of the worst in the market (and OpenGL is the only cross-platform 3D API).

Re:open 3D acceleration for Linux? (4, Informative)

PeterKraus (1244558) | more than 3 years ago | (#36740002)

You know wrong. The newest hardware always lags 1-2 months in 2D accel support and maybe up to a year in 3D support (that was the case with R700 just after the openness pledge) in the FOSS drivers, but it always gets there. Here is a handy matrix as it currently stands, this being a Barts-based GPU, you're looking under Northern Islands:

http://www.x.org/wiki/RadeonFeature [x.org]

The 3D performance is not as good as the one under fglrx, though.

Re:open 3D acceleration for Linux? (1)

Tom9729 (1134127) | more than 3 years ago | (#36740146)

You can get information on the status of the open radeon driver here: http://www.x.org/wiki/RadeonFeature [x.org]

Re:open 3D acceleration for Linux? (0)

Anonymous Coward | more than 3 years ago | (#36740222)

There are zero open 3D drivers for mobile GPUs. Not even the one in the OpenMoko FreeRunner (due to NDAs). That said, I heard a rumour there would be an announcement about PowerVR later this year.

iirc the third dimension has been patented (0)

decora (1710862) | more than 3 years ago | (#36741076)

i heard that a consortium of microsoft, apple, and intel decided to patent the third dimension, since nobody had ever thought of it before about 2009.

Re:open 3D acceleration for Linux? (0)

Anonymous Coward | more than 3 years ago | (#36741866)

I'm pretty sure this is "mobile" in the sense of laptop GPU, in which segment there are open drivers for at least AMD's offerings, as long as it's nothing too recent. Dunno about nvidia's; last time I had an nvidia card, their closed drivers were pretty solid, and the nouveau project (open source driver) looked so scarily incomplete I didn't even try them, but I hear they've improved a lot.

Also, damn the rumors about PowerVR drivers. I've been hearing them with more or less credibility from when I bought my N800 way back when (2008, I guess), until I got tired of dashed hopes and bitter tears and stopped minding rumors...

Re:open 3D acceleration for Linux? (0)

Anonymous Coward | more than 3 years ago | (#36743406)

Did you start reverse engineering the binary blobs on your N800?? That would have been more useful than bitter tears and dashed hopes.

Re:open 3D acceleration for Linux? (0)

Anonymous Coward | more than 3 years ago | (#36743440)

Yeah, I don't really count laptop GPUs as mobile, I'm thinking of the ARM space when I hear "mobile". In the x86 (mobile) space there are open drivers for Intel, AMD, nVidia and probably any others that ship on x86 laptops.

Re:open 3D acceleration for Linux? (1)

DaVince21 (1342819) | more than 3 years ago | (#36746572)

Actually, I have 3D support on the open-source Gallium driver for my r700 series ATI card, and they're still heavily working on that driver so support may become a lot better. The open-source graphics driver world has been doing a lot better ever since Gallium came along.

I can't play Minecraft yet, though. But any typical Linux OpenGL game will work.

Graphics News: Phoronix (1)

DrYak (748999) | more than 3 years ago | (#36757692)

In addition to the driver-specific website pointed by others in this thread, Phoronix is also a nice source of information :
- they regularily feature benchmarks pitting closed- versus open-source drivers.
- they post news about support for recent hardware being added to Kernel / Mesa, etc.

Re:Graphics News: Phoronix (1)

bzipitidoo (647217) | more than 3 years ago | (#36758858)

Yet good hardware accelerated 3D graphics is still not available in an open source driver for Linux. Phoronix's benchmarks show this quite clearly. The proprietary Catalyst driver is not just a little faster, it's 10 times faster than the best open source driver.

I have 2 computers with Radeon cards, an X1500, and an HD5450. I have a couple of simple OpenGL programs I wrote, and while they run on those, they're extremely slow. Probably the open source drivers are emulating the 3D with software. Those programs are much, much faster on the Nvidia card with the proprietary driver. I've tried games, and they are unplayably slow on the Radeons. Some run acceptably if I set an environment variable, LIBGL_ALWAYS_INDIRECT=1.

If it's just a matter of giving the devs time, then surely the X1500 is old enough now? I run Arch Linux in part to get the most recent kernels and X servers.

I'm disappointed that it's been years, and we still don't have fully capable open source drivers. When I heard ATI had seen the light, I was ready to drop Nvidia on the spot. Frustrating. Maybe the best hope is that some new graphics hardware company comes along and kicks both their rears with next generation hardware (massively parallel ray tracing, anyone?) and open drivers for Linux. And virtualization. I'd love to be able to dump the dual boot setup for Xen, but not at the expense of losing the graphics acceleration.

3D Acceleration - just in today (1)

DrYak (748999) | more than 3 years ago | (#36761298)

Yet good hardware accelerated 3D graphics is still not available in an open source driver for Linux. Phoronix's benchmarks show this quite clearly.

Yet, it's slowing coming in.

This just in [phoronix.com] . Apparently, on the HD6000 side of things, the performance gap is shrinking. We're still talking about a 2:1 gap, but it's better than the 10:1 reported earlier.
It takes time. Time for the OSS developper getting used to write good drivers for it. Time for good consolidated architectures to emerge and gain momentum (like the Gallium3D with a good separation between API front-ends and hw back-ends). Time for hw manufacturer to integrate OSS in their development pipeline : it took enormous time before the first documentation could be green-lighted for publishing by their legal department, currently they only lag a few weeks to a few months behind. By the time their reach the HD8000 generation (or whatever it will be called at that point in time). AMD promised that the OSS will be integrated into the development process from the beginning.

Contrast it with Intel : they've been much longer in the OSS game. Currently, as (non-PowerVR-based) hardware is rolled to stores, there is driver support available (okay: there are still hicups - initial sandy bridge was a buggy, wasn't available in mainstream distros and required pulling the latest development version). But that's still support released almost simultaneously. And benchmark show almost similar performance between the Linux (opensource only) and the Windows (blob) drivers.

I have 2 computers with Radeon cards, an X1500, and an HD5450. [...] Probably the open source drivers are emulating the 3D with software.

You're doing something wrong... You should check on your distro's forum if you didn't miss something somewhere.
Specially with the HD5450 :
- it's a mid-range card (the biggest performance gap happens on highest range of cards)
- it's a previous-to-latest generation of cards (by now bugs must have been ironed out).
It should perform decently.

Are you sure that you're getting latest up-to-date drivers from your distro's repositories? (Some distro use additional repositories to get the latest versions, otherwise you only get bug- and security- fixes for whatever version comes with the stock distro)
Are you sure you're running the *Gallium3d* variant of drivers? (the "r600g" driver ?) (Some distro still used the older variant "r600" by default. Gallium3d has been making gigantic leaps forward in the latest months).

(Also there's a bug affecting some AMD hardware users: you might need to add "irqpoll" on the boot parameters. read your /var/log/message log. If it complains about "irq nn: nobody cared (try booting with the "irqpoll" option)" and then "Disabling IRQ #nn", you might be affected by a bug which slowly brings the 3D acceleration to a crawl)

One of my machine has an AGP variant of HD4500. I run latest openSUSE + official repository from SUSE for latest X11 version. (I still have the stock kernel, so I don't benefit from some advantages of the latest kernel modules). I got an up-to-date Xorg and Mesa. The performance isn't stellar, but its decent enough.

I think overall it a sign of the whole graphics situation. Sufficient and decent OSS solution have started to appear. But saddly, it's not always clearly documented and made easy. For performance people need often the lastet version, a version newer than the one which came with stock distribution.
- But this latest version doesn't always exist (in the sandy bridge case, initial enthousiast needed to pull the source and compile it themselves ). It would be better if more collaboration between distribution+developpers+manufacturer happens. So we see more "official additional repositories" (the repositroy on openSUSE's buildservice to get latest X11 and Mesa is a nice starting point).
- When nice repositories with ready-to-use packages made available exist, it's not clearly advertised to the end users and made known. I've seen that Ubuntu can detect some hardware and automatically show a pop-up proposing to add an extra repository to get the closed source binary drivers that couldn't be shipped with the distro. Something similar would be nice for getting said up-to-date opensource-drivers

I'm disappointed that it's been years, and we still don't have fully capable open source drivers. When I heard ATI had seen the light, I was ready to drop Nvidia on the spot.

As I said above : be patient, it takes time. We're slowly starting to get there, although today it still requires jumping through some hoops to get the best performance.
You can still show support to the company which are actually pushing for OSS drivers and vote with your wallet by buying AMD and Intel GPUs (even if you go back to the closed source drivers for radeons).
But in the year to come, things are going to get better at an accelerating pace:
- Intel has already a good integration of OSS in their development process, AMD has promised to progressively do so too until we reach parity by the HD8000 generation.
- And the whole Sandy Bridge semi success/semi fiasco (drivers available but hard to get working) has raised awareness for distros and the next few years will probably show some work to make latest drivers more easily available.
- Add to that, the fact that other software giants with lots of brains and resources - like Google - have started tackling the problems too. (Google has started to help port Intel drivers to the newer Gallium3D architecture, because they need it for their ChromeOS. And they have Google Summer of Code (GSoC) projects to improve support for OpenGL 3.x, OpenCL, and other modern APIs).
You won't beat performance records with an OSS setup tomorrow or next month. But probably in 3 years.

Maybe the best hope is that some new graphics hardware company comes along and kicks both their rears with next generation hardware (massively parallel ray tracing, anyone?) and open drivers for Linux.

Sorry to disappoint you, but Intel dumped the Larrabee project :-P (Though it had the potential to be exactly that : Intel heavily advertised it for ray tracing, and they are known to have OSS now integrated into their pipeline)

But a completely different company will have a hard time jumping into the game :
- They would need extensive know-how in driver development (frankly, lots of OSS drivers sucked when they began. even where documentation was available).
- They would need dramatically fast and revolutionizing hardware (that's hard too. Just look at companies trying to jump into the market : S3, VIA, etc. all have tried to enter the 3D market too, but didn't manage to reach the level of performance of the main players).
- They would *definitely* need production capabilities (just look at the sheer number of cool opensource friendly- or open-hardware projects : mobile phones, handheld consoles, netbooks. Now try to get one now : always the same problem - massive production delays) (If you're not Intel, AMD or Nvidia, chances are between the time you got a nice idea and implement it and have it shipped to customers, lot of time will pass. During this time, the big players will have time to bring several generations to the market, each one out-performing the previous one.)
- They need also to have decent Windows support too (that's where the main 3D market is currently. Otherwise the hardware will be produced in too little volumes to be affordable). (At least, the recent emergence of a DirectX 10/11 state tracker for Gallium3D, some code could be shared between Linux and Windows drivers).

And virtualization. I'd love to be able to dump the dual boot setup for Xen, but not at the expense of losing the graphics acceleration.

Supporting virtualisation and shared usage of acceleration would be a possibility (thank to things like IOMMU, PCI sharing, and the like).

A different one would be approaches like VMWare's and VirtualBox's - making the host acceleration available to the guest, by forwarding the requests. This is even more interesting as VMWare is currently the main Gallium3D developper (they bought Thungsten Graphics). The Gallium3D architecture is the most widely used for recent OSS drivers (Nouveau, recent Radeon, Google's Intel). And the Gallium3D architecture has support for the most API, including a DX10/11 state tracker. So it should be possible to have a Gallium driver running inside the guest Windows, which listens to OpenGL 3.0 and DX10/11 APIs, and uses a back-end which pipes it to the host's Gallium driver using an OSS back-end.

Re:3D Acceleration - just in today (1)

bzipitidoo (647217) | more than 3 years ago | (#36765818)

That's good to hear. You know so many details about all this I wonder if you're one of the developers.

I checked the machine with the HD5450, and I am running r600, not r600g. Confusingly, the package info for ati-dri says "Mesa DRI radeon/r200 + Gallium3D r300,r600 drivers". Not easy to install the actual r600g, as it's not yet in Arch Linux's mainline repositories. I mention all this as an example of the difficulties users face, and am not trying to bog this thread down in the details of my specific hardware.

As I don't particularly feel like spending hours thrashing through the details of adding repositories to the distro and running down all the packages and changes needed to compile and install the better driver, looks like more waiting is easiest. Perhaps 1 more month.

Hope you got a spare... (2)

AngryDeuce (2205124) | more than 3 years ago | (#36739534)

Hope you got a spare battery. Or three.

Re:Hope you got a spare... (0)

Anonymous Coward | more than 3 years ago | (#36739668)

I have a really long extension cord.

Re:Hope you got a spare... (1)

lucian1900 (1698922) | more than 3 years ago | (#36746824)

Actually, if they're anything like the E-350, battery life won't be a problem at all. For the performance, the E-350 has great battery life. Even more so for the C-50, which sacrifices some performance for insane battery life (12h playing video on some netbooks).

What definition of mobile? (1)

FunkyELF (609131) | more than 3 years ago | (#36739542)

... seriously?

Re:What definition of mobile? (1)

Billly Gates (198444) | more than 3 years ago | (#36741648)

Nice for those who want to play games and be burdened by shitty integrated graphics with 2002 era performance. These are integrated but it will fly with video, html 5 acceleration with IE 9/10/Windows 8 and can do flash 1080p HD full screen at 30 fps easily. All this for $700 is impressive. Macbook pro's have their own dedicated video cards but cost like $1600. That is a ton of money.

Bitcoin craze! (0)

DriedClexler (814907) | more than 3 years ago | (#36739554)

Hey, now you can have a Bitcoin miner on the go!

Re:Bitcoin craze! (0)

Anonymous Coward | more than 3 years ago | (#36742340)

Hey, now you can have a Bitcoin miner on the go!

It'll probably cost you more in electricity than the coins are worth at this point- let alone the additional cost of the fancy GPU- but I doubt it'll stop people trying.

Re:Bitcoin craze! (0)

Anonymous Coward | more than 3 years ago | (#36745474)

I don't know about him, but I heat with electricity. Bring it on.

yuo Fail It!? (-1, Troll)

Anonymous Coward | more than 3 years ago | (#36739588)

is busy infighti8g

But can it run Gnome 3? (1)

CiarnOS (1325141) | more than 3 years ago | (#36739724)

I mean really who gives a fuddly about crysis if it can't even do a few windows correctly. Pfft!

They'd only just recently made it possible to play video correctly them Whamo no Gnome 3.

Who cares if they can make better hardware if the software can't talk to it.

not a HD 6990 (0)

Anonymous Coward | more than 3 years ago | (#36739862)

"the Radeon HD 6990M — a mobile equivalent to the company's high-end Radeon HD 6990 PCI Express graphics card design"

No, it isn't, it's more or less (actually, less) a Radeon HD 6870. http://www.anandtech.com/show/4494/amd-raises-the-mobile-performance-bar-with-radeon-hd-6990m

Still a nice GPU, but the number/name is misleading.

Incorrect summary (1)

Meeni (1815694) | more than 3 years ago | (#36740358)

a mobile equivalent to the company's high-end Radeon HD 6990 PCI Express graphics card design — features 1,120 stream processing units, 56 texture units, 128 Z/stencil ROP units, and 32 color ROP units." Except that the PCI-E version has more than twice as many stream processors (in the order of 3500), 64 color ROPs and 70% higher memory frequency. This might be the best mobile unit, but the 6990M is not, in any way except its name, something comparable to the HD 6990 PCI-E... It is, in all specifications, at most, half of what the non mobile version is. Marketing... Marketing...

Re:Incorrect summary (1)

coxymla (1372369) | more than 3 years ago | (#36743246)

Not to mention that the desktop 6990 is a dual GPU card... a better comparison would be the 6970 which is the high end single GPU Radeon.

Re:Incorrect summary (1)

That Guy From Mrktng (2274712) | more than 3 years ago | (#36743322)

Sir? ... Yes, were the ones to blame, engineers tried to shove the HD6990 on some laptops that eventually burned down one testing room, I was there, the laptop ran it's fans so hard that started to hoover across the table until blowing up spectacularly near to a big pile of failed packaging demos. So management blamed us [1] and we had to come up w/ something that would lead the blame onto us if a smart consumer figured it out.

[1] We had a lot of failed packages in there because we had to outsource the design to the same guys that we outsource the coding, they promised they will hire designers, but ended up shoving javascript monkeys to hammer the design in SVG by code!. The designs actually created buffer overruns in the CTP machine and he have lost contact w/ production since then.

You think it's funny to blame marketing for everything, sure, everything is fine and dandy until stuxnet owns your printing department and start to print weird Chinese propaganda, did your know they went to the moon first? [news.com.au]

Bit Coin mining on the go! (1)

madhatter256 (443326) | more than 3 years ago | (#36740738)

Finally, a GPU to do some bit coin mining when on the go!

The heat will melt the plastic in the laptop lol

SfFrist psot (-1)

Anonymous Coward | more than 3 years ago | (#36740814)

Oh yeah! (1)

danomac (1032160) | more than 3 years ago | (#36741114)

I need all that power. To open up a new VT and use vim!

but still no support for CUDA. (0)

Anonymous Coward | more than 3 years ago | (#36741506)

yeah but does it help Photoshop CS5 run better than the fastest mobile GPU from nVIdia

Re:but still no support for CUDA. (1)

GigaplexNZ (1233886) | more than 3 years ago | (#36744128)

It has support for both OpenCL and DirectCompute, both of which Adobe could have used instead of CUDA. While it's true that AMD GPUs won't help with Photoshop, that's not AMDs fault.
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?