Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD's Fusion CPU + GPU Will Ship This Year

kdawson posted more than 3 years ago | from the never-say-die dept.

AMD 138

mr_sifter writes "Intel might have beaten AMD to the punch with a CPU featuring a built-in GPU, but it relied on a relatively crude process of simply packaging two separate dies together. AMD's long-discussed Fusion product integrates the two key components into one die, and the company is confident it will be out this year — earlier than had been expected."

cancel ×

138 comments

Sorry! There are no comments related to the filter you selected.

first post? (-1)

Anonymous Coward | more than 3 years ago | (#32224370)

In three years, hopefully I get this in a laptop?

Sup dawg (3, Funny)

Anonymous Coward | more than 3 years ago | (#32224438)

Sup dawg. We herd you like processing units, so we put a processing unit in yo' processing unit so you can computer while you compute!

Re:Sup dawg (-1, Flamebait)

Anonymous Coward | more than 3 years ago | (#32225874)

Sup nigger? We heard you like niggers so we put a nigger in your nigger so you can nig while you nig.

OK, they're integrated "properly", but... (3, Insightful)

Dragoniz3r (992309) | more than 3 years ago | (#32224476)

It doesn't really matter, any more than AMD's "proper" quad core mattered more than Intel pasting two dual-core dies together. This is really just AMD getting beaten to the punch again, and having to try to spin it in some positive way. It's great news that it will be out earlier than expected, but I think they would have been better off taking the less "beautiful" and just throwing discrete dies into a single package. Particularly as it has yet to be seen how big the market for this sort of thing is. More exciting to me is that AMD is ahead of schedule with this, so hopefully they'll be similarly ahead with their next architecture. I'm yearning for the day when AMD is back to being competitive on a clock-for-clock basis with Intel.

Tablet Processor? (0, Offtopic)

sanman2 (928866) | more than 3 years ago | (#32224520)

I'd really love it if Fusion could give us ultra-powered iPad-killing tablet PCs, complete with multi-tasking/multi-window functionality, as well as 3D acceleration. But will it be low-powered enough?

Re:Tablet Processor? (1)

xianthax (963773) | more than 3 years ago | (#32225740)

you mean the tegra 250?

http://www.nvidia.com/object/tegra_250.html [nvidia.com]

Re:Tablet Processor? (1)

ThePhilips (752041) | more than 3 years ago | (#32226522)

Tegra can't run Windows. AMD Fusion can.

Re:Tablet Processor? (1)

V!NCENT (1105021) | more than 3 years ago | (#32226842)

You mean Windows can't run on a Tegra? It is an OS you know, designed to Operate a System, yo... 'n stuff...

Re:OK, they're integrated "properly", but... (5, Insightful)

Anonymous Coward | more than 3 years ago | (#32224534)

Sure Intel got there first and sure Intel has been beating AMD on the CPU side, but...

Intel graphics are shit. Absolute shit. AMD graphics are top notch on a discrete card and still much better than Intel on the low end.

Maybe you should compare the component being integrated instead of the one that already gives most users more than they need.

Re:OK, they're integrated "properly", but... (3, Informative)

sayfawa (1099071) | more than 3 years ago | (#32224684)

Intel graphics are only shit for gamers who want maximum settings for recent games. For everything else, and even for casual gamers, they are fine. At this very moment I'm just taking a quick break from playing HL-2 with an I3's graphics. Resolution is fine, fps is fine, cowbell is maxed out. Go look at some youtube videos to see how well the gma 4500 (precurser to the current gen) does with Crysis.

Re:OK, they're integrated "properly", but... (1)

angelwolf71885 (1181671) | more than 3 years ago | (#32224736)

compared to a Geforce 6500 the GMA 4500 barely gets playable frame rates on med while the 6500 gets rather respectable frame rates on med

Re:OK, they're integrated "properly", but... (1)

Shikaku (1129753) | more than 3 years ago | (#32225556)

For Intel, GMA stands for "Graphics My Ass!"

Re:OK, they're integrated "properly", but... (2, Insightful)

Joce640k (829181) | more than 3 years ago | (#32226714)

How come Intel sells more GPUs than ATI and NVIDIA combined?

Because they sell them to people who've moved out of their parent's basement...

Re:OK, they're integrated "properly", but... (2, Insightful)

haruchai (17472) | more than 3 years ago | (#32227182)

and, therefore, can't afford anything better than the graphics equivalent of Mac'n'Cheese now
that Mom and Dad are no longer paying the bills.

Re:OK, they're integrated "properly", but... (3, Insightful)

cyssero (1554429) | more than 3 years ago | (#32224776)

Should it be any accomplishment that a game released in November 2004 works on a latest-gen system? For that matter, my Radeon 9100 IGP (integrated) ran HL-2 'fine' back in 2004.

Re:OK, they're integrated "properly", but... (1)

sayfawa (1099071) | more than 3 years ago | (#32224990)

And was your Radeon 9100 free? 'Cause I was just looking for a processor. But in addition I got graphics that are good enough for every pc game I have. Which includes Portal, so 2007. And no, I'm not saying that that is awesome. But it's certainly not shit, either. And, as your post demonstrates, only gamers give a fuck.

Re:OK, they're integrated "properly", but... (2, Informative)

sznupi (719324) | more than 3 years ago | (#32225332)

Is your CPU + motherboard combo cheaper than typical combo from some other manufacturer that has notably higher performance and compatibility?

With greater usage of GPUs for general computation, the point is that not only gamers "give a fuck" nowadays.

PS. If something runs HL2, it can run Portal. As my old Radeon 8500 did, hence also certainly integrated 9100 of parent poster.

Re:OK, they're integrated "properly", but... (1)

not flu (1169973) | more than 3 years ago | (#32227164)

GPUs for general computation are a thing of the future, not a thing of the present. I bought a discrete graphics card recently for DVI output, 3D performance was not even a consideration. h.264 acceleration, silence, price and linux/windows/OS X compatibility were. Only gamers give a fuck, every other use for GPUs is (at present) extremely niche.

Re:OK, they're integrated "properly", but... (1)

sznupi (719324) | more than 3 years ago | (#32224818)

Intel GFX is shit for many games, especially older ones (considering their state of drivers); they have problems with old 2D Direct(smth...2D?) games since Vista drivers FFS.

At least they manage to run properly one of the most popular FPS games ever, lucky you...

Re:OK, they're integrated "properly", but... (1)

Shikaku (1129753) | more than 3 years ago | (#32225576)

DirectDraw. Microsoft is going to release Direct2D with IE9 for faster rendering, and DirectDraw is depreciated...

Re:OK, they're integrated "properly", but... (1)

sznupi (719324) | more than 3 years ago | (#32225646)

Deprecation which doesn't mean much if one wants to simply run many of the older games, using tech which should work in the OS and drivers right now. Older games, for which Intel gfx was supposed to be "fine"...

Re:OK, they're integrated "properly", but... (2, Insightful)

TubeSteak (669689) | more than 3 years ago | (#32225064)

Intel graphics are only shit for gamers who want maximum settings for recent games.

Having the "best" integrated graphics is like having the "best" lame horse.
Yea, it's an achievement, but you still have a lame horse and everyone else has a car.

Re:OK, they're integrated "properly", but... (1, Insightful)

sznupi (719324) | more than 3 years ago | (#32225418)

What are you talking about? On good current integrated graphics many recent games work quite well; mostly "flagship", bling-oriented titles have issues.

"Lean car -> SUV" probably rings closer to home...

Re:OK, they're integrated "properly", but... (0)

drsmithy (35869) | more than 3 years ago | (#32225954)

Having the "best" integrated graphics is like having the "best" lame horse.

In an mpg race...

Re:OK, they're integrated "properly", but... (0, Troll)

Loomismeister (1589505) | more than 3 years ago | (#32225032)

Intel doesn't make graphics, how could you say that the non-existent graphics are shit? Game's have graphics that are rendered by processors, and AMD + intel are rendering the same graphics as each other for any game.

I feel bad for feeding this obvious troll, but I find it surprising that this was modded up as insightful.

Re:OK, they're integrated "properly", but... (5, Informative)

lowlymarine (1172723) | more than 3 years ago | (#32226064)

Sigh, I know *I'm* the one actually feeding the troll here, but: http://www.intel.com/consumer/products/technology/graphics.htm [intel.com]

The page for the GMA 950 [intel.com] even has this hilarious tidbit:
"With a powerful 400MHz core and DirectX* 9 3D hardware acceleration, Intel® GMA 950 graphics provides performance on par with mainstream graphics card solutions that would typically cost significantly more."
Whoever wrote that line must have been borrowing Steve's Reality Distortion Field.

Re:OK, they're integrated "properly", but... (0)

Anonymous Coward | more than 3 years ago | (#32226412)

GeForce FX 5200 would like to have a few words with you.

Re:OK, they're integrated "properly", but... (1)

meow27 (1526173) | more than 3 years ago | (#32226804)

intel's linux drivers suck dead, half-rotten, year old horseballs

ATI's graphics drivers are at least tolerable.... heck i cant do a dual monitor without PERMANENTLY BREAKING my intel graphics card driver!... yes this has happened to me while using ubuntu jaunty and why I'm never getting a laptop without nvidia or ATI cards.

(yes a re-install of ubuntu worked)

Re:OK, they're integrated "properly", but... (1, Insightful)

bemymonkey (1244086) | more than 3 years ago | (#32226136)

Why are Intel graphics shit? They run cool, use very little power and have sufficient grunt for anything a typical non-gamer (maybe CAD and GPU-accelerated Photoshop aside) will throw at them...

Not being able to run games does not make an integrated GPU shit...

Re:OK, they're integrated "properly", but... (1)

Atmchicago (555403) | more than 3 years ago | (#32227150)

That depends on what you plan on using it for. I can run a composited desktop, torchlight, and civ 4 on a core i3 (1900x1200). It supports h264 decoding. It's low power. And if it gets too slow in a few years I can buy a $50 card to upgrade. So for me it's fine.

Re:OK, they're integrated "properly", but... (2, Insightful)

markass530 (870112) | more than 3 years ago | (#32224546)

I would say it will matter, at least it might, Can't really write it off until you've seen it in the wild. AMD's more elegant initial dual core solution was infinitely better than Intels "lets slap 2 space heaters together and hope for the best"

Re:OK, they're integrated "properly", but... (1)

blankinthefill (665181) | more than 3 years ago | (#32224550)

I agree, this really strikes me as the same thing that happened with the memory controllers/FSB a few years ago. They move all of it on die, then claim it's this great huge thing... but in the end it really doesn't make all that much of a difference in the grand scheme of things. Obviously its a good move, and one that Intel is going to want to make... EVENTUALLY. But what's the real world benefit of this over the Intel solution? IS there any benefit that the average user buying these chips is ever going to notice (These are integrated graphics... probably not many power-users rushing to get these)? I love real competition between AMD and Intel, as it's been shown many times over to be good for us, the consumer, so I too hope this may lead to better competition (early releases do tend to be good for that), but that's really the only bright spot that I can see in this release for AMD.

Re:OK, they're integrated "properly", but... (5, Informative)

Aranykai (1053846) | more than 3 years ago | (#32224624)

I recently went from an older AMD dual core to a Phenom II. With the exact same board and hardware, my memory performance increased by about 20% thanks to the independent memory controllers.

AMD also makes strikingly capable on-board graphics, so this will likely rule out the need for on-board or discrete video in the average person's computer. Cheaper/simpler motherboards and hopefully better integration of GPGPU functionality for massively parallel computational tasks.

Re:OK, they're integrated "properly", but... (4, Insightful)

BikeHelmet (1437881) | more than 3 years ago | (#32225522)

Lower power consumption, making AMD chips more competitive in notebooks - perhaps even netbooks.

Re:OK, they're integrated "properly", but... (1)

bhtooefr (649901) | more than 3 years ago | (#32226258)

Although Atoms already have the GPU on-die nowadays...

Re:OK, they're integrated "properly", but... (2, Interesting)

crazycheetah (1416001) | more than 3 years ago | (#32225210)

While this is more for gamers (and other more GPU intensive tasks; if GPGPU use keeps increasing--if it is increasing?--it could become more of a factor for more people), AMD had hinted at the ability to use the integrated GPU in the CPU alongside a dedicated graphics card, using whatever the hell they call that (I know nVidia is SLI, only because I just peaked at the box for my current card). So, it's something power users could actually be quite happy to get their hands on, if it works well. And as for non-power users, we can get this and not worry about graphics cards on the mobo or dedicated. Sounds like a good deal to me. And that beats anything Intel has to offer with this same idea (not that Intel doesn't win in other areas).

Re:OK, they're integrated "properly", but... (0)

Anonymous Coward | more than 3 years ago | (#32225324)

ATI is called crossfire, but it (and SLI) require identical chipsets, which you won't be getting. nVidia's PhysX would benefit from an additional nVidia chipset in the system, but ATI doesn't have anything like that. Basically, if you're a gamer, this is complete non-news, and depending on the price increase over a regular CPU, it's not even useful for regular people since you can get a cheap-ass card for $50 or less.

Re:OK, they're integrated "properly", but... (5, Interesting)

sznupi (719324) | more than 3 years ago | (#32224552)

Actually, the situation might be reversed this time; sure, that Intel quadcores weren't "real" didn't matter much, because their underlying architecture was very good.

In contrast, Intel GFX is shit compared to AMD. The former can usually do all "daily" things (at least for now, who knows if it will keep up with more and more general usage of GPUs...)' the latter, even in integrated form, is suprisingly sensible even for most games, excluding some of the latest ones.

Plus, if AMD throws this GPU on one die, it means it will be probably manufactured at Global Foundries = probably smaller process and much more speed.

Re:OK, they're integrated "properly", but... (2, Insightful)

evilviper (135110) | more than 3 years ago | (#32225072)

This is really just AMD getting beaten to the punch again, and having to try to spin it in some positive way.

I'll have to call you an idiot for falling for Intel's marketing, and believe that, just because they can legally call it by the same name, it remotely resembles what AMD is doing.

Re:OK, they're integrated "properly", but... (4, Insightful)

MemoryDragon (544441) | more than 3 years ago | (#32226178)

Except that Intel yet has to deliver an integrated graphics solution which deserves the name. AMD has the advantage that they can bundle an ATI core into their CPUs which means a decent graphics card finally.

Re:OK, they're integrated "properly", but... (2)

Anonymous Coward | more than 3 years ago | (#32226320)

If AMD are simple tossing a gpu and cpu on the same die because they can... then agreed.

If AMD are taking advantage of the much reduced distance between cpu and gpu units to harness some kind of interoperability for increased performance or reduced power usage over the Intel "glue em together" approach... then maybe this could be a different thing altogether.

Re:OK, they're integrated "properly", but... (2, Insightful)

Hurricane78 (562437) | more than 3 years ago | (#32226848)

I’m sorry, but I still support everyone who does things properly instead of “quick and dirty”.
What Intel did, is the hardware equivalent of spaghetti coding.
They might be “first”, but it will bite them in the ass later.
Reminds one of those “FIRST” trolls, doesn’t it?

The Diff (3, Insightful)

fast turtle (1118037) | more than 3 years ago | (#32227008)

There's two sides to this coin and Intel's is pretty neat. By not having the GPU integrated into the CPU die, Intel can improve the CPU/GPU without having to redesign the entire chip. For example, any Power management improvements can be moved into the design as soon as it's ready. Another advantage for them is the fact that each die CPU and GPU are actually indepenent and can be manufactured using what ever process makes the most sense to them.

AMD's design offers a major boost to overall CPU performance simply through the fact that the integration is far deeper then Intel's. From what I've read, the Fusion ties the Stream Processors (FPU) directly to a CPU and should offer a major boost in all Math ops of the CPU and I expect that it will finally compete with Intel's latest CPU's in regards to FPU operations.

Re:OK, they're integrated "properly", but... (1, Interesting)

Anonymous Coward | more than 3 years ago | (#32227052)

Intel has always had worse engineering and better execution than AMD.

Instead of Intel "getting there first" perhaps it's Intel executing first on AMD's better engineering; like AMD64. Huh?

You incessant Intel posters are the Fox News of Slashdot.

This Is Good For IE 9 (1, Interesting)

WrongSizeGlass (838941) | more than 3 years ago | (#32224490)

With IE 9 headed toward GPU assisted acceleration, these types of "hybrid" chips will make things even faster. Since AMD's main enduser is a Windows user, and IE 9 will probably be shipping later this year, these two may be made for each other.

Of course every other aspect of the system will speed up as well, but I wonder how this type of CPU/GPU package will work with after market video cards? If you want a better video card for gaming, will the siamese-twin GPU bow to the additional video card?

Re:This Is Good For IE 9 (1)

GigaplexNZ (1233886) | more than 3 years ago | (#32224532)

With IE 9 headed toward GPU assisted acceleration, these types of "hybrid" chips will make things even faster.

Even faster than current generation discrete GPUs? I think not.

Re:This Is Good For IE 9 (4, Insightful)

WrongSizeGlass (838941) | more than 3 years ago | (#32224590)

Even faster than current generation discrete GPUs? I think not.

They'll move data inside the chip instead of having to send it off to the internal bus, they'll have access to L2 cache (and maybe even L1 cache), they'll be running in lock-step with the CPU, etc, etc. These have distinct advantages over video cards.

Re:This Is Good For IE 9 (0)

Anonymous Coward | more than 3 years ago | (#32224642)

I don't think it will match the fastest discrete GPUs, but for general use and light gaming AMD's motherboard chipsets with integrated graphics have been very compelling.

Re:This Is Good For IE 9 (2, Interesting)

GigaplexNZ (1233886) | more than 3 years ago | (#32224842)

That'll certainly increase bandwidth which will help outperform current integrated graphics and really low end discrete chips, but I severely doubt it will be enough to compensate for the raw number of transistors in the mid to high end discrete chips. An ATI 5670 graphics chip has just about as many transistors as a quad core Intel Core i7.

Re:This Is Good For IE 9 (2, Insightful)

alvinrod (889928) | more than 3 years ago | (#32227090)

And if Moore's law continues to hold, within the next four years it won't be an issue to put both of those chips on the same die. Hell, that may even be the budget option.

Re:This Is Good For IE 9 (0)

Anonymous Coward | more than 3 years ago | (#32225760)

It'll be like floating point co-processor deal back in 486sx/dx days: general purpose processor gets specialized computing unit bolted right next to it on the same die. Probably makes MMX/SSE/Altivec look bad in comparison.. ..and might lure Apple to use AMD CPUs. They might like possibilities it gives, imagine: "PowerMac 9000 has four Fusion GPUs: each 100x faster than AltiVec!"

Re:This Is Good For everyone (1)

ufoolme (1111815) | more than 3 years ago | (#32225852)

One distinct disadvantage... HEAT! even with all the die shrinks
No1 Advantage, forcing Intel to product decent graphics.

Re:This Is Good For everyone (3, Informative)

Hal_Porter (817932) | more than 3 years ago | (#32226218)

Actually Intel had a radical way to handle this - Larrabee. It was going to be 48 in order processors on a die with Larrabee new instructions. There was a Siggraph paper with very impressive scalability figures [intel.com] for a bunch of games running DirectX in software - they captured the DirectX calls from a machine with a conventional CPU and GPU and injected them into a Larrabee simulator.

This was going to be a very interesting machine - you'd have a machine with good but not great gaming performance and killer server performance - servers are naturally "embarrassingly parallel" because you can have one thread per client. A sort of x86 take on Sun's Niagra.

Of course there are problems with this sort of approach. Most current games are not very well threaded - they have a small number of threads that will run poorly on an in order CPU. So if the only chip you had was a Larrabee and it was both a CPU and a GPU the GPU part would be well balanced across multiple cores. The CPU part would likely not. You have to wonder about memory bandwidth too.

Larrabee was switched to be a GPU only and then canned.

Of course as a pure GPU it is a bit of a poor design. Real GPUs don't drag in x86 compatibility - they can implement whatever instruction set is best and nothing else. The instruction set is not publicly exposed and can change from generation to generation. You can cram a lot more than 48 cores onto a GPU and the peak performance is higher. Power consumption is lower too.

Still a modern gaming GPU is huge - there's no way you're going to cram it and a modern GPU onto a die and get something affordable. Then again CPUGPU chips are probably not aimed at gamers - there's an argument for having a CPU and a stripped down integrated GPU on one chip for netbooks like the latest Atoms do.

You could cram in a chipset too to reduce the price on netbooks.

Re:This Is Good For everyone (2, Informative)

Rockoon (1252108) | more than 3 years ago | (#32227156)

Of course there are problems with this sort of approach. Most current games are not very well threaded - they have a small number of threads that will run poorly on an in order CPU. So if the only chip you had was a Larrabee and it was both a CPU and a GPU the GPU part would be well balanced across multiple cores. The CPU part would likely not. You have to wonder about memory bandwidth too.

I believe that it was in fact memory bandwidth which killed larrabee. A GPU's memory controller is nothing like a CPU's memory controller, so trying to make a many-core CPU behave like a GPU while still also behaving like a CPU just doesnt work very well.

Modern good performing GPU's require the memory controller be specifically tailored to filling large cache blocks. Latency isnt that big of an issue. The GPU is likely to need the entire cache line, so latency is sacrificed for more bandwidth. The latency is amortized over many many operations.

CPU's on the other hand require the memory controller be tailored to filling small cache blocks. Latency is a big issue. The CPU may only want or need 4 bytes from that cache line, so latency can't be sacrificed for bandwidth. The latency may not be amortized over many operations.

Re:This Is Good For IE 9 (1)

forkazoo (138186) | more than 3 years ago | (#32225322)

Even faster than current generation discrete GPUs? I think not.

Not for most things, but for some specific GPGPU type stuff where you want to shuffle data between the CPU and the GPU, yes. Much, much faster. For exactly the same reasons that we no longer have off-chip FPU's. A modern separately socketed FPU could have massive performance. It could have its own cooling system, so you could use a ton of power just on the FPU. OTOH, you would still need to get data back and forth to the main CPU, so it makes more sense to have a slightly more modest FPU right on the chip for most things.

Re:This Is Good For IE 9 (4, Interesting)

bhtooefr (649901) | more than 3 years ago | (#32226284)

Arguably, the "off-chip FPU" nowadays IS a GPU - hence all the GPGPU stuff.

Re:This Is Good For IE 9 (1)

Aranykai (1053846) | more than 3 years ago | (#32224638)

Seeing as AMD is in both markets, I'm sure they will have no issue working along side discrete graphics.

Re:This Is Good For IE 9 (1)

hairyfeet (841228) | more than 3 years ago | (#32224864)

IIRC (sorry to f'n tired to Google the link) it is supposed to work similar to Hybrid Crossfire, in that if you pair it with an AMD discrete it will let the card drop into low power mode when running the desktop or basic video, and when you need to kick it up for gaming it gives it to the discrete, unless it is a low end card in which case they split the load in hybrid.

So it should be pretty good for keeping the power usage down when not needing a full bore card. I have to say I'm really liking the way AMD is going right now. Yeah I know Intel has the biggest ePeen ATM, but unless you are seriously pushing the machine, who cares? After seeing how nice the new duals ran I went and built one for myself, and when the price dropped I was able to drop in a 925 quad. Frankly this CPU takes everything I can throw at it, runs less than 112F under load and 83F idle, and you really can't beat building a quad core with 8Gb of RAM and W7 HP X64 for less than $700.

If these chips turn out to be good I have a feeling most if not all my customers will be getting them in new builds. And yes, Intel having shitty onboards DO make a difference, even if you don't game. One word: Video. With everyone having HD widescreens having hardware accelerated video makes a BIG difference, especially with dual core CPUs. The low end dual core Intel PCs I built for customers just didn't seem as smooth or easy with video as the new AMDs, even with the lower end Athlon IIs. And with the economy still sucking price matters, and my customers can simply get a much higher spec'd rig all around by going with all AMD. Having the biggest ePeen is nice, but bang for the buck counts pretty highly in my book, and AMD has the crown for that IMHO.

Bad for the gnme project (0)

Anonymous Coward | more than 3 years ago | (#32225822)

The upcoming gnome 3.0 will not run unless you have a working driver for your graphics card. If this new chip hits the steets at the same time as gnome 3 then a lot of people will end up in trouble.

This post is Die HArd! (-1, Offtopic)

BlackBloq (702158) | more than 3 years ago | (#32224528)

Today is a good day to die!
Must of rolled a 20 with the die!
This one is not diet !
This is too loaded with diatribes!
O'k I'm done the torture but hey it was fun...
PS: this is not a diurnal post!

Seems a bit rich to call it crude (2, Interesting)

Gadget_Guy (627405) | more than 3 years ago | (#32224568)

Calling Intel's offerings crude sounds like it is quoting from AMD's press release. It may be crude, but it works and was quick and cheap to implement. But does it have any disadvantages? Certainly the quote from the article doesn't seem terribly confident that the integrated offering is going to be any better:

We hope so. We've just got the silicon in and we're going through the paces right now - the engineers are taking a look at it. But it should have power and performance advantages.

Dissing a product for some technical reason that may not have any real performance penalties? That's FUD!

Re:Seems a bit rich to call it crude (2, Interesting)

sznupi (719324) | more than 3 years ago | (#32224780)

...But does it have any disadvantages?...

With Intel's offerings the thing is that they don't really have any advantages (except perhaps making 3rd party chipsets less attractive for OEMs, but that's a plus only for Intel). They didn't end up cheaper in any way (ok, a bit too soon to tell...but do you really have some hope?). They are certainly a bit faster - but still too slow; and anyway it doesn't matter much with the state of Intel drivers.

AMD integrated GFX has already very clear advantages. This new variant, integrated with the CPU, while certainly simpler than standalone parts, might make up for it with much higher clock and wide data bus. Edning quite attractive.

Re:Seems a bit rich to call it crude (1)

Lemming Mark (849014) | more than 3 years ago | (#32226552)

I thought integration in the same package allowed (presumably for electrical reasons - very small physical distance, not limited by number of pins you can fit) a faster interconnect between the two dies, so there actually is (potentially) some advantage to doing it, even though it's not proper dual core.

Re:Seems a bit rich to call it crude (1)

Gadget_Guy (627405) | more than 3 years ago | (#32227154)

With Intel's offerings the thing is that they don't really have any advantages

What about the large reduction in power requirements for their supporting chipset. This was always the weakest link for Intel. Their CPUs are quite low powered, but their chipsets ruin any power savings. The all-in-one CPUs now allow for substantial overall power savings, meaning Intel is the king when it comes to performance per Watt.

Re:Seems a bit rich to call it crude (1)

drinkypoo (153816) | more than 3 years ago | (#32226772)

Calling Intel's offerings crude sounds like it is quoting from AMD's press release. It may be crude, but it works and was quick and cheap to implement. But does it have any disadvantages?

Of course it does. Having an additional interconnect between CPU and GPU means not only that cost is higher, but that performance is decreased. You have to have an interface that can be carried through such an interconnect, which is just another opportunity for noise; this interface will likely be slower than various core internals. With both on one die, you can integrate the two systems much more tightly, and cheaper too.

Re:Seems a bit rich to call it crude (1)

Gadget_Guy (627405) | more than 3 years ago | (#32227136)

Actually, Intel's CPUs with built-in GPU are infinitely faster than AMD's in that you can buy one of the Intel chips now. Coming up with technical quibbles is meaningless without any real benchmarks to show the differences, which even AMD can't provide.

CPUGPU (0)

pegasustonans (589396) | more than 3 years ago | (#32224606)

I, for one, welcome our new small furry not-yet-house-trained overlords!

CPUGPU, just step around it...

Re:CPUGPU (1)

kvezach (1199717) | more than 3 years ago | (#32226296)

And so the wheel of reincarnation [cap-lore.com] turns another notch...

Could this be AMD's next Athlon? (2, Insightful)

rastoboy29 (807168) | more than 3 years ago | (#32224716)

I hope so, Intel is far too dominant right now.

Re:Could this be AMD's next Athlon? (-1, Flamebait)

Anonymous Coward | more than 3 years ago | (#32225034)

Wow, really? Me too!

Man, wouldn't that be cool if AMD's new chip totally blew Intel's out of the water? Gads, it would be so cool, man! AMD would be like Linux -- slow to adapt and initially inferior, but eventually becoming the most powerful and efficient alternative to Intel's chips.

I had an intel Core2 Quad that was like 500 bucks and ran OK, but then I got an AMD for $100 and it was totally better! I mean, really. My games(and pr0ns, tee-hee) ran much more smoothly and I felt cool because I had rooted for the underdog and won! Man, oh man. My laptop has a big AMD sticker on it because Its so badass at the coffee shops. Even the Apple guys look at my bitchin' lappy and get all jealous because they know my shit is twice as fast at half the price.

AMD processors even run all of the software that Intel processors do! That means Crysis, Word, Photoshop, Flash...even Linux!!!

Fuck, man. You just made my night with that. I'm so happy. FUCK YEAH! I'm gonna go snort up a bunch of crystal meth and strangle a hooker.

How do you cool this thing? (1)

Ruvim (889012) | more than 3 years ago | (#32224764)

With the bulk of processing power for both CPU and Graphics being concentrated in a single die, I can only imagine how hot it's going to get!

AMD Fusion is about GPGPU (3, Informative)

Boycott BMG (1147385) | more than 3 years ago | (#32224766)

AMD Fusion was meant to compete with Larrabee which is not released. The Intel package with two separate dies is not interesting. The point of these products is to give the programmer access to the vast FP power of a graphics chip, so they can do, for instance, a large scale fft and ifft faster than a normal CPU. If this proves more powerful than Nvidia's latest Fermi (GTX 480 I believe), then expect a lot of shops to switch. Right now my workplace has a Nvidia Fermi on backorder, so it looks like this is a big market.

Re:AMD Fusion is about GPGPU (1)

bhtooefr (649901) | more than 3 years ago | (#32226302)

More like, NV can't get the yields up, I suspect.

Re:AMD Fusion is about GPGPU (1)

baka_toroi (1194359) | more than 3 years ago | (#32226506)

Well, it's hard getting them up from 1.7%

Bigger GPU Than CPU, Please (1)

Doc Ruby (173196) | more than 3 years ago | (#32224808)

I want my CPU to be mostly GPU. Just enough CPU to run the apps. They don't need a lot of general purpose computation, but the graphics should be really fast. And a lot of IO among devices, especially among network, RAM and display.

Re:Bigger GPU Than CPU, Please (1)

keeboo (724305) | more than 3 years ago | (#32224876)

Yeah, that surely matters a lot for corporate users.

Re:Bigger GPU Than CPU, Please (1)

Doc Ruby (173196) | more than 3 years ago | (#32226936)

1. I don't care. And there are many millions, billions of people like me.

2. Most corporate computing also uses "netbook" type functionality that doesn't use a big CPU, but needs a bigger GPU. That's why there are CPUs like the Atom.

3. Sarcasm is just obnoxious when you're wrong.

Advanced features (4, Interesting)

wirelessbuzzers (552513) | more than 3 years ago | (#32224868)

In addition to the CPGPU or whatever what they're calling it, Fusion should finally catch up to (and exceed) Intel in terms of niftilicious vector instructions. For example, it should have crypto and binary-polynomial acceleration, bit-fiddling (XOP), FMA and AVX instructions. As an implementor, I'm looking forward to having new toys to play with.

Re:Advanced features (1)

fragMasterFlash (989911) | more than 3 years ago | (#32224980)

CPGPU

Actually they are calling it an APU (accelerated processing unit), FWIW.

CUDA? (1)

Gothmolly (148874) | more than 3 years ago | (#32224894)

Does this mean CUDA support in every AMD "CPU" ?

Re:CUDA? (2, Informative)

Anonymous Coward | more than 3 years ago | (#32225544)

No.
CUDA is Nvidia.
ATI has Stream.

Not marketed toward me (1)

DeadboltX (751907) | more than 3 years ago | (#32224946)

Call me when they can fit 9 inches of graphics card into one of these cpu.

Re:Not marketed toward me (4, Insightful)

BiggerIsBetter (682164) | more than 3 years ago | (#32225208)

Call me when they can fit 9 inches of graphics card into one of these cpu.

Size isn't everything!

Re:Not marketed toward me (3, Funny)

kaizokuace (1082079) | more than 3 years ago | (#32226376)

Thats what the guys tell themselves.

Re:Not marketed toward me (1)

CODiNE (27417) | more than 3 years ago | (#32226454)

Clearly you have a small graphics card yourself.

Re:Not marketed toward me (0)

Anonymous Coward | more than 3 years ago | (#32226488)

Come on, man! Not even your mom can handle 9 inches.

Re:Not marketed toward me (0)

Anonymous Coward | more than 3 years ago | (#32227078)

The HD 4200 IGP in my 785G chipset has far superior performance than that of the GeForce FX 5800 (according to some sources, it even rivals the GeForce 6600). Currently, they can fit 9 inches of 2003/2004 graphics card into a chip on my motherboard. I'm fairly certain that they can put that onto a CPU.

future upgrading? (4, Interesting)

LBt1st (709520) | more than 3 years ago | (#32225118)

This is great for mobile devices and laptops but I don't think I want my CPU and GPU combined in my gaming rig. I generally upgrade my video card twice as often as my CPU. If this becomes the norm then eventually I'll either get bottlenecked or have to waste money on something I don't really need. Being forced to buy two things when I only need one is not my idea of a good thing.

Re:future upgrading? (2, Insightful)

FishTankX (1539069) | more than 3 years ago | (#32225462)

The grpahics core will likely be small, add an inconsequential amount of transistors, be disable-able, and or crossfire able with the main crossfire card.

However, the place I see this getting HUGE gains, is if the on board GPU is capable of doing physics calculations. Having a basic physics co processor on every AMD CPU flooding out the gates will do massive good for the implementation of physics in games, and can probably offload alot of other calculations in the OS. On board video encode acceleration anyone?

Just having a dedicated super wide parallel optimized floating point monster on the die for relatively little price penalty seems like an excellent idea to me.

Re:future upgrading? (1)

fuzzyfuzzyfungus (1223518) | more than 3 years ago | (#32225496)

Depending on how the economics shake out, it can pretty easily be a good thing, or at least an indifferent one.

Generally, it is cheaper to make a given feature standard than it is to make it optional(obviously, making it unavailable is cheaper still). Standard means that you can solder it right onto the mainboard, or include it on the die, means no separate inventory channel to track, and greater economies of scale. For these reasons, once a given feature achieves some level of popularity, it shifts from being optional to being standard(remember the days when motherboards didn't include at least one NIC and some sort of audio, or the days when an FPU was an optional part, or the brief period when your 3D card was a separate board that went next to your 2D card?)

This does mean that, for users sufficiently outside the common profile, that redundant components get shipped. Audiophiles end up with unused onboard sound. Gamers end up with unused onboard video(though, with the rise of demand-based graphics switching, this is becoming less serious, since having a dinky integrated GPU to paint your desktop and only have the fire-breathing discrete board power up for games is a good thing). For server stuff, you disable the onboard Realtec crap and get a real NIC. However, given the gigantic economies of scale in the electronics business, it'd likely cost you more to get "only what you need" than it would to get "what joe average needs + my chosen option cards", even though the latter represents slightly more silicon and connectors and stuff.

Re:future upgrading? (1)

CAIMLAS (41445) | more than 3 years ago | (#32225684)

I generally upgrade my video card twice as often as my CPU. If this becomes the norm then eventually I'll either get bottlenecked or have to waste money on something I don't really need.

That depends: do you buy Intel or AMD processors, currently?

Because if you buy Intel processors, I can see your point (and the reason behind not frequently upgrading your CPU): CPU upgrades are costly if the socket changes with every upgrade, requiring a new board at the same time. With AMD processors, however, they've retained the same basic socket for quite some time (to negligible performance detriment and the ability to upgrade components largely independently). This is Good Design on their part.

If they continue to do this paired with the GPCPU, it'll arguably not be that different, and you might even save some money by getting a small incremental CPU upgrade when you upgrade the chip for increased graphics processing. Your board will (may) be cheaper due to not needing a PCI bus for extra cards at all (as a graphics card is often the only add-on most people put in their systems, these days). And on and on...

It will Be Fast! (1)

b4upoo (166390) | more than 3 years ago | (#32225212)

But will quad core and six core chips also carry a graphics chip? And how long before the two quad core mother boards hit the streets?
            Frankly we are on the edge of a serious improvement in computers.

Re:It will Be Fast! (1)

hedwards (940851) | more than 3 years ago | (#32226566)

Indeed, but do most people really need more than 4 cores at this point? Software for the home user still hasn't really caught up with quad core at this point, it'd be a bit silly to put out a dual quad core board for that market. OTOH that'd be just dandy for the server market.

Re:It will Be Fast! (0)

Anonymous Coward | more than 3 years ago | (#32227120)

Those motherboards already exist. You just have to look at the server motherboards.

eg: http://www.newegg.ca/Product/ProductList.aspx?Submit=ENE&N=2010200302%201071346469&name=Dual%20LGA%201366.

DOLL (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#32225238)

Good manners something done It a break, if please moderate Usenet is roughly [idge.net] And/or distribute stagnant. As Linux the 'coomunity' Preferrably with an

Apple angle (2, Interesting)

Per Cederberg (680752) | more than 3 years ago | (#32226126)

Worth noting is that Apple has invested rather heavily in technology to allow programmer use of the GPU in MacOS X. And were recently rumored to have met with high ranking persons from AMD. Seems only logical that this type of chip could find its way into some of the Apple gear.

Question is of course if it would be powerefficient enough for laptops, where space is an issue...

more like x86-64+GPU instructions combined.. (2, Interesting)

strstr (539330) | more than 3 years ago | (#32226224)

I look forward to seeing what AMD's new architecture brings. It's not really interesting thinking about it as integrating a GPU into the same space as a CPU, but creating one chip that can do more exotic types of calculations than either chip could alone and making it a available in every system. I'm also envisioning "GPU" instructions being executed where normally CPU instructions were when not in use, and vise versa, basically so everything available could be put to use.

intel is all about marketing (1, Troll)

toastliscio (1729734) | more than 3 years ago | (#32226834)

AMD had a better architecture at the times of Athlon and Intel made "netburst" architecture, the name makes users believe that it bursts internet surfing. It was 30 stages pipeline, because it could go up with MHz and so it was good to make users think "more MHz, better cpu" (like when people buys stereos, more Watts=better sound. Yuck.) AMD was the first to release dual core desktop processors, but Intel preceded AMD with dual core Pentium 4: two single-core dies on one package. AMD was the first to release quad core desktop processors, but Intel preceded AMD with quad cores: two dual-core dies on one package. Now it is the same story with CPU+GPU. The bad thing here is that all this is done with the complicity of magazines and hardware review websites around the world.

Socket compatibility (0)

Anonymous Coward | more than 3 years ago | (#32226912)

If AMD's tradition of using the same socket type continues here, then this should be a much bigger deal than Intel's entry. It'll mean that for a very small amount of money, I can add some GPU action to the little boxes I have around where the video card I scrounged is so old it's pathetic.

The same sort of situation appears in servers. I have a bunch of AMD servers which I certainly don't want to put a graphics card in, but I might have some algorithms which would benefit from GPUs. Instant upgrade.

Now, that's not the general model for success CPU companies really need, but it's a start and would provide a capital boost while AMD goes forward on making most of thier chips fusion chips, pushing a lot of business computing into GPU dependency and therefore AMD dependency. Not bad, from AMD's perspective.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>