Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD's Fusion Processor Combines CPU and GPU

timothy posted more than 4 years ago | from the hence-the-name dept.

AMD 240

ElectricSteve writes "At Computex 2010 AMD gave the first public demonstration of its Fusion processor, which combines the Central Processing Unit (CPU) and Graphics Processing Unit (GPU) on a single chip. The AMD Fusion family of Accelerated Processing Units not only adds another acronym to the computer lexicon, but ushers is what AMD says is a significant shift in processor architecture and capabilities. Many of the improvements stem from eliminating the chip-to-chip linkage that adds latency to memory operations and consumes power — moving electrons across a chip takes less energy than moving these same electrons between two chips. The co-location of all key elements on one chip also allows a holistic approach to power management of the APU. Various parts of the chip can be powered up or down depending on workloads."

Sorry! There are no comments related to the filter you selected.

The simpsons (2, Funny)

Anonymous Coward | more than 4 years ago | (#32455918)

Heh. They should use Apu from the Simpsons in their advertising...

Re:The simpsons (0)

Anonymous Coward | more than 4 years ago | (#32456480)

I think they just wanted to get "FAP" in it.

"This is my FAP-Unit."
"I see..."
"It's very compact, good graphics man."

Re:The simpsons (0)

Anonymous Coward | more than 4 years ago | (#32457102)

The s3 "savage" graphics cards did use a "lily savage" lookalike to advertise their card at a games show. I guess they couldn't afford the real one :-)

Re:The simpsons (1)

zepo1a (958353) | more than 4 years ago | (#32457744)

Yo Dawg, I herd you like to GPU while you CPU so I put a GPU in your CPU so you can GPU while you CPU.

vs Larrabee (1)

jabjoe (1042100) | more than 4 years ago | (#32455946)

This could be interesting. Intel might have to change plans for Larrabee again.

Re:vs Larrabee (1, Insightful)

Anonymous Coward | more than 4 years ago | (#32456094)

Change plans?

No, they need to get their rasterising software written and the chip quite a bit more efficient.. as per their plans as-is.

Re:vs Larrabee (0)

Anonymous Coward | more than 4 years ago | (#32456118)

Intel have already announced their own on chip GPU in a deliberate attempt to stifle NVidias Ion platform. http://www.theinquirer.net/inquirer/feature/1651933/intel-confident-sandy-bridge-integrated-gpu

Also FYI the larrabee has been announced under a different name as a competitor to Nvidias HPC Tesla cards. http://www.theregister.co.uk/2010/06/01/intel_knights_co_processor/

AMDs product is just a desperate attempt at trying to be relevant. They need to show they have a product competing with the big boys in all the right channels.

Re:vs Larrabee (5, Insightful)

Rockoon (1252108) | more than 4 years ago | (#32456326)

AMDs product is just a desperate attempt at trying to be relevant. They need to show they have a product competing with the big boys in all the right channels.

AMD is plenty relevant. It is Intel that scrambled to put out a 6 core desktop processor, which was so poorly planned that the cheap version is $1000. Meanwhile nVidia is desperately trying to get people locked into their CUDA API because their video cards just dont bang the performance drum like they used to.

AMD and Intel have different visions. AMD is clearly focusing on getting more cores on chip for more raw parallel performance (12 core CPU's in 4 chip configs are owning the top end server market.. brought to you by AMD), while Intel is clearly trying to maximize memory bandwidth to peak out raw single threaded performance (triple channel ram and larger cache is owning the software rendering and gaming markets)

Normal people are within the $50 to $200 CPU range, and at those price points, solutions from both camps perform about the same. On the video card front, you just can't beat AMD right now. Best price/performance ratio on top of best performance period.

Re:vs Larrabee (4, Interesting)

Calinous (985536) | more than 4 years ago | (#32456370)

The 6-core Intel processor is the Extreme Edition (always was introduced at $1000), and frankly smokes every other desktop processor out there.
AMD is the value-choice - they're cheaper at the same performance point, but they don't really compete in the over $250 desktop arena.
On the server front, Intel's introduction of Core2 based Xeons allowed it to compete again, and right now AMD is leader only in some cases in server performance (some are draws, but most I think go to Intel). Too bad, as server processors were producing a lot of money for AMD.
Intel is also leader in performance/watt, due to a complex power delivery architecture and better processor production facilities.
Meanwhile, AMD competes where it can on the processor front (but ruled the previous 6 months on the performance graphic front).

Re:vs Larrabee (3, Insightful)

sznupi (719324) | more than 4 years ago | (#32456566)

Intel is also leader in performance/watt, due to a complex power delivery architecture and better processor production facilities.

As long as you look only at raw CPU performance and power usage. Add GFX perf into consideration and...

(plus that would be quite recent development for Intel; their power consumption numbers weren't that great by themselves, when adding also chipsets of previous gen)

Re:vs Larrabee (2, Insightful)

Rockoon (1252108) | more than 4 years ago | (#32456578)

The 6-core Intel processor is the Extreme Edition (always was introduced at $1000)

If ((not realistic for server marker) && (cant sell for less than $1000 without undercutting our other offerings))
{
setlabel("Extreme Edition");
}

Where is Intel's budget 6-core design? Is it because they refuse to make budget 6-core CPU's, or is it because they can't make budget 6-core CPU's?

Either way, the proof is in the pudding. They are not targeting the highly parallel market either by choice ("ignoring that market" scenario) or by mistake ("caught with pants down" scenario)

Re:vs Larrabee (1)

Joce640k (829181) | more than 4 years ago | (#32457570)

they don't really compete in the over $250 desktop arena.

Maybe they don't really want to.

Under $250 is by far the biggest market. Competing with the high end Intel chips would probably lose money.

Re:vs Larrabee (1)

mdm-adph (1030332) | more than 4 years ago | (#32457656)

Meanwhile, AMD competes where it can on the processor front (but ruled the previous 6 months on the performance graphic front).

Ahem -- still rules. http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-5000/hd-5970/pages/ati-radeon-hd-5970-overview.aspx [amd.com]

(And don't say it cheats because it has two processors, as Nvidia has been doing the same thing for their last two model lines, as well.

Re:vs Larrabee (1)

LWATCDR (28044) | more than 4 years ago | (#32457732)

Well right now the fastest Supercomputer is an AMD. Next year maybe it will be Intel.
I think you are a correct in many ways one AMD is the choice in desktops that use CPUs in the under $250 mark. Which is probably close to 90% by volume.
AMD is or had better performance in visualization which is a huge market in servers.
Frankly the need for big fast CPUs is dropping. I work at a software company. Our flagship product was a monster. It really could use just about all the CPU, memory, and IO you could throw at it. Graphics really doesn't matter but everything else does.
It runs well on an Atom is you don't have a bunch of other crap running at the same time.
Why do you think visualization is so popular? Most servers these days are running at well under 10%. You might as well run six servers on one box and save power, money, and space.

Fusion could be a huge win for AMD. The laptop and netbook market are the big ones today. If they can create the best value mobile x86 solution people will buy a ton of them,
They will even help in the desktop market because more companies will use them in small form factor PCs.

Re:vs Larrabee (5, Informative)

purpledinoz (573045) | more than 4 years ago | (#32456538)

Also, think of what this means for laptops. First, you save a huge amount of space by not having to have a separate GPU chip on the board. Have you seen how crammed the mainboard is on the macbook? And with the significant improvements in power consumption, it's a win-win for the laptop market.

Re:vs Larrabee (2, Insightful)

TheGryphon (1826062) | more than 4 years ago | (#32456712)

hopefully this has good effects for cooling, also. Maybe genuises will stop designing boards with 2 hot components separated by 4-6" on a board cooled by 1 copper pipe/fan assembly ... cleverly heating everthing along the whole length of pipe.

Re:vs Larrabee (2, Informative)

Soul-Burn666 (574119) | more than 4 years ago | (#32456898)

It should be plenty good for space and power consumption. Just look at Intel's US15W chipset which includes the GMA500 IGP.
It's tiny and consumes 2W compared to previous gen chipset + GPU setups (GMA950) that consume 15W, lengthening the battery life by a huge margin.
The chip itself has good performance, hindered only by terrible outsourced drivers (Tungsten, I'm looking at you), currently only optimized for video decoding (who said two smooth 1080p streams at less than 100% CPU usage using EVR in MPC?)

Combining the CPU and GPU can probably give a comparable reduction of power consumption and size, with the support of AMD/ATi graphics core instead of PowerVR core + terrible Tungsten drivers.

Re:Relevant? (4, Informative)

Joce640k (829181) | more than 4 years ago | (#32456644)

AMD designed/implemented the 64bit instruction that will be running our desktop PCs for decades to come.

Intel was the one scrambling to catch up [wikipedia.org] on that.

Re:Relevant? (2, Insightful)

Hal_Porter (817932) | more than 4 years ago | (#32457742)

Back in the days of Athlon64 vs Pentium 4 and Itanium AMD were ahead. Still since Core2 I'd say Intel are doing better. That being said Larrabee seems to be dead and I still think the idea has legs. Hopefully AMD will be to Larrabee what AMD64 was to IA64 - i.e. a more pragmatic version of the idea that ends up working better.

Re:vs Larrabee (0)

Anonymous Coward | more than 4 years ago | (#32457016)

Well, best performance period this far. I expect the next performance period to be even better.

Re:vs Larrabee (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#32457298)

ITT: fanbois ignoring facts modded up by other fanbois ignoring facts. enjoy your emotional, less performing build ;)

Re:vs Larrabee (1)

Rockoon (1252108) | more than 4 years ago | (#32457372)

I can build *two* AMD 6-core systems (that includes CPU, motherboard, ram, case, and power supply...) for the price of that extreme edition Intel CPU leading the performance charts. Thats right, two complete systems for the price of that chip.

I'll enjoy my TWO systems with cash-to-spare that together trivially outperform your one system.

Re:vs Larrabee (3, Informative)

Cyberax (705495) | more than 4 years ago | (#32456368)

How so?

AMD's offer is real, it uses a real performant GPU, not a GMA joke. Larrabee is stil vapourware, and it will be for a long time.

Re:vs Larrabee (1)

Calinous (985536) | more than 4 years ago | (#32456474)

Intel's on-processor graphic in the i5-661 (the fastest on-board graphic from Intel) is trading blows with AMD's last generation (and current generation, as AMD didn't improve performance of its graphics core over its last generation).
      If you're refering to an add-on card from AMD/ATI, then by all means the Intel IGP is crushed (just like the AMD's IGP or NVidia's IGP)

Re:vs Larrabee (1)

sznupi (719324) | more than 4 years ago | (#32456588)

But when you actually look at the state of drivers...

Re:vs Larrabee (1)

hattig (47930) | more than 4 years ago | (#32456648)

"Trading blows" with a two generation old integrated graphics core that is going to get replaced early next year with one around 5x - 10x faster.

This is like a wannabe thug trying to beat up an elderly gentleman.

And failing.

Re:vs Larrabee (1)

Calinous (985536) | more than 4 years ago | (#32456694)

AMD's current (and latest) IGP isn't better than the IGP they launched two years ago. As for promises, I've had plenty from NVidia, I'll wait until Fusion is in stores.

Re:vs Larrabee (2, Informative)

hattig (47930) | more than 4 years ago | (#32456630)

Intel's on chip GPU is just that - a GPU, and a primitive one at that. It can't even do OpenCL. It's certainly not a competitor to anything that AMD will release. Never mind Intel's appalling graphics drivers (and consistent history of poor driver releases), and benchmark cheating (so that they look competitive in reviews).

Re:vs Larrabee (4, Insightful)

mcelrath (8027) | more than 4 years ago | (#32457190)

AMD and Intel need to have a contest on the shittiest driver category. I have one of each. Each revision of xserver-xorg-video-intel bricks my laptop in a new and exciting way. And AMD's fglrx is a steaming pile of rendering errors, inconsistent performance, and crashes.

On the other hand, both Intel [intellinuxgraphics.org] and AMD [x.org] have released specs and participate in open source development. So in the long run, either one is a better choice than NVidia. So I'll continue to complain about them and submit bug reports. It's the open source way.

Re:vs Larrabee (1)

Lord Ender (156273) | more than 4 years ago | (#32457658)

It's not so interesting. I rarely wait on my CPU. It's my I/O and my GPU that hit the limits. When will NVIDIA make a GPU with a CPU core? That could be a real game-changer.

Enough with hyping eye candy (4, Insightful)

sco08y (615665) | more than 4 years ago | (#32455954)

“Hundreds of millions of us now create, interact with, and share intensely visual digital content,” said Rick Bergman, senior vice president and general manager, AMD Product Group. “This explosion in multimedia requires new applications and new ways to manage and manipulate data."

So people watch video and play video games, and it's still kinda pokey at times. We're way past diminishing marginal returns on improving graphical interfaces.

I bring it up, because if you're trying to promote a technology that actually uses a computer to compute, you know, work with actual data, you are perpetually sidetracked by trying to make it look pretty to get any attention.

Case in point: working on a project to track trends over financial data, there were several contractors competing. One had this software that tried to glom everything into a node and vector graph, which looked really pretty, but didn't actually do anything to analyze the data.

But to managers, all they see is that those guys have pretty graphs in their demos and all we had was our research into the actual data... all those boring details.

Re:Enough with hyping eye candy (5, Insightful)

Deliveranc3 (629997) | more than 4 years ago | (#32456112)

|"Hundreds of millions of us now create, interact with, and share intensely visual digital content," said Rick
|Bergman, senior vice president and general manager, AMD Product Group. "This explosion in multimedia requires
|new applications and new ways to manage and manipulate data."

So people watch video and play video games, and it's still kinda pokey at times. We're way past diminishing marginal returns on improving graphical interfaces.


Well sure YOU DO, but your Gran still has a 5200 with "Turbo memory" (actually that's only 3 years old, she probably has worse). This will be the equivalent of putting audio on the motherboard, a low baseline quality but done with no cost.

I bring it up, because if you're trying to promote a technology that actually uses a computer to compute, you know, work with actual data, you are perpetually sidetracked by trying to make it look pretty to get any attention.

Bloat is indeed a big problem, programs are exploding into GIGABYTE sizes, which is insane. OTOH linux reusing libraries seems not to have worked. There is too little abstraction of the data so each coder writes their own linked list, red-black tree, or whatever algorithm instead of just using the methods from the OS.

Case in point: working on a project to track trends over financial data, there were several contractors competing. One had this software that tried to glom everything into a node and vector graph, which looked really pretty, but didn't actually do anything to analyze the data.

Sounds like a case of "not wanting to throw the baby out with the bathwater." If they have someone of moderate intelligence on staff, that person can find a way to pull useful information out of junk data. He/she will resist removing seemingly useless data, because they occasionally use it and routinely ignore it. A pretty presentation can also be very important in terms of usability, remember you have to look at the underlying code but the user has to look at the GUI, often for hours a day.

But to managers, all they see is that those guys have pretty graphs in their demos and all we had was our research into the actual data... all those boring details. I can't comment on the quality of your management, but once again don't underestimate ease of use or even perceived ease of use (consider how long you will remain trying to learn a new tool if frustrated, the perception that something is as easy as possible is a huge boon... think iCrap).
Anyway back to Fusion, this is EXACTLY what Dell wants, bit lower power, less heat, significantly lower price and a baseline for their users to be able to run Vista/7 (7 review: better than Vista, don't switch from XP). So while it's true that this chip won't be dominant under ANY metric, and would therefore seem to have no customer base it's attractiveness to retail is such, that they will shove it down consumer throats and AMD will reap the rewards.

I'm curious about these things in small form factor, now that SD cards/MicroSD cards have given us nano-size storage we can get back to Finger sized computers that attach to a TV.

SFF Fusion for me!

Re:Enough with hyping eye candy (5, Interesting)

Anonymous Coward | more than 4 years ago | (#32456444)

| This will be the equivalent of putting audio on the motherboard, a low baseline quality but done with no cost.

I don't think you are viewing this correctly. I wish they didn't call it a GPU because your thought on the matter is what people are going to think of first. Instead think of it as the Fusion between a normal threaded CPU and a massively parallel processing unit. This thing is going to smoke current CPUs in things like physic operations without the need of anything like CUDA and without the performance limit of the PCIe bus. The biggest problem with discrete cards is pulling data off the cards because the PCIe bus is only fast in one direction (data into the card). This thing is going to be clocked much higher then discrete cards in addition to having direct access to the memory controller.

I don't think many have even scratched the surface of what a PPU (Parallel Processing Unit) can do or how it can improve the quality of just about any application ... I think this is going to be Hott.

Re:Enough with hyping eye candy (0)

Anonymous Coward | more than 4 years ago | (#32456462)

I wish they didn't call it a GPU because your thought on the matter is what people are going to think of first. Instead think of it as the Fusion between a normal threaded CPU and a massively parallel processing unit

Just call it a 'PU'

Re:Enough with hyping eye candy (4, Interesting)

hedwards (940851) | more than 4 years ago | (#32456494)

It'll also be interesting to see how they manage to use this in tandem with a discrete card, as in preprocessing the data and assisting the discrete card to be more efficient.

What about memory? (2, Insightful)

ElusiveJoe (1716808) | more than 4 years ago | (#32456978)

This thing is going to smoke current CPUs in things like physic operations without the need of anything like CUDA and without the performance limit of the PCIe bus.

Ummm, but videocard has its own super-fast memory (and a lot of it), and it uses direct access to system RAM, while this little thing will have to share the memory access and caches with CPU.

without the need of anything like CUDA

I dare to say, that this is totally false.

Re:What about memory? (1)

Courageous (228506) | more than 4 years ago | (#32457560)

I agree with you. Something like CUDA will be required.

C//

Re:Enough with hyping eye candy (0)

Anonymous Coward | more than 4 years ago | (#32457288)

Eh, PCIe has more read bandwidth than AGP in both directions combined.

Re:Enough with hyping eye candy (2, Informative)

TheThiefMaster (992038) | more than 4 years ago | (#32456476)

Well sure YOU DO, but your Gran still has a 5200 with "Turbo memory" (actually that's only 3 years old, she probably has worse).

What year are you living in?
1: Turbocache didn't exist until the 6100.
2: The 5200 is seven years old
3: You can apparently still buy them: eBuyer Link [ebuyer.com]

Re:Enough with hyping eye candy (2, Funny)

FreonTrip (694097) | more than 4 years ago | (#32456996)

If you think that's bad, you can still buy Radeon 7000 cards in lots of places, and they're fully a decade old. Look around a bit more, and you can find Rage128 Pro and - yes, your nightmares have come back to haunt you - 8 MB Rage Pro cards at your local CompUSA store. For the right market, old display technologies can still easily be good enough - mach64 support in X.org is good 'n' mature at this point, and if you're running a command-line server with framebuffer support it's easily good enough.

Re:Enough with hyping eye candy (1)

sznupi (719324) | more than 4 years ago | (#32456690)

They specifically said about "creating, interacting, sharing ... managing, manipulating" - and you just dismissed that part and criticised "consuming"?

There is a quite popular usage scenario which is nowhere near diminishing marginal returns - video editing. Architecture of Fusion seems perfect for that. Will help also in image editing; even if its not so desparatelly needed in this case, it will come handy with what's enabling video boom - reasonably cheap digicams shooting fabulous 720p, even at this point already.

Yeah, sure, go ahead and call it "crap"...but that sea of crap will give us many great videographers. Especially if large portion of them will be able to finally even afford quite sensible camera and editing rig (remember, world encompanesses not only developed countries)

Re:Enough with hyping eye candy (1)

sco08y (615665) | more than 4 years ago | (#32456830)

Yeah, sure, go ahead and call it "crap"...but that sea of crap will give us many great videographers. Especially if large portion of them will be able to finally even afford quite sensible camera and editing rig (remember, world encompanesses not only developed countries)

I'm not saying it's crap. Other comments pointed out how this is far more than simply glomming a GPU onto a CPU and I don't doubt that. I'm complaining about the eye-candy oriented hype, and I'm stupefied as to how, even in third-world countries, there's this desperate shortage of video. Do you really think that their problems would be solved if only they could set up their own cable news networks?

Re:Enough with hyping eye candy (1)

sznupi (719324) | more than 4 years ago | (#32456954)

...

If not "developed" then it's suddenly "third world"? For that matter, why does first world (numbering designation is a bit obsolete btw) accept the existance of indy videographers? Aren't they useless?

Re:Enough with hyping eye candy (1)

FreonTrip (694097) | more than 4 years ago | (#32457012)

Actually, if they could use video to democratize the availability of information, you could really be on to something...

How well does it handle virtualization? (1)

MikeFM (12491) | more than 4 years ago | (#32455960)

I wonder how well it would work in a virtualization environment such as VMware, Xen, KVM, etc. I could really see a point to a server that could easily off-load GPU work from thinclients that are running virtual desktops without needing to manage a huge box full of GPU cards.

Re:How well does it handle virtualization? (2, Interesting)

Anonymous Coward | more than 4 years ago | (#32456132)

It doesn't bring anything to the table yet. Firstly, IOMMUs need to be more prevalent in hardware, then secondly there needs to be support for using them in your favourite flavour (Xen will be there first) of virtualisation.

That said, we'll get ugly vendor-dependent software wrapping of GPU resources. Under the guise of better sharing of GPUs between VMs, but really so you're locked in.

This has nothing to do with virtualisztie. (1)

leuk_he (194174) | more than 4 years ago | (#32456204)

No, YOu cannot offload CPU work to GPU work with a virtualisation solution. Even if it was possible the network bottleneck would be far larger than most advantages gained.

And second, the GPU that is integrated has the same kind of processing power as current integrated (on the motherboard) solution. You can offload a little bit, but since there are power limits you do can expect very high gains. THe gains that exist will be used for power efficient laptops/notebooks or cheap desktops.

If you really have large amounts processing work to do that fits a GPU well, you can invest in a GPU card better that has a high power envelope. There are not many applications for this (relative to number of PC boxes), this will be a niche market (but even a small % of all pc sales is a big market...)

That all said, distributed project file like Boinc [berkeley.edu] will only benefit from more opengl capacle GPUs in the field.

Re:This has nothing to do with virtualisztie. (4, Insightful)

MikeFM (12491) | more than 4 years ago | (#32456428)

Sure you can offload GPU work so long as the entire process is handled by the server and just streaming the result to the client. I've seen this done over the Internet besides on a LAN. It'd be different if it were trying to use the client CPU and memory to drive the GPU.

They specifically were pointing out the benefit to having the GPU and CPU on the same chip which is quite a bit different than a mobo integrated solution. It probably isn't as powerful as a Xeon quad-core processor and a $500 video card but the question is how well it is setup to handle many different GPU tasks. I'd at least assume it's quite a bit faster for these types of tasks than a standard CPU and I wonder how well they can scale the technology for a better CPU and GPU.

I'm not sure I agree it's a niche market. I'd say more of a market poised to explode when the right products make it attainable. For virtualization it's more important that it can handle several unrelated tasks at a reasonable speed than that it can handle a single task at a high speed. If each CPU core also had a paired GPU it'd open up possibilities. Bulk, power consumption, and heat are often as big of issues for server farms as for laptops which is another reason why an interpreted GPU might be of interest.

Grid computing uses goes hand in hand with virtualization. Again coming down to how well these can work in parallel. Being able to fit a number of CPU and GPU cores on a single physical chip could be very beneficial I think.

How I'm supposed to program apps for these? (1)

kumma (1077987) | more than 4 years ago | (#32455968)

That's hot! Now I'll need only one big cooler instead of many small ones. Is it possible to use these GPUs easily in my own apps? What kinds of compilers we have for these? Will this start an era of functional programming?

Re:How I'm supposed to program apps for these? (0)

Anonymous Coward | more than 4 years ago | (#32456382)

Business as usual since it's just a CPU core with a couple of GPU execution units on die. Your system treats it as GPU.

Eventually, here's betting on it leading to GPU extensions in the same way we use SSE. Even then, vectorising instructions doesn't lead to functional programming.

heat (-1, Troll)

timmarhy (659436) | more than 4 years ago | (#32455974)

heat is going to destroy this thing. in theory i can see this making faster gaming systems, but in practise i'm betting it's terrible....

Re:heat (5, Funny)

Anonymous Coward | more than 4 years ago | (#32456064)

Perhaps you should email your insights the CEO of AMD. I'm sure he'll be grateful for the heads up from some retarded cunt on slashdot that his huge array of engineers and scientists have being building a chip that doesn't work for the past 5 years.

Re:heat (0)

Anonymous Coward | more than 4 years ago | (#32456416)

jesus why is _this_ guy who is right on the money marked -1: troll? the troll is the guy he is responding to :\

Re:heat (1, Insightful)

Anonymous Coward | more than 4 years ago | (#32456774)

Because I said some swear words. I tried to resist, but it's just too satisfying.

Re:heat (1)

lysdexia (897) | more than 4 years ago | (#32457464)

I must admit, I was disappointed that you didn't go for the more gerund-y "cunting retard".

Re:heat (1)

somersault (912633) | more than 4 years ago | (#32456188)

This is not going to make "faster gaming systems". It's just another, cheaper and lower power form of onboard graphics. Gamers are definitely still going to be using discrete graphics for now.

Re:heat (1)

StoneOldman79 (1497187) | more than 4 years ago | (#32456232)

Heat will only be a problem if they aim to replace the video cards from (hardcore) gaming system...
Problably not.
I guess they are aiming for the largest market:
cheap, but "good enough" graphics for the lowest possible price point.
Maybe it will be also be used for physics acceleration and similar high-computing tasks but only time will tell.

Re:heat (3, Interesting)

Rockoon (1252108) | more than 4 years ago | (#32456390)

Indeed.

This GPU-on-the-CPU is targeting the mobile/lightweight market.

Think about how the other solutions work. That GPU chip sits next to the CPU chip and they both must be connected to the system bus in order to access ram. With AMD's solution here, you remove that GPU chip and therefor also remove the external BUS connection that it required. This is a very big win for manufacturers, who would even pay a premium for the chip because of the lower production costs. But knowing AMD, they wont be charging a premium for it. Instead they will try to push Atom's out of the market.

Re:heat (4, Insightful)

sznupi (719324) | more than 4 years ago | (#32456406)

AMD chipsets with integrated GFX were quite good at power consumption already; using a dozen or so watts. Considering AMD puts out quadcores with sub 100W TDP, Fusion shouldn't be that big a problem.

Re:heat (1)

Calinous (985536) | more than 4 years ago | (#32456502)

Yes, heat is going to destroy this thing if they want to make it a Fermi plus an overclocked Extreme Edition processor.
      However, if the graphic performance is limited based on available (main) memory bandwidth (from which the main processor also takes a chunk), they don't need more than a quarter of a 5850, and starting with a 65W TDP processor, they're in the 125W TDP (where they released plenty of processors).
      If they drop graphic performance even further, they could get a desktop processor and a normal desktop graphic card under 95W TDP (and even mATX boards support/supported 95W AMD processors).

Re:heat (2, Informative)

hattig (47930) | more than 4 years ago | (#32456658)

This demo was of Ontario - AMD's low power solution for netbooks and low-end notebooks. This will be using the low power Bobcat cores and probably something similar to an HD5450 graphics-wise.

I seriously doubt heat is going to be an issue.

Moving electrons (3, Informative)

jibjibjib (889679) | more than 4 years ago | (#32455992)

"Moving electrons between two chips" isn't entirely accurate. What moves is a wave of electric potential; the electrons themselves don't actually move very far.

Re:Moving electrons (1)

drinkypoo (153816) | more than 4 years ago | (#32456592)

Last I checked nobody was quite sure if all electrons moved or some electrons moved. Has this actually been ironed out? Do any of these chips actually switch fast enough that your statement is correct regardless?

Re:Moving electrons (1)

stewbee (1019450) | more than 4 years ago | (#32457078)

Electrons will drift( look up 'electron mobility' on Wikipedia), but the GP is right in that it is the wave motion of the potentials that are primarily the means in which the information will travel. At the same time, he is just being a bit nit picky. I think what person in the article is trying to say that usually to go from one chip to another, you usually need to provide a buffer (ie amplifier) on the output interface. This will give you better noise margins on the receiver chip. This would be your classical communication theory dilemma; if I sent a one, what is probability that a one was received? Adding a buffer effectively increases the SNR so that the probability of receiving the voltage level that is sent is pretty near 1 (values of 0.99999 are minimally acceptable). Adding buffers will increase the power of the chip.

Re:Moving electrons (1)

drinkypoo (153816) | more than 4 years ago | (#32457794)

Adding buffers will increase the power of the chip.

Just to be clear, you mean power consumption, right? And to be still more clear, adding buffers increases latency because you have to wait for more transistors to switch? And of course more power means more opportunity for noise in other systems...

Re:Moving electrons (0)

Anonymous Coward | more than 4 years ago | (#32457622)

>> Has this actually been ironed out?

Negative, but I'm positive you'll get a charge out of the potential results.

Yeah! (5, Interesting)

olau (314197) | more than 4 years ago | (#32456056)

I'm hoping moving things into the CPU will make it easier to take advantage of the huge parallel architecture of modern GPUs.

For what, you ask?

I'm personally interested in sound synthesis. I play the piano, and while you can get huge sample libraries (> 10 GB), they're not realistic enough when it comes to the dynamics.

Instead people have been researching physical models of the piano. So you simulate a piano in software, or the main components of it, and extract the sound from that. Nowadays there are even commercial offerings, like Pianoteq (www.pianoteq.com) and Roland's V-Piano. Problem is that while this improves dynamics dramatically, they're not accurate enough yet to produce a fully convincing tone.

I think that's partly because nobody understands how to model the piano fully yet, at least judging from the research literature I've read, but also very much because even a modern CPU simply can't deliver enough FLOPS.

Re:Yeah! (1)

kumma (1077987) | more than 4 years ago | (#32456194)

Quite all AI things that I know involve dot products for long vectors, which could be optimized with GPUs.

just like my Core i3, then (1, Insightful)

FuckingNickName (1362625) | more than 4 years ago | (#32456076)

Just like my Core i3 sitting about 20 inches to the left, then. Yes, I know they're incorporating a better GPU, but they're touting too much as new.

Re:just like my Core i3, then (4, Informative)

odie_q (130040) | more than 4 years ago | (#32456140)

The technical difference is that while your Core i3 has its GPU as a separate die in the same packaging, AMD Fusion has the GPU(s) on the same die as the CPU(s). The Intel approach makes for shorter and faster interconnects, the AMD approach completely removes the interconnects. The main advantage is probably (as is alluded to in the summary) related to power consumption.

Re:just like my Core i3, then (4, Interesting)

ceeam (39911) | more than 4 years ago | (#32456224)

I also heard that they *share* FP math units between CPU and GPU.

Re:just like my Core i3, then (1)

FuckingNickName (1362625) | more than 4 years ago | (#32456238)

You're entirely correct, sorry! Inferring wrongly from a high-level flow diagram, I thought the second die was used for PCIe/memory, with graphics on the CPU die.

Re:just like my Core i3, then (1)

triplepoint217 (876727) | more than 4 years ago | (#32456926)

I wonder if a future step will be to mix the GPUish vector op logic in with the logic for the regular CPU. It could further shorten interconnects (though at the cost of lengthening some others) and I would think it might have head dissipation advantages, if you are doing a graphics heavy task, the heat is spread over the whole chip instead of all being generated at the one GPU sector. I am sure there is an "oh god the complexity" here as far as actually doing that mixing, and it might make selectivly shutting down the GPU when not in use harder. Anyone who has actual chip design experience want to comment?

Re:just like my Core i3, then (3, Interesting)

sznupi (719324) | more than 4 years ago | (#32456430)

Well, "incorporating a better GPU" makes quite a bit of difference, considering i3/i5 solution isn't much of an improvement almost anywhere (speed - not really, cost - yeah, I can see Intel willingly passing the savings...anyway, cpu + mobo combo hasn't got cheaper at all, power consumption is one but mostly due to how Intel chipsets were not great at this); and seemed to be almost a fast "first" solution, announced quite a bit after the Fusion.

Whoa, graphics on the CPU? (3, Insightful)

Rogerborg (306625) | more than 4 years ago | (#32456168)

Let's party like it's 1995! Again! [wikipedia.org]

Slightly less cynically, isn't this (in like-for-like terms) trading a general purpose CPU core for a specialised GPU one? It's not like we'll get more bang for our buck, we'll just get more floating point bangs, and fewer integer ones.

Re:Whoa, graphics on the CPU? (1)

drinkypoo (153816) | more than 4 years ago | (#32456534)

Slightly less cynically, isn't this (in like-for-like terms) trading a general purpose CPU core for a specialised GPU one? It's not like we'll get more bang for our buck, we'll just get more floating point bangs, and fewer integer ones.

Until it takes less than four intel CPU cores (This is a psuedorandom number, it's what I recall from some intel demo) to do the job of a halfway decent GPU, this approach will be rational for any users who care about 3d graphics. Intel would like us to have "thousands" of CPU cores (I assume that means dozens in the near term) and to ditch our GPUs and the change cannot come fast enough for me... but it's not here yet.

It's that time again... (1)

bradley13 (1118935) | more than 4 years ago | (#32456818)

Very few people need more than a dual-core - the cores just sit their twiddling their bits. Sacrifice a core or two for a good GPU, and you have massively simplified the design of the system, saved power and saved space.

Sure, it's not a new idea - in IT we seem to progress in spirals. It's time this idea came around again...

holistic is (1)

jimmydevice (699057) | more than 4 years ago | (#32456228)

The new paradigm. Snort...

Re:holistic is (1)

Aeros (668253) | more than 4 years ago | (#32457338)

I have been looking for a 'holistic approach to my power management'.

Open Source drivers? (4, Interesting)

erroneus (253617) | more than 4 years ago | (#32456386)

Will the drivers for the graphics be open source or will we be crawling out of this proprietary driver hole we have been trying to climb out of for over a decade?

Re:Open Source drivers? (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#32456404)

Who gives a fuck? Linux is for losers. If you want to keep being a loser than you pay the price. Keep sucking on that open source teat and get use to getting second rate software.

Re:Open Source drivers? (-1, Troll)

Anonymous Coward | more than 4 years ago | (#32456516)

eat poop, oss.

Re:Open Source drivers? (1)

Calinous (985536) | more than 4 years ago | (#32456520)

The more things change, the more things stay the same.
      Don't expect to have high quality, high performance open source drivers as people high in the company will think the information revealed in those drivers will help the competition

Re:Open Source drivers? (0)

Anonymous Coward | more than 4 years ago | (#32456616)

I sense trouble. The next version of Gnome will not run a laptop that doesn't have working 3D drivers.

Re:Open Source drivers? (4, Informative)

Skowronek (795408) | more than 4 years ago | (#32456944)

The documentation needed to write 3D graphics drivers has been consistently released by ATI/AMD since R5xx. In fact, yesterday I was setting up a new system with a RV730 graphics card which was both correctly detected and correctly used by the open source drivers. Ever since AMD started supporting the open source DRI project with both money, specifications and access to hardware developers things have improved vastly. I know some of the developers personally; they are smart and I believe that given this support, they will produce an excellent driver.

It's sad to see that with Poulsbo Intel did quite an about-face, and stopped supporting open source drivers altogether. The less said about nVidia the better.

In conclusion, seeing who is making this Fusion chip, I would have high hopes for open source on it.

APU (2, Funny)

Capt James McCarthy (860294) | more than 4 years ago | (#32456568)

Looks like the Quickie Mart has a lawsuit on their hands.

Meh. (2, Insightful)

argStyopa (232550) | more than 4 years ago | (#32456822)

Sounds like a non-advancement to me.

"Look, we can build a VCR *into* the TV, so they're in one unit!"

Yeah, so when either breaks, neither is usable.
Putting more points of failure into a device just doesn't sound like a great idea.

In the last 4 computers I've built/had, they've gone through at least 6-7 graphics cards and 5 processors. I can't remember a single one where they both failed simultaneously.

Now, if this tech will reduce the likelihood of CPU/GPU failures (which, IMO, are generally due to heat or less frequently power issues) somehow, then great. But I have a gut reaction against taking two really hot, power-intensive components and jamming them into even closer proximity.

Finally, I'm probably in the minority, but I prefer being able to take my components ala carte. There were many times in the past 25 years that I couldn't afford the best of all components TODAY, so I built a system with a very high-end mobo and CPU, but using my old soundboard, RAM, etc until I could afford individually to replace those components with peer-quality stuff.

Re:Meh. (3, Insightful)

mcelrath (8027) | more than 4 years ago | (#32457100)

Sounds like you need a new power supply, or a surge suppressor, or a power conditioner, or an air conditioner.

You shouldn't see that many failures. Are you overclocking like mad? Silicon should last essentially forever compared to other components in the system, as long as you keep it properly cooled and don't spike the voltage. Removing mechanical connectors by putting things on one die should mean fewer failure modes. A fanless system on a chip using a RAM disk should last essentially forever.

A single chip with N transistors does not have N failure modes. It's essentially tested and will not develop a failure by the time you receive it. A system with N mechanically connected components has a failure rate of N*(probability of failure of one component), and it's always the connectors or the cheap components like power supplies that fail.

Re:Meh. (1)

ElectricTurtle (1171201) | more than 4 years ago | (#32457540)

I have built as many systems and have had no GPU failures and only one CPU failure (and that was because it was a first generation socket 462 and the brick and mortar I was buying the parts from didn't have any HSFs for 462... he said 'oh, just use this really beefy socket 7 HSF, that will be enough!' Pff, well, it wasn't. At least the bastard replaced the chip when it died, at which time he did have 462 HSFs).

I'm willing to bet that the GP buys cheap crap like ASRock and generic PSUs that couldn't perform at a fraction of their claimed specs and then wonders why his shit dies all the time. Meanwhile I have basically every CPU, GPU, and motherboard I've ever built with over more than a decade all in working order.

MSI + Enermax + AMD + Thermalright = FTW4ever.

Re:Meh. (1)

argStyopa (232550) | more than 4 years ago | (#32457724)

Then you'd be wrong.

Firstly, I've been building PC's since perhaps 1984. (I wouldn't include the early computers that I built from kits in 80-81.) So we're talking over a long, long span of time.

I learned early on that you get what you pay for - shit components=shit performance.
Thus PRECISELY the point I was making: when sinking a lot into individual components because you're not buying cheap crap, it's useful to be able to purchase incrementally.

Now, I'll answer all the other commenters: first, recognize that this is over 25 years. I've built dozens of systems for other people and their systems run far more reliably than mine BECAUSE I personally live in an extremely challenging computer environment: a 110 yr old farmhouse, in a rural community, without a/c, in MN - even with a window unit, the ambient temps in summer in my computer room can reach a humid 90+ F (32C). Even with a good UPS/line scrubber, we seem to still get power spikes, browns, drops, and very 'dirty' power, to the point that I've even considered lobbying to be the first local tester for an in-home fuel cell system.

But all this is peripheral to my main point that putting multi functions to me on a single die seems to simply be adding points of failure. Only one comment to this point even responded to THAT.

Re:Meh. (1)

brxndxn (461473) | more than 4 years ago | (#32457188)

Wow, obviously you are a consumer. I cannot imagine anyone worth their pay in the business world replacing individual components in a computer. Usually, it is just tossed.. or handed back to the oem to get fixed..

Second, when is the last time you had a processor fail?

Re:Meh. (1)

MikeFM (12491) | more than 4 years ago | (#32457242)

With that kind of fail rate you're probably seeing the reason not to be cheap and not to try to keep reusing old parts. In my experience the technology moves fast enough that every three years I want to replace my systems anyway and the only thing worth saving is the hard disk which can be dumped as yet-another-drive into your backup units RAID. I kept trying to save the last really expensive graphic card I purchased for my new systems until I realized that the new $25 cards were more powerful - not worth the hassle.

Except for servers and power gamers building/upgrading is probably not worth it. I just grab another Macbook and iMac every couple years and call it good. My last Dell lasted less than a year under lit wear and tear so I hope the new one does better; if not I will probably just stop keeping any PCs around.

Re:Meh. (1)

Quantumstate (1295210) | more than 4 years ago | (#32457784)

1 year is definitely unusual for the life on a PC. My parents still have a nice Dell from 2001 (with the ram tripled from the original and an extra hard drive) which is working fine. The floppy drive on it died but nothing else has had a fault. But ignoring the anecdotes, the data shows that the average life is far more than one year. Here in the EU it is not legal to offer less than 2 years warranty so they need a decent survival rate just to stay in business.

Re:Meh. (0)

Anonymous Coward | more than 4 years ago | (#32457376)

In the last 4 computers I've built/had, they've gone through at least 6-7 graphics cards and 5 processors. I can't remember a single one where they both failed simultaneously.

Holy hell, you're doing it wrong.

In the ~20 years I have been building computers I have never had a CPU fail and I have had exactly two graphics cards fail. The graphics cards were an old ISA Cirrus Logic card from the early 90's (failed within a few days of buying it; the replacement is still working) and an AGP GeForce 440 that just failed recently after nearly 10 years of use.

Re:Meh. (1)

cpicon92 (1157705) | more than 4 years ago | (#32457584)

I don't think this chip is aimed at you as a home desktop PC builder. It seems more like it's aimed for the netbook or very compact/expensive laptop field. Think about it. What takes up more space? A tv with a vcr on top, or a tv with a vcr built in?

Re:Meh. (1)

MartinSchou (1360093) | more than 4 years ago | (#32457710)

You may have missed the memo - Intel is the largest supplier of graphics units for PCs and Mac. And no - none of their graphics units are discrete. They're all mounted on the motherboard. Just like the audio controller. And the USB controller. And SATA controller. And NIC.

And for some strange reason, there's still a market for discrete controllers.

What this is doing isn't taking away your choices. It's giving you more choices. Though, to be realistic, it's probably aimed more at OEMs and business who have no use for discrete units.

I thought that they learned (1)

Rallias Ubernerd (1760460) | more than 4 years ago | (#32457164)

I thought that they learned it is more efficient to have a seperate CPU and GPU.

Re:I thought that they learned (1)

rrhal (88665) | more than 4 years ago | (#32457526)

The GPU is still separate from the CPU - its just located on the same chip. The idea is that you save power by having the interconnect on the chip level rather than having a high speed backplane on the circuit board. This also reduces board complexity. This is a win for small ultraportable devices - more than desktop computers. I could also see a use for micro ATX home media boxen.

Future Multi CPU + GPU Combos (1)

inhuman_4 (1294516) | more than 4 years ago | (#32457280)

Packing the GPU into the CPU makes a lot of sense but also raises some questions.

Does this mean that in the future we can have chips that contain not only a multi-core CPU but also a multi-core GPU? For example could AMD pump out a frag-tastic 6 CPU + 4 GPU chip for hardcore gamers and scientists?

How is this going to effect the cooling for the chip? If I fire up Crisis will my computer melt? (Assuming a GPU is packed in with enough power to play crisis.)

Also how is this going to effect memory bandwidth? Most graphics cards come with some pretty high throughput memory to make everything work. Once everyone is on 64bit having the extra RAM for the GPU is not a problem, but what about the bandwidth?

With all of this multicore processor stuff, I get the feeling that we are going to hit upon another memory bandwidth limit very soon.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?