The Outlook On AMD's Fusion Plans 122
PreacherTom writes "Now that AMD's acquisition of ATI is complete, what do the cards hold for the parent company? According to most experts, it's a promising outlook for AMD . One of the brightest stars in AMD's future could be the Fusion program, which will 'fuse' AMD's CPUs with ATI's GPUs (graphics processing units) in a single, unified processor. The product is expected to debut in late 2007 or early 2008. Fusion brings a hopes of energy efficiency, with the CPU and GPU residing on a single chip. Fusion chips could also ease the impact on users who plan to use Windows Vista with Aero, an advanced interface that will only run on computers that can handle a heavy graphics load. Lastly, the tight architecture provided by Fusion could lead to a new set of small, compelling devices that can handle rich media."
Stock tip ... (Score:5, Funny)
Re: (Score:2)
Re:Stock tip ... (Score:4, Insightful)
Frankly, I'm betting this is going to turn out more like the next generation of integrated video. Basically, the only "fusion" chips you'll see will be ones designed for small/cheap boxes that people never upgrade the components on. I'm betting the graphics in general will be slow and the processor will be average. Super fast processors and fast graphics won't get the fusion treatment because the people who buy them tend to want to keep them separate (for upgrading later), not to mention the difficulty you'd have powering and cooling a chip that complex.
Re: (Score:3, Interesting)
"Righteous" = "big"?
Intel was making 130W CPUs until AMD got better performance with 60W (although Intel have now overtaken AMD on this.) I've got a 40W GPU which is as powerful as a 100W GPU of a couple of years ago.
A state-of-the-art CPU plus a mid-to-high range GPU today could come in at around 130W. The 130W CPU heat-sink problem is solved (for noisy values of "solved".)
Also, it is much easier to deal with a big heatsi
Re: (Score:2)
I think people are confused about the nature of Fusion--it is intended for general computer users, not the high-end geeks who want to load up on the latest in everything inside the computer.
It depends on the integration (Score:2)
It could be, as you assume, a cheap graphic card, something similar to the intel chipsets that integrate some lowend 3D graphics.
It could also be some 3D geometry unit. A specialised vector engine that could be used for geometry, physics, or general-purpose GPU programming. Which will be more like the AMD's answer to Cell processor.
In which case a fusion chip will still be used together with specialised GFX cards. But vector calculat
Re: (Score:1)
Heat sinks don't cover the whole motherboard, just the hottest parts, so putting two devices that each normally need heat sinks into one small area implies to me that the OP is right. There will have to be some kick-arse cooling to stop it from melting.
Especially when you take into account that by the time they get anything out to market, the CPU that wil
Airport fun (Score:2, Funny)
"But, but its an AMD processor, built in Germany or Russia or somewhere"
"Teh internet told me it was more powerful than anything else out there."
"It would literally blow me away!"
Re: (Score:1)
hate ATI and when AMD bought them, they absorbed ATI's taint.
this solution is really only good for servers anyways, when people dont care about video.
and, since server solutions can get away with uber low end graphics chips, why bother?
enthusiasts will be wanting to upgrade seperately...defeating the purpose.
Re: (Score:3, Interesting)
Wow. And here I was thinking there was this vast market for things called "workstations" where businesses didn't need high-end video cards and home systems where users didn't require the best 3D accelerators on the market. Shows what I know.
Even most enthusiasts only replace their video cards every 12-18 months. If a CPU/GPU combo was in the same price range as a current video card (not farfetched) then there'd be no reason not to use a combo chip.
B
Re: (Score:2)
Re: (Score:2)
Most desktop machine in the workplace do not need high end video cards. The on board one works fine.
"Even most enthusiasts only replace their video cards every 12-18 months. If a CPU/GPU combo was in the same price range as a current video card (not farfetched) then there'd be no reason not to use a combo chip."
So if you bought a machine 6 months ago, and now you hav to upgrade the entire thing to use an application, that's fine with you?
feel free to waste those hundreds of dollars.
En
Re:Airport fun (Score:4, Insightful)
Right now I'm buying a $200 vidcard every 18-24 months. I'm looking at probably getting my next one middle of next year, around the same time I replace my motherboard, CPU, and RAM. My current system is struggling with the Supreme Commander beta and upcoming games like Crysis should be equally taxing on it. In the past six years, I've bought three CPU upgrades. If AMD could market a $300 chip that gave me a CPU and GPU upgrade with similar performance and stay on the same socket for 3-4 years, I'd be breaking even.
Re: (Score:3, Interesting)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Call centers, cubicle farms for accounting and the like, home users who don't do any gaming, even dual monitor users--a Radeon 7000 can use dual monitors quite well for productivity apps--far outnumber stock traders, programmers, and graphic designers.
As for gamers: anyone chewing through 2-5 video cards a year is reckless and spends way too much money on hardware. ATI and nVidia are both on a six-month cycle, so there is ab
Servers use video cards? (Score:3, Insightful)
Coming to think of it, the way we have things set up, the console is inaccessible from the lab - but accessible via terminal concentrators - over the lan.
Re: (Score:2)
Re: (Score:1)
A better way to spend they money would be on PR... (Score:3, Insightful)
Re:A better way to spend they money would be on PR (Score:2, Interesting)
Re: (Score:2)
I just finished explaining to a friend why the software he just purchased for his business will run fine on the laptop I suggested from a Fry's ad. The specs for the software list "Intel Processor" and he assumed the AMD chip in my recommendation wouldn't work, because he has no idea what a processor even does. I would even hazard a guess that whomever wrote those specs doesn't know either, this little software vendor isn't getting paid to push Intel hardware.
If they could just get word
Re: (Score:2)
Also I think the Intel C-compiler only enableds MMX/SSE/SSE2 etc support on Intel CPUs, thus making the programs slower on AMD unless you patch the binery to bypass the CPU check.
Re: (Score:2)
Re: (Score:1)
You'd want to make sure this move wouldn't make people equate AMD with Celeron performance. I wouldn't want AMD to be seen as a Celeron replacement.
Re:A better way to spend they money would be on PR (Score:1)
Considering they are doing so well without any form of mass attention tells me they really could deal a major blow to Intel if they advertised.
Then again, the shock value of telling people I use AMD is like telling them about my Fiero. "Wtf is AMD/Fiero?". ^_^
Re: (Score:2, Informative)
power efficiency?? (Score:2, Interesting)
Re: (Score:1)
Re:power efficiency?? (Score:5, Informative)
Re: (Score:2)
Bad idea for upgrades (Score:5, Insightful)
Combining the CPU and GPU may make sense for embedded systems or as a replacement for integrated graphics, but I cannot see it working for those who prefer to have specific components based on other factors.
Re: (Score:1, Informative)
Unless combining the two increases the performance of the system as a whole enough that the AMD CPU/GPU combination keeps up with or beats the latest and greatest video card...
It's for laptops and budget systems (Score:5, Insightful)
The advantages of a combined CPU/GPU in this space are:
1) Fewer chips means a cheaper board.
2) The GPU is connected directly to the memory interface, so UMA solutions will not suck nearly as hard.
3) No HT hop to get to the GPU, so power is saved on the interface and CPU-GPU communication will be very low latency.
I highly doubt AMD is planning on using combined CPU/GPU solutions on their mainstream desktop parts, and they are absolutely not going to do so for server parts. I think in those spaces they'd much rather have four cores on the CPU, and let you slap in the latest-greatest (ATI I'm sure they hope, but if NVidia gives them the best benchmark score vs Intel chips then so be it) graphics card.
AMD has already distinguished their server, mobile, desktop, and value lines. They are not going to suddenly become retarded and forget that these markets have different needs and force an ATI GPU on all of them.
Re: (Score:2)
Re: (Score:3, Insightful)
Re: (Score:1)
If you're running a server (let's say a web server), aren't you only going to put in a video card that barely has anything on it (I'm thinking ATi Rage stuff, where all you need is 1024x768 or something)?
Re: (Score:2)
Re:It's for laptops and budget systems (Score:5, Interesting)
I think they are, and I think it's the right choice. The GPU that will be integrated will not be today's GPU, but a much more general processor. Look at NVidia's G80 for the beginning of this trend; they're adding non-graphics-oriented features like integer math, bitwise operations, and soon double-precision floating point. G80 has 128 (!) fully general-purpose SISD (not SIMD) cores, and soon with their CUDA API you will be able to run C code on them directly instead of hacking it up through DirectX or OpenGL.
AMD's Fusion will likely look a lot more like a Cell processor than, say, Opteron + X1900 on the same die. ATI is very serious about doing more than graphics: look at their CTM initiative (now in closed beta); they are doing the previously unthinkable and publishing the *machine language* for their shader engines! They want businesses to adopt this in a big way. And it makes a lot of sense: with a GPU this close to the CPU, you can start accelerating tons of things, from scientific calculations to SQL queries. Basically *anything* that is parallelizable can benefit.
I see this as nothing less than the future of desktop processors. One or two x86 cores for legacy code, and literally hundreds of simpler cores for sheer calculation power. Forget about games, this is much bigger than that. These chips will do things that are simply impossible for today's processors. AMD and Intel should both be jumping to implement this new paradigm, because it sets the stage for a whole new round of increasing performance and hardware upgrades. The next few years will be an exciting time for the processor business.
Re: (Score:2)
This could be of huge benefit to F/OSS - if it's possible to write a decent GPL driver for an AMD GPU, there's suddenly a huge lever to persuade nVidia to open their GPU machine language too. Yay for AMD (again...)
Re: (Score:2)
The average desktop user is at the point now where they buy a new PC entirely if the old one is too slow in any area. Its not about RAM upgrades or video cards, its about a PC to them, and they buy a whole new one.
On the same note, integrated video is more than enough for most server configurations, and only high-end CAD/visualization workstations and gaming rigs need independent graphics capabilities.
Re: (Score:2)
Well, yes and no. They're becomming more programmable, but they are still very highly specialized towards doing floating point vector calculations.
But you're right, in that AMD will probably target the HPC market with a combin
Re: (Score:2, Interesting)
Re: (Score:2)
Probably AMD will continue to make GPU-less chips for headless servers and specialized applications where no GPU is needed, just as (for a while, at least) Intel made 486SX chips which were 486s without the FPU, when FPUs were first build into CPUs. Although with the emergence of ideas to leverage GPUs for non-display applications, I wouldn't b
Re: (Score:2)
So... (Score:3, Funny)
Project named Fusion...
Please tell me Pons and Fleischmann [wikipedia.org] aren't behind this?
Re: (Score:1)
We also got a gas guzzling car and razor with numerous blades. I say that if it doesn't net fusion energy, there should be a law against calling it fusion!
Heat??? (Score:3, Insightful)
With a decent single-GPU gaming rig drawing over 200W just between the CPU and GPU, do they plan to start selling water cooling kits as the stock boxed cooler?
Re:Heat??? (Score:5, Interesting)
You're talking about the high-end "do everything you can" GPUs... ATI is dominating the (discrete) mobile GPU industry because their mobile GPUs use so little power. Integrating (well) one of those into a CPU should still result in a low-power chip.
Yes but (Score:3, Insightful)
Re: (Score:1, Funny)
Upgrades ? (Score:1, Insightful)
Re: (Score:2)
Re: (Score:2, Funny)
Disaster for Linux and OSS (Score:5, Insightful)
This one is untouchable until they open up the graphics drivers - or goodbye AMD/ATI.
jh
Re: (Score:2, Funny)
Goodbye already. (Score:2)
Show your support. Buy one too.
Re: (Score:1)
but... (Score:2, Interesting)
Re: (Score:1)
ATI (before we were AMD) released CTM http://www.atitech.com/companyinfo/researcher/docu ments.html [atitech.com], which is the hardware specification for the pixel shaders on our graphics cards. The pixel shaders are probably the most complicated part of our chips and we released this because the GPGPU community wanted it. While I don't speak for AMD, I would not be surprised at all if a group serious about writing an open source AMD driv
Remember math coprocessors? (Score:2)
Re: (Score:1)
Network processor
Sound
Video input processor
USB (or whatever equivalent but newer technology)
Disk controller
Memory
Re: (Score:1, Interesting)
Re: (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
But seriously, memory would seem to the next thing. The already have L1 and L2 caches, why not move all of memory on board. As long as CPUs keep shrinking, there is no reason not to do this. Sure, you can still have a memory bus to external memory, if you want to upgrade.
cool (Score:1)
Maybe... (Score:5, Interesting)
Re: (Score:2)
Linux Drivers (Score:3, Interesting)
I've been an nVidia advocate since 1999 when I bought a TNT2 Ultra for playing Quake III Arena under Linux on my (then) K6-2 400.
I'm on my 4th nVidia graphics card, and I have 6 machines, all running Linux. One is a 10-year-old UltraSPARC, one has an ATI card.
Despite slashbot rantings about the closed-source nVidia drivers, and despite my motley collection of Frankenstein hardware, I've never had a problem with the nVidia stuff. The ATI stuff is junk. The drivers are pathetic (open source) and the display is snowy, and the performance it rubbish.
I hope AMD do something about the Linux driver situation.
My next machine will be another AMD, this time with dual dual-core processors and I'll be doing my own Slackware port, but I'll be buying an nVidia graphics card.
Re: (Score:3, Informative)
Well if you do 3D gaming on Linux, you're used to closed source drivers, since there hasn't really been another choice since the 3dfx Voodoo -- who won me over by supporting Linux, if not the Free Software philosophy beh
Re: (Score:2)
I'll be sticking with nVidia for the foreseeable future though; ATi is just not worth the risk on any OS.
Re: (Score:1)
I currently have an ATI Radeon 9200. The reason I went with it rather than a faster card is because of the open source driver for it. The games I play are emulated SNES, GTA III, GTA Vice City, and Enemy Territory. I haven't had and problems with it. It whenever I install Linux the card works accelerated with out of the box.
Re: (Score:1)
I've got a K6-III/450 with 128MB RAM and a TNT2 M64, running Slackware.
Cyrix MediaGX (Score:1, Funny)
this will fail (Score:2)
Do you really want to have to replace an entire system when you upgrade? You buy a Dell, a new game comes out 6 months and your system can't play it reasonably well.
So then you either
a) buy a new system
or
b) gut in a video card and not use the one on the proc.
When processors begins to peak, and each upgrades is basically a few ticks, then developers will have to create things for the systems that is out , not a system that will be out i
Re: (Score:2)
Do you know about Hypertransport? Do you know how important multi-CPU AMD motherboards are about to become?
While intel's multi-core processors are choking on a single frontside bus, with an AMD system, you just need to plug in another CPU, GPU, physix processor, vector processor or whatever and get more (not less) memory bandwidth per processor and a linear increase in processing power.
By 2008, I expect 4-socket AMD motherboards will be common place amongst consumers, never mind enthusiasts.
Intel will ha
Re: (Score:3, Interesting)
Integrating the GPU with the CPU will be about driving down cost and power consumption, not something that is usually a high-priority for folks that want to run the latest greatest games and get all the shiniest graphics. So, I'd be very surpri
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
I don't upgrade so this would be nice, yes.
You are a gamer and you are special. Most of the world isn't special and would like a cheaper machine to browse the web. You will buy a different system with an upgradable GPU. Or this new setup will be almost exactly like integrated graphics today which allows you to add your own killer card as a replacement.
At the risk of being modded reundant (Score:5, Insightful)
That's great and all, but does it run Linux?
I'm not kidding, either. Is AMD going to force ATI to open up its specs and its drivers so that we can FINALLY get stable and FULLY functional drivers for Linux, or are they still going to be partially-implemented limited-function binary blobs where support for older-yet-still-in-distribution-channels products will be phased out in order to "encourage" (read: force) customers to upgrade to new hardware, discarding still-current computers?
That is why I do not buy ATI products any more. They provide ZERO VIVO support in Linux, They phase out chip support in drivers even while they are actively distributed. They do not maintain compatibility of older drivers to ensure they can be linked to the latest kernels.
This is why I went Core 2 Duo for my new system and do not run AMD - their merger with ATI. My fear is that if ATI rubs off on AMD then support for AMD processors and chipsets will only get worse, not better.
Re: (Score:3, Insightful)
It is pretty typical in a buyout like this for the larger company's culture to dominate the smaller one. While in many cases this is a bad thing as the smaller company has the more open culture, in this case it is the larger company, AMD, that is more open.
It is ridiculous to think that support for AMD
Re: (Score:3, Informative)
So you use SiS chipsets then? They're the only manufacturer I can think of who still provide specs for their video chips (or do Intel still do that too?).
Unfortunately we're currently stuck with a range of equally sucky choices. I tend to buy (older) ATI cards because at least they get reverse-engineered drivers, eventually.
Re: (Score:2)
I do not use SiS products as the failure rates I've seen for SiS are somewhere between horrible and abysmal.
Re: (Score:2)
I think I have figured out what you're saying, but it's much more fun to pretend that you're writing really bad Japanglish ads for hardcore pr0n.
GPU or GPGPU? (Score:2, Interesting)
I remember programming assembly graphics code in BASIC back in the day. You would set the VGA card to mode 13h and then write to...what was it now...0xa00? That's probably wrong. Anyway, whatev
Re: (Score:1, Interesting)
Not what you think (Score:2, Interesting)
Re: (Score:2, Funny)
A step between on-board video, and full graphics (Score:3, Interesting)
With Vista coming out soon, PC-makers are going to want a low-cost 3-d accelerated solution to be able to run some (or maybe all) of the eye-candy that comes with vista.
I'll buy it if they provide free drivers (Score:1, Insightful)
For the sake of competition... (Score:3, Funny)
How many blades? (Score:2, Funny)
Fusion or Design By Committee (aka "Convergence") (Score:2)
I'm not saying it won't work; I'm saying that fusing development teams with expertise is a lot different than fusing different components onto the same board. And that,
Dual and Quad socket! (Score:2)
My fear is... (Score:1)
Matrix Operations? (Score:2)
VIDEO cards on the HT bus (Score:1)
Amd 4x4 systems may be able to have 4 video cards + 2 cpus with graphics processing in them.