Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Outlook On AMD's Fusion Plans

Zonk posted more than 7 years ago | from the playing-oblivion-on-a-handheld dept.

122

PreacherTom writes "Now that AMD's acquisition of ATI is complete, what do the cards hold for the parent company? According to most experts, it's a promising outlook for AMD . One of the brightest stars in AMD's future could be the Fusion program, which will 'fuse' AMD's CPUs with ATI's GPUs (graphics processing units) in a single, unified processor. The product is expected to debut in late 2007 or early 2008. Fusion brings a hopes of energy efficiency, with the CPU and GPU residing on a single chip. Fusion chips could also ease the impact on users who plan to use Windows Vista with Aero, an advanced interface that will only run on computers that can handle a heavy graphics load. Lastly, the tight architecture provided by Fusion could lead to a new set of small, compelling devices that can handle rich media."

cancel ×

122 comments

Sorry! There are no comments related to the filter you selected.

Stock tip ... (5, Funny)

guysmilee (720583) | more than 7 years ago | (#16875388)

Invest in heat sinks! :-)

Re:Stock tip ... (1)

MtViewGuy (197597) | more than 7 years ago | (#16875544)

Not really. Because the Fusion concept integrates so much into one chip, it also means the potential to substantially cut the size of the motherboard itself, which means lower overall power consumption. This means we could dramatically reduce the size of the CPU box to something not much bigger than the current Mac Mini but still have tremendous processing power. It would certainly make it easier for designer of small computer barebones systems like Shuttle's XPC boxes.

Re:Stock tip ... (3, Insightful)

jandrese (485) | more than 7 years ago | (#16876194)

Yeah, but the heatsink for the processor/graphics card combo system will be righteous.

Frankly, I'm betting this is going to turn out more like the next generation of integrated video. Basically, the only "fusion" chips you'll see will be ones designed for small/cheap boxes that people never upgrade the components on. I'm betting the graphics in general will be slow and the processor will be average. Super fast processors and fast graphics won't get the fusion treatment because the people who buy them tend to want to keep them separate (for upgrading later), not to mention the difficulty you'd have powering and cooling a chip that complex.

Re:Stock tip ... (2, Interesting)

Michael Woodhams (112247) | more than 7 years ago | (#16876582)

Yeah, but the heatsink for the processor/graphics card combo system will be righteous.

"Righteous" = "big"?

Intel was making 130W CPUs until AMD got better performance with 60W (although Intel have now overtaken AMD on this.) I've got a 40W GPU which is as powerful as a 100W GPU of a couple of years ago.

A state-of-the-art CPU plus a mid-to-high range GPU today could come in at around 130W. The 130W CPU heat-sink problem is solved (for noisy values of "solved".)

Also, it is much easier to deal with a big heatsink on the motherboard than on a video card - the size and weight are much less restricted.

Hm, perhaps if AMD starts making 100+W fusion chips, they'll start supporting Intels BTX form factors (which were largely designed to improve cooling.) As a silent computing nut, I think this would be a Good Thing.

Re:Stock tip ... (1)

MtViewGuy (197597) | more than 7 years ago | (#16877242)

Actually, I think an XPC-sized box with heat-pipe cooling for the Fusion chip will actually work pretty well.

I think people are confused about the nature of Fusion--it is intended for general computer users, not the high-end geeks who want to load up on the latest in everything inside the computer.

Re:Stock tip ... (1)

Bob Gelumph (715872) | more than 7 years ago | (#16876608)

There might be less overall energy use, but there will be a much higher energy use in the small part of the motherboard in which the processor resides.
Heat sinks don't cover the whole motherboard, just the hottest parts, so putting two devices that each normally need heat sinks into one small area implies to me that the OP is right. There will have to be some kick-arse cooling to stop it from melting.
Especially when you take into account that by the time they get anything out to market, the CPU that will be part of the package will have at least 4 cores of its own.

Airport fun (2, Funny)

LiquidCoooled (634315) | more than 7 years ago | (#16875434)

how will homeland security like you bringing home a multi core Fusion through the gates?

"But, but its an AMD processor, built in Germany or Russia or somewhere"

"Teh internet told me it was more powerful than anything else out there."

"It would literally blow me away!"

Re:Airport fun (1)

Chicken04GTO (957041) | more than 7 years ago | (#16875514)

wont buy it.
hate ATI and when AMD bought them, they absorbed ATI's taint.
this solution is really only good for servers anyways, when people dont care about video.
and, since server solutions can get away with uber low end graphics chips, why bother?

enthusiasts will be wanting to upgrade seperately...defeating the purpose.

Re:Airport fun (2, Interesting)

PFI_Optix (936301) | more than 7 years ago | (#16875670)

So...there are only servers and enthusiasts in the market?

Wow. And here I was thinking there was this vast market for things called "workstations" where businesses didn't need high-end video cards and home systems where users didn't require the best 3D accelerators on the market. Shows what I know.

Even most enthusiasts only replace their video cards every 12-18 months. If a CPU/GPU combo was in the same price range as a current video card (not farfetched) then there'd be no reason not to use a combo chip.

But hey, feel free to waste hundreds of dollars just because you think you know how things will work. Don't let the facts get in the way of your fanaticism.

Re:Airport fun (1)

grommit (97148) | more than 7 years ago | (#16875866)

The GP may be on to something though. Technically, all of those business and home desktops *are* servers since they've got a myriad of trojans, rootkits and bots busily serving data to script kiddies around the globe.

Re:Airport fun (1)

geekoid (135745) | more than 7 years ago | (#16876202)

but his point is valid.

Most desktop machine in the workplace do not need high end video cards. The on board one works fine.

"Even most enthusiasts only replace their video cards every 12-18 months. If a CPU/GPU combo was in the same price range as a current video card (not farfetched) then there'd be no reason not to use a combo chip."
So if you bought a machine 6 months ago, and now you hav to upgrade the entire thing to use an application, that's fine with you?

feel free to waste those hundreds of dollars.

Enough with the snarky.

My CPU is 3 years old, and runs everything fine. I sue it for photography, and games, mostly WoW.
the only upgrade is the video card, twice. both times about 80 bucks.

Much cheaper then having to buy a complete combo two times.

Re:Airport fun (3, Insightful)

PFI_Optix (936301) | more than 7 years ago | (#16876650)

You assume that this would do away with video cards; there's not a chance of that happening any time soon. As I said in another thread, it'd be quite simple for AMD to disable the on-chip video in favor of a detected add-in card.

Right now I'm buying a $200 vidcard every 18-24 months. I'm looking at probably getting my next one middle of next year, around the same time I replace my motherboard, CPU, and RAM. My current system is struggling with the Supreme Commander beta and upcoming games like Crysis should be equally taxing on it. In the past six years, I've bought three CPU upgrades. If AMD could market a $300 chip that gave me a CPU and GPU upgrade with similar performance and stay on the same socket for 3-4 years, I'd be breaking even.

Re:Airport fun (2, Interesting)

BigFootApe (264256) | more than 7 years ago | (#16879508)

I believe they would simply re-purpose the onboard shaders for general computing.

Re:Airport fun (1)

BlackSnake112 (912158) | more than 7 years ago | (#16876830)

Most people I have seen want a workstation for either high end programming, or rendering something (3D video/pictures, etc). The high end programmers are dual or more (usually quad) monitors and the rendering people need the graphics processing power. Not the gamer cards but the rendering cards Quadro from Nvidia and Fire from ATI(? I forget ATI's cards). Nither of those types cards are low end. The plain number crunching business people yes they don't need a high end graphics solution. But many business people are going multi-monitor. And if those business people are stock traders I have seen a few with eight (yes 8!!) monitors. Those people had multiple video cards (one AGP, and three pci all hooked to two monitors each).

Regular home people who read email, surf and do regular things, yes this looks good for them. But most of the other people this is not going to be common.

I need to see these CPU/GPU combos in action before I'd say anything with the gamer crowd. I have seen gamers go through 2-5 video cards a year. Remember they over clock a lot. This can burn out the chips faster if kept at the upper limit all the time. Most of the over clocks to the CPU don't burn out the chip, but the video over clocking I have seen is where the damage comes in. They buy a high end video card, change the fan (or put a water block on it) and max out the GPU and memory speeds on the video card. Sometimes all is fine, othertimes the card gets burned out.

just what I have seen, and like I said I want to see these in action before any real decison is made.

Servers use video cards? (2, Insightful)

milgr (726027) | more than 7 years ago | (#16875864)

Gee, most of the servers I use don't have a video card. Some of the servers have serial ports. Others talk over a proprietary fabric - and pretend to have a serial connection (and maybe even VGA). I don't need to walk into the lab to get to the server's virtual consoles.

Coming to think of it, the way we have things set up, the console is inaccessible from the lab - but accessible via terminal concentrators - over the lan.

Re:Airport fun (1)

Zonk (troll) (1026140) | more than 7 years ago | (#16877740)

How can you use a server without Aero? The command line is too scary...

A better way to spend they money would be on PR... (2, Insightful)

Channard (693317) | more than 7 years ago | (#16875470)

.. or advertising on TV. I work in a computer shop and it seems loads of people have no idea who the hell AMD are. I've explained that they're just competitors to a lot of customers, but still the customers go 'No, I've been told to get an Intel.' I can't recall having ever seen an AMD ad on telly at all.

Re:A better way to spend they money would be on PR (2, Interesting)

scuba_steve_1 (849912) | more than 7 years ago | (#16875536)

...which may explain how AMD has managed to keep their costs low over the years. Word of mouth is compelling...even to the point that many folks that I know are now biased against Intel...even though we are at a unique point where AMD's advantage has eroded...at least for the moment.

Re:A better way to spend they money would be on PR (1)

Hijacked Public (999535) | more than 7 years ago | (#16875724)

To some extent, yes.

I just finished explaining to a friend why the software he just purchased for his business will run fine on the laptop I suggested from a Fry's ad. The specs for the software list "Intel Processor" and he assumed the AMD chip in my recommendation wouldn't work, because he has no idea what a processor even does. I would even hazard a guess that whomever wrote those specs doesn't know either, this little software vendor isn't getting paid to push Intel hardware.

If they could just get word out enough to get the general public to realize that they are making an x86 alternative to Intel they might do themselves some good.

Re:A better way to spend they money would be on PR (1)

businessnerd (1009815) | more than 7 years ago | (#16875938)

Leave the advertising to the computer manufacturers like Dell and HP. Now that Dell is going to carry AMD, they could start putting in AMD logos with whatever "chime" AMD has at the end of every one of their commercials. On their website, they can even say, "Now we bring you more savings by offering you the option of an AMD processor." Followed by an explanation that AMD is on par with intel when it comes to speed and is fully compatible with all software designed for intels. If I were in charge, I'd probably drop the Celeron as the low cost alternative and just push AMD's. In general the AMD costs as much, if not less than a Celeron and outperforms it like gangbusters.

Re:A better way to spend they money would be on PR (1)

AP2k (991160) | more than 7 years ago | (#16876142)

I saw one of Dell's laptop commercials and the only processor options it stated were AMDs single and double core Turion. They are getting a little PR at least.

Considering they are doing so well without any form of mass attention tells me they really could deal a major blow to Intel if they advertised.

Then again, the shock value of telling people I use AMD is like telling them about my Fiero. "Wtf is AMD/Fiero?". ^_^

Re:A better way to spend they money would be on PR (2, Informative)

jernejk (984031) | more than 7 years ago | (#16876826)

power efficiency?? (2, Interesting)

Klintus Fang (988910) | more than 7 years ago | (#16875496)

yeah. i'm also wondering how putting the two hottest components on the mother board (the GPU and CPU) into the same package is a power savings... :-/ maybe on the low end of the market where the performance of the GPU is irrelevant, but for those who actually care about GPU performance, putting the two most power hungry and memory bw hungry components together doesn't seem like a good idea.

Re:power efficiency?? (1)

Yfrwlf (998822) | more than 7 years ago | (#16875610)

There will be a decrease in net heat output and power consumption, but you're right, it will make them hotter than just the CPU or GPU by itself. Maybe water cooling will be really needed at that point for the fastest CPU/GPU combos.

Re:power efficiency?? (0)

Anonymous Coward | more than 7 years ago | (#16875660)

Reducing the number of functional units required in a two-chip solution by moving to a unified solution will result in an overall power reduction, while being able to make use of AMD (and in the future) Intel processes will reduce the cost and power requirements of the fabless GPU manufacturers who tend to get stuck a process generation behind for their silicon. It will also reduce the overlap between the functional units that would be on the CPU anyway for SIMD operations, because the GPUs are already arrays of SIMD units.

This is obviously where this was going to head: the only surprising aspect is the timeline.

Re:power efficiency?? (5, Informative)

RuleBritannia (458617) | more than 7 years ago | (#16876528)

Any kind of integration tends to improve power efficiency just because of the high capacitance of the PCB traces. This makes it difficult to route a PCB for high-speed inter-chip communications never mind getting multiple 2.5Gb/s (PCIe) signal traces through a connector. All this requires large driver cells to drive off-chip communication and these use a great deal of power (and moderate area) on chip. Reducing the noise floor of your signals (by keeping them on chip) also gives you more headroom for voltage reductions in your digital hardware. All in all it makes it a much better picture overall for power efficiency. But dissipating power from these new chips will still be a headache for CPU package designers and systems guys alike.

Bad idea for upgrades (5, Insightful)

rjmars97 (946970) | more than 7 years ago | (#16875568)

Although I can see the potential efficiency increases, combining the GPU and CPU into one chip means that you will be forced to upgrade one when you only want to upgrade the other. To me, this seems like a bad idea in that AMD would have to make dozens of GPU/CPU combinations. Say I want one of AMD's chips in my headless server, am I going to have to buy a more expensive processor because it has a high powered GPU that I don't want or need? What if I want to build a system with a good processor to start, but due to budget reasons want to hold off on buying a good video card?

Combining the CPU and GPU may make sense for embedded systems or as a replacement for integrated graphics, but I cannot see it working for those who prefer to have specific components based on other factors.

Re:Bad idea for upgrades (1, Informative)

Anonymous Coward | more than 7 years ago | (#16875730)

Combining the CPU and GPU may make sense for embedded systems or as a replacement for integrated graphics, but I cannot see it working for those who prefer to have specific components based on other factors.

Unless combining the two increases the performance of the system as a whole enough that the AMD CPU/GPU combination keeps up with or beats the latest and greatest video card... ...and is nearly as cheap to upgrade as buying high end video card... ...and both these seem entirely possible to me.

It's for laptops and budget systems (4, Insightful)

Chris Burke (6130) | more than 7 years ago | (#16875946)

Especially the former, where you can't really upgrade anyway and you typically have a GPU soldered to the board.

The advantages of a combined CPU/GPU in this space are:
1) Fewer chips means a cheaper board.
2) The GPU is connected directly to the memory interface, so UMA solutions will not suck nearly as hard.
3) No HT hop to get to the GPU, so power is saved on the interface and CPU-GPU communication will be very low latency.

I highly doubt AMD is planning on using combined CPU/GPU solutions on their mainstream desktop parts, and they are absolutely not going to do so for server parts. I think in those spaces they'd much rather have four cores on the CPU, and let you slap in the latest-greatest (ATI I'm sure they hope, but if NVidia gives them the best benchmark score vs Intel chips then so be it) graphics card.

AMD has already distinguished their server, mobile, desktop, and value lines. They are not going to suddenly become retarded and forget that these markets have different needs and force an ATI GPU on all of them.

Re:It's for laptops and budget systems (1)

jandrese (485) | more than 7 years ago | (#16876278)

You know, there could be a short term marketing coup to be had with a UMA style setup. I've noticed that a lot of people compare cards based entirely on how much memory they have (you have a 128MB card? Bah, my 256MB card is twice a good!), and will even mention it in game requirements "requires a 64MB video card". With UMA you could theoretically count the entire system memory as your card memory "this new system has 1GB of video on it!". People will think it's the hottest new thing until they get home and discover that the amount of memory on a video card doesn't matter as much as many other aspects of the card.

Re:It's for laptops and budget systems (0)

Anonymous Coward | more than 7 years ago | (#16876566)

They'll of course do this for server parts, only the units will be used for more general numeric processing. There is nothing magical about what GPUs do, they are large vector processors, and they are becoming more general-purpose in nature as the demands in shader programs increases. This is why you see all of the interest in GPGPU stuff, and why when you aren't using these units to play Quake 6 you'll be able to use them for simulations of fluid dynamics. This will overlap with the market of Cell or POWER6 more than the market of the T1.

Re:It's for laptops and budget systems (1)

Chris Burke (6130) | more than 7 years ago | (#16876684)

They'll of course do this for server parts, only the units will be used for more general numeric processing. There is nothing magical about what GPUs do, they are large vector processors, and they are becoming more general-purpose in nature as the demands in shader programs increases.

Well, yes and no. They're becomming more programmable, but they are still very highly specialized towards doing floating point vector calculations.

But you're right, in that AMD will probably target the HPC market with a combined CPU/GPU where the GPU is destined to be used as a math coprocessor. I think they had a press release recently about "Stream Computing", which basically means offloading vector computation to the GPU. However a GPU will still be a poor match for the server space, so I'm fairly confident that if AMD goes this route they'll split the Opteron line into a server and HPC line.

Re:It's for laptops and budget systems (2, Insightful)

racerx509 (204322) | more than 7 years ago | (#16878168)

This product will most likely find its way into the mobile industry. Imagine a laptop with a *decent* 3d accelerator that uses low power and can actually run a 3d game at a good frame rate, without weighing a ton or singing your knees. They may be onto something here.

Re:It's for laptops and budget systems (1)

CCFreak2K (930973) | more than 7 years ago | (#16878174)

I think in those spaces they'd much rather have four cores on the CPU, and let you slap in the latest-greatest...graphics card.

If you're running a server (let's say a web server), aren't you only going to put in a video card that barely has anything on it (I'm thinking ATi Rage stuff, where all you need is 1024x768 or something)?

Re:It's for laptops and budget systems (1)

Chris Burke (6130) | more than 7 years ago | (#16878408)

Yes, the "let you" was to indicate that doing so would be the customer's choice. Obviously few server customers would. I was mostly referring to the desktop market with that comment.

Re:It's for laptops and budget systems (4, Interesting)

modeless (978411) | more than 7 years ago | (#16879024)

I highly doubt AMD is planning on using combined CPU/GPU solutions on their mainstream desktop parts, and they are absolutely not going to do so for server parts

I think they are, and I think it's the right choice. The GPU that will be integrated will not be today's GPU, but a much more general processor. Look at NVidia's G80 for the beginning of this trend; they're adding non-graphics-oriented features like integer math, bitwise operations, and soon double-precision floating point. G80 has 128 (!) fully general-purpose SISD (not SIMD) cores, and soon with their CUDA API you will be able to run C code on them directly instead of hacking it up through DirectX or OpenGL.

AMD's Fusion will likely look a lot more like a Cell processor than, say, Opteron + X1900 on the same die. ATI is very serious about doing more than graphics: look at their CTM initiative (now in closed beta); they are doing the previously unthinkable and publishing the *machine language* for their shader engines! They want businesses to adopt this in a big way. And it makes a lot of sense: with a GPU this close to the CPU, you can start accelerating tons of things, from scientific calculations to SQL queries. Basically *anything* that is parallelizable can benefit.

I see this as nothing less than the future of desktop processors. One or two x86 cores for legacy code, and literally hundreds of simpler cores for sheer calculation power. Forget about games, this is much bigger than that. These chips will do things that are simply impossible for today's processors. AMD and Intel should both be jumping to implement this new paradigm, because it sets the stage for a whole new round of increasing performance and hardware upgrades. The next few years will be an exciting time for the processor business.

Re:It's for laptops and budget systems (1)

MikeBabcock (65886) | more than 7 years ago | (#16879710)

I could see this on servers and desktops equally, just not on gaming rigs.

The average desktop user is at the point now where they buy a new PC entirely if the old one is too slow in any area. Its not about RAM upgrades or video cards, its about a PC to them, and they buy a whole new one.

On the same note, integrated video is more than enough for most server configurations, and only high-end CAD/visualization workstations and gaming rigs need independent graphics capabilities.

Re:Bad idea for upgrades (2, Interesting)

hairpinblue (1026846) | more than 7 years ago | (#16876378)

I can appreciate that an integrated CPU/GPU combination may have advantages in many arenas. It feels like a Bad Idea, though, in the same way that televisions with integrated VHS players were a bad idea, and all-in-one stereo systems didn't become a Good Idea until they came down both in price and physical size. In general I'm not comfortable with someone else bundling my technology for me. I'll be more than happy to accept the cost of keeping up to date with researching the individual components, and accepting the small performance drawback of the data bus between processor, memory, and the video card. In some ways it feels like a cog in the wheel of advancing TC and DRM. In other ways it's really inevitable since video display is such an enormously processor intensive task. The computer, for the majority of the population, has become an entertainment device similar to what the television and radio were in progressively earlier generations. Even with the push to F/OSS taking off and catching the attention of more and more consumers the end tasks are solidifying and standardizing for the vast majority of the population. Logically speaking why wouldn't the industry begin to solidify and standardize more and more of the components within the product? Look for the reintroduction of integrated audio chipsets, and maybe even their integration into the processor core, for a single unified network entertainment box (SuneB) rather than a real computer. Then where will the F/OSS movement go? By the time the SuneB hits we'll be back to OS on a chip (much like the Amiga had 20 years ago, or TV set-top network boxes, which the Amiga became in Escom's and later QVDs hands, have, or DVD players have). Technology really seems more and more cyclical every time I see it evolving and progressing.

As a hobbyist, though, this sort of move makes me uncomfortable and maybe even a little bit sad. I've always liked the puzzles that computers bring: programming, building, troubleshooting, compiling, security monitoring, maintaining, and even the jargon and zealotry that comes with being a computer enthusiast. When computers have become a standard black box commodity what will be the next hobby puzzle to hold my interest?

Oh. And yes. I'd like to claim intellectual property on the SuneB. Sure, the industry will call it something else and all the patents will have a different name, but at least, 10 years from now when a SuneB clone company is the driving force on the stock market, I can sit back and think to myself,"Somewhere on Slashdot there's a post proving that I should be a billionaire rather than a corporate wage-slave."

Re:Bad idea for upgrades (1)

DragonWriter (970822) | more than 7 years ago | (#16876678)

Say I want one of AMD's chips in my headless server, am I going to have to buy a more expensive processor because it has a high powered GPU that I don't want or need?


Probably AMD will continue to make GPU-less chips for headless servers and specialized applications where no GPU is needed, just as (for a while, at least) Intel made 486SX chips which were 486s without the FPU, when FPUs were first build into CPUs. Although with the emergence of ideas to leverage GPUs for non-display applications, I wouldn't be surprised if such a model were, like the 486SX with FPUs, merely a short-term transition to a future where GPUs are part of the processor in most general purpose computers.

What if I want to build a system with a good processor to start, but due to budget reasons want to hold off on buying a good video card?


You'll be in the same place that people wanting a good general purpose processor that wanted to save money by not buying an FPU right away were once FPUs became standard features: SOL. OTOH, its quite possible that the CPU+GPU combo units will be enough less expensive than CPU + GPU separately that they'll justify the extra expense for most purchasers.

Combining the CPU and GPU may make sense for embedded systems or as a replacement for integrated graphics, but I cannot see it working for those who prefer to have specific components based on other factors.


Probably, but the former are the overwhelming majority of the market, and the latter a small minority. Is it worth it to manufacturers to not serve the first well in order to cater to the latter? Perhaps not.

Re:Bad idea for upgrades (1)

drinkypoo (153816) | more than 7 years ago | (#16876762)

Presumably AMD will have a limited set of these processors and most of them will be targeted at budget systems, the rest at laptops. Neither kind of machine is typically upgraded - most laptops you can't upgrade, and most budget systems just never need to be upgraded because the kind of people who buy them want to surf the web, read email, and use office. And maybe a simple paint program, and download pictures off their digital camera. Any crap computer today can do all that; might as well get more integration, which will lower cost. And EVERYONE will want a decent 3D solution in their PC soon enough, because we're going to see more and more use of it as time goes by. Aqua and Aero are both just baby steps.

So... (2, Funny)

FlyByPC (841016) | more than 7 years ago | (#16875614)

Energy efficiency...
Project named Fusion...
...
Please tell me Pons and Fleischmann [wikipedia.org] aren't behind this?

Re:So... (1)

eclectro (227083) | more than 7 years ago | (#16875808)

Project named Fusion...

We also got a gas guzzling car and razor with numerous blades. I say that if it doesn't net fusion energy, there should be a law against calling it fusion!

Re:So... (0)

Anonymous Coward | more than 7 years ago | (#16876636)

> We also got a gas guzzling car and razor with numerous blades


you forgot to mention this remarkable piece of tech: Alesis Fusion Synth [alesis.com]

Heat??? (2, Insightful)

pla (258480) | more than 7 years ago | (#16875616)

Although CPUs have gotten better in the past year, GPUs (particularly ATI's) still keep outdoing each other in just how much power they can suck.

With a decent single-GPU gaming rig drawing over 200W just between the CPU and GPU, do they plan to start selling water cooling kits as the stock boxed cooler?

Re:Heat??? (4, Interesting)

Pulzar (81031) | more than 7 years ago | (#16878096)

Although CPUs have gotten better in the past year, GPUs (particularly ATI's) still keep outdoing each other in just how much power they can suck.


You're talking about the high-end "do everything you can" GPUs... ATI is dominating the (discrete) mobile GPU industry because their mobile GPUs use so little power. Integrating (well) one of those into a CPU should still result in a low-power chip.

AWEsOME FP? (-1, Redundant)

Anonymous Coward | more than 7 years ago | (#16875630)

486/66 3ith 8 accountVs for less the problems

Re:AWEsOME FP? (0)

Anonymous Coward | more than 7 years ago | (#16876656)

post wha1 is? nutteR bunny funyon

Yes but (2, Insightful)

Ant P. (974313) | more than 7 years ago | (#16875652)

Will it run Linux less than half a year after it's obsoleted by the next version?

Re:Yes but (1, Funny)

Anonymous Coward | more than 7 years ago | (#16880504)

Yes, in all the VESA glory

Upgrades ? (1, Insightful)

Anonymous Coward | more than 7 years ago | (#16875680)

Let's say I buy Fusion. Later on NVIDIA brings cool graphics card to market. Will I be able to use NVIDIA graphic card with Fusion ?

Re:Upgrades ? (1)

PFI_Optix (936301) | more than 7 years ago | (#16875764)

I don't see why not. A lot of modern systems have integrated graphics that automatically switch off or become secondary if an add-in video card is detected. No reason AMD couldn't do this in their own chips.

Re:Upgrades ? (2, Funny)

adsl (595429) | more than 7 years ago | (#16877084)

Or you could replace the whole Fusion chipset with the projected Nvidia chipset (release also late 2007) which attaches a CPU onto their graphic chipset.

Re:Upgrades ? (0)

Anonymous Coward | more than 7 years ago | (#16879506)

Let's say I buy Fusion. Later on NVIDIA brings cool graphics card to market. Will I be able to use NVIDIA graphic card with Fusion ?

If they don't, many will not buy it. I bought ATI for years, then switched to NVIDIA and didn't look back. Better alternative OS drivers was the real go, plus they seemed cheaper.

AMD made a mistake buying ATI, they just haven't figured it ot yet. My ATI Video USB 2.0 Blunder is about to hit the waste basket. Flaky and supported by nothing.

Disaster for Linux and OSS (4, Insightful)

hirschma (187820) | more than 7 years ago | (#16875732)

Whoa. You're going to need a closed-source kernel driver to use your CPU now? They can eat me. The graphics driver situation is bad enough.

This one is untouchable until they open up the graphics drivers - or goodbye AMD/ATI.

jh

Re:Disaster for Linux and OSS (2, Funny)

Kookus (653170) | more than 7 years ago | (#16875962)

No, you just won't have the extra functionality of the graphics portion of the chip... Which, hey!, isn't any different then today!

Goodbye already. (1)

Kludge (13653) | more than 7 years ago | (#16879890)

I just bought a cpu/mobo combo with open source support for 2-3D acceleration under Linux out of the box. It has an Intel G965 chipset.

Show your support. Buy one too.

but... (2, Interesting)

Hangin10 (704729) | more than 7 years ago | (#16875768)

does this mean ATI will be opening up its GPU programming specs, or merely what is being stated (that graphics chip and CPU will share a die) ?

Re:but... (1)

Glacial Wanderer (962045) | more than 7 years ago | (#16877214)

Is ATI going to open up their specs so people can write open source drivers?

ATI (before we were AMD) released CTM http://www.atitech.com/companyinfo/researcher/docu ments.html [atitech.com] , which is the hardware specification for the pixel shaders on our graphics cards. The pixel shaders are probably the most complicated part of our chips and we released this because the GPGPU community wanted it. While I don't speak for AMD, I would not be surprised at all if a group serious about writing an open source AMD driver could get the rest of the chip specification released by asking for it. It doesn't hurt to ask especially with this CTM project as proof that ATI/AMD is now serious about releasing specs.

Remember math coprocessors? (1)

maddogsparky (202296) | more than 7 years ago | (#16875778)

Prior to 486s, they used to have the floating point functions on a separate chip from the processor. If the GPU is moving to the processor now, what will be the next thing to get sucked in?

Re:Remember math coprocessors? (0)

milgr (726027) | more than 7 years ago | (#16875970)

There are currently not that many other components to get sucked in. Here is a list off the top of my head:
  Network processor
  Sound
  Video input processor
  USB (or whatever equivalent but newer technology)
  Disk controller
  Memory

Re:Remember math coprocessors? (1, Interesting)

Anonymous Coward | more than 7 years ago | (#16876108)

Ram. Look at the Xenos GPU (Xbox360) or the PS3 GPU. Both have the ram soldered directly over the GPU package. They cant be in the same chip because of the different fabrication processes, but they can be glued together for higher overall speed. But upgrades will suck...

Re:Remember math coprocessors? (2, Insightful)

rjstanford (69735) | more than 7 years ago | (#16877060)

One thing to consider is that right now its getting pretty easy to have "enough" RAM for 99% of all users. I mean, if you get a new machine today that had 1.5-2.0gb in it, the odds of even wanting to upgrade would be slim to none. The fact is that most people live quite reasonably with 256-512mb right now, and will never upgrade. Note: most /. readers != most people. For modern machines if you're not running anything more brutal than Office, having a gig permanently attached would probably make sense for most people who would be using an integrated graphics type of system.

cool (1)

forrestf (1028150) | more than 7 years ago | (#16875798)

this processor will just be a addon ontop of the vid card you would have, like 3dnow! its just a add on, maybe we can shift into 128bit processors!

Maybe... (4, Interesting)

MobyDisk (75490) | more than 7 years ago | (#16875892)

The article says that this might be attractive to businesses: I can see that since most businesses don't care about graphics. This is similar to businesses buying computers with cheap on-board video cards. But that means they will be profiting on the low-end. It seems like this is more of a boon for laptops and consoles: Currently, laptops with decent video cards are expensive and power-hungry. Same with consoles. But for mid-range and high-end systems, there must be a modular bus connecting these two parts since they are likely to evolve at different rates, and likely to be swapped-out individually.

Re:Maybe... (1)

MikeFM (12491) | more than 7 years ago | (#16876700)

I don't know if that's really true anymore. When I upgrade either my CPU or GPU these days I usually end up upgrading both along with mobo and RAM. Unless you're making a very minor upgrade of either CPU or GPU, and who can afford that unless you're buying crappy outdated stuff anyway, you'll likely have a hard time not needing to upgrade your mobo in the process and if you do that you usually have to upgrade everything else. Just package everything together and make it really powerful so I won't have to upgrade more than every other year to play the coolest new games.

It does make me wonder about the multi-core multi-gpu future though. Can they manage to pack 16 cores and 4 gpus into a single package that won't melt a hole through the mobo? If they can then I'll be sure to buy one. They better make the GPU specs as open as the CPU specs though because I use Linux and won't buy it if it doesn't have opensource drivers available.

Linux Drivers (2, Interesting)

turgid (580780) | more than 7 years ago | (#16875944)

I've been an nVidia advocate since 1999 when I bought a TNT2 Ultra for playing Quake III Arena under Linux on my (then) K6-2 400.

I'm on my 4th nVidia graphics card, and I have 6 machines, all running Linux. One is a 10-year-old UltraSPARC, one has an ATI card.

Despite slashbot rantings about the closed-source nVidia drivers, and despite my motley collection of Frankenstein hardware, I've never had a problem with the nVidia stuff. The ATI stuff is junk. The drivers are pathetic (open source) and the display is snowy, and the performance it rubbish.

I hope AMD do something about the Linux driver situation.

My next machine will be another AMD, this time with dual dual-core processors and I'll be doing my own Slackware port, but I'll be buying an nVidia graphics card.

Re:Linux Drivers (2, Informative)

Chris Burke (6130) | more than 7 years ago | (#16876024)

Despite slashbot rantings about the closed-source nVidia drivers, and despite my motley collection of Frankenstein hardware, I've never had a problem with the nVidia stuff. The ATI stuff is junk. The drivers are pathetic (open source) and the display is snowy, and the performance it rubbish.

Well if you do 3D gaming on Linux, you're used to closed source drivers, since there hasn't really been another choice since the 3dfx Voodoo -- who won me over by supporting Linux, if not the Free Software philosophy behind it. NVidia similarly works. The ATI drivers are terrible, and I'm not talking the open source ones.

I hope AMD do something about the Linux driver situation.

Me too, because I'm sick of having only one practical choice for graphics cards. Not that I really have any complaints with NVidia, but it would be nice to be able to pick the best card, not the one that I can count on to work.

I'm hopeful, just because AMD has been a big supporter of Linux and gcc, particularly in getting them to support AMD64. I guess we'll see.

Re:Linux Drivers (1)

Ant P. (974313) | more than 7 years ago | (#16876602)

I do 3D gaming just fine with a 9250 and the open driver. It's not noticeably worse than the FX5200 I had before it burned out.
I'll be sticking with nVidia for the foreseeable future though; ATi is just not worth the risk on any OS.

Re:Linux Drivers (0)

Anonymous Coward | more than 7 years ago | (#16876962)

K6-2 400

Ha! I'm running a K6-3 410. Voodoo 3 graphics card.

Re:Linux Drivers (1)

turgid (580780) | more than 7 years ago | (#16877138)

I've got a K6-III/450 with 128MB RAM and a TNT2 M64, running Slackware.

Re:Linux Drivers (0)

Anonymous Coward | more than 7 years ago | (#16877478)

Yeah, I'd say I'm in the same boat. I guess I qualify as an AMD fanboy, but not to the point of idiocy. If it ever gets to the point where AMD doesn't support nVidia, I'll just drop their processors. Honestly I think this will be an exercize in R&D. AMD has something up their sleeves that will maintain their reltionship with nVidia - and I'd say it's some sort of coprocessor with an interface that nVidia is free to use. If they don't they're idiots, and they're going to lose a lot of customers.

Re:Linux Drivers (1)

Zonk (troll) (1026140) | more than 7 years ago | (#16877936)

The ATI stuff is junk. The drivers are pathetic (open source) and the display is snowy, and the performance it rubbish.


I currently have an ATI Radeon 9200. The reason I went with it rather than a faster card is because of the open source driver for it. The games I play are emulated SNES, GTA III, GTA Vice City, and Enemy Territory. I haven't had and problems with it. It whenever I install Linux the card works accelerated with out of the box.

Cyrix MediaGX (2, Funny)

vision864 (712184) | more than 7 years ago | (#16876034)

Cool amd is about to MediaGX themselves.

this will fail (1)

geekoid (135745) | more than 7 years ago | (#16876036)

the processor market still changes too rapidly for this kind of bonding.

Do you really want to have to replace an entire system when you upgrade? You buy a Dell, a new game comes out 6 months and your system can't play it reasonably well.

So then you either
a) buy a new system
or
b) gut in a video card and not use the one on the proc.

When processors begins to peak, and each upgrades is basically a few ticks, then developers will have to create things for the systems that is out , not a system that will be out in a year.
When this happens*(and it will) software will enter a golden age.

Of course, someone could come up with completly different technolgy and make current procs. irrelevant..

*
There are many factors coming to light that are already slowing down proc development. Die limitations, noise limitations, bus size limitation, to just name a few. Don't confuse practicallity with theory.

Re:this will fail (1)

turgid (580780) | more than 7 years ago | (#16876220)

Do you know about Hypertransport? Do you know how important multi-CPU AMD motherboards are about to become?

While intel's multi-core processors are choking on a single frontside bus, with an AMD system, you just need to plug in another CPU, GPU, physix processor, vector processor or whatever and get more (not less) memory bandwidth per processor and a linear increase in processing power.

By 2008, I expect 4-socket AMD motherboards will be common place amongst consumers, never mind enthusiasts.

Intel will have hot, slow, high frequency 8-core space heaters choking in a single socket.

Re:this will fail (3, Interesting)

NSIM (953498) | more than 7 years ago | (#16876280)

Do you really want to have to replace an entire system when you upgrade? You buy a Dell, a new game comes out 6 months and your system can't play it reasonably well. So then you either a) buy a new system or b) gut in a video card and not use the one on the proc.

Integrating the GPU with the CPU will be about driving down cost and power consumption, not something that is usually a high-priority for folks that want to run the latest greatest games and get all the shiniest graphics. So, I'd be very surprised if this is intended to hit that part of the market, more likely it's designed to address the same market segment that Intel hits with graphics embedded in the CPU's supporting chipset.

That said, having the CPU & GPU combined (from the point of view of register and memory access etc) might open up some interesting new possibilities of using the the power of the GPU for certain non-graphic functions.

Back in the day at Intergraph we had a graphics processor that could be combined with a very expensive (and for the time powerful) dedicated floating point array processor. To demonstrate the power of that add-on somebody handcoded an implementation of the Mandelbrot Fractal algorithm on the add-on and it was blistering fast. I can imagine similar highly-parallelized algorithms doing very well on a GPU/CPU combo.

Re:this will fail (1)

adsl (595429) | more than 7 years ago | (#16877148)

Graphic chipsets are often more complicated than the CPU. I agree about the lowering of cost and reducing power consumption. Nvidia themselves have a similar project with a hoped for release late 2007 and VIA are powering the "Walmart" Laptp with remarkably low power consumption and reduced cost. Thus the market for a combined CPU/Graphics chipset is likely to be in low end desktops AND most importantly in LAPTOPS. It will be interesting to see wat Intel have up their sleeves also.

Re:this will fail (1)

zenslug (542549) | more than 7 years ago | (#16879042)

Do you really want to have to replace an entire system when you upgrade?

I don't upgrade so this would be nice, yes.

You are a gamer and you are special. Most of the world isn't special and would like a cheaper machine to browse the web. You will buy a different system with an upgradable GPU. Or this new setup will be almost exactly like integrated graphics today which allows you to add your own killer card as a replacement.

At the risk of being modded reundant (4, Insightful)

kimvette (919543) | more than 7 years ago | (#16876326)

I'm going to ask:

That's great and all, but does it run Linux?

I'm not kidding, either. Is AMD going to force ATI to open up its specs and its drivers so that we can FINALLY get stable and FULLY functional drivers for Linux, or are they still going to be partially-implemented limited-function binary blobs where support for older-yet-still-in-distribution-channels products will be phased out in order to "encourage" (read: force) customers to upgrade to new hardware, discarding still-current computers?

That is why I do not buy ATI products any more. They provide ZERO VIVO support in Linux, They phase out chip support in drivers even while they are actively distributed. They do not maintain compatibility of older drivers to ensure they can be linked to the latest kernels.

This is why I went Core 2 Duo for my new system and do not run AMD - their merger with ATI. My fear is that if ATI rubs off on AMD then support for AMD processors and chipsets will only get worse, not better.

Re:At the risk of being modded reundant (0, Flamebait)

EvilRyry (1025309) | more than 7 years ago | (#16876936)

Now that AMD and ATI are one company, I've extended my purchase boycott to include AMD like you have. Intel provides me with an opensource solution, why don't you AMD?

Re:At the risk of being modded reundant (0)

Anonymous Coward | more than 7 years ago | (#16877850)

Ass kissers who apply double standards at Slashdot are in no short coming.

Re:At the risk of being modded reundant (1)

gardyloo (512791) | more than 7 years ago | (#16878960)

Ass kissers who apply double standards at Slashdot are in no short coming.

      I think I have figured out what you're saying, but it's much more fun to pretend that you're writing really bad Japanglish ads for hardcore pr0n.
 

Re:At the risk of being modded reundant (2, Insightful)

Chris Burke (6130) | more than 7 years ago | (#16877306)

This is why I went Core 2 Duo for my new system and do not run AMD - their merger with ATI. My fear is that if ATI rubs off on AMD then support for AMD processors and chipsets will only get worse, not better.

It is pretty typical in a buyout like this for the larger company's culture to dominate the smaller one. While in many cases this is a bad thing as the smaller company has the more open culture, in this case it is the larger company, AMD, that is more open.

It is ridiculous to think that support for AMD chipsets and processors will get worse since AMD has utterly depended on Linux to jump start the 64-bit x86 market. Oh, and a processor is nothing if it doesn't expose its interfaces, because they count on programmers to use those new instructions or modes or whatever to optimize their programs and make the processor look good. There is no DirectX or OpenGL equivalent that processors hide behind.

Re:At the risk of being modded reundant (2, Informative)

asuffield (111848) | more than 7 years ago | (#16878178)

That is why I do not buy ATI products any more.


So you use SiS chipsets then? They're the only manufacturer I can think of who still provide specs for their video chips (or do Intel still do that too?).

Unfortunately we're currently stuck with a range of equally sucky choices. I tend to buy (older) ATI cards because at least they get reverse-engineered drivers, eventually.

GPU or GPGPU? (2, Interesting)

tbcpp (797625) | more than 7 years ago | (#16876426)

From what I understand (and I could be wrong), AMD/ATI is aiming more at the GPGPU market. So we're talking more of a suped up altivec processor in the CPU instead of a full blown GPU. It sounds like the're simply adding a 64 pipleline vector processor to the existing x86-64 core. I'm not sure if this is a bad idea.

I remember programming assembly graphics code in BASIC back in the day. You would set the VGA card to mode 13h and then write to...what was it now...0xa00? That's probably wrong. Anyway, whatever your wrote to that portion of memory would go to the screen.

If you had a huge SIMD co-processor, would it not be possible to rival modern GPUs with this model? Not to mention being able to do some cool stuff like having a video input card dump data driectly into that portion of the screen. So you could have video in with the CPU at complete idle.

Re:GPU or GPGPU? (1, Interesting)

Anonymous Coward | more than 7 years ago | (#16879430)

Exactly. This could be the move that makes physics acceleration ubiquitous. Sure, el cheapo systems can be built that utilize the on package or on die GPU capabilities for low end graphics. But a higher end (and gamer relavent) use would be to have the integrated "GPU" doing physics calculations while an add in board continues handling the traditional GPU tasks. This would be *far* superior to the current add in board approach, because the tight CPU-Physics integration would allow for some damned sweet gameplay enhancements (as opposed to the current make it prettier effects).

Not what you think (2, Interesting)

Anonymous Coward | more than 7 years ago | (#16876468)

My people are reading this as an integrated GPU and CPU. I don't see it that way. I see it as adding a generic vector processor to the CPU. Similar to the Cell processor and similar to future plans Intel has described. Vector processors are similar to SSE, 3DNow, etc. They are SIMD processors that can execute very regular mathematical computations (Video and audio encoding/decoding) VERY quickly, but aren't much good for generic algorithms.

A step between on-board video, and full graphics (2, Interesting)

Vellmont (569020) | more than 7 years ago | (#16876544)

The people claiming this will fail all seem to miss the market this is aimed at. It's obviously not intended to compete with the high-end, or even middle of the road graphics processor. Those boards require gobs of VERY fast video memory. My guess is this thing is aimed at a space between the on-board video (which are really just 2-d chips) and the full 3-d graphics card. Anyone buying this has no intention of buying a super-duper

With Vista coming out soon, PC-makers are going to want a low-cost 3-d accelerated solution to be able to run some (or maybe all) of the eye-candy that comes with vista.

integrated memory? (0)

Anonymous Coward | more than 7 years ago | (#16876558)

I hope they integrate enough memory. The GPU of 2012 will needs quick fast guaranteed (not shared) access to about 4GB of texture memory to render the environments and character avatars for games of future (ie, about 8 times the amount the Xbox 360 shares for graphic and computing).

I'll buy it if they provide free drivers (1, Insightful)

Anonymous Coward | more than 7 years ago | (#16876690)

I'll buy this is they provide free drivers; I won't buy it if they don't. Vista's piggish graphics will surely push all GPU's to new performance levels. I don't care about on-chip integration nearly as much as I care about avoiding the need to use binary blobs in my free OS.

For the sake of competition... (2, Funny)

Wilson_6500 (896824) | more than 7 years ago | (#16876768)

Let's hope this fusion doesn't bomb.

How many blades? (1, Funny)

fahrbot-bot (874524) | more than 7 years ago | (#16877040)

AMD will be making razors and shave gel? Sweet! How many blades, 4, 5 or scalable on demand?

Fusion or Design By Committee (aka "Convergence") (1)

DingerX (847589) | more than 7 years ago | (#16877694)

Great idea on paper. It boils down to personnel though. You're talking about fusing development teams with experience. Will they work together well? Or will the elevator assets go work for someone else, leaving the understudies to bicker about with an ignoramus boss unable to figure out which engineers are clever and which are just suckups?

I'm not saying it won't work; I'm saying that fusing development teams with expertise is a lot different than fusing different components onto the same board. And that, in turn, is a lot different than a multi-option fuze.

Re:Fusion or Design By Committee (aka "Convergence (0)

Anonymous Coward | more than 7 years ago | (#16879782)

Ironically the big merger-of-companies graphics project between GigaPixel and 3Dfx was called Fusion. Would have been a kick-ass chip too, had 3DFX had the guts to cancel Rampage and throw everything behind the new project. Ah well, most of us are nvidians now...

AMD invents... (0)

Anonymous Coward | more than 7 years ago | (#16878538)

...the CELL processor. Whoulda thunk it?

Dual and Quad socket! (1)

gfody (514448) | more than 7 years ago | (#16878748)

I know the trend is single socket multi core but with the gpu embedded dual and quad sockets instead of sli!

My fear is... (1)

Artana Niveus Corvum (460604) | more than 7 years ago | (#16879306)

This will lead to a whole new world of disgustingly bad graphics chips eating system RAM and claiming to have "256MB" or whatever but really having little or none and just munching on (slow) system memory as needed... and that never works as well as it should.

Matrix Operations? (1)

NitsujTPU (19263) | more than 7 years ago | (#16879744)

How about getting those lightning fast matrix operations onto the CPU? I always hear about people building application-specific tweaks by reprogramming their algorithm into a shader language. I imagine that there is far more fertile soil for innovation here than some lame combined C/GPU.

VIDEO cards on the HT bus (1)

Joe The Dragon (967727) | more than 7 years ago | (#16880120)

They are also look at this as well. Maybe even some kind of an super cross fire / sli with build in cpu video graphics processing + video in the slot with it's own graphics processing and ram also you may be able to 2 video cards linked as well.

Amd 4x4 systems may be able to have 4 video cards + 2 cpus with graphics processing in them.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>