ATI and AMD Seek Approval for Merger? 229
Krugerlive writes "The rumor of an ATI/AMD merger/buyout has been out now for sometime. However, this morning an Inquirer article has said that a merger deal has been struck and that the two companies will seek shareholder approval on Monday of next week. In the market, AMD is down as a result of lackluster earnings announced last evening, and ATI is up about 4% on unusually high volume." This is nothing but a rumour at the moment, a point that C|Net makes in examining the issue. From the article: "AMD has always boasted that it only wants to make processors, leaving networking and chipsets to others. AMD does produce some chipsets, but mostly just to get the market started. Neutrality has helped the company garner strong allies."
Does that mean.... (Score:4, Interesting)
were holding back progress in 3D graphics performance).
Depends. (Score:5, Insightful)
You need to bear in mind that the GPU is the critical component in most systems, but makes almost no money for the vendor and has a relatively low volume. There is precisely no reason whatsoever for AMD to want to merge with ATi or to buy them up. That would be expensive and earn them little. In fact, given how much they've made from their component-neutrality, sacrificing that might mean they'd actually lose money overall.
On the other hand, CPUs are high volume, high profit, and AMD is gaining market-share. It is an ideal target for a buy-out, particularly as ATi can't be doing that well in the GPU market. Buying AMD would be like buying a money-printing-machine, as far as ATi were concerned. Better still, AMD is a key player in bus specifications such as HyperTransport, which means that if ATi owned AMD, ATi could heavily influence the busses to suit graphics in general and their chips in particular.
(Mergers are never equal, as you don't have two CEOs, two CFOs, etc. One of them will be ultimately in charge of the other.)
If the rumour is correct, then don't assume AMD is the one instigating things - they have the most to lose and the least to gain - and don't assume either of them will be around when the mergers and buyouts finish.
Re:Depends. (Score:3, Interesting)
What about (I hate that I am going to type this word) synergies. Maybe AMD thinks that they have enough in common with ATI that they could reduce redunancies after the merge (ie fire people and possibly sell off a fab plant) and make both companies more profitable. Just a thought.
Re:Depends. (Score:3, Informative)
Re:Depends. (Score:2)
What makes you think the CEO gets to choose the board? You clearly don't know shit about corporations.
The shareholders elect the board. The board chooses the C
Re:Depends. (Score:3, Interesting)
Not at the moment. But with a little more miniaturisation and time both CPU and GPU will be merged onto the one chip package. This is a situation where the combined company will have more than a small edge over their rivals. Avoiding the use of (relatively) long transmission wires to communicate across the motherboard bus; speeds will increase beyond anything current
Re:Depends. (Score:5, Insightful)
Once, CPUs didn't do vector computations. They were either converted to scalar operations, or performed on a dedicated (expensive) coprocessor. Now, lots of CPUs have vector units.
Once, CPUs didn't do stream processing. Now, a few CPUs (mainly in the embedded space) have on-die stream processors.
A GPU is not much more than an n-way superscalar streaming vector processor. I wouldn't be surprised if AMD wants to create almost-general coprocessors with similar characteristics that connect to the same HT bus as the CPU; plug them directly into a CPU slot and perform all of the graphics operations there. Relegate the graphics hardware to, once more, being little more than a frame buffer. This would be popular in HPC circles, since it would be a general purpose streaming vector processor with an OpenGL / DirectX implementation running on it, rather than a graphics processor that you could shoehorn general purpose tasks onto. The next step would be to put the core the same die as the CPU cores.
The CPU industry knows that they can keep doubling the number of transistors on a die every 18 months for 10-15 years. They think they can do it for even longer than this. They also know that in a much smaller amount of time, they are going to run out of sensible things to do with those transistors. Is a 128-core x86 CPU really useful? Not to many people. There are still problems that could use that much processing power, but most of them benefit more from specialised silicon.
Within the next decade, I think we will start to see a shift towards heterogeneous cores. The Cell is the first step along this path.
Re:Depends. (Score:2)
Re:Depends. (Score:2)
I beg to differ. GPUs have higher volumes than CPUs, assuming you count GPUs embedded in chipsets, along with the discrete GPUs. Just think about how often people upgrade their CPUs as opposed to their GPUs.
As for profit margins, then you have a point there, although for the wrong reasons, I
Re:Depends. (Score:2)
Every computer sold has at least 1 CPU, but may not have a GPU at all (what use does a server have for a GPU if it's sitting in a rack and never has a screen attached), or it might have one integrated into the chipset.
The most a single system will have is 2 GPUs, whereas highend machines could have many CPUS, and are unlikely to need a GPU at all.
Re:Depends. (Score:2)
Re:Depends. (Score:2)
Re:Depends. (Score:2)
Where it really matters these days is in the laptop space. Laptop sales are set to pass desktops in the next year or two. They did for Apple last year, and they're at about 50% of desktop sales for the rest of the industry. While desktop GPU sales grew by about 25%, laptop GPU sales grew by over 30%; particularly noteworthy since most laptops d
Re:Depends. (Score:2)
Well, just in case... (Score:2)
Re:Well, just in case... (Score:2)
Re:Well, just in case... (Score:2)
Re:Depends. (Score:2)
Re:Depends. (Score:2)
For a while, I tried maintaining a timeline of all the different 3D chip vendo
Re:Depends. (Score:2)
The original corporate entity S3 ended up changing it's name to SonicBlue, filing bankrupty, and selling most of their assets to Denon and Best Data. The graphics division, however, was acquired by Via and is now operated as S3 Graphics and used as the basis for Via's IGP solutions. So they're still around, sort of.
Re:Does that mean.... (Score:2, Insightful)
Re:Does that mean.... (Score:2)
Re:Does that mean.... (Score:2, Informative)
GPU = Graphics Processing Unit.
AFAIK, they've been in the processor business since they launched their first graphics card.
Re:Does that mean.... (Score:2)
GPU = Graphics Processing Unit.
AFAIK, they've been in the processor business since they launched their first graphics card.
I could be wrong but I thought their first card to have a GPU was the Geforce 3.
Re:Does that mean.... (Score:2)
I was close It seems the first one was Geforce_256 which did not do well because of cost and lack of performance in non-gaming applications. Seems like the idea took off once they introduced it again in the Geforce 2. If you want to read more about it some information can be found here. [wikipedia.org]
Re:Does that mean.... (Score:2, Informative)
Hardly. CPU and GPU design are very different tasks at so many levels.
At the highest level, the architectures are radically different - a GPU is basically a bunch of the instantiations of a minimally-programmable, cus
Re:Does that mean.... (Score:2, Informative)
interesting... incidentally, i happen to work for a gpu company (one mentioned in this article even...), and we have a large number of engineers doing full-custom circuit-design work. they may not be working on custom adders (we don't need them), which is perhaps the point you were trying to make, but they are often doing some very complicated circuits nonetheless...
Re:Does that mean.... (Score:2, Informative)
(disclaimer: I work for one of the two big GPU companies)
Man, your information is very outdated. I would estimate that at least 25% of a current GPU is laid out by hand. CPUs definitely have more custom parts, but not more than 50-60% of the chip. The rest is synthesized u
Why ATI... Go NVidia (Score:4, Insightful)
Re:Why ATI... Go NVidia (Score:5, Insightful)
If you meant ATis' own drivers, yeah, they suck. But really, if ATi just made docs available, the much better X.org drivers would be able to support far more of their hardware..
If the rumour is truee, I hope AMD care about open drivers..
Re:Why ATI... Go NVidia (Score:2)
well corporations are shizophrenic, but: http://www.amd-jobs.de/de/einstieg/freiestellen_o
Re:Why ATI... Go NVidia (Score:2)
Those are mostly kernel-related jobs - for server chips, having the OS kernel take full advantage of processors is a big deal, and Linux is a pretty big OS in the server arena.
When it comes to ATI/nvidia, however, Linux users make up a much smaller part of their target market (gamers).
Re:Why ATI... Go NVidia (Score:2)
-matthew
Re:Why ATI... Go NVidia (Score:4, Informative)
Re:Why ATI... Go NVidia (Score:5, Funny)
Funny way to define recent. You don't happen to be a Debian developer, do you?
I just threw away a R300 series card (ATi 9800 XT) for an nVidia SLI. I bought the ATi back in mid '05 and it has sit on the store shelves for 1/2 a year before I picked it up for the "Free" Half-life 2 and then "stable" accelerated proprietary drivers.
I game under Linux. But with an ATi card, nothing worked well or for very long. Wine, the commercial Cedega, even native games would kill the driver. I had to install nVidia dependancies for my team's 3d software. Software which in the end wouldn't work without the nVidia drivers.
If you meant ATis' own drivers, yeah, they suck. But really, if ATi just made docs available, the much better X.org drivers would be able to support far more of their hardware..
I don't see that improving quickly unless somebody is a big itch to scratch builds a community like the one around nVidia. A lot of people doing games in Linux only develop and test with nVidia hardware. Not everyone can afford two $600-800 rigs with recent cards.
Once I switched to nVidia 3D a ton of games that only worked on Windows now install and play as well if not better than native on Windows. Older 3d games like Diablo 2, Warcraft 3 and Startopia fly at high frames-per-second (> 60-100._ Current generation games like Tron 2.0, Guildwars and Half-life 2 get respectible fps (~30) where the ATi drivers would struggle to get 2-3 fps and often crash if anything changed the drawing state.
I hope AMD care about open drivers..
This assumes that AMB comes out on top. Or that the ATi proprietary midset doesn't infect AMD. On one side you have two companies that are basic chip fabbers, spewing out GPUs, CPUs and chipset engineeing specs as fast and cheaply as possible. On the other you have ATi, buried deep in a race with nVidia, and AMD, who won the last round of CPU wars with x86_64. As has been mentioned by others, mergers are little more than one company eating another. I for one would not be surpised if after any such ATi/AMD merger that the next (last?) AMD nVidia motherboard chipsets are at least 6 months to a year behind the next ATi releases.
At the best, it would be intersting to see a dual-core CPU with one core a GPU and a metric ton of cache. I'd be almost like the old 468SX vs. 468DX days.
completely agree (Score:4, Insightful)
Re:completely agree (Score:3, Interesting)
AMD, after buying out ATI opens up the architecture or supports Linux as a 1st tier platform.
I bet if ATI was putting out first rate drivers it might influence quite a few purchases in that direction
Re:completely agree (Score:5, Insightful)
Sigh. This detrimentally short-sighted acceptance of binary-only drivers that users like you have is precisely why there are no good drivers for recent ATi hardware, or most recent graphics besides Intel. And until users like yourself start demanding that vendors provide documentation, not binary blobs, graphics support will continue to suck.
Binary drivers kill kittens (thanks airlied for that one). They don't help if you run other free Unixen, they don't help if you use a non-mainstream platform (e.g. PPC, AMD64 up until recently, it doesn't help the Radeon in the Alpha I have here).
Demand DOCUMENTATION - even if it's gibberish to you personally, it's will benefit you far more than binary blobs eventually...
Re:completely agree (Score:2)
YOU demand documentation for your other free *nixen and your mainstream platforms.
*I* will use the binary drivers Nvidia provides because they fulfill the most important requirement AFAIC: They make my stuff WORK.
Re:completely agree (Score:2)
Yeah, AIGLX and Xegl work real well with my Geforce FX.
NVidia does provide somewhat decent drivers, but they just fulfill the necessary requirements for being useful. Proper documentation would go all the way to "sufficient".
Re:completely agree (Score:2)
Right - cause they don't make any difference to your chosen platform, do they? Except that 99% of graphics work on Unix platforms is done in userspace, in Mesa and Xorg code, so work done by FreeBSD, Sun, etc.. engineers also tends to apply to your Linux machines (and vice versa).
You're simply an ignoramus: you're using a system (only parts of which are either Linux or Linux specific) which tens of thousands of people have don
Re: (Score:2)
So let me get this straight (Score:2, Insightful)
I think that's the concensus on here, certainly the linux drivers are apparently awful.
My AMD64 desktop machine has an NVidia graphics card which works much better than the ATI rubbish built into the motherboard. But I'm not using that machine to write this. In fact, other than for occasional gaming, that machine rarely gets switched on.
I tend to use my laptop. Which has a Centrino chipset.
You know - that one that Intel brought out for laptops? The one
Re:Why ATI... Go NVidia (Score:2)
Re:Why ATI... Go NVidia (Score:3, Insightful)
Re:Why ATI... Go NVidia (Score:2)
I think, then, what you're looking for could come from this merger. AMD being the less expensive of the major CPU producers is a first choice for the free Unix group, and they know it. Maybe joining with ATI will cause the joined company to beco
Re:Why ATI... Go NVidia (Score:2)
First of all, this arrangement benefits NVDA as much as AMD. How? It eliminates their main compet
AMD + ATi vs. Intel + nVidia (Score:5, Insightful)
And if AMD and ATi merge.. It sort of seems like a punch in the face to nVidia. Leaving them wanting to talk to Intel. Leading to... what?
For a long time there have been two beasts in the CPU market and two beauties in the GPU market. AMD and Intel in CPUs, and ATi and nVidia in GPUs. If they marry respectively, the offspring might have the good qualities of neither and the bad qualities of both. I think overall the consumer would probably (more than likely) lose out.
So, I really kind of hope this is just a rumor.
TLF
Re:AMD + ATi vs. Intel + nVidia (Score:2)
The only concern nVidia should have is if the AMD Process line started c
Re:AMD + ATi vs. Intel + nVidia (Score:2)
Re:AMD + ATi vs. Intel + nVidia (Score:2)
Re:AMD + ATi vs. Intel + nVidia (Score:2)
People say that, but I have to wonder what Intel has to gain. I mean, they're already the biggest player [reghardware.co.uk] in the graphics industry when it comes to market-share, so they clearly have the know-how to build graphics chips. Sure, they don't currently go after the enthusiast market, but there might be reasons for that:
1. Lower margins - Nvidia's gross margin [yahoo.com] is under 40%, and Intel's [yahoo.com] is cl
New Logo (Score:5, Funny)
T
I
Re:New Logo (Score:5, Funny)
Re:New Logo (Score:5, Funny)
And for the inevitable legal troubles down the road, ADMIT.
Re:New Logo (Score:2)
Talk about a dumb move (Score:4, Interesting)
The market's view of this is visible from the fact that ATI is up and AMD is waaaay down.
Re:Talk about a dumb move (Score:2)
Re:Talk about a dumb move (Score:3, Interesting)
The market's view of this is visible from the fact that ATI is up and AMD is waaaay down.
Wrong. The company doing the takeover (AMD) almost always declines -- rather noticably, too -- and the company being taken over almost always increases -- usually because the takeover bid is at a higher stock price.
AMD is just reporting bad earnings news in a volatile, short-heavy, news-sensitive market. With companies reporting good earnings still trading downward, it's no surprise that reporting bad earnings will
Conflict - nForce? (Score:3, Insightful)
I also worry that chipsets for AMD based motherboards will not work so well with my nVidia video card. Not an ATI fan at all.
I'm going to be watching these guys very closely. This would sway me away from AMD.
Re:Conflict - nForce? (Score:2)
I'm a big AMD fan, but after dealing with nForce platform drivers, I'll be really upset not to lose the nForce line of chipsets.
Re:Conflict - nForce? (Score:2)
Forget even Linux, nForce is seriously crippled under WindowXP also.
It was very painful trying to get a new system built with nForce4, RAID 1 SATA aray, and a SATA Optical Drive (system is AMD also, but that wasn't a problem
Finally I traced the problem to an incompatibility in the nForce chipset. It can EITHER support a SATA RAID array, or it can support a SATA optical drive. Doing both unfortunately causes the system to bluescreen.
(and yes, I know SATA on the optical doesn't buy you much
Re:Conflict - nForce? (Score:2)
Poor Choice For AMD (Score:3, Insightful)
The Xbox 360 is the first console ever to have PCs outperform it before the console has hit store shelves. In the past, consoles have had at least a year or so before PCs could touch them.
What the hell is AMD thinking?
AMD needs to come up with its own bogus SPEC score generating compiler to grow in the market, not a fucked up GPU maker.
Population+ For A Particular Group of Engineers (Score:2)
It's definitely going to be one of those positive situations where software is doctored to perform particularly well [when combinations are involved].
GPU in socket? (Score:3, Interesting)
That'd be hella cool.
Tom
Re:GPU in socket? (Score:2)
Though yeah I guess the memory would still be slower.
Tom
Re:GPU in socket? (Score:2)
Re:GPU in socket? (Score:2)
I hope the merger is rejected (Score:2)
Either this would vastly improve ATI or it could drag AMD down into mediocrity. If the merger does happen I truly hope that it is the former (ATI cleaning up its act across the board) but all too often with these sorts of mergers its the former that happens. ATI has a lot of great technology with fast GPUs, but when the drivers suck, customer service and support are nonexistent, and they absolutely re
Re:I hope the merger is rejected (Score:2)
although, the end result is a crappy customer experience.
Minor correction (Score:2)
Please, no. (Score:2)
Damn. This worries me
Dilbert, anyone? (Score:4, Interesting)
Another possible hint that this is more than rumor (Score:2)
Not that far fetched. (Score:5, Interesting)
IMHO, AMD would be well advised to start shipping it's own chipsets, just like Intel. It just makes things easier for their most important customers, the big OEMs. They have one less vendor to worry about. There's less testing required, since presumably AMD would test the CPU and chipset together. And it's less risky for both customers and AMD since AMD has a very strong incentive to make sure that chipsets will be available for their platform on time, whereas third parties have different priorities.
Then there's the whole GPU angle. Why shouldn't GPUs be produced in company owned, i.e. tweaked for performance, fabs? They're every bit as complex and big and expensive as CPUs. Bringing that in house should give a nice bump to performance. And what is a GPU going to be in five years anyway? On the AMD platform, all the tools are in place to allow the GPU to work much more like a cheap DSP/co-processor than we've ever seen before. If the Opteron wasn't an Itanium killer, maybe a couple Opterons and a couple "GPU-DSPs" will do the trick. Even for regular workstations, imagine just plugging a GPU into a free socket on the MB? That would fit very nicely in the middle of the graphics market... way better than integrated, but way cheaper than an add-on card.
Lastly, AMD needs a way to use the last generation fab equipment a little longer. Making chipsets would let them use the fab equipment for an extra few years. They lost that cost efficiency when they spun off the flash business. Fab gear is expensive, so it's kind of a waste for them to be yanking it out everytime the minimum for a marketable CPU moves higher.
Five years ago AMD needed partners and an ecosystem to support their own platform and survive as a company. The next five years are about turning the CPU market into a duopoly.
I have a few shares of AMD. And I'd like to see this deal happen, but only at a decent price (from AMD's point of view). Hmm... this post turned rather long...
Re:Not that far fetched. (Score:2)
Contrast with ATI and nVidia chipsets (now that VIA, SiS and ULi are pretty much out of the market) - drivers are always binary blobs. True,you can generally run Linux on an nVidia chipset with open source drivers, even up to the reverse engin
Intel D101GGC (Score:2)
Check out the D101GGC: http://www.intel.com/products/motherboard/d101ggc
I find it odd for Intel to use a third-party's chipset in their mobos, but it would be double-weird if that third-party was AMD.
The real reason (Score:2)
Let's face it. CPUs are commodities. You buy price/performance.
Recently, Intel has been using the platform to differentiate itself.
Centrino is one example in the notebook world.
You can see other examples with "advanced I/O" in the newer server platforms.
Intel dictates the platform and can define it to suit their needs.
AMD has no platform strategy. It's at the mercy of various 3rd party chipset makers.
This is why this makes strategic
100% Going To Happen (Score:4, Interesting)
This is just FUD (Score:2)
Good news (Score:3, Interesting)
The AGP slot has been getting faster and faster. The GPU has been getting bigger and has been doing more. There is an obvious need for a physics core and multicore CPUs. Clearly this is leading to adding the GPU to the CPU on the same chip, or at least very close to it, like the L2 cache on the slot1 Intel CPUs. After a certain AGP/PCIX bus speed, the AGP or PCIX slot will become less feasible, and it will be important to put the GPU as close to the CPU as possible.
Now think of the PS3. Its a revolution. Its not here yet, and its release is not being managed very well, but the ball on multicore CPUs (not just dual core) has gotten rolling. The Ultrasparc T1 has shown the world that multicores can be real and actually work. Not to mention the fact that most computers bought today at least has a mediocre GPU somewhere in it. This means AMD needs a GPU to add to its multicore CPUs as another core. They've already added the northbridge to it havent they? And that has saved us money hasnt it?
Intel has one-upped AMD recently with its Core chips, and AMD sounds like its really gonna one-up Intel with chips that should take the market away.
But why should ATI do this? (Score:2)
But why should ATI be interested in a merger? They would probably lose all their Intel chipset business and a lot of the enthusiasts graphics card business on Intel platforms.
Re:approval for what? (Score:2)
I would find it a bit odd if it happens.
Re:approval for what? (Score:2)
The way the world works... AMD should take notice (Score:2)
It's been very clear for along time that ATI are rubbish outside the fanboy wars and that you get the best bang for your buck using AMD+nForce+nVidia GPUs. That is the combination I've bought for the last few years now and I've never regretted any of those purchases. If that were to change I guess my grassroots support for AMD may have to be realigned, although very painfully, to
Re:Fuck no. (Score:2)
For me working in Window is painful, and only have useful with a dozen tools like cygwin, adobe, etc...
In Linux I can "just work". Booting windows to play games isn't an option as multiple people use this box. Where in Linux I can totally dominate one of my four cores with a video game, if I boot windows they get 0 of the 4 cores to use.
Tom
How about this one? (Score:2)
The drivers are horrible.
ATI lies to their customers.
The GPU may be solid, but that only half the battle.
Re:Fuck no. (Score:2)
Re:Fuck no. (Score:2)
Even an old GF4 Ti works with the latest drivers, on all 3 operating systems, with full 3-D acceleration. When a new card comes out, within a couple of months, it has support for all OSes. When new Linux kernel config options bre
Re:Fuck no. (Score:2)
I've seen Windows users literally throw away a working 9600XT just because of the WinME-quality drivers. The card's on my shelf right now, in fact. (my MX400 is fast enough)
Re:Fuck no. (Score:2)
Re:Linux + ATI WORKS! (Score:2)
As you see. You didn't need to go further, we already know you use XOrg drivers and not fglrx. Yes, that drivers are nice. Not fully featured, but open and nice. I preffer open too.
Now try using some X1800 and tell us how do you like that for a difference (no magic without crapiest ever driver named fglrx). Then pop in nvidia and their drivers and tell us how that feels. Believe me, using nvidia drivers will suddenly
Re:Linux + ATI WORKS! (Score:2)
I also wonder if the ATI sales boost is partly due to Apple's laptop sales we heard about yesterday.
Re:Linux + ATI WORKS! (Score:2)
Re:ATI + AMD = ? (Score:2)
Intel + nVidia = EVIL DNA IN IT / DIVE IN AT NIL
AMD + nVidia = DNA VIA DIM
ATI + Intel = NIL ATE IT
Re:Integrated graphics (Score:2)
Buyers of integrated graphics aren't the informed type. In the current market, nobody of a technical inclination would buy integrated graphics based on principle (even if it was halfway decent), and the uninformed people wouldn't care if it was any good.
Re:Integrated graphics (Score:2)
Re:Integrated graphics (Score:2)
Here's a thought for the every-whiney video game industry: I'd be much more inclined to go plop down $50 on a new game if I could run it at high resolution with all the detail on with my *AVERAGE* PC. That would impress me a hell of a lot more than awesome new graphics that only people with brand new hardwar
Re:Integrated graphics (Score:2)
I had a GeForce 4 MX 440 that was BETTER than even the highest spec integerated intel graphics chipset you can get today.
If intel wanted to, they could easily make (or licence from someone else) a better chipset for even the lowest end systems with hardware T&L and other 21st centuary graphics card features and all without affecting the functionality of the chipsets or motherboards or making them cost significiantly more.