×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

ATI and AMD Seek Approval for Merger?

Zonk posted more than 7 years ago | from the curiouser dept.

229

Krugerlive writes "The rumor of an ATI/AMD merger/buyout has been out now for sometime. However, this morning an Inquirer article has said that a merger deal has been struck and that the two companies will seek shareholder approval on Monday of next week. In the market, AMD is down as a result of lackluster earnings announced last evening, and ATI is up about 4% on unusually high volume." This is nothing but a rumour at the moment, a point that C|Net makes in examining the issue. From the article: "AMD has always boasted that it only wants to make processors, leaving networking and chipsets to others. AMD does produce some chipsets, but mostly just to get the market started. Neutrality has helped the company garner strong allies."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

229 comments

Does that mean.... (3, Interesting)

mikael (484) | more than 7 years ago | (#15760587)

NVidia would seek a partnership with Intel (Although some news articles reported that they felt that Intel
were holding back progress in 3D graphics performance).

Re:Does that mean.... (0)

Anonymous Coward | more than 7 years ago | (#15760598)

Why would Nvidia want to do that? They already make a great motherboard chipset.

Re:Does that mean.... (2, Insightful)

jhfry (829244) | more than 7 years ago | (#15760616)

I think NVidia needs to get into the processor market themselves. Maybe not for general computing, but I bet their designers have some great ideas for a processor that would be at home in a console! With GPU's being so powerful these days, I can't imagine that they lack the expertise to do it.

Re:Does that mean.... (1)

snuf23 (182335) | more than 7 years ago | (#15760968)

Well they do have the GPU going into the Playstation 3 - but IBM seems to have a lock on at least this series of next gen consoles. All 3 CPUs (Xbox 360, PS3 and Wii) have been developed in conjunction with IBM.

Re:Does that mean.... (2, Informative)

PhoenixPath (895891) | more than 7 years ago | (#15761025)

I think NVidia needs to get into the processor market themselves.

GPU = Graphics Processing Unit.

AFAIK, they've been in the processor business since they launched their first graphics card. :p

Re:Does that mean.... (2, Informative)

Anonymous Coward | more than 7 years ago | (#15761034)

I think NVidia needs to get into the processor market themselves. Maybe not for general computing, but I bet their designers have some great ideas for a processor that would be at home in a console! With GPU's being so powerful these days, I can't imagine that they lack the expertise to do it.

Hardly. CPU and GPU design are very different tasks at so many levels.

At the highest level, the architectures are radically different - a GPU is basically a bunch of the instantiations of a minimally-programmable, customized, low-speed DSP pipeline on a core, whereas CPUs are highly programmable, general purpose, extremely agressive designs. Saying nvidia has the know-how is like saying that someone who designed a system of 100 rowboats to troll in a lake has the know-how to design racing speedboats.

At lower levels, GPUs are designed using synthesis and place&route, while CPUs tend to be semi-custom with some full-custom blocks. Circuit design is not something GPU companies do - they're given a library of gates from the fab company they use, and use those gates. In CPUs, lots of circuits are designed using fancier circuits (for example, the Itanium's adder has dynamic logic and complex passgate logic) and many things are laid out by hand (i.e. an engineer draws the physical shapes that will be used after some processing to make the masks)

Re:Does that mean.... (2, Informative)

Anonymous Coward | more than 7 years ago | (#15761312)

"Circuit design is not something GPU companies do..."

interesting... incidentally, i happen to work for a gpu company (one mentioned in this article even...), and we have a large number of engineers doing full-custom circuit-design work. they may not be working on custom adders (we don't need them), which is perhaps the point you were trying to make, but they are often doing some very complicated circuits nonetheless...

Depends. (5, Insightful)

jd (1658) | more than 7 years ago | (#15760996)

If it's ATi trying to buy out AMD (which is perfectly possible), then they might not have enough money left to stop nVidia doing a hostile takeover of them both. That would eliminate one of nVidia's competitors -and- give them control over the CPU that looks set to take over.


You need to bear in mind that the GPU is the critical component in most systems, but makes almost no money for the vendor and has a relatively low volume. There is precisely no reason whatsoever for AMD to want to merge with ATi or to buy them up. That would be expensive and earn them little. In fact, given how much they've made from their component-neutrality, sacrificing that might mean they'd actually lose money overall.


On the other hand, CPUs are high volume, high profit, and AMD is gaining market-share. It is an ideal target for a buy-out, particularly as ATi can't be doing that well in the GPU market. Buying AMD would be like buying a money-printing-machine, as far as ATi were concerned. Better still, AMD is a key player in bus specifications such as HyperTransport, which means that if ATi owned AMD, ATi could heavily influence the busses to suit graphics in general and their chips in particular.


(Mergers are never equal, as you don't have two CEOs, two CFOs, etc. One of them will be ultimately in charge of the other.)


If the rumour is correct, then don't assume AMD is the one instigating things - they have the most to lose and the least to gain - and don't assume either of them will be around when the mergers and buyouts finish.

Re:Depends. (1)

PhoenixPath (895891) | more than 7 years ago | (#15761040)

...nVidia doing a hostile takeover of them both.

Gotta wonder if the SEC would allow the merger of the two top GPU providers on the market.

Can you name (off the top of your head) 3 other GPU manufacturers? (Not just card, but the actuall GPU).

It's not an easy task...

Re:Depends. (1)

John Courtland (585609) | more than 7 years ago | (#15761073)

Matrox, Intel, Trident? It is pretty hard, I don't even know if Trident still exists in that capacity any more...

Re:Depends. (0)

Anonymous Coward | more than 7 years ago | (#15761220)

SIS, S3, and XGI?

It is kind of scary considering how few players there are in the field nowadays.

Well, just in case... (1)

jd (1658) | more than 7 years ago | (#15761386)

I'll add two more, so the list is at least 4: SiS and VIA both make GPUs. Does S3 still exist? Also, the MediaGX (rebadged to the Geode) has a GPU core built into the CPU, so that sort-of counts.

Re:Depends. (0)

Anonymous Coward | more than 7 years ago | (#15761267)

Except they aren't the top two GPU providers on the market. Intel is the no GPU maker by a large majority.

Also how exactly could nvidia do a hostile takeover of AMD and ATI. Don't both companies dwarf nVidia?

Re:Depends. (0)

Anonymous Coward | more than 7 years ago | (#15761336)

You have absolutely no idea what you're talking about.

ATI is worth less than half of AMD, how do you buy a dollar with 50 cents?

sigh.

Why ATI... Go NVidia (4, Insightful)

jhfry (829244) | more than 7 years ago | (#15760593)

I always thought that AMD and Nvidia were the better combo. Besides the ATI Drivers suck for Linux, where a large percent of the enthusiast market's interests lie. Isn't AMD still more of an enthusists processor until it can get into one of the top vendor's machines?

Re:Why ATI... Go NVidia (0)

Anonymous Coward | more than 7 years ago | (#15760630)

I don't know about that. The hardcore gamer crew are notoriously enthusiastic and windows-friendly.

Re:Why ATI... Go NVidia (4, Insightful)

Paul Jakma (2677) | more than 7 years ago | (#15760653)

Actually, the X.org drivers for ATis are probably the best out there. The problem is they lack support for recent ATi hardware (lacking good 3D support for vaguely recent, e.g. R300 and up, though it's getting there apparently, and completely lacking any support 2D or 3D, for the most recent R500 hardware), as ATi havn't made documentation available in a *long* time.

If you meant ATis' own drivers, yeah, they suck. But really, if ATi just made docs available, the much better X.org drivers would be able to support far more of their hardware..

If the rumour is truee, I hope AMD care about open drivers..

Re:Why ATI... Go NVidia (1)

Sweetshark (696449) | more than 7 years ago | (#15760702)

If the rumour is truee, I hope AMD care about open drivers..

well corporations are shizophrenic, but: http://www.amd-jobs.de/de/einstieg/freiestellen_os rc.php [amd-jobs.de]

Re:Why ATI... Go NVidia (1)

CTho9305 (264265) | more than 7 years ago | (#15761064)

well corporations are shizophrenic, but: http://www.amd-jobs.de/de/einstieg/freiestellen_os [amd-jobs.de] rc.php

Those are mostly kernel-related jobs - for server chips, having the OS kernel take full advantage of processors is a big deal, and Linux is a pretty big OS in the server arena.

When it comes to ATI/nvidia, however, Linux users make up a much smaller part of their target market (gamers).

Re:Why ATI... Go NVidia (1)

misleb (129952) | more than 7 years ago | (#15760873)

I would consider lack of 3D support "sucking." So if ATI's own drivers are WORSE than not having 3D support, wow!

-matthew

Re:Why ATI... Go NVidia (3, Informative)

Paul Jakma (2677) | more than 7 years ago | (#15761023)

The X.org drivers do support 3D, and quite well, on the older R100 and R200 cards. R300/400 are also supported for 3D, but those have needed extensive reverse engineering, and hence are not quite as mature (though, getting there apparently), also they have only really reverse engineered the equivalent of the R200 feature set, so they're not getting the most out of the cards - all thanks to ATis silly attitude about supplying documentation.

Re:Why ATI... Go NVidia (1)

LoveTheIRS (726310) | more than 7 years ago | (#15761085)

ATI drivers the best? How about almost zero, or slow 3D support. If you want to actually use any of your hardware capability you need to use ATI's own Linux drivers. ATI' Linux drivers lack any software suspend/suspend to disk functionality which is imperative on laptops. Which is unfortunate seeing that ATI almost has the complete laptop graphics market share. Another imperative function, hot-changeable dual monitors whose functionality on laptops is used to send a video signal to a projector, is missing from the ATI drivers. Not to mention the ATI drivers crash and tend to be tied to a single X.org version. ATI Linux drivers are crap. I hope their merger with AMD will mean better drivers for all systems, but especially for Linux.

Re:Why ATI... Go NVidia (1)

axlr8or (889713) | more than 7 years ago | (#15760678)

Oh man, this is bringing tear to my eye (I have more than one, don't worry). I don't like Intel or Windoze, but you are right. ATI don't like Linux very well. But it sure is nice on Windoze. I like the card's performance, and with the exception of Diamond Stealth I had years ago, I've always had ATI on my own machines. I thought it would be really cool to have ATI and AMD together but you're probably right. Next machine I build will have Nv and AMD.

Re:Why ATI... Go NVidia (1)

Captain Jack Taylor (976465) | more than 7 years ago | (#15760713)

I've used ATI cards on Linux and Windoze, and let me just say...they suck on BOTH now. I used to love them, but everything past about a 9600 is pure and utter CRAP. It's not the power of the card, though, but the shitty drivers, regardless of OS, holding them back.

Perhaps the consumer space (0)

Anonymous Coward | more than 7 years ago | (#15760703)

ATI seems ahead in TVs, phones, and gaming boxes. Perhaps that's where AMD is going. In fact, consumer is the only thing that has been saving ATI's bacon, as their margins in the PC space are far behind nVidia. nVidia has made some recent aqquistions to broaden beyond high-end PC, but ATI is way ahead there. Perhaps AMD sees the ATI broad product offerings and nice tech since R300, and can run their divisions with better margins than Orton seems to have been able to do.

completely agree (3, Insightful)

RelliK (4466) | more than 7 years ago | (#15760743)

Nvidia makes the best chipsets for AMD. Why would they want to merge with second-rate vendor? I hope AMD doesn't become as unstable as ATI drivers.

Re:completely agree (2, Interesting)

powerlord (28156) | more than 7 years ago | (#15760980)

Look at the other possibility:

AMD, after buying out ATI opens up the architecture or supports Linux as a 1st tier platform.

I bet if ATI was putting out first rate drivers it might influence quite a few purchases in that direction ... of course it might also push nVidia to do the same ... arms races can be fun for the spectators (and consumers :) )

Re:completely agree (4, Insightful)

Paul Jakma (2677) | more than 7 years ago | (#15761071)

I bet if ATI was putting out first rate drivers it might influence quite a few purchases in that direction

Sigh. This detrimentally short-sighted acceptance of binary-only drivers that users like you have is precisely why there are no good drivers for recent ATi hardware, or most recent graphics besides Intel. And until users like yourself start demanding that vendors provide documentation, not binary blobs, graphics support will continue to suck.

Binary drivers kill kittens (thanks airlied for that one). They don't help if you run other free Unixen, they don't help if you use a non-mainstream platform (e.g. PPC, AMD64 up until recently, it doesn't help the Radeon in the Alpha I have here).

Demand DOCUMENTATION - even if it's gibberish to you personally, it's will benefit you far more than binary blobs eventually...

Re:Why ATI... Go NVidia (1)

rockchops (866057) | more than 7 years ago | (#15760806)

I agree. I have a strong bias towards AMD and nVidia. But besides that, I've always thought it beneficial to have the options of Intel/AMD + nVidia/ATI. I hope this doesn't mean that ATI chips will get married to AMD or visa versa (such as degraded performance with non-partner chips)!

So let me get this straight (2, Insightful)

uptoeleven (845032) | more than 7 years ago | (#15761002)

ATI and AMD shouldn't merge because ATI's drivers suck.

I think that's the concensus on here, certainly the linux drivers are apparently awful.

My AMD64 desktop machine has an NVidia graphics card which works much better than the ATI rubbish built into the motherboard. But I'm not using that machine to write this. In fact, other than for occasional gaming, that machine rarely gets switched on.

I tend to use my laptop. Which has a Centrino chipset.

You know - that one that Intel brought out for laptops? The one that's hugely, massively successful in one of the main growth areas of hardware sales? Everyone wants a laptop... or a home media centre based on a pc but doesn't run like one... Everyone is buying Intel. Why? Because to all intents and purposes all the laptops come with Intel centrino sets. It's dead easy - they're dead easy to support, all the bits work together, no conflicts. AMD? Sure nice chips but who makes Turion laptops? Acer... Asus... and... um... some other companies... Perhaps Alienware? HP make a couple, Fujitsu Siemens make a couple but these aren't their high-end desirable laptops. It's like "well if I spend money I get a centrino, otherwise it's a toss-up between Celeron - the cacheless wonder - and a chip that sounds like a sticky nut treat..."

Who makes Centrino laptops? Dell, Sony, Toshiba, Fujitsu Siemens, Samsung, Panasonic, whatever IBM are calling themselves now - oh and Acer and Asus and Alienware too but - oh yes, and one really important company who basically stuck 2 fingers up to AMD - Apple. I'll bet Apple choosing Intel hurt. But everyone's buying laptops with Centrino chipsets in... No-one's really buying AMD... because AMD don't provide a chipset and an easy way for manufacturers to just kind of put their machines together in a lego-style fashion.

Does it make business sense for AMD to tie up with the chipset and motherboard manufacturer that also happens to make graphics cards? Hell yes. Does it make sense for AMD to try to get into the laptop market in a meaningful way? Probably. Will their driver support get any better? We can hope...

AMD + ATi vs. Intel + nVidia (5, Insightful)

The Living Fractal (162153) | more than 7 years ago | (#15760602)

As much as I like AMD, I have to say that if Intel and nVidia teamed up they would probably beat the crap out of AMD + ATi.

And if AMD and ATi merge.. It sort of seems like a punch in the face to nVidia. Leaving them wanting to talk to Intel. Leading to... what?

For a long time there have been two beasts in the CPU market and two beauties in the GPU market. AMD and Intel in CPUs, and ATi and nVidia in GPUs. If they marry respectively, the offspring might have the good qualities of neither and the bad qualities of both. I think overall the consumer would probably (more than likely) lose out.

So, I really kind of hope this is just a rumor.

TLF

Re:AMD + ATi vs. Intel + nVidia (1)

sporadic (110921) | more than 7 years ago | (#15760646)

If this happens, a lot of interesting things will happen. Intel's chipsets for do not support SLI, only Crossfire, maybe Intel and NVIDIA would start talking about cross licensing if CrossFire becomes an AMD+ATI property?

I for one would love a Core 2 Duo SLI solution, fully supported by both Intel and NVIDIA.

But like you, I'd rather see these four companies remain separate.

Sporadic

The timing of it... (0, Flamebait)

SanityInAnarchy (655584) | more than 7 years ago | (#15760747)

Right now, just as Intel's jumping ahead of ATI, too.

Maybe it's a sign. Maybe Intel jumped ahead of ATI because ATI sucks so much that just the anticipation of such a merger was enough to cause problems for AMD? Mabye Intel is so awesome because they already are talking to nvidia?

Of course, I've got a better combo in mind already: nVidia, period. They always talk about how they hate the x86 architecture, and wish you could just do everything on their hardware. Well, maybe they should try that... Wouldn't be the first time, after all, the nForce chipsets often required severe OS hacks to get anything working on them.

Re:AMD + ATi vs. Intel + nVidia (1)

ScottLindner (954299) | more than 7 years ago | (#15760769)

nVidia does just as well with both Intel and AMD processors. Even if AMD and ATI merged, it would be in nVidia's best interest to stay on their own unaligned. It's not like the ATI + AMD combo would actually make something better than nVidia could for chipsets. And assuming they could, so what? nVidia would just turn it up and prove that they can compete. nVidia can always crank up the heat when they need to. They're good at that.

The only concern nVidia should have is if the AMD Process line started closing its architecture to play favorites with its ATI side. If that happened, nVidia would have to align with Intel. If any of this happened, it would really suck for a lot of us unless AMD + ATI could allow nVidia to continue unthreatened.

Other thoughts on this?

Re:AMD + ATi vs. Intel + nVidia (1)

nine-times (778537) | more than 7 years ago | (#15760832)

Even if there is some sort of a merger, it's not like that means AMD will make their processors only work with ATi cards, or make ATi cards only work with AMD processors. Well, I guess they could do that, but I'm not sure what the point would be.

Re:AMD + ATi vs. Intel + nVidia (1)

CTho9305 (264265) | more than 7 years ago | (#15761106)

As much as I like AMD, I have to say that if Intel and nVidia teamed up they would probably beat the crap out of AMD + ATi.

People say that, but I have to wonder what Intel has to gain. I mean, they're already the biggest player [reghardware.co.uk] in the graphics industry when it comes to market-share, so they clearly have the know-how to build graphics chips. Sure, they don't currently go after the enthusiast market, but there might be reasons for that:
1. Lower margins - Nvidia's gross margin [yahoo.com] is under 40%, and Intel's [yahoo.com] is close to 60%.
2. Huge time-to-market pressures. Right now, Intel and AMD ship products every few years, and make sure they work pretty well before shipping them. On the other hand, anyone who's bought bleeding-edge video cards know that they're buggy as hell until you get drivers that work around the problems because the companies don't really care whether the product works any longer than required to survive through the standard benchmarks.

Re:AMD + ATi vs. Intel + nVidia (0)

Anonymous Coward | more than 7 years ago | (#15761235)

Talk about a dumb move (3, Interesting)

overshoot (39700) | more than 7 years ago | (#15760621)

Well, as an AMD stockholder I'll certainly vote against it (not that I have enough shares to matter.)

The market's view of this is visible from the fact that ATI is up and AMD is waaaay down.

Re:Talk about a dumb move (1, Interesting)

Anonymous Coward | more than 7 years ago | (#15760723)

Did you actually have a point? or were you just observing the obvious effect that happens immediately before all mergers? (Namely: that the price skyrockets for the one that's going to get swallowed, and the price tanks for the one that's going to be footing the bill.)

The stock market is just a ponzi scheme / elaborate game of hot potato anyway, so don't try to take any deep meaning from stock prices on anything less than a year average.

Re:Talk about a dumb move (1)

CTho9305 (264265) | more than 7 years ago | (#15760957)

That's probably because AMD missed earnings estimates - most of the drop happened between closing yesterday and opening today, not after The Inq's story.

Re:Talk about a dumb move (2, Interesting)

rfunches (800928) | more than 7 years ago | (#15761221)

The market's view of this is visible from the fact that ATI is up and AMD is waaaay down.

Wrong. The company doing the takeover (AMD) almost always declines -- rather noticably, too -- and the company being taken over almost always increases -- usually because the takeover bid is at a higher stock price.

AMD is just reporting bad earnings news in a volatile, short-heavy, news-sensitive market. With companies reporting good earnings still trading downward, it's no surprise that reporting bad earnings will earn a company a sound beating on its stock (case in point: Dell). Rumors of the AMD bid weren't even reported by Dow Jones until well after today's close. An analyst quoted in the DJ story mentioned that AMD would have to issue more stock (and dilute current shareholders' stock, a Bad Thing) in order to complete the deal, with ATYT valued at $5.6b, both companies with only about $3b of combined cash, and AMD with $500m of debt.

I don't see how this makes any financial sense for AMD. The stock is at 52-week lows, there's disappointing earnings for the most recent quarter, the phasing out of one of their chip lines is confusing consumers, Intel's Conroe seems to have better prospects, and AMD is spending a lot of money for a new plant that won't be ready for years. They don't seem to have any good news.

I am not a professional investor or analyst, and I don't hold AMD or INTC stock.

God damn it. (1, Interesting)

Anonymous Coward | more than 7 years ago | (#15760623)

Say goodbye to nForce chipsets for AMD.

Conflict - nForce? (2, Insightful)

Coplan (13643) | more than 7 years ago | (#15760631)

I'm a big AMD fan. But I'd be really upset to loose the nForce line of chipsets. In my opinion, it's a must for any AMD user. And I think it would be very difficult to come up with a good replacement.

I also worry that chipsets for AMD based motherboards will not work so well with my nVidia video card. Not an ATI fan at all.

I'm going to be watching these guys very closely. This would sway me away from AMD.

Re:Conflict - nForce? (1)

drinkypoo (153816) | more than 7 years ago | (#15760742)

I'm a big AMD fan. But I'd be really upset to loose the nForce line of chipsets.

I'm a big AMD fan, but after dealing with nForce platform drivers, I'll be really upset not to lose the nForce line of chipsets.

Re:Conflict - nForce? (1)

PenGun (794213) | more than 7 years ago | (#15760866)

I find the nforce4 mobos to work well with the linux drivers. I agree the nvidia stuff seems kinda borke.

      PenGun
    Do What Now ??? ... Standards and Practices !

Re:Conflict - nForce? (0)

Anonymous Coward | more than 7 years ago | (#15761058)

I find the nforce4 mobos to work well with the linux drivers.
No thanks to Nvidia for that, though.

Re:Conflict - nForce? (1)

powerlord (28156) | more than 7 years ago | (#15761000)

Amen!

Forget even Linux, nForce is seriously crippled under WindowXP also.

It was very painful trying to get a new system built with nForce4, RAID 1 SATA aray, and a SATA Optical Drive (system is AMD also, but that wasn't a problem :) )

Finally I traced the problem to an incompatibility in the nForce chipset. It can EITHER support a SATA RAID array, or it can support a SATA optical drive. Doing both unfortunately causes the system to bluescreen.

(and yes, I know SATA on the optical doesn't buy you much ... except MUCH better airflow for a system based around a microATX board)

Re:Conflict - nForce? (1)

lostguru (987112) | more than 7 years ago | (#15761271)

thats cause WinblowsXP

of course i'd be interested to see what happens when vista rolls out (maybe next decade if we're lucky)

i just pray that amd, intel, nvidia, ati, and all the other hardware guys can come to some agreement so we don't have a schizophrenic micro$oft come out with a horrible os that won't run well on any of them or worse only run on one of them

Re:Conflict - nForce? (1)

nine-times (778537) | more than 7 years ago | (#15760906)

I'm wondering, why are people jumping to these kinds of assumptions? Intel makes its own motherboards, chipsets, graphic chipsets, etc., but that doesn't prevent them from functioning with other manufacturers' parts. What business sense would there be in AMD making their processors incompatible with nVidia chipsets? If either were Microsoft, then maybe they could get away with it, but generally hardware/software benefits from compatibility.

Poor Choice For AMD (3, Insightful)

Anonymous Coward | more than 7 years ago | (#15760642)

As anyone familiar with the botched ATI graphics system in the Xbox 360 knows, whatever competence ATI may have had in the past is long gone.

The Xbox 360 is the first console ever to have PCs outperform it before the console has hit store shelves. In the past, consoles have had at least a year or so before PCs could touch them.

What the hell is AMD thinking?

AMD needs to come up with its own bogus SPEC score generating compiler to grow in the market, not a fucked up GPU maker.

Re:Poor Choice For AMD (1, Informative)

Anonymous Coward | more than 7 years ago | (#15761074)

This is fairly obviously a troll, but I'll respond anyway. Have you considered that it's also the first console ever that wasn't rendering at 640x480 60 fields a second, versus 1024x768, 1280x1024 or even higher for a pc? At the second res that's more than 4 times the resolution of what most Xbox/PS2 games were, 8 times if the game wasn't progressive scan, which the XBox was the first to do for most games. The Xbox 360 renders at 1280x720 at the lowest, which is much closer to a normal PC res.

Re:Poor Choice For AMD (0)

Anonymous Coward | more than 7 years ago | (#15761143)

The blame for the 360 should not fall at ATI's feet.

ATI was able to build the graphics system that Nintendo designed for the GameCube that was cheap and efficient and had the same overall power as both the PS2 and Xbox(some areas lower, some higher, obviously)

NVidia wasn't really to blame for the Xbox fiasco. They slapped a desktop GPU in a console just like Microsoft asked them to do.

And ATI really isn't to blame when Microsoft tried to do the same thing with the Xbox 360 - obviously having learned nothing from the 5 billion dollar first Xbox disaster.

There is one and only one party to blame and that is Microsoft. After two failures in the console market it is unlikely they will be around long enough to have a third.

Integrated graphics (0)

Anonymous Coward | more than 7 years ago | (#15760657)

What actually comes to me hearing about this is how incredibly much everyone hates Intel Integrated Graphics. I'm told that ATI and NVidia both have low-end cards that don't really cost any more than Integrated Graphics, but get used less often just because they're not what the system comes with. Mark Rein of Epic [tgdaily.com] seems to think Integrated Graphics is slowly killing PC gaming.

I wonder, with AMD and ATI working together will they be able to present an alternative which meets Intel on price while beating them by far on performance. And if they do, will Intel have to improve their offerings to stay competitive...

Hooray capitalism

Population+ For A Particular Group of Engineers (1)

70Bang (805280) | more than 7 years ago | (#15760690)



It's definitely going to be one of those positive situations where software is doctored to perform particularly well [when combinations are involved].

;)

GPU in socket? (2, Interesting)

tomstdenis (446163) | more than 7 years ago | (#15760694)

There is a company out there that has an FPGA in a 940 pin socket. What about putting a GPU in it? Dual channel memory, HT link to the main processor, HT link to a DAC from the GPU [make mobos with fixed DACs on the board].

That'd be hella cool.

Tom

Re:GPU in socket? (0)

Anonymous Coward | more than 7 years ago | (#15760774)

Probably be slower actually as the GPU usually uses memory one or two generations above what computers use and the connections are probably very specilized. There is a reason why having a gpu share system memory results in slow performence.

Re:GPU in socket? (1)

tomstdenis (446163) | more than 7 years ago | (#15760861)

Um, you realize that in a 2P config the extra socket would not be "sharing" system memory. Though you're right they tend to use already use GDDR2/GDDR3 [iirc similar to DDR2].

Though yeah I guess the memory would still be slower.

Tom

Fuck no. (0, Troll)

A beautiful mind (821714) | more than 7 years ago | (#15760701)

Here I come Intel, if this is true. Just because ATI drivers are horsecrap on Linux. I'm not going to support that company. Especially that Intel looks quite good with that Conroe stuff...

Re:Fuck no. (1, Flamebait)

abshnasko (981657) | more than 7 years ago | (#15760829)

Why must all anti-ATI comments end with "on Linux" ?

All you linux bedwetters need step into the real world. ATi, as far as my experience goes, has made solid GPU's. You represent a minority of ~5%, you do not rule the tech industry. Get over it.

Re:Fuck no. (1)

tomstdenis (446163) | more than 7 years ago | (#15760958)

Because they don't listen to their customers?

For me working in Window is painful, and only have useful with a dozen tools like cygwin, adobe, etc...

In Linux I can "just work". Booting windows to play games isn't an option as multiple people use this box. Where in Linux I can totally dominate one of my four cores with a video game, if I boot windows they get 0 of the 4 cores to use.

Tom

How about this one? (1)

geekoid (135745) | more than 7 years ago | (#15761108)

ATI sucks on windows.
The drivers are horrible.
ATI lies to their customers.

The GPU may be solid, but that only half the battle.

Re:Fuck no. (1)

be-fan (61476) | more than 7 years ago | (#15761315)

Linux users are perhaps a few percent of the general gamer market, but on the other hand, they make up a substantial percentage of the professional market. If you're using your GPU to do 3D modeling, scientific visualization, etc, there is not insubstantial chance you're on Linux.

Re:Fuck no. (0)

Anonymous Coward | more than 7 years ago | (#15761060)

Nvidia drivers are crap too. What's the difference between broken, buggy, binary only crap drivers from ati and broken, buggy, binary only crap drivers from nvidia?

Re:Fuck no. (1)

BobPaul (710574) | more than 7 years ago | (#15761245)

the broken, buggy, binary only crap drivers from nvidia actually work. That's a big difference.

Re:Fuck no. (0)

Anonymous Coward | more than 7 years ago | (#15761289)

They work in the same way the ati ones do. They work sometimes, if you are lucky, and are using very common hardware snd haven't had to customize your kernel or X in any signficant way. I had to get an ati card for our brand new amd matlab sim machine, because the nvidia binary drivers caused it to lock up hard as soon as X started, and the open source nv driver doesn't support multiple monitors. The ati drivers worked fine, took 3 minutes to install them and reboot, and everything was fine.

If true... (0)

Anonymous Coward | more than 7 years ago | (#15760716)

this would not be the first time ATi and AMD shot themselves in their respective foot.

Huge Opportunity for Free Software Drivers (1, Insightful)

Anonymous Coward | more than 7 years ago | (#15760726)

AMD, like Intel, could be convinced to open up the specifications to their graphics hardware in order to sell more of their complement product, processors. The difference is that ATI Graphics Processing Units (GPUs) don't suck like Intel GPUs. AMD could have almost 100% of the Linux notebook market within a year and my guess is HP would be the big winner because they already have a business line of AMD notebooks with ATI GPUs: HP Compaq nx6125 Notebook for Business (New Zealand link since this Anonymous Coward is from NZ) [hp.com]

Article Text (1)

cunina (986893) | more than 7 years ago | (#15760781)

AMD and ATI to ask shareholders for merger approval

By Gary Niger: Friday 21 July 2006, 13:53

ACCORDING to an extraordinarily reliable source, AMD and ATI will on Monday pitch their shareholders with the proposition that the two companies merge.

It's an interesting idea - AMD doesn't quite have the shekels in the bank to buy ATI outright. The deal, subject to shareholder approval, may still founder.

If the deal goes through, Nvidia and its SNAP partnership with AMD will definitely be reconsidered and Nvidia will all of a sudden become a super underdog compared to the new juggernaut. It may also stop the endless bickering between ATI and Nvidia that's entertained the world+dog for some years now.

AMD will be glad to get its hands on ATI's very profitable handheld division. The firm needs good chipsets and will also benefit from a great consumer digital chip segment. It will also like the integrated graphics business and will now get a piece of this action.

approval for what? (1)

postmortem (906676) | more than 7 years ago | (#15760782)

They are not making same product, and neither will have relative monopoly even after merger.

AMD market share is at best 20%, ATi's at best 1/3 ( other third intel and nVIDIA), and again even with bundling their respective products they don't make any impact, it is not like that they couldn't bundle ATi shi*sets without merger.

As already has been said majority of nForce users will swing away. They are making easier for us to avoid them, as AMD+ATi will always come in combo, so if you don't like one, you don't get the other one; or makes it easier to avoid both while buying laptops (many intel based laptops have ATi graphics)

Re:approval for what? (1)

LWATCDR (28044) | more than 7 years ago | (#15760851)

Intel does have video chips and chip sets yet nVidia supports them I don't see why this would have to be the end of the nForce.
I would find it a bit odd if it happens.

Re:approval for what? (1)

ichigo 2.0 (900288) | more than 7 years ago | (#15760936)

They need approval from the owners a.k.a. shareholders, market shares don't matter in this case.

I hope the merger is rejected (1)

kimvette (919543) | more than 7 years ago | (#15760785)

I think this is bad for AMD because ATI has crappy support, crappy customer experience, and crappy drivers.

Either this would vastly improve ATI or it could drag AMD down into mediocrity. If the merger does happen I truly hope that it is the former (ATI cleaning up its act across the board) but all too often with these sorts of mergers its the former that happens. ATI has a lot of great technology with fast GPUs, but when the drivers suck, customer service and support are nonexistent, and they absolutely refuse to document registers for folks (third party developers) willing to develop drivers for FREE, I have absolutely, positively NO reason to buy ATI products, even if they do offer superior products (like their All in Wonder series). I used to be an ATI fanatic (most of my machines - personal and business- are STILL ATI-equipped/crippled in the hopes they'll clean up their act) but on all new machines I've been choosing NVidia, and recommend NVidia to clients whenever it's possible.

Re:I hope the merger is rejected (1)

kimvette (919543) | more than 7 years ago | (#15760864)

s/crappy customer experience/service/

although, the end result is a crappy customer experience.

No!! (1)

archcommus (971287) | more than 7 years ago | (#15760792)

This would terribly upset me if it were to go through. I could nVidia then teaming up with Intel, and you'd basically be forced to either buy an AMD+ATI combo or an Intel+nVidia combo. Nooo thanks.

Please, no. (1)

WidescreenFreak (830043) | more than 7 years ago | (#15760846)

I am a hard-core AMD and nVidia fan. I don't have any Intel PCs in my house except those that I got as freebies, and I've never had good luck with *any* ATI card. I cringe in fear at what would (or at least could) happen to my gaming systems of the future if ATI and AMD merge. Yes, I can see some type of exclusivity where ATI cards are going to somehow be more advantageous than nVidia when it comes to gaming hardware for reasons other than plain, old competition.

Damn. This worries me

Dilbert, anyone? (3, Interesting)

ivoras (455934) | more than 7 years ago | (#15760870)

Doesn't this story look like a Dilbert-ish situation - the companies themself don't even consider merging but because "the word is out" and "everybody knows they'll do it" it somehow becomes a reality?

Re:Dilbert, anyone? (0)

Anonymous Coward | more than 7 years ago | (#15761093)

How do we know the companies aren't considering it? Companies do a lot of things in secret, even from their own employees. In general, The Inquirer is disturbingly accurate.

as an employee (0)

Anonymous Coward | more than 7 years ago | (#15760989)

as an employee of one of these two companies (does it matter which one...?), i can say the office was buzzing today with talk about this. the concensus is that it's probably just a rumor, but if it is true, none of us would be very happy about it...

So much nonsense (1)

bruno.fatia (989391) | more than 7 years ago | (#15761107)

You think nVidia will stop manufacturing nForce for AMD chips? Unless they are dumb, they won't. It's money for them!

Also what makes you think also that AMD + ATI means Intel + nVidia? Nothing so far other than speculation . Nobody gave any real evidence about it.

ATI + AMD = ? (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#15761174)

ATI + AMD = DAAMIT

Re:ATI + AMD = ? (1)

ZeroExistenZ (721849) | more than 7 years ago | (#15761346)

Intel + AMD = LATE MIND
Intel + nVidia = EVIL DNA IN IT / DIVE IN AT NIL
AMD + nVidia = DNA VIA DIM
ATI + Intel = NIL ATE IT

don't look at intel (1)

MADnificent (982991) | more than 7 years ago | (#15761181)

Could it be, that this is just a way to keep our attention away from the new intel?

AMD has to do something to keep attention from the enthousiasts. Monday the price cuts will be given. So to me it seems like we are getting a whole bunch of 'events' trying to keep AMD in the spotlight...

---
Today I am a bunny. A very very drunk bunny. >>>> this is not really a sig

Linux + ATI WORKS! (1, Troll)

Urza9814 (883915) | more than 7 years ago | (#15761183)

I don't understand all the comments saying ATI's Linux drivers suck. I've got a Radeon 9200, and I've never had a problem with 'em. I love the thing. I HATE NVidia. They're expensive as hell. I always recommend AMD and ATI. Of course, I don't think there's anyone here that would say AMD isn't good...heh. I've got my Athlon XP 2200+ overclocked from 1.35GHz to 2.09, and I've had it higher, but not all of my RAM is fast enough.
I can't wait for this merger if it's true.

Re:Linux + ATI WORKS! (1)

justsomebody (525308) | more than 7 years ago | (#15761280)

I don't understand all the comments saying ATI's Linux drivers suck. I've got a Radeon 9200

As you see. You didn't need to go further, we already know you use XOrg drivers and not fglrx. Yes, that drivers are nice. Not fully featured, but open and nice. I preffer open too.

Now try using some X1800 and tell us how do you like that for a difference (no magic without crapiest ever driver named fglrx). Then pop in nvidia and their drivers and tell us how that feels. Believe me, using nvidia drivers will suddenly seem like the best fuck you've ever had (if you know what you talk about, /. people and fuck are not really connected topics, mostly by wish only).

Re:Linux + ATI WORKS! (1)

Urza9814 (883915) | more than 7 years ago | (#15761327)

I really don't know what I'm using. I install a distro (tried it on Mandriva, Slackware and Libranet) then install the drivers from ATI's website, and everything works fine. Don't know more than that about my graphics card, because I don't need to. It works.

Re:Linux + ATI WORKS! (1)

geniusj (140174) | more than 7 years ago | (#15761282)

The whole Intel vs AMD thing is like a swinging pendulum. Intel is back in the tech lead now in most markets, and it's soon to be all. We'll see what AMD does to swing the pendulum back their way.

I also wonder if the ATI sales boost is partly due to Apple's laptop sales we heard about yesterday.

Re:Linux + ATI WORKS! (1)

pornflakes (945228) | more than 7 years ago | (#15761348)

Hm.
Support for r2x0 (8500-9250) cards is actually broken in recent Ati linux drivers.
Ati cards are, at least, just as expensive as NVIDIA cards. Midrange-ish NVIDIA cards (7600gt,6600gt...) outperform their Ati counterparts and are generally cheaper.

Re:Linux + ATI WORKS! (1)

Urza9814 (883915) | more than 7 years ago | (#15761378)

Meh...all I know is I've got two computers, one with an ATI, one with a NVidia. When I looked up their prices, the NVidia was about twice as much...but I have a buncha games that won't run on the NVidia (biggest problem being 3D pixel shader or something) that run great on my Radeon. Dunno the actual model number for the NVidia though.

Another possible hint that this is more than rumor (1)

Solr_Flare (844465) | more than 7 years ago | (#15761200)

The recent announcement by Apple that they are going to be partnering with Nvidia for future ipod use. This could be the first step in them getting ready to switch over to Nvidia for their graphics processors since they use Intel chipsets and ATI graphics cards currently. I'm sure AMD is bitter over Intel being picked instead of AMD for the new "Mactels" too, so I could easily see them withdrawing ati support if the merger takes place.

Not that far fetched. (4, Interesting)

WoTG (610710) | more than 7 years ago | (#15761218)

At first glance, this is a stupid idea for AMD, but upon reflection, it isn't that bad. We've got to look at the 5 year picture for a deal of this size. What will AMD need to do to be more successful in 5 years than they are today? Well, despite what the teenage gamers will say, it actually doesn't mean having the highest FPS in Quake 5. The stable, highest volume, and generally profitable sales are in corporate servers and workstations. That's Dell, HP, and to a lesser extent Gateway, Lenovo, et al. So, what do they need from AMD or Intel? They want cheap, fast, reliable supply, few defects, and ease of integrating into the individual computers. After several years of the Athlon and Opteron, AMD is only now starting to get a toe hold in workstations and a reasonable share of server CPUs.

IMHO, AMD would be well advised to start shipping it's own chipsets, just like Intel. It just makes things easier for their most important customers, the big OEMs. They have one less vendor to worry about. There's less testing required, since presumably AMD would test the CPU and chipset together. And it's less risky for both customers and AMD since AMD has a very strong incentive to make sure that chipsets will be available for their platform on time, whereas third parties have different priorities.

Then there's the whole GPU angle. Why shouldn't GPUs be produced in company owned, i.e. tweaked for performance, fabs? They're every bit as complex and big and expensive as CPUs. Bringing that in house should give a nice bump to performance. And what is a GPU going to be in five years anyway? On the AMD platform, all the tools are in place to allow the GPU to work much more like a cheap DSP/co-processor than we've ever seen before. If the Opteron wasn't an Itanium killer, maybe a couple Opterons and a couple "GPU-DSPs" will do the trick. Even for regular workstations, imagine just plugging a GPU into a free socket on the MB? That would fit very nicely in the middle of the graphics market... way better than integrated, but way cheaper than an add-on card.

Lastly, AMD needs a way to use the last generation fab equipment a little longer. Making chipsets would let them use the fab equipment for an extra few years. They lost that cost efficiency when they spun off the flash business. Fab gear is expensive, so it's kind of a waste for them to be yanking it out everytime the minimum for a marketable CPU moves higher.

Five years ago AMD needed partners and an ecosystem to support their own platform and survive as a company. The next five years are about turning the CPU market into a duopoly.

I have a few shares of AMD. And I'd like to see this deal happen, but only at a decent price (from AMD's point of view). Hmm... this post turned rather long...

The real reason (1)

akuma(x86) (224898) | more than 7 years ago | (#15761326)

AMD has Centrino envy. More specifically, they need a platform strategy.

Let's face it. CPUs are commodities. You buy price/performance.
Recently, Intel has been using the platform to differentiate itself.
Centrino is one example in the notebook world.
You can see other examples with "advanced I/O" in the newer server platforms.
Intel dictates the platform and can define it to suit their needs.

AMD has no platform strategy. It's at the mercy of various 3rd party chipset makers.

This is why this makes strategic sense.
AMD wants to control a platform and use that to differentiate itself from Intel.

Maybe Intel and Nvidia can now merge (0)

Anonymous Coward | more than 7 years ago | (#15761375)

And blow the awful pieces of shit ATI/AMD as well as Mudorola out of the fscking water. W007!
After that, Intel/NVidia can merge with Dell and Microsoft to blow Linusx and opensores out of the water. Then the faggot couple Fucktard Taco and BrokebackNeil can slit their god-damned wrists from depression and no more SHITDOT! W007! W007!

GO AHEAD, FUCKING FLAME AWAY OR WASTE YOUR G0D DAMNED MOD POINTS FUCKTARDED SHITDOT SHEEPLE!!!!!!!!!!
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...