×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

It's Official — AMD Will Retire the ATI Brand

timothy posted more than 3 years ago | from the rose-by-any-other-name dept.

AMD 324

J. Dzhugashvili writes "A little over four years have passed since AMD purchased ATI. In May of last year, AMD took the remains of the Canadian graphics company and melded them into a monolithic products group, which combined processors, graphics, and platforms. Now, AMD is about to take the next step: kill the ATI brand altogether. The company has officially announced the move, saying it plans to label its next generation of graphics cards 'AMD Radeon' and 'AMD FirePro,' with new logos to match. The move has a lot to do with the incoming arrival of products like Ontario and Llano, which will combine AMD processing and graphics in single slabs of silicon."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

324 comments

J. Dzhugashvili (!) writes ... (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#33413654)

Fates worse than death, eh?

ATI - burn in hotel echo lima lima (-1, Troll)

Anonymous Coward | more than 3 years ago | (#33413660)

good riddance

Re:ATI - burn in hotel echo lima lima (-1, Troll)

Anonymous Coward | more than 3 years ago | (#33413716)

And nothing of value was lost.

Re:ATI - burn in hotel echo lima lima (-1, Troll)

DarkKnightRadick (268025) | more than 3 years ago | (#33413800)

beat me to it.

Radeon outlasts ATI (3, Funny)

Hadlock (143607) | more than 3 years ago | (#33414204)

It's interesting that the Radeon brand, or series at least, has outlived it's creator. Who will be there to give away Radeon to it's new life partner?

Something old (AMD), Something new (Radeon), Something borrowed (x86 architecture), Something blue (Intel?)

Great news (4, Interesting)

mangu (126918) | more than 3 years ago | (#33413680)

"The move has a lot to do with the incoming arrival of products like Ontario and Llano, which will combine AMD processing and graphics in single slabs of silicon."

Good. Getting rid of the PCI-e bus between CPU and GPU is one important step in getting massive parallelism to work well.

Since we hit the 3 GHz barrier, where the speed of light itself becomes a limit, putting the processing elements physically closer is essential to get better performance. Now let's see them put 4 GB or so of fast RAM on the same chip.

Re:Great news (1)

conares (1045290) | more than 3 years ago | (#33413688)

Since we hit the 3 GHz barrier, where the speed of light itself becomes a limit

Seriously? wow

Re:Great news (5, Informative)

mangu (126918) | more than 3 years ago | (#33413738)

With a 3 GHz clock, a signal at the speed of light travels 10 cm during one clock cycle. This means that if a chip needs data from another and there's a distance of five centimeters or more between both chips the data will not arrive in the same clock cycle.

Re:Great news (4, Interesting)

data2 (1382587) | more than 3 years ago | (#33413770)

So with current die-sizes of about 146mm^2, assuming it's really square, we have a maximum length of about 1.7cm. Sounds like we can go up to 9Ghz, at least if we are just using the speed of light in vacuum.

Re:Great news (4, Informative)

mangu (126918) | more than 3 years ago | (#33413884)

we have a maximum length of about 1.7cm. Sounds like we can go up to 9Ghz, at least if we are just using the speed of light in vacuum.

Assuming the signals travel in a straight line. If you look at current motherboards and video cards, you'll notice that many of the copper traces are "wiggly", not straight. That is done in order to get bits in parallel buses to arrive at the same time, and conductor traces on the chips must be designed similarly, it's the longest distance that any of the bits must travel that limits the others.

Besides, there are capacitance and inductance effects to be considered. Transitions from one to zero and vice-versa aren't instantaneous and that must be taken into account.

One could say that 9 GHz would be the absolute physical limit for a 1.7 cm chip and the technical limit is somewhat lower than that.

For a set of chips on a board, the absolute physical limit is much lower, and that's the reason why on-chip cache memory has become so important lately.

Re:Great news (2, Informative)

BusterB (10791) | more than 3 years ago | (#33414008)

An XFI-SFI interconnect runs up to 10.3 Gbps on a single serial link. It is double-pumped (bit on each end of the clock) so the clock rate is half that. This is the connection that links a 10Gbps phy to the transceiver module. You do have to keep the interconnects pretty short though.

http://www.altera.com/technology/high_speed/protocols/10gb-ethernet-xfi-sfi/pro-xfi-sfi.html [altera.com]

XDR ram can transmit 8 bits per clock on a serial line: http://en.wikipedia.org/wiki/XDR_DRAM [wikipedia.org]

Re:Great news (1)

durrr (1316311) | more than 3 years ago | (#33413954)

Also, we have to design a processor which either use wireless wiring between all gates or only have straight-shortest-possible-path connections between everything.

It will also need to use instant quantum-FTL communication between harddrives, ram and GPU to not make it a massively bottlenecked waste of performance.

Re:Great news (1)

Randle_Revar (229304) | more than 3 years ago | (#33414006)

exactly

Re:Great news (1)

Hylandr (813770) | more than 3 years ago | (#33414194)

So are we coming full circle where the computer is really a base station for the CPU and related hardware found on a motherboard is included in the chip, and plugged in like so many Atari 2400 cartridges?

That would be neat. Upgrades would only be dependent on having the right docking station then.

- Dan.

Re:Great news (-1, Flamebait)

cbiltcliffe (186293) | more than 3 years ago | (#33413828)

With a 3 GHz clock, a signal at the speed of light travels 10 cm during one clock cycle. This means that if a chip needs data from another and there's a distance of five centimeters or more between both chips the data will not arrive in the same clock cycle.

Really? So when did we all get to using optical interconnects?

Electricity doesn't travel at the speed of light.

And even if it did, for your random, uninformed postulation to be true, we would need evidence that chips could not practically run faster than 3GHz. Unfortunately for you, that is not the case [intel.com].

Re:Great news (0)

Anonymous Coward | more than 3 years ago | (#33413844)

I think you're proving your opponent's point there: the Pentium 4 was a ridiculous piece of crap; it did a better job as a space heater than a processor.

It's not light speed (3, Informative)

NotSoHeavyD3 (1400425) | more than 3 years ago | (#33413864)

But propagation speed is a signficant fraction of C. (66 to 96 percent http://en.wikipedia.org/wiki/Speed_of_electricity [wikipedia.org] ) Admittedly you've got a point, they've already gotten past 3GHZ. (I'm just wondering how much faster they can get before signal speed is actually the limiting factor.)

Re:It's not light speed (1)

simcop2387 (703011) | more than 3 years ago | (#33414042)

At that point you do need to start designing things more like a network than as a bus that's attached to the processor since you'd have latencies of several clock cycles involved. and the only thing to do is really to increase cache sizes and move them ever closer to the processor so that the latencies can be avoided as much as possible. But as far as things like a graphics card goes, when you've got tens of billions of clock cycles per second and a latency of say 100 cycles, the user is never going to notice

Re:It's not light speed (1)

rwa2 (4391) | more than 3 years ago | (#33414164)

It wasn't the signal speed that became a limiting factor above 3Ghz, but transistor power leakage current, which sort of goes "to hell in a handbasket" above 3Ghz.

http://arstechnica.com/old/content/2004/06/prescott.ars/2 [arstechnica.com]

But that sort of explains why Moore's law went all multi-core after Intel gave up trying to make 4Ghz CPUs that didn't leak power all out the wazoo.

Re:Great news (1)

AvitarX (172628) | more than 3 years ago | (#33414034)

I wonder how many things on the chip took more than a clock cycle? Specifically things that in a good design would take one.

I wonder if that's why the Pentium 4 had terrible performance per clock.

Re:Great news (4, Insightful)

dylan_- (1661) | more than 3 years ago | (#33414158)

Really? So when did we all get to using optical interconnects? Electricity doesn't travel at the speed of light.

We're not, but even if we were, that's the fundamental limit. Electricity traveling slower than this makes the problem worse.

And even if it did, for your random, uninformed postulation to be true

You've clearly misunderstood his post, so adding insults just makes you look foolish.

we would need evidence that chips could not practically run faster than 3GHz. Unfortunately for you, that is not the case.

No we wouldn't. If it can't be done in one clock cycle, it'll be done in two (or more). Who said anything about this limiting clock speed?

Anyway, at a higher clock speed, the problem becomes even more pronounced. With a 3.8 GHz clock, a signal at the speed of light only travels 7.9 cm during one clock cycle (but let's estimate about 6.5 cm for electricity).

Re:Great news (1)

Vectormatic (1759674) | more than 3 years ago | (#33413984)

that doesnt really matter that much, clock/data propagation stages in a data channel are nothing new. Intel's Pentium 4 had pipeline stages which only served to propagate the data/instruction one step further down the pipeline.

Sure, the added latency isnt all that nice, but 1 clock-cycle to anywhere isnt needed anyway

Re:Great news (0, Redundant)

MichaelSmith (789609) | more than 3 years ago | (#33413778)

Looks like a photon travels 10 cm [wolframalpha.com] in a clock pulse at 3GHz.

Re:Great news (0, Flamebait)

cbiltcliffe (186293) | more than 3 years ago | (#33413830)

The electricity that runs through my processor is not made of photons, making this comment, although interesting, 100% irrelevant.

Re:Great news (1, Insightful)

Anonymous Coward | more than 3 years ago | (#33413976)

It's very relevant, because it means that the electricity, which is moving even slower than photons (about 2/3 c) travels less than 10 cm in one cycle

Re:Great news (1)

u17 (1730558) | more than 3 years ago | (#33414152)

Ah, but to arrive at such a conclusion you need to make a non-trivial logical deduction based on arcane prior knowledge involving abstract relationships and mathematical concepts such as suprema, a feat that not many can accomplish!

Re:Great news (0)

Anonymous Coward | more than 3 years ago | (#33413728)

Funny, because I've seen many overclocked systems reach well above 3GHz.

Re:Great news (1)

Arbition (1728870) | more than 3 years ago | (#33413972)

That just means increasing the number of waits the CPU needs to do, making the importance of branch prediction ever more important.

Re:Great news (2, Informative)

sanosuke001 (640243) | more than 3 years ago | (#33413754)

Yeah, 3ghz doesn't come close to the light speed barrier. i think the issue is more from heat dissipation and electron bleed...

moving the gpu on-die will fix the latency associated with the pci-e bus, but it's not because of the reasons you seem to believe

Re:Great news (5, Informative)

bertok (226922) | more than 3 years ago | (#33413868)

Yeah, 3ghz doesn't come close to the light speed barrier. i think the issue is more from heat dissipation and electron bleed...

moving the gpu on-die will fix the latency associated with the pci-e bus, but it's not because of the reasons you seem to believe

Want to bet?

At 3 GHz, light moves just 7.2 cm [wolframalpha.com], given a typical upper range for the velocity factor of copper of 0.72. Silicon and fibre optics are usually worse, with a VF between 0.4 and 0.6, or between 4 and 6cm per clock. That's barely enough to traverse a CPU die, let alone the motherboard. Moving parts physically closer together has a lot to do with the speed of light!

Re:Great news (4, Informative)

IndustrialComplex (975015) | more than 3 years ago | (#33414174)

Want to bet?

At 3 GHz, light moves just 7.2 cm, given a typical upper range for the velocity factor of copper of 0.72. Silicon and fibre optics are usually worse, with a VF between 0.4 and 0.6, or between 4 and 6cm per clock. That's barely enough to traverse a CPU die, let alone the motherboard. Moving parts physically closer together has a lot to do with the speed of light!

I really would mod this informative, since I was about to make a similar point. I think a lot of the confusion is that people hear things like the Speed of Light in terms of Kilometers per second, and it gets filed away by the brain as inconsequential for scales which are measured in centimeters and MUCH smaller.

But when you realize that that scale which is only a factor measured in millions meters per second is being divided into segments that are fractions of billionths of a second, the speed of light manifests in a much more physically understandable term.

Re:Great news (2, Informative)

Rockoon (1252108) | more than 3 years ago | (#33413960)

Overclockers have gone above 6ghz here [tomshardware.com] and above 7ghz here [geek24.com] and dont forget over 8ghz here [softpedia.com]

In each case, its always about the heat.

Pretty much all CPU's sold today (even "2.x ghz" chips) can go over 4ghz with proper air cooling. The reason they dont sell 4ghz+ chips is because chips have warranties and require a proper cooling setup in order to not fail at those speeds. Most important of course is heat sink and cpu fan which Intel and AMD do have some control over, but also of considerable importance is case fans and case ventilation, which they do not have control over.

Just moving my case fan from the stock front position (intake) to the back (exhaust) gave me 10 degrees C more headroom at load, allowing my AMD 1055T to go from 2.8ghz to 4.1ghz (before moving the case fan, I was only stable up to 3.36ghz) ..

Re:Great news (1, Insightful)

Anonymous Coward | more than 3 years ago | (#33414068)

More cycles does not have anything to do with the original poster's point. He's not saying you can't go above 3GHz he's saying it's a point of diminishing returns as there are more empty cycles while the chip waits for stuff from external sources which can no longer arrive in a single clock cycle.

Re:Great news (1)

shoehornjob (1632387) | more than 3 years ago | (#33414056)

I don't think this is such a good idea. We'll have to see what comes of it but I like being able to choose what processor and GPU I want in my computer. I'm currently running a quad core Phenom 3.2 ghz cpu and an Nvidia GTX 250. Since that one crappy ATI card I bought several years ago I have always bought Nvidia GPU's and they never let me down. With this change I'll have to start buying Intel chips which are more expensive. I wouldn't be surprised if Intel secures some kind of deal with Nvidia to start doing the same. I'm curious to see if AMD will come under some kind of regulatory scrutiny from the government. You know if this were an insurance company (Travelers) and a financial services company (Citibank) this deal would send up red flags because it effectively limits the consumers choice (IE if you want that gpu you have to buy a CPU that you may not have wanted to get). It will be interesting how this plays out.

Re:Great news (2, Insightful)

mangu (126918) | more than 3 years ago | (#33414184)

The same thing happened with math coprocessors. I once had an AMD Am386 chip with an Intel 80387 floating point chip. With the 486 CPU series Intel fully integrated the floating point functions in the same chip as the CPU.

I don't think the market for separate graphics chips will last much longer. The only way to get more performance out of CPUs now is by adding cores and it makes sense to let the CPU use the GPU cores. Integrating graphic functions in the CPU seems inevitable by now.

Re:Great news (1)

StayFrosty (1521445) | more than 3 years ago | (#33414198)

I'm sure you will still be able to buy CPUs without a GPU on die. Based on your logic, you won't be able to buy Intel CPUs either because some of Intel's chipsets include crappy onboard Intel GPUs.

Wil this affect open source drivers (4, Interesting)

La Gris (531858) | more than 3 years ago | (#33413684)

Are there any deeper changes to come behind the re-brand? ATi involved in producing open source drivers ans specs for their GPU. Will this name change carry some bad news about the current openness?

Re:Wil this affect open source drivers (0)

Anonymous Coward | more than 3 years ago | (#33413698)

Considering that they only started doing that after the AMD buyout, I kinda doubt that anything bad will happen on that front now.

Re:Wil this affect open source drivers (5, Informative)

Ironhandx (1762146) | more than 3 years ago | (#33413712)

ATI really only started doing that after they were acquired by AMD so I wouldn't worry too much.

Re:Wil this affect open source drivers (1)

ProppaT (557551) | more than 3 years ago | (#33413730)

I think that the "deeper changes" are that AMD's prepping for their integrated CPU/GPU launch. It only makes sense. If they're gonna start merging chips, it would be awful awkward to have to brand names AND a product name attached to a chip.

I would image that better Linux drivers might come down the pipeline, though. These integrated approaches lend themselves nicely towards Linux workstations and they'd definitely loose out on a potential market if they completely ignored the issue.

Opportunity knocking for AMD here... (4, Insightful)

Qubit (100461) | more than 3 years ago | (#33413914)

...AMD's prepping for their integrated CPU/GPU launch. ...
I would image that better Linux drivers might come down the pipeline, though...they'd definitely loose out on a potential market if they completely ignored the issue.

I'd go one step further and say that I think that AMD has an opportunity to highlight their hardware here.

Intel's CPUs and integrated graphics have long had great support in the Linux kernel. Because Intel controls the tech, they can actually provide the correct and full source for the graphics drivers. The problem is that Intel integrated graphics aren't ever anything special.

If AMD is seriously working on integrating their graphics cards and processors -- perhaps even onto the same die -- then they have an opportunity to provide a much more powerful, integrated hardware platform with fully-open drivers. Intel can't compete with that kind of setup, especially as NVidea appears to have an aversion to opening the source to their graphics card drivers.

Re:Opportunity knocking for AMD here... (2, Informative)

hedwards (940851) | more than 3 years ago | (#33414074)

Intel's CPUs and integrated graphics have long had great support in the Linux kernel. Because Intel controls the tech, they can actually provide the correct and full source for the graphics drivers. The problem is that Intel integrated graphics aren't ever anything special.

Methinks you might are being a bit generous with Intel. I went with an Intel integrated chipset a number of years back because the alternatives weren't very well supported on FreeBSD, but the graphics weren't just not special, they were bad. Sufficiently bad that I've stayed away from them ever since. Which for Intel is just dumb, I have a very hard time believing that Intel couldn't do any better than what they've been doing. Hopefully with AMD owning ATI that'll kick a bit of sand in Intel's collective face so that they actually do something about it.

Re:Wil this affect open source drivers (2, Informative)

MostAwesomeDude (980382) | more than 3 years ago | (#33414212)

Other way around; AMD has always released specs and started releasing ATI specs after ATI was acquired. You may notice that http://www.x.org/docs/AMD/ [x.org] is lacking docs for the r200 and earlier; that's because AMD made the acquisition during the r400 era, and the docs for older chipsets were more or less lost forever at that point.

Right now, the open-source drivers are called radeon, r300, r600, etc.; one developer committed his code as "amd" instead at one point. (It got changed to avoid end-user confusion.)

Let The Confustion Begin (1, Insightful)

noc007 (633443) | more than 3 years ago | (#33413694)

I assume there's going to be an AMD Radeon sticker next to the Intel Inside sticker. I can't wait to sort out the confused people around me thinking there are two physical CPUs, one from each manufacturer, in that computer. In addition to consolidating its brand presence,I suppose they think this will reduce confusion when IMHO it will create more confusion for a while.

Re:Let The Confustion Begin (3, Funny)

Anonymous Coward | more than 3 years ago | (#33413718)

What confusion?

As you said, there are two physical CPUs, one from each manufacturer, in that computer. Where's the confusion?

Re:Let The Confustion Begin (0, Redundant)

Tukz (664339) | more than 3 years ago | (#33413786)

What?
There ISN'T 2 physical CPUs in the computer.
The CPU is from Intel, the GPU is from AMD.

The confusion is that most regular people know of AMD and Intel as the CPU, ATI and nVidia as the GPU.

As OP said, if there is 2 stickers on the PC, saying AMD Radeon and Intel Inside, they might think the computer have both or at the very least be slightly confused.

I pretty much repeated what OP said, in a slightly different wording.

Though I don't think it'll be a problem, but I see the point.

Re:Let The Confustion Begin (4, Insightful)

cbiltcliffe (186293) | more than 3 years ago | (#33413870)

The confusion is that most regular people are only marginally aware of an AMD/Intel distinction, although don't know what it means, and don't know at all ATI or nVidia.

Fixed that for you.

Re:Let The Confustion Begin (0)

Anonymous Coward | more than 3 years ago | (#33413910)

^THIS.

Re:Let The Confustion Begin (0)

Anonymous Coward | more than 3 years ago | (#33413726)

http://www.heise.de/newsticker/meldung/AMD-loest-sich-von-Markenbezeichnung-ATI-1069117.html?view=zoom;zoom=1

Re:Let The Confustion Begin (0)

Anonymous Coward | more than 3 years ago | (#33413734)

Seems like confusion on the expense of Intel, so that would be a bonus for AMD.

Re:Let The Confustion Begin (3, Insightful)

CubicleView (910143) | more than 3 years ago | (#33413736)

I can't wait to sort out the confused people around me thinking there are two physical CPUs

I'd imagine that the only people who care to hear about the internals of your computer (if any) will be able to figure it out for themselves.

Re:Let The Confustion Begin (0)

Anonymous Coward | more than 3 years ago | (#33413742)

RTFA shithead.

The badges you see above will be used for systems with discrete Radeon and FirePro graphics cards. The lower row omits the AMD logo, so PC makers shipping Intel-based systems will be able to avoid the oil-and-water combo of Intel and AMD branding, if they wish.

Re:Let The Confustion Begin (1)

somersault (912633) | more than 3 years ago | (#33413744)

GPUs are being used more and more for non graphics co-processing these days, so any people thinking of the machine as having two CPUs aren't that far off..

Re:Let The Confustion Begin (0)

Anonymous Coward | more than 3 years ago | (#33414084)

You are an idiot. GPUs are used only for vectorized operations. They are HARD to program for. I write OpenCL, and it SUCKS. They will *NOT* be seen as two processors or a processor and coprocessor. You might want to do some reading.

Captcha: dinosaur

Re:Let The Confustion Begin (3, Informative)

Lliam33 (1881990) | more than 3 years ago | (#33413764)

No, there are two logos, as seen in the article. One with an "AMD Radeon" logo for discrete cards and one with just "Radeon Graphics" for PC makers building Intel-based systems.

Classic example of not reading the article... (5, Informative)

maweki (999634) | more than 3 years ago | (#33413816)

because it states "The badges you see above will be used for systems with discrete Radeon and FirePro graphics cards. The lower row omits the AMD logo, so PC makers shipping Intel-based systems will be able to avoid the oil-and-water combo of Intel and AMD branding, if they wish."

Re:Classic example of not reading the article... (0, Troll)

whoop (194) | more than 3 years ago | (#33413876)

You could at least provide a link to this alleged quote of yours. How are we to know that you didn't just make all that up?!?

Re:Classic example of not reading the article... (0)

Anonymous Coward | more than 3 years ago | (#33414060)

If you RTFA you would know he didn't make it up...

Re:Classic example of not reading the article... (1)

Barefoot Monkey (1657313) | more than 3 years ago | (#33414072)

You could at least provide a link to this alleged quote of yours. How are we to know that you didn't just make all that up?!?

It's right there in TFA [techreport.com]. The link is in the summary. Read the paragraph directly below the pictures of the new badges.

Re:Classic example of not reading the article... (0)

Anonymous Coward | more than 3 years ago | (#33414172)

It's right here: http://techreport.com/discussions.x/19547 [techreport.com]

You can also find it at the top of the page... twice, in fact.

Yes, I'm well aware that that's the joke.

Re:Let The Confustion Begin (1)

kevingolding2001 (590321) | more than 3 years ago | (#33413848)

You could make that assumption and then type up a post discussing the confusion that would result.
Or you could RTFA (I know, new here etc...) and see the bit that says...
"The lower row omits the AMD logo, so PC makers shipping Intel-based systems will be able to avoid the oil-and-water combo of Intel and AMD branding, if they wish."
... and save yourself all that trouble.

Engadget had good commentary: (0)

Anonymous Coward | more than 3 years ago | (#33413714)

Engadget.com Article [engadget.com]

Regarding killing off the brand,

Great, but did anyone consider the fact that the graphics wars will now be fought between two teams wearing green jerseys?

fglrx (3, Insightful)

leathered (780018) | more than 3 years ago | (#33413746)

..can they retire that too? please?

Re:fglrx (1)

TheCycoONE (913189) | more than 3 years ago | (#33414062)

The name? Sure they can. To please you it will (continue to) be known as Catalyst. (http://support.amd.com/us/gpudownload/linux/Pages/radeon_linux.aspx)

That's fine and dandy (0)

Anonymous Coward | more than 3 years ago | (#33413748)

but that doesn't stop their drivers from sucking.

Re:That's fine and dandy (1, Informative)

Anonymous Coward | more than 3 years ago | (#33413896)

nVidia's drivers suck pretty bad too. The real problem is that the high-end graphics card companies will prioritize "getting a couple of extra FPS in a benchmark" over "not crashing all the goddamn time"

At least the Linux open-source drivers tend to be stable, when a card finally gets supported (a generation late, at least).

Not terribly surprising (1)

samael (12612) | more than 3 years ago | (#33413752)

They can give the AMD brand a big boost by associating it directly with the graphics cards - and it will probably mean that people buying an AMD graphics card will be more likely to buy an AMD processor to go with it.

Re:Not terribly surprising (1)

shoehornjob (1632387) | more than 3 years ago | (#33414108)

As I read it they now have no choice. If they want to buy an AMD cpu or an ATI gpu they are locked in to both. I'm still unsure if this is a big win for consumers.

Re:Not terribly surprising (2, Informative)

samael (12612) | more than 3 years ago | (#33414200)

ATI graphics cards work just fine with Intel processors. I don't believe there's any move to stop them doing so when they rebrand.

Does nVidia support HD and HD audio over HDMI? (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#33413756)

Because to all those currently praying for some reason for the demise of ATI, they do. I can hook up my HD 5770 to my receiver and get sound and video in one. As of a few months ago no nVidia cards offered this.

Re:Does nVidia support HD and HD audio over HDMI? (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#33413788)

FIRST POAST!!!111!!one

Re:Does nVidia support HD and HD audio over HDMI? (1)

Grimbleton (1034446) | more than 3 years ago | (#33413964)

My laptop with a GeForce 310M does it just fine. Pushes out 1080p, too.

Re:Does nVidia support HD and HD audio over HDMI? (0)

Anonymous Coward | more than 3 years ago | (#33414178)

Laptop != desktop.

stronger brand (0, Offtopic)

omidaladini (940882) | more than 3 years ago | (#33413766)

I think ATI was a more reputable brand than AMD that has to carry Defeated-by-Intel badge for years.

Re:stronger brand (1)

Ironhandx (1762146) | more than 3 years ago | (#33413880)

Funny how fast intel seemed to have dumped that "Completely annihilated by AMD using less than half of our R&D budget" badge that they were wearing for a couple of years after the Athlon64 was released.

WTF how is this offtopic? (1)

drinkypoo (153816) | more than 3 years ago | (#33413930)

I think ATI was a more reputable brand than AMD that has to carry Defeated-by-Intel badge for years.

I think that ATI is one of the least reputable brands in PC hardware, every single ATI 3d accelerator I've ever owned has caused me some kind of problem. Retiring the ATI brand won't fool any geeks but it will fool the people they told to never buy ATI.

Re:WTF how is this offtopic? (2, Insightful)

Randle_Revar (229304) | more than 3 years ago | (#33414046)

>it will fool the people they told to never buy ATI.
who would be so irresponsible as to tell someone that?

Re:WTF how is this offtopic? (2, Insightful)

drinkypoo (153816) | more than 3 years ago | (#33414160)

who would be so irresponsible as to tell someone that?

Friends don't let friends buy ATI. I will no longer attempt to help friends with ATI driver problems because usually the answer is "you're fucked" or "become a driver developer" which is the same thing. I can't remember the last time I had an ATI graphics solution with which I've had zero problems, because that has never happened and I have used hardware from almost every generation of ATI graphics chips. Wait, that's no true, there was one combination I had no graphics problems with, Mach32 on NT3.51. But with Mach64 came a driver complex enough to prove that ATI couldn't write drivers, and the rest is history... a painful chapter of history I'd like to burn.

Re:stronger brand (1)

Rockoon (1252108) | more than 3 years ago | (#33414098)

Intel has been running a smaller 32nm process/die size in order to beat AMD in performance and only a few of those 32nm chips designs have achieved price/performance parity while the rest are grossly below the curve.

AMD is about to put its own 32nm process into production chips, so at the very least the very top end will not be Intel-only land anymore. The only question is whether or not AMD's new chips will continue the long standing trend of spanking Intel on the price/performance metric ("defeated-by-intel" indeed... shut up fanboy)

That's retarded. (0)

Anonymous Coward | more than 3 years ago | (#33413792)

ATI is the oldest surviving video card brand. :(

Little bit of hate? (5, Funny)

Anonymous Coward | more than 3 years ago | (#33413810)

In May of last year, AMD took the remains of the Canadian graphics company and melded them into a monolithic products group, which combined processors, graphics, and platforms. Now, AMD is about to take the next step: kill the ATI brand altogether.

Oh, please, J. Dzhugashvili, don't hold back. Tell us how you REALLY feel. What'd the rejected original form of this summary look like?

In May of last year, the poor, innocent Canadian angels of technology, ATI, had their very remains tortured and raped by the evil, evil AMD, cruelly melded into a hideous abomination of a monolithic products group, creating an unholy, soulless combination of processors, graphics, and platforms. Now, the faceless anti-christ forces of AMD plan to take the next step in their plans to destroy all that is good in the world: Slaughter the angelic ATI brand altogether, laughing with sadistic glee as it begs for mercy in a futile appeal to the quickly-evaporating last shreds of AMD's humanity and compassion, ATI having never having harmed a fly in its too-short, sad, sad life.

Justice Department on vacation since 1980 (1, Troll)

PopeRatzo (965947) | more than 3 years ago | (#33413934)

This is hardly their worst offense, but how did the Bush Justice Department ever let AMD buy ATI to begin with? Are we really OK when there are only two major manufacturers of processors and graphics hardware?

I guess the answer is "for the same reason they're about to let United Airlines and Continental Airlines merge".

Don't they realize that every time one of these mergers happens, the end result is that Goldman Sachs makes a ton of cash, a handful of execs make a ton of cash, and a whole lot of manufacturing workers are thrown off the back of the train? Then they act like they don't understand why there are market "inefficiencies" and manufacturing is fleeing the US (and Canada). And ultimately consumers suffer, too.

Oh, and yes, the Justice Department does have jurisdiction when a US company buys a Canadian company (or vice versa).

Re:Justice Department on vacation since 1980 (4, Interesting)

Murdoch5 (1563847) | more than 3 years ago | (#33414000)

This was a great merg. This merg lead to the first decent Ati drivers being created on the Linux side. If this wouldn't of happened then how much longer would ATI of survived. They basicly said FU to Linux and ignored it. Great Merg.

Re:Justice Department on vacation since 1980 (2, Informative)

JamesP (688957) | more than 3 years ago | (#33414168)

Which manufacturing workers exactly?!

ATI does not have a plant. It's all TMSC and the other one I forgot how's it called.

Re:Justice Department on vacation since 1980 (1)

barzok (26681) | more than 3 years ago | (#33414202)

This is hardly their worst offense, but how did the Bush Justice Department ever let AMD buy ATI to begin with? Are we really OK when there are only two major manufacturers of processors and graphics hardware?

They were too busy dragging their heels on the Sirius/XM merger at the behest of the NAB to notice.

Re:Justice Department on vacation since 1980 (2, Insightful)

Ecuador (740021) | more than 3 years ago | (#33414206)

I don't understand. There were two major manufacturers of CPUs and two major manufacturers of GPUs before the merger, exactly the same number as after the merger. Where is your problem exactly?

I personally see problems elsewhere. One example is ebay, the online auction monopoly, being allowed to not only buy paypal, but also disallow any other payment system...

Re:Justice Department on vacation since 1980 (3, Insightful)

Chaos Incarnate (772793) | more than 3 years ago | (#33414214)

We went from there being two manufacturers of processors & two manufacturers of usable graphics hardware... to there being two manufacturers of processors & two manufacturers of usable graphics hardware. Not sure what you're thinking there was for the Justice Department to stop.

Retired ati a long time ago.. (0, Troll)

rec9140 (732463) | more than 3 years ago | (#33413974)

ati was retired a long time ago...

They got put on semi retirement after dealing with their "all in wonder" cards and drivers on winidiot.

They officially got retired after their boneheaded move to drop support for cards 1-2 years old in the Linux driver tree.

So no loss really.. I am not purchasing AMD graphics, period.

AMD processors, YES. That is all I spec and want, but they can keep the graphics.

AMD you SHOULD HAVE PURCHASED NVidia! Thats the only video card allowed now.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...