×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Confirms Commitment To x86

Unknown Lamer posted more than 2 years ago | from the amd-versus-world dept.

AMD 163

MrSeb writes with an excerpt from an Extreme Tech story on the recent wild speculation about AMD abandoning x86: "Recent subpar CPU launches and product cancellations have left AMD in an ugly position, but reports that the company is preparing to jettison its x86 business are greatly exaggerated and wildly off base. Yesterday, Mercury News ran a report on AMD's struggles to reinvent itself and included this quote from company spokesperson Mike Silverman: 'We're at an inflection point. We will all need to let go of the old 'AMD versus Intel' mind-set, because it won't be about that anymore.' When we contacted Silverman, he confirmed that the original statement has been taken somewhat out of context and provided additional clarification. 'AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud.' The larger truth behind Silverman's statement is that no matter what AMD does, it's not going to be 'AMD versus Intel' anymore — it's going to be AMD vs. Qualcomm, TI, Nvidia, and Intel."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

163 comments

Considering Bulldozer ... (4, Interesting)

ackthpt (218170) | more than 2 years ago | (#38215118)

The larger truth behind Silverman's statement is that no matter what AMD does, it's not going to be 'AMD versus Intel' anymore — it's going to be AMD vs. Qualcomm, TI, Nvidia, and Intel."

Considering the execution of Bulldozer, you could possibly add AMD to the vs. list.

Re:Considering Bulldozer ... (2)

lightknight (213164) | more than 2 years ago | (#38216046)

Granted, Bulldozer is...painful to look at. However, I am willing to give AMD the benefit of the doubt, and allow them one upgrade cycle to fix the bugs in their design before considering the competition. They claim that this design will ramp up better than the previous stuff, and others have claimed that a few software patches are needed for various OSs like Windows to take advantage of the change in architecture.

Mind you, it does kind of feel like Intel with the Itanium (the Itanic), but thankfully this design is still in the x86 world.

Re:Considering Bulldozer ... (3, Insightful)

AmiMoJo (196126) | more than 2 years ago | (#38216826)

The problems with Bulldozer are more than can be fixed by a few revisions or software patches I'm afraid.

I honestly can't figure out what AMD was thinking. Every demanding desktop task these days makes heavy use of floating point maths. In fact that has been the case for a decade or more, going back to P4 and Athlon eras where they were adding more FPUs to single cores. So what does AMD do? Let's have more cores and fewer FPUs!

I can only assume they were hoping that more of the heavy floating point computation would be handled by the GPU. Meanwhile Intel's current generation have added new instructions that outperform GPUs in tasks like video transcoding. It breaks my heart because I was really looking forward to Bulldozer as I have always favoured AMD. Their sockets last much longer than Intel's who seem to dream up a new one for every CPU revision, and you get all the features that Intel charges extra for like ECC RAM support.

I think the best thing they can do now is revise the design and release the next generation as early as possible because this one is going nowhere.

Re:Considering Bulldozer ... (1)

serviscope_minor (664417) | more than 2 years ago | (#38217042)

I honestly can't figure out what AMD was thinking. Every demanding desktop task these days makes heavy use of floating point maths. In fact that has been the case for a decade or more, going back to P4 and Athlon eras where they were adding more FPUs to single cores. So what does AMD do? Let's have more cores and fewer FPUs!

I thought that the standard 128 bit FPUs were independent between the modules as before. The only sharing that happens is when an AVX instruction is issued and they get merged to be a single 256 bit unit. Since AVX is very new, this presumably won't happen with much software right now.

Am I the onlyone... (0)

rsilvergun (571051) | more than 2 years ago | (#38215144)

that had to google "inflection point"? From a marketing standpoint it might be good to have a CEO who isn't an engineer :P.

Re:Am I the onlyone... (4, Informative)

masternerdguy (2468142) | more than 2 years ago | (#38215198)

An inflection point is when the second derivative of a curve is zero, and is generally where a curve changes from concave down to concave up. I'm assuming the guy was talking about a profit over time curve or something.

Re:Am I the onlyone... (0)

Anonymous Coward | more than 2 years ago | (#38215468)

The curve seems to be talking about a time curve plotted against events (amd vs intel?) which is obviously highly subjective. Basically he's using the word inflection to say it's the end of a era (cultural?) of just amd vs intel. His wording then is more symbolic in meaning.

Re:Am I the onlyone... (2, Interesting)

Anonymous Coward | more than 2 years ago | (#38215772)

He just read, "The Innovator's Dilemma," by Clayton Christensen. He sees new innovation in x86 chipmaking as having diminishing returns, making the entire architecture susceptible to other architectures and competitors where new innovation still provides increasing returns.

Re:Am I the onlyone... (1)

nine-times (778537) | more than 2 years ago | (#38216204)

"Inflection point" is sometimes used more broadly and metaphorically to mean something like, "a time of drastic change (or potential drastic change)." He's not using it in a technical sense.

Re:Am I the onlyone... (2)

gman003 (1693318) | more than 2 years ago | (#38216428)

Precisely. It meant "the point where AMD goes from a desktop chip maker that also makes mobile chips, to a mobile chip maker that also makes desktop chips".

Re:Am I the onlyone... (-1)

Anonymous Coward | more than 2 years ago | (#38215206)

Am I the only one that had to google "inflection point"?

Probably not, but then again, there's a lot of stupid people out there.

Re:Am I the onlyone... (4, Insightful)

ackthpt (218170) | more than 2 years ago | (#38215330)

that had to google "inflection point"? From a marketing standpoint it might be good to have a CEO who isn't an engineer :P.

or a CEO who picks up a word or phrase from an engineer and thinks, 'Hey, that sounds good, I'll use it in my next meeting or press statement!'

Re:Am I the onlyone... (2, Insightful)

arth1 (260657) | more than 2 years ago | (#38215500)

Indeed. Which is why words and phrases like "pushing the envelope" and "quantum leap" are so often used wrong, and marks the CEO (who reflects on the company) as a dummy.

Re:Am I the onlyone... (1)

iblum (894775) | more than 2 years ago | (#38217172)

Indeed. Which is why words and phrases like "pushing the envelope" and "quantum leap" are so often used wrong, and marks the CEO (who reflects on the company) as a dummy.

wasn't quantum leap that tv show starring Jonathan Archer

Re:Am I the onlyone... (2)

Hatta (162192) | more than 2 years ago | (#38215374)

This is why it's good to have a background in math, even if you're not employed in an STEM field. All sorts of processes can be described in mathematical terms, knowing what those terms mean helps you understand the world better. People often say "calculus? I'll never use that after high school!". But the truth is, I use my calculus education every single day without ever touching an integral or derivative.

Re:Am I the onlyone... (1)

ackthpt (218170) | more than 2 years ago | (#38216778)

This is why it's good to have a background in math, even if you're not employed in an STEM field. All sorts of processes can be described in mathematical terms, knowing what those terms mean helps you understand the world better. People often say "calculus? I'll never use that after high school!". But the truth is, I use my calculus education every single day without ever touching an integral or derivative.

Why baffle people with BS when you can use real language :)

The new line will factor into integral processes where derivatives will end-product quality!

Sounds like something I'd read in a Dilbert strip...

Re:Am I the onlyone... (1)

iblum (894775) | more than 2 years ago | (#38217206)

This is why it's good to have a background in math, even if you're not employed in an STEM field. All sorts of processes can be described in mathematical terms, knowing what those terms mean helps you understand the world better. People often say "calculus? I'll never use that after high school!". But the truth is, I use my calculus education every single day without ever touching an integral or derivative.

I was thinking about setting up a Reggie Pole for my wife to dance on.

Corporate double-speak (4, Insightful)

LordNimon (85072) | more than 2 years ago | (#38215282)

AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud.

This is a completely meaningless statement. You could say the same exact thing about any microprocessor company. For example:

"Freescale is a leader in PowerPC microprocessor design, and we remain committed to the PowerPC market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."

This statement is true even though AMD and Freescale aren't competitors.

This is the kind of garbage that makes employees think that their managers are clueless and don't know how to fix the company.

Re:Corporate double-speak (2, Funny)

Anonymous Coward | more than 2 years ago | (#38215318)

"PRUNEJUICE is a leader in DRINK YO PRUNE JUICE design, and we remain committed to the DRINK YO PRUNE JUICE market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."

Re:Corporate double-speak (4, Insightful)

Skarecrow77 (1714214) | more than 2 years ago | (#38215336)

Meaningless marketing spin are the only public statements that:
A. don't cause controversy and anger among the investors/stockholders
B. you aren't forced to go back on 12 months down the line when you find out you were too optimistic and/or out of touch.

Re:Corporate double-speak (2, Insightful)

Hatta (162192) | more than 2 years ago | (#38215988)

Meaningless marketing spin should cause controversy and anger among the stockholders. If I'm investing my money in a company, I want to know they have real plans, not just platitudes. Buzzwords are a sign that they have no idea what they're doing. Take your money and run.

Re:Corporate double-speak (1)

epine (68316) | more than 2 years ago | (#38216750)

Take your money and run.

Stores have figured out how to dump the unprofitable customers (long live data mining). Now they've figured out how to dump the skittish investors. You weren't wanted in the first place. Actually, the game was played this way all along.

Reason is a short leash. The receiving side will take blind faith any time they can get it.

Re:Corporate double-speak (1)

Shatrat (855151) | more than 2 years ago | (#38217524)

Reporters deserve meaningless drivel because if you give them any real information they'll just butcher it in order to twist it into a cure for cancer or the latest exploit for iPhones in order to grab headlines.
I'd look for firm dates and roadmaps provided to customers, partners, and investor relations as a sign that the company knows what they're doing.

Re:Corporate double-speak (4, Funny)

arth1 (260657) | more than 2 years ago | (#38215602)

This is a completely meaningless statement. You could say the same exact thing about any microprocessor company. For example:

"Freescale is a leader in PowerPC microprocessor design, and we remain committed to the PowerPC market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."

This statement is true even though AMD and Freescale aren't competitors.

Freescale commits to the cloud? That is BIG news. Time to run out and adjust my stock portfolio!

Re:Corporate double-speak (2)

nine-times (778537) | more than 2 years ago | (#38216262)

It's not completely meaningless, but you have to know how to read this kind of business-speak. He's saying, "We're either not planning on dropping x86 or we're not prepared to announce that we're dropping it. We do have hopes of moving into additional markets, including mobile, low-end, and scalable architectures, but we don't have any specific plans that we'd like to announce about that either."

Out with the old... (1)

Anonymous Coward | more than 2 years ago | (#38215304)

'We're at an inflection point. We will all need to let go of the old 'AMD versus Intel' mind-set, because it won't be about that anymore.'

I actually find that a little saddening. I miss the good old Intel vs. AMD days when there was a legitimate choice to be made by performance enthusiasts. Mind you, those days are long gone, but I will still hold a special place in my heart for my AMD Athlon64 3400 :)

Do their chisps [still] overheat? (1)

bogaboga (793279) | more than 2 years ago | (#38215322)

In the past, I always advocated for, and employed AMD chipped systems. I was once burned by my advocacy when I lost several AMD mobos after they all got fried!

This was a contributory event to my getting fired, though a poorly written application was partly responsible. My employer could not listen because other AMD systems survived. They did because they were to be running the application next.

What is the experience of slashdotters using these systems? Do they still consume lots of power or overheat?

Re:Do their chisps [still] overheat? (1)

stanlyb (1839382) | more than 2 years ago | (#38215542)

Yes they do. Some 2-3 years ago they introduced 125W CPU's. Nowadays, if you bother to check the motherboard's specification, they always explicitly say that they have some upper limit, say 95W for example. Did anyone confess that their CPU's are burning your system? Literally. Even my power supply was burned by this CPU, and it was supposed to have an life guarantee, go figure out.

Re:Do their chisps [still] overheat? (2)

Jeng (926980) | more than 2 years ago | (#38217550)

Yes you do have to do your research and make sure you purchase a compatible motherboard, that is true whether you are dealing with Intel, AMD, or VIA.

125W is still the highest of the desktop chips from AMD, but many AMD motherboards are rated at 140W. You can buy a motherboard that only supports up to 95W, but why would you do that?

Intel has some 130W chips.

Re:Do their chisps [still] overheat? (2)

0123456 (636235) | more than 2 years ago | (#38215568)

AMD chips seem to consume more power than comparable Intel chips, but I'm pretty sure they have thermal throttling these days.

I was impressed with one of the old P4 systems in my previous job because the fan was just lying on top of the heat-sink and every once in a while someone would knock it off and the CPU would just throttle down until someone got around to putting it back (yeah, I don't know why we never spent a few dollars to buy a fan that could actually be screwed into place). In those days an AMD CPU would probably have burned out.

Re:Do their chisps [still] overheat? (0)

Anonymous Coward | more than 2 years ago | (#38216278)

As someone who bought an Athlon XP during that era, and in fact did exactly that (Fan failed, possibly due to dust, possibly just bad bearings) And then ran it without the fan, Ican attest that they didn't have thermal throttling, and would in fact fail in a spectacular fashion.

Mind you during the same era I managed to blow up a PS and mobo with no noticable external threats other than a brand new 3ghz hyperthreaded P4, which while still working I believe to have been the culprit, either due to a less than ideal heatsink (original was aluminum core, swapped with a copper cored one off my 2.0ghz celeron bought to compare 100 dollar Intel mobo/cpu combos against the equiv amd offering. Hint: The Athlon won.
Current gen I'd go AM3 over intel (Mostly due to cheap

This was typed BTW on a Sempron 140 clocked to 3.2 ghz, which has been my primary gaming system since my prior G31/E5400 setup died due to a bad firewire card. I go without whoever offers the features I want at the pricepoint I need, and honestly the farther into the future we go, the less interest either AMD or Intel is giving me to buy their future products. If AMD for example were to drop overclocking from their non-Black parts then Intel might start to look more appealing again. If intel were to stop dicking around with software limitiations on their current chips and allow overclocking on their non-K chips, I would seriously consider them, even if it did limit me to 2 x8 bandwidth videocard slots.

Re:Do their chisps [still] overheat? (1)

confused one (671304) | more than 2 years ago | (#38215790)

That problem only affected earlier chips with inadequate cooling. The stock cooler was inadequate as they made certain assumptions with regard to "typical" usage. They fixed this long ago (it's been a decade?) by adding temp sensors and automatic clock throttling.

What's he gonna say? (5, Insightful)

93 Escort Wagon (326346) | more than 2 years ago | (#38215408)

Let's say AMD is planning - or thinking about, at least - stopping the manufacture of x86 processors. What's a responsible company spokesperson going to say? "Yes, we're working on an exit strategy and are hoping to be out of the business by 2014" - does anyone believe that would be stated? If it was, their x86 business would tank immediately, and all employees working on x86 now would update their resumes and get while the getting is good.

Several years ago, we had an important faculty member accept a dean-ship at another university. The lead time was going to be a bit more than a year. In the meantime, this faculty member still had research projects going full bore. So what did he do? He told his staff that the research projects were going to continue, and would remain at our university for the foreseeable future. Guess what happened a year later? Yup - the "foreseeable future" he spoke of 12 months before turned out to be almost exactly 12 months long.

Re:What's he gonna say? (2)

lightknight (213164) | more than 2 years ago | (#38216066)

Or it could just as easily be someone floating a balloon -> a rumor is reported through various sources, and AMD gets a preview of how the market might react. Depending on the reaction, they might go one way or the other.

Re:What's he gonna say? (4, Interesting)

tlhIngan (30335) | more than 2 years ago | (#38216800)

Let's say AMD is planning - or thinking about, at least - stopping the manufacture of x86 processors. What's a responsible company spokesperson going to say? "Yes, we're working on an exit strategy and are hoping to be out of the business by 2014" - does anyone believe that would be stated? If it was, their x86 business would tank immediately, and all employees working on x86 now would update their resumes and get while the getting is good.

I'd be willing to bet that one of AMD's investors is Intel, and while AMD may want to get rid of the x86 business, Intel won't let it.

Intel needs AMD. And AMD's weakened state is ideal for Intel. However, if AMD dies, Intel also suffers (think anti-trust). But with AMD alive, Intel's scrutiny is lowered and they can sell more chips easily.

Heck, I'm willing to bet Intel has next-gen chips ready, but they want to keep AMD viable and are holding off the release. There's no benefit to Intel other than a few percent marketshare if AMD dies, and there's a huge downside of EU regulators, US regulators and very close scrutiny.

Translation (4, Interesting)

DarthVain (724186) | more than 2 years ago | (#38215442)

"Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."

We will continue to make chips for servers, and low end crap. We can't compete with Intel for the consumer market in the short to medium term, however we are still relevant in business circles.

Consumers prepared to be gouged by Intel as soon as they figure this out. Also other than to just "say it" this has been the truth for some time, years in fact. I don't know if it is AMD stumbling or Intel just continuing to hit home runs, but there hasn't exactly been a whole lot of competition since the days of the ye old Athlon 64 series of processors. Ever since Intel came out with the Core 2 Duo, AMD has been unable to come up with an answer. Perhaps it had something to do with diversifying by buying up ATI, diverting capitol or focus away from core business. Ironically the AMD/ATI brand of video cards has a better reputation than the AMD CPU division, if only my opinion...

Re:Translation (2)

JDG1980 (2438906) | more than 2 years ago | (#38215578)

We will continue to make chips for servers, and low end crap. We can't compete with Intel for the consumer market in the short to medium term, however we are still relevant in business circles.

Consumers prepared to be gouged by Intel as soon as they figure this out.

Intel really can't gouge customers too hard, or it will hasten the transition away from x86 that they fear. ARM will be a much more serious competitor once Windows 8 is released with support for it. Yes, it requires everything to be recompiled, but I don't think Intel wants to be locked into a "legacy only" niche, which is what would happen if they pushed too hard on pricing. I keep hearing that if AMD doesn't perform, Intel is going to gouge the hell out of us, but the truth is that Intel has been beating the pants off of AMD since the release of Core 2 Duo (5+ years) and the gouging still hasn't happened. The Sandy Bridge 2500K is still the best bang for the buck of any CPU anywhere. People act as if the Netburst era was somehow emblematic of Intel, when it was in fact a bizarre aberration.

Re:Translation (1)

0123456 (636235) | more than 2 years ago | (#38215664)

Indeed. Intel's real competition hasn't been AMD for a few years now, it's been ARM.

Re:Translation (1)

lightknight (213164) | more than 2 years ago | (#38216112)

Let Intel turn its full attention to ARM for a few cycles, and see if AMD doesn't punish them.

Re:Translation (1)

0123456 (636235) | more than 2 years ago | (#38216298)

Let Intel turn its full attention to ARM for a few cycles, and see if AMD doesn't punish them.

Intel don't need to, because they're big enough to have different teams doing both. The problem is that no-one can really push the x86 architecture down to ARM-level power consumption because it's such a complex beast in comparison.

Re:Translation (1)

Andy Dodd (701) | more than 2 years ago | (#38216556)

I wouldn't be surprised if Intel is REALLY regretting selling off XScale to Marvell - Intel had an ARM business for a while, but it just didn't do particularly well, so they sold it.

Probably 1-2 years later, the ARM market started exploding.

I would not be surprised if Intel is quietly working on getting back into the ARM business.

Because AMD still resembles a threat. (1)

AdamJS (2466928) | more than 2 years ago | (#38215712)

Intel is pushing forward because it's beneficial to them at the moment not to rest on their laurels.
AMD is underperforming, yes, but not so much that Intel is given any real leeway to slack off;
That is to say, if the i5/i7 lines were only a 5% increase over C2D performance for 1/3 higher price, AMD would have destroyed them, so while AMD hasn't been "real" competition for Intel for quite some time now, they've been good enough to keep the industry trudging along.

If AMD outright left the market, there would be absolutely no incentive or real immediate threat necessitating actual improvements until desktop ARM chips actually started getting established.

Re:Because AMD still resembles a threat. (1)

Shatrat (855151) | more than 2 years ago | (#38216678)

There would still be an incentive to keep performance/price increasing enough to keep the upgrade cycle going, it just wouldn't be as strong as if they had someone else pushing that same metric in competition. They could also control exactly what rate it grows at, and hold back their outstanding new designs until demand tapered off, then release them in order to obsolete the last generation and start the next wave of upgrades.

Re:Translation (0)

lexman098 (1983842) | more than 2 years ago | (#38215814)

The Sandy Bridge 2500K is still the best bang for the buck of any CPU anywhere.

A 200 dollar CPU is the best bang for the buck? I think this statement encompasses a lot of why Intel is is kicking AMD's ass.

Re:Translation (1)

edxwelch (600979) | more than 2 years ago | (#38216532)

ARM only will compete against Intel in cases where power consumption is more important than performance, i.e. netbooks and low power servers (read small market).
Don't forget that a lot of apps are using SSE and x86 extensions will have to be rewritten to run on ARM.

Re:Translation (1)

0123456 (636235) | more than 2 years ago | (#38216724)

ARM only will compete against Intel in cases where power consumption is more important than performance, i.e. netbooks and low power servers (read small market).

Or an increasing number of desktop/laptop uses. At least 90% of the time my laptop and desktop systems are clocked down to the minimum clock speed because there's really nothing for the CPU to do when browsing the web or writing documents.

An awful lot of current x86 CPUs could be replaced with ARM and users would barely notice.

Re:Translation (1)

edxwelch (600979) | more than 2 years ago | (#38216906)

> users would barely notice.
They would notice once they tried to play any games or run any heavy apps like photoshop.

Re:Translation (2)

0123456 (636235) | more than 2 years ago | (#38217140)

They would notice once they tried to play any games or run any heavy apps like photoshop.

Which most people don't do. The most processor-intensive thing a large fraction of PC users do on a typical day is play HD video on Youtube.

Re:Translation (1)

TheRaven64 (641858) | more than 2 years ago | (#38217432)

ARM only will compete against Intel in cases where power consumption is more important than performance

And in places where ARM's performance is 'good enough'. I have a little machine with an 800MHz ARM Cortex A8. For light web browsing and word processing, it's just about good enough, but for anything heavier it isn't. I have another machine with a dual-core 1.5GHz Snapdragon (heavily tweaked A8), and it's fine for Flash-heavy web browsing and most other things including playing back streaming video. A quad-core 2GHz Cortex A9 is far more power than a large proportion of computer users currently need.

Re:Translation (1)

Andy Dodd (701) | more than 2 years ago | (#38216578)

This is pretty much what happened in the P4 days. Intel got complacent and started gouging customers, and that allowed AMD to gain HUGE amounts of market share.

Re:Translation (2, Informative)

macromorgan (2020426) | more than 2 years ago | (#38217004)

AMD SHOULD have gained huge amounts of market share, but thanks to some anti-competitive behavior by Intel that wasn't the case.

i wonder if they have thought of this: (0)

Anonymous Coward | more than 2 years ago | (#38215464)

"Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."

Maybe they could also leverage their synergies.

Radeon may save them... (1)

broken_chaos (1188549) | more than 2 years ago | (#38215488)

My understanding is that Radeon cards are still competing neck-and-neck with Nvidia's offerings these days, especially per-dollar. I may be mistaken, though, as my video card is still an 18-month-old ATI Radeon 5850 (back before Nvidia even had a DirectX 11 card on the market, and before the AMD-ATI buyout), which can still play everything I've thrown at it on full settings at 1920x1080.

Even if their CPUs are lack-luster (even at the lower price point, it would seem, where they used to be quite competitive), they may be able to survive mostly on the GPU market without too many troubles. For a while, at least.

Re:Radeon may save them... (1)

masternerdguy (2468142) | more than 2 years ago | (#38215532)

Until ATI has good Linux support I'll only be buying NVidia. Actually I've generally bought AMD CPUs and NVidia GPUs, but if AMD starts slipping then I'll be forced to go Intel.

Re:Radeon may save them... (2, Interesting)

lightknight (213164) | more than 2 years ago | (#38216142)

Linux has had better support for ATI than Nvidia cards for at least a generation now.

Re:Radeon may save them... (4, Insightful)

TheRaven64 (641858) | more than 2 years ago | (#38217448)

Not really. AMD has really crap proprietary drivers, nVidia has slightly crap proprietary drivers. AMD's open source drivers are poor, nVidia's are nonexistent. If you're willing to run a blob, nVidia's support is better. If you aren't, they both suck.

Re:Radeon may save them... (2)

Charliemopps (1157495) | more than 2 years ago | (#38215908)

I'm a radeon guy to... always buy AMD/Radeon, but even I have to admit.. Radeon cards have a lot of problems that NVidia cards just dont have. You go to nearly every major game releaseds support forums, and what's stickied at the top? "Radeon owners issues click here"

Add to that Nvidias clearly superior support for hardware accelerated HD decoding and really, my favorite card has some catching up to do. I spent months trying to get a Radeon card to work in my HTPC and I think I got the hardware decoding to engage on one movie... and it was a hardware encoding demo. I ordered a $30 Nvidia card that was 2yrs old and threw it in... bam, it just worked. I didn't even have to mess with settings. Hell, when I did mess with the settings I was having a hard time getting it NOT to engage.

Re:Radeon may save them... (1)

jandrese (485) | more than 2 years ago | (#38216064)

It has always been the case that AMD/ATI has generally had the better hardware (although it does shift whenever new chips drop), but nVidia has the better drivers. People will occasionally tell you that "yeah, that was true in the past, but ATI drivers are pretty good now", but then some new game comes out and they're crap again.

Re:Radeon may save them... (1)

Andy Dodd (701) | more than 2 years ago | (#38216638)

Yup, I have yet to ever see evidence that ATI has learned the concept of regression testing.

It seems like on a regular basis, Game X needs driver revision M or lower, and Game Y needs driver revision N or higher with ATI cards. So you're screwed if you want to play both games.

Every time I have had the misfortune of dealing with an ATI video chipset, it's been utter driver hell. NVidia does a much better job of regression testing, and they also do a MUCH better job of long term support of older chipsets. I still remember the hell I went through trying to update drivers for a friend's Dell 600M with an ATI chipset - compared to it, my own Dell laptops were ANCIENT but it was FAR easier to find up-to-date drivers for their NVidia chipsets that worked, as opposed to hours of trial and error with the ATI drivers to find one that would provide proper 3D acceleration AND drive her external monitor at a non-wacky resolution.

Re:Radeon may save them... (1)

lightknight (213164) | more than 2 years ago | (#38216200)

Hmm. AMD has been trying to topple Intel by merging the CPU and the GPU into a single unit.

Intel tends to be better in the single-threaded CPU performance, while AMD has been better with offering more cores. What changes with ATI and Intel is that Intel's graphics options are something of a terrible joke (played on corporate and value customers), and ATI's video cards are sought after as equally as Nvidias.

If AMD can offer a single chip that does both, and does it well (key factor here), with compilers that take advantage of the additional power a GPU can offer when its right next to the CPU, it could hurt Intel in dangerous ways. On another note, the one thing ATI is known for over Nvidia is the number of Stream processors it offers -> Nvidia's are more programmable, ATI has more of them. AMD choose the hardware route, which depending on their ability to combine the two designs may or may not pay off.

Re:Radeon may save them... (3, Insightful)

0123456 (636235) | more than 2 years ago | (#38216264)

If AMD can offer a single chip that does both, and does it well (key factor here)

You can't put a 300W GPU and a 125W CPU on the same die. At least not if you're sane.

The only use for graphics integrated on the CPU are for cheap low-end systems or for extra performance if you can offload some processing to the GPU cores. Putting a high-performance GPU there makes no sense because you need insane cooling to get the heat out and it will be crippled by the slow, shared memory interface anyway.

Plus, of course, you can't just upgrade the GPU in two years when the CPU is still fast enough for current games but the GPU isn't; you have to replace both. CPU manufacturers might love that, but users won't.

Re:Radeon may save them... (2)

Andy Dodd (701) | more than 2 years ago | (#38216660)

It doesn't help that Intel integrated graphics made GREAT leaps forward in Sandy Bridge.

Am I missing something here? (0)

Synerg1y (2169962) | more than 2 years ago | (#38215566)

The x86 market? Wtf still runs on x86, x64 was meant as the replacement, not side by side architecture. It's superior in every way, shame people can't dev for it proper, but that's not the architecture's problem and enough works on x64 that it's 99.9% impossible for me to justify x86 for anything. A few years ago I was building PCs for clients (my hobby, not profitable at all) with AMD processors for the price point performance. More recently it's been harder and harder to justify AMD over intel even on eco builds. I don't know what's going on with AMD, but I would appreciate a new competitor entering the consumer market that isn't intel or AMD that can compete w the former on some level. Stating they are going to go the x86 route sounds like they are throwing in the towel, I don't see this as a sustainable business strategy, every consumer desktop at best buy on display has been x64 for the past few years.

Re:Am I missing something here? (4, Informative)

JDG1980 (2438906) | more than 2 years ago | (#38215606)

When people say "x86" they usually include the 64-bit extensions in this category. Every x86 CPU made today, whether from Intel, AMD, or even Via, supports the AMD64 extensions.

Re:Am I missing something here? (1)

0123456 (636235) | more than 2 years ago | (#38215616)

Every x86 CPU made today, whether from Intel, AMD, or even Via, supports the AMD64 extensions.

Some of the netbook Atoms didn't a year or two back; isn't that still the case today?

Re:Am I missing something here? (1)

JDG1980 (2438906) | more than 2 years ago | (#38216120)

Some of the netbook Atoms didn't a year or two back; isn't that still the case today?

As far as I can tell, the N270 (Diamondville series) was the last Atom that didn't support 64-bit. A quick Google search indicates that Intel hasn't officially discontinued it, but it seems to be almost impossible to find any new products that contain one. Newegg doesn't have any netbooks using this, though they do sell a 10-pack of Intel Atom N270 motherboards [newegg.com] . Since they don't sell individual units, I assume these may be surplus stock (and probably intended for use in embedded systems). Even if the Diamondville series is not officially discontinued, it's definitely on its way out.

Re:Am I missing something here? (1)

jones_supa (887896) | more than 2 years ago | (#38216336)

All netbook Atoms do support it now. The early N270 and N280 were the ones that didn't. Then the miniaturized Atom lines (Z, E) apparently lack it.

Re:Am I missing something here? (2)

Archangel Michael (180766) | more than 2 years ago | (#38215672)

x64 is an extension to x86. What we need a a whole new class of computers designed and built for 64bit architectures. But that calls for a complete redesign of the most popular OS and probably MOBO architecture as well.

The problem is, who would want to do that?

What about ARM64? (1)

IYagami (136831) | more than 2 years ago | (#38216196)

It will be deployed in 2014:

http://www.eetimes.com/electronics-news/4230160/ARM-unveils-64-bit-architecture [eetimes.com]

"Indeed, the first processors based on ARMv8 will only be announced sometime in 2012, with actual server prototypes running on the new architecture expected in 2014."

Re:What about ARM64? (1)

TheRaven64 (641858) | more than 2 years ago | (#38217476)

I'm not sure where they got the 2014 number from. nVidia aims to have A64 parts shipping in 2012. Possibly the 2014 number is for ARMv8 cores designed by ARM, which are expected to be a bit later.

Re:Am I missing something here? (1)

lightknight (213164) | more than 2 years ago | (#38216246)

We could just breathe life back into the Alpha architecture.

As a matter of fact, I believe that MS supports it right up until Windows 2000 (multiple RCs, no release).

I'd love to have an EV12 processor in my next machine. ^_^

Re:Am I missing something here? (1)

AdamJS (2466928) | more than 2 years ago | (#38215734)

"x86" in this context means desktop x86 chips, x86_64 chips and AMD64 chips.

Re:Am I missing something here? (1)

Shatrat (855151) | more than 2 years ago | (#38215738)

x86 is a subset of x86_64, or said another way, x86_64 is an extension of x86, just like SSE or MMX but with a lot more new instructions and hardware requirements.
It sounds like everything you do is x86. The alternative to that architecture isn't x86-64, it's Itanium-64 or ARM or any of the various big iron RISC chips.

Re:Am I missing something here? (1)

Anonymous Coward | more than 2 years ago | (#38215794)

I've been a /. reader for the past 10 years and your ramblings are the most useless post I've ever encountered here. 99.9999% (all except you) of the readers here understand that AMD will not abandon the 64-bit arch. Btw, the corrrect term is x86-64..

One thing AMD has over Intel (1)

C_Kode (102755) | more than 2 years ago | (#38215592)

I just upgraded my PC from a Intel E6600 to a AMD Phenom II X6 1100T. I chose AMD, for one reason. How the heat sink / fan attach to the motherboard.

I have dogs, and kids and my PC doesn't reside protected under a desk. It gets bumped all the time from them playing and those stupid plastic plug brackets that Intel uses to attach the heat sink and fan to the motherboard were absolute garbage. Someone would bump my PC and the heat sink would hang off and cause the CPU to overheat. Not to mention after re-attaching it a few times, the knobs break and you have to buy a new heat sink and fan.

At least AMD still uses the clamp style that clamps to the socket. There is no way that is going to come off unless you intend to take it off. I won't buy Intel again until they re-design how the heat sink attaches.

Re:One thing AMD has over Intel (2)

JDG1980 (2438906) | more than 2 years ago | (#38215638)

Why not just buy a third-party heatsink that comes with its own backplate? You don't have to use the default mounting hardware.

Re:One thing AMD has over Intel (1)

Anonymous Coward | more than 2 years ago | (#38215674)

For the last three (?) generations of Intel, heat sinc mounting mechanism have been bolted-through the motherboard. Pentium 4 and up. That is significantly stronger than a clamp on the cpu socket.

Translation (2)

Vyse of Arcadia (1220278) | more than 2 years ago | (#38215620)

When we contacted Silverman, he confirmed that the original statement has been taken somewhat out of context and provided additional clarification. 'AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to buzzwords buzzwords buzzwords buzzwords buzzwords buzzwords.'

Dear AMD.... (1)

Lumpy (12016) | more than 2 years ago | (#38215684)

I love you guys but recently only have been buying Intel i5 and i7 because your Math coprocessor still stinks badly compared to Intel. For video compression and really heavy maths, I really wanted to use your 6 core processors, but they were slower than the 4 core i7 I bought instead.

Give me a 6 core that runs like a raped ape and has a really good math coprocessor and I'll be back. give me an 8 core that can also do multi chip on the same motherboard so I can build a 16 core for a cheap price, and I'll be back with a whole lot of friends.

There are lots of us that actually do real computing that has really heavy math. I know you guys can do this.

Re:Dear AMD.... (4, Informative)

0123456 (636235) | more than 2 years ago | (#38215708)

There are lots of us that actually do real computing that has really heavy math.

So shouldn't you be using SSE instructions rather than x87?

Re:Dear AMD.... (0)

Anonymous Coward | more than 2 years ago | (#38215976)

Errr, look into Opterons maybe? 12-core processors going 4-way are over a year old. Now you have 16-core processors, 4-way,

http://www.newegg.com/Product/Product.aspx?Item=N82E16819113038 [newegg.com]

So a 64-core box, if you are looking at math on x86. I can't find anything in similar price range on Intel side that offers 64-cores in a box, or even 32-cores in a box.

Of course, you could just be trolling and only actually want 1-thread performance anyway.

x86 (0)

uigrad_2000 (398500) | more than 2 years ago | (#38215786)

Why do we still use this terminology (x86)?

I thought the last x86 processor produced was the Pentium Pro [wikipedia.org]

When the Pentium 4 came out, it was frequently called the "7th generation", but it was never called the 786 or 80786, either formally or informally. Sure, it's just naming conventions, but that's exactly what x86 is about, it's about a trend in naming conventions.

My new hobby will be referring to processors as having x87 [wikipedia.org] architecture, as a distinction to indicate they support floating point instructions.

Re:x86 (0)

Anonymous Coward | more than 2 years ago | (#38216020)

Because these chips are still compatible with the original 8086?

Re:x86 (4, Insightful)

TheRaven64 (641858) | more than 2 years ago | (#38216114)

When the Pentium 4 came out, it was frequently called the "7th generation", but it was never called the 786 or 80786, either formally or informally

But they are all x86 compatible, because they can all run code compiled for 8086, 80186, 80286, 80386 and 486 processors.

My new hobby will be referring to processors as having x87 architecture, as a distinction to indicate they support floating point instructions.

People do refer to x87 when talking about the FPU on x86 chips. It's commonly used when differentiating it from SSE - modern compilers will emit SSE instructions instead of x87 ones unless you specify a backwards compatible target architecture (PII or earlier).

Re:x86 (0)

Anonymous Coward | more than 2 years ago | (#38216228)

Errr... because they all run the x86 instruction set? This being the most significant part of the chip with respect to compatibility of software. It defines the basic instructions, memory management/address spaces, etc. (with modifications and refinements over the decades) The chips are backwards compatible and can run legacy x86 code to this day (with caveats, like 16bit code cant run in 64bit mode).

x87 is a relatively minor subset of the chip. No modern OS or software runs purely in x87. It's origin is just a math co-processor. You might as well call chips SSE because they support SSE instructions.

Shame... (3, Insightful)

Oswald McWeany (2428506) | more than 2 years ago | (#38216056)

Shame- I usually support the underdog- and always wanted AMD to be able to run Intel neck-and-neck.

Nowadays though AMD seems to stand for A Mediocre Design

I hope they can recapture their mojo and challenge intel again- if for no other reason than to provide a lower pricing incentive to intel.

Re:Shame... (1)

Shatrat (855151) | more than 2 years ago | (#38216794)

I think the problem is that they are competing with Intel to beat them in massively parallel and IO limited applications. This doesn't look good on your average review site benchmarking the processor in Crysis or some PC oriented synthetic benchmarks.
I think they're still going to be very popular for servers and supercomputers for a long time.

There's also the problem that (1)

AdamJS (2466928) | more than 2 years ago | (#38217362)

They don't beat the comparative Intel chips at said tasks anywhere near well enough to justify the heat and cost tradeoffs.

Inflexion point? (0)

Anonymous Coward | more than 2 years ago | (#38216610)

Frankly am amazed they even have a wesbite...

Offer it to the Chinese (1)

couchslug (175151) | more than 2 years ago | (#38216732)

China needs a processor company, and even without AMD being leading-edge if their products are sold inexpensively there is a huge potential market worldwide.

i like AMD! (0)

Anonymous Coward | more than 2 years ago | (#38217328)

maybe i'm just slow .. but i LUV my EeePC that has a AMD c-50 (dual-core) APU inside.
it sits on the table (on) and is happy with no powercord for 6+ hours.
with some google searching and linux voodoo it can even output (via HDMI) 1080p movies to my LCD tv.
it has 2x usb 3.0(!) ports (which is like 80% as fast as new-fangled COPPER thunderbolt?)
it's basically my old AMD Athlon 2700+ (from 2003) plus a modern mid/low range GPU
which was in a lian lee alu full-tower case .. just shrunk to 10 inches. WOW!
-
maybe AMD is betting that we HAVE the good software already (like XP?), we don't need MORE software hence MORE CPU grunt, but
we want smaller computer (that use LESS electricity) that run that good (old) software? *shrug*

what intel needs to do (0)

Anonymous Coward | more than 2 years ago | (#38217400)

If intel isn't so stubborn about x86 it needs to license the ARM instruction set and do what they always do: out manufacture everyone else, this time in the ARM space, squishing TI, qualcomm and anyone else who's currently in the ARM mix. And when they start to dominate ARM processors then they can "take over" the ARM instruction set and bend it to their will.

I know it's evil, but it's probably the best way for intel to leverage what it does best to stay on top of the market.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...