Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Preparing To Give Intel a Run For Its Money

Soulskill posted about 2 months ago | from the saddle-up dept.

AMD 345

jfruh writes: "AMD has never been able to match Intel for profits or scale, but a decade ago it was in front on innovation — the first to 1GHz, the first to 64-bit, the first to dual core. A lack of capital has kept the company barely holding on with cheap mid-range chips since; but now AMD is flush with cash from its profitable business with gaming consoles, and is preparing an ambitious new architecture for 2016, one that's distinct from the x86/ARM hybrid already announced."

cancel ×

345 comments

Only the great Master of Paper can save AMD (-1, Flamebait)

VTBlue (600055) | about 2 months ago | (#47020051)

With Papermaster running the show, I expect nothing short of excellence...on paper.

Re:Only the great Master of Paper can save AMD (4, Insightful)

binarylarry (1338699) | about 2 months ago | (#47020085)

I stick to Intel because they're the best CPU you can buy right now.

But I'd love to see AMD back in the game. I bought the first X2 Athlon series, what a beast that was.

Sadly that was also the last AMD CPU I've purchased.

Re:Only the great Master of Paper can save AMD (5, Informative)

drinkypoo (153816) | about 2 months ago | (#47020107)

I bought the first X2 Athlon series, what a beast that was.

Sadly that was also the last AMD CPU I've purchased.

The Phenom II X3 was also an absolute monster for the price, as was the Phenom II X6.

Re: Only the great Master of Paper can save AMD (2, Informative)

Anonymous Coward | about 2 months ago | (#47020261)

Got a Phenom II x4 as it was the best bang for the watts and I was building an always on multi-purpose rig.

Re:Only the great Master of Paper can save AMD (2)

haruchai (17472) | about 2 months ago | (#47020537)

AMD made it easy to upgrade incrementally; not sure if the same would have been true of Intel as I've not had an Intel desktop in over 10 yrs.

Bought a Athlon X2 with nForce-based mainboard & DDR2 RAM in 2006.
Maxed out the RAM & upgraded to Athlon II X4 in 2009 while keeping same mainboard.
In 2011, bought new 990FX- based board to get SATA 3 / USB 3 & DDR3 RAM but kept the same 4-core CPU.
Just last week, got a 8320 Black Edition 8-core at a good price and might soon get my first "AMD" videocard as an upgrade for my GeForce 9600GT.

Re:Only the great Master of Paper can save AMD (4, Informative)

drinkypoo (153816) | about 2 months ago | (#47020731)

AMD made it easy to upgrade incrementally; not sure if the same would have been true of Intel as I've not had an Intel desktop in over 10 yrs.

No, intel changed sockets more than AMD did in that period. I got an AM3+ board, so I went from Phenom II X3 720 to Phenom II X6 1045T, which I still have. If you're not expecting massive single-thread performance, it is still a fairly beefy CPU. I mean, sure, half as much as an intel chip, but I paid a hundred bucks for this (and for my original CPU) and you'd have to spend $200 to get an intel chip with this much horsepower today. AMD-chipset motherboards are cheaper than intel-chipset motherboards as well, so the total savings was at least $200 if not more. Today, I still have more than enough CPU for anything I want to do; It's the 240GT that's holding me back now. Been thinking about a modest upgrade to a newer nvidia card pulled from a Dell on ebay for $60.

Re:Only the great Master of Paper can save AMD (3, Informative)

Nemyst (1383049) | about 2 months ago | (#47020707)

AMD arguably still is pretty competitive... for the price. Having to append those words to anything about AMD is what they hate, though. They can compete in the low-mid end by undercutting Intel (and even then their CPUs are usually more power hungry), but as soon as you get closer to the high end AMD's just left in the dust except for those rather rare use cases which don't need much FPU performance but can run in massively parallel systems, where their 8-core CPUs shine.

Re:Only the great Master of Paper can save AMD (1)

ganjadude (952775) | about 2 months ago | (#47020157)

I was a strong AMD guy from the K6 days up until the phenom days. up to that point I found better bang for my buck with AMD Its only with the haswell processors* that I have made the switch back to intel

* just happened to be the generation I was on when i was ready to upgrade my 1st gen phenom

Just like Bulldozer? (-1, Flamebait)

jandrese (485) | about 2 months ago | (#47020083)

Put up or shut up AMD. I don't care how awesome your chips will be in 2 years, I care how good they are now, and right now they aren't so good. 2 years is a long time in the CPU world, I hope your gains aren't completely eaten away before you launch the chip, again.

This is really a low quality article. "In the distant future we plan to release faster chips!" What a scoop!

Re:Just like Bulldozer? (4, Informative)

thaylin (555395) | about 2 months ago | (#47020135)

The story is more about the come back of their great designer who has done a lot, and how they are betting on him.

RTFA (3, Insightful)

Anonymous Coward | about 2 months ago | (#47020139)

DId you RTFA?

His task will be a new microarchitecture to overcome some of the shortcomings in AMD's current generation microarchitecture, called Bulldozer. Bulldozer adopted clustered multi-thread (CMT) designs[...] Bulldozer is inherently less efficient than Intel's chips[...]What Keller will do, no one knows. And AMD would be nuts to tip its hand. The most logical move for Keller would be to dump the CMT design in favor of a design with simultaneous multi-threading (SMT)

More of the same? Probably not.

Re:RTFA (0, Flamebait)

jandrese (485) | about 2 months ago | (#47020463)

"The current design is bad, are you going to use a better design?"
"We won't say, it's a secret!"

What a compelling article.

Re:Just like Bulldozer? (1)

Anonymous Coward | about 2 months ago | (#47020155)

All the article is saying is that now they have the capital for R&D so they expect to be able to push new territory that they otherwise wouldn't have. Intel will just match this new capital expenditure level, and everyone will be happy.

Re:Just like Bulldozer? (0)

Anonymous Coward | about 2 months ago | (#47020203)

All the article is saying is that now they have the capital for R&D so they expect to be able to push new territory that they otherwise wouldn't have.

Citation needed. Last I looked, Intel's R&D budget was larger than AMD's revenue.

Re:Just like Bulldozer? (4, Interesting)

TheRaven64 (641858) | about 2 months ago | (#47020227)

Last I looked, Intel's R&D budget was larger than AMD's revenue

That certainly was true (probably still is), but it's misleading. AMD no longer owns fabs and the majority of Intel's R&D spending is on process technology. By spinning off GlobalFoundaries, AMD is able to share that R&D cost with other SoC makers and go to other companies if they happen to be able to do it better at a specific time.

Re:Just like Bulldozer? (0)

Anonymous Coward | about 2 months ago | (#47020311)

Good point. Would be interesting to know how their CPU R&D budgets compare, though AMD will still struggle if Intel continues to be at least one generation ahead of the fabs AMD are using.

Re:Just like Bulldozer? (1)

rogoshen1 (2922505) | about 2 months ago | (#47020359)

how is spinning off your fabrication capability 'good' in the long run? (not trying to be flippant, it's a serious question)

Re:Just like Bulldozer? (5, Interesting)

AdamHaun (43173) | about 2 months ago | (#47020589)

how is spinning off your fabrication capability 'good' in the long run?

I don't work at AMD, but I do work at another company that relies partly on foundries.

Basically, it's economies of sale and competition. Semiconductor fabrication processes keep getting more expensive. Foundries specialize in process development and spread the R&D across many, many customers. Unless you're willing to spend a fortune keeping up (as Intel is), have special requirements, or need a ton of volume, you have little to gain and a lot to lose from rolling your own process. Remember, you don't just have to make transistors, you also have to have good enough yield to turn a profit and good enough reliability to keep your customers. If you fail, you have to spend even more money to fix the fab on top of the money you're losing on the stuff you manufacture. Meanwhile, TSMC is cheerfully cranking out wafers for your competitors.

Re:Just like Bulldozer? (1)

haruchai (17472) | about 2 months ago | (#47020609)

That's a good question. After all, Intel was able to use their vastly superior fab capabilities to fend off AMD's enhanced tech for years until they released their Nehalem architecture to definitively take back the desktop CPU performance crown.

They're not called ChipZilla for nothing.

Re:Just like Bulldozer? (1)

drinkypoo (153816) | about 2 months ago | (#47020751)

Intel does that by spending massive scads of money on process technology. They are able to spend that because they massively overcharge for their product if you take only the design, manufacturing and distribution costs into account. You also have to count paying for their processes.

AMD let a bunch of designers go, it's not clear if intel would have eaten their lunch so aggressively otherwise. Guess we'll have some inkling here soon.

Re:Just like Bulldozer? (0)

Anonymous Coward | about 2 months ago | (#47020619)

It isn't, but they needed survive and the fabs were draining too much cash.

Re:Just like Bulldozer? (1)

bhcompy (1877290) | about 2 months ago | (#47020451)

Last year I had an R&D budget of $1. This year I have an R&D budget of $100. It doesn't matter what my competitor has, it matters that I can now do a lot more research and development.

Re:Just like Bulldozer? (0)

Anonymous Coward | about 2 months ago | (#47020471)

But Intel has been spending on research. AMD gave up a while ago, and settled for making money on mobile devices.

Re:Just like Bulldozer? (1)

Anonymous Coward | about 2 months ago | (#47020221)

Yup, and the BS about them being first to 64-bit...maybe in the consumer sector, but Intel, IBM and DEC all had 64-bit chips before the Athlon was even designed let alone shipped.

Re:Just like Bulldozer? (4, Insightful)

Rich0 (548339) | about 2 months ago | (#47020271)

Yup, and the BS about them being first to 64-bit...maybe in the consumer sector, but Intel, IBM and DEC all had 64-bit chips before the Athlon was even designed let alone shipped.

They invented the architecture that you probably typed your post on. That was the point. Heck, on my linux distro it is still called amd64...

Re:Just like Bulldozer? (0)

Anonymous Coward | about 2 months ago | (#47020539)

Oh! The mighty Linux says? Wow. This must be fact!

Re:Just like Bulldozer? (1)

Anonymous Coward | about 2 months ago | (#47020653)

Not the architecture, that belongs to Intel, AMD extended it to support 64 bits. Intel actually had 64 bit extension in the closet, they didn't want to cannibalize the Itanium. When the AMD extensions became mainstream, the defacto standard, Intel licensed it. But the extension is still tied to the Intel architecture and Intel is still in control.

Re:Just like Bulldozer? (5, Informative)

drinkypoo (153816) | about 2 months ago | (#47020761)

Not the architecture, that belongs to Intel, AMD extended it to support 64 bits.

What are you on about? amd64 is not an architecture, nor is x86. They are instruction sets. The underlying architecture may be informed by the instruction set, but it's also only loosely coupled in modern CPUs.

Re:Just like Bulldozer? (1)

bhcompy (1877290) | about 2 months ago | (#47020469)

Yes, and Itanium is dead. Alpha is gone. And I don't even know what IBM did. Meanwhile, x86-64 is here to stay.

Re:Just like Bulldozer? (0)

bobbied (2522392) | about 2 months ago | (#47020807)

Yes, and Itanium is dead. Alpha is gone. And I don't even know what IBM did. Meanwhile, x86-64 is here to stay.

And we are all better off? Um, NOPE!

The ONLY thing that kept the X86-64 going was that it ran X86 instructions. As a processor the X86 family is really horrid both from the electrical and programming sides. Lucky for the motherboard makers most of the sticky "design" is abstracted away by the chip-sets, but for the programmer, the architecture was not designed to be expandable, certainly didn't port well into 64 bits.

If you ask me, Alpha and other processors where of better design. My personal favorite was the Motorola 68000 (aka power PC) stuff. That was an engineer's dream to work with both from the hardware and software sides of things. Certainly the instruction set and programing model was a whole lot better than what we generally see today...

So why did the X86 win out? You can think both IBM and Bill Gates for that. (I suppose you could blame Motorola for not accepting the license terms too.) It is the processor in the PC, which have been sold by the millions, making production costs lower.. Apple Mac stuff was initially Motorola CPU's but finally even Apple had to ditch them for cost. X86 is in use for business and historical reasons, not because it was the right choice technically.

Re:Just like Bulldozer? (0)

Anonymous Coward | about 2 months ago | (#47020891)

Motorola 68000 and PowerPC have nothing whatever to do with each other except that Apple used both in its products and Motorola was involved in the design of both processors.

Re:Just like Bulldozer? (3, Interesting)

kasperd (592156) | about 2 months ago | (#47020641)

Yup, and the BS about them being first to 64-bit...maybe in the consumer sector, but Intel, IBM and DEC all had 64-bit chips before the Athlon was even designed let alone shipped.

That is true. However AMD were the first to make a 64-bit architecture, which was x86 compatible. And it was also the first 64-bit CPU to be in a price range that was acceptable to average consumer. But most importantly, AMD designed an architecture so successful that Intel decided to make their own AMD compatible CPU. Today Intel probably earns most of its money on CPUs using AMD's 64 bit design.

But if AMD now want to go and build an entirely new design, which is nothing like x86, they may very well be repeating the exact same mistake Intel made to let AMD64 get the lead.

By now it might be safe to ditch all 8, 16, and 32 bit backwards compatibility with the x86 family. But AMD64 compatibility is too important to ignore.

Re:Just like Bulldozer? (0)

Anonymous Coward | about 2 months ago | (#47020809)

Who was the first search engine - no one cares and most don't know. Being first is nice, staying first keep you in business.

Re:Just like Bulldozer? (5, Insightful)

oh_my_080980980 (773867) | about 2 months ago | (#47020969)

I think the point was even with Intel's massive cash and infrastructure they couldn't bring 64 bit to the desktop - hell they couldn't do it on the server end either; thet Itanium chips were huge flops. And what killed Itanium was AMD's chip!

" Itanium failed to make significant inroads against IA-32 or RISC, and then suffered from the successful introduction of x86-64 based systems into the high-end server market, systems which were more compatible with the older x86 applications." http://en.wikipedia.org/wiki/I... [wikipedia.org]

So the point is that AMD was more than capable of producing a chip to beat Intel.

Re:Just like Bulldozer? (1)

Anonymous Coward | about 2 months ago | (#47020263)

Where are the same demands for realism when it's a 3D printing story? There it's always "in 50 years we'll 3D print houses on Mars!" and everyone sucks each other's dicks!

No kidding (4, Insightful)

Sycraft-fu (314770) | about 2 months ago | (#47020275)

I would -love- to see AMD truly competitive with Intel on every level because it is only good for us consumers. It would be great if both companies made chips so fast, efficient, stable, and capable that you didn't buy AMD or Intel based on anything but who had the better deal that week.

However I'm not interested in hype and bullshit. As you say, "put up or shut up." I get tired of hearing about how great your shit will be in the future. Guess what? Intel's shit will be great in the future too, probably. It is great right now.

So less with the hype, more with the making a good CPU.

Just like Bulldozer? (0)

Anonymous Coward | about 2 months ago | (#47020421)

Don't like news about what tech will be like in 2 years? Then why do you read slashdot?

Re:Just like Bulldozer? (1)

jandrese (485) | about 2 months ago | (#47020477)

Because I want the news from last week of course.

Re:Just like Bulldozer? (1)

oh_my_080980980 (773867) | about 2 months ago | (#47020985)

You mean like Windows phones.

Re:Just like Bulldozer? (4, Insightful)

timeOday (582209) | about 2 months ago | (#47020901)

2 years is a long time in the CPU world

Well, not so long as it used to be. I recently got a Macbook Pro and under "About This Mac / Processor" it says "2.3 GHz Intel Core i7" - the same thing it says on a Macbook Pro I got 3 years ago. The CPU is not actually identical of course - it has much-improved battery life, which is good. But the performance increase, if any, is not noticeable. Times really have changed.

First to 64-bit (0, Informative)

Anonymous Coward | about 2 months ago | (#47020119)

You mean first to x86-64. Intel had a 64-bit processor before that (Itanium). 13 years later, Itanium is dead and x86 is holding us back, so much that servers are turning towards ARMv8 (inferior design to Itanium, but tons of momentum from mobile/embedded).

Re:First to 64-bit (1)

danomac (1032160) | about 2 months ago | (#47020439)

First in the sense that Apple made the first tablet. ;-)

Re:First to 64-bit (1)

toejam13 (958243) | about 2 months ago | (#47020517)

While the Itanium ISA may be dying, a lot of the redundancy, error detection and error handling from Itanium has made its way into the Xeon line.

And x86-64 isn't holding all of us back. Different processor architectures excel in different places. ARM is good for mobile and clusters. x86-64 is good for desktop and big iron.

Re:First to 64-bit (3, Informative)

jon3k (691256) | about 2 months ago | (#47020723)

ARM servers excel at some workloads. The only people looking at ARM are people running massive web server farms. Xeon still crushes ARM in performance per watt and absolute performance for almost every server workload. It's definitely an interesting area and somewhere I expect ARM to really grow but right now it's not much of a contest.

Re:First to 64-bit (4, Insightful)

bobbied (2522392) | about 2 months ago | (#47020897)

You mean first to x86-64. Intel had a 64-bit processor before that (Itanium). 13 years later, Itanium is dead and x86 is holding us back, so much that servers are turning towards ARMv8 (inferior design to Itanium, but tons of momentum from mobile/embedded).

You do realize that this run towards ARM is not a full stampede, and is driven by price and operating costs and only useful for Unix/Linux systems as windows server isn't really interested in supporting ARM yet. This is more like a trickle of some large specialized systems off onto Red Hat (or similar) systems where one can afford to just change processors and recompile everything in an effort to same a bit of operating power and hardware costs. But you have to be looking at enough servers to make this worth the labor cost.

So, where I don't care for the X86 family and would love everybody to switch to ARM, I know it's not going to happen in my career without there being that "killer" app that pushes everybody off of Windows. Right now, with "Office" being the "killer got to have" application of all time, and that generally only running on Windows, guess what? X86 is here to stay.

Intel Inside, inovation outside. (1, Insightful)

Anonymous Coward | about 2 months ago | (#47020147)

It was never technology or the lack of it that kept AMD out and Intel in. It was clever marketing, FUD and just plain ignorance of the customer. The "Intel Inside" ads and the "what if something is not compatible with AMD" feeling that the marketing gurus created kept Intel on the top outselling even the more superior AMDs. The real Intel killers are the ARM processors and mobile computing that is giving Intel a run for its money. This is what happens if you refuse to innovate!

Re:Intel Inside, inovation outside. (1)

Anonymous Coward | about 2 months ago | (#47020299)

There was also the contracts PC makers signed with Intel, keeping the chips out of the wildly selling (at the time) Dude, you're getting a Dell market.

Re:Intel Inside, inovation outside. (2)

Defenestrar (1773808) | about 2 months ago | (#47020379)

Yeah? Well the original Athlon 64 dual core isn't compatible with modern Window's operating systems. [neowin.net] The instruction set which prohibits it from being used was around at the time, AMD just didn't put it in.

Buh? (4, Interesting)

drinkypoo (153816) | about 2 months ago | (#47020151)

But the real fight of a decade ago, when AMD was first to 1GHz, the first to 64-bit, the first to dual core, seemed missing. It's not surprising since the company was facing a real threat to its survival. But with a gravy train from the gaming consoles, it looks like the company is ready for a fresh battle, with a familiar face at the helm.

Uh, wait. No. It was surprising when AMD was the performance leader. It was surprising because they were broke. It's not surprising to see AMD pushing out a new architecture now that they have money. It takes a lot of money to do that. So we start out completely ass-backwards here.

Much elided, then

The most logical move for Keller would be to dump the CMT design in favor of a design with simultaneous multi-threading (SMT), which is what Intel does (and IBM's Power and Oracle's Sparc line).

Wait, what? Why? Why wouldn't it make more sense to just fix the lack of FP performance, perhaps by adding more FP units? Why would it make more sense for them to go to a completely different design? It might well, but there is no supporting evidence for that in the article.

Re:Buh? (1)

dagamer34 (1012833) | about 2 months ago | (#47020245)

It wasn't that surprising that AMD was king around 2003-2004, the problem was that Intel was playing very dirty, signing deals with OEMs like Dell to specifically NOT use AMD chips. The fines Intel got from the EU are never going to do as much to help AMD as actually gaining more profits during that period (and who knows, they may not have sold their mobile Radeon group to Qualcomm in an effort to raise cash). It's the domino effect of unknowns that hurts the most.

Re:Buh? (3, Insightful)

jandrese (485) | about 2 months ago | (#47020363)

AMD was dominant while Intel was chasing dead ends (Netburst and Itanium). Once Intel woke up and started working on sane chip designs again AMD's goose was cooked. They just can't compete with Intel's R&D budget. Plus, AMD made some boneheaded decisions of their own, like firing a bunch of their R&D staff in the belief that computer automated chip layout would prove superior to human designed layouts.

Re:Buh? (1)

drinkypoo (153816) | about 2 months ago | (#47020503)

AMD was dominant while Intel was chasing dead ends (Netburst and Itanium). Once Intel woke up and started working on sane chip designs again AMD's goose was cooked. They just can't compete with Intel's R&D budget.

Well, that was my point, AMD can afford to have an R&D budget right now. But you're right, intel spent a lot of time dicking around with nonsensical architectures that they might well have been able to spend crushing AMD sooner. On the flip side of that, though, is the question of whether they could have actually been more effective. Too many cooks, and all that. Spending more money doesn't necessarily result in getting where you want to go sooner. You tend to go somewhere, but not necessarily in your chosen direction. Itanic illustrates that better than anything else mentioned so far in this discussion.

Plus, AMD made some boneheaded decisions of their own, like firing a bunch of their R&D staff in the belief that computer automated chip layout would prove superior to human designed layouts.

Yeah, I think that will eventually happen, but not now. Right now, human insight is still king. On the other hand, maybe they spent their staff budget on R&D :)

I don't think intel is immune to competition. AMD with some cash is still dangerous to them. I won't hold my breath, but I would like to see them remain relevant.

I would also like to see them figure out graphics drivers, so I can buy their integrated processors...

I'm Still Rooting for AMD (5, Interesting)

Jaborandy (96182) | about 2 months ago | (#47020175)

I was so proud of them when they kicked IA64's ass with their amd64 architecture, beating Intel at their own game by choosing to be x86-compatible when even Intel didn't go that way. Then I was sad when amd64 started getting called x64, since it stripped AMD of the credit they deserved. Go AMD! A world without strong competition for Intel would be very bad for consumers.

Re:I'm Still Rooting for AMD (1)

jandrese (485) | about 2 months ago | (#47020391)

I have to admit, I found the AMD64 moniker a little confusing when I first read it. I had to Google around to make sure my Core2 chip would support it and that it wasn't using some AMD proprietary extension top of the 64 bit extension. x86_64 is less confusing, even if it is more awkward to type out.

Re:I'm Still Rooting for AMD (0)

maz2331 (1104901) | about 2 months ago | (#47020429)

AMD has historically made huge gains in a stair-step fashion where they are flat or declining for a long period, then leapfrog way ahead of Intel, who then catch up and pass them in a more linear fashion.

Re:I'm Still Rooting for AMD (2, Informative)

Anonymous Coward | about 2 months ago | (#47020535)

As others have pointed out, AMD have historically beaten Intel when Intel fscks up. Intel needed the P4 to keep ramping up clock speed because it had sucky IPC, and it hit a brick wall, so AMD beat them because they had significantly better IPC at similar clock speeds. Intel wanted everyone to switch to Itanium, so that was their 64-bit push, while AMD pushed 64-bit into the x86.

As soon as Intel realized they needed good IPC on a 64-bit x86 you couldn't fry eggs on, AMD was back to second place, and have been stuck there since.

AMD Atanium (0)

Anonymous Coward | about 2 months ago | (#47020177)

Come on, you know you want AMD to produce something not-backwards-compatible. The drama, it would be so entertaining.

Re:AMD Atanium (2)

viperidaenz (2515578) | about 2 months ago | (#47020555)

Liquidation isn't entertaining...

Drivers? (5, Insightful)

Bigbutt (65939) | about 2 months ago | (#47020179)

Honestly they need a better team writing the drivers. You can have the best CPU/GPU in the industry but if the drivers suck, no one will want to buy them.

[John]

Re:Drivers? (1)

Jeff Flanagan (2981883) | about 2 months ago | (#47020239)

I've had plenty of issues with AMD video drivers, though those problems seem to be behind them, but is there really a problem with their CPU/chipset drivers?

Re:Drivers? (0)

Anonymous Coward | about 2 months ago | (#47020293)

If they keep dumping code into the open source driver, and linux gaming takes off, it's moot what they do.

They're fools (2)

Baldrson (78598) | about 2 months ago | (#47020181)

This was their opportunity to dominate the CPU market with the MIll CPU architecture [millcomputing.com] and they blew it.

Re:They're fools (0)

Anonymous Coward | about 2 months ago | (#47020303)

Actually, they'd be fools if they tried to be the most dominate processor manufacturers instead of being the best processor manufacturers... but that kind of logic doesn't work in most peoples' minds.

Re:They're fools (0)

Anonymous Coward | about 2 months ago | (#47020491)

This was their opportunity to dominate the CPU market with the MIll CPU architecture [millcomputing.com] and they blew it.

So...you pitched AMD on your unknown, untested architecture that has never been implemented and only exists on paper... and the went with ARM instead? The fools

I mean it's not like ARM has mature compilers (multiple) OS support, strong developer mindshare, tons of performance data to drive design on real silicon to drive CPU design tradeoffs, like MIII already does...oh..wait minute...

2 years is a long time to wait (1)

edxwelch (600979) | about 2 months ago | (#47020229)

I think profits will be from consoles, GPUs and low end APUs for the time being

wrong (4, Interesting)

Charliemopps (1157495) | about 2 months ago | (#47020255)

Sorry AMD, you're heading in the completely wrong direction. CPUs are already plenty fast. They have been for years. 3D gaming is starting to look like just another "Gold plated speaker wire" guy hobby as everyone moves to mobile devices.

The real winners in the future are going to be the very cheap, very efficient chips. Do you want one very powerful computer to run everything in your house? Or do you want everything in your house to have its own dedicated, highly efficient CPU that does just what that device needs?

Re:wrong (1)

Bruinwar (1034968) | about 2 months ago | (#47020369)

Why did I still wait if my CPU is "plenty fast" enough? My dream is to some day work on a machine that waits for my input rather than me always waiting for it. This means faster everything, including the CPU.

Re:wrong (-1)

Anonymous Coward | about 2 months ago | (#47020977)

I'm not sure about that. You come off seeming pretty slow. Maybe a tough of down syndrome or a heavy diet of paint chips. Do your eyes point in the same direction? Just curious.

Re:wrong (1)

mlts (1038732) | about 2 months ago | (#47020397)

What AMD should consider are FPGAs and different power cores on the same die. This isn't anything new, but done right, it can go a long way in the server room.

The FPGAs can be used for almost anything. Need a virtual CPU for AES array shifting? Got it. Need something specialized for FFT work? Easy said, say done. Different power utilization cores would be ideal for a server room where most of the hosts see peak load use, then after quitting time, end up idle.

Re:wrong (1)

viperidaenz (2515578) | about 2 months ago | (#47020631)

Anything with an FPGA is always going to be in a niche market.
It won't let you multi-task since you can't reprogram it when ever you context switch.

Users don't want messages say they need to wait for X to finish before they start Y.

They're also very expensive because they use a lot of silicon.
They also consume a lot of power too.

Re:wrong (0)

Anonymous Coward | about 2 months ago | (#47020425)

I heard a saying that the most reliable way of making money in a gold rush is to sell picks and shovels. The 21st century equivalent is probably in tech bubble the most reliable way to make money is selling servers and network hardware. I would guess AMD is chasing the server farm market.

Re:wrong (0)

Anonymous Coward | about 2 months ago | (#47020485)

The real winners in the future are going to be the very cheap, very efficient chips.

Maybe they should hire you as CEO? Or maybe that's why they are working with ARM as mentioned in the summary?

Re:wrong (1)

Anonymous Coward | about 2 months ago | (#47020711)

ladies and gentlemen - an idiot (or just poorly informed with a penchant to draw premature conclusions... so I suppose... an idiot).

"CPUs are already plenty fast"

feel like ive heard this before... then this stuff called software comes out and makes CPUs feel not so fast...

"3D gaming is starting to look like just another "Gold plated speaker wire" guy hobby as everyone moves to mobile devices"

lol ok well gold plating has a specific anti-corrosive usage, so... not the best comparison. and uhhh.. angry birds, for a lot of people, isn't quite on par with say, mass effect. a bit like saying 'why watch a movie when I have all these digital shorts?!' and while luddites (or old people) might still cling to the near-sighted notion that 720p is "good enough", 4k and 8k have objective benefits, and require tons of vram bandwidth to run relative to games...

"The real winners in the future are going to be the very cheap, very efficient chips"

cheap and fast is best? YOU DONT SAY! the real winners will be whoever can adequately run the in-vogue software the cheapest. derp.

Re:wrong (1)

Loki_1929 (550940) | about 2 months ago | (#47020911)

CPUs are already plenty fast. They have been for years.

Incorrect. CPUs are plenty fast and have been for years for doing many common tasks. The fact is that they aren't nearly fast enough (particularly for single-threaded items) and almost certainly won't be for another decade or more. There's a limit to what and how much you can multi-thread, and even then, you're still limited by single-thread performance x number of threads.

So yes, for grandma playing Blackjack on Yahoo, today's CPUs are plenty fast. For me and many others? The fastest stuff available is 100x slower than "fast enough".

Do you want one very powerful computer to run everything in your house? Or do you want everything in your house to have its own dedicated, highly efficient CPU that does just what that device needs?

I want computers (and servers, especially) which are able to perform their particular function without me having to wait on them. Ever. I want usable speech recognition feeding into a responsive AI that behaves as expected without delay (and God help you if you answer "Siri" to this). I want Eve Online to be able to stick 50,000 ships in one fight with full collision and damage physics modeling with zero lag. I want to be able to transcode, store, tag, and index 20 hours of home movies and a year worth of pictures without waiting. I want to run realtime and faster simulations of complex systems.

Are these common, everyday needs? Moreso than you might think. A lot of the back-end servers struggle to keep up with workloads that either expand or change over time. While much of what's right in front of your eyes seems pretty happy with the CPU that's there today, there's a lot of stuff happening behind the scenes that isn't. This causes server admins and developers to have to spend inordinate amounts of time, money, and cranial energy figuring out how to make it functional, giving the limited computing power available.

A lot of things need very little power, and they should have very little computers with very little CPUs to make them go. Some things - things you don't think about - need tons of power, either serially or just overall. I'd pay good money if Intel and AMD would stick with 4-12 cores and concentrate on making those cores enormously powerful. As it is, they're risking going the route of SPARC, and obviously that isn't working out well for SPARC. Interestingly enough, Oracle's trying to make SPARC more like x86 even as Intel and AMD are trying to make x86 more like SPARC.

Best low-cost CPU with half-decent GPU? (1)

ArcadeMan (2766669) | about 2 months ago | (#47020257)

I'm looking at the new Intel G3240 with Intel HD 4000 and I was wondering if something around the same price range (70$CAD) from AMD had an equivalent CPU with a better GPU.

Re:Best low-cost CPU with half-decent GPU? (1)

ArcadeMan (2766669) | about 2 months ago | (#47020401)

I'm not even sure the G3240 comes with the HD 4000 because Intel makes it near impossible to know which GPU is used inside a lot of their CPUs, listing only "Intel HD".

Re:Best low-cost CPU with half-decent GPU? (1)

jandrese (485) | about 2 months ago | (#47020437)

I don't know about your case specifically, but the rule of thumb is that for low to mid range stuff you can get an AMD solution for about the same price that is going to have a slower CPU and faster GPU. It's pretty easy to beat a HD4000 GPU in any case. Of course this shoehorns you in a bit. If you went with the Intel solution, then you could drop a discrete GPU in later. If you go with a CPU that is too slow you often have to change more of the base system to upgrade (memory and motherboard).

Re:Best low-cost CPU with half-decent GPU? (1)

ArcadeMan (2766669) | about 2 months ago | (#47020645)

(reply for both jandrese and washu_k)

Thank you for your comments. I guess I'll go with the G3240 since it's a better CPU, endure the Intel HD GPU for now and add a GTX 750TI later.

Re:Best low-cost CPU with half-decent GPU? (0)

Anonymous Coward | about 2 months ago | (#47020933)

It would also depend on whether you felt it was OK to overclock. For roughly the same price, you can get an unlocked FM2 processor and overclock the X86 cores. Since the G3240 isn't hyperthreaded and the CPU clock is locked, you get the benefit of being able to run 4 threads at a higher clock speed.

To be completely honest, the performance difference at this level is minimal. Depending on the level of work you are putting the computer through, you are barely goign to notice the difference if all you plan on doing is surfing the web, checking email, working on a couple office documents or writing a couple scripts. And the money you didn't spend on an Intel CPU can go into more memory, storage, or even a better discrete GPU.

Re:Best low-cost CPU with half-decent GPU? (1)

washu_k (1628007) | about 2 months ago | (#47020493)

No, there really isn't an equivalent. Which is more important, CPU power or GPU power?

The closest AMD in price with a GPU is the A6-6400K. It would be quite a bit better in the GPU department, but MASSIVELY worse in the CPU department. Not even close in CPU power. To get something that wont cripple you on CPU you would need to go up to the A8-6600K, but that is over $110 at the CAD stores I checked and would still be way worse in single thread CPU.

There are also the new Kabini CPUs and the top end of those, the Athlon 5350, is around $70. It would save you money on the MB (AM1 boards are cheap), but would be even worse than the A6-6400K in CPU and might not even match the G3240 in GPU.

Re:Best low-cost CPU with half-decent GPU? (1)

ChrisSlicks (2727947) | about 2 months ago | (#47020703)

AMD A6-5400K. 3.6GHz (3.8 Turbo) and Radeon HD 7540D. $65

As others have said it is a slower processor than the intel but with faster graphics. The AMD only gets a 2100 CPU Mark (Passmark software) which is about the same as an old Phenom II X2 or a few year old Intel i3 mobile chip.

I assembled it as a low-end system for a parent that basically does email and web surfing along with some basic image editing and cheesy games.

Hmm, 1 2? (0)

Anonymous Coward | about 2 months ago | (#47020259)

With openpower systems coming with such great performance numbers from IBM. AMD jumping back hard into the arena at this time might be good timing. ARM/Mobile is growning everything else is shrinking at this point. Its going to be an interesting couple years with the Tegra chips in there.

Compaq was afraid to use AMD chips FOR FREE (4, Interesting)

Kartu (1490911) | about 2 months ago | (#47020267)

Compaq was afraid to use AMD chips given out for free, because Intel would "retaliate", ok?
What kept AMD's market share low was not "clever marketing" of its competitor, it's crime.

Back in P4 Prescott times, Intel's more expensive, more power hungry, yet slower chip outsold AMD's 3 or 4 to 1.
Not being able to profit even when having superior products, it's really astonishing, to see AMD still afloat.

Free? Keep in mind they'd lose Intel Payola (2)

rsborg (111459) | about 2 months ago | (#47020965)

Compaq was afraid to use AMD chips given out for free, because Intel would "retaliate", ok?
What kept AMD's market share low was not "clever marketing" of its competitor, it's crime.

Back in P4 Prescott times, Intel's more expensive, more power hungry, yet slower chip outsold AMD's 3 or 4 to 1.
Not being able to profit even when having superior products, it's really astonishing, to see AMD still afloat.

Intel's Payola [1] (which basically kept Dell profitable for several quarters of the past decade) is something you have to factor in when looking at these "deals". I'm just sad that Intel didn't pay a bigger price for their purely anticompetitive corrupt practices.

[1] http://www.theatlantic.com/tec... [theatlantic.com]

target foot acquired! (0)

slashmydots (2189826) | about 2 months ago | (#47020279)

Way to shoot yourself in the foot, AMD. I don't want or need a new architecture. I want x86 (and x64) for my PC and laptop, the end. In 2016, you'll have some stupid new chip that won't work in any PC or laptop in the world just in time for people to realize that tablets are netbooks with no keyboard and a higher failure rate and stop buying them. I don't think they will make a 1/8 watt cell phone chip when they're this far behind so they're making a completely useless product that nobody will use in anything.

Re:target foot acquired! (1)

NotInHere (3654617) | about 2 months ago | (#47020431)

Intel tried to obsolete x86 with its IA-64 architecture, and they failed. With old architectures its like with old software: lots of backwards compatibility, and a huge mess. And x86 /is/ old. x86 still has features from the 1970 8080 chip. It is time for an architecture to success x86, and remove that clutter. However, with a new architecture, AMD takes a high risk. The PC success of their architecture depends on whether microsoft wants to support three architectures or not.

Re:target foot acquired! (1)

viperidaenz (2515578) | about 2 months ago | (#47020673)

Windows NT used to run on DEC Alpha.

AMD lost money in 2013 and 2014Q1 (1, Informative)

Anonymous Coward | about 2 months ago | (#47020333)

now AMD is flush with cash from its profitable business with gaming consoles

And that cash is going to other parts of the business to offset their loses. AMD lost $83M last year and $20M last quarter. At this point, the only part of the busines that is viable is the graphics division; they just need better drivers. AMD is grasping for straws at this point. Personally, I think their should ditch their x86 products, it is dragging them down.

Re:AMD lost money in 2013 and 2014Q1 (1)

viperidaenz (2515578) | about 2 months ago | (#47020687)

Why do they need to make better drivers? Their graphics division makes all its money selling GPU's to Microsoft and Sony. No driver issues there.

ha ha (1)

p51d007 (656414) | about 2 months ago | (#47020383)

I remember when the AMD K6 came out. "AMD is preparing to give Intel's pentium a run for its money with the new K6".

Re:ha ha (0)

Anonymous Coward | about 2 months ago | (#47020593)

Except K6 was competing (and in some benchmarks beating) the Pentium II, not the Pentium.

And with the K7 (Athlon), AMD truly wiped the floor with the Pentium III. Intel didn't manage to get back on top until the Core2.

Re:ha ha (1)

viperidaenz (2515578) | about 2 months ago | (#47020693)

K6, the Celeron killer!

Re:ha ha (1)

drinkypoo (153816) | about 2 months ago | (#47020825)

The original K6 was a bit of a turd, because of its 24-bit FPU. That was a horrible, terrible mistake.

The K6/2 was a peach of a processor, clock for clock it will beat a Pentium II at many operations not least because the low-end P2s that it was competing with had crippled cache. Sadly, most of the motherboards it was coupled with were pure shit.

By the time that was fixed and the K6/3 came out with onboard L2 cache (and the motherboard cache, if any, became L3) it was too little, too late. The K6 was known as a failure.

Then the K7 came out and it beat the living crap out of the P3 dollar for dollar, watt for watt, clock for clock, any way you wanted to measure. And then it went on to beat the P4 on all the same bases. I won't count AMD out yet.

Re:ha ha (1)

Red_Chaos1 (95148) | about 2 months ago | (#47020979)

I've been in the AMD camp for ages, but I have to admit that even the K6-2 was not all that. It was decent, but it still suffered a weak FPU, which became apparent when MP3 made its big hit.

LOL (0)

Anonymous Coward | about 2 months ago | (#47020561)

Intel has so much tech on the roadmap, anything AMD were to come out with, Intel would just push stuff to come out sooner that would immediately be better.

AMD should just stop bothering with processors...

Yay! (1)

buggsdummy (2698599) | about 2 months ago | (#47020713)

Can't wait for a new day to dawn for AMD!

SecureBoot (0)

Anonymous Coward | about 2 months ago | (#47020745)

It better have a configurable secureboot feature, because ARM has the real nightmare of TCPA/TC through the use of a non-configurable secureboot.

2016 (4, Funny)

Anonymous Coward | about 2 months ago | (#47020785)

By 2016 AMD will have a CPU that beats the sh*t out of Intel's 2014 best offerings.

Run (0)

Anonymous Coward | about 2 months ago | (#47020819)

The only running Intel is gonna do is to the bank with armfuls of money.

Let there be an FX-8550 (0)

Anonymous Coward | about 2 months ago | (#47020871)

I hope they will release a Steamroller based FX series until that, many would like to put their hands on those improvements. No one really expects them to be present in the high-end market, but we sure want them to remain strong in the medium segment.
http://semiaccurate.com/2014/01/23/kaveri-versus-richland-performance-per-clock-comparison/
http://www.extremetech.com/computing/177099-secrets-of-steamroller-digging-deep-into-amds-next-gen-core

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...