Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Why AMD Could Win The Coming Visual Computing Battle

Zonk posted more than 6 years ago | from the all-about-the-eyeballs dept.

AMD 161

Vigile writes "The past week has been rampant with discussion on the new war that is brewing between NVIDIA and Intel, but there was one big player left out of the story: AMD. It would seem that both sides have written this competitor off, but PC Perspective thinks quite the opposite. The company is having financial difficulties, but AMD already has the technologies that both NVIDIA and Intel are striving to build or acquire: mainstream CPU, competitive GPU, high quality IGP solutions and technology for hybrid processing. This article postulates that both Intel and NVIDIA are overlooking a still-competitive opponent, which could turn out to be a drastic mistake."

cancel ×

161 comments

Sorry! There are no comments related to the filter you selected.

eat my shorts slashdot !! (-1, Troll)

Anonymous Coward | more than 6 years ago | (#23107992)


Eat my shorts slashdot !!

whateves. (-1)

Anonymous Coward | more than 6 years ago | (#23107994)

whateves.

... vested interest. (1, Flamebait)

konputer (568044) | more than 6 years ago | (#23107998)

This was written by an AMD shareholder, of course. Guilty as charged as well, here.

Re:... vested interest. (5, Funny)

Anonymous Coward | more than 6 years ago | (#23108162)

I'm sure those AMD shares will come in handy some day... I, for instance, am out of paper towels.

Re:... vested interest. (5, Insightful)

konputer (568044) | more than 6 years ago | (#23108250)

I'm still rooting for AMD. I think that they can pull themselves out of the mess they made. Why? No sane reason. But whenever the U.S. economy decides to come back up, so will AMD.

A-Team (4, Funny)

afxgrin (208686) | more than 6 years ago | (#23109234)

This is definitely a win for the A-Team. I'm sure Mr. T feels pity for the fools...

Re:A-Team (1)

rydan (1275020) | more than 6 years ago | (#23110806)

AMD puts the - in A-Team.

Re:... vested interest. (1)

PeterKraus (1244558) | more than 6 years ago | (#23109854)

Same for me. I hope that AMD will raise, I am a big fan of that company (even though I run Core 2 Duo at the moment, since it was cheap and powerful when I bought it).

Re:... vested interest. (5, Interesting)

hey! (33014) | more than 6 years ago | (#23108700)

There's always an element of drawing the bullseye around the bullet hole in business planning. Your position is never quite what you'd want it to be (with rare exceptions), so you job, in part, is to imagine a bright future that, through an incredible stroke of luck, start right where you're standing right now.

The thing is, while that is all necessary and good as part of business planning, individual investors really ought not to make investment decisions based on this kind of planning, unless they have their own teams of researchers and analysts and their own sources of information.

If you know nothing about the technology, you can't really examine something like this critically. If you know a great deal about it, you are even less qualified to make prognostications, because your opinion about what is good technology messes with your opinion about what makes good business sense.

Mark Twain was a very intelligent man, who lost his entire fortune investing in a revolutionary typesetting system. The things that made him a great writer made him a lousy investor: imagination, contrariness, a willingness to buck convention. Of course, exactly the same qualities describe a superb investor. The thing that really did him in was overestimating his knowledge of a market he was peripherally involved in.

It was natural for Twain to be interested in the process of printing books and periodicals, and to be familiar enough with the process of typesetting in general to see the potential, but not quite intimately enough to see the pitfalls. He would have been better off investing in something he had absolutely no interest or prior experience in.

Re:... vested interest. (1)

moderatorrater (1095745) | more than 6 years ago | (#23109306)

There's are 2 critical differences between Mark Twain's investing in a typesetting machine and me investing in AMD: I'm the target customer for AMD and I'm not going to invest principally in this one company. I would assume from your description of what happened to Mark Twain that he wasn't heavily involved with the people who used type setting machines and therefore made the purchasing decisions for them. On the other hand, I'm intimately familiar with all the reasons that people choose one company over another in computer parts.

So, to sum up my comment in an analogy, Mark Twain investing in type setting technology would be akin to me investing in a company that's trying to sell a new kind of fabrication technology. My investing in AMD would be more akin to Mark Twain investing in another author, or perhaps a particular publishing company.

Of course, since I haven't been in the stock market long enough to have any sort of track record, this is all armchair investing anyway.

Re:... vested interest. (2, Interesting)

hey! (33014) | more than 6 years ago | (#23109442)

Oh, certainly. I wasn't making a specific point about you.

If you've ever been on the product management end of the stick, though, the biggest danger is overestimating the number of people who think as you do or visualize their needs as you do. That's why it's dangerous for people with lots of technical knowledge to use it to guide their investments. You can overcome this, but it's a serious trap.

That's why I don't invest in tech companies at all; whenever I have it hasn't worked out.

I did pretty well in the financial services sector for some time, although I'll admit I had more than my fair share of luck. I simply chose that as one of my investments because money bores me. I'm mostly out now, but I'm thinking of getting back in now that that a disaster is making people scared of these stocks. That's the ticket: if you balance your portfolio, every time an industry goes down, you end up buying. Every time it goes up, you end up selling. Which is just another way of buying low and selling high.

It doesn't pay to get excited by a single company's brilliant potential. I'm actually contemplating getting out of stocks altogether because it's too tempting to be clever rather than patient. It's really a lot of trouble for an amateur to try to out think the market; it's better to out wait it.

Re:... vested interest. (3, Funny)

ZeebaNeighba (913093) | more than 6 years ago | (#23109770)

Thats why I only invest in alcohol and gambling stocks

Re:... vested interest. (1, Informative)

Vigile (99919) | more than 6 years ago | (#23108806)

This was written by an AMD shareholder, of course.

Guilty as charged as well, here.
I am absolutely not an AMD shareholder. Nor do I own anything in Intel or NVIDIA.

Re:... vested interest. (1)

Mr_eX9 (800448) | more than 6 years ago | (#23108844)

Circumstantial ad hominem...nothing to see here, move along.

Sorry, you overlooked the obvious (3, Informative)

BadAnalogyGuy (945258) | more than 6 years ago | (#23108016)

Year over year annual growth has ceased and this past quarter shows a 0.2% decline in revenues.

Re:Sorry, you overlooked the obvious (4, Insightful)

brunes69 (86786) | more than 6 years ago | (#23108152)

Only a 0.2 decline in revenues in the mist of what many consider an already begun recession ain't too bad.

Re:Sorry, you overlooked the obvious (5, Informative)

moderatorrater (1095745) | more than 6 years ago | (#23108268)

Apparently you didn't RTFA, because they describe the problems that AMD is having and then go on to say why the problems may be surmounted. In other words, you're overlooking the obvious position that AMD's in. nVidia doesn't have a CPU line that's one of the top CPUs in the market and in performance. Intel doesn't have a GPU that's competitive in performance. With the market moving towards greater integration and interaction between the CPU and the GPU, there's only one company that can deliver both.

So it's going to come down to whether or not AMD has the ability right now to keep pushing their product lines and innovating fast enough to beat Intel and nVidia to the punch. Their financial situation hurts their chances, but it doesn't negate them completely.

Re:Sorry, you overlooked the obvious (0)

Anonymous Coward | more than 6 years ago | (#23109426)

"With the market moving towards greater integration and interaction between the CPU and the GPU, there's only one company that can deliver both." Why only one? Aren't you contradicting yourself?

Re:Sorry, you overlooked the obvious (2, Informative)

BobPaul (710574) | more than 6 years ago | (#23109958)

Intel has CPU but their graphics are severly lacking. nVidia has GPU, but no CPU at all (unless they pair with VIA or someone else). AMD is the only one of the 3 that has both. How is that statement self contradicting?

Dons an asbestos suit... (1)

knavel (1155875) | more than 6 years ago | (#23108094)

Flamewar in 3...2...1...

Catch (-1, Offtopic)

jmpeax (936370) | more than 6 years ago | (#23108168)

I was excited, until I read this:

But the concept is not an official aim of Esa, and one of the agency's senior officials has dismissed the idea as "science fiction".

Re:Catch (1)

Xordan (943619) | more than 6 years ago | (#23108366)

Wrong story? ;)

Re:Catch (0)

jmpeax (936370) | more than 6 years ago | (#23108594)

Argh! Evidently!

Silly me.

Catch & Release... (5, Insightful)

Deadfyre_Deadsoul (1193759) | more than 6 years ago | (#23108232)

Amd has supposed to have been dead and written off how many times in the past years? Ati as well?

Its nice to know that they still maintain an edge, even though they have no where near the capitol on hand that nVidia and Intel do.

I for one always liked Underdogs... :)

sh1t (-1)

Anonymous Coward | more than 6 years ago | (#23108296)

all 44rties it's

More like zombie visual computing (2, Funny)

danhm (762237) | more than 6 years ago | (#23108302)

I thought AMD was dead [slashdot.org] !

Re:More like zombie visual computing (4, Funny)

Culture20 (968837) | more than 6 years ago | (#23110442)

That's only when the submitter(s) want to sell short. Now that it's low and they've bought AMD stock again, it's time to raise the stock price.

AMD bought out ATI? (-1, Redundant)

Kingrames (858416) | more than 6 years ago | (#23108356)

wow, that explains so much. I used to get AMD and ATI confused, and that just got worse when trying to find drivers for my new $150 1 GB Radeon HD 3650.

Which I still can't get working quite right in Linux.

In any event, I seriously doubt that AMD is out of the competition, what with deals like that going round.

speaking of which: I recently saw a $30 8 GB usb flash drive - that seems far more newsworthy than this, especially since it renders DVD writers obsolete for anything but creating illegal copies of dvd movies. which is easier with avi's anyways...

Re:AMD bought out ATI? (0)

Anonymous Coward | more than 6 years ago | (#23108494)

That card is never going to use 1GB of VRAM.

Re:AMD bought out ATI? (1)

What Would NPH Do (1274934) | more than 6 years ago | (#23108586)

Is that before or after you start playing Crysis?

Re:AMD bought out ATI? (1, Offtopic)

Slippery Pete (941650) | more than 6 years ago | (#23108500)

$30 for an 8GB USB flash drive? That doesn't really compare to the 10 CENTS I pay for a blank DVD. I still need to share gigs of data with people where placing that data online isn't an option. If I had to spend $30 a pop to send these out, it would cost a small fortune.

And no, the information I'm sharing isn't illegal.

Re:AMD bought out ATI? (1)

Kingrames (858416) | more than 6 years ago | (#23109278)

I was referring to [cost of drive] + [cost of countless cd's].

Re:AMD bought out ATI? (1)

What Would NPH Do (1274934) | more than 6 years ago | (#23109406)

For 36 dollars you can buy a CD burner and a 100 pack spindle of CD-Rs from newegg. For 6 dollars more you can burn almost twice the content of one of those drives. Plus after the initial 18 dollars for the CD drive you'll save yourself about 12 bucks buying the 100 pack spindle and getting over twice the storage.

Re:AMD bought out ATI? (2, Informative)

What Would NPH Do (1274934) | more than 6 years ago | (#23109528)

And to add to my previous comment for 50 dollars I can buy a DVD burner and a 100 pack of DVD-Rs and I can have enough storage for almost 500 gigs of data. You on the other hand would have to buy almost 59 of those USB drives to match that at a cost of almost 1800 dollars. Have fun with that.

Re:AMD bought out ATI? (3, Informative)

Anpheus (908711) | more than 6 years ago | (#23110996)

I just bought a portable hard drive, it's got better read-write speed, portability and it's easier to back up data too.

Re:AMD bought out ATI? (1)

PitaBred (632671) | more than 6 years ago | (#23110998)

Persistent storage, yeah. No question DVD's are ahead. What if you aren't only archiving stuff, though? Moving data from one computer to another, and back? Not to mention that optical drives take a LOT more power and noise to run than flash drives, which is quite important in laptops.

They both have their places. Hammers, screwdrivers, all that jazz.

Re:AMD bought out ATI? (1)

BlackSnake112 (912158) | more than 6 years ago | (#23109778)

I hope you meant 3700 series since that is the higher end version.

Cash Crunch (5, Interesting)

Guppy (12314) | more than 6 years ago | (#23108538)

I used to know an engineer who worked for AMD, and one of the things he would tell me about were the problems with the merger with ATI. There were a lot of manufacturing and engineering differences between the two companies that made it difficult to combine designs from the two. In addition, the poor financial situation of AMD meant they didn't have enough time and money to complete the "Fusion" CPU/GPU combo -- one of the main drivers behind the merger in the first place.

He said that the company will still bring something out, and that something will still go by the codename "Fusion", but it will not be the product originally envisioned at the time the companies decided to merge. He speculated maybe some kind of Multi-Chip Module -- essentially just a separate CPU and a separate GPU die mounted into the same packaging.

Re:Cash Crunch (1)

gnuman99 (746007) | more than 6 years ago | (#23108834)

Like the Intel's quad code offering (two dual-core chips)? It may not be so bad after all.

I used to work at Intel... (3, Interesting)

vivin (671928) | more than 6 years ago | (#23109218)

... and I recall during company meetings we would be told that Intel was "keeping an eye on nVidia". AMD not so much. Intel looks at nVidia to be a new and strong threat.

Re:Cash Crunch (5, Funny)

afxgrin (208686) | more than 6 years ago | (#23109726)

Yeah - they should just stack the dies in some half-ass manner at first. Go back to some kind of Slut configuration, put the dies ass to ass, sandwich the bitch with heat sinks, put high speed interconnects through the ass-to-ass layer, and -tada- you've got the first generation of GPU and CPU in one. It's dirty, it's hot, but it works.

They can code name it "Finger cuffs" or something equally dumb.

Yeaah - they'll be using more wafer area than they would like, since the GPU and CPU would still be manufactured separately, but they can probably make up for the additional cost by charging some premium price.

Hell, it would still be cool if you could remove the GPU and drop in another one, or do the same with the CPU. At least by keeping the CPU and GPU so close they could keep the interconnect lengths short, and deviate from the AGP/PCI-E standard. That would have to yield some sort of substantial performance gain.

But then again - I have no experience in the manufacturing of these things, so my 2 cents is worth exactly that. I may have a chance getting in as a wafer handling monkey. As long as I don't have to manually stir wafers in HF, I really don't like that ...

Re:Cash Crunch (1, Interesting)

Anonymous Coward | more than 6 years ago | (#23110204)

I'll be quite disenfranchised if they (AMD/Intel/Whoever) start manufacturing CPU/GPU combo hardware that can't be upgraded independently at my discretion. I'm a very frugal (read: cheap) builder of my own custom rigs and I pride myself on getting the best bang for the best buck through wise choices of cheaper hardware.

I've had gaming rigs run for 5 years on the same mobo and chip, while only selectively upgrading the graphics card & RAM once or twice along the way.

The day they force me to buy some static cpu/gpu combo module for $500+ every year to keep up with the software is the day my PC becomes a media/web browsing machine and my gaming consoles take over that facet of my entertainment.

I rather like my ability to upgrade a single cheap piece of hardware to keep my rig running nicely every year or two. Thats the whole beauty of PC's over consoles. It'll be a shame if they funk that up.

Re:Cash Crunch (1)

moosesocks (264553) | more than 6 years ago | (#23110826)

Remember that Intel spent billions developing the Itanium, whilst one of their cash-strapped subsidiaries in Israel came up with the Core Duo.

Just because Fusion might not be the glorious flagship envisioned by AMD doesn't mean that it'll flop.

Re:Cash Crunch (0)

Anonymous Coward | more than 6 years ago | (#23110918)

This is pure BS. I *am* an engineer working at AMD (hence the anon post), and there are no "manufacturing and engineering differences" that remain unsolved in Fusion whatsoever. I can't say that the two parts of the company have been totally seamlessly integrated, but many of the design methodology kinks have been ironed out. Fusion processors *will* be single-die chips, not MCMs like the post above suggests. The first Fusion project (Fusion is a concept/family, comprising several entirely different designs) is well underway, and others will start soon. I wouldn't write AMD off just yet..

Apple's role in AMD-Intel war (4, Interesting)

Ilyon (1150115) | more than 6 years ago | (#23108666)

I respect AMD and had faith in their ability to make a comeback in the past, but there's a new wrinkle this time: Apple.

Apple computer sales are growing at 2.5 times the industry rate, and they use Intel CPUs. With all the growth in the PC market going to Intel CPU's, is there much room for an AMD comeback?

I can see two ways for AMD to make a comeback. If Apple's agreement to use Intel CPUs expires and AMD can win some business with Apple, AMD can latch on to Apple's growth. But Apple chose Intel for its ability to ramp up production. Will AMD be able to provide the same? Will AMD be willing to give up other customers to meet Apple's demand?

If Apple chooses this route, how big of an architecture change will this be? I've no doubt Apple can provide developer tools to aid the migration, but will Core 2 optimizations easily translate to AMD optimizations?

Will Apple take the risk of supporting both architectures? They are very active in LLVM development, which allows dynamic optimization of code. If LLVM works as well as many hope, Apple could deliver software in a common binary format that automatically adapts to any architecture using LLVM. This would be quite novel. Apple would benefit from ongoing competition between Intel and AMD while giving AMD a fighting chance in a market increasingly dominated by Apple.

The other potential AMD savior is Linux. Can the open source community deliver software that can take advantage of AMD's CPU-GPU architecture spectacularly enough to give AMD the sales it needs?

If Apple weren't in Intel's camp, I would invest in AMD with confidence in a turnaround, but I think the fate of AMD lies largely with adoption by Apple or Linux.

What do you think?

Re:Apple's role in AMD-Intel war (5, Informative)

dreamchaser (49529) | more than 6 years ago | (#23109312)

OS X runs just fine on SSE3 equipped AMD CPU's as it stands. It's not supported of course but it runs just fine. They (Apple) could easily support AMD even if they didn't optimize for them quite as much as they do for the Core 2 architecture. Frankly, I'm not sure the difference would be all that noticeable compared to the already noticeable delta in performance between the two main x86 architectures.

Re:Apple's role in AMD-Intel war (1)

cesclaveria (1267268) | more than 6 years ago | (#23109386)

I know Apple is growing, but I sincerely doubt that it will have such a great impact. What is Apples current market share? 5%?

Re:Apple's role in AMD-Intel war (1)

WhiteWolf666 (145211) | more than 6 years ago | (#23109556)

IIRC its near 10% now. Nearly 20% of laptop sales, too.

I know its 25% of laptop sales by revenue, but I'm not sure on the unit counts.

Re:Apple's role in AMD-Intel war (0)

Anonymous Coward | more than 6 years ago | (#23110728)

"RETAIL SALES- ignoring the rather large volume of sales by direct vendors, and resellers."

Apple is irrelevant in terms of computer sales.

Re:Apple's role in AMD-Intel war (1)

CODiNE (27417) | more than 6 years ago | (#23109688)

Yeah they just passed Toshiba last quarter, next up is Acer, then HP and finally Dell. We'll see where they currently stand with the next report from Apple coming soon.

Re:Apple's role in AMD-Intel war (0)

Anonymous Coward | more than 6 years ago | (#23110532)

I'm pretty sure that is just retail US sales, which is a tiny fraction of total computer sales. It ignores all of HP and Dell's direct and reseller sales.

I don't know why people want so badly for Apple to have better sales. Apple is awful to deal with pretty much from any angle. The only thing they have is that they aren't Microsoft.

Never one penny of my money with my knowledge.

Re:Apple's role in AMD-Intel war (4, Insightful)

moderatorrater (1095745) | more than 6 years ago | (#23109448)

What do you think?
That linux is a dominant player in the server market and that Apple is pretty much negligible in either. With how similar Intel and AMD chips tend to be, I don't know that there's anything stopping Apple from switching to AMD at any time. Either way, it's a relatively small chunk of the desktop market.

The other potential AMD savior is Linux. Can the open source community deliver software that can take advantage of AMD's CPU-GPU architecture spectacularly enough to give AMD the sales it needs?
This is an interesting question. When AMD comes out with their chips, if they really want to impress people with its abilities, they would do well to get some coders working on Folding@Home working on their new chips. It was impressive to see what ATI cards could do with the code, and it would be a great way to showcase the abilities to computationally heavy programs that run on servers (thereby breaking into that market).

On the desktop end they would have to get something working to showcase the performance in games. Unfortunately, open source doesn't have a lot of 3d games floating around.

Whatever happens, I think they're going to have to show something that works well with windows or else they're going to flop. If it works well enough with windows and they can show substantial performance improvements, then get manufacturing capacity up, they might be able to land an Apple contract. It would be huge for publicity and for a single contract, but for the overall market, it's not going to make or break them.

Re:Apple's role in AMD-Intel war (1)

Colonel Korn (1258968) | more than 6 years ago | (#23109532)

AMD never sold CPUs to Apple, and based on your Apple Growth = 2.5*Total PC Growth with Apple's still very small market share, the non-Apple PC market is growing too (which we already know, anyway). So AMD's potential market is expanding. I think that's probably not a bad thing for them.

As to your later comments, Intel and AMD CPUs still follow the x86 architecture that make them play nice with the same software. I imagine Mac software would work just fine on an AMD chip, and I seem to recall reading about hacked OSX doing just fine on AMD. AMD's CPU marketing these days is about price/performance, though, which might not appeal to those who want the Apple experience.

Re:Apple's role in AMD-Intel war (1)

TJamieson (218336) | more than 6 years ago | (#23110524)

Indeed, AMDs can run OS X. The main thing Apple did to prevent this was to put CPUID calls into their binaries. Patch those out and you're running on AMD. (There's more to it than just this, of course, but that's the biggest issue).

Re: "The other potential savior is Linux" (1)

rickst29 (553930) | more than 6 years ago | (#23110972)

I agree. In the past, (as well as right now), ATI-written GPU "Catalyst" drivers for Linux have usually been bug-riddled, feature-poor, and a problem waiting to happen. And so, the vast majority of Linux "heavyweights" have been building and buying computers with NVidia. (Both discrete and integrated.) BUT, AMD/ATI has recently offered the actual specifications, under FREE license terms, for Linux folks to write driver code under the GPL. (ATI personnel can also take part, as long as they offer their code under GPL terms.) This could be huge. And unlike Apple systems, many Linux boxes are built as low-end machines, naturally targeting integrated graphics anyway. My own PC cost less than $250, and adding a graphics card + passive cooling was a significant cost. It was, of course, an NVidia card... and I expect that ATI graphics will really become viable soon, due to their having released the full specifications required to write FLOSS drivers.

Monkey See, Monkey Do (4, Interesting)

MOBE2001 (263700) | more than 6 years ago | (#23108668)

Nvidia has a better chance to compete successfully against Intel because their executives do not think like Intel. AMD, OTOH, is a monkey-see-monkey-do company. Many of their executives (e.g., Dirk Meyer) and lead engineers came from Intel and they only see the world through Intel glasses. Having said that, this business of mixing coarse-grain MIMD and fine-grain SIMD cores on a single die to create a heterogeneous processor is a match made in hell. Anybody with a lick of sense can tell you that universality should be the primary goal of multicore research and that incompatible processing models should not be encouraged let alone slapped together. Programming those hybrid processors will be more painful than pulling teeth with a crowbar. Heck, breaking programs down into threads is a pain in the ass. Why would anybody want to make things worse?

The best strategy, IMO, is to work on a universal processor that combines the strengths of both MIMD and SIMD models while eliminating their obvious weaknesses. AMD needs somebody with the huevos to say, "fooey with this Intel crap! Let's carve our own market and create a completely new technology for a completely new paradigm, parallel processing". Is Hector Ruiz up to the task? Only time will tell. For a different take on the multicore and CPU/GPU issue, read Nightmare on Core Street [blogspot.com] .

Re:Monkey See, Monkey Do (2, Insightful)

What Would NPH Do (1274934) | more than 6 years ago | (#23108762)

Let's carve our own market and create a completely new technology for a completely new paradigm, parallel processing".
Parallel processing is a new paradigm? Since when? The 1960s called, they want you to stop stealing their ideas.

Re:Monkey See, Monkey Do (1, Informative)

MOBE2001 (263700) | more than 6 years ago | (#23108884)

Parallel processing is a new paradigm? Since when? The 1960s called, they want you to stop stealing their ideas.

What's being passed as parallel processing is not really parallel processing. In fact this is the reason that parallel programming is so hard: it's not what it's claimed to be. Switch to a true parallel programming model and the problem will disappear. Read Why Parallel Programming Is So Hard [blogspot.com] to find out why multithreading is really a fake parallelism.

Re:Monkey See, Monkey Do (1)

What Would NPH Do (1274934) | more than 6 years ago | (#23108998)

And yet that doesn't change the fact that neither parallel programming or parallel processing (yes the exact same meaning of the words as you are using them) is not new. All of it is based off of work from the 1960s.

Re:Monkey See, Monkey Do (2, Funny)

MOBE2001 (263700) | more than 6 years ago | (#23109282)

Look, parallel processing is new for the mainstream. This is the point that Bill McColl made on his blog recently. Read Microsoft Agrees, Parallelism IS The New New Thing! [computingatscale.com]

Re:Monkey See, Monkey Do (2, Funny)

What Would NPH Do (1274934) | more than 6 years ago | (#23109358)

When Microsoft finally jumps on the bandwagon for something and starts calling it a "new thing" you're sure to know it's at least a decade or more old.

Re:Monkey See, Monkey Do (0)

Anonymous Coward | more than 6 years ago | (#23108886)

Wa? How did they get our number? ??

Re:Monkey See, Monkey Do (1)

turing_m (1030530) | more than 6 years ago | (#23111054)

They asked the operator.

Re:Monkey See, Monkey Do (0)

Anonymous Coward | more than 6 years ago | (#23110186)

Well the jerk store called and they're running out of you!

Re:Monkey See, Monkey Do (0)

Anonymous Coward | more than 6 years ago | (#23108794)

And someone will surely say,

  Does it run Linux?

And the answer will probably be yes. Then someone else will ask, Does it run Windows?

  And the answer will probably be No.

And the whole think will be ignored except for a few server affictionadoes.
A shame really but there you go.

Re:Monkey See, Monkey Do (1)

Vigile (99919) | more than 6 years ago | (#23108868)

I don't agree with this: it was AMD that led Intel into the world of on-die memory controllers as well as removing the front side bus from PC architecture.

Re:Monkey See, Monkey Do (1)

MOBE2001 (263700) | more than 6 years ago | (#23109108)

I don't agree with this: it was AMD that led Intel into the world of on-die memory controllers as well as removing the front side bus from PC architecture.

These things are just evolutionary improvements to exisiting technology and it goes to prove that AMD does have excellent engineering talent, which was never really in doubt. What is needed now is a completely new technology to solve a nasty problem that threatens to bring the industry down to its knees. AMD can do it. Question is, do Ruiz and the big money people behind AMD have the huevos and the vision to step up to the plate?

Re:Monkey See, Monkey Do (5, Insightful)

samkass (174571) | more than 6 years ago | (#23109136)

From the introduction of the Athlon by AMD (the first really "modern" x86 CPU that finally eliminated most of the CISC disadvantages), though on-die memory controllers and dragging Intel kicking and screaming into the 64-bit world, right up until AMD's lack of a solid response to Core, I'd say AMD led Intel's thinking. Now they're the followers again.

Re:Monkey See, Monkey Do (1)

Vigile (99919) | more than 6 years ago | (#23109228)

Yeah, I forgot about 64-bit - another instance in which Intel was lacking.

And in truth, AMD APPEARS to be ahead in the move to fusing a CPU and GPU architecture into something new.

Re:Monkey See, Monkey Do (3, Insightful)

moosesocks (264553) | more than 6 years ago | (#23110988)

Intel got lucky with Core. It was never on their roadmap as a flagship desktop chip.

It's effectively a multicore version of a laptop-adapted Pentium III with a bunch of modern features tacked on.

Nobody ever envisioned that this would work as well as it did, and Intel only started paying attention to the idea once their lab in Israel was producing low-power mobile chips that were faster than their flagship Pentium 4 desktop chips.

AMD didn't have an answer to Core, because Intel themselves were largely ignorant of the fact that the P6 architecture that they had previously deemed obsolete was adaptable to more modern systems. AMD saw Itanium and Pentium 4 in Intel's roadmaps, and knew that it had nothing to fear, as the products they had developed were vastly superior to both.

Re:Monkey See, Monkey Do (0)

idiotnot (302133) | more than 6 years ago | (#23109224)

And introducing x64....of which some of the design mistakes will come back to bite us in another fifteen years, as it just extends the agony of 8086 legacy even longer. IA64 may be overreaching, but what AMD did wasn't the answer, either.

In any event, Intel still seems far better positioned to move forward than either AMD or nVidia. The abrupt course correction from NetBurst to Core well illustrated that.

And, to be fair, the integrated graphics are a *lot* better than they used to be, and certainly better than some of the real cheap kit (Via, SiS, etc.). I've been pretty pleased with the couple of boards I bought recently w/ 945G chipsets. One happily serves NFS and Samba shares, and rarely has a monitor connected. The other plays videos off that NFS server, with nary a problem running 720p video. It also runs 10C cooler since I pulled the 6600GT in favor of another capture card.

Re:Monkey See, Monkey Do (1)

nxtw (866177) | more than 6 years ago | (#23111024)

But looking back, what practical advantage does an on-die memory controller have for the end-user? HyperTransport has less latency and more bandwidth, but Intel Core CPUs remain competitive performance-wise without these features.

It was never the monumental change many made it out to be for desktop systems; it's another incremental improvement in performance.

Re:Monkey See, Monkey Do (2, Interesting)

ThePhilips (752041) | more than 6 years ago | (#23109172)

That reminds me of AMD before Opteron release. Their CPUs sucked because they were always catching up with Intel. Their finances sucked. Luckily for them Intel made strategical mistake (called Itanic [wikipedia.org] thus giving AMD opportunity and enough time to release completely new architecture - AMD64.

I wonder if AMD will get lucky second time - in the repeated "nothing to lose" situation.

/me crossing fingers.

Re:Monkey See, Monkey Do (1)

samkass (174571) | more than 6 years ago | (#23109260)

I don't think you're obviously correct. You may turn out to be correct, but there is a lot to be said for heterogeneous processors on a single die.

Some thoughts:
1. A very-low power, slow core tied to a super heavy-duty number cruncher on the same die that use the same instruction set. One could imagine an OS shutting off the big core when all it has to do is blink the cursor to save power, but firing it back up when you click "Compute". Done right, it seems like this could give you a laptop with a day or more of typical battery power.
2. GPUs and CPUs are fundamentally different beasts, and many of the slowdowns are due to either shuffling textures between RAM and GPU, or coordinating CPU and GPU. nVidia is taking the tack of moving more computation to the GPU (physics, etc). It may be equally valid to move the GPU onto the CPU die so the CPU can do the collision/hit detection and the GPU the rendering with very fast coordination.
3. SIMD instructions have been grafted onto MIMD processors ever since the original MMX, and before that DSPs lived on the motherboards of some Macs and NeXT workstations. They're really two fundamentally different kinds of computing, and I'm not ready to say that it wouldn't make sense to separate them out into separate cores.
4. Some embedded CPUs can already run Java bytecode natively as well as some other ABI like ARM. I could imagine a world where some of the dynamic natures of modern languages is combined with hardware support to give us some pretty powerful little machines.

Re:Monkey See, Monkey Do (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23109284)

...work on a universal processor that combines the strengths of both MIMD and SIMD models while eliminating their obvious weaknesses.
And just what, pray tell, is that model? Really, I'm interested. How do you get the best of both with the weaknesses of neither?

Examples, please (1)

raftpeople (844215) | more than 6 years ago | (#23110088)

Yoru blogs have many words but no concrete examples of "true" parallel processing applied to problems currently viewed as best solved by sequential algorithms. If your truly parallel processors are passing messages to achieve proper sequencing, how is that better than implicit sequencing for the class of problems that require sequential processing?

If you are saying that all problems can be parallelized with a net gain in elapsed time to solve, please provide the math proof that supports that position. Don't forget to include setup time for each parallel processor and it's internal environment to get it to solve the problem at hand.

Hope (1)

Nom du Keyboard (633989) | more than 6 years ago | (#23108712)

We as consumers can only hope that this will be true.

they have not "written them off" (4, Interesting)

EjectButton (618561) | more than 6 years ago | (#23108812)

Nvidia and Intel are well aware of AMD and have not "written this competitor off". The only one ignoring AMD is the technology press because they are generally too stupid to focus on more than two things at a time. Most articles are presented in a context of "x is going to overtake y" "technology x is a y-killer". Conflict sells and overly simplistic conflict sells to a wider audience.

AMD has some financial problems and their stock may sink for a while but they are not about to go bankrupt. If anyone should be worried about their long-term prospects it's Nvidia. Intel and AMD both have complete "platforms" as in they can build a motherboard with their own chipset, their own GPU and stick their own CPU in it. Nvidia has a GPU and not a whole lot more, their motherboard chipsets are at an obvious disadvantage if they need to design chipsets exclusively for processors whose design is controlled by their direct competitors.

Nvidia's strength has been that on the high-end they blow away intel GPUs in terms of speed and features, Intel has been slowly catching up and their next iteration will be offered both onboard and as a discrete card and will have hardware-assisted h.264 decoding.

Nvidia's advantage over ATI has been that ati has generally had inferior drivers regardless of what platform you were using, since AMD took over ATI has been improving their driver situation significantly both with respect to thei proprietary drivers and their recent release of specs for the open source version. Meanwhile Nvidia seems to have been doing everything they can to trash the reputation of their drivers over the last year both with their awful Vista drivers and their buggy/sloppy control panel that they have forced on everyone.

The consensus lately is that we are looking at a future where you will have a machine with lots of processor cores and cpu/gpu/physics/etc functions will be tightly coupled. This is a future that does not bode well for Nvidia since the job of making competitive chipsets for their opponents will get tougher while they are at the same time the farthest from having their own platform to sell.

Re:they have not "written them off" (1)

EMeta (860558) | more than 6 years ago | (#23108960)

Speaking of which, does anybody know of anything I can do to get rid of or minimize how much I have to see Nvidia's control panel?

Re:they have not "written them off" (1, Funny)

Anonymous Coward | more than 6 years ago | (#23109056)

minimize how much I have to see Nvidia's control panel?
Stop opening it?

Re:they have not "written them off" (0)

Anonymous Coward | more than 6 years ago | (#23110354)

Funny and Correct :}

I've been using nvidia gaming based cards now for 7 or 8 years now (4 different cards). I don't even know what control panel he's talking about.

There is the little icon in the bottom of the screen, I normally just remove that from my msconfig, does not hinder the operation of the card or drivers at all.

Re:they have not "written them off" (2, Funny)

GregPK (991973) | more than 6 years ago | (#23108966)

I think its just a move with the cuda engine that needs refinement. As it grows mature, driver issues will subside.

AMD is making a break for the open source arena. I gave Hectar that advice a while ago. Apparently, he was listening in his anonymous drunken stupor on the financial forums. AMD is poised to make a stand in the next 2 to 3 years.

Re:they have not "written them off" (1)

RiotingPacifist (1228016) | more than 6 years ago | (#23109984)

Nvidia's advantage over ATI has been that ati has generally had inferior drivers regardless of what platform you were using, since AMD took over ATI has been improving their driver situation significantly both with respect to their proprietary drivers and their recent release of specs for the open source version. Meanwhile Nvidia seems to have been doing everything they can to trash the reputation of their drivers over the last year both with their awful Vista drivers and their buggy/sloppy control panel that they have forced on everyone.
while this is good for linux, are they making similar improvements in the windows arena, I may be biased but i always preferred nvidia drivers to to ati ones.
If ati sort out there drivers then they will be able to cash in on the market that needs these chips, which also happens to be the one where the money is, laptops.

My only problem with AMD, is that their CPUs dont seam to scale as low as intel ones, my current 2.0ghz only drops to 800mhz, but my intel one would drop fro 1.6ghz to 200mhz, not sure how this reflects in powerusage though. Thinking of power usage, is anybody working on moving wireless on to CPUs or save power on wifi in other ways?, because the although nice graphics are welcome, its battery life that laptop users really want, if getting AMD or Intel on a laptop is going to result in having an extra hour, i dont really care which has the better graphics chip.

Re:they have not "written them off" (1)

nxtw (866177) | more than 6 years ago | (#23111122)

My only problem with AMD, is that their CPUs dont seam to scale as low as intel ones, my current 2.0ghz only drops to 800mhz, but my intel one would drop fro 1.6ghz to 200mhz, not sure how this reflects in powerusage though.


Your Intel CPU would drop from 1600 MHz to 200 MHz? Are you sure?

My 1200MHz Core 2 Duo ULV only drops to 800 MHz. My 4 year old Pentium M system dropped from 1400 MHz to 600 MHz.

Thinking of power usage, is anybody working on moving wireless on to CPUs or save power on wifi in other ways?

That makes no sense. Wifi is going to use power no matter what; it takes power to transmit.

because the although nice graphics are welcome, its battery life that laptop users really want, if getting AMD or Intel on a laptop is going to result in having an extra hour, i dont really care which has the better graphics chip.

The CPU and screen still use more power than your wireless transmitter, so get a laptop with an ULV CPU, a LED backlit screen, and a big battery.

Re:they have not "written them off" (1)

toejam13 (958243) | more than 6 years ago | (#23110840)

Nvidia could get around their lack of a complete platform through the purchase of VIA. Their C3 line could be ramped in such a way to compete with AMD and Intel's processor lines. Furthermore, VIA's processor division does have some experience with blended CPU+GPU designs from the MediaGX days. The major problem, though, is that the C3 line is more of an embedded processor design than a mainstream one. But that may not be bad for integrated systems for devices like laptops.

AMD not a contender (0)

Anonymous Coward | more than 6 years ago | (#23109150)

All this might be true if AMD didn't continuously shoot themselves in the foot. Socket 472/754/939/940/AM2/AM2+/AM3 anyone?

Re:AMD not a contender (1)

Vigile (99919) | more than 6 years ago | (#23109472)

AMD has traditionally had a much more stable platform than Intel...

S939 was around a long time.

AM2, AM2+ are compatible.

Intel has had just LGA775 long time - but with all the new processor release new power requirements forced new motherboards to be purchased / manufactured anyway.

For someone that goes through a LOT of motherboards, I still give AMD the edge here.

Why AMD + ATI should win, plus why they won't (5, Insightful)

Alzheimers (467217) | more than 6 years ago | (#23109156)

Why AMD + ATI Should win: Hypertransport. Putting the GPU on the same bus as the CPU should theoretically eliminate whatever roablocks the PCI bus created. Plus, allowing for die-2-die communication and treating the GPU as a true co-processor instead of a peripheral should open up huge possibilities for performance boosts.

Why AMD + ATI won't win: AMD won't risk alienating their OEM partners who also manufacture Intel motherboards and NVidia boards. Also, it's AMD.

Re:Why AMD + ATI should win, plus why they won't (1)

Kelz (611260) | more than 6 years ago | (#23109432)

If you combined a current CPU with a current GPU:

It'd overheat. Like crazy. Current GPUs usually run 40c hotter than most CPUs.

Re: Heat (3, Insightful)

IdeaMan (216340) | more than 6 years ago | (#23110370)

Ok let's talk about heat.

Putting both GPU and CPU in close proximity to each other should help, not hinder. I think you mistook the GP for saying they'd be on the same die, but he said bus, not die.
It may be that they need to be separated a couple of inches from each other to allow room for fanout of the CPU signals to the rest of the board rather than having them in the same socket. If they weren't separated, and the chip packaging was the same height, they could design one heat sink over both chips. This reduces the parts count for the fan and heatsink and therefore increases reliability.

Having something on a plug in card with such an extreme cooling requirement just doesn't make sense. You aren't allowed much space for heat sink design between it and the next slot. Having the GPU on the motherboard gives case/motherboard designers more room for the heatsink design.

AMD has some great solutions (3, Insightful)

LWATCDR (28044) | more than 6 years ago | (#23109404)

Right now AMD has some great CPUs on the low end and the best integrated graphics solution.
A huge number of PCs never pay a game more graphically intensive than Tetris and are never used to transcode video!
Right now on newegg you can pick up an Athlon X2 64 4000 for $53.99
The cheapest Core2Duo is $124.99. Yes it is faster but will you notice? Most people probably will not.

I've given up on ATI/AMD (0, Offtopic)

FireXtol (1262832) | more than 6 years ago | (#23109502)

I have an AMD Athlon processor, but my next system will likely be Intel/Nvidia.

I upgraded from a Radeon 9000 to a GF7600GT recently. Woah boy! Several times increase in performance, and paid the same price, essentially(5-6 years later). It has a nice price tag($100 after rebate), and better driver support and better performance than comparable ATI (~HD2600) cards. That's enough reason for me to switch. I didn't need anything 'uber' in my AGP 2.0 compliant slot that maxes at a mere 4x, just a little someting to keep this system 'in the game' for another few years. Biggest motivation was DX9 and Shader Model 3. I have no plans to run Vista, even though my system can now!

I like AMD... (1)

YetAnotherProgrammer (1075287) | more than 6 years ago | (#23109504)

because we don't just one brand on the market. I remember when we had 33MHz jumps for hundreds of dollars. Intel could charge what they wanted because they were the only player in the game. They had no reason to come out with a faster cpu. Without AMD we probably would be waiting for P3 500s to come out. I don't play many games on my computer or do anything with 3D modeling, so I can deal with the on-board or a low end GPU. This make AMD an attractive CPU-GPU bundle.

Intel could swallow AMD or Nvida for that matter. (1)

centosfan (1274982) | more than 6 years ago | (#23109614)

Currently the market capitalization for AMD is $3.7B, Intel's is $127.7B. I really believe, other than a possible government anti-trust problem, Intel could/would easily buyout AMD for competitive reasons if it felt so inclined. In fact, I am somewhat surprised given the current stock valuations that they haven't. I suppose they do not view them as a near and present threat to their market.

not sure I agree (1)

axiome (782782) | more than 6 years ago | (#23109672)

Very good article but while I think AMD will still matter, its not in the best position of the three. The article mostly goes over AMD's technology and strategy advantage but glosses over financials.

AMD is highly leveraged with ountains of debt. The article glossed over it but the fact of the matter is AMD is highly leveraged has $5 billion dollars of debt and only $1 billion of cash. AMD unsecured debt bonds are now rated CCC, junk turf. And the other wammy is with capital liquidity so damn low in the current credit crisis, its hard to see how AMD can secure any sort of money. Its also poised to report a 238 million dollar loss for this quarter and just cut quite a bit of staff.

Intel is kicking some butt right now. Take a look at this very recent report:

Intel beat reassures tech sector. Tech bellwether Intel (INTC) posted EPS of $0.28 ($0.04 better than $0.25 consensus) on revenue of $9.67B (in line), and issued an upbeat Q2 outlook that sent its shares soaring +7.3% in extended trading. Unlike many other firms, Intel said strong sales (+9.3%) were driven by North American demand, particularly for servers. CEO Paul Otellini said Intel is not seeing any effects from U.S. economic weakness, a sharp contrast to recent remarks by rival AMD (AMD) which told of widely weaker-than-forecast Q1 sales. For Q2, Intel sees revenue of $9-9.6B (vs. $9.26B consensus) and gross margins of 56% (vs. 53.6% this quarter). NAND flash memory chips revenue continued to flag.

AMD is also losing a ton of its market share it gained from the Athlon 64/Opteron days in the server market. I believe its down to about 23% now.

Now from a strategic standpoint, I'll take a slightly different view than the Pcper paper. I believe its a case right now of Sun Tzu's maxim: "he sends reinforcements everywhere, he will everywhere be weak." In other words, AMD has an alright GPU solution but has been second fiddle to Nvidia except for a brief upturn during the old Radeon 9700/9500/9600 series days. And ever since Core 2, Intel has proven it didn't need a serial interconnect Hypertransport (yet) and an onboard memory controller to be faster than the Athlons of those days. With Nehelem, it will get both and will be a force to be reckoned with.

I believe that AMD sat on its laurels too long during the Athlon 64 generation. It introduced the A64 series in mid 2003 and basically took over the lead with its performance and power effieciency. It even had the early lead with dual cores (remember the 3800 X2?) But it got bogged down with the ATI buyout and Intel basically landed a surprise attack with Core 2. AMD got caught with its pants down until early 2008 with the release of Phenom which for all intents and purposes only matches Conroe and surpassed by Penryn in both speed and wattage use.

Even Nvidia, while as the article said, has a strategic position problem still has a mound of cash reserves as does Intel. Granted the losing market share of PC gaming will hurt Nvidia, it still has some maneuver room. From my perspective, Intel is in smooth sailing for at least another year.

Price per Performance keeps AMD alive (5, Insightful)

Eldragon (163969) | more than 6 years ago | (#23109724)

What is overlooked by most of the PC enthusiast press is that AMD still offers an excellent price/performance ratio that Intel does not match.

We have AMD to thank for the reason high end CPUs from intel costs $300 instead of $1000 right now.

Intel hasn't forgotten AMD (0)

Anonymous Coward | more than 6 years ago | (#23109834)

What was the title of Andy Grove's book? "Only the Paranoid Survive". OTOH, I wouldn't write off AMD, as a previous poster did, just because Apple chose Intel. Apple would have no problem unchoosing Intel if they can get a better product that meets their needs.

Re:Intel hasn't forgotten AMD (1)

ThePhilips (752041) | more than 6 years ago | (#23110392)

The crucial difference is that Intel is one-stop CPU shop: they produce everything in house. AMD is much much smaller than Intel and often has to resort to outsourcing.http://slashdot.org/comments.pl?sid=526148&cid=23109172# Cancel

What is important to Apple is that Intel delivers complete solution in several markets. AMD has lots of holes in its offerings which need to be filled by 3rd party components to make a product out of them. For Apple it is important to keep secret until product reaches market to rip additionally on wow-effect: dealing with many partners doesn't help.

Also, Intel can guarantee delivery - it owns 10+ fabs. AMD? Only 2 and only one of them is using modern process - only one fab to compete against Intel.

I'd say that the whole discussion is a praise to AMD, because the (relatively) small company is standing against giant Intel is. Thanks to open thinking of AMD (as opposed to corporate politicking of Intel) we have now affordable 64bit computing on our desks.

KISS (1, Funny)

faragon (789704) | more than 6 years ago | (#23110028)

Intel has to buy Nvidia or AMD to be competitive in the long run. There is no try, as if AMD survive, it could hurt badly Intel with CPU+IGP solutions.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>