Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Crossroads for Intel

CowboyNeal posted more than 9 years ago | from the road-ahead dept.

Intel 123

pillageplunder writes "Businessweek offers a pretty balanced read on what challenges Intel faces in the upcoming year. Rivals Samsung and AMD are making inroads on Intels core businesses, an expected cyclical industry downturn looms next year, and with several critical delays in new (for Intel) markets puts its strategy at risk. A neat read."

cancel ×

123 comments

Sorry! There are no comments related to the filter you selected.

oh yeah? (-1, Troll)

Anonymous Coward | more than 9 years ago | (#10468546)

strif stop

I call BS (-1, Troll)

Anonymous Coward | more than 9 years ago | (#10468549)

and FP

YOU FAILED IT! (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#10468565)

Everything in the subject.

Brit's crossroads (0, Offtopic)

cwebb1977 (650175) | more than 9 years ago | (#10468550)

I think Britney's crossroads were better, you didn't even have to read to enjoy it.

Hmmm (-1, Offtopic)

Moo Moo The Cow (810132) | more than 9 years ago | (#10468556)

hmmm interesting. I don't understand the whole idea, but i get the gist

Cyclical downturn? (4, Interesting)

tdvaughan (582870) | more than 9 years ago | (#10468564)

Could someone explain why the semiconductor industry is 'cyclical'? What is it which makes a downturn predictable, or is it a self-fulfilling thing (lack of investment during predicted downturns causes otherwise unnecessary lack of performance)?

Re:Cyclical downturn? (1, Interesting)

Anonymous Coward | more than 9 years ago | (#10468593)

Funny thing is a new headline I just saw. IBM Chief sees global tech spending rise.

Re:Cyclical downturn? (5, Interesting)

erick99 (743982) | more than 9 years ago | (#10468595)

I've been on the marketing & sales side of computers for 21 years (started in Oct '83) and the only cycle I've ever been able to reliably predict is a slow down in corporate purchasing during the month of February. That does not necessarily relate at all to the semiconductor cycle, if there is one. I used to meet regularly with Intel marketing reps and they never mentioned a cycle. There may well be a longer term cycle, such as a four year or five year cycle where so many machines are bought at the beginning of a major product cycle such as the intro of the P4, for example. In that case, a lot of machines would come out of service starting at 2 years (leases) and out to five years (fully depreciated). This is all to be taken with a grain of sale of course - there are just too many variables (intervening and contravening).

What do you hear.... (2, Insightful)

zogger (617870) | more than 9 years ago | (#10468916)

...from customers when they are new computer shopping? Are they adding primarily new boxes to what they are already running, or are they upgrading what they have? If it's upgrading, why are they upgrading?

I'm asking the latter because it seems like computers got "good enough" for most business purposes already. But I don't *know* that, it just seems so. Is it really just because of the way business taxes are structured?

Re:Cyclical downturn? (5, Informative)

Anonymous Coward | more than 9 years ago | (#10468607)

Most businesses are considered "cyclical". Basically, this means they do better whent he economy is better, worse when the economy is worse. i.e. they do better when people and businesses have more money to spend. People hold off buying when they don't have the money, because these are discretionary expenses.

This is in contrast to some businesses which are fairly "noncyclical"--demand is relatively constant over time, regardless of ability to paty. Medical care is a classic example here--people don't decide to hold off on having tha heart attack until they have a better job.

Then, there are some "countercyclical" industries--ones that do BETTER when people have less money to spend. Examples here are businesses that make "inferior" goods--cheaper substitues for more expensive products. They do better when people have less money, because in good economic times, their products are more attractive.

To an extent, there's an aspect of being "cylical" to most businesses, but some are more tied to economic cycles than others. Intel makes a good case for being a very cyclical business--computing upgrades are a fairly discretionary expense, and delaying upgrades is a fairly common response to bad business climates. On the other hand, when the economy picks up, and people have money to spend, getting those computer upgrades they've been meaning to get for awhile becomes more attractive.

Re:Cyclical downturn? (1)

mefus (34481) | more than 9 years ago | (#10469767)

Most businesses are considered "cyclical". Basically, this means they do better whent he economy is better, worse when the economy is worse.

If that is true, I have to wonder about the people that hold this definition. It is completely orthogonal to what I've always imagined "cyclical" to mean.

I suppose "reactive" or some other definition implying direct proportionality to the amount of money being spent would scare the investment marketeers.

Cycle very strongly implies some periodic nature to the swing of the economy, of which there are enough of to have a descriptive term. What do you call that, then?

Re:Cyclical downturn? (1, Funny)

pedestrian crossing (802349) | more than 9 years ago | (#10468629)

It follows the "cycle" of Windows "upgrades".

Re:Cyclical downturn? (1)

swb (14022) | more than 9 years ago | (#10468633)

I was wondering that too. I'm like "Geeze, I thought we might have another stab at the economy recovering {more, some, at all} next year."

I kind of wonder what a Kerry election might do for all this. Not practically, as in Kerry policies, but psychologically to markets and the business community.

Re:Cyclical downturn? (0, Flamebait)

sydres (656690) | more than 9 years ago | (#10468995)

probably kill it since I doubt he will be too friendly with corporations;putting the bite on their profits through heavy taxation all while taking their hand outs and softmoney in the name of his party of course. and since Dems traditionaly believe in taxation to fund their programs we the people will get squeezed or at least those of us who work for a Living

Re:Cyclical downturn? (1)

mefus (34481) | more than 9 years ago | (#10469850)

probably kill it since I doubt he will be too friendly with corporations

Sure if by "corporations" you mean Bechtel and Halliburton. More money fled the economy after Bush won than ever. 9/11 is a tiny blip on the landslide fall [yahoo.com] of the economy after Bush started looking likely in the polls in 2000.

I think Bush scares the f**k out of business.

Re:Cyclical downturn? (1)

swb (14022) | more than 9 years ago | (#10470409)

I don't know about that. Many financial leaders (as opposed to the get-rich-quick crowd typically fronted as "business leadership") are leery of Bush's policies at this juncture. They like the idea of reduced taxes, but Bush's government spending has been profligate and he's amassed deficits that the real money guys are scared of.

His foreign policy hasn't made Americans popular and has a long-term price tag associated with its wars and occupations, which doesn't make their view of the deficit any rosier.

My person opinion is that a Kerry election would likely be seen by many as a welcome change (if only for the distraction of a change) and might lead to just enough optimism to push the economic engine hard enough to create a sustainable ecnomic upsurge.

Any kind of "bad democratic liberalism" which might anger markets and economic leaders would take years to show any real effect.

One word.. Inventory (4, Interesting)

tanveer1979 (530624) | more than 9 years ago | (#10468744)

The Semiconductor industry inventory stats are ones which give CEOs nightmares. Actually semiconductor companies mostly make chips which are sold to big buyers like Siemens(modems), ENI(modems), DELL, Nokia etc., So these companies tell the semiconductor guys I need so many chips for so many cell phones/cameras etc., And these guys over order, by a small margin. Next year new tech old models are sold for scrap and again inventories rise. In many cases over ordering reaches levels which are uncomfortable. You cant throw away 20% of your cell phones at cost price, can you? So they dont order. But Semiconductor companies have huge Fabs running. And when such cases arise you have fabs operating at half capacity or even lower. And this leads to big losses. Another problem is that a new chip comes from design to fab about 6 months before production begins, and if problems come in the chip it may actually see the vendor after one year! So what do we have here. Based on demand this year, we plan for next year and if inventories pile up its bad luck.

If you wonder why cant semiconductor companies reduce production, the reason is that when we come out with a chip, ie design a chip there a minimum number which is required for the chip to be profitable. This number is in the range of 500000+ units. Such things are hard to predict. In case of a DSL/cable modem chips the design and conception start one and a half years before release to fab. And six months after that full blown production starts. So we have to know 2 years in advance what to do people want. Its 2 years of R&D by over 100 engineers which leads to a chip. And look at the infrastucture investment. Farms of 100s of 64bit 2GHz+ machines, Ultrasparcs etc., running for 1.5 years simulating, testing, designing.

Get the idea? Chip design is a costly business, and unless market analysts get more accurate instead of being stupidly bullish, this cyclic downturn may be much softer

Re:Cyclical downturn? (2, Interesting)

gadget junkie (618542) | more than 9 years ago | (#10468814)

"Could someone explain why the semiconductor industry is 'cyclical'? What is it which makes a downturn predictable, or is it a self-fulfilling thing (lack of investment during predicted downturns causes otherwise unnecessary lack of performance)?"

It is relatively simple: any new plant, or major refurbish of an existing plant, adds so much to capacity that demand takes a while to catch up.
If demand grows even slightly less than forecast, Capacity utilization falls, and the company ends up running the plant for cash, i.e. pricing down the product to move inventory and recoup part of the building cost, making huge losses in the process.

This is particularly relevant at this point in time for Intel and AMD, since the forecast for Corporate computer demand have been way off the mark these three years.
Why they were so high really escaped me at the time: for a generic office computer, any duron is really overkill, and corporations can refuse to install the latest and greatest MS operating system and go for more of what they have now.

Re:Cyclical downturn? (3, Informative)

nelsonal (549144) | more than 9 years ago | (#10469237)

Normally a cyclical industry is one that involves large capacity additions which make up both a signficant amount of the productive capacity and a signficant portion of the cost of each unit. Semiconductors and tankers are both classically cyclical industries. In real terms let us imagine that you and I have built a brand new fab, we perhaps raised $500 million in equity, got grants of $500 million and borrowed $2 billion. Perhaps we decide to compete in memory manufacture. Prior to our fab going on line let us say that capacity was 100/k 8" wafers per month. Following our fab now capacity is 140k 8" wafers per month. Each month to pay for our fab we need to generate sales of about $1600 per wafer to cover our fabs cost. If we did our market research well we should be able to cover that cost if we didn't (perhaps price declines signficantly at 145k wafers and one of our competitors increased their capacity by 20k wafers while we built our fab. Now everyone is producing to shut someone down (or waiting for demand to grow so that we can produce 160k wafers profitably (which could take several years). Upturns are caused by longer waiting by all competitors downturns are caused by capacity being added more quickly than demand, and the most money to be made is accomplished by being the only one to build before an upturn. Under those three rules (with no collusion) you will always have big upturns then overbuilding then downturns then upturns and you have a nice sine wave cycle. The overall trend of the sine wave is up, but the sinewave has a big amplitude.
In processors and other high value chips this is less true (as engineering and design talent reduces the number of competitors you are dealing with allowing more normal capacity increases to take place. Over the last thirty years only three big semiconductor companies have reliably generated more cash than is required to build for the next cycle (Intel, Analog Devices, and Linear Technologies) all three compete with very few competitors and their products require a considerable amount of design before manufacture can take place.

Re:Cyclical downturn? (3, Interesting)

supergiovane (606385) | more than 9 years ago | (#10469557)

You're Mr. Intel. You have to explaine to your shareholders that AMD got it right and they are now driving innovation (e.g. x86-64) and now you're the one who has to catch up. What would you say?
  1. We're in deep shit, boys! You'd better invest your bucks in another company.
  2. It's the cyclical behaviour of the semiconductor history. Now we're getting hit, but next year we'll kick their asses and we'll reduce them into dust. So, don't worry and give us your money.

Maybe I'll do my part next year... (4, Interesting)

jmcmunn (307798) | more than 9 years ago | (#10468573)


And buy a new processor to upgrade my 300Mhz PII I am running here at home. Nahh....it still loads Slashdot just fine. I'll wait till the next generation come out and then buy one of the current chips. (I have been saying that for 4 years now)

Re:Maybe I'll do my part next year... (2, Insightful)

Timesprout (579035) | more than 9 years ago | (#10468668)

I would like them to only release chips within a regularly defined cycle of say 500Mhz speed increases rather than release every improvement they can squeeze out of the chip. I think people would find it easier to plan and commit to a purchase this way. I think processors are fast enough now to handle the needs of the vast majority and theres not a great deal to be gained by flooding the market with differerent processor speeds and people _always_ waiting to maybe purchase the next small incremental release.

Re:Maybe I'll do my part next year... (3, Insightful)

joib (70841) | more than 9 years ago | (#10468766)


I would like them to only release chips within a regularly defined cycle of say 500Mhz speed increases rather than release every improvement they can squeeze out of the chip. I think people would find it easier to plan and commit to a purchase this way. I think processors are fast enough now to handle the needs of the vast majority and theres not a great deal to be gained by flooding the market with differerent processor speeds and people _always_ waiting to maybe purchase the next small incremental release.


Umm, no. That wouldn't be a very good idea. The reason, in short, is price discrimination [wikipedia.org] . By having a wide variety of products, they can better milk the customers. And the customers win too, since they can choose which product best matches their requirement. It's a win-win situation, so to speak.

Re:Maybe I'll do my part next year... (1)

gadget junkie (618542) | more than 9 years ago | (#10468843)

....You know, I sold my semis stocks when I suddenly realized that I hadn't been pestering the Sysadmin for a new machine for my cubicle for a whole year.

Re:Maybe I'll do my part next year... (0)

Anonymous Coward | more than 9 years ago | (#10468879)

Well you seem to be in a good position then, because 300 is well below what is considered "useful" for a computer. You can probably find people throwing out computers in the P450-500 range - maybe upgrade some ram too depending upon how much you have.

Re:Maybe I'll do my part next year... (1)

MtViewGuy (197597) | more than 9 years ago | (#10469084)

Actually, if you're willing to invest in a few hundred dollars you can bring your computer box up to date.

For example, if your computer case supports the ATX form factor you could get an Abit VA-10 motherboard, an AMD Sempron 2400+ boxed CPU (e.g., CPU with CPU fan already installed), and 512 MB of DDR333 DDR-SDRAM for the few hundred dollars I mentioned. The result would be dramatic increases in performance--just the CPU performance will probably be 8-10 times what you have now. :-)

Re:Maybe I'll do my part next year... (1)

Blakey Rat (99501) | more than 9 years ago | (#10470927)

Moderated as "Interesting?" How is that "Interesting?" Man, if there was an option for it, I'd moderate it as "inane."

Is it a neat read... (3, Insightful)

Ghoser777 (113623) | more than 9 years ago | (#10468577)

because it paints a major decline in the Intel empire, or because it actually has insightful commentary and information?

And yes, I didn't RTFA.

Matt Fahrenbacher

Re:Is it a neat read... (5, Funny)

steevo.com (312621) | more than 9 years ago | (#10468666)

It is a neat read because I don't work at Intel anymore.

Re:Is it a neat read... (0)

Anonymous Coward | more than 9 years ago | (#10469270)

Over the past two years, AMD has become known as the innovator in computer processors, thanks to its 64-bit chips, which allow applications to run more efficiently.

You call this insightful?

Not to mention the unexplained "cyclism" of the industry -- no such thing. Ask anybody who's been there a decade or two.

And Samsung painted as a rival (among AMD and TI) -- but not a single mention that Samsung doesn't make CPUs. (It makes big boatloads of memory chips, cellphone processors, and similar fairly simple stuff). Sure, it's a huge competitor in the entire semiconductor industry, but in this case the particular venues should have been clarified.

Re:Is it a neat read... (1, Funny)

Anonymous Coward | more than 9 years ago | (#10469410)

Sorry, I'm the AC above, and due to a brainfart I didn't realise you were asking, not claiming.

Please change the opening to "Would you call this insightful? :)"...

Competition is good (3, Interesting)

ancice (817863) | more than 9 years ago | (#10468578)

Competition is good. At the worse, if it doesn't accelerate the progress of better products, it will at least create a check on the dominant players.

Although Intel is lacking on the 64bit side.

Hmmmm (5, Insightful)

antifoidulus (807088) | more than 9 years ago | (#10468589)

problems at Intel, problems at Microsoft. Could it be that some companies just get too big for their own good?
Happened to other comanies, just look at US Steel, in 1918 they represented 3% of the GDP of the US, but they got too big, and eventually competetitors, both at home and abroad ate up most of that.

Re:Hmmmm (5, Insightful)

Anonymous Coward | more than 9 years ago | (#10468631)

Not really comperable. Steel is a generic product where there's little brand loyalty, and essentially no product differentiation. Microsoft and Intel, on the other hand, make products that no one can make directly--you can compete with similar products, but some shop in Taiwan can't turn out Pentium IV's.

Re:Hmmmm (1)

ceeam (39911) | more than 9 years ago | (#10468681)

How much of the profit (and turnaround) of Intel comes from P4? (Just curious).

BTW, IMHO, Intel is long overdue a good PC CPU. I'd say they have not made one since P3 socketed ones. When I immediately try to talk away anyone who is going to buy a P4 it's not because I hate Intel, just the CPU is crap.

Pentium-M (0)

Anonymous Coward | more than 9 years ago | (#10469333)

A good Intel CPU:

Pentium-M (the Banias derivatives). It's a socketed Pentium 3 on steroids but consuming way less power.

Don't be at all surprised when^W if Intel dumps the P4 (after a decent face-saving period) and rolls out an all P-M desktop CPU selection. Certain to appear in smallish and blade servers, too.

Remains to be seen how far Itanium floats. POWER5 gives some hope that the big iron market hasn't disappeared anywhere.

Re:Hmmmm (0)

Anonymous Coward | more than 9 years ago | (#10469393)

Good question!

Another one: How long do they themselves *plan* to have profit from it?

The switch to model numbers instead of Almighty Gigahertz tells loud and clear that they *know* they can't crank P4's clockspeed up much longer. It stops scaling up, or the power & heat issue becomes intolerable. Maybe they'll get some (model number) bumps from higher FSB and more cache, but for how long?

IA-64 for the desktop (mwahahaha!) or Pentium-M EMT64?

Re:Hmmmm (2, Interesting)

jkrise (535370) | more than 9 years ago | (#10468712)

Excellent point! I think we're very lucky that mfrs. of these items do not have a near-monopoly situation:

1. Memory.
2. CD, DVD drives and media.
3. Network cards.
4. Hard disk drives.

If one or more of these items were controlled by patents / monopolies; the situation could be alarming... Just wondering - can Intel patent it's chip pin-outs / signal levels (not the internal design) in such a way other mfrs. cannot replicate the function?

-

Re:Hmmmm (1)

mmkkbb (816035) | more than 9 years ago | (#10468873)

Just wondering - can Intel patent it's chip pin-outs / signal levels (not the internal design) in such a way other mfrs. cannot replicate the function?

I think it would be tough to patent signal levels, but didn't Intel try patenting the socket with the Pentium II?

Re:Hmmmm (1)

goldmeer (65554) | more than 9 years ago | (#10469727)

Intel has a patent on the P4 bus. Intel Sued VIA when it released some chipsets for the P4. The 2 companies setteled on a cross-license agreement.
So, Yes, Intel can and has patented the pinout and signaling for the P4 processor.
Google is your friend. [justfuckinggoogleit.com] :)

I agree (1)

cslarson (625649) | more than 9 years ago | (#10468896)

I agree. I have a harder time convincing myself to invest in those huge companies. For a company to increase shareholder value it has to grow (vastly outweighs factors such as trimming costs). I believe huge companies can get to a critical mass at some point when it becomes increasingly difficult to add market share (esp. if you already have ~90%), meaning that the company has to make gambles in new markets it may not be as well versed in while also protecting its core products.

Re:I agree (1)

Dravik (699631) | more than 9 years ago | (#10470936)

With the most recent tax cuts couldn't you invest for dividends instead of growth? No company can grow forever.

Re:Hmmmm (3, Interesting)

cyngus (753668) | more than 9 years ago | (#10469023)

I think that's pretty much the story of corporations through all time. I think that extends even beyond corporations, to countries.

Let's look at Intel and Microsoft. Both rose to dominance because they had a good product at the right time, with good marketing. (I'm a Mac fan, I think windows is sh*t, but there's no denying that Microsoft has made computers more accessible to a wider audience, although Apple has always made the better product.) Now both are having some problems, why? Three main reasons:
1) Everybody targets the leader - if you're the leader in an industry everyone can see your weaknesses and target them to take you down. You're the guy to beat and people are going to try to do that.
2)The leader is big, and knows it - the leader of an industry is typically big, has big sales, big profits. They spend accordingly and build out accordingly, adjusting to lower profits is harder when you're used to them.
3)The leader is typically slower - 3 follows from 2, in that if you're a bigger company its harder for you to change course and take advantage of new ideas and trends. Firstly, your organization is larger and therefore harder to manage. Secondly, your customers tend to hold you more accountable to servicing them, the underdog gets more leeway, because he's the underdog.

So companies tend to start out small, grow, become too big to adjust quickly to a changing environment and then die or breakup. Some companies (IBM is a good example)manage to just fall into decline for a while and then emerge as a power player again, but this is hard to do for several reasons such as regaining customer confidence, having enough money to engineer the turn around, and the difficulty of changing the corporate culture to fit the reinvented company.

Re:Hmmmm (1)

HuguesT (84078) | more than 9 years ago | (#10470165)

There was a time between the release of Windows NT 4.0 and the advent of MacOS/X 10.1 when Microsoft arguably had the better O/S, on all fronts you can care to mention: security, usability, adherence to standards, stability, ran on better hardware (alpha!), etc.

I'll always remember the John Carmak .plan entry when he started working on Mac/OS 8 to port Quake 3. "you have to be at peace with rebooting". It was so true it infuriated all the Mac fans.

Now of course WinXP has lost its ways. it only runs on 32-bit Intel, cannot be put on the Internet more than 20 minutes in an unpatched state before being hacked and makes no sense on headless servers, whereas MacOS is finally getting 64-bit support on reasonably beefy G5, looks gorgeous and runs beautifully.

For a moment Apple looked perilously close to the brink though.

Still... (4, Insightful)

StevenHenderson (806391) | more than 9 years ago | (#10468623)


As long as Dell is almost exclusively Intel, then they ought to be just fine. It is Intel's exclusivity agreements that will sustain them in times like these, I'd wager. (Yes, I know Intel's problems aren't just in the desktop market, but I like to over-generalize).

Re:Still... (1)

Gr8Apes (679165) | more than 9 years ago | (#10469115)

Dell has started making some of the same mistakes that Gateway made in the 90s. Namely, low-bidding. While this is not as huge a mistake now as it was back in the 90s, the quality of Dell computers is falling because of it.

Re:Still... (1)

StevenHenderson (806391) | more than 9 years ago | (#10469143)

Oh, I agree that the quality of Dell computers has been falling for quite some time. However, the ignorance of the consumer (ie suckers for marketing) is remaining steady, so it works out well for them.

Re:Still... (1)

Gr8Apes (679165) | more than 9 years ago | (#10469710)

True - I haven't bought a dell nor recommended them since around 2000. They were relatively good machines back then. For personal machines, I've built my own ever since, getting a much better hardware platform for about half the price, or even less.

Re:Still... (1)

HuguesT (84078) | more than 9 years ago | (#10470455)

The corporate Dell lines aren't so bad. Sure they are unupgradable but they are easy to maintain (no screw to open them, everything pull apart very simply, neat plastic rails for disks, very accessible motherboards). They are designed to run office though, don't try to play games on them (it will work but not very well), and they simply don't overclock. The BIOS will not give you any option in that regard.

One other thing they have going for them is that they are extremely quiet. I have what I consider a very quiet *underclocked* SFF PC that I purpose-built myself with large low-speed fans next to a Dell, and the Dell is quieter. The SFF PC only makes a kind of whoosh sound of displaced air (no fan noise whatsoever), but as for the Dell, not only do it fans run quietly, they have better *turbulence* control. I know of water-cooled PCs that are noisier.

The consumer level Dells (Dimension) is complete and utter crap however.

invites (-1, Flamebait)

Anonymous Coward | more than 9 years ago | (#10468640)

Re:invites (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#10468655)

...Hey everybody, I'm posting dirty links!

Maybe Intel will rethink their partnership with MS (-1, Troll)

Anonymous Coward | more than 9 years ago | (#10468648)

And start catering to the consumer instead. I would love to see Intel come out with silicon made especially for Opensource.

Magnificent.... (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#10468663)

This is a textbook example of Troll-whoring. You sir, are a credit to your kind.

Xscale (4, Interesting)

mirko (198274) | more than 9 years ago | (#10468652)

The i86 architecture is dying and Intel could not release decent 64 bit proc at time, so, they'll have to rely on the Xscale processors which are, after all, ARM compatible.
As the ARM has had the hugest sales in the world during the last years (not on the desktopm, but everywhere else), this'd just imply that Intel will keep its domination but outside the PC market.

Re:Xscale (1)

Junks Jerzey (54586) | more than 9 years ago | (#10468907)

The i86 architecture is dying and Intel could not release decent 64 bit proc at time

But in reality, only the smallest fraction of the market is looking for 64-bit processors. Consider that Dell still sells 256MB PCs, PCs with 1GB in them are still the vast minority, and PCs with more than 1GB are rare indeed. (I know, I know, certain people in certain fields will chime in and talk about how 2GB is a requirement, but they're still in the minority.)

Re:Xscale (1)

mirko (198274) | more than 9 years ago | (#10468960)

2 words : Doom 3
more words : Doom , then quake made the public update.
I have no doubt Doom 3 will also influence hardware obsolescence.

Re:Xscale (0)

Anonymous Coward | more than 9 years ago | (#10469553)

Doom 3 makes some upgrade from 256 to 512 MB. Big frigging deal...

Don't expect the same impact. Doom 3 didn't emerge into the same kind of 3D game vacuum that Doom did back in the day. Ever since, there have been games enough to make gamers keep their rigs top notch; and the majority of users either have consoles, or don't bother with yet another first person shooter.

What Doom 3 is more likely to influence (among gamers) is video card upgrades. Even half-serious gamers have sufficient CPUs and RAM to play the game at decent settings. While it's fairly intensive to the entire system (thanks to the shadow volume calculations), it's pronouncedly a pixel fillrate hog (thanks to the stencil-buffer shadowing passes).

The landscape has changed. Doom 3 was much anticipated, but kinda just between Far Cry and Half-Life 2... and before Duke Nukem 4Ever ;-)

Re:Xscale (1)

Junks Jerzey (54586) | more than 9 years ago | (#10469609)

2 words : Doom 3
more words : Doom , then quake made the public update.
I have no doubt Doom 3 will also influence hardware obsolescence.


Doom 3 runs perfectly on a 3GHz PC with 512MB and a good video card. *Perfectly*. This is exactly the setup I've been using, and the game is smooth as silk. Please note that the "ultra" quality setting doesn't count, because it trades an impercptible increase in video quality for a *massive* increase in bandwidth (it doesn't use lossy compression on certain types of multi-pass textures, like normal maps, so they get *huge*). Personally, I can't tell the difference between the the top three quality settings when I'm playing the game. I can only tell if I walk around pausing the game, specifically looking for artifacts (and even then I have to really put some effort into it).

Re:Xscale (1)

Gr8Apes (679165) | more than 9 years ago | (#10469153)

Dell's bottom end PC's aside, no decent PC should come with less than 512MB RAM. That's just so that you can truly multi-task (why shut down your email and browser when loading Doom 3, for instance?)

Also, 1GB is chump change even in the land of 32 bit computing. 4GB is the max, whether MS can support it or not. 64 bit processing is important for things other than maximum memory access. Photo/sound processing comes to mind, not to mention video editing. That's just for starters.

I'll end this with saying that 99% of everyday PC tasks don't need any of the above. Heck, the only reason they need anything above the original Pentium or 486 is that MS's ubiquitous bloatware office apps create 100KB 2 line documents.

Re:Xscale (1)

Junks Jerzey (54586) | more than 9 years ago | (#10469555)

Also, 1GB is chump change even in the land of 32 bit computing. 4GB is the max, whether MS can support it or not. 64 bit processing is important for things other than maximum memory access. Photo/sound processing comes to mind, not to mention video editing. That's just for starters.

You realize, of course, that the x86 line of processors has *always* supported 64-bit floating point operations (in reality, all operations are 80-bits internally). This goes all the way back to the original 8087 coporocessors. And there's always been a 64-bit bus on Pentiums, even the very first model. So what 64-bit operations are you wanting? Pure 64-bit integer math?

Re:Xscale (1)

HuguesT (84078) | more than 9 years ago | (#10470518)

Memory moves are more efficient in 64-bit mode. Memory bandwidth in the Opteron/Athlon64 is much better than with Intel. This is even without the new register banks that come with these processors.

x86 architecture still alive thanks to AMD. (4, Insightful)

MtViewGuy (197597) | more than 9 years ago | (#10469021)

Actually, the x86 CPU architecture is still alive thanks to a company called AMD. :-)

AMD's groundbreaking Athlon CPU core is far superior to what Intel has, and the Athlon XP showed that you don't need ridiculous clock speeds to get superior overall CPU performance, thanks to the the combination of the very efficient Athlon CPU core and generous on-die L1/L2 CPU memory caches. AMD's decision to put the memory controller onto the CPU die with the Opteron/Athlon 64 CPU's also demonstrates how to get superior CPU performance without running high CPU clock speeds like Intel needs to do with the Pentium 4 CPU's.

Re:x86 architecture still alive thanks to AMD. (0)

Coolpup (796096) | more than 9 years ago | (#10469088)

Parent refers to i86 not x86. i86 refers to the Itanium platform, which many programmers hate. It is very doubful that x86 will die anytime soon.

Re:x86 architecture still alive thanks to AMD. (0)

Anonymous Coward | more than 9 years ago | (#10469276)

IA-32 = x86
IA-64 = IPF (Itanium Processor Family)

How could i86, whatever it means, refer to Itanium?

Re:x86 architecture still alive thanks to AMD. (1)

mirko (198274) | more than 9 years ago | (#10469378)

For me, i86 specifically refer to Intel IA32 procs.

Re:x86 architecture still alive thanks to AMD. (1)

genneth (649285) | more than 9 years ago | (#10469715)

And Intel still has a R&D budget the size of AMD's annual turnover... OK, that's complete bullshit, but you get the point that Intel still has a lot of money left in its pocket, and no lack of talented people to pay with that money. Take the Pentium M. I'll pick a PM over an Opteron any time I don't want/need SMP.

Re:x86 architecture still alive thanks to AMD. (1)

MtViewGuy (197597) | more than 9 years ago | (#10469789)

Actually, the main reason why the Pentium M CPU was considered a success was the fact Intel stuffed a massive amount of cache memory onto the CPU die (I believe the Pentium M CPU die has 1024 KB or higher of CPU memory cache). Small wonder why future Pentium CPU's for desktop computers will be derived from the Pentium M design.

Re:x86 architecture still alive thanks to AMD. (2)

fitten (521191) | more than 9 years ago | (#10470405)

Actually, the main reason why the Pentium-M is considered a success is because it offers very respectible performance at very low power requirements. It was designed for laptops and blades.

The Pentium-M has a number of very interesting technologies. For example, in lower power states, it can actually turn off parts of its caches to save power (effectively meaning it has smaller caches at lower power).

The only thing the Pentium-M isn't strong (strong being measured by computational power per clock speed) in is FPU performance (again, a power saving compromise). Still... my Pentium-M 1.4GHz laptop can equal a Pentium-4M 2.2GHz (actually, it's just a smidgen faster than it) doing things like mp3 encoding using LAME. At integer work, I think it is even faster.

A French site overclocked a Pentium-M a while back and showed that clock for clock, it is as fast or faster than an Athlon 64 on most things integer intensive but slower at FPU (32-bit software, of course) and that is with a slower bus to memory (they jumped it up to 533MHz QDR - DDR266 memory compared to the A64's DDR400) and no on-die memory controller while consuming less than 1/2 the power.

I'd love to see Pentium-M EM64T CPUs out for the desktop (with a beefed up FPU subsystem).

Re:x86 architecture still alive thanks to AMD. (0)

Anonymous Coward | more than 9 years ago | (#10470247)

"Take the Pentium M. I'll pick a PM over an Opteron any time I don't want/need SMP"

You must like paying MORE and getting LESS! Yep, you're a genius!

Intel aren't doing that badly in other areas (5, Funny)

lukestuts (731515) | more than 9 years ago | (#10468657)

I hear the pipeline on the P5 is going to be so long that Halliburton want to license it for reconstruction work in Iraq.

Re:Intel aren't doing that badly in other areas (2, Informative)

Fulcrum of Evil (560260) | more than 9 years ago | (#10468835)

I hear the pipeline on the P5 is going to be so long that Halliburton want to license it for reconstruction work in Iraq.

Well, you must have heard wrong. The Pentium [arstechnica.com] only has 5 stages. Or did you mean whatever comes after the P4 [amd.com] ?

Re:Intel aren't doing that badly in other areas (1)

GMFTatsujin (239569) | more than 9 years ago | (#10469944)

Does it even make sense to have a chip called the Pentium 5? Shouldn't that be, like, Pentium Squared, or something?

AMD RECEIVES THIS YEAR'S "BEST SUPPLIER AWARD" (0, Redundant)

tecman84 (815301) | more than 9 years ago | (#10468676)

SUNNYVALE, CA-NOVEMBER 20, 2000-AMD today announced that Samsung Electronics Co., Ltd. has named AMD the recipient of its 2000 "Best Supplier Award." Samsung's prestigious award was given to AMD in recognition of its excellence in supporting the Wireless Terminal Division of Samsung Electronics, with AMD's Flash memory solutions.

What do we all think of this?

the Article can be found in it's full text below

URL http://www.pcstats.com/releaseview.cfm?releaseID=4 16

Lack of vision (3, Insightful)

Anonymous Coward | more than 9 years ago | (#10468736)

Intel has made some pretty big mistakes over the recent years, in some cases going against common sense:

RDRAM, Itanium, 64-bit extensions for x86, frequency as sole measure of performance, ...

It should be no surprise that now Intel's future is clouded. They have no one to blame but themselves.

Re:Lack of vision (3, Interesting)

MtViewGuy (197597) | more than 9 years ago | (#10468954)

The biggest fiasco for Intel was the Itanium project, which showed while it was a technically-excellent CPU it also exposed the big problem of lack of software to support the CPU.

Meanwhile, AMD brought new life to the X86 architecture with a modern developed from scratch CPU design using the Athlon CPU core. Note that AMD's CPU's have truly impressive performance per CPU clock cycle, and AMD's decision to move the memory controller onto the CPU die with the Opteron/Athlon 64 CPU's allows AMD to match the performance of the latest Intel Pentium 4 CPU's without Intel's need to run very high CPU clock speeds.

Re:Lack of vision (2, Insightful)

joib (70841) | more than 9 years ago | (#10469117)


The biggest fiasco for Intel was the Itanium project, which showed while it was a technically-excellent CPU it also exposed the big problem of lack of software to support the CPU.


I wouldn't say excellent. Itanium is a somewhat competetive cpu in the high end market, but it's far from the original goals of running in circles around the competition. Not to mention that currently Sun and IBM are selling dual-core cpu:s, which Intel isn't.

As I see it, Itanium was a very interesting experiment in cpu architecture, that in the end really wasn't enough better than the status quo.


Meanwhile, AMD brought new life to the X86 architecture with a modern developed from scratch CPU design using the Athlon CPU core.


AMD is not alone. Ever since the original Pentium, Intels own x86 cpu:s have essentially been RISC processors inside, just like the Athlon.


Note that AMD's CPU's have truly impressive performance per CPU clock cycle, and AMD's decision to move the memory controller onto the CPU die with the Opteron/Athlon 64 CPU's allows AMD to match the performance of the latest Intel Pentium 4 CPU's without Intel's need to run very high CPU clock speeds.


I agree, I think Intel made a big mistake by focusing on maximizing clock speed for the P4. Apparently they didn't foresee the rising importance of power consumption (and associated cooling).

Re:Lack of vision (2)

MtViewGuy (197597) | more than 9 years ago | (#10469540)

Ever since the original Pentium, Intels own x86 cpu's have essentially been RISC processors inside, just like the Athlon.

That is true, but Intel's X86 core is still heavily derived from the CPU core pioneered on the Pentium Pro CPU of the middle 1990's. Indeed, the Pentium II/III CPU's were essentially improvements from the PPro CPU core.

Meanwhile, AMD's Athlon CPU core was pretty much developed from the ground up (thanks to their acquisition of NexGen), and because it is close to a clean sheet design even the earliest Athlons were already out-performing the equivalent Pentium III CPU's on the equivalent CPU clock basis. Because the Athlon CPU core sported such high processing efficiency, when the Athlon XP CPU's came out AMD CPU's could still compete against the Intel Pentium 4 CPU's; for example, the Athlon XP 2400+ CPU matched up surprisingly well against the Pentium 4 2.4 GHz CPU even though the clock speed of the AMD CPU was was lower.

Re:Lack of vision (0)

Anonymous Coward | more than 9 years ago | (#10469778)

Man, thanks for the laugh of the day! :)

Your elegant, shall I say italic style completely hides the fact that you are posting the same Captain Obvious stuff all over, with slightly but sufficiently altered wordings.

In a post below, in this thread, you explain your "from scratch" theory. It is, of course, as we both know, pulled out of an arse.

The NexGen acquisition lead to the K6 (and K6-2, K6-III). Athlon recycled a lot of that excellent design into the K7 Athlon. Main differences where the significant beefing-up of the decoder/dispatcher, the parallel FMUL and FMAC units, and the Alpha EV6 front side bus.

IN case you 'meant' the K8 Athlon 64, it is a direct derivative of the K7; obvious differences are the on-chip memory controller, the 64-bit registers and ALUs, and the HT chip-to-chip connections.

But your elegance made me smile. It would be too harsh to call you a troll, but I can't help myself:

"Trolling, trolling, trolling... Raaawhiiiide! RAWHIDE!"

And I mean that sincerely as a compliment! :)

Re:Lack of vision (0)

Anonymous Coward | more than 9 years ago | (#10469854)

Aw crap, should really have previewed before submitting that. Apologies for the horrendous typos. Below = above, AMD = Athlon, where = were, and so forth...

Intel is still making money (2, Informative)

Anonymous Coward | more than 9 years ago | (#10468743)

Despite Intels problems, they made a record revenue and profit report this year. They still know how to make money and some reports says that their yields are far better than others and this may be a sign of this.

Re:Intel is still making money (1)

gadget junkie (618542) | more than 9 years ago | (#10468971)

"Despite Intels problems, they made a record revenue and profit report this year. They still know how to make money and some reports says that their yields are far better than others and this may be a sign of this."


That's not how financial markets value companies, otherwise those who made the most money should have the highest Price/earning ratios and vice versa.

In reality, the stock market views company in a dynamic mode; is the company becoming moree productive, or less? does it depend on intellectual property or technological advancement? Is it strong in lucrative markets?


if you look at Intel [yahoo.com] versus AMD, [yahoo.com] , you'll see that AMD has done better. Now, AMD doesn't make the kind of money that intel does, but by getting out a products that's appealing for geeks folding like mad or *ahem* trying to be productive [pacific-fighters.com] , they garnered more profit out of a unit of production than Intel.
Remember that AMD outsell Intel in retail desktop, [theregister.co.uk] , and that's where a good part of the dough is.

AMD dualcore Opteron (3, Informative)

geeveees (690232) | more than 9 years ago | (#10468770)

In other news, there are some benchmarks on AMD's dualcore Opteron: http://episteme.arstechnica.com/eve/ubb.x?a=dl&s=5 0009562&f=174096756&x_id=1097194717&x_subject=Opte ron+dual-core+details+emerge&x_link=http://arstech nica.com&x_ddp=Y/ [arstechnica.com]

It appears AMD designed the Opteron from ground up to be dualcore.

Re:AMD dualcore Opteron (1)

Gr8Apes (679165) | more than 9 years ago | (#10469191)

I believe they designed it to be multi-core. I believe in the initial papers they were talking about dual core being released in 2004/2005 (don't recall exact dates) and that quad and 8-way cores were on the horizon. How much was vapor, who knows.

Intel Compilers (3, Informative)

should_be_linear (779431) | more than 9 years ago | (#10468779)

I am predicting that Intel compilers department will be trashed soon. According to latest Coyote's benchmarks, GCC is caching up with performance. Moreover, you cannot improve performance of C++ compiler beyond certain limit, and it seems both intel and gcc (also MS) are close to that point. So, nobody will buy ICC to gain 5% on one app and loose 3% on another. Times when Intel had 30-40% over gcc will never come back.

Re:Intel Compilers (2, Informative)

Krondor (306666) | more than 9 years ago | (#10468972)

I am predicting that Intel compilers department will be trashed soon. According to latest Coyote's benchmarks, GCC is caching up with performance. Moreover, you cannot improve performance of C++ compiler beyond certain limit, and it seems both intel and gcc (also MS) are close to that point. So, nobody will buy ICC to gain 5% on one app and loose 3% on another. Times when Intel had 30-40% over gcc will never come back.

This will only hold true if and when Intel and HP scrap all Itanium plans. Itanium processors do not do instruction prefetch. HP and Intel decided it was the compilers job to organize instructions for execution (a point I agree on). However, the vast majority of processors do have instruction prefetch, and GCC disagrees and believes that it is the processors job to organize instructions for execution. As a result, Itanium's GCC performance is absolutely pitiful. There is also no easy fix for this as changing GCC to do instruction prefetch only for Itanium is no small task, and obviously changing Itanium to have an instruction prefetch is also no small task. ICC, however, does do instruction prefetch on the compiler level, and thus results in much much better performance on Itanium then GCC. For Intel and HP to continue to market Itanium, they will need an adequate compiler. However, I don't see them selling Itanium much longer anyways so most likely neither will ICC. It would be nice if they would open source it though as decent competition to GCC couldn't hurt :).

Aren't Intel's compilers simply a "spin off" prod? (0)

Anonymous Coward | more than 9 years ago | (#10469388)

Are Intels compilers really there as a full business in and of itself, or is it some auxillary "side benefit" of their processor business. At Microsoft for example, the developer tools division primary focus is providing good development tools for Microsoft developers, as well as Microsoft itself. The fact that they make ~$20 million on the side is pretty much a tertiary benefit.

Similarly, isn't the fact that Intel vends compiler technology little more than a byproduct of the fact that they do oodles of CPU research, and that it is just a nice thing to make a penny or two on the side? Maybe if they were starting from scratch, they would do everything on GCC, but the fact that they have all of this talent already there implies to me that it is probably not worth it for them to simply throw it away.

Re:Aren't Intel's compilers simply a "spin off" pr (1)

mefus (34481) | more than 9 years ago | (#10470450)

Are Intels compilers really there as a full business in and of itself, or is it some auxillary "side benefit" of their processor business.

I think it's just a demo to convince companies to buy the real product: access to intel's algorithms for the compiler developed to optimize instructions historically left to the cpu.

Somewhere within Compaq is a whitepaper describing optimizations available on the alpha platform that need to be either hand-coded or written in as compiler optimizations on the Itanium.

I thought this requirement of the compiler would be met when HP decided to use Linux on the Itanium, but it didn't happen. In fact, what happened is that you may compile Linux using ICC.

(could be wrong, no links to review and too lazy to find them!)

Visual C++ .Net (1)

puz (222978) | more than 9 years ago | (#10470719)

>is pretty much a tertiary benefit.

That probably has been true in the past but it looks like Micro$oft is now trying to make money from their compilers. I say that because programmers can no longer purchase the Pro version of Visual C++. Instead, they make you buy the more expensive Visual STUDIO .Net Pro, which also includes Basic and C#.

Re:Visual C++ .Net (0)

Anonymous Coward | more than 9 years ago | (#10470876)

If they were simply trying to get more money out of developers, they would not be considering the 'Express' lines of tools. I think that their reason for your particular issue is something else; more like trying to steer their developers into C#/VB.NET.

Intel (1)

F7F7NoYes (740722) | more than 9 years ago | (#10468918)

I forgot, are we supposed to hate Intel?

(Near) future threat to Intel (2, Interesting)

News for nerds (448130) | more than 9 years ago | (#10468975)

in high-performance processor market is, IBM. Currently its PowerPC chips power Macintosh PC and Nintendo console. In Xbox 2 console, IBM succeeded Intel's deal with Microsoft for Xbox [theinquirer.net] . IBM's Power architecture is going to be embedded in massive volume for both Nintendo and Microsoft consoles. Then, another architecture developed with Sony and Toshiba, STI's Cell [ibm.com] will power PS3 console and other servers/workstations. IBM fabs will help production of AMD processors in forthcoming generations, too.

I think IBM will cause more trouble than AMD (2, Informative)

cheros (223479) | more than 9 years ago | (#10469003)

Their new Power5 chip is a seriously good piece of engineering which will make a rather savage dent in the Intel market when people realise how good it actually is.

Has anyone realised that it has an MTBF of well over half a century? More computing with less power: if you're running lots of blade servers this chip also solves your other big problem: heat.

The moment IBM comes out with pricing that approache Intel (and, frankly, I would be surprised if that isn't coming) anyone competent enough to work out the real TCO (get the REAL facts ;-) will not even have to think twice.

IMHO, in comparison Sun or AMD don't even feature as a threat..

Re:I think IBM will cause more trouble than AMD (0)

Anonymous Coward | more than 9 years ago | (#10469179)

The moment IBM comes out with pricing that approache Intel (and, frankly, I would be surprised if that isn't coming) anyone competent enough to work out the real TCO (get the REAL facts ;-) will not even have to think twice.
Let's not go crazy here. If all you care about is Linux, Apache, and MySQL, then maybe that's true. But if you want to run Oracle or any other closed-source apps - and most companies do - you have to wait for your vendor to port the product to Power5 before you can think even once about switching. And naturally, because none of their customers are switching, Oracle et al. will see no point to making the huge investment of porting their app to a new architecture.

I'm not saying it can't (or even won't) happen, but it is not as simple or easy as you're making it sound.

Rivals Samsung (0, Offtopic)

k4_pacific (736911) | more than 9 years ago | (#10469005)

I work with a guy named Rivals Samsung. His father is Korean and his mother is from India.

Doesn't Intel own a (large) share in AMD? (1)

gtrubetskoy (734033) | more than 9 years ago | (#10469277)

I remember reading a long time ago in the "Inside Intel" book that if it weren't for an investment from Intel in the early days, AMD would never get off the ground. I wonder if Intel is still invested in AMD?

Re:Doesn't Intel own a (large) share in AMD? (1)

hooqqa (805765) | more than 9 years ago | (#10470344)

I have an 8088 AMD cpu around here somewhere.. Maybe intel helped them, maybe intel just needed another chipmaker to meet a spike in demand. The only thing keeping AMD from making SX Opterons is Intel and vise versa - they're both big companies.

Making inroads? AMD Defeated Intel. HOW? (2, Insightful)

Anonymous Coward | more than 9 years ago | (#10469302)

Remember when Intel followed AMD's 5 - 10 year chip naming using a number to identify the chip rather than using the raw MHZ speed? Yes, that was this year. Yes, that was Intel realizing that it ain't about how big it is, it's how you use it.

And AMD has been my choice, as well as my companies choice, since 95. For almost 10 years AMD has been the cheaper, faster alternative, duplicating everything the Pentium has done and recently defeating it in most speed tests, forcing Intel to panic by releasing "Super Extreme Hyper-Upper-Cut" editions of thier Pentium four just to MEET the already released AMD 64 bit chips running on 32 bit technology.

Thusly, it isn't that AMD (and Samsung??) is making inroads - they're in the lead, and Intel has been panicking for years.

Re:Making inroads? AMD Defeated Intel. HOW? (0)

Anonymous Coward | more than 9 years ago | (#10469313)

OH and uh, here's the proof:

http://weblog.siliconvalley.com/column/dangillmo r/ archives/001820.shtml

Cyclical huh, it's so simple now I see the light. (1)

GuyFawkes (729054) | more than 9 years ago | (#10469803)



If it's an even year, eg 2004 / 2006 / 2008, the semiconductor industry is waxing, if it's an odd year it must be waning.

Clearly since next year is an odd year, 2005, we can expect a semiconductor slump.

God these stock market types really are clever.

Intel is hard to feel sorry for... (1, Informative)

dtjohnson (102237) | more than 9 years ago | (#10471038)

In recent years, Intel has come out with the PIII with the built-in ID number, the Itanium which ran existing 32-bit software very slowly, and the P4 which has probably boosted electricity consumption worldwide to meet its voracious appetite while increasing room temperatures and air conditioning demand. Intel has allied itself with most of the major computer makers through all sorts of sleazy schemes to the point that most of the computers for sale are 'Intel Inside' machines. The largest maker, Dell, does not offer a single machine using a non-Intel chip. Millions of users have actually had to build their own machines from components if they wanted to use non-Intel parts, such as those by AMD. If General Motors dominated the car business like Intel dominates the computer business, we would have to buy a whole box of Chrysler or Toyota parts and put them together if we wanted to drive something besides GM.

So yes, Intel is at a 'crossroads' of sorts, but it is of their own making. Actually it is more like a fork in the road and Intel took the wrong turn back about the time of the Pentium.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?