×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Speculation on Real Reasons Behind Apple Switch

Hemos posted more than 8 years ago | from the the-true-switch-campaign dept.

Technology (Apple) 659

/ASCII writes "There is an article over at Ars Technica with some insider information about the reasons behind Apples x86 switch, given that the new IBM processors seem to be a perfect fit for Apple. The article claims that Apple hopes to power its entire line, from Servers to desktops to iPods and other gadgets with Intel CPUs, and that by doing so, they will gain the same kinds of discounts that Dell get."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

659 comments

Apple v. Dell? (5, Insightful)

PlancksCnst (877593) | more than 8 years ago | (#13032210)

Does Apple really sell as much (volume-wise) as Dell does?

Re:Apple v. Dell? (4, Insightful)

Dagny Taggert (785517) | more than 8 years ago | (#13032216)

Good point. I would assume that sales volume would have to be very, very high to receive a Dell-like discount. I don't think Apple qualifies. Then again, Intel might give them a good discount to keep them onboard. Apple could always re-marry IBM.

Re:Apple v. Dell? (4, Insightful)

CaymanIslandCarpedie (868408) | more than 8 years ago | (#13032302)

Then again, Intel might give them a good discount to keep them onboard. Apple could always re-marry IBM.

Or just use the oldest trick in the book ("We are looking at using some chips from AMD."), and then see what "discounts" you qualify for ;-)

Re:Apple v. Dell? (2, Insightful)

weg (196564) | more than 8 years ago | (#13032360)

Apple could always re-marry IBM.

Of course. Apple users have already become used to this kind of changes..

Re:Apple v. Dell? (1)

PlancksCnst (877593) | more than 8 years ago | (#13032413)

Yes, but think of all the schools/govt agencies/large businesses that have all Dells on their desktops.

Re:Apple v. Dell? (2)

Knome_fan (898727) | more than 8 years ago | (#13032226)

If you consider consumer products like the ipod (and in fact the article speculates that Apple will be focusing more and more and these kinds of products) they are selling quite an impressive volume I would guess. If it's as big as Dell, I don't know, but it probably is big enough to be awarded discounts.

interesting take on ipod centric-business planning (5, Interesting)

J Barnes (838165) | more than 8 years ago | (#13032214)

This is a really interesting take on the switch that I hadn't considered before. This move to intel makes all the sense in the world if Apple is trying to cram an intel processor inside the iPod, and for pure volume discounts alone, this could really help apple's overall profit margin.

I'd worry about putting all my eggs in one basket, but I suppose as far as baskets go, intel is a relatively safe bet overall.

Re:interesting take on ipod centric-business plann (5, Interesting)

Ed Avis (5917) | more than 8 years ago | (#13032225)

And why does Apple need to switch from plain-Jane ARM processors to Intel's greased-lightning XScale? What do they need that extra power for? Why, to bring back the Newton, of course!

Re:interesting take on ipod centric-business plann (5, Funny)

/ASCII (86998) | more than 8 years ago | (#13032289)

The second coming of Newton, a video iPod or perhaps a PSP killer. Or all of the above, but with an integrated cellphone. And a pony!

Whatever the plan, we need new terms. (4, Funny)

doublem (118724) | more than 8 years ago | (#13032420)

First, we had the Wintel monopoly.

Then some competitors came along and the non-Intel processors running Windows carved out a large enough market share to justify splitting the terms off into ChipZilla and Wintendo.

Now, we have MAC going Intel. What the HELL do we call this?

MacTel?

Intelmac?

Apptel?

Intenapple?

What terms can we use now???

Re:Whatever the plan, we need new terms. (1)

/ASCII (86998) | more than 8 years ago | (#13032488)

I'd guess the'll be called Macs, iBleh, etc.. Apple has never been very big on empasizing the name of their contractors.

Re:interesting take on ipod centric-business plann (5, Funny)

ceeam (39911) | more than 8 years ago | (#13032324)

Thanks - the image of Isaac Newton with two bolts to his head (and stitches) growling "I live again" will haunt me today.

Re:interesting take on ipod centric-business plann (3, Funny)

utexaspunk (527541) | more than 8 years ago | (#13032482)

I suppose Einstein, with his hairdo, will make a fitting bride of Frankenstein... :)

Re:interesting take on ipod centric-business plann (1)

lisaparratt (752068) | more than 8 years ago | (#13032338)

Hmm - ISTR ARM9 and above are faster, clock for clock, than the XScale.

Re:interesting take on ipod centric-business plann (2, Informative)

SilentSheep (705509) | more than 8 years ago | (#13032406)

IIRC i read that the new iPod will be moving to use 2 ARM9 processors, instead of the current set up(2xARM7 and a ARM9). ARM9 processors are generaly 'quicker' than XScale processors, not necessarily more MHz but like for like, ARM performs better! (trust me i work for ARM)!!!

Re:interesting take on ipod centric-business plann (3, Funny)

Zediker (885207) | more than 8 years ago | (#13032461)

mmmm Throughput.... Tastes like... vectory....

Elements (2, Interesting)

Anonymous Coward | more than 8 years ago | (#13032287)

Intel have been working on something big. It was previously rumored that this something was the Pentium V and that Microsoft would be releasing a special version of Windows [theinquirer.net] specifically for the processor.

"Windows Elements"?

What the hell is that? I'm thinking that the Pentium V has something so revolutionary that it prompted:

1) Microsoft to release a special version of Windows, specifically for the processor and,
2) Apple to change sides.

I also think that Intel expected to be much further along on the Pentium V at this point. It seemed like they were expecting to use it in order to quench AMD's 64-bit lead and, when the design was set back, they scrambled to come up with EMT64 as a stop gap solution.

So just what is this Pentium V and the "stackable" design, anyway? IMHO, it will be unified processor and NVRAM (not flash, something new). There will probably be at least a few gigs of a very fast NVRAM right on the processor. This NVRAM will be as fast or faster than SRAM so there will be no need for a cache or external system memory - the operating system will be installed right in the processor. The stackable design is for expansion.

Intel's NVRAM page [intel.com]. Nothing to indicate that any of this is true but some interesting reading, nonetheless. This could also explain MontaVista's PRAMFS [sourceforge.net].

If the backing-store RAM is comparable in access speed to system memory, there's really no point in caching the file I/O data in the page cache. Better to move file data directly between the user buffers and the backing store RAM, i.e. use direct I/O.

Re:Elements (5, Funny)

ceeam (39911) | more than 8 years ago | (#13032481)

Wow! Do you think perchance that this Pentium V will also have something to do with "code morphing"? I heard this technology should really be revolutionary!

If they'd gone with AMD... (4, Funny)

doublem (118724) | more than 8 years ago | (#13032336)

It's too bad they didn't go with AMD processors instead. Then the iPod could have doubled as a hot plate / coffee warmer. That would be a useful technology fusion if you ask me. Far better than a crappy cell phone in an iPod!

Re:If they'd gone with AMD... (4, Funny)

FidelCatsro (861135) | more than 8 years ago | (#13032361)

but now they could shove a pentium 4 in it and double it as a portable grill

iSmores, iCoffee, iFrenchPress (1, Redundant)

doublem (118724) | more than 8 years ago | (#13032445)

Ohhhhh.

The perfect camping hardware.

It's a grill, it's a music player.

What do you call Smores made with an iPod?

iSmore?

iSugar?

If they can keep decent battery life, I see this as a perfect chunk of camping hardware. It would still be cool in the office though. You could get a decent coffee maker attachment for your iPod!

Of course the filters for the iFrenchPress would be three times as expensive as filters for a regular coffee maker, even though the only difference is the Apple logo, but because it's an iProduct, people will eat it up.

Well, drink it up.

Full Article Text (TM) (-1)

Anonymous Coward | more than 8 years ago | (#13032215)

Ars Logo ServerCentral Logo * Main Page * News * Articles * Reviews * Guides * Columns * Journals * Increase font size * Decrease font size * Toggle font serif * Toggle Scaling * Toggle Color Scheme * Revert styles * Emporium * Shop.Ars * Open Forum * Subscribe Inside the big switch: the iPod and the future of Apple Computer By Jon "Hannibal" Stokes Sunday, July 10, 2005 If you've been following the Apple-to-Intel transition, you're going to want to read this whole article. Why? Because I'm going to do something that I almost never do: spill insider information from unnamed sources that I can confirm are in a position to know the score. Note that this isn't the start of some kind of new trend for me. It's just that all this information that I've been sitting on is about to become dated, so it's time to get it out there. As I said in my previous post on the 970MP and FX unveiling, the new PowerPC processor announcements from IBM raise a number of questions about timing, like, when will these parts be available? how long has IBM been sitting on them? why the apparently sudden leap in performance per watt on the same process after a year with so little improvement? The announcements also raise serious questions about why, if these great parts were just around the bend, did Apple really jump ship for Intel? Was it performance, or performance per watt, as Jobs claimed in his keynote speech, or were there other, unmentioned factors at work? I have some answers to those questions, and I'll pass them along below. However, those answers come complete with their own vested interests, so feel free to interpret them as you will. First, let's talk about the broken 3GHz promise. It's apparent in hindsight that 3GHz on the 970 was never going to happen on a 90nm process without lengthening the 970's pipeline, which is a fairly significant change. Who knows why IBM promised Jobs 3GHz? All we know is that IBM tried to hit that target without the needed pipeline change, and missed it. The laptop G5, which is the long-rumored and now-announced 970FX, has supposedly been ready to go into an Apple laptop since at least early last month. And for what it's worth, yes, Apple was offered the Cell and other game console-derived chips. In fact, IBM routinely discloses its entire PowerPC road map to Apple, so pretty much anything PPC that IBM puts out is not only not a surprise to Apple, but it's potentially available for Apple's use. So why didn't Apple take any of these offers? Was it performance, as Jobs claimed in his keynote? Here's something that may blow your mind. When Apple compiles OS X on the 970, they use -Os. That's right: they optimize for size, not for performance. So even though Apple talked a lot of smack about having a first-class 64-bit RISC workstation chip under the hood of their towers, in the end they were more concerned about OS X's bulging memory requirements than they were about The Snappy(TM). One of the major factors in the switch was something that's often been discussed here at Ars and elsewhere: Apple's mercurial and high-handed relationship with its chip suppliers. I've been told that the following user post on Groklaw is a fairly accurate reflection of the bind that Apple put itself in with IBM: I've worked with Apple Authored by: overshoot on Sunday, June 12 2005 @ 08:56 PM EDT and I can tell you, there's a very good chance that they outsmarted themselves into a "no bid" response from IBM. Part of Apple's longstanding complaint against IBM was that Apple would announce a new computer with a new IBM processor, sales would skyrocket, and IBM wouldn't have adequate supply. We've all heard the story. Here's my take: Apple negitiate for a new processor chip. Being Apple, they want "most favored customer" treatment, with fab-fill margins for the vendor. What's more, they want this for what amounts to a custom processor chip, so any oversupply will just sit on the shelf until Apple decides they want them, and sometimes Apple will let them sit a while to see if they can get a price break -- it always pays to remind the world that one is, after all, the Steve Jobs. With terms like that, custom chip vendors only start as many lots as the customer contracts to accept right off the line. Apple, not exactly rolling in cash, isn't going to highball that estimate. In fact, they play it conservative and only order a small startup batch. The rest follows, of course: the product sells, Apple orders more to cover the demand, and IBM tells them that processors have a 6-month lead time. Apple complains publicly about IBM (does this sound like anyone we know?) IBM, being grown-ups, doesn't say anything that might be perceived as negative about a customer. Lather, rinse, repeat. Well, time goes by and IBM has other customers who actually pay up front for custom designs and who don't insist on having IBM tailor their product roadmap around a few million units a year. Apple again demands that IBM dedicate their CPU design teams to making an Apple special that will never generate much revenue. If IBM won't play, Apple will go to Intel. IBM does a Rhett Butler, and the rest is history. Note that you aren't hearing one way or the other from IBM on this story. Class bunch, IBM. Apple has been pulling these stunts for a long time, as anyone who followed the company's relationship with Motorola knows. Compare the quote above to the following selection from a five-year-old Paul DeMone article describing Apple's dysfunctional relationship with Motorola and the reasons for Motorola's long clockspeed stagnation: In many ways Apple is the author of its own misfortune. Years of work and billions of dollars of investments are required to design, manufacture and maintain the competitiveness of a family of microprocessors for the desktop computer market. Time and time again Apple has changed business strategies abruptly, only to reverse itself again a short while later in ineffective attempts to stem its gradual but consistent losses in market share. The PowerPC semiconductor partners, Motorola in particular, has written off hundreds of millions of dollars in losses caused directly by the erratic actions of Apple Computer, such as encouraging and later crushing a nascent market for Macintosh clones. The mercurial nature of its primary customer, combined with its minuscule and generally diminishing share of the desktop computer market, have meant that at least the last two generations of PowerPC processors have been designed primarily with embedded control, and more recently, digital signal processing applications in mind. This has left Apple in the position of only being able to differentiate itself on the basis of curved system form factors and translucent plastic. So this behavior has been going on for years and has spanned multiple CPU suppliers. The only thing that's different now is that the Mac is no longer the foundation for Apple's future growth. For the real reason behind the switch, you have to look to the fact that it's the iPod and iTMS--not the Mac--that are now driving Apple's revenues and stock price. As I stated in my previous article on the switch, Apple is more concerned with scoring Intel's famous volume discounts on the Pentium (with its attendant feature-rich chipsets) and XScale lines than it is about the performance, or even the performance per Watt, of the Mac. It's critical to understanding the switch that you not underestimate the importance of Intel's XScale to Apple's decision to leave IBM. The current iPods use an ARM chip from Texas Instruments, but we can expect to see Intel inside future versions of the iPod line. So because Apple is going to become an all-Intel shop like Dell, with Intel providing the processors that power both the Mac and the iPod, Apple will get the same kinds of steep volume discounts across its entire product line that keep Dell from even glancing AMD's way. If you think XScale is too powerful for the iPod--it's used in powerful color PDAs--then you're not taking the device seriously enough as a portable media platform. The XScale is plenty powerful enough to do video playback, and I have reason to believe that Apple is currently working on a video iPod to counter the Sony PSP. (My guess is that we might even see it in time for Christmas.) When the video iPod hits the streets, Apple will have an iPod product that plays each of the media formats (music, pictures, video) represented in its iLife suite. The cold, hard reality here is that the Mac is Apple's past and the iPod is Apple's future. It's a shame that Steve Jobs can't be upfront with his user base about that fact, because, frankly, I think the Mac community would understand. The iPod and what it represents--an elegant, intuitively useful, and widely appealing expression of everything that Moore's Curves promise but so rarely deliver--is the Macintosh of the new millennium. There was no need to put on a dog and pony show about how IBM has dropped the performance ball, when what Jobs is really doing is shifting the focus of Apple from a PC-era "performance" paradigm to a post-PC-era "features and functionality" paradigm. UPDATE: If you read all the way to the bottom of this article, and you think that my basic thesis is that "the Mac is doomed and Apple is planning to quit selling personal computers," or some other such fatuous nonsense, then you need serious remedial help with reading comprehension. I already made my point about Apple's shift in focus from the desktop PC (as exemplefied by the Mac) to the post-PC gadget (as exemplefied by the iPod) in a previous article, where it didn't occasion nearly this much contention and idiotic, defensive ranting. I have no idea why stating the exact same case a bit more strongly should induce such spasms and seizures of rage in the Mac Faithful, but there you go. (Ok, actually, I do have an idea. It's because I published the blasphemous, vulgar, and generally irreligious suggestion that The Holy Steve (may peace be upon him) was high-handed and arrogant in his dealings with chip suppliers, and that he was less than forthright about the reasons behind the switch. For such impiety I'm sure I'll be duly punished... assuming that The Steve even really exists, of course, and isn't just a figment of our collective human imaginations...) [] Discuss [Discuss this article] Home | News | Articles | Reviews | Guides | Columns Copyright © 1998-2005 Ars Technica, LLC [Site Info] [Privacy Policy] [Advertise]

Re:Full Article Text (TM) (1)

afd8856 (700296) | more than 8 years ago | (#13032233)

My god, that's crappy. Don't preview, but at least make sure the selection box is on Plain Old Text, next time.

Re:Full Article Text (TM) (0)

Anonymous Coward | more than 8 years ago | (#13032318)

Whatareyoutakingaboutthegppostwasperfectlyeasytore adbutwhatIlikedbestwasthathemanagedtousethebtagbut notbr.

It's also about marketing (5, Interesting)

alexhmit01 (104757) | more than 8 years ago | (#13032219)

Right now, Apple has to market Apple machines vs. Windows machines, and they are hard to compare. When the PPC is better, people don't believe it. They are either behind in performance or MHz/GHz, or something.

This lets a comparison with Dell/HP be VERY clear.

If the Apple hardware is $100-$200 more than a Dell, it is a straightforward question, is it worth this premium to get OS X. It makes for a straightforward comparison. In addition, if Apple's manfuacturing gets better (and they grow their share from the #8 player in the PC space to #3/#4, which is probably around a 10% market-share), then they can price equally to PC players and STILL make good margins, because they don't have to pay MS their fee.

Forget JUST the processor difference, they can really enter a straight competition with a minor price premium for a superior system... Plus, if Microsoft stumbles and looks vulnerable, they can compete in the OS market.

Also, think about Government/Corporate contracts. Someone can write an RFP: runs Linux + random software that is x86 only... or runs Office XP... Since the Apple can, they can now compete for that contract.

Lots of good things for Apple, and some minor fears for those of us suffering the transition. (I have in-house Cocoa apps that will now need to be QA'd on two platforms, even if development is "click a button.")

Alex

Re:It's also about marketing (0)

Anonymous Coward | more than 8 years ago | (#13032269)

Either that or now that we can directly compare we might see that OSX performs worse than windows on an otherwise equivelant machine :-)

It could swing both ways.

Re:It's also about marketing (1, Insightful)

TobyWong (168498) | more than 8 years ago | (#13032323)

"If the Apple hardware is $100-$200 more than a Dell..."

You're joking right? $200 difference? In your (and my) dreams maybe.

Funny - IBM is to Apple as Intel is to Dell... (4, Interesting)

Chordonblue (585047) | more than 8 years ago | (#13032363)

You know Apple's not the only PC manufacturer that's been pushy. Dell has been dropping hints about using AMD for some time now and you can believe that everytime they do, Intel gets to shell out for another advertising campaign or something. I mean, how much 'testing' does Dell have to do to magically realize (like everyone else has) that AMD has the upper hand in most performance areas? I say that Dell merely does this to get more consessions from Intel.

But look at it this way. Intel knows that Dell secretly fears Apple in it's space. What this is REALLY all about is Intel getting more leverage. I can just hear it...

INTEL: "Oh? What's this Dell? You want to use AMD? Ok, then I guess you won't need this advertising spiff more than Apple will..."

Intel is the real winner in this scenario, not Apple, although I have no doubt that Apple will thrive regardless.

Options? (4, Insightful)

Steinfiend (700505) | more than 8 years ago | (#13032220)

Doesn't the choice to change processor basically give Apple and their users more options? If Apple release hardware that can run not only their own much loved OSX operating system, but also Windows, Linux and *BSD that it removes one of the major arguments about getting an Apple. Namely, "I can't run XXX piece of software, it doesn't support Apple". As long as a dual or even triple boot is possible then I can't see any reason to not get an Apple.

Ultimately look at it this way, If the Mohammed won't come to the mountain, get a big crane and get ready to do some heavy lifting.

Re:Options? (1)

ubera (107426) | more than 8 years ago | (#13032248)

Except that change of processor to Intel does not mean X86 architecture all over. There is still likely to be a lot of custom stuff in the Apple hardware, making it just as far from WINTEL as it has been up until now.

and *intel inside* badge does not a compatible system make.

Snappy (4, Funny)

Anonymous Coward | more than 8 years ago | (#13032224)

When Apple compiles OS X on the 970, they use -Os. That's right: they optimize for size, not for performance. So even though Apple talked a lot of smack about having a first-class 64-bit RISC workstation chip under the hood of their towers, in the end they were more concerned about OS X's bulging memory requirements than they were about The Snappy(TM).

misspelled Teh ...

Wow... brilliant insight.... (-1, Redundant)

islandrain (888578) | more than 8 years ago | (#13032228)

Who woulda ever thought that by switching over to Intel chips that Apple could reduce the cost of its over-priced system?

Re:Wow... brilliant insight.... (1)

/ASCII (86998) | more than 8 years ago | (#13032322)

You know, if you haven't RTFA, you should avoid trying to summarize it. That is in no way, shape or form what the article says.

"hope" has nothing to do with it (3, Insightful)

frankie (91710) | more than 8 years ago | (#13032235)

Lord Steve may seem insane, but if so, one of his disorders is obsessive-compulsive. He would not pull such a major change as switching to Intel unless he had a thick contract in hand with every i dotted and t crossed.

If this theory is in fact the plan (for large values of if) then it's not just hope. It would be written in stone.

Re:"hope" has nothing to do with it (5, Funny)

no_barcode (840948) | more than 8 years ago | (#13032391)

Every iDotted? Of course! He got an iContract with iNtel. iThink you've nailed iT. iNeed coffee.

Re:"hope" has nothing to do with it (2, Insightful)

saider (177166) | more than 8 years ago | (#13032421)

The article seems to indicate that Steve's contracts are short sighted because Apple does not want to risk taking delivery on components for products which might not sell. IBM would not produce excess chips for Apple because the chips Apple ordered were not in demand from IBM's other customers.

If Apple ever ordered chips from Intel, Intel could overproduce and then if Apple never followed up, Intel could sell the excess to another vendor.

Basically, Intel is designing products for personal computer makers. IBM is designing products for embedded or big-iron makers.

If this had been an apple fan-site (2, Funny)

aussie_a (778472) | more than 8 years ago | (#13032236)

There is an article over at Ars Technica with some insider information about the reasons behind Apples x86 switch

Ars Technica is damn lucky it's not an apple fansite. Otherwise it would have been sued by Apple.

Article Text (1, Informative)

Anonymous Coward | more than 8 years ago | (#13032238)

Inside the big switch: the iPod and the future of Apple Computer
By Jon "Hannibal" Stokes

Sunday, July 10, 2005

If you've been following the Apple-to-Intel transition, you're going to want to read this whole article. Why? Because I'm going to do something that I almost never do: spill insider information from unnamed sources that I can confirm are in a position to know the score. Note that this isn't the start of some kind of new trend for me. It's just that all this information that I've been sitting on is about to become dated, so it's time to get it out there.

As I said in my previous post on the 970MP and FX unveiling, the new PowerPC processor announcements from IBM raise a number of questions about timing, like, when will these parts be available? how long has IBM been sitting on them? why the apparently sudden leap in performance per watt on the same process after a year with so little improvement?

The announcements also raise serious questions about why, if these great parts were just around the bend, did Apple really jump ship for Intel? Was it performance, or performance per watt, as Jobs claimed in his keynote speech, or were there other, unmentioned factors at work?

I have some answers to those questions, and I'll pass them along below. However, those answers come complete with their own vested interests, so feel free to interpret them as you will.

First, let's talk about the broken 3GHz promise. It's apparent in hindsight that 3GHz on the 970 was never going to happen on a 90nm process without lengthening the 970's pipeline, which is a fairly significant change. Who knows why IBM promised Jobs 3GHz? All we know is that IBM tried to hit that target without the needed pipeline change, and missed it.

The laptop G5, which is the long-rumored and now-announced 970FX, has supposedly been ready to go into an Apple laptop since at least early last month. And for what it's worth, yes, Apple was offered the Cell and other game console-derived chips. In fact, IBM routinely discloses its entire PowerPC road map to Apple, so pretty much anything PPC that IBM puts out is not only not a surprise to Apple, but it's potentially available for Apple's use.

So why didn't Apple take any of these offers? Was it performance, as Jobs claimed in his keynote? Here's something that may blow your mind. When Apple compiles OS X on the 970, they use -Os. That's right: they optimize for size, not for performance. So even though Apple talked a lot of smack about having a first-class 64-bit RISC workstation chip under the hood of their towers, in the end they were more concerned about OS X's bulging memory requirements than they were about The Snappy(TM).

One of the major factors in the switch was something that's often been discussed here at Ars and elsewhere: Apple's mercurial and high-handed relationship with its chip suppliers. I've been told that the following user post on Groklaw is a fairly accurate reflection of the bind that Apple put itself in with IBM:

I've worked with Apple
Authored by: overshoot on Sunday, June 12 2005 @ 08:56 PM EDT

and I can tell you, there's a very good chance that they outsmarted themselves into a "no bid" response from IBM.

Part of Apple's longstanding complaint against IBM was that Apple would announce a new computer with a new IBM processor, sales would skyrocket, and IBM wouldn't have adequate supply. We've all heard the story. Here's my take:

Apple negitiate for a new processor chip. Being Apple, they want "most favored customer" treatment, with fab-fill margins for the vendor. What's more, they want this for what amounts to a custom processor chip, so any oversupply will just sit on the shelf until Apple decides they want them, and sometimes Apple will let them sit a while to see if they can get a price break -- it always pays to remind the world that one is, after all, the Steve Jobs.

With terms like that, custom chip vendors only start as many lots as the customer contracts to accept right off the line. Apple, not exactly rolling in cash, isn't going to highball that estimate. In fact, they play it conservative and only order a small startup batch. The rest follows, of course: the product sells, Apple orders more to cover the demand, and IBM tells them that processors have a 6-month lead time.

Apple complains publicly about IBM (does this sound like anyone we know?) IBM, being grown-ups, doesn't say anything that might be perceived as negative about a customer.

Lather, rinse, repeat.

Well, time goes by and IBM has other customers who actually pay up front for custom designs and who don't insist on having IBM tailor their product roadmap around a few million units a year. Apple again demands that IBM dedicate their CPU design teams to making an Apple special that will never generate much revenue. If IBM won't play, Apple will go to Intel.

IBM does a Rhett Butler, and the rest is history. Note that you aren't hearing one way or the other from IBM on this story.

Class bunch, IBM.

Apple has been pulling these stunts for a long time, as anyone who followed the company's relationship with Motorola knows. Compare the quote above to the following selection from a five-year-old Paul DeMone article describing Apple's dysfunctional relationship with Motorola and the reasons for Motorola's long clockspeed stagnation:

In many ways Apple is the author of its own misfortune. Years of work and billions of dollars of investments are required to design, manufacture and maintain the competitiveness of a family of microprocessors for the desktop computer market. Time and time again Apple has changed business strategies abruptly, only to reverse itself again a short while later in ineffective attempts to stem its gradual but consistent losses in market share. The PowerPC semiconductor partners, Motorola in particular, has written off hundreds of millions of dollars in losses caused directly by the erratic actions of Apple Computer, such as encouraging and later crushing a nascent market for Macintosh clones. The mercurial nature of its primary customer, combined with its minuscule and generally diminishing share of the desktop computer market, have meant that at least the last two generations of PowerPC processors have been designed primarily with embedded control, and more recently, digital signal processing applications in mind. This has left Apple in the position of only being able to differentiate itself on the basis of curved system form factors and translucent plastic.

So this behavior has been going on for years and has spanned multiple CPU suppliers. The only thing that's different now is that the Mac is no longer the foundation for Apple's future growth.

For the real reason behind the switch, you have to look to the fact that it's the iPod and iTMS--not the Mac--that are now driving Apple's revenues and stock price. As I stated in my previous article on the switch, Apple is more concerned with scoring Intel's famous volume discounts on the Pentium (with its attendant feature-rich chipsets) and XScale lines than it is about the performance, or even the performance per Watt, of the Mac.

It's critical to understanding the switch that you not underestimate the importance of Intel's XScale to Apple's decision to leave IBM. The current iPods use an ARM chip from Texas Instruments, but we can expect to see Intel inside future versions of the iPod line. So because Apple is going to become an all-Intel shop like Dell, with Intel providing the processors that power both the Mac and the iPod, Apple will get the same kinds of steep volume discounts across its entire product line that keep Dell from even glancing AMD's way.

If you think XScale is too powerful for the iPod--it's used in powerful color PDAs--then you're not taking the device seriously enough as a portable media platform. The XScale is plenty powerful enough to do video playback, and I have reason to believe that Apple is currently working on a video iPod to counter the Sony PSP. (My guess is that we might even see it in time for Christmas.) When the video iPod hits the streets, Apple will have an iPod product that plays each of the media formats (music, pictures, video) represented in its iLife suite.

The cold, hard reality here is that the Mac is Apple's past and the iPod is Apple's future. It's a shame that Steve Jobs can't be upfront with his user base about that fact, because, frankly, I think the Mac community would understand. The iPod and what it represents--an elegant, intuitively useful, and widely appealing expression of everything that Moore's Curves promise but so rarely deliver--is the Macintosh of the new millennium. There was no need to put on a dog and pony show about how IBM has dropped the performance ball, when what Jobs is really doing is shifting the focus of Apple from a PC-era "performance" paradigm to a post-PC-era "features and functionality" paradigm.

UPDATE: If you read all the way to the bottom of this article, and you think that my basic thesis is that "the Mac is doomed and Apple is planning to quit selling personal computers," or some other such fatuous nonsense, then you need serious remedial help with reading comprehension. I already made my point about Apple's shift in focus from the desktop PC (as exemplefied by the Mac) to the post-PC gadget (as exemplefied by the iPod) in a previous article, where it didn't occasion nearly this much contention and idiotic, defensive ranting. I have no idea why stating the exact same case a bit more strongly should induce such spasms and seizures of rage in the Mac Faithful, but there you go. (Ok, actually, I do have an idea. It's because I published the blasphemous, vulgar, and generally irreligious suggestion that The Holy Steve (may peace be upon him) was high-handed and arrogant in his dealings with chip suppliers, and that he was less than forthright about the reasons behind the switch. For such impiety I'm sure I'll be duly punished... assuming that The Steve even really exists, of course, and isn't just a figment of our collective human imaginations...)

Re:Article Text (0)

Anonymous Coward | more than 8 years ago | (#13032301)

Would you knock that shit off please??? It's Ars Technica for Chrissakes....I don't think it's going down anytime soon!

Re:Article Text (2, Funny)

Linus Torvaalds (876626) | more than 8 years ago | (#13032470)

they optimize for size, not for performance

What kind of moronic statement is this? Optimising for size is optimising for performance. What, does he think that the Apple guys are optimising for size to cut down on shipping costs?

what about AMD? (5, Interesting)

utopicillusion (843168) | more than 8 years ago | (#13032239)

If such a move was made, does this make AMD's anti-trust case against Intel more convincing?

Maybe now (because of the lawsuit), intel will not provide such deals to Apple. Is then, Apple in deep shit?

Yes!

Re:what about AMD? (1)

Knome_fan (898727) | more than 8 years ago | (#13032314)

IANAL, but I guess that giving discounts for large volumes isn't in and off itself ilegal, while giving discounts only to those who don't use a competing product is.

Re:what about AMD? (1)

ceeam (39911) | more than 8 years ago | (#13032371)

Why? I cannot imagine how can the situation come when Intel is banned to provide "such deals" to Apple and still able to provide them to other makers. Can you? And if not - why would Apple be in "deep shit"?

Re:what about AMD? (1)

servo335 (853111) | more than 8 years ago | (#13032392)

Better yet; will this remove the fog from apple's eyes and guid them to make osx for amd as well?

A perfect Fit (0, Troll)

Atlantic Wall (847508) | more than 8 years ago | (#13032241)

Are u kidding, these G5 cpu's would melt the plastic cases they came in in seconds. yeah they may be good chips but to compete with the windows market this had to happen, a not so perfect fit after all.

Article is crap, I know the real reason! (3, Funny)

Anonymous Coward | more than 8 years ago | (#13032243)

Apple is planning to do for hot air what they've done for music with iHeat, the world's first ergonomically-designed space heater. iHeat will be the first space heater to use Apple's exclusive scroll wheel technology for setting the temperature, and only Intel has what it takes to get the job done.

LINUX did SGI (0)

Anonymous Coward | more than 8 years ago | (#13032255)

If they get their act together they will do Apple too.

Wait a second... (1)

joey_knisch (804995) | more than 8 years ago | (#13032264)

Aren't these the same discounts that Intel is being sued for under anti monopoly laws?

Re:Wait a second... (3, Informative)

/ASCII (86998) | more than 8 years ago | (#13032369)

No. Volume discounts are not illegal. Only offering volume discounts to customers who stay away from your competitors products might be, but that is not the same thing.

If bulk discounts where illegal, Wallmarts would be out of buisness and everyone would have to shop at 7-11.

This is my experience with Apple MACs (-1, Troll)

Ta Pere * (882182) | more than 8 years ago | (#13032270)

I had been using Apple MACs for quite a long time and a few months ago I suggested to my boss that we replace a few of our client machines with them just as an evaluation. After some persuasion he finally agreed to the idea - mainly due to the fact that our IT manager had been using an Apple MAC for quite some time.

It all was really good to start with. OSX was better than our expectations and my boss was happy with the switchover - I had even been nominated for a promotion at the end of the year. Unfortunately we encountered one problem. One employee lost a whole project due to not being able to right click on the program he was using at the time.

My boss wasn't happy with me at all. I was called into his office and given a stern reprimand. A few days later I was looking for another job from home. A word of advice: don't use Apple MACs as all they seem to do is cause problems.

Re:This is my experience with Apple MACs (1)

SlamMan (221834) | more than 8 years ago | (#13032310)

This is one of the dumbest posts I've seen in some time, and thats saying something. This doesn't even qualify for "Troll" status

Re:This is my experience with Apple MACs (0)

Anonymous Coward | more than 8 years ago | (#13032340)

NO YUO

Re:This is my experience with Apple MACs (1)

fracai (796392) | more than 8 years ago | (#13032327)

Yep, that's why I always put a router between my Apple MAC and the Internet.

Re:This is my experience with Apple MACs (1)

justforaday (560408) | more than 8 years ago | (#13032477)

I tried using the MAC cloning feature of my router, but websites still kept on identifying my machine as running Windows...

Re:This is my experience with Apple MACs (2)

Fluk3 (742259) | more than 8 years ago | (#13032355)

You are a liar and/or a troll.

Obviously you want people to flame you for spelling Mac in all caps when everyone knows MAC is something different.

And, you want people to flame you because you can right click with a 2 button mouse on a mac or control click with a single button, and of course you can't loose data from 'not being able to' use a contextual menu.

Please...

Re:This is my experience with Apple MACs (1)

jleq (766550) | more than 8 years ago | (#13032446)

-1 Troll

These posts are seen on Slashdot all the time, and they're still incredibly annoying.

It's New Coke/Old Coke (0)

Anonymous Coward | more than 8 years ago | (#13032271)

This isn't a switch, Apple will continue to push both proc lines as it see fit. Xcode should make that possible.

The real reasons are obvious (3, Insightful)

ryanvm (247662) | more than 8 years ago | (#13032278)

Is it just me, or aren't the real reasons for Apple's switch obvious?
  • Cheaper processors due to economies of scale. Also cheaper because they will constantly be fought over by both Intel and AMD.
  • Running Windows apps in Mac OSX becomes much more feasible since they can now do virtualization instead of emulation. Dual booting between Mac OSX and Windows will now be a possibility as well.

Re:The real reasons are obvious (1)

p0ppe (246551) | more than 8 years ago | (#13032442)

People's been saying otherwise, ever since the switch was made public. The price of a G4/G5 is apparently a whole lot cheaper than the corresponding Intel Pentium M.

I'm not too sure either on fighting between Intel and AMD. AMD currently lacks the fabs to be able to guarantee a steady supply of processors for Apple.

Re:The real reasons are obvious (2, Insightful)

Fahrvergnuugen (700293) | more than 8 years ago | (#13032458)

If you honestly think that one of the two reasons that apple switched to x86 was so "Dual booting between Mac OSX and Windows" is possible, you're smoking crack.

If that were true then it would mean that Jobs secretly thinks that windows is somehow better than OSX and that it needs to be supported by their hardware in order for Apple to survive.

I don't understand the advantage... (1)

pointbeing (701902) | more than 8 years ago | (#13032280)

Not completely OT, but I don't understand Apple's attraction to RISC processors anyway.

I guess I can understand the advantage in maybe a midtier box or mainframe or larger, but it seems to me that any processor instructions not supported by the RISC chip would have to be emulated in software - and on a multimedia desktop PC I don't understand the advantage of RISC over CISC.

Can someone enlighten me?

Re:I don't understand the advantage... (5, Insightful)

timster (32400) | more than 8 years ago | (#13032393)

You misunderstand RISC. As do most people these days, it seems.

Back in the olden days, when chips were still designed by small teams on reasonable budgets, somebody noticed that hand-written assembly was rapidly becoming passe. When the assembly is being written by a compiler, it makes sense to design the chip with that in mind, and make an instruction set that is efficient at the kind of simple instructions that compilers like to write.

This led to a simpler design that could be made somewhat faster than a complex one. This led to many predicting the demise of so-called CISC chips. This prediction, like the "Internet in danger of collapse" and "Apple to go bankrupt" predictions, is no closer to actually happening than it was when it was first made.

The surprise was that Intel wanted a chip that had the speed advantages of RISC but used the same interface as their older chips, so they designed one. So they built a chip called the Pentium that translated CISC instructions into RISC ones. Since this operation is essentially O(n), they got good performance, and they've continued that basic design to the present day.

So to answer your question, it's already true that any operations that are not simple are emulated in software -- it's just that in x86 processors the emulation is on the CPU. Today there is no important difference between CISC and RISC, whether we are speaking of mainframes or desktops.

Re:I don't understand the advantage... (1)

pointbeing (701902) | more than 8 years ago | (#13032439)

Thank you for the explanation ;-)

But isn't software emulation considerably slower than hardware emulation? I promise not to ask any more questions today, honest ;-)

thanks again -

Re:I don't understand the advantage... (0)

Anonymous Coward | more than 8 years ago | (#13032416)

just2good [just2good.co.uk]
Optimisation Principles

CISC vs RISC Architecture
Where does CISC Stop and RISC Begin?
Defining Performance
The Instruction Cycle : Clock Cycle ratio
General Approaches

CISC vs RISC Architecture

The performance of a CPU is highly dependent on its internal architecture. The goal is to do as much 'work' in a given amount of time as possible. This can be achieved using several different approaches. One approach is to increase the 'power' of each individual instruction in the instruction set. This results in CISC architecture. Another approach is to attempt to reduce the clock cycle time, thereby increasing the clock speed of the CPU. To do this, the complexity of individual instructions must be reduced, leading to RISC architecture.

The x86 instruction set is a predominantly CISC instruction set. CISC stands for Complex Instruction Set Computing. Traditional theory states that CPUs can be made quicker by adding more and more complexity into the instructions of the instruction set. The aim was to perform as much work in a single instruction as possible. Hence instruction sets grew larger and complicated. Early Intel CPUs such as the 8086 and 286 are considered pure CISC processors.

RISC theory arrived a little later. The Reduced Instruction Set Computing approach states that the best performance can be achieved by reducing the time taken to execute any given instruction. Rather than have complex instructions that require many clock cycles (more on this later) to complete, RISC chips use very simple instructions that could be performed in fewer clock cycles. Performance can then be improved by making the cycles shorter - i.e. by implementing a faster clock.

Thus, with RISC architecture, we increase CPU speed by attempting to keep each instruction simple enough so that it can be performed in a single clock cycle.

How do we make instructions more simple? Well, one approach is to limit the number of memory addressing modes available to instructions. CISC instructions can usually address memory in many different ways. This builds complexity into the instruction and also means that a given instruction op-code can be of variable size. RISC instructions, on the other hand, are usually limited to a single memory addressing mode. In fact, in a full RISC implementation, most instructions can not access memory at all! Instead, a special set of instructions (called load and store instructions) are designed to read and write from memory, transferring data to and from registers as required. The rest of the instruction set can only operate on register data or immediate data (i.e. data specified as part of the instruction itself).

The result of this so called Load-Store RISC architecture is that instructions are less complicated and, as a bonus, tend to be of fixed size. This makes performance-optimising strategies, such as pipelining, easier to implement.

In summary, the result of the RISC approach is that clock speeds are increased. As an added benefit, RISC-based chips require fewer transistors than CISC-based chips of similar performance, and are therefore cheaper to build. Also, since the CPU core die area is smaller for RISC chips, more die area can be devoted to performance enhancing features, such as extra registers and larger caches. (To see comparisons of the number of transistors used in CPUs, check out the CPU History section.)

Where does CISC Stop and RISC Begin?

It should be noted that CISC and RISC are not clearly defined classes, but should rather be regarded as a set of CPU design principles. Thus, modern CPUs often exhibit traits of both architectures and can not be categorised as either purely CISC or purely RISC.

Intel have been increasing their use of RISC technology since the Pentium chip. However, these chips must also be able to perform traditional x86 instructions. In this case, the x86 instructions are handled a little differently to the older CISC-chips which execute x86 instructions natively.

During the decode step, one or more decode units in the RISC chip take the x86 instructions and convert them into simple fixed-length RISC-like micro-instructions, often called micro-ops. These micro-ops can then be handled just like any other RISC instruction; the CPU architecture can therefore utilise standard RISC optimising strategies (these will be described below) to improve performance and increase clockspeed.

Defining Performance

Remember that we can not define the performance of a processor by clock speed alone. CPU manufacturers certainly exploit the ignorance of the consumer by conditioning us to believe that clockspeed is the be all and end all of everything.

Instead, we should think of performance as the time it takes to perform a task. Let's define the time taken to perform some arbitrary task (such as a complex program) as T. Now, if the number of instructions required to perform this task is N, the average number of clock cycles per instruction is C and the clock speed of the CPU is S, then we can arrive at the following (though simplistic) formula:

T = N * C / S

Clearly then, the goal is to make T as small as possible. We can do this by minimising N and C, while maximising S. (For those of you who are not mathematically minded, please do not be put off by this formula. This really is simple stuff!)

RISC based processors tend to have a smaller value of C, since instructions are more simple than in CISC designs and therefore require less cycles to complete. However, at the expense of this, RISC chips tend to have a higher value of N (compared with CISC), since more instructions are required to perform the same task.

However, certain principles can be applied which can effectively reduce C to 1, such as pipelining which will be discussed very soon.

Although this discussion is very simplistic, it does strongly hint at how a CPU can be optimised. It would seem that if we can employ strategies to minisise C (such as pipelining), then CISC architecture will yield the highest performance. However, it turns out that in practice, many performance optimising principles (such as pipelining, once again) are more easily implemented using RISC architecture.

The Instruction:Clock Cycle ratio

In the early days, each step of the instruction process (i.e. fetch, decode, address generate, execute and write back) would need one or more clock cycles. Therefore a single instruction could require over 5 cycles to complete. Hence the instructions:clock cycle ration for such a CPU was only 0.2 or less! It doesn't take a genius to work out that if you can improve this ratio, you will get a faster CPU.

General Approaches

As mentioned earlier on, there are a few general approaches to CPU optimisation. These include: increasing the power of instructions (the CISC approach), decreasing the cycle time (predominantly RISC), and, perhaps most importantly, decreasing the number of cycles per instruction. In practice, a good balance of these approaches proves most cost effective.

To begin with, we must move away from our single internal CPU bus view (see Instruction Process) that I've been referring to so far. Clearly, with a single bus, only one data transfer is possible at any given time. Furthermore, this bus design requires the use of temporary storage registers. By implementing a multi-CPU bus design, we can now transfer more than one piece of data within the CPU at any given time. This design also circumvents the need for temporary registers.

Secondly, as already mentioned, accessing main memory is slow compared to all other phases of the instruction process. Thus we need to minimise the time taken for this step, or at least limit its presence as a bottleneck.

One way to do this is to overlap the fetch phase with instruction execution. The CPU can be made to fetch instructions from memory while it is executing instructions that have already been fetched. This is a primitive form of pipelining, which will be explained in much more detail later.

This approach is called an instruction prefetch and is typically achieved by way of an instruction unit. The job of this unit is to determine (or predict) which instruction will be executed next, fetch that instruction and queue it up (into the instruction queue), ready for when the execution unit (whether integer unit or FPU) is ready for it.

This process can be optimised by storing instructions and data in a cache. This will be explained in detail in the next section. But for now, it is sufficient to think of the cache as a small, local store of instructions and data which can be accessed very rapidly. By storing data here, the CPU can avoid main memory interactions.

Before we go on to specific optimisation strategies, I will finally mention the use of multiple integer units and FPUs. By having more than one integer unit in combination with one or more floating point units, we can effectively run more than one instruction at once. And when I say 'more than one instruction at once', I'm not simply referring to the overlapping of instruction phases that we see in pipelining, but in fact complete separation of execution processes. This approach is called superscalar architecture....

Re:I don't understand the advantage... (1)

SilentSheep (705509) | more than 8 years ago | (#13032472)

RISC processors are a lot more efficient, both with speed and power than a CISC processor. A RISC processor can have a similar performance to a CISC processor with a lot lower clock speed and therefore less power consumption. This makes them particularly good for mobile systems(i.e. iPOD).

Re:I don't understand the advantage... (3, Informative)

TheRaven64 (641858) | more than 8 years ago | (#13032490)

but it seems to me that any processor instructions not supported by the RISC chip would have to be emulated in software

I'm not sure what you mean by this. You should read up on the Church-Turing thesis. Basically, it can be proven that a very simple instruction set (I think the minimum is 3 instructions[1]) can run any algorithm. The question then becomes, what instructions should be implemented as a single instruction on the chip, and which ones should be implemented as a combination of instructions. Generally, it turns out, it is a good idea if all of your instructions take the same length of time to execute - this makes interleaving different instructions much easier. It therefore makes sense to have a relatively simple instruction set.

The trend towards CISC ended with things like the VAX. Back when people used to program in assembly, it made sense to have complex instruction sets to make things easier for the programmer. The VAX included things like an evaluate polynomial instruction, for example. Of course, this was quite unwieldy, and so a lot of the instructions were implemented as microcode - you they were automatically translated to a set of simpler instructions.

With the development of high-level languages, it emerged that compiler writers were not using these complex instructions, they were implementing them directly in simpler instruction. It then made more sense to focus on making a small set of instructions run quickly (which, it turns out, is easier and therefore cheaper).

Note that `CISC' chips are not really CISC anymore. They do the same `emulation' that RISC chips do. When you run x86 code on a Pentium each instruction is broken down into simpler instructions and then these are executed on the RISC core. The Pentium 4 (and, I believe, the Pentium M) cache these micro-instructions, so they don't have to do the translation twice.

[1] Zero, Increment, and Conditional Jump, for example - try it, you can do addition simply, multiplication by repeated addition, then build more complex algorithms from there.

High handed or not (5, Interesting)

stefanb (21140) | more than 8 years ago | (#13032283)

Irrespective of whether The Steve dealt properly with IBM, the reality is and has been for many years that developing their own CPU (or having it developed for them) was just too expensive for Apple.

The original idea of the Apple-IBM-Motorola coalition was that they would be able to compete with Intel by combining forces: CPUs for servers, workstations, and embedded systems; and by creating a third-party systems market to drive demand for these CPUs (PReP). This never really took off, so IBM and Motorola were stuck with having to compete with Intel for price/performance for a single customer that would only buy a fraction of what Intel and AMD would churn out. I have no idea how much it costs to keep up a competitive CPU architecture, but it must be in the hundreds of millions, if not billions per year.

Cell might be cheap, but it doesn't allow Apple to compete with PCs on a price/performance or performance/watt level. And paying IBM to continue to develop the 970 architecture was just too expensive: people might be willing to pay a bit more for Apple systems, but only so much.

Just look at all other contenders in the high performance CPU market: there's nobody left except for Sun and Fujitsu/Siemens, and they announced last year that they will cooperate on SPARC. From a pure market standpoint, Apple had little choice.

The real reason... IBM can't get 90nm together (3, Interesting)

Anonymous Coward | more than 8 years ago | (#13032294)

I have to post this anonymously... You'll see why below. The real reason Apple switched from IBM is because IBM just hasn't gotten their shit together with 90nm. I know this because I recently left a job at a large semi-conductor manufactorer that used IBM for our digital fab. IBM repeatedly promised, "we'll fix the problems in our process" for YEARS, and just couldn't get their act together. With run after run of silicon, IBM couldn't manufacture the parts correctly (or other other customers parts). Finally, my company became fed up, and bit the bullet to switch to another manufactorer. It was a 4 engineer year sunk cost (to update some the design), and the design worked out of the chute (and at pretty good yields). You heard it here first... IBM just doesn't have their shit together at 90nm.

(Mac == past) && (iPod == future) ??? (0, Flamebait)

sczimme (603413) | more than 8 years ago | (#13032304)


The cold, hard reality here is that the Mac is Apple's past and the iPod is Apple's future.

I find it hard to believe that anyone who purports to know the inner goings-on at Apple could come up with such a ridiculous assertion. The author seems to believe that those products cannot exist in parallel...

Perhaps he's just trying to encourage debate, in which case IHBT.

Re:(Mac == past) && (iPod == future) ??? (2, Informative)

Knome_fan (898727) | more than 8 years ago | (#13032385)

The author seems to believe that those products cannot exist in parallel...

No he doesn't. In fact he expicitly states that in the article:

"If you read all the way to the bottom of this article, and you think that my basic thesis is that "the Mac is doomed and Apple is planning to quit selling personal computers," or some other such fatuous nonsense, then you need serious remedial help with reading comprehension. I already made my point about Apple's shift in focus from the desktop PC (as exemplefied by the Mac) to the post-PC gadget (as exemplefied by the iPod) in a previous article, where it didn't occasion nearly this much contention and idiotic, defensive ranting."

Yes, I saw that (1)

sczimme (603413) | more than 8 years ago | (#13032479)


No he doesn't. In fact he expicitly states that in the article:

Yes, I saw that. However, he states unequivocally that (A==B) in the middle of the article and then states just as firmly that (A!=B) at the end; this is a poor example of communication. The author should not expect those two statements to carry equal weight - which does he actually believe? The disclaimer-like language in the last paragraph struck me as an effort to weasel out of the strong point he tried to make earlier.

As an aside, I don't read this guy's column and was not inclined to look for the "previous article" he mentioned.

Re:(Mac == past) && (iPod == future) ??? (1)

NickHydroxide (870424) | more than 8 years ago | (#13032388)

I don't really think that he was saying Macs == teh doomed, and to all hail our new iPod-bearing overlords.

More to the point, and I think in fact that most sane people would agree - the iPod has been a revolution for Apple. It is the exact bandwagon they need to bring their other products (i.e. Macintosh computers) to a wide consumer audience.

From TFA:

"If you read all the way to the bottom of this article, and you think that my basic thesis is that "the Mac is doomed and Apple is planning to quit selling personal computers," or some other such fatuous nonsense, then you need serious remedial help with reading comprehension."

Re:(Mac == past) && (iPod == future) ??? (1)

/ASCII (86998) | more than 8 years ago | (#13032403)

If you read the _whole_ article, it ends with a clarification that Mac desktop computers will be a part of Apples roadmap for a long, long time, but that their main revenue stream is _already_ comiong from iPods, iTMS, etc..

Re:(Mac == past) && (iPod == future) ??? (0)

Anonymous Coward | more than 8 years ago | (#13032409)

I think the problem is that you don't seem to see the parallel in other markets.

I see this Apple situation as the "Nintendo complex".

Apple has a portable solution that is a huge hit, its main computer offerings arent all that, but its eyes got real big and bright once it realized that there's serious market share to be had in the handheld gadget market.

Can the Ipod exist with the Mac? Yes. Can the Mac ever be as popular as the Ipod? Never.

Re:(Mac == past) && (iPod == future) ??? (1)

BoomerSooner (308737) | more than 8 years ago | (#13032423)

Did you actually read the entire article? He was saying the PC platform is a commodity, not that they were going to quit selling Macs. If you look at where the revenue of Apple is derived, you'll see branching into devices is significantly more profitable than the PC market and as such that is the future. Making crazy margins off Macs is the past (hopefully).

Re:(Mac == past) && (iPod == future) ??? (2, Insightful)

Iriel (810009) | more than 8 years ago | (#13032450)

I believe that the author is thinking in the material world too much. It's not just iPod and portability that's been helping Apple lately, it the customization. Having your own custom playlists to carry with you where you go, your own set of widgets on the desktop and your own group of rss feeds. All of this housed in a smooth, sleek package. It's not just the hardware that propels Apple lately, has anyone ever told this author about something called Tiger?

AMD (2, Insightful)

rerunn (181278) | more than 8 years ago | (#13032313)

"Intel CPUs, and that by doing so, they will gain the same kinds of discounts that Dell get. If price of cpu's were really such a big factor, AMD might have been alot more willing to offer discounts than Intel.

Compile flags (5, Informative)

pp (4753) | more than 8 years ago | (#13032315)

They claim -Os is to remove bloat, not increase performance :-) Thing is, for kernel type code the resulting code is actually _faster_ than with gcc -O2, since there is a lot less cache pressure.

The Fedora kernel people have benchmarked this quite a bit (and now compile kernels with -Os too), the difference is quite measurable, 5%:ish in some benchmarks.

Re:Compile flags (4, Interesting)

Wdomburg (141264) | more than 8 years ago | (#13032379)

Especially on the G5, with a relatively small L2 cache (especially for a 64-bit CPU) and exceedingly high memory latency.

the "real reason" any business does anything? (1, Flamebait)

Khyron (8855) | more than 8 years ago | (#13032341)

Woah, you mean Apple is a for-profit business and its leaders have made a crucial decision to ensure future profitability?

Why, as an Apple user or an Apple stockholder would I not expect this? What about the WWDC keynote didn't already make this clear?

The reasons given in the media, by Apple, have included supply line issues with IBM based on performance, efficiency, heat, cost, and availability. What's so shocking here?

The "real" reason? Please. The posted needs to ask FOXNEWS for a job.

iPod is Apples Future? (4, Insightful)

brockbr (640130) | more than 8 years ago | (#13032347)

First off, I RTFA... It implies that the iPod & iTMS, not the Mac, could drive Apple's future. What is Apple without the Mac? What is Apple without OSX? If the simple answer is a "portable media player company with ties to the RIAA & MPAA", then so be it - But that answer is shortsighted. This can be seen by Microsoft's foray into this arena (witness Windows Media and Media Center PC's), along with Linux's abilities (Myth) in this same subset of the market. It's the Media stupid! The media is *not* the player. If Apple, which the article supposes, is out to drive the hand-held player market with it's technology, then it may very well succeed - In hand-helds that is. If it ignores the Mac as the center of *their* digital world, they may end up with a cute player and nothing more.

What Apple said all along (0)

Anonymous Coward | more than 8 years ago | (#13032357)

What's so insightful about this? The public statements Apple has made about this all explicitely refer to contract negotiations. "Insider information"? ... give me a break.

-Os (4, Informative)

wowbagger (69688) | more than 8 years ago | (#13032366)

Compiling with "-Os" (optimize for smaller code size) is not always at odds with speed, as is implied in the article.

While for some trivial benchmark code -O4 may generate faster code, for real-world applications keeping your code in cache is worth more than loop unrolling - so in real-world stuff often -Os is better than -O[2345].

Proprietary PC (5, Insightful)

fbonnet (756003) | more than 8 years ago | (#13032382)

With its switch to Intel, Apple is going to succeed where MS couldn't: build a "proprietary" PC that doesn't rely on anything legacy such as the BIOS.

Nearly everything except the BIOS will be standard on the Mactel platform. Seems to me like the perfect occasion to introduce a "trusted", DRM-enabled platform from the ground up.

Now Apple can tell the RIAA & MPAA: on our platforms, your stuff will be secure.

He's right about the Mac being "the past" (5, Insightful)

alexhmit01 (104757) | more than 8 years ago | (#13032384)

The desktop wars are over. Commodity IBM PC-compatibles with Microsoft OSes and Intel chips won. Sure the market is HUGE and niche markets (even #1 player Dell doesn't dominate the market, it owns the niche for moderately supported business machines with semi-custom orders) remain extremely profitable if done right, but Intel and Microsoft have extracted most of the profits. Even highly innovative AMD can only capture 20% of the market.

In 1996 Fortune interviewed Steve Jobs and asked him what he would do if still running Apple. He responded that he would "milk the Mac for all it is worth and move on to the next big thing."

This doesn't mean that those of us with an investment in Apple hardware (or more risky, custom Cocoa software like we have) mean that Apple is going to abandon the Mac....

They are going the milk it for all it is worth.

With OS X, we have a NeXTSTEP/Mac fusion that Steve likes, and Apple will keep profitably pushing out software updates that they sell, but that isn't Apple's growth.

Their growth operations: software, when Steve rejoined they had recently gone from free OS upgrades to selling two upgrades, OS 7 and OS 7.5, IIRC, maybe 6 was sold as well.

Now, Apple sells new OS Versions every 1 - 2 years. They put out an iLife upgrade annually. They will probably put out iWork annually. And they replaced their free iTunes system with a nicely growing .Mac system, where the cost of the storage is going to zero but their annual subscribers are growth.

The average Mac customer pre-Jobs bought a Mac and used it for 6 years.
The average Mac customer post-Jobs buys a Mac, and uses it for 3-4 years with 2 OS upgrades, 1 or 2 software purchases, and 20% of a .Mac subscription (or some similar number). That means that Apple can sell a low-margin system like the Mini, pocket $100 on the system, and hope to grab another $200-$300 in software sales over the system's lifetime... So a $500 Mac Mini sale is as good for Apple as a $2000 PowerMac with 40% margins was 5 years ago.

Apple will keep innovating the Mac to milk the cash cow... They will NOT enter price-wars or otherwise fight with MS or Dell or HP for market-share. They will milk the cash cow, try to execute and expand markets, but they are NOT interested in growing to a 10% market with the SAME profits as now by cutting their margins by 75% which would make the software developers happy.

It isn't a zero-sum game, they are selling the iMac or Mac Mini as a digital life system. Sure you have a Windows machine for whatever... but add a Mac Mini and a KVM (and annual OS X + iLife upgrades) to easily put your kid's Soccer Games on DVD and send to his grandparents. That is their "growth" strategy.

It isn't a bad strategy, but selling easy-to-use digital toys is how Apple is a growth company, and Microsoft is becoming a mature company that will steadily increase its annual dividend.

Good for Steve Jobs, good for Apple shareholders, and hopefully good for its customers as long as Apple keeps putting out new products that we want to buy because we are the cash cow to be milked, but we aren't going to benefit from price cuts from a price war because market-share and PC growth just don't interest Apple...

That said, I'm sure at some level Apple sees Linux entering the network market for office networks, and realizes that with the best (and easiest to use) desktop Unix... he can enter that market. If you like Linux, if Apple gets the BEST WINELIB performance, the BEST Qt performance, and best Gtk performance, and has KDELIB and GNOMELIB ported... well how tough is it that Apple is able to compete with Linux for SOME share of the corporate desktop market.

Apple is in a position to make SOME gains in PC market-share, but growing back to 10-20% over 10 years isn't giant tech growth... the iPod and OTHER SIMILAR projects is.

It's a smart business move, and Apple has set themselves up to grow profits steadily in their core markets, and then swing for the fences with new products like the iPod, iTMS, etc.

Alex

I thought I was kidding (4, Funny)

doublem (118724) | more than 8 years ago | (#13032396)

And all this time I thought putting the "Intel Inside" stickers on the MACs when I was in in college was a cruel gag done to piss off the MAC users. Not I'm a bloomin' prophet.

The Microsoft Factor (2, Funny)

chiph (523845) | more than 8 years ago | (#13032402)

My personal theory of what's going to happen over the long term is that Microsoft will discover the benefits of running on a closed hardware device (no more pesky driver problems from marginal hardware makers!) and will port Windows to the Intel Mac where it becomes a best seller.

Chip H.

Bah (5, Insightful)

Maury Markowitz (452832) | more than 8 years ago | (#13032426)

Is it just me, or are the "insiders" who can't even spell just a tad less than credible?

Why can't anyone take the announcement at face value? Clearly IBM (and Moto/Freescale) don't want to develop new top-end chips for a small market. Who can blame them?

But Intel is going to build their next generation anyway. Apple's small marketshare is meaningless in this context, they're in a race with AMD for a huge market no matter what else happens.

Let's remember that Intel has been courting Apple for well over a decade now. They're also clearly unhappy with the crappy boxes being offered by their existing vendors. Having Apple onboard making cool products with their systems must be a dream come true -- "See, THIS is what an Intel machine can do".

But no, not enough of a conspiracy in that I suppose.

Yeh, Surprise... yay! (2, Interesting)

jvd (874741) | more than 8 years ago | (#13032437)

Why does this surprises us at all? It was a bit obvious that the switch had to do more with money (since IBM didn't want to lower their prices for Apple) than by the mere ideology of which processor Apple feel is better. Hello, wake up.

Comparisons can hurt (4, Insightful)

jav1231 (539129) | more than 8 years ago | (#13032471)

My Powerbook boots faster than my new Thinkpad. So this could go the other way. Apple fans could find that the Apple hardware my behave considerably different. Even if not, now Apple has to compete with high-end gamer boxes when trying to be the fastest. Perhaps they won't try to be the fastest, but faster than Dell/HP. It's going to get very interesting, to say the least.

Here's what is so surprising (1)

blair1q (305137) | more than 8 years ago | (#13032474)

What is so surprising is that so many people don't understand that it was illogical for Apple to use higher-priced parts for 20 years just to be different.

I expect further incursions of logic into Apple's business practices. The use of aftermarket motherboards and fungible accessories, e.g.

Its days as an iconoclast are over.

Which means its days as a boutique development shop are over.

So if you're a hardware designer working for Apple, you'd better either start sucking up hard hoping you're one of the few who are kept, or start buffing your resume', and maybe learning Chinese.

This is no troll. This is business, and Apple finally joined up.

Jobs ego factor and 360? (4, Interesting)

nobodyman (90587) | more than 8 years ago | (#13032478)

I'm wondering if the 360 has something to do with this, or if it at the very least nudged Jobs over the edge.

Hear me out. Most people have heard about Jobs' pathological reaction when he loses face, and everyone knows that he *hates* Bill Gates, right?

So awhile back Jobs' predicts 3Ghz G5's in 2005 (which I guess became the "3GHZ Promise"). IBM fails to deliver. However, Microsoft announces shortly before E3 that the 360 will use a 3.2 GHZ triple-core G5. I can only imagine that Jobs was pissed on some level that Bill Gates trumping him in Apple territory.

Of course, there have been a few reports that the 360's G5 is essentially crippled, and that the chip will effectively be only twice as fast as the original xbox's cpu. Even if it's true, I don't think that changes anything. Jobs may have figured figured (and I'd be inclined to agree) that even if the 360 chip is not really as powerful as it seems, it represents time&effort that IBM was dedicating elsewhere instead of working on improving it's offerings to Apple.

In fact, when you consider that IBM is working w/ Sony and Nintendo on other customized G5's, it seems pretty clear where Apple stands in terms of priority. Not that I blame IBM -- why the hell would you care about the rantings of Steve Jobs when you are going to be selling your product to 3 out of the 3 biggest players in the console market, with each one amounting to way more sales that what you'd ever get with Apple.

Not sure if it's the case, but it sounds plausible enough. At least he kept the promise though, right? ;-)
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...