Slashdot: News for Nerds


Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Apple vs. PC in Adobe After Effects

pudge posted more than 12 years ago | from the ouch dept.

Technology (Apple) 84

An anonymous user wrote, "Digital Video Editing ran some tests to compare the Dual G4 with the Athlon MP in After Effects. They didn't use the fastest Athlons, but the results are pretty clear anyway. This is especially interesting after Apple announced that they would be killing Shake for x86 platforms. If Apple really wants to position the Mac as an alternative to x86 on the film / video effects market, they are going to need to improve their hardware, especially with AMD's 64-bit CPU just around the corner. From the article: 'Not one of the objective tests we conducted using After Effects bore out Apple's claim of Mac superiority. In fact, in most of the tests, the Mac was left lagging far behind.'"

cancel ×


first pr0st! (-1, Troll)

Anonymous Coward | more than 12 years ago | (#3484393)

FIrst pr0st baby!!!

Re:first pr0st! (0)

Anonymous Coward | more than 12 years ago | (#3494982)

This article was interesting.
I understand that OS X is slow.
I also suspect that After Effects does not utilize the altavec engine although I am more than happy to be corrected on this point.

I would like to see a review of the product running under OS 9. Also I am curious to know if any software out there other than Photoshop has been designed to use altavec.

Question (1)

leviramsey (248057) | more than 12 years ago | (#3484395)

I couldn't find it in the article, but is AfterEffects AltiVec and/or 3DNow! optimized?

Re:Question (0)

Anonymous Coward | more than 12 years ago | (#3484549)

Also, since it is Carbonized, is it going to be any slower, or less responsive?

Re:Question (1)

codesmith (61876) | more than 12 years ago | (#3484581)

According to Adobe's AF5.5 New Features document, "After Effects 5.5 introduces important performance improvements, including Pentium 4 optimizations and faster parsing of expressions.", nowhere do they mention Altivec in the 5.5 documents (at least in what's available on the web). However, Altivec support has been present since AF version 4.*, and one would assume that it has been maintained since then...

Re:Question (1)

Alan Partridge (516639) | more than 12 years ago | (#3484770)

I don't think it's really all that important - thus test shows up quite nicely that the G4 is currently pretty hamstrung by t's 133Mhz memory bus. I do video compression professionally using both Mac and PC workstations, but the Mac is currently one generation behind the PC in terms of performance owing to it's memory bus. We're betting that Apple will address this during the summer, and we'll add a new Mac workstation at THAT point. Whether it's a G4 or G5 that'll be powering it doesn't really matter that much - right now the Mac just needs more bandwidth. Set againstt all that is the simple fact that our existing G4 workstations have both 64bit PCI AND Gig Ethernet, so in some ways the Mac is still pretty quick. But, no doubt our fastest machine is our Athlon XP 2100+ - it's a screamer. Our 933 Mac gives around the same performance (generally) as our older 1.4 GHz T'bird. Still f**king fast by any reasonable measure, and amazing when doing it Altivec-style (you should see the Apple MPEG2 encodr go!).

OS Overhead? (1)

leviramsey (248057) | more than 12 years ago | (#3484425)

[I'm assuming that the systems were running OS X and XP, respectively]

How much could OS overhead play a part in the results? Does XP eat up an equivalent number of CPU cycles to OS X?

Re:OS Overhead? (1)

gospodin_david (577508) | more than 12 years ago | (#3484486)

Really now...

It's our beloved Aqua and Quartz that use up the processor, not Mach (the kernel). Mind you, Mach doesn't help the situation, even though it is an elegant solution. In other words, I'd think that unless the reviewer was playing with the Dock, or window resizing while running his tests, it's probably just a case of cheaper, faster hardware.

Of course I'm so taken with OS X, that I doubt that I'll ever go back to Windows (X11 is another matter).

slower is slower (0)

Anonymous Coward | more than 12 years ago | (#3484529)

Who cares where the slowdown is coming from? Slower is slower. Apple has to speed up their machine/OS combination, and fast.

Finally (5, Insightful)

elliotj (519297) | more than 12 years ago | (#3484446)

I really like the Mac. Honestly.

But I'm glad to see some independent testing on this front. I think those contrived Photoshop bakeoffs are an embarrasment.

I personally don't think Apples are as fast as PCs. I think most people agree. That's really not the point. There are many good reasons to buy a Mac. But a Mac running OS X is slow and everyone knows it.

Re:Finally (2)

azosx (568180) | more than 12 years ago | (#3484927)

I'd say Apples in OS 9 are a lot faster than PCs. There's just something about OS 9, everything is very snappy and crisp, apps just open with out that familiar winding of your HD you get in Windows. Unfortunately, this isn't a very good argument considering OS 9 is officially dead but soon so will be Windows 98 yet I'm sure people will continue to run that for years to come as well. As for OS X, yes, if you're going try and run it on your Tangerine iBook, sure, it'll be slow, but so would Windows XP on that Celeron 333 box you have lying around. I run OS X on a PowerBook G4 550 and it's pretty quick. I get a higher FPS rate in Quake III with the 16MB Mobility Radeon than I do in Windows 2000 AS with a GeForce2 MX 400 64MB, go figure? Is is as fast in every day use as Windows 2000 AS on my dual 1GHz P3 box? Hell no, but a lot faster than it would be on that P3 500 I have lying around. OS X just like 2000 or XP require high end hardware.

Re:Finally (1)

clarkgoble (241742) | more than 12 years ago | (#3492805)

The "snappiness" of OS9 vs. OS10 is more for interface elements. Purprotedly Jaguar fixes a lot of this. The things discussed in this article about AfterEffects are much more tied to Adobe's coding and the speed of the processor and thus are quite separate from "snappiness."

I think most informed people have always known Apple's hardware has speed problems. It isn't exactlly parallel to MHz, of course. But it does show the problems with Motorola. I think Apple has to fix this or all their wonderful OSX fixes will be of little help.

Having said that, most computers today have far more power than people need. Network speed is almost always the problem. So unless you are doing computationally intensive work (i.e. heavy graphics) it is all moot. Any computer today is powerful enough. (Except perhaps a weak G3 with OSX, due to the UI elements)

Re:Finally (0)

Anonymous Coward | more than 12 years ago | (#3485392)

It matters on your system. If you have a 733mhz PowerMac or 800mhz iMac - OSX won't run "slow" on them.

Re:Finally (1)

longbottle (537395) | more than 12 years ago | (#3503968)

I wholeheartedly agree. I haven't been a mac user for long (bought a G3 600MHz iMac seven months ago) but since I got it, it's become my main machine. However, it's one of the most unresponsive machines I own... and I have a 233MHz machine running at 180MHz (don't ask) a Performa 475 (a 68k machine!) and a 486... yes, the iMac feels slower than it's 68k cousin. I'm able to make a side-by-side comparison between my iMac and my windows 2000 pro machine... it scares me. Win2k box: PII 450MHz, 128MB RAM. iMac: G3 600MHz, 256MB RAM. I love my mac, I really do... but 20 seconds to switch windows (in the same app, no less!) is absoloutely intolerable. Apple's attempt to resolve this? Leave users like me out in the cold... Quartz Extreme (the first real speedup of OSX since 10.1) requres a graphics chipset not in my machine... and since it's an iMac (I wanted a G4, but I couldn't afford it) I can't upgrade the graphics. So the speedups in 10.2's QE will help me not at all.

Just my two cents. But, you should know, slow as a 68k machine or not, I still use it as my main machine... out of all nine of my boxes.

Re:Finally (1)

pi radians (170660) | more than 12 years ago | (#3510581)

but 20 seconds to switch windows (in the same app, no less!) is absoloutely intolerable.

Okay, what are some people doing to their computers? I just don't get these claims. I own a G3 300 which is running Mac OS X 10.1.4 and there is not that much lag at all. I am using a G4 450 at work and while it is more responsive, my G3 isn't quite the dead horse a lot of users are claiming such computers to be.

I don't know what it is, but it seems either there are a lot of trolls out there spreading FUD about OS X or my computers (one that is 4 years old) have some kind of super-hidden ability.

Re:Finally (1)

longbottle (537395) | more than 12 years ago | (#3511281)

Just so you know... I wasn't trolling, just blowing off some steam. And let me toss a few examples your way... I type a one line IM message... in the time it takes to appear and be ready to send and the rainbow disk cursor goes away, I could have gone and made a sandwich. Shitty IM client, you say? More than likely. How about something closer to home..., for one. I like it as an email client, but it's HTML rendering when images are imbedded... painfully slow.

If you don't have these kind of speed problems, I'd be delighted to find out the tricks you've used to boost OSX in the ass and make it responsive enough to use it as heavily as it's capable of. If you haven't tweaked anything, I'd say yes, your machines must have a super-hidden ability, because mine's simply not that fast. Well, unless I reboot into OS9... but that doesn't happen very often, fast or not.

How is that exactly equal? (2, Interesting)

azosx (568180) | more than 12 years ago | (#3484684)

How is dual 1.533GHz Athlon processors anywhere equal to dual 1Ghz G4 processors? The combined processing power of the Athlons is over 1Ghz greater than the combined processing power of the G4s. Again, I ask, how is this equal? Also, we know the amount of ram in each system but what type of ram was it. The G4 had PC133 but the Athlon? It was likely using 266MHz PC2100 DDR ram, far superior. What about the hard drives? Apples was probably a DMA 66 5400 or 7200 RPM drive while the PC was likely supplied with a DMA 100 7200 RPM drive. With the processors aside, there's a lot more to consider when comparing apples to oranges.

Re:How is that exactly equal? (4, Insightful)

tps12 (105590) | more than 12 years ago | (#3484738)

Nice defense. You basically argued that it isn't fair to compare the two because Apple's hardware can't compete, and I would agree.

All this trial does is throw some long-deserved doubt on Steve Jobs' repeated claims that hardware specs are meaningless and that performance must be intuited emotionally rather than objectively measured.

This strikes me as a predictable outcome after years of focussing more on pretty cases and bouncing icons than on what's inside. Being 2 years behind the cutting edge in hardware just isn't going to pass muster.

Re:How is that exactly equal? (2, Interesting)

Alan Partridge (516639) | more than 12 years ago | (#3484975)

that's not really fair, the Mac isn't REALLY 2 years behind in any realistic sense - the performance of that dual 1Ghz G4 would have been off the charts 2 years ago. Certainly, the Powermac is showing it's age (particularly that of it's mobo chipset) but I think that we're rapidly approaching a new mobo from Apple that will meet or beat the best PC configs. If you look back, you'll see that Apple's performance hikes tend to come in bigger steps, less often whereas the PC world gets a newer and faster chip every month or so. The Powermac definitely isn't out of contention yet, but the overhaul is due. FWIW, the article bears out my experiences quite closely as someone working in a cross platform environment.

Re:How is that exactly equal? (0)

Anonymous Coward | more than 12 years ago | (#3542449)

Get a clue -- what they don't tell you is the endless headaches you'll have running AE in XP. Like the day of constant rebooting XP we just did trying to edit a 10 minute spot -- someting I could have done at home on my dual 800 Mac in a fraction of the time... and besides -- OS X hasn't even been optimized with the new compiler -- 10.2 will bring major performance increases and it's fairly well known in the processor community that the G5 is poised to crush the 64 bit chips from Intel and AMD -- AND BE BACKWARDS COMPATIBLE AT THE SAME TIME. Think, think, think people!!! WHEEEE!!!!

Re:How is that exactly equal? (2)

jchristopher (198929) | more than 12 years ago | (#3484846)

How is dual 1.533GHz Athlon processors anywhere equal to dual 1Ghz G4 processors? The combined processing power of the Athlons is over 1Ghz greater than the combined processing power of the G4s. Again, I ask, how is this equal? Also, we know the amount of ram in each system but what type of ram was it. The G4 had PC133 but the Athlon? It was likely using 266MHz PC2100 DDR ram, far superior. What about the hard drives? Apples was probably a DMA 66 5400 or 7200 RPM drive while the PC was likely supplied with a DMA 100 7200 RPM drive. With the processors aside, there's a lot more to consider when comparing apples to oranges.

Uh, I think that's the point! Look at what you get for your money, see how much faster the PC is? It would be nice if Apple would include faster RAM, bus, and drives. Unfortunately they have chosen not to sell computers with those features.

The only comparison that anyone can make has to use what they sell. The Mac system compared in the article uses what they sell - in fact, the FASTEST computer currently available from Apple. Note that the Athlon system they compared it to is not nearly the fastest available Wintel, yet it wins the test anyway, and costs less than the Mac.

Re:How is that exactly equal? (2)

crisco (4669) | more than 12 years ago | (#3484939)

I would imagine that the equality is in the prices of the respective systems. Bang for the buck, I think my dad called it.

The Polywell [] starts at $3000, as does the dual G4.

The equality is also in the long standing claims of superior performance from some Apple tr^H^Henthusiasts.

Re:How is that exactly equal? (0)

Anonymous Coward | more than 12 years ago | (#3485511)

"I would imagine that the equality is in the prices of the respective systems. Bang for the buck, I think my dad called it."

Duh! Then why did they _remove_ 512 well-needed MB of RAM from the Dual 1 GHz G4 if they compared two similarly-priced systems?

This is a major flaw in their test, since Mac OS X and applications do better the more RAM is available.

Nik D

Re:How is that exactly equal? (1)

Kilted_Ghost (559000) | more than 12 years ago | (#3487165)

Actually they removed the memory from the Athlon machine and not the G4.

Re:How is that exactly equal? (1)

iamroot (319400) | more than 12 years ago | (#3484949)

The point is that they're not equal. They picked the fastest available processors for each archetecture. Mac processors are RISC, and PC are not. So if you used dual 1GHz Athlons, and Dual 1GHz G4s they wouldn't be equivelant to the Althons. A 1GHz RISC processor has an advantage over a 1GHz processor that doesn't use RISC. You can't really compare them easily. But thats not what really matters anyway.

Mac processors have been almost always lower rated in MHz that PC processors. When PCs were 1GHz, Apple was still making 500MHz(about) Macs. Apple said that because they were RISC, they were equivelant to the PC processors. The test approaches things from a practical point of view. In reality a platform is represented by its current(see latest) hardware, at least for PCs and Macs. The point is that the latest PC hardware beats the latest Mac hardware in this test. Combined with the cost factors, this means if you are someone who wants to get an after-effects workstation, you have two choices. You can get a PC for under $1000 which will be much faster than the Mac, or a slower Mac for over twice the price. This will seriously reduce the number of people that would go with the second option.

Re:How is that exactly equal? (1)

azosx (568180) | more than 12 years ago | (#3485055)

The whole point of the article was that they didn't pick the fastest available Athlon to date. They picked one that was supposedly most equal to the Apple. Apple claims dual 1Ghz G4s are roughly the speed of a 2.4GHz P4. There you have 400MHz of play between the two. In this comparison, there is 1Ghz of play between the two setups. It's not even comparable to Apples claims, it far exceeds them. The article didn't prove that an equally equipped Athlon setup could out perform an Apple. The price of the two systems in this articles comparison is not relevant.

Re:How is that exactly equal? (1)

joedames (130760) | more than 12 years ago | (#3485313)

The price of the two systems in this articles

says you! I'd like to have an extra $400 bucks as well as the extra time to spend it. This test is fairly conclusive... the fastest mac is not as fast as the fastest pc, moreover, it's not as fast the fastest PC from 6 months ago. I like macs, but the PC is obviously faster in After Effects rendering.

Re:How is that exactly equal? (1)

p7 (245321) | more than 12 years ago | (#3485492)

I think the last sentence tells us why they ran this test.

"But if Mac users are under the impression that their machines can render After Effects composites faster than any Windows-based workstation, our tests do not support that conclusion."

There's even more (0)

Anonymous Coward | more than 12 years ago | (#3488151)

People see these comparos, complain about how slow the system is and how Apple needs to make it faster. Apple doesn't develop the processor or motherboard architecture that they use. That's Motorola. Apple manufactures the stuff, but doesn't create it. And when Motorola is hurting financially, Apple hurts and has to wait longer for processors. Look at the G3 and how long it took for us to get the G4! The x86 has two _competing_ companies who make sacrifices to get faster products out the door (i.e. they cut profit margins). Motorola doesn't have that pressure. Moreover, look at Intel and AMD: they make only processors. Moto makes all kinds of different devices, including my cell phone next to me and the G4 running in my TiBook. So if these computer shootouts are ever going to be even, Apple does need faster products, but if it can't have access to the faster technology to make faster computers, everyone's just going to have to wait until it becomes available.

Re:There's even more (0)

Anonymous Coward | more than 12 years ago | (#3491148)

AMD and Intel don't "Just make processors". You want to know why Motorola is hurting financially? Because Intel and AMD have been making embedded CPUs for network appliances and mobile devices. This is direct competition with Motorola's main business. But really now, AMD and Intel have as much variety in their products as does Motorola.

You are right about the fact that there's more effort driving development on x86 hardware. But the reason for this, in my opinion, is due to the fact that we're living in a Windows dominated world. If Windows ran on Mac hardware, we would see more effort in developing PPC hardware.

Do or die? Do or flounder? (1)

tomdarch (225937) | more than 12 years ago | (#3484710)

Let me state the obvious - Apple needs to release G5 based systems soon, or it will be in deep do do. It needs to be cutting edge - DDR266 ram! Dual processors well over 1GHz! Oh wait, I meant "cutting edge as of a year ago". It should be said, though, that the dual G4/1GHz system they tested did OK considering the system and processor speeds. It has me drooling for one of those Dual 1.6GHz G5 system! MacWorld NY will tell...

Re:Do or die? Do or flounder? (1)

tupps (43964) | more than 12 years ago | (#3488463)

Having processors speed over 1ghz doesn't mean anything.

A dual 400mhz Sun Server comes in at $18,000 and I am sure it can blow away your PC with 2.4ghz chips.

What I want to know is how long it takes me to:

Start up from sleep
Read my email,
Surf some web pages,
Open Dreamweaver
Make edits to web sites
Open Fireworks
Edit some web images
Open QuickTime VB Authoring App
Stitch some QTVRs
Open Movie Editing Software
Download and edit some movie files
Render out movie for TV, CD and web
Upload web sites
Surf the web (web site testing)
Read some more emails
Put the machine back to sleep.

In 90% of this work (apart from Stitching the QTVR's and outputting Movies) I am not going to be waiting for the machine but waiting for me.

That is why the user experience on the Mac is so important. If you can get around quickly, moving data easily then the Mac will out perform the PC any day. For me this is what makes the Mac head and shoulders above any PC.

The simple fact that my Mac has stayed up for a couple of weeks now without rebooting (since the last update) is invaluable. It cuts many minutes out of my day.

Everyone already knows this (0, Troll)

jchristopher (198929) | more than 12 years ago | (#3484789)

This is not news... it is obvious to anyone that has used both Macs and PCs that a $1500 PC destroys a $1500 Mac with regards to speed.

I suspect that those people who are still using Macs either:

1. Don't have speed as their first priority, (maybe they are more interested in tightly integrated hardware/software, for example)

2. Simply don't realize how much faster Wintel has gotten because they haven't used one lately, due to bias and/or choice.

Re:Everyone already knows this (1)

whee (36911) | more than 12 years ago | (#3485266)

I'd like to give you my perspective on Macs right now. I have a 1.4ghz Palomino, lots of RAM, the works. It was very fast when I built it for $1500, and it's still fast now. Despite this, I recently bought a $2800 800mhz PowerBook; it'll be slower, but there's no way you can duplicate the experience of OS X.

Find me a laptop with the features of the Powerbook (gigabit ethernet, dvd/cdrw drive, the slim size) for a comparable price, and an advertised battery life of five hours (they all inflate, it's reasonable to compare the inflated values), and I might consider an x86 laptop. Sure, the Powerbook may be slower in some aspects, but it makes that up with features.

Combine that with Cocoa and a user-friendly UNIX, and you'll see why I (and many others) are willing to pay more for a Mac. I'm willing to pay more just for a stable development environment; no fuss about what GUI toolkit to use, what language, or things like that. The tools they provide (interface builder, project builder, gcc3) are all I'll ever need --- and they're free.

Looking at the upcoming features of Jaguar [] , I'm even more pleased with my purchase. Where else can I get an OpenGL accelerated GUI, ZeroConf, and a tool like Sherlock 3 just by using this consumer-level OS? Speed should always come second to usability and the ability to get work done.

That's why I'm becoming a Mac user, and I suspect it is the same reason for many others. We realize how much faster x86 is --- but it doesn't matter. Experience is key.

Re:Everyone already knows this (1)

joedames (130760) | more than 12 years ago | (#3485330)

take it easy... nobody is saying macs don't have great value. This was a speed test. The mac lost.

Nobody insulted your choice of computer, so you don't have to defend your choice of OS. I too use OSX, but it does need improvements in speed. I wish that Apple had found the true medium between form and function on this OS release, maybe Jaguar will do it. The OS is just to dang pretty considering how slow it runs.

Speed it up, then make it pretty.

Re:Everyone already knows this (0)

Anonymous Coward | more than 12 years ago | (#3486151)

>Speed it up, then make it pretty.

Strangely, this is exactly the opposite of what they've been saying on the OpenOffice discussion (yesterday, I think). There are many posts saying, "I'll never use it until it looks good - regardless of functionality."

It seems to be two sides of a coin, and Apple chose to have "style over substance" for now. Style is a HUGE part of the Mac Mystique. I admit, the look of Aqua has had me hooked since the first day they posted a Beta Screenshot on

Plus, it will encourage hardware upgrades for Apple, which accomplishes two important things; ONE, it generates cash and market share to help improve their products and draw developers, and TWO, eliminate the need to support so much old hardware, which is a bloody huge millstone around their neck. Supporting old hardware and software drains valuable resources into essentially non-productive (read:non-profitable) areas.

I also noticed that game-makers do the same thing - design their games to run well on next year's hardware! John Carmack of id Software confirmed it in an article here on Slashdot. If Apple's optimizing their OS for next years' hardware, well, it kind of sucks speedwise now without RAM upgrades and such, but it will *rule* for the people who are migrating from x86 or upgrading. And that's who they're trying to impress - the people who will pay for their new stuff.

Hardware is really designed to run the software of TODAY, not tomorrow. My PB520 runs everything fast, fun, and stable - as long as I run the versions of Netscape and Office that I got at the time of purchase! We lust after computers like we lust after Ferraris or Porsches, Beemers, or women. There's always one better on the way, "Just over the next hill". Be happy with what you have, or pony up the cash for the new one if you're too hooked. That's life. We never seem to be satisfied. I hate to say it, but I think now that Apple is on track, they may keep coming out with "must-have" hardware and software FOREVER!!! We've got to be satisfied sometime, or we'll go broke and insane. I'm pretty close (just ask my dead grandmother who told me to buy the NEW 10 Gig iPod!).

P.S. I'm already saving up to buy a G5 PB in (I'm hoping) May of next year. Think different, think ahead. That's how you get more new Mac stuff.


Re:Everyone already knows this (1)

jchristopher (198929) | more than 12 years ago | (#3485625)

Speed should always come second to usability and the ability to get work done.

A big part of usability is the system responding in an instant when the user clicks on something. If this doesn't happen, the user starts thinking "did the computer realize that I clicked?".

Maybe they click again, or they assume they're "doing it wrong" when the only problem is that the system is slow to respond to their mouseclick/keystroke.

The ability of the system to quickly respond to user requests is a giant part of usability. I don't want to GUESS where to click because a window is still redrawing, or wonder whether I can start typing in a text field yet.

Is the system ready for my input? With OS X, I don't know and it's very frustrating. Speed has EVERYTHING to do with usability. Windows does not suffer from this problem.

Re:Everyone already knows this (2, Interesting)

drsmithy (35869) | more than 12 years ago | (#3488699)

You are confusing speed with responsiveness. OS X lacks the latter as well, so the gist of your gripe is still applicable. However, the problem is *responsiveness*, not speed. An interface needs to respond immediately with some sort of feedback that the thing you have just done is getting some sort of reaction. That's why Windows 95 introduced the little hourglass+arrow pointer and MacOS Classic has the little zooming rectangles whenever you double click something.

It is quite possible for a machine/OS/interface to be slow, but still remain responsive. Unfortunately OS X fits squarely into the "unresponsive" category, even on quite fast machines like my PB667 (and a G4/933 isn't much better). X and its associated window managers/GUIs/whatevers tend to suffer the same problem. NT based versions of Windows (particuarly later ones like 2k and XP) remain quite responsive even on slower hardware and the king of all in terms of responsive GUIs, I'm led to believe, was the Amiga.

Re:Everyone already knows this (1)

jchristopher (198929) | more than 12 years ago | (#3488967)

That is exactly correct. Example: start up an MP3 player in the background, the fire up a browser. Notice that even as the browser lags as you click on stuff, the MP3 will continue to play just fine without stuttering. Fire up some more programs, and your MP3 will continue to play just fine, even as you suck up memory. The "machine" is fast, as far as running tasks, but the responsiveness makes it seem slow.

Right now, OS X has a SERIOUS responsiveness problem. If you think it's bad on a fast G4, be glad you don't have an iBook. Great little machines, but I couldn't believe how unresponsive it was. Sold it to some zealot that can't see the truth even when it's staring them in the face. Seriously, if it works for folks, more power to them. But I don't see how anyone could use the iBook with OS X on a daily basis - I couldn't. I was ready to throw it out the window!

Re:Everyone already knows this (2)

0x0d0a (568518) | more than 12 years ago | (#3488700)

Windows does not suffer from this problem

It's all relative. I still happily use my PII/266 as my main box. I just use Linux, Sawfish, and a suite of fast apps (dillo and rxvt are the two primary ones).

Re:Everyone already knows this (0)

Anonymous Coward | more than 12 years ago | (#3488800)

With the exception of my iDisk (soon remedied by using Goliath) I have never had any problem with delayed response when clicking in OSX. Window resizing is slower than OS 9, but it has never provoked an outbreak of frenzied clicking on my part.

My computer definitely wouldn't fare very well in a bakeoff with a state of the art Athlon. It's a two and a half year old 350mghz G4. But in my experience OSX usability is fine on this machine. I regret I actually delayed the switch to OSX because of reading posts like this. The moral, the tales of trolls are best treated with scepticism.

Re:Everyone already knows this (0)

Anonymous Coward | more than 12 years ago | (#3485549)

3. Have a huge investment in Mac software and are stuck using it even if they desired to change.

Shake for x86 is NOT being killed. (5, Informative)

Doktor Memory (237313) | more than 12 years ago | (#3484851)

You can run Shake on x86 to your heart's content, as long as you run it on Linux. This being slashdot, you'd think the story editors would be clued up on this sort of thing...

Re:Shake for x86 is NOT being killed. (1)

hublan (197388) | more than 12 years ago | (#3485978)

You can run Shake on x86 to your heart's content, as long as you run it on Linux.

But after mid-2003 it will be biting the dust as well along with the IRIX version. Yes, Steve is very genorous.

I know of a few long format productions that had geared their pipelines towards using Shake as their comping tool. When the time comes to deploy, they can no longer get licenses and what we're left with are some seriously pissed off people out there.

Effectively, Steve didn't just kill Shake, but the entire client base as well. Good job.

Re:Shake for x86 is NOT being killed. (0)

Anonymous Coward | more than 12 years ago | (#3511626)

Uh... No one knows what will happen to IRIX and Linux shake after 2003. Apple just said it will develop it until at least 2003. They may kill it then, they may not. They're probably going to wait and see what the customers want.

Er, no. (2)

Doktor Memory (237313) | more than 12 years ago | (#3515073)

But after mid-2003 it will be biting the dust as well along with the IRIX version.

Er, no. Nobody from Steve on down has said a damn thing about the fate of Linux/Irix Shake after 2003 other than that they'll evaluate it at the time.

Apple has been pretty consistant about being willing to publish non-MacOS versions of their top-end software (ie: WebObjects) when they know that there's a demand for it. I strongly suspect that if current Shake customers make their needs known, they will be tended to.

*Apple is dying (-1)

Chinese Karma Whore (560174) | more than 12 years ago | (#3484856)

Fate, chance, karma, whatever you wanna call it -- when Miss Fortune spreads her legs for you, you're already in over your head. Believe me, I know.

Bunny LaFever looked like a dame with more curves and venom than Reggie Peeler's Land O' Snakes. But she wasn't a real dame. She was a she-devil. That golden bush of hers was nothing but a welcome mat to hell.

But now I'm getting way ahead of myself. Bunny had a way of doing that to jerks like me. She twisted us inside out and turned our heads around so we couldn't think straight anymore. So lemme begin at the beginning ...

Carnies got a word for a crooked game operator like me. They call me "Flattie" cuz I'll flat-out rob you and make you like it.

My name's Randy Everhard and I've got a million ways to take your money. One of my personal favorites is the "hopper shot." It's tossing softballs into toilet seats, which you've seen on every midway in your life. I could gaff the joint to make it impossible to win.

But where's the fun in that? I work it so any chucklehead can win all night long. Cuz once I've hooked a live one into thinking he can take me for a ride, that's when I nail him with the "build-up." Caught up in the excitement of winning game after game, the rube's built up to play twenty games at two bucks a pop. And the only prize he's going home with is a teddy bear that cost me three shekels per, wholesale. You do the math, Einstein.

The problem with selling three-dollar plush for forty scoots is that the build-up only pays off if you've got a steady string of suckers. And that night was turning out to be a real larry. The Laff Riot carnival was a flattie's wet dream. The grab joints and flashy rides were a front for the real action: flat stories, alibi and percentage joints, crap tables, slot machines, fortune wheels.

The show was running wide open. Everybody crooked and every joint gaffed and nobody doing a damn thing to stop it. I figured the cops were greased slicker 'n Liberace's asshole. It should've been like shooting trout in a barrel. Too bad nobody was taking my bait. I was up shit creek without a paddle to piss on.

My first goddamn night with the show, and already I was itchy for a new angle.

I can't remember which one of them I saw first: the blonde come-on dressed like she had an exhibitionist streak a mile wide or the square in the coke bottle glasses who was eyeballing her like she was nothing but something to look at. Of course, that Coppertone beauty really was something to look at. She was turning heads and raising dicks all over the place. But I didn't like him getting his eyes all over this piece of 100 percent corn-fed cocktease.

She was stacked like a double-decker Ferris wheel with nipples that could cut glass. The red double-O's stenciled on her football jersey were stretched over humongous hooters. She looked like a shooting gallery, bursting at the seams. You couldn't miss those twin titty targets. I'm talking knockers so big you could still see them when she turned around. And believe you me, she was one woman who looked as good going as she did coming.

She wore a pair of daring Daisy Dukes that were so short and tight her crotch sucked them in. The denim over her ass was thread-bare, blown out like a retread. And if that wasn't enough, she was doing a number on a grape Popsicle to make your peter wish it was frozen on a stick. That girl was one carnival ride I wanted to jump on quick, and I didn't care how many tickets it cost.

In my racket, though, business comes before pleasure. And this looked like a golden opportunity to work the key scam. It's the oldest con in the carny book.

I jumped the counter and made my way over to the chump with the steamed-up glasses. I was like, "Hot enough for ya? And I ain't talking about the weather, fella." At first he didn't buy it when I told him I was the "manager" of this fine talent. He just stood there mopping his brow with a hanky.

"I don't fuck chickens and I don't shit feathers," I said, "and I wouldn't lie about a piece of ass like that, neither." I gave myself a hard-on feeding him the fast talk: screwing her would make a man think he died and gone to heaven, where the streets are paved with solid gold snatch.

"She's a sight for sore eyes, ain't she? And if you think I'm giving you lip, you oughta see her go to town on a dick. Life-transforming, friend. Life-transforming." I pulled out an old key I kept for just such an occasion. Dangling it before his bug eyes, I spieled how it was the key to her room at some motel outside of town. "I'm talking once-in-a-lifetime opportunity, pal. She's the reason hard-ons were made."

He swallowed it all -- hook, line and sinker.

Chuckling over what he was going to tell his wife when he came home minus his paycheck, I made my way over to the sultry sex kitten. She was throwing heat like a furnace. Melting chocolate bars at twenty paces. It was too hot to fuck, but next to her, that scorcher felt like a cool, seaside breeze.

"I just made you twenty bucks, and all you had to do was stand here looking gorgeous, Gorgeous." She didn't say anything, just looked me up and down and blinked those big baby blues. The sheen of sweat on her face glowed under the neon lights. She'd sucked all the flavor out of the end of the Popsicle, so the tip was white.

I fished out a crisp, new bill and passed it over. She let it rest in the palm of her hand as she stared at it, confused. She tried giving it back to me, but I stopped her. "See that guy over there?" I asked, stepping aside to give her a glimpse. "He just paid me a lot of money to sleep with you."

He what?" she goes, insulted. She threw down what was left of her Popsicle and took a step closer. Her eyes burned like a butane flame. Like most women, she looked better when she was steamed. But I didn't want her making a scene. She was liable to blow the act.

"Don't get yer panties in a bunch," I said, shutting her cakehole with my hand. I told her about the con and then nervously took my hand away. I was sure she was gonna blow up again. But she kept quiet. I told her we had to scram and didn't give her a chance to say no. I just put my arm around her waist and steered her toward the exit gates. I gave Pops a back-handed wave as we booked outta there double-time.

My dick is long and my cons are short. Cop and blow, that's my motto -- take the money and run. Otherwise things got a way of getting ugly.

Two minutes later, we were hauling ass down the highway in my supercharged Chevy Menace. It was an acid green two-door with cheetah seat covers, four on the floor and dual exhaust. Twin cams and 440 horses under the hood.

"Say," I said, "what's your name, anyway?"

I was hoping to get to know every inch of her better. She smelled like coconut oil. Her tanned skin gave off heat like asphalt that'd been baking in the sun all day.

"Bunny," she goes. "Bunny LaFever." She was a real piece, too. I couldn't wait to do all sorts of dirty things to her. "How much you take him for?" she asked. "Two-fifty." In actuality I scored three-fifty. But if there's one thing I know about women, it's never tell them exactly how much money you've got.

Back at my room at the God bless America Truckstop Motel, she showed me that that sweet and innocent show was just a put-on. I was glad, though. I prefer a girl with some experience under her belt.

Before I knew it, she was all over me like stink on shit. Purple from the Popsicle, her tongue sprung to the back of my throat and then snaked all over the inside of my mouth like she was mining the gold fillings out of my teeth. Despite all the tongue wrasslin,' her hands were nowhere near where I wanted them to be.

My dick had been so hard for so long I thought it would blast off like a rocket, but she kept her distance. The teasing was cute at first but enough was enough. I grabbed her hands and planted them on the tent pole in my pants.

She pulled away and took a few steps back.

"You trying to insult me? You think you can have this body for free?" Bunny squeezed her 'lopes together, serving them up for my hungry eyes: "These tits alone cost five bucks to look at."

I chuckled nervously. "C'mon," I go, "quit screwing around."

"I'm totally serious. Five bucks or I'm gone."

I started laughing for real, digging the little swindler. What else could I do but pay up? She had me right were she wanted me.

This was one of those times in a man's life when he knows his dick's doing the brainwork but he doesn't care. Whatever the dick wants, the dick gets. That right there's the whole story of my life.

I plucked a five-spot from my wallet and waved it like a flag of surrender. She just looked at it. "I don't want your money now," she goes. "Pay me later."

"Whatever you say." And I just eased back on the bed to enjoy the show.

She peeled off her T-shirt and out bounced those giant, all-natural juggs. She had razor sharp tan lines from the sling of a skimpy bikini top. You could tell from her nips that the air-conditioning was on full-blast.

Bunny danced around the room, wiggling and shaking everything her momma gave her. I looked her up and down until I could've guessed her weight. She had all the right parts in all the right places and then some.

She neared the bed and leaned over me to let those massive, all-American melons swing inches above my face. "Wanna taste them?" she goes. As if she had to ask.

I lifted my head to suck the tantalizing titties into my mouth, but she snatched them away.

"Five bucks," she goes.

"All right, five bucks."

"Five bucks each, big spender."

"You got it."

"Pay me later," she cooed, and moved closer to bury me beneath her treasure chest. "Mmm," she purred, "you suck real good."

"Damn straight," I mumbled. "You're getting my money's worth."

She only laughed as her fingers spider-walked down to my crotch and unzipped my fly. "You'd like a tit-fuck, wouldn't you?"

It wasn't a question. It was a statement of fact. Some girls are mind readers, but Bunny LaFever was the first dick reader I ever had the pleasure to meet.

"Twenty bucks," she barked.

I was like, "A bargain at twice the price. Pay you later?"

"That's right, bright boy."

We switched places on the bed so that she was on her back. I kicked off my shoes and pulled down my pants and underwear. This dick of mine's got its own zip code and time zone.

When she gripped the shaft, her fingers didn't reach all the way around. She was like, "Lucky for you I'm still in my size-is-everything phase."

"Me, too," I said, dropping to my knees to straddle her. My hard-on slipped between her cleavage like a hot dog in its steamed bun. She pressed them together to make the sandwich good and tight as I began my strokes.

I humped her hooters harder to push my dick closer to her succulent mouth. She stuck out her pink tongue and tickled the tip. Back and forth it fluttered over the head.

"There's a freebie," she giggled. "But I won't take one in the mouth for less than twenty."

"How much to swallow?"

She had to think that one over. "Thirty," she answered. "And that's only cuz I like you."

I dismounted and stood beside the bed. She sat on the edge of the mattress to let her mouth get better acquainted with my cock. Her tongue twirled over my shaft until it looked like a monument of polished marble.

She blew me good and slow, repeatedly bringing me to the edge of orgasm and then stopping until the urge melted away.

The build-up felt so good it hurt. I never begged anyone for anything before. But tortured by her talented tongue, I was actually begging for mercy.

After some more tongue lashing, she finally let me fill her mouth. She swallowed, too, and it felt like my whole body was sliding down with it.

Apple Has FCP (1, Insightful)

Anonymous Coward | more than 12 years ago | (#3484887)

It's really silly to think that Apple has anything other than its own video editing software in mind when it makes claims about itself in that market.

Apple makes sure that Final Cut Pro works just as well on their hardware (or better) than any other comparable editing solution on any other platform. Lots of professional editors have moved to it.

Apple aquired the sofware that would later become FCP from Macromedia. Look at what happened with Apple aquired Zayante. Now look at how they used them.

The same thing will eventually happen to Shake too; it will be Applefied -- a new skin and some new features added and, most importantly, its useful pieces integrated in other Apple products where it will increase product value.

That's Apple's thing:
Apple Hardware + Apple Software (original or aquired and retooled) = better overall product/user experience. At least, that is what it looks like they are doing to me, and until you get a benchmark to measure that, I can't trollbait like this too seriously.

Re:Apple Has FCP (0)

Anonymous Coward | more than 12 years ago | (#3487643)

This is simply not true. I'm at WWDC right this very minute, and everyone here- all three thousand of us- are being shown all the details about how to write fast code on OSX.

If after effects is slow, it is because Adobe is not competing, and not writing fast code. Especially given that processors that are 4-6 times as fast as the competition per-hertz are slow in this test.

It gets really annoying coming to slashdot and seeing such bogus un-objective "test" results trotted out. I take it as concession that everyone knows the PowerPC is much faster but everyone (in the slashdot community anyway) is in denial.

ANYONE can write altivec enhanced code, and anyone who does will kick their x86 competition to the ground, time and again.

Adobe was giving the mac a poor showign in video editing, so Apple started to compete- deciding it was a core multimedia market they wanted to do well in. So, Apple's putting effort into it and their products are better than the competition. Simple competitiveness. Only on slashdot could that fact be claimed to show the PowerPC processor is slow.

After effects sucks. I wonder why Apple bought Nothing Real?


killing shake for x86?! (1)

tarzan353 (246515) | more than 12 years ago | (#3484994)

after Apple announced that they would be killing Shake for x86 platforms

No, Apple announced that they were discontinuing for Windows. They are continuing to support of x86 operating systems until at least 2003, at which point they would re-evaluate the market. This is not killing it, just leaving their options open.

Another benchmark (2)

dh003i (203189) | more than 12 years ago | (#3485191)

These are some benchmarks the scientific community will be interested in:

(1) See how long it takes each machine to completely align a large gene (i.e., 500+ nucleotides) for a large number of isolates (i.e., 30+).

(2) See how long it takes each machine to complete a maximum likelihood hueristics search using a large gene and a large number of isolates, to determine the phylogeny of the isolates.

(3) Etc.

Re:Another benchmark (2)

ScumBiker (64143) | more than 12 years ago | (#3485350)

Ok, I'll bite. Which side wins your proposed gene counting, Mac or PC? As a recent purchaser of a new dual-1ghz Mac, I thought I was getting a reasonably fast machine. I
m satisfied with what I've got and I'm enjoying the hell out of learning this beautiful new OS along with grinning from ear to ear as I cat foo | grep oof in the

Re:Another benchmark (2)

0x0d0a (568518) | more than 12 years ago | (#3488711)

For a market like this -- parallelizable scientific computing will, at least for several years more, stick with x86. Why? It's cheap. No one cares about a slick desktop or neat UI features if you just want another headless box in the cluster chewing away at numbers.

Of course, none of these people are going to be using Windows, either...

Re:Another benchmark (1)

PaulBu (473180) | more than 12 years ago | (#3489142)

Hey! "parallelizable scientific computing"??? ASCI (used to be world fastest until a week ago) computers more or less alternate between fastest Intel and fastest IBM/Motorola/Apple processors. How comes they are 'stick with x86"?

Paul B.

P.S. check out for specs...

Re:Another benchmark (0)

Anonymous Coward | more than 12 years ago | (#3501084)

No, they stick to the fastest IBM processors. This has absolutely nothing whatsoever to do with to CPUs shipping in Apple machines.

There is not ONE SINGLE machine with Motorola PowerPC CPUs (the ones you get in a Mac) in the top 500.

The IBM processors arent't really PowerPC CPUs, but descendents of their PowerParallel chip. It's got 2 floating-point units and no altivec unit - basically the exact opposite of the current Mac processors. This CPU will never ship in a Mac - it needs huge amounts of cooling, and costs $10000 per unit. The IBM CPU tops the current SPECbench while the best Motorola chip is less than *half* the performance of an Athlon CPU.

Re:Another benchmark (1)

DavidRavenMoon (515513) | more than 12 years ago | (#3494205)

AfterEffects runs like a dog in OS X. Here's some more benchmarks:

From:Computing using Mac OS X []

"For people who would like such a comparision .. this code (after AltiVec and dual processor optimization) runs almost 10 times faster on a dual 800 MHz G4, as compared to a 1 GHz Pentium III (g77)!! If you compare it to a 1 GHz Pentium III with a commercial compiler (Intel Fortran: ifc) .. hey, its only fair :-) .. the dual G4 800 MHz is still more than 4 times faster!"

"Not convinced? Then read this study [] done by NASA on the G4's scientific computing potential. In my view, it is probably the most detailed and extensive study done in this regard."

In the NASA document:

"While AltiVec compiler support is not available for general F77 computations, Absoft has implemented AltiVec in a limited number of F90 vector functions and BLAS routines in their the Mac OS v6.2 compiler. These operations are accelerated under AltiVec by providing vectorization and 4-way parallel processing of single precision floating point computations [Reference 6]. To test this feature, a benchmark code was developed using the F90 "matmul" function, which multiplies matrices in array form. In this test, 200x200 matrices A and B were multiplied to form the 200x200 matrix C, and the computation was repeated 100 iterations for more accurate timing."

Table 4: "Summary of F90 'matmul' Benchmarks" shows that when code is written to use Altivec a 500MHz G4, running Mac OS 9 completed the test in 1.5 seconds and scored 1067 MFLOPS. A 800MHz Pentium III running Red Hat Linux completed the test in 10.3 seconds and scored 155 MFOPS. The next fastest after the G4 was a 500Mhz Alpha 21264 running Red Hat. It did 286 MFLOPS in 5.6 seconds.

"The MFLOPS benchmark was obtained by dividing the time benchmark into the number of floating point operations (FLOP) needed to perform the matrix multiplication with traditional scalar computations (involving nested DO loops). Using an operation count of 1 FLOP for each scalar multiply and add, the multiplication of NxN matrices requires approximately 2N3 FLOP [Reference 9]. For 200x200 matrices, repeated 100 times, this results in a total of 1596000000 FLOP, or 1.596 GFLOP." []

Re:Another benchmark (0)

Anonymous Coward | more than 12 years ago | (#3501107)

Frankly, although I like Macs it isn't fair to compare a manual altivec-optimized code with an automatically generated code on x86. Either you compare the automatic code on both platforms, or you should compare manually optimized Altivec with manually optimized SSE.

And, comparing a dual G4 with a single-CPU Pentium of the last generation is ridiculous. If you want to claim it's faster - compare it to a dual Xeon with 2*2.4 GHz.

Finally, the NASA document saus quite explicitly that this only applies for single precision... which is used by about 1% of scientific programs. The remaining 99% is double precision, which isn't altivec optimized.

We're working with Molecular Dynamics - check out for some benchmarks comparing the G4 with x86 in single precision. The G4 uses altivec and the x86 SSE, but the x86 is about twice as fast as the G4.

I'd be willing to pay you $1000 of can show us how to make Mac innerloops as fast as x86 (just download the code, it's free software)!

Video Production, Hardware, and you (1)

Dr. Bent (533421) | more than 12 years ago | (#3485338)

These guys [] use openGL to speed up video compositing (to the point where it's real time!). It's interesting to see what is esentially gaming hardware used for professional video production.

Re:Video Production, Hardware, and you (0)

Anonymous Coward | more than 12 years ago | (#3485460)

OpenGL [] has always been about working. SGI has kicked butt in the past because of taking adavantage of OpenGL hardware. Gamers just wanted to take advantage of the high end hardware. Now high end hardware is "gaming".

Not surprised (5, Interesting)

Aram Fingal (576822) | more than 12 years ago | (#3485462)

Apple has always been careful to compare the G4 to the Pentium 4 and not Athlon. The tests I have seen comparing all three (even by MacAddict) tend to more than validate AMD's claim that the Athlon is faster Mhz for Mhz than the Pentium.

Apple has tended to fulfill Moore's Law in fits and starts rather than the smooth curve you see with the x86. They pulled well ahead about 3 years ago and then hardly moved until just recently. We'll see how far the current surge takes us.

Speaking of 64-bit processors, I suspect that the more portable UNIX core of Mac OS X will allow Apple to support a 64-bit machine at the consumer level before Windows can.

Re:Not surprised (1)

hublan (197388) | more than 12 years ago | (#3487645)

Speaking of 64-bit processors, I suspect that the more portable UNIX core of Mac OS X will allow Apple to support a 64-bit machine at the consumer level before Windows can.

Really? []

Re:Not surprised (1)

drsmithy (35869) | more than 12 years ago | (#3488734)

The "core" of both OS X and Windows (NT|2k|XP) has been available/ported to multiple platforms previously. What makes you think OS X is any more portable than Windows ?

Where's the BIG picture ... (4, Insightful)

ultraslide (267976) | more than 12 years ago | (#3485816)

Sure enough, even the Mac lovers can agree that for the same cash a PC is gonna be faster than Mac. Intel and AMD have big incentives to keep those clock speeds as high as possible.
But ... where are the studies about the entire work flow? Just because the machine is faster at grinding thru certain processes, it doesnt mean that the same job will get done quicker. What's the time to import/export files? What about saving those big files off to another disk? What about the learning curve for new apps (or OSs for that matter?) What about downtime for repairs and upgrades? What about end user training? These all "cost" in the end. I'm not saying that Apple would win this kind of study but I know from personal experience I do "get more done" on my Mac than on my PC.

Macs ALWAYS cheaper. (1)

BitGeek (19506) | more than 12 years ago | (#3494244)

For the same cash, you ALWAYS get a better mac than PC. Every time I've done a comparison, if you look at a Dell, Gateway, IBM, or other non-fly-by-night manufacturer, you spend about twice as much as you would for a mac with comperabe specifications (I'm talking about hardware.) When you factor in the fact that the mac is about 4-6 times as fast at speed intensive things, you find that the mac is a much better deal on a price preformace scale.

Given that the largest installed base of open source software is on the Mac, and that the non-open source stuff kicks every other OS out there-- better video than real, better graphics than any desktop,(OpenGL implementation), faster application development, a vastly superior UI,etc. etc. I find it shocking that so many slashdot readers- obstensibly people that support opensource- continue to repeat the myths and outright lies spread by the evil empire.

Get a Mac. Run Linux on it if you want, dual boot with darwin and OSX if you want. But get one and see what it is that you're missing.

Only be evading actually using one or getting informed about the technology involved can you continue to hold the worldview you represent here on slashdot (and get moderated up for... hmmm.)

Why is Gosling, Joy, and every other big name unix guy I know not intimately involved with linux development of going to the Mac? The titanium powerbook, and other great hardware.

As I heard Gosling say yesterday "Mac OS X is unix with quality control and taste."


Re:Macs ALWAYS cheaper. (0)

Anonymous Coward | more than 12 years ago | (#3502192)

Clap, clap, clap!

That was beautiful. I've never heard anybody switch to the Mac and complain about it. Nobody complains about Apple's software direction since the Apple NeXT merger. What is the deal with the anti-mac sentiment? My G4 iMac is hands down the fastest and best-looking unix workstation I have ever owned... and I own quite a collection. So what are you gonna do, get a Sun? Run Linux on a franken-box that breaks down all the time? Enjoy... but I'm happy with my Mac, thank you.

It's NOT about processor speed! (0)

Anonymous Coward | more than 12 years ago | (#3486140)

It's NOT about processor speed!

This [] sums it all up nicely. This guy was looking at getting a PC, then decided to get a Mac despite his benchmarks.

Re:It's NOT about processor speed! (0)

Anonymous Coward | more than 12 years ago | (#3486159)

That was the part where he decides. This [] is the part where he talks about why.

Re:It's NOT about processor speed! (0)

Anonymous Coward | more than 12 years ago | (#3486386)

That's the biggest load of pro-Apple propaganda I've ever seen.

Pure bullshit.

Re:It's NOT about processor speed! (1)

drsmithy (35869) | more than 12 years ago | (#3488804)

I have to disagree with him on a couple of points:

The Mac definitely wins where file management is concerned.

I find the Finder to be incredibly difficult and clunky to work with when complex and deep directory hierarchies are present. One of the things I miss most from Windows when I'm using OS X (apart from the better responsiveness) is Explorer's directory tree+file listing layout.

The Mac nicely eliminates unnecessary clutter when changing foreground applications.

Dunno what he means here. I find the taskbar much ncer to use (and more usable) than the dock.

The Mac powers up and down in the time it takes Windows to POST.

I find that hard to believe. There's no way my PB667 boots any quicker than any of my PCs (except for the ones with heaps of hard disks and SCSI devices attached attached). I suspect that he's comparing wake-from-sleep time to bootup time. But, in any event, who cares about how long a machine takes to boot when you hardly ever do it ?

Faster hardware? not yet. (0)

The Mainframe (573877) | more than 12 years ago | (#3486991)

I don't know about the Mac being slower. You've got to look at the x86 processor vs. the G4 and how they differ in fundamental design. The x86 architecture tries to squeeze out more processor cycles per second. The G4 architecture tries to squeeze out more processing per cycle.
If you're looking to do a large quantity of small things (like playing games, for instance, where you need to process a large number of processor unintensive tasks) sure the x86 is going to be better. However, that's not what I'm doing. I want to process fairly CPU-intensive data efficiently, and the G4 does that for me. So, I buy a dual CPU system and eat through full motion video because I can process at nearly 30 frames per second.
One thing I'll say in Apple's favor: Their hardware is good. I buy a mac, I know I'm going to pick up reliable hardware that will take a beating. True, my UI may not be as snappy, but that's not what matters to me. I want a kick ass processing behemouth, and that's what I got.

Re:Faster hardware? not yet. (1)

twiztidlojik (522383) | more than 12 years ago | (#3487248)

The duals are even better in a cluster. I dont have any details, but I recently read an article about the grandness of macintosh clusters. Apparently, like the man said, the Macintosh is aimed more at the "lots of processing per cycle", not the "lots of cycles" view. This outlook that Motorola has had, combined with the gigabit ethernet (giving a fatter pipe to communicate with the controller) built in makes the G4 tower have some serious clusterability, especially with things like rendering huge 3D scenes in 4096x3072 resolution. Those clusters could probably finish off a SETI@home unit in about an hour! =D

Re:Faster hardware? not yet. (0)

Anonymous Coward | more than 12 years ago | (#3494368)

You are rationalizing to justify your expenditure. Otherwise you'd be suffering from buyers' remorse.

Has After Effects been cocco'ed? (0)

Anonymous Coward | more than 12 years ago | (#3487518)

Or were they running it in Classic mode?
That may take a hit or two...

Does After Effects out of the box handdle MP machines?

Unequivalent Compression Codec Comparision (0)

Anonymous Coward | more than 12 years ago | (#3487521)

"Each file was output to its platform's native format with no compression added -- on the Windows platform, they were uncompressed Video for Windows files (.avi) and on the Mac OS X platform, they were rendered to the QuickTime animation codec (.mov)."

This marks a big difference. The Animation codec is NOT uncompressed, and does add processing time. I would like to have seen them run the tests with equal conditions. They said they also ran it on both platforms rendering to QT Animation Codec. But as presented, it is not an equal comparision.

My personal experience with the Animation codec was, it didn't save much space, but sure ate a lot of cycles.

Re:Unequivalent Compression Codec Comparision (1)

andrewski (113600) | more than 12 years ago | (#3487682)

Carbon sux.

You can't easily Cocoa any existing application. Writing a proper Cocoa app takes a different mindset than the traditional c / c++ program does. Until we see programs benchmarked that were coded NATIVELY in Cocoa, we won't see much that is impressive.

That being said, Carbon is a crutch for developers. Nothing more.

Re:Unequivalent Compression Codec Comparision (2, Interesting)

Decimal Dave (411182) | more than 12 years ago | (#3488154)

This AC is right; the Animation codec (one of the oldest codecs in the QuickTime package) uses compression. It's not even very fast compression - I've found that standard motion JPEG is faster. If they really wanted a fair comparison they should have used uncompressed video or possibly just standard DV spacial compression. Accessing uncompressed video isn't very taxing on the CPU (because it's not compressed), but it is very disk intensive because the files tend to be *huge*.

Objective Charlie White (1)

tobyglyn (455427) | more than 12 years ago | (#3488367)

Charlie White published the following after MacWorld NY last year:

"Worse, he (Jobs) engages in downright fraud. Consider the tired old "smoking Pentiums" routine. Funny that when Jobs compares the new G4 with the Pentium, he picks Cleaner, an application that runs significantly faster on a Pentium 4 than a Pentium 3. But lo and behold -- it's a single-processor Pentium 3 that's compared to the mighty G4.
What would happen if a dual processor Xeon 1.7 GHz machine (based on the P4 chip) were tested against the G4? Guess. Another odd
occurrence: Where was the AMD Athlon chip, another "Pentium Smoker," in this carnival? I say, next time, Jobs, get a copy of LightWave up there and render a few frames with that G4 against the fastest PC and we'll see who gets smoked.

Charlie White Senior Producer Digital Media Net"

Of course what Charlie didn't say was that he had not watched the Macworld P4vs G4 shootout himself, which was was he missed the big P4 signs and the apology from Jobs that though they knew P4 1.8ghz models were shipping, Apple was only able to obtain a 1.7ghz model for the NY Macworld. He also missed the explanation that both the Photoshop and Cleaner apps (optimised for P4 and G4) were running a series of processes common to complete a real world job and that the P4 and G4 were both running equiv RAM and HDs.
It seems that Charlie White and Digital Media Net never let the facts get in the way of a good headline.

Tis the season.... (1)

marktwain (523893) | more than 12 years ago | (#3488484)

for Trolls.


After Effects doesn't use 2 processors (0)

Anonymous Coward | more than 12 years ago | (#3489133)

I guess none of you are aware that there is a bug that currently affects After Effects Network rendering libraries which it uses even to use a 2nd local processor. This probably affected the results. Look for it to get fixed soon.

What about other software? (0)

Anonymous Coward | more than 12 years ago | (#3490591)

I think testing one piece of software doesn't exactly show the power of the processors. This is just as ignorant as Apple doing the Photoshop bake-offs. A true test would be a battery of software with similar hard disks and video cards. G4s can be fitted with 3rd party this and that as well. Out of the box isn't always what the professionals use anyways.

Mac PPC days are numbered (0)

Anonymous Coward | more than 12 years ago | (#3494359)

Whether it's next year, or the year after, Apple will be forced to drop the PPC processors and switch to AMD or Intel processors. When it comes to speed in commodity processors, nobody can catch up with the x86 family. There is too much money being poured into x86 R&D and fabrication technology that PPC is doomed to lag behind forever.

That is what Darwin on the x86 is all about. When the time is right, Jobs will pull the plug on the PPC Macs. It does not mean that Apple will make IBM PC compatibles. They will probably use a proprietary design based on the x86 chips. Don't expect to be able to boot Windows into one of the new Macs. Nonetheless, sooner or later, Mac will be aboard the x86 express.

Re:Mac PPC days are numbered (1)

ReblMonkey (579089) | more than 12 years ago | (#3513307)

You obviously have no idea what you're talking about.
IBM's PowerPC roadmap, which is completely devoid of any reference to the AltiVec acceleration unit, calls for chip architecture to exceed the 2 GHz barrier in the coming year.

While keeping mum on specifics, the company did say that its upcoming chips will be multi-core, meaning that several processor cores can be arranged on a single chip. This technology allows the possibility of four-processor or even eight-processor configurations.

Motorola also has said it plans to exceed the 2 GHz barrier in the coming year and is calling for the same I/O improvements and pipeline upgrades offered by IBM.

Read more on that here [] .

Anyway, G4/G5 chips are looking to pick up in MHz very steadily, and I can't wait for the day in the next few months (and with hope it'll be at MWNY) when my dual 1.6 is tearing everything apart. And you can just imagine when you'll be able to buy a 4-processor G5 running at 2+ GHz each.

Carbon vs Cocoa (1)

Bishop923 (109840) | more than 12 years ago | (#3497958)

I'm assuming that since the recent crop of Adobe apps are written as to run in both OS9 and X that the use Carbon. Has anyone done benchmarks on Carbon-based app performance in OS9 and X vs Cocoa based in X?

I only wonder because I've noticed a little sluggishness in Illustrator 10 under X AND 9 that I didn't experience various Cocoa apps(some quite large) under X.

Could the ease of platform transition that Carbon provides have a significant impact on performance?

Re:Carbon vs Cocoa (1)

ReblMonkey (579089) | more than 12 years ago | (#3513247)

Cocoa runs much more smoothly and much faster on new G4 OS X systems than Carbon can. They are really not even directly comparable when you get down to it, but being the owner of a great old B&W G3/450, I can tell you I would jump on anything Cocoa over anything Carbon any day of the week.
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account