Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

IBM Releases XL compilers for Mac OS X

pudge posted more than 10 years ago | from the faster-you-say dept.

IBM 84

Visigothe writes "IBM released their XL Fortran Compiler and XL C/C++ Compiler for OS X. The compiler is binary compatible with GCC 3.3, and has multiple levels of optimization, creating binaries that are much faster than their GCC-compiled counterparts." No prices are noted, and the planned availability date is January 16.

cancel ×


Sorry! There are no comments related to the filter you selected.

FP (-1, Flamebait)

Anonymous Coward | more than 10 years ago | (#7977047)

Suck it down you whores.

Oh yeah, the CLiT ownes you and your cat.

Re:FP (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#7977083)

To be in the clit, don't you need to be logged in when you first post?

Re:FP (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#7977187)

Doesn't matter. I got first. You didn't. ROTFLMAO


Re:FP (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#7977350)

Only the most amateurish wannabe would get excited about getting first post in a story which took half an hour to get just 6 posts.

A real first poster wouldn't waste his time here.


Benchmarks (4, Interesting)

melquiades (314628) | more than 10 years ago | (#7977049)

Has anybody seen any useful benchmarks of compiler output comparing XL and GCC on PowerPC?

That would be interesting to see.

Re:Benchmarks (4, Informative)

integral-fellow (576851) | more than 10 years ago | (#7978552)

for performance comparisons, see this page: Blainey-SciComp7_compiler_update.pdf

Re:Benchmarks (2, Informative)

York the Mysterious (556824) | more than 10 years ago | (#7980062) ainey-SciComp7_compiler_update.pdf Here's the correct address. The space in there screwed it up.

Re:Benchmarks (4, Informative)

klui (457783) | more than 10 years ago | (#7980480)

Slashdot mucks long lines. You need to use a link like this [] . Basically, on a POWER4 system (unknown re: G4/G5), specint2000 is around 30% improvement, specfp2000 is around 50% improvement. (Just eyeing the results.)

Re:Benchmarks (1)

integral-fellow (576851) | more than 10 years ago | (#7981063)

Thanks for that! I was trying to find the tags to insert the link, but I did not. How does one embed the link?
In fact, I could not find any directions for using HTML other than the spartan 'Allowed HTML:' list just below the Comment window. Where are the instructions?

Re:Benchmarks (1)

farnz (625056) | more than 10 years ago | (#7982441)

It's standard HTML tags; find any decent HTML reference, and it should explain how to work the available tags.

To make a link, enter it as something like:
<a href="">Slashdot</a>
which appears as:
Slashdot []
Note that in the href, the URL in quotes needs the http:// bit.

Re:Benchmarks (-1, Flamebait)

Anonymous Coward | more than 10 years ago | (#7981115)

No fair. Platform-optimized compilers have been off limits for benchmarking against PPCs, so GCC was the only fair entry. Oh wait, this one will HELP the PPC scores? FIRE IT UP, BABY! OH YEAH! Let's see XL/PPC Photoshop recompiled compared to a P-II 450. One filter only, just the best PPC performer.

Supports Objective-C! (4, Insightful)

jazuki (70860) | more than 10 years ago | (#7977139)

Very cool! Looks like the C/C++ compiler also has support for Objective-C now. Even if it's in the form of a "technology preview" and probably preliminary.

This means that this could well be usable as a replacement for GCC in developing Cocoa-based apps. It's good to finally have some options. Can't wait to see how well it works!

Supports G4 and G5, but not G3 (3, Interesting)

Laplace (143876) | more than 10 years ago | (#7977246)

It's great that IBMs compiler produces faster code that is compatible with gcc, however it appears that it won't generate code that runs on G3 machines. This means if you want to build apps with it you either need to write code that builds with two compilers or not support any G3 machines.

As a very happy G3 user I will be sad when I'm forced to upgrade.

Re:Supports G4 and G5, but not G3 (1)

Synesthesiatic (679680) | more than 10 years ago | (#7977446)

Isn't that what fat binaries are for?

Re:Supports G4 and G5, but not G3 (2, Informative)

Gogo Dodo (129808) | more than 10 years ago | (#7978043)

If I remember right (I'm not a Mac programmer), fat binaries are for mixing 68K and PPC code, not different PPC code.

Re:Supports G4 and G5, but not G3 (2, Insightful)

Visigothe (3176) | more than 10 years ago | (#7978251)

The original "Fat Binaries" on the Mac were for 68k/PPC version, however, the idea of the FAT binary is not limited by CPU architecture.

You could easily produce a "Fat Binary" that runs on G3/G4-G5/Alpha/68K/SPARC. That said, it would be one big binary before you stripped it =)

Re:Supports G4 and G5, but not G3 (3, Informative)

Anonymous Coward | more than 10 years ago | (#7980746)

NeXT did "Fat Binaries" before Apple did, and they are still possible in OS X's .app bundles. NeXT's added Fat Binary support on Black Tuesday [] in 1993. Apple's Fat binaries were introduced with the Power Macintosh line in 1994. NeXT's fat binaries could be built to run 68K, x86, SPARC, and PA-RISC.

Of course, right now the search algorithm isn't designed for a fallback mechanism. The system can consider itself either a "MacOS" or a "MacOSClassic". Both are assumed to be generic PPC code and one doesn't fall back to the other.

Re:Supports G4 and G5, but not G3 (1)

Pius II. (525191) | more than 10 years ago | (#8006419)

> The system can consider itself either a "MacOS" or a "MacOSClassic".

That's not the "fat" part about the fat binaries. Fat binaries are for several architectures in one file.
You build and link your binary like normal, then use lipo to merge them.
Darwin 7 is build completely fat, with even the kernel being a fat binary.
Here's the output of lipo -detailed-info on "yes":
Fat header in: yes
fat_magic 0xcafebabe
nfat_arch 2
architecture ppc
offset 4096
size 9516
align 2^12 (4096)
architecture i386
cputype CPU_TYPE_I386
cpusubtype CPU_SUBTYPE_I386_ALL
offset 16384
size 9516
align 2^12 (4096)

Don't forget x86! (2, Interesting)

MarcQuadra (129430) | more than 10 years ago | (#7982110)

don't forget x86, there's nothing STOPPING the *.app files from holding code for any architecture. If Apple ever does have to jump ship to x86 I'm sure there'll be a lot of apps that are distributed with PPC and x86 (and probably x86-64) executables inside them.

One Application Icon to Rule Them All.

Re:Don't forget x86! (1)

ModernGeek (601932) | more than 10 years ago | (#8011271)

Thinking of that, is there a way to have a FAT binary for cross platform, like windows and MacOS X?

Re:Don't forget x86! (1)

MarcQuadra (129430) | more than 10 years ago | (#8013883)

Not really, the magic is that the application icon in OS X is really just a folder with folders inside it, including what would normally be installed as seperate files in Windows. See, in Windows you have a binary .exe file and a whole slew of other files installed all over the system (dlls, jpgs, gifs, inis, etc.). In OSX you have one icon that's a folder with the 'tidbits' in it and seperate folders for binaries for different machines. To get this working under Windows you would need a different binary loader, and probably a different binary format, or a very intricate 'wrapper' script. I wouldn't be surprised to see something like this come out with longhorn though, considering that we will be dealing with several different architectures at that point (x86, x86-64, ia-64) and it would leave 'room to grow' for the new Windows platform.

Re:Supports G4 and G5, but not G3 (1)

Steveftoth (78419) | more than 10 years ago | (#7977709)

I'm fairly sure that it will produce code that will run fine on a G3, especially because the G4 is basically a G3 with Altivec.

These compilers will produce better code then GCC for the PPC chips in macs, so maybe next year we will have a OS compiled on XLC and not GCC (another speed-up). But I don't know how reliant Apple is on GCC for compiling (extensions and whatnot).

Re:Supports G4 and G5, but not G3 (2, Insightful)

LordAlexander (140694) | more than 10 years ago | (#7978078)

Actually, the G3 is a 603-based CPU and the G4 is a 604-based CPU.

It was rumored a while back that Apple might use IBM G3+Altivec chips in PowerBooks, further confusing the issue.

Re:Supports G4 and G5, but not G3 (0)

Anonymous Coward | more than 10 years ago | (#7978266)

The G4 is a G3-based CPU, with the only major change being the addition of altivec.

Re:Supports G4 and G5, but not G3 (1)

Erect Horsecock (655858) | more than 10 years ago | (#7982065)

No this is wrong. The G3 was based on the 603 where the G4 was based on the 604 with the addition of Altivec

Re:Supports G4 and G5, but not G3 (2, Informative)

Toraz Chryx (467835) | more than 10 years ago | (#7985941)

Actually you're both wrong, kinda.
the original G4 is a G3 + Altivec + MERSI + the 604e's FPU hardware

the modern G4's bear very little resembalance to the 7400 however, having a pipeline three stages longer and a more elaborate altivec implementation.

Re:Supports G4 and G5, but not G3 (0)

Anonymous Coward | more than 10 years ago | (#7977767)

A bit ironic, considering that IBM used to make the G3 chips in the later G3 iBooks.

Re:Supports G4 and G5, but not G3 (0)

Anonymous Coward | more than 10 years ago | (#7979559)

Not really. XLC comes from AIX, and I don't think AIX ever ran on G3s.

Re:Supports G4 and G5, but not G3 (1)

squiggleslash (241428) | more than 10 years ago | (#7979510)

So presumably it's useless to compile code intended for non-970 based IBM machines too?

It will not RUN! on a G3 but code is ok for G3 (1)

CottonEyedJoe (177704) | more than 10 years ago | (#7998151)

Sys req say it requires a G4 or G5, but the code it generates will work on a G3.

Fortran? (-1)

scumbucket (680352) | more than 10 years ago | (#7977312)

I want my COBOL compiler for OS X!!

Pricing (5, Informative)

Erect Horsecock (655858) | more than 10 years ago | (#7977356)

According to an Ars thread the XLC compiler will be $499 for a single seat license. WAY below the cost for the AIX versions.

Linkage []

Re:Pricing (1)

topham (32406) | more than 10 years ago | (#7980765)

$500 is a good deal. Particularly since you should be able to take a project developed in gcc and re-compile it with XLC if the need arises. I'm considering starting work on a particular project, if it works out and I take it far enough it could be in my financial interest to spend the $500, on the other hand it could end up like other projects of mine, destined to be filed away and long forgotten... so I'll keep the $500 for now, and spend it later if it warrants.

Current compiler? (2, Insightful)

Robowally (649265) | more than 10 years ago | (#7977459)

What is MacOS X currently compiled with? Is it GCC? If so, the new IBM compiler would presumably speed up the entire OS somewhat if it were recompiled via IBM's compiler?

Re:Current compiler? (5, Informative)

Erect Horsecock (655858) | more than 10 years ago | (#7977605)

What is MacOS X currently compiled with?

I'm not 100% sure but I seem to remember in the WWDC keynote Jobs saying it was built with gcc3.3 the version that ships with Xcode.

If so, the new IBM compiler would presumably speed up the entire OS somewhat if it were recompiled via IBM's compiler?

Ya probably. I was surprised that they implemented Vector support so quickly. XLC really shines on Floating Point code, but I'm really curious to see how well it handles Vector. Even if the whole operating system isn't compiled with xlc as long as the core libs and things like codecs for QT and other multimedia apps the speed up would be impressive.

Re:Current compiler? (3, Informative)

Selecter (677480) | more than 10 years ago | (#7979143)

Yes, Panther was built with the GCC 3.3 "+" some poeple are calling it. The OS would speed up quite dramatically *if* Apple were to release G3/G4/G5 specific versions as the opts for the G5 if taken to the highest level would actually slow down or break things with a G4 or more likely a G3.

More likely they will break it down into chip specific libs and kibbles and bits and have the installer detect and choose. If they code for the middle ground I fear they will give up the best chance of having huge speed gains. They need those gains on the G5 to keep their momentum going.

Re:Current compiler? (1)

ZackSchil (560462) | more than 10 years ago | (#7989632)

They can't really do that because one of Apple's trademarks is system independence. You can remove your Mac's hard drive and place it in a completely different machine, any machine capable of booting it's OS, and have it work perfectly. I'd say that the boot loader, etc would choose, not the installer.

Re:Current compiler? (1)

gerardrj (207690) | more than 10 years ago | (#8001537)

But there is precedent for the other view. I don't know if you were around in the Quadra/Performa days, but you could install system software for your particular machine, or for all machines.
If you chose for "your machine", you could not move the system folder to another machine and have it boot if the other machine were substantially different.
Granted it was a bit of a mess, but it did provide for a very lean system folder, and increased performance.

Re:Current compiler? (4, Informative)

TALlama (462873) | more than 10 years ago | (#7979979)

The only problem being that XLC doesn't support G3 Machines, so it would be impossible to create a binary that runs on any Mac, as is currently supported by GCC. In theory they could make separate binaries and install the 'correct' one, but that poses problems for systems booting off of external hard drives (which binary do you put on the drive?) and booting over the network. The ability to carry around a copy of OSX on your iPod is a powerful thing, and not one most people would give up lightly.

Of course, this could have been gotten around by using Bundles, which is a folder that acts like a double-clickable application. The structure is: <-- The application
Contents/MacOS/SomeApplication <-- The OSX binary
Contents/MacOSClassic/SomeApplication <-- The OS9 binary

It chould be:
Contents/MacOS/SomeApplication <-- The generic binary
Contents/MacOS-G3/SomeApplication <-- The G3-optimized binary
Contents/MacOS-G4/SomeApplication <-- The G4-optimized binary
Contents/MacOS-G5/SomeApplication <-- The G5-optimized binary

When you double-click, it uses whatever binary is appropriate for the system. Unfortunately, this doesn't work for Frameworks, which lack the notion of platform.
A rant to beat the lameness filter: this bundle format should be adopted everywhere. It allows for a folder to be used as an application, and to contain all the resources it needs to be used and run. Moving the folder moves the application, and the folder doesn't use any vodoo to keep the data together, as pretty much any HD format understands folders and files.
In addition, the multiple-binaries trick (as shown above working with OS9/X and proposed for processors) would allow the same bundle to work on muluple platforms, so I could email you a zipped version of Office from my Mac that could work on your Wintel, no Java required.
The support is in the Finder/Explorer/Browser, which needs to understand that 'double click on bundle' == 'find correct binary and launch it'.

Re:Current compiler? (2, Informative)

Graff (532189) | more than 10 years ago | (#7982326)

Of course, this could have been gotten around by using Bundles, which is a folder that acts like a double-clickable application...When you double-click, it uses whatever binary is appropriate for the system. Unfortunately, this doesn't work for Frameworks, which lack the notion of platform.

Ahh, but that is what fat binaries [] are for. A fat binary allows you to package up several different versions of a program optimized for different runtime architectures.

Macintoshes have been able to use fat binaries for some time, it was one of the things that allowed the fairly seamless transition from 68k processors to PowerPc processors.

Here is more information on optimizing for the G5, if you look towards the bottom there is notes on packaging an app to run on different processors.

Re:Current compiler? (1)

Graff (532189) | more than 10 years ago | (#7983468)

Here is more information on optimizing for the G5, if you look towards the bottom there is notes on packaging an app to run on different processors.

Whups, somehow my link was lost. Here is the link [] to optimizing the G5. And yes, I know it should be PowerPC not PowerPc! :-)

Re:Current compiler? (1)

otuz (85014) | more than 10 years ago | (#7983824)

Sorry, this is not what fat binaries are for.
Fat binary was a ugly quick&dirty hack that ruined the beautiful, clean design of resource forks. A better way would have been to use for example "PPCD" resources for PPC machine code and "CODE" for existing 68k binary code.
The NeXT approach of MacOSX is however much more scalable. It works really well in OpenStep bundles.

Setting up a karma whore... (2, Interesting)

Troy (3118) | more than 10 years ago | (#7977519)

Pardon my ignorance, I was 31337 for only 37 seconds in 1997.

What do they mean when they say that two compilers are "binary compatible" Does it mean that XL produces identical machine code? Does it take identical switches so makefiles don't have to be rewritten? Does it simply mean that XL has the same foibles as gcc, so code written to gcc's foibles doesn't need tinkering? Use of the term doesn't quite fit with my current understanding of compilers.


Re:Setting up a karma whore... (1, Informative)

Anonymous Coward | more than 10 years ago | (#7977622)

I would imagine that it means a library compiled by one compiler could be linked by an app compiled with the other.

Re:Setting up a karma whore... (4, Informative)

dr2chase (653338) | more than 10 years ago | (#7977638)

Binary compatible means same data layouts, same parameter-passing conventions, same conventions for shared libraries and position-independent code. However, between those interfaces, the generated code is probably different.

Think of it like nuts and bolts -- a nut and bolt are compatible if they have the same diameter and threads per inch, but they may be made of carbon steel, steel, bronze, nylon, titanium, whatever.

Re:Setting up a karma whore... (3, Informative)

sartin (238198) | more than 10 years ago | (#7979364)

For C++, binary compatible goes the extra distance of requiring that the dynamic dispatch and run-time type information layouts be the same. When a C++ class gets created, there are hidden bits of data that are used to handle virtual functions and the likes of . Many binary interface standards don't specify how these work, but to be truly compatible, the two compilers have to match here. See Stroustup's Technical FAQ [] for a little more information.

Re:Setting up a karma whore... (1)

lordpixel (22352) | more than 10 years ago | (#7977641)

Usually it would mean that if you has a (shared) library that was compiled with XLC, you could use it in a program compiled with gcc or vice versa. Perhaps there's more to it than that, but that would be the minimum requirement, I think.

The extreme case would be that Apple could compile OS X with XLC ($) but still allow people to run applications written with gcc (free).

Re:Setting up a karma whore... (0)

Anonymous Coward | more than 10 years ago | (#7977647)

My guess is that it is object-file compatible, so you can link in libraries created with GCC with no problems (and maybe some optimizations). No, it doesn't produce the same machine code; that would be almost impossible to do and offer no advantage over gcc (except you get to pay more). It also is command-line compatible so you don't have to alter the switches.

# Binary compatibility and coexistence with gcc V3.3
# GCC command line compatibility

Re:Setting up a karma whore... (2, Informative)

davechen (247143) | more than 10 years ago | (#7977692)

Looking at IBM's accouncement [] it looks like they use same headers and run-time libraries as gcc 3.3. They say you can combine xlc and gcc compiled files. I.e., you can link object files.

Re:Setting up a karma whore... (4, Informative)

MBCook (132727) | more than 10 years ago | (#7977760)

I believe that it referes to the interfaces between code. What I mean is that, for example, it lists the functions in object files the same was as GCC and they are called with the same machine code sequence as GCC (they way arguments are put on the stack, etc). This is good for a few reasons. For one thing, it means that code written with this compiler can link to code written with GCC or vice-versa. Ordinarily you can't take an object file from VC++, one from OpenWatcom, one from GCC, and one from ICC and link them together. But if all the compilers were biniary compatablie, you would be able to. It has nothing to do with the internal code generated, as if both compilers generated identicle sequences of machine code, one couldn't be faster than the other. I think the main benefit is that, for example, you could use a static library that was compiled with this compiler with your code that uses GCC without a problem.

As for commandline switches and such, I would assume that they would be the same (or that there would be a simple option like --gcc that would turn on "gcc mode" so that it took the same command line stuff).

PS: If I'm wrong would someone please reply and correct me, and not just mod me wrong?

Re:Setting up a karma whore... (4, Informative)

g_lightyear (695241) | more than 10 years ago | (#7979827)

"binary' refers, indeed, to the binary compatibility of object files; in GCC terms, when there's an "ABI" change, you have to re-compile all applications, as new stuff compiled in the new Application Binary Interface can't access stuff compiled in the old ABI.

What it REALLY means:

1) You can compile the majority of your application in GCC, and selectively compile in IBM's XLC.

2) You can compile one library in XLC, and link it in to your GCC application.

3) You can compile a library in GCC, and link it in to your XLC application.

Etc. You get the point. Essentially, while the code they generate is very, very different in terms of optimization and performance, they are, in fact, completely interchangeable in terms of the things they produce as output.

XLC is, in fact, a very different beast than GCC. The number of optimizations it provides goes well beyond what GCC currently provides, and does include auto-vectorization and support for OpenMP - things which don't suck on parallel systems.

So XLC is a good thing for commercial software developers, and at minimum, the compatibility of the systems means that we as developers have no excuse not to be compiling, at bare minimum, the most *important* functions (and if we're doing it this way, it might as well be specific functions) in XLC, and link in that parallelized and optimized object file into our existing project.

As for commandline switches... nope. Almost never compatible. No hope. Basic stuff is mildly similar, but the guts you'd use once optimizing are very different.

But at a high level, yes, you just say xlc -O3 instead of gcc -O3, only you might say xlc -O5.

Re:Setting up a karma whore... (1)

Tjp($)pjT (266360) | more than 10 years ago | (#8039841)

And just to underline the point: Names are mangled the same way between GCC and XLC so one linker knows what the string of trashcharacters means (the mangled name of C++ functions to account for overloading).

FASTER OS X? (5, Interesting)

zulux (112259) | more than 10 years ago | (#7977538)

What does Apple use to compile OS X - and if IBM get the Objective C sections woking properly, could Apple use the IBM comiper to get OS X to run faster?


MBCook (132727) | more than 10 years ago | (#7977809)

I would assume that they already use IBM's compilers since IBM designed the chips and would have the fastest/maturest compilers. What other option is there? GCC? I thought that GCC didn't produce very optomized code for G3/4/5s (optomised well by using AltiVec, etc).

Just like I'd assume that Microsoft would use Intel's compilers if a) Microsoft didn't make compilers and b) AMD (and other clones) weren't around.

So I'm just speculating. Anyone know the real answer?

Re:FASTER OS X? (3, Interesting)

addaon (41825) | more than 10 years ago | (#7978079)

OS X and all Apple software for OS X that I know of (maybe not including some high-end stuff, I just don't know for sure) is indeed built with GCC.

Porting an operating system to a different compiler is a pain in the neck, and most OS's use compiler-specific tricks to deal with low-level details. Also, most of Apple's higher-level software is in Objective C, and as of now only GCC really supports Obj C well on the mac.


zsazsa (141679) | more than 10 years ago | (#7978249)

I've always heard that Apple uses GCC to compile OS X. Apple has always touted the GCC name on its website, and I remember hearing that a lot of the speed increases in recent releases are due to new optimizations in GCC. Also, the much maligned/celebrated SPEC benchmarks Apple touted for the G5 were obtained with GCC builds. Seeing that Objective C is only in "technical preview" for the IBM XL compilers, you can be certain that it wasn't used for any current version of OSX.

Re:FASTER OS X? (3, Funny)

DAldredge (2353) | more than 10 years ago | (#7978559)

Have you thought about applying for a /. editors job?

Re:FASTER OS X? (3, Interesting)

Visigothe (3176) | more than 10 years ago | (#7978352)

Apple currently uses GCC to build Panther. As XL is much faster in 95% of the situations, I would imagine that Apple would transition to XL once the Obj-C portions of the compiler were a bit more mature. [The public beta of XLC didn't have any Obj-C support]

Re:FASTER OS X? (5, Interesting)

rmlane (589573) | more than 10 years ago | (#7980180)

As mentioned by others, the majority of OS X is compiled by GCC.

The exception is Quicktime, which uses (and has used since well before OS X) a older, custom version of the IBM compilers. I believe, but am not 100% sure, that Quicktime has always used the IBM compilers on PowerPC CPUs.

This is very good news for Apple's science users, one of the real problems pushing Mac boxes into some markets has been the lack of a really good Fortran compiler. The performance boost for C/C++ code will also be appreciated.

As for a wholesale transition of OS X to the IBM compilers: next to no chance. QA of the transition would take far too long and absorb resources that could be better used on other improvements. It would also cause problems with the Open Source versions of Darwin, so expect the vast majority of OS X to remain GCC compiled.

That being said, I would expect that certain chunks will be transitioned, where it makes sense. The output of the IBM compilers is binary compatible with GCC, so you can recompile (and re-QA) chunks of the OS where you'll get a major improvment.

Quartz Extreme, CoreFoundation and AppKit spring to mind, but don't expect this to happen in 10.3 or 10.4, more like 10.5 or 11.0.

Re:FASTER OS X? (3, Informative)

g_lightyear (695241) | more than 10 years ago | (#7983734)

As there's work going into XCode to ensure that any project can specify which compiler it uses on a target-by-target basis, I fully expect we'll see several core projects in the Darwin codebase switch over to using both compilers (where XLC will be used to compile specific branches) in 10.4.

OpenMP, at app-level, is pretty much guaranteed to get some use, and Apple will very likely spend some time in the vec/math libs fully OpenMPing that code to get parallel use of both CPUs.

CoreGraphics would probably get some small, critical sections built with it, but it's much more difficult to figure out how to get good benefit out of OpenMP code in those circumstances.

Anything else introduces problems; OpenMP will spawn off threads to do work, and if those threads start accessing code that isn't actually MT safe and/or fully reentrant at that level, it's going to get zero beneift.

Re:FASTER OS X? (2, Interesting)

gerardrj (207690) | more than 10 years ago | (#8001446)

You mean it would consume more resources than a complete re-write of the OS from Pascal to C as happened with System 6 to System 7 transition?

I expect that once IBM completes the ObjC compiler that most of the OS will be migrated to that compiler, as will much of the high performance commercial software out there. The developer tools will still come with the free GCC compiler, and Apple will still maintain it, but without changes to the core of GCC (which are being resisted by the maintainers), it will never make the G5 shine.

I have yet to see pricing from IBM on these compilers, so I don't know yet how viable they will be for smaller developers. It might be that developers offer two versions of their software: standard for the regular cost, and the 15%-40% better performance version from the IBM compiler for like 20% higher cost, just so they can recoup their added costs.


lethe1001 (606836) | more than 10 years ago | (#8004914)

what is the status of GCC optimizations for the macintosh platform? why is the GCC team resisting? and can't apple just fork GCC if they need to make heavy modifications for the G5?

Re:FASTER OS X? (2, Informative)

gerardrj (207690) | more than 10 years ago | (#8005806)

Apple has completed some very major improvements to the gcc code base, I don't recall, but I think Apple managed so improve compile times by something like 25%, and code performance of something like 15%. I don't remember where I read those numbers, or how close I am to what I read.

The reason (again from what I've read) that the gcc maintainers are resisting the changes that Apple would really like to make for the G5, is that the changes would fundamentally break many if not most of the other platforms compilers. I admit I don't understand compiler optimization, but apparently the logic that gcc uses for modeling the processor just can't keep up with the number of execution units, registers and/or the number of instructions in flight. I think the way the G5 "groups" instructions in to bundles of 5 for scheduling also throws things off a bit. I've read a few discussions that it's time for GCC to go through some sort of fundamental change that would allow for vastly different acitectures to be supported via plug-ins or modules or something.

Apple could indeed fork gcc, but given their commitment to "giving back" to the open source community, that 's probably not going to happen. Hopefully someone more "in the know" can either confirm or correct some of this. I personally think also that forking something as fundamental as the compiler that's used for almost all open-source software would be a "bad thing".


CreateWindowEx (630955) | more than 10 years ago | (#7995377)

I would guess that only a relatively tiny portion of OS X would benefit from things like autovectorization--probably most of the code is passing messages around, blocking on semaphores, writing out display lists to the video card, managing queues, etc, etc. The code that tends to benefit from vector optimizations are the really tight inner loops of 2D graphics routines and audio/video codecs; these are small enough that it may be easier to just vectorize them by hand or with GCC's intrinsics and probably beat the IBM compiler vectorizing "naive" code; and most G4s and above will do most of the pixel-pushing in graphics hardware, anyways. I haven't done much with altivec, but my experience with vector units on a certain other platform has been that if the code wasn't written with vector operations in mind, you can be *slower* overall by vectorizing it, because of the extra overhead of packing things onto 16-byte boundaries, moving between FPU and VU regs and so forth.

It's rare for most other code to be limited by floating-point performance; usually the CPU will spend most of its time blocking on pipeline stalls and cache misses, not ALU. Throw in the risk of having to manage and support separate G3/G4/G5 versions operating system components plus finding and fixing all the problems encountered when switching compilers, and it doesn't look like a very good plan...

Autovectorisation ? (4, Interesting)

Jesrad (716567) | more than 10 years ago | (#7978174)

Weren't these compilers supposed to bring automatic conversion of multiple 32bits arithmetic operations into Altivec-accelerated code ?

Re:Autovectorisation ? (0)

Anonymous Coward | more than 10 years ago | (#7978340)

They're working on it, but it's not done yet. IBM thinks it'll be a little while, iirc. I've also heard the GCC people are working on it.

Re:Autovectorisation ? (1)

Paradox (13555) | more than 10 years ago | (#7979168)

Actually, reading around, it seems like a limited version of this will indeed be available. Now, will it be as good as we'd like?

Maybe, maybe not. :) I'd like to see a really aggressive auto-vectorization scheme that goes for every possible chance to parallelize code. Since the PowerPC spec calls for so dang many registers, it seems like it'd be much easier (and provide more benefits) to store up several ops in registers and then chain them.

Integer ops are also subject to this. :)

Fortran Motives (3, Interesting)

fm6 (162816) | more than 10 years ago | (#7979282)

From what I've heard, software companies hate selling Fortran compilers. You'll notice that Microsoft no longer has one. Not enough people use the language to make it worth the development and support costs.

So why are you still able to buy Fortran compilers? Because the people who use the language tend to be engineers (the physical kind) and scientists, and thus spend a lot of money on high-end computers. No Fortran compiler, not fat contracts for your Starfire [] and Origin [] boxes. Which is why Sun and SGI both sell Fortran. And whose the leading vendor of Fortran for the Itanium? Good guess [] .

So is IBM trying to help Apple sell more Macs? Probably not. They'd make a little money from the extra CPU sales, but not enough to justify something like this. More likely they have this compiler to help them sell more high-performance PPC systems [] . As long as they have it, not that much extra effort to port it to the Mac.

Re:Fortran Motives (2, Interesting)

jabberjaw (683624) | more than 10 years ago | (#7979607)

Yes indeed Fortran is still alive and kicking [] . Although I have heard that some of the physics libraries are being converted to C/C++.
As an aside, has anyone else noticed the lack of Fortran texts in brick & motor bookstores? I know Numerical Recepies in Fortran [] is online, anyone care to mention a good intro. text for a "n00b" like me?

Re:Fortran Motives (2, Interesting)

fm6 (162816) | more than 10 years ago | (#7979695)

I used to work with a guy who had just finished a Physics PhD. He hated Fortran, and had insisted on using C++ all through school. But he admitted to being the only person in his program who did so. Proof that programming languages are as about the community they serve as the technology they encompass.

Re:Fortran Motives (0)

Anonymous Coward | more than 10 years ago | (#7980617)

The book I had for my college intro to numerical fortran class was Numerical Methods and Software by Kahaner, Moler, and Nash. I don't know if it's still in print these days- it came with a 5.25" disk in the back at the time :)

I would also highly recommend not ever reading the Numerical Recipes series of books. Many of the routines they present are poor quality, both in terms of accuracy and speed.

Re:Fortran Motives (1)

squant0 (553256) | more than 10 years ago | (#7981112)

Try "Classical Fortran, Programming for Engineering and Scientific Applications" by Michael Kupferschmid, ISBN=0824708024, around $70

Its a well written book that assumes the reader knows nothing of fortran. He teaches Fortran at RPI [] .

Fortran 95, not Fortran 77! (1)

wispoftow (653759) | more than 10 years ago | (#7985464)

Nooooo!!!! Not Fortran 77! We need to be moving people AWAY from Fortran 77 and into the more modern Fortran 90/95. Awhile ago Slashdot, there was a piece about what was killing Fortran. Indeed, in my opinion, g77 is killing Fortran, due to the fact that the only free Fortran compiler is (was) based on Fortran 77. Please, everyone look at Fortran 95. Intel makes an awesome Fortran 77/90/95 compiler (ifc) which is free for non-commercial use under Linux. It is link-compatible with gcc. I have been doing scientific programming for ten years now, and I have yet to come across a problem where I had to write a piece of C or C++ code! Fortran is for scientific computing, and C (and its ilk) are for systems programming and GUIs and stuff. What allowed C to gain any ground whatsoever was the fact that Fortran 77 lacked dynamic memory allocation and had a rigid code layout. Not anymore, baby. Look at Fortran 95. It beats the pants off of C and C++ for scientific and mathematical computing!

Re:Fortran Motives (1)

valkraider (611225) | more than 10 years ago | (#7987321)

No lack of texts in Portland [] . These are all on the shelf at Powells Technical Books at like NW 8th(maybe 9th) and Burnside. Or you can buy them online too if you don't live in Stumptown....

Re:Fortran Motives (1, Interesting)

Anonymous Coward | more than 10 years ago | (#7982586)

Fortran is part of the SPEC benchmark, so every vendor will continue to produce Fortran compilers and write them off as marketing cost.

objective-c support (2, Interesting)

sethdelackner (110929) | more than 10 years ago | (#7981080)

So has anyone got better pointers towards the state of their objective-c support? I know they say it is there as a technical preview with no guarantees until they finish, but does it basically work and is slow, or is it unable to compile even modestly complex stuff?

Re:objective-c support (0)

Anonymous Coward | more than 10 years ago | (#7981507)

Skimming the IBM press release, I see "Objective-C" but not "Objective-C++", which is Apple's extensions for letting C++ and Objective-C coexist in the same file.

Did IBM really make a pure objective-c compiler? Who would write anything big enough (we're talking Cocoa GUI apps here) that the extra speed would justify the cost & nuisance of XLC if they were limited to that?

Objective-C is not a bad language and I actually like using it, but adding the power & syntax extensions of C++ makes it HUGELY more practical.

We use mostly CodeWarrior for various reasons, one of which is the extreme speed & code quality compared to GCC. XLC+XCode is in the same price range: is it worth it?

Re:objective-c support (1)

g_lightyear (695241) | more than 10 years ago | (#7984740)

XLC's code generation makes it worthwhile for two features: auto-vectorisation, where it will attempt to organize and discover vectorizable code and use AltiVec to process when/where possible, and for OpenMP support, which allows you to tell the compiler that a given section of code *can* be run in parallel, and how to divide the task amongst CPUs; the compiler takes care of setting up threads on the CPUs, etc., and doing the actual work in parallel, while your code stays nice and tidy. The long answer is yes. The short answer is yes. :) XCode as an IDE is a long way from perfect, but the compilers are rock-solid dependable, and XLC's a great code generator.

0 is more than 500 (1)

rixstep (611236) | more than 10 years ago | (#7981762)

Big Blue might find it hard to compete with the free ADC tools, no matter the quality of their XLC.

Re:0 is more than 500 (2, Informative)

bash_jeremy (703211) | more than 10 years ago | (#7982604)

I don't think IBM is trying to compete with the IDE. IBM made a somewhat superior compiler to gcc (what Apple currently uses to compile OSX and all their apps).

How does this effect the built in GCC (1)

Sonic McTails (700139) | more than 10 years ago | (#7993160)

Does this symlink itself over GCC or does it add itself to gcc_select so you could do sudo gcc_select xl and get the new compiler. I know using variables that you can get Project Builder/XCode to use a different compier or send different flags. I think now there is a bit of compeletion in the PPC compiler, maybe GCC will get faster.

Also available for Linux PPC32/64! (1, Informative)

Anonymous Coward | more than 10 years ago | (#7994067)

Just to note (this is slashdot after all) - these compilers have also been released for Linux on PowerPC! And there, it supports both 32-bit and 64-bit ABIs. On OS X, you're limited to the 32-bit ABI.

xlc's binaries run on G3s (0)

Anonymous Coward | more than 10 years ago | (#8012116)

xlc's and xlf's binaries run fine on G3s (depending on the settings).

Recompiling OSX Core (0)

Anonymous Coward | more than 9 years ago | (#8043631)

Darwin is the lower level of OSX - this is open source and freely avaailable to recompile and play with.

A recomplied Darwin can then be inserted back into your copy of OSX - Ok its a little more complicated replacing the various files (Kernel, libs etc) but the principal holds.

This all works fine using GCC - could someone a little more clued up than me try this with the IBM compliers and report on prgress.

If you manage this you will be a god to the Mac community - imagine speeding up everyones copy of OSX - lots of good karma
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?