Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Porting Open Source to Minor Platforms is Harmful

samzenpus posted more than 9 years ago | from the do-one-thing-at-a-time dept.

Operating Systems 709

Tamerlan writes "Ulrich Drepper posted a blog entry titled "Dictatorship of Minorities". He argues that open source projects' attempts to support non-mainstream (read "non-Linux") operating systems slow down development and testing. While Ulrich may be biased (he is a RedHat employee) he has the point: if you ever read mailing list of any large open source project, you know that significant piece of traffic is about platform-specific bugs or a new release broken on some exotic platform."

Sorry! There are no comments related to the filter you selected.

Adverse Affect For Me (3, Interesting)

geomon (78680) | more than 9 years ago | (#12680165)

But I agree with the opinion if, as a developer and/or integrator, you are attempting to reach a mass market. Red Hat, then SuSE and others have abandoned the Sparc platform awhile ago. I bought an old Ultra 1 and am now limited to a few distros to keep my Sparc machine up to date.

But it was my understanding that the idea behind open source was to roll your own if your machine was not covered. If I want to keep the Sparc chugging along, I guess I'd better learn to port it myself.

Re:Adverse Affect For Me (0, Offtopic)

andreyw (798182) | more than 9 years ago | (#12680220)

Any reason why you're not running Solaris 10 on the Ultra?

Re:Adverse Affect For Me (1)

0racle (667029) | more than 9 years ago | (#12680243)

Could it be because Solaris 10 wont run on an Ultra 1?

Re:Adverse Affect For Me (1)

darkjedi521 (744526) | more than 9 years ago | (#12680247)

You need at least a U2 to run Solaris 10. U1 isn't supported.

Re:Adverse Affect For Me (1)

raddan (519638) | more than 9 years ago | (#12680393)

Why not try a BSD? If you already use a UNIX-like system, making the switch won't be hard. OpenBSD has great Sparc support, mostly because some key developers love the platform.

Re:Adverse Affect For Me (1)

geomon (78680) | more than 9 years ago | (#12680428)

OpenBSD has great Sparc support, mostly because some key developers love the platform.

I picked up a copy from their ftp site last week. I will be installing it this week.

Have you run OpenBSD on Sparc?

Any snags to watch out for?

Of course (2, Insightful)

LBArrettAnderson (655246) | more than 9 years ago | (#12680168)

Of course it is... And by that logic, developing software at all is harmful - takes time, money, and all the same stuff it takes to port it.

http://imcommunity.net/cgi-bin/u.cgi?u=38 [imcommunity.net]

Re:Of course (1)

Liquidrage (640463) | more than 9 years ago | (#12680294)

I don't think logic leads to your statement.

You might not agree with the blog entry, but it certainly doesn't posit logic that would lead to your statement. Please let me know if you need a further explaination. But I'm pretty sure that after you've taken a moment of reflection you'll understand why your initial response was not accurate.

FP 4 MEMORIAL DAY NEGROES (-1, Offtopic)

repruhsent (672799) | more than 9 years ago | (#12680172)

In honor of our fallen soldiers in this any all wars, I hereby claim this first post.

Let's extrapolate, shall we? (-1, Redundant)

chriswaclawik (859112) | more than 9 years ago | (#12680177)

If we take this to its logical conclusion, porting for ANY platform takes longer than not creating any software at all. So let's just give up now.

Zing. I am UNSTOPPABLE.

Re:Let's extrapolate, shall we? (1)

superpulpsicle (533373) | more than 9 years ago | (#12680212)

In the mid 90s Redhat was a minority platform, where as slackware was more mainstream linux. That was until $$$ change things up.

Question (2, Insightful)

cyberfunk2 (656339) | more than 9 years ago | (#12680188)

Are they referring to Mac OS here ? I highly value the open source ports made to Mac OS X, such as firefox.

Furthermore, at least on OSX, the Fink project makes many programs OS X buildable, but puts the maintenance onus mostly on the Fink people, not the original authors. Of course this can have it's own problems.

Re:Question (2, Insightful)

danielk1982 (868580) | more than 9 years ago | (#12680228)


Are they referring to Mac OS here ?


Mac OS, Windows, Solaris, BSD pretty much anything that isn't Linux, or specifically Red Hat Linux


I highly value the open source ports made to Mac OS X, such as firefox.


As do I, considering I am obliged to work under Windows 2000 at work.

Bottom line is that he's right. Ports do cost time and effort and probably slow development, but the reason they are there is because there is a need. People want to run OpenOffice on Windows, FireFox on Linux and Apache on FreeBSD.

Re:Question (2, Informative)

LnxAddct (679316) | more than 9 years ago | (#12680292)

I don't think they are talking about mac. Hell, this next release of Fedora on June 6th will be the first to support the Mac (most importanty the mac mini). I think this is referring to things like why should Suse or Fedora run on Arm processors? If these distros were targeting them specifically it'd be different, but Fedora and Suse target mainly X86 and X86_64 as far as I know. Why make programmers focus attention on platforms that will rarely be used? I myself am a programmer and find that after the top 3 or 4 platforms being targeted, your efforts on other platforms start to get stretched to the point of diminishing returns. It eventually cuts into you're regular work and I think it really does affect code quality. I look at it kind of like code optimization, 10% of a program in many cases will be used 90% of the time, and supporting platforms with little use or popularity usually wind up using up a large percentage of your time in testing and debugging. Please note that I'm not saying platform diversity is bad, it is indeed a good thing and very important and that is why its nice that some open source projects such as NetBSD target every platform under the sun. For best code quality though, its *usually* best to stick to as few platforms as possible. (Note: NetBSD does a surprisingly good job of keeping acceptable code quality while retaining support for many platforms)
Regards,
Steve

Java? (3, Funny)

Lingur (881943) | more than 9 years ago | (#12680200)

Wasn't Java supposed to solve this problem? I was under the impression that you could run Java apps on any platform (albeit slowly) without worrying about compatability?

Re:Java? (5, Funny)

Anonymous Coward | more than 9 years ago | (#12680320)

Yeah - Sorry about that, our bad.

-- Sun Microsystems

Re:Java? (5, Funny)

Anonymous Coward | more than 9 years ago | (#12680360)

Obligatory Zawinski paraphrase:
Some people, when confronted with a problem, think "I know, I'll use Java." Now they have two problems.

Debian (2, Insightful)

3770 (560838) | more than 9 years ago | (#12680201)


I'm a fan of Debian, but I think that Debians effort to support the myriad of architectures out there is hurting it.

It does a great service to the rest of the Linux community though, because it helps keep things portable.

But having a requirement that something work on a large number of platforms slows down the release cycle.

The OpenBSD project doesn't seem to agree. (5, Insightful)

EbNo (174259) | more than 9 years ago | (#12680203)

There are many instances where OpenBSD developers indicated that a bug found in one port led to discovery of problems that affected several other platforms. It seems in this case that multiplatform support is beneficial, and the larger the number of platforms, the greater the likelihood that such bugs will be found and fixed.

Re:The OpenBSD project doesn't seem to agree. (0)

Anonymous Coward | more than 9 years ago | (#12680279)

Amen. Ulrich Drepper is insane, for a number of reasons. But this article really highlights it. "Dictatorship of the Minorities" my ass. Just because Debian can't get its shit together...

Re:The OpenBSD project doesn't seem to agree. (4, Insightful)

quelrods (521005) | more than 9 years ago | (#12680310)

I was about to make a similar remark but thank you for stating it! While some code may "work" in one place this by no means makes it bug free. There are many instances of bad code working but sheer luck and only under a specific arch/platform. By ensuring code works under multiple architectures you will help eradicate bugs that may be exploitable. For example when a program seg faults repeatadly under OpenBSD I know that the program in question is not managing memory correctly. (OpenBSD with its memory protection refuses to allow reads/writes to illegal addresses that on other platforms could have resulted in exploitable holes.) While I have written many a fix for such programs it is nice to easily identify which programs/developers have a clue and which do not.

Porting helps rid software of bugs. (5, Insightful)

John Harrison (223649) | more than 9 years ago | (#12680420)

As you say, there are examples where porting has helped a project. I know that in porting one of my games to four platforms (Classic Mac OS -> Windows -> Linux -> OS X) has helped eliminate bugs that I never knew were there. Also, I learned things that have made my later projects easier to port since I more able to write them "correctly" to begin with. By avoiding platform specific libraries and techniques I write better code.

By that logic (1, Insightful)

Anonymous Coward | more than 9 years ago | (#12680208)

Software developers should only develop for Windows, because supporting Linux/Mac/BSD is diverting resources away from the (vast) majority of users.

PS: These graphic word things are nearly unreadable!

Re:By that logic (0)

Anonymous Coward | more than 9 years ago | (#12680367)

PPS: What the hell are you talking about, I post regularily and haven't seen one yet.

Sounds like how Taco feels about the blind! (-1, Troll)

Anonymous Coward | more than 9 years ago | (#12680211)

The captcha will keep them out unless they have help from someone that can see. I agree with the decision to keep the blind out. They're useless people, and Taco is right to exclude them from this site.

They have one thing going for them... (0)

Anonymous Coward | more than 9 years ago | (#12680396)

...they are immunte to goatse, tubgirl, and lemon party.

To the moderators (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#12680433)

How would you feel if your favorite web site suddenly decided that you were no longer good enough to participate because you have a disability? That's what Taco has done. Think about that before marking a post as a troll. Complaining about heavy-handed hate tactics is not trolling.

OpenOffice is a Gateway Drug... (4, Insightful)

Chordonblue (585047) | more than 9 years ago | (#12680217)

I'm not sure I totally agree with this article - at least as far as Windows porting is concerned. Programs like OOo are gaining acceptance in the Windows world and that foothold has led my own organization to 'embrace and extend' that success. For instance, for the first time we will be purchasing Apples - running NeoOffice of course - and we already have a few Linux terminals here for public use.

I like to think of OSS/GPL stuff as a 'gateway drug' - to use an analogy. Using it may not automatically make people go to Linux, but it certainly makes it an increasing possibility.

Re:OpenOffice is a Gateway Drug... (3, Insightful)

quelrods (521005) | more than 9 years ago | (#12680329)

I completely agree that OSS propogates through the gateway drug phenomenom. Originally everyone tried the RMS 100% free approach but that lead to no acceptance outside of the geeks. As programs like firefox, gaim, OO, etc. become popular on windows we erode the closed source based until familiarity with OSS apps makes the switch of the underlying OS trivial and unoticable.

Re:OpenOffice is a Gateway Drug... (1)

digidave (259925) | more than 9 years ago | (#12680352)

Not exactly a "minor" platform there now, is it?

Re:OpenOffice is a Gateway Drug... (1)

Metzli (184903) | more than 9 years ago | (#12680440)

I fully agree. I've convinced a number of people at the office (Windows server admins) to try Firefox and, as they've discovered that FOSS isn't an evil, some have started tinkering with Linux on spare desktops. I understand why people don't want to support every architecture, but it still seems like a good idea to support more than one or two.

Personally, I also find it extremely handy that my tools at home can be used in my Windows-only workplace. It's immeasurably helpful that I can run Firefox, OpenOffice, Ethereal, TCPDump/WinDump, etc. on my varied home boxes (Solaris, OS X, Linux, *BSD, etc.) and my work XP box.

Both sides are right, I think. (3, Insightful)

Dancin_Santa (265275) | more than 9 years ago | (#12680219)

First off, everyone will complain about this. The thinking goes that Open Source == Freedom, so the more choice in platforms you have, the more Freedom you have and thusly you actually help Open Source more.

I think this is incorrect.

First off, Open Source, despite its close engagement with Freedom ought to also stand for what is best in the Software Engineering world. This means clean, lightweight, portable code. For better or worse there is a standard which is POSIX. Linux, to the extent that it uses the GNU system, is basically POSIX-compliant. Open Source projects ought to target POSIX and keep themselves free of proprietary entanglements.

This can be achieved by focusing efforts on programming for Linux, the premier Open Source operating system. Only by keeping the code clean can a project be easily ported, but a project that isn't even near completion ought not be ported at all. Such non-mainline work results in incompatibilities and divergences from the main trunk of code that cannot be easily fixed down the road.

A very good example is the Symbian/Nokia gcc compiler which has many special extensions and cannot be used to compile for any other targets or operating systems. Well, they are doing away with their special version of the compiler and finally going back to the main line gcc tree. Unfortunately, all that work to specialize gcc for their platform is tossed out the window now. Work to no avail, essentially.

The key here is not to focus on Linux, specifically. Rather, it is to focus on a standard and program to that. That Linux is one of the best of the standard bearers, it makes sense to complete programming there first rather than start porting to esoteric platforms right away.

NetBSD? (4, Interesting)

Bananatree3 (872975) | more than 9 years ago | (#12680224)

The development of netBSD to over 40 different platforms has brought a lot of good development to many different platforms that would have been dominated by mono-operating systems. A good instance is the handheld devices platforms (HPC,Palm PC, etc.), which would otherwise be dominated by Windows CE (except for the few Linux Palm PC's, but majority are WinCE). The development of netBSD for the majority of platforms has reached great maturity, and it is still developing well.

Re:NetBSD? (0)

Anonymous Coward | more than 9 years ago | (#12680291)

most of those ports are barely functional...

Re:NetBSD? (1)

smitty_one_each (243267) | more than 9 years ago | (#12680345)

What's with the anti-functional bias? I a platform has emacs, does it really need anything else?

Try a VM (1, Insightful)

alext (29323) | more than 9 years ago | (#12680227)

A proper engineered solution to this problem was developed some time ago: a VM.

There are 1000s of Java projects on SourceForge that will never have this problem.

yay! (1, Insightful)

Anonymous Coward | more than 9 years ago | (#12680269)

Let's all get things done a hundred times slower!

Progress!

Java as fast as C++ (1)

Dancin_Santa (265275) | more than 9 years ago | (#12680281)

A hundred times slower than what? Assembly?

This was in 1998 [javaworld.com]

Please update your trolls.

JavaWorld Proclaims: Java as fast as C++ (0)

Anonymous Coward | more than 9 years ago | (#12680311)

Yay unbiased sources!

In other news, Ben and Jerry report that ice cream is just as healthy as broccoli.

IBM Proclaims: Java as fast as C++ (1)

Dancin_Santa (265275) | more than 9 years ago | (#12680344)

As does IBM [ibm.com]

the results show that Java server performance is competitive with legacy environments

Java servlets outperform CGI, and the Java platform is competitive with C or C++

Jvm optimizations such as efficient monitor and object allocation implementations have led to significant performance improvements, including better scalability that is vital for server workloads

How much money has IBM put into Java? (0)

Anonymous Coward | more than 9 years ago | (#12680376)

Can you find a source that doesn't have something to gain by proclaiming Java to be as fast as C++?

Sure. (1)

Dancin_Santa (265275) | more than 9 years ago | (#12680401)

Academics looking at the question [idiom.com]

Java is now nearly equal to (or faster than) C++ on low-level and numeric benchmarks. This should not be surprising: Java is a compiled language (albeit JIT compiled).

Re:IBM Proclaims: Java as fast as C++ (0)

Anonymous Coward | more than 9 years ago | (#12680411)

The problem with CGI is architectural. AppleSoft BASIC would be faster than CGI if it was integrated into the web server properly. For fun, imagine a CGI that invokes a Java class file.

Re:JavaWorld Proclaims: Java as fast as C++ (0)

Anonymous Coward | more than 9 years ago | (#12680356)

While Java is in many ways not quite there compared to C in every instance, for many uses it is just as fast. Java must run on the same computers and use many of the same algorithms everyone else must use. While its memory footprint might be larger, there are many supporting benchmarks that show it performs well compared to C++ in ordinary applications.

Re:yay! (1)

name773 (696972) | more than 9 years ago | (#12680323)

but is it resource intensive?

Re:Try a VM (0)

Anonymous Coward | more than 9 years ago | (#12680304)

Until you try and run your application on a a system that doesn't have a Java VM. Not to mention the Swing incompatibilities even between Windows, Solaris, Linux and OS X (which, by the way, are the only systems with full Java compliance). And only x86, Sparc, and PPC hardware for that matter.

This is about way more than that. He's talking about stuff like Arm, Alpha, MIPS, VAX. OpenVMS, Acorn, AmigaOS, SkyOS. You know, the stuff they don't sell at Best Buy or even New Egg.

Re:Try a VM (4, Insightful)

mellon (7048) | more than 9 years ago | (#12680349)

Spoken like someone who's probably never tried running their JVM-based GUI application on a new platform. Java cross-platform compatibility is a nice idea in theory, but in practice you wind up having to test everywhere and tweak your code when you run into differences in GUI implementations, so it's definitely not write-once, run everywhere. A more complete API specification would help here, but if wishes were horses, there'd be a lot of poo on the road.

Re:Try a VM (2, Insightful)

quelrods (521005) | more than 9 years ago | (#12680350)

Except that java is one of the most non-portable solutions ever. Sun java runs on exactly: windows x86, windows ia64, linux x86, linux ia64, solaris sparc32, and solaris sparc64. Some decently written C code easily runs on more systems that that while only requiring a compile. Java is only in theory portable. Unless sun opens the jvm it will never be fully portable.

Re:Try a VM (2, Informative)

Rich0 (548339) | more than 9 years ago | (#12680423)

Clearly you aren't running on amd64. I've given up on running just about anything other than helloworld.java on this platform, using any JDK I can get my hand on (both Blackdown and Sun, stable and beta versions).

A VM is just another architechture. In theory we could just write everything for x86 and then run emulators on every other platform, and it would be about the same thing.

The problem is that Java works great as long as you only run it on an x86, or maybe a sparc or a mac. And java apps have their downsides as well.

I call bad c code (4, Insightful)

quelrods (521005) | more than 9 years ago | (#12680229)

Overall the arguement is mostly bogus. For example many linux developers have trouble writing code that even compiles under any of the *bsds. That is just sloppy coding. If everyone got in the habbit of at least writing code that doesn't use system specific includes (linux developers seem the worst at this) and compiled with gcc -strict -Wall or something similar it wouldn't be much of any issue. While I can see that a request to make something work on OpenBSD VAX might be better ignored I fail to see how supporting at the very least linux/*bsd (Open, Net, Free) on ppc, sparc, sparc64, and x86 is supporting a minority. Overall OSS users/developers ARE a minority and to argue over which minority beats who is silly. Also, to only bother to support linux is no better than only bothering to support windows!

Re:I call bad c code (1)

geomon (78680) | more than 9 years ago | (#12680257)

Also, to only bother to support linux is no better than only bothering to support windows!

Too true.

Re:I call bad c code (0)

Anonymous Coward | more than 9 years ago | (#12680366)

Keep in mind that Depper is talking about low-level stuff such as glibc. It may be a necessity to write "bad" C code in order to make the platform work. He writes the software that abstracts away the differences between platforms.

Minorities make life so ... complicated ... (5, Funny)

dist_morph (692571) | more than 9 years ago | (#12680232)

Let's just eradicate them once and for all. A homogenous Linux monoculture will be easier to maintain and be to the benefit of all of us.

Apple (1)

mnemonic_ (164550) | more than 9 years ago | (#12680259)

You know, that's what Apple has now. And in four years, Apple developed an OS that's stable, fast and incredibly easy to use, along with a software suite that integrates beautifully. Apple's already done this, while the Linux community is still trying after 20 years.

Re:Minorities make life so ... complicated ... (1)

Bananatree3 (872975) | more than 9 years ago | (#12680263)

[sacrasm]and meanwhile, lets' throw out all but AMD processors, everything but Western Digital harddrives and have a single Linux distro so we can make everyone happy. [/sarcasm]

But the point is ... (1)

TheGavster (774657) | more than 9 years ago | (#12680234)

The point of open source is that the features that get added and the platforms that are supported are the ones that people put the time in to code. If a project supports platform x, it goes to reason that someone, somewhere, uses platform x and the given application, and has the skill to make them work together. This isn't a commercial project, where you have a marketroid telling you 'someone, somewhere wants feature x and for the application to work on platform y'.

No... (2, Insightful)

AAeyers (857625) | more than 9 years ago | (#12680249)

Porting Open Source to Minor Platforms is Harmful

I would say that an article about someone's blog entry on the front page is harmful.

Re:No... (1)

smitty_one_each (243267) | more than 9 years ago | (#12680406)

This is the closest thing to "Stuff that matters" I've seen on the front page in weeks. UD is nobody's corporate shill, and the question of 'what is a reasonable platform range' is another facet of configuration management that really bears an extended hash-out.
s/harmful/helpful/

Standard operating procedure from Ulrich (1, Interesting)

Anonymous Coward | more than 9 years ago | (#12680252)

This post was surely inspired from this message [redhat.com] to libc-alpha.

He's curt to the point of being rude, and I'm surprised anyone wants to develop on anything he's involved with. I wonder if the more social glibc developers like Roland agree with his position?

Sounds like... (1)

danielk1982 (868580) | more than 9 years ago | (#12680258)


Which are the OS targets which should be supported? Support for proprietary OSes should be dropped. Free software should only support free OSes and even among those the group needs to be trimmed significantly (ideally to one).


Sounds like someone wants to take his ball and go home.

platform-specific bugs? Doubtful (5, Insightful)

pedantic bore (740196) | more than 9 years ago | (#12680262)

Most of the bugs that I've seen that are "platform-specific" are not actually due to bugs in that platform -- they're just ordinary bugs that were there all along, unnoticed due to poor assumptions. (Back in my day, we called this the "all the world's a VAX" assumption -- now it's the "all the world's x86") Finding these bugs and removing them make the code better.

The bugs due to platform bugs -- well, knowing about them helps improve the platform.

If you think fixing these bugs is a pain in the neck, fine. If you think it's a waste of time, however, think again.

If they only affect exotic platforms it is a waste (1)

glrotate (300695) | more than 9 years ago | (#12680392)

Sure, there are examples of "Gee, this would have been a major security problem if not for the Vax port". But most of what you see on this lists are just, "Waah, it doesn't compile on my Amiga." Well who cares? Assumptions were made, assumptions that cover 90% of the users. Go buy a PC and load Linux already.

Re:platform-specific bugs? Doubtful (5, Insightful)

runningduck (810975) | more than 9 years ago | (#12680408)

Porting software to different platforms has two distinct benefits:

1) identifying subtle bugs

2) preparing software for future platforms

- Subtle Bugs -
As stated in the parent post, porting software to various platforms help uncover bugs that may not surface during routine testing in a mono culture.

- Future Platforms -
Making software portable prepares software for the future. As computer technology advances, software that has been developed to be portable will be the first code running on the new hardware. I could go on and on about this, but I think the recent articles regarding Intel's Itanium already make this point loud and clear.

I can see where he is coming from.... (1)

refactored (260886) | more than 9 years ago | (#12680268)

I suspect of all the people in the world, Uli has been one of those bitten hardest and deepest by this problem, mostly because of his vast contributions to so many of the core items in the GNU/Open Source software collection. So I can feel for his radical tone.

However, there is some good PR value in some of these ports.

He is right though, a bit of strategic thinking would be Good. eg. Think about when dropping a port would merely force some lazy PHB to take the final bold step into the wonderful world of GNU/Linux and when keeping a port is winning valuable support and new recruits. eg. Dropping support for SCO was Good in several ways.

No need to look too far (1)

shutdown -p now (807394) | more than 9 years ago | (#12680271)

Free software should only support free OSes and even among those the group needs to be trimmed significantly (ideally to one).
I don't care whether your code is "Free" or not in this case. If you want the whole playing field for yourself, you want too much. Competition is just as good in OSS world.

testing is good (1)

ScottSpeaks! (707844) | more than 9 years ago | (#12680273)

And here I thought that getting rid of bugs was a good thing. Coding for portability makes for better code, so code that doesn't port easily is deficient. (Any high school freshling with VisualStudio can write code that works dependably on a single platform.) Ergo, testing on secondary and tertiary platforms makes for better apps and is simply another aspect of QA.

Fine until some future bug bites you in the ass... (5, Insightful)

forkazoo (138186) | more than 9 years ago | (#12680275)

Keeping your code portable helps eliminate stupid assumptions, which make your software useless when the dominant platform changes. Once, all the world was a VAX, and people did stupid things. Then, the world changed. They kept doing stupid things.

Think, for example, about 64 bit cleanliness. A piece of software which supported Alpha, UltraSPARC64 and SGI's MIPS64, and so on, wouold have been fairly trivial to port to IA64, and AMD64, and PPC64 when they started to become significant. OTOH, code which assumed it was running on a 386 would have been a pain in the ass to port to even just AMD64.

Also, by supporting a broad spectrum of compilers, you will probably be able to understand what is going wrong when you compiler of choice changes. Witness code breakage on gcc3. Devs who had already ported their software to a variety of compilers were better able to respond to any issues, and fix their code.

Many monoculturalists make stupid endian-ness assumptions. Now, Mac OS X is becoming a significant market. If you have stupid endian-ness assumptions, then you may wind up having to basically rewrite in order to gain access to those millions of potential customers/users.

Imagine if OpenGL only supported SGI and 386. Or libtiff only worked on i386. People just wouldn't use them. Things like that get used because they are ubiquitous, and you can build them anywhere.

Sounds like a Microsoft-ism (2, Interesting)

amigabill (146897) | more than 9 years ago | (#12680276)

Doesn't that sound like something MS would say? Don't waste your resources developing for small-time operating system like Linux or Mac when only the large market platform, ie. MS, is feasible for reaching your feature goals in a timely manner?

Come on. While this is an excuse for a proprietary for-profic product with a small team, I don't see this as flying as well in open source land. If some guy wants to port Firefox or OpenOffice to something off the wall like AROS or some other nearly unknown platform, let him. He wouldn't be working on the Windows or Linux ports regardless, so why prevent him from doing something so crazy with his own time and money?

Portable code is robust code (4, Insightful)

Sinner (3398) | more than 9 years ago | (#12680283)

Porting to minor platforms exposes bugs, real bugs, that might not have been found otherwise. It enforces good software engineering practices.

Of course, you can overdo it. Take a look at InfoZip for example. No, seriously, take a look at it. It works on every platform you can think of, but the price is that the code is almost unreadable. The biggest problem is all the cruft needed to maintain 16-bit compatibility. It desperately needs updating to handle non-ASCII filenames intelligently, but the last thing that code needs is another layer of #ifdef's.

There comes a time when you just have to say "fuck the Amiga".

Re:Portable code is robust code (1)

Omnifarious (11933) | more than 9 years ago | (#12680322)

IMHO, #ifdef's are a poor way to handle portability, and the code needs some serious refactoring to either try to find a single method that works on all platforms properly, or move the platform dependent piece out to separate modules.

Re:Portable code is robust code (2, Insightful)

bani (467531) | more than 9 years ago | (#12680405)

#ifdefs are a tool like any other. use them when they make sense, when they are the best tool for the job. they are not always the best tool, but "never" (as you imply) is the wrong answer also.

writing 5 different copies of the same function is often more counterproductive than one with 5 little #ifdefs in it.

moving code out into separate modules can introduce other problems such as increased effort to keep all the individual architecures coherent -- which increases the risk of bugs creeping in. this approach is often counterproductive.

Re:Portable code is robust code (3, Funny)

NoMoreNicksLeft (516230) | more than 9 years ago | (#12680351)

Tramiel, is that you?

Ummmm . . . (4, Insightful)

erikharrison (633719) | more than 9 years ago | (#12680284)

Sometimes, we use hyperbole to make a point.

Unfortunately, I don't think that Ulrich is doing that.

AIX is not a minority platform. What The Fuck. Okay, so the AIX guys are asshats in the way they treat GCC, fine. But GCC's claim to fame is that it it is the cross compiling, multiplatform compiler du jour. I think Ulrich loses a lot of credibility to say that GCC needs to not support AIX because it's a minority platform.

*nix applications which run primarily in userspace should port to the various BSD's and Linux easily, and if they don't then 99% of the time it's a bug. And in many cases, it's a bug that will affect the working platforms eventually (relying on nonstandard behavior of system calls, linker oddities, assumptions about file placement, etc). And if a closed Unix platform has paid developers to assist in the porting, then it should run on that platform too. And if the paid devs are dickbrains, then a good project leader should say so. Behave, or fork and get your whining ass out of my tree.

These AIX GCC guys shouldn't be saying "This patch breaks AIX, kill it", they should be saying "This patch fixes *blank* on AIX", at least most of the time.

Apparently ... (1)

Fookin (652988) | more than 9 years ago | (#12680285)

Their server is hosted on a minor platform.

Ouch.

Not just OSS (0)

Anonymous Coward | more than 9 years ago | (#12680297)

In the large software project I work on, I can attest that this is most definitely true.

Our product supports 13 platforms (usually both 32-bit and 64-bit for each of those) - several windows, unices, linuces... In general, we try to run our entire test suite on every platform for every release and every fixpak. In general it takes about a month to run all the test for each platform. Obviously, we run the suites concurrently on each platform.

Once we get close to the end of the release cycle, and the code gets frozen, every last minute fix has to be run on all these different platforms... what a hassle. It consumes large amounts of both hardware and human resources to perform all this testing.

Not to mention that in the normal development process, when checking in a fix you can't test your change on every possible platform... you'd never get any work done. So we test on the major ones and cross our fingers it doesn't break anything else. Sometimes there are compile errors on the rare platforms (HPIA64 is a beast for this) and others there are runtime problems that don't get discovered until much later.

So, I can attest that in the real world, even in a non OSS environment, supporting all those rare platforms is a huge PIA and definitely slows down development and reduces quality.

Re:Not just OSS (0)

Anonymous Coward | more than 9 years ago | (#12680305)

What is the implementation language?

How do you keep the code base from splitting too far along platform lines?

Re:Not just OSS (0)

Anonymous Coward | more than 9 years ago | (#12680354)

It's about 66% C, 17% C++, 17% Java.

The build levels are the same across the platforms. The less-used platforms just come out with fewer builds. But, for example, the 050530 build will have the same source on all platforms.

Re:Not just OSS (0)

Anonymous Coward | more than 9 years ago | (#12680385)

It makes sense not to have separate source trees, but do you just use #ifdefs to handle things tricky things like byte ordering?

One approach that I've seen work many times over is to have separate platform-specific header files and a compiler directive to pull in the correct headers. What approach do you use?

He's crazy. (2, Insightful)

mellon (7048) | more than 9 years ago | (#12680303)

Porting reveals bugs. It also forces you to rethink short-sighted decisions. Furthermore, most of the problems I run into with porting have to do with cross-version incompatibility on Linux - the BSDs actually have comparitively stable APIs.

This line of thinking is a lot like how I presume Microsoft thinks of things: if we just port to this one API, it doesn't matter how bletcherous it is. But as Microsoft has discovered, this kind of thinking actually turns into a straitjacket, which prevents them from being responsive when they need to be.

get some perspective (4, Insightful)

ArbitraryConstant (763964) | more than 9 years ago | (#12680307)

"IMO the most notorious case is how the gcc development is held hostage by Edelsohn and maybe IBM as a whole by requesting that everything always works perfectly well on AIX. How often has one seen "this patch breaks AIX, back it out". It cannot reasonably be expected that everybody tests on AIX. It is an proprietary OS running on proprietary and expensive hardware which not many people have access to. The overall development speed could be significantly improved by dropping the AIX requirement which, in some form or another, has been agreed upon by the steering committee. AIX is irrelevant in general today, i.e., the situation changed. And the people in the steering committee are too nice to just tell the very small minority causing the problem to take a hike."

GCC is the de facto standard because it runs on more platforms than anybody else.

If it ceases to run on all these platforms, it will either:
a) fork a project that will support them
b) another compiler will take its place as the de facto standard
c) people will be forced to use whatever the default cc is on their OS.

In any of these cases, the portability concerns will get an order of magnitude worse.

what's he calling non-mainstream? (1)

iggymanz (596061) | more than 9 years ago | (#12680309)

a couple million web sites running an open source OS which is something Not Linux, and those that do run Linux are bailing out of RedHat to other things: SuSE, Ubantu, etc.

Java/C# (1)

zyridium (676524) | more than 9 years ago | (#12680312)

This is a reason we should be using languages other than C/C++.

Then we only have one (significant, sure) platform dependant application.

Dealing with platform oddities on every program is always going to be a far more difficult problem.

Be One (1)

Atomic Frog (28268) | more than 9 years ago | (#12680325)

Yeah, of course one operating system is easier. While you're at it, it's also counter productive to support a myriad of hardware platforms too.

From now on, you will be allowed to use 1 specific CPU, with 1 chipset and one motherboard...
Heck, and why even let other people at the source code. They'll just come back and tweak things and report new bugs. Close up the source, I say!

Sheesh! The whole beauty of open-source is that anyone can and probably will port stuff to their own favourite platform. You may find some platform specific bugs, but they may be helpful. You may learn something that fails disastrously on some platform is actually showing up as a subtle bug in another (e.g. security issue).

You can't pick and choose about open-source. Either it's open to everyone, or not.

I categorically disagree. (5, Insightful)

bani (467531) | more than 9 years ago | (#12680339)

Porting to other platforms/architectures often reveals bugs in your primary target platform. it is often worth the effort to port to other platforms on this basis alone.

also, if it takes you a lot of effort to keep architecture-nimble, there is something fundamentally wrong with your design. this in itself should be a warning.

But there is no benefit at all in supporting something like PA Risc, m68k, cris, etc as configurations of the generic code.

ulrich obviously has no clue whatsoever about embedded systems, and should therefore stfu on this point. one of the most popular embedded platforms is a 68k variant (coldfire) -- it's probably second behind ARM. by dumping support of 68k you castrate linux in the embedded marketplace. there's much more to 68k linux than sun3 and atari/amiga.

his rant against mingw as "undeserving" is stupid. mingw is an enabler -- it means people can develop for win32 without having to pay microsoft $$$$ for the privilege of doing so.

his 'dictatorship of the minorities' argument is actually self-defeating on this point because microsoft users are in the majority. by his own arguments, we should be concentrating on supporting win32 as the primary target for gcc and primary architecture for linux.

utterly ridiculous.

bug-eyed rants like his just serve to reinforce the stereotype that all open source advocates are completely unhinged. it is not helpful in the least.

BeOS (1)

ayersrj (701333) | more than 9 years ago | (#12680343)

It's the wave of the future. How come OpenOffice hasn't been released for it yet?!

OS X is one (1)

stm2 (141831) | more than 9 years ago | (#12680357)

What I see in mailing list is a lot of questions regarding how to port or how to install OSS in Mac OSX, so please don't port it to OSS. There are also several linuces, so lets settle down to only one to rule them all. (I am being sacarstic here!).

Re:OS X is one (1)

stm2 (141831) | more than 9 years ago | (#12680378)

I correct myself: Where it says "don't port it to OSS" should read "don't port it to OSX"

OpenSSH (1)

ArbitraryConstant (763964) | more than 9 years ago | (#12680361)

I suppose the OpenBSD crowd should stop supporting those "other" platforms like Linux with OpenSSH.

Re:OpenSSH (0)

Anonymous Coward | more than 9 years ago | (#12680432)

RedHat would probably support that, as it would give them an excuse for introducing a fork which they control.

I don't agree either (2, Insightful)

toby (759) | more than 9 years ago | (#12680369)

In my experience, porting is like the water of a river washing over river stones. Over time, every port makes the stone smoother. This applies whether it's a new architecture, O/S, compiler, or even just the unfamiliar box of some other user.

There are bugs that just don't get flushed out until you port to: non-x86; 64-bit; bigendian; Win32; OS X; etc, etc, etc. Drepper should know better: All the world's not a VAX, etc. (though a VAX port is a fine start :-)

Also, every port makes the process of porting itself easier. It's no coincidence that the most reliable and defect-free software is typically the most-ported software. This has always been true: TeX and METAFONT (where the monetary bug bounty [tug.org] doubled for every bug report, so assured was Knuth of its quality); Apache; Linux itself; NetBSD; GCC and friends; etc.

That's nonsense (1)

raddan (519638) | more than 9 years ago | (#12680371)

I've come across many cases of code that relies on the endianness of the x86 and breaks on PPC. Sure, it'll run fine on x86 now, but what if there's a shift in architecture in the future? Look at all the once-big architectures that are dead now: VAX, Alpha, Motorola 680x0... Why write the same code again?

Porting reveals bugs. Shouldn't we spend time finding bugs? It's not like the BSDs are lagging behind in features...

Hypocritical (4, Interesting)

Temporal (96070) | more than 9 years ago | (#12680383)

If you write your code to be portable in the first place, fixing platform-specific issues should be quick and easy.

And, of course, you write your code to be portable because you make sure it runs on the big three: Windows, Mac OSX, and Linux.

Right?

Actually, I think a much larger problem is just that: Many OSS developers don't even try to support Windows. Yes, I know you hate the OS and don't want to support Microsoft, etc., etc.. But, how can you complain about major software not supporting Linux when you're writing your own software that doesn't support Windows? Isn't that entirely hypocritical?

My take: Port your software to every platform you can, especially Windows. This gives freedom of OS to your users. And if you're a Linux user yourself, you should understand just how valuable and important this freedom is.

You say that like its a bad thing... (2, Insightful)

kimanaw (795600) | more than 9 years ago | (#12680390)

As someone who has supported multiplatform s/w that's hosted on

  • Win32
  • Linux (various, incl. PPC)
  • Solaris
  • AIX
  • HPUX
  • OS X
  • FreeBSD
  • MVS
  • OS/400
  • multiple other "minor" Un*x platforms
  • a Zaurus
  • a PocketPC
  • some routers running a proprietary kernel

...I call BULLSHIT!

The bugs one finds on "minor" platforms usually end up being bugs on the "major" platforms you just haven't found before. Of course, for those of you still intent on/forced to write code in C/C++, you're likely getting your just desserts.

Core OS API standards the key to better software p (1)

Eravnrekaree (467752) | more than 9 years ago | (#12680394)

I believe most of the problem with having to spend large amounts of time to port an app to different OSs could be fixed by each OS supporting the same basic API calls and command line and compile tools, as defined in things such as POSIX and Single Unix Spec, in order to assure source compatability, the ability to recompile an app on any OS with no modifications. It seems one of the biggest offenders in making portability difficult is Windows, and a lot of time has to be wasted to port to that platform.

Many OSs, as well, support each others binary formats as well. FreeBSD can run compiled Linux applications, which is possible since Linuxs libraries, etc, are open source and can be used to support such binary compatability on other OSs. FreeBSDs Linuix support is pretty fast too, with little or no performance hit.

Flame boys begone (1)

Moby Cock (771358) | more than 9 years ago | (#12680397)

I think its useful to point out that MS tries to be all things to everyone (with varying degrees of success) and it has not hurt them (yet). Furthermore, tinkering is what OSS is built on. To say that tinkering with obscure harware/software is harming OSS is really rather foolish.

He is wrong on all counts. (4, Insightful)

bluGill (862) | more than 9 years ago | (#12680402)

It is rare that I can say someone is wrong on all counts, but I have not found one defensible statement in there. (Though I guess one could be hidden and I missed it)

His first mistake is thinking GNU is everything. Maybe for him it is, but for most people we use what works. When the boss sets me down on a AIX machine I want it to work - I'm not allowed to install Linux (though I'd install *BSD if I could wipe the OS), I'm supposed to get work done.

Minorities are useful despite the cost of working with them. Bugs that are 1 in a million may happen every time on AIX. 1 in a million bugs are very hard to find. I've spends days looking at a partial crash trace wondering why it broke, and if it will happen again. With no known way to duplicate the bug it is really had to fix, and hard to be sure the 'fix' works. When it fails every time the bug is easy to fix.

Good programmers should have no problem writing cross platform code. When your code breaks on AIX, it is a sign of bad code - even if the breakage is because AIX doesn't have a function you expect.

Cross platform compilers (gcc) are much easier for me to work with. Because gcc is cross platform I can compile my stuff at home and debug it, than bring it to work and compile it and assume it works. Particularly with gcc 2.95, the support for C++ was so bad that you could not count on code written for that to work on a better compiler.

Speaking of gcc 2.95, other vendors have had better compilers for years, while gcc is only arriving. Even today, gcc isn't a great c++ compiler. (though 4.x is much better) There is no point in throwing stones at other vendors - their compilers may have been expensive, but they at least worked close to right.

The upper/lower case differences with Windows are a non-factor. You should never have any word that differs by case only - it leads to many bugs if you do.

The API differences on Windows are mostly handled by Cygwin and mingw. Those areas that are different are places where you should have your code modular anyway. Mostly we are talking about device and networking code. IPv6 is on the way (has been for 10 years now...), you need some difference code to support that. There is no standard for device code - what works on OpenBSD won't work on linux, or FreeBSD.

True almost nobody cares are VAX - but it is interesting anyway. If you code is modular like it should be, then supporting those weird things isn't a big deal - you write you code, and let those are care about it test.

A short summary: There should be only one OS that anyone runs: RedHat Linux enterprize edition on x86. (not x86-64) Not Fedora core, much less gentoo or those other non-redhat distributions. You FreeBSD people can go to hell.

He wants to take his ball and go home, I don't care, we are better off without people like him in the open source world.

If it's the same person supporting both platforms (0)

Anonymous Coward | more than 9 years ago | (#12680409)

...I can see where that would be a problem, but if the project is diverse enough to include people more skilled in Windows, BSD, Macs, or whatever than in Linux, I'm not sure you would gain much by forcing them to stick just with Tux instead of the Borg, the demon, or the fruit.

Severely OT and trollish (1)

homerj79 (58075) | more than 9 years ago | (#12680414)

Is is just me, or did I first read his name as Ulrich Drpepper?

Geebus, RTFA'ing was like reading Lenin! (1)

idontgno (624372) | more than 9 years ago | (#12680419)

Was it me, or did Ulrich sound distinctly like he was arguing for the Dictatorship of the Proletariat [wikipedia.org] , at least until the perfection of New Free-Software Man and the inevitable rise of International Socialism^h^h^h^h^h^h^h^h^h Software Freeism?

I philosophically support Free Software. (And no, I don't have to show you my Party Credentials.) But loose talk about purges of counterrevolutionary elements and collectivization of software development really freaks me out.

News flash: Communism rightly failed. Take Free Software down its path only if you want it to fail too.

Well, some neo-Stalinist Free Software chekist will probably downmod me. Is this how Leon Trotsky felt? (I mean, before he got the ice axe in the head.)

Keep Free Software FREE! Once you begin rounding up and interning the minorities, you are taking the first steps toward a new totalitarianism!

He does have a point (0, Offtopic)

ShatteredDream (636520) | more than 9 years ago | (#12680421)

The problem with all of this "choice" is that to be able to compete with Windows, MacOS X or even Zeta in the long run, Linux distributions have to have all of their components that tightly integrated. That means that ideally the KDE, X.Org and Linux developers need to be all on the same sheet of music to make sure that their components work very, very well together. Most people cannot even tell you what GDI is on Windows, or Quartz is on MacOS X, so why should they have to know what X.Org is and why they need to care about it?

This is the painful reality for these developers. The average buyer doesn't want a distribution, they want a complete operating system. KDE + X.Org + Linux is a cobbled together setup, Windows, MacOS X, Syllable, BeOS, etc. were and are not. That's what they expect, and it may mean that some of the smaller projects have to take on a lot of work. So be it. If you want desktop Linux to work well, and be a true replacement for Windows, then it may mean that the KDE and/or GNOME guys have to go Linux only or that another project has to be started that creates a complete and pure Linux operating system that is a "total experience and environment" rather than a collection of packages.

The difference is fundamental, not symantic. It means that the projects must be coordinated together with one vision, one plan and a goal of one end result.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?