Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Alternatives to Autoconf?

Cliff posted more than 10 years ago | from the dodging-the-complexities dept.

Programming 108

Despairing Developer queries: "Once autoconf was a great way to make widely portable programs. But now when you have to spend more time sorting out incompatibilities between autoconf versions, breaking battles between autoconf compatibility wrappers and configure.in compatibility functions that tries to outsmart each other, and on top of that see the list of dependencies increase (it's not that fun to compile perl on unicos) and performance diving rapidly, what is your escape plan? Is there a drop in replacement for autoconf? Is there something else out there that is as portable as autoconf to use instead?"

Sorry! There are no comments related to the filter you selected.

QMake (3, Interesting)

rufus0815 (651685) | more than 10 years ago | (#9213392)

QMake (from Trolltech) is a possible replacement, though not free (according to the FSF)!

Re:QMake (2, Informative)

blackcoot (124938) | more than 10 years ago | (#9213449)

not entirely sure why this got marked as a troll, but i guess that's what the meta-mod system is for. two things: firstly, it's available under some crazy dual license scheme which allows you the option of accepting either the QPL [trolltech.com] or the GPL (see here [trolltech.com] ). secondly, i don't think that qmake comes even close to covering the ground auto(conf|make) does. i guess the closest relation is to automake, except crippled. as far as i know, it doesn't have any of the detection / configuration stuff that autoconf does. of course, it's been a while since i last used it, so that may have changed in newer versions.

Re:QMake (3, Interesting)

rufus0815 (651685) | more than 10 years ago | (#9213498)

AFAIK it still hasn't got a lot of features that automake has (specially the detection / configuration stuff - as you said correctly).
Never the less, it's easy for beginners, and it's cross-plattform!
So for posix developers it might be unsufficient - for Windoze developers (M$VC) it would be a great improvement (the way M$VC handles the project settings is horrific)!

Re:QMake (1, Informative)

Anonymous Coward | more than 10 years ago | (#9213667)

not entirely sure why this got marked as a troll

Because QMake is free. I couldn't find any comment by the FSF about QMake, and it's dual licensed under exactly the same terms as Qt itself. From the FSF website:

We recommend that you avoid using the QPL for anything that you write, and use QPL-covered software packages only when absolutely necessary. However, this avoidance no longer applies to Qt itself, since Qt is now also released under the GNU GPL.

Simply suggesting QMake wouldn't have been a troll, of course, but I assumed someone else would repost without the misinformation.

Re:QMake (1)

orangesquid (79734) | more than 10 years ago | (#9220895)

No, it's because it's put out by trolltech

<grin>

Re:QMake (1)

Brandybuck (704397) | more than 10 years ago | (#9217859)

QMake is free according to the FSF. Actually, it always has been free according to the FSF. Even it's predecessor, tmake, was always free according to the FSF. What have you been smoking?

mod article up! (5, Insightful)

blackcoot (124938) | more than 10 years ago | (#9213438)

i wish there was a way to moderate articles up, because you've hit on one of my major (*major*) pet psychotic hatreds regarding developing software. auto(conf|make) sucks badly. it's bearable if you're developing from scratch (not depending on other libraries) or require that your bundled versions of libraries be used. but when your software depends on, say, 123098123871237 other packages (i.e. you're writing for gnome or kde), you're boned.

unfortunately, there are no reasonable replacements that i know of, which is probably a testament to the nastiness inherent in solving this problem. a pity, really -- auto(conf|make) and company are a really good idea (in theory). unfortunately, there seems to be some really bad crack smoke involved in designing these tools. first (and probably foremost) in my mind is why isn't there a database of some sort which would at least allow the option of keeping track of which versions of what applications have been configured how and installed where.

Re:mod article up! (5, Informative)

noselasd (594905) | more than 10 years ago | (#9213502)

Try pkg-config --list-all
pkg-config provides you with compiler/linker/preprocessor flags for
compiling a program that uses various libraries.
Now, if only all libraries provided a .pc file..

Mod parent up! (4, Interesting)

sofar (317980) | more than 10 years ago | (#9213523)

pkg-config singlehandedly is creating a sea of calm in autoconf land while still keeping the strength of autoconf. Take Xfce4 for instance. Within a relatively short timeframe this small group (4-5 devs) have written an entire framework that compiles on tens of platforms by using PKG_CONFIG extensively together with autoconf

Re:Mod parent up! (2, Insightful)

Jahf (21968) | more than 10 years ago | (#9215099)

Maybe, but I often find as a user when trying to compile a package that has a .pc pkg-config file requirement for another package that that other package often forgets to leave it's .pc file around.

For instance, when compiling Evo from fresh source on a version of SuSE awhile back, I installed the various -devel RPMs that were required and yet those -devel RPMs left out their .pc files, meaning that even though I had the development libraries, Evo couldn't find them.

Not saying pkg-config may not be useful, but it needs to be standardized on like autoconf used to be before it becomes truly useful to the power user.

Personally, I don't care which is used (pkg-config or a fixed autoconf) so long as it works well with others and is consistent.

Re:Mod parent up! (2, Informative)

sofar (317980) | more than 10 years ago | (#9215838)


well that's actually your distro's fault. And yes binary distro's suck at providing headers and other -config and .pc stuff you need.

source distro's like Lunar-linux [lunar-linux.org] provide a wonderful rich platform with all of these -dev stuff installed by default and the way the developer has meant it. These source distros are excellent for developers.

Re:Mod parent up! (2, Informative)

irix (22687) | more than 10 years ago | (#9217631)

As somone who works on an open-source project that requires evolution-devel to compile, let me say that I am well aware of that problem. When GNOME went over to pkgconfig, many distros took a while to build their -devel packages with the .pc files.

That being said, I agree with the parent post - pkgconfig goes a long way toward solving problems with automake and autoconf. Everything in GNOME now uses it, so setting up the build environment for anything that depends on GNOME libraries is much, much easier. Hopefully other projects that provide libraries will follow in GNOME's footsteps.

Re:Mod parent up! (2, Interesting)

KewlPC (245768) | more than 10 years ago | (#9223312)

Until a project decides not to put it's .pc files in the standard place. For instance, when installing Clanlib on my Gentoo system (using Portage, no less), the .pc files didn't get put in /usr/lib/pkgconfig (which is where pkg-config expects them to be on Gentoo).

Instead they got put in /usr/lib/clanlib-0.7.7, and when I tried to build a program that used pkg-config to find clanlib, the build broke. If I were a Linux newbie, I'd probably have just given up rather than try to find the .pc files that I knew were installed.

I really don't want to get into some kind of PKG_CONFIG=$PKG_CONFIG:/some/directory:/some/other /directory:/yet/another/directory mess.

Re:Mod parent up! (2, Interesting)

elFarto the 2nd (709099) | more than 10 years ago | (#9223987)

This is the one thing that bugs me about pkgconfig. I mean how hard can it be to add a --install option, allowing pkgconfig to choose where to place the .pc file.

Maybe I'll go make a patch...

Regards
elFarto

Re:mod article up! (2, Interesting)

Crayon Kid (700279) | more than 10 years ago | (#9215568)

Which brings me to another issue: why isn't the output from 'configure --help' available in machine-readable form? Say, XML? This would help a lot with the creation of graphical configuration tools for source packages. AFAIK there are a couple of helper apps out there that do this, but they have to go through horrible hoops parsing the output from 'configure --help'.

Re:mod article up! (1)

devphil (51341) | more than 10 years ago | (#9217661)


The place to extract that information is not from 'configure --help', it's from the configure.{in,ac} file directly. The basic --help output (the standard options like --prefix, etc) is known already, so just get the arguments to AC_ENABLE_* and AC_WITH_*, or however they're spelled.

Re:mod article up! (2, Funny)

Chris Z. Wintrowski (442269) | more than 10 years ago | (#9224357)

...This would help a lot with the creation of graphical configuration tools for source packages. ...


How is a graphical configuration tool going to help anything? You want to pile another layer of crap onto an already burdened autoconf just so you can play with your mouse? What the hell is wrong with you? Don't be such a pussy: type './configure --help', figure out the options on the command line, and stop recommending open source programmers waste their time developing mere gimmicks. Do you work for Microsoft or something?

Re:mod article up! (0)

Anonymous Coward | more than 10 years ago | (#9223948)

Did you take a look at pkg-config's source code ?

Re:mod article up! (1)

c (8461) | more than 10 years ago | (#9213553)

why isn't there a database of some sort which would at least allow the option of keeping track of which versions of what applications have been configured how and installed where.

You mean why doesn't ./configure keep such a database or why isn't there one on all Unix boxes?

The latter is obvious... there is no standard package management system.

I can't imagine a mode of operation for configure where you're re-running it so frequently as to be a real time waster. Unless you've got a source base so screwed up that it needs to ./configure everytime you type 'make'. Things built using xgettext can come awfully close to that situation, but that's the only thing I've ever seen in the ballpark.

c.

Re:mod article up! (1)

brendan_orr (648182) | more than 10 years ago | (#9214117)

Though you must remember, not all OS's/distro's utilize a packaging system, nor should they change their ways. For libraries on POSIX systems, looking in the library directories is sufficient enough for version information, for non-posix it might be a little harder (perhaps in the order of "stamping" a standard string format in ascii within the binary) For user-space programs its very easy (if the programmers are smart)... "-v" ...My $0.02

Re:mod article up! (1)

Mr. Piddle (567882) | more than 10 years ago | (#9223283)

auto(conf|make) sucks badly

Worst understatement ever!

Re:mod article up! (1)

blackcoot (124938) | more than 10 years ago | (#9223462)

it sucks so badly that i couldn't find a better way to express just how abhorrent the experience of dealing with auto* is. i was going to say something about how legions of puppies and kittens and other cute little furry animals die torturous deaths at the hands of commie nazi pedophile terrorist hippies. urm. yes. you see the problems i was having ;-P

Two suggestions: (2, Informative)

warrax_666 (144623) | more than 10 years ago | (#9213459)


These are both sort of combined configuration and build systems (which is the way it should be IMHO). Scons requires Python (>=1.5.2, IIRC), so it is as "only" as portable as Python itself (which is to say "very") while cmake doesn't require anything except a C++ compiler. The actual "language" in Scons is just regular python while cmake uses a hideous custom language.

Two awful suggestions (4, Informative)

sofar (317980) | more than 10 years ago | (#9213479)


they may be a drop in replacement for developers, but for packagers and people trying to track changes and new versions, both cmake and scons (blender!) are horrible. They cost us (A group of 10 people working on a distro) enormous amount of extra time (blender's upgrade to .33 took me a whole day to figure out, whereas before it only takes me 20 minutes to fully test a new blender version).

all in all autoconf maybe a problem for developers, but for packagers it is still *by* *far* the best.

Re:Two awful suggestions (0)

Anonymous Coward | more than 10 years ago | (#9213839)

blender's upgrade to .33 took me a whole day to figure out, whereas before it only takes me 20 minutes to fully test a new blender version

AHAHAHAHAHAH! Fully testing a new blender version in 20 minutes! Yeah, right.

Besides: If people are muckin' around with the build scripts and whatnot you should EXPECT to have to work more on packaging. The same goes if they had been mucking around with an automake/autoconf'd build system. Tough shit.

Re:Two awful suggestions (1)

Eivind Eklund (5161) | more than 10 years ago | (#9236997)

As a non-Linux packager, I want autoconf to die. It makes building a software package into a random event - either it works (by magic), or debug hell is afore you. Somewhat like installing Windows software.

I actually found it easier porting software back before anybody did any attempt at making it fully automatic. autoconf's use of sh as a back-end language for a compiler for an auto-detect language often makes it necessery to muck about in the "object files" (sh files), and reverse engineering these is harder than reverse engineering normal object files.

My opinion is that autoconf must die. Its model ("each package shall run a 10,000+ lines shell script to attempt to guess what is installed") is wrong, and the release engineering done by the autoconf team does not seem to be particularly good, either (there are a bunch of incompatibilities between minor versions).

The classic portability systems had one configuration file per system type, and had the user select one of these. This was much easier to deal with. If autoconf had been used only to write out a sample configuration file that should match the user's system, it would IMO be reasonably OK. As it is, it often is an utter pain unless it works on first try.

Eivind.

Re:Two suggestions: (0)

Anonymous Coward | more than 10 years ago | (#9213591)

I actually use ruby scripts to confiugre the software. Ruby is very crossplatform... so just rqueire it. If your target is too old or incompatable to have a ruby interpreter then boo hoo. Most computers (even my zaurus!) can handle ruby. it works great. If you want make like functionality check out Rake. usually though a custom script will do.

AND it is not too hard to pick up a prject and write its logic in a ruby script from some build system... it is usually worth the time. SCREW auto tools. These are the worst thing to come to the *n*x world... well almost (SCO beats that out).

Re:Two suggestions: (3, Insightful)

jrumney (197329) | more than 10 years ago | (#9217177)

Using a Ruby based configuration makes sense if your project uses Ruby anyway, just as using Ant makes sense for Java projects, but for building standalone C/C++ projects non-standard build tools are just another dependancy that will ensure that noone will bother to use your project because you set the barriers too high. The benefit of autoconf/automake is that only the developers need to use it. End users run a pre-built configure script, which runs in /bin/sh.

Re:Two suggestions: (1)

Mr. Piddle (567882) | more than 10 years ago | (#9223296)

The benefit of autoconf/automake is that only the developers need to use it.

Developers use it, users suffer under it. I've spent so much time fighting with autoconf on non-vanilla-GNU/Linux systems that it would have saved time to just edit the fucking source code to get it to compile.

Re:Two suggestions: (0)

Anonymous Coward | more than 10 years ago | (#9224112)

Funny, that was the same thinking that made me starting the PMK project :)

Re:Two suggestions: (0)

Anonymous Coward | more than 10 years ago | (#9224139)

So to have an autoconf replacement you use scons. But scons is using python. And python needs autoconf to build.

Isn't it a bit complicated for a replacement ?

I'm maybe biased but i think that for a such tool the minimum of dependency is required.
The idea behind autoconf was nice. Using a shell script so it's portable and need zero dependancy as every unix system has a shell. They failed because these scripts are not so portable (just look at BSD ports trees if you don't believe me). Also the shell script can be easily trojaned (some scripts are somewhat *heavy*) and then can result in a threat (remember what happens not so long ago ...).

Damien

problem inevitable (5, Insightful)

fraccy (780466) | more than 10 years ago | (#9213462)

I feel your pain, but this isn't just autoconf. Its a general theme of the way we compute. Version nightmares. I think the problem is unavoidable because of the way we currently compute: 1) Competition and the enormous diversity today will always leads to heterogenous systems, no matter how good their intentions initially 2) The semantics of software and its environment are not embedded in the data, which of course means when a version changes, something somewhere breaks, and someone somewhere has to fix it Theres only two solutions, either by decreasing diversity through standardisation (oh heck the version of that keeps changing too), or real autonomic computing operating at a higher level of abstraction. Roll on autonomic computing, real self-configuring and self-adapting systems. Until then, we can only attempt to minimise the problems, and can only ever solve them in a limited scope for a limited period of time.

Re:problem inevitable (-1, Troll)

Anonymous Coward | more than 10 years ago | (#9213573)

I think the problem is unavoidable because of the way we currently compute: 1) Competition and the enormous diversity today will always leads to heterogenous systems, no matter how good their intentions initially

This is why I truly and honestly believe that Microsoft Windows is the best way to go. They only release a new version every few years and it is generally very good with backward compatibility. Plus they frown on compiling stuff from scratch unless you're a commercial company anyway so no nasty autoconf needed.

Re:problem inevitable (1)

fraccy (780466) | more than 10 years ago | (#9217756)

Yes I hear you in a sense, there is a bonus to be had from an entire environment that is engineered to work together. This however is an example of the "limited scope" and "standardisation" I referred to, and not everyone, as we see, uses windows. Also you would be mistaken to think the windows environment is free of the problem. There are different versions of windows, each with their own incompatibilities, and furthermore it would be a stretch of the imagination to say third party windows software (or microsofts own) does not suffer from the same problems :) Heterogeneity has a positive role as well as a negative; thank god we're not all slaves to Bill Gates, what a world that would be.

Re:problem inevitable (0)

Anonymous Coward | more than 10 years ago | (#9227390)

"This is why I truly and honestly believe that Microsoft Windows is the best way to go."

Are you smoking crack????
Have you ever developed for windows???
Have you never heard the phrase "DLL Hell"???

I develop on windows (I barely use Linux), and even in a closed shop environment working out the dependancies and conflicts is hell. And don't even bring up Win2000's new "solution" to the problem. It creates just as many problems as it solves.

Re:problem inevitable (1)

RAMMS+EIN (578166) | more than 9 years ago | (#9231021)

Various unices have realized the problem of incompatible libraries long ago and come up with a solution: they append the version number to the filename.

So, when your app needs a feature introduced in libfoo version 2.1, it would (depending on the naming scheme :-( ) link against libfoo2.so.1. Whether that is actually libfoo2.so.1.3 or libfoo2.so.2.4 does not matter; the idea with version numbers is usually that different major numbers indicate incompatible versions, minor number changes indicate feature additions, and microversions numbers indicate bugfixes.

On Windows, the same library would typically be named foo.dll, or foo2.dll if you're lucky, but, as you can see, the version control is much coarser as far as the filename is concerned, making the problem worse on Windows.

scons? (1)

ville (29367) | more than 10 years ago | (#9213473)

How about scons [scons.org] ? I must admit I hardly used all of autotools' potential as they were quite complicated, so I can't say if scons offers anything even close to what autotools had.

// ville

Re:scons? (1)

noselasd (594905) | more than 10 years ago | (#9213490)

I second scons. It's perhaps more a replacement for make, but I don't think I'll ever write a makefile again after discovering scons,
and the autotools are just way to complicated to begin bother with.

You have been 's CONNED (4, Interesting)

sofar (317980) | more than 10 years ago | (#9213504)

SCons isn't the solution either. SCons relies way too heavily on python and doesn't make better distribution tarballs. Most developers using SCons rooll out horrible tarballs that cannot even detect the proper Python version! (blender!!!).

SCons makes you lazy, do the work, then your application builds BETTER on MORE platforms with autoconf

Re:You have been 's CONNED (0, Flamebait)

Anonymous Coward | more than 10 years ago | (#9213684)

STFU, just because you're too stoopit to figure out how scons works doesn't mean it's no good! Fuck packagers, I, as a developer, want an easy to use build system.

Re:You have been 's CONNED (1)

be-fan (61476) | more than 10 years ago | (#9221934)

Ugh. Autoconf is such an ugly piece of shit. Its nearly impossible, for example, to create a KDE application without copying the autoconf/automake mess of an existing setup. And you're hosed if you need to do anything moderately unusual in your build procedure. In the end, 'make' is really just a one-off, hackish, single-purpose programming language. Its much better to replace it with a real, general programming language.

Try PMK, for example (5, Informative)

wsapplegate (210233) | more than 10 years ago | (#9213507)

You can find it at pmk.sourceforge.net [sourceforge.net]

Or else, you can have a look at A-A-P [a-a-p.org] , by nobody else than Bram Moolenaar, the author of the One True Editor, a.k.a. ViM :-)

There is also Package-framework [freshmeat.net] , by Tom Lord, the author of the infamous Arch [gnuarch.org] SCM.

I was about to mention SCons, too, but other people already did (it always pay to check other comments just before posting, especially on /. :-)

To sum it up : there is no shortage of alternatives to the incredibly hairy Autoconf/Automake nightmare. The problem is, people are still using them for the very same reason they use CVS instead of Arch/Subversion, or Sendmail instead of Postfix/Exim : because they're considered ``standard'' tools, and people feel more comfortable with software they know to be used by plenty of other people (millions of programmers can't all be wrong. Can they ?). I really hope they'll stop making this kind of mistakes soon, so I won't need to curse them everytime I have to debug some Autoconf breakage...

Re:Try PMK, for example (0)

Anonymous Coward | more than 10 years ago | (#9219114)

pmk++ It rocks. Autofuck and friends need to be replaced by something that doesn't suck. pmk, as far as I can tell, is that replacement. Switched a year ago and haven't had a problem. qmake works too: I dig it, but pmk is probably the open source solution that the entire BSD community should drift and congeal around. Linux, on the other hand, should can stick with its autofuck nightmare until sufficient pain has been enduced and people start saying, "I will not use stupid/shitty software," and repeat the phrase until every two minutes until they've converted all of their bits to pmk.

Re:Try PMK, for example (2, Insightful)

cgreuter (82182) | more than 10 years ago | (#9221292)

I took a quick look at pmk a while back. I got as far as this FAQ entry:

7. Why not supporting cygwin?

Because cygwin is not handling executable files as it should. It is absurd to have to take care about a trailing'.exe' under a Unix-like environment.

Absurd it may be, but Windows is the most popular platform out there and refusing to support it because it's too icky is just plain dumb. They've refused to make pmk useful enough to actually be valuable to me, so I haven't bothered using it for anything.

Re:Try PMK, for example (0)

Anonymous Coward | more than 10 years ago | (#9224024)

What you missed is that PMK is aimed to POSIX systems.

If someday someone has interests to make a PMK port for windows then he just have to contact me or another member of the team. We'll be happy to help him using the common base for a specific port (and also to include him in the team of course).

Speaking about Cygwin, we will support them when they will do the things nicely. I see nowhere that unix binaries filenames must have .exe extension. When Cygwin will be able to manage this transparently (just as it should be) then we'll see ...

Oh,BTW i have no problem to boycott the "most popular" game console OS. Yes i meant windows :)

Damien

Re:Try PMK, for example (0)

Anonymous Coward | more than 10 years ago | (#9227755)

And you missed the part of the original question about being "widely portable."

Re:Try PMK, for example (0)

Anonymous Coward | more than 10 years ago | (#9228929)

I've not missed anything as i was replying about critics of my project.

If you're not able to write constructive comments you should just shut-up.

Damien

Re:Try PMK, for example (1)

cgreuter (82182) | more than 9 years ago | (#9232798)

What you missed is that PMK is aimed to POSIX systems.

The bit I saw was that it was supposed to be a replacement for GNU autoconf. Autoconf is designed to automate away the differences between platforms and if PMK is restricted to POSIX, that makes it insufficiently flexible for my purposes.

(Autoconf is designed to work on non-POSIX systems. That's why the shell code is so hacky--it's got to work around every Unix's own shell wierdness.)

If someday someone has interests to make a PMK port for windows then he just have to contact me or another member of the team. We'll be happy to help him using the common base for a specific port (and also to include him in the team of course).

Ah. Well, I seem to have misunderstood the FAQ item then. I read it as meaning that you would never support Cygwin (or Windows, by extension) because of the '.EXE' thing. It might be worth clarifying your stance on that.

In any case, you've written a program designed to automatically take care of all of the quirky differences between platforms, then refused to support one of the more common platforms because it's got all of these quirky differences. Do you see why I'm not all that enthusiastic about it?

Oh,BTW i have no problem to boycott the "most popular" game console OS. Yes i meant windows :)

You are, of course, entitled to do this--it's your project after all. But it's sort of like wolves boycotting sheep. Microsoft benefits from lock-in. The easier it is to port applications to (and from) Windows, the easier it will be for your average person to migrate away from Windows.

Re:Try PMK, for example (0)

Anonymous Coward | more than 10 years ago | (#9235884)

The bit I saw was that it was supposed to be a replacement for GNU autoconf. Autoconf is designed to automate away the differences between platforms and if PMK is restricted to POSIX, that makes it insufficiently flexible for my purposes.

PMK is an alternative that is targeted to POSIX systems. It never aimed to be a clone of autoconf.

(Autoconf is designed to work on non-POSIX systems. That's why the shell code is so hacky--it's got to work around every Unix's own shell wierdness.)

You're right, that's also why a lot of people don't like autoconf. But if you are building from sources on other platforms than linux then you must know that there many where configure scripts needs to be modified to make it running correctly. So the goal of portability is not really reached.

Ah. Well, I seem to have misunderstood the FAQ item then. I read it as meaning that you would never support Cygwin (or Windows, by extension) because of the '.EXE' thing. It might be worth clarifying your stance on that.

We will not support Cygwin until they make it running as it should be. This means until they fix the .exe problem which should not happen.
I insist on the fact that windows and cygwin are not the same platform in my mind. Cygwin is a unix environement that should work transparently but it does not.

Ah. Well, I seem to have misunderstood the FAQ item then. I read it as meaning that you would never support Cygwin (or Windows, by extension) because of the '.EXE' thing. It might be worth clarifying your stance on that.

See the previous statement in the FAQ:
# Do you plan to make it running under windows ?
No because we want to keep this project as simple as possible. But if some people want to port pmk to windows they can contact us.

Damien

Re:Try PMK, for example (1)

RAMMS+EIN (578166) | more than 9 years ago | (#9231910)

``Absurd it may be, but Windows is the most popular platform out there and refusing to support it because it's too icky is just plain dumb.''

Yay, a flamewar! I'll join! I see this issue the other way: it's not portable software not supporting Windows, it's Windows not supporting portable software.

Before Windows was created, there was the POSIX API. Though definitely inspired by UNIX, POSIX is an API that any operating system could support, and there are indeed non-UNIX systems that are POSIX-compliant. When the fine folks at MicroSoft designed Windows, they knew about POSIX. They thought about making their system compatible, and didn't.

I say that developers writing software that works on both POSIX and win32 systems are supporting a noble cause. However, people who choose to use Windows are opting out of lots of great software. In return, they get other software that POSIX systems do not support.

Asserting that developers are dumb because they do not support a system that was designed to be incompatible is, I would say, misguided. Of course, it's your choice to ignore their project because it doesn't support your favored platform, but please don't call them dumb (you might even consider apologizing).

And by the way, I challenge your statement that Windows is the most popular platform. It may be the most used platform on desktops, but it's not like every user willingly chose to use it. Many are forced to, and others don't know any better. Outside desktop circles, Windows is often frowned upon.

Re:Try PMK, for example (1)

cgreuter (82182) | more than 9 years ago | (#9233271)

Yay, a flamewar! I'll join!

I'm not usually very good at flamewars, so just pretend I wrote a lot of incoherent abuse here.

Asserting that developers are dumb because they do not support a system that was designed to be incompatible is, I would say, misguided.

If you go back and read my post, you will see that I'm referring to the behaviour as dumb, not the people. That's a big distinction. These folks have written a working non-trivial software package and that requires some smarts. I'm talking about one particular decision of theirs.

But anyway, let me spell out my reasoning:

  • Autoconf works around all kinds of different platform-specific wierdnesses (many of which predate POSIX, BTW) so that my programs will be widely portable.
  • PMK tries to be a replacement for autoconf.
  • Therefore, PMK should also do the same sort of platform-specific workarounds for me.
  • PMK does not do this for Cygwin while autoconf does.
  • Therefore, autoconf solves the portability problem more effectively than PMK does.
  • Therefore, PMK is less useful than autoconf.
  • The PMK team disapproves of that particular part of the problem (Cygwin has a quirk they don't like) and so refuses to try to solve it.
  • Therefore, the PMK team is deliberately refusing to solve part of the problem they claim they are trying to solve (i.e. replacing autoconf).
  • This is dumb.

I mean, not supporting Windows is all well and good (and the sort of thing I do on occasion) but when your stated goal is to make software more compatible across platforms, you can't do that without making yourself a liar.

I could understand it if they'd said, "we're looking for someone to do a Cygwin port because we can't be bothered." But what they said was, "Cygwin is broken and we're not going to work around the problem because they should fix it." This for a tool whose very purpose is to work around those sorts of problems.

(Note: see my followup to Damien one thread over. I may have partially misunderstood their position on this, in which case I could be wrong in my claims of dumb behaviour. But this is how I see things and my conclusions are derived from that.)

And by the way, I challenge your statement that Windows is the most popular platform. It may be the most used platform on desktops, but it's not like every user willingly chose to use it.

Whether or not people like it is irrelevant to my argument. Windows is the operating system of most of the desktop computers in the world. If you want to write a program and have it work on the biggest number of computers possible, you have to make sure it works under Windows. As a developer, the best I can do is develop under Unix and use cross-platform tools so that I don't have to touch Windows very much. This way, my customers have a choice, but a lot of them are still going to choose Gatesware and I need to provide that.

(My own take is that the vast majority of PC users really don't care what operating system they can run, as long as they can go to the store, buy a piece of software and just have it work. People who care about OSs have mostly already switched to something else. But that's beside the point.)

Re:Try PMK, for example (1)

RAMMS+EIN (578166) | more than 9 years ago | (#9233403)

Sh.t, that wasn't a flamewar, that was healthy discussion! You spoiled it. ;-)

Anyway, I must go to bed now, or I'll be writing incoherent abuse because I'm incapable of doing anything else. Thanks for your explanation!

Re:Try PMK, for example (0)

Anonymous Coward | more than 10 years ago | (#9236068)


If you go back and read my post, you will see that I'm referring to the behaviour as dumb, not the people. That's a big distinction. These folks have written a working non-trivial software package and that requires some smarts. I'm talking about one particular decision of theirs.

But anyway, let me spell out my reasoning:


And that's where i also give my own reasoning.


* Autoconf works around all kinds of different platform-specific wierdnesses (many of which predate POSIX, BTW) so that my programs will be widely portable.
* PMK tries to be a replacement for autoconf.


Both are right. But PMK does not target the same platforms than autoconf (this means we don't try to rule the world). Briefly, we provide an alternative to those who are not happy with autoconf (and in first place: US :).


* Therefore, PMK should also do the same sort of platform-specific workarounds for me.


It seems that PMK does not fit your needs.


* PMK does not do this for Cygwin while autoconf does.


What you don't understand is that PMK would work as well if Cygwin had the behavior it should have.


* Therefore, autoconf solves the portability problem more effectively than PMK does.


I totally agree, but as previously stated we have different goals.


* Therefore, PMK is less useful than autoconf.


You forgot to append "for me" at the end. Because for people that only care about POSIX systems (like i do) then PMK is just fine.


* The PMK team disapproves of that particular part of the problem (Cygwin has a quirk they don't like) and so refuses to try to solve it.


Wrong ! Cygwin has a BUG and it's not to PMK to solve it. Their support of binaries is not complete, that's the point you missed.


* Therefore, the PMK team is deliberately refusing to solve part of the problem they claim they are trying to solve (i.e. replacing autoconf).


This proves that you totally misunderstood what is the concept of PMK.


* This is dumb.


No, this is a different point of vue.


I mean, not supporting Windows is all well and good (and the sort of thing I do on occasion) but when your stated goal is to make software more compatible across platforms, you can't do that without making yourself a liar.


It depend on what you mean by platforms. Again our platforms target is only POSIX systems.


I could understand it if they'd said, "we're looking for someone to do a Cygwin port because we can't be bothered." But what they said was, "Cygwin is broken and we're not going to work around the problem because they should fix it." This for a tool whose very purpose is to work around those sorts of problems.


I can't understand your point here. The Cygwin goal is to give a Linux like environement for windows. So the expected behavior would be to work under Cygwin as like you would do under Linux. Then under Linux do you look for pkg-config or pkgconfig.exe ? For me this is a Cygwin bug.

That said, you should really look at our goals. We don't try to clone autoconf but to provide an alternative. This means that there are things we don't like in autoconf and that we try to fix in PMK. One of these is the abundance of tweaks for non-POSIX platforms. These platforms could be suported by specific ports but actually i have not the time and the need to do that.

I hope that the situation is more clear now.

Damien

Language Standards (-1, Flamebait)

Anonymous Coward | more than 10 years ago | (#9213799)

This'll be taken as a troll I'm sure, but I always found that sticking to the language as it is defined negates the need for autoconf.

I've written more portable, non-trivial C code than I care to remember and never needed to use autoconf. If you need platform specific code (eg. nontrivial file handling) then you code appropriately, which you have to do even if you use autoconf because autoconf doesn't handle that kind of thing.

It's a completely unecessary and worthless project catering for people who don't fully understand the language they're coding in.

Re:Language Standards (-1, Flamebait)

Anonymous Coward | more than 10 years ago | (#9214747)

moderated "-1 flamebait" by someone who's so insecure he can't stand to be told that he doesn't understand the language sufficiently well enough to do without crutches like autoconf.

And the amount of people of people who complain about buggy code. I'll tell you what causes buggy code, people who don't understand the language they're coding in. Now, if all the kiddies fuck off and leave the coding to the big boys we can let projects like autoconf die and get to work on producing quality code.

Re:Language Standards (0)

Anonymous Coward | more than 10 years ago | (#9221352)

Another "-1 flamebait" by an idiot moderator who fails to understand how a computer works. Continue like this and all the world will be able to see the worthlessness of the Linux culture. Twats who think they can code but can't.

Consider this numbnuts, the BSD world manages without autoconf but the kiddie, wannabee Linux coders need hopeless tools to massage their ego.

Yes. Autoconf. The only GNU tool that is completely superfluous in a world of grown up programmers.

Solution to the Problem (2, Interesting)

aminorex (141494) | more than 10 years ago | (#9213943)

> Is there something else out there that is as
> portable as autoconf to use instead?

Yeah. It's called GNU Make.

Seriously, if you write your makefiles and your
code in a responsibly portable manner, there's
absolutely no reason for autoconf or automake.
And it's not hard. I've done it repeatedly.
The auto* tools are an antipattern virus.

Re:Solution to the Problem (1)

wsapplegate (210233) | more than 10 years ago | (#9214242)

> Seriously, if you write your makefiles and your
> code in a responsibly portable manner, there's
> absolutely no reason for autoconf or automake.

Well, that's right except sometimes you can't avoid some #ifdef quirkyness (because a function has to be invoked with different parameters in some foreign C library you target, or because Windows uses '\' instead of '/' as a directory delimiter, etc.). In these cases, the simplest way to go is to write a Makefile.Linux, a Makefile.FreeBSD, a Makefile.Win32, and so on, and instruct the user to cp Makefile.{your_arch} Makefile && make. Which works, of course, except it's more work (and more chances you'll make a mistake in one of the Makefiles) for you and a little annoyance for the user. Build configuration systems are supposed to do the tedious work for you, and if they _really_ did, that would be great. The real problem is, instead of easing the build process, they (at least the evil Autoconf) frequently add a layer of complexity and misleading errors...

Re:Solution to the Problem (0)

Anonymous Coward | more than 10 years ago | (#9214934)

Windows accepts forward slashes internally (eg for things fopen()). Also, backslashes require escape sequences, which look ugly. Just use forward ones, and remember to watch your case sensitivity even though windows won't pick it up.

Re:Solution to the Problem (5, Insightful)

grubba (89) | more than 10 years ago | (#9215922)

So how do you portably detect which taste of system calls you have on the OS without autoconf, and without an explicit database OS <=> feature?

eg:

  • Is your getservbyname_r OSF/1- or Solaris-style?
  • Does your getspnam_r take 4 or 5 arguments?
  • Does your struct hostent have the field h_addr_list?
  • Are you on a Linux system with a broken <shed.h>?

All of the above are easily detectable with autoconf.

I however agree with you that there's absolutely no need for automake.

Re:Solution to the Problem (2, Informative)

aminorex (141494) | more than 10 years ago | (#9219039)

That's what ifdef is for.

But more importantly, if you're writing application
code using a system call layer, you've already lost
the game.

Re:Solution to the Problem (3, Informative)

Anonymous Coward | more than 10 years ago | (#9219489)

I had to anonymous answer you so I could mod parent up.

Of course thats what ifdef is for, but ifdef isn't dangling its always:

#ifdef SOMETHING

and where do you think the SOMETHING comes from for cases like:

* Is your getservbyname_r OSF/1- or Solaris-style?
* Does your getspnam_r take 4 or 5 arguments?
* Does your struct hostent have the field h_addr_list?
*Are you on a Linux system with a broken ?

I'll tell you; auto-conf doe some checks, possibly test-compiles, basically discovers the local landscape and sets a LOAD OF PREPROCESSOR SYMBOLS (OK, and custom makefiles blah blah, what do you think holds the #def's anyway) and then compiles the code full of #IFDEF's

Sam

Re:Solution to the Problem (1)

natersoz (239301) | more than 10 years ago | (#9226331)

Absolutely true. The fact is that Linux is the future and the distros are pretty standard.

Once the Makefile is written its easy to modify for cross compiling - something ridiculous under auto-crap.

GNU MAKE forever!

Use the same versions (-1, Troll)

Anonymous Coward | more than 10 years ago | (#9214179)

Generally when I have worked on projects in the past where the GNU Autotools were used, all the developers were requires to use the same versions of these tools. Since end users don't need autoconf, etc. this wasn't really a problem. I agree that this isn't a perfect solution, but the benefits of using the Autotools outweighed the negatives of all developers needing the same version.

Re:Use the same versions (0)

Anonymous Coward | more than 10 years ago | (#9221647)

Look dumbass, your situation has nothing to do with what we are talking about.

If you have control over the group enough to mandate a specific version of autotools, why not just mandate a specific development platform, and then you can just use Makefiles or shell scripts or whatever ? Chances are, you guys DID mandate a specific development setup. So what was the point of autotools ?

PARENT UNFAIRLY MODDED (1)

Rob Riggs (6418) | more than 10 years ago | (#9228177)

Parent is not a Troll. This is actually a very sound statement. Build dependencies are just that: build dependencies. "You want to build my cool new software package? OK, you'll need an ANSI C compiler; Bourne shell; libs X, Y and Z; and automake version A, autoconf version B and libtool version C."

Frequently you'll see statements like this in the INSTALL documentation of an application: "I've tested this on the following OSes: Linux (Fedora Core 1), Solaris 9 (SPARC, not x86), OpenBSD, and Plan 9. I've had reports that it works on HP-UX and AIX 5L. If you get it working on UNICOS (and can let me know about it) drop me a line with your patches."

This is perfectly sensible. Heck, its even quite common to have multiple versions of autotools installed to deal with all of the build dependencies one has. (I have 3 automake versions installed on the workstation I now write from.) Nothing wrong with that, really.

I think the story submitter may think it's a problem to have multiple autotool versions installed, but I don't think he knows what the problem really is. Perl doesn't use the autotools, and if Cray want's to help support its users, it could provide a UNICOS build system for OSS developers to test on like IBM does/did with AIX 5L.

Re:PARENT UNFAIRLY MODDED (0)

Anonymous Coward | more than 10 years ago | (#9228944)

I think the story submitter may think it's a problem to have multiple autotool versions installed, but I don't think he knows what the problem really is.

I'm sorry but i'm not sure you ever know what the problem is. In a perfect world you will only need to have the configure script in the package and it should build without having autoconf installed on the end-user's computer [gnu.org] .
In real world that is not the case ...

Damien

Re:PARENT UNFAIRLY MODDED (1)

Rob Riggs (6418) | more than 9 years ago | (#9232777)

I'm sorry but i'm not sure you ever know what the problem is. In a perfect world you will only need to have the configure script in the package and it should build without having autoconf installed on the end-user's computer. In real world that is not the case ...

Oh, I completely agree. Until a package is actually built on a platform, one may not know what sort of things really need to be checked for or changed by the configure script. It's also quite likely that all of autotools macros haven't been tested on the less widely available OSes. That's where the folks wishing to use open source software on less popular platforms bear some responsibility.

I don't for a minute think that it is easy, but it is what we've got. And the only way to make things easier is through standardization, and not with a new configuration tool.

Well,if you've got to build software for Crays ... (3, Interesting)

Anonymous Coward | more than 10 years ago | (#9214704)

Chances are the other packages people here are talking about haven't ever been built on a cray, too. A Cray is not exactly common-place.

Usually, the simplest way to deal with broken scripts/automake/autoconf tests, is to use a better shell. Take the bash, tcsh, or whichever shell works on Unicos, and run your autoconf tests throught that. If you think you've found a problem in autoconf itself, run the regression test suite and submit a bug report.

A quick google search over the autoconf mailing list archive shows that there were 0 posts on Unicos in the last two years. So chances are, that any bugs in autoconf itself in the last two years have not been tested, or that it has no bugs on Unicos ;) I'd think that the autoconf developers would appreciate a good bug report, just like any other project would.

If your problem is autoconf's performance, then a) use a faster shell, b) use --cache-file, c) use ccache to run those myriads of little C tests faster. Or fix the tests in question.

A classic on slow platforms is the maximum shell length test. That one comes from libtool, not from autoconf. It's been vastly improved in libtool 1.5.6, for example. If your upstream doesn't use a recent version of libtool, convince them to do so. :)

Damn (1)

jcorgan (30025) | more than 10 years ago | (#9214852)

I initially read the headline as, "Alternatives to Ashcroft." Damn, I wonder what's on my mind these days?

Ant (-1, Flamebait)

time4tea (193125) | more than 10 years ago | (#9214882)

What kind of apps are you writing that need c++ anyway?

% ant

Builds anywhere.

Translation (was Re:Ant) (3, Funny)

happyfrogcow (708359) | more than 10 years ago | (#9217537)

What kind of ${CRAP} are you writing that need ${YOURLANGSUCKS} anyway?

${MYLANGROXORS}

${FSCKYOU}

Re:Translation (was Re:Ant) (0)

Anonymous Coward | more than 9 years ago | (#9231878)

Ah yes ha ha.

However, the point, not well made you might argue, is valid.

If the op had asked how do i write a 60fps fps in perl, you might direct him/her to c++.

As they asked "how do i mangle some c tool to help me write software that runs on multiple platforms", perhaps a different language might help. It why there is more than one language after all.

just cos you're a leet c0der in c, doesn't mean its the most appropriate tool for the job.

One can argue that c (and c++) is almost never the right rule for the job now. Any exception you might point out simply proves the rule.

Perl? (0, Flamebait)

Neon Spiral Injector (21234) | more than 10 years ago | (#9214936)

Perl doesn't use autoconf. It does have script that sort of mimics autoconf, but isn't. Also most Perl extensions use `perl Makefile.pl` to configure themselves.

Re:Perl? (1)

hummassa (157160) | more than 10 years ago | (#9215007)

perl itself is autoconf'ed. perl Makefile.pl is just for XSs

Re:Perl? (1)

Neon Spiral Injector (21234) | more than 10 years ago | (#9215106)

Nope, like I said, Perl has a script that works like autoconf, but isn't.

From Perl's configure.gnu:
#! /bin/sh
#
# $Id: configure,v 3.0.1.1 1995/07/25 14:16:21 ram Exp $
#
# GNU configure-like front end to metaconfig's Configure.
#

I have a replacement (4, Interesting)

Chemisor (97276) | more than 10 years ago | (#9215211)

I was pretty fed up with autoconf myself, and wrote a little C app to emulate it. Initial ./configure time dropped to a second and reconfigure is instantaneous. I would recommend it for your simpler projects: bsconf.c [sourceforge.net] and bsconf.h [sourceforge.net] , the latter being the configuration file.

how about java? (0, Flamebait)

demian031 (466963) | more than 10 years ago | (#9216502)

no flame-bait i swear.

even tho 'write once debug everywhere' makes a great bumper sticker. it doesn't mean it is true. seriously, java is a good answer here.

Re:how about java? (0)

Anonymous Coward | more than 10 years ago | (#9221266)

Yeah. Because Java rocks on UNICOS.

Re:how about java? (2, Interesting)

tradervik (462791) | more than 10 years ago | (#9223399)

Indeed. We develop server-side code on Windows (2k and XP), and deploy on Linux, Windows, and Solaris. With three major products representing millions of lines of code developed over a period of approximately 6 years, we have never had a single instance of platform incompatibility.

Hate to say it, but ... (4, Insightful)

Russ Steffen (263) | more than 10 years ago | (#9217167)

... Those who do not understand autoconf are doomed to reinvent it. Poorly.

Re:Hate to say it, but ... (1)

Mr. Piddle (567882) | more than 10 years ago | (#9223325)


And I thought autoconf was just a poor reinvention of portable makefiles because programmers are too lazy to give a damn about real portability and high-quality software. It seems that the most basic abstraction in software is beyond most programmers' grasp (yeah, let's write a precompiler precompiler...that's simply genius (famous last words)).

Re:Hate to say it, but ... (1)

RAMMS+EIN (578166) | more than 9 years ago | (#9231796)

How does true standards sound to you? No autoconf needed anymore!

Autotools are a nightmare (2)

spitzak (4019) | more than 10 years ago | (#9217667)

Case in point: I wanted to compile cairo [cairographics.org] on my Mandrake 9.1 system. I couldn't until I edited the autoconf file to remove "new commands" and added phony files to make pkgconfig happy. Then it compiled just fine and worked. I tried to compile the demos and was completely frustrated and eventually hand-wrote a trivial makefile and they all compiled just fine and worked (except for the GTK one...). I am now trying to compile the "Glitz" OpenGL backend, and I am running into the same troubles: I can't prove it yet, but I strongly suspect it will compile just fine on my machine, if I can just get around the mysteries and complaints of autoconf/automake/pkgconfig, probably by wasting a great deal of time and divining the basic, and probably simple, Makefile that would compile it.

The incredibly frustrating, and dare I say stupid thing is that the only thing I need to "update" on my machine is the damn autoconf tools! I actually have all the libraries and headers these things need. That is completely backwards! In fact due to autoconf they have pretty much said "this only compiles on the very newest experimental Linux system". Well in that case, you have eliminated the need for "autoconf" and you could send out Makefiles that work only on a new Linux system. That would probably be easier for somebody like me to edit and get working on an older system.

When you do try to fix this, you run into the horrifyingly bad syntax and rules of M4 and GMake. Supposedly this is because they want to be back-compatable and run on older systems. But they lie when they freely add new "autoconf" commands so that the newest version is needed. Why not scrap the whole thing and try a modern syntax?

My proposed solution: make ONE compiled program that does "configure" and "make" and "make depend" (and "install" and "uninstall" and "clean" and "dist" and all the other special-cased targets...). This program can use the existing automake/conf stuff so it can be compiled for multiple platforms. The program then reads one file, in editable text (no XML, and it should be trivial to add/remove a source file by adding/deleting a line). This file should be parsed in a procedural way, with "if" and "for" statements and functions (ie it is perl or something) and the result should be the dependencies tree and it can then run the programs (the result is extremely static and has actual filenames and commands, not templates or "rules"). Make it really easy to add and remove switches to the compiler. Make it save user answers to questions in a file so it can provide those answers again, and make a gui program that provides panels and checkboxes to change those user answers. Make it automatically check for dependencies in any C style source files by looking for "#include" without running the compiler, and save the results in a binary database with date stamps so it can run this instantly as needed. Any package dependenciies should be checked with "if" statements in the file, the program would have enough commands to do typical file system things like look for files or grep a file for something, and it is then trivial to write a file that checks for something in multiple places by using "if-else" statements.

Re:Autotools are a nightmare (1)

Rob Riggs (6418) | more than 10 years ago | (#9228058)

I wanted to compile cairo on my Mandrake 9.1 system. I couldn't until I edited the autoconf file to remove "new commands" and added phony files to make pkgconfig happy.

Unless I misunderstand the problem, autotools is not to blame for your woes. The way I interpret your complaint is that you did not have a build environment that satisfies the build dependencies of cairo.

Now, you might have a valid complaint that cairo does not adequately document its build requirements, or that it has build requirements that most users can not reasonably be expected to meet. (Both are common complaints of mine when dealing with bleeding-edge open source projects.) But to blame autotools for the problem is not quite fair.

Fix on wrong level (1)

Tobias Luetke (707936) | more than 10 years ago | (#9217715)

I think you are trying the fix the problem on the wrong level.

There is nothing wrong with autoconf, and there is nothing wrong with new releases of software depending on the latest {autoconf|make|libc|lib|.*} whatsoever.

Take gentoo for example. After installing a minimal OS i can type emerge gnome and it will pull roughly 200 packages and compile them all cleanly. I honestly don't have any idea which of those 200 unrelated projects require which version of autoconf or whatever. Gentoo knows and deals with it by figuring out which is the latest autoconf required and getting this early on. As soon as you have infrastructure which does the dependecy resolution for you the problem evaporates.

That beeing said, my gentoo servers pull a new autoconf version about once a month for some random unrelated update. Thats really a bit excessive.

Re:Fix on wrong level (1)

Rob Riggs (6418) | more than 10 years ago | (#9228118)

That beeing said, my gentoo servers pull a new autoconf version about once a month for some random unrelated update. Thats really a bit excessive.

I am starting to think the same thing. I completely understand why it is happening though. Every time behavior changes in a commonly used library or other common dependency (commercial or open source) managed by the autotools, those version changes must be reflected in the autotools.

The thing is, if you want the latest and greatest packages built from source, your common build dependencies are going to be the packages that change the most. And the tools that manage the complexity of build environments, the autotools, are the ones that will need the most frequent updating.

The only real way out... (1)

ameoba (173803) | more than 10 years ago | (#9217903)

...is to use a language that's not going to get bogged down in the mess of portability. Java's a good example in theory but in practice is known to be somewhat brittle. Python/Perl/Mozilla seem to be a much better example in practice.

This, of course, limits you to systems on which the language has been implemented but it allows you to push the burden of portability onto somebody else.

Re:The only real way out... (1)

bcrowell (177657) | more than 10 years ago | (#9219033)

Yup. C was designed as a systems programming language, which would let you get close to the hardware. That's why, for instance, an int isn't defined to be any particular number of bits. C was really created as an alternative to coding in assembly language. It was never conceived of as a language that would be used for writing huge, cross-platform end-user applications. C++ is likewise a fine language for certain purposes, but by designing it to be C-compatible, they gave it some of the same limitations and problems as C. The mere fact that people perceived a need for something like autoconf just indicates that there were serious problems with C/C++.

No, I'm not saying that C/C++ is useless. I'm just saying that different languages work well for different things. Every language represents a set of engineering trade-offs. A VW bug is a great car for what it's intended for. A pickup truck is just what you need for some other purposes.

Languages do, however, reach the point where they've outlived their usefulness. Nobody today would design a language exactly the way FORTRAN was designed. They might instead design a language that duplicated some of FORTRAN's successes within its limited domain (e.g., simple loop structures that are easy for the compiler to optimize), without repeating its mistakes (e.g., COMMON blocks). In the decades since C was created, one thing that's changed is that we really have garbage collection down to a science. No longer do you wait a long time while your application does garbage collection. Sure, real-time applications still shouldn't use it, but for typical end-user applications, primitive C-style memory allocation is just as much of an anachronism as FORTAN's COMMON blocks. Yes, you can use other styles of memory management with C/C++, but if it's that simple, why do we still keep on finding out about new buffer-overflow exploits in sendmail, etc.?

Seems like most people's rationalizations for using C/C++ for big end-user apps today are either (a) it's fast, or (b) it's universally available. Well, performance isn't a good justification, because you can get the same performance by coding 99% of your code in Perl or Python, and only doing the most time-critical 1% in a language that compiles to machine code. Availability was an issue 15-20 years ago, but at this point, Perl and Python are available pretty much wherever you go.

Re:The only real way out... (1, Insightful)

Anonymous Coward | more than 10 years ago | (#9221525)

C code is more portable than any other language.

I speak from experience, having taken many odd projects and tried to get them to work on some even odder platforms. If you are writing code that needs to be widely used, say it implements a protocol or parses a file format you hope to be standard, C is your only choice.

If you are writing in perl or python, you are doing so because it is easier for you as the developer, not because it's easier or better for the person running it. In that case, one of two things must be true: You are very very important, and your convience can't be impinged upon at the expense of others; or, the code is very unimportant, so spending effort to allow other people to use it is a waste.

Guess what the case is for your code ?

Re:The only real way out... (1)

skids (119237) | more than 9 years ago | (#9230129)

> ...is to use a language that's not going to get
> bogged down in the mess of portability.

This is a solution people should seriously consider -- a lot of project code doesn't need to be in C because it isn't performance-sensitive on the level that C addresses. Using a language that has a high degree of portability and the ability to extend into C for the places where you need to explicitly optimize may be a valid alternative solution to a robust configuration system. Now if only there wasn't such acrymony between the camps of different languages...

However, there are other alternatives.

One is to make use of, and participate in the development of, portability abstraction libraries like GLIB, LibGG, etc. Projects like these hold a lot of promise for confining whole families of portability issues into a discrete package, and can have runtime advantages by increasing shared object use between unrelated applications. However, some have grown too extensive (like GLIB) and really need to be broken down into more useful pieces. Such libraries tend to evolve from larger projects, and unfortunately carry paradigms and conventions from those projects along with them. This threatens their overall acceptance -- it is much easier to find something objectionable about an encyclopedia than it is about a one-page pamphlet -- and as such they get rejected.

A concerted, full flank, push by the Free Software community to develop and apply very simple, lowest common denominator, portability encapsulation headers and libraries beyond libc is badly needed... the problem is in rallying a goodly portion of the community to such a cause, and finding solutions that are small enough and commonsense enough to appeal to a broad base of developers. Only a high-profile democratic design process could acheive this in a short timeframe, otherwise it can only evolve slowly, if at all, as some of these libs become more popular and others fade or crumble under their own weight.

P.S. Hell, it's 2004 and we still don't have a universally recognized header which we can include to get typedefs for different explicit integer sizes, we still have to hunt for them and hope that if we define them on our own they weren't hiding someplace waiting to to spit "redefinition" warnings back at us. With simple issues like this still unresolved, it makes me think that we need a new class of Free Software developer -- the crusader -- who attempts to take very simple things like this and apply them across as many projects as he can get to go along with it. Sort of like a code janitor, but with more than spellfixes.

P.P.S. though I think autoconf has its head up its ass as much as the next guy, I would warn that fledging projects claiming to replace it only look better because they are new and have not yet tackled all the issues autoconf has. It is easy for a project to be faster and simpler than autoconf when it doesn't do nearly as much. Rest assured, they will get just as byzantine as folks tack on their needed functionality.

Smake? (1)

UnknownSoldier (67820) | more than 10 years ago | (#9218521)

Surprised that no one has mentioned 'smake'

Does anyone have a summary of the differences of all the various *makes?

more general question (1)

chanceH (197827) | more than 10 years ago | (#9218664)

to those who have some decent autoconf kung-fu:

after you get the hang of it, is it comfortable enough to use with all your building? or is something you add on at the end when app/project is ready for export?

I'm good enough at makefiles to make them pretty portable (i.e. it'll work on the 2-3 platforms I really care about at the time), and just have never bit the bullet and learned use autconf.

still, if I download something, and install procedure is anything besides "./configure; make; make install", I consider it a significant (though not 100% reliable) warning sign that I just downloaded crap.

Re:more general question (2, Informative)

grubba (89) | more than 10 years ago | (#9219507)

autoconf has two purposes:

  • Retargeting the installation and configuring options (eg --prefix, --with-*).
  • Detection of the build environment (eg compiler, system calls, etc).

Converting from a plain Makefile to configure.in+Makefile.in is straightforward if your Makefile already uses variables for binaries and directories.

The main reason to use autoconf is the second point above; when you write code that uses system calls (eg read(2), write(2)) and not libc calls (eg fread(3C), fwrite(3C)), that differ between operating systems (eg BSD vs SysV vs POSIX), and need to detect which taste you have.

To answer your questions: autoconf is something I hook in when:

  • the project is starting to get large,
  • when I've been bit by operating system incompatibilities,
  • or if I intend to release the code.

Re:more general question (1)

HishamMuhammad (553916) | more than 10 years ago | (#9226209)

after you get the hang of it, is it comfortable enough to use with all your building? or is something you add on at the end when app/project is ready for export?


I have to say yes. I just converted my latest project, htop [sf.net] , to autoconf and automake, and don't regret a thing. I just followed the tutorial from the info pages (yes, not the manpages, the other ones, that nobody reads :) ), wrote two files (configure.ac and Makefile.am), and everything worked.

There is also a very cool program called autoscan which scans your .c files and creates a sample configure.ac with the configure checks your program needs.

There are also bonus features by switching to autoconf, like make dist: it packs a clean htop-0.3.tar.gz file, only including the needed files (ie, I can keep other files in my development directory and they won't be shipped by accident).

still, if I download something, and install procedure is anything besides "./configure; make; make install", I consider it a significant (though not 100% reliable) warning sign that I just downloaded crap.


Oh yes, I can relate to that sentiment. Having an autoconf-generated configure script makes your project look more mature. :)

Write portable code. (3, Interesting)

V. Mole (9567) | more than 10 years ago | (#9219147)

It's really not that hard. The autotools are crutch for people who can't be bothered to actually learn the C language and library, or the difference between POSIX and what their current environment provides. Go read #ifdef Considered Harmful [literateprogramming.com] (PDF) by Henry Spencer, about the right way to deal with portability problems, and notice that the whole approach of autoconf is wrong.

"checking for stdlib.h..." indeed. If you don't have an ISO-C environment, you're so screwed that there's nothing autoconf can do to save you: you don't know how the language works, so whether or not you're missing a few library functions isn't going to make any difference.

I once worked on a fairly large process control system, of which less than 1% was portability code. It ran on Solaris, AIX, HP-UX (this was pre-Linux). It also ran on VMS and WinNT. Most of the portability issues dealt with the entirely different models for process management and memory-mapped files, not with pipsqueek stuff like autoconf.

It's real simple: you read the docs. You determine what the standard actually requires, not what your development system happens to do. And you program to that, then test.

As for the autotools proclaimed ability to port to systems you weren't planning for, that's so much BS. If the system is sufficiently different that ANSI C and POSIX aren't sufficient, then you're going to have to update your code anyway.

Replacing autoconf with some other crutch isn't going to help. Just ditch it entirely.

Re:Write portable code. (2, Insightful)

Rob Riggs (6418) | more than 10 years ago | (#9228008)

The autotools are crutch for people who can't be bothered to actually learn the C language and library, or the difference between POSIX and what their current environment provides.

Yours is a rather myopic and provincial view of the problem, IMHO. The autotools are a necessary evil when one has more dependencies than just ISO C. What versions of libraries are installed? What odd bugs must I work around? Where are the headers for package foo located on this system? What compiler options should I pass?

It's real simple: you read the docs. You determine what the standard actually requires, not what your development system happens to do. And you program to that, then test.

What do the standards say about which version of libjpeg is installed and where its headers are? What about the version of Gnome? What if I want to support both Gnome and KDE if they are available? How do I determine that with Make?

I don't think you fully grasp the problems that the autotools actually do solve for developers.

I propose... (2, Funny)

k4_pacific (736911) | more than 10 years ago | (#9219868)

A new program: autoconfconf

I am struck by the recursive nature of the situation.

Shortsighted design (1)

Maljin Jolt (746064) | more than 10 years ago | (#9228728)

Autoconf is ok, but it should be designed with it's own versioning schema on mind. To avoid nasty incompatible hacks in future. That future is now. But in software, everything is fixable, just someone must care about it. Anybody volunteers instead of slashdot ranting? Yes, I can hear you: you just don't like this m4 stuff, sorry.

Autoconf is used too much (0)

Anonymous Coward | more than 10 years ago | (#9233958)

One word, standards.

What is needed is a *standard* build environment. If software is written to compile in these standard environments then the configuration problem gets pushed down to the construction of the build environment.

The problem is that lots of developers use autoconf as their build environment. Rather developers should use tools like pkg-config, gcc, gnu make, and ocamlfind. Then only these tools need to be ported to weird operating systems then everything else gets ported for free.

Most software projects don't do more than the following:
- compile programs into libraries and executables
- install executables, libraries, configuration and data files

For the first part you need to demand a build environment. Time should be spent porting build environment, not individual packages. For the second you need to build a suitable package spec --- be that debain, rpm, godi or whatever.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?