×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Autotools

samzenpus posted more than 3 years ago | from the read-all-about-it dept.

Books 148

Muad writes "John Calcote is a senior software engineer in Novell's Linux business, who after slogging up the steep learning curve the Autotools triad poses to those packaging software according to the portable GNU conventions for the first time, very kindly decided to make the experience easier to newcomers by sharing his years of experience and carefully crafted bag of tricks. His book is a welcome update to a field that has not seen entries now for a full ten years, so long has been the time since GNU Autoconf, Automake, and Libtool by Gary V. Vaughn, Ben Ellison, Tom Tromey, and Ian Lance Taylor hit the shelves. Unfortunately, the publishing industry is driven by the need to turn a profit to fund its endeavors, and specialist items like this book are not obvious candidates for volume selling - which is a credit to No Starch Press' willingness to venture down this path." Keep reading for the rest of Federico's review.The book opens with John's experiences in adopting the Autotools, and quickly offers what is in my view a very important word of caution that is often lacking in the few tutorials I have seen on the Net: the Autotools are not simply a set of tools but foremost the encoded embodiment of a set of practices and expectations in the way software should be packaged the GNU way. While it is acceptable for beginners not to know what these expectations are, the right frame of mind to approach the Autotools is to focus on learning what way the Autotools operate, what they are trying to accomplish, and why. Attempting to use the Autotools without understanding the bigger picture will lead to very high amounts of pain, as it is one of the toolsets most difficult to adapt for use separate from the policies they represent, so strongly are these conventions embedded in their fabric. With this understanding, it becomes possible to generate extensive configurations with a few lines of Autoconf or Automake - without this understanding, it very quickly becomes a battle to force a round peg into a square tool.

John's style is more extensive and takes a longer path to the "technical meat" of the problem than the 10-year old alternative, but in this reader's opinion it flows significantly better as there is an underlying story, a thread that connects the bits of what is otherwise a pretty arid subject. For those masters of shell-fu, this book is a page-turner, while for mere mortals it is a good, approachable, path into a difficult skill.

The book is structured around the packaging of two different projects, the first being a simplified "Hello, World" project to provide a digestible introduction to the processes and technology of the Autotools, while the second representing the full-blown packaging of a complex, real-world project (the FLAIM high-performance database). This is a very good approach, breaking the theory into many practical examples of practice, and providing many ready-made bits that the rest of us can start our own configuration build files from. The result is a first half providing a gentler, streamlined introduction to the subject matter, before the full jump into the gory details of the most complex possibilities the toolset offers. While it must be noted that John attempts to keep away from those most fine details which "may be subject to change" between minor releases of the tooling, which is doubtlessly good for both our scripts' and the book's shelf life, it must be observed that he does not shy away from very dense (and otherwise utterly undocumented) material, such as the use of M4 macros in Autoconf, something a colleague of mine once pointed to me as "the one more reason I'd rather chew broken glass than deal with Autotools".

Assuming you have the requisite knowledge of Make, Shell scripting (particularly Bash), and GCC that are essential to a developer, packager, maintainer or buildmaster of a Linux, BSD or *NIX project, or that you are on your way to achieving those skills, this is a book that belongs in your shelf, right next to the RPM documentation. This is material for experts or experts in the making, but in my opinion you will find no better introduction to this complex subject. I had it on my wish list well before it was ever printed, and its presence on my desk caused several other developers in my office to order their copies pretty much on the spot upon finding out of its existence. Either as a learning tool for a skill you are trying to attain, or as a reference to turn to when faced with the complexities of this unique set of tools, this book is well worth its price tag.

I certainly hope this is not the last publication we see on the Autotools in this decade, but either way, it is a good start indeed - and my hope is that the publisher will refresh the title when an update is warranted, without waiting ten years!

Federico Lucifredi is the maintainer of man (1) and a Product Manager for the SUSE Linux Enterprise and openSUSE distributions.

You can purchase Autotools: A practitioner's guide to GNU Autoconf, Automake, and Libtool from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

148 comments

Wow... (-1, Troll)

Anonymous Coward | more than 3 years ago | (#33715092)

A book review that isn't some shill review for a book from Packt Publishing? What is Slashdot coming to?!?!?!? Does this mean CmdrTaco has finally garnered enough funds to book his micropenis enlargment surgery?

Re:Wow... (-1, Troll)

Anonymous Coward | more than 3 years ago | (#33715252)

nope... quite the opposite, in fact. Seems our boy found a top who preferred his small cock. Made him shave (everything -- balls, legs, chest, back, ass, etc) and dress up like a little schoolboy.

Mod up (1)

CarpetShark (865376) | more than 3 years ago | (#33717594)

Packt are way too fond of publishing any old crap just to get an early book out on a niche topic. They have been known (by me) to randomly spam people who blog on their books' topics, asking them to review the book. Once they get you interested in the free book they start hounding you for a review on your blog. I wouldn't have minded TOO much if their books had anything worthwhile to write about.

Autotools do not need a book (5, Insightful)

koinu (472851) | more than 3 years ago | (#33715174)

... they should be replaced by something else.

Re:Autotools do not need a book (1, Troll)

Dr. Sp0ng (24354) | more than 3 years ago | (#33715186)

Here you go [cmake.org].

Re:Autotools do not need a book (5, Interesting)

Urban Garlic (447282) | more than 3 years ago | (#33715692)

I often need to install software in an environment that's different from where it's going to be run, e.g. I install it on a file server, where the target directory is /export/apps/stow/, and then I use "stow" to put it in /export/apps, which clients mount as /usr/site, so they see "/usr/site/bin/", and are set up to look in /usr/site/lib for libraries, and so forth.

I don't know if this is instrinsic to newer build schemes or not, but my recent experience has been that "old-style" (autotools-based) packages work just fine, they interoperate well with stow, accept the "--prefix" argument to configure, and work just fine for the clients. Cmake-based packages tend to hard-code path names into start-up scripts, which then break on the clients, which view the app in a different hierarchy -- they don't have /export, in particular.

Now, it may well be that these are badly-written cmake scripts, and cmake is perfectly capable of doing it right, I honestly don't know. But it seems to me that cmake (and Python's easy-install, and every other new build scheme I've run across in the past few years) are all part of a new generation of tools which really want the build-time and run-time environments to be the same, because they're built around the "single-user isolated workstation" model.

But it's not true. Lots of us still have centralized file servers that use NFS exports to make centrally-managed Linux applications available to many clients. The new tools make some things easier, but this, they make harder.

Also, uphill through the snow both ways, and we liked it, get off my lawn, kids today don't know nothin', no respect I tell you.

Re:Autotools do not need a book (3, Informative)

GooberToo (74388) | more than 3 years ago | (#33715828)

I've run into the same problem with cmake. I don't really have that problem with python tools as python's virtual environment tools [python.org] seem to handle things nicely. Tools such as pip [python.org] natively handle virtual environments, automatically installing into it when one is active.

Also, there are lots of nice wrappers [python.org] to work with python's tools, for developers, such as gogo [bitbucket.org].

Re:Autotools do not need a book (3, Interesting)

Anonymous Coward | more than 3 years ago | (#33717410)

"I don't really have that problem with python tools as python's virtual environment tools seem to handle things nicely. Tools such as pip natively handle virtual environments, automatically installing into it when one is active."

Which is another aspect of the very same problem. So their solution to ignoring they should seggregate functional development from bug fixing, that they should take API stability as an almost sacred cow is "reinventing" the "static linked environment"?

Try to install two disparaged tools based on the same shared environment and welcome to the fun of app A asking for 'foo 1.2.3' and app B needing 'foo 1.2.4', so you end up producing a virtual environment for each and every app you install. Good luck then with tracking security advisories for "foo" (and all the other dependencies of each installed app). Good luck with finding an upgrade path that will just fix known security issues without breaking functionality for any of your installed apps.

But, but, but... you should use the very bleedging edge -the developer of App A will say, without paying attention to the fact that you manage 2000 machines with 150 different main apps from different developers and dates, disregarding even that each of those developers will define "bleeding edge" as "whatever it happens to run in my development box, disregarding if it's really needed or it's stable or it will be supported by my dependecies' developers".

Ahhh, youngsters... now, you get off my lawn!

Re:Autotools do not need a book (1)

Abcd1234 (188840) | more than 3 years ago | (#33715892)

I don't know if this is instrinsic to newer build schemes or not, but my recent experience has been that "old-style" (autotools-based) packages work just fine, they interoperate well with stow, accept the "--prefix" argument to configure, and work just fine for the clients.

Not true. No true at all.

I used stow for many many years. And it's true, a lot of packages just worked. You'd do a "configure --prefix /run/time/target", then "make install prefix=/install/time/target", and it would work. But this is *hardly* universal. Many many apps fail due to install-time relinking, which means you have to hack libtool and the build a bit in order to get them to install. It's doable, certainly, but it's hardly automatic.

Now, I haven't had to deal with cmake-based systems recently, so it may be that they're even more broken than this. But it's never been completely trivial to build for one runtime path, while installing to another.

'course, as a user on an isolated workstation, I finally just moved away from stow, and manage my custom-built packages using checkinstall. But that's obviously no solution when you're managing applications on an NFS share.

Other build-package problems (1)

coats (1068) | more than 3 years ago | (#33716688)

And there's worse -- supporting multiple simultaneous build targets. Most of my stuff I build {optimized, debug, optimized-profiling} x {gfortran/gcc, g95/gcc, sunf95/suncc, ifort/icc} for a set of 12 simultaneous build-targets. Conventional build systems do not support multiple simultaneous build-targets well.

Re:Autotools do not need a book (1)

Kaz Kylheku (1484) | more than 3 years ago | (#33717172)

Hi Urban Garlic,

The most important thing to know is that the --prefix argument (for a correctly designed configure script which follows the conventions) indicates the run-time installation directory of the program. I.e. the path given to --prefix may actually be compiled into the program and then used by that program at run-time.

The average free software user compiles the program on the same machine where he will run it. And so the prefix is also the place where the program is copied during "make install".

The default value for --prefix is often /usr/local, for this specific use case.

If you're building a distro or toolchain, you may have to override --prefix to /usr. (Assuming that the programs you are making wll go into /usr/bin, /usr/lib, etc on the target system).

The point is that the prefix has to be correct for the ultimate destination where the software lives.

Of course, when you're preparing a package for another system (perhaps a cross-compiled package, even), and the prefix is /usr, of coures you can't do "make install" because that will try to copy into /usr on the local build system! You must install into some temporary directory (or a "sysroot" directory where you are building up the filesystem image for a target). The way this is done varies from package to package. Some packages accept an extra configure parameter, like --install_dir. Many packages accept a make variable called DESTDIR which is specified on the "make install" command line; the contents of DESTDIR is prepended to the prefix. In some cases you can override the prefix variable itself at install time "make install prefix=...".

Re:Autotools do not need a book (1)

chrb (1083577) | more than 3 years ago | (#33717400)

You must install into some temporary directory

I'm sure he knows this since he is already using Stow [gnu.org]. Stow works pretty well for having multiple versions of different software packages built from source and installed simultaneously, and having a proper package management system for it all. Though these days I use Checkinstall [asic-linux.com.mx] - having the final package as a .deb or .rpm make it a bit easier to distribute the built packages.

Re:Autotools do not need a book (2, Insightful)

Anonymous Coward | more than 3 years ago | (#33717298)

"Now, it may well be that these are badly-written cmake scripts, and cmake is perfectly capable of doing it right"

The critics about the book make a point about "the Autotools are not simply a set of tools but foremost the encoded embodiment of a set of practices and expectations in the way software should be packaged the GNU way".

It's not that cmake can or can't be properly used in order to provide a platform-independent prebuild environment but it lacks the grayback experience. As the old motto says, "those who don't understand UNIX are condemned to reinvent it, poorly". Unix-zen, once "the cool wave" is about 40 years old now. New generations feel they know better than the old farts (and many times, they are right) and that they can do it better and simpler... just because they are not (still) aware of the whole landscape and its corner-cases. So, yes, they might re-implement the old farts' tools better and simpler (after all, they have the advantage of building of top on the shoulders of giants, so to say) but, at the same time, they are condemned to fail on the same stepstones their olders managed to workaround just because they choose to ignore them, and then add their own package of failures.

"But it seems to me that cmake (and Python's easy-install, and every other new build scheme I've run across in the past few years) are all part of a new generation of tools which really want the build-time and run-time environments to be the same, because they're built around the "single-user isolated workstation" model."

Young is always proud and think to know better than anybody else. They just produce tools for what they know and they choose to ignore everything else. No wonder they fail on things that would seem trivial for older people that already had to fight with them years ago.

This is the generation of 'reinventing the wheel': they are doing better in some aspects and fail heavily on things that should be known by now.

Re:Autotools do not need a book (2, Funny)

Anonymous Coward | more than 3 years ago | (#33717516)

because they are not (still) aware of the whole landscape and its corner-cases

Youngsters can undervalue knowledge of "the whole landscape and its corner-cases", but old farts can also overvalue it. The beauty of new tools like CMake is that they can leave the past behind, and stop worrying about corner cases on obsolete platforms. At one time I was an absolute expert on MSDOS and Windows 3.1. I leave that off my resume now because it no longer has any practical value, and it just makes me look old :-(

Re:Autotools do not need a book (3, Funny)

TheRaven64 (641858) | more than 3 years ago | (#33717110)

I've looked at cmake, and it seems like a really nice solution. When I work out what the corresponding problem is, I have no doubt I will use it.

Re:Autotools do not need a book (1)

oiron (697563) | more than 3 years ago | (#33719856)

Actually building on multiple platforms without maintaining separate build files for each is the problem...

CMake was created to build Kitware's other products, most notably VTK and ITK. To date, I've built both, and other things built on top of them on three platforms, with several variations: GCC on Linux, both 32bit and 64bit, MinGW and Visual C on Windows. I don't need to install anything else apart from CMake and the compiler (and associated Make package) on each of those platforms, run it once, and then run make. On the other hand, just you try running autotools on Windows...

The other option, which I see a lot of projects using, is to have multiple build systems - an autotools one for Unices, and a .sln/.vcproj set for Windows. I don't think I need to point out the fun in maintaining that...

Re:Autotools do not need a book (1)

StripedCow (776465) | more than 3 years ago | (#33715238)

Indeed, but a book can make it easier to develop such a thing.

Re:Autotools do not need a book (0)

Anonymous Coward | more than 3 years ago | (#33715304)

They already exists. CMake is a fine replacement, sane and very pragmatic.

Re:Autotools do not need a book (1, Interesting)

Anonymous Coward | more than 3 years ago | (#33715262)

They have been.

Cmake among others has effectively replaced autotools. It's FAR easier to deal with, cross platform, fast, will build makefiles, visual studio solutions, and X-code, and supports testing and other things.

There are some other ones around too like Scons, but the point is, anyone starting a new project now with autotools is a dolt or a masochist or both.

Autotools is dead. Let's let it be buried in peace, please.

Re:Autotools do not need a book (2, Insightful)

samjam (256347) | more than 3 years ago | (#33716052)

If that were true you wouldn't have needed to say it

Re:Autotools do not need a book (1)

Vintermann (400722) | more than 3 years ago | (#33719884)

The efficient markets hypothesis as applied to software. If it was better, we'd already use it!

Re:Autotools do not need a book (4, Interesting)

JanneM (7445) | more than 3 years ago | (#33717382)

I've just recently been in the situation of selecting a build system for a project with an existing codebase. I looked at the obvious alternatives, including cmake.

In the end, I chose autotools.

When you're doing a non-trivial project, cmake doesn't become any less complicated than autoconf and automake anymore - if your build is complex, you have to deal with that complexity somewhere after all. And there's a lot more and better resources for using autotools than cmake around, for figuring out odd corner cases. If you have a somewhat odd build requirement, chances are somebody else has already solved it using autotools already.

From my experience so far, most of what people dislike about using autotools come from Automake. But Automake is of course completely optional to use, and Autoconf - which provides most of the benefits - was made to be standalone. If you have a system with existing makefiles, it makes a lot of sense to simply use Autoconf to configure the app and the makefiles and leave Automake alone.

This is a lengthy but really illuminating document on using the autotools, that specifically goes through using autoconf alone and on how to adapt an existing project: http://www.freesoftwaremagazine.com/books/agaal/brief_introduction_to_gnu_autotools/ [freesoftwaremagazine.com]

Re:Autotools do not need a book (0)

Anonymous Coward | more than 3 years ago | (#33719100)

In case you didn't notice, the article you linked to was written by the same guy who wrote the book being reviewed, John Calcote.

Even better would be this (1)

bogaboga (793279) | more than 3 years ago | (#33715560)

decided to make the experience easier to newcomers by sharing his years of experience and carefully crafted bag of tricks.

Even better would be reading that this gentleman had gotten behind efforts to make working with the tools easier. Simply teaching me tricks though welcome, is not good enough. Working with the tool(s) still is difficult.

Re:Autotools do not need a book (0)

Anonymous Coward | more than 3 years ago | (#33716152)

Perhaps. The thing is, everything else since then has sucked. Big time. Autotools was designed when cross-platform development was more prevalent. Nowadays new coders think cross platform means Ubuntu and RedHat. It just ain't so.

Re:Autotools do not need a book (-1)

Anonymous Coward | more than 3 years ago | (#33717014)

Autotools is definitely NOT cross-platform. You can't ignore the single largest platform in the world (Windows) and claim to be "cross-platform".

Re:Autotools do not need a book (1, Informative)

Anonymous Coward | more than 3 years ago | (#33716324)

Oh but I might still buy a copy. Just to wipe my @ss with. I can't begin to think of the hours I've wasted debugging build failures of this heap of cr@p.

mk-configure (1)

avgapon (1851536) | more than 3 years ago | (#33716570)

Why invent Makefile writing scripts or even programs when make and Makefiles can easily do all that is required for cross-platform (and cross-target) compilation? http://sourceforge.net/projects/mk-configure/ [sourceforge.net]

How about we just throw them away? (0)

Anonymous Coward | more than 3 years ago | (#33715178)

Seriously, the whole autotools thing is the worse design EVAR.

I suppose it's nice (4, Insightful)

Vintermann (400722) | more than 3 years ago | (#33715230)

I suppose it's nice that someone writes a book like this, since a lot of existing projects use autotools (or more commonly, try to by means of copy/paste and cargo-cult based build scripting).

But autotools should really be phased out. It solves a lot of problems that aren't problems anymore, and makes a helluva lot of new ones in the process. There are a lot of up and coming build systems to challenge it, and then there's CMake which is an OK compromise between those and practicality.

Re:I suppose it's nice (1, Interesting)

Anonymous Coward | more than 3 years ago | (#33716186)

Care to point out what new problems autotools creates? From my experience, autotools projects tend to work flawlessly while cmake ones tend to throw a lot of obscure errors that no one is able to figure out.

Re:I suppose it's nice (4, Interesting)

Entrope (68843) | more than 3 years ago | (#33716388)

The autotools suite requires that software developers keep revising things that worked before, because autotools has some new paradigm for some aspect of its operation every year or so.

For example, one of my open source projects lets the user specify which extra modules should be compiled into the binary. (It doesn't use loadable modules because that was even more painful when we started out.) Over a span of about two years -- and I think three "minor" (1.x) releases of automake -- the approved way of conditionally linking object files changed twice. The changes were not documented, and nobody bothered to describe any way that would work across the several versions of automake that were in common use at the time. In contrast, doing the same thing with autoconf alone or with some non-autotools script is dead simple.

autoconf has also gotten progressively fussier. For example, simple m4 macro invocations like FOO(bar) used to work fine. Now they often generate a warning that the argument is not quoted properly. To pick one example of prior art, C solved that particular precedence problem about 35 years ago. There is no good reason for autoconf to make the software maintainer throw in all the []s that it wants.

Those are just a few examples from my experience; others will have more.

Re:I suppose it's nice (3, Insightful)

DiegoBravo (324012) | more than 3 years ago | (#33716562)

> Care to point out what new problems autotools creates?

Builds that take half an hour just to "configure", checking for the existence of things like strcpy(), but anyway fail at compile or link time because a missing symbol in an upgraded dynamic library?

Re:I suppose it's nice (1)

onionman (975962) | more than 3 years ago | (#33717222)

Care to point out what new problems autotools creates?

The fact that I can't use them on Windows. Seriously, I can't see how we call a build system "cross platform" when it doesn't cooperate with the most widely deployed build environment.

It seems to me that it might be possible for a Python-based build system to have enough platform independence that it could build projects on Linux, Mac, and Windows. SCons looks like it's coming along. I'd really like to see more developers get behind it, but on the open source projects that I've worked on many of the developers are antagonistic toward any OS other than Linux.

Re:I suppose it's nice (5, Informative)

djm (126641) | more than 3 years ago | (#33718370)

It's true, pretty much. We developed configure scripts and ways to generate them in the days of 28.8kbps modems and they had to work on Unix System III and Xenix and HP-UX. We couldn't assume anything like Perl or Python was available. Linux distros were only just appearing, and there were no package management systems. Windows was still a 16-bit DOS shell. It was a different world. I'm amazed this stuff has endured as long as it has with so few changes. By the time Automake was written, several years after Autoconf, we at least felt we could assume the presence of Perl.

Want to know why it's called "Autoconf", which I think is a bit ugly of a name? I wanted to call it "Autoconfig", but when you add a version number and ".tgz" to that, you exceed the 14-character file name limit of some of the Unix variants it had to be downloaded and installed on!

Dave MacKenzie
Autoconf's main developer

Good grief, those run-on sentences (0, Offtopic)

hkz (1266066) | more than 3 years ago | (#33715274)

John Calcote is a senior software engineer in Novell's Linux business, who after slogging up the steep learning curve the Autotools triad poses to those packaging software according to the portable GNU conventions for the first time, very kindly decided to make the experience easier to newcomers by sharing his years of experience and carefully crafted bag of tricks.

The book opens with John's experiences in adopting the Autotools, and quickly offers what is in my view a very important word of caution that is often lacking in the few tutorials I have seen on the Net: the Autotools are not simply a set of tools but foremost the encoded embodiment of a set of practices and expectations in the way software should be packaged the GNU way.

Good heavens, I'm all for sentences with body, but this is terrible. I actually stopped reading the article after the second one. You know what this site could use? Editors.

Re:Good grief, those run-on sentences (1)

Abcd1234 (188840) | more than 3 years ago | (#33715632)

To be fair, I wouldn't consider those run-on sentences (they are, by and large, grammatically correct, less a missing comma here or there). They're just... excessive.

Re:Good grief, those run-on sentences (1)

Mikkeles (698461) | more than 3 years ago | (#33717078)

Jeez, I realise that attention spans are diminishing these days, but this is ridiculous!

"the GNU way" == Garbage (-1, Troll)

Anonymous Coward | more than 3 years ago | (#33715320)

Just like everything that is prefixed by the word 'GNU', Autotools are a stinking pile of fail.

Re:"the GNU way" == Garbage (1)

CarpetShark (865376) | more than 3 years ago | (#33717652)

Yeah, that whole GNU Compiler Collection thing is just a waste of time. Can't compile a binary out of anything. Thankfully Microsoft were forced to give away a free version of Visual C++ for some reason, having killed all the commercial competition. Also, Intel have released a compiler for linux, which you can actually run in a shell that just happens to be pre-compiled somehow. So, thankfully there are alternatives to GCC.

Let me get this straight: (5, Informative)

larry bagina (561269) | more than 3 years ago | (#33715380)

A guy from novell/suse is reviewing a book by another guy from novell/suse. When his novell/coworkers see the book at his novell/suse desk, they immediately buy a copy.

Did I miss any novell/suses?

Re:Let me get this straight: (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#33715438)

I novell/sused in my pants over your novell/susing of all the novell/suses in the article.

Crappy Approach (2, Insightful)

firewrought (36952) | more than 3 years ago | (#33715436)

I understand that adoption/marketing/historical factors may have justified this particular approach to cross-platform builds of C/unix apps, but is this such a big problem that it requires 5-6 languages to solve (counting the syntax of C, sh, configure.ac, Makefile.am, makefile and possibly other intermediate formats)? Sheesh...

Re:Crappy Approach (1)

loonycyborg (1262242) | more than 3 years ago | (#33715592)

Probably they just wanted to make the build process not require anything other than sh and make to be installed on end user's system, so they used m4 macro processor to generate shell scripts and Makefiles which can just be included in the tarball.

RE: A not so Crappy Approach (2, Insightful)

ebuck (585470) | more than 3 years ago | (#33715762)

Those who don't understand Automake are doomed to repeat the mistakes of build systems that are not designed like Automake.

The one language that actually drives all of Automake is ML. Funny it is the one language you didn't list, but you listed a bunch of the high level macro files that get expanded with ML. If you don't know sh, then you shouldn't be programming on the command line (stick to an IDE that does the compilation for you). C isn't required for any component of Automake. If you write a makefile with automake, you made a big mistake, as Automake writes the makefiles for you.

There are other systems out there which are easier to use, but there's only a handful that does things in a manner that is highly reliable and portable on many platforms. Those that do strive for such goals end up operating like Automake, but often they do so without allowing such easy access to the internal guts of what is really going on.

Yes, I like cmake too, but bashing Automake just because you don't understand it is just the computer equivalent of name-calling.

Re: A not so Crappy Approach (4, Insightful)

Waffle Iron (339739) | more than 3 years ago | (#33715928)

Hmmm... are you sure that you didn't mean the macro language M4? I thought that ML was the pure functional ancestor to languages like Haskell.

Anyway, I played around with M4 a little bit because I thought it looked handy for a few things. It has a deceptively simple specification that only takes a few pages, but it's one of the most extreme examples of "emergent behavior" I've encountered. Even simple tasks rapidly become mind boggling due to the deceptively tricky nature of recursive text substitutions and quoting. It's a real brain teaser of a language. It does seem like it would be a nifty tool if I spent enough time to really figure it out.

Re: A not so Crappy Approach (0)

Anonymous Coward | more than 3 years ago | (#33716858)

Take time to figure it out.

I've been an occasional, uninformed user of m4, but then necessity and a two hour train journey was enough to let me write a very flexible m4 based provisioning system for our Linksys phones. Moves adds and changes are now trivial :-)

strange brew that's also good for you (0)

Anonymous Coward | more than 3 years ago | (#33715482)

That would be home made Kombucha.

I would rather carve a Makfile into my arm (3, Insightful)

Bitch-Face Jones (588723) | more than 3 years ago | (#33715580)

with a nail than use autotools

Re:I would rather carve a Makfile into my arm (2, Funny)

shutdown -p now (807394) | more than 3 years ago | (#33716032)

I would rather carve a Makfile into my arm with a nail

I thought that's precisely what autotools is for?? ~

Re:I would rather carve a Makfile into my arm (1)

TheRaven64 (641858) | more than 3 years ago | (#33717210)

I'm fairly sure there's a macro for that in autoconf. And it's enabled by default.

The first rule of the autotools club is... (0)

Anonymous Coward | more than 3 years ago | (#33715714)

...that you should never use autotools.

The second rule is that is better to use something else like scons.

Shallow Learning Curve (2, Funny)

the eric conspiracy (20178) | more than 3 years ago | (#33715738)

Shouldn't that be shallow learning curve? Steep would imply quick effortless progress towards expertise (at least if you put the independent variable on the abscissa and dependent on the ordinate as is customary). A shallow learning curve now would imply slow progress...

Re:Shallow Learning Curve (4, Insightful)

Abcd1234 (188840) | more than 3 years ago | (#33715936)

Oh, for the love of jesus, not this again...

Yes, "learning curve" is a colloquialism that is not, literally, logically consistent. Move on already. Seriously.

Re:Shallow Learning Curve (0)

Anonymous Coward | more than 3 years ago | (#33716710)

Oh, for the love of Jesus, not this again... "Jesus" is a proper noun and should be capitalized. Move on already. Seriously.

Re:Shallow Learning Curve (3, Informative)

camperdave (969942) | more than 3 years ago | (#33716492)

No. When people talk of steep learning curves, the vertical axis is effort, the horizontal axis is expertise. Something with a steep learning curve requires a lot of effort to gain a little expertise. Something with a shallow learning curve requires almost no effort to gain knowledge.

Re:Shallow Learning Curve (1)

AdamHaun (43173) | more than 3 years ago | (#33718280)

I thought it was how much you have to learn (Y axis) vs. how long you use the software (X axis). A shallow learning curve takes less learning (= effort) in the short term.

Print on Demand for Obscure Topics (1)

perpenso (1613749) | more than 3 years ago | (#33715908)

Unfortunately, the publishing industry is driven by the need to turn a profit to fund its endeavors, and specialist items like this book are not obvious candidates for volume selling - which is a credit to No Starch Press' willingness to venture down this path.

The situation is not as dire as this post seems to suggest. Print on demand [wikipedia.org] is an option for a book such as this. Getting a publisher like No Starch is great since they will provide traditional editing, review and marketing services however the publisher is not necessarily making any great investment since they too can take advantage of a print on demand type of approach. There is no longer a need to print a large number of books up front.

NIGGA (-1, Troll)

Anonymous Coward | more than 3 years ago | (#33716038)

Host wh4t tHe house Future. The hand

Re:NIGGA (1)

QuantumBeep (748940) | more than 3 years ago | (#33719172)

I have to know. What the hell is the point of this shit? I mean, if it had a link to something, or a shout-out to GNAA, or just anything, I would understand.

CMake (4, Informative)

paugq (443696) | more than 3 years ago | (#33716140)

Obligatory link to a good autotools alternative: CMake [cmake.org]. And my CMake tutorial, Learning CMake [elpauer.org].

Re:CMake (0, Insightful)

Anonymous Coward | more than 3 years ago | (#33716758)

Why replace the GNU autotools with something even worse? CMake is the perfect build system for projects like pulseaudio and systemd, by which I mean software I wouldn't want to touch with a bargepole. CMake is a piece of fucking shit, or rather, -DCMAKE_IS_A_PIECE_OF_FUCKING_SHIT:BOOL:ON.

Re:CMake (0)

Anonymous Coward | more than 3 years ago | (#33716888)

It's surprising pulseaudio doesn't use cmake considering the coprophiliac nature of its developer.

Re:CMake (1, Insightful)

Anonymous Coward | more than 3 years ago | (#33717182)

I think it's a testament to how bloody awful Autotools is that people think CMake is an improvement!

Re:CMake (1)

paugq (443696) | more than 3 years ago | (#33717386)

So what's your proposal for multiplatform software development? Oh, and I want to be able to use Incredibuild, the debugger and the profiler with Visual C++ and have line information, breakpoints in the IDE, etc. Same for gcc on Linux: I want my distcc when I'm on a distributed build environment. Same for XCode on Mac. Scons and all the others which compile & link directly are not able to use that because they do not generate .vcproj / Makefiles / whateverprojectfilemyIDEuses

Re:CMake (1)

mathfeel (937008) | more than 3 years ago | (#33717434)

Thanks for the link. Time to read up on CMake. When debug a broken build script in Gentoo, my personal experience is that I usually figure out what's wrong more quicker with Autotool than Cmake. The main reason is probably that I am more familiar with the former from reading this autotool online tutorial/reference: Autotool Mythbuster [flameeyes.eu]

Expat? (0)

Anonymous Coward | more than 3 years ago | (#33719758)

Why does it depend on expat?

The problem with CMake (1)

achurch (201270) | more than 3 years ago | (#33719798)

is that it's got ugly syntax, effectively no cross-compiling support, and less-than-helpful documentation. And its generated Makefiles sometimes miss changes in header files, forcing you to "make clean".

But yeah, it's still a good alternative to autotools.

Print it (0)

Anonymous Coward | more than 3 years ago | (#33716158)

http://www.mcs.anl.gov/~rgupta/calcote_autotools_guide.pdf

Just say no to autotools. (0)

Anonymous Coward | more than 3 years ago | (#33716338)

Friends don't let friends use autotools.

For the love of insert concept of deity recognised

Cross-platform, but not cross-compiling (1)

wowbagger (69688) | more than 3 years ago | (#33716574)

I'll throw this bit of fuel on the flame-fest, in the form of a question:

Does anybody else find that Autotools based projects, while being very cross-platform, are almost impossible to actually cross-compile?

I do embedded systems work, and the embedded universe is moving to Linux as the kernel, and quite frequently a sub-set of the Gnu environment for the runtime. So you get things like BitBake, OpenEmbedded, and Angstrom, which attempt to enable you to build a complete system from sources. However, what all these environments do is run QEMU to emulate the target CPU in order to do the builds.

Now, that's stupid IMHO: I have this VERY FAST multi-core workstation, and I am
1) Throwing away all but one core, and
2) Hobbling that core emulating a completely different architecture (e.g. emulating an ARM to build for the OMAP).
Much of the - I shall use the term "work" although a more bovine-scatological term springs unbidden to mind - involves writing "recipes" for BitBake to work around issues in Autotools.

Much of Autotools seems to me to assume that the machine building the code will be the machine running the code - or at least, a machine of the same type as the machine running the code. So all the Autotools "magic" to deduce structure layouts, word sizes, byte orders, and such all assume that you can
a) compile the code using the host's C compiler,
and
b) Run the resulting probe programs on the host.
Both of which are totally FALSE for cross-compiling.

I used to say that you could design a new CPU that nobody had ever seen, and once you ported Binutils, GCC, and Linux to it, you could build an entire functioning distro for it, just by iterating over the various source packages for and cross-compiling. I know better now: most of the source packages out there DO NOT cross compile worth a damn! They might BUILD NATIVELY on a wide range of architectures, but not cross-compile.

Re:Cross-platform, but not cross-compiling (1)

Kaz Kylheku (1484) | more than 3 years ago | (#33717106)

Indeed, requiring that part of the build takes place on the target machine, or having to emulate it, is an incredibly lame-assed copout.

At Zeugma Systems, I produced an embedded GNU/Linux distro known as Zeugma Linux. The rule of the project was that everything cross-compiles.

No MIPS instruction was emulated during the build.

I took the lessons that I learned, and incorporate them into the small configure scripts that I write by hand, which takes far less effort over the life of the project than dealing with Autotools.

Re:Cross-platform, but not cross-compiling (0)

Anonymous Coward | more than 3 years ago | (#33717574)

Does anybody else find that Autotools based projects, while being very cross-platform...

I've found that Autotools don't work for shit unless I'm using bash as my shell and I'm using the
entire GCC toolchain. Time and again I've had autotools throw fucked-up errors when using tcsh or the Sun compiler.

Re:Cross-platform, but not cross-compiling (1)

bongey (974911) | more than 3 years ago | (#33718146)

What are you talking about ? http://tbingmann.com/2010/apidocs/autoconf-2.65.zip/autoconf_14.html [tbingmann.com]
From someone setup the build for a production system to compile on linux and target platform being windows, I am little confused.
`--build=build-type'
`--host=host-type'
`--target=target-type'

Re:Cross-platform, but not cross-compiling (2, Insightful)

wowbagger (69688) | more than 3 years ago | (#33718816)

Now, configure something like CORBA targeting a PPC, but configuring on an X86, for compilation on an X86 using a cross compiler.

Oops - all your structure padding code for the CORBA martialling is broken, because the autoconf scripts all assume they can find out the padding of the structures by emitting a program, building it, running it (whoops! wrong arch!) and getting the output.

It's all well and good if the program is trivial enough that it does no serious probing of the system, or if the configure host is the same CPU type as the target, but break that, and unless the project's configure script is set up to correctly detect a cross compile, to provide a set of configuration parameters you can provide on the command line to set the variables that would normally be inferred by probing, AND has the wit to refuse to run without you providing those parameters because it detected you are cross compiling, and you will NOT get a running program by default, NOR will you be able to get one without major surgery on the configuration data by hand.

Re:Cross-platform, but not cross-compiling (1)

bongey (974911) | more than 3 years ago | (#33719038)

COBRA , your original comment was talking about C and ARM system.
You have committed a logical fallacy .http://en.wikipedia.org/wiki/Perfect_solution_fallacy [wikipedia.org]

In sticking to the original post you can cross compile from x86 arch to arm without a emulator.
It has been a while since I played around with are but here is an example http://www.ailis.de/~k/archives/19-ARM-cross-compiling-howto.html [ailis.de]
On a side note, I don't care much for autotools, because I am surrounded by windoz users at work, and they love visual studio.

Re:Cross-platform, but not cross-compiling (1)

achurch (201270) | more than 3 years ago | (#33719842)

Interesting you should mention this; I've had the same problems you describe trying to get CMake to cross-compile, but with autotools, "--target=other-cpu" has generally worked fine in my experience (making it just about the only redeeming feature in that spaghetti mess of shell and m4 code). Admittedly I haven't tried building an entire Linux distribution, so maybe I just happened to choose packages that don't rely on running test programs, but IIRC autotools will explicitly disable the standard runtime tests when cross-compiling.

Re:Cross-platform, but not cross-compiling (1)

Vintermann (400722) | more than 3 years ago | (#33719934)

I know better now: most of the source packages out there DO NOT cross compile worth a damn! They might BUILD NATIVELY on a wide range of architectures, but not cross-compile.

I think this is the problem with autotools. It gives the impression of supporting lots of things, but the majority of scripts out there even break if you try to build in a separate tree from the source code. All those checks for the behavior of strcpy() and so on impresses newbies to think that their program would compile on Xenix and Sys5 and what have you, while in practice it's completely pointless.

FOR ALL AUTOTOOLS "REPLACEMENTS" (2, Insightful)

eexaa (1252378) | more than 3 years ago | (#33716806)

please note that all current 'replacements' are totally wrong and actually work only as puny build systems, not supporting any of the great portability benefits that autotools give. Scons, cmake, whatever else depend on their working installation on _build_ machine. This is wrong, only working shell+make+gcc should be needed to actually build software.

So...

Is there ANY good "replacement", preferably lightweight, with this great virtue? As far as I know, there's none.

Re:FOR ALL AUTOTOOLS "REPLACEMENTS" (0)

Anonymous Coward | more than 3 years ago | (#33716986)

i do not think your 2MB+ configure shell script file trumps my 2KB .pro file that can do the same thing an more.

qmake and cmake work just fine while autotools drives me crazy.

Re:FOR ALL AUTOTOOLS "REPLACEMENTS" (1, Interesting)

Anonymous Coward | more than 3 years ago | (#33717328)

only working shell+make+gcc should be needed to actually build software.

You've identified the key dividing line between Autotools fans and detractors. If you absolutely must limit your dependencies to "shell+make+gcc", Autotools may be the best tool for the job. I just don't understand why anybody chooses to limit themselves that way. It seems like all pain and no gain.

Re:FOR ALL AUTOTOOLS "REPLACEMENTS" (2, Insightful)

paugq (443696) | more than 3 years ago | (#33717432)

What about Windows (Visual Studio) ? No shell. Incompatible makefiles. What is the answer then?

Re:FOR ALL AUTOTOOLS "REPLACEMENTS" (0)

Anonymous Coward | more than 3 years ago | (#33717768)

What about Windows (Visual Studio) ? No shell. Incompatible makefiles. What is the answer then?

CMake, or something like it.

good lightweight autotools replacement (0)

Anonymous Coward | more than 3 years ago | (#33717498)

quagmire [google.com] got the concept right, but seems to be pretty dead by now ...

No amount of documentation ... (5, Interesting)

Kaz Kylheku (1484) | more than 3 years ago | (#33717076)

can compensate for the low quality of this garbage, not to mention its poor performance (some builds spend more time running configure than actually compiling, installing, and tarring up the resulting run-time and devel packages!)

These tools have to be redesigned from the ground up by someone who understands that free software has made large inroads into the embedded world where it needs to be, doh, cross-compiled.

For my own projects, I develop a carefully-crafted configure shell script by hand and recommend everyone do the same.

The script has carefully developed and tested support for (1) cross-compiling, (2) configuring/building in a separate directory, and (3) installation into a temporary package directory.

Furthermore, this script should ONLY test the features that are actually needed by the program. (The program should assume a reasonably modern system; don't bother testing for obscure bugs in System V release 3, okay?) The script should make sure that it tests only header files and compiling with the toolchain that it is given. The configure test programs programs should never accidentally include something in the build machine /usr/include, or link something from /usr/lib!

Configure scripts should never run any program that they compile because it may be the wrong architecture, and they should not have an "if cross compiling" check which disables their features when cross-compiling and substitutes dumb defaults! Quite simply, implement the tests so that cross-compiling is not needed.

For instance, instead of making a C program which outputs values with printf, make it so that the values are used as initializers for static data. Then use the cross-compiling toolchain's "nm" utility to extract the values from the compiled object file. You don't have to run a program to know how wide a (void *) is. Just do "static char size_of_void_star[sizeof (void *)]". Compile it, and then interrogate the .o to discover what is the content of the size of the data region named by size_of_void_star!

Look at the stupid configure script for GNU bash. When crossing, it assumes totally pessimistic values for all checks that can't be done when cross-compiling. Unless you explicitly override the ac_* variables yourself, you get a shell which has no job control! The bash build assumes that your kernel has a tty system dating back to the middle 1980's.

Nobody should dig through a configure script to find out what broken tests they have to override by setting the variables manually.

Shell code should never be generated by m4 macros, let alone ones which frequently change.

Look at the stupidity of thisl. Suppose you want to produce a minimal patch for some open source project that uses autoconf. Suppose that as part of your patch you have to enhance the configure script with some new options. To do that, you have to modify configure.in. But projects ship with the generated configure script. So you want to regenerate that, right? Oops, your version of autoconf is different, and so you get a HUGE diff. Worse yet, the generated configure script breaks!

So people end up with ten different versions of autoconf installed, so they can always run the right one which matches what the configure script was produced with that is in the tarball.

Build configuration from scratch is not difficult. It is easy. Autotools do not simplify anything. They monstrously complicate something which is already simple and doesn't require simplification or automation. Using the Autotools is a false economy. You may think you are getting something for free and saving time, but in the end you will spend more time over the life of your project wrestling with this garbage (and also waste the time of countless other people you don't even know) than if you just carefully wrote a small configure script by hand.

Amen brother - TELL IT! (1)

wowbagger (69688) | more than 3 years ago | (#33718852)

I whole-heartedly agree with you - and I'll make another point. Much of what auto* was designed to do was to allow porting the Gnu toolchain over to a target which did not have it - e.g. porting binutils and GCC to some Unix box that did not have them yet, thus they had to assume as little about the target system as possible to allow bootstrapping. However, the idea was that once you HAD binutils, GCC, and libc, you could use them, and have a predictable standard environment.

Now-a-days, that is almost a gimme - any new system will almost certainly have GCC as a part of the bring-up. So why not move on to a pkg-config style system, where there is an executable that can be run for the target platform, that can answer questions like "How big is a pointer? an int? A float? Do you have these library routines? What do I do to dynamically link a program?" pkg-conf does a wonderful job for the programs it knows about, why not extend it to answer all the myriad of standard questions that auto* seems to ask every time it's run?

Sure, you'd have to generate that program for a new target - but guess what? most of the "questions" are ones GCC already knows the answer to, so why not just make GCC be the "pkg-config" for basic CPU arch type questions? Then, even if I am cross-compiling for a Floobydust300 rev B processor, all I have to do is 'floobydust-unknown-linux-gcc --whatis "sizeof(int)"' and I have my answer. (alternatively, dump an XML file, or a key=value file, or header, or any number of other approaches).

Re:Amen brother - TELL IT! (1)

oiron (697563) | more than 3 years ago | (#33719892)

Interesting solution, but it doesn't answer what happens when I don't use GCC - suppose I'm on Windows, using MSVC, or for that matter, on Linux using ICC or clang? Those are still important use cases for cross-platform applications and libraries.

So, let's have a program that knows about all these compilers and platforms, and can generate the appropriate build scripts.

Hello, CMake... ;-)

Re:No amount of documentation ... (1)

mfwitten (1906728) | more than 3 years ago | (#33719846)

For instance, instead of making a C program which outputs values with printf, make it so that the values are used as initializers for static data. Then use the cross-compiling toolchain's "nm" utility to extract the values from the compiled object file. You don't have to run a program to know how wide a (void *) is. Just do "static char size_of_void_star[sizeof (void *)]". Compile it, and then interrogate the .o to discover what is the content of the size of the data region named by size_of_void_star!

<limits.h>

portability with security is hard (0)

Anonymous Coward | more than 3 years ago | (#33718556)

that's why 2 styles of pkg-ing
many guys blame relative path for security problems
while many devs need this

moncleroutlet (0, Offtopic)

mbt00001 (1902150) | more than 3 years ago | (#33719094)

moncleroutlet Moncler [moncleroutlets-us.com] is a unified fashion brand, personality rather than obvious.Simple Moncler Jackets [moncleroutlets-us.com] and Moncler Coats [moncleroutlets-us.com] bring infinite taste and connotation.Elastic band sleeve cuffs with snap button closure. Rib knit waistband inside.New Moncler Jacket [moncleroutlets-us.com] design in 2010, whether from the fabric choice or design, every detail has a new sense. Discount Moncler Outlet [moncleroutlets-us.com] is free shipping and great discount online now. http://www.moncleroutlets-us.com/ [moncleroutlets-us.com]

Not SUSE but Novell (0)

Anonymous Coward | more than 3 years ago | (#33719484)

"John Calcote is a senior software engineer in Novell's Linux business" He does not. He works for identity business, not SUSE.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...