Jon "maddog" Hall is the head honcho at Linux International. Monday we asked him questions about the future of Linux, his beer preferences, and other burning issues. Today, as our premier interview guest in the last year of the 20th century, maddog answers with his usual wit and candor.
1) What NEW directions do you see Linux going in?
There have been lots of articles on what is the future of the current Linux projects... What do you see as the NEW, non-current directions that Linux will embark on in the near future/next century?
Holy cow!! Linux is already being used in ubiquitous computing, embedded computing, turnkey systems, desktop systems, server systems, and Beowulf Supercomputers...how many more directions do you think it should turn?
True, I have been vocal about not telling people to use linux as a "highly available" system, due to its current lack of things like a log-based file system, hot-swap hardware, better hardware error detection and other things that a highly available system would need for a 366 day a year uptime.
And true, many people have noted a lack of NUMA memory capabilities, allowing expansion of processors past the single ultra-high-speed memory bus limitations of cost vs performance.
And a lot of different countries would LOVE to have true internationalization and localization done, so just by changing a message catalog (or adding to it) an operating system or application could be localized for a particular culture.
And then there is the perennial lack of applications, device drivers, etc to make Linux a truly viable desktop.
All of these are bad news, but the good news is that each of them is being addressed at "Linux speed".
In the past, and as short a time as ten years ago, these were all seen as lacking in the commercial Unix operating system space. I heard these same issues as reasons to use systems such as MPE, MVS and VMS, and that Unix systems "could never" do these things. Now most of these needs can be met through these same commercial Unix systems. The Linux community can now take the best of these solutions and designs for this work and re-implement them to meet the needs of the Linux community.
Since visualization of solutions is the hardest thing, and architecting a solution is the second hardest thing, if we pick the best implementation and re-implement it, we should have solutions to these (and other) issues very quickly.
2) Linux vs. HURD
Do you think GNU/HURD might one day take over Linux's place? It certainly has a more modern design, although it is currently still in the works. Do you think it's a plausible alternative to Linux when it is ready for general consumption?
Or does Linux have a drive in the Open Source community that HURD doesn't? Linux seems to have generated a lot of enthusiasm, fandom, (and zealotry?). Could it be this drive that made Linux so successful and the lack thereof make HURD take such a long time to get developed?
(Disclaimer: I am NOT trying to start a flamewar between Linux and HURD supporters.)
I think that if the technical goodness of a kernel was the defining issue, then one of the *BSDs would certainly be where Linux is today.
It is hard to say why Linux caught on, and the *BSDs did not, but I definitely think it has to do with the community spirit that the Linux community has managed to garner.
I have known of the FSF for years. I fought to have free software delivered with Digital's Unix systems since 1984. I have tried to donate equipment to the FSF. Unfortunately for reasons dealing with Richard Stallman's ideals of Free Software, and Digital's inability to create a licensing agreement (in those days) that he could sign, I was limited in what I could do. After a while, it was JUST TOO HARD.
With the Linux community I have never run into an issue yet that was JUST TOO HARD. Most of the time the people agree with what has to be done, and that their goals and ideals are much the same as mine.
I will also say that I am REALLY GLAD that Richard Stallman is the way he is, and has the ideals he has, because he continues to show me the path that I SHOULD be going, even though I may only get halfway there. Without his lead, I would have no epitome to reach.
Now, will HURD take over from where Linux is when HURD is ready? Perhaps. If it really is a better-performing kernel than Linux. Remember that a "more modern design" does not guarantee better performance, or even easier maintenance.
I have not looked at the HURD's design, but I understand that it is based on a microkernel, and from this you may get the idea that it is a "more modern design". But OSF's code was based on a microkernel, and so was MkLinux and a variety of other systems that have come out. None of these have shown a performance improvement over what can be done with a well-structured monolithic kernel with loadable device drivers and loadable kernel modules.
Since my customers sort of sit around with stop watches measuring performance in SPEC-marks and SPEC-rates, TPC numbers, etc., it is hard to give up performance for other issues.
Still, if the HURD comes out, and if it is system-call compatible with Linux, there would be a good case for substituting the HURD kernel for the Linux kernel. I make the one stipulation because I think that ISVs are TIRED of porting applications from one system to another, and really want to see ONE binary interface from the Linux/HURD community for each hardware architecture. This is why I think that one of the most important aspects of the Linux community is the Linux Standard Base (LSB) project.
3) Chasing the taillights?
by Wiktor Kochanowski
Linux, and in general the Open Source development model, has been accused in the past of "chasing the taillights" -- of always catching up to features that other commercial programs have, because they are results of vision rather than of a creeping evolution.
Myself, I think there may be something in this view, when I look e.g. at the emerging UI input methods like voice recognition and pen input/handwriting recognition on the client side, and various goodies on the server side.
Do you agree with this? If so, is Linux condemned to always be a few steps behind of the current state of the art of OS design, at least as far as features go?
If not, what examples of vision and features unique to Linux would you provide as examples?
You are talking to someone who has been in the computer field for thirty years, so for the most part all I have seen is "chasing the tailights". Sure there have been some innovations in networking, but most of the operating systems have taken a lot of their "innovation" from systems such as the Whirlwind, SAIL, Xerox STAR and "re-implimented" them.
[I am sure I pissed off a lot of people with that last statement, but I purposely made it that way to get people to really take a look at what these older systems have done, and to marvel at what they did so long ago.]
I think that a lot of the features of an "operating system" you are talking about (i.e. input/handwriting, voice recognition, etc.) are things that were developed as layered product projects, and "integrated" into a certain operating system by a certain company we all know as part of "THAT COMPANY's innovations" (DON'T GET ME STARTED ON THAT TOPIC).
As the Linux worldwide market grows, I think you will find that more and more of this innovation will happen on Linux, due to both the Open Source concept and the worldwide virtual groups of minds who will work on it. The difference will be that the Open Source model will show where the innovation actually came from, and not where it was bought from.
Regarding the recent community linux certification efforts etc, can we expect to see LI take a part in this? Are we going to get free community certification for Linux? Especially since all PHB's now seem to want certification...
It was members of LI that started the LSB effort, and a lot of our members are very active in the pursuit of this standard.
LI members encouraged both Sair and the LPI in their standards efforts, feeling that (particularly in the early stages) two or more open certification efforts would be useful, since the community would decide (in the end) which of the certification efforts was best. The voting on this would be by how many people signed up for that certification, and which certification was judged best by the hiring managers of the certified people.
In the case of hardware certification, LI has been encouraging an emerging distribution and hardware manufacturer neutral certification group which has the goal of determining what the steps should be for certification rather than any set rules of certification itself.
I do not believe that there will be a "Free Certification" simply because there is a lot of boring, mundane work in marking answer forms, administering certificates, etc, but it can be made as inexpensive as possible by making it open, with openly published standards that have to be met.
5) Linux feature growth
As I mentioned in a recent article thread, the Linux kernel is braving new waters in several areas which UNIX has traditionally shunned in the kernel (graphics support, http server, game support for network management, etc). These features raise the eyebrows of many people, but is this the way you see operating system design moving in the future? Are we so bound by the dreaded user-mode context switch that we have to plow every service as deep into the kernel as it will go?
Mind you, I'm all for the khttpd idea as a single example, but it seems like the beginning of a trend that will end up making the original Linux kernel look like a wristwatch driver, and leave a lot of low-end users in a bind....
I still remember the time we placed the X window server inside the Ultrix kernel. This created a few problems, not the least of which meant that a mistake in code that (with a user-space based kernel) would normally only cause the X-server to crash, the person to be forced off the system, and the login-prompt to re-appear, NOW would cause the whole system to crash due to some type of "kernel panic". We also noted that the X-server (which managed its own memory) would grow without bound, using all of the available kernel memory in a few hours under specific graphics loads. In a user-based X-server, this was (at least) tolerable, but in a kernel-based X-server, it caused the whole system to hang.
All of this was to save a few microseconds of system context switch time and to give better "feel" to the X-server. Then an engineer got almost exactly the same "feel" with the user based X-server by raising the scheduling priority on both the X-server and the window manager.
And I might point out that since that time kernel-based scheduling of lightweight threads has made this type of issue even less of an argument.
My personal belief is that there are certain things that an operating system kernel should do, which is schedule resources among hardware and processes, including memory and CPU time. All other things should be put out in user space. But the last time I wrote kernel code was twenty-five years ago for a PDP-8....
[and speaking of time....as I typed this last part of the answer, the clock on my Linux system turned to:
[maddog@localhost maddog]$ date
Sat Jan 1 00:00:01 EST 2000
...so you can see that Linux is Y2K compliant]
6) How can you afford development?
I don't understand how Linux can complete in the upper end server market, especially against competitors like Microsoft and Sun.
Microsoft is about to release Windows 2000 datacenter which will allow up to 64gig of ram and 32 processors. How can any one company afford that kind of equipment for the development of Linux?
Do you have any plans to recruit companies like Compaq and Dell so that they are major players in the development efforts of Linux? It seems to me that it would be benificial to have companies like this helping to direct the future development of Linux in terms of large scale applications. I realize that these companies are developing drivers and the such, but that isn't really what I'm talking about..
Apache running on Linux on a machine with 32 processors and 64gig of ram, able to out perform anything MS can throw at it. That is what I'm talking about...
First of all, let me point out that Sun has two major divisions, SunSoft and Sun Microsystems. While SunSoft MAY see Linux as a competitor, Sun Microsystems sees it as another operating system to help it sell SPARCS and Intel PCs, which Sun makes. Even SunSoft can look at Linux as helping to maintain the Unix marketplace, and perhaps re-creating the Unix desktop. This will, in the long run, benefit SunSoft.
As to the other large vendors such as SGI, IBM, HP and Compaq, each of these companies have engineers working on internal projects to help Linux xpan the larger types of hardware platforms. Unfortunately, as you get into these very large systems, there are several basic differences that can occur just in the support of NUMA memory alone. Different internal bus structures and architectures might make it very hard for one kernel to be delivered across all these platforms.
At the last Comdex show, in his keynote speech, Linus acknowledged the fact that these larger systems might have a radically different kernel (or kernels) developed for them, but that the kernel programming interfaces would probably remain the same.
Ergo, when you bought your distribution of Linux on a CD-ROM, for certain machines you may have to get your 'boot diskettes" from the people who ship you the machine. Or perhaps certain of the kernel functions might be included in the boot ROMS that come with the machine, and linked into the Linux kernel as it boots.
I know a lot of you think of Compaq as a "Microsoft Shop", but they also sell about 1 Billion USD of Unixware every year, and support 11 different operating systems on their Intel platforms. As long as Linux helps them sell a significant number of hardware platforms, they will make the investment in supporting Linux on them.
7) How about the software no one wants to write?
How about the software no one wants to write? By this, I mean the software that most programmers would consider "boring", yet is truly essential to the further growth of Linux as a desktop and server OS. It's great that we have so many window managers, office suites, browsers, etc. both existing and coming down the pike, but what about the other stuff that's just not as exciting? The stuff you really have to pay people to write? Maybe third party vendors with paid employees are the answer, but will all of those companies want to make their software truly Open Source?
There are several projects underway which are looking at the funding of "boring" Open Source tasks. Some of them are quite interesting in their approach. One might be to fund scholarships for college students based on how much documentation they write, or have written. Or perhaps making it a co-op assignment. On the other hand, perhaps we have to be more prudent in how we make these "boring" tasks attractive. We all like to listen to the person who wrote the code, but what about listening to the person who wrote the documentation? Perhaps more people would write good documentation, if they were the ones invited to the many conferences and trade events occurring around the world, freeing up the developers to spend more time at home writing code (or even at workshops writing code).
by Mike Hall
I have had the chance to meet you at several LinuxWorldExpo's and USENIX etc.. At each of these events, you were always present at the parties with a large glass of beer.
My question: What is your favorite beer? and why?
I am not a great fan of the darker beers such as porter or stout, although this is not a hard and fast rule. For instance, I do like Guiness draught, and particularly when it comes from Temple Bar in Dublin.
As to the lighter beers and ales, I admit to being a beer snob, and I like few of the "national brands". Anchor Steam was my first "micro-brew", and still retains its unique flavor. Pete's Wicked Ale was a long-time favorite, but I feel my tastes have drifted away from it (or vice versa).
There are a lot of micro-brews that I like, and a lot that I really hate. Do not put any fruit in it, or strange spices, or "non-beer" things (such as hot mexican peppers). And PLEASE don't hand me a beer that you feel has to be improved by sticking a lemon or lime into it. Save it for watering the plants.
9) How you get the nick name?
What ever happened to get the nick name "maddog"? Must have a pretty interesting story behind that, eh?
(SIGH) I have told this story so many times.....but perhaps this will curtail telling it a few thousand times more....
Once upon a time I taught in a small two-year technical college. The Dean of Instruction was a fine gentleman, but we did not see eye-to-eye in teaching students, Often we would have arguments, and often the arguments would get heated. During these arguments, often the entire school would hear both the Dean and my opinions on many topics. And sometimes these arguments would get REALLY heated (like the time the Dean hired and fired me four times on the same day). It was during these times that the arguments were too hot for maddogs and Englishmen. The Dean was British....
Finally, my own question and answer:
Q: Why do you like Linux so much? Why do you spend time evangelizing it?
The computer industry has been very good to me over the past thirty years. I have seen computers go from room-sized monsters to things that would fit on your wrist, or at least in a small pouch on your side. Yet I feel that there are still a lot of answers that have to be found and even tougher problems that have to be solved by users.
I am fond of talking about the applications that I have seen running on Linux as I have traveled around the world. People working on understanding how the Universe works in places like Fermilab or Brookhaven National Labs, people working to find new paints, new sources of energy, and other research projects using Beowulf systems. I am interested in seeing people reduce the cost of embedded systems projects by using a well-written operating system that is scalable and free.
And finally I really enjoy seeing people working in the Health-care space, trying to disseminate information that can help cure diseases such as AIDS or cancer.
As I saw Linux spreading over the planet, and being used in places like Sao Paulo, Brazil, or Korea, or China, I knew that the planet earth had to take every chance to find the next great mind in computer science, and that it was less likely to find this mind in a closed-source environment that had all of computer development funneled through Redmond, Washington. We had to have a mechanism for finding the next "Albert Einstein" of computer science, and I see Linux as a magnifying glass, waiting to help us locate that person.
And so to you, Mr, (or Ms.) Einstein, wherever you are... become involved with Open Source projects, and give the world a hand. It desperately needs you.
md - at the beginning of the new century
Monday: Steve Wozniak. Tuesday: two special "surprise" interview guests.