Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Does Open Source Fail the Acid Test?

sengan posted more than 15 years ago | from the people-that-can't-spell-Torvalds dept.

Linux 226

Norman Lorrain writes "This month's Computer magazine is running an article by Ted Lewis in which he predicts Open Source software will hit a brick wall. "

cancel ×


Sorry! There are no comments related to the filter you selected.

Sengan's comment -- have lost my password again :( (0)

Anonymous Coward | more than 15 years ago | (#2018435)

I disagree with Ted Lewis because:

1. Although some Free Software Projects are trying to put everything and the kitchen sink into one integrated package, others aren't. I expect those that do this will die of bloat as Ted says.
2. But many projects won't do this. Instead they'll use other programs in the proud Unix tradition of "make it do one task, but make it do that task well." This decoupling of code bases and reuse is a driving strength of Free Software.
3. Free software that "makes it" is usually well-coded in comparison to commercial software. This makes it easier to debug even if you're not too familliar with it, making the statement "any bug is shallow to someone" correct. 4. Since the main cost of code is maintenance. companies whose main product is not software but hardware or services would like to reduce this cost. Parallel debugging is one way of reducing the cost. We may therefore see a lot more open-source from companies in the near future.

That said, the Linux kernel could be considered to be tending towards 1. A possible way of dealing with Linus-overload is to move a lot of driver-code out of the kernel, and keep it seperate -- like sound used to be, and PCMCIA is. This would have the added benefit of cross OS portability (Hurd, BeOS, Solaris for x86 etc -- the PCMCIA and GGI projects have made some steps in that direction). By spreading driver development over multiple OS's you increase the number of developers and reduce the overall cost in terms of each OS.

Does it matter? (0)

Anonymous Coward | more than 15 years ago | (#2018436)

Open sourced programs are the result of all the different flavours of UNIX and the developer not having access to all of these flavours.

GPL is the result of RMS being unable to fix bugs in software he was using.

If it takes off or not doesn't matter. It's not going to go away.

People have made-do thus far without support from companies, and if everyone decided to backout, they'd leave many many lines of useful code in their wake.

- Idcmp

people-that-can't-spell-Torvalds dept (0)

Anonymous Coward | more than 15 years ago | (#2018437)

people-who-can't-spell-Torvalds dept

if you're going to nitpick, be wary of nitpicking...

short sightedness (0)

Anonymous Coward | more than 15 years ago | (#2018438)

A good counter-argument would be that in practice it works opposite of what he describes. Companies like to sell their products based on how many features they provide. There isn't much to be gained from leaning it down. Whereas open source types tend to make the job easier on themselves by reducing bloat.
Maybe teddy hasn't noticed that...

I sort of agree. (0)

Anonymous Coward | more than 15 years ago | (#2018439)

Open Source has served Linux well in the past because of the tradition of componentization in the traditional Unix userland. As VinodV observed in the Halloween document, Unix-style componentization leads to more robust components (because the component authors don't know what inputs their components will be getting) and enables Unix authors to "party" without stepping on each others' toes.

This form of componentization hasn't applied to certain monolithic bloatware applications (sendmail, emacs, Perl). I personally avoid emacs (using vi or JOVE instead), use qmail instead of sendmail where possible, and am learning awk/sed and Python to avoid the bloat of Perl. Nevertheless, the forces pushing developers and administrators towards this kind of bloatware are subtle and powerful.

GUI applications are a whole 'nother tarpit. Basically, I don't believe Open Source will produce polished GUI applications; instead, you'll get more retrofitted bloatware (GtkEmacs, anyone?) and lots of pointless cloning (57 different IRC clients, anyone?).

Just remember, OSS authors: Source code is a LIABILITY, not an asset. We want you to solve problems using *less* source code, not more.

picking nits (0)

Anonymous Coward | more than 15 years ago | (#2018440)

His "what linux is and isn't" chart seems to go counter to his arguments... but let's just look at the isn'ts:

1) lacks "Video card support"
I suppose the video signal is getting from my box to the monitor by magic. This smells of FUD. True, there are many cards out there the Linux does not support, but to say it lacks "video card support" implies that Linux has no video capabilities at all.

2) lacks "Wireless LAN support"
I can't comment on whether this is true or not, mainly cause I don't care if it's true of not. If wireless support=mainstream viability, then I must be living on a different planet.

3) a good selection of productivity software
I suppose Applixware, Wordperfect, Star Office, etc. aren't good enough. I think I smell a "no M$ Office=worthless" FUD. It's got perl, it's got vi, what else does it need? ;-) I know, I know, people are idiots. Vi would never fly in the "real world". That's why we have pico ;-)

where'd my cookie go?
-adam a

I agree, Open Source is doomed in mass public (0)

Anonymous Coward | more than 15 years ago | (#2018441)

Disclamer: I haven't been able to read the article and I'm fairly new to Linux.

Open Source seems to be lucky. The potenital for planting viruses in something like Linux seems great. If open source spreads like it is meant to, sooner or later the checking process and honest will fail. If this begins to happen and makes it into the mass media, open source would perminitely limited to elites users only.

OSS (0)

Anonymous Coward | more than 15 years ago | (#2018442)

As long as there are people who refuse to accept substandard software, there will continue to be development of OSS software created in a collaborative effort among them. I cant imagine what life was like before OSS, and I shudder thinking what it would be like after should it fail to exist.

Why compete with MS for users? they have their niche, we have ours. I happen to think MS is good for their market, and OSS is good for the type of people who read /.

It is a sad day when there is no choice but microsoft software.

I personally believe both sides are good, not to say I love microsoft... but strictly OSS for all users isnt a good idea either as most of em are relatively stupid and couldnt care less what goes into their systems.

I just dont like the idea of trying to annihilate each other. Each serves a purpose, each has its weaknesses.

Flame away,


Mainstream Acceptance (0)

Anonymous Coward | more than 15 years ago | (#2018443)

If the threshold of functionality required for mainstream acceptance of some software exceeds the threshold of functionality required to satisfy the desires of the members of the free software developer community, there is a good chance that software will never have mainstream acceptance, unless there is some supplemental motivation to enhance the software, like money.

Is this a problem?

This smacks of FUD (0)

Anonymous Coward | more than 15 years ago | (#2018444)

This smells exactly like the BS that they used to run about the AS/400, except insert "AIX" everywhere you see Open Source. Fuck Microsoft, and fuck columnists who think like this. I work for a telecommunications provider. We have one goal. 99.999(999999999) uptime. I won't ever be able to see that with Windows NT. While I've come close with AIX, and Solaris, and Linux. I'm leaning towards Linux. There's a big difference between having the source, and not. I guess he's making the mistake of thinking the average moron. NEEDS to run Linux. Fuck 'em, let them run OS/2, I need Linux to run my Telco, and screw everyone else. You DO want dial-tone, right? You DO want Voice-over-IP, right? You DO want ADSL, right?

I dunno, forget the article.

Non-flaming counter-argument (0)

Anonymous Coward | more than 15 years ago | (#2018445)

Yes, but I'm just some unknown nobody, even in the Linux universe, let alone the "corporate" world. I could go on about the circular logic used to come to the conclutions the origional author made. I could point out the fact that, as we have seen quite strongly, Open Source is increasing the programmer population, not decreasing it, and that creeping-featuritus can't really live in the Open Source model. Good examples of this, from both points of view, are the GNOME and KDE projects. While a real waste of the programming talent (XFCE [] makes these two things look nearly as bad as an MS product) they are actually bringing more programmers into the projects, keeping the code clean while allowing mega-features in that aren't bloating the programs.

Time will show this to be the case. Just as it showed all those people who, back at SD '95 East, said that there would NEVER be any commercial grade DBMS products ported to linux. I quote a representitive from Computer Associates you, when asked if Ingres would ever be ported to Linux, said "Not only no but HELL NO!!"

Joe [mailto]

False assumptions (0)

Anonymous Coward | more than 15 years ago | (#2018446)

First, the article states that the code in Linux is increasing exponentially and will eventually suffer from "code bloat". While it is true there are new features added to Linux all the time, you certainly don't have to use all of them. That is the beauty of having the source code and compilling your own kernel--you only put in what you want or need and leave everything else out.

Second, he states that there is a scarcity of programmers that can (or are willing to) contribute to Open Source. It seems to me that there are many programmers willing to work on Open Source projects just to see how it is done; i.e. there seem to be quite a few CS students out there just working on this stuff to see how it all works. Many initial developers seem to move on, but so far there have been people to replace them. (The GIMP is an excellent example of this) He also cites that Apache development is now only supported by 20 core programmers, compared to the development team of "thousands". It seems to me that once the main project is done, you don't need thousands of programmers just to maintain a code base and make minor bug fixes and improvements.

I think these two things alone make most of the conclusions of this article way off base.

counter-argument (0)

Anonymous Coward | more than 15 years ago | (#2018447)

With commercial software, your speed of development and maintenance are only as good as the number and quality of programmers that you can hire. For each good programmer that you hire, another product looses one. Open source does not suffer from this problem.

One can stay with an open source project while working for a succession of different private projects. The private project have the benefit of your expertise only as long as you are employed there. The open source project may enjoy the benefit of your expertise for as long as you desire to contribute.

The 100 very best programmers on earth could all work on separate projects at work and yet collaborate on a single project in their spare time. Open source has much more potential because it can attract very good talent.

Open source does not have the recruiting problem that private companies have if talented programmers are willing to contribute to existing open source projects rather then try to reinvent wheels.

False. (0)

Anonymous Coward | more than 15 years ago | (#2018448)

This is the sort of FUD M$ would like you to believe, but it's simply not true. Free software projects usually have a single or a small group of principle maintainers who guard the 'official' release code very well. Certainly no code makes into release verions without their checking it over - and they have a strong interest in not allowing trojans or other defects into their software since their reputations are relying on it. A lot of these people read code like you read English (or maybe better). Even if a trojan somehow made it through in one release, it wouldn't make it into the second. While it is probably true that only a small percentage of users actually check out the code, if the software is useful and popular enough, that is all that is needed. The defect will be found, people will be alerted, and the code will be corrected.

I agree, Open Source is doomed in mass public (0)

Anonymous Coward | more than 15 years ago | (#2018449)

Don't be daft. The most likely way that viruses
will inundate Linux is if wine gets word running well enough that it can propogate the macro viruses. I'd venture to say that Linux, of all the open source is extremely unlikely to ever have a virus implanted in it considering how well reviewed code submitted for it is.

I don't understand (0)

Anonymous Coward | more than 15 years ago | (#2018450)

How is the potential for planting viruses (or virii, depending on who you ask) great if you have the source? It seems to me that there are thousands of people going through the code every day, and someone is sure to notice. However, a binary can contain a virus and easily go undetected with no peer review. I could be wrong, but it seems like it sort of relates to the "security through obscurity" encryption style.

using his arguemnet,the oss keeps things unbloated (0)

Anonymous Coward | more than 15 years ago | (#2018451)

I admit linux still has problems leaving its tech and unix roots and it needs to grow its own way instead of just copying other unix's but if linux ever becomes like windows, I can picture thousands of thousands of angry users fixing the problem.

AN exapmle is the introduction of modules. ALot of linux users were extremely picky about which things they thought wanted to go in the kernel and they wanted the fastest performance on the planet and did not want extra features. The result was modules that you can add and remove. Another problem was ease of use.A couple linux users decided to write installation programs for their distributors for easier installation and another incident when the unix guys were angry about the windows lovers praising about destop intergration and having things work together as one (ms went nuts with this with dll's ole etc) the unix and linux guys wrote desktop enviroments for their widnows managers (kde and gnome).

I for one want to make linux the easist os for all (including mac). I intend to someday write remake of xwindows that has destop intergratoin inside it and I also want to associate fles with icons easier without having to add code in the programs itself. Most of todays linux users dont want this but I want to do this to keep linux growing. Linus himself said that he thinks smp is great but he really wants to see linux come in at the low end. FAce it guys, the high end is already competitive and done being capitalised. its the low end that microsoft has a strong stranglehold and we need lower cost pc's (like the days of the trs 80 and atari). I remember when the apple2 was the rich guys computer (when it cost only 400). Now these same people buy 3,000 pc's. Microsoft is expecting to charge oems 120 or 100 bucks for windows 2000. Its too much for many Europeans. We Americans don't relise that were very rich and average people can't afford 3,000 dollar pc's. If pc's go down in price some more, more of these people will buy pc's and the market will continue to grow. Today, all the anaylists expecct the pc market to standstill in around 2003 and stop growing. We need to make these things cheaper! The oss will free them. If the code gets bloated, people will change it. Unlike other os's if the code gets bloated, its stays bloated and people buy other os's. IF the code was clsoed, then linux could possibly die of this problem.

wireless LAN (0)

Anonymous Coward | more than 15 years ago | (#2018452)

I don't know what kind of wireless lan this guy is trying to run, but this box seems to be handling my NCR card just fine...

Bugs in *nixes (0)

Anonymous Coward | more than 15 years ago | (#2018453)


obviously Linux has/had it's share of bugs, but
all of them have been fixed on short notice. If
you want to have a look at the number of bugs in
a commercial *nix, check out:
--> http://sunsolve.Sun.COM/pub-cgi/us/

All of these have taken weeks, sometimes months
to fix because the source wasn't available. More
often than not, a patch fixes one thing and breaks
three others.

If you think that's bad, keep in mind that Solaris
is probably the best of the current batch of
non-OSS *nixes.

They miss the point (0)

Anonymous Coward | more than 15 years ago | (#2018454)

It's impossible to argue when the make the wrong assumptions.

"The business case for Linux and it's open source cousins is based on two fundamentals: the software is free ... the availability of source code attracts debuggers"

This is bunk... As a business users of Linux I feel the more important issues are :

On the server end Linux handles all of these things much better than any Microsoft product.

They also misquote the Halloween Document (Attributing thing said by Microsoft to things said by Eric Raymond). And flat out misstate a lot of things where do they get that crap about failure rates? If a Linux utility failed on me 9% of the time I would through it out.

Dennis Baker

Defect Density (0)

Anonymous Coward | more than 15 years ago | (#2018455)

The measure of default density, while in many cases useful for gauging the faultiness of a programmer's accuracy, is an outdated way of measuring bugs in a product. To the end user, 1 serious bug in 10,000 lines of code is identical to 1 serious bug in 20,000 lines of code, provided the two programs accomplish the same task. Linux's smaller size in comparison to commercial Unices is an asset, not a liability, and it is ridiculous to suggest that Linux will necessarily bloat to the same size as the others.
As for the liabilities cited by the author:
1. Video card support. (Get a commercial X-server, if you love paying so much)
2. Wireless LAN. (Check out Linux Journal's December issue for the article on "Linux and wireless networking in Africa").
3. Commercial productivity applications. (What Unix version has more productivity apps than Linux?)
The author's good points are lost among his muddle and unreasonable assumptions. The comments about Microsoft seem pretty fair -- unless they get a fine on the order of $15 bil, they've got enough resources that they don't really have to worry about Linux, especially as new markets replace the desktop and back room server as primary revenue streams.
I see no merit, however, in his argument that open source will not influence the way that software is built in the future. Aside from the most obvious examples (Mozilla, sendmail, perl), we could also point out that the upstart (proprietary) BeOS discovered GCC to be dramatically better than commercial compilers. . .

The Cathedral sinks into quicksand of... (0)

Anonymous Coward | more than 15 years ago | (#2018456)

... international economic and political reality.

Why bother with these pathetic technical diversions?

Linux could be much less than the technical
excellence it is and the obviousnous of its
ascent would still hold.

Does anyone think that China is going suddenly start
paying for that 95% of the software that is
pirated in that country?

Someone please prove to me how countries will
soon start exporting major portions of their
GNPes to the US as they computerize?

Windows, Apple, and the lot could be a hell
of a lot better than they are and the obviousness
of the commoditization of OSes would still hold true.

So we have:

(1) World Computerizes and Pays major parts of
earnings to US companies

(2) World Computerizes and continues to pirate
and steal commercial software while ignoring
increasing easy to use and cheap OSS.

(3) World Computerizes and accepts a free,
open source model for OSes as the only ethical
and economic choice.

Which of the above 3 will happen?

Because Windows 2003 might have better PnP,
DirectX, and be more stable is somehow change the
underlying economic and political realities?

Good article, right issues, but wrong conclusions! (0)

Anonymous Coward | more than 15 years ago | (#2018457)

This article should be read by the advocates
of the open-source movement because it points
out some issues that the OSS movements
have and thus we can solve the problems by
being aware of them.

The author is quite wrong in his conclusions
though. He does not foresee that OSS movement
will evolve and adapt itself to the new demands
of the future. Market forces and commercial sector
will be fully integrated into the OSS world.
Without this adapatability, OSS would not have
come this far. OSS is sustainable.

OSS (open source software) is not basically about
cost-free software. Its about openness and
freedom. Here the API infrastructure is controlled
by the user-community rather than by a
profit-driven manipulative corporation.

Unsubstantiated claims (0)

Anonymous Coward | more than 15 years ago | (#2018458)

This article makes a number of unsubstantiated claims that on the surface appear to be logical arguments, but which fall flat upon examination.

Take the quote from the article in the summary above: "Success leads to features, and feature creep leads to bloated software." Here, the reader is misled to equate the addition of features with "feature creep". The improvement of an application with necessary, relevant, and reasonable features IS possible without "feature creep". Furthermore, UNIX itself is built along the lines of modularity... which explains such assertions as "Linux has no support for video cards" (well duh, of course it doesn't... X does that job)... and this modularity tends to reduce feature creep to a certain degree.

This is not the only misleading statement made either. There are a number of broad assertions made without any supporting evidence at all, such as: "Already, Apache is losing the performance battle against Microsoft's IIS." Hmm. News to me!

I believe that the educated reader can sit down and rip apart most of this article by identifying both the unsupported statements and the skips in logic, similar to the examples above.

GUI is possible (0)

Anonymous Coward | more than 15 years ago | (#2018459)

AC wrote:
> GUI applications are a whole 'nother >tarpit. Basically, I don't believe Open
> Source will produce polished GUI >applications; instead, you'll get more
> retrofitted bloatware (GtkEmacs, anyone?) >and lots of pointless cloning (57
> different IRC clients, anyone?).

I disagree with this statement. Problem with
bloated GUI is not in its principle.
Command line applications may be even more
bloated, remember JCL in OS/360?

It is simply one genius come once in the
end of 1960 with concept of shell and redirection,
which made possible splitting command line
apps to independent program.

Now we just need another genius which would
do the same for GUI programming. Tcl/Tk comes
close, but it is not enough.
Corba is not solution either becouse it lacks
something as simple as shell concept
"let other program read output of this program" Lets think about
it and we surely find a solution. There are a lot
of us to think.

lacks "Wireless LAN support" ???? (0)

Anonymous Coward | more than 15 years ago | (#2018460)

About a year ago, we needed to set up a wireless router to a remote building. We decided to slap outdoor antennas on Lucent WaveLAN cards (2Mbit wireless LAN) and roll our own. I shuddered to think about 2 more 24x7 network Windows boxes to maintain. I had never seen or touched Linux, but we heard there were Lucent WaveLAN drivers available. A programmer who had been playing with Linux at home suggested we use a spare 8MB 486SLC2/50 (Imagine a 486/33 with the flu).

You can imagine the rest...that little box also became our ftp server, ran network diagnostics, a name server, company mail, and more. Now we run three Linux boxes (not counting dual boot systems), and half the engineers also run it at home.

OSS = more trojans inside ? (0)

Anonymous Coward | more than 15 years ago | (#2018461)

I think that the risk of finding trojan in software is directly related to the size or the complexity of the source code and NOT to the fact that it is OSS or not.

The fact in free projects like kde, mozilla or apache is that everybody can modify the source and include any modifications he wants but anybody can review the source and check the source for trojans or backdoors too.

And on the opposite, if there is a backdoor on the NT tcp/ip stack it will be very difficult to find...

Response to Ted Lewis (0)

Anonymous Coward | more than 15 years ago | (#2018462)

Below is what I mailed to the author of the article.

Thanks for an interesting critique on Open Source and Linux. You bring up good points that are certainly worth thinking about. I'd like to point out a couple of mistakes in the article, though, if you don't mind:

Linux 2.0 does not have a multithreaded kernel, although Linux 2.2 does. 2.2 also does include video card support for a number of the more common consumer/hobbyist level cards. If you mean graphics cards instead of video, XFree86, usually used with Linux, has fairly good support for most common chipsets, and of course there's also the possibility of using the (commercial) Metro-X or Accelerated-X from Metrolink and XiGraphics, respectively.

Re failure rates / failure densities: While Linux kernel itself is in the 1.5 MLOC range, the entire tool set, which includes the GNU utilities and much other code under GPL or BSD-liek license, is much bigger. Not having all the source code installed, I can't give you an exact quote on that. That would affect your failure density estimate for the entire operating system installation considerably. Of course, at the reliability levels we're both interested in, hardware failure is a big factor, and although Linux supports RAID storage, it doesn't yet offer any kind of general redundancy clustering. That, not the "lack of a good selection of productivity software" is a feature sorely missing in Linux compared to the commercial UNIX vendors like Sun and HP.

It's clear from the article that you're aware of many of the lead programmers in the various important free software projects (Linux and Apache being two that you mention) having been employed by companies directly or closely related to the open source market. However, you seem to have drawn a strangle conclusion from this fact. It is not at all obvious that their employment in these companies means that they're not available for furthering the development of the projects any longer. To the contrary, in fact. Most of these people are on payroll with the specific task of developing and maintaining the free software project they "came from", and the companies in question readily accept that the code these people produce will immediately be released under the original free license. Why? Either it makes the total product of the company better (as is the case for Whistle Communications, whose file and web server appliances run the Samba file server - Whistle employs Jeremy Allison to develop Samba), or the company's internal use of a free product is large-scale enough to make it cost-effective to have one or more full-time employees developing it (which is how the Apache web server was born).

In other words, commercial distributors of free software don't take away momentum from the development of the software, nor does it mean that the software no longer is free. There is no threat to the free availability of Linux, even if Red Hat or someone else did become a controlling force on the commercial Linux market. The 1000 copyright holders of the Linux kernel ensures that - all of them would have to agree to commercialize the OS under the same commercial entity first. Nor could Microsoft "absorb" Linux, as you put it, for the same reason. They could embrace it, but that would only make it stronger, and all of Microsoft's contributions would also become free.

I'm also curious where your market share figures were based on. You claim that the Apache web server is losing market to Microsoft IIS. The most reliable survey I know of in this market, the Netcraft web server survey, has indicated for several months that Apache is not only the biggest, but also the fastest-growing web server product on the Internet, with a 54% market share this month. IIS market share has actually been fairly flat for the past few months at 23-24%, while Apache is constantly stealing away market from Netscape and other smaller products. Intranet deployments naturally can not be reliably counted, since there are no "sales figures" for Apache or other free servers.

You also suggested that the Linux market share is 3.6%, which even if it were true (which is impossible to verify with no concrete data on actual Linux installations) is quite misleading, when a majority of Linux installations are low- to mid-range server systems. In this market segment, UNIX is not quite as weak as your article would lead to believe. Dataquest's server market survey in October last year showed UNIX market share growing, dominating over NT over two-fold. It's difficult to find reliable data on the market share of Linux or the various free BSD derivatives, but it would appear to be somewhere around 14% of the server market, concentrated into web and file servers. According to Datapro, Gartner Group and IDC, Linux is the fastest growing non-Microsoft OS, in fact according to some reports it may be the ONLY non-Microsoft OS growing market share.

I can't help but think that your initial position might have been somewhat biased. Admittedly, so is mine - I like working on this platform. I guess it remains to be seen whether your acid test proposition holds or not.

FUD fud Fud FUD fud Fud FUD (0)

Anonymous Coward | more than 15 years ago | (#2018463)

This article is FUD, pure and simple, probably paid for by a "very large software house".

It has already been shown that GNU software has a significantly lower failure rate than equivalent commercial software.

What college do you go to? (0)

Anonymous Coward | more than 15 years ago | (#2018464)

I'm currently enrolled at UNLV and I'm having a tough time finding
anyone else that uses Linux... *sheesh* All the programmers have this idea they're going to work
for microsoft and make lots of money? *sheesh*
Anyway... I've got to start a linux group on campus
as I know there are others there... just not where they're hiding... :/

Defect density is a bad metric (0)

Anonymous Coward | more than 15 years ago | (#2018465)

The argument in the article to disregard defect rates and instead use defect density is a completely bogus argument.

Consider the following two programs:

Program A contains 10 bugs in 100 lines of code.
Defect rate = 10
Defect density = 10%
Program B contains 10 bugs in 1000 lines of code.
Defect rate = 10
Defect density = 1%

Each program contains the same number of bugs, but the bloat of program B covers up this fact under Mr. Lewis' metric.

Secondly, neither of the measures mentioned in the article concern themselves with either the likelihood of tripping over the defect, nor the results.

If my program contains 2 defects, both are weighted equally. If one of those defects is that it spontaneously destroys your hard drive, that is a much deadlier defect than if the defect simply doesn't save my .signature file.

Further, no weight is given to the possible relevance of the bug. Consider a bug that makes it impossible to even launch an application (one that EVERY user would trip over) and a bug that is only tripped when you select 12 different items, click three buttons, pause for 10 seconds and then hit cancel (which is likely to only ever be seen by the developer).

Without having a decent metric upon which to determine code quality, Mr. Lewis' arguments about the reliability of Open Source are incredibly weak.

I won't even get started on his attempt to extrapolate userland defects into kernel space...

$enegan @ $lashdot (0)

Anonymous Coward | more than 15 years ago | (#2018466)

It's useless responding directly to this kind of FUD - but I can respond to Senegan's eagerness to publish such stories. His lack of editorial wisdom seems incurable, unless one considers possible motivations for such apparent cluessness.

If this continues soon there will be little but stuffed penguin adds and such articles planted to generate controversy (translate "hits" for $lashdot). It seems that if the author of this story is correct about anything he's correct when applying his analysis to $lashdot, which seems lacking in quality control these days.

Well, one of the good things about the OSS community is when a website (like a system or an application) is no longer serving its purpose somebody can spin off another which does. We aren't beholden to $lashdot or anybody else.

Although I intended not to honor this story by responding to it, I will say that all the author's criticisms apply to commercial software, not OSS. $lashdot is becoming like commercial software, driven by "hit count", advertising, and the budding journalism careers of its editors.

It may be time to move on. $lashdot is having problems keeping its database uncorrupted these days, anyway. The next step will be selling all our names to junk mailers.

Oh, but the faithful will say. These people are making such great sacrifices for "us", putting in long hours for no pay to bring us the latest and greatest. And I'd like to sell you an NT system to manage your nuclear power plant.

Trash the Slash

Look around buddy, (0)

Anonymous Coward | more than 15 years ago | (#2018467)

If everybody is working on the GUI, why does everyone still work on the command prompt.
Is Apache, BIND, Sendmail, Qmail, the Linux Kernel, the GNU toolkit, the egcs compiler, Perl, Python accessible without GUI.
The "dirty work" that you talk about is the "fun" part for a true system programmer

Inside the Tornado? (0)

Anonymous Coward | more than 15 years ago | (#2018468)

I think Linux is starting to cross the chasm btw.

Quite a few of Lewis comments seem to be straight out of a marketing book with this title. The key point is that "crossing the chasm" is hard to do and the way to do it is to focus on one market exclusively and refuse to be drawn outside of that market until you have conqured it completely. Then find another realistic market to tqke over. Wath MS they are very good at it and they started with DOS, it was their strongest card.

I would suggest that this market for Linux is Apache/Linux Web Server. Apache holds 54% of sites, that is not 54% of pages served however, and that is important to remember. If the Linux/Apache combo can take this market over then expansion into more areas will follow. I think the oss community would be well advised to focus carefully on dominating one market segment. I would suggest that Apache might look to a OSS sql database for serving web pages from and extend into that market next. php or similar maybe.

Interesting thought that western Government is built on democracy and openness, the same could be true of software. Human nature doesn't thrive in a secretive power hungry structure, it usually destroys itself through its own arrogance.

However I don't expect that to happen because free people are contrary, they don't like the focus of hierarchical structures and so Linux strength is its diversity,as shown by the range of platforms it runs on.

However I actually believe that Linux will be successful in low cost appliances because a US$40-50 componet in your $400 component price item is a big deal, and if you can use Linux for that, then that is compelling when you roll 1M units. So embedded units systems may pick up o Linux. Netwinder for example, but netwinder must be able to be a client for Citrix, I am not sure if it can?

Other niches will be successful to, the thing is though Linux is written for programmers by programmers.

Pay attention to his facts though, not very accurate, looked at the mail and dns systems of the internet recently?

I agree, Open Source is doomed in mass public (0)

Anonymous Coward | more than 15 years ago | (#2018469)

another AC wrote:

>Have you ever seen the little doom clone that the Microsoft programmer put into excel

No, but I'd love to.

Any instructions for how to get at it ?

factoid:fact ratio unbelievable (0)

Anonymous Coward | more than 15 years ago | (#2018470)

Classic marketing ploy by someone with a conclusion in search of a collection of words to justify it.
"No Wireless LAN support", slow Linux release cycle, Apache starved for innovation...

This article is the worst one I have seen in a long time.

Anthony "forgotten password" David

Wrong arguments. (0)

Anonymous Coward | more than 15 years ago | (#2018471)

Ok, the argument about code density is on WEAK feet:

  • First, it's difficult to know how many code lines other Unices have, as you usually do not have source.
  • A Linux system doesn't have 1.6millions of source lines. The Linux 2.2.1 kernel does have 1.6 millions.
  • As he compares complete systems, the following packages at least should be added to the Linux counter: [andreas@heaven]$ ls XFree86-3.3.3 ghostview-1.5 make-3.77 bash-2.02 glibc-2.1 ncurses-4.2 bc-1.05 grep-2.2 rcs-5.7 binutils-2.9.1 groff-1.11 readline-2.2 emacs-20.3 gs5.10 sh-utils-1.16 fileutils-4.0 gzip-1.2.4 textutils-1.22 findutils-4.1 inetutils-1.3.2 uucp-1.06.1 gcc-2.8.1 ispell-3.1 xboard-4.0.0.tar.gz gdb-4.17 less-332 gettext-0.10 m4-1.4 [andreas@heaven]$ find . -name \*.[ch] | xargs wc -l | awk ' /insgesamt/ { sum=sum+$1 } END { print sum } ' 4917423 If you add the 1.6 millions lines from kernel 2.2.1, you get about 6.5 millions, and I missed obviously many packages. Andreas

half and half (0)

Anonymous Coward | more than 15 years ago | (#2018472)

Linux will succeed as a tool to tie together networks of Dos, Windows, and Mac computers. It shines in terms of serving printers, modems, group email, and files, and it's convenient to use, once installed. The file system, if it crashes, is fairly easy to recover, much easier than Windows, but not as easy as Dos.

It probably won't work on the end user desktop. It's seems to be to difficult to write apps for it. Doesn't seem like there will ever be any programmable databases like Dbase, Nutshell, Q&A. Without these you can just forget it.

For languages, businesses have their choice of C++ and C++, both curses upon productive society. The rest - Perl, Tcl/Tk, Python - all have their own serious omissions in critical areas that make them unusable for serious work like customer lists.

You L1n3ks h4krz need to pick up some of those old Dos apps and see firsthand how to build a full featured application. Paradox 4.5 DOS (not Windows) will lead you in the right direction. It'll teach you more about ergonomics and design of the feature set than ten years of listening to some dopey college professor.

$enegan @ $lashdot (0)

Anonymous Coward | more than 15 years ago | (#2018473)

>It's useless responding directly to this kind of FUD - but I can respond to Senegan's eagerness to publish such stories. His lack of editorial wisdom seems incurable, unless one considers possible motivations for such apparent cluessness.

Maybe its because us readers want stories like this? Why don't you go to zdnet and shut up.

>If this continues soon there will be little but stuffed penguin adds and such articles planted to generate controversy (translate "hits" for $lashdot).

You're just pissed because you don't get as many hits. If you don't want to give /. more then leave. Now.

> Well, one of the good things about the OSS community is when a website (like a system or an application) is no longer serving its purpose somebody can spin off another which does. We aren't beholden to $lashdot or anybody else.

Well go set one up then and stop complaining. If its really any good I'm sure even /. will mention it.

> Although I intended not to honor this story by responding to it, I will say that all the author's criticisms apply to commercial software, not OSS.
$lashdot is becoming like commercial software, driven by "hit count", advertising, and the budding journalism careers of its editors.

Perhaps, but so is zdnet and all the others. Why don't you just sell your computer and not bother.

> It may be time to move on. $lashdot is having problems keeping its database uncorrupted these days, anyway. The next step will be selling all our names to junk mailers.

Idiot. Slashdot is a web site, not a person. Since when has a bunch of HTML been able to manage a database? I'm sure it'll be fixed shortly. This is just a retrofit of fact to suit your arguement. Badly.

Personally I couldn't care less if /. sold YOUR email. They don't have mine.

> Oh, but the faithful will say. These people are making such great sacrifices for "us", putting in long hours for no pay to bring us the latest and greatest.

Nope, they just have lots of stories I and others like to read. Vote with your feet, you bore us all.


Bug density fix... (0)

Anonymous Coward | more than 15 years ago | (#2018474)

1. Take code below and copy to file ''
2. Do 'chmod 0777'
3. Do './'
4. Notice how bugs/lines of code ratio has changed

-----[cut here]----------------
# Adds 30,000,002 lines of bug free(tm) code to Linux
# for that 'extra low bug density'(tm) feeling
echo "/* include/bug_density.h */" > $file
echo "/* Bug Density Patch 11/02/1999 */" >> $file
echo "/* Anonymous Slashdot Reader */" >> $file
echo >> $file
echo "void bugdensity(void)" >> $file
echo "{" >> $file
echo " if(0){" >> $file
for i in 1 2 3
do for j in 1 2 3 4 5 6 7 8 9 0
do for k in 1 2 3 4 5 6 7 8 9 0
do for l in 1 2 3 4 5 6 7 8 9 0
do for m in 1 2 3 4 5 6 7 8 9 0
do for n in 1 2 3 4 5 6 7 8 9 0
do for o in 1 2 3 4 5 6 7 8 9 0
do for p in 1 2 3 4 5 6 7 8 9 0
do echo ' fprintf(stderr,"And another line of code");' >> $file
done; done; done; done; done; done; done; done
echo "};">>$file
echo "}">>$file
echo "/* end of bug_density.h*/">>$file


WOw, this guy reads books (0)

Anonymous Coward | more than 15 years ago | (#2018475)

Wow this guy has read 'crossing the chasm' book

That book is good but meant only for COMMERICAL
software moron!, get a life. So has the Porsche lost it?
I mean its not mainstream at $50,000 is it.
Simplistic views breed narrow minds!
Who gives a ratts ass if linux is mainstream or not, because it works well and all techoss love it and MS earns less $$$
ANyone pro NT, anit-linux must own MS shares, so dont listen to them

Dominant Paradigm, Resistance is Futile, 7 of 9ish (0)

Anonymous Coward | more than 15 years ago | (#2018476)

I feel that the author of this wondrous article who hides behind a secured server misses the point of the article he quotes " .. The Bazaar .." where they talk about the differences in culture between a potlatch ( give-away culture ) and the more "corporate" ( give me culture ).

Languaging like "absorb and extend" reeks of Borg mentality and homogenius culture.

I think what attracts me to all the flavours of un*x is the personalization of the operation environment. It is putting the choice of interface design back into the hands of users.
Look at all the various window managers for XFree86. Sure some of them may be buggy, I think the wonder and attraction is that as a user, one can communicate with the developers to make feature requests, offer bug fixes, portability clues to other OS's...

The author seems to have a love affair with M$. True, I do enough of my own bashing of M$. If it wasn't for them, I wouldn't be making so much $$ in supporting WindozeXX. Thanks Billy!

I also think the author is flawed in his calculations about the code size blah blah blah. Here he is assuming a traditional economic model of continued growth. How many times have apps been rebuilt from the ground-up as major design paradigms in the developers efforts changed the way they thought out the API's? ( And some apps keeping traditional API structures to boot !).

Here's a clue ... "Even if Linux proves to be the better client, Microsoft can always use tying - the creation of MANDATORY DEPENDANCY among applications - to maintain it's monopoly."

sic "..And the ploys of a WEAKER entry level player trying to upsurp the STRONGER players". Count how many times the author uses weak in this diatribe of ivory tower mind tweak.

Another response points to the authors predominant focus through his collection of books "How to make money off your personal computer".

I tend to apply a permaculture model to computing. It is like any other tool. There is an old Peruvian saying, "a man is known by the tools he uses".

Does it really matter was OS is best as long as there some benefit from the practice of use? What do we use computers for anyways other than to strucuture reality into coherent forms, make connections, build alliances, create art, express the mundane, the commonality, and the bizarre bazzaar?

I use NT right now to post this. I do my development on Solaris. I like Linux because I can see the next trend in the Gnome/Gtk apps. I support Macs because the redhead who needs help is a cutie. It's all good.

Then again, I tend to favour the give-away side of the model, to use tcom products to create better designs.

If the designers @ Microsoft want to figure out how to get rich by directing where I want to go today,... yeah right,... not there!

And as far as usability and productivity, using any flavour of un*x seems to be more productive regardless of flavour ( RedHat, Suse, OpenBsd, Slackware, Debian, Solaris ). I just downloaded Corel WordPerfect for my RedHat installation. What else to people use computers for other than writing or making digital art?

Oh yeah, no mention of Gimp? It rocks!

Oh poor Microsoft, we bash on you, oh boo-hoo-hoo. A good example of a give-me culture where M Gates has more money ( a resource ) that all the poverty nations in the world put together. I think there is something wrong with that. Oh yeah, he just donated 100 million to, about 2.17e-6 % of his actual wealth.

It comes to haves and have nots. However, the have-nots can breath life into those old 386 machines and have world-class operation systems ( speaking of linux ).

Oh wow, I am out in galatic central! Worf! Plot a course back to Earth! Engage!

So when I think about what goes on in the world, we still have problems with over crowding, smog, diminishing natural resources, the lure of better faster machines with better faster more intuitive interfaces. How many times have I tried to walk my Dad though the control panel on WinXX? Or explain that "this program has performed an illegal function" was NOT because AOL was breaking the law.

So Microsoft this and Microsoft that. Hey, it it works for you great! If you have half a clue about making the most out of your computing experience, then maybe you want to check out some flavor of linux.

Productivity, hmm the author mentions sendmail, bind, ftp, and GNU gcc ( ecgs ? ). I think he is tripping when he says that there is poor video support ( maybe read that HCL, it is the same for WinXX and any other system , or get support from the european markets ... thinking of Diamond cards in particular. My poor Fire GL Pro 1000 is just gathering dust, nice card doesn't play well with others ... )

Back to my original thought of design paradigms. One thing I like is all the true innovation of the oss developers creating tools such as gcc, python, perl, tcl, all the libs, mysql, postgresql, apache. Roll your own app! Donate some help to the alpha code warriors/warrioresses!

Waiting for the fifth generation programming languages. Voice interface, somewhere in the 24th century...

OK, must sleep, end of rant...

Closing clouds, open source tweaks the corporate model, kant we all just get along? Whatever works best, maximum leverage of available resources to mutual benefit. Plant a tree or two..

Anonymous Coward

"I am a alien from outerspace,
Come to save the human race,
Living in my breifcase,
Living in my suitcase,
I am an alien I say,
I am the word professor,
I am a word processor"

Lee Scratch Perry

My thoughts are the result of extreme exposure to multiprocessing systems
running multithreaded enviroments

no data entry on client side (0)

Anonymous Coward | more than 15 years ago | (#2018477)

There is no way to do serious data entry on a Linux workstation.

Don't give me a song and dance about XForms or Netscape forms. There's no way to validate input as it's being done.

Set up a web database, and enter your phone bill 100 times. Can you do it in less than 45 minutes? I didn't think so. That's because the client has to send it over the network to the server, the server checks it, rejects it, and sends it back over the network to you. Can't get anything done that way.

The way it's done is with FIELDS, with a real lookup window. The data is checked as it's being typed in.

DOS has apps that do this, Windows tries and often succeeds, but nowhere in Linux can this be found.

And Linux won't succeed until someone like me has an app like this.

I agree, Open Source is doomed in mass public (0)

Anonymous Coward | more than 15 years ago | (#2018478)

Bullshit... anybody who knows the source code and the operating system well enough to be able to stick a virus in there just WOULDN'T DO IT. Have you ever heard anybody who's gotten past the initial "gee, this is kinda hard to use" aspect of Linux and actually used it for a significant period of time complain about it, other than say what features they want in it? Personally, I've never heard of anybody running Linux for a significant amount of time and realizing that Windows was better and switching back (well... except maybe a long time ago because of hardware support.) Anybody who uses Linux for any amount of time gains a respect for the operating system and wouldn't code a virus into it. And they also wouldn't want a virus in code that they run themselves...

Matt Spong [mailto]

virus-free nirvana (0)

Anonymous Coward | more than 15 years ago | (#2018479)

It's understandable that OSS products are challenged on the possibility of virus-vunerability, bugginess, and code bloat. Understandable, because these are real, legitimate concerns that we face all the time when we use commercial software.

It's also understandable that OSS users respond to this challenge with "huh?" Simply because these issues are never notable in open source software - they represent the strengths of the code that open source developers put so much pride in.

The Line counts are wrong (0)

Anonymous Coward | more than 15 years ago | (#2018480)

Mr. Lewis claims that
"Unix has more than 10 million lines of code, while linux has only 1.5 million".

This is inaccurate.

While the 2.0.36 kernel has (by my count) 918835 lines of code, the 2.2 kernel has 1603773, which is nearly twice as much, and it is not the only vital part of the system. If you include X, the compiler, the C library, as well, you're up to about 7 million.

And then there's BIND, the shell, all the shell utilities used by the startup scripts... In fact, if you go the whole hog and include everything on the distribution CD, it is much more. Counting all the lines of C, C++ and shell script code on a the Red Hat 5.2 source CD-ROM, I get a total of 18804260 lines (in 432 packages). That's over 18 million lines of code. Naturally, not all of this is vital for exerybody.

I think Mr. Lewis has been reading Darrell Huff.

- James Youngman
(who can't remember his Slashdot login).

Ya gotta compare apples & apples.... (0)

Anonymous Coward | more than 15 years ago | (#2018481)

How many people are involved developing the Linux "core"? Don't compare it to how many people are developing GUIs... there is much choice in GUI and that is an advantage of Linux. Compare it to how many people are working on the "core" of other OSes. My guess is that Linux has far more!

Where are the Zeus figures on the MS benchmark!! (0)

Anonymous Coward | more than 15 years ago | (#2018482)

They probably didn't publish them because they were frightened!!!

History (1)

Zoloft (218) | more than 15 years ago | (#2018542)

as well as the currently snowballing way of things, has already proven him sadly mistaken.

End of story.

Lies, damn Lies, and statistics. (1)

jandrese (485) | more than 15 years ago | (#2018544)

You want a good Windows Text editor, try Vim. :)

It's also a good Unix/Be/etc... editor that everyone should use. Of course this is all IMHO. :)


gavinhall (33) | more than 15 years ago | (#2018548)

Posted by The Orge Captain:

As I understand the industry at this time I find the commnets made in this article to be fairly uneducated. I bieleve there is a fad going on but it is not the open source movement I bieleve that this will be a trend in the future as software gets closer to the user being more custom for its intended purpose rather then one size fits all of today. The fad I see is people who place linux on there desktop machines and with some of the members of my age group who install Linux thinking that they will become some sort of hacker by installing it. Linux is already to complicated for the average user to run because it was never designed to be run by the average user the people who were constructing it work constructing it for there own use and so there was little attempt to make things not necessarly easy, but straight forward there. There is a tremendous amount of upkeep involved in keeping any unix system runing paticularyly in securing it (Unlike Microsoft the unix community actually fixes vulnerbilitys in a timely fashion). I see linux as NTs nemesis, educated users running the operating system on higher preformace workstations and departmental servers. (What I bileve is needed in the corporate enviroment is a less complicated operating system that runs only what corporate destop users need a browser, word processing etc, enterprise ware. This would reduce the amout of problems that are created by windows in a corporate enviroment (horendous up keep and abuse) Linux is and excellent operating system and Im glad that I have it.

Lies, damn Lies, and statistics. (1)

pb (1020) | more than 15 years ago | (#2018554)

If you don't like Emacs, use vi (I like pico).

If you don't like sendmail, use qmail.

It isn't like you don't have any choices, unless you install windows. (let me know when windows comes with multiple, independent editors and e-mail servers, and everything else. Then let me know when you find a good *text* editor for it, and a good e-mail server, etc, etc. Then let me know if you had to port UNIX to do it. :)

As for the article, it was completely wrong. Testing programs for errors has nothing to do with how long their source code is. Seeing as how Linux is currently *the* *dominant* *unix* in the market, I don't see how the other players could do any better. Linux may not scale quite as well on other people's specific enterprise hardware configurations, but it can't do worse than NT. Therefore, Microsoft also fails the acid test.

What was the test again?

I agree, Open Source is doomed in mass public (1)

C.Lee (1190) | more than 15 years ago | (#2018556)

And the macro viruses still wouldn't work on the linux side of things because they are dependant on WORD which isn't avaible for Linux, so a macro viruses based on MS WORD wouldn't have the slighest idea of how to acess the Linux filesystem. Also since WINE would actually be handling the actual disk I/O and not a MS program under it, it shouldn't be to difficult to have WINE look for and direct this sort of activity to /dev/null instead of to a hd if it does in fact proves to be a problem.

IIS vs. Apache? Code bloat? Eh? (1)

ninjaz (1202) | more than 15 years ago | (#2018558)

This guy is obviously twisting truth here in pursuit of an agenda. Last I saw someone do testing of Apache vs. IIS (ZDnet, no less), Apache beat it quite handily in the speed department. And, Apache also serves to answer another of his arguments, namely code bloat. Apache handles this nicely through the use of modules (as does perl..) Don't want that piece of bloat? Ok! It doesn't get loaded at runtime. The gimp is a nice illustration of that, also

Re: Solaris vs. Linux code size/defect ratio, last time I installed Debian, it had *lots* more to it. Of course, in typical Free Software fashion, it's all modularized, so you only get what you ask for. And, I've seen solaris boxes get wedged in X somewhat frequently. kfm even brought a Solaris x86 box here to its knees. It was causing things to go in extreme slow motion, wouldn't die with kill -9, and even survived a switch to single user mode... And nfs was *not* involved. I've never seen that type of thing happen with Linux.

Untested updates? (1)

Chemical Serenity (1324) | more than 15 years ago | (#2018560)

Agreed, there are a lot of disowned semi-finished, semi-attempted packages out there... personally, I don't think that's such a bad deal. It's a shame there was the wasted effort, but in most cases it appears that these abandoned packages are a result of:

Another package which is better/farther along

Real life concerns (usually revolving around real job or school workloads)

A project leader who doesn't know how to attract people to his banner, or know how to set reasonable goals

A Boring Project which doesn't capture the interest of the parties involved for the long haul, or ever attract enough people to have completed

In the case of the first, that's just simple software darwinism... either the coder helps them out or simply looks for something else to do. Saves us from wasted effort in duplication.

The second, well, not much can be done about that... ya gotta eat, but I've noticed that a lot of people will go and do that stuff and come back, or at the least hand control over to a competant fellow developer.

The third can be summed up in pretty much one word: Freedows. ;)

The last is the only one that worries me, and has been identified by others as a 'trouble spot' for OSS software development under the so-named bazaar model... people often shy away from the uncool/boring projects for obvious reasons. I personally see this as an excellent opportunity for profit-motivated coders to fill the gaps in with commercial offerings... YAPMFL (Yet Another Profit Margin From Linux).

I question this 'untested update' thing you mention though... if you're looking for code stability, chances are good you've already latched onto a decent distro (rh, deb, suse et al) where a centralized body compiles, tests and packages changes prior to release. I've personally found redhat pretty good about supplying stuff which just works out of the .rpm, and seems quick on the draw to fix buggy things (although it took me 2 days to finally DL the wu.ftpd exploit-proofed RPM... they REALLY need to work on that site mirroring from

-- (remove the SPAM-B-GONE bit)

Error Metric (1)

wayne (1579) | more than 15 years ago | (#2018563)

Because it is based wrong data. Ted Lewis wrote: "Unix has more than 10 million lines of code, while Linux has only 1.5 million." Now, Linus has just recently said: "Right now its in the amount of 15,000,000 now" (lines of kernel code) (see that transcript of IRC with Linus, it was here yesterday)

Linus didn't type that number, someone else was transcribing his phone conversation to IRC. That latter person misstyped. The source to the Linux kernel is about 1.5MLOC.

However, the article is still wrong because the error test wasn't comparing kernels, but the utilities, and the GNU utilities are not dramatically different in size than the commerical utilities, in terms of KLOC.

Bloated open source.. (1)

Daniel (1678) | more than 15 years ago | (#2018566)

..I suppose that proprietary software (for example, Win2000 or Netscape 4.x) is less bloated?


Lies, damn Lies, and statistics. (1)

Daniel (1678) | more than 15 years ago | (#2018567)

IIRC, Emacs is just the core LISP environment and editor..all those other things are separate programs that happen to run inside Emacs.


Already Replied To It (1)

Robert Crawford (1742) | more than 15 years ago | (#2018568)

I sent a response to Computer the day the
issue arrived in the mail. I've already received word that it will be published, if they have space.

I have a copy of my letter on my home machine; maybe I can dig it out and post it.

OSS (1)

bain (1910) | more than 15 years ago | (#2018573)

Bravo. Spoken like a true prodigy

Who would like to volenteer supporting a 200 user office moving from M$ to Linux ?


Lies, damn Lies, and statistics. (1)

luminiferous (2096) | more than 15 years ago | (#2018574)

For one thing this article only looks at the 2.0 Linux kernel, and another it does not do any comparisons to number of programmers vs. defect density or try to figure out the defect density of the various microsoft operating systems.

Also if a high-end *nix can have 60% defects and still remain on par with linuxs defect density then what does that say about bloat? Considering linux has virtually all of the functionality of the high-end *nix at the core level.

As for the arguments on defect density and bloat increasing with linux that *could* be true if everyone who decided to code opensource apps or work on them did not have the unix mindset of making one thing do what it's supposed to and do it well, and not integrate everything into it.

As long as there is a will to do something, something will be done.

Copy? (1)

GreenPickles (2275) | more than 15 years ago | (#2018576)

Anyone have a copy in HTML or something... I hate PDF.

Too much talk (1)

diakka (2281) | more than 15 years ago | (#2018577)

This guy is rediculous. He talks about Linux's reliability being suspect? I don't care what his reasons or arguments are. The proof is in the pudding. Linux development/quality is not slowing down, and it's not going to. OTOH, I can think of one megacorp who's software is gonna be way late :)

Market Stratification (1)

Helmholtz (2715) | more than 15 years ago | (#2018579)

The vast majority of the OSS/Linux talking heads seem to neglect the INTENSE stratification that exists in the computer business. I think Microsoft downplayed this very factor when they decided to feed NT to the common user, and are now paying very heavily for it . . . what's this there's going to be another consumer-grade OS based on DOS . . why? because NT is not compatible with cutting edge gaming at its core and will never be. I sure hope that the moron that came up with that strategy has been ripped a new one by his boss.

I know very little about the requirements of the server market, but I suspect that if reliability and stability is a strong factor then this great rise in NT usage might start to fall away with the advent of W2k, that is unless W2k is initially released in a very stable form heh heh heh. The next viable solution is a Unix-type platform, and it seems to me that this is where Linux will start to make some serious inroads against NT & the bix Unixes.

Then there's the consumer/business market. I initially clump them together because both of these major markets tend to use similar hardware, whereas the server market uses distinctly more advanced hardware. On the business end, Windows is pretty strong because it's fast. What I mean is that you can order a hundred or more PCs with Windows pre-installed and with a minimum of effort, all these PCs can be hooked up to a central network, and employeeds can be plunking numbers into spreadsheets and emailing jokes within a very short time. While Linux might offer advantages in maintance, configurability, and performance, it will be hard to convince the business folks to go for it because it doesn't come preinstalled (although this might be changing), and when something doesn't work right there isn't a centralized someone to call and bitch at. Forget that Microsoft charges for support, if this is a business that isn't a real concern. Add to this the additional cost of training virtually all the employees to use this OS that most have never seen and some have never heard of.

As far as the consumer market goes, there exist many sublevels. At the top (money-wise) there are the gamers. These are the people who plunk down many thousands of dollars on a predictably regular basis just so they can prance around as cyber-warriors (I have nothing against gamers, but sometimes I wonder if the tons of money might not be better spent). I think this is where Microsoft fumbled badly. The gamers are constantly looking for ways to get their machines to run faster, and with the talk of QuakeIII being multithreaded so it can use multiple processors, and the growing realization that for 32-bit programs that consume a lot of memory, NT offers a large performance advantage over 95/98, it should be obvious that Microsoft saw a way to feed an OS to a segment of people that are willing to spend a lot of money on a very regular basis. Unfortunately the very things that make NT run better on high-end machines also make it a nightmare for games that require direct-hardware access. That Microsoft thought (and possibly still think) that they could have their cake and eat it too still amazes me. But I have digressed. This is another market that Linux will not have an influence in. The emmergence of cutting-edge hardware along with their accompanying cutting-edge drivers (for 95/98) happens at such a rapid pace that there's no way Linux could keep up. Couple this with an already severe lack of game support (there are a lot of mainstream games other than Quake and Heretic), and the stage is set for 95/98 to rule this market segment.

Now for the consumers that are knowledgeable about their machines, and are eager to learn more. Generally these tend to be students at all levels, aspiring young programmers that are trying to learn as much as possible as quick as possible and don't have to worry about things like a 9-to-5 job. Here is where Linux already lives, and is probably growing rapidly. Personally, this is where I fall. I've only been messing with computers for a couple of years now, but I've already worn out Windows 95 and just the other day I hosed my NT installation so that I might be "all Linux". I maintain a small 98 installation just for games. I think this segment of the consumer market is approximately the same size as the gamer segment, and possibly just a little larger. The only problem is that within this segment are also the people who use computers for what they are intended, and hence they don't need the latest new hardware so their dollars tend to stay close to home instead of being scattered into the market. While this is a good personal philosophy, it really sucks for the market segment, as it lessens the importance of the segment. Linux also amplifies this sentiment. I have two machines at my house, one is a dual PPro that is my primary machine (and even though it is considered antiquated by many, it is still blazingly fast for everything that I ever need to do), and the other is a 486/66. When I was given the 486 it was slogging away with win95, and I was very impressed at the performance increase that was gained just by changing the OS to Linux. So the people who never felt like they needed a huge machine to begin with now will have yet another reason not to spend their money.

Then there is the last and largest subgroup of the consumer market. I call this group the email-drones. The people who belong to this group are the vast multitudes that are buying PCs because Betty down the street has one and if Jenny gets one then she can send email to Betty who can then reply to her email, all without picking up the telephone. And then there's the chat and the ICQ and before you know it the beauty salon of the 50s has moved into the den and onto the computer screen. This market group is single reason for the existance of both the iMac and the $500 PC. Everyone was shocked when the iMac didn't come with a floppy drive . . well, when I was mentioning this in passing conversation to my mother, she said right away "What do you need one of those for?". I rest my case. I'm also not trying to be in the least sexist by seemingly populating this large market segment with stereotypical images of the "gossiping housewife". I use this image as a behavior descriptor only, as it applies to an equal number of males as to females. Anyway, back to the point, here is yet another market where Linux will never make any serious inroads. My parents are still using an antiquated P75 machine, and everytime my father grumbles about things taking a long time to load and/or programs crashing I think about setting up a Linux installation on their machine, as I know Linux could easily fulfill every one of their computing needs and it would do it faster and better. But I already get too many calls from them whenever something happens on their machine that they don't understand and I shudder to think about trying to have them add a directory to their PATH or some such over the phone. Once again, Windows wins because it is easy, not because it is the best.

So what does this have to do with the article? Well, I think that Linux will continue to grow in both market size and stature, and that it will become a real competitor to Microsoft in many areas, especially in the high and low end. But unfortunately the real meat is in the upper and lower middle of the market, and I suspect that the ability for Linux to penetrate this area is slim to none, but not for any of the reasons described in the article, rather for the simple reason that the demands of those market segments do not mesh with the attributes and philosophy of the Linux OS. Personally I think this is a good thing. There's a reason I have a 98 installation, it plays Age of Empires REAL well, and it gives me a place to go to "relax" at the computer. I like having the "play" separated from the "work" in this manner, and while I remain a strong admirer and advocate of the Linux OS, I think the first question that must always be asked is "What do you plan on doing with your PC?". And I'm glad that Linux is not the solution to the reply "Umm . . I don't know . . games and stuff . . "


CORE mechanics (was: Its True) (1)

Robert Bowles (2733) | more than 15 years ago | (#2018580)

Personally, I've always found the core mechanics the most fascinating, (sometimes to the exclusion of all else). The nuts and bolts are what makes Linux powerful and elegant.

As far as fame goes, Linus receives the most by dealing with core kernel code, the core of the core as it were. Alan Cox, working on core-kernel and driver code (mainly), currently receives second billing. RMS's contributions to make, glibc, (long list) gets far less attention than deserved. And for the developers working on the "fancy" projects such as X11, XFree86, Gnome, KDE, StarOffice and Gimp, great as their contribution is, there is far less name-recognition.

Weak argument (1)

Andy (2990) | more than 15 years ago | (#2018581)

The author states that code bloat and featurism will become a problem in the long term. Well, free softeare has been arounf for a long time and we've seen no real evidence of it yet. Most free software projects go through an early phase of rapid development and then settle down to a quiet middle age. Remember how often GCC and Emacs were released in the early 90's? Now minor releases are made at about 1 year intervals.

What really happens when a project matures is that the original programmers strike off into new areas (like GUI desktops) and leave their previous work to those interested in maintaining it.

Full of misfacts (1)

dj.delorie (3368) | more than 15 years ago | (#2018583)

This article was so full of factual errors and misconceptions that I think the author didn't do any research at all, and is completely ignorant of the topic he writes about.

The business case for Linux and its open source software cousins is based on two fundamentals: the software is free or nearly so, and the availability of the source code attracts a large number of debuggers - again for free or nearly so.

No, the business case for Linux and OSS is based on these two fundamentals: it performs better than its proprietary counterparts, and it fails less often. The author goes to great lengths to describe how total cost of ownership is a key factor in buying decisions, yet fails to realize that TCO is influenced most by unit failure rate, and studies have shown that unit failure rate for Linux is far less then for Windows. Thus, TCO should be far less for Linux simply because you don't need support as often.

... the failure rate for the utilities on the freely distributed Linux version of Unix was ... 9 percent ... Unix has more than 10 million lines of code, while Linux has only 1.5 million. So the Unix defect rate could have been as high as 60 percent and still paralleled that of Linux.

This is a simple math bug. The failure rate for Linux is 9 percent. If the size of Linux were ten times its current size, the failure rate would probably still be 9 percent. That's what percents are - ratios. His "defect density" sounds to me like exactly the statistics he is refuting (thus, the defect rate of Linux is 9%, compared to 15-43% for other Unix), since density is a ratio. 60% of 10 million sounds like a lot more than 9% of 1.5 million.

"Linux, once again, has had over 1,000 people submit patches, bug fixes, etc. and has had over 200 individuals directly contribute code to the kernel. ... Microsoft's core development team consists of 400 full-time program-mers and 250 testers... When compared to the size of the Windows NT effort, Linux is woefully undersupported.

Microsoft has only two times the core programmers, but they have 6 times as many lines of code to work on, and Linux is the one that's undersupported? Sounds like Linux has three times as many programmers per line of code than Microsoft.

Already, Apache is losing the performance battle against Microsoft's IIS.

Didn't he see the recent article by ZDNet showing that Apache outperformed IIS by a wide margin?

Even the opensource community admits to this weak-ness: "The biggest roadblock for open source software projects is dealing with exponential growth and management costs as a project is scaled up in terms of innovation and size" (

He quotes a Microsoft source saying Linux is in trouble, then attributes it to the Linux community. Of course Microsoft is going to put Linux down. When taken in the correct context with the correct attributions, this statement loses his purpose.

So we can rule out any scenario in which Microsoft takes over Linux.

The GPL would prevent this anyway. Even if Microsoft tried to "absorb and extend", they'd never be able to do it without violating copyright laws, because any extending they did would have to be released as soon as they shipped it to any customer. Even if they try to tie applications to "their" Linux, any work they'd done to make it "their" Linux would have to be released under the GPL, and could be replicated in other Linux variants.

A third scenario is most likely: Linux will turn commercial

Linux is commercial. What do you think Cygnus, Red Hat, Caldera, SuSE, and Pacific HiTech are doing with it? They certainly aren't charities - they're out for profit.

To make a lasting impression, software developers still must cope with absorb-and-extend and other techniques of the strong. To do so, they will have to retain a certain amount of proprietary code in their products and charge for their intellectual property, one way or the other.

First off, as stated before, absorb and extend just won't work with GPL'd code. It can't. Any work Microsoft puts into enhancing Linux (sans applications, which are effectively independent of Linux itself) becomes part of Linux, which only improves it for everyone else too. Any support Microsoft puts into Linux would only help the Linux community. Second, Linux's "intellectual property" isn't the code, it's the people behind it. Companies like Cygnus, Red Hat, and Caldera make lots of money off OSS by having the right people and services, not by having proprietary code. Microsoft can't duplicate that without a significant investment in people and time.

who? guys who enjoy such a work (1)

hany (3601) | more than 15 years ago | (#2018585)

who? guys who enjoy such a work

Perhaps he prematurely concluded it failed (1)

Guy Harris (3803) | more than 15 years ago | (#2018586)

"To qualify as a world-class success and not just a fad, each new product or method must pass the acid test of 'crossing the chasm' that separates early adoption from mainstream acceptance. Linux, and open source in general, fails this acid test."

Or, more accurately, perhaps hasn't yet passed this acid test. It can only be said to have failed that test if, after some "reasonable" period of time, it hasn't passed it; otherwise, simply by discussing a product early enough in its life cycle, one could dismiss almost any new product as "failing this acid test".

Perhaps it will fail the test, and perhaps it won't. His article should be treated as a prediction to be tested against future reality, not as a firm description of what will happen. (A phrase that's more and more in my mind these days is "Just because somebody says something, that doesn't necessarily mean it's true.")

Nice FUD. (1)

DannyC (4040) | more than 15 years ago | (#2018587)

I was about to try to answer without flaming but I find myself unable to do that. Here are some glaring stupidities:

"... The concept of free software is a well-known and frequently practised strategy of the weak"

He then draws a parallel between AT&T's licensing of SystemV with Linux's openness. AT&T's strategy was a market-driven, commercial venture, Linux's openness isn't.

"When HP and Sun acknowledge Linux as a viable challenger, we will see its rapid deconstruction as these competitors first embrace and the extend its advantages. Linux is on the verge of being squashed be the strong and will probably not survive in its present form."

Here's the question of code forking again and I must admit that it's worried me before. Thankfully, the open licenses under which the Open Projects operated do not permit it, as Linus said in the MSNBC chat.

About the failure rate of Unix utilities: "... it should not be confused with defect density, a more reliable metric... Unix has more than 10 million lines of code, while Linux has only 1.5 million." His conclusion is that Linux's defect density (lines_of_code/failures) is much higher.

Wrong! The correct ratio is functionality/failures. If a product does what it should do in less lines of code, that's actually a benefit because that's the mark of better, cleaner code that's much easier to debug.

"Thus, Linux's reliability is suspect. In fact, we can expect Linux defect densities to get worse..."

The point is that I've run Linux at home for the last 7 years at home on four different computers and at work for the last 5 years on many, many different computers and in that time have had four unscheduled crashes. Yup, can count them on ONE hand. I have also used Windows for the last two months and since have had more than 20 crashes and have had to re-install 3 times.

Linux gives me more functionality than Windows, for a better price and keeps my personal data (the most important part of my computer) safe.

He later goes on to compare the number of full-time programmers at work on Linux and Microsoft, and the number of beta-testers pointing out that Microsoft has so much more.

Well that doesn't reflect very well on them, now, does it? Considering that they're challenged and therefore but on the same footing as the Open people.

"... Support diminishes still further when the hype wears off an open source application... Mozilla's mailing list declined by 58 percent... Apache..was developed by a cast of thousands is now supported by fewer than 20 core members. Already, Apache is losing the performance battle against Microsoft's IIS."

Mozilla's mailing list: Expected. That's called stabilization. Apache's development team: That's all that's needed, baby! Apache's performance: One wonders where he gets his numbers.

Page 2.

"... Unfortunately, these exceptionally talented programmers (Open source) are in limited supply and, as any open source program becomes widely distributed, this talent will become increasingly scarce."

Limited supply? As far as I known the human race hasn't become infertile, all of a sudden. Anybody can become an Open Source Programmer, in fact, zillions of kids now in school are getting in on the act. That's your supply.

And this is where I quit. The next bits made me angry enough that the only thing I could reply was "F**k off and die, you idiot!" I guess that qualifies as flaming. Oh well.


Quality logic, good foundation! (1)

Odinson (4523) | more than 15 years ago | (#2018590)

I saw a great deal of legitimate gripes with this article. One thing that was not mentioned yet was the growth of the internet's partial responsibility for the OSS revolution. I have to wonder if that was a stratigic omition. The net definatly makes it much more feasible for developers to spontainiously coordinate with little to no overhead. That flexibility of the internet, renders figures that do not weigh the recent popularity of ipv4 in fair proportion, slanted and antique in nature.

With the solid statistics just mentioned backing it up, I knew I knew the material in this article had originated from somewhere else. After a little searching I turned up a group who had done similar research and turned up similar results.

That group's transcript...

Did you dress her up like this?

No, no... no ... yes. Yes, yes, a bit, a bit. She has got a wart.
She Turned me into a newt.

A Newt?

I got better

what do you do with witches?

Burn! Burn, burn them up!

And what do you burn apart from witches?

More witches! Wood! So, why do witches burn? [pause]

B--... 'cause they're made of wood...?


We have found a witch, may we burn her?
What also floats in water?



Uh, very small rocks!
I'm not a witch, I'm not a witch!!

They dressed me up like this.

No we didn't!

No! No!

And this isn't my nose, it's a false one!

So, how do we tell whether she is made of wood?

Build a bridge out of her!!
A duck! Exactly! So, logically... If she weighs the same as a duck, she's made of wood? And therefore?

A witch! A witch! A witch! A witch!
Right, remove the supports!
[whop] [creak]

A witch! A witch!
It's a fair cop.
Burn her! Burn her!

Mr. Lewis: You misunderstand yourself... (1)

Lettuce B. Qrious (4630) | more than 15 years ago | (#2018592)

I started out reading this thing in the hope that I would see a well done study of the possible limitations of the OS development model. Instead I find the usual, crappy and unsustained semi-arguments that we see whenever somebody claims to have given the OS-movement a serious, critical look.

I found Mr. Lewis to be full of himself, and this finding was substantiated when I went to the website [] that promotes his book. In the section called "about the book", he goes on and on about how science and technology joined economy in the industrial revolution, and how the current powers that be/were didn't understand what was going on. Let me point to a couple of phrases that I found particularly amusing, given his stance on OS's role in the current "revolution":

- "Drucker's Law still applies: the people in the midst of the revolution don't know what is hitting them. And they won't know until after the IPO1 is over. Like passengers in a speeding boat, spectators in the software age know the river's current is swift, but they don't know where the raging falls lie."

Too true. But I you find it rather pathetic to rant about how people failed to understand the industrial revolution, and then commit an article which thoroughly demonstrate that he hasn't properly understood the networking revolution that he claims to be preaching? Mr. Lewis has no idea what's hitting him, and thus serves as a poor guide for others.

- "It may be too fast for royalty [drawing a parallell to the powermongers of the Industrial revolution, here], but the software economy is on its way. It may be a mystery to the establishment, but it is well understood by the Netheads2 in Wired World3 . It may violate the doctrine handed down by classical economists, but it does follow a set of laws. It may be just in time."

A mystery indeed, Lewis, and indeed one that you don't grasp the way you claim. Oh, and by the way, we apologize for violating your doctrine. The final sentence in the previous paragraph points to the next:

- "The late 20th century marks the beginning of the end. Within the next 20 years we will discover the new laws of the software economy. We have early warning signs - Netscape Communications Corp. parleyed 16 months of software development work into an IPO valuation of $3 billion. Companies like Microsoft and Adobe Systems which were unknown a decade ago are now the darlings of the stock market, and the nouveau riche telecommunications industries like 3COM, Cisco Systems, and Bay Networks have turned from small-cap industries into powerhouses of the new century."

Okay, people: This guy is a friggin Ph.D., he holds degrees in Mathematics and Computer Sciences, so let's not write him off as an idiot (even though he claims Microsoft was unknown a decade ago). His angle is just skewed. He has published numerous books, and the one that probably sheds the most light on his skewed vision on computing, is the one from 1976 called "How To Profit From Your Personal Computer".

From this, we may postulate that Mr. Lewis is probably hooked on the income side of computing. Don't get me wrong, though, I have no problem with anybody making money by facilitating my work. However, this places Lewis among the current day's power-mongers (I may be inflating his ego here, but the guy is Chairman of Computer Science at the Naval Postgraduate School in Monterey, Califorina, which may or may not be a big deal - kind of hard for me to know from where I sit), and if we were to allow a parable from his own account of the industrial revolution, he should thus be one of the last to know what hits him in this revolution. According to his recent writings, he appears dead on schedule...

Far from being discouraged by Mr. Lewis futile attempts to discredit the OS movement (some of his points are valid, though, had they not drowned in the rest of his rubbish), we should see this as a confirmation that OS is on the right path. This path will need some adjustment from time to time, and some of the necessary adjustments may stem from people with no better understanding of the cause than that of Mr. Lewis.

Those who don't know their history, are doomed to repeat it...

This guy fails the free software test. (1)

Extremist (4666) | more than 15 years ago | (#2018593)

I don't think Mr. Lewis has a firm grasp on the whole reason Free Software (FS) exists.

First off, Linux and FS in general was not started to squish MS. I still don't think it exists to squish MS. That's not the point. This is a community center that all are welcome in. These community volunteers are not here to build a non-profit organization to put the Embassy Suites out of business. Just a place for people who want a good, down to earth, quality place to go. One that they, too, are free to contribute to. A place that can exist after the founders leave, for whatever reason.

I don't know about the majority here, but I started using linux for a few core reasons.

1) I wanted UNIX. Period. UNIX has power that NT just can't achieve. It's flexible, stable, logical, and versitile.

2) I wanted to learn to code. Having source code to everything is a big plus there. I can peruse the kernel, KDE, Afterstep, sendmail, whatever I want. That makes for some great learning material. Plus, I can reuse this code freely.

3) I am never held captive to a bug again. Without even claiming to be a programmer (I am not) I can say that I have fixed a bug for myself. A NIC driver. Not some glamorous GUI office suite. Just a NIC driver for a card I needed to get working right at that very time. Stores closed, no money, choice between 2 NICs, one broken, and one with a driver that would not compile. Read the error messages from the compiler, fixed what logically seemed to be the problem, and Shazam... a working network.

None of this has anything to do with MS other than you can't do it with MS :)

The biggest reason I try not to use Windows (of any version) is that I have lost work. Alot of work. Windows freezes, and occasionally takes a partition with it. Not often, but how often does it take?

I do disagree with MS in their business practices, but that still is not the reason I use linux and GNU software. I use it because it's just better. I can get things done. And I can learn, which is very important to me.

Do I care if corporate (insert country here) adopts linux? Only to the extent that I may get the one application I really need that isn't on linux yet, and that hardware developers will start contributing open sourced drivers. I think I'll get the drivers and apps with or without this, so in the end, it really doesn't matter.

As to the remarks about bloat and featurism. I see a checks and balances here. The programmers ARE the users. If RedHat decided to start playing "embrace and extend" tomorrow, what would happen? Well, I think the first thing that would happen is the keepers of the code (us... the users and programmers, by grant of the GPL) would just revert the code. And stop using RedHat. RedHat would sink like a rock, and they must know that. Aside from being able to fix and use software on your own terms, this is the most important aspect of Free Software. Without us, they don't exist. And since anybody can start a company to take their place, using the very package they put out, they hold on by honor and reputation alone. Anything they add that isn't liked by the rest of the FS programming world will be an orphaned bastard child left on the side of the road. Plenty of other distro's to take their current place at the top (they are top in the US, anyway, for the time being.)

Those of us who really want to use linux/*BSD/HURD/etc... are in no danger of losing it. We will still wake up tomorrow and have all the code. Nothing lost, nothing to worry about. And there will always be someone that wants to make it do something THEY need. And we will all benefit. New day, same way.

This is linux. You will NOT be assimilated, but you are always welcome to join in. We're having fun over here :)

Error Metric (1)

dvdeug (5033) | more than 15 years ago | (#2018594)

> The Unix error defect could have been as high as
> 60% and still parrelled that of Linux.

That's absurd. Think about. In pratical use, it's how often it fails for you. 60% is unusuable. Linux's bugs are far from making it unusable. While his comparison based on lines of code may make sense to him, it doesn't make any sense to me.

feature creep (1)

unitron (5733) | more than 15 years ago | (#2018596)

I fear he may be right about feature creep leading to bugs. Look no further than Microsoft for a perfect example.

some video cards lack support for linux... (1)

unitron (5733) | more than 15 years ago | (#2018597)

you can look at it either way. The manufacturers need to feel a greater demand for it than they do now. Wouldn't it be easier for the people who know exactly how the hardware works to write a driver for an OS for which they can get the source than the other way around?

Only the OS need by open source... (1)

doog (5889) | more than 15 years ago | (#2018598)

His argument seems to hinge on the fact that there are only so many programmers with the will and talent to produce quality open source software. The problem with this logic is that open source software itself actually CREATES more talented programmers because would be coders and programming students can learn by reading actual CODE! While I don't believe that ALL code should be open source, I do believe the operating system and core components should be, otherwise you end up with microserfs using the advantages that only they as the OS provider have to squash everyone else. An OS owned by the people is the only way to prevent that.

About bloat empirical data. (1)

Apuleius (6901) | more than 15 years ago | (#2018602)

Compare Netscape Communicator to Gecko.

'nuff said. Lewis's claim of bloat killing
OSS software is bunk, because the link between feature creep and bloat isn't there.

Feature creep in OSS means somebody offering new command utilities, maybe a GUI here and there. If I don't put it in my hard drive, it isn't bloat.

Embrace and extend against Linux?? (1)

Apuleius (6901) | more than 15 years ago | (#2018603)

So according to this guy, if I write a killer app, compile it for Linux, and sell it binaries-only, that will somehow make Linus wither like the wicked witch of the west once watered?

Worst case RMS won't invite me to his Superbowl party.

Then there's his statement "but Linux is still a Unix, and Unix is still losing market share". That's not FUD. It's fudging.

Why does Slashdot cite pay publications? (1)

Felix von Leitner (6965) | more than 15 years ago | (#2018605)

The Slashdot effect is great if it helps free projects, but if you quote from pay or registration sources, you just dignify their behavior and bring new customers to them.

I find it really annoying when I click on a link on Slashdot just to find that I have to register or pay some company. No, thanks.

And I find it preposterous that some people here actually comment on a short quote from a larger article that they could not read in total because you have to pay for it. Read it completely and THEN you can comment on it.

A reponse (1)

Glith (7368) | more than 15 years ago | (#2018606)

Bloatware is a result of one thing: a company's iterative processes through version numbers to get more revenue. Open-source projects are immune to this trend, because open-source projects aren't driven by corporations, but by necessity.

Programs in the open source model exist to solve a problem, not to get people to buy it.

As far as 'not main-stream acceptance', what about Apache? sendmail? bind? XFree86?
Granted, not everyone is using those programs, but if he's trying to predict what Linux (any *nix really) will be as far as a desktop operating system in several years... well... he's nuts.

Also, he fails to grasp that new talented programmers are created daily. Linux is quite popular on campuses. I can't cross campus without hearing about it somewhere. Linux can easily move in to capture future talented programmers in a way that Apple and Microsoft's astroturf campaigns cannot possibly attain.

Real computer science majors won't touch Microsoft or Apple products. That's why the business building has the shrine to Microsoft, and not any of the many computer buildings. (Macs are for the liberal arts people).

Argument doesn't seem that sound... (1)

couchslayer (7594) | more than 15 years ago | (#2018609)

...because, while the author questions other people's statistics, he doesn't provide any inkling as to where he got most of his. So when he says that Linux has a higher 'defect density', he doesn't quantify this -- most likely because he cannot.
But this article is a great argument if one assumes that the industry will stay as it is, which has always been risky. IBM and others thought that mainframes would always rule the earth, Microsoft was once unknown too. Hell, I remember not very long ago when everybody did the BBS thing and life was good. But things change.
This article also seems to assume that CIOs will continue to have huge supplies of cash to throw at technologies and at blaming others (i.e. look at the support industry at present for what it really boils down to), and just as the computer has whet the corporate appetite for more profits, there has to be another target after all is eeked out from these machines. And guess who'll be in line? That's right, the ones who keep buying things for them.

But the largest problem, and the problem that the Linux community has to deal with right now, is that we're in a position where we just don't have to care about being commercial, or being big, or so on. This article treats Linux as a business, which it never intended to become. But more and more now I see people who are treating it as if it should be profit-motivated, and it's a shame that the chase after money has caught so many of us. Yes, Linus can be considered a bottleneck, but only if one feels that a certain schedule of releases must be kept up, which just isn't the case. Too often, we seem to be sacrificing our ideals to win approval of business, and by-and-by most of what business cares about here is making money, not keeping freedom.

Who cares what he thinks? (1)

GodEater (7709) | more than 15 years ago | (#2018610)

He may or may not have some valid comments there about why OSS software will succeed or fail.

Who cares?

I don't give a rat's fart whether OSS succeeds or fails in the commercial market. I use it because I like it. Not because everyone else says I should use it.

If businesses decide not to go with OSS software it's their choice, not mine. I'll still stick with Linux.

So what? (1)

kampi (7720) | more than 15 years ago | (#2018611)

Maybe some of the OSS projects will go down, probably others will arise....
Stability is a question of moving, not unlimited growth. Take a look at evolution...


Ratios (1)

Espressoman (8032) | more than 15 years ago | (#2018613)

Hi all,

There is something I think we in the Linux community aren't being completely honest about. Sure, the number of new Linux software applications in development are astonishing, and yes these applications are more than a match for the offerings of conventional software houses. But I think we are perhaps in too much of a hurry to show off these new developments, because they are for the most part in development - a great number in very early stages of development - and people out there in the world, including the media, are mistakenly beginning to think that these applications are what is on offer from the Linux community.

The result is that tens of thousands of people who are new to the Linux community - including the media - download the latest development releases of things like Gnome, Enlightenment, etc., and then get frustrated with stability issues, poorly implemented features, or just installing the thing!

Part of the problem is that so much has been happening over the past year that most Linux applications are in development. There are so many unfinished applications on offer that it is difficult for people to even find stable applications that they can use.

We really need to emphasise to new users and the media that Linux _is_ super stable and a promising alternative to other OS's, but _only_ if really stable, and usually fairly mundane, applications are used.

People used to the Windows world are accustomed to downloading every update they can find. This is not a safe policy in the Linux world because most of the time these 'updates' are untested.

We and the media has created a situation where new Linux users are hungry for every new and glossy app we can develop. The hype, in my opinion, is getting a little out of hand, and people are being mislead.

Why don't we encourage RedHat or one of the other Linux distributors to counterbalance their RawHide distribution with a RockSolid distribution. Hence we coud always point the media, business users, and new Linux users to the RockSolid distribution, and keep them enthused about Linux by keeping them informed of what applications will soon be declared 'rock solid'.

"Greed is the greatest motivator of all" (1)

korpiq (8532) | more than 15 years ago | (#2018615)

In a quick look, I loved this one the most, even more than the growth rate assumption above it. Now this is a scientifically proof clause to use as the basis for an explanation of how the OSS movement will "implode".

Laugh of the week, if it only didn't have a load of less critical readers. I hate the power of clueless media.

Market Stratification (1)

malkavian (9512) | more than 15 years ago | (#2018620)

Thanks for a beautifully written article.. It's always good to see something like that on Slashdot..
There are a few things that maybe we'll have to agree to disagree on though... You say that Linux won't make inroads on the market of the 'average home user' (read gamer/email drone).. I've found that quite a few people who fit into this category who've actually installed Linux, and been happy with it.. A small learning curve (maybe the same as going from win 3.x to win 9.x) and they're there..
I've also had a _lot_ of Win users wandering past my workstation at work (where I set up a Linux box to handle the department's webserver, fileserver etc.), and look in awe at my basic Windowmaker screen with the clip.
Everyone wants one, and now, most of the people on the floor who own PCs are running Linux at home, to check out this OS that seems to do so much more than Windows.
I'd agree with your points, if you stated that there's a long way to go before Linux makes inroads into those markets...
I believe it will.. Not to the dominance of the market that MS have.. But I believe they'll maintain a reasonable presence.
Never is a long long time.
I still have a nice long list of lots of names back in '94-'95 who were telling me that Linux would never make it mainstream, that it'd never be a commercial viability and you'd never have non-guru users.
Some of them I still phone up to laugh at before I ask them out for a beer.
They're now the ones asking for my advice on how Linux can be used to increase the reliability of their company's information systems.
All that said, it's merely my experience of the situation, so what I write is coloured by my own experience and bias.
It'll still be interesting to see what, despite all the FUD that's spread, happens.
These are, indeed, interesting times..


What I don't understand... (1)

banky (9941) | more than 15 years ago | (#2018621)

... is what people gain by slagging on OSS. Assume, then, that he is not part of an elaborate plot to FUD the OSS world to death (hatched by MS, of course). What does anyone gain by making the world safe for corporate america? Do middle managers sit in their offices thinking, Damn, I have to adopt a policy that makes more criminal, white-collar, lowlife a$$h*le$ rich; lets get our publication company to slag on that product that isn't technically owned by anyone. I see why ZD is a FUD machine: no one there is smart enough to type ls and then interpret the results. But the rest of the world? It just confuses me. You're picking on the little guy.

Rebuttal (1)

DuaneGriffin (10845) | more than 15 years ago | (#2018622)

"To qualify as a world-class success and not just a fad, each new product or method must pass the acid test of 'crossing the chasm' that separates early adoption from mainstream acceptance. Linux, and open source in general, fails this acid test."

I strongly disagree with this. The key question here is, what is the mainstream market? It is interesting to note that the OSS that has been around for a while has been very widely adopted among its intended market: software developers. Would anyone say that Emacs, vi, grep, sed, gcc, gzip, tar, to name just a very few, do not have mainstream acceptance amongst developers? And what about the server world: BIND, Sendmail, Apache, not to mention the Berkley TCP/IP stack?

Now, it is true that there is not wide spread adoption of GNU/Linux on the desktop. No one is arguing that, and that is not the immediate goal. GNU/Linux is currently being (re)positioned as a server OS. Its original 'market' was simply hobbyist OS hackers. A perfect example of its new market are ISPs. What proportion of hobbyist OS hackers and ISPs run GNU/Linux or *BSD systems? More than enough, I am sure, to qualify as 'mainstream acceptance'.

Pronouncements that OSS is just a 'fad', and that it 'fails this acid test' are certainly premature. IMHO, the evidence points to the exact opposite conclusion. In the markets it was designed for GNU/Linux and OSS already have mainstream acceptance. The question is whether they will be able to gain such acceptance in other markets, such as the workstation and desktop markets. This has yet to be seen, and to state that OSS has failed the acid test is like stating that Microsoft has failed the acid test because the majority of toasters do not run WinCE.

"The more the open source paradigm succeeds, the more untenable it becomes because maintenance and support grow more complex, and costs increase due to a scarcity of talented programmers. Success leads to features, and feature creep leads to bloated software."

Here the author seems to get confused between individual OSS projects, and the 'OSS paradigm' itself. Ignoring such distinctions the argument seems to go:
1) Maintenance and support for OSS grows more complex as that software succeeds.
2) [Implied] There is a limited pool of talented programmers willing to work on OSS projects.
3) Costs increase because this pool is exhausted due to 1.

Firstly, the statement that success breeds features is not necessarily sound. Success does not have to equate to feature bloat, and it does not have to mean increased complexity. 'ls' is a very successful utility. Is it bloated? I think not. One key design principle of UNIX, and therefore of GNU/Linux, is that of modularity. This directly counters feature creep. For non-utility apps this principle is not as strong, however it is still important. X is a classic example: it works with all the different WMs, which (should) work with all the different X apps. (Please! No GUI flames! Maybe I shouldn't even mention this as an example, oh well...)

Also, if a project is succeeding that implies that it is becoming more reliable, getting better documentation, and gaining a larger user base. Now, it is true that as the user base grows the testing becomes more exacting, more bugs will be found, and more features/enhancements requested, but:

One key aspect of OSS, notably espoused by ESR, is that as the user base grows, so does the support base. This counters the third point. As an app becomes more widely used, more people will want to hack at it, fix bugs, write docs, etc. It could be argued that the proportion of developers in the user base is too small to alleviate the problem. This is simplistic, however. Programmers are not the only ones who contribute to a project. Writing documentation, sending in bug reports, sharing ideas, providing feedback: these are all very important contributions that anyone can make.

Furthermore, it is perfectly possible for companies to pay people to work on OSS. If the cost of paying for the maintenance of existing OSS software is less than the cost of buying or developing and maintaining alternate versions of the software, then the argument is irrelevant. Note that in some cases it is impossible to maintain proprietary software at any price (e.g. when the owner goes out of business).

If I have the time I might try to address the rest of the points in the article, but for now these will do.

Misses a big point (1)

Fyndo (11748) | more than 15 years ago | (#2018624)

The article misses one big point. (aside from many inaccracies and invalid comparisons)

It is possible for opensource software to be supported by commercial entities w/o becoming proprietary.

e.g. in [] several models are introduced. I think that with IBM, Dell, and compaq now (or soon) offering hardware with Linux installed one bears repeating:

Widget Frosting

In this model, a hardware company (for which software is a necessary adjunct but strictly a cost rather than profit center) goes open-source in order to get better drivers and interface tools cheaper.
The open-source culture's exemplars of commercial success have, so far, been service sellers or loss leaders. Nevertheless, there is good reason to believe that the clearest near-term gains in open-source will be in widget frosting.

If IBM and Compaq switched just one tenth of their AIX and Digital Unix teams to Linux programming, they could contribte immensly....

Commercialisation will save OSS (1)

Blackers (12213) | more than 15 years ago | (#2018626)

The author is correct to a certain extent in believing that the romantic ideal of volunteer development of very large open-source applications will become increasingly strained in the future.

But the development of Linux is not simply going to collapse. Because it has become crucial to the businesses of so many companies, it is in their interests to spend a lot of money maintaining and improving it. It's simply a matter of re-adjusting the idea of 'an infinite number of hackers' to 'an infinite number of software / hardware companies etc'.

Its True (1)

Doodhwala (13342) | more than 15 years ago | (#2018634)

Okay..before anyone flames me..let me clarify-I love linux.

Now..I feel that with large pieces of code (esp OSS) when there is no responsibility except moral, there will be instances of bugs coming through. Also people who work for recognition will work on the more "fancier" aspects of things for which they get attention. Who wants to pay attention to the nuts & bolts and the really CORE mechanics which is dull to say the least ?? Let me know!

I Still Say Its True (1)

Doodhwala (13342) | more than 15 years ago | (#2018635) admitted it yourself. I admit the low level is there but then most of the people are working on the GUI. At the end of it all, its what deep within that matters. It exists now..BUT who is working on improving it ????

WTF is he talking about? (1)

Your own stupidity (14554) | more than 15 years ago | (#2018645)

There error density in the article is unacceptably high...

"However, Linux 2.0 lacks the following features:

"* video card support" ???
"* Wireless LAN support" Plus 2.2 most definitely has wireless LAN support.
"* good selection of productivity software" Depends on your definitions here, but that will be a hard position to take in a year.

"UNIX has more than 10 million lines of code, while Linux has only 1.5 million," so Linux has a higher error density (errors per lines of code). Where is he getting this numbers? Is he just counting the kernel? If so, he is probably undercounting severely. No way is a typical UNIX kernel that much larger. If he is counting utilities, then he is way underestimating. Either way, it's a completely bogus comparison. In the text, he is supposedly comparing utilites.

Okay, by my count (using find, wc, and python) on /usr/src/linux and *.[ch], Linux 2.2.1 has 1604504 lines of code. Throw .S files in there and the total swells to 1676913 lines of code.

Now the RedHat 5.2 CD #2 with all the SRPMS is a couple hundred megs. I'd pop it in and find out the exact number, but I don't have it handy. It's got to be at least 300 MB, at a guess. Taking into account roughly the same compression rate as the 2.2 kernel src (about one line per 7 bytes), that's 42 million lines of code for RedHat. It's a guestimate, but at worse I figure +/- 10 million lines, but easily as much as UNIX (whatever UNIX he's talking about where you can count lines of code). SuSE is substantially larger.

He also assumes that the Linux (apparently kernel) code base will continue to grow exponentially in terms of lines of code and will be as bloated as NT (and where will NT be?). A lot of those lines of code go into supporting different platforms and drivers. I suspect maybe 25% of it or less is actually used on any given platform. How many platforms does NT support? (Two, if you count token Alpha support.)

There's more, but basically the bottom line is: This guy is totally smoking crack or something. Oh wait, I get it now! He's dropping acid! That's why Linux doesn't pass the "acid test": He popped a couple microdots and Linux didn't sing a pizza into his nose or something. Bad trip, man!

No Subject Given (1)

Fizgig (16368) | more than 15 years ago | (#2018650)

Don't Apache and Sendmail count as "wide acceptance"? And I'll laugh if he doesn't think Gecko will be widely accepted. These aren't just niche groups using these products.

Ted Lewis & Linux Support (1)

ordord00 (17347) | more than 15 years ago | (#2018652)

Lewis' arugment that support will be unmanagable after the user base increases beyond a certain point is dumb. Do you think Microsoft handles all the support for all of it's products? No. Then who do people turn to when they have windows problems? They go to their computer vendor, computer guru friends, and local computer stores. In other words more than just Microsoft people support Microsoft products. The same will happen with Linux. For support people will call VA Research, Red Hat, Caldera, the HOWTOs, the news groups, irc, their next door neighbor. Not every one is going to run to Linus to solve their Linux problems.

On another topic, as available software increases more developers (i.e. windows developers) will switch to Linux. This will mean more hackers will tinker with Linux and produce more patches and kernel code. This will make up for the "supposed" lack of developers for Linux (Lewis said there where about 200 developers for the Linux kernel) compare to MS (400 kernel devs.)

Hasn't been accepted? I think not. (1)

Targon (17348) | more than 15 years ago | (#2018653)

Let's see, we have virtually every major database being ported to Linux. We have games beginning to be written for Linux now(as part of the initial release, not an afterthought or port) as well. There are who knows how many millions of systems running Linux. So, it's gone a bit beyond, "fad" stage when commercial products are showing up for it regularly. As for the problem of bloat, you may have noticed that many distributions have an FTP site where additional packages can be downloaded from. This will allow for the bloat to be removed from the initial release, and to keep things to a reasonable level. So, bloat isn't an issue either, as long as the packages are organized in a way that can be managed by the users. This can be dealt with by offering different "levels" of install for users. I know that Debian GNU/Linux began to implement a "function" question to help make installation easier for beginners with the 2.0 release. I suspect that when 2.1 is released, it will build upon this feature to help make the initial install easier. Redhat and SuSE have used a GUI install to aid users through the install. This shows that it will not take much longer before Linux becomes "accessable" for many endusers to move to.

Perhaps not (1)

Daffy Duck (17350) | more than 15 years ago | (#2018654)

Vulnerability to viruses is more a consequence of the distribution method than of the software itself. After all, it's not unheard of for even shrink-wrapped software to be infected. What's really at fault is the implicit instruction to "just trust me and run this code".

So what we need is someone to trust. This could be a shrink-wrap packager like Red Hat, or a certificate authority. Automatically checking the signature on new software shouldn't be any more complicated than running a virus scanner - something that many (most?) mainstream users already do.

Lewis is, well, an idiot (1)

bluestar (17362) | more than 15 years ago | (#2018655)

> To qualify as a world-class success and not just a fad, each new product
> or method must pass the acidtest of "crossing the chasm" that separates
> early adoption from mainstream acceptance.

Besides this not being the only, or even best, criterion of mainstream
acceptance, he fails to define mainstream acceptance. With Linux' user
base in the same ballpark as Mac OS, I'd say it has passed this test

> This is because the strategy of the weak no longer works when a product
> becomes a serious threat to competitors. Microsoft demonstrated this in
> its battle with Netscape. Once Netscape seriously challenged Microsoft by
> gaining the dominant share of browser users, Microsoft skillfully applied
> an absorb-and-extend strategy, a proven technique of the strong.

Huh? Netscape was not open source. Netscape didn't "challenge" MS, they
created a market that MS dismissed at the time. MS got market share
through price dumping, product tying and exclusionary contracts, not

> Unix has more than 10 million lines of code, while Linux has only 1.5
> million. So the Unix defect rate could have been as high as 60 percent
> and still paralleled that of Linux.

Program A has 100 Klocs and 10 bugs. Program B, with identical
functionality, has 50 Klocs and 6 bugs. Program A is more reliable?!

> Greed is the greatest motivator of all.

Spoken like a true American. He simply doesn't understand the OSS

> First, Microsoft has many more options than does the open source
> movement. Because it holds the strong position, Microsoft can simply
> absorb and extend Linux.

He also hasn't read the GPL.

> A third scenario is most likely: Linux will turn commercial and Caldera,
> Red Hat, or some other traditional software publisher will push it into
> the mainstream.

He definitely hasn't read the GPL. And he doesn't realize that Caldera
and Red Hat went commercial a couple years ago.

> In their drive to simplify support, they will always choose Windows,
> Solaris, or HP-UX over yet another Unix. After all, total cost of
> ownership is driven by support, not capital costs.

I see this argument a lot and I hate it because it's so stupid. It
implies that no OS will ever replace another. If the cost of adding or
switching an OS was so great, we'd ALL have 3270s on our desks.

> Even if Linux proves
> the better product, Microsoft can always use tying-the creation of a
> mandatory dependency among applications-to maintain its monopoly.

Where can I get the drugs he's taking? The DOJ disagrees with him.

Free binaries vs OSS (1)

Robert Frazier (17363) | more than 15 years ago | (#2018656)

Lewis seems to think that giving away code (OSS)
is analogous to, and an extension of, the
marketing strategy of giving away a product
(binaries, in this case) in order to increase
market share. This analogy is dubious at best.
The correct analogy would be a company giving
the manufactoring plans and tools needed to create
the object locally.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?