Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Why Linux Has Failed on the Desktop 995

SlinkySausage writes "Linux is burdened with 'enterprise crap' that makes it run poorly on desktop PCs, says kernel developer Con Kolivas. Kolivas recently walked away from years of work on the kernel in despair. APCmag.com has a lengthy interview with Kolivas, who explains what he sees is wrong with Linux from a performance perspective and how Microsoft has succeeded in crushing innovation in personal computers."
This discussion has been archived. No new comments can be posted.

Why Linux Has Failed on the Desktop

Comments Filter:
  • Don't think so (Score:4, Insightful)

    by jasonmicron ( 807603 ) on Tuesday July 24, 2007 @11:21AM (#19970097)
    how Microsoft has succeeded in crushing innovation in personal computers.

    I found that rather funny. Blaming Microsoft for your own lack of creativity and ingenuity.

    Besides, Steve Jobs would very much disagree.
    • Re:Don't think so (Score:5, Insightful)

      by jellomizer ( 103300 ) * on Tuesday July 24, 2007 @11:29AM (#19970219)
      Creativity is very rarly an out of the blue thing, It is about looking at many alternatives trying to take what you like about them and make it your own, with perhaps somthing extra to get them to work correcly together.

      Having many Different OS's and Computers around we would be much better off seeing what works what doesn't why it does and how to improve on it. Back in the 80s If I were asked how would a Desktop System look in 2007 I would have given a much different answer (In my mind a 2007 desktop would look more like Plan 9 and less like windows) But during the 80s the Only GUI i had experinece with was Gem Desktop and I didn't particully care for it. I expected graphics in 2007 to be a bit better then they are now, But the OS in my mind would have frames not windows.
    • Re:Don't think so (Score:5, Insightful)

      by ImaLamer ( 260199 ) <john.lamar@gma[ ]com ['il.' in gap]> on Tuesday July 24, 2007 @11:47AM (#19970531) Homepage Journal
      The whole thing is dead wrong. All that enterprise crap is what keeps the platform solid and almost crash free.

      Sure, some extra code may slow things down, but since Linux, Windows and even MacOS now, is all based on server kernels (linux's own, VMS/WNT for anything newer than Windows 2000, *BSD) they don't crash too much. YOU may have problems with XP or 2000, but you shouldn't be. I've had an XP install going for more than four years, Windows 2000 running for months. (If you can't do this, you should not be using it, nuff said)

      Code doesn't care how many employees you have. Maybe this guy belongs at Ubuntu, where things are moving towards the 'desktop'. Just ask my new Ubuntu installation on my laptop - it's running like a desktop just fine. I just finished 5 hours of World of Warcraft on it!
      • Re:Don't think so (Score:5, Informative)

        by ardor ( 673957 ) on Tuesday July 24, 2007 @12:43PM (#19971459)
        His point is that the kernels are optimized for servers. That is, focus on throughput, performance, but not latency or responsiveness. A desktop has the latter two as priorities, while sacrificing the former two. As an example, it doesn't matter if that mpeg4 video I/O eats a little more CPU, as long as other tasks don't interrupt its playback.
        • Re:Don't think so (Score:5, Interesting)

          by lordtoran ( 1063300 ) on Tuesday July 24, 2007 @01:09PM (#19971819) Homepage
          This is why Linux distributors supply custom built kernels in different flavors. In desktop distributions like Kubuntu or Mandriva, the standard kernel is in fact configured to be responsive for desktop use.
        • Re:Don't think so (Score:5, Interesting)

          by HermMunster ( 972336 ) on Tuesday July 24, 2007 @01:37PM (#19972321)
          Why he's getting the response he is, is because of the claim that Linux is a failure, which only feeds the Windows fanboys. Linux is in no way a failure on the desktop. It just isn't as widely accepted as a viable desktop due to so many people not knowing anything about it as a desktop OS, or that it even exists. Focusing on that--getting the word out--is what will ensure Linux on the desktop.

          The good thing is that Linux, GNU, and Open Source development are moving along at a faster pace than Windows is and sooner or later it will begin to surpass other OSes and GUIs in features, stability, flexibility, future potential, etc (if it already hasn't). There are weak spots as all products have them. I think Open Source will respond better to enhancing those features faster than a monolithic monopoly ever could. Not to mention there are huge numbers of potential developers that will be creating prior art and even IP that companies such as Microsoft can only steal if they want to move ahead. That's a tremendous boom.

          What also troubles me is that Linux, GNU, and Open Source tend to react to technologies instead of really developing new technological ideas. We see that feature such and such has been created and that is often reproduced, though maybe in a superior way. What I'd like to see are more unique ideas coming from the Linux community itself thus ensuring that some key new technological concepts come from Open Source. It is sort of like when John Warnock created Adobe and created PostScript for the Apple Mac and the Laser printer. It was a technology like that which propelled Apple to the front of certain markets and it is that which made John Warnock the rich man he is today. I just can see some killer app being developed for Linux which draws people into the industry created and supported by so many of us. Also, convincing companies such as Adobe to adapt their applications to Linux will also help change the landscape. The issue is why would a company develop for such a small market? Well, as we have seen in the past couple years with Ubuntu having approximately 20 million users world wide and then with all the other distributions combined we come near 100 million users world wide. That's a huge market vs. what Adobe had when it was working on the Postscript and the laser printer with Apple. Certainly a much greater potential market for even some of the smaller technologies. Personally, I don't care if software costs money. And I know software can be developed for the Open Source operating systems without forcing them to use Open Source code. So, the potential is there for a huge market to make some people very rich selling software to Linux users.

          I don't recall the guys name nor his exact quote nor the precise context of the quote, but I do recall what he was getting at when he said something like "in our fight for racial equality we should have put more emphasis on buying land/property and being less strict about fighting for equality, as equality is bound to happen in a free society." What he meant was if they had bought land they'd have it as a valuable resource--something to ensure the future. They should have focused on that as much as they did on just getting equal rights as equal rights were bound to happen. Maybe it would have taken longer but it was bound to happen. This is what I perceived he meant. What I'm getting at with this story is that Linux should be focusing on building up (as in every participant, every volunteer, every developer) the IP and prior art to keep companies such as Microsoft from getting patents on them. We'll get parity sooner or later on the desktop. Let's own the land upon which the IP is based so that the monolithic monopoly doesn't lock Open Source out of some key advances. I'd rather see Open Source lock out the commercial entities than have the freedoms that I desire held hostage to the extortion attempts we've seen Microsoft use in the past.
          • Re:Don't think so (Score:5, Insightful)

            by sxeraverx ( 962068 ) on Tuesday July 24, 2007 @02:15PM (#19972971)
            Linux CANNOT have a killer app, because it contradicts what Linux stands for: Freedom, Openness, Choice, to name a few. If the Linux community creates something, it's damn well going to be F/OSS, and therefore, portable to just about any other platform. The fact that something is proprietary is the essence of what makes it "killer," and that just might be why Linux hasn't been able to dominate.
          • Re:Don't think so (Score:5, Insightful)

            by electroniceric ( 468976 ) on Tuesday July 24, 2007 @03:08PM (#19973705)

            The good thing is that Linux, GNU, and Open Source development are moving along at a faster pace than Windows is and sooner or later it will begin to surpass other OSes and GUIs in features, stability, flexibility, future potential, etc (if it already hasn't). There are weak spots as all products have them. I think Open Source will respond better to enhancing those features faster than a monolithic monopoly ever could. Not to mention there are huge numbers of potential developers that will be creating prior art and even IP that companies such as Microsoft can only steal if they want to move ahead. That's a tremendous boom.
            Wow, those are some big shoes to fill, and filling them rests on some pretty big ifs.

            Read The Mythical Man-Month. One of the most cogent things Brooks has to say is about project coherency, best exemplified in the desktop world by Apple. What Macs give you above all, their primary value proposition, is coherency of design.

            Coherency tends to be one of the weakest suits for many or most Open Source projects, especially those without a central entity to define the direction. The exceptions tend to be server or kernel-side: Apache, Linux Kernel, databases, etc, and I'd claim this is because there's a well-defined set of CS problems being solved there. KDE, which I use daily, has absolutely no coherency of design. That's why it does well as a testbed for new features, approaches, etc, but very poorly at consistency of experience.

            Brooks' argument, which is pretty credible, is that coherency comes from having one or a few project architects and consistently returning to their vision. They absolutely need to spend a lot of their time listening to users and developers and reacting to their feedback, but ultimately someone's vision is what makes a codebase hang together. Con is saying that the architects of Linux are basically not that interested in the desktop experience on vanilla hardware, because they're most interested in more traditional CS questions that tend to play themselves out much more in the enterprise space than the desktop space. As a non-CS guy in the software development world, this really strikes a chord with me. The Linux desktop is built on very similar components to the Mac desktop, yet is worlds away in usability. And that's basically because a) nobody is defining, shepherding and advocating usability requirements at the OS level, and b) the desktop projects don't have a architect/requirements definer at all.

            The rest of the article, and particularly the extravagant claims about success and failure are pretty much what you'd expect from a smart, non-CS, hardworking, disgruntled community member who has not been taken as seriously as he ought. The same dynamic pretty clearly played itself out in the climate change debate over the "hockey stick", where Mann et. al. were too dismissive of smart, hardworking, somewhat contrarian, non-climate science authors of counterclaims, McKitrick and McIntyre (M&M). Mann's work has withstood M&M's criticism well, and frankly M&M dropped the ball on some key items (like not properly modeling how various quantities vary with latitude - a big blooper in climate science) but the whole debate would have had less drama (and therefore been less ripe for political cherry-picking) had M&M not been seen to be marginalized by the climate science community. To me the lesson is not the technical merits of Con's solutions, but the lack of serious attention to his points about where the focus is in kernel development. That's the interesting part of this story, and one that Linus should really take to heart.
            • Re:Don't think so (Score:4, Insightful)

              by synthespian ( 563437 ) on Tuesday July 24, 2007 @06:01PM (#19976099)
              The Linux desktop is built on very similar components to the Mac desktop, yet is worlds away in usability. And that's basically because a) nobody is defining, shepherding and advocating usability requirements at the OS level, and b) the desktop projects don't have a architect/requirements definer at all.

              And whose fault is this? How many usability studies has GNOME conducted? NOVELL, IIRC, has done a only a handful, many years later. And KDE has set up an usability group one one or two years ago (and I've yet to read any paper on it). Not only that, GNOME has adopted the practice of not even paying attention to bug reports (look up Eugenia Loli-Queru's arguement with the GNOME project on this).

              Almost all the free software GUIs are not innovating *at all* on usability. They are all about little cosmetic changes. Mac OS X and Vista have left them behind the curve (and don't mention Beryl...what's the point of a spinning cube ?! How does that increase usability? Or wobbly windows?!!) Sometimes they inovate a little, but in the opposite direction, like Ion.

              And frankly when someone tries something new, nobody pays attention. Like OpenCroquet. Like some experimental Java desktops. You can't really expect anything other from developers hellbent on C programming...What can you expect from GMOME? All I expect from a C project of that size is that it's going to be further and further behind the curve...We can't even expecct anything from the likes of Novell: their Mono is not really being developed as a multiplatform tool, is it? (So, no FOSS desktop like GNOME or KDE).

              The real shame is having companies that are basically full with non-creative individuals injecting money on FOSS.

              By the way, "Linux" is not the only Unix-like OS that uses GNOME and KDE.
          • Re:Don't think so (Score:4, Insightful)

            by kklein ( 900361 ) on Tuesday July 24, 2007 @09:06PM (#19978153)

            Okay, I'll be as clear as I can be here: Linux will never take over the desktop. Ever. Ever. Why? Because it's a pain in the arse.

            Never, in all my years of working on the Mac and Windows, have I been required to type something like "sudo vim /etc/X11/xorg.conf" and then try to tell my computer to display something over 640x480 resolution--and even then not having it work, even after following 3 different, progressively complex, methods of getting an nVidia driver to work.

            Every year or so, I try to set up a Linux machine with whatever the new darling distro is. Only once have I gotten one to work acceptably, but there were still issues I wasn't happy with. And that took about a week of reading poorly-written manpages. Just the other day I gave Ubuntu 7.0.4 a shot. I gave up after 2 hours of fiddling to get working video.

            That is after having to futz with my CMOS to boot it--a step most people wouldn't know to do.

            Linux people are, and I'm going to be brutally honest here, morons. Not computer morons, obviously, because they have the skills and general knowledge required to get Linux to at least boot and display video properly, but morons because they lack even a basic understanding of what other people want from computers. Linux people are, and this will be news to precisely no one, geeks. As such, their opinions on computers are absolutely irrelevant to anyone other than fellow geeks.

            People do not want to fuss. They want to buy a computer, turn it on, and start putting in software they bought at Wal-Mart without ever even thinking about what is going on below the UI. Hell, as far as most of them know, there ISN'T anything below the GUI. That's what it has taken to get the computer into every home in every developed country in the world: compatibility and ease-of-use.

            Linux offers neither of these things.

            Ultimately, the FOSS model is fundamentally flawed. People write things they find fun or that they really need--motivations we in the education business refer to as intrinsic, which is the best kind of motivation there is. The problem is that no one finds things like video drivers fun. There's no huge drive to make sure all the features of the video card are supported, because you won't need them anyway. So, without some kind of extrinsic motivation, like profit, certain jobs just never get done--or at best, get done half-assedly.

            This problem is exacerbated by the fact that the people doing the developing are uber-geeks (we know this for certain because they are evidently coding for fun), and therefore don't sweat having to tweak a text file here and there. They pat themselves on the back for getting it to run at all (as they should--it's quite the accomplishment, and something to be marveled at!) and get so excited that they mistake this small success to be proof that everybody can and should be running Linux just like them. But they shouldn't, because (polishing off my old Slashdot chestnut)...

            Linux is a toy.

            It is a hobby OS. People have gotten this claptrap toy to do some pretty great things, and it's a no-brainer for any kind of application where the computer isn't expected to do anything very exciting (games, iTunes, iMovie/Windows Movie Maker, hook up any random scanner you buy--Only geeks are "excited" by hosting webpages and/or directing network traffic) or where you need a really small footprint (embedded). But that does not a desktop OS make. Not for the unquantifiably vast majority of computer users, anyway.

            Look, everyone hates Microsoft. Apple has their own hassles to deal with. But both are so astonishingly better at serving the customer's needs and desires than the Linux distros will ever be that the fact that some people even need that pointed out to them simply demonstrates, clearly and unequivocally, that those people are, as I have already stated above, morons.

            I'm sorry, but it's true.

        • Re:Don't think so (Score:4, Insightful)

          by Daniel Phillips ( 238627 ) on Tuesday July 24, 2007 @03:16PM (#19973831)

          His point is that the kernels are optimized for servers. That is, focus on throughput, performance, but not latency or responsiveness.
          Actually, the poor interactive performance of the Linux scheduler was due to a combination of a server-oriented performance hack (O(1) scheduler) and an ineffective attempt to propagate the notion of "interactivity" between processes. So in this case, both a server hack and a desktop hack contributed to the problem.

          Thankfully fixed now, due to Con figuring out how to satisfy both efficiency and latency objectives with a single scheduler, and Ingo rudely but efficiently pushing his own interpretation of Con's work into mainline. Moral of the story: sometimes the process is bumpy and feelings get hurt, but the code doesn't care, it just keeps getting better.
      • Re:Don't think so (Score:5, Informative)

        by BlueStraggler ( 765543 ) on Tuesday July 24, 2007 @01:34PM (#19972279)

        All that enterprise crap is what keeps the platform solid and almost crash free.

        I want to agree with you, I really do. But my SuSE 10.1 desktop regularly has fits where it becomes completely unuseable - if I can manage to get a shell, I find that the load has spiked to 5-10 (on a single core system) when the system was doing *nothing*. Just this morning, I woke up, poured a bowl of cereal, walked over to it to read some Slashdot over my Cheerios, and found the system thrashing and refusing to come out of screensaver because the load was so high. This happened while I was sleeping. I had to ssh in from my Powerbook to kill off any processes that appeared to be using CPU before the system would respond to the mouse.

        Meanwhile at work, we just tossed an Ubuntu server that should have been reasonably swift, but was regularly DOS'ing itself by spiking to loads of 40 or more several times a day under normal use. A load of 40-60, on a single-core machine! We "fixed" it by spending thousands of dollars replacing it with a pair of multicore beast with scads of memory and fast disks, which seems to overpower the problem.

        Then there's that server belonging to a client, a RHES 4 system. When I ssh in through a tunnel to update it, it insists on running the update program as an X client for crissakes. Then it tells me to register the system at a URL, but the URL cannot be selected or copied to the clipboard. This is "enterprise" quality software?

        Back at work, the dev server is still a RedHat 7.3 clunker. It has a half dozen developers fine-tuning their infinite loops, fork bombs, broken joins, buffer overruns, and spaghetti code, all day long. It simply never crashes or hangs, never gets slow, and never complains about the abuse it receives. It's a rock-solid dream. Except that it's a damn nuisance to update, since it's so old. And it's only hobbyist-quality software, after all, built before RedHat went all enterprise-centric

        Posted, with regrets, from my Powerbook. I'm starting to think that software built for the home user is a safer bet than the "enterprise" shite I'm dealing with every day.

    • by timrichardson ( 450256 ) * on Tuesday July 24, 2007 @01:27PM (#19972167) Homepage
      Half way through the interview I slammed on the mental brakes. Linux is so famous for getting more from old hardware. My Debian distribution boots much more quickly than Windows. And waiting for me in apt-get an upgrade to a new kernel with a new "fair" scheduler. After slamming the brakes, I didn't get off the bus though. Con is a great guy, looking for 120% activity in his life. His insights are more to do with kernel development than Linux on the desktop. Con: Success with your further endeavors, and for sure you will find something related to computers quite soon. An Amiga user never gets that out of their system.
    • Re:Don't think so (Score:4, Insightful)

      by Qwavel ( 733416 ) on Tuesday July 24, 2007 @01:58PM (#19972699)

      Yes, Apple has succeeded to take market share from MS on the desktop, while Linux has failed, but you are leaving out an extremely important piece: MS was working very hard to make sure that Apple displaced Linux. Tons of evidence has surfaced in e-mails and in interviews with ex-MS people that MS saw Linux as a real threat, whereas they saw Apple as safe and even as useful (in relation to the Justice Department).

      Now, MS never intended that Apple should take over digital media, marginalize WMA, etc. They miscalculated on that, but on the desktop, Apple has managed to make MS look much better. Not only can they claim that there is competition on the desktop, but now it is harder to blame them for charging too much or for promoting lock-in.

      So, I think the statement that you are rupiating could be modified to "how Microsoft has succeeded in crushing Linux in personal computers." because it is Linux and open source that they wanted to crush, not innovation.
  • Some of us find it quite up to the task. The choice of desktop OS is up the consumer, and their individual needs. Some people need Windows, some people need Mac. Some of us need Linux because Windows and Mac have failed on OUR desktops.
    • by slickwillie ( 34689 ) on Tuesday July 24, 2007 @11:33AM (#19970301)
      It's been working fine on my desktop since Slackware '96.
    • by LWATCDR ( 28044 ) on Tuesday July 24, 2007 @11:42AM (#19970453) Homepage Journal
      Wow you really didn't bother to read the link did you.
      The author was speaking about how poorly Linux performed ON the desktop. Thinks like audio skipping and the desktop feeling slow. He was talking about how the Kernel was so slanted to big iron and the server market that it has ignored desktop performance. The was also talking about how hard it is to create benchmarks that show interactive responsiveness.
      He also talked about how hard it is for "normal" users to communicate problems to Kernel developers.

      What he is talking about is how Linux has failed to perform as well as a desktop as it does a server.

      What most people have failed to notice or care about is this is a person that actually tried to fix problems by writing code! He was a truly working under the FOSS ideal and has given up.

      Too bad so many people are dismissing what he has to say.
      • Re: (Score:3, Insightful)

        by cyphercell ( 843398 )
        the title of the article is flamebait, of course people are going to respond to it.
      • by jedidiah ( 1196 ) on Tuesday July 24, 2007 @12:08PM (#19970907) Homepage
        Utter rubbish.

        I use Linux as a PVR and it's more than up to the task. It can maintain adequate performance and responsiveness even when doing heavy number crunching. My MythTV boxes are quite often running at 100% cpu and a load average of 5 or 10.

        Forget "audio skipping".

        Let's try realtime video capture + realtime video decoding + 3 video transcoding jobs all going at the same time.

        I can even still use my mythbackend as a desktop with very respectable responsiveness while all of this is going on.

        "most people" are at a loss to see what his problem is.
  • It hasn't (Score:4, Informative)

    by jshriverWVU ( 810740 ) on Tuesday July 24, 2007 @11:22AM (#19970125)
    Been using it as a desktop since 96, and have several friends who've been using it as a desktop for more than 5 years. Even my girlfriend uses it as a desktop now, and had only 1 day to "convert" to the usage, and she's not that computer savvy.

    Now it's all in the marketing and politics, but on the software side it's there.

    • let's just nip this little tangent in the bud, shall we? he's saying the Linux kernel is so bloated with enterprise level crap, and is so optimized for the server role, that it performs poorly on the desktop.
    • and she's not that computer savvy.

      Perhaps that was the cause for her lack of problems. My guess is that she didn't need much to do with the computer anyway, apart than writing stuff for homework.

      An OS is like a car. The more you know it, the more you tune it, and you optimize and squeeze every little bit of performance from it to fit your needs. You install new seats, a more comfy steering wheel, cup holders, etc etc.

      The problem comes with the change. One day you're sitting in your perfectly-tuned american car FR with V8 engine, and then you switch to an european car. And suppose it's a 4WD or FF with a rotary engine. For starters, the steering wheel is on what you knew was the passenger's seat. You have to change speed with the left hand instead of the right hand. You have to look to the right instead of the left.
      It's much worse when you realize that the knowledge and tools that helped you to tune your old car don't work with the new car (how the heck do you fix a rotary?). It's a completely different monster, and you have to RELEARN EVERYTHING FROM SCRATCH. Lots of knowledge lost.

      For example, to quickly search for a file in Windows, I open a commandline, and type dir *mask* /s /b. In Linux it's a different command (find -name), and if you're not logged in as root, you get all these "access denied" warnings (where the heck did i put that web server root directory?).

      To get help, you don't type "command /?". You type "man command", and then you have to scroll thru pages of explanations that you don't fully understand. (And don't get me started on the config).

      Back to the cars analogy. If you're just LEARNING to drive, "ah, it has a steering wheel and pedals." It's easy. Of course it's easy! Because you don't know ANYTHING.

      The real problem with switching to Linux is having to UNLEARN every bit of knowledge you've gained about windows with the years. It's much more painful when you're a Windows power user.
    • by zettabyte ( 165173 ) on Tuesday July 24, 2007 @12:39PM (#19971395) Homepage

      Even my girlfriend uses it as a desktop now, and had only 1 day to "convert" to the usage, and she's not that computer savvy.

      That has to be the most brilliant use of vender lock in to date!

      I'm going to go start a migration plan for my girlfriend to switch to Linux. Then when she gets tired of my geeky arse she'll have to think twice about dumping me!

      :-D

  • Applications (Score:4, Insightful)

    by grasshoppa ( 657393 ) on Tuesday July 24, 2007 @11:23AM (#19970127) Homepage
    The answer is simple: Application support. That's why desktop linux has failed. Nevermind the rest of the chatter; I can tell you that had I had the applications needed, I would have switched two organizations over to linux desktops by now, possibly more.

    And it's not a problem of performance; It's a question of politics. We have to convince enough software vendors to start coding in a cross-platform language/way.
    • Re:Applications (Score:4, Insightful)

      by slackmaster2000 ( 820067 ) on Tuesday July 24, 2007 @11:45AM (#19970501)
      Agreed.

      There are lots of applications out there, but perhaps too many. So many applications have everything but that one critical feature that an organization has come to rely on. "But it's open source and you can implement it yourself!" True, but that costs real money too. Quite a bit actually.

      And then there are the vertical applications that we can't move away from because we've got years and years worth of data stuck in the swamp. Yes, we could migrate but at what cost? Business doesn't care about operating systems or information philosophy, it cares about getting the job done to make money. It would take a considerable cost advantage to move an organization of medium size or larger from a Windows environment to a Linux environment.

      I spent years on and off trying to figure out how to move my company to Linux both on the desktop and the server. It's just too much, even still. Our business is manufacturing Widgets, and we get along just fine in our Windows world. If we were starting over from scratch today with the 5 or so employees we had when I first started a decade ago, I would make different choices. I despise 3/4 of Microsoft server products, and I hate the cost of MS Office.
    • Re:Applications (Score:5, Insightful)

      by Asphalt ( 529464 ) on Tuesday July 24, 2007 @12:07PM (#19970877)
      I don't think Linux on the Desktop has failed. I have been running it in one form of the other since the mid-90's, and have a Ubuntu machine now. As a matter of fact, every other machine but this one is running Linux on the desktop.

      But ... I am currently using Windows Vista as I am typing this.

      Why?

      Because I got a new machine. Quad Core with 2x8800GTX cards powering 3 monitors and an X-Fi Sound Card. It looks and sounds great.

      But when I tried to install Ubuntu 7.04 on it over the pre-installed Vista, I got a blank screen. Apparently the 10 month old 8800GTX drivers are not included on the Ubuntu install disk. Yes, there are some workarounds, using a text install, installing envy from a shell, and some other tips that may or may not work (results have been mixed), but it's a leap of faith. People that are running the 8800 cards in Ubuntu have been generally disappointed at their performance from the reading I have done on the Ubuntu forms, finding them slower than 7xxx series cards, and even slower then 6800's. What a waste of expensive graphics cards. And there does not even exist a driver to power my X-Fi soundcard. So I would not get sound. Sweet, a computer with no sound. All that music I downl ... I mean BOUGHT ... would never get heard.

      And that may be a bit of a problem for the Linux Desktop. It is hard to start out with a Linux desktop if you have psuedo-cutting-edge hardware. Many people buying new machines have to wait some time for stable and easily installable drivers to appear for their hardware, and by the time they appear, they are already fully entrenched in Windows, have their file structure laid out, etc.

      I am sure I will eventually have Linux installed on this machine, but it will be long after it is a high-end machine. I am not going to waste good hardware on drivers that don't work, or work sub-optimally.

      But this is not the fault of Linux. The folks who release the drivers just don't care too much about Linux. That is the problem.

      In two years, this will be a killer Linux workstation. Today, it would make a shitty Linux workstation.

      So, Vista it is for the time being. I have already gotten a BSOD. The OS is nuttier than a squirrels turd and is a general pain in the ass. But my applications run, everything installs the moment I plug it on (joystick, pocket PC, Bluetooth adapter, SD cards, etc) ... I can see what I am doing, the audio sounds great and I can get things done.

      Would I rather run Linux? Yes. Vista thrashes the disk around like crazy the whole time the machine is on, and it can only see 2.5 gigs of the 4gigs of RAM I have installed. I suppose I could shell out a few hundred for 64but Vista, but who knows what drivers will and won't work in that.

      But at least I have audio and video on the OS I have now. It's an imperfect work.

      The Achilles heel for Linux desktops has been and always will be fast and easy driver support, IMHO.

      Linux works great on slightly older hardware, but by the time the hardware is slightly older, it is more difficult to get converts. People tend to dance with who brought them, and on most machines, that is Windows.

  • by Agent Green ( 231202 ) on Tuesday July 24, 2007 @11:23AM (#19970141)
    And that enterprise crap in Linux saves companies an incredible shitload of money. Enterprise users also have the muscle to keep their systems up to date. The back-office stuff is the more important arena to win, IMHO.

    Desktop users are fickle ... and that's why Linux has failed on the desktop. However, Ubuntu has made incredible progress on this front.
    • Re: (Score:3, Insightful)

      by spun ( 1352 )
      Note too that much of the work being done in open source these days comes from companies like IBM, Redhat, and Novell, not from Joe Q. Randomhacker. These companies see the server market as the largest, most profitable Linux market. That's where their throwing their development dollars. Hey, here's an idea: why not make desktop distribution without all that enterprise crap in the kernel?
  • by ArcherB ( 796902 ) * on Tuesday July 24, 2007 @11:25AM (#19970169) Journal
    One of the strengths of Linux is also its biggest weakness. If someone has a computer and for some strange reason needs to install an OS, which Linux distro do they choose? I've run Linux for years and I still can't name all the available distros. I doubt ANYONE can.

    Another problem is the MS dominance over the OS market. It's hard to buy a computer without Windows and even harder to purchase one with Linux preinstalled. Your average computer user is not going to purchase a computer that won't run (because of no OS) and even if they did, when they go to the store pick up an OS, all they see is Windows.

    Linux users need to stick to a Distro that works, is easy, is well known, and comes as an option to be preinstalled on computers from the majority of manufacturers, even if it is along side Windows or as a bootable DVD thrown into the box.
    • by gardyloo ( 512791 ) on Tuesday July 24, 2007 @11:35AM (#19970335)
      I've run Linux for years and I still can't name all the available distros. I doubt ANYONE can.


            I can't name them, either. But I also can't name all the available versions of Windows. So what?
      • Re: (Score:3, Insightful)

        by ArcherB ( 796902 ) *

        I've run Linux for years and I still can't name all the available distros. I doubt ANYONE can.

        I can't name them, either. But I also can't name all the available versions of Windows. So what?

        Let's see how I do:
        Available?
        XP Home, Server, and Enterprise
        Server 2k3
        Vista Home, Professional, and Ultimate.

        Let's see how I did by comparing to this [wikipedia.org] Wiki page. I missed:
        Windows Vista Starter
        Windows Vista Enterprise
        and the Windows 2k3 server editions. There are six, which I lumped into one.

        Of course, I skipped the embedded. Also, many of those are Enterprise editions of 2003 that you probably won't find people confusing them for a desktop OS (well, they might until the see the price tag!). Also, I skip

    • Re: (Score:3, Interesting)

      Your average computer user doesn't go to the store to pick up windows; windows comes on the computer they just bought, and they don't know that they have any other options. The problem is not the number of distros; the problem is the lack of distros pre-installed on OEM computers.

      Plus, if you're not happy with a particular distro, you can try another one, for free, and with a minimum of effort. I've gone through 3 or 4 over the years before sticking with Kubuntu.
    • by vivaoporto ( 1064484 ) on Tuesday July 24, 2007 @12:07PM (#19970883)
      You know. The same thing happens to me, but instead of with Linux distributions, it happens with cars, televisions and other goods.

      If someone has far away job and for some strange reason needs to own a personal vehicle, which car do they choose? I've driven cars for years and I still can't name all the available models. I doubt ANYONE can.

      Car drivers need to stick to a model that works, is easy, is well known, and comes as an option to be sold at every auto shop and used car fair, even if it is along side the models from the brand that particular auto shop represents.

      I cannot imagine a reasoning more beaten up and less relevant than this one. While it is true that people prefers pre-packaged goods, too much choice was never a problem in the other markets. There is a multitude of car brands, TV brands, beer brands, all of them differing in a way or another but every of them catering to their target audience. And we do not see people fighting to get this or that car (beer, TV set) brand to dominate the market because of an eventual technical superiority, better taste, features, etc.

      That is because what is the best alternative for one may not be the best for other, because people taste differs, because people need differs. The only difference from that to computer Operational Systems is that the collaborative culture brought by the microcomputer "revolution" make people expect a level of interoperability and interchangeability between these different branded machine that they don't expect in other ones, like cars, for instance.

      And to blame the lack of interoperability we have nobody else than certain proprietary software companies (there are many of them for me to enumerate by name, but you know the ones I'm talking about), that could agree on standards that would thrive interoperability (imagine what would the industry be if they didn't agreed on ASCII, for instance), but instead put their short time gains over it and helps to push the whole industry back a couple of decades.

      To summarize: too much choice happens everywhere, and it is a good thing, inclusive in computing, as long as there are interoperability among the choices. Linux (the kernel) and its most of its userland is open source, open specs so, the lack of interoperability can't be blamed on them.
    • Re: (Score:3, Insightful)

      One of the strengths of Linux is also its biggest weakness. If someone has a computer and for some strange reason needs to install an OS, which Linux distro do they choose? I've run Linux for years and I still can't name all the available distros. I doubt ANYONE can.

      So? I can't name all the different kinds of laundry detergent (or even Tide laundry detergent), but its not hard for me to find one that works well enough and use it.

      Why would I need to be able to name all of the Linux distros in order to use on

  • Again??? (Score:4, Insightful)

    by Pedrito ( 94783 ) on Tuesday July 24, 2007 @11:31AM (#19970269)
    Everyone has a take on it. Haven't we had this discussion a hundred times on Slashdot?

    My personal opinion, after having used Linux quite a bit, is simply that Linux isn't ready for the desktop. While many apps have easy to install packages, a lot of apps don't. Particularly smaller, single-developer shareware kind of apps. Many of these require getting source and compiling, something my mother or grandmother won't be able to do.

    Speaking of my mother and grandmother, the other thing they already find confusing enough is the Windows directory layout. Linux is FAR more complicated in that department. They'd find organizing their documents much more difficult.

    Finally, frankly, I don't find the UIs all that intuitive to use. I've used Gnome and KDE. I prefer KDE, but I have issues with both. It took me a while to figure out how to drag and drop gzip compressed files from KDE. I can't even remember how it works off the top of my head, I'd have to go do it again. But it definitely wasn't as intuitive as drag and drop from say WinZip to a folder in Windows.

    The fact is, Linux just isn't ready for the desktop. Don't get me wrong, huge strides have been made over the past few years in usability and I suspect it'll get there eventually, but it's not there.

    Another issue is the community, which in many places is hostile to newbies. I've been insulted on more Linux support forums for asking question than I've ever been on Windows support forums. There are places to get good support for Linux, but there are a lot of really hostile ones too. Windows may have some hostile ones, but I just run into it far less frequently.

    This is just my personal opinion, based on my experiences with it. Other people may have had different experiences. I still love Linux for certain things and I run a Linux box as a file server, firewall, database server and for video editing. I'd never trust connecting a Windows box directly to the internet, but I've always trusted Linux for that. But as a desktop environment, it just doesn't work for me.
  • by Compholio ( 770966 ) on Tuesday July 24, 2007 @11:32AM (#19970279)
    The article really focuses on how quickly the desktop responds to user operations. I haven't personally found this to be a problem on the 2.6 kernels; however, to say that work is not being done in this area is unfair. Kernel Trap has had several articles on people working on CPU schedulers to address this problem, recently the Completely Fair Scheduler was merged to potentially solve this problem: http://kerneltrap.org/node/11773 [kerneltrap.org].
    • Re: (Score:3, Insightful)

      by Anonymous Coward
      recently the Completely Fair Scheduler was merged

      That _might_ be one of the reasons he's pissed off, you know ...

      • by PCM2 ( 4486 ) on Tuesday July 24, 2007 @03:07PM (#19973681) Homepage
        I'm surprised (but I guess not shocked) that there hasn't been more discussion on /. as to the technical matters behind what he's saying. I, for one, do not follow Linux kernel development closely enough to be up on any of this stuff. If you make it that far in TFA, though, you'll find that his main gripe was the incredible resistance he got to his desire to include a "fair" CPU scheduler in the kernel. He even went so far as to develop a pluggable architecture that would allow you to pick which scheduler you wanted at boot time, but this was also met with resistance. Then you get this:

        Then one day presumably Ingo decided it [fair scheduling] was a good idea and the way forward and... wrote his own fair scheduling interactive design with a modular almost pluggable CPU scheduling framework... and had help with the code from the person who refused to accept fair behaviour in my flamewar.
        Presumably this is not the whole story, but I'd expect /. to talk at least a little bit about this aspect of the story, rather than all these "Linux on the desktop" comments we get. How does Ingo's new CFS compare to the code Kolivas wrote? Which design is superior? Does Ingo's design actually borrow from Con's code, or does it just do more or less the same thing? And what about Con's implied accusation that the kernel development process is impenetrable, both to end users and even key developers when they reach an impasse with one of the "elite" -- is this a fair criticism? Like I said, there's no way for me to answer these questions for myself with my current knowledge.
  • Wrong problem (Score:5, Interesting)

    by pubjames ( 468013 ) on Tuesday July 24, 2007 @11:33AM (#19970303)
    Linux on the desktop has been gradually improving, and is now at a point when it is probably pretty much equal to Windows. It may even surpass it in the medium term.

    But how good it is isn't really the issue. The fact is, Microsoft has an incredible lock-in, and it is going to take many years to chip away at that. But Firefox has demonstrated that it is possible to win market share from Microsoft. The two essential ingredients are persistence and time. If Microsoft continue to stumble - as they have with Vista - then Linux on the desktop will happen more quickly.
  • by hey! ( 33014 ) on Tuesday July 24, 2007 @11:38AM (#19970381) Homepage Journal
    What is "Linux" for that matter?

    If by "failed" you mean "failed to achieve X market share", I should think the answer is obvious: normal people don't give a flying fuck what kernel their operating system uses. And since their computers come with Windows preinstalled, they are not going to swap operating systems to get a better kernel -- or a better license. Even MacOS wouldn't be where it is, if it was developed and sold as a purely OS product, instead of being bundled with Apple computers.

    On the server end, people are concerned about capacity, performance, and licensing restrictions, so it's a different ball game.

    People have only two problems with the Linux kernel, and neither of them is due to the existence of enterprise features: (1) the USB doodad they just bought doesn't work automatically and (2) the specific application doesn't support any version of Linux. As to why this is so, it all comes back to the fact they don't care what kernel they have and they already have Windows, so people in the business of catering to them don't bother to do anything to fix these problems. If they did, user apathy means it wouldn't make a big difference in Linux desktop adoption.

    In the end, this is a situation that only Microsoft can change, and that by screwing up. Maybe they have with Vista, but I think not. Vista will be like the old 640K DOS memory limit. Industry (other than MS) will move heaven and earth to accomodate it, should it become the status quo, which given user indifference will probably happen.
  • by spaceyhackerlady ( 462530 ) on Tuesday July 24, 2007 @11:41AM (#19970427)

    I'm typing this on a Linux desktop. It's a pretty hefty system (dual-core, 2.8 GHz, 4 GB RAM), but it earns its living, I assure you. It's Slackware, with a custom kernel. As I've mentioned before, my view is that the distro kernel is solely there for bootstrapping the system until you can build a custom kernel to match your hardware and your needs. It's open source. We can do that, you know.

    My biggest frustration with Linux is the notion that Linux systems must emulate Windows to be acceptable (e.g. Mono), and that the Unix interface is a priori incomprehensible, for no other reason than that it doesn't look and feel like Windows. I like the concept of lightweight desktop-oriented distros like Puppy, but do not like they way they so desperately emulate Windows. Right down to the icons.

    Is that all there is? We have an open-source OS here, with open source applications. If we don't like how they work, we can roll our own. Mindlessly aping whatever Microsoft are dumping in to Vista this week is dumb.

    What next, DRM?

    ...laura

  • Coral Cache (Score:3, Informative)

    by Frankie70 ( 803801 ) on Tuesday July 24, 2007 @11:44AM (#19970475)
    Here [nyud.net]
  • Exchange, bitches! (Score:5, Insightful)

    by LibertineR ( 591918 ) on Tuesday July 24, 2007 @11:50AM (#19970611)
    Competing OS are not keeping Linux off the desktop in big numbers, EXCHANGE is, period.

    Anyone familiar with how Microsoft locks in customers will tell you the same thing.

    We have reached a point where neither the desktop OS or the Server OS doesnt matter as much as the apps they run. Exchange is the one app that is almost a must-have. Anyone can list all the non-proprietary stuff that runs 80% of Exchange functionality, or 50%, but does it better, and so on and so on.

    Give it up, and start building something that takes Exchange on directly, feature for feature, with better recovery, and message pushing to handheld devices.

    Or, maybe just shutup? This has been obvious for years. Microsoft keeps improving Exchange, Enterprises keep buying it, and everything else that goes with it.

    Linux cannot exist on its own with a bunch of 50-to-80% solutions, expecting to fill the gap by the temporary pleasure of giving Microsoft the finger from time to time.

    Either compete or change the game. Only Google and Apple seem to get this.

    And can we stop asking this question over and over again?

  • by delire ( 809063 ) on Tuesday July 24, 2007 @12:03PM (#19970811)
    .. but failing on the desktop very unsuccessfully. More and more people are using it and now even major hardware vendors are reporting great sales results of Linux on laptops. Some even say it's the least slowest-growing desktop operating system today. Linux is so crap it can't even fail properly!

    My advice? Install it now and help it be even worse at failing.
  • by rs232 ( 849320 ) on Tuesday July 24, 2007 @12:30PM (#19971259)
    "At that time the IBM personal computer and compatibles were still clunky, expensive, glorified word processing DOS machines"

    "Enter the dark era. The hardware driven computer developments failed due to poor marketing, development and a whole host of other problems. This is when the software became king, and instead of competing, all hardware was slowly being designed to yield to the software and operating system design"

    "However, the desktop PC is crap. It's rubbish. The experience is so bloated and slowed down in all the things that matter to us. We all own computers today that were considered supercomputers 10 years ago .. So why on earth is everything so slow?"

    "I watched the development and to be honest... I was horrified. The names of all the kernel hackers I had come to respect and observe were all frantically working away on this new and improved kernel and pretty much everyone was working on all this enterprise crap that a desktop cares not about"

    "Or click on a window and drag it across the screen and it would spit and stutter in starts and bursts. Or write one large file to disk and find that the mouse cursor would move and everything else on the desktop would be dead without refreshing for a minute"

    --

    Why Linux Has Failed on the Desktop

    "Linux is burdened with 'enterprise crap' that makes it run poorly on desktop PCs", Zonk quoting SlinkySausage.

    Quoting him out of context and making him say something he didn't say ... for shame Zonk ... the headline is also misleading.
  • Rubbish (Score:3, Insightful)

    by soccerisgod ( 585710 ) on Tuesday July 24, 2007 @12:31PM (#19971275)

    I can agree to some of the things said by this guy, but all in all, it's rubbish. Sure, response times are one thing and I think they've been addressed very well by preemption features and configurable scheduler frequency, but to blame a slow desktop experience on the kernel is just stupid. Really stupid. If you wonder where all your megahurzes go, try looking at your KDEs and Gnomes first, your animated gizmos, your 3d desktop gimmicks and applets and your java crap.

  • by Animats ( 122034 ) on Tuesday July 24, 2007 @01:40PM (#19972385) Homepage

    First, the Slashdot article is terrible. The article isn't about "why Linux is failing on the desktop", it's about why a kernel developer who was trying to improve scheduling performance quit.

    The scheduling issue is interesting. I used to work on mainframe schedulers, I've done real-time work, and I'm familiar with the issue in game implementation, so I know how hard this is. We could do better than what we have now, but not by some magic fix to the scheduler. We have to look at interactivity as a real time problem.

    It is, too. Alan Kay used to say that there is no more excuse for a delay between pressing a key on a computer and having something happen than there is on a piano. We haven't been faithful to that, and it subtly drives users nuts.

    One useful idea from the real time world is explicit "sporadic scheduling". Some real time operating systems have this. A process can explicitly request that it wants, say, 10ms of CPU time every 100ms. The scheduler must reject that request if the system is overbooked. If it does accept the request, the scheduler has committed that much resource to the process. If the process overruns its time slot, it loses priority and an overrun is tallied.

    This is what an audio or video player should be using. This is how you get audio and video that don't pause or skip. For this to work, the player must be able to calculate, for each system it runs on, exactly what resources are needed to play the current content. This may take more analysis and benchmarking than many programmers are used to doing. It's worthwhile to make overruns visible to tools outside the application, so that users can detect broken applications. To a real time programmer, overrunning your time slot means "broken". You have to think that way.

    On the interactivity front, it's useful for a thread to be able to request a high priority for a short period after an event, with a priority drop to follow quickly if it keeps the CPU too long. That's how you get the mouse cursor to track reliably. Of course, the thread that handles mouse events has to pass off all the real work to other threads, not stall the thread handling fast events.

    It's also probably time to end paging to disk. When it works, paging at best doubles the effective RAM. But paging inherently results in long unexpected delays. If you want interactivity, don't page. Real-time systems don't. Neither do game consoles. RAM is so cheap that it's not worth it. (1GB starts at US$56 today at Crucial.) Paging devices maxed out around 10,000 RPM since the 1960s, and haven't improved much since. Give it up. Today, paging is in practice mostly a means for dealing with memory hogging apps. (Hint: open "about.config" in Firefox and turn off "browser.cache.memory.enable". so it doesn't save screen dumps of each page for faster tab switching.) It's probably time for Linux to not page interactive processes by default.

    This implies an operating system that says "no" when you put on too much load, instead of cramming it in and doing it badly. Open too many windows of video, and at some point the player won't open another one. There's nothing wrong with that, but most Linux/Unix apps don't handle resource rejections from the operating system well.

  • by ckolivas ( 1132603 ) on Tuesday July 24, 2007 @06:41PM (#19976629)
    The chance of being modded up is miniscule, but anyway I'm Con Kolivas. There is only one thing I'd like to point out about the whole interview. Ashton (the interviewer) chose the title that says why linux failed on the desktop without consulting me. If you actually read the interview I never once say that linux failed on the desktop.
    • by ckolivas ( 1132603 ) on Tuesday July 24, 2007 @09:52PM (#19978507)
      It seems they were sensitive to my complaint and have changed the title of the story at apcmag now. The slashdot title for the interview and their misquoting was... unfortunate.
    • by sl3xd ( 111641 ) * on Wednesday July 25, 2007 @12:01AM (#19979381) Journal
      Ashton (the interviewer) chose the title that says why linux failed on the desktop without consulting me. If you actually read the interview I never once say that linux failed on the desktop.

      Well, now you have a personal understanding of why a lot of people are turning from "mainstream" journalism to alternative sources. The journalistic process isn't exactly honest or honorable, is it?

      I did think it odd that after arguing against fair scheduling for quite a while, Ingo, et. al. decided to implement it (and how rapidly it was dropped into the kernel). I've read a few articles about the sudden change of heart. I'm sorry things worked out that way; I can definitely get an idea how disappointing that you didn't even get any credit for championing fair scheduling, nor were you given any involvement in implementing the CFS.

      On the other hand, I also recall reading a paper that was given at OLS 2006 that was more or less stating that "Userspace Sucks"; there's a lot of work to be done there.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...