Why Linux Has Failed on the Desktop 995
SlinkySausage writes "Linux is burdened with 'enterprise crap' that makes it run poorly on desktop PCs, says kernel developer Con Kolivas. Kolivas recently walked away from years of work on the kernel in despair. APCmag.com has a lengthy interview with Kolivas, who explains what he sees is wrong with Linux from a performance perspective and how Microsoft has succeeded in crushing innovation in personal computers."
Don't think so (Score:4, Insightful)
I found that rather funny. Blaming Microsoft for your own lack of creativity and ingenuity.
Besides, Steve Jobs would very much disagree.
Re:Don't think so (Score:5, Insightful)
Having many Different OS's and Computers around we would be much better off seeing what works what doesn't why it does and how to improve on it. Back in the 80s If I were asked how would a Desktop System look in 2007 I would have given a much different answer (In my mind a 2007 desktop would look more like Plan 9 and less like windows) But during the 80s the Only GUI i had experinece with was Gem Desktop and I didn't particully care for it. I expected graphics in 2007 to be a bit better then they are now, But the OS in my mind would have frames not windows.
Re:Don't think so (Score:5, Funny)
Funny, that's exactly the same answer I'd give in 2007.
(The eye candy -- make it stop!)
Re:Independent creativity launches many things... (Score:5, Informative)
And yes i know it isn't "either the desktop or the server" but i can see his POV beeing somewhat right.
Re:Don't think so (Score:5, Insightful)
Sure, some extra code may slow things down, but since Linux, Windows and even MacOS now, is all based on server kernels (linux's own, VMS/WNT for anything newer than Windows 2000, *BSD) they don't crash too much. YOU may have problems with XP or 2000, but you shouldn't be. I've had an XP install going for more than four years, Windows 2000 running for months. (If you can't do this, you should not be using it, nuff said)
Code doesn't care how many employees you have. Maybe this guy belongs at Ubuntu, where things are moving towards the 'desktop'. Just ask my new Ubuntu installation on my laptop - it's running like a desktop just fine. I just finished 5 hours of World of Warcraft on it!
Re:Don't think so (Score:5, Informative)
Re:Don't think so (Score:5, Interesting)
Re:Don't think so (Score:5, Interesting)
The good thing is that Linux, GNU, and Open Source development are moving along at a faster pace than Windows is and sooner or later it will begin to surpass other OSes and GUIs in features, stability, flexibility, future potential, etc (if it already hasn't). There are weak spots as all products have them. I think Open Source will respond better to enhancing those features faster than a monolithic monopoly ever could. Not to mention there are huge numbers of potential developers that will be creating prior art and even IP that companies such as Microsoft can only steal if they want to move ahead. That's a tremendous boom.
What also troubles me is that Linux, GNU, and Open Source tend to react to technologies instead of really developing new technological ideas. We see that feature such and such has been created and that is often reproduced, though maybe in a superior way. What I'd like to see are more unique ideas coming from the Linux community itself thus ensuring that some key new technological concepts come from Open Source. It is sort of like when John Warnock created Adobe and created PostScript for the Apple Mac and the Laser printer. It was a technology like that which propelled Apple to the front of certain markets and it is that which made John Warnock the rich man he is today. I just can see some killer app being developed for Linux which draws people into the industry created and supported by so many of us. Also, convincing companies such as Adobe to adapt their applications to Linux will also help change the landscape. The issue is why would a company develop for such a small market? Well, as we have seen in the past couple years with Ubuntu having approximately 20 million users world wide and then with all the other distributions combined we come near 100 million users world wide. That's a huge market vs. what Adobe had when it was working on the Postscript and the laser printer with Apple. Certainly a much greater potential market for even some of the smaller technologies. Personally, I don't care if software costs money. And I know software can be developed for the Open Source operating systems without forcing them to use Open Source code. So, the potential is there for a huge market to make some people very rich selling software to Linux users.
I don't recall the guys name nor his exact quote nor the precise context of the quote, but I do recall what he was getting at when he said something like "in our fight for racial equality we should have put more emphasis on buying land/property and being less strict about fighting for equality, as equality is bound to happen in a free society." What he meant was if they had bought land they'd have it as a valuable resource--something to ensure the future. They should have focused on that as much as they did on just getting equal rights as equal rights were bound to happen. Maybe it would have taken longer but it was bound to happen. This is what I perceived he meant. What I'm getting at with this story is that Linux should be focusing on building up (as in every participant, every volunteer, every developer) the IP and prior art to keep companies such as Microsoft from getting patents on them. We'll get parity sooner or later on the desktop. Let's own the land upon which the IP is based so that the monolithic monopoly doesn't lock Open Source out of some key advances. I'd rather see Open Source lock out the commercial entities than have the freedoms that I desire held hostage to the extortion attempts we've seen Microsoft use in the past.
Re:Don't think so (Score:5, Insightful)
Re:Don't think so (Score:5, Insightful)
Read The Mythical Man-Month. One of the most cogent things Brooks has to say is about project coherency, best exemplified in the desktop world by Apple. What Macs give you above all, their primary value proposition, is coherency of design.
Coherency tends to be one of the weakest suits for many or most Open Source projects, especially those without a central entity to define the direction. The exceptions tend to be server or kernel-side: Apache, Linux Kernel, databases, etc, and I'd claim this is because there's a well-defined set of CS problems being solved there. KDE, which I use daily, has absolutely no coherency of design. That's why it does well as a testbed for new features, approaches, etc, but very poorly at consistency of experience.
Brooks' argument, which is pretty credible, is that coherency comes from having one or a few project architects and consistently returning to their vision. They absolutely need to spend a lot of their time listening to users and developers and reacting to their feedback, but ultimately someone's vision is what makes a codebase hang together. Con is saying that the architects of Linux are basically not that interested in the desktop experience on vanilla hardware, because they're most interested in more traditional CS questions that tend to play themselves out much more in the enterprise space than the desktop space. As a non-CS guy in the software development world, this really strikes a chord with me. The Linux desktop is built on very similar components to the Mac desktop, yet is worlds away in usability. And that's basically because a) nobody is defining, shepherding and advocating usability requirements at the OS level, and b) the desktop projects don't have a architect/requirements definer at all.
The rest of the article, and particularly the extravagant claims about success and failure are pretty much what you'd expect from a smart, non-CS, hardworking, disgruntled community member who has not been taken as seriously as he ought. The same dynamic pretty clearly played itself out in the climate change debate over the "hockey stick", where Mann et. al. were too dismissive of smart, hardworking, somewhat contrarian, non-climate science authors of counterclaims, McKitrick and McIntyre (M&M). Mann's work has withstood M&M's criticism well, and frankly M&M dropped the ball on some key items (like not properly modeling how various quantities vary with latitude - a big blooper in climate science) but the whole debate would have had less drama (and therefore been less ripe for political cherry-picking) had M&M not been seen to be marginalized by the climate science community. To me the lesson is not the technical merits of Con's solutions, but the lack of serious attention to his points about where the focus is in kernel development. That's the interesting part of this story, and one that Linus should really take to heart.
Re:Don't think so (Score:4, Insightful)
And whose fault is this? How many usability studies has GNOME conducted? NOVELL, IIRC, has done a only a handful, many years later. And KDE has set up an usability group one one or two years ago (and I've yet to read any paper on it). Not only that, GNOME has adopted the practice of not even paying attention to bug reports (look up Eugenia Loli-Queru's arguement with the GNOME project on this).
Almost all the free software GUIs are not innovating *at all* on usability. They are all about little cosmetic changes. Mac OS X and Vista have left them behind the curve (and don't mention Beryl...what's the point of a spinning cube ?! How does that increase usability? Or wobbly windows?!!) Sometimes they inovate a little, but in the opposite direction, like Ion.
And frankly when someone tries something new, nobody pays attention. Like OpenCroquet. Like some experimental Java desktops. You can't really expect anything other from developers hellbent on C programming...What can you expect from GMOME? All I expect from a C project of that size is that it's going to be further and further behind the curve...We can't even expecct anything from the likes of Novell: their Mono is not really being developed as a multiplatform tool, is it? (So, no FOSS desktop like GNOME or KDE).
The real shame is having companies that are basically full with non-creative individuals injecting money on FOSS.
By the way, "Linux" is not the only Unix-like OS that uses GNOME and KDE.
Re:Don't think so (Score:4, Insightful)
Okay, I'll be as clear as I can be here: Linux will never take over the desktop. Ever. Ever. Why? Because it's a pain in the arse.
Never, in all my years of working on the Mac and Windows, have I been required to type something like "sudo vim /etc/X11/xorg.conf" and then try to tell my computer to display something over 640x480 resolution--and even then not having it work, even after following 3 different, progressively complex, methods of getting an nVidia driver to work.
Every year or so, I try to set up a Linux machine with whatever the new darling distro is. Only once have I gotten one to work acceptably, but there were still issues I wasn't happy with. And that took about a week of reading poorly-written manpages. Just the other day I gave Ubuntu 7.0.4 a shot. I gave up after 2 hours of fiddling to get working video.
That is after having to futz with my CMOS to boot it--a step most people wouldn't know to do.
Linux people are, and I'm going to be brutally honest here, morons. Not computer morons, obviously, because they have the skills and general knowledge required to get Linux to at least boot and display video properly, but morons because they lack even a basic understanding of what other people want from computers. Linux people are, and this will be news to precisely no one, geeks. As such, their opinions on computers are absolutely irrelevant to anyone other than fellow geeks.
People do not want to fuss. They want to buy a computer, turn it on, and start putting in software they bought at Wal-Mart without ever even thinking about what is going on below the UI. Hell, as far as most of them know, there ISN'T anything below the GUI. That's what it has taken to get the computer into every home in every developed country in the world: compatibility and ease-of-use.
Linux offers neither of these things.
Ultimately, the FOSS model is fundamentally flawed. People write things they find fun or that they really need--motivations we in the education business refer to as intrinsic, which is the best kind of motivation there is. The problem is that no one finds things like video drivers fun. There's no huge drive to make sure all the features of the video card are supported, because you won't need them anyway. So, without some kind of extrinsic motivation, like profit, certain jobs just never get done--or at best, get done half-assedly.
This problem is exacerbated by the fact that the people doing the developing are uber-geeks (we know this for certain because they are evidently coding for fun), and therefore don't sweat having to tweak a text file here and there. They pat themselves on the back for getting it to run at all (as they should--it's quite the accomplishment, and something to be marveled at!) and get so excited that they mistake this small success to be proof that everybody can and should be running Linux just like them. But they shouldn't, because (polishing off my old Slashdot chestnut)...
Linux is a toy.
It is a hobby OS. People have gotten this claptrap toy to do some pretty great things, and it's a no-brainer for any kind of application where the computer isn't expected to do anything very exciting (games, iTunes, iMovie/Windows Movie Maker, hook up any random scanner you buy--Only geeks are "excited" by hosting webpages and/or directing network traffic) or where you need a really small footprint (embedded). But that does not a desktop OS make. Not for the unquantifiably vast majority of computer users, anyway.
Look, everyone hates Microsoft. Apple has their own hassles to deal with. But both are so astonishingly better at serving the customer's needs and desires than the Linux distros will ever be that the fact that some people even need that pointed out to them simply demonstrates, clearly and unequivocally, that those people are, as I have already stated above, morons.
I'm sorry, but it's true.
Re:Don't think so (Score:4, Insightful)
Thankfully fixed now, due to Con figuring out how to satisfy both efficiency and latency objectives with a single scheduler, and Ingo rudely but efficiently pushing his own interpretation of Con's work into mainline. Moral of the story: sometimes the process is bumpy and feelings get hurt, but the code doesn't care, it just keeps getting better.
Re:Solution (Score:5, Insightful)
That's part of the reason why Linux will never really hit it big on the desktop.
Re:Solution (Score:4, Insightful)
A tweak to the start module in the kernal should be able to set the nice level of any program when it starts - giving latency sensative software more priority & dropping those insensative. Skype cares about latency - terminal doesn't care all that much. The whole point of the comment was that a process already exists to deal with the majority of the latency issues described and either a daemon or a tweak to the start module should be able to use that process & adjust usage based on the program without user intervention.
The administrator creates a file /etc/nicety that ranks programs as needed:
[programs] /usr/local/bin/Skype 15 /usr/local/bin/gterm -4 /usr/local/bin/totem 15 ...
[user]
bob 45
alice 18 ...
With the user section, you could even prevent a slob from killing the system in a multi user environment by limiting his total niceness. Anything over nice(max) results in everything being trimmed back proportionally. If you boot into single user mode, that section is ignored.
ulimit does provide some of these constraints, but it works on the whole userspace for its memory & process quotas & per process for the nice limit.
Re:Solution (Score:5, Insightful)
And how exactly does the average home Windows user set the parameters on the software they install - oh yes, it's all done automagically. Why, I do believe that installation software for linux currently adds things to the SE linux contexts when it installs. But I suppose that would be impossible to do to a simple file when your creating a module to do it.
Here how about rather than toss an idea off my head, I spell it out in a step by step process:
or
A developer [not the desktop user] modifies the module in the kernal that starts processes to set the nice level of the process according to a set record already defined in a file [in
There did that include enough detail that your straw men are dispelled? The core protocols exist to minimize the problem - nice. A patch to set the nice level on starting shouldn't be all that hard to do, a daemon to reset them after starting even easier. Your argument that a home user needs an 'admin' who understands daemons & patches etc to do this is only valid if you also feel the average Windows box needs an 'admin' to install AIM.
As several people pointed out, Con's dissatisfaction with the kernel dev team is that they wouldn't change the way nice works to better impliment it's stated purpose - splitting CPU time based on the interactiveness of a program. However, even in it's current state, a system to automate nicing the processes would resolve most of the issues people are seeing with desktop responsiveness. In combination with any of the new schedulers, it should make just about anyone happy.
Re:Solution (Score:4, Insightful)
Re:Solution (Score:5, Insightful)
He's saying that someone afraid of their computer can't do it. And until Linux can be used by people afraid of their computer, it won't appeal to the majority of the desktop PC market.
flamebait (Score:4, Insightful)
Right, and flamebait is insulting people just because you're an asshole. And I don't mean "you" in the general sort of sense.
No Unix or Unix-like distro has been and they've been around a heck of a lot longer than windows.
I know several people on Ubuntu who struggle with Windows. Plus, saying Unix has been around a long time and it's going nowhere is like saying, in 1994, that the internet has been around a long time, and it's going nowhere. The initiative towards a widespread Linux desktop has been around a few years, max. OSS movements take a while to rev up, and this one's doing quite nicely.
Be quiet, be schooled, thank the nice man as he leaves.
Re:Solution (Score:4, Insightful)
Unix and Unix-like systems have been around for a very long time (a lot longer than windows) and have yet to hit big on the desktop.
Really? Like OSX?
Granted, it doesn't have as much market on the desktop, but it's still the second. And it only is the second exactly because of the reasons de GP pointed out.
Why this solution won't work: (Score:5, Insightful)
>some given level - better yet tweak the module that starts programs to nice them as they start. Works
>better than blocking the background tasks by bumping everything that's happening under a users uid, while
>still providing the lower latency issue.
Here is what they average computer user will think of your solution:
1) What's a daemon?
2) What does "renices" mean?
3) What are priorities?
4) What is a pre-set program?
5) What is a module?
6) What does it mean to block a task?
7) What is a background task?
8) What is a UID?
9) What is latency?
Re:Why this solution won't work: (Score:5, Interesting)
Re:Why this solution won't work: (Score:4, Insightful)
A couple of years ago Thomas Hesse, president of Sony BMG managed to say "Most people, I think, don't even know what a rootkit is, so why should they care about it?" and this was rightfully frowned upon as an valid argument. I fail to see how your is different, am I missing something?
Re:Don't think so (Score:5, Insightful)
Mr. Kolivas in the article hit the nail on the head. Take linux windows. Drag one around. It chops around into various little segments and such as it moves. Drag an icon. Select stuff. Reposition a toolbar (or buttons on it). There are these fraction of a second delays. It's almost like walking on stilts. You're on the floor, and you feel when your feet hit the floor, but there is feeling of some layer in between where you're not REALLY touching the the floor. Same applies to Linux and it's GUI (or at least it's most common collection of tools that we call it's GUI).
Now personally, I'm not so sure the problem is in the kernel; I've always been more apt to blame Xfree86 (and now X.org) instead, but the fact remains that it just doesn't feel right.
Mac OS X on the other hand, has a MUCH better flow to it. BeOS's approach to such things was practically perfect (my 450mhz K6-II with 128mb of RAM running BeOS feels faster than an Athlon XP 2100 with 1GB of RAM running Gentoo Linux). Even Windows, despite it's many other problems, feels more responsive on the desktop than Linux.
What the problem is for sure, I don't know, but I'd certainly like to see it fixed. Windows is well, Windows (boring and evil). Macs work too well for their own good for a tinkerer (they work, work well, and not as many people feel like fiddling with them, so the development community is much smaller). BeOS is dead. Other operating systems like SkyOS an Syllable are just too obscure. Linux is where it's at for programming enthusiasts. It would sure be nice to be able to use it for more than that though
Re:Don't think so (Score:5, Informative)
Two words: Direct Rendering
The issues your describing have almost nothing to do with Linux and everything to do with your graphics card driver (or lack thereof). If you've ever run Windows XP on a system without your graphics card driver you will experience the same thing. In fact, in my experience it's quite a bit worse.
There certainly are some things that could be optimized in Linux, but I those are relatively insignificant.
choppiness (Score:4, Informative)
While the difference isn't nearly what it used to be, FreeBSD has always had far less of that on the same hardware and the same version of X.
Back on a 486 (and even my K6, iirc) linux could freeze for seconds under loads under 4, while at least the mouse kept working at 20 and up.
The last time I compared on the same hardware (a couple of years ago), Linux was merely "annoying" under load, rather than the older "unusable"
hawk
Re:Don't think so (Score:5, Informative)
I want to agree with you, I really do. But my SuSE 10.1 desktop regularly has fits where it becomes completely unuseable - if I can manage to get a shell, I find that the load has spiked to 5-10 (on a single core system) when the system was doing *nothing*. Just this morning, I woke up, poured a bowl of cereal, walked over to it to read some Slashdot over my Cheerios, and found the system thrashing and refusing to come out of screensaver because the load was so high. This happened while I was sleeping. I had to ssh in from my Powerbook to kill off any processes that appeared to be using CPU before the system would respond to the mouse.
Meanwhile at work, we just tossed an Ubuntu server that should have been reasonably swift, but was regularly DOS'ing itself by spiking to loads of 40 or more several times a day under normal use. A load of 40-60, on a single-core machine! We "fixed" it by spending thousands of dollars replacing it with a pair of multicore beast with scads of memory and fast disks, which seems to overpower the problem.
Then there's that server belonging to a client, a RHES 4 system. When I ssh in through a tunnel to update it, it insists on running the update program as an X client for crissakes. Then it tells me to register the system at a URL, but the URL cannot be selected or copied to the clipboard. This is "enterprise" quality software?
Back at work, the dev server is still a RedHat 7.3 clunker. It has a half dozen developers fine-tuning their infinite loops, fork bombs, broken joins, buffer overruns, and spaghetti code, all day long. It simply never crashes or hangs, never gets slow, and never complains about the abuse it receives. It's a rock-solid dream. Except that it's a damn nuisance to update, since it's so old. And it's only hobbyist-quality software, after all, built before RedHat went all enterprise-centric
Posted, with regrets, from my Powerbook. I'm starting to think that software built for the home user is a safer bet than the "enterprise" shite I'm dealing with every day.
Re:Don't think so (Score:4, Insightful)
Tell me, do you compile your shit natively or do you install binaries (such as using the RHEL bootable CD, which in general installs binaries)? Because I've never had any problems if I've taken a day to uninstall everything, download the newest SOURCE and recompile natively on my box with my library versions and my compiler, optimized for my memory controller and my CPU. After recompiling my Kernel image with same and rebooting. If you expect Open Source, in most cases amateur, developers to make their software automatically detect and work with older library versions, compile portable enough binaries to run on your hacked together system, you are sorely mistaken. Do it right, trust me. Binaries ARE NOT PORTABLE. They sort of work, sometimes. C source is PORTABLE. USE THE SOURCE.
I think you've just perfectly summarized why Linux is not popular as a desktop platform.
But Linux is so famous for low hardware needs (Score:4, Insightful)
Re:Don't think so (Score:4, Insightful)
Yes, Apple has succeeded to take market share from MS on the desktop, while Linux has failed, but you are leaving out an extremely important piece: MS was working very hard to make sure that Apple displaced Linux. Tons of evidence has surfaced in e-mails and in interviews with ex-MS people that MS saw Linux as a real threat, whereas they saw Apple as safe and even as useful (in relation to the Justice Department).
Now, MS never intended that Apple should take over digital media, marginalize WMA, etc. They miscalculated on that, but on the desktop, Apple has managed to make MS look much better. Not only can they claim that there is competition on the desktop, but now it is harder to blame them for charging too much or for promoting lock-in.
So, I think the statement that you are rupiating could be modified to "how Microsoft has succeeded in crushing Linux in personal computers." because it is Linux and open source that they wanted to crush, not innovation.
Re:Don't think so (Score:5, Interesting)
There's a difference.
Re:Don't think so (Score:5, Funny)
I say Solanum lycopersicum. Which fully demonstrates my superiority, I think.
Re: (Score:3, Insightful)
Re:Don't think so (Score:5, Informative)
But, I suppose, "why linux has failed on the desktop" sounds catchier than "a well known kernel hacker muses on the relationship between software and hardware in PC innovation and discusses the problems he sees in the way the mainline kernel developers address desktop user needs."
Re:Don't think so (Score:5, Funny)
No way can you consider 100 million of anything a failure.
How about: 100 million dead?
Re:Don't think so (Score:5, Insightful)
Re:Don't think so (Score:4, Insightful)
Things I would like to see in Linux:
Standardized single-sign on/authentication solution. Yes, I know there's kerberos, but someone needs to build an easy to use API over kerberos which allows you to make a simple call like "bool isTrusted()" to handle security throughout the app. ONE SIGN ON. ONE KEY staying with the user session, whether they open a shell, click on an app in KDE or Gnome, SSH or NFS to another machine or disk. One sign on. Please. This is one thing that Windows does so simply and elegantly. And yes, I know they crippled Kerberos and stuff. But it works. It really does. One of the most impressive things about Windows to me with no real Linux analogue. To get the same thing in Linux, you have to know what you're doing. In windows, you check the "Trusted for Delegation" box and make sure the computer has an account in LDAP.
That's about it. I have about 4 linux boxes, 1 macosx, and several win2k3 servers. I enjoy working with Linux the most because I have a lot of control. But when it comes to getting something "good enough" set up from scratch to live, windows beats Linux hands down. Thus, CIO's and CEO's buy it. If it were possible to have a nice standardized teaching method to teach nice standardized Linux installs and get enough people through there to make a difference, it would be possible to stage a serious invasion of MS shops. The reason is that they have "good enough" all ready, but they are starting to get new ideas that the microsoft stuff is not capable of doing quickly, and MS themselves have become too big and bloated as a company to get anything done in a timely fashion. Whereas a small consortium is much more nimble. The problem is there's NO LEADERSHIP. It's a bunch of nerds leading each other around, arguing about the correct text editor to use and/or what window manager is best. When there emerges a clear leader, not a technology leader but someone with the vision of truthful computing who can get us all thinking the right way, then we can make a push. This leader will not be in it for the money. Although he/she may already have a lot of it. This person will DEFINITELY not be from the academic, CS or otherwise, sector. Perhaps a politician, but more likely a businessman. Above all, a great leader with the vision to provide something better than good enough, and the army to build it.
Correction: Why Linux has failed on YOUR desktop (Score:4, Insightful)
Re:Correction: Why Linux has failed on YOUR deskto (Score:5, Informative)
Re:Correction: Why Linux has failed on YOUR deskto (Score:5, Funny)
Which reminds me, I really ought to return them.
Re:Correction: Why Linux has failed on YOUR deskto (Score:5, Insightful)
The author was speaking about how poorly Linux performed ON the desktop. Thinks like audio skipping and the desktop feeling slow. He was talking about how the Kernel was so slanted to big iron and the server market that it has ignored desktop performance. The was also talking about how hard it is to create benchmarks that show interactive responsiveness.
He also talked about how hard it is for "normal" users to communicate problems to Kernel developers.
What he is talking about is how Linux has failed to perform as well as a desktop as it does a server.
What most people have failed to notice or care about is this is a person that actually tried to fix problems by writing code! He was a truly working under the FOSS ideal and has given up.
Too bad so many people are dismissing what he has to say.
Re: (Score:3, Insightful)
Re:Correction: Why Linux has failed on YOUR deskto (Score:5, Insightful)
He wasn't saying that Linux was a worse desktop than Windows. In fact the said that Windows had many of the same problems! He wants Linux to be the best that it can be. Not just good as Windows or not just better than Windows but the best that it could be.
Re:Correction: Why Linux has failed on YOUR deskto (Score:4, Informative)
I use Linux as a PVR and it's more than up to the task. It can maintain adequate performance and responsiveness even when doing heavy number crunching. My MythTV boxes are quite often running at 100% cpu and a load average of 5 or 10.
Forget "audio skipping".
Let's try realtime video capture + realtime video decoding + 3 video transcoding jobs all going at the same time.
I can even still use my mythbackend as a desktop with very respectable responsiveness while all of this is going on.
"most people" are at a loss to see what his problem is.
Re:PVR != Desktop (Score:5, Informative)
And I would avoid correcting people when you don't know what you're talking about.
MythTV is not an embedded application, it's a software application that runs on a general purpose PC. I, like the GP, have a desktop computer that runs MythTV. It can record two channels at once while flagging commercials or transcoding a third TV show while I use it as a desktop or watch a fourth TV show. The audio doesn't skip nor does the desktop feel slow (as the GGP suggested) until I'm functioning at 100% CPU, which is fairly rare.
MS does this, why not copy them? (Score:5, Informative)
Of course not, Microsoft does it for the customer so they don't need to learn how to do it themselves. Would it be so hard for a Linux distro to do so as well when it is doing a "workstation" rather than a "server" install. Some distro ask and have this info regarding intended use.
I think you are exemplifying the "by nerds for nerds" attitude that the author of the article would probably argue is holding back Linux adoption.
Re: (Score:3, Informative)
Virtual Desktops
Bash (not sure what shells OS X comes with)
Beagle (no sure how spotlight compares)
Apt
Beryl (ok, not really a need, but a definite want)
Evolution
Re:Correction: Why Linux has failed on YOUR deskto (Score:5, Interesting)
As far as software goes, Ubuntu allows me to easily install whatever I want with just a few clicks. Windows requires me to search the web for software, then (If I'm lucky) download a free or shareware version of the software, or purchase the software. I live in a pretty remote area, and there are no software stores around (Except for a WalMart and Staples that are over an hour away), so it takes me at least a few hours to get the software, or up to a week if I need to buy it online. With Ubuntu, I have it within a few minutes. Also, Ubuntu keeps all of the software on my system up to date on its own, something that Windows has no way of doing.
Don't get me wrong, I'm not a rabid Linux fan boy. I make my living as a Windows developer, so I spend the vast majority of my time on a Windows XP box. My personal computers all run Ubuntu though, as it's shown me that it is far easier to use and maintain.
Re:Correction: Why Linux has failed on YOUR deskto (Score:5, Insightful)
On my desktop, Windows recognized the monitor and was able to use the full resolution with its generic drivers. Performance was terrible, but once I installed the specific drivers, it worked fine.
With Ubuntu, it simply reported "sync out of range" and there was nothing that could be done. Safe mode generated the same error, and with no UI to interact with, that's the end of it.
Likewise, when I tried Ubuntu on a laptop, it recognized the wireless card and then refused to use it. (It just doesn't work - trying to set the WEP key does nothing, it just says "activating device" and then returns to not working.)
Windows on both machines just work. Granted drivers had to be installed, but once they were installed, it just worked. No additional effort. No "sync out of range".
Now this experience obviously isn't typical either, but it demonstrates the main problem with Ubuntu: when it fails, there's no way to get help. Your options are basically to whine on forums, and then get completely useless advice like editing configuration files on a read-only CD with an OS that doesn't display a UI.
With Windows, there's a support number you can call, or you can take it to a local computer store, or ask for help among the massive number of Windows users - in short, you're not stuck with snobs on forums who think you should be able to hand-edit configuration files without being able to see anything on the screen.
Re:Correction: Why Linux has failed on YOUR deskto (Score:5, Interesting)
I have 4 computer shops with 45 minutes of me that build linux boxes. All of them are quite capable of restoring one that didn't install properly. Also that support number you can call for Windows is usually a waste of time and money. Every time I've called it's been a 20 - 45 minute wait followed by:
I think once they actually gave me a MS Knowledgebase number to resolve my problem.
As for asking for help among the massive number of Windows users - I almost pissed myself when I read that. I am almost certain that the number of people who can & will tell you how to hand configure your /etc/fstab to register a HD that the system didn't recognize on install is greater than the number of people who can tell you how to go into the registry & reset it to do the same.
As for snobs on the forums, the few times I've gone to ask questions, I have seen people asking for additional information - often with very specific requests & exactly how to get that information - only to be rounded on by the original poster claiming nobody is willing to help them. If expecting you to be able to follow directions to provide the detailed information needed to solve your problem is snobbery, then I guess there are a lot of snobs on the boards.
Unfortunately I guess there just aren't as many people gellering on the Linux boards as there are on the Windows boards. Oh wait, on the Windows boards they tell you to check the MS knowledgebase & if the solutions not there - reinstall.
Re: (Score:3, Interesting)
One of my hobbies is making interesting software environments which boot from removable media or the network. While some Windows tools exist which can facilitate this, some powerful nix-only concepts like mount -o loop just don't have Windows analogs.
My fav
Re: (Score:3, Informative)
It works. I had more trouble getting my current printer to work with XP than I did in Ubuntu.
I prefer the Gnome interface. I have a few panels with different purposes, and each one has a hide button (but no arrows). I keep them collapsed on the left side of my screen. It's become instinctual for me to click in certain places for shortcuts, the menu, virtual desktops, etc.
T
Re: (Score:3, Informative)
Re:Correction: Why Linux has failed on YOUR deskto (Score:4, Interesting)
Many times a day I find myself wanting to look at one window while typing into another. Either I'm working on some data analysis and want to plot things, or I'm writing and need to look back closely at something in an online paper, or I'm using a cad program and feeding it numbers from an email or scratch paper, I'm thumbing through photographs and wand to jot down notes on a scratch terminal at the same time.
Sure, if both objects happen to be text one can do the same in screen, emacs, or your multiplexor of choice (and I do, when appropriate.) And, if you're going to be doing it a lot with the same objects you can resize your windows and tile things. But, in practice, it's always a one-off minute long task involving random graphics for which resizing windows would be a pain.
When it comes down to it, UI configurability is among the biggest drivers in my OS choice. If you ask me why I like linux, I'll give you a long, meandering, philosophically charged answer that won't convince anyone. If you ask me why I throw a fit whenever I'm forced to use a non unix-like system, the answer is a lot more pedestrian: X can be easily configured to fit my needs, and every task can be accomplished from within a well designed shell.
What do I personally need in a UI?
- multiple virtual desktops
- focus follows mouse
- no raise on focus
- per-user key remapping
- fully functional, fast keyboard control over window placement/size
There are plenty of other little window manager tweeks that I like a lot, but that's the minimum I need in order to not hate integrating with a desktop. In windows, some of it kinda sorta works if you install lots of random third party software. (Although I've yet to find a no-raise-on-focus or a per-user key remapping option. Would love to hear about one if it exists.)
In X, it takes a minute of setup time and works on every machine, everywhere, and it doesn't screw up the UIs of all the other users.
Re: (Score:3, Insightful)
You can set up a Mac to dual boot in OS X and Linux and you're too dumb to figure out how to install Firefox on it?
It hasn't (Score:4, Informative)
Now it's all in the marketing and politics, but on the software side it's there.
Performance, not ease of use (Score:3, Informative)
The other side of the coin (Score:5, Insightful)
Perhaps that was the cause for her lack of problems. My guess is that she didn't need much to do with the computer anyway, apart than writing stuff for homework.
An OS is like a car. The more you know it, the more you tune it, and you optimize and squeeze every little bit of performance from it to fit your needs. You install new seats, a more comfy steering wheel, cup holders, etc etc.
The problem comes with the change. One day you're sitting in your perfectly-tuned american car FR with V8 engine, and then you switch to an european car. And suppose it's a 4WD or FF with a rotary engine. For starters, the steering wheel is on what you knew was the passenger's seat. You have to change speed with the left hand instead of the right hand. You have to look to the right instead of the left.
It's much worse when you realize that the knowledge and tools that helped you to tune your old car don't work with the new car (how the heck do you fix a rotary?). It's a completely different monster, and you have to RELEARN EVERYTHING FROM SCRATCH. Lots of knowledge lost.
For example, to quickly search for a file in Windows, I open a commandline, and type dir *mask*
To get help, you don't type "command
Back to the cars analogy. If you're just LEARNING to drive, "ah, it has a steering wheel and pedals." It's easy. Of course it's easy! Because you don't know ANYTHING.
The real problem with switching to Linux is having to UNLEARN every bit of knowledge you've gained about windows with the years. It's much more painful when you're a Windows power user.
Re:warning: ot (Score:5, Insightful)
The first can apply to any OS: the user has learned the specific, repetitive motions that get done for them what they need to have done on their specific system (usually this would be windows, but could be any OS). Their "skills" are easily translated to another system of the same type and maybe to a similar but not too distantly related system. An XP user could probably move down to win98 or up to say vista without too much difficulty, though they'd find certain things don't work the way they'd expect. An XFCE user could move fairly easily to gnome, not so much to KDE, but still okay. But if you ask this user to move to a completely different system (from any win to any linux, or from XFCE to say fluxbox or even worse to wmii (I love wmii, BTW)) then there's trouble. They don't actually have "skills", they have a set of rote responses that look like "skills" but aren't.
The second set have not necessarily learned any particular set of actions to get desired results. Instead they have learned about how computers work in general and have developed a set of "skills" for _determining_ which actions get desired results. The difference is subtle but important. Someone in this set can sit down at a computer they've never seen or even heard of before and figure out how to get something done in relatively short order. They will likely never be as productive on any one particular machine as the folks from the other set, but they will be proficient at _any_ machine in short order.
To determine which set you belong to, try something radically different from what you normally use for a good period of time and see what happens. If you give up within hours, then you're probably from the first set. If you find that you've forgotten exactly when you changed, then you're like from the second set and are probably already looking for something even more different to try.
So, after "Preview" (see what a good boy I am!), I have to add that there is probably a third set: those who probably belong in the second set, but have never fully developed that set of skills choosing instead to dive deep into the particular system of their choosing and learning it intimately. These folks can do _anything_ on their machine, but have given up their potential to learn breadth in exchange for depth.
Re:It hasn't (Score:5, Funny)
That has to be the most brilliant use of vender lock in to date!
I'm going to go start a migration plan for my girlfriend to switch to Linux. Then when she gets tired of my geeky arse she'll have to think twice about dumping me!
Re: (Score:3, Interesting)
In your case, Ubuntu fails to properly handle a case where hardware is moved between boots. In my case, Windows fails to handle hotplug on an interface specifically designed for hotplug. Nyahh, nyaah.
The plural of "anecdote"
Applications (Score:4, Insightful)
And it's not a problem of performance; It's a question of politics. We have to convince enough software vendors to start coding in a cross-platform language/way.
Re:Applications (Score:4, Insightful)
There are lots of applications out there, but perhaps too many. So many applications have everything but that one critical feature that an organization has come to rely on. "But it's open source and you can implement it yourself!" True, but that costs real money too. Quite a bit actually.
And then there are the vertical applications that we can't move away from because we've got years and years worth of data stuck in the swamp. Yes, we could migrate but at what cost? Business doesn't care about operating systems or information philosophy, it cares about getting the job done to make money. It would take a considerable cost advantage to move an organization of medium size or larger from a Windows environment to a Linux environment.
I spent years on and off trying to figure out how to move my company to Linux both on the desktop and the server. It's just too much, even still. Our business is manufacturing Widgets, and we get along just fine in our Windows world. If we were starting over from scratch today with the 5 or so employees we had when I first started a decade ago, I would make different choices. I despise 3/4 of Microsoft server products, and I hate the cost of MS Office.
Re:Applications (Score:5, Insightful)
But ... I am currently using Windows Vista as I am typing this.
Why?
Because I got a new machine. Quad Core with 2x8800GTX cards powering 3 monitors and an X-Fi Sound Card. It looks and sounds great.
But when I tried to install Ubuntu 7.04 on it over the pre-installed Vista, I got a blank screen. Apparently the 10 month old 8800GTX drivers are not included on the Ubuntu install disk. Yes, there are some workarounds, using a text install, installing envy from a shell, and some other tips that may or may not work (results have been mixed), but it's a leap of faith. People that are running the 8800 cards in Ubuntu have been generally disappointed at their performance from the reading I have done on the Ubuntu forms, finding them slower than 7xxx series cards, and even slower then 6800's. What a waste of expensive graphics cards. And there does not even exist a driver to power my X-Fi soundcard. So I would not get sound. Sweet, a computer with no sound. All that music I downl ... I mean BOUGHT ... would never get heard.
And that may be a bit of a problem for the Linux Desktop. It is hard to start out with a Linux desktop if you have psuedo-cutting-edge hardware. Many people buying new machines have to wait some time for stable and easily installable drivers to appear for their hardware, and by the time they appear, they are already fully entrenched in Windows, have their file structure laid out, etc.
I am sure I will eventually have Linux installed on this machine, but it will be long after it is a high-end machine. I am not going to waste good hardware on drivers that don't work, or work sub-optimally.
But this is not the fault of Linux. The folks who release the drivers just don't care too much about Linux. That is the problem.
In two years, this will be a killer Linux workstation. Today, it would make a shitty Linux workstation.
So, Vista it is for the time being. I have already gotten a BSOD. The OS is nuttier than a squirrels turd and is a general pain in the ass. But my applications run, everything installs the moment I plug it on (joystick, pocket PC, Bluetooth adapter, SD cards, etc) ... I can see what I am doing, the audio sounds great and I can get things done.
Would I rather run Linux? Yes. Vista thrashes the disk around like crazy the whole time the machine is on, and it can only see 2.5 gigs of the 4gigs of RAM I have installed. I suppose I could shell out a few hundred for 64but Vista, but who knows what drivers will and won't work in that.
But at least I have audio and video on the OS I have now. It's an imperfect work.
The Achilles heel for Linux desktops has been and always will be fast and easy driver support, IMHO.
Linux works great on slightly older hardware, but by the time the hardware is slightly older, it is more difficult to get converts. People tend to dance with who brought them, and on most machines, that is Windows.
Re:Applications (Score:5, Insightful)
Re: (Score:3, Insightful)
Of course most companies just see quality (of any sort, not just software) as a needless cost.
Enterprises want enterprise crap. (Score:5, Interesting)
Desktop users are fickle
Re: (Score:3, Insightful)
Re: (Score:3, Interesting)
Too much choice and yet none at all (Score:3, Insightful)
Another problem is the MS dominance over the OS market. It's hard to buy a computer without Windows and even harder to purchase one with Linux preinstalled. Your average computer user is not going to purchase a computer that won't run (because of no OS) and even if they did, when they go to the store pick up an OS, all they see is Windows.
Linux users need to stick to a Distro that works, is easy, is well known, and comes as an option to be preinstalled on computers from the majority of manufacturers, even if it is along side Windows or as a bootable DVD thrown into the box.
Re:Too much choice and yet none at all (Score:5, Funny)
I can't name them, either. But I also can't name all the available versions of Windows. So what?
Re: (Score:3, Insightful)
I've run Linux for years and I still can't name all the available distros. I doubt ANYONE can.
I can't name them, either. But I also can't name all the available versions of Windows. So what?
Let's see how I do:
Available?
XP Home, Server, and Enterprise
Server 2k3
Vista Home, Professional, and Ultimate.
Let's see how I did by comparing to this [wikipedia.org] Wiki page. I missed:
Windows Vista Starter
Windows Vista Enterprise
and the Windows 2k3 server editions. There are six, which I lumped into one.
Of course, I skipped the embedded. Also, many of those are Enterprise editions of 2003 that you probably won't find people confusing them for a desktop OS (well, they might until the see the price tag!). Also, I skip
Re: (Score:3, Interesting)
Plus, if you're not happy with a particular distro, you can try another one, for free, and with a minimum of effort. I've gone through 3 or 4 over the years before sticking with Kubuntu.
Re:Too much choice and yet none at all (Score:5, Insightful)
If someone has far away job and for some strange reason needs to own a personal vehicle, which car do they choose? I've driven cars for years and I still can't name all the available models. I doubt ANYONE can.
Car drivers need to stick to a model that works, is easy, is well known, and comes as an option to be sold at every auto shop and used car fair, even if it is along side the models from the brand that particular auto shop represents.
I cannot imagine a reasoning more beaten up and less relevant than this one. While it is true that people prefers pre-packaged goods, too much choice was never a problem in the other markets. There is a multitude of car brands, TV brands, beer brands, all of them differing in a way or another but every of them catering to their target audience. And we do not see people fighting to get this or that car (beer, TV set) brand to dominate the market because of an eventual technical superiority, better taste, features, etc.
That is because what is the best alternative for one may not be the best for other, because people taste differs, because people need differs. The only difference from that to computer Operational Systems is that the collaborative culture brought by the microcomputer "revolution" make people expect a level of interoperability and interchangeability between these different branded machine that they don't expect in other ones, like cars, for instance.
And to blame the lack of interoperability we have nobody else than certain proprietary software companies (there are many of them for me to enumerate by name, but you know the ones I'm talking about), that could agree on standards that would thrive interoperability (imagine what would the industry be if they didn't agreed on ASCII, for instance), but instead put their short time gains over it and helps to push the whole industry back a couple of decades.
To summarize: too much choice happens everywhere, and it is a good thing, inclusive in computing, as long as there are interoperability among the choices. Linux (the kernel) and its most of its userland is open source, open specs so, the lack of interoperability can't be blamed on them.
Re: (Score:3, Insightful)
So? I can't name all the different kinds of laundry detergent (or even Tide laundry detergent), but its not hard for me to find one that works well enough and use it.
Why would I need to be able to name all of the Linux distros in order to use on
Again??? (Score:4, Insightful)
My personal opinion, after having used Linux quite a bit, is simply that Linux isn't ready for the desktop. While many apps have easy to install packages, a lot of apps don't. Particularly smaller, single-developer shareware kind of apps. Many of these require getting source and compiling, something my mother or grandmother won't be able to do.
Speaking of my mother and grandmother, the other thing they already find confusing enough is the Windows directory layout. Linux is FAR more complicated in that department. They'd find organizing their documents much more difficult.
Finally, frankly, I don't find the UIs all that intuitive to use. I've used Gnome and KDE. I prefer KDE, but I have issues with both. It took me a while to figure out how to drag and drop gzip compressed files from KDE. I can't even remember how it works off the top of my head, I'd have to go do it again. But it definitely wasn't as intuitive as drag and drop from say WinZip to a folder in Windows.
The fact is, Linux just isn't ready for the desktop. Don't get me wrong, huge strides have been made over the past few years in usability and I suspect it'll get there eventually, but it's not there.
Another issue is the community, which in many places is hostile to newbies. I've been insulted on more Linux support forums for asking question than I've ever been on Windows support forums. There are places to get good support for Linux, but there are a lot of really hostile ones too. Windows may have some hostile ones, but I just run into it far less frequently.
This is just my personal opinion, based on my experiences with it. Other people may have had different experiences. I still love Linux for certain things and I run a Linux box as a file server, firewall, database server and for video editing. I'd never trust connecting a Windows box directly to the internet, but I've always trusted Linux for that. But as a desktop environment, it just doesn't work for me.
Desktop Responsiveness (Score:5, Informative)
Re: (Score:3, Insightful)
That _might_ be one of the reasons he's pissed off, you know
Re:Desktop Responsiveness (Score:5, Interesting)
Wrong problem (Score:5, Interesting)
But how good it is isn't really the issue. The fact is, Microsoft has an incredible lock-in, and it is going to take many years to chip away at that. But Firefox has demonstrated that it is possible to win market share from Microsoft. The two essential ingredients are persistence and time. If Microsoft continue to stumble - as they have with Vista - then Linux on the desktop will happen more quickly.
Failed? What counts as failed? (Score:3, Insightful)
If by "failed" you mean "failed to achieve X market share", I should think the answer is obvious: normal people don't give a flying fuck what kernel their operating system uses. And since their computers come with Windows preinstalled, they are not going to swap operating systems to get a better kernel -- or a better license. Even MacOS wouldn't be where it is, if it was developed and sold as a purely OS product, instead of being bundled with Apple computers.
On the server end, people are concerned about capacity, performance, and licensing restrictions, so it's a different ball game.
People have only two problems with the Linux kernel, and neither of them is due to the existence of enterprise features: (1) the USB doodad they just bought doesn't work automatically and (2) the specific application doesn't support any version of Linux. As to why this is so, it all comes back to the fact they don't care what kernel they have and they already have Windows, so people in the business of catering to them don't bother to do anything to fix these problems. If they did, user apathy means it wouldn't make a big difference in Linux desktop adoption.
In the end, this is a situation that only Microsoft can change, and that by screwing up. Maybe they have with Vista, but I think not. Vista will be like the old 640K DOS memory limit. Industry (other than MS) will move heaven and earth to accomodate it, should it become the status quo, which given user indifference will probably happen.
Typing on a Linux desktop (Score:5, Interesting)
I'm typing this on a Linux desktop. It's a pretty hefty system (dual-core, 2.8 GHz, 4 GB RAM), but it earns its living, I assure you. It's Slackware, with a custom kernel. As I've mentioned before, my view is that the distro kernel is solely there for bootstrapping the system until you can build a custom kernel to match your hardware and your needs. It's open source. We can do that, you know.
My biggest frustration with Linux is the notion that Linux systems must emulate Windows to be acceptable (e.g. Mono), and that the Unix interface is a priori incomprehensible, for no other reason than that it doesn't look and feel like Windows. I like the concept of lightweight desktop-oriented distros like Puppy, but do not like they way they so desperately emulate Windows. Right down to the icons.
Is that all there is? We have an open-source OS here, with open source applications. If we don't like how they work, we can roll our own. Mindlessly aping whatever Microsoft are dumping in to Vista this week is dumb.
What next, DRM?
...laura
Coral Cache (Score:3, Informative)
Exchange, bitches! (Score:5, Insightful)
Anyone familiar with how Microsoft locks in customers will tell you the same thing.
We have reached a point where neither the desktop OS or the Server OS doesnt matter as much as the apps they run. Exchange is the one app that is almost a must-have. Anyone can list all the non-proprietary stuff that runs 80% of Exchange functionality, or 50%, but does it better, and so on and so on.
Give it up, and start building something that takes Exchange on directly, feature for feature, with better recovery, and message pushing to handheld devices.
Or, maybe just shutup? This has been obvious for years. Microsoft keeps improving Exchange, Enterprises keep buying it, and everything else that goes with it.
Linux cannot exist on its own with a bunch of 50-to-80% solutions, expecting to fill the gap by the temporary pleasure of giving Microsoft the finger from time to time.
Either compete or change the game. Only Google and Apple seem to get this.
And can we stop asking this question over and over again?
Re:Exchange, bitches! (Score:4, Insightful)
Linux is not only failing on the desktop.. (Score:5, Funny)
My advice? Install it now and help it be even worse at failing.
the desktop PC is crap .. (Score:4, Informative)
"Enter the dark era. The hardware driven computer developments failed due to poor marketing, development and a whole host of other problems. This is when the software became king, and instead of competing, all hardware was slowly being designed to yield to the software and operating system design"
"However, the desktop PC is crap. It's rubbish. The experience is so bloated and slowed down in all the things that matter to us. We all own computers today that were considered supercomputers 10 years ago
"I watched the development and to be honest... I was horrified. The names of all the kernel hackers I had come to respect and observe were all frantically working away on this new and improved kernel and pretty much everyone was working on all this enterprise crap that a desktop cares not about"
"Or click on a window and drag it across the screen and it would spit and stutter in starts and bursts. Or write one large file to disk and find that the mouse cursor would move and everything else on the desktop would be dead without refreshing for a minute"
--
Why Linux Has Failed on the Desktop
"Linux is burdened with 'enterprise crap' that makes it run poorly on desktop PCs", Zonk quoting SlinkySausage.
Quoting him out of context and making him say something he didn't say
Rubbish (Score:3, Insightful)
I can agree to some of the things said by this guy, but all in all, it's rubbish. Sure, response times are one thing and I think they've been addressed very well by preemption features and configurable scheduler frequency, but to blame a slow desktop experience on the kernel is just stupid. Really stupid. If you wonder where all your megahurzes go, try looking at your KDEs and Gnomes first, your animated gizmos, your 3d desktop gimmicks and applets and your java crap.
Slashdot got it wrong, but it's a real issue. (Score:5, Informative)
First, the Slashdot article is terrible. The article isn't about "why Linux is failing on the desktop", it's about why a kernel developer who was trying to improve scheduling performance quit.
The scheduling issue is interesting. I used to work on mainframe schedulers, I've done real-time work, and I'm familiar with the issue in game implementation, so I know how hard this is. We could do better than what we have now, but not by some magic fix to the scheduler. We have to look at interactivity as a real time problem.
It is, too. Alan Kay used to say that there is no more excuse for a delay between pressing a key on a computer and having something happen than there is on a piano. We haven't been faithful to that, and it subtly drives users nuts.
One useful idea from the real time world is explicit "sporadic scheduling". Some real time operating systems have this. A process can explicitly request that it wants, say, 10ms of CPU time every 100ms. The scheduler must reject that request if the system is overbooked. If it does accept the request, the scheduler has committed that much resource to the process. If the process overruns its time slot, it loses priority and an overrun is tallied.
This is what an audio or video player should be using. This is how you get audio and video that don't pause or skip. For this to work, the player must be able to calculate, for each system it runs on, exactly what resources are needed to play the current content. This may take more analysis and benchmarking than many programmers are used to doing. It's worthwhile to make overruns visible to tools outside the application, so that users can detect broken applications. To a real time programmer, overrunning your time slot means "broken". You have to think that way.
On the interactivity front, it's useful for a thread to be able to request a high priority for a short period after an event, with a priority drop to follow quickly if it keeps the CPU too long. That's how you get the mouse cursor to track reliably. Of course, the thread that handles mouse events has to pass off all the real work to other threads, not stall the thread handling fast events.
It's also probably time to end paging to disk. When it works, paging at best doubles the effective RAM. But paging inherently results in long unexpected delays. If you want interactivity, don't page. Real-time systems don't. Neither do game consoles. RAM is so cheap that it's not worth it. (1GB starts at US$56 today at Crucial.) Paging devices maxed out around 10,000 RPM since the 1960s, and haven't improved much since. Give it up. Today, paging is in practice mostly a means for dealing with memory hogging apps. (Hint: open "about.config" in Firefox and turn off "browser.cache.memory.enable". so it doesn't save screen dumps of each page for faster tab switching.) It's probably time for Linux to not page interactive processes by default.
This implies an operating system that says "no" when you put on too much load, instead of cramming it in and doing it badly. Open too many windows of video, and at some point the player won't open another one. There's nothing wrong with that, but most Linux/Unix apps don't handle resource rejections from the operating system well.
That title was not chosen by me (Score:5, Interesting)
Re:That title was not chosen by me (Score:5, Informative)
Re:That title was not chosen by me (Score:4, Insightful)
Well, now you have a personal understanding of why a lot of people are turning from "mainstream" journalism to alternative sources. The journalistic process isn't exactly honest or honorable, is it?
I did think it odd that after arguing against fair scheduling for quite a while, Ingo, et. al. decided to implement it (and how rapidly it was dropped into the kernel). I've read a few articles about the sudden change of heart. I'm sorry things worked out that way; I can definitely get an idea how disappointing that you didn't even get any credit for championing fair scheduling, nor were you given any involvement in implementing the CFS.
On the other hand, I also recall reading a paper that was given at OLS 2006 that was more or less stating that "Userspace Sucks"; there's a lot of work to be done there.
Re: (Score:3, Insightful)
Re: (Score:3, Interesting)
Re:Linux Hasn't Failed on My Desktop (Score:5, Insightful)
Stop. Reread what you just posted. First you say it's easy to use. Then you say that you configured your grandmothers machine with four buttons she can use to access the things she uses most.
If it's so easy, why did you have to configure those buttons? Why couldn't your grandmother do it herself?
I'm not saying whatever version of Linux your grandmother is using isn't easy to use. What I am saying is that well-meaning folks like you who support Linux on the desktop always use an example such as the one you gave to show how easy Linux is to use yet, by your own admission, you had to do the setup. You had to do the configuration.
This isn't to say that configuring Windows is necessarily easy or even intuitive. However, either through force of repetition or blind luck, the average person is able to configure a Windows environment more easily than a Linux environment.
I don't personally use Linux though I have fiddled with Slack 10 and Debian so maybe my perceptions are off, but the overall point is that those who support Linux and who say how easy it is to use ALWAYS say they got a family member/SO/whomever to use it AFTER they configured it for them so therefore, it must be easy to use. That's looking at it from the wrong angle.
I wrote in a post a while back about documentation and how the biggest problem with it is that it isn't detailed enough for the average person. People, despite the innate intelligence we are supposedly born with, like to be handheld the first few times when doing something. Particularly if they have never done it before.
You and I may be able to program our vcrs and dvd players (well, not me yet. See my journal for why) without reading the manual but that is only because we have been exposed to the general process for so long that we can draw upon our past experiences to get us through the configuration. Joe Average can't (or won't depending upon how militant they are).
I don't know what the answer is because installing an OS, even as streamlined as Microsoft, Apple and the various Linus distributions have done, is still not easy. There are still questions that need to be answered to configure it that I'm certain your grandmother couldn't answer without your guidance.
Yes, once the OS is installed and configured things will just work but as has been said a billionteen times before, people don't want to have go through a long configuration process. They want to be able to put in a floppy/CD/glass block/whatever and other than double-clicking on an icon, have the software installed and ready to go.
I realize this is somewhat of a rant but those of you who work with Linux on a daily basis think that using your distro is simple and easy. Which it is but ONLY because you've been working with it for X months/years/decades/eons and know it pretty much inside and out. Take someone off the street and have them do an install of the OS or a piece of software on Linux and I can guarantee you they will tell you to do things to yourself which are not possible (except if you're a master contortionist).
Easy is a relative term. What is easy for you or I is not easy to our parents or grandparents. Those who produce Linux distros need to understand this and have it plastered all over their work spaces so every time they do something they should always ask themselves, "Is this something that Joe Average can do?" not, "Well shoot, this is simple. All one has to do is rm -f *%!@, then grep for dlist -t to be sure it was disjoined at which point they can do an apt get something and finally a make something. I can do that in my sleep!" (and yes, I know what I wrote makes no sense. That it is exactly what the outside world hears when you folks talk about doing something)
Re: (Score:3, Interesting)
I see this whole thing as a huge ego trip by Ingo and Linus. If they were halfway decent people, they would be able to admit "Hey this new guy had a good idea, and lots of people are using it, and it works, lets bring him in". Instead Ingo was hung up on "My way is the b
Re:Does this guy know what he's talking about? (Score:5, Informative)
Yeah, actually his patches were pretty good. He taught himself C, grokked the kernel coding style, and was a presence on the kernel mailing list. He maintained the -ck set of patches for quite a while, and wrote a couple of new schedulers (staircase, staircase deadline, rotating staircase deadline) based around the concept of fairness.
After quite a bit of discussion, Ingo Molnar produced the CFS (completely fair scheduler) which just recently got merged. The bulk of the new scheduler was written in 62 hours, then finetuned over many weeks on the kernel mailing list. He gave credit to Con for proving the fair scheduler design concept, and for some of the tuning.
A number of people were disappointed by the perceived nepotism, where it appeared that Ingo's got merged because he was in the "in" crowd. I expect this is part of what triggered Con's decision to leave. On the other hand, the two schedulers are very different and it may be that one is really technically better than the other--I haven't compared the two in detail.
Re:Linux on the Desktop? (Score:5, Funny)
Re:Oh ye, it's the performance, duh (Score:5, Insightful)
In short, you don't know what you're talking about. He's not talking about *throughput* which Linux does very well at. He's talking about latency and interactive performance. A system where the desktop is snappy and responsive, where the CPU wastes cycles if need be just to makes sure the mouse doesn't lag and that windows are redrawn in a prompt, synchronized way. A kernel optimized for desktop performance (have you ever *used* Quartz on OS/X?) will sacrifice overall throughput and raw total performance for low latency servicing of the things a user actually looks at on the screen. It's this perception of performance that matters on the desktop. If users sees a fraction of a second delay or stuttering in his UI, this is perceived as "slow!"
For example, my Fedora Core 6 box running on an older AMD 2800+ XP, is plenty fast at lots of things. I can compile large programs fairly quickly, and do all kinds of things. But dragging a window across the screen not only is slow, but it also can cause my audio to skip.
On the same processor (even under VMware!) Windows XP is smooth and the UI responsive. Of course under the hood Windows doesn't fair so well. I can't compile with as much raw speed, and although the UI is responsive, the code connected to it may not be executing in a speedy manner, causing me to have to wait for the computer. But the important part is that the windows draw smooth and fast. Resizing a window or moving a window is silky smooth.
Even Vista, though it ultimately is slower than XP and Linux, has a UI that appears to be super fast and slick, much faster than any Linux desktop (remember perception *is* reality). Just try to use it sometimes.
Now his patches combined with, say Compiz, go a long ways to making Linux have the responsiveness that desktop users require, the apparent schizophrenia on the part of Linux developers in relation to the desktop has frustrated him and driven him away. This is a tragedy.
Re:Maybe the GP is one of those 20%? (Score:4, Informative)