Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Linux Distributors Work Towards Desktop Standards

Zonk posted more than 8 years ago | from the lets-get-together-and-feel-alright dept.

247

WebHostingGuy wrote to mention an MSNBC article discussing a move by several Linux distributors to standardize on a set of components for desktop versions of the operating system. From the article: "The standard created by the Free Standards Group should make it easier for developers to write applications that will work on Linux versions from different distributors. Linux has a firm foothold as an operating system for servers -- it's popular for hosting Web sites, for instance -- but has only a few percent of the desktop market."

cancel ×

247 comments

Yea like they will ever agree with anything (4, Insightful)

jellomizer (103300) | more than 8 years ago | (#15179878)

After the talk there will be 2 Major Faction. While one may win. The Second one will go Screw you and make their own design in-spite of the the talks. That is the problem with Ego Driven Software vs. Profit driven. While they both have their advantages and disadvantage. Ego Driven Software while the Code my be better quality but have a much harder time agreeing with other people. But Profit driven Software tends to be more consistent but software quality tends to be a little lower.

Re:Yea like they will ever agree with anything (3, Insightful)

mugenjou (912908) | more than 8 years ago | (#15179893)

and microsoft is a combination of both? they have the low quality of profit driven software as well as egoistic recreation/bastardization of standards just to be as incompatible as possible with the rest of the world.

Re:Yea like they will ever agree with anything (1)

Pneuma ROCKS (906002) | more than 8 years ago | (#15180025)

egoistic recreation/bastardization of standards just to be as incompatible as possible with the rest of the world

That's not egoistic. They break several standards to maintain a strong grip on the market, and also because it's sometimes very costly to live up to them word by word, so it's cheaper just to take the easy way. This is all maximization of profit. I think GP is very right.

Re:Yea like they will ever agree with anything (1)

inode_buddha (576844) | more than 8 years ago | (#15180224)

I would argue that their ego is directly tied to their market share. Bummer.

Re:Yea like they will ever agree with anything (1)

Jugalator (259273) | more than 8 years ago | (#15180050)

Egoistic as a company, yes, not egoistic per individual/grouping, which can be a problem in open source organizations.

Re:Yea like they will ever agree with anything (1)

tsa (15680) | more than 8 years ago | (#15179976)

That is exactly what I thought when I read this. This sort of thing has been tried before but so far it never worked for Linux. Let's hope this time people will lower their testosteron levels a bit for the greater cause, because having three major desktop operating systems is way better than the two we have now!

uhm what? (0)

Anonymous Coward | more than 8 years ago | (#15180048)

lots of profit driven software companies are full of problems caused by egos, i mean after all, what drives profit?

the only way to have 'no ego' software is to have anonymous contributors.

Re:Yea like they will ever agree with anything (4, Insightful)

asuffield (111848) | more than 8 years ago | (#15180077)

You're half right. The bit you got wrong is that the profit motive does not inspire people to produce consistent software. Most commercial software is just inconsistent, with everything around it and sometimes even with itself. This happens because each piece of software has a different project leader, and nobody in management above them understands enough to impose a single vision on the whole system. Given a choice, an individual project team will usually attempt to differentiate their project from all the others, in the hope of getting more money and/or recognition.

So the conclusion is probably that different software created by different people is usually going to be different. That's probably a good thing and you should just get used to it. Nobody can invent a single way to do things that is right for every piece of software you might want to use in the future.

Re:Yea like they will ever agree with anything (0)

Anonymous Coward | more than 8 years ago | (#15180298)

I dont think the same applies to everything, just look at the profit driven US mobile phone network versus the 'ego' driven European networks.

I think this is the closest analogy I can think of

Novell, DBase, Lotus, WP, DR.DOS, soon AOL/Intuit (1)

WindBourne (631190) | more than 8 years ago | (#15180344)

What did/do they have in common? All were/will be killed off by a company that is profit motivated and uses illegal actions. The ego stuff was done by these companies and then one low quality profit company has managed to kill all but 2 and they are just a matter of time (Intuit is doing ok, for the moment, but turbo tax is slowly being gutted and their targeting markets will happen soon with MS stuff; Intuit will start a slow downwards).

firstpost (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#15179880)

GNAA GNAA

Frst (1)

Helen Keller (842669) | more than 8 years ago | (#15179881)

Posnmmmmmmmmmt

reasons why (4, Interesting)

fl!ptop (902193) | more than 8 years ago | (#15179883)

interesting that msn bills move as 'making the operating system compete better with windows' instead of 'making it easier for developers to write applications that work on different flavors.'

i would think the former is a result of the latter, instead of the other way around.

I disagree (-1)

Anonymous Coward | more than 8 years ago | (#15179885)

period.

ps: congrats on first post

I don't know what they are on about (3, Informative)

MichaelSmith (789609) | more than 8 years ago | (#15179888)

I can run KDE applications under fvwm and Gnome, as long as the runtime libraries are there. I don't see why it is hard to have QT and GTK libraries on each system.

The only remaining issue is cut and paste with rich content but the article doesn't talk about that.

Re:I don't know what they are on about (4, Insightful)

jellomizer (103300) | more than 8 years ago | (#15179904)

It is an issue of consistency. If I am running Gnome I know when I am running a KDE app because it looks a feels slightly off. The same if I am using a straight X11 App. Linux for the desktop is not about Window Managers. It is about giving Developers tools to make their Apps Desktop Friendly, And ability to make sure Linux Apps look good no matter what WM you are using.

Re:I don't know what they are on about (2, Insightful)

moro_666 (414422) | more than 8 years ago | (#15179932)

i wouldn't start my kde if it would look and behave like gnome ;)

  now about the `issue` itself, redhat is dragging along a bunch of people to push some kind of one-standard-for-all (cough-cough-bs-cough-cough-profite-cough). they want to unify some things (the article didn't really elaborate what ...), and therefor make all the stuff more the same.

  i don't know about you, but if i'd want everything to look the same, i could aswell choose osx or winblows (nah, not really win, it's not ...). i chose linux many years ago because i wanted it to behave like i want it to and to look like i want it to. i don't want my desktop to look'n'feel like it suits a redhat salesman.

  i understand that this will help to push linux into the streets blabla, but is this really what we all want ? or is this the beginning of the end of linux as we know it ?

  we have seen many items out on the `market` that were supposed to unify and standardize linux (various package managers etc.), none of them have succeeded. for the broadband, i hope it succeeds, for my own's sake, i hope they fail terribly and give it up.

Standard don't remove freedom (3, Insightful)

MarkByers (770551) | more than 8 years ago | (#15179988)

i understand that this will help to push linux into the streets blabla, but is this really what we all want ? or is this the beginning of the end of linux as we know it ?

No. There will always be distributions that do it their own way despite what any standards organisations say. You will always be free to use these distributions. No-one can force standards into Free software (if you try, people can fork), but you can make the standards so good that distributions (and their users) want them. If people don't want them, they won't be successful.

Re:Standard don't remove freedom (1)

wysiwia (932559) | more than 8 years ago | (#15180091)

...but you can make the standards so good that distributions (and their users) want them.

That's exactly what I had in mind when I created wyoGuide (http://wyoguide.sf.net/ [sf.net] ) albeit others have to decide themselves if I've come close enough. If I almost get no complains might be a hint but don't know for sure.

O. Wyss

Re:I don't know what they are on about (1)

Kangburra (911213) | more than 8 years ago | (#15180049)

i wanted it to behave like i want it to


I agree with this, but also when I choose that way I don't want some wizard putting it back or an update changing the setting I explicitly chose.

Linux is not perfect but it's not the worst option.

Re:I don't know what they are on about (1)

Spit (23158) | more than 8 years ago | (#15180250)

Apple doesn't have this, some apps are the white look, some are the brushed metal. Windows apps often look quite different. Why should X developers do any different? The app works doesn't it?

Re:I don't know what they are on about (1)

Sweetshark (696449) | more than 8 years ago | (#15179905)

I don't see why it is hard to have QT and GTK libraries on each system.
Because:
- its ugly design
- it involves lots of code duplication
- it sucks on lean platforms (for example Maemo)
- it doubles your chances of being hit by a security flaw
- it produces a lot of unmaintained basic infrastucture code (like VFS) where the implementation is the spec.
- standards are a Good Thing

Re:I don't know what they are on about (2, Interesting)

Homology (639438) | more than 8 years ago | (#15179961)

I don't see why it is hard to have QT and GTK libraries on each system.

Because: - its ugly design

It's not an issue of bad design, but of neccessity. As long as different applications are using different libraries, you end up installing those libraries.

- it involves lots of code duplication

There is no code duplication involved, however, there is some overlap of functionality

- it sucks on lean platforms (for example Maemo)

If your computer is very limited, then you don't want to run either KDE or GNOME.

- it doubles your chances of being hit by a security flaw

Care to elaborate on that?

- it produces a lot of unmaintained basic infrastucture code (like VFS) where the implementation is the spec.

Just because some random Linux distro is offering both KDE and GNOME does not imply that KDE or GNOME stops maintaining their own code.

- standards are a Good Thing

Yeah, yeah, sure. Except when the "standards" sucks.

Re:I don't know what they are on about (2, Insightful)

IamTheRealMike (537420) | more than 8 years ago | (#15179984)

The standards in question are things like "how do I install a menu item in a way that works across distributions" and "how do I distribute C++ apps in ways that don't randomly crash" and "what libraries can I expect a Linux system to have".

The whole "as long as the runtime libraries are there" catch is what it's all about. It's not reasonable to expect people to deal with dependencies.

Re:I don't know what they are on about (2, Insightful)

Homology (639438) | more than 8 years ago | (#15180024)

The whole "as long as the runtime libraries are there" catch is what it's all about. It's not reasonable to expect people to deal with dependencies.

It's the package maintainers job to deal with library dependencies. I a Linux distro is unwilling to do this, why should I use it in the first place since it is obiously of low quality?

its an impossible job (0)

Anonymous Coward | more than 8 years ago | (#15180072)

the best 'maintainer' in the world cannot keep up with the hundreds of sublibraries and different versions of sublibraries required for the various gui toolkits in linux.

programmers have an even harder time because they have to get the header files and development and/or debugging libraries for all this stuff. if two of their sublibraries use a 3rd sublibrary but each requires a different version... oi vey. they usually just copy all the source code into their own project which bloats it and locks out improvements/bugfixes/securtiy-patches made to the 3rd sublibrary.

not to even start mentioning the quality assurance issues. if you really want to test a program, to professional standards, you have to test it, as in actually run it, and actually do this on the different versions of libraries out there. so redhat, mandrake, suse, ubuntu, kubuntu, and so forth and so on, each have different libraries, and different locations for configuration files, and different settings used on some basic things like apache or mysql or X11 or KDE or gnome. so you have to test against all that stuff. and different versions of those OSes use different versions of the libraries. so you have to test against all that as well.

the less labor involved for the distro maintainers, and the less labor required for programming, and the less labor required for testing, the the more time people can spend writing useful programs instead of trying to make their program compatible with 5 different flavors of the libraries shipped on different linux versions, and the less time they have to spend learning the quirks and testing the 3 or 4 different ways of packaging programs.

if you want to make crap programs and leave it up to everyone else to install them, its not a big deal.

if you want to make programs that people can install with one or two clicks, like Windows has had for years, and macintosh, then you need more standardization.

apt-get install blah blah, yeah great. but it works differently for suse, redhat, etc. and besides, alot of times apt-get just wont work, or packages will be available in several different versions that are confusing.

thats because its hard to be a distro maintainer / packager and its hard to get everything perfect.

having more consistency would make everyones job a little easier and this would have reverberations throughout the whole system.

it has been gettin more consistent over the years.

Re:I don't know what they are on about (1)

IamTheRealMike (537420) | more than 8 years ago | (#15180130)

World of Warcraft is not in most distros, is it low quality? Compared to say, bzFlag?

Don't get me wrong, I love a good game of bzFlag, but saying "If a Linux distro is unwilling to do this, why should I use it in the first place since it is obiously of low quality?" isn't going to fly with most people ...

Re:I don't know what they are on about (1)

AusIV (950840) | more than 8 years ago | (#15180247)

Different distros may have different ideas of how to handle dependencies package distribution and dependencies. I'm a relative noob to Linux, and I already know that debian uses one kind of package, gentoo another, fedora another, and there are several other methods of distributing. I think part of the problem is that each of these distribution techniques has its own advantages. You're not going to get people who use Gentoo to download and use binaries (at least not for everything), and you're not going to get people who use Ubuntu to compile everything they want to install.

Not wanting to meet someone else's standard may lead to a lack of ability to maintain dependencies, but that doesn't mean the method of distribution is necessarily low quality.

Re:I don't know what they are on about (1)

DrXym (126579) | more than 8 years ago | (#15180038)

I guess the issue isn't what widget set you use, but the fact that you have two stacks underpinning both of them with little sharing of libraries or settings. When KDE & GNOME learn to play nice and share the same settings, desktop files, theme engine, and behaviour is the day that Linux move on from pointless widget wars (that no end user cares anything about) and focus on delivering a high quality desktop.

I have to ask... (3, Interesting)

MobileTatsu-NJG (946591) | more than 8 years ago | (#15179900)

This question is going to seem rude, and I apologize for this, but why didn't this happen years ago? I'm asking out of curiosity, not as a jab at the community. It seems to me that this sort of standard would have been quite valuable as soon as GUIs became prevalent with Linux.

Re:I have to ask... (1, Redundant)

wysiwia (932559) | more than 8 years ago | (#15180016)

Simply because nobody realized the real reasons until OSDL published its survey in Dec 2005 (http://www.osdl.org/dtl/DTL_Survey_Report_Nov2005 .pdf [osdl.org] ). Even today nobody wants to take the necessary steps as outlined here http://linux.slashdot.org/comments.pl?sid=183801&c id=15179906 [slashdot.org] .

O. Wyss

Re:I have to ask... (4, Insightful)

Pneuma ROCKS (906002) | more than 8 years ago | (#15180037)

I think GUI, despite being prevalent for quite some time, have been very, very low in the priority list of Linux developers. The community has focused more on the low-level, kernel and architecture areas, and the rest has suffered from it. IMO, GUIs in Linux have always been an afterthought, and that's the reason they suck so much (again, IMO).

This sheds light in a key problem with open-source software: developers will work in what they want to work, not necessaily in what needs to be done.

Yeah, mod me down, see if I care.

This is nice, but... (1, Insightful)

PenguinBoyDave (806137) | more than 8 years ago | (#15179902)

Dell, HP, Toshiba, etc. etc. STILL package Windows with every new PC that leaves the shop. I have seen no indication that they plan on changing that any time soon. Sure...Dell might say he likes Ubuntu, but I'll believe it when the first Dell ships with Ubuntu and a Ubuntu sticker on the front where the Windows sticker used to be...you know "This PC specifically designed for Ubuntu Linux."

I don't know too many people that are going to go out and buy a while-box PC (other than geeks) and load Linux, when for about $400.00 you can get a fully rigged-out Dell with Windows and all the goodies.

   

Re:This is nice, but... (0)

Anonymous Coward | more than 8 years ago | (#15180030)

"a fully rigged-out Dell with Windows and all the goodies."

With all the oem installed crapware that slows the system to a crawl and with the value add products such as demo software to keep the adware/spyware/viruses/worms/malware at bay. Indeed, who would turn down such an offer..?

"One big things that's difficult is consistency" (2, Informative)

wysiwia (932559) | more than 8 years ago | (#15179906)

Yes and consistency can only be achieve by standardizing. Unfortunately this doesn't only hold true for the desktop, it's equally or even more important for the applications. So far Jim Zemlin, executive director of the Free Standards Group, doesn't seem to realize this else the FSG would have already standardized on a single set of application guidelines as outlined in wyoGuide (http://wyoguide.sf.net/ [sf.net] ). Since this isn't the case so far we still have to wait for the breakthrough of the Linux desktop.

If anybody is interested in a Linux desktop and don't want to wait much longer, he should persuade the FSG to come to terms and at least delve and evaluate wyoGuide.

See also http://lxer.com/module/newswire/view/54009/index.h tml [lxer.com]

O. Wyss

A key step to the $150 laptop (2, Interesting)

rqqrtnb (753156) | more than 8 years ago | (#15179909)

Once Linux distributors get their act together, it won't be long before semi-disposable laptop computers will become available. The Nick Negroponte hand-cranked, third-world computer will spawn a commercial version.

Linux as the OS, Open Office, Mozilla, a few other key apps, and with no "Microsoft Tax", and no headache in installing Linux on a used Win-Tel machine, plus a few "styling options," these machines will sell like hotcakes!

Then, of course, the virus writers will shift to more fertle grounds, and all the bad that goes with the good...

Finally! (3, Interesting)

yootje (770109) | more than 8 years ago | (#15179911)

What Linux needs is standardization. Having 921034 options to choose from is sometimes a good thing, but sometimes you have the feeling: why don't they just work all on 1 fantastic piece of software?

Re:Finally! (5, Insightful)

Coryoth (254751) | more than 8 years ago | (#15179953)

Having 921034 options to choose from is sometimes a good thing, but sometimes you have the feeling: why don't they just work all on 1 fantastic piece of software?

Because the worlds open source developers are not a giant slave pool designed to do your bidding.

Open source will always be chaotic and involve a great deal of duplication because that's that nature of the beast. The gain you get from that cost is much more open software that's developed rapidly and tends to work as a free market for ideas: the better ideas eventually win out (though that may take some time). If you want something different then you want Apple or Micrsoft with their rigid top down control structure which ensures that everyone is working toward a single unified goal (as much as is possible), and all the work is directed. The upside is consistency and a unified vision, but the downside is that the whole thing is more locked up, an often slower development cycle, and a tendency to get hit with the same stupid mistakes release after release after release just because it appeals to the guy at the top.

It's a choice and you can pick the software ecology that suits your needs. Just don't go expecting one to behave like the other on your whim - there are deep fundamental philosophical divisions about how to develop software (to let it evolve from the bottom up, or direct it from the top down) that are largely irreconcilable.

Jedidiah.

Re:Finally! (1)

wysiwia (932559) | more than 8 years ago | (#15180159)

Open source will always be chaotic ...

There's nothing against chaotic but does it also have to be bad? I'm not against anybody doing anything in OpenSource but when it comes to standards there's no use for bad. So far the FSG mostly created sound standards but unfortunately that can't be said of the Freedesktop.org and even worse of the desktop architects of OSDL. Especially the Portland initiative of the OSDL which was just created for this task and which should know better do not even try to tackle the first top inhibitor of a desktop Linux adoption (http://www.osdl.org/dtl/DTL_Survey_Report_Nov2005 .pdf [osdl.org] ), not even after I've told them. So my hope is the FSG will take over this problem and do a better job.

O. Wyss

Re:Finally! (1)

Cereal Box (4286) | more than 8 years ago | (#15180308)

It kills me how on the one hand you guys got absolutely nuts about web standards, document standards, etc. -- "just code to the standard and it'll magically work!" is the mantra around here. But as soon someone says that the Linux desktop or Linux distributions need to standardize on this or that the tired old "but that would stifle choice!" line gets trotted out.

Hmm, maybe that's what Microsoft thinks. They break standards because standards would just limit their choice...

Re:Finally! (1, Troll)

tomstdenis (446163) | more than 8 years ago | (#15179967)

First off Linux is a kernel.

As for the distros, yes there is redundancy. It's annoying. I tried to tell Redhat and SUSE to merge but they refused. For the most part outwardly they're all the same. You get some un-optimized heavily modified Kernel that you can't trace back to the vanilla and a plethora of pre-built tools with whacky --enable-* flags set. It's annoying and highly unproductive.

As for the options, keep in mind unlike [say] Windows a Linux based distro can target a variety of actual real world "work scenarios". This is why there are many projects out there that can "confuse" the landscape.

As for being complicated to choose between... use Gentoo. It handles 99% of all dependency problems while letting you use the latest and greatest built with the options you want enabled. How many packages do I have installed? Around 400 to 500. Can I name half of them? Not even close. Can I easily add a new package or remove an old one? Yes.

And then to those who claim bloat ... I'm using ~3.1GB of space for what I consider a fairly well equipped workstation (many tools such as GNU CC chain, GDB, various mem checkers, tetex, X11, Gnome, openoffice, etc).

I can get a basic workstation (with devel tools, X11, openoffice, etc) in around 2GB which is much better compared to Windows which on its own is 2GB. Then you have to install 6GB of MSVC, another GB of Office, another GB of Miktek [to get real work done], etc ,etc, etc. A complete Windows workstation takes ~15GB of disk or so.

Part of the reason why Linux distros [specially Gentoo] can be so small is the use of shared libs. Which is odd because Windows is largely based on DLLs.

Tom

Re:Finally! (3, Insightful)

JanneM (7445) | more than 8 years ago | (#15179974)

why don't they just work all on 1 fantastic piece of software?

Because there is no one answer to what makes a piece of software fantastic.

When intelligent people can reasonably disagree on it, don't be surprised - or dismayed - when the end result is several divergent designs. That is truly a case where any one of the designs are good, and importantly, better than a compromise between them.

Re:Finally! (5, Insightful)

TERdON (862570) | more than 8 years ago | (#15179986)

why don't they just work all on 1 fantastic piece of software?

Because there couldn't be such a thing - it's an oxymoron.

Basically, the requirements of the piece of software would be heavily contradictory - dead-easy to use, but still incredibly powerful. Few such programs exist - because they are virtually impossible to make.

Example: file managers. On the one hand, you have explorer, finder, nautilus et al, which all are at least relatively easy to use even for a newbie. Many find them far to little powerful, especially on /., where the favourite probably is raw /bin/bash, which is far more powerful, but also really hard to learn.

The same principle holds for most other software. Either you make an easily usable, or a powerful version. The powerful version will, by definition, need a lot of learning on the part of the users, and thus can't be easily usable.

When you try to unite these two conflicting requirements, the most likely outcome is one of:

1) Cluttered interface, which intimidates the newcomer
2) Clean interface, but with all powerful features hidden away from sight so the advanced user has to look for them.
3) Millions of settings in an unmanagable settings dialog, toggling the different features on and off.

Conclusion: One software normally can't be the great software - not for every single user. The shifting requirements different individuals have will without doubt make them prefer different software - and that isn't really a bad thing. If everybody ran the same software, there wouldn't be as much incitement for developing new, powerful features!

Re: .sig (0)

Anonymous Coward | more than 8 years ago | (#15180009)

On the desktop and haven't looked back... (3, Informative)

i_want_you_to_throw_ (559379) | more than 8 years ago | (#15179917)

I have tried using Linux on the desktop MANY MANY times and always found myself stymied by getting printers to work and so forth. I have always been adamanat about using it for servers where it's very much worth the time to figure out Linux to have the benefits of it as a server product (bulletproof security, etc).
As a desktop product though I wasn't about to spend all day dicking around with trying to get it to work. That's was then.... this is now...

I have been using Linux as a desktop for several months now and it has flawlessly detected all my perpherals, and I Have now been able to spend more time doing development which is what I get paid to do.

Linux is getting better in this area and Linux is going to start making inroads. Slowly but surely...

Re:On the desktop and haven't looked back... (1)

tomstdenis (446163) | more than 8 years ago | (#15179945)

Again this is an example of faulty logic.

You're saying because your printer manufacturer hasn't followed the 25 year old PS standard that Linux is broken? Why not buy "Linux ready printers" [some Samsung laser printers for instance].

After that driver install is easy and you basically print through CUPS.

But again, this is totally a manufacturer problem not Linux. It isn't like Linus can force manufacturers to include Linux drivers for their non-standard proprietary shit.

Tom

Re:On the desktop and haven't looked back... (1)

Kangburra (911213) | more than 8 years ago | (#15180057)

You're saying because your printer manufacturer hasn't followed the 25 year old PS standard that Linux is broken?


No, but most people will think this. If you know enough to even know what PS stands for you'll have bought a printer that is capable of it.

To the general public things like printers may need the CD that came with them but not have to chase help forums to get a test page. Until this is changed, Linux is not ready for the desktop. So we have to complain to the hardware manufacturers until it does work out of the box.

Re:On the desktop and haven't looked back... (1)

tomstdenis (446163) | more than 8 years ago | (#15180094)

Again wrong wrong wrong.

Just because your printer handles PS doesn't mean you can just dump a PS file to it and have it print. Many use proprietary encodings of the PS data as it's sent to the printer.

As for installation ...

emerge -uD cups
cd samsung-driver ./install.sh

[I think it was called install or setup...]

The installation is GUI DRIVEN, you pick your printer and how it's hooked up and it installs the driver for CUPS.

From opening the box to printing [locally] you're looking at all of about 5 mins of work at most. Getting remote printers is a bit tricky [mostly figuring out CUPS] but still not that hard.

My old job has two Samsung LPs hooked up to CUPS in round-robin fashion. Not exactly super hard.

Just because HP, Canon and the others can't figure shit out doesn't mean "Linux ain't ready". The technology support in Linux is there, has been there for a long time.

This is the same as the game debate. Linux has long supported video drivers such as those from ATI and Nvidia. It also supports sound, keyboards and mice.

The only reason there are not a lot of good FPS games for Linux is because nobody is writing OpenGL engines. They're all too gung-ho on DX.

Let's see... SDL + OpenGL == portable game engine which handles keyboard, mouse, timers, graphics, network, fonts, etc and would be portable [even work in Windows].

Tom

Re:On the desktop and haven't looked back... (1)

Trelane (16124) | more than 8 years ago | (#15180234)

Just because HP, Canon and the others can't figure shit out doesn't mean "Linux ain't ready".
Actually, HP has done a terrific job of supporting [sf.net] Linux [sf.net]

These are at least Open, if not Free Software packages, and included in your distro (I've not found one yet that doesn't have them, what with them being FOSS and all.) To use them:

emerge [hpijs|hpoj|hplip]
and then the drivers will show up in your printer listing in CUPS (you have have to restart; I don't remember. I use the web interface; use whatever you're comfortable with). If appropriate, you can select the scanner in SANE; it just appears.

It is precisely because of this great support that I will be buying an HP Office Jet in the near future. It also makes sense on their part--the software is a way to sell printers, which is a way to sell ink. ;)

Thanks, HP!

Re:On the desktop and haven't looked back... (1)

neildiamond (610251) | more than 8 years ago | (#15180073)

In response to the comment about why not use X printer just beacause it supports Linux...

First off, I used to buy only HP printers. A year ago I bought an Officejet 7110 to have the ability to print AND scan in Linux. (It didn't fax in Linux, but 2 out of 3 ain't bad.) Scanning from Gimp in Linux was fantastic for bulk scanning projects. I really miss that. However, the printer was never quite right (as in flaky). The print quality (when working properly) was identical to the HP printer we had 4 years earlier. Finally the whole thing crapped out right after the one year mark. I did some research and found out that Cannon had the best all around multifunction in my price range (Pixma 780). I can't scan in Linux which pisses me right off. Cannon won't provide drivers for printing. I think that some CUPS drivers supposedly sort of work, but I downloaded proprietary drivers from Germany that only let me print in draft mode (though are fine as a shared printing device using Windows drivers from another machine).

Just noticed today that there are hacked scanner drivers that I will have to test out... http://pixma.schewe.com/ [schewe.com]

Anyway, I based my purchase decision on...
Cannon
Lower cost of printer,
Lower cost of ink
Better print quality
Fax really works all the time!
scanning in Windows only (for now)
Printer is rock solid

vs. HP
Most functions supported in Linux
Lesser print quality
Fax half-worked (has this annoying power save feature that has it crap out after about 8 hours of idle on a UPS that cannot be disabled)
More expensive ink
Printer was flaky
more expensive purchase price for printer

I think most sane people (not necessarily on Slashdot) would agree that full Linux support isn't everything if the product still kind of sucks.

Re:On the desktop and haven't looked back... (1)

LocoMan (744414) | more than 8 years ago | (#15180152)

it's making inroads but it still needs some more work done, IMHO. I'm trying linux for the first time now on a second computer I have around (Ubuntu, to be more specific), and while I was pleasantly surprised at how usable was right off the bat (even the internet worked), it has dissapointed me so far at how many times I have had to resort to command line from the beggining (to get MP3 working), and how hard it was to get something I would have though would be simple as installing a video driver to get decent 3D. At least in my experience trying to install them, following the instructions on wiki.ubuntu.com got my a BSOD (well, it was a screen telling me X couldn't be loaded... but it was blue.. :) ) and sent me to command line only, managed to find some more instructions to uninstall them from there too and could get back on the GUI, then downloaded the drivers from nvidia.com and tried to install them... and failing because it didn't have what I needed for them. After 2 HOWTO's (about 10 pages worth of instructions) I finally managed to install them after 7 failed attempts (it wanted gcc.. then it wanted the RIGHT version of GCC, then it wanted some system variables, plus several other stuff I did from the HOWTO's that I have no idea what they did). At least that was my first experience with linux. My computer doesn't have any weird stuff, and the video card is a nvidia 5500, which I would consider fairly mainstream, yet it took me 2 nights dicking around to make it work (in 3D, at least). And that comes from someone that's fairly computer literate and not foreign to command lines (I grew up in the time when DOS was king and we had to edit autoexec.bat and config.sys files all day to make our games work). I completely agree it's getting better, but at least in that area it needs some more work... 3D and MP3 playback at least (I haven't been using linux much yet and those are the only roadblocks that I've found so far) are things that are very mainstream by now and it should be a lot easier to get working... I understand they can't be included working by default because they're not "free", but maybe include a wizard to make it work without having to resort to searching for HOWTO's and command lines. That said, though, I'm still pleasantly surprised with it, and while I probably wouldn't use it myself on my main work computer (I'm a video editor and graphic designer, all the better tools for that are windows or OSX only) given more time with it, once MP3 playback is working, that is, I'd probably recommend it to someone that only needs a computer for simple browsing and email.

Re:On the desktop and haven't looked back... (1)

c0d3h4x0r (604141) | more than 8 years ago | (#15180314)

The parent poster's GNU/Linux experience mirrors my own. Even Ubuntu is unnecessarily difficult to configure.

Installing programs (that aren't yet specifically built as a package for your distro, which is rarely the case when you're trying to install, say, the most recent released version of an app) is a nightmare.

Installing device drivers and configuring X for 3D graphics cards is a nightmare.

Installing printer drivers and having them actually work in all applications is a nightmare.

Hell, just trying to find device drivers for many common devices is a nightmare. Half the time they are available only as source code you have to merge into the kernal, or only as binaries packaged for Red Hat, and you can't get them to work harmoniously with your particular distribution.

Trying to get something like video file playback working is a total mess, because there are umpteen different competing video subsystems or libraries with different applications relying on different ones.

HOWTOs aren't an answer, because often the information you find in them isn't quite right for your given distribution, and it just fouls things up worse than before you started following the instructions, and of course the HOWTO provides no insight about how to roll back the changes if you encounter failure.

All the Linux zealots keep cheering the improved usability of the Linux desktops. It's bullshit. Users still can't easily install the right programs or drivers, get devices working fully and reliably, make sense of asinine library dependencies that make "DLL Hell" look like a walk in the park, or figure out how to get basic scenarios (such as video file playback) working on their desktop.

Usability is about more than pretty point-and-click eye candy. It's about making tasks easy to accomplish, not just for newbies, but for expert users, too. It's about saving people time and hassle. It's about simplifying and making consistent the inner workings and structure of the system so that there isn't so much to have to learn and figure out. It's about making things painfully obvious to users so that they don't have to form their own inaccurate guesses through expensive and risky trial and error.

I'm sick of Linux and FOSS zealots claiming that something is easy to use just because mountains of documentation are available for it. Something is by definition not easy to use if you have to learn a lot of unintuitive stuff about it before using it to perform basic tasks.

If someone has to go buy a "Linux for dummies" book, or has to read a lengthy technical HOWTO document, or has to go post questions in user forums, to understand how the system works and to accomplish basic tasks, then the system is by definition unusable.

GNU/Linux systems in general are a complete disastrous mess of chaos. If the grand experiment that is GNU/Linux has proven anything, it's that top-down design is absolutely necessary when it comes to something as technically complex as software.

Standards wont make a difference (3, Insightful)

gimpimp (218741) | more than 8 years ago | (#15179926)

We've had standards bodies for a long time. LSB, Freedesktop, etc - none of will help increase market share. Sure, they make like easier for developers, ie a gnome icon theme will soon work on a kde desktop. But the single major problem on linux is dependancy hell. I have nightmares about this.

Repository based installation is NOT the way to go. Autopackage is just a pretty frontend around the same problem. Until we can install and remove applications as easily as OSX users can, we don't stand a chance.

If you were a new user to unix, what would you prefer:
A) open synaptic, search the thousands of packages, hope you find what you're after, install it.
B) download an app folder, drag it to your appliactions folder. go.

Without this ease of use, there's no chance. I still laugh at people who say linux is ready, whilst at the same time they can't install the latest firefox on their box because it depends on the latest gtk which depends on the latest glib, which depends on....

Re:Standards wont make a difference (0, Troll)

tomstdenis (446163) | more than 8 years ago | (#15179936)

What are these dependency problems of which you speak ... .... uses gentoo ...

Biggest problem with some commercial apps is they insist on using C++ and all the bleeding edge features of GCC 4.x.y. That's why they have "portability" issues. Specially on platforms where the C++ internal symbols have slightly different names (re: Redhat, try running Synopsis tools in Gentoo).

It's true there is a few too many OSS libs out there where some could be joined into one larger lib (or just even one package with various shared objects) but that's also the flexibility of the system.

But in general there are standards for development as you've said. So really and problem working cross-platform is usually of their own doing.

You can easily "install/remove" apps in Linux. Gentoo is already there with the end user application. It just needs rollback capabilities and a shiny GUI wrapper.

Tom

Re:Standards wont make a difference (1)

gimpimp (218741) | more than 8 years ago | (#15179957)

i've used gentoo, and though it's great for developers and people who like to get their hands dirty - it's not for everyday folk who just like to check their email with the latest Evolution, and browse the web with the latest Firefox.

Until linux has real drag and drop package management, not this searching through a 10000 package repo garbage, there's absolutely no chance in hell. And I won't recommend it to anybody either.

I love linux, use it every day on my pc's, but it's package management is trash. Mac users for example. If the latest beta of their chat client comes out, they can very easily download the package, open it, and run the app inside. it doesn't clash with anything on their existing system, it just works. On the other hand, I can't install the latest Gaim beta because it's source only. Then I have to worry about its deps. So depressing.

Re:Standards wont make a difference (2, Insightful)

tomstdenis (446163) | more than 8 years ago | (#15179982)

I dunno. I don't subscribe to the dumificiation of humanity. I agree that things shouldn't be harder than they need to be. But it isn't like "emerge -uD firefox" is so fucking hard to type. I mean look at how many people can hardly use Windows as it is. I think the trend should move towards "let's document our system and stick to standards".

Remember the days of the 200-page MS-DOS 5.0 user manual showing off all the commands with examples? What happened to that? For $300 [full XP pro] you think they could include a 100-page primer on using Windows. I mean it isn't like the CD cost them $300 and when they fully admit they're making money hand over fist you think customers have a right to demand more.

It isn't like there are not Linux books though. So if the user has to learn how to user their computer is that really so bad? It means they get better use of it and are not at someone elses mercy as to what they can run and how. I think that's a good thing. I could be wrong...

Tom

Re:Standards wont make a difference (1)

baadger (764884) | more than 8 years ago | (#15180010)

But it isn't like "emerge -uD firefox" is so fucking hard to type.

Wouldn't it be easy to goto help -> check for updates in Firefox like on Windows? oh wait it's greyed out on Linux. What does "-uD" bit mean? Man page, manual page what's that? Why should I have to read this to understand it? Why is it called emerge? Why isn't this easier?

I'm a gentoo user too, but this isn't trivial to the n00b. The days of the 200 page MSDOS 5.0 user manual are long gone, in them days you didn't have the same joe blogg user on the computer trying to get their job done.

The fact is you shouldn't need a multi-hundred page manual for anything with a GUI.

Re:Standards wont make a difference (-1, Flamebait)

tomstdenis (446163) | more than 8 years ago | (#15180029)

STFUN. [stfu + newb]

The installation guide for Gentoo explains "emerge" and why you use it. As for the rest of the options "man emerge".

Oh, you're incapable of reading a 30 page document... oh ok. Glad to know you're a TOTAL FUCKING RETARD.

I mean even if Gentoo bound their user manual in gold and gave it out for free they'd still have people like you saying "where is this guide you are speaking of?".

I'm a gentoo user too, but this isn't trivial to the n00b.

It's a lot better. I moved to Gentoo ~2003 or so when the user manuals were still like "it works, trust us". You can't tell me you're not better off now with Gentoo though. So the user has to actually learn a little about how the software on their computer works. I think that is a good thing.

As someone who has seen a lot of computer users [OSS or Windows] stuck on the trivialest of issues because they can't use a shell to patch a shared object or startup script or something and have to wait for "official support" I say knowing something is a good idea.

The fact is you shouldn't need a multi-hundred page manual for anything with a GUI.

Um with screenshots covering all aspect of the OS from how to run apps, install devices, setup networks, wifi, printers, install and remove software, etc. I can easily see that being ~100 pages.

In fact what do you get with a legit copy of Windows? A little pamphlet trying to sell MS addons...

Tom

Re:Standards wont make a difference (1)

Homology (639438) | more than 8 years ago | (#15180060)

[rant]

It is seldom I see that anyone is recommending Linux users to read man pages. I used to use Linux (SuSE, a few years ago), but quality issues and poor documentation made me move away from it. In general, the Linux man pages are of low quality (out-of-date, incomplete and buggy), if there are any man pages at all to read.

New OpenBSD [openbsd.org] users with a Linux background are unused to actually read documentation, and just post on a mailinglist without doing some research first. Considering the quality of Linux documentation, that is understandable behaviour. However, on OpenBSD, the man pages and other documentation is high quality and is expected to be read.

[/rant]

Re:Standards wont make a difference (0, Troll)

tomstdenis (446163) | more than 8 years ago | (#15180075)

Dunno about you. The man pages I read are usually POSIX or glibc functions and they are just fine. As for the various other random commands it depends. Most of the coreutils are well documented [e.g. "cp", "ls", etc].

The thing that is least documented would have to be /etc/conf.d/ entries. But mostly a quick google is all you need.

You have to keep in mind the "man-pages" package is actually a separate project on its own. It's not strictly part of the Linux realm.

Tom

Re:Standards wont make a difference (1)

Homology (639438) | more than 8 years ago | (#15180128)

The thing that is least documented would have to be /etc/conf.d/ entries. But mostly a quick google is all you need.

On OpenBSD it's seldom that I've to google for something that is part of the base install (and that covers alot). Most, if not all, config files are documented in man pages or other documentation available (like the excellent FAQ [openbsd.org] ).

You have to keep in mind the "man-pages" package is actually a separate project on its own. It's not strictly part of the Linux realm.

This seems to part of the problem.

Re:Standards wont make a difference (1)

init100 (915886) | more than 8 years ago | (#15180096)

Wouldn't it be easy to goto help -> check for updates in Firefox like on Windows? oh wait it's greyed out on Linux.

Read my previous comment [slashdot.org] about this for an explanation.

Re:Standards wont make a difference (1)

drsmithy (35869) | more than 8 years ago | (#15180155)

Remember the days of the 200-page MS-DOS 5.0 user manual showing off all the commands with examples? What happened to that?

It's now an online, searchable, cross-referenced, interlinked resource.

Books look nice on the shelf and are handy for helping get to sleep at night, but a good online help system is infinitely more functional.

Re:Standards wont make a difference (1)

IamTheRealMike (537420) | more than 8 years ago | (#15179973)

Biggest problem with some commercial apps is they insist on using C++ and all the bleeding edge features of GCC 4.x.y.

GCC 4.x has been out hardly a year, almost no commercial apps "depend" on it. C++ on Linux is generally riddled with problems anyway but it's not the fault of people who use C++, it's the fault of the people behind the compilers and low level binary formats/tools ...

Re:Standards wont make a difference (1)

tomstdenis (446163) | more than 8 years ago | (#15179992)

Where I work onsite [let's just say they like chess] they are using the latest and greatest C++ features of a "blue" compiler. Turns out porting to GNU and other toolchains is doable but not without some minor pains here and there. Do they need these features? Maybe, Maybe not.

My point wasn't to blame C++ but to say using the bleeding edge of compilers is not always smart. Similarly with other tools and libs that are fresh new.

You're right most OSS tools target GCC 3.2 or 3.4 in terms of "expected functionality" which is nice. But various commercial apps [like verilog tools] often use C++ and are built on platforms like Redhat where there are hidden renamed symbols. There is no reason why a userspace program like those of Synopsis can't run in Gentoo [or Debian or ...]. They don't because the C++ stdlibs are incompatible.

Tom

Re:Standards wont make a difference (1)

IamTheRealMike (537420) | more than 8 years ago | (#15179995)

Hidden renamed symbols is a new one to me, do you have any more details about this specific issue?

Re:Standards wont make a difference (1)

tomstdenis (446163) | more than 8 years ago | (#15180015)

I can't find the material right now. But if you try to run [iirc] Synopsis tools on Gentoo they'll crash. The trick was one of the __ functions was renamed. Apparently renaming it back in the ELF file is enough to get the program to work again.

If I had access to a Redhat box I could tell you.

I don't think it's huge problem but it is problematic.

So are the kernel patches [even Gentoo does]. I run vanilla kernels [always have] and I update when appropriate [or given the time]. But I mean what exactly is gentoo-sources-2.6.16-r3 in terms of a Redhat kernel anyways?

In essence all the right tools for good standardization are there. Just people abuse them to get "an edge".

Tom

Re:Standards wont make a difference (1)

grumbel (592662) | more than 8 years ago | (#15180105)

### Biggest problem with some commercial apps is they insist on using C++ and all the bleeding edge features of GCC 4.x.y. That's why they have "portability" issues.

I have to disagree, while C++ incompatibilities between GCC versions are an issue, they are mostly an issue for distributors who try to link everything dynamically, for stand-alone packages that is hoverever not a problem, link statically and the problem is solved. The real throuble are mainly the Linux kernel itself and glibc, since those you can't solve by static linking or by including them in your package, and neither of the maintainer of them seems to care much about binary backward compatibility, so that the problem won't solve itself anytime soon in the future. And there is also the throuble with GCCs usability, which makes dynamic linking easy and static linking a nightmare to get right, which is why hardly anybody is using it in the first place, most simply don't know how.

Re:Standards wont make a difference (4, Insightful)

IamTheRealMike (537420) | more than 8 years ago | (#15179963)

Repository based installation is NOT the way to go. Autopackage is just a pretty frontend around the same problem.

Well, autopackage was designed to deal with many of the problems repository based distribution has, so, I would strongly disagree with the notion that it's just "a pretty frontend around the same problem". We've put many, many times more effort into things like reliable/easy installs than making it pretty (though there is still much to do).

Without this ease of use, there's no chance. I still laugh at people who say linux is ready, whilst at the same time they can't install the latest firefox on their box because it depends on the latest gtk which depends on the latest glib, which depends on....

This problem affects any OS. You can't install Safari on MacOS X 10.1 either, if I remember correctly. It's true that Linux suffers this problem worst of all though, because there's no unified platform, and because there's no profit motive so little incentive for developers to go "the extra mile" to reduce system requirements. But it's a separate (though related) problem to how you install software.

Re:Standards wont make a difference (3, Interesting)

Florian (2471) | more than 8 years ago | (#15179980)

download an app folder, drag it to your appliactions folder. go.
Unfortunately not. OS X programs often spread their files all over the file system, with a mess of binary configuration files, possible netinfo entries (akin to the Windows registry...), etc. There is no standard method in OS X to cleanly remove them - just deleting the application won't do the trick in most cases. Even Windows is superior in that respect.

Besides, downloading binary code somewhere from the Internet and installing it in your system is a security nightmare and practice that should be abandoned ASAP. I find the Linux/BSD model of providing all software in distribution-provided repositories blessed by the distribution's maintainers vastly superior to OS X, with unmatched clean and safe installation, removal and upgrading of software. (How, for example, do you upgrade all your Mac OS X software with one command or click?) I use both Debian and Mac OS X and find Debian vastly superior in this respect.

Re:Standards wont make a difference (1)

jeffehobbs (419930) | more than 8 years ago | (#15180098)


OS X programs often spread their files all over the file system, with a mess of binary configuration files, possible netinfo entries

Wha...? What you say may hold true for server software .pkg installs, but these days those are few and far between, and by definition include a bill of materials that tells you where and what files are being installed (select "Show Files" during the install to see). Most OS X application software these days are .app drag and drop installs, and they will "spread their files" 99% of the time in three places:

  1. ~/Library/Preferences/
  2. ~/Library/Application Support/
  3. ~/Library/Caches/


...and I have *never* had an app modify my NetInfo settings in the five (!) years I've been running OS X.

~jeff

Re:Standards wont make a difference (1)

moro_666 (414422) | more than 8 years ago | (#15179996)

so basically the thing you want, is full support. someone who would test all the dependancies and stuff when every package is released. and you want it for free.

good luck, i hope you get a ferrari for free too.

Re:Standards wont make a difference (1)

Kangburra (911213) | more than 8 years ago | (#15180066)

so basically the thing you want, is full support. someone who would test all the dependancies and stuff when every package is released. and you want it for free.


This could be a good way to earn money in the OSS arena though.

Re:Standards wont make a difference (3, Informative)

i_should_be_working (720372) | more than 8 years ago | (#15179998)

If you were a new user to unix, what would you prefer:
A) open synaptic, search the thousands of packages, hope you find what you're after, install it.
B) download an app folder, drag it to your appliactions folder. go.


You forgot the part in B) where you search through the internet for the home page of the application. Then you read the home page trying to find out how to download it. Once you see the "download" link you go through a couple of pages asking you what version you want and what mirror you want to use. Then after waiting for the download you finally start the actuall installation.

Whereas with A) it's more like: Open Synaptic, use the search field to find the app faster than you would on the net, install it.

I prefer option A. It's more convinient for me and the repository based system has other benefits I'd rather not do without. I can see where you are coming from, but different people prefer different things. I'm just glad the distros agree with me (or rather I agree with them).

And for the record, it's not the distribution or Linux devs who are stopping app folders from coming to GNU/Linux. They already exist. Nothing stops someone from bundling everything a program needs in a self-contained folder. That's how most of the proprietary apps I use are packaged. Open source devs could do this with their programs too, but it would be more effort without much benefit when the distros are going to package it anyway.

Re:Standards wont make a difference (4, Insightful)

asuffield (111848) | more than 8 years ago | (#15180114)

Nothing stops someone from bundling everything a program needs in a self-contained folder. That's how most of the proprietary apps I use are packaged. Open source devs could do this with their programs too, but it would be more effort without much benefit when the distros are going to package it anyway.

Actually, it's not because it's more effort. It's because it is fundamentally a bad idea.

If you bundle everything you need into one blob for each application, then suddenly your system has installed several hundred copies of gtk, all at different versions. Obviously this is quite wasteful of space, but even that is not the real problem. This is:

A security advisory was just released for all copies of gtk before a given version.

What exactly do you do now? You don't know which of your hundreds of applications has got that code included in it. Even if you could figure it out, you now have to either rebuild all of those by hand (if you can), or go to each individual upstream developer and download an updated version from them. If you're a desktop user then you probably aren't going to get this done, so you'll be running with known security holes in some applications. If you're a sysadmin then you're probably going to find a new job.

I would say that the ability to install security updates in a reasonably painless and secure manner is one of the most fundamental tests of any distrbution method. Applications-as-self-contained-blobs fails it badly.

Re:Standards wont make a difference (1, Insightful)

baadger (764884) | more than 8 years ago | (#15180147)

For the most part most Window's apps bundle all the DLL's they need to run with the installer. This is inefficient when it comes to bandwidth, but it isn't necessarily the mess you suggest it is if the developer has done their job correctly.

For a start, these DLL's should be installed into shared location (Common Files, or the System folder). Secondly, most installers now warn and ask you that you are about to overwrite a file of newer version than is currently being installed, and all is well.

I don't see how you can claim Linux distributions are any better with the likes of "library.so.5" and "library.so.6" and related symlink mess that often entails. I can't even count how many times i've seen people have library issues on linux community forums and the like and the solution was to move a library here or add a symlink there.

Re:Standards wont make a difference (2, Informative)

baadger (764884) | more than 8 years ago | (#15179999)

I agree.

You know sometimes I wish I could just goto Help -> Check for Updates in Firefox on Linux as easily as I can on MS Windows. It's laughable that the most well known of open source software doesn't function as seemlessly on an open source operating system as it does on a proprietary Microsoft one.

Hell, if my repository doesn't have the latest version of Opera (it doesn't) I say sod it and get it from the source, run Opera's 'install.sh' and i'm happy if it works (it does). Yet, theres no safe way to uninstall or manage that installation thereafter.

Microsoft's registry and filesystem arrangement isn't as pure as us geeks would like, everything thrown in a single 'Program Files' folder, the start menu and registry practically pissed upon, user documents stored in a subfolder of the user profile and settings folder on the same partition as the operating system ecetera ecetera.

The fact is though on Linux, you're forced to engage with the community to get what you want in the repositories, rely on using the distro flavour of the month to get the best choice, or get down and dirty with configure, make and the filesystem yourself. Some people never want to have to do *any* of that, and they shouldn't have to. How anyone can claim Linux will every make it to the average mom's desktop, without constant nannying by a geek (and yes lots of Windows users struggle by without one, and the spyware awareness situation is improving), unless they address these issues is just funny.

Re:Standards wont make a difference (1)

NorbrookC (674063) | more than 8 years ago | (#15180044)

You know sometimes I wish I could just goto Help -> Check for Updates in Firefox on Linux as easily as I can on MS Windows.

It's not just checking for updates. I've found it's often much faster and simpler to use the Windows version of some OSS projects for evaluation. Yeah, yeah, yeah, I know, people are going to tell me how "simple" it is to do this with Linux and give me a set of procedures for their distro. What I want is to be able to download it, install it, and go. I don't want to be downloading distro-specific libraries, etc., etc., etc., particularly when all I'm trying to do is see if application X has the promised features/capabilities I need, and then have to reverse the procedure if it doesn't.

Re:Standards wont make a difference (1)

init100 (915886) | more than 8 years ago | (#15180053)

You know sometimes I wish I could just goto Help -> Check for Updates in Firefox on Linux as easily as I can on MS Windows. It's laughable that the most well known of open source software doesn't function as seemlessly on an open source operating system as it does on a proprietary Microsoft one.

This is because the "Check for updates" in Firefox relies on a well-known security flaw in Windows, namely that every user is running as an administrator, and thus has the authority to modify system files. In Linux, since you are not browsing the web as root, it is perfectly understandable that you cannot update the systemwide installation of Firefox from the browser window. By the way, package management systems would be screwed up if files they "own" suddenly are replaced by other files of (to them) unknown origin. What are they supposed to do if the user tries to uninstall the original package?

Re:Standards wont make a difference (1)

baadger (764884) | more than 8 years ago | (#15180265)

The administrator priveleges dilemma is a good point. I'm absolutely infuriated with the way Windows forces me to have TWO administrator accounts on my machine because you need another in addition to the 'hidden' default 'Administrator' account. You cannot downgrade your user account to Limited User if it is the only account bar 'Administrator'.

There are advantages and disadvanages to both the repository and proprietary installer methods. With repositories I know atleast I will get updates eventually during a sync. On Windows I always know I can have upto date versions NOW straight from the vendor as long as I remember to check for updates myself... or use the applications build in update checker (which doesn't require administrator priveleges).

Perhaps Microsoft should implement a 'check for updates' API that proprietary vendors can use to deliver update *notifications* as a sort of plugin to add/remove programs and Windows Update. Call it Application Update or something. That would certainly solve alot of problems with updating on Windows.

Re:Standards wont make a difference (2, Informative)

zerblat (785) | more than 8 years ago | (#15180131)

You know sometimes I wish I could just goto Help -> Check for Updates in Firefox on Linux as easily as I can on MS Windows.
If you're using Ubuntu, Update Manager [osnews.com] will take care of the updating for you. You don't even have to ask it to check for updates, it does that automatically and notifies you [ubuntu.com] if there are any updates. Plus, it works the same for all of your software, not just one application.

Other distros have similar things.

Re:Standards wont make a difference (3, Informative)

javanree (962432) | more than 8 years ago | (#15180022)

This might have been an issue years ago, but these days there isn't any serious "dependancy hell" anymore. Tools like yum sort that out. As long as you pick a sane combination of repositories things will "just work"
For Fedora (only one I'm familiar with), there's freshrpms , Dag and a few others that work great. For the distro I use (CentOS) I maintain my own repository, so all other users just have to click to get what they need.

And if you want one-click install, have a look at Klik, which is now available for many distro's already. Although I personally prefer RPM's (since it's easier to clean/upgrade) it's a good idea for novice users.

Things like LSB and freedesktop ARE making a difference, although some of it might not (yet?) be visible on the surface.

Re:Standards wont make a difference (3, Informative)

cozziewozzie (344246) | more than 8 years ago | (#15180076)

Repository based installation is NOT the way to go. Autopackage is just a pretty frontend around the same problem. Until we can install and remove applications as easily as OSX users can, we don't stand a chance.

We can do this already: Klik [atekon.de]

The problem is that you end up with 200 versions of the same libraries, and the resulting memory and disk space overhead.

That's why this sort of installation is generally used for easy testing of things instead of a sane installation procedure.

Linux bundle files (1)

wysiwia (932559) | more than 8 years ago | (#15180219)

B) download an app folder, drag it to your appliactions folder. go.

I don't remember when I suggested in the Linux kernel mailing list about creating Linux bundle file support in the kernel but it must be at least 4-5 years ago. Since nobody didn't recognize it's value then I haven't insisted.

I hope I don't have to repeat in a similar fashion the same in another few years about how to make the Linux desktop successful.

See http://lxer.com/module/newswire/view/54009/index.h tml [lxer.com]

Re:Standards wont make a difference (1)

swillden (191260) | more than 8 years ago | (#15180229)

If you were a new user to unix, what would you prefer:
A) open synaptic, search the thousands of packages, hope you find what you're after, install it.
B) download an app folder, drag it to your appliactions folder. go.

I have both Macs and Debian Linux boxes, and (A) wins, hands down, almost every time. It is *much* easier for me to find and install software on Linux that it is on a Mac. Uninstallation is a breeze, too. The "drag-to-install" idea is nice, but to make it really nice Apple needs to add a central repository of software that I can easily search and then install from (drag 'n drop or otherwise).

OS X needs synaptic.

Re: I'll take A! - Repositories are The Answer (2, Interesting)

anon mouse-cow-aard (443646) | more than 8 years ago | (#15180232)

If you were a new user to unix, what would you prefer: A) open synaptic, search the thousands of packages, hope you find what you're after, install it. B) download an app folder, drag it to your appliactions folder. go. download from where? How did you find the place to download from? You did a google to find it first, on what term? the name of the software? the purpose? If the first step is to do a search in synaptic (btw... Adept in ubuntu is nicer) that looks only for packages for your distro, version, and architecture. It is far simpler. And from now on, all your security updates are applied automatically, so the user does not have to worry about the latest and greatest.
  • A) is more:
    1. Start up a software repository browser. (Click'n'run, Adept, Synaptic, whatever)
    2. Do a reasonable search, to find the package, read among a few descriptions.
    3. click on the install button. (which downloads and installs.)
  • B) is more like:
    1. google search to find the web site of the software.
    2. figure out where the download link is.
    3. Try to figure out which package is correct for the computer (This step completely defeats most ordinary folks.)
    4. Download it somewhere... (OK, it's on my desktop, now what do I do with this package thing?)
    5. drag it into the applications folder (That assumes that you know what an applications folder is... again, inexperienced people will not know or worry.)

Folks hear about downloading, and expect to download, and application developers find packaging a pain, a barrier to distribution, but once people look at it critically, it is really about what people are used to, not about what works better. Downloading random packages off the net is a bad idea on any OS. Getting supported packages from a repository that tracks your OS is the right idea. Vendors of proprietary software should (and the good news is that many are) simply provide repositories for distros that provide for this kind of automatic updating.

People who say that repositories are not uptodate are not reasonable. Most people want software that has undergone some testing, want software to update itself automatically once it is installed, want the correct version for their system to be chosen automatically (i.e. asking people to be able to answer the question "on glibc 2.1 based distributions..." is too much.) The the software provider cannot find the time to perform proper packaging, and will not arrange for updates to be easy to do when there are security issues or improvements available, then you should not install the software unless you are prepared to do that sort of support on your own. That is a choice that most people do not think about.

Making repositories easier to deal with is the thing to concentrate on. For example, A missing piece right now would be to have an XML ''download selector'' which would contain a list of repositories for various distros, that frontends for apt/yum/whatever could just download and automatically select the appropriate repository for a given distribution. ISV's would just create the XML file (and the requisite repositories behind them.) And the whole manual download/install process would disappear. That would be a big end user improvement with only a small change existing tools.

Why doth the rumours continue? (4, Insightful)

Psychotria (953670) | more than 8 years ago | (#15179979)

Unfortunately, those added software libraries differ among Linux distributors, making it hard to know if an application like a word processor will function on a particular Linux computer.

What a load of rubbish...

When I read a comment like this, I have to question a) the qualifications of the article author; and/or b) their motives. Any assertions made in the article need to be critically examined and their validity questioned after such false hoohah.

Re:Why doth the rumours continue? (1)

Shivetya (243324) | more than 8 years ago | (#15180240)

don't take the word function to literally. This could be as innocous as "does it look right". Akin to running a window31 version of software on XP. It might work fine but it certainly doesn't look or "feel" the same.

The motive is to better the chances of Linux being on the desktop. This means that your going to have some badly worded observations that say one thing to one group and something entirely else to another.

It is the casual user who controls the market. The buy what is sold to them and what is sold to them is the simplest software at the lowest price point. Windows is filling this bill because for the most part its insanely easy to use for the common user. Most will have passing familiarity with it from work or school. Throw them on OS/X and I bet a few would be confused initially but would adapt pretty quickly. Throw them on Linux and they should adapt as well, until they sit down at another Linux system using a different desktop and they will be confused as to what Linux is.

If standards like these are not in place it will forever be "vendor's Linux" or some marketing gimmicky name. Eventually people may restrict their development to a paticular vendor and their extensions. Its best to set the ground level and keep developers and users focused there. Otherwise you will continue to be fractured and just meander in the low % of users.

ONLY a few percent? (5, Insightful)

penguin-collective (932038) | more than 8 years ago | (#15180021)

A few percent desktop marketshare is what Macintosh has. Seems to me that the "fractured" Linux desktop is doing pretty well already.

What is this article talking about? (0)

Anonymous Coward | more than 8 years ago | (#15180041)

Even on MS Windows, apps break on library updates (SP2 anyone?). I'm conflicted, I refuse to install PAM and other such garbage and the good thing about linux is that I don't have to. OTOH while I can't stand design by comittee this may be worthwhile if it stops KDE/Gnome apps requiring so many libs and daemons to run under usable DE's.

I see more flying chairs... (1)

HangingChad (677530) | more than 8 years ago | (#15180061)

...in Redmond tonight.

As a devout Linux desktop user... (1)

STDOUBT (913577) | more than 8 years ago | (#15180070)

I have to say I don't care.

I don't care about all my apps having
a unified 'look 'n feel' (boring!)
I don't care there's no standard 'base-system'.
I like the variety!
Little things like cut/paste, I can work around.

If the home user wants to run Linux, let them
take the time to learn what it's all about.
Efforts like ubuntu which appear to aim at dumbing it all down for Joe Sixpack are IMHO, rather insulting. Fine -make your easy-peasey, unified system. I'll keep the chaos/versatlity thank you very much.

Discussion.. (1)

kahrytan (913147) | more than 8 years ago | (#15180085)

I've got two questions for all of you Slashdotters.

What will it take for Linux to become a mainstream desktop operating system? It's the billion dollar question.

To help with that question, ask yourself this.

What do 97% of all computer users do on their computers?

1. They research information for school.
2. They talk to their friends via AIM, Yahoo, Googletalk, Trilian, etc.
3. They send emails to their friends and coworkers.
4. They use it to play games.
5. Watch or Listen broadband content. (Movies, Music, TV)

Linux needs an Open Source Standard in dealing with ALL graphics cards. Plus, a basic way in rendering graphics.

Linux applications need a good commercial look. GAIM looks to ugly for mainstream desktop.

Re:Discussion.. (1)

DementialDuck (970026) | more than 8 years ago | (#15180157)

First of all. I'm from Chile, and my English may be a little poor. So sorry if my English sucks. Yes. They need a very good support for all IM protocols including all features. I agree with you about GAIM. I'm a MSN Messenger user, because I'm from Chile and it's popular here. AMSN sucks, it's very three times slow than MSN Messenger from Windows, they emoticons sucks, it's webcam support is very poor. I think that Linux need a very good HAL. And if it's possible a HAL that support windows drivers, this way I don't worry about I'm in Windows or Linux. Much people may think 'but this is imitating windows', but Microsoft imitated Apple and Apple imitated Xerox. The good ideas deserves to be copied. The GUI models from Windows and MacOSX and the associated software are a very good base to start. About standards, yes that's a good idea, this way we can atract hardware developers supplying an API much easier for use that will make great operational costs savings. It's the same from the desktop, a very good standard for applications saves time in strategic decisions. A manager when decides to make a software for Linux, he lose time thinking about what graphical API to choice, GTK or QT?, that involucrates a cost of investigation for helping the manager to choose. Microsoft has only one API, one desktop, one IDE, one suite. Hardware vendors choice Microsoft because for them it's a fact standard.

Applications (1)

walterbyrd (182728) | more than 8 years ago | (#15180168)

Nobody runs an OS just to run the OS. It's all about the apps.

It doesn't take much, just one killer app, to sink linux as desktop candidate. That app could very well be a game.

As to your list of "What do 97% of all computer users do on their computers?" You seem to refer only to home users. Business users are a huge part of the desktop market. IMO: Linux fails even worse in the business sector. I know about OpenOffice, but there is *much* more to it than that. There are thousands of third party apps that just don't run on Linux.

Re:Discussion.. (1)

STDOUBT (913577) | more than 8 years ago | (#15180174)

What will it take for Linux to become a mainstream desktop operating system? It's the billion dollar question.

It will take an ubuntu-like distro with a name that isn't goofy and that comes loaded with all the multi-media codecs ready to go, along with a corporation to provide hand-holding... Oh, and a nice fat price tag so sheeple will think they're actually exchanging value for value. Personally, I want Linux to flourish in *business*. I use Linux 100% at home, and I honestly don't care if it ever goes "mainstream".

Linux needs an Open Source Standard in dealing with ALL graphics cards. Plus, a basic way in rendering graphics. It's called VESA.

Linux applications need a good commercial look. GAIM looks to ugly for mainstream desktop.

If you say so. But speaking strictly for myself, "commercial" is not a quality that I long for in my Personal Computer.

Re:Discussion.. (0)

DementialDuck (970026) | more than 8 years ago | (#15180198)

The discussion talks about who to make Linux take a more greater piece of the desktop market pie. I you don't worry about that, good for you.

VESA sucks, it don't support 3D Features.

You're comment doesn't offer an answer.

If we wan't to make Linux popular in the desktop. Is very good first to make a study what people like. I all software design, the requisites are primary.

Re:Discussion.. (1)

broeman (638571) | more than 8 years ago | (#15180258)

Go and ask that to Novell, Red Hat among other commercial Linux distributions, but what has Linux (or slashdot.org) to do with this?

How is it difficult to do task 1, 2, 3, 4 and 5 in a Linux distribution?

I guess Linux has a standard for graphics cards, or it would be hard for Nvidia or others to make drivers for the kernel.

Why don't you help making graphics for GAIM and others if it doesn't look good?

If some distributions want to have standards, fine with me, but this has nothing to do with hackers (appearently a slashdotter too) who likes Linux for its freedom. Call me an ideologist if you want, but this is why I choose Linux.

Others choose Linux because they want a gratis alternative to Windows. This is not what the most of the community including the developer's focus are on, so of course those people are anoyed and even frustrated. Somebody must have told them, that they can get a gratis OS, with free support. "Just download it on their site, ask questions on IRC/forums and you save a lot of money!", "they do it for free! use them! they don't care", "just annoy the hell out of them, and they will answer all your questions!" (which I read in comments some days ago here on slashdot)

I have seen many people leaving communities because they got abused again and again by freeloaders. It is really sad :(

My main problem with Linux (1, Troll)

Bin Naden (910327) | more than 8 years ago | (#15180267)

Linux is a great operating system. The main problem with adoption of this operating system in the mainstream is all the small things that makes its use difficult "out-of-the-box" and how most installation in Linux is so difficult. Here is a list of problems I have encountered with linux pretty much in the order I encountered them when I started dicking around with it: Installation stage: 1) Ooops, you have no hard drive! I had a SATA hard drive and teh distribution I was attempting to install (Mandrake linux) had no support for it. After trying a few distribution, only Fedora at the time seemed to support that hardware. 2) Hanging as the installation starts. Caused because of my newish graphics card. Therefore, I had to go through the text installation mode. Hooray! Linux is installed!!!! WOOOOTTT! 3) Horror! My nice graphical interface hangs at startup! That will teach me to have an ATI graphics card. After dicking around with the ATI drivers for a few days, I finally managed to make it work. 4) Oh noes, now my usb optical mouse doesn't work, I have to find out how to get it to work. After much googling, I find out that I must make some changes to the xorg.conf file and I therefore happily go twiddle with the settings in there. 5) Yippeee, my mouse works and I have a nice graphical user interface, now let's listen to some mp3's. First of all, let's mount my NTFS drive that contains all my MP3's.... NOOOOO!!! NTFS is not recognized :(. At this point, I get fed up with Fedora core and don't touch it for a few months. 6) On the recommendation of my friend, I install Gentoo Linux, follow all the instructions with a custom kernel and reboot. 7) Oh noes! Horror, I have no hard drive. After much googling, I find out that SATA hard drive are under the "SCSI" category for the kernel options. 8) No networking now! I must now rebuild my kernel with the Reverse engineered NFORCE 4 driver! 9) Now, do an "emerge kde-meta" and wait 10) 10 hours later 11) Yeah, I can actually mount a NTFS hard drive now. And I can play MP3's too. Now let's go check out those video clips on launch.com. ... ... 12) Try to get the totem plug-in to work in firefox. It finally sort of works but it's ugly and launch doesn't work anyways. But at least I can now watch my porn. Fiew... 13) After a session of "relaxation" let's now try to install half-life 2 on this machine. ... I won't even go into how much effort I put into this and never got it to work anyways. FOllowing that, I put linux aside a couple of months and decided to try some other time. 14) I got sick of ATI being so buggy in linux and causing my system to crash after every logout and shelled out the cash for a NVIDIA graphics card Therefore, the biggest problems to adoption of linux is: bad driver support (which ubuntu has fixed to a big degree but it still never got my graphics card right) and bad multimedia, gaming integration with the OS. The driver is fixable eventually if you tweak enough however it is inexcusable that an operating system will give you so much trouble in trying to play multimedia on the internet. I want to be able to watch my videoclips on launch.com dammit! The issue of gaming is being somewhat worked out with CEDEGA and wine but both are very buggy and iffy solutions. I got Deus Ex to work on my computer with Wine but all newer games refuse to install or get bizarre errors. Of course, if there is already a way to get all this to work in linux, let me know.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...