Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

A New Kind of OS

ScuttleMonkey posted about 8 years ago | from the dreams-and-things dept.

393

trader writes "OSWeekly.com discusses a possibility of futuristic OSes with both negatives and positives. From the article: 'Imagine if you will, a world where your ideas and perhaps, even your own creative works became part of the OS of tomorrow. Consider the obvious advantages to an operating system that actually morphed and adapted to the needs of the users instead of the other way around. Not only is there no such OS like this, the very idea goes against much of what we are currently seeing in the current OS options in the market.'"

cancel ×

393 comments

Sorry! There are no comments related to the filter you selected.

It's like nothing we've seen .. since Linux (5, Insightful)

ackthpt (218170) | about 8 years ago | (#15997813)

Consider the obvious advantages to an operating system that actually morphed and adapted to the needs of the users instead of the other way around. Not only is there no such OS like this, the very idea goes against much of what we are currently seeing in the current OS options in the market.

I don't know about the parent, but when I build a kernel I don't just default to everything. I build for what I'll need. If that changes significantly then I'll do another with different options and settings.

While it may seem novel to "morph" to what's currently needed, it's not really so revolutionary an idea. It once was that operating systems cleared out unused libraries from memory (rather unlike the way Windows behaves, by loading 385 MB of junk you just might need during a session) and dynamically adjust the amount of processor priority and time (Priority and Run Burst) each task is assigned as needed depending upon system load, etc. Some things appear to have gone backward as we've got more dependent on ooh, shiny! features, whistles and bells.

Maybe like NASA digging up how they once did the Apollo Moon missions, to relearn, it's time for some of the people who do operating systems today to look back at how we did things 20-30 years ago.

Re:It's like nothing we've seen .. since Linux (3, Insightful)

Sinryc (834433) | about 8 years ago | (#15997850)

Normal people can't do that. I can't program worth shit, and I don't even know how to mess with the Kernal. They mean an OS that changes with you, without you having to do it with coding. If Linux could do that, it would be MUCH better.

Re:It's like nothing we've seen .. since Linux (3, Funny)

Anonymous Coward | about 8 years ago | (#15997937)

I can't program worth shit, and I don't even know how to mess with the Kernal[sic].
Not only do you not know how to "mess with it," you don't even know how to spell it.

Re:It's like nothing we've seen .. since Linux (5, Funny)

Sinryc (834433) | about 8 years ago | (#15997946)

See? It needs to be made easier?

Re:It's like nothing we've seen .. since Linux (5, Insightful)

radarsat1 (786772) | about 8 years ago | (#15997959)

Normal people can't do that. I can't program worth shit, and I don't even know how to mess with the Kernal. They mean an OS that changes with you, without you having to do it with coding. If Linux could do that, it would be MUCH better.


You know, as a programmer, I get really tired of people suggesting ways to program computers "without doing any coding". That's where BAD things come from. That's where "dynamically hiding menu items" come from, so you never know where things are. That's where "visual programming" comes from, so you're staring at a screen full of boxes and lines with little to no organizational structure.

No. If you're gonna program a computer, learn how to program. The CS field as a whole apologizes for the fact that computers are hard. They are complex machines. Unfortunately it is not always easy to get them to work they way they should, or the way you want them to. But that's life. If you're not willing to learn how to program, you should be willing to learn how to use what other people have programmed, or learn how to write specs and make intelligent suggestions to the community. But this bullshit about "intelligently adapting the OS to a user's needs" is just asking for trouble. It's asking for "programming" without actually asking for any "design" or "specifications". It will end up being crap.

The fact is, making something "user friendly" means making the front-end more simple -- and thus making the back-end more complicated. But this complexity always eventually compounds and compounds until the end user can't understand what's happening and gets confused. In the end, we learn that computers are easier to use if you understand the back-end, and that can only happen if you use a minimum of metaphor. That is-- a straight-forward system that is obvious and transparent.

The mistake that Windows and many GUI systems have made is in trying to HIDE the system in metaphor. It always backfires, because although a transparent system may be harder to learn, it is far, far easier to deal with once the learning curve has been climbed. And since we've discovered that even the simplest metaphoric GUI requires "training", well.. you may as well train the end user how it actually WORKS instead of trying to hide it from them in a bubble of "interface".

Of course, that's just MHO. Though I believe Neal Stephenson [cryptonomicon.com] agrees with me.

(My apologies to the parent. My comments aren't really directed at you, per se, I just get tired of people suggesting that computer programming should be effortless. Computer using should be easy, but programming is programming, if you know what I mean.)

What hogwash (5, Insightful)

ClosedSource (238333) | about 8 years ago | (#15998006)

A CLI is no more "the system" than a GUI, it's just another abstraction. Most black-and-white movies were made that way because it was the best that could be done, not because the filmmaker thought it was more artistic. In a like manner, most OS's of the 70's used a CLI not because it was a "minimum metaphor", but because it was the only practical option at the time.

Re:What hogwash (4, Insightful)

radarsat1 (786772) | about 8 years ago | (#15998093)

I'm sorry, what? Where did I mention anything about the CLI?
I certainly did not mean to imply that there's anything wrong with a GUI. But there IS something wrong with dynamically hiding parts of a GUI based on some unspecified learning algorithm.
Do you understand what I mean?
Computers should be transparent and obvious, THAT is what makes them easier to use, not artificially messing with the interface to pretend the "hard parts" don't exist. But that doesn't mean we shouldn't be able to use the mouse to interact with them. It just has to be designed well -- meaning everything accessible in a logical manner, whether it is through the keyboard or the mouse.

Re:What hogwash (2, Interesting)

jpardey (569633) | about 8 years ago | (#15998095)

Actually, it is more of the system than a GUI, in most cases. It is closer to the lowest common denomiator than a gui is. To make a flexible CLI program is easier than making a flexible GUI program, simply because the GUI gets exponentionally more complex the more you try to do with it. So technically they are the same, but practically the CLI will win.

Re:It's like nothing we've seen .. since Linux (1)

19thNervousBreakdown (768619) | about 8 years ago | (#15998014)

No mod points, I refuse 'em, but right now I wish I had them all so I could mod this up to the front page. You've summed up exactly the way I feel about learning, using, and programming computers, and I only hope more people get it like you. My hat's off to ya.

Re:It's like nothing we've seen .. since Linux (0)

msloan (945203) | about 8 years ago | (#15998020)

I think programming with boxes and lines is just another way to program. You're right, it doesn't really make it much easier. However, it might be better for more visual people. There is a game called mindrover where you wire together logic components to make a robot's brain. Pretty cool stuff really. Box type data pipelines are ideal for paralel processing. Visual programming gives a much better representation of this than textual. I think for some difficult programming tasks (namely those which you can automatically ascribe a numerical fitness to a solution), new paradigms will make them near effortless in comparison. I really like your description of the problems of GUIs. I think a GUI can and will be vastly superior to a command line in nearly all tasks, we just haven't done the right things. I think if the backend is intuitive, the frontend can be elegant. Also, as far as the Neal Stephenson thing, Apple had a GUI well before Microsoft.

Re:It's like nothing we've seen .. since Linux (2, Insightful)

soft_guy (534437) | about 8 years ago | (#15998028)

If it was really effortless, I guess we would be out of a job.

I have seen some of these things come and go. For example, I remember when VB6 came out and there was a lot of talk about it would be the end of C++. For example, why ever write an actual win32 based application, when it is easier to just crank something out in VB in a shorter time.

At the time, I remember some Windows C++ guys who I worked with being all like, "I guess I will have to find another career because I really don't want to be a VB programmer".

Well, it didn't happen.

This kind of a statement, that there will be some new revolutionary thing where computers can do new things that they didn't do before without having to be programmed - if you can really do it, then more power to you, but my guess is that it just won't be possible.

Re:It's like nothing we've seen .. since Linux (5, Insightful)

ScuzzMonkey (208981) | about 8 years ago | (#15998038)

I don't disagree with you entirely, but you can certainly understand that the line between using and programming has become blurred over the years, and not always with such negative outcomes. After all, in the beginning, everything was programming. Your argument could have been applied to someone just wanting a simple word processor back in the punch-card and teletype days.

Things have obviously changed quite a bit; you don't have to be a programmer to get WYSIWYG editing and print output anymore. It may not seem like it from here, but there are probably a lot of functions that most people consider "programming" that will fall into the same category at some unspecified point in the future. All that programming does is simply interface with the machine at a slightly more complex level than the average user. We're just talking about improving the interface to the point where some things, which now require "programming", will simply be "using" instead... and programmers will move on to more complicated arenas.

Macros or mail filters or Netflix's recommendation system are all ways that average users basically program computers today without any hardcore CS education. Ten or twenty years ago, they would have required such a background to accomplish the same tasks, but no one really considers it "programming" today; there is no reason that many other functions that we currently think of as programming won't become similarly easy or transparent.

There will always be the wizards responsible for writing the code that puts those things into place, and so that's where I agree with you--if you want to be a coder, go learn to code. In that sense, programming will always be programming, but I think the common definition of the word is a necessarily moving target.

Re:It's like nothing we've seen .. since Linux (1)

timeOday (582209) | about 8 years ago | (#15998036)

They mean an OS that changes with you, without you having to do it with coding. If Linux could do that, it would be MUCH better.
Put some useful meat on that suggestion and you may become a millionaire. "The computer should adapt to the user, not the other way 'round" is not new, the problem is it's a vague aspiration which has proven difficult to nail down in any useful way. Microsoft's latest products automatically hide menu items unless you use them often, and frankly I hate it.

Re:It's like nothing we've seen .. since Linux (1)

Korin43 (881732) | about 8 years ago | (#15998066)

Easier != Better The "automorphing" OS may be decent at guessing what you want, but when you code it yourself, you get exactly what you want.

Re:It's like nothing we've seen .. since Linux (1)

applix7 (998238) | about 8 years ago | (#15997875)

I like the idea of stripping down an OS so that it is just a multiplexer, providing apps with a way of sharing the hardware. In such an OS, this "collaboration" is an application- or library-level phenomenon, not a part of the "OS". And I don't see how a set of office apps, for instance, could be retrofitted (easily) with such adaptive technology. They'd need a rewrite, IMHO.

Also: The author may be onto something, but I'm not sure even he knows what it is. Because half the trouble I find with using technology is relearning what goes where. Example: I hear a phone ringing. Was it my landline, my cell, the movie I'm watching, or Skype? Which do I deal with after hearing the sound? A computer won't fix that problem. Technology *could* make things easier, e.g. a soft female voice says "call on your cell phone"... But no one has throught of such simple solutions yet.

Re:It's like nothing we've seen .. since Linux (5, Funny)

Fordiman (689627) | about 8 years ago | (#15997917)

"It looks like you're trying to write a slashdot post..."

Re:It's like nothing we've seen .. since Linux (5, Funny)

Dhalka226 (559740) | about 8 years ago | (#15997945)

I presume the "Read the article" option would be permanently grayed out?

Re:It's like nothing we've seen .. since Linux (1)

Fordiman (689627) | about 8 years ago | (#15997970)

Probably, but the 'Write Automated Troll' button would be in 32pt Arial Black

Re:It's like nothing we've seen .. since Linux (1)

Kamiza Ikioi (893310) | about 8 years ago | (#15997984)

Virtualization is the key here. You start off with the absolute minimum, and work your way up. If everyone did what you did, they would be amazed how powerful their home desktop really is. I use Xen to run highly optimized OSs under my top OS (which is stripped down of all but the essential applications I run daily... I run a parent heavy setup, and use the child OSs like temporary servants who are not allowed to bother "me"). I have one that's specifically a file server (using a TrueCrypt module compiled just for the minimal kernel), I have one that acts as my firewall/proxy/dns. And I have another that acts as a developmental web server. I have others for general purpose (compiling, backing up, etc.)

The idea of all-in-one, be everything to everybody, operating systems died the moment everyone realized that we've quickly come to what most of us realize, and that is the human bottleneck. The computer I'm on is several years old Celeron, and it can still outpace me, even with my XGL turned up to max and running several apps on the base OS. The key is that it only does so when you tell it what you want to do with it. If you tell it to do everything, of course it will work slowly.

The problem is that ease of use has not caught up with the pace of our new toys for truly managing our system resources. We are not the average user, who is still screaming and kicking because their brand new machine locks up when trying to read email. I think OSs would do so much better to come compartmentalized out or the box. When something is needed, it runs in its own box, and when it is not, it disappears completely. The average Windows users doesn't even realize all of the unnecessary daemons running, or that all those little icons in the bottom corner are sucking up their system resources, slowing their boot times, and opening them up to crashes, trojans, and headaches.

The day users can boot in, and be asked "What/Where do you REALLY want to do/go today?" is when they'll find a more pleasurable experience. As it is now, the average home user's computer is a car trying to drive in all directions at once, and getting nowhere. But, I guess that's what sells new computers.

Where's the beef? (5, Informative)

AKAImBatman (238306) | about 8 years ago | (#15997814)

Must be a slow news day. I read through the entire article and I didn't find anything substantial. He spends 6 paragraphs on the first "page" explaining how cool (and "weird") it would be to attach adaptive intelligence to our workflow. (His example is, what if the computer knew when NOT to bother you with email?)

He then goes on for another 5 paragraphs just to tell us that Evil Corporations(TM) could misuse the data about our personal preferences against us. (Shocker, isn't it?) So we might as well forget the whole idea, because the Bad Guys(TM) have it in for us.

*Sigh*

I suppose I could plug my own Linux Desktop Distribution of the Future article to fill space and provide something substantive, but then I'd be accused of shameless self-promotion. So instead, I'm going to bed. 'Night all! :)

Re:Where's the beef? (1)

Osty (16825) | about 8 years ago | (#15997844)

Must be a slow news day. I read through the entire article and I didn't find anything substantial. He spends 6 paragraphs on the first "page" explaining how cool (and "weird") it would be to attach adaptive intelligence to our workflow. (His example is, what if the computer knew when NOT to bother you with email?)

You think that's great? You should read the "Opera 8.0 vs. Pocket IE" [osweekly.com] review. 3/4s of the first page of the review is spent explaining what a web browser and web server are (in horribly bad terms, no less).

Re:Where's the beef? (1)

gettingbraver (987276) | about 8 years ago | (#15997865)

I read through the entire article and I didn't find anything substantial.
So it wasn't just me. Nite.

Re:Where's the beef? (1)

ig88 (19976) | about 8 years ago | (#15997935)

This article made my brain hurt with it's breathtaking inanity.

Re:Where's the beef? (1)

ig88 (19976) | about 8 years ago | (#15997944)

This article made my brain hurt with its breathtaking inanity.

Other users? (3, Interesting)

PacketCollision (633042) | about 8 years ago | (#15997817)

My major concern with such a system (besides the obvious privacy ones touched on in the article) is what happens when some other user sits at my comptuter uses it for a while. Would the "adaptive engine" or whatever be smart enough to figure out that there was someone else there or would I have to reset my settings and have it relearn everything?

Another interesting aspect would be as a constant check to make sure the allowed user is the one at tthe keyboard. Different enough input stats and the password box pops up.

here ya go (0)

Anonymous Coward | about 8 years ago | (#15997876)

fingerprint reader on the keyboard, that is tied to the machine with a hash, combined with a cadence reader. Everyone types at their own cadence. I imagine a large enough sample would give you a profile. Profile doens't match-no input. wrong keyboard and no fingerprint match-no input. Might even lock the machine automagically, send a text message to you on your 5g cell, and activate the room halon sprayers..

I like my typing cadence idea actually-first dibs, copyleft, right and center, prior art, TM, pat pending and so on. Give me candy! Candy License EULA.

Re:here ya go (1)

unitron (5733) | about 8 years ago | (#15998063)

"fingerprint reader on the keyboard, that is tied to the machine with a hash, combined with a cadence reader. Everyone types at their own cadence...I like my typing cadence idea actually-first dibs, copyleft, right and center, prior art, TM, pat pending and so on. Give me candy! Candy License EULA."

Even assuming that a copyright claimed by an AC would be provable and enforceable, you still have to have a system sophisticated enough not to give false positives when the user is , uh, "practicing one-handed typing".

Re:Other users? (1)

TubeSteak (669689) | about 8 years ago | (#15998043)

what happens when some other user sits at my comptuter uses it for a while.
I thought it was fairly simple to "identify" a user by their typing patterns (measureing delays between keystrokes, etc).

I'm not so sure about mouse usage, but IIRC, you can definitely tell apart users by their typing.

As an aside, I don't think your idea "Different enough input stats and the password box pops up" is terribly feasible. Unless you're going to bug the normal user constantly, anyone could pop in a cd/diskette and escalate their priveleges or cause some other mischief without typing a keystroke.

It's been done (sort of) (4, Insightful)

TimmyDee (713324) | about 8 years ago | (#15997826)

This sort of "adaptive learning" for applications has already been done, albeit in a limited and utterly frustrating way courtesy of MS Office and their magical hiding menus.

As a Mac user who has to interact with PCs quite often at work, I find this not only not helpful, but completely obnoxious. I realize this is probably due to MS's fairly awful learning algorithm, but I think the lesson here is that it's going to take a long, long, long time before anything like this can make its way to the desktop without pissing off 50% of the users. Or more.

Hate them! Hate them! Hate them! (4, Interesting)

khasim (1285) | about 8 years ago | (#15997852)

This sort of "adaptive learning" for applications has already been done, albeit in a limited and utterly frustrating way courtesy of MS Office and their magical hiding menus.
Yes! And I am somewhat annoyed with them.

One of the FIRST things I do is go and turn of "Use personalized menues".

Hunting for the widget the FIRST time was annoying enough. Why would I want to hunt for it a SECOND time? I have already learned where it is the first time.

Not to mention that I'm usually doing at least 3 different tasks at once.

If you want to improve the OS "of the future", then START with a reduced set of commands and allow the user to choose what level s/he is comfortable with. Do NOT move items once they've been learned.

Morphing menu idiocy. (1)

AReilly (9339) | about 8 years ago | (#15998068)

Hunting for the widget the FIRST time was annoying enough. Why would I want to hunt for it a SECOND time? I have already learned where it is the first time.


It's worse than that. As an infrequent user of MS Office, I've noticed that the time it takes me to correlate the label on a menu item that my mouse is hovering over with what I want to do is just fractionally longer than the time that Office decides is the right time to expand the menu into full glory mode. Bang! The menu item that I then click on is *not* the one that I was hovering over, and something inexplicable happens to my document. (Of course, the menu goes away as soon as you release the mouse button, so you can't even see what option it was that was clicked.)

Bah, humbug. Gimme VIM any day.

Re:It's been done (sort of) (2, Informative)

Jeff DeMaagd (2015) | about 8 years ago | (#15997929)

I think Quicksilver does a pretty good job of learning. It doesn't rule anything out but allows you to get to programs, address book entries and some data files with fewer keystrokes. I like how I can type in a few characters of someone's name, it can figure out who that is and open address book to that person's address book profile once I press enter to confirm that's the right person. And if you don't like that idea, it's completely optional, you don't install it.

Tedious... (5, Insightful)

applix7 (998238) | about 8 years ago | (#15997836)

The OS is just a hardware multiplexer. Anything above that level is called an application.

the precious (0)

Anonymous Coward | about 8 years ago | (#15997911)

yes I agree. "Give it to me raw nd wriggling"

Then let me recompile it optimally and and write make files for all my most popular commands

Re:Tedious... (1)

Fordiman (689627) | about 8 years ago | (#15997932)

Quite true.

Well, with a couple of major exceptions.

The GUI server WM are not applications; they're APIs.

Re:Tedious... (1)

LordOfTheNoobs (949080) | about 8 years ago | (#15997983)

Your statement would be better founded if it were so in reality. An OS is an /environment/, not merely the kernel upon which it resides. It is the whole of the standard set of kernel services, standard libraries, filesystem layout and other aspects which when combined create a unique structure upon which applications can run.

Red Hat, Debian, Slackware, Gentoo etc are different Operating Systems. Admitedly, they are all variations on a theme, but one cannot pull an app off of any one of them and expect it to behave perfectly on the next, just because they utilize the same kernel H/W multiplexer under the hood.

Files are stored differently, permissions can be done in different ways ( with ACL extensions for example ). Different daemons are standard.

Each of these is unique enough to where without the special attention of package maintainers and translators, software would be hell to run across them. And before you say compile from source, watch the myriad of things your autoconf is checking next time it runs through. All but the simplest C programs are rife with preprocessor hacks to change code segments that need to run certain ways in certain environments. And if you're not seeing these differences in the programs, they are simply hidden in the libraries.

In Windows, explore.exe, windows media player, and even the oft maligned internet explorer with its active-x are all parts of the OS. Standard objects upon which any program can depend. Parts of the environment.

Your soundbyte sounds informative, but it's wrong.

Re:Tedious... (1)

mrraven (129238) | about 8 years ago | (#15998007)

Yep mod parent up. The last thing I want is more complexity in the O.S. itself that would make it more liable to crash and take down the system as a whole. Keep the crashy stuff in userland, thanks.

Playing with your OS. (0)

Anonymous Coward | about 8 years ago | (#15997842)

Basically your future OS is like a game that learns about you.

A future OS is only going to be as good as the hardware it runs on.*

*Guess what I'm working on? :>

I'm not sure 'bout that (3, Insightful)

coolhelperguy (698466) | about 8 years ago | (#15997845)

From experience, it's a whole lot easier to have a standard interface to things (especially things like the control panel) than to have it rearranged for each user.

Trying to fix someone's computer with an adapted OS would be a real pain, and asking for help via email would be next to impossible, because your options could be in a different place.

Even today's OS adapatability can be unnerving. I get used to using something from the top N programs on the Start Menu (Sorry, no Linux on the work computer), but when it gets bumped off because Windows thinks I used something else more often, I'm confused for a few seconds, just enough to be annoyed.

So my guess is that this "new kind of OS" won't succeed because of support hassles and confusing the user. But it'd be darn cool if those problems could be fixed.

Re:I'm not sure 'bout that (1)

zurtle (785688) | about 8 years ago | (#15997918)

but when it gets bumped off because Windows thinks I used something else more often, I'm confused for a few seconds, just enough to be annoyed.


That's when you right-click... Pin To Start Menu. Windows does it well IMHO, you just need to learn how to do it: which is perhaps part of the problem.

Re:I'm not sure 'bout that (1)

Fordiman (689627) | about 8 years ago | (#15997949)

Relax. The article is just spin for the 'ooh, shiny' crowd.

Yes, I'm looking at you, mac fanboys.

Re:I'm not sure 'bout that (1)

soft_guy (534437) | about 8 years ago | (#15998059)

I'll assume you are either a Microsoft or Linux fanboy.

If you are a Microsoft fanboy, I'll mention the irony that your platform is the one that actually implemented an idea as assinine as this article (menu items that go away and re-arrange themselves).

If you are a Linux fanboy, I'll just assume you think usability is a bug, not a feature.

Re:I'm not sure 'bout that (1)

Bastian (66383) | about 8 years ago | (#15997999)

Agreed. I hope the future of OSes is that they will become *more* predictable, not less.

I had a helpdesk job when Windows and Office XP first came out. Adaptive menus and the ability to decide whether or not you want the sidebar and stuff like that makes providing tech support an absolute nightmare. I remember it taking me a week to figure out the best order of places to look when trying to get a caller to the network settings window.

In comparison, providing support to the Win98 users was a dream; it was easier for me to keep track of what was going on and talk people through it, and it was certainly a far sight less baffling to the more computer-illiterate callers I got.

Re:I'm not sure 'bout that (1)

agent_no.82 (935754) | about 8 years ago | (#15998024)

Bah, for some reason I always feel the need to modify my interfaces, but not by them "learning." (Ergo, KDE.) A proper set up would allow users to carry their customization data with them.

Re:I'm not sure 'bout that (1)

soft_guy (534437) | about 8 years ago | (#15998069)

What you could do would be to create an application that presents a user interface into someone else's computer except using YOUR customized UI metaphors. That way, you could perform an action using your UI and the person you are helping would see that action happening on their computer in their own UI.

So, you are right, it wouldn't be email support, but it would not be impossible to provide remote technical support, but it would have to be designed into the system from the ground up in order to really be workable.

Futuristic OS? (4, Funny)

eclectro (227083) | about 8 years ago | (#15997847)


I have seen it, and it's called LCARS [lcarscom.net]

Re:Futuristic OS? (1)

thephotoman (791574) | about 8 years ago | (#15998051)

I must agree, I'm waiting for LCARS myself. Unfortunately, that kind of voice recognition is going to have to wait for the 23rd Century, at the earliest.

But while I'm waiting, I think I'm going to invent warp drive. After all, what's a computer running LCARS if it's not on the fastest starships in the galaxy?

This is what I want in a future OS (4, Interesting)

Travoltus (110240) | about 8 years ago | (#15997848)

More control of my computer by me, instead of by someone else.

I keep hearing about stuff like "all your base are belong to thin clients and remote servers" whenever someone mentions the future of OSes and that deeply disturbs me, especially the part about remote storage of data and subscription based access to remotely hosted apps. Forget morphing; I would prefer changing my OS settings as I please. In fact, give me OS the option where I can save my settings to a profile and then load up a profile to fit what I'm doing.

I'll pay more for having everything on my hard drive, under my control, without any need to phone home to authorize further usage of my media, software or OS. Unfortunately we the sheeple are being herded towards the digital corporate nanny state where the corporations decide what we'll get and these little heuristic tricks the OS of tomorrow will do for us, will give us the illusion that we have control.

Funny how it is that to get the kind of extra value I desire, I need to actually pay [redhat.com] less [debian.org] . Ok, so I'll purchase a support contract, does that count as "paying more"?

Re:This is what I want in a future OS (1)

kfg (145172) | about 8 years ago | (#15997889)

More control of my computer by me, instead of by someone else.

From the article:

"I was all but convinced that having an operating system that could do much of my thinking for me was the way to go. . ."

If you lack a brain. The core issue isn't Big Brother, but the simple ability to think, to control and to even know if the output of the computer makes any sense whatsoever.

The sense of the latter is already visibly in decline, even among the educated "elite."

KFG

Re:This is what I want in a future OS (1)

eno2001 (527078) | about 8 years ago | (#15997934)

You realize how vastly inefficient it is to have all that power sitting there on the desktop? You probably want to own a stretch SUV as well? Personally I think the answer is in on demand clustering with VMs. A cluster of small CPU/RAM combos with centralized storage. But instead of dedicating a CPU to each person, the CPUs migrate to whoever needs the most power on the grid at the moment (within reason). So let's say you're doing some REAL work and editing a huge audio file or rendering video... you get up to 25 CPU/RAM combos allocated to you. On the other hand, you're doing something less important, say... working on a piddly little spreadsheet... you only get one CPU/RAM combo because that's enough. This is all transparent to you. Why is this better? Because it's more efficient use of resources. Everyone has the power of up to a 25 node cluster and if you want more, then you pay for ops/sec rates beyond your base price. Who doesn't want the power of a freight train in their living room sometimes?

Re:This is what I want in a future OS (1)

cibyr (898667) | about 8 years ago | (#15998025)

Overheads and latency screw this idea over. I'll just use my own CPU, RAM, GPU and disk, thank you very much. And then why I crash or bog down my system doing something stupid I don't have a bunch of other users screaming at me.

Re:This is what I want in a future OS (1)

agent_no.82 (935754) | about 8 years ago | (#15998040)

Somebody who wants security, reliability, and privacy; someone does not trust companies to run his/her computer.
Seriously now, this is what we have sleep and other such functions for. Ever monitor power usage of a computer? It varies based on resource usage.

I think we'll see more specialized OSs (4, Interesting)

PIPBoy3000 (619296) | about 8 years ago | (#15997853)

For example, users will see flavors of the OS that are secure, fast, web-based, all-inclusive, or geared towards some specialized function such as controlling a robot or doing scientific calculations. Already you see Linux forks all over the place, just for this reason. I think the trend will continue down that path - an OS for every need.

Re:I think we'll see more specialized OSs (3, Informative)

andrel (85594) | about 8 years ago | (#15997962)

I agree we're going to see a lot more customized forks in the world of GNU/Linux, but I disagree that most of them are going to be full operating systems. Instead we're seeing a common core with customized faceplates on it. For example what the Ubuntu folks are doing with Ubuntu/Kubuntu/Xubuntu/Edubuntu. Behind the scenes it is all one OS, but with different faceplates changing how it appears to the user. Debian are doing the same thing [debian.org] .

Imagine... (5, Funny)

Venik (915777) | about 8 years ago | (#15997862)

...an operating system that actually morphed and adapted to the needs of the users...

Users? Aren't those the guys who always need their passwords reset and profiles restored? It already morphed and adapted and became Windows. We have only ourselves to blame. In Soviet Russia OS does not adapt to users; users adapt to... Oh, wait.

The Scary OS? (2, Funny)

Bombcar (16057) | about 8 years ago | (#15997869)

You're entering the sector of an filesystem adjacent to a partition, the kind of place where there might be a bootloader or some kind of weird Linux. These are just examples. It could also be something much better. Prepare to enter... The Scary OS.

Good ideas (4, Insightful)

ThousandStars (556222) | about 8 years ago | (#15997878)

The problem with articles like this is that they're filled with highfalutin and banal platitudes but low on nitty-gritty details about how one could actually construct the OS of the future. Look, I'd like "an operating system that actually morphed and adapted to the needs of the users instead of the other way around," but what the hell does that mean, exactly? And, once you've decided how it means, how are you going to implement it?

If those questions had answers, someone would already be writing the "OS of the future." Sadly, at least in present and near-future technological terms, those questions don't have answers, and so they'll remain in the world of hand waving prognostications about some techno-utopia.

Re:Good ideas (3, Funny)

kfg (145172) | about 8 years ago | (#15997912)

I'd like "an operating system that actually morphed and adapted to the needs of the users instead of the other way around," but what the hell does that mean, exactly?

"I'm too lazy to customize my toolbar."

KFG

Re:Good ideas (1)

ThousandStars (556222) | about 8 years ago | (#15997998)

Or, alternately: "I don't know how to use virtual desktops."

Re:Good ideas (3, Insightful)

Coryoth (254751) | about 8 years ago | (#15998070)

All this "adapt to the user" talk is, as you say, fine and well, but no-one has the faintest idea how to do that. What little there is of that technology is pitifully bad, in a large part because it adapts to what the user does, as opposed to what the user wants. That just generally results in a lot of time spent with the user going "no, I didn't mean that!", "no, I don't want you to do that now!" etc.

You may as well talk about the OS of the future which just has a single button in the middle of the screen that says "do what I want". The gap between intention and action is bad enough, trying to model future intention based on past action is just asking for trouble.

Nothing to see here, move along. (4, Insightful)

4D6963 (933028) | about 8 years ago | (#15997880)

This article sounds like articles from 1990 about the house of 2015, you know, the ones talking about how saying "light" will turn light on, how you will check and reply to your video e-mails from your living room big screen TV well you know.. just like Back To the Future II.

My point is, I don't think you'll really see or even want a self deciding or modifying OS, even if the idea sounds cool. Mod me down for this if you want, but I think this whole article is just some nearly-worthless futuristic rambling, even if it's got some interesting ideas, don't pay attention.

Re:Nothing to see here, move along. (1)

5plicer (886415) | about 8 years ago | (#15997953)

I fully agree.

Adaptation algorithm = boon for Spy agencies (2, Insightful)

trelayne (930715) | about 8 years ago | (#15997884)

In recent news, it was found that the Pentagon is looking for ways to gather meaningful data from social networking sites.

Adaptive OSes would be one step better since breaking into your specific "morphing" would reveal more intimate data about the way you think, the importance you place on specific topics based on the way you prioritize your email message accesses,etc. To some degree this is possible by cross referencing cookie data between big corporate sites who just love one another. But adaptation potentially makes it much easier.

I'll bet that people will be clamoring to include morphing (if it ever exists) in web 2.0 type applications. I don't really understand this excitement. Your data is only as secure as the trust you place on the system admins of your site. No contract ever really guarantees they won't give into law enforcement agencies who want to know what color underwear you like.

Interface, not OS (3, Insightful)

topham (32406) | about 8 years ago | (#15997886)

I wish people would stop confusing Interface with OS.

Sure, when people talk about OS X they are often referring to it's interface (Aqua), but an interface does NOT have to be integral to the OS.
Linux / X-Windows are the obvious example on Slashdot.

^BumP (2, Informative)

TubeSteak (669689) | about 8 years ago | (#15998056)

Parent makes an excellent point.

You can find dozens of good (many bad) shells for Windows or *nix.

GUI != Operating System

Not too exciting. (3, Interesting)

aardvarkjoe (156801) | about 8 years ago | (#15997893)

For the lazy, here's the description from the article about how the futuristic OS is going to work:

Here's an example for you: imagine you are sitting there working away on a video project. After stopping for a break, your OS pops up with a small alert box asking you if you'd like the PC to roll into adaptive mode. You select yes and the OS begins to learn, as you work, what your needs are.

You go to open your video project again after lunch and almost immediately, you find that the program feels more in tune and responsive to your needs. On the second monitor, you discover a virtual palette of all the editing tools you use the most. No longer are you being forced to locate the editing tools you need from some arcane menu. No, instead your PC has done the work for you with no interaction on your part whatsoever. Sounds interesting? Just wait, it gets weirder...

During the course of your editing work, your PC has already learned from previous experiences that you do not like to be bothered with e-mail alerts when working on specific projects. It's not so much the software being used mind you, rather the type of "work" being done at the time.

An important e-mail from your client comes rolling in along with a number of less important messages. Thanks to Brand X OS' new probability engine, the only e-mail you are alerted to is the one the OS knows will be critical. Even though the other less important e-mails are coming from the same person, your OS understands how to handle this just the way you prefer.

Now, I don't know about anybody else, but I would kind of expect that the video editing program would make the tools easily accessible the first time I use it, rather than waiting until I've spent a couple hours hunting through menus before doing so. And my e-mail program already has an option controlling whether it notifies me of new messages or not.

In a general sense, the idea of an adaptive OS sounds nice, but the author sure didn't come up with any examples that sound particularly compelling -- or even interesting -- to me. The hard part of coming up with a next-generation OS isn't in programming new features; it's actually inventing or designing something that people will find useful.

Re:Not too exciting. (1)

Mistshadow2k4 (748958) | about 8 years ago | (#15998023)

Thanks, but I'm too lazy to read that.

phb shifting around paradigms again... (1)

Goalie_Ca (584234) | about 8 years ago | (#15997895)

I don't see what this has to do with an operating system at all. This article shows the level of understanding of a middle-age soccer mom. I don't even think LCARS would fit that definition.

A next gen OS will probably be a virtualized-modular-scalable-secure-networking-ind exed- piece of software with a modular and stable-yet-clean api. Just look at the past and look at what servers are doing/have done. Its not hard to see the trends. What this means for end users is more capable software, more reliable software, and hopefully an end to constant upgrading.

Take mac os x for instance. Add zfs, and more intelligent/complete programs for mail, music, and videos. Add a web 3.0 capable browser and drivers for every device out there. Give it the kernel upgrade (l4 rumours are about!) and perhaps cocoa interfaces for a dozen managed and unmanaged languages. Of course it would be nice to change the user-seen file system to a tagged database rather than a tree-based structure with symlinks and hardlinks. Further develop core video, audio, animations, etc. to completion and add a few other niceties and then you'll have the os of the next decade. The next decade after that computers better lose the keyboard and mouse interface... and hopefully they will be unrecognizable machines entirely! The OS of the future will be something the users will be completely unaware of. It will siltently do its job of running software and managing resources.

You've Got to Be Kidding Me (1)

eno2001 (527078) | about 8 years ago | (#15997898)

On page one of this "article" the author posits a wonderful OS that intuits what you want out of it and arranges itself in such a way that your workflow is made easier. Then page two comes to a screaming halt and slams into reverse with some lame caveat about how any OS that would do all the thinking for you is a tempting lure to "evildoers". There wasn't much substance in that. No. Not much at all. In fact we've all thought about how you would craft the perfect OS. I had an idea a while back for a document type "DNS" system. It would tell ANY OS that used it, which applications for that platform would open which documents and provide download links if you didn't have the software or would just launch it if you did. This Doctype DNS could be local on your network and also would be global just like real DNS with the root controlled by a non-profit organization that way there is no profit motive behind it. Great idea (I think) on paper. But it's not really workable. And that's just one idea. I think just a few of us posters could probably come up with better ideas for an OS of the future than the author of the linked piece did. For a minute I thought I was on Digg when I read that article. Besides, my current kick is the Xen hypervisor. I just love having such flexibility and power that is as yet unheard of on any other x86 system (and no VMWare Workstation doesn't do what Xen does).

Not an OS (1)

Doc Ruby (173196) | about 8 years ago | (#15997899)

The features that column describes are not OS features. They're app features. There's nothing stopping an app developer from including those features in an app running on any OS, or even a cross-plaform app running on Windows/Linux/OSX.

At best their "popular palette" system across apps is a windowing toolkit, only marginally part of the OS, and also possible in any current desktop OS, or in a shared app library.

What's such a dumb article so wrong about what an OS is doing in _OS Weekly_?

It'll never work (1)

pestilence669 (823950) | about 8 years ago | (#15997900)

In a society that hates fair use, you simply cannot have an OS that assimilates ideas that might be redistributed elsewhere. Lawyers would force DRM on every I/O & messaging call. Of course, this article doesn't even try to get that deep.

The sci-fi bend is more along the lines of A.I., which disturbs me. Not because I don't want my computer to take over the world, but the feeble-minded author seemed more excited about the prospect of needing to do less than he was about being able to do more.

It's called 'open source'. (1)

Celarnor (835542) | about 8 years ago | (#15997901)

Imagine if you will, a world where your ideas and perhaps, even your own creative works became part of the OS of tomorrow.
Really? One's own creative works becoming part of an operating system? This writer is a more than a couple of years late to get that movement started. I'm sure someone's going to bring up the point that distributions have become complex enough for the user that they could be considered standardized, but your average user isn't going to be contributing much of his 'creative works' to part of an operating system anyway. In order for AI to do anything like the writer is describing in TFA, it would have to build some massive catalog of a user's habits. A)That would probably become incredibly huge over time, B)Having so much information in a couple of files is like hanging a big sign in front of your computer saying "free data within a guaranteed path here", C)The AI itself would have to be coded like a rock, and given that there are even a few viruses out there for OSX, I wouldn't trust anything that high-level to run on my PC constantly. I don't know about other /.ers, but I'd rather suffer doing the extra 3-5% of work that my computer could potentially do for me that doesn't require my input.

Re:It's called 'open source'. (0)

Anonymous Coward | about 8 years ago | (#15998022)

...and given that there are even a few viruses out there for OSX,..."

Really? How about a source or link?

I've heard of a trojan, it requires you to enter your admin password, but no self-replicating viruses.

Turning the computer inside out (4, Interesting)

caseih (160668) | about 8 years ago | (#15997903)

Almost since the inception of computers and then later modern OS design we've been trapped in a paradigm that although mirroring some aspects of the real world (the desktop, tools, etc), is quite backwards from other aspects. I think it is time we ditched some of these decades old concepts. For one the concept of an "application" has to go. It's an outdated and locks us down and restricts what we can do. See it's not about the applications; it's about the data. The data is the most important thing. Data should not be imprisoned in an application or even a series of compatible applications. Rather than the application being the focus of our OS and UIs, we should make the data, or the "document" be the focus. Instead of applications we have smaller, simpler, tools that can be applied to the documents (data objects or whatever). Common tools can work equally well on like data objects no matter where they reside. A spell checker would spell check anything that is text. A pen could draw on anything that is a drawable (a surface of some kind). If you needed a better pen, you'd buy a better pen that would work on the same surfaces as the old one (but in a better way perhaps). Everything would be document-centric with the concept of, perhaps, tool palettes or something. But it would be very modular and loosely coupled. The irony of loose coupling is that it could lead to the integration of widely differing sets of tools. For years Microsoft has tought us that to have good integration between the various tasks (word processing, spreadsheets, etc) we need a tightly intergrated application. This is false. We really need just open document objects that can support a variety of types of data and the tools to work on them. The OS becomes the app and *everything* is then integrated, but in a more open and extensible way. Of course this dramatic shift would lead to the demise of many major software houses until they can learn to adapt to the new way of doing things. But in the end the OS gets out of the way and lets us *work*.

If some of these concepts sound familiar, it is because they are not new. Apple and IBM once talked about this in their Taliget (http://en.wikipedia.org/wiki/Taligent) project which died. Unfortunately while we talk about technologies like OOP, they really haven't moved very much beyond languages. OSs are modular and even object-oriented to a degree, but they haven't quite arrived at the things I describe yet. Having the KDE libraries being object-oriented and manipulatable over RPC and DCOP is a step towards a possible document-centric future.

Re:Turning the computer inside out (1, Insightful)

Anonymous Coward | about 8 years ago | (#15997950)

Document-centric environments have been around as long as GlobalView (i.e. it's older than Windows). They invariably have a flaw in that they become only as good as the object model they're based on. All objects have to pick up behavior, but they eventually develop silos, and you get applications. And that's just emergent behavior -- do you think Adobe is interested in designing their canvases, brushes, transforms, plugins, and so on for maximum reuse across the system? It's not just proprietary fences put up deliberately, it's also that it would be more effort than actual payback. In the end, photo editing is about tasks, not documents.

Basically, you have to have a system that can do both documents and applications well. Of course I believe that applications should still run on an orthogonally persistent fine-grained object database underneath, provided and managed by the system, but there is still the notion of a task that runs and completes.

Re:Turning the computer inside out (1)

agent_no.82 (935754) | about 8 years ago | (#15998065)

Interesting thoughts. I'm using a KDE environment myself and appreciate the spellchecker kpart.

meh.. (1)

dexomn (147950) | about 8 years ago | (#15997906)

I adapt the OS to me with bubblegum, duct tape, and shellcode. I ain't sharing that shit. /me *spits on the floor*

Re:meh.. (1)

dexomn (147950) | about 8 years ago | (#15997919)

Not to mention admitting to not commenting code when you've been drinking and your boss happens to read slashd... oh shit..

what a visionary (1)

spir0 (319821) | about 8 years ago | (#15997916)

Not really an article, so much as a quick pointer of what we may get in the future.

It would have been better if the writer had actually analysed some of the things he spoke about and discussed them. I can sum up the article in one sentence using as much depth, research and intelligence as he did:

"In the future computers may be able to predict your work habits, but some people will use it for bad and stuff."

Good on ya mate.

Style over substance? (1)

Gavin Rogers (301715) | about 8 years ago | (#15997924)

I'm not concerned about how pretty OSes will be in the future or how clever they'll be at filing away my stuff to make it easier to find again later since the history of OSes shows that they are designed better at these tasks as time goes on.

My OS of the near future will be secure and stable, the likes of which we can only dream about now. It will recover gracefully from hardware errors, it will use high-level APIs to talk to all hardware making drivers a thing of the past, it will put large parts of itself inside read-only flash RAM on the PC where no application program (read: viruses and spyware) can touch it. It will back itself up automatically as well as specifying a level of hardware redundancy as standard.

It will be designed so well that it will rarely need maintenance and re-installing will seem as quaint then as loading a computer program with a stack of punched cards seems to us now.

That would be a nice OS to use. It might also find me looking for a new career path though as sys admins would find it hard to get work :-)

So basically (1)

XnavxeMiyyep (782119) | about 8 years ago | (#15997928)

The best OS would be a magical OS that could do everything by mystically knowing everything!

The Future OS isn't an OS, really... (4, Interesting)

rickb928 (945187) | about 8 years ago | (#15997936)

It's a Hypervisor.

Your applications provide (or are provided with) enough OS foundation to function in the limited virtual machine they live in.

The Hypervisor manages the hardware, inter-application communication, networking for each, and of course picking up the trash and keeping everything polite.

Apps only see the shared resources the Hypervisor permits.

But most important, two features:

  - Each app gets the OS features it needs. My word processor may not need the same things the database needs, nor the e-mail app, nor the music player. So the OS for each app is lighter and nimbler.

- Each app is restricted in how it interacts with other apps. No more OLE, DDE, much less opportunity for the backdoor/under the hood shenanigans we call worms, viruses, trojans, and 'badware' (ick, stupid name).

I saw an article describing this and promptly lost any way to find the FRAKKING ARTICLE! Did anyone else, and where the heck is it? I thought it was *here*, on /.

Grrrrr....

But I love the idea. It ain't really new, but it's clever.

rick

Smart OS = unpredictable by user (1)

noidentity (188756) | about 8 years ago | (#15997941)

After stopping for a break, your OS pops up with a small alert box asking you if you'd like the PC to roll into adaptive mode. You select yes and the OS begins to learn, as you work, what your needs are.

You go to open your video project again after lunch and almost immediately, you find that the program feels more in tune and responsive to your needs.

This is basically referring to prediction: the OS predicts what you'll be doing and alters its behavior so that the predicted action will occur more efficiently. The problem is that no matter how good this prediction is, it's not going to be perfect, so the user will still need to work around the cases where it's wrong, as the user always must. But adding prediction means that the user will have a much harder time predicting how the computer will respond, which will require that the user work more slowly and verify the response of each action one step at a time. This is a step backwards. The best computer is one that has predictable responses to user input, allowing the user to work ahead of the machine and do much of interaction at an unconscious level.

On the second monitor, you discover a virtual palette of all the editing tools you use the most. No longer are you being forced to locate the editing tools you need from some arcane menu. No, instead your PC has done the work for you with no interaction on your part whatsoever. Sounds interesting? Just wait, it gets weirder...

Either the OS already had been programmed to show this recent actions palette and was just holding out, or the author is suggesting that it actually think up the concept of a recent actions palette and then program it itself!

An important e-mail from your client comes rolling in along with a number of less important messages. Thanks to Brand X OS' new probability engine, the only e-mail you are alerted to is the one the OS knows will be critical. Even though the other less important e-mails are coming from the same person, your OS understands how to handle this just the way you prefer.

The problem with the OS determining that in situation X you want behavior Y is that it probably will under-specify or over-specify what situation X is. For example, it might define situation X as the front window being at coordinates 239, 23.

Next

Man, I'd like an OS that figured out that I want to read the entire article on one page, without any visual distractions.

Systems that try to be smart are a pet-peeve of mine, and it never seems like people have thought the idea through. I remember some like this for the Mac many years ago, though I never tried them.

It's Called Emacs (0, Informative)

Anonymous Coward | about 8 years ago | (#15997948)

If you want the closest thing to an adaptable operating system, learn Emacs inside and out. With elisp only a buffer away, it doesn't take long to adapt it the way you want to work. No restart required!

And prior to Emacs there were Lisp Machines. From what I know, every function available on one was redefinable all the way to the hardware without needing a restart. Now that's adaptable!

And it was done in the early 1980s too. Microsoft should have stole from MIT and Symbolics instead of stealing from Apple. Come to think of it, Xerox's UI that Apple stole may have been done in Lisp. Please correct me if I'm wrong.

What I would like out of an OS... (2, Interesting)

NerveGas (168686) | about 8 years ago | (#15997952)

1. Non-intrusive.
2. Stable.
3. Efficient.
4. Intuitive.

    Some time ago, I worked on a friend's computer that was running Windows 95 on a Pentium 166. I was astounded at how fast and responsive it was. Windows XP on an A64/P4 barely keeps up, yet offers very little more to me in terms of usefulness. Neither Windows, MacOS, nor XWindows particularly fits #4, at least not for me.

    I will say, in terms of scalability, XWindows is a *real* screamer on a quad-Opteron with 8 gigs of RAM and a nice, fast SCSI array.

steve

Re:What I would like out of an OS... (1, Interesting)

Anonymous Coward | about 8 years ago | (#15998076)

The X Window System is (a) not an OS, and (b) not called XWindows.

Virtual Machines (1)

Mr_Tulip (639140) | about 8 years ago | (#15997972)

I would point out that such an OS would need to be task oriented, rather than user oriented.
If I start working a novel task, one that I've never done before, I'd hate to have to 'teach' my OS how to behave.
Many people I know already customize their OS for the task they are doing.
The easiest way is to just create several user accounts or desktops, each of which runs different 'background' applications. My gaming logon in windows runs very few services, keeping the system as lean as possible , whereas my 'day-to-day' login has virus scanners, screensavers, desktop backgrounds etc configured.
The next iteration of this, for me, at least, is the virtual appliance [vmware.com] . While popular in servers for a while now, they are really beginning to take off for home /desktop use.
I am beginning to use these quite frequently, creating my own installations that contain the tools neccessary for a given task.
Biggest advantages include cross-platform, wide availabilty, and great uptime (as long as you backup your images.

A new kind of Girlfriend (4, Funny)

Heir Of The Mess (939658) | about 8 years ago | (#15997991)

Imagine if you are sitting there playing Halo. After stopping to go grab a coke, your girlfriend walks into the room asking you if you'd like her to roll into adaptive mode. You say yes and she begins to learn, as you play, what your needs are.

You go to resume your game again after the coke and almost immediately, you find that the your girlfriend seems more quiet and responsive to your needs. Out in the kitchen, she is out there preparing a virtual smorgasboard of all the food and drink you need the most. No longer are you being forced to locate old cheese snacks from some resealable container. No, instead your girlfriend has done the work for you with no interaction on your part whatsoever. Sounds interesting? Just wait, it gets weirder

During the course of your gaming, your girlfriend has already learned from previous experiences that you do not like to be bothered with request for attention when working on specific missions. It's not so much the game being used mind you, rather the type of "work" being done at the time.

An important sms from your brother with his score comes in along with a number of less important family messages. Thanks to Brandy X's new attitide, the only sms you are alerted to is the one your girlfriend knows will be critical. Even though the other less important sms are coming from the same person, your girlfriend understands how to respond for you, just the way you prefer.
....

The Dumbest Article In Recent Memory (0)

Anonymous Coward | about 8 years ago | (#15997994)

That was the dumbest article I've read in a long time. Inarticulate, nonsensicle rubbish. digg--... oh wait... :-)

The Matrix has you... (1)

LuNa7ic (991615) | about 8 years ago | (#15997995)

From TFA...
Here's an example for you: imagine you are sitting there working away on a video project. After stopping for a break, your OS pops up with a small alert box asking you if you'd like the PC to roll into adaptive mode. You select yes and the OS begins to learn, as you work, what your needs are. You go to open your video project again after lunch and almost immediately, you find that the program feels more in tune and responsive to your needs. On the second monitor, you discover a virtual palette of all the editing tools you use the most. No longer are you being forced to locate the editing tools you need from some arcane menu. No, instead your PC has done the work for you with no interaction on your part whatsoever. Sounds interesting? Just wait, it gets weirder...
Am I the only one who objects to technology following this path? Unfortunately, I have to use MS Office on a semi-regular basis and I absolutely hate the so-called 'user friendly adaptive technology' included in it, and the the way it automatically capitalises words, 'corrects' spelling, and misaligns objects.

Why? Because I want to do my OWN thing. I want to customise and fine tune my work to the way I want it. I don't want to have to conform to the standards that are most commonly observed in customer feedback reports. So what if I'm using the the software in a way not forseen by Billy Gates, Steve Ballmer and an army of overpaid consumer research suits.

I also have some questions:
*At what point does the PC take over and do everything for you?
*When does the user become obsolete?
*Where should the line drawn where we force people to learn about the computer instead of vice versa?
*When do the MIBs take me away and neuralise me for stumbling upon their secret plot?

Kai Krause tried this once (3, Interesting)

Animats (122034) | about 8 years ago | (#15998003)

Kai Krause [wikipedia.org] tried something like that once, in "Kai's Power Tools". The interface started out simple, and as you used it, when the software decided you were good enough, you advanced to the next level and more tools appeared. This was one of the first programs to have really cool functional widgets, like draggable on-screen trackballs and joysticks.

Users hated it. The cool user interface just got in the way of getting work done. At one point, a rumor started that Kai was going to redesign Photoshop's interface, and there were organized protests to Adobe.

But his programs looked so cool.

Part of the problem was that Kai was addressing a very hard problem - the user interface for a drawing program. The MacOS X toolbar looks like a Kai interface. But that tool bar is really just a menu. Serious drawing programs, from AutoCAD to Maya, have to offer so many different yet interacting capabilities to the user that they're terrifyingly hard. A full-scale 3D animation program is about as hard as an interface gets. There before you is the ability to create a synthetic world. Animation programs struggle to provide all the needed tools without overwhelming the user.

There's also the issue in that world that working artists want quite a different set of capabilities than amateurs do. Artists seldom edit freehand-drawn lines. They delete them and sketch new ones; they don't drag spline control points. An experienced animator creating a human head in a 3D animation system won't build it up one polygon at a time, or start pulling on an ellipsoid. They may draw a series of cross-sections and skin them. I've seen this done in less than a minute. So the needed tools may be quire different from what a programmer would imagine.

Dave? (1)

Konster (252488) | about 8 years ago | (#15998010)

Daisy, Daisy, give me your answer do. I'm half crazy all for the love of you. It won't be a stylish marriage, I can't afford a carriage. But you'll look sweet upon the seat of a bicycle built for two.

Won't work for me (1)

IlliniECE (970260) | about 8 years ago | (#15998021)

I think I'm way too random in my PC usage habits for an OS like this to help. Does anyone else feel this way?

Another fatuous article assuming strong AI (1)

OnanTheBarbarian (245959) | about 8 years ago | (#15998042)

This article, like so many other ones, assumes that AI will make some sort of miraculous progress within the very short term. Many of the features mentioned are fairly banal and not really OS functions at all (application customization) and many others would be wildly frustrating unless the OS really did solve major unsolved problems in AI (the email agent that just miraculously knows whether I'll find an e-mail important and can be interrupted right now).

A related problem is that few OS's and applications do a particularly good job of setting out their static functionality, much less building good menu structures on-the-fly in reponse to my behavior (the trick of grouping frequently-used operations is such low-hanging fruit, and already done so many places, that it's hard to imagine this as particularly impressive).

Given all the real problems with current OS's (general usability problems, backward compatibility, stability, bloat, taking advantage of increasingly complicated hardware - GPUs and multicore spring to mind) it's hard to imagine some sort of spray-on AI layer being the answer to all of our prayers.

The email example is doable today (1)

patio11 (857072) | about 8 years ago | (#15998096)

Bayes filter + plugin in Thunderbird + option to pop up window when "messages like this" show up = good enough "AI" for mail priority filtering. I don't actually pop up a window, but I have PopFile separate my mail into a variety of folders, including one marked "Urgent Action"*, and while its not quite as effective as having a human screener its a whole lot cheaper and PopFile will never accuse me of sexual harassment for exposing it to the contents of my spam folder.

But, as you say, there is no reason this is (or should be) an OS function. Dyamically optimizing the actual OS-level stuff for you ("We're going to alter the functionality of malloc() on the fly so that your applications are more responsive!") is probably a much more difficult problem. Some of the real pie-in-the-sky stuff in the article actually would require Strong AI.

* I like digging under the hood. My Urgent Action bucket seems to work primarily off of header information (mails from my boss are more likely to be urgent than those from my mother, mails from @hotmail.com are vanishingly likely to be urgent) with some spice coming from keywords (the filter "knows" what words relate to an illness a certain family members is suffering, what words are about my pet project at work, etc).

My OS will learn from past mistakes (5, Funny)

Jacer (574383) | about 8 years ago | (#15998047)

User, From your usage history, it seems to me that you like bloated software, spyware, torjans, viruses, worms, and other malware. I've take the liberty of installing all of these with the latest features. I've also removed all productivity software, as my records indicate you were failing out of school anyway. Regards, Your new-age OS.

Users? Learning OS? (1)

agent_no.82 (935754) | about 8 years ago | (#15998050)

Give them many (usable) options, let them customize the damn OS themselves, I say!

BSD, here and now (1)

NuShrike (561140) | about 8 years ago | (#15998052)

It starts out as just as something on a disk, you push it in, and it grows inside your HD as it was shipped.

Then, it's a-la-carte land where you get to select what apps you wanted, what options of the apps you don't want, what multiple versions of apps you want at the same time, flip around kernel options, and update it all from CVS at anytime while keeping your own modified sources of something.

Build and install without worrying about dependencies. No frozen precompiled distros, no fumbling around RPM hell, no keeping the apps in sync with the OS nor forced upgrades of the apps because the distro decided so, no tightly coupled interdependencies between the OS and the third party apps. It grows with you as you want it, the way you want it.

You know, Windows is great for this too. Some of the Linux out there (Red Hat/Fedora) needs to wake up.

Blah blah ... (1)

Kaz Kylheku (1484) | about 8 years ago | (#15998058)

More pseudo-visionary clap trap from people who can't bang two bits together.

Hey, I'm Slashdot user 1484. I have a waiver that lets me be a curmudgeon whenever I please, which is pretty much all the time.

voice recognition (1)

llZENll (545605) | about 8 years ago | (#15998062)

Why would you ever want your creative ideas to become _part_ of the OS? IMO the ideal OS should be 100% transparent and you never know of its existance. It would facilitate the seamless transport of information, ideas, and media. Beyond perfect natural voice recognition, I don't see any potential evolution of the OS as significant.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>