×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The State of the Internet Operating System

CmdrTaco posted about 4 years ago | from the i-think-it-needs-a-security-patch dept.

Operating Systems 74

macslocum writes "Tim O'Reilly: 'I've been talking for years about "the internet operating system," but I realized I've never written an extended post to define what I think it is, where it is going, and the choices we face. This is that missing post. Here you will see the underlying beliefs about the future that are guiding my publishing program as well as the rationale behind conferences I organize.'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

74 comments

The Internet has an operating system??? (0)

Anonymous Coward | about 4 years ago | (#31669490)

I did not know that.

Meh (2, Informative)

WrongSizeGlass (838941) | about 4 years ago | (#31669492)

That article isn't exactly cromulent. Is there a daily prize for obviousness?

Re:Meh (1, Funny)

Anonymous Coward | about 4 years ago | (#31669662)

Is there a daily prize for obviousness?

No, there isn't.

Re:Meh (0)

Anonymous Coward | about 4 years ago | (#31669684)

Obviously there should be.

Re:Meh (0)

Anonymous Coward | about 4 years ago | (#31669770)

yeah! neither does article embiggen our knowledge on the subject!

Re:Meh (0)

Anonymous Coward | about 4 years ago | (#31669870)

You silver-tongued pirate.

He's chanelling Stallman is why it sounds familar (2, Interesting)

xzvf (924443) | about 4 years ago | (#31669824)

O'rielly is pointing out the same dangers of the Cloud as Stallman, but in a reasonable voice. The question is how to preserve the DIY environment when hardware is sealed (see iPad) and software is ran on corporate computers. Will innovation be constrained or will the cloud be open enough to allow people to change vendors easily without total reworks?

Re:He's chanelling Stallman is why it sounds famil (2, Informative)

tadghin (2229) | about 4 years ago | (#31674440)

You've got to be kidding that I'm channelling Stallman. He's finally waking up to an issue that I put in front of him all the way back in 1999. At the time, he said "It didn't matter." See for yourself, in the transcript of our interchange at the 1999 Wizards of OS conference in Berlin. They are a fair way through the PDF of the transcript, so read on down: http://tim.oreilly.com/archives/mikro_discussion.pdf [oreilly.com]

At the time I was talking about "infoware" rather than "Web 2.0" but the concepts I was working with were in the same direction.

But in case you don't want to go through all that, here's the relevant bit:

Richard Stallman:

I came up to the mike again because I wanted to address
the topic that Tim O'Reilly raised. Some of you might know about our major
disagreements on other issues, but that's not what he spoke about. And I think that
this distinction between hardware and software and infoware is an interesting one
and that you addressed it very well from the open source point of view. That being
a matter of looking for a development methodology of making things that work and
judging success to a large extent in the same concept of market share or number of
users that is used as a criterion by the proprietary software developers. Now,
looking at that same concept, that same situation from the Free Software point of
view, I bring to this a different idea of goals and a different idea of a criterion.
The goal in the Free Software movement is to extend our freedom. 'Ours' meaning
that of whoever wants freedom to work together so that freedom spreads over a
wider range of activities. And so our criterion isn't really about market share, ever
and it's only secondarily about 'Do we have good technology, does the program
work reliably?' Obviously if it works badly enough it won't be useful, but otherwise
we can fix it, so that's just a side issue. The important thing is: How many activities
can we do without giving up our freedom? What is the range of things that we can
do on a computer which has just free software on it, where we don't have to
compromise our freedom to do any of those things?

Now when you apply this criterion to things like web servers that answer certain
kinds of questions for you, that communicate with you, you find an interesting
thing: a proprietary program on a web server that somebody else is running limits
his freedom perhaps, but it doesn't limit your freedom or my freedom. We don't
have that program on our computers at all, and in fact the issue of free software
versus proprietary arises for software that we're going to have on our computers and
run on our computers. We're gonna have copies and the question is, what are we
allowed to do with those copies? Are we just allowed to run them or are we allowed
to do the other useful things that you can do with a program? If the program is
running on somebody else's computer, the issue doesn't arise. Am I allowed to copy
the program that Amazon has on it's computer? Well, I can't, I don't have that
program at all, so it doesn't put me in a morally compromised position, the way I
would be if I were supposed to have a program on my computer and the law says I
can't give you a copy when you come visit me. That really puts me on the spot
morally. If a proprietary program is on Amazon's computer, that's Amazon's
conscience. Now I would like them to have freedom too. I hope they will want
freedom, and they will work with me so that we all get freedom, but it's not directly
an attack on you and me if Amazon has a proprietary program on their computer.
It's not crucially important to you and me whether Amazon uses a free operating
system like GNU plus Linux, or a free web server like Apache. I mean I hope they
will, I hope free software will be popular, but if they give up their freedom, that's
just a shame it's not a danger to us who want freedom.

What matters with infoware and freedom is the freedom as applied to the
information we get. If we get a web page, are we free to mirror it? If there is an
encyclopedia online somewhere, can everybody access it, are we free to mirror it,
can we add new articles to it? If there is courseware, textbooks, the web-equivalent
of textbooks, you read it and you study a subject and you learn, are these free? Can
you make modified versions of it and re-distribute them to other people? So we --
humanity -- have a gigantic job ahead of us to spread freedom into the area of
infoware and that's what the free software community has to do with infoware.

Tim O'Reilly:

I agree, I think that there are a lot of interesting issues. Think about,
for example, maps.yahoo.com: what if it gives the wrong directions? Can we fix it?
Many of us who deal with websites, for example, that have data up there, have a
real problem if we get the wrong data. And there is a very analagous situation to
software, where for example, we provide data to Amazon, they get it wrong, and we
can't fix it. You know we've got to send them mail, and they say 'Oh we'll get
around to it later,' and maybe it never gets fixed.

But there are further issues, we keep branching out. There are types of information
that you don't want to have modified. This gets into the whole issue of identity
online and digital signing, those kinds of things, because for example, if I make a
statement of opinion, as opposed to a statement of fact, I sure as heck don't want
somebody to modify it and 'improve' it and pass it off as what I said. So there are
some very very interesting social issues that we are going to get into. I think that
the Internet is going to change everything. We think we've seen a lot of change in
the last few years and I think we haven't seen anything yet.

I really don't think Richard ever grokked the Internet.

Re:Meh (1)

FyRE666 (263011) | about 4 years ago | (#31670546)

The guy also seems to have a problem differentiating between an operating system and a network infrastructure.

Re:Meh (1)

Tanktalus (794810) | about 4 years ago | (#31671922)

The internet has an operating system just as much as a colony of ants has a hive mind. They don't, but they sure act like they do.

Metaphors. Learn them. Use them. Love them. But don't anthropomorphise them. They hate it when you do that.

Re:Meh (1)

AP31R0N (723649) | about 4 years ago | (#31672760)

How crumulent is it? We know it's not exactly cromulent, but you've left it's actual cromulence vague.

Dumb terminals and smart people don't mix (5, Interesting)

elrous0 (869638) | about 4 years ago | (#31669508)

This whole "Internet OS" thing reminds me of the periodic resurgences of the dumb terminal/thin client idea that goes back to the mainframe days. It seems like every ten years or so, everyone is talking about thin clients in every office, with the OS and apps running on some offsite server somewhere (now with the added twist of multiple servers over the internet). Ostensibly this is seen as a good way to save IT money and overhead. But in every actual deployment I've seen, it only causes hassles, additional expense, and headaches.

Back in the 90's we tried this at my old university. We networked all our computers and put all our apps on a central server. Even though this was all done on a local network (much more reliable in those days than the internet), it was still a complete disaster. Every time there was a glitch in the network; every student, professor, and staff member at the university lost the ability to do anything on their computer--they couldn't so much as type a Word document. Now, with little network downtime, you would think this wouldn't be so much of a problem--but when you're talking about thousands of people who live and die by the written word, and who are often working on class deadlines, you can imagine that even 30 minutes of downtime was a nightmare. I was skeptical of this system from the get-go, but got overruled by some "visionaries" who had bought into the whole thin client argument with a religious fervor. Of course, long story short, we ended up scrapping the system after a year and going back to the old system (with a significant cost to the state and university for our folly).

Re:Dumb terminals and smart people don't mix (3, Insightful)

Em Emalb (452530) | about 4 years ago | (#31669574)

Actually, it's all just one big cycle. When I first broke into the IT world, PCs were a bit of a novelty in most businesses. Then, the PC explosion caused things to move towards a "client-side" setup, with faster desktops, laptops and not as much horse power required on the server side. Then, in an effort to save money, tied in with servers/CPUs/memory becoming cheaper, and security concerns, companies started (or have started) to slowly pull things back from the client side and put more emphasis on the server side of things.

That said, I'm sure it won't be long before we go full-circle again.

One final thought, I do not want any "OS" that's supposed to run on my computer to be running on the internet. Corporate networks, in my experience, are typically much more prone to solid uptimes, unlike the internet. Plus, if something goes down on my network, I don't have to depend on someone else to fix it.

Re:Dumb terminals and smart people don't mix (1)

diegocg (1680514) | about 4 years ago | (#31669906)

I don't think we are going to go back anymore to the old days of client-side apps. There's a big difference today, the growing ubiquity of network access. Decades ago we didn't have internet (or it was crappy, slow and too inexpensive), every once in a while a new computer generation focused in client-side software because networks didn't really matter that much. With the ubiquity of internet I don't think we we'll see that again. We are starting to see MB/s of internet bandwith, it won't be too long until we have bandwith comparable to what local IDE DMA disks could offer in 2000.

The balance shifts due to psychology, not raw perf (0)

Anonymous Coward | about 4 years ago | (#31670430)

Don't be fooled by absolute measures like MB/s. The ratios of CPU/RAM/IO/network speeds are more informative, and they keep expanding. I could just as easily argue that independent app hosts are the obvious future, because cell phones and iPods now have more powerful processors, larger RAM, and larger tertiary storage than what many PCs had in the 1990s. Why use costly or unreliable networks when you can process locally? Neither argument actually holds water, as the choice depends on the application requirements and human factors.

We've already seen many iterations of this cycle. Teletypes replaced with video terminals, remote tty-based mainframe apps replaced with early PCs with their own video terminals (CP/M era), video terminals replaced with onboard video cards (IBM AT and Apple era), minicomputers replaced with Unix workstations, workstations replaced with X Terminals, X Terminals replaced with Unix PCs, etc. In most of these cycles, the "thin" terminals have been embedded appliances which boasted processing and memory almost equivalent to the standalone computers of the same era. My CP/M desktop and its video terminal both had about the same type of CPU, similar amounts of RAM, and similar serial port IO capability.

What I expect is that the market speeds up and keeps getting more turbulent, so this psychologically driven cycle or market pulse will break down. At any one point in time, we'll just see a broader mixture of differently tuned device designs all proceeding in the market together with sufficient consumers to fund development. Their constituent parts are the real driving commodities: CPU, RAM, HDD, SDD, GPU, LCDs, wireless chipsets, etc. The printed circuitboard, power supply, and case form factor are more easily adapted for low-to-medium markets, so different combinations can be brought to market without too much overhead, unlike the individual integrated circuits coming from the chip fabs.

Re:Client-side / Server Side (1)

TaoPhoenix (980487) | about 4 years ago | (#31671678)

I like a hybrid approach.

Our Enterprise accounting system is on the server, but office apps are local. Daily workflow seems to produce a lot of "debris", which conveniently forms little digital compost heaps on people's local machines. (With a little nudging) if there's a document that's usefully finalized, post that version to the server folder.

MS Office Basic is "essentially almost-free" for OEM hardware purchases, so why put Word and Excel on a server?

Re:Dumb terminals and smart people don't mix (1)

PopeRatzo (965947) | about 4 years ago | (#31669930)

Corporate networks, in my experience, are typically much more prone to solid uptimes, unlike the internet.

Does the internet go down a lot? I haven't noticed.

Re:Dumb terminals and smart people don't mix (1, Funny)

Anonymous Coward | about 4 years ago | (#31670728)

"Does the internet go down a lot? I haven't noticed."
Then you are watching the wrong videos. They almost always start with going down.

Re:Dumb terminals and smart people don't mix (1)

Sir_Lewk (967686) | about 4 years ago | (#31671674)

I get phonecalls from my mother all the time telling me that the internet has gone down once again. Her home phone, which she uses when she calls me about this, is VoIP.

Re:Dumb terminals and smart people don't mix (1)

blahplusplus (757119) | about 4 years ago | (#31672492)

"Actually, it's all just one big cycle."

Not since the internet, the problem with thin clients is it starts to create single point of failure. The great thing about the net is redundancy even if that comes at a cost it gives you extreme amounts of flexibility.

The great thing about the net is redundancy, is a site down? find it in cache's of the net.

Re:Dumb terminals and smart people don't mix (1, Funny)

Anonymous Coward | about 4 years ago | (#31669584)

Dumb peeple and dumb terminalz dont mix either.

Re:Dumb terminals and smart people don't mix (1)

commodore64_love (1445365) | about 4 years ago | (#31669590)

It's also dumb. Even if you bought a low-end Intel Atom machine, why would you want to waste that CPU letting it be a dumb terminal? Put that CPU to work by enabling it to do tasks independently even if the network connection fails.

Re:Dumb terminals and smart people don't mix (2, Funny)

oldspewey (1303305) | about 4 years ago | (#31669636)

Exactly - keep your CPU busy running SETI@Home while all your apps sit on a server somewhere.

Re:Dumb terminals and smart people don't mix (3, Insightful)

drinkypoo (153816) | about 4 years ago | (#31669738)

It's also dumb. Even if you bought a low-end Intel Atom machine, why would you want to waste that CPU letting it be a dumb terminal? Put that CPU to work by enabling it to do tasks independently even if the network connection fails.

I weep for OpenMOSIX. I was hoping that the project would continue and ere long we'd be motivated to buy all one architecture in our house simply because all the machines would form a cluster almost without our involvement and just accelerate each others' tasks. A terminal cluster where the terminals also make the entire system faster is kind of an ideal dream.

Re:Dumb terminals and smart people don't mix (1)

L4t3r4lu5 (1216702) | about 4 years ago | (#31670172)

You're not suggesting... A world wide beowulf cluster?!

Re:Dumb terminals and smart people don't mix (1)

drinkypoo (153816) | about 4 years ago | (#31670540)

You're not suggesting... A world wide beowulf cluster?!

That would be nice too, but there are many issues to be worked out first. Let Amazon &c work them out before we start building intentional cloud botnets. This would only provide you a single system image cluster in your house, and because Unix works on a process model, MOSIX works on process relocation. But when combined with LTSP and a bunch of machines of the same architecture (you could treat anything from Pentium on up, in x86 land, as i586 for example) then it would eliminate the need for local storage and provide a cluster without appreciable management, benefiting any home user with multiple machines. And anyone who sat down and netbooted their netbook from your network, gaining access to your applications and files (as per permissions) would improve the processing power of your cluster, while also being able to use it. Modern LTSP allows access to local resources, so they'd still have access to their own data in this context, while you don't have to worry about malware on their machine running on your network. This also presents a safe way to handle road warrior configurations while also keeping costs low; when laptops are booted in the office, they netboot from the server and join the cluster. When they are booted at home, they self-boot.

The latest LTSP supposedly allows you to configure the system to run some applications from the clients, so there are good reasons for home users with multiple machines to embrace LTSP as it is... but not nearly as good as if OpenMOSIX were a going concern.

Re:Dumb terminals and smart people don't mix (4, Interesting)

david.given (6740) | about 4 years ago | (#31670472)

I weep for OpenMOSIX. I was hoping that the project would continue and ere long we'd be motivated to buy all one architecture in our house simply because all the machines would form a cluster almost without our involvement and just accelerate each others' tasks. A terminal cluster where the terminals also make the entire system faster is kind of an ideal dream.

What happened to OpenMOSIX, anyway? I used it very successfully to turn groups of workstations into build servers; they all ran OpenMOSIX, and then make -j8 on any of the workstations would farm out the build to all the workstations. And it all Just Worked, and there was bugger all maintenance involved, etc. I was really looking forward to it getting mainlined into the kernel and then it just all kind of vanished.

There's no indication of what happened on the mailing list --- it just stops. There's a new project called LinuxPMI [linuxpmi.org] that claims to be a continuation but there's no mailing list traffic...

Re:Dumb terminals and smart people don't mix (3, Informative)

snarfies (115214) | about 4 years ago | (#31672298)

According to Wikipedia, "On July 15, 2007, Bar announced that the openMOSIX project would reach its end of life on March 1, 2008, due to the decreasing need for SSI clustering as low-cost multi-core processors increase in availability."

Re:Dumb terminals and smart people don't mix (1)

Ogi_UnixNut (916982) | about 4 years ago | (#31673342)

Yeah, in fact I built just that for my school many years ago. 10 computers (PIII's), set up as an openmosix terminal cluster. It worked really well. If all terminals were in use people had the power of one PIII just like normal, and if fewer people used it, then there would be more power for everyone. This was far more efficient, especially as the computers would be on anyway, and scaled really well, as we didn't need to invest in really beefy servers to host all the apps on. It really was cost effective, and I thought at the time it would be the way forward.

Unfortunately back then nobody (outside the IT department) heard of Linux, and refused to use it, so the system was eventually reverted back to plain Windows boxes.

Still, it was a great experience, and I learnt a lot building the system, it is a shame openmosix is no longer developed. I think that even today, it would be an awesome system, primarily because of it's cost effectiveness and efficient use of resources.

Re:Dumb terminals and smart people don't mix (1)

Drethon (1445051) | about 4 years ago | (#31669872)

Because having someone else do the crunching decentralizes the storage of the data you are using? With it decentralized you no longer have to upgrade your hard drive capacity, you can have a power outage and the data will still be procesed and multiple people can process it at the same time without interference?

This is assuming a perfect system, the server has to upgrade appropriately and have proper data, power and network backups to prevent the same issues but how often does slashdot go down these days?

Re:Dumb terminals and smart people don't mix (1)

commodore64_love (1445365) | about 4 years ago | (#31670112)

Well coming from an era when we had dumb terminals, I have no desire to go back to that. I like being able to use my computer even when it's not connected to the net. Like last night, I was watching videos without a connection. I couldn't do that with one of those so-called "cloud" computers, because neither the movie nor the player software would be on my machine.

And if you're really concerned about backing-up your data, there are services you can use NOW to upload your HDD to the net, so if your house burns down you'll still have your data stored safely. You don't need to turn your 1500 Megahertz Atom computer into a dumb client.

Re:Dumb terminals and smart people don't mix (1)

Drethon (1445051) | about 4 years ago | (#31670258)

This isn't really something that works perfectly now but it is something that could work in the near future (if everything goes perfect).

You can already connect your computer to the cell phone network for internet, get this updated for bandwidth and reliability and there is no reason a computer cannot always be connected to the internet.

Additionally having the OS on the internet instead of your device allows you to be working on a document on you desktop, move it over to your iphone to continue work on the bus (PLEASE NOT in the CAR!) and when you get to work you can move it off the iphone to your work computer seamlessly. I see more power in being able to connect to the internet OS with multiple devices, possibly at the same time and they don't operate completely independly.

Not saying this will be perfect but this is a possibility.

Re:Dumb terminals and smart people don't mix (0)

Anonymous Coward | about 4 years ago | (#31671932)

Additionally having the OS on the internet instead of your device allows you to be working on a document on you desktop, move it over to your iphone to continue work on the bus (PLEASE NOT in the CAR!) and when you get to work you can move it off the iphone to your work computer seamlessly. I see more power in being able to connect to the internet OS with multiple devices, possibly at the same time and they don't operate completely independly.

What kind of people need to be doing all this work, all the time, from everywhere? Moreover, who wants to be able to do this? Screw that, when I'm at work I'm at work, and when I'm not I'm not.

Re:Dumb terminals and smart people don't mix (1)

Drethon (1445051) | about 4 years ago | (#31672118)

You don't want to for work but what about MMOs? Yes this will probably lead to whole new problems but think about the ability to run a game continuously across multiple devices. There is a part of me that is annoyed with is annoyed with the exploitation of MMO addiction but I know a company will try this.

I'm sure there are other examples that would work too, ex GPS maps or grocery lists...

Re:Dumb terminals and smart people don't mix (2, Insightful)

jc42 (318812) | about 4 years ago | (#31670244)

... but how often does slashdot go down these days?

Actually, that's a good way to phrase it. That is, it may be true that slashdot itself is almost always up and running. But from my viewpoint, out here on an internet "leaf" node, slashdot quite often seems to be "down". It's fairly common that when I do a refresh, it can take a minute or more to complete. Sometimes when the "Done" appears at the bottom left of the window, the window is mostly blank, and it takes another refresh to get the summaries back on the screen.

The basic problem with the cloud-computing model is the same as with the thin-client+server model and the terminal-cluster+mainframe model: Your computing is done on one or more remote machines, over which you have no control, and even when that's working, the results you see on your screen depends on a comm network. That network might work well when first installed with short links. But if it's successful, it'll quickly become overloaded and upgraded at team of managers and workers who mostly don't have a clue about how the technical details of the system.

The bean counters can explain all they like about how much cheaper centrally-controlled computing systems are. But if you actually want to get your work done, you'll once again discover that you need a computer that can do the work locally. If you don't have control over the machine, it won't do your work the way you want it done, and the people who do control it won't have a strong motive to help you with problems that they don't see or understand.

Re:Dumb terminals and smart people don't mix (2, Interesting)

Drethon (1445051) | about 4 years ago | (#31670492)

Where I'm at on the other hand I almost never have delay loading a slashdot page. Look at where the internet is now compared to where it was ten years ago. We have an explosion of broadband access compared to back then. If the internet continues growing that may no longer be a problem in ten years.

On the other hand an internet OS will use a lot of that bandwidth, likely leading to increased lag even as bandwidth increases (see hardware requirements of Win 95 vs Win 7...).

Unfortunately the only sure way to know how well it will work or not is to try it and see what happens. Then if it doesn't, see what fails and if it can be improved. The internet OS has enough potential to make it worth it but yes it has as many potential downfalls...

Re:Dumb terminals and smart people don't mix (1)

jc42 (318812) | about 4 years ago | (#31677682)

Unfortunately the only sure way to know how well it will work or not is to try it and see what happens.

Probably, and of course the open nature of the Internet means that people are free to experiment with a network OS. Actually, I've done that myself. Some 25 years ago, I demoed a "distributed POSIX" library that allowed me to do things like type "make" on one machine, and watch as it spun off subprocesses that compiled source on N other machines, linked them with libraries on other machines, and installed the object files into directories on yet other machines. My "netclib.a" included handling clock skew, so that the timestamps on different machines were adjusted to make them all UTC, so that make's standard scheme worked correctly. I even included a few VMS systems as some sort of show of virtuosity.

Meanwhile a guy down the hall from me was doing his own distributed unix system, and we compared notes a lot. He borrowed my clock-correction code; I borrowed several chunks of his code. Of course, the Newcastle Connection folks were there several years before us. And none of these projects has had any influence that I can detect on the current "Internet OS" movement. The new distributed OS probably won't be based on any lessons learned by those who did it decades ago. ;-)

One of the predictions I'd make now might be a bit worrying. If you remember the kerfuffle a few years back, when msn.com was caught red-handed using images from customers' email and web sites in ads, you'll know where I'm headed. The first defense of msn.com was to point to the fine print of their contract, where it stated that any files stored on their computers became the legal property of msn.com. They backed off after a bit of publicity, as the parents of the kids in those ads started picking up torches and pitchforks. But the issue is still there lurking in the background. Many ISPs and web "hosting" sites still have such language in their fine print. This is going to bite a lot of people who move their stuff onto "the cloud".

Last weekend, my wife and I visited a friend who was one of the artist vendors at the Boston Flower Show. Her (the friend's) specialty is high-quality botanical drawings. Part of her income comes from making technical illustrations for botanical publications. She also sells prints of her drawings in various forms (cards, bookmarks, whatever), which she was selling at the show. The topic of putting her stuff online came up, so I described the msn.com story, and mentioned that this was a growing problem for people like musicians who want to distribute via the Internet rather than the ripoff that is the commercial music industry. A number of bands have found that their MP3s were now owned by their ISP that so graciously provided web space to customers. I made the point that if your web site is on a machine owned by any company, you may have legally handed your copyrights over to that company. Unless you're an expert "IP" lawyer, you probably won't know until it's too late. You'll find that the "hosting" company is selling your stuff, and there's nothing you can do about it.

She was visibly shocked by this. My advice was to learn to run her own web site. She uses Macs, so it's actually fairly easy. I may spend some time teaching her the basics. But she's obviously also worried about the complexity of all this. So maybe I'll end up volunteering to host her site on my machine, and promise that she keeps all the copyrights for her stuff. I've done this for a few friends who have heard the horror stories, done a bit of research, found out that I wasn't kidding, and asked for help.

Personal control was what caused the desktop-computer explosion in the 1980s, to end the control that company MIS/IT/whatever departments had over what people were allowed to do on their computers. Personal control over "IP" rights may well be what blocks widespread adoption of "cloud computing" or "Internet OS" or whatever the next buzz phrase will be. Once people realize that it's (in part) a ruse to take control over other people's files, the more knowledgeable people won't want to have much to do with it.

(I do use gmail, but I'm careful to not put anything there that contains anything that could be considered private by any of the people involved. This especially includes any identifying information for any accounts anywhere, but also material that might be copyrightable. I don't want to be the one responsible for accidentally transferring a copyright to google.com. ;-)

Re:Dumb terminals and smart people don't mix (1)

Drethon (1445051) | about 4 years ago | (#31680108)

My preference on this would be to only use sites that allow encrypted files. I've noticed a lot of clouds don't allow that, wonder why :)

Re:Dumb terminals and smart people don't mix (1)

cynyr (703126) | about 4 years ago | (#31670298)

running things locally would work great if >90% these days didn't need files from some sort of network drive/server/export/etc, requiring network access anyways. Lots of commercial software won't run if it can't get a license from the network, Outlook is just about worthless without a network connection. So really you need that connection anyways. Why do you seem to think that the loss of network access would need to imeditly kill any thing you were doing at the time? wait for the network to come back up, and resume work as if nothing happened.

Why would you buy an Atom for this? a ARM SoC, Nvidia tegra 2 comes to mind, would fit in the monitor and be able to do all of this. Since the client CPU isn't running any apps, the type isn't important on the client end. Since the apps are executed on server, if it has X86 compatible CPUs all your software should work fine. Also this is really only a concern on proprietary apps, lots of the major opensource apps work on ARM/PPC/blackfin just fine with little to no modification. Mplayer maybe not, by why does your client need to be able to play videos? flash, again not on ARM yet, but that would seem to be a good thing in a corporate network. Youtube stops working, hulu, lots of other time wasters.

On linux with X, you can already do this, GDM has support for logging into a remote machine, worst case, you need to run GDM on the real hardware, and make the X session a SSH+key+exec gnome/kde/etc. The only catch here is that removable devices don't work as well, without a few more games, but in a coperate environment that may not be a bad thing. In fact this should all work fine for off network use(slower but useable) by just checking to see if you are at home, if not vpn home and then do the login.

Of course i have no idea how to do any of this if you need windows, and it would not be a good idea for cpu/gpu intensive loads, photoshop/CAD/3d animation. Windows doesn't really support showing only a few windows as windowing calls and not pushing a pile of pixels. Yes i know RDP handles only updating small parts of the screen at once, but that doesn't handle encrypting the stream, the key presses, mouse movements, etc. To be honest windows seems to really care about which machine something is being run on. Also I'm not sure how licenses would work for say, excel, it's only installed on one or a handful of machines, yes it's being run by 30 users at a time, but it's only installed in one place. Do you need to seats of Excel if people share a computer?

Re:Dumb terminals and smart people don't mix (1)

Drethon (1445051) | about 4 years ago | (#31669832)

Dumb terminals have the capability to eliminate nearly all hardware requirements for the client except for ability to process the connection. On the other hand they require extreme levels of backup on the server side that has the potential to be cost prohibitive.

We may be at the point where things are stable enough (How often do you loose your gmail? Yes it went down for me the other day but its the first time in at least a couple years). The risks are much higher than the gains but they can be overcome if enough care is spent (not saying it will be but...).

Re:Dumb terminals and smart people don't mix (1)

Trails (629752) | about 4 years ago | (#31670030)

So your implementation didn't handle faults well, therefore we should throw out the idea?

There are certainly criticism to be made for the centralized model, but your anecdote isn't one of them. If the product you bought and/or stuff you built wasn't fault tolerant then you bought and/or built the wrong solution.

Re:Dumb terminals and smart people don't mix (0)

Anonymous Coward | about 4 years ago | (#31670500)

Back in the 90's we tried this at my old university. We networked all our computers and put all our apps on a central server. Even though this was all done on a local network (much more reliable in those days than the internet), it was still a complete disaster.

Funny, we did the same thing at my Uni (EE) as well via NFS mounts (and a lot of diskless clients as well). Things worked splendidly.

Even things likely lowly SparcStation 5s and 10s were used, and on those machines we had a wrapper script for the heavy duty applications (e.g., Matlab) that would do an 'rsh' onto beefier machines.

The idea is sound for many situations (not all of course); Perhaps it was your implementation that sucked?

Re:Dumb terminals and smart people don't mix (1)

tlhIngan (30335) | about 4 years ago | (#31671220)

This whole "Internet OS" thing reminds me of the periodic resurgences of the dumb terminal/thin client idea that goes back to the mainframe days. It seems like every ten years or so, everyone is talking about thin clients in every office, with the OS and apps running on some offsite server somewhere (now with the added twist of multiple servers over the internet). Ostensibly this is seen as a good way to save IT money and overhead. But in every actual deployment I've seen, it only causes hassles, additional expense, and headaches.

It's already happening though. Today's term for it is "cloud computing" - but it's the same idea, and people are embracing it to a huge extent. Services like GMail and Hotmail for e-mail, Google DOcs for office stuff, Google Apps - all being taken up rather quickly. Hell, Facebook is probably the top cloud-computing platform out there, offering messaging, gaming and many other services to millions of users.

So it may be a fad, but it's one that's catching on again.

Now, the forces pulling us back to the smart client model might very well be the iPhone/iPad, for its inability to run Flash means a lot of these cloud-computing apps don't work. Instead, users create local versions of the same apps. Hell, cellphones that require "the cloud" are also popular (Android, WebOS), and much to Palm's demise, switching from WebOS to Android is quite simple since Android just grabs your data "off the cloud" (like WebOS does), meaning you don't have to go through lengthy transfer procedures.

A lot of stuff in computing is cyclical - hell, we seem to repeat history continually. The same goes for CPUs and offboard controllers - stuff done by offboard controllers is migrated into the CPU (e.g., FPU, memory controllers), and stuff used to be done by CPUs is migrated to offboard controller (e.g., GPU). And then CPUs will start spawning GPU-like things to move the GPU back onboard, while spurning other onboard tasks offboard...

Re:Dumb terminals and smart people don't mix (0)

Anonymous Coward | about 4 years ago | (#31671574)

Except that this is not at all what the Article is talking about. AT ALL. (I know it's slashdot...)

In particular the article talks about the remote services that a local application would use as part of the platform. There is no mention of getting rid of local ressources.

Re:Dumb terminals and smart people don't mix (1)

nine-times (778537) | about 4 years ago | (#31672472)

Yeah, I feel like there are a few problems with the vision of running a terminal/mainframe model, first and most obvious being, as you said, it introduces a central point of failure for everyone. If the server goes down, everyone on that server is suddenly unable to work. People will counter by saying, "well you just distribute it across a bunch of servers so there's no more single point of failure." It's harder than it sounds. If you distribute across servers, how do you manage that distribution? What happens when your method for managing that distribution goes down?

I don't think it's insoluble, but for the time being it sounds like more trouble than it's worth. One of the things it's important to keep in mind is how cheap computing has gotten. We have more computing power in our cell phones than existed in the biggest computers a few decades ago, and we're putting hundreds of gigabytes into USB thumb drives. It's ultimately not going to save you much money to forgo internal storage and computing power for a thin client, so people are usually going to get a thicker client anyway. Once you have that internal storage and processing power, you may as well use it.

Honestly, I think someone needs to invest in making a really smart syncing technologies (possibly involving filesystem improvements and making applications more aware of file changes) to make it so that we can work locally and "in the cloud" at the same time. Imagine you could store all your data online so that it was encrypted and not even your host could access your data, but you could always access it seamlessly. Edit a word document on your laptop, it syncs to the server and back to your desktop automatically (similar to Dropbox). If you're working on someone else's PC it's also available via an online editor akin to Google Docs, and you could see each other editing similar to SubEthaEdit. Take a picture on your camera, and it's automatically uploaded to a service like Picasa, which then automatically syncs it to your "Pictures" folder on your laptop and desktop. It's also available to whatever social networking you're doing, bla bla bla. Your online storage is version controlled similar to Apple's time machine. Everything is stored online, but it's also cached on your local drives for quick and easy access.

We have an awful lot of the components lying around to build something like this, but not integrated together. And besides bandwidth/storage issues, whatever syncing software you use has to be smart enough to keep multiple devices in sync (files could be changed on one of any number of devices and the changes show up on all the rest). It also has to be smart enough to only sync the files that have been changed, or rather only the portions of files which have been changed, and to do it immediately upon write.

Re:Dumb terminals and smart people don't mix (1)

dkleinsc (563838) | about 4 years ago | (#31672898)

I was skeptical of this system from the get-go, but got overruled by some "visionaries" who had bought into the whole thin client argument with a religious fervor.

Or alternately, those "visionaries" were expecting to profit personally from the thin client manufacturer.

What we really want is the best of both worlds (1)

sean.peters (568334) | about 4 years ago | (#31673120)

There's no reason why we can't have both - data backed up/synchronized to the "cloud", and applications that can continue to run on locally cached data when the network is unavailable for whatever reason. There are still some cases where this is problematic - e.g. my iPhone Google Maps application really doesn't work in the hinterlands, as the phone won't have the maps locally stored - but this is really just a problem of caches not being big enough or smart enough to do what we need. The problem will be partly solved by brute force - it looks like flash memory will continue to get more dense for a while - and partly by increased intelligence from the applications themselves. In the case of the maps application, it's easy to envision a more evolved version of Google maps realizing that I'm about to leave a cell phone coverage area, and in the background, downloading maps I'm likely to need before it's too late to get them.

I think this is really what TFA is trying to point out, but now I'm probably in contention for the Captain Obvious prize myself.

Re:Dumb terminals and smart people don't mix (2, Insightful)

Tubal-Cain (1289912) | about 4 years ago | (#31673214)

Every time there was a glitch in the network; every student, professor, and staff member at the university lost the ability to do anything on their computer--they couldn't so much as type a Word document.

Meh. That's true for my workplace despite our thick clients. Network folders, Internet connection, Active Directory... If anything goes down the office just sort of grinds to a halt.

Re:Dumb terminals and smart people don't mix (1)

GWBasic (900357) | about 4 years ago | (#31675680)

Back in the 90's we tried this at my old university. We networked all our computers and put all our apps on a central server.

That's the point of local storage in HTML 5. Applications that make good use of it can run without a network connection, or when the server suffers a 30-minute "glitch."

Breathe deep then read this (-1, Redundant)

Anonymous Coward | about 4 years ago | (#31669560)

A combination of SEO (search engine optimization) tricks, product duplication, fake companies, and bogus product reviews have muddied the water for software. No wonder app stores have begun taking over--it's almost impossible to find good software on the Internet. It's hard to find anything beyond big names like Microsoft Office and Adobe Photoshop, because little sleazeball companies have taken over the space.

You see them all the time. If you're looking for any sort of utility on Google, you'll come across a ton of crappy and often useless (if not outright fraudulent) garbage that has risen to the top, thanks to what I like to call "active" SEO trickery. The normal SEO ideas used to improve search results are being actively modified on what seems like a day-by-day basis.

Try searching "free video converter." Say you want to convert an .avi file to .mp4. Which one should you use? [Note: None of them are actually free.] When you start digging through the list, you'll find a number of anomalies. First of all, the cream doesn't rise to the top anymore. That's where the empty plastic bottles and flotsam are. The cream has dissolved, and you can't use Google to find it.

To find the cream, you have to go to a forum and ask around. Of course it's nearly impossible to find a subject that everyone agrees on. If you have enough followers, you can use Twitter or Facebook for crowd-sourcing, but you have to have a lot of tech savvy followers for this to work.

In the days when computer magazines ruled, these things were done by staffs within a rigid review framework. That's impractical nowadays. You'd be hard pressed to find any print computer magazine that will tell you which is the best video utility out there. And if they did, few of today's users would know to source from the magazine. The situation is grim.

Sourcing directly from Google is problematic. Sellers market crap by repacking it on numerous similar Websites, re-branding it so it competes with itself. Various SEO tricks help bring the product to the top of the search results. These are one or two person operations with sometimes slick packaging and products that seldom work as advertised. The companies protect themselves with onerous EULAs that say the product sucks, so it's all good and legal. A few offer bogus trial periods or shareware-like deals that are out and out frauds, with nearly all of the important features disabled. Often times the products just don't work at all.

People credit Apple's creation of the iTunes App Store as part of the company's control freak nature. That may be part of the its impetus, but we can't overlook the fact that the store manages to reign in the kind of crap we see in the online world where scammers dominate.

There are a number of reviewing initiatives that hope to compete with the App Store for mindshare. I suspect that they will eventually have some impact. For now, however, there are simply too many sub-categories and too much confusion. Which are the best font cataloging and organizing tools? What is the best software for moving a DVD recording to my laptop? What is the best software for converting an old XYwrite document to HTML? What software best converts hard CRs to soft CRs?

It's rare to find what you're looking for if there's any complexity involved. You get junk instead. Even when a trusted source like PCMag or a dedicated review site actually offers a listing of the particular utility you're seeking, it's a miracle if it shows up in the Google results. Why? Because these sites aren't doing active SEO on every single page of content they're publishing.

I've complained in the past about SEO and its impact on the Web. All I get is flak from SEO consultants. But that hasn't stopped me yet. My favorite search example is "best cell phone plan." Good luck with that one.

The app store of the future will have the 10 utilities that might work with star ratings telling readers what other users think. Within the app store structure, if anyone tries to rig the rating, their company and products get permanently ejected.

Yes, public reviews are often corrupted. This is another good reason to turn to a trusted source. But where are they? They're buried, too. I would include myself in that group, but I don't do much reviewing anymore. Most companies don't want their products reviewed because they're not very good--and they know it. They'd rather just market it on the Internet and make bogus claims about it.

I'm actually amazed when I find a great product that does everything it says. The system is hopeless.

Re:Breathe deep then read this (1, Interesting)

Anonymous Coward | about 4 years ago | (#31669746)

I think that's where old-school software download sites shine again. They are basically app stores for free/shareware apps; and they've been around for decades.

With the advent of Google-level search engines, they became a lot less relevant. Now that Google & co are spammed to death, they regain part of their old glory.

It's not all black and white though. App-stores suffer from fraudulent entries that try to game the system, too. I've followed the reports of various Apple App Store developers for a while and even though Apple is tough, a lot of dubious crap falls through the cracks. On the other hand, Google tries to combat sites that try to game their ranking algorithm (and fails miserably).

P or NP (3, Insightful)

daveime (1253762) | about 4 years ago | (#31669596)

It seems the hardest and most time-consuming problem with Internet operating systems is figuring out how to work offline.

And the easiest solution, which seems to escape almost everybody, is "don't work online in the first place".

Re:P or NP (0)

Anonymous Coward | about 4 years ago | (#31669932)

The only winning solution is to never play...?

Re:P or NP (3, Interesting)

starfishsystems (834319) | about 4 years ago | (#31670116)

Not really. Your situation of working offline is a particular case of working online. It just happens to have high latency. So the easiest solution, for the user, is one which generalizes to encompass high latency.

The converse is not true. Of course you can retain the capabilities of an offline environment even after you add a wire to it, but those capabilities do not generalize to managing the resources on the other end of the wire.

The easiest solution to implement is a pencil and a piece of paper. Oh, you want capabilities too? Well, that's different.

Re:P or NP (2, Interesting)

Late Adopter (1492849) | about 4 years ago | (#31673964)

And the easiest solution, which seems to escape almost everybody, is "don't work offline in the first place".

FTFY. Having my data available on any online computer or device that I happen to be at *increases* its availability to me, even in the presence of occasional outages. There's down-sides, such as privacy, but availability isn't one of them: it's a net positive.

Re:P or NP (0)

Anonymous Coward | about 4 years ago | (#31687480)

As a lawyer explained to me, they have to have all files and applications available locally when the mainframe is down, because Federal Judges don't care why you missed a filing.

Internet as a living entity (2, Interesting)

gmuslera (3436) | about 4 years ago | (#31669640)

If were a living thing, it would have cancer, several kinds of it, spread all around the body. Botnets, zombies armies, spam, malware sites... a good percent of it is just badly sick. It have several brains too, some of them playing against the health of the whole body by not letting the "blood" flow freely all around, as some governments censoring it because political reasons or lobbying ones.

It have its strengths too, is maturing (hopely), have a good defense system so the sickness spread around don't infect everything, and it evolves fast (even if limited by laws, patents, trolls, etc), getting more personal and localized.

With a bit of luck people, institutions and governments starts to worry about its health, the ecosystem that it is and start working on preserving it as much as the planet we live.

Re:Internet as a living entity (0)

Anonymous Coward | about 4 years ago | (#31676234)

i think a more apt analogy would be like a government sponsored forest in which one type of tree is planted after the initially diverse ecosystem is leveled. consequently once one tree becomes infected, the rest are infected... :(

Quoting the article (0, Flamebait)

vikingpower (768921) | about 4 years ago | (#31669722)

"always-on future", frequent use of the word "massive", "the internet operating system is an information operating system" etc. etc. etc. Besides the article a big colored blotch blares something about some "web2.0 expo" - whatever that may be. Brief: are there people actually listening to / reading this guy and his baked air ? What a bunch of meaningless cr*p !

Quote (1)

globalsnake (1345027) | about 4 years ago | (#31669820)

So where is the quote of the hour coming from? Sheep hearders?

Re:Quote (0)

Anonymous Coward | about 4 years ago | (#31670040)

As I listened, I heard the sheep herder herding the herd of sheep.

Re:Quote (1)

globalsnake (1345027) | about 4 years ago | (#31670234)

Well tell me something I did not know. So obviously sec sos, so are you going away from hand held radiation devices to implants, easier to send torture signal?

Plan 9 Anyone? (1, Funny)

Anonymous Coward | about 4 years ago | (#31669862)

It does sound like everything Plan 9 was trying to solve and did solve to a certain extent.
The trouble is plan 9 was too early for its time and it still is.
There is a larger problem too. Ownership. It is clear who owns and responsible for
individual machines. But who owns the mystical "between the machines space".
Google? Government? United Nations? Can't pick which is worse.

Seriously? Microsoft? (0)

Anonymous Coward | about 4 years ago | (#31670026)

"..Along came Microsoft with an offer that was difficult to refuse: We'll manage the drivers; all application developers have to do is write software that uses the Win32 APIs, and all of the complexity will be abstracted away. "

In which universe did Microsoft first come up with the concept of driver management and a standardized API?

Sounds more like he's summarizing the most popular services of the World Wide Web today, and calling all that the Information Operating System. We've heard allegations of the WWW being an OS before.

Re:Seriously? Microsoft? (1, Funny)

Anonymous Coward | about 4 years ago | (#31670214)

Sounds more like he's summarizing the most popular services of the World Wide Web today, and calling all that the Information Operating System..

He missed porn.

internet os = ubisoft drm (0)

Anonymous Coward | about 4 years ago | (#31672544)

After the mess with ubisoft's god awful DRM, why do people think having a cloud based operating system is a good idea? And if it can survive a disconnection and thus doesn't need the net at that point, what's the point in having it in the cloud at all?
There are too many potential problems for cloud based OSs, such as a company owning your data instead of you, potential for their servers to be hacked, net disconnection, etc, all for the advantage of slightly cheaper hardware for you. I want to own the data on my hard drive while it's not connected to the net.

Crappy trade off in my opinion.

Mobile code, redundant data (1)

ka9dgx (72702) | about 4 years ago | (#31673974)

I think a better version of the future is to secure the PC using sandboxing and capabilities to limit the side effects of applications. This then allows you to download and run apps on your PC, without the need to trust them. You could then have redundant copies of your stuff spread across your various devices. Your stuff includes photos, videos, documents, and the code to manipulate them.

The focus on services is a result of the distortions caused by the lack of a good security model on the PC. Once that gets fixed, a lot of thing work better.

Re:Mobile code, redundant data (1)

atrimtab (247656) | about 4 years ago | (#31675060)

Agreed. The idea of cloud computing is a power play to make users feel more secure given the inherent problems of (primarily) Microsoft Windows usage on the Internet.

The pitch is: "We'll do everything for you in the cloud and then it won't matter what you are running on your internet access device."

The problem with that model is that everything gets controlled by someone else. But the majority of non-technical customers do not understand how much they are giving away with that service model. They feel safer with a large corporate entity telling them what to do than with local in-house technicians and service providers.

I think that the future model should be more of local clients synced across the internet. Like the DropBox service provides. Everything works whether you are connected or not. And everything is re-synced whenever a client connects to the net, but no processing or closed applications are run "for you" in the cloud.

Most consumers see technology as "magic" and don't realize what they give away or lock themselves into until it's way too late. We are seeing that more and more each day with Facebook, Apple, Google, etc. Whose business models depend on the naive user to accept free services or supposedly "safe hardware" in exchange for lock-ins.

Google at least seems to be offering the most open data formats on it's services, so the user lock-in is not nearly as complete as with Apple products.

Apple seems to want to re-invent television via a complete locked-in patent controlled proprietary walled garden that they can charge tolls on to everyone who uses it. Good for Apple and their shareholders, and bad for almost everyone who buys into it. But most buyers won't realize that until they've invested time, $$$ and their data in a product that's more like pretty handcuffs than a good tool.

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...