Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

New York Times Says Thin Clients Are Making a Comeback

timothy posted about 6 years ago | from the dialectic-materialism dept.

Networking 206

One of the seemingly eternal questions in managing personal computers within organizations is whether to centralize computing power (making it easy to upgrade or secure The One True Computer, and its data), or push the power out toward the edges, where an individual user isn't crippled because a server at the other side of the network is down, or if the network itself is unreliable. Despite the ever-increasing power of personal computers, the New York Times reports that the concept of making individual users' screens portals (smart ones) to bigger iron elsewhere on the network is making a comeback.

Sorry! There are no comments related to the filter you selected.

How cool! (4, Funny)

Tink2000 (524407) | about 6 years ago | (#25351769)

Now, the terminals that work has had since 2003 are back in vogue. Awesome.

Re:How cool! (2, Insightful)

religious freak (1005821) | about 6 years ago | (#25352137)

Heh, I'm in financial services, try 1963. Nothing like using a state of the art thick client to emulate a 60's era dumb terminal... your fees at work!

Re:How cool! (0, Offtopic)

Ash-Fox (726320) | about 6 years ago | (#25352829)

If you can read this... 01110101 01110010 00100000 01100001 00100000 01100111 01100101 01100101 01101011

01101110 01101111 00100000 01110101 00100001

Re:How cool! (0)

Anonymous Coward | about 6 years ago | (#25352983)

01101110 01100101 01100101 01100100 00100000 01100111 01101001 01110010 01101100 01110011

Re:How cool! (2, Funny)

Anonymous Coward | about 6 years ago | (#25352373)

2003? Bring back 1973, the last year when American clients were thin.

GOATSE!!! (-1, Troll)

Anonymous Coward | about 6 years ago | (#25353157)

http://goatse.cz/ [goatse.cz]
You nerds love it!

Re:How cool! (1)

monghe (1384125) | about 6 years ago | (#25352947)

Thin clients may be, but they all run p2p applications.

Could have told you that was coming (4, Insightful)

falcon5768 (629591) | about 6 years ago | (#25351789)

its all in the upkeep, It is cheaper and easier to maintain a bunch of servers, and have a bunch of lightweight computers hooked into it than to maintain a individual machine per EVERY person. While there will always be things that having a individual machine is better suited for, for those people where all they need is internet, database access, and word processing, it makes little sense to not just maintain that stuff on a secure server and farm it out to everyone else. I have been pushing this for years in our school district, its only been now where the people who get to make the decisions are finally listening.

Re:Could have told you that was coming (3, Interesting)

Bill, Shooter of Bul (629286) | about 6 years ago | (#25351875)

Don't forget the cost of maintaining the network. In a school district setting that would probably mean a WAN connecting all of the schools and district offices together. If the network goes down.... every one has to stop working. I'm sure you are very talented and it might work for your particular district. In my area, I know the level of Network Engineers they have and I'm convinced the whole thing would blow up.

Re:Could have told you that was coming (5, Informative)

wintermute000 (928348) | about 6 years ago | (#25351961)

Which is why its not a great idea putting mission critical thin clients across a WAN

Though having worked for several years in large corporate environments (and their associated love for citrix farms), I would observe

- WAN accelerators work. A riverbed (mind you at ~$50,000AUD a pop it ain't exactly cheap) will make a 2M link seem like LAN speeds for the protocols its optimised for. Depending on cost of bandwidth....

- Consolidation does not have to go overboard. If there are at least a few hundred users, it can be cost efficient to run a local server. Most network problems that are not a result of a bungled change / cabling stuffup are WAN.

- Government network? good luck with that buddy!

- The bean counters find it very easy to quantify the cost 'savings' and push their agenda as such. However for your potential losses due to downtime caused by network outages.... heck the fortune 500 I am contracted to presently doesn't even have a method for estimating the dollar cost of downtime, let alone a method for estimating the amount of downtime likely to occur (needless to say they also choose the cheapest carrier, which has a ridiculous inability to meet SLA, and then consolidate like mad to place even more reliance on the WAN).

Like most things in IT there is no silver bullet or magic formula, each case needs to be judged on their own merit.

And on a side note, given how much hardware costs have dropped and the fact that user requirements have remained relatively static (i.e. most generic office workers are still using the same software as 4 years ago), how hard can it be to embed the email client (with local cache so they can at least view emails they already downloaded) and office suite on the thin client itself so at least they can keep working on documents?

Re:Could have told you that was coming (1)

Hognoxious (631665) | about 6 years ago | (#25352591)

A riverbed (mind you at ~$50,000AUD a pop it ain't exactly cheap) will make a 2M link seem like LAN speeds

When it stops working, do you refer to it as a billabong?

Re:Could have told you that was coming (1)

wintermute000 (928348) | about 6 years ago | (#25353147)

When the alternative is that amount per year if you want to double the bandwidth - it doesn't seem so bad does it?

Re:Could have told you that was coming (3, Interesting)

peragrin (659227) | about 6 years ago | (#25353139)

here is the kicker you can't easily run citirix and windows apps across a WAN. too much bandwidth that is lag sensitive.

my company runs an AIX server with ssh access. each user literally SSH's into the server which loads up the acccess to the point of sale/inventory database. Everything important is tightly controlled. but the fact that you can run it over a dial up 36.6 modem effectively means that even if the internet is choking you can still work.

Re:Could have told you that was coming (1)

wintermute000 (928348) | about 6 years ago | (#25353159)

Citrix isn't too bad, but I wouldn't like to use email or office apps over it. Database / web or web-like frontends are tolerable

Re:Could have told you that was coming (1)

An dochasac (591582) | about 6 years ago | (#25353389)

Have you tried Sun Ray? Some Citrix marketing material uses cute statistics such as "average bandwidth used over 24 HOURS" which might make it seem to be a better low bandwidth solution than it is. On the other hand, I find Sun Ray very usable over a WAN even though my home network connection barely exceeds 1Mb. As to the fact that thin clients are useless when the network goes down... so are PCs for all practical purposes. The fact that Sun Rays uses 1/40th the power of a typical desktop PC would be awesome by itself, but the fact that I can upgrade thousands of desktop clients in the same amount of time it takes to upgrade 1 PC probably saves more money (headcount) in a typical corporate environment.

Re:Could have told you that was coming (0)

Anonymous Coward | about 6 years ago | (#25353419)

Sure if you've got the option, but I work in security and we're only allowed to use a couple of work related programs. All of which are run over a secured connection to an off site server.

We wouldn't be able to get away with doing it on site because of backups and the fact that many sites across America have to be using the same pool of applications. Sure it sucks when the apps go down, but we do have a backup software piece which can be used when the server goes down.

Re:Could have told you that was coming (0)

Anonymous Coward | about 6 years ago | (#25351987)

Don't forget the cost of maintaining the network. In a school district setting that would probably mean a WAN connecting all of the schools and district offices together. If the network goes down.... every one has to stop working. I'm sure you are very talented and it might work for your particular district. In my area, I know the level of Network Engineers they have and I'm convinced the whole thing would blow up.

Well you're not the only one on this. The network on my college have to be supervised (yeah, I said it) by two different people. And God, do they complain when we - the students - use the library computers to anything else besides research? Yes they seem like complaining school girls in comparison to us.

Re:Could have told you that was coming (1)

Jellybob (597204) | about 6 years ago | (#25352909)

Whats the "anything else besides"?

If you're using it do word processing instead of Internet access, fair enough, but if I was running your *library* network, I'd be pissed at people deciding to sit around playing Flash games as well.

Re:Could have told you that was coming (0)

Anonymous Coward | about 6 years ago | (#25353277)

And I'd be pissed that the people I'm paying for service are whining little bitches about it.

Re:Could have told you that was coming (2, Informative)

lostguru (987112) | about 6 years ago | (#25352047)

Our district already has it, each school has two T1's direct to the district office, VOIP, and web all go through there. Works fine, only problem is we can't get the morons at the district to remove things from the content filters.

Re:Could have told you that was coming (1)

Antique Geekmeister (740220) | about 6 years ago | (#25352275)

But now that is considered a normal, critical part of services like electricity and fresh water.

Re:Could have told you that was coming (0)

Anonymous Coward | about 6 years ago | (#25352361)

no you have servers on site - the gist of it is maintain 1 box instead of 20

Re:Could have told you that was coming (0)

Anonymous Coward | about 6 years ago | (#25353469)

If you can get away with that. You're obviously not working in something critical like security. The first copy is generally kept off site which requires this sort of off site solution. Having only one copy stored on site is not good in the long shot possibility where the building is destroyed.

We do have a back up, but it's not used very often, probably less than once a year.

Re:Could have told you that was coming (3, Insightful)

ObsessiveMathsFreak (773371) | about 6 years ago | (#25352465)

If the network goes down.... every one has to stop working.

At this point, if the network goes down then all clients, thin or thick, will effectively stop working anyway.

Re:Could have told you that was coming (2, Insightful)

Hognoxious (631665) | about 6 years ago | (#25352611)

I need the network to work on a report or spreadsheet that's stored on my local hard drive?

Re:Could have told you that was coming (1, Informative)

Anonymous Coward | about 6 years ago | (#25352703)

If you're storing business documents on your local hard drive you're doing it wrong: it's likely your hard drive is not backed up on a regular basis. Every place I've worked has had policies against doing this for exactly this reason. Almost all them mapped "My Documents" (and/or $HOME) to a network share (or NFS filesystem) anyway, so users would have to go out of their way to store things locally.

Re:Could have told you that was coming (1)

TheSovereign (1317091) | about 6 years ago | (#25351895)

Virtualization is the leading the way on this front. We even virtualize desktops at work mobile units must literally RDP into a vm in order to work, we don't allow information out of our system

Re:Could have told you that was coming (2, Informative)

wolf_bluejay (1369783) | about 6 years ago | (#25351979)

Of course the the idea of running a server and a bunch of lightweight clients is so much easier to tend to. I work for a school district and we run our own version on thin/diskless clients. We have a few thousand running now, and change about a thousand a year more over every year. After 3 years of great improvements all around, we are never going back to individual stations. I do find it comical that old ideas seem to keep coming back, and it just might be because they are good ideas. Of course, we run fat/diskless for most of it, so that kills most of the downsides. And yes, we run a little over 700 machines from one server.

Re:Could have told you that was coming (1)

poetmatt (793785) | about 6 years ago | (#25352011)

You sidestepped the major issues and questions resolving thin clients and related setups:

The first question is: for your supposed all-around solution, what exactly is it intended to be used for?

The second is:
Why could said solution in the first not be solved by people having computers in the first place, albeit cheap ones if stuff is so minimal it can be done on thin client?

Re:Could have told you that was coming (4, Interesting)

Bazman (4849) | about 6 years ago | (#25352027)

Seven hundred?!?! Microsoft had a web page where you could put in your client requirements and they would tell you how many Win 2003 TS machines you would need to support these clients. I don't think we ever got it down to fewer than 10 users per server - how did you manage 700?

Currently we have four servers for about forty seats in our labs. They don't get much usage, and people don't seem to notice they're sharing a machine with the other 10 people on that row of the lab.

I'd give thin clients to everyone, but then someone in an office of their own will tell us they really need Skype, and they really need a web camera... I suppose these things could be connected to a thin client and forwarded over USB, but it's not something we've tried...

  The other show-stopper is where users need admin rights for particular software. It does still seem to happen, mostly with big important pieces of software like our finance system or student records management. It may just be it needs to write to the C: drive so we could bodge it with access rights, but we don't want to screw up the installation so the user gets admin rights. Now, could we do that on a shared Windows 2003 TS box? I don't think so. With VM tech we could give them a VM of their own to play with though...

  VM tech has also helped us deploy Linux and Windows to our labs. Previously we had say four servers running Linux and four running Windows, and if the lab session needed Windows then there were four Linux servers sitting idle, and the users crammed onto the four Windows servers. With VMs, we stick a Windows and a Linux VM on each server, then the users are more spread onto the eight servers. Win.

Re:Could have told you that was coming (1)

markdavis (642305) | about 6 years ago | (#25353265)

We use Linux servers, Linux apps, and Linux thin clients. Our ratio is two servers for 160 clients/users. But we also have support from the CEO, so when we say "no" to users wanting animation, sound, etc, it means "no". Users have access to everything they need to perform their jobs and costs are very low (compared to thin OR fat MS environments).

Re:Could have told you that was coming (1)

ILuvRamen (1026668) | about 6 years ago | (#25352013)

oh yeah, so what happens when I want to plug in a USB drive or put in a CD or DVED? No matter what you answer, it's either going to involve a lot of walking to the server room (yeah right) or a huge delay while the entire contents of the storage device is transfered over the network to open the file(s) on it OR even worse, I simply can't use a disc or USB drive. It's idiotic in today's world for any school or work environment. Sure, put it in for public access terminals instead of super locking down XP otherwise skip it.

You should be asking "Why?" (4, Interesting)

Suzuran (163234) | about 6 years ago | (#25352483)

Easy, same way I handle it at our office with our terminal server: "You can't do that."

Employees have no business copying CDs worth of data to (or worse, from) the office. In the eight years since the implementation of our terminal server environment, I have had exactly zero cases where there was a legitimate need to copy large amounts of data from the terminal server.

Your computer at work is for working, not playing games when you think nobody is watching. Almost all of the complaints I get from employees wanting a "real PC" instead of a thin client revolve around their desire to screw around on the clock without being detected.

In 100% of the cases where the employee was granted a PC instead of a terminal, later investigation revealed unauthorized usage within one month, ranging from forging call sheets to play flash games to a salesman using over 75% of the company's total internet transfer in one month at myspace.

Re:Could have told you that was coming (1)

WTF Chuck (1369665) | about 6 years ago | (#25352791)

Why do you need to plug in a USB drive? Are you trying to steal data? Are you trying to load a trojan onto the system? Are you trying to load pirated software so you can then call the BSA? Are you trying to load up MP3's and P2P software so that the RIAA will send nasty grams?

If you really think that the IT department is going to let you anywhere close to their servers without having you drug off by security, you need to seek professional help.

Welcome to the 21st century, where everyone is either litigation crazy, seriously covering their asses to keep them from being sued off, or just plain stupid.

I do hope that you just forgot to include the proper <sarcasm /> tags

Eh (4, Insightful)

Sycraft-fu (314770) | about 6 years ago | (#25352055)

There are plenty of downsides too. While it might be easier to maintain, it is also easier to fuck up. Someone does something that breaks a piece of software, now the whole department/company/whatever doesn't have it rather than just that person. A network outage is now a complete work stopping event rather than an inconvenience. Special software installs for special tasks are hard since that software has to be tested to make sure it doesn't hose the server.

I could keep going, if I wished. Now that isn't to say that means the thin client model is bad. In fact we are hoping to do it for our instructional labs at some point. What I'd really like (and there are VM solutions to do) is that not only would we have thin clients, but a student could use a laptop as a thin client too and load our image from their home or whatever.

However, the idea that they are just cheaper/better is a false one. They can be cheaper in some cases, in others you can easily spend more. Likewise they can simplify some thing and make others more complex.

There isn't a "right" answer between large central infrastructure and small distributed infrastructure. It really depends on the situation.

All I will say is if you are looking at doing this at your work as you suggest be very, very careful. Make sure you've really done your homework on it, and make sure you've done extensive testing. I don't think it's a bad idea, but be sure you know what you are getting in to. Just remember that while people get whiny when, say, an e-mail server goes down, if the terminal server goes down and NOTHING works, well then people go from whiny to furious in a second.

It's the same kind of deal with virtualization. It is wonderful being able to stack a bunch of logical servers on to one physical server. However if that one physical server dies you can be way more fucked. You have to spend a good deal more time and money in making sure there is proper redundancy and backups and such. So while packing 10 servers on 1 using VMWare Server (free) might be nice and cheap, you also might be creating a ticking time bomb. You then might discover that putting those 10 servers on a small cluster with a fibre channel disk array and VMWare Virtual Infrastructure (not free) solves the reliability problem nicely, but isn't quite as cheap as you thought.

Just something to be careful with. At work we have both sorts of things. We've got individual desktops, and we've got thin clients (though we actually got rid of most of those). We've got individual servers, we've got virtual servers, and so on. All methods have advantages and disadvantages. I am not a zealot either way, just warning that a change from a decentralized to a heavily centralized infrastructure isn't something to be done lightly. You solve various problems, but introduce a host of new ones.

In particular hardware reliability is something you want to keep in mind. You for sure want an "N+1" situation with your terminal servers, perhaps even more than that. You can't count on the hardware being reliable. Hopefully it is, but I've seen even the real expensive, redundant shit (like a Sun v880) fail with no warning. When it's the be all, end all and all work stops when it is down, that just can't happen.

Actually, it's probably a PR story (5, Insightful)

Moraelin (679338) | about 6 years ago | (#25352253)

1. Actually, regardless of whether they are making a comeback or not, or what their advantages and disadvantages may be, this is probably just a PR story. Just like the "The Suit Is Back!" that got traced back to a PR agency a couple of years ago.

PR loves to masquerade as news because it bypasses your BS filter. An ad for Men's Warehouse suits gets looked over, a piece of news that you won't get hired unless you wear a suit, tries to replace your premises with theirs and let you take a leap to the "I must buy a suit" conclusion. Or better yet, to the even better conclusion, "I must only hire people in suits 'cause everyone else is doing it." There are a lot of sheeple out there who only need a "The Herd Is That Way -->" sign to willingly enter someone's pen and be sheared like "everyone else".

For anyone who's not sheeple, this is a non-story. Whether _you_ need a server instead of PCs or not, depends on what _your_ needs are and what _your_ employees are doing. Use your own head.

The only ones who need an "everyone else is doing X" story are those who have to follow a herd to feel secure.

Hence, the love PR has for this kind of story.

2. Over-simplifications like "all they need is internet, database access, and word processing" were false when arguing why grandma should only need an old 486, and tend to be just as false for a company. So you'll have to do some analysis if for a particular company that is indeed true, or just glossing over what's really going on. (Or even wishful thinking by some IT guy who feels his job would sound more important if he was overseeing a server.)

E.g., a lot of companies have salesmen who go with a laptop to various customers to give a presentation and try to win a contract. Are you ready for the case when that guy you're trying to sell insurance doesn't have internet to connect to your server via VPN? Are you sure that those server side apps' files can be converted flawlessly to MS Office or whatever those sales guys have on their laptop?

It's just one example where goimng, "bah, they only use database apps and word processing" is glossing over a more complex problem.

3. The argument for saving costs is a good one, and far from me to advise wasting money. But you have to be sure that you're actually _saving_ money across the organisation, not just saving $1000 in the narrow slice you see, at the cost of causing $1,000,000 to be lost in workarounds and lost productivity somewhere else. Entirely too much "cost cutting" lately is the latter kind of bullshit theatre.

E.g., if someone costs you $100,000 per year -- and I don't mean just wage, but also electricity costs, building rent, etc -- saving $1000 is nullified if it drops their productivity by as little as 1%. Saving a few hours per year of an IT guy's work can be a very bad trade off, if it costs that guy as little as 5 minutes total per 8h work day to put up with the quirks and delays of the centralized system. (480 minutes a day, times 1% is 4.8 minutes.) It can add up very easily to that. It only takes wasting 1 second per form through some web-app instead of letting that guy massage the data locally in Excel or Access(*), to add up to more than that in a day. A close enough approximation can very easily be approximative enough to actually turn the whole thing into a loss.

(*) ... or whatever F/OSS equivalents you prefer. This is not MS advocacy, so fill in the blanks with whatever you prefer.

And as you move higher up the totem pole, things get even funkier. If a salesman is doing contracts worth millions of dollars with those presentation, I hope you better save a _lot_ with that centralized solution, because it only takes one lost contract (e.g., because he couldn't connect) to put a big minus in the equation. E.g., if you're going to pay a CEO tens of millions per year, and actually believe that his work is worth every cent (heh, I know, but let's keep pretending,) then... again, you better be damned sure that you don't drop _his_ productivity by 1% 'cause that nullifies a lot of savings in one fell swoop.

And make sure you also consider the other related costs there. E.g., if "there will always be things that having a individual machine is better suited for", do you have a coherent plan to keep synchronizing the data between those and the rest of the folks on terminals? Did you calculate the costs for that?

Note that I'm not saying a flat out "it's never worth it" there. I'm just saying you should do the maths first. If it ends up still looking ok, fine, do it. But don't base such stuff just on broad generalizations.

Re:Actually, it's probably a PR story (1)

Kjella (173770) | about 6 years ago | (#25353229)

The only ones who need an "everyone else is doing X" story are those who have to follow a herd to feel secure.

With all due respect, there are many companies that do things in their own OMGWTF way that really badly needs to be wacked over the head with a clue-by-four that says "Everybody is doing it this way, it's simple, cheap, reliable, flexible and in every way better than what you've hacked together. Please put that abomination out of its misery and let us show you a standard, sane and modern solution." Of course many are thinly disguised marketing attempts too but there's definately a need for real information out there too.

Are you lying then, falcon? (0)

Anonymous Coward | about 6 years ago | (#25352333)

Could have told you that was coming (Score:4, Insightful)
by falcon5768 (629591)

Your sig:
"Slashdot, where telling the truth is overrated but lying is insightful."

Re:Could have told you that was coming (1)

Lumpy (12016) | about 6 years ago | (#25353497)

Actually it's not. WE tried that, instead of following the IT departments finding that the reccomendation was based on the idiot CTO went and bought all NCD terminals and spent a huge long dollar on a Citrix farm.

We ended up spending 6X the cost on the whole setup than buying the typical dell pc's. The entire time it was a mess and never worked right because of the people in power buying what some sales rep told him was the best and ignoring the Experts on staff that researched the whole damn thing.

Thin client can work in the right environment and when management supports it and the IT department designing it.

First Post! (4, Funny)

Harmonious Botch (921977) | about 6 years ago | (#25351795)

...or, well, it would have been first if I wasn't on a thin client waiting 15 ^%*^&# seconds for a keystroke echo.

Thin clients? (1)

TheLink (130905) | about 6 years ago | (#25352231)

You should (would) have seen the posts from the "I'm using Vista you insensitive clod" bunch.

They're still waiting for the cancel/allow box to show up ;).

Another cycle in the industry (4, Insightful)

4D6963 (933028) | about 6 years ago | (#25351803)

Yay! People rediscover the advantages of thin clients! How long until they rediscover the downsides...

Re:Another cycle in the industry (1)

thermian (1267986) | about 6 years ago | (#25352089)

Yay! People rediscover the advantages of thin clients! How long until they rediscover the downsides...

That would be when the vendors have made their cut and hand over to the consultants for their turn at the trough.

Re:Another cycle in the industry (1)

Breakfast Pants (323698) | about 6 years ago | (#25352149)

This isn't really an industry cycle, it looks more like a plug for a bunch of current products, ala: http://www.paulgraham.com/submarine.html [paulgraham.com]

Re:Another cycle in the industry (2, Insightful)

lysergic.acid (845423) | about 6 years ago | (#25352163)

it's not about rediscovering the advantages/disadvantages of thin clients. AFAIK thin clients were never fully abandoned. it's simply about finding the right niche for thin clients.

for instance, if you're setting up some computers at a public library that only need to search through the library catalog and nothing else, then thin clients are the clear way to go. if you're running a school network where thousands of students will be sharing a few hundred computers, but they'll need word processing, desktop publishing, web access, etc. then you don't want dumb terminals obviously, but you may still want to just set up a bunch of diskless nodes network booting from a central server instead of having to manage a network of standalone workstations.

while processor power has increased significantly, the computing demands of the casual user hasn't increased that much since the days of Windows 95. a secretary/accountant/manager/student/etc. does not need to do anything beyond running an office suite, checking their e-mail, and browsing the web. a thin client by today's standards can still do all of these things. heck, a sub-laptop can do all of these things. so why waste the time & resources to manage a bunch of standalone workstations when a thin client will do?

reserve the fat clients for people who actually need it: engineers, programmers, designers, researchers, etc. and by giving everyone else thin clients, you'll give them less chance to screw up their system, thus giving them more uptime and more reliability, which users will appreciate.

Re:Another cycle in the industry (3, Interesting)

4D6963 (933028) | about 6 years ago | (#25352277)

the computing demands of the casual user hasn't increased that much since the days of Windows 95

Right, just try watching YouTube on Firefox with a Pentium 133.

by giving everyone else thin clients, you'll give them less chance to screw up their system, thus giving them more uptime and more reliability, which users will appreciate.

Uh huh, you can solve the "chance to screw up their system" by keeping the thick client but virtualising the OS, and as for more uptime and reliability it will only be as reliable and uptimely as is your network/servers, which is in most contexts probably not any better, plus you have to deal with general downtimes, and this way people are going to end up with all their eggs in the same basket, which, although avoidably, could bring huge IT catastrophes. Relying entirely on a centralised network is absolute madness, a single network administrator's mistake, a lack of redundancy combined with a hardware failure, a bad decision or incompetence could paralyse an entire infrastructure. Centralising everything only looks nice on paper.

Re:Another cycle in the industry (1)

lysergic.acid (845423) | about 6 years ago | (#25352635)

hasn't increased that much != hasn't increased at all. a modern 700 MHz cpu is perfectly capable of surfing the web and handling most office computing work.

and what does using thin clients have to do with lack of redundancy? no one said you had to use a single file server for the entire network. using fat clients will not make up for a lack of common sense. and if you can't manage to keep a dozen servers up, you're certainly not going to be able to handle maintaining a couple hundred fat clients.

so, yea, if you're not a competent network administrator, then obvious you shouldn't be network booting anything. but assuming you can keep your network up, then running thin-clients can simplify your maintenance work. not everyone needs a workstation with the latest quad core CPU virtualizing Windows Vista, all just so they can check their e-mail, search the web, and open Word/Excel/PowerPoint. aside from eliminating the unnecessary overhead, using thin clients in such situations would greatly reduce power consumption.

Re:Another cycle in the industry (1)

Ash-Fox (726320) | about 6 years ago | (#25352771)

Right, just try watching YouTube on Firefox with a Pentium 133.

I have actually done this using a Pentium 133 as a thin client.

What about other downsides? (1, Redundant)

DesScorp (410532) | about 6 years ago | (#25352409)

Yay! People rediscover the advantages of thin clients! How long until they rediscover the downsides...

I first got the computer bug seriously when I was in college, and took some courses requiring the use of dumb terminals in our computer center... they were running off a DEC minicomputer running Unix, and I was hooked. I learned to do a lot using those old green and orange screen terminals, and to this day, I wonder if most businesses wouldn't be incredibly more productive if they went back to simple no-GUI dumb terminals... with text email and Lynx browsers.

Think about it. How many employees now blow off hours at a time during the workday by playing solitaire, going to MySpace, releasing the latest trojan into their LAN via email attachments...

Even with a GUI terminal, if it was stripped down and wasn't Windows based (and had drastically limited Internet access), I think a lot more would get done around offices.

Re:What about other downsides? (1)

4D6963 (933028) | about 6 years ago | (#25352467)

Where are you going to people who will a) accept being deprived of the computer 'perks' they take for granted and b) qualified to work without a GUI? Also, what sort of business could run with just that nowadays?

Re:What about other downsides? (1)

dangitman (862676) | about 6 years ago | (#25352601)

Think about it. How many employees now blow off hours at a time during the workday by playing solitaire, going to MySpace, releasing the latest trojan into their LAN via email attachments...

And how much of your workforce are you going to be left with once everybody quits because of your GUI-less, diversion-free system?

Re:What about other downsides? (1)

rich_r (655226) | about 6 years ago | (#25353435)

There's plenty of data entry work that requires just that, and I've done enough of it! If I've got no internet access anyway, I'd rather just have a well designed text-based system that is fast, lag free and supplied with a decent keyboard.

I fail to understand why people moved away from systems that just worked and replaced them with boxes that did so much but are used for exactly the same tasks and do it just that little bit worse.

Re:What about other downsides? (2, Insightful)

markdavis (642305) | about 6 years ago | (#25353305)

>Even with a GUI terminal, if it was stripped down and wasn't Windows based
>(and had drastically limited Internet access), I think a lot more would get done around offices.

Bingo! That is exactly what we have- Linux server, Linux apps, Linux thin clients (160). Everything is locked down tight. We have everything users need in order to be productive and nothing else (accounting apps, OpenOffice, Firefox, Sylpheed, IceWM, some utils). Internet access is only through a white list of approved sites. But this ONLY works because the CEO supports the concept and allows us to say "no" to users/departments who think they are "special". And yes, the CEO uses a Linux thin client also (although he and Directors can browse outside the whitelist; but still no Flash, Java, nor sound).

Re:Another cycle in the industry (1)

antirelic (1030688) | about 6 years ago | (#25353557)

Hrm. Arent desktops virtually becoming thin clients with the advent of "cloud computing"? As far as I can tell (working in a hugenormeous corporate environment 20,000+ users), any application worth its salt is being run on big iron servers, where the clients only run browsers written in Java (or some other platform neutral client) where they can perform trivial functions not worth running on the main servers. I cant really think of any software outside of development and video games that are processor intensive that are designed to run primarily on a desktop anyway.

So now we have "thin clients" connecting to the "cloud" in order to do the real work. In this scenario your going to have to make up in networking components what you save in client components for the sake of redundancy and availability.

That "true computer"... (4, Interesting)

The Master Control P (655590) | about 6 years ago | (#25351821)

Is now just a bunch of generic PCs in smaller form factors. So in essence you're sticking a network layer between the rest of the computer and it's video card. So instead of network outages (which are inevitable) crippling just network operations, they now cripple everything including your ability to keep typing your office documents or looking at the email you've already got.

It's annoying as hell, but if my network craps itself I still have a working computer in front of me and I can still do a subset of what I was doing before. Not so with thin clients.

<tinfoil mode>
Of course they want to take the actual computer away from you, they want to have control over you. If they could, your "computer" would be a mindless terminal to a Big Brother Approved mainframe that spied on everything you did.
</tinfoil mode>

Re:That "true computer"... (4, Interesting)

inKubus (199753) | about 6 years ago | (#25352295)

The Linux Terminal Server Project [ltsp.org] is actually pretty good. And useful for a variety of things beyond just saving dough on the desktop end. Remote access is one that comes to mind. Sure, you could have a bunch of X terms, but this will work with ANY box with a PXE (hell even Netboot) NIC. You don't need virtualization or any of that garbage. UNIX was designed as a "multi-luser" operating system ;), back when mainframes were last in vogue. Xwindows is really quite good over a slow network and has been for DECADES.

Now, I want to stress that I am a proponent of terminals in only certain areas. A public library computer bank. A factory environment, where you want your server safe and securely away from sparks and heat. A customer service environment where the employee is only doing one or two things. My business ops people would have real computers for the reasons you mentioned. I want them to be accounting and developing even if the server is down.

Re:That "true computer"... (1)

inKubus (199753) | about 6 years ago | (#25352305)

Correction, you can also boot using Floppy, CD, or USB boot image.

Re:That "true computer"... (1)

Xouba (456926) | about 6 years ago | (#25353055)

<tinfoil mode> Of course they want to take the actual computer away from you, they want to have control over you. If they could, your "computer" would be a mindless terminal to a Big Brother Approved mainframe that spied on everything you did. </tinfoil mode>

You're not using GMail or any of Google other services, are you?

Re:That "true computer"... (1)

markdavis (642305) | about 6 years ago | (#25353331)

In a modern business system environment, if the network goes down, productivity pretty much stops. Period. It doesn't matter if the clients are fat or thin.

Thin clients ... (1)

Jacques Chester (151652) | about 6 years ago | (#25351823)

Dumb clients, fat clients, thin servers, retarded paywalls.

It's simple business: (5, Insightful)

Fluffeh (1273756) | about 6 years ago | (#25351833)

When you have customers with thick clients, sell em thin ones cause they are "better-er".

When you have flogged off all of your customers with a thin client, the new thing is a "better-er-er" thick client.

Whole thing sounds like very simple 101 style marketing. Why try to sell someone something they have? Convince them what you have is better. Total no-brainer imo.

Necessary (1)

chrome (3506) | about 6 years ago | (#25351845)

We're gonna need them, what with the economy cratering!

Middle ground? (3, Insightful)

Max Romantschuk (132276) | about 6 years ago | (#25351879)

How about a netbook-style device which could offer limited functionality on it's own for email, web, basic office apps (say a boot image updated from the central server when connected), and used as a thin client at the office plugged into a docking station with proper display(s) and keyboard+mouse? Best of both worlds?

Re:Middle ground? (2, Insightful)

poetmatt (793785) | about 6 years ago | (#25352019)

Hmm, you mean, like one of those laptop things? /snicker

Re:Middle ground? (3, Funny)

RuBLed (995686) | about 6 years ago | (#25352117)

and loaded with Vista Enterprise Edition.

note: you have to turn aero on for a complete thin client experience

Re:Middle ground? (3, Insightful)

nacturation (646836) | about 6 years ago | (#25352401)

How about a netbook-style device which could offer limited functionality on it's own for email, web, basic office apps (say a boot image updated from the central server when connected), and used as a thin client at the office plugged into a docking station with proper display(s) and keyboard+mouse? Best of both worlds?

Why, all you'd need is some kind of Window System that could display X, where X could be any number of applications.

Happy coincidence, Thin Client & Virtualizatio (2, Interesting)

Anonymous Coward | about 6 years ago | (#25351887)

We have recently adopted a phased approach of deploying new thin clients as our estate of traditional desktops hit retirement. After having seen several false dawns and uncomfortably proprietary solutions in the last 15 years, it was only now that we have been happy enough with the whole solution (thin client HW, network connectivity, back-end virtualization SW) to take the plunge.

There are now a range of HW clients (we use ChipPC [chippc.com] ).
There are a couple of viable virtualization systems (we use Citrix Xen [xensource.com] , without the presentation server "tax").
We've chosen a dedicated virtualization hardware appliance on the back-end from 360is [360is.com] .

Struggling economy, phooey! (2, Funny)

Centurix (249778) | about 6 years ago | (#25351905)

Finally I can sell all the Wyse 120 terminals I have in the garage! If you want me I'll be high-rolling at the casino for a couple of weeks...

My clients are fat? (5, Funny)

Plantain (1207762) | about 6 years ago | (#25351907)

My clients are all obese, and show no intentions of slimming down; what am I doing wrong?

Re:My clients are fat? (4, Funny)

ilikejam (762039) | about 6 years ago | (#25352713)

You're working in America.

2009: Thin client v Linux on the Desktop (2, Funny)

MosesJones (55544) | about 6 years ago | (#25351937)

Oh yes its back the battle that everyone has been waiting for its the Rumble on the Desktop, the fight of the century, the challenger is the undisputed next year champion, fighting out of California by way of Finland it is the Penguin himself, Tux "next year" Linux.

And now the champion, dominating in the 70s, losing form in the 80s, disappeared as a recluse in the 90s and the start of the century but now he is back to claim his crown. With the black trunks and green trim its Thin "Latency is a Bitch" Client.

Lets have a good clean fight to finally decide who will be declared the Desktop champion of 2009.

This fight is sanctioned by the ODC (Optimistic Desktop Council) and will be fought under rules of low data, huge assumptions and a complete lack of understanding on the total size of the market.

Re:2009: Thin client v Linux on the Desktop (2, Interesting)

4D6963 (933028) | about 6 years ago | (#25352021)

This isn't boxing, more like wrestling. So don't be surprised if you see VM "you trashed your OS here have this backup virtual image" ware jump up on the ring, headbutt in all directions and virtualise the shit out of your thick clients.

Am I the only one who believes that the future is not in thin clients but in desktop supervisors who make all your OSes run transparently virtualised? I'm talking about 10-15 years.

Re:2009: Thin client v Linux on the Desktop (1)

gbjbaanb (229885) | about 6 years ago | (#25352393)

Google would like to disagree with you.. :)

maybe not in 10-15 years, but I see the future as more mobile devices. If we can fix battery life (possibly) and displays (probable with in-eye HUD type affairs) then processing and capacity will increase. We'll probably see a lot of 'download what you need as you need it' combined with local processing. Your desktop and big monitors will probably go the way of the abacus.

Re:2009: Thin client v Linux on the Desktop (1)

4D6963 (933028) | about 6 years ago | (#25352527)

I'm not sure what you mean regarding the comment about Google but Google has anywhere to go but up. It's one of these companies that start with a huge momentum because they have a tremendous edge over the concurrence, but at some point (about now), they're just another big company with a huge power and market share but little momentum because, as I like to think of it, they reached their orbit, i.e. they did anything they could hope doing in the domain they started in, now the challenge for them is to keep the competition to a safe distance and try to thrive on less familiar grounds.

I disagree with your comment about mobile devices. Of course over the years as UMPCs and later mobile phones catch up in usability with desktop PCs (you can argue that it's already just mission accomplished for UMPCs considered they can run Ubuntu with Gnome), you'll realise that while you can do a lot on such devices you still need a desktop computer. I strongly doubt that the small screen issue will be resolved within that time frame, even with stabilised embedded micro projectors (I mean, you still need a suitable surface to project to), and neither will the controls issue. I'm talking to you by comfortably typing with both my hands on a large keyboard. You'll never get that comfort with a handheld device, and never within the foreseeable future will you be able to get things done as comfortably and fast as with a desktop/laptop computer. As for downloading apps and such, it's very hard to predict because things move pretty fast in that area, but even today a lot can be done in a web-based fashion, so I assume things will get better in that area, to the point all you'll feel you need will be a web browser to do anything.

But that's all off-topic anyways, it's silly to think that mobile devices will actually replace desktop machines, be it at home or even more at work.

Re:2009: Thin client v Linux on the Desktop (1)

gbjbaanb (229885) | about 6 years ago | (#25353123)

I did say that I wasn't sure about the 10 year timeframe... but on the other hand, you never know.

Mobile phones have more power today than desktops of 10 years ago. Modern desktops are reaching a plateur of processing power - not necessarily because we can't squeeze more from them, but because they have enough juice for most people. Its not like years ago when you had to buy a new PC every year or two.. today a 3 year old PC will happily run almost everything as fast as you want. (sure, you can stick Vista on it and need more RAM, which is why lots are sticking with XP)

So, I see a trend for more power from mobiles, especially as people use them more for surfing and emailing... and then want things like Word and Powerpoint on them. That will drive usage, plus the fact that you can get more revenue from these things that are connected to the web - ISPs will want you to use them, and companies will think they can grab market share and become the next Microsoft. The new touch-screen displays are showing people what they can do with mobiles, expect more advances to come in this area, and more growth in their usage.

The small screen can be fixed already (well, maybe not) with projection spectacles, but lots of people are using their tiny-screen blackberries more than their desktops.

This doesn't mean there are issues to resolve - better input using either voice or gesture style 'typing', better projected displays, more CPU and RAM and battery. But these can be surely be fixed in 10 years!

One day, if the above comes true, then the small device will replace the desktop. Probably not in 10 years (unless you're a salesman/journalist/etc) unless the technology takes a leap forward.. which could happen as easily as the eee pc revived the notebook market!

Do I have to say it? (0)

Anonymous Coward | about 6 years ago | (#25352071)

Thin is in.

I support the return of thin client entirely (2, Insightful)

Martian_Kyo (1161137) | about 6 years ago | (#25352139)

too many dumb users (ok I am being too harsh here, too many uneducated users) these days. Thin clients = less freedom, which in case of most users means they'll make fewer mess ups.

This means less boring maintenance work for IT people, in large companies especially.

Re:I support the return of thin client entirely (1)

samos69 (977266) | about 6 years ago | (#25352515)

We've been using thin clients for years at my company (well most people except us in IT) and it's true, the time spent managing desktops is much less, they are very cheap and easy to replace if one dies (wee've only had three out of 500+ die in the last couple of years).

Flash Kills Thin Clients (1)

TheMiddleRoad (1153113) | about 6 years ago | (#25352175)

Unless you've got a lot of bandwidth to spare, Flash will kill performance.

Re:Flash Kills Thin Clients (1)

markdavis (642305) | about 6 years ago | (#25353359)

That is why on our large, Linux, thin-client network, we do not install nor allow Flash. Yes, you are correct... animation is a HUGE enemy of thin clients. But we also have an enforced site whitelist.

On those few sites that are so broken as to require Flash to do anything productive, employees are welcome to come to the training room and use a specially configured station with a local Firefox + Flash + Java.

Yes, i observed that (1)

drolli (522659) | about 6 years ago | (#25352187)

there was a short period bridging the vt100 terminals to the sunrays from 1997 to 2000, where the University library installed personal computers for accessing their network.

No, seriously. This is non-news.

The transition to personal Computers stopped long ago. I can not remember to have seen an institution in the last five years switching to a PC-based infrastructure, but i see since approx. 2001 a rise of thin clients in larde organizations. The organizations for which this pays off will get smaller and smaller with time and in a few years we will have gotten rid of the infrastructural maintainance and support Hell the PC still presents.

Engineering is already there (1, Informative)

Anonymous Coward | about 6 years ago | (#25352189)

I gave up my workstation 3 years ago and have been running on a remote X-server (Redhat) over NX. All of my design software runs off the computational servers anyways, and the NX server is just for running virtual desktops for 10 people at a time. My tasks are not graphic-intensive, and even if I had a local workstation I would want my jobs running on the fastest available machine.

My PC runs office and a NX client, and feels like a thin-client.

I believe most engineers run like this these days. It makes working from home easier too.

But... (1)

Chrisq (894406) | about 6 years ago | (#25352211)

New York Times Says Thin Clients Are Making a Comeback

But in Texas they're as fat as ever.

About 7 years ago I heard the same story (2, Funny)

zullnero (833754) | about 6 years ago | (#25352317)

And later in the year, when the corporation I worked for lost 10 million because one of their customers went bankrupt, I, by chance, got to sit in on a bigwigs meeting.

After announcing the loss and accompanying layoffs, he actually followed it by saying "And I don't think suggesting thin clients will help us out of this one."

Man, it was so hard to keep from laughing...next time I hear that, and it sounds like I will hear that again, I think I'll just risk my job and have a big belly laugh.

it follows from physics (3, Insightful)

azgard (461476) | about 6 years ago | (#25352443)

From physics, it's obvious that centralized computing is more energy efficient than distributed one. The longer distance you have to move energy (that encodes the information) to compute the results, the more energy you need. Also, centralization allows for better resource sharing.

The only issue is who pays for the costs. Mass production of computers allowed to decrease their costs to the point that distributed systems were cheaper than centralized ones. However, as the demand for computer power grows, energy spent on computing itself enters the equation, and the times will change again.

Re:it follows from physics (1)

MSDos-486 (779223) | about 6 years ago | (#25352887)

Ironically enough I tried to make that argument to a physics professor who was borrowing a computer cluster of mine. He was kinda a grey beard so he was envisioning a world of VAXen vs Commodores.

The new dress-code that will also be deployed... (1)

K3ba (1012075) | about 6 years ago | (#25352543)

'And to start using your new terminal, you will now have to wear flares'.

Some concepts should be revisited - terminals (unlike flares) are indeed one of them.

They have their place (1)

Piranhaa (672441) | about 6 years ago | (#25352613)

My company uses a mix of fat and thin clients. IT gives a few choices between desktops, which costs the department money, and then there are 'free' (to department) thin clients. In actuality, the thin clients cost more when actually purchasing. Don't ask me why Wyse charges $800 for a mini-itx system with 1gb/ram and 1gb of flash, running a VIA C7 1.2ghz processor when the device runs just a Citrix client anyways. I personally prices a comparable mini-itx system, with a tiny case, for ~60% of what Wyse charges (with more memory, CPU, and storage). The boxes are still technically fat clients, but they've just been crippled down so only the smart enough people can run applications natively on the device. The boxes are also set to write protect so that no changes can be committed by anyone except local admins (or remotely maintenance of course).

Most applications can work okay through this method, however there are some that still can't work over thin clients. Even running big Excel or Word documents poses a lot of lag. Also, the video cards are old (S3 Virge anyone?) that when trying to output to dual displays, it looks like total crap on one. Some issues arise when deploying specific applications for a handful of individuals as well, by request.

All in all, they are great overall if you can get away with them. There is a tremendous power saving since the box doesn't use much memory running the Citrix client, the processor doesn't get taxed since it's mostly idle, and the whole power consumption isn't much more than a single 10-15 watt hard drive. I'd say the display is the most power hungry item in this case, but set to standby when not in use.

No surprise (1)

UnixUnix (1149659) | about 6 years ago | (#25352671)

Considering the current state of the economy, ALL clients are bound to be thin.

Of course they are (1)

tyler.willard (944724) | about 6 years ago | (#25352743)

It's been about 10 years since the last time they were hyped. Besides, all the yammering about "computing in the cloud" has the thin-client folks excited.

We've come full circle but... (1)

pbrennan (1384057) | about 6 years ago | (#25352779)

Thin clients don't have to mean less power available at the desktop, infact when coupled with VMware or other virtualisation technology thin clients can be as powerful as any on-the-desk solution.

The problems I hear mentioned here about network outages causing company wide problems in terms of disruption are just silly, any decent enterprise these days has a resilient network, I can't remember the last time we had a system wide network outage, they just don't exist anymore, any outage is limited to possibly a single department or other localised area of the network.

I'm currently in the middle of rolling out Sun Ray ultra thin clients for an enterprise I'm employed with, over 800 desktops at one site alone and for ease of management UTC can't be matched. When combined with NetApp it's a simple case of restoring a machine from a snap clone if a user screws up his desktop, takes minutes and the support techies don't even need to leave their desks.

Then there is the green thing, why pay for CPUs, memory and hard disks at each and every users desk, the majority of which are less than half utilised? Thin client solutions are almost always more efficient in terms of energy use and with the rising price of energy these days an enterprise can save a fortune in its annual energy consumption.

So the thin client is about cost and ease of management, not just about restricting end users abilities, hell we even have a software development team using UTC and if they felt restricted I can assure you they'd moan!

One PC, One User, to many problems (1)

MSDos-486 (779223) | about 6 years ago | (#25352869)

One thing that always annoys be about some of the non technical people at work(I work for a product testing lab) is that they have this strange urge to save stuff to there desktops and My Documents folders on there local machines. Because were a smaller org we have a hodge podge of computers that are not all setup in a uniform fashion so its difficult to make sure everyone is on the Domain and doing it The Right Way . So I see some combination of thin clients and virtualization as the solution to this. I still hate how Windows apps save 75% of user specific data in 25 random places that may or may not link to there User folder, making roaming a pain. I can not think of one major Linux app that doesn't save user setting to the home directory of the user.

Total disaster (0)

Anonymous Coward | about 6 years ago | (#25353011)

I'm a principal in a firm of about 150. We've just gone thin client. Our consultants told us that it was the way to go. I had reservations but I have no qualifications (just a general interest in IT stuff) so who was I to argue?

It has been, to date, an unmitigated disaster. The system is slow beyond belief, and a couple of months down the track despite promises and changes and upgrades it hasn't got any faster. Despite a huge spend, we are now no faster than on our old totally outdated networked PC system. It is taking forever to get all the software we used to use up and running on the new system.

We had outage after outage in the first couple of months, not just crashing the odd PC but the whole firm. The downtime doesn't bear thinking about. Hopefully things are settling down now.

On the old system I might (as a keen amateur) have been able to fix something myself. Now, we are totally reliant upon our central IT people to fix any little thing that goes wrong.

Personally, the customisations and additional software and so on that I used to run are no longer available. I can be more productive on my home PC, connected to the firm's servers remotely, than I can be at work. At home, at least I can flick between a remote connection window and my own PC's OS with all the software and productivity tools I want, set up how I want them. I know IT people don't like personalisations because they introduce unknown variables and so on. However, from a power user's point of view I am now straitjacketed in an unproductive way.

Personally, I hate the idea (0, Troll)

Blue_Wombat (737891) | about 6 years ago | (#25353041)

Worked for a place that effectively tried to thin-client us (although they didn't call it that). It was horrible. It was one of the main reasons I left. Most of the guys who did the work and had earned the place stellar reputation (and earned well into six figures) did as well. Most of the team left in a 3 month period, and their reputation (and revenue) cratered. Even if they had been able to replace the team (they tried, failed) I doubt the system cost saving would have covered half the recruitment cost. Still, I guess they saved a couple of hundred per seat on hardware and support costs! Guess what guys - I studied hard for my quals, my market value is starting to get pretty good with experience, and I know how to do my job. I do it the way that works best for me, and I set up my tools to work for me. If some pimply IT support guy thinks they know better than me what is needed to do my job, they are welcome to try and do it. If they can't, they should just piss off. Their job is to give me what I need to do my job and bring in revenue, it ain't my job to work around them. I want a nice powerful machine, that is fully customisable by me for my needs. If I want non-standard software, or a bit of non-standard hardware, the correct response is (1) purchase; and (2) install - it isn't to try and standardise me on what works for someone else.

Cloud Computing .... (1)

CalcuttaWala (765227) | about 6 years ago | (#25353093)

The pendulum keeps swinging between centralised systems ( "mainframe" with "dumb" terminals) and client server ( with smart or fat clients ). For standard applications like email and office productivity products from Google and Zoho offer good support ... in the centralised mode ... through what is now known as cloud computing. However for specialist applications, one might still have to go for desktop based applications.

Comeback? (0)

Anonymous Coward | about 6 years ago | (#25353141)

Isn't it necessary to have been successfull in the past to make a "comeback"?

nbmf (1)

scientus (1357317) | about 6 years ago | (#25353205)

one user can execute a forkbomb and whipe everybody out, also the thin clients i have used are way underspowered and completely die when a group of people get on them, the huge memory needs of these computers make it unfeasable--why is computing allways done with small commoidy system?

Its not one or the other.. (0)

Anonymous Coward | about 6 years ago | (#25353269)

They're not mutually exclusive..anyone who uses the web _is_ using their desktop as a thin client. Anyone who's using is using distributed computing.
    My cloudy crystal ball says that what we'll see is a mix. Where appropriate and cost effective, apps will be on the desktop. Where appropriate and cost effective, on servers... and the 'thin client' software of choice will be

Windows make a good terminal (0)

Anonymous Coward | about 6 years ago | (#25353325)

Whether you put a fat client on a PC or give em a thin client is irrelevant, it's still just a terminal when all is said and done, 19 times of of 20 connecting to a UNIX backend. (Yes that right, UNIX not linux)

And you will not convince me that you have some critical stuff going on that requires any sort of PC. If you do, pack it up you'll be fired soon.

Let windows do what its good at, on a thin client.

MS Terminal Server (1)

Danzigism (881294) | about 6 years ago | (#25353401)

i work for a company that deploys MS terminal servers for large scale networks. i completely understand the benefits of a terminal server and the 2 major things a part of our sales pitch is centralized data and the low cost as compared to an actual PC ownership. but then i think about the fact that a good thin client still costs around $400+ and then the fact that MS charges terminal server licensing fees per client! and what's even more funny is that you need volume licensing for Office 2007 when installed on a terminal server. you used to be able to use Office 2003 for free and use only oine license. so ultimately, the fact that it saves you money is over. the only benefit left is that the user hardly has a chance of getting infected with a virus. but honestly, the Linux Terminal Server Project seems like a much better solution.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?