Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

IT Snake Oil — Six Tech Cure-Alls That Went Bunk

ScuttleMonkey posted more than 4 years ago | from the salesman-ejection-seat dept.

IT 483

snydeq writes "InfoWorld's Dan Tynan surveys six 'transformational' tech-panacea sales pitches that have left egg on at least some IT department faces. Billed with legendary promises, each of the six technologies — five old, one new — has earned the dubious distinction of being the hype king of its respective era, falling far short of legendary promises. Consultant greed, analyst oversight, dirty vendor tricks — 'the one thing you can count on in the land of IT is a slick vendor presentation and a whole lot of hype. Eras shift, technologies change, but the sales pitch always sounds eerily familiar. In virtually every decade there's at least one transformational technology that promises to revolutionize the enterprise, slash operational costs, reduce capital expenditures, align your IT initiatives with your core business practices, boost employee productivity, and leave your breath clean and minty fresh.' Today, cloud computing, virtualization, and tablet PCs are vying for the hype crown." What other horrible hype stories do some of our seasoned vets have?

cancel ×

483 comments

Sorry! There are no comments related to the filter you selected.

In Defense of Artificial Intelligence (5, Insightful)

eldavojohn (898314) | more than 4 years ago | (#29952666)

The bad news is that artificial intelligence has yet to fully deliver on its promises.

Only idiots, marketers, businessmen and outsiders ever thought we would be completely replaced by artificially intelligent machines. The people actually putting artificial intelligence into practice knew that AI, like so many other things, would benefit us in small steps. So many forms of automation are technically basic artificial intelligence, it's just very simple artificial intelligence. While you might want to argue that the things we benefit from are heuristics, statistics and messes of if/then decision trees, successful AI is nothing more than that. Everyone reading this enjoys benefits of AI but you probably don't know it. For instance, your hand written mail is most likely read by a machine that uses optical character recognition to decide where it goes with a pretty good success rate and confidence factor to fail over to humans. Recommendation systems are often based on AI algorithms. I mean, the article even says this:

The ability of your bank's financial software to detect potentially fraudulent activity on your accounts or alter your credit score when you miss a mortgage payment are just two of many common examples of AI at work, says Mow. Speech and handwriting recognition, business process management, data mining, and medical diagnostics -- they all owe a debt to AI.

Having taken several courses on AI, I never found a contributor to the field that promised it to be the silver bullet -- or even remotely comparable to the human mind. I don't ever recall reading anything other than fiction claiming that humans would soon be replaced completely by thinking machines.

In short, I don't think it's fair to put it in this list as it has had success. It's easy to dismiss AI if the only person you hear talking about it is the cult-like Ray Kurzweil but I assure you the field is a valid one [arxiv.org] (unlike CASE or ERP). In short, AI will never die because the list of applications -- though small -- slowly but surely grows. It has not gone 'bunk' (whatever the hell that means [wiktionary.org] ). You can say expert systems have failed to keep their promises but not AI on the whole. The only thing that's left a sour taste in your mouth is salesmen and businessmen promising you something they simply cannot deliver on. And that's nothing new nor anything specific to AI.

Re:In Defense of Artificial Intelligence (1)

John Hasler (414242) | more than 4 years ago | (#29952772)

The same defense applies to pretty much all of these (except maybe CASE).

Re:In Defense of Artificial Intelligence (5, Interesting)

John Whitley (6067) | more than 4 years ago | (#29952850)

The people actually putting artificial intelligence into practice knew that AI, like so many other things, would benefit us in small steps.

Actually, there was a period very early on ('50s) when it was naively thought that "we'll have thinking machines within five years!" That's a paraphrase from a now-hilarious film reel interview with an MIT prof from the early 1950's. A film reel which was shown as the first thing in my graduate level AI class, I might add. Sadly, I no longer have the reference to this clip.

One major lesson was that there's an error in thinking "surely solving hard problem X must mean we've achieved artificial intelligence." As each of these problems fell (a computer passing the freshman calc exam at MIT, a computer beating a chess grandmaster, and many others), we realized that the solutions were simply due to understanding the problem and designing appropriate algorithms and/or hardware.

The other lesson from that first day of AI class was that the above properties made AI into the incredible shrinking discipline: each of its successes weren't recognized as "intelligence", but often did spawn entire new disciplines of powerful problem solving that are used everywhere today. So "AI" research gets no credit, even though its researchers have made great strides for computing in general.

Re:In Defense of Artificial Intelligence (3, Interesting)

Chris Burke (6130) | more than 4 years ago | (#29953388)

A film reel which was shown as the first thing in my graduate level AI class, I might add. Sadly, I no longer have the reference to this clip.

Heh. Day 1 of my AI class, the lecture was titled: "It's 2001 -- where's HAL?"

The other lesson from that first day of AI class was that the above properties made AI into the incredible shrinking discipline: each of its successes weren't recognized as "intelligence", but often did spawn entire new disciplines of powerful problem solving that are used everywhere today. So "AI" research gets no credit, even though its researchers have made great strides for computing in general.

Yeah that's when the prof introduced the concept of "Strong AI" (HAL) and "Weak AI" (expert systems, computer learning, chess algorithms etc). "Strong" AI hasn't achieved its goals, but "Weak" AI has been amazingly successful, often due to the efforts of those trying to invent HAL.

Of course the rest of the semester was devoted to "Weak AI". But it's quite useful stuff!

Re:In Defense of Artificial Intelligence (1)

RAMMS+EIN (578166) | more than 4 years ago | (#29952988)

``Having taken several courses on AI, I never found a contributor to the field that promised it to be the silver bullet -- or even remotely comparable to the human mind.''

The problem is that, if it isn't that, then what is "artificial intelligence", rather than flashy marketing speak for just another bunch of algorithms?

Re:In Defense of Artificial Intelligence (0)

Anonymous Coward | more than 4 years ago | (#29953316)

The problem is that, if it isn't that, then what is "artificial intelligence", rather than flashy marketing speak for just another bunch of algorithms?

Everything a computer (or arguably, a human) ever does is (or at least can be expressed as) simpy another bunch of algorithms that create a whole which hopefully does something useful.

You can't make a difference between artificial intelligence and a bunch of algorithms, this has been always known well. Artificial intelligence is just eloquent enough bunch of algorithms to resemble human intelligence.

If you have a robot that can sort your mail or a computer that can tell your actual emails from the spam (and usually has some kind of learning algorithms for this) or a robot that can vacuum your house by remembering where it has already been, avoiding furniture, etc... It would be a far stretch to claim that those don't have AIs.

Re:In Defense of Artificial Intelligence (2, Funny)

FlyingBishop (1293238) | more than 4 years ago | (#29953376)

Artificial intelligence is trying to make computers do things that are currently very hard for a computer to do, but very easy for a human to do. Once there are ubiquitous algorithms / hardware to do something as fast as a human can, we remove it from the category of "things computers will never be able to do as well as people."

Re:In Defense of Artificial Intelligence (5, Insightful)

Wonko the Sane (25252) | more than 4 years ago | (#29953010)

ERP could work if the vendors would realistically deal with GIGO.

Unless you lock down the permissions so tightly that the system is unusable, your users will enter bad data. They'll add new entries for objects that already exist, they'll misspell the name of an object and then create a new object instead of editing the one they just created. They'll make every possible data entry error you can imagine, and plenty that you can't.

We'd see a lot more progress in business software applications if all vendors would follow two rules:

  1. Every piece of data that comes from the user must be editable in the future
  2. Any interface that allows a user to create a new database entry MUST provide a method to merge duplicate entries.

Re:In Defense of Artificial Intelligence (1)

boristdog (133725) | more than 4 years ago | (#29953536)

Unless you lock down the permissions so tightly that the system is unusable, your users will enter bad data. They'll add new entries for objects that already exist, they'll misspell the name of an object and then create a new object instead of editing the one they just created. They'll make every possible data entry error you can imagine, and plenty that you can't.

GET OUT OF MY MIND!

-seriously, are you me?

GUIs, games, compilers used to be called AI (1)

peter303 (12292) | more than 4 years ago | (#29953098)

It just when some aspect of symbolic computing is successful, its not really considered AI anymore and the goal changes. Or it was any computing technology to emerge from an AI laboratory was considered AI'ish.

Some researchers divided this into "soft" and "hard" AI. The later would be someone conversational humna-like mentality. The former is any software technology along the way.

Re:In Defense of Artificial Intelligence (1)

Dachannien (617929) | more than 4 years ago | (#29953116)

Having taken several courses on AI, I never found a contributor to the field that promised it to be the silver bullet -- or even remotely comparable to the human mind.

Douglas Lenat [wikipedia.org] , perhaps?

Re:In Defense of Artificial Intelligence (4, Interesting)

Animats (122034) | more than 4 years ago | (#29953204)

Having taken several courses on AI, I never found a contributor to the field that promised it to be the silver bullet -- or even remotely comparable to the human mind.

Not today, after the "AI Winter". But when I went through Stanford CS in the 1980s, there were indeed faculty members proclaiming in print that strong AI was going to result from expert systems Real Soon Now. Feigenbaum was probably the worst offender. His 1984 book, The Fifth Generation [amazon.com] (available for $0.01 through Amazon.com) is particularly embarrassing. Expert systems don't really do all that much. They're basically a way to encode troubleshooting books in a machine-processable way. What you put in is what you get out.

Machine learning, though, has made progress in recent years. There's now some decent theory underneath. Neural nets, simulated annealing, and similar ad-hoc algorithms have been subsumed into machine learning algorithms with solid statistics underneath. Strong AI remains a long way off.

Compute power doesn't seem to be the problem. Moravec's classic chart [cmu.edu] indicates that today, enough compute power to do a brain should only cost about $1 million. There are plenty of server farms with more compute power and far more storage than the human brain. A terabyte drive is now only $199, after all.

Re:In Defense of Artificial Intelligence (1)

jitterman (987991) | more than 4 years ago | (#29953238)

Only idiots, marketers, ...

Why did you repeat yourself? :)

Re:In Defense of Artificial Intelligence (3, Informative)

ceoyoyo (59147) | more than 4 years ago | (#29953256)

"CASE" isn't entirely bunk either. CASE as CASE might be, but computer aided software design isn't. Perhaps most here are now too young to remember when, if you wanted a GUI, you had to design it by hand, positioning all the elements manually in code and then linking things up manually, in code.

Now almost nobody designs a GUI without a RAD tool of some kind. You drop your GUI elements on the window and the tool generates code stubs for the interaction. That's way, way nicer, and way, way faster than, for example, setting up transfer records for a Windows 3.1 form.

Re:In Defense of Artificial Intelligence (2, Funny)

stinerman (812158) | more than 4 years ago | (#29953568)

Only idiots, marketers, businessmen

You repeat yourself, Mr. eldavojohn.

Of Course: (0, Troll)

Anonymous Coward | more than 4 years ago | (#29952684)

Microslop Craporation [microsoft.com] .

Virtualization has worked (5, Insightful)

mveloso (325617) | more than 4 years ago | (#29952708)

Not sure why virtualization made it into the potential snake-oil of the future. It's demonstrating real benefits today...practically all of the companies I deal with have virtualized big chunks of their infrastructure.

I'd vote for cloud computing, previously known as utility computing. It's a lot more work than expected to offload processing outside your organization.

Re:Virtualization has worked (4, Insightful)

i.r.id10t (595143) | more than 4 years ago | (#29952804)

Yup, even for "just" development, virtualization has been a great gift. With one or two beefy machines, each developer can have an exact mirror of a production environment, and not cause issues on the production side or even for other developers while testing code and such.

Re:Virtualization has worked (4, Insightful)

Anonymous Coward | more than 4 years ago | (#29952808)

because virtualization only works for large companies with many, many servers, yet contractors and vendors sell it to any company with a couple of servers. You should virtualize you email ($2,000 by itself, give or take a little), web server, ($2,000 by itself, give or take a little), source control ($1,000 by itself, give or take a little, and a couple of others. So you have maybe $10,000 in 5 to 6 servers needed to run a small to mid-size company and spend tens of thousands to put them on one super-server running a complex setup of virtualized servers...oh no, the motherboard died and the entire biz is offline.

Virtualization has it's place, but only at the larger companies.

Re:Virtualization has worked (2, Insightful)

jedidiah (1196) | more than 4 years ago | (#29952920)

You could replace "virtualization" with "mainframe" or "big unix server" and still have the same issue.

You would also end up with similar approaches to the problem. With some of these (mainframe), virtualization has been mundane/commonplace for decades.

Re:Virtualization has worked (5, Insightful)

VoidEngineer (633446) | more than 4 years ago | (#29952960)

Having been involved in a business start-up for a year or so now, I'd have to disagree. Virtualization is indispensible for QA testing. Being able to run a virtual network on a personal PC lets me design, debug, and do proof-of-concepts without requiring the investment in actual equipment. Virtualization isn't just about hardware consolidation: it's also about application portability. Small companies have just as much need for QA testing, hardware recycling, and application portability as the large ones.

Re:Virtualization has worked (0, Troll)

pthreadunixman (1370403) | more than 4 years ago | (#29953476)

Ummm... application portability is what operating systems are for.

Re:Virtualization has worked (0)

Anonymous Coward | more than 4 years ago | (#29953050)

Bullshit.

I run a small company doing hosting and development. I have *one* server (quad-core). I use virtualization to test my code and before system updates.

Virtualization has proven itself useful on pretty much every scale - from the smallest single-server machine right up to Google's datacentre.

Re:Virtualization has worked (0)

Anonymous Coward | more than 4 years ago | (#29953190)

uhm, you build a cluster with cheap servers and you will be fine. shoot, even virtualizing and keeping it as a single use server is still a good idea as the image can go just about anywhere and you wont have to futz with images for different machines (one email server image that will go on an opteron or a xeon or windows or linux or whatever)

Re:Virtualization has worked (3, Insightful)

javelinco (652113) | more than 4 years ago | (#29953296)

Spoken like someone who invested the technology five years ago, and hasn't updated their information since.

1. If a small business is running more than two servers, then it's likely it'll be cheaper, over the next five years, to virtualize those servers.
2. If a small business needs any sort of guaranteed uptime, it's cheaper to virtualize - two machines and high availability with VMWare, and you are good to go.
3. Setting up VMWare, for example, is relatively simple, and actually makes remote management easier, since I have CONSOLE access from remote sites to my machine. Need to change the network connection or segment for a machine remotely? You can't do it safely without virtualization.

There is more, but I recommend you check this out again, before continuing to spout this stuff. It's just not true anymore.

Re:Virtualization has worked (0)

Anonymous Coward | more than 4 years ago | (#29953386)

Virtualization has it's place, but only at the larger companies.

Even if that's true (and other responders have already explained why it's not), that doesn't make it "snake oil", as the article suggests. "Snake oil" is something that simply doesn't do what it has been claimed to do, period. "Snake oil" is not something being marketed to the wrong customers.

Re:Virtualization has worked (3, Interesting)

Just Some Guy (3352) | more than 4 years ago | (#29953508)

because virtualization only works for large companies with many, many servers

You're full of crap. At my company, a coworker and I are the only one handling the virtualization for a single rackful of servers. He virtualizes Windows stuff because of stupid limitations in so much of the software. For example, we still use a lot of legacy FoxPro databases. Did you know that MS's own FoxPro client libraries are single-threaded and may only be loaded once per instance, so that a Windows box is only capable of executing one single query at a time? We got around that by deploying several virtualized instances and querying them round-robin. It's not perfect, but works as well as anything could given that FoxPro is involved in the formula. None of those instances need to have more than about 256MB of RAM or any CPU to speak of, but we need several of them. While that's an extreme example, it serves the point: sometimes with Windows you really want a specific application to be the only thing running on the machine, and virtualization gives that to us.

I do the same thing on the Unix side. Suppose we're rolling out a new Internet-facing service. I don't really want to install it on the same system as other critical services, but I don't want to ask my boss for a new 1U rackmount that will sit with a load average of 0.01 for the next 5 years. Since we use FreeBSD, I find a lightly-loaded server and fire up a new jail instance. Since each jail only requires the disk space to hold software that's not part of the base system, I can do things like deploying a Jabber server in its own virtualized environment in only 100MB.

I don't think our $2,000 Dell rackmounts count as "super-servers" by any definition. If we have a machine sitting their mostly idle, and can virtualize a new OS instance with damn near zero resource waste that solves a very real business or security need, then why on earth not other than because it doesn't appeal to the warped tastes of certain purists?

Re:Virtualization has worked (0)

Anonymous Coward | more than 4 years ago | (#29953526)

Bullshit. VMWare Server (free, as in beer) on Debian doesn't cost a dime. I have been running 6 virtrual machines (guests) on two low spec boxes (3 GHz P4, 4 GB, IDE disks) since early 2007 and they've been rock solid.

Re:Virtualization has worked (1)

nine-times (778537) | more than 4 years ago | (#29952866)

Yeah, I don't think this stuff can simply be called "snake oil". ERP systems are in use. They're not a cure-all, but failing to fix every problem doesn't make a thing useless. The current usefulness of "artificial intelligence" depends on how you define it. There are some fairly complex statistical analysis systems that are already pretty useful. Full on AI just doesn't exist yet, and we can't even quite agree on what it would be, but it would likely have some use if we ever made it.

Virtualization is useful and has its place, as does "cloud computing" (which seems to mean different things to different people, but regardless it has its uses).

I guess a lot of these things are over-hyped and they ideas have been sold to people as being better and more trouble-free than they are in reality. But then, so is everything. For example, Windows 7, Karmic Koala, and Snow Leopard have all failed to solve all of my computing problems.

Re:Virtualization has worked (2, Interesting)

Monkeedude1212 (1560403) | more than 4 years ago | (#29952868)

Today, cloud computing, virtualization, and tablet PCs are vying for the hype crown. At this point it's impossible to tell which claims will bear fruit, and which will fall to the earth and rot.

I agree with your post (not the article) - these technologies have all had success in the experimental fields in which they've been applied. but ESPECIALLY virtualization, which is way past experimenting and is starting to become so big in the workplace that I've started using it at home. No need to setup a dual boot with virtualization, and the risk of losing data is virtually removed (pun intended) because anytime the virtual machine gets infected you just overwrite it with yesterdays backup. No need to set up dual boots through the BIOS (for those who are scared to venture there).

I have yet to find an application of Virtualization that has failed to do what it promised.

Re:Virtualization has worked (1)

RAMMS+EIN (578166) | more than 4 years ago | (#29953016)

``Not sure why virtualization made it into the potential snake-oil of the future. It's demonstrating real benefits today...practically all of the companies I deal with have virtualized big chunks of their infrastructure.''

I am sure they have, but does it actually benefit them? In many cases, it seems to me, it's just people trying their best to come up with problems, just so they can apply virtualization as a solution.

Re:Virtualization has worked (0)

Anonymous Coward | more than 4 years ago | (#29953144)

It saves our tiny company ~$260 month in reduced energy consumption consolidating a bunch of light (different OS) servers into one virtual box.

Re:Virtualization has worked (2, Interesting)

afidel (530433) | more than 4 years ago | (#29953616)

It saved us from having to do a $1M datacenter upgrade so yeah, I'd say it benefited us.

Re:Virtualization has worked (2, Insightful)

MightyMartian (840721) | more than 4 years ago | (#29953168)

I think the issue I have with both virtualization and cloud computing is a lack of concrete assessment. They are touted as wunder-technologies, and while they have their place and their use, a lot of folks are leaping into them with little thought as to how they integrate into existing technologies and the kind of overhead (hardware, software, wetware) that will go with it.

Virtualization certainly has some great uses, but I've seen an increasing number of organizations thinking they can turf their server rooms and big chunks of their IT staff by believing the hype that everything will become smaller and easier to manage. The technology is real, has some excellent uses and in a well-planned infrastructure upgrade can indeed deliver real results. But the sales pitch seems to be "replace 10 servers with 1, fire most of your IT department and away you go!"

As to cloud computing, well, it's nothing more than a new iteration of a distributed computing model that dates back forty years or more. In the olden days (back when I was just a strippling) we called it the client-server model. Again, it's a technology was potentially excellent uses, but it, even moreso than virtualization has been hyped beyond all reason. There are profound security and data integrity issues that go along with cloud computing that seem to be swept under the rug. Again, it's the "put your data on the cloud, fire most of your IT department and away you go!"

I'm fortunate in that I have a lot of say in how my budget is spent, but I've heard of guys who are basically having management shove this sort of stuff down their throats, and, of course, win or lose, it's the IT department that wears it when the bloom comes off the rose.

Quite frankly I despise marketers. I think they are one of the greatest evils that have ever been created, a whole legion of professional bullshitters whose job it is to basically lie and distort the truth to shove out products that are either not ready for prime time or don't (and never will) deliver on the promises.

Re:Virtualization has worked (5, Interesting)

digitalhermit (113459) | more than 4 years ago | (#29953442)

I administer hundreds of virtual machines and virtualization has solved a few different problems while introducing others.

Virtualization is often sold as a means to completely utilize servers. Rather than having two or three applications on two or three servers, virtualization would allow condensing of those environments into one large server, saving power, data center floor space, plus allowing all the other benefits (virtual console, ease of backup, ease of recovery, etc..).

In one sense it did solve the under-utilization problem. Well, actually it worked around the problem. The actual problem was often that certain applications were buggy and did not play well with other applications. If the application crashed it could bring down the entire system. I'm not picking on Windows here, but in the past the Windows systems were notorious for this. Also, PCs were notoriously unreliable (but they were cheap, so we weighed the cost/reliability). To "solve" the problem, applications were segregated to separate servers. We used RAID, HA, clusters, etc., all to get around the problem of unreliability.

Fast forward a few years and PCs are a lot more reliable (and more powerful) but we still have this mentality that we need to segregate applications. So rather than fixing the OS we work around it by virtualizing. The problem is that virtualization can have significant overhead. On Power/AIX systems, the hypervisor and management required can eat up 10% or more of RAM and processing power. Terabytes of disk space across each virtual machine is eaten up in multiple copies of the OS, swap space, etc.. Even with dynamic CPU and memory allocation, systems have significant wasted resources. It's getting better, but still only partially addresses the problem of under-utilization.

So what's the solution? Maybe a big, highly reliable box with multiple applications running? Sound familiar?

Re:Virtualization has worked (3, Interesting)

pthreadunixman (1370403) | more than 4 years ago | (#29953346)

Yes, it helps, but it really only helps with under-utilized hardware (and this is really only a problem in Microsoft shops). It doesn't help at all with OS creep; in fact, it makes it worse by making the upfront costs of allocating new "machines" very low; however, it has been and continues to be marketed a cure all which is where the snake-oil comes in. VMware's solution to OS creep: run tiny stripped down VMs with a RPC like management interface (that will naturally only work with vSphere) so that the VM instances essentially become just really heavy weight processes. We are basically coming full circle back to ESX just being yet another general purpose operating system where applications are written specifically for it and thereby defeats the entire purpose of using "virtualization" in the first place.

disappointing... (5, Funny)

Known Nutter (988758) | more than 4 years ago | (#29952770)

very disappointed that the word "synergy" did not appear in either linked article or the summary.

Re:disappointing... (1)

jockeys (753885) | more than 4 years ago | (#29952872)

that's not really a "tech" commodity, more of a touchy-feely HR bullshit commodity.

I don't see anything wrong with this list... (5, Funny)

Jazz-Masta (240659) | more than 4 years ago | (#29952926)

We need to bring about a paradigm shift, to think outside the box, and produce a clear synergy between cloud computing and virtualization.

Re:I don't see anything wrong with this list... (0)

Anonymous Coward | more than 4 years ago | (#29953182)

This person should either be given a prize for the most authentic response or executed
for crimes against humanity.

Re:I don't see anything wrong with this list... (1)

Jazz-Masta (240659) | more than 4 years ago | (#29953438)

Oh come on now, let's be nice...this is slashdot...one of the most *tolerant* communities on the Internet!

Re:disappointing... (1)

OG (15008) | more than 4 years ago | (#29953058)

Having not read the article, I figured they discussed Jem's hologram-inducing supercomputer in the AI section.

Re:disappointing... (0)

Anonymous Coward | more than 4 years ago | (#29953158)

Neither did Leverage, I think.

Re:disappointing... (1)

jellomizer (103300) | more than 4 years ago | (#29953584)

Synergy does exist. However most people doesn't know what it means and most people who are talking out of their butts use it too much. Synergy is when a team of people collaborating together product output greater then the sum of each individual alone.

However most People think it is about being excited about your job. Which isn't what the word means.

My Meta-assessment (4, Interesting)

Anonymous Coward | more than 4 years ago | (#29952790)

IT snake oil: Six tech cure-alls that went bunk
By Dan Tynan
Created 2009-11-02 03:00AM

Today, cloud computing [4], virtualization [5], and tablet PCs [6] are vying for the hype crown. At this point it's impossible to tell which claims will bear fruit, and which will fall to the earth and rot.

[...]

1. Artificial intelligence
2. Computer-aided software engineering (CASE)
3. Thin clients
4. ERP systems
5. B-to-b marketplaces
6. Enterprise social media

1. AI: Has to have existed before it can be "bunk"
2. CASE: Regarding Wikipedia [wikipedia.org] , it seems to be alive and kicking.
3. Thin Clients: Tell that to the guys over at TiVo that thin-client set-top-boxes are bunk.
4. ERP Systems: For low complexity companies, I don't see why ERP software isn't possible.
5. Web B2B: He is right about this one.
6. Social media: Big companies like IBM have been doing "social media" within their organization for quite some time.It's just a new name for an old practice

And as far as his first comment,

"Today, cloud computing [4], virtualization [5], and tablet PCs [6] are vying for the hype crown. At this point it's impossible to tell which claims will bear fruit, and which will fall to the earth and rot."

[4] Google.
[5] Data Servers.
[6] eBooks and medical applications.

Re:My Meta-assessment (5, Insightful)

Tablizer (95088) | more than 4 years ago | (#29952936)

There's a pattern here. Many of the hyped technologies eventually find a nice little niche. It's good to experiment with new things to find out where they might fit in or teach us new options. The problem comes when they are touted as a general solution to most IT ills. Treat them like the religious dudes who knock on your door: go ahead and talk to them for a while on the porch, but don't let them into the house.
     

Re:My Meta-assessment (0)

Anonymous Coward | more than 4 years ago | (#29953370)

Treat them like the religious dudes who knock on your door: answer the door covered in goat's blood and invite them in for an orgy, family style.

Re:My Meta-assessment (1)

jedidiah (1196) | more than 4 years ago | (#29952968)

> 3. Thin Clients: Tell that to the guys over at TiVo that thin-client set-top-boxes are bunk.

Nevermind the Tivo. Web based "thin client computing" has been on the rise in corporate computing for over 10 years now. There are a lot of corporate Windows users that use what is essentially a Windows based dumb terminal. Larger companies even go out of their way to make sure that changing the setup on your desktop office PC is about as hard as doing the same to a Tivo.

Client based computing (java or .net) is infact "all the rage".

Re:My Meta-assessment (1)

jimicus (737525) | more than 4 years ago | (#29953254)

> 3. Thin Clients: Tell that to the guys over at TiVo that thin-client set-top-boxes are bunk.

Nevermind the Tivo. Web based "thin client computing" has been on the rise in corporate computing for over 10 years now. There are a lot of corporate Windows users that use what is essentially a Windows based dumb terminal. Larger companies even go out of their way to make sure that changing the setup on your desktop office PC is about as hard as doing the same to a Tivo.

Client based computing (java or .net) is infact "all the rage".

They've been doing that for years. Strangely, even when your desktop PCs are locked down so tight they may as well be dumb terminals, a lot of people will still scream blue murder if it really is a dumb terminal being put on their desk.

Re:My Meta-assessment (2, Insightful)

NotBornYesterday (1093817) | more than 4 years ago | (#29953430)

So don't tell them it's a dumb terminal. Put a thin client on their desk and tell them they're getting a 6 ghz octocore with 32 gigs of ram and a petabyte hard drive. They'll never know. Most of them, anyway.

Re:My Meta-assessment (2, Interesting)

Mike Buddha (10734) | more than 4 years ago | (#29953060)

2. CASE: Regarding Wikipedia [wikipedia.org] , it seems to be alive and kicking.

As a programmer, CASE sounds pretty neat. I think it probably won't obviate the need for programmers any time soon, but it has the potential to automate some of the more tedious aspects of programming. I'd personally rather spend more of my time designing applications and less time hammering out the plumbing. It's interesting to note that a lot of the CASE tools in that wikipedia article I'm familiar with, although they were never referred to as CASE tools when I was learning how to use them. I think the CASE concept may have been too broad, and had gotten a bad name, even thought some of the parts were/are useful.

Object Oriented Programming (1, Interesting)

Anonymous Coward | more than 4 years ago | (#29952794)

OOP was hyped as a cure-all, but only turned out to help out in a few portions of apps, and trigger a philosophical holy-war between set fans (relational) and graph fans (oop). As a new tool to add to the tool box, fine. As a cure-all, NOT.

The Cloud (5, Funny)

Anonymous Coward | more than 4 years ago | (#29952798)

It has vaporware all over it.

Re:The Cloud (2, Funny)

Shikaku (1129753) | more than 4 years ago | (#29952990)

Clouds are actually water vapors. So it literally is vaporware.

Re:The Cloud (1)

syousef (465911) | more than 4 years ago | (#29953152)

Clouds are actually water vapors. So it literally is vaporware. ...and since it's water vapour it's no surprise that letting it anywhere near your computer hardware will instantly make that hardware go on the fritz.

Re:The Cloud (1)

Jack9 (11421) | more than 4 years ago | (#29953450)

I was under the impression that water vapor wasn't visible. Clouds are microscopic crystalline suspensions.

Re:The Cloud (1)

maxwell demon (590494) | more than 4 years ago | (#29953384)

However, some of them have an impressive flash interface.

There is just one Myth. (4, Insightful)

cybergrue (696844) | more than 4 years ago | (#29952830)

It arises when the salesman tell the clueless management that "This product will solve all your problems!"

Bonus points if the salesman admits that he doesn't need to know your problems before selling it to you.

Machine translation replacing human translation (3, Insightful)

WormholeFiend (674934) | more than 4 years ago | (#29952846)

Let's just say the technology is not quite there yet.

Re:Machine translation replacing human translation (2, Funny)

circletimessquare (444983) | more than 4 years ago | (#29953434)

"Let's just say the technology is not quite there yet"

aka

"Pertaining to the acceptability, us, speaking of the mechanical acumen almost has arrived, still"

ERP? (1)

Swanktastic (109747) | more than 4 years ago | (#29952848)

I was surprised to find ERP on this list. Sure, it's a huge effort and always oversold, but there's hardly a large manufacturing company out there that could survive without some sort of basic ERP implementation.

Re:ERP? (0)

Anonymous Coward | more than 4 years ago | (#29952904)

how did they survive before ERP systems?

Re:ERP? (0)

Anonymous Coward | more than 4 years ago | (#29953124)

how did they survive before ERP systems?

By being too inefficient to still survive in today's marketplace.

Re:ERP? (2, Insightful)

cryfreedomlove (929828) | more than 4 years ago | (#29952958)

The fundamental problem with ERP systems is that they are integrated and implemented by the second tier of folks in the engineering pecking order. Couple that fact with an aggressive sales force that would sell ice to eskimos and you've got a straight road to expensive failure.

Re:ERP? (3, Interesting)

smooth wombat (796938) | more than 4 years ago | (#29953242)

you've got a straight road to expensive failure.

Sing it brother (or sister)! As one who is currently helping to support an Oracle-based ERP project, expensive doesn't begin to describe how much it's costing us. Original estimated cost: $20 million. Last known official number I heard for current cost: $46 million. I'm sure that number is over $50 million by now.

But wait, there's more. We bought an off-the-shelf portion of their product and of course have to shoe-horn it to do what we want. There are portions of our home-grown process that aren't yet implemented and probably won't be implemented for several more months even though those portions are a critical part of our operations.

But hey, the people who are "managing" the project get to put it on their résumé and act like they know what they're doing, which is all that matters.

an aggressive sales force that would sell ice to eskimos

I see you've read my column [earthlink.net] .

Re:ERP? (0)

Anonymous Coward | more than 4 years ago | (#29953112)

Yes and no on ERP.

Clearly you are correct that any large company and surely manufactures are using some ERP (or MRP/DRP). In specific verticals and with realistic expectations many roll-outs have been successful and with demonstrable returns.

But the idea of ERP from a monolithic vendor with an all-pervasive solution for all your business lines and processes and an implementation that was going to reinvent your company and change all those corporate structures that have remained up-till-now resistant to change was the joke.

ERP as just another software solution with constraints is a success.

ERP as new-age religion is a failure.

Microsoft silverlight (4, Insightful)

assemblerex (1275164) | more than 4 years ago | (#29952894)

That went over real well once they saw user visits drop by almost half...

No substitute (1)

UnixUnix (1149659) | more than 4 years ago | (#29952902)

Listening to the Willy Lomans of the world is no substitute for insight and understanding. As Plato might have put it, either the managers had better understand technology or the techies get to manage.

Expert systems (3, Insightful)

michael_cain (66650) | more than 4 years ago | (#29952928)

Within limits, expert systems seem to work reasonably well. Properly-trained software that examines x-ray images has been reported to have better accuracy than humans at diagnosing specific problems. The literature seems to suggest that expert systems for medical case diagnosis is more accurate than doctors and nurses, especially tired doctors and nurses. OTOH, patients have an intense dislike of such systems, particularly the diagnosis software, since it can seem like an arbitrary game of "20 Questions". Of course, these are tools that help the experts do their job better, not replacements for the expert people themselves.

House MD Expert System (1)

Dareth (47614) | more than 4 years ago | (#29953624)

House Rule for AI Medical Expert Systems:

Patients lie.

Thin Clients? (1)

bertoelcon (1557907) | more than 4 years ago | (#29952930)

By the description in here the cloud didn't work because:

Worse, users resented giving up control over their machines, adds Mike Slavin, partner and managing director responsible for leading TPI's Innovation Center. "The technology underestimated the value users place upon having their own 'personal' computer, rather than a device analogous -- stretching to make a point here -- to the days of dumb terminals," he says.

So why does it look good now? Oh right different people heard the setup and a new generation gets suckered on it.

Re:Thin Clients? (2, Insightful)

abigor (540274) | more than 4 years ago | (#29953156)

Because cloud computing doesn't require a thin client? The two things aren't related at all. Offloading processing and data makes perfect sense for many applications.

Thanks for linking to the print version (5, Interesting)

harmonise (1484057) | more than 4 years ago | (#29952940)

This is a bit OT but I wanted to say that snydeq deserves a cookie for linking to the print version. I can only imagine that the regular version is at least seven pages. I hope slashdot finds a way to reward considerate contributors such as him or her for making things easy for the rest of us.

Re:Thanks for linking to the print version (0)

Anonymous Coward | more than 4 years ago | (#29953018)

There are ONLY six pages you insensitive clod.

Virtualization is not bunk. (3, Interesting)

E. Edward Grey (815075) | more than 4 years ago | (#29952976)

I don't know of a single IT department that hasn't been helped by virtualization of servers. It makes more efficient use of purchased hardware, keeps businesses from some of the manipulations to which their hardware and OS vendors can subject them, and is (in the long term) cheaper to operate than a traditional datacenter. IT departments have wondered for a long time: "if I have all this processing power, memory, and storage, why can't I use all of it?" Virtualization answers that question, and does it in an elegant way, so I don't consider it snake oil.

Fake quote in technology #1 (0)

Anonymous Coward | more than 4 years ago | (#29953006)

The quote "Some day we will build a thinking machine..." which TFA attributes to Thinking Machines corp is bogus, I think. Google turns up only a handful of hits, and the happy times I had with C*/PRISM on a CM-5 left me with the distinct impression that the people at Thinking Machines definitely had their heads screwed on and switched on.

Did TFA just make this quote up?

The crazy hottie (5, Interesting)

GPLDAN (732269) | more than 4 years ago | (#29953024)

I kind of miss the crazy hotties that used to pervade the network sales arena. I won't even name the worst offenders, although the worst started with the word cable. They would go to job fairs and hire the hottest birds, put them in the shortest shirts and low cut blouses, usually white with black push-up bras - and send them in to sell you switches.

It was like watching the cast of a porn film come visit. Complete with the sleazebag regional manager, some of them even had gold chains on. Pimps up, big daddy!

They would laugh at whatever the customer said wildly, even if it wasn't really funny. The girls would bat their eyelashes and drop pencils. It was so ridiculous it was funny, it was like a real life comedy show skit.

I wonder how much skimming went on in those days. Bogus purchase orders, fake invoices. Slap and tickle. The WORST was if your company had no money to afford any of the infratsructure and the networking company would get their "capital finance" team involved. Some really seedy slimy stuff went down in the dot-com boom. And not just down pantlegs, either.

Re:The crazy hottie (2, Funny)

chappel (1069900) | more than 4 years ago | (#29953264)

I still remember a visit from a PC sales rep that was hired straight off the Dallas Cowboys Cheerleading squad. OMG I bet she could sell computers.

It's always the hype problem. (4, Insightful)

loftwyr (36717) | more than 4 years ago | (#29953054)

Most of the technologies in the article were overhyped but almost all have had real value in the marketplace.

For example, AI works and is a very strong technology, but only the SF authors and idiots expect their computer to have a conversation with them. Expert systems (a better name) or technologies that are part of them are in place in thousands of back-office systems.

But, if you're looking for HAL, you have another 2001 years to wait. Nobody seriously is working toward that, except as a dream goal. Everybody wants a better prediction model for the stock market first.

Re:It's always the hype problem. (1)

JoeCool1986 (1320479) | more than 4 years ago | (#29953134)

But, if you're looking for HAL, you have another 2001 years to wait. Nobody seriously is working toward that, except as a dream goal. Everybody wants a better prediction model for the stock market first.

We at Qualia Labs [qualialabs.com] are working towards that :). Though we still see it in the context of a very long term goal.

Overhyped, but note quite snake oil (2, Informative)

Jyms (598745) | more than 4 years ago | (#29953074)

I got interested in AI in the early 90's and even then the statements made in the article were considered outrageous by people who actually knew what was going on. I use AI on a daily basis, from OCR to speech and gesture recognition. Even my washing machine claims to use it. Not quite thinking for us and taking over the world, but give it some time :).

Same with thin clients. Just today I put together a proposal for three 100 seat thin client (Sunray) labs. VDI allows us to use Solaris, multiple Linux flavors, Minix, Windows, pretty much any OS we wish at the click of a mouse. The biggest problem is guessing what is going to happen now that Oracle is taking over, not the technology/architecture. Yes, Windows (CE) "thin clients" suck and are not very thin, but real think clients are quite handy.

A lot of these technologies were/are hopelessly over-hyped, but that is not a fault with the technology, but a problem with the idiots doing the hyping.

TFA (0)

Anonymous Coward | more than 4 years ago | (#29953094)

If nothing else, thanks for linking to the print version of the article...

Tech cure-all missing option: emacs (4, Funny)

turing_m (1030530) | more than 4 years ago | (#29953136)

Apparently it cures everything but RSI.

Re:Tech cure-all missing option: emacs (0)

Anonymous Coward | more than 4 years ago | (#29953368)

Yes, Emacs was clearly marketed as a technology that promises to revolutionize the enterprise, slash operational costs, reduce capital expenditures, align your IT initiatives with your core business practices, boost employee productivity, and leave your breath clean and minty fresh. I'm surprised they missed it.

Those aren't all (5, Insightful)

HangingChad (677530) | more than 4 years ago | (#29953210)

We used to play buzzword bingo when vendors would come in for a show. Some of my personal favorites:

IT Best Practices - Has anyone seen my big book of best practices? I seem to have misplaced it. But that never stopped vendors from pretending there was an IT bible out there that spelled out the procedures for running an IT shop. And always it was their product at the core of IT best practices.

Agile Computing - I never did figure that one out. This is your PC, this is your PC in spin class.

Lean IT - Cut half your staff and spend 3x what you were paying them to pay us for doing the exact same thing only with worse service.

Web 2.0 - Javascript by any other name is still var rose.

SOA - What a gold mine that one was. Calling it "web services" didn't command a very high premium. But tack on a great acronym like SOA and you can charge lots more!

All those are just ways for vendors and contractors to make management feel stupid and out of touch. Many management teams don't need any help in that arena, most of them are already out of touch before the vendor walks in. Exactly why they're not running back to their internal IT people to inquire why installing Siebel is a really BAD idea. You can't fix bad business practices with technology. Fix your business practices first, then find the solution that best fits what you're already doing.

And whoever has my IT Best Practices book, please bring it back. Thanks.

Re:Those aren't all (0)

Anonymous Coward | more than 4 years ago | (#29953280)

Business social networking?

That would be basecamp. Apparently lots of people use it. It seems a lot slower than mantis though.

Anybody remember "Push" technology? (1)

scorp1us (235526) | more than 4 years ago | (#29953258)

And all that brouhaha surrounding it? We were supposed to sit back and have all that junk crammed down our throats, but we'd want it all, because the database would have our marketing preferences.

What about Linux? (On the desktop) (Sorry, I couldn't resist!)

Incredible labor saving devices (1)

Culture20 (968837) | more than 4 years ago | (#29953276)

Incredible labor saving devices of the future! Vacuum Cleaner salesmen would always say they were labor saving devices. They are actually _more_ work than sweeping with a broom, but the end result is cleaner (brooms just move dust around). Of course, telling a PHB that the virtual environment will cost more in hardware and manpower but will be 3x as good doesn't win points. PHB only wants reduction in cost.

Hello, IT department. (1)

theinvisibleguy (982464) | more than 4 years ago | (#29953302)

Have you tried turning it off and turning it on again?

Bad specifications... (1)

osu-neko (2604) | more than 4 years ago | (#29953320)

"The idea of CASE was to produce better code faster by having a computer do it," says McLean. "Just feed your specifications into the front end, and it'll spit out flawless code. The vendors counted on customers who did not realize that the biggest problem in these projects is bad specifications, and they found a lot of those customers. So, people fed bad specs in one end and got bad code out of the other."

So, they never asked a single professional programmer? XD Seriously, has ANYONE EVER gotten a spec that wasn't ridiculously underspecified, internally contradictory, and containing numerous very very bad ideas? Anyone who's done any professional coding knows that the best way to make a product that probably does not look good and certainly does not do anything useful and does not even work right for the things it does do, is to give the customer exactly what they asked for. Although I've used that as a tactic before as a starting point. First, implement exactly what they asked for, then rather than trying to explain to them why that won't work, show them. But I use that as a last resort...

I call BS on this story (2, Insightful)

FranTaylor (164577) | more than 4 years ago | (#29953330)

"Artificial intelligence" - what's keeping the spam out of YOUR inbox? How does Netflix decide what to recommend to you? Ever gotten directions from Google Maps?

"Computer-aided software engineering" - tools like valgrind, findbugs, fuzzing tools for finding security problems.

"Thin clients" - ever heard of a "Web Browser"?

"Enterprise social media" - That really describes most of the Internet

As soon as I saw an opionion from "Ron Enderle" I knew this story would be BS.

A failure of terminology, not of technology (1)

camionbleu (1633937) | more than 4 years ago | (#29953348)

In the early 90s (pre-web), I worked on two of the technologies mentioned in the article. First, thin clients. I worked on a system where we had a thin user interface layer (two versions: Windows and OS/2 Presentation Manager) talking to a powerful server backend that was doing database lookups and heavy number crunching. It worked well, but this type of architecture morphed into web applications in the mid-90s, and people stopped talking about "thin clients". However, a browser talking to an e-commerce backend or pretty much any other type of web app is precisely that: a thin client. The article is quite foolish to say that thin clients are "making a bit of a comeback". In fact, they have quietly taken over.

Secondly, AI. I worked on an expert system in the late 80s and early 90s that worked very well (and had modest commercial success). To this day, there are plenty of rule-based systems and neural networks in use in real-world situations such as decision support. But the term "artificial intelligence" made non-technical people overestimate what was possible at the time. Looking back, it was a term that encouraged people to overestimate what was possible. However, the set of technologies that were commonly referred to as AI have not generally failed -- only the term itself, "artifical intelligence", has fallen out of favour.

Cloud Computing? (1)

Toreo asesino (951231) | more than 4 years ago | (#29953350)

That's only hype if you don't understand why you'd use it.

You're building a website for example; you think it *might* become highly popular & high bandwidth. Normally you'd have two options; 1, invest in a tonne of infrastructure just in case and risk hugely over investing in nothing; 2, don't bother and risk collapsing under strain of hugely underestimating traffic demands.

Well, cloud computing takes that worry off your shoulders. If your app needs more "cloud"; you can give it extra juice in minutes without any interruption to service. In azure anyway it's just an XML file change - "instance count" to add/remove more/less VMs to your collection. Insane amounts of processing horsepower if you want; nay if you have the cash to match it you can have thousands of servers at your command without even restarting the app. That's value, and it doesn't even take much to "cloudify" an app either.

Yes it costs more to feed it more processing, but you know you only pay for what you need at any time.

So no; cloud computing isn't perfect or for everyone; but it certainly has its' place.

ERP is snake oil? (2, Funny)

bazorg (911295) | more than 4 years ago | (#29953480)

Funny the bit about ERP software. Essentially they say that ERP is not as good as people expected, but once you apply some Business Intelligence solutions you'll be sorted.

I doubt validity of TFA (2, Insightful)

S3D (745318) | more than 4 years ago | (#29953484)

From TFA, philippic against social media:

That's too much information. Before they know it, their scientists are talking to the competition and trade secrets are leaking out."

I don't think author has a clue. The secrets which could be accidentally spilled are not worth keeping. If it so short it bound to be trivial, really essential results are megabytes and megabytes of data or code or know-how. Treat your researcher as prisoners, get prison science in return.

Speech Recognition Software? (0)

Anonymous Coward | more than 4 years ago | (#29953492)

What about speech-recognition software? In the mid 90s Microsoft and IBM thought it was going to be huge, but it remains limited to niche markets.

Java (2, Insightful)

Marrow (195242) | more than 4 years ago | (#29953546)

It was not too long ago that Java was going to:
Give us applets to do what Browsers can never do: Bring animated and reactive interfaces to the web browsing experience!
Take over the desktop. Write once, run anywhere and render the dominance of Intel/MS moot by creating a neutral development platform!

Yes, perhaps its found a niche somewhere. But its fair to say it fell short of the hype.

One statment is support of CASE (1)

jd.schmidt (919212) | more than 4 years ago | (#29953554)

There is kind of a paradox about things like CASE, sometimes the better they work the MORE progamers you need because people always want MORE and now producing the final prodect is easier, so you get more for your money.

So consider what computers did for us in 1980 vs. what they do today. Improvements in programing have made is possible and cost effective to do what we do now. So CASE may have been a total sucess, from the point of view of someone in the 80's, but the net result is increased demand, not reduced.

Why Artificial Intelligence may never exist (5, Insightful)

jc42 (318812) | more than 4 years ago | (#29953630)

The most obvious counterexample to the "AI" nonsense is to consider that, back around 1800 or any time earlier, it was obvious to anyone that the ability to count and do arithmetic was a sign of intelligence. Not even smart animals like dogs or monkeys could add or subtract; only we smart humans could do that. Then those engineer types invented the adding machine. Were people amazed by the advent of intelligent machines? No; they simply reclassified adding and subtracting as "mechanical" actions that required no intelligence at all.

Fast forward to the computer age, and you see the same process over and over. As soon as something becomes routinely doable by a computer, it is no longer considered a sign of intelligence; it's a mere mechanical activity. Back in the 1960s, when the widely-used programming languages were Fortran and Cobol, the AI researchers were developing languages like LISP that could actually process free-form, variable-length lists. This promised to be the start of truly intelligent computers. By the early 1970s, however, list processing was taught in low-level programming courses and had become a routine part of the software developers toolkits. So it was just a "software engineering" tool, a mechanical activity that didn't require any machine intelligence.

Meanwhile, the AI researchers were developing more sophisticated "intelligent" data structures, such as tables that could associate arbitrary strings with each other. Did these lead to development of intelligent software? Well, now some of our common programming languages (perl, prolog, etc.) include such tables as basic data types, and the programmers use them routinely. But nobody considers the resulting software "intelligent"; it's merely more complex computer software, but basically still just as mechanical and unintelligent as the first adding machines.

So my prediction is that we'll never have Artificial Intelligence. Every new advance in that direction will always be reclassified from "intelligent" to "merely mechanical". When we have computer software composing best-selling music and writing best-selling novels or creating entire computer-generated movies from scratch, it will be obvious that such things are merely mechanical activities, requiring no actual intelligence.

Whether there will still be things that humans are intelligent enough to do, I can't predict.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?