Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Biggest Roadblocks To Information Technology Development

Zonk posted more than 6 years ago | from the bumps-in-the-road dept.

Technology 280

ZDOne writes "ZDNet UK has put together a list of some of the biggest obstacles preventing information technology from achieving its true potential, in terms of development and progress. Microsoft's stranglehold on the desktop makes the list, as does the chip-makers' obsession with speed. 'There is more to computing than processor speed -- a point which can be easily proven by comparing a two-year-old PC running Linux with a new PC buckling under the weight of Vista. Shrinking the manufacturing process to enable greater speed has proven essential, but it's running out of magic ... What about smarter ways of tagging data? The semantic web initiative runs along these sorts of lines, so where is the hardware-based equivalent?'"

Sorry! There are no comments related to the filter you selected.

Paraphrase? (1)

foobsr (693224) | more than 6 years ago | (#21492147)

0. Lack of (artificial) intelligence (still)

More specifically, lack of ability of applications (or lack of applications able) to adapt to the needs of the individual user automagically (top of my wishlist: a memory crutch).

/satire This will be fixed once evil&co realize that such a 'profiler' is a well performing surveillance tool while at the same time realizing that 'progress' that is purely driven by the technologically feasible does not cut it.

CC.

Re:Paraphrase? (1)

morgan_greywolf (835522) | more than 6 years ago | (#21492631)

More specifically, lack of ability of applications (or lack of applications able) to adapt to the needs of the individual user automagically (top of my wishlist: a memory crutch).
Who says you need artificial intelligence to do that? Applications are already able to customize menu picks and available buttons based on context. Memory crutch? Have you used something like Visual Studio or Eclipse, with autocomplete and function-reference-in-a-bubble? None of this requires artificial intelligence, just plain ol' vanilla pattern recognition.

Re:Paraphrase? (1)

advocate_one (662832) | more than 6 years ago | (#21492705)

None of this requires artificial intelligence, just plain ol' vanilla pattern recognition.

"pattern recognition" IS part of artificial intelligence..., /pedant mode = "off"

Re:Paraphrase? (1)

morgan_greywolf (835522) | more than 6 years ago | (#21492827)

"pattern recognition" IS part of artificial intelligence..., /pedant mode = "off"
So you're telling me that grep, sed, awk, vi and emacs are examples of artificial intelligence? (Hint: regular expressions are a form of pattern recognition)

Two Different Uses of the Word (4, Informative)

StCredZero (169093) | more than 6 years ago | (#21493043)

You're conflating two different uses of the word "pattern" from two different computer science/programming contexts and think this constitutes cleverness. BZZZZZT! Wrong! No cigar!

They're not even the same phrases. You're thinking of pattern recognition [wikipedia.org] and pattern matching [wikipedia.org] . Read the 2nd article. They are definitely not the same thing!

We need another RISC revolution, but in support of what we really need as programmers. That would be better support of VMs for high level languages. VMs in the sense of Xen will also be useful, but we are already making significant progress there.

Re:Paraphrase? (1)

CastrTroy (595695) | more than 6 years ago | (#21493069)

And am I the only one who finds this feature completely annoying. I like things to be the way I left them. You don't come back to your keyboard to find the letters rearranged (or even missing) each day based on the keys you use most often. Why should menu items do this? It's much easier to just have things stay where they are, so you can find them when you need them.

Re:Paraphrase? (2, Funny)

truthsearch (249536) | more than 6 years ago | (#21493245)

You don't come back to your keyboard to find the letters rearranged (or even missing) each day

Depends on if your office is next to mine. :)

But I agree with you. I find it ridiculously annoying. Anything the vast majority of users wouldn't use should be moved to a more obscure location. Anything I never use should just sit there. I can handle not clicking it all by myself.

Re:Paraphrase? (1)

crgrace (220738) | more than 6 years ago | (#21493163)

There will never be any "artificial intelligence" because once a computer achieves something that was considered "artificial intelligence", the phrase is redefined to exclude said achievement. So, by definition, artificial intelligence is impossible. In fact, once something that was considered to require intelligence is achieved, it is then relegated to rote "pattern matching".

Examples:
Adaptive filtering
Chess
Music composition
Real-time target acquisition and tracking

Biggest roadblock? (1, Insightful)

Anonymous Coward | more than 6 years ago | (#21492155)

IT workers and their know-it-all attitudes.

Re:Biggest roadblock? (3, Funny)

cayenne8 (626475) | more than 6 years ago | (#21492261)

"IT workers and their know-it-all attitudes."

Nah....#1 Answer: PHB's [wikipedia.org] !!

Re:Biggest roadblock? (0)

Anonymous Coward | more than 6 years ago | (#21492915)

end users inability to figure things out.

Re:Biggest roadblock? (1)

jo42 (227475) | more than 6 years ago | (#21493107)

Let me fix that for you:


Dumb users who think they know it all.

:-p

Re:Biggest roadblock? (0)

Anonymous Coward | more than 6 years ago | (#21493203)

Thanks for proving the GP's point.

Americans (-1, Flamebait)

DeeQ (1194763) | more than 6 years ago | (#21492167)

I think it mostly has to do with a lot of the development being done in America. Bigger = Better apperntly. If only people would start taking the logical route.

Nothing is stopping you. (1)

AltGrendel (175092) | more than 6 years ago | (#21492495)

You can always fork an open source project. Or better yet, start one of your own.

This isn't meant to be flamebait or a troll. This is the beauty of open source. You can DIY if you want to. You don't have to if you don't want to. You can contribute time, money, etc.. to your favorite project. Or nothing at all. But open source allows you to be the solution to the problem that you have noticed.

Go ahead, scratch that itch.

Re:Americans (1)

zymurgyboy (532799) | more than 6 years ago | (#21492545)

You smell fo troll, but I'll bite anyway.

If Americans aren't doing it the way you want it, why not grab the ball and run with it yourself? What is stopping you? And what is the "logical route"? Care to elaborate on that?

Re:Americans (1)

morgan_greywolf (835522) | more than 6 years ago | (#21492759)

Ridiculous. There are plenty of open source projects with American developers that think smaller and more modular is better. Take XFCE, for example. Small, lightweight.

On the other side of the pond, there are plenty of European developers that think bigger is better. Take KDE for example. Much of the development team is German.

I'm not saying one approach is better than the other, but the whole point of open source is to give you some choices. You want big and full featured? You know where to find it. You want small and lightweight? You know where to find it.

Horrible (5, Insightful)

moogied (1175879) | more than 6 years ago | (#21492175)

The author clearly has no idea what they are saying.

We haven't come far. Qwerty is 130 years old, and windows, icons, mice and pointers are 35. Both come from before the age of portable computing. So why are we reliant on these tired old methods for all our new form factors?

We are reliant because they work damn good. Its not like they were the simpliest of ideas, they were just the ones taht stuck because they worked.

Better not tell him about the wheel or fire (4, Insightful)

SmallFurryCreature (593017) | more than 6 years ago | (#21492333)

Just because something is old does NOT mean it is obsolete, more and more I see this as an absolute truth, advancing (oh okay, runaway) age has nothing to do with it.

Some things just work and don't really need to be replaced. Change for change sake is bad. NOW GET OF MY LAWN!

Re:Better not tell him about the wheel or fire (1)

CastrTroy (595695) | more than 6 years ago | (#21492607)

Exactly. We could have replaced paper books by now with a small e-reader device, but really, books work a lot better than computer screens for reading in a lot of ways. Even with everyone carrying around a laptop, you'll still see people reading paper books, because its the best way to do it.

Re:Better not tell him about the wheel or fire (1)

Firethorn (177587) | more than 6 years ago | (#21492971)

Personally I'd read a lot more ebooks if more ebooks were available without the restrictive DRM and hardcover prices as compared to paperbacks.

I'm not going spend $400 [amazon.com] and $20/book*.

Though I'll admit to considering it as long as I can transfer my webscription [webscription.net] ebooks to it.

*Yes, they advertise "New York Times® Best Sellers and all New Releases $9.99, unless marked otherwise." The whole 'unless marked otherwise' is real assuring. Besides, I don't normally read best sellers, and pay less than $10/book.

Re:Horrible (1)

quilombodigital (1076565) | more than 6 years ago | (#21493207)

I agree... give him three shells to clean thiss mess! :) []s, gandhi

People missing the point (2, Insightful)

suso (153703) | more than 6 years ago | (#21492181)

I'll say it but it isn't going to do any good anyways.

One of the big roadblocks is users not seeing the big picture or not caring. Over the years, I've seen so many programs (especially open source) get off track of their goals because of a large number of vocal users that don't get the point of the program and expect it to do something else.

Or how about the biggest misconception of all time "Computers are going to make your life easier and they are going to be easy to use".

Re:People missing the point (1)

aadvancedGIR (959466) | more than 6 years ago | (#21492281)

"Computers are going to make your life easier and they are going to be easy to use"

You forgot the "Within 10 years, everything would have been programmed and CS will be an extinct profession".

Re:People missing the point (2, Interesting)

suso (153703) | more than 6 years ago | (#21492415)

You forgot the "Within 10 years, everything would have been programmed and CS will be an extinct profession".

Wrong. If you've been paying attention, the computer industry re-invents itself whenever a new medium comes along and all the software gets written all over again.

  • 1970s - Hey computers, lets make a spreadsheet program.
  • 1980s - Hey personal computers, lets make a spreadsheet program for home use.
  • 1990s - Hey windows, lets make a spreadsheet program that crashes.
  • 2000s - Wow, the internet, let's make a spreadsheet program that works from the browser.
  • 2010s - Whoa, virtual reality, those guys are going to need spreadsheet programs in their virtual offices.
  • 2020s - Hey man, there is a computer in my head, I'm going to need a spreadsheet program.
  • 2030s - Oh no, AI, they are going to need spreadsheet programs too. Oh wait, they wrote it themselves.
  • 2040s - Retire


So people who are in school now, still have some time left.

Re:People missing the point (1)

aadvancedGIR (959466) | more than 6 years ago | (#21492645)

Did you noticed we were talking about big misconceptions (or wet dreams), such as working IA in our lifetime.
CS has always been and will probably ever will be a self sustaining industry, the tools and products evolve, but the work doesn't: we are continously improving things or adding new ones on top of them.

Re:People missing the point (1)

suso (153703) | more than 6 years ago | (#21492811)

CS has always been and will probably ever will be a self sustaining industry, the tools and products evolve, but the work doesn't: we are continously improving things or adding new ones on top of them.

Yes of course but I was commenting on your statement that in ten years everything will have been written. Of course there are programmers that improve software during its lifetime, add new features, make it mature. But it seems like every generation of software that becomes mature, a new medium comes along and people say "Hey, lets start from scratch" and then a new generation of software starts.

Here's One More (5, Insightful)

puppetluva (46903) | more than 6 years ago | (#21492185)

The insistence to present everything as a video instead of an article or good analytical summary is holding back technology information sharing (much like this video).

I wish these outlets would stop trying to turn the internet into TV. We left TV because it was lousy.

You got that right! (0)

Anonymous Coward | more than 6 years ago | (#21492539)

The insistence to present everything as a video instead of an article or good analytical summary is holding back technology information sharing (much like this video).
Testify, brother! I can read far faster than some talking head can talk. Why slow down information transfer to the speed of the stupidest illiterate in the audience?

I wish these outlets would stop trying to turn the internet into TV. We left TV because it was lousy.
I think it's mostly narcissism. Look at Boing Boing TV, for example. I love BoingBoing.net, and Xeni is very easy on the eyes, but I'm not interested in watching her play Cronkite... just give me the text and a link, please.

Moron (1)

aadvancedGIR (959466) | more than 6 years ago | (#21492227)

The success of the PC is that it is a quite universal tool. Changing its hardware to deal with some kind of data in a particular way is OK for some niches, but not mainstream. Who would want 1 PC to go on the web, one for word processor, one for mails...

The number one problem (5, Insightful)

beavis88 (25983) | more than 6 years ago | (#21492231)

The number one problem is all the idiots who are too stubborn/stupid to learn how to use their tools. If these people knew as little about hammers and they do about computers, there wouldn't be a round thumb left in the whole goddamn world. Just because it's a computer doesn't mean you have to turn off your brain.

Re:The number one problem (1)

pthor1231 (885423) | more than 6 years ago | (#21492509)

So true, I wish I had mod points.

Agreed (4, Informative)

Ultra64 (318705) | more than 6 years ago | (#21492593)

"It says click OK to continue... what should I do?"

This is the kind of question I get to deal with at work.

Re:The number one problem (1)

Entropius (188861) | more than 6 years ago | (#21492953)

I think we need to change error messages to things that are technically accurate, with hyperlinks to wikipedia.

Instead of Windows saying "This network has limited or no connectivity" and leaving the user to puzzle out exactly what the hell that means, it should just say "Unable to obtain an [[IP address]] from the [[DHCP]] server: operation timed out."

Those of us who already know what that stuff means will know that they need to go futz with their router; those of us who don't might learn something (from, of course, a local mirror of the relevant articles, since their connection is down).

This is why Linux is a better desktop OS for those of us who know a little: when something breaks it's generally clear what it was. Fixing it might be hard, but at least it's *possible*.

Re:The number one problem (4, Insightful)

Raul654 (453029) | more than 6 years ago | (#21493113)

The hallmark of good design is that people don't have to know how it works under the hood. How many people who drive cars on a daily basis can describe the basics of what is going on in the engine? (And, I should point out - cars are much more mature technology than computers - simpler and generally better understood)

That attitude, which is effectively equivalent to the RTFM attitude many people in the open source community take towards operating system interface design, is IMO the singular biggest obstacle to widespread linux adoption. Also (at the risk of starting an OS evangalism flamewar), it is the reason Ubuntu has become so very popular so recently. Ubuntu gets the design principles right, starting with a well-thought out package manager (admittedly copied from Debian).

Re:The number one problem (2, Insightful)

Nerdposeur (910128) | more than 6 years ago | (#21493147)

The number one problem is all the idiots who are too stubborn/stupid to learn how to use their tools.

While this is true in some cases, I think it's mostly snobbery. Well-designed tools can be used intuitively.

Most people learn exactly as much as they see a need to learn. How much do you know about how your car works? Your plumbing? Your washing machine? Just the basics, I'd guess - enough to use it. Thankfully, your car's manufacturer has kept things simple for you.

The "idiots" you refer to may have advanced degrees in their field; they just don't happen to be IT people. Don't expect them to waste their time learning everything you know. If you need a lawyer, you'll hire one; if a lawyer needs an IT person, he'll hire one. But in ordinary circumstances neither the law nor technology should intrude in your normal activities.

Re:The number one problem (2, Insightful)

CastrTroy (595695) | more than 6 years ago | (#21493229)

You are right. I've seen many people who are smart in most situations become inexplicably dumb when sitting in front of a computer. People seem to have some thought that the computer should just do everything for them, and therefore their brain shuts off. I'm not sure if that's the exact reason, but it does seem like that is what's happening. Also I wouldn't expect to be able to walk up to a bunch of woodworking tools, and a pile of wood and be able to build a set of furniture for my bedroom, with having to learn anything. I don't know why people have this attitude computers, where they should be able to use it without any knowledge.

Re:The number one problem (1)

Nerdposeur (910128) | more than 6 years ago | (#21493249)

If these people knew as little about hammers and they do about computers, there wouldn't be a round thumb left in the whole goddamn world.

If hammers needed constant maintenance to function normally, people would stop using hammers.

Ignorance is the biggest obstacle (3, Interesting)

explosivejared (1186049) | more than 6 years ago | (#21492245)

The simple fact that most people don't have a basic understanding of even the most simple IT tasks. Most people look at a computer and see it as just a box that hums and hisses and produces magical pictures. As long as most people have a largely uneducated view of IT it won't "live up to its potential", whatever that may be. Seriously, think about how much more productive an IT worker could be without having to do the constant virus cleanup and such things which can be, for the most part, easily avoided with just a basic understanding of security. Ignorance is the biggest obstacle

Re:Ignorance is the biggest obstacle (5, Interesting)

BeBoxer (14448) | more than 6 years ago | (#21492395)

Think how much more productive an IT worker could be if the software tools didn't require them to learn a bunch of skills which are irrelevant to their job. Back when cars had chokes and manually adjusted spark advance, you would have been claiming how important it was for drivers to get 'basic understanding' of these things. But of course the real answer was to completely hide these details from drivers so that today they have no idea what it even means to choke an engine or advance a spark. Yes, ignorance is a problem. But it's not the users who are ignorant. It's those of us who develop and maintain the IT systems who are ignorantly blaming the users for our own failings.

Re:Ignorance is the biggest obstacle (1)

CodeBuster (516420) | more than 6 years ago | (#21492619)

Not having to worry about something because it is taken care of automatically and not caring, even at a basic level, about what is being done automatically on your behalf are really two different things. To use the tired car example (why must engineers always use the car analogy?), it is good for even the average driver to understand basically how his engine is working even if the details remain unknown. If the driver has some basic level of understanding then he will be better able to judge for himself when the car needs to be serviced by a professional and when he can avoid paying several hundred dollars and a trip to the dealership by dropping into the local auto parts store and doing some basic maintenance himself. Total ignorance is expensive and over time it can really add up. It is better to understand your choices, at least at some level, so that they are not completely uninformed.

Re:Ignorance is the biggest obstacle (3, Interesting)

CastrTroy (595695) | more than 6 years ago | (#21492685)

However, having a computer that doesn't bother it's user and just takes care of itself goes against the main directive of computers. Computers are supposed to do whatever the user tells it to do. If the user instructs it to run a virus, it will run the virus. If the user instructs it to go to a phishing site, and submit their banking credentials to the server, then the computer will do that. In many instances we've set up alot of programs to ask the user when they try to do something stupid, but often they click yes, even if the computer advises against it. Maybe what we really need is AI, so that the computer will be able to tell the user "I can't let you do that , Dave", and then all our problems will be gone.

Re:Ignorance is the biggest obstacle (1)

Nerdposeur (910128) | more than 6 years ago | (#21493213)

Mod parent up!

My 50-something parents shouldn't have to learn about virus scans and disk defragmenting and registry maintenance in order to surf the web and send email. They have already spent their careers learning their own specialties.

Why should our tools need babysitting all the time?

biggest roadblocks? (5, Insightful)

Anonymous Coward | more than 6 years ago | (#21492263)

Management.

Contradictory and otherwise trite... (5, Insightful)

lstellar (1047264) | more than 6 years ago | (#21492269)

I personally believe Microsoft's dominance, and recent anti-tust troubles, has helped spur underground and indie programming. Nothing motivates youth like an evil world corporation, no? Granted they operated using a walled garden (or prison?) for many years, but you cannot tell me that a portion of the world's elite *nux programmers aren't motivated by the success of M$.

And different forms of input? How do you release that article today- in the age of the Wii, and the smart table, etc. I think it- sans carpal tunnel- that ye ole keyboard is simply the most efficient.

Other than that (and some other sophmoric entries like "war") this article focuses on true hinderances, in my opinion. I believe lock-out, gaps in education and copyright laws enfringe upon innovation the most. People will always have a desire to make something great, even if it is in the presence of a war, or Microsoft, etc. But people cannot innovate if it means punishment or imprisonment.

Re:Contradictory and otherwise trite... (1)

ducomputergeek (595742) | more than 6 years ago | (#21493235)

A lot of the Unix programmers out there are Unix programmers because the platforms that drive big business were developed and deployed on Big Iron & Unix before Microsoft was even founded. It has more to do with "Our original (INSERT ACCOUNTING/HR/ERP) package was developed for Sun/IBM/DEC back in the 1970's/1980's. Since then we've deployed newer versions on newer hardware, but it remains Unix Based." than with M$ being an evil corporation. Generally these folks are also well paid. Helps with motivation.

Unix had a 15 year head start on business mainframes before Microsoft jumped in the game. And that game hasn't changed much other than instead of proprietary Unix, the data centers are now are choosing Linux. Linux killed proprietary Unix. Still doesn't pose much of a threat to Windows.

Windows (3, Insightful)

wardk (3037) | more than 6 years ago | (#21492279)

I suppose there are those people who will think this a troll.

it's not, and it's the right answer.

Windows is the single biggest stifler of progress in every IT shop I've been in. yes, there are other challenges, but those are for the most part, workable.

you cannot work around this steaming pile of operating system. it rides on your ass all day, every day, like a yoke a slave might wear as he spends his 14 hour day rowing. every now and they the whip comes down.

remove windows from the IT shop and watch it THRIVE

Re:Windows (1)

Toreo asesino (951231) | more than 6 years ago | (#21492447)

You're right, it sounds like a troll.

I'm sure I speak for most IT professionals when I say when something comes along that's better for the particular job than Windows is, we'll switch eventually. This isn't religion, just practical and professional common-sense.

Until that day, I don't think Windows is that bad to be honest. Having said that, I'd add that competition is healthy and so is diversity, but removing Windows won't achieve anything.

Re:Windows (1)

zymurgyboy (532799) | more than 6 years ago | (#21492663)

Troll or otherwise, you're just wrong. Windows is not the problem. It's the lazy and/or stupid bureaucrats you find in every IT department (top to bottom, much of the time) who admin it that are the problem. The tools they're given to work with would make no difference in their mindset and approach.

Re:Windows (1)

naetuir (970044) | more than 6 years ago | (#21492767)

There isn't a competitor out there that is that much better than Windows. Remember that the system is only as good as the person operating it.

Apple is great - It's much easier to upkeep (from a technical standpoint) than Windows. But you still have user issues, and that's a (the?) major time sync. Sure, you don't have the time expenditure in doing defrags and virus scans, but then again.. If you're in an IT shop and not using some form of centralized management software (MOM, SMS, Antivirus Server, ...), you're doing things wrong.

Same concept for Linux, but *much* more time spent trying to educate the normally uninformed user.

If you're talking about from the server market aspect, Windows isn't as dominant as you might think. But even there again, any computer (including servers) is only as good as the person that is operating it.

Silos (0)

Anonymous Coward | more than 6 years ago | (#21492283)

Gosh, it's easier to blame Microsoft or the fact that we still use mice for a lack of creativity, isn't it?

The operating system matters less than what we run on it. Operating systems can be modified. Better mouse drivers can be written.

This article is the usual blaming the tool not the market forces that reward crap products. It's easier to get a Zwinky(tm) than find a practical use for the semantic web, but this idiot wants us to worry about whether we're still using a mouse or not.

FUD! (1)

Anonymous Coward | more than 6 years ago | (#21492295)

comparing a two-year-old PC running Linux with a new PC buckling under the weight of Vista
I have a two year old PC running Vista and it runs just fine thank you very much.

We're out of solutions (3, Funny)

Gizzmonic (412910) | more than 6 years ago | (#21492299)

All technological breakthroughs have happened already. The fax machine was the pinnacle of human achievement. Just give up.

Smarter not Faster (3, Interesting)

downix (84795) | more than 6 years ago | (#21492315)

I've said much the same as he did in regards to system speeds. If I optimize my system, I can outperform the latest and greatest my friends have. But I can optimize only so far due to the hardware design. I long back to the old Amiga days, where the core of the system was integrated around the CPU, but still giving the user a completely flexible design. Heck, you can find decades old machines running very modern hardware, due to their innovative design. Ever tried to run a modern video card, soundcard or NIC in a PC from 1989? I've seen Amigas do it. And they did it through being smarter, not faster.

What, no MySQL? (0)

Anonymous Coward | more than 6 years ago | (#21492319)

It ought to be there...

lack of ability to understand (3, Insightful)

yagu (721525) | more than 6 years ago | (#21492321)

Perhaps the biggest roadblock is the general inability of the masses to grasp technology and at the same time technology's allure and ubiquity. Unlike other nuanced sciences (rocket science, brain surgery, etc), computer technology is trotted out as "easy enough for the masses".

That "easy enough" has trickled down from the anointed few to the general population, both in the work place and in homes.

Now, what drives decisions and directions for technology is driven more by uninformed Golf Course conversations than true understanding and needs and the ability to match technology to solutions correctly. Heck, I experienced an entire abandonment of one technology at management's whim to implement a newer and better solution. This, while the existing solution worked fine, and the new solution was unproven. (coda to that story, five years later, that team is busily re-converting the "new" back to the "old".)

Time and again I see people doing bizarre things with technology... in the workplace, with hubris, unwilling to ask others what is most appropriate, and in the home, where ignorance, while benign in intent, rules. I don't know how many times I've encountered things like people with multiple virus checkers running on their machine because they figure more is better.

At the same time, I remember a salesman trying to steer me away from a PC that wasn't their "hot" item because it had a video card with FOUR megabytes memory (this was a LONG time ago)... his reasoning? Who in their right mind would ever USE four megabytes memory for video??? Yeah, this salesman was senior. Yeah, I got it, he was an idiot. But these are the drivers of technology.... people not in the know.

And, while I only have limited direct anecdotal experience of this in well-known companies, I would expect it to be more widespread than many might realize.

Re:lack of ability to understand (2, Insightful)

foobsr (693224) | more than 6 years ago | (#21492679)

Perhaps the biggest roadblock is the general inability of the masses to grasp technology

Eventually more like: "Perhaps the biggest roadblock is the general inability of humanity to navigate a complex system beyond an arbitrarily negotiated collection of local, mostly unrelated local optima".

For short one may name it "collective stupidity".

CC.

Re:lack of ability to understand (1)

Otter (3800) | more than 6 years ago | (#21492733)

Unlike other nuanced sciences (rocket science, brain surgery, etc), computer technology is trotted out as "easy enough for the masses".

On the other hand, rockets and neurosurgery gear provide employment for a tiny number of really smart people, while IT creates jobs for any halfwit who knows how to find the ';' key. For all the sneering about "the masses", I don't think you guys would be happy if they really did stop using computers.

I don't know how many times I've encountered things like people with multiple virus checkers running on their machine because they figure more is better.

I'm not even sure that's wrong, let alone obviously wrong.

Baby steps. (0)

Anonymous Coward | more than 6 years ago | (#21492339)

The biggest obstacle is the relative immaturity of the field compared to other fields. Just look at all the literature on how to improve the process.

dumb users (1)

wwmedia (950346) | more than 6 years ago | (#21492341)

i think the biggest obstacle are dumb users,

you know the ones that open spam emails, install all sorts of crapware (then end up having their computer being part of a botnet) and fuel the whole online advertising industry

Re:dumb users (0)

Anonymous Coward | more than 6 years ago | (#21492517)

And marketing people! Don't forget those godforsaken marketing/project management types who try to manage while being unable to turn on a computer.

Bullshit (3, Informative)

everphilski (877346) | more than 6 years ago | (#21492345)

There is more to computing than processor speed

As someone who does scientific computing, I say bunk! My primary bottleneck is still the processor. FTA:

Too much R&D time and money goes into processor speed when other issues remain under-addressed. For example, could data not be handled a bit better? What about smarter ways of tagging data? The semantic web initiative runs along these sorts of lines, so where is the hardware-based equivalent?

Sure, tagging and controlling data is important, but far from difficult, and with well-written programs a good suite of visualization tools is relatively easy. Give me some speed, dammit! Why should I have to wait for my slot on the cluster when I could have the power right here under my desk?

 

Re:Bullshit (1)

andre.ramaciotti (1053592) | more than 6 years ago | (#21492603)

Sure, tagging and controlling data is important, but far from difficult, and with well-written programs a good suite of visualization tools is relatively easy.
Indeed. Intel and AMD work with CPUs, it's not their fault if no one created a better way to organize files. And, problably, this new way of tagging files will need more CPU time.

Re:Bullshit (2, Insightful)

Chris Burke (6130) | more than 6 years ago | (#21493009)

Sure, tagging and controlling data is important, but far from difficult, and with well-written programs a good suite of visualization tools is relatively easy. Give me some speed, dammit! Why should I have to wait for my slot on the cluster when I could have the power right here under my desk?

Not to mention that unless he's talking about more efficient data paths (i.e. more IPC instead of clock frequency, but still more overall execution speed), that kind of 'data tagging' is completely inappropriate for a general purpose CPU. That kind of complexity should be added in software, with hardware merely giving it the necessary 'oomph'. As soon as you start putting high-level data storage constructs into a CPU, it becomes an ASIC -- Application Specific Integrated Circuit. Which should imply "limited usefulness and lifespan" because as soon as you want to change how you tag your data, that hardware becomes useless. Sure, after coming up with a good software-based data storage scheme, if you calculate that the performance of the scheme is worth the large cost, then create an ASIC for it. But to admonish the CPU makers in general for not creating such a thing? That's just backwards.

Re:Bullshit (0)

Anonymous Coward | more than 6 years ago | (#21493247)

No, He's saying that VMS should have won. And you know, 30 years later, I'm starting to think that he might have been right. VMS should have won. Unforutnately, the UI still sucks.

Open Source?? (1)

Marvin01 (909379) | more than 6 years ago | (#21492375)

What, Open Source didn't make the list?!? On the other hand, neither did Software Patents. Where is a good shill when you need one?

Re:Open Source?? (1)

Marvin01 (909379) | more than 6 years ago | (#21492403)

Oh wait, there it is on page 2. Figures.

Roadblocks my be a good thing (3, Insightful)

jellomizer (103300) | more than 6 years ago | (#21492377)

Perhaps because I am a Mac user and I am kinda use to "Best of both worlds"
(Or worst of both worlds depending on your priorities) Of WIndows and Linux. But Using all 3 OSs
I have seen significant progress in the past 8 years. While there hasn't been to much new innovation
per se like the killer apps that will change the world and how we think and do things. But
society has greatly changed and technology has improved...

Windows. Love it or Loath it. Windows has greatly improved over the past 8 years. Just with XP
Alone. It got the population off of DOS based OS's DOS, Windows 3 - Windows ME onto the more stable
NT Kernel. As a result major PC problems have been reduced compared to the increasing danger it
faces. Take a 98 box and do some web browsing and see how long before it become unusable. No it is
not perfect by any means and there is a lot of suckage to it. And Vista doesn't seem much better
but there has been a huge stabilization on Windows even Vista is more solid then 98 or ME.

Linux. It is no longer considered a FAD os. People now take it seriously, not just a baby Unix clone. It
is taken seriously and used widely in the server environment. Desktop Linux never really hit full force
mostly because of the rebirth of Apple but there were a lot of huge improvements in OS User-interface
and it is comparable to current versions of windows.

Internet Use. During the 90s people used the internet mostly as a fad but now it is used as part of their
life. Just imagine doing things 10 years ago. Most things you needed to go to the store to buy. For information
you needed to trek to the library, doing papers required huge amount of time dedicated on finding sources.
There were a lot of things we wanted to know but we didn't because there wasn't any speedy way of looking it up.
Finding People, getting directions, things are much different now then they use to be.

While there hasn't been great innovation there has been great stabilization and culture change around technology
which help to spur on the next wave of innovation in the future. We as a culture need time to lets massive changes to
sink in so we can fully understand what the problems are with technology that need to be fixed.

Semantic web, from ZDNet (2, Insightful)

SmallFurryCreature (593017) | more than 6 years ago | (#21492463)

Right, look at their page, filled with words that have NOTHING to do with the actuall contents but that still get noticed by search engines.

All the big sites work like that, designed to show up at no matter what you search for. Games sites are especially bad/good at this, no matter what game you look for IGN will show up as the definitive source for info on it.

If you want the semantic web dear ZDNet stop this crap NOW. Start it yourselve and clean up your site so that your pages are only indexed for the actual article, not all the crap around it.

Oh but you don't wanna do that do you, because that ain't economical and will put you at a disadvantage.

Well, that is the same reason behind all your other points. DOn't ask Intel to give up the speed race if you are unwilling to give up the keyword race.

Semantic web? Wikipedia is my new search engine. Because wikipedia is one of the only sites to only want to return accurate results and not spam keywords like mad.

The semantic web can't happen until you get rid of people who spam keywords. You can't make smarter PC's as long as reviewers and customers obsesss about clockspeeds.

The first to change might win, but they will be taking a huge risk, none of the established players will do that. Remember, it took an upstart like google to change the search market, now that it is big, do you really think google would dare blacklist IGN from returning results because they got to many empty pages? Offcourse not, maybe the next search company will try that, but not google.

Change your own site first ZDNet, then talk about how the rest of the industry should change.

Re:Semantic web, from ZDNet (1)

Rogerborg (306625) | more than 6 years ago | (#21492585)

In other news, 95% of drivers agree with the proposition that the guy in front of them should have taken the bus.

The biggest road block is linguistic (1)

crovira (10242) | more than 6 years ago | (#21493267)

in that we never say what we mean.

Try transliterating most expressions, specially curses, across linguistic barriers and you immediately see the problem.

How is a computer supposed to 'understand' you when you can't even understand yourself without years of intimately shared experience?

Google, with its extremely sophisticated pattern matching, is part of the solution, but they can only do so much.

Yahoo, with its human moderated search spaces, is also part of the solution, but they can only do so much.

Deep contextual dependency, a.k.a. the semantic web, is something that is hard to achieve, even in humans.

We will NEVER achieve perfect solutions, language is always evolving, but the solutions will improve over time (they'll require less of it.)

hardware and physical components (0)

Anonymous Coward | more than 6 years ago | (#21492485)

how about hardware, its the year 2007 we have quad core cpus and graphics that are out of this world but we are still using those hard to pull molex connectors and what about getting ram into and out of some dell cases. Maybe we should focus a LITTLE attention on the physical factors, not a lot just a little bit.

They missed government regulation (0, Offtopic)

Kohath (38547) | more than 6 years ago | (#21492493)

Government regulation is going to be the main thing holding back technology in the next 20 years. These regulations are spawned by people wanting to substitute their choices for yours and mine. And greed. Examples:

- Restrictions on talking on the phone in your car
- Restrictions on talking on the phone in airplanes
- Electrical rate-hikes and forced conservation to combat Global Warming
- Sarbanes-Oxley and other laws that make business finance riskier (so there are fewer tech startups)
- Internet taxes
- Other taxes that take money away from folks who could but tech and put it in the hands of governments

There are more examples, but I'm out of time.

Re:They missed government regulation (5, Insightful)

johneee (626549) | more than 6 years ago | (#21492951)

Bull. (Mostly)

Now, I'm Canadian, so I can't comment authoritatively on what it's like in the U.S, but your points make no sense whatsoever. Can it be argued that government gets in the way? Perhaps, but not with the examples you've given.

Phones in cars: If it was just your life you were putting in danger, then who am I to stand in your way? However, this affects everyone around you. You become statistically more dangerous to everyone around you when you're talking on the phone while driving, and you should not have the right to do that. Governments who do this do it because more people are concerned about not getting run over by dorks who can't wait ten minutes to make their bowling plans than there are dorks.

Restrictions on talking on the phone in airplanes: There were (valid?) concerns about cell phones interfering with airplane electronics. Now that these issues are more well understood, the restrictions are going away. Personally, I'd rather them be more safe than sorry.

Electrical rate-hikes and forced conservation to combat Global Warming: Yup. Again, your right to run ten computers at artificially low rates that don't take into account the total cost of the power it takes (including the environmental cost) doesn't trump my right to not have my house under water in 50 years. You're using power, pay the full cost of it.

Sarbanes-Oxley and other laws that make business finance riskier (so there are fewer tech startups): It has been proven over and over again that businesses cannot be trusted to monitor themselves, so the public says things like "they shouldn't be allowed to do that, someone should do something about it so my retirement fund doesn't dissapear!". Well, guess what? The "someone" tends to be the government, and the "something" is S-OX. Got a better way to make sure "they" can't do "that"? I'm all ears, but if you say the invisible hand of the market I'm going to flick your ear.

And taxes, well, it costs money to do the business of government. I'd like it to be lower myself, but to say that internet shopping should be tax-free just because it's online is just arrogant and dumb. There may be other good reasons for it being tax-free, but if you want your iPod and you buy it online, you should be paying taxes just like the rest of us chumps. We can make a case for lowering taxes overall, but that's a completely different argument.

Biggest roadblock = artificial limits (1, Insightful)

B5_geek (638928) | more than 6 years ago | (#21492505)

I see the biggest limiting factor that prevents us from experiencing computing nirvana (a la Star Trek; "computer do this..") is artificial limits placed on us by corporations trying to gouge us for more profit.

Cell phone companies: Imagine how much more pervasive internet access would be if data access didn't cost more then a mortgage payment. I can accept a certain degree of slowness based on technical limitations.

ISP's: Offer the moon, and then restrict your access if you try to leave the driveway. "UNLIMITED INTERNET FOR $20/MONTH*" *If you exceed whatever usage we deem is to expensive for us, we will charge you hundreds of dollars and give you a bad credit rating.

Media Companies & DRM: Wake up and drink the kool-aid. Your business model has changed and it all started with the VCR. People do not like being forced to jump through hoops. There are multiple options that are available that will allow you to thrive in this digital age but like buggy-whip manufacturers you refuse to adapt.

The problems that come when innovation (0)

crovira (10242) | more than 6 years ago | (#21492947)

and profit (which is by definition "anti-innovation") are forced to survive in the same 'for profit' economy are the need for true competition to coerce progress from the forces of stagnation.

ARPA, which became DARPA, was a 'not for profit', 'damn the cost', 'pedal to the metal' engine of innovation which tackled the glacial pace of change that existed before then.

It created the environment that made the modern world (the world since 1950) possible and that world in turn has created enormous engines of wealth.

In dealing with climate change, we could have another ARPA, if people are bright enough to see the possibilities as well as the need.

Apollo (0)

Anonymous Coward | more than 6 years ago | (#21492513)

You don't necessarily need war to advances technology...at least not a hot war. The Apollo program spurred numerous technologies. Of course it was a product of the Cold War.

The number one thing IMHO (1)

hey! (33014) | more than 6 years ago | (#21492541)

is the skills of the people practicing IT. The root of the problem is the skills of the people hiring the people who practice IT, who prefer to hire more cheaper people than fewer good ones.

I can think of three things. (3, Insightful)

LWATCDR (28044) | more than 6 years ago | (#21492605)

The X86, MS-DOS/Widows, and Unix/Posix.

Yes the X86 is fast and cheap but we have it only because it ran MS-DOS and then Windows. I have to wonder just how good an ARM core made with the latest process would be? How cheap would it be at a tiny fraction of the die size of an X86. How little power would it take?
How many of them could you put on a die the size of the latest from Intel or AMD CPU? Maybe 16 or 32?
It will not run Windows thought...
Take a look at the T2 from Sun.
And then we get to Unix. Yes I use Linux everyday. I love it and I want to keep it. The problem is that I think we could do better. Linux and the other Unix and Unix like OS are eating up a huge amount of development resources.

bad mushrooms. (1)

roman_mir (125474) | more than 6 years ago | (#21492627)

A BlackBerry keyboard is a wonder of miniaturisation; shame the same's not true of most BlackBerry users.
- the author is on drugs. BTW, I don't like BBs, but many people can't live without them and the small keyboards are their cocaine and I am pretty sure those are not Smurfs we are talking about.

The current lack of global wars and/or disasters
- there are plenty of wars going on at any point in time. Let's bomb the author of this POS article, maybe that will help to improve the tech.

The author is an ass.

Idiot clients... (1, Insightful)

Dracos (107777) | more than 6 years ago | (#21492643)

That are too obsessed with what they want, and ignore the developers who know what they need and how to mesh want and need together.

The site I launched last week (prematurely, at the client's insistence) had no content, but it did have the oh-so-necessary splash page with a 5 meg flash video (with sound!) embedded in it that to the casual observer looks like a trailer for a new Batman movie. All the issues I'd brought up since the project began suddenly became important after the site went live (except the lack of content).

Do people go to the dentist and demand that their fillings are candy flavored lead? No. But when that person wants a website, they demand every poison they can think of (splash page, ambush the user with audio, flash navigation that search engines can't follow, giant flash ads for themselves on every page, no content) no matter what the "doctor" recommends.

The best clients don't assume they know the web, and will explain their business model, then ask the developers what should be done.

The biggest obstacle is peace (0)

Anonymous Coward | more than 6 years ago | (#21492661)

War spurs development! Most technological advantages have been made in research for military. R&D only gets proper funding during wartime. Funny but true.

Re:The biggest obstacle is peace (1)

b1ufox (987621) | more than 6 years ago | (#21492869)

Though funny i consider this argument insipidly idiotic. why?
Well, war in the way article mentions was never good, is not good and will not be good.

As a Analogy i have seen some really talented artists(music artists) who are drug addicted. This does not means that one should do drugs to be a good musicians.

Hmm (1)

Spykk (823586) | more than 6 years ago | (#21492721)

'There is more to computing than processor speed -- a point which can be easily proven by comparing a two-year-old PC running Linux with a new PC buckling under the weight of Vista.' Are they suggesting Intel and AMD should be developing software instead of improving on their processors?

web 2.0 = skype? (0)

Anonymous Coward | more than 6 years ago | (#21492723)

Why would he mention eBay purchasing Skype in comparison to Microsoft investing in facebook under the heading of Web 2.0? That was a huh moment in the article.

The other huh moment is that global war is going to produce technical innovation at the consumer level. Bunker-busting bombs do nothing for our desktops. And already with technical military dominance that is way over the top, the US has no real competition to advance against. The technical advances of World War II were a result of relatively low technology state. We're no longer at the same point in development and to think that war would do anything for us except consume money on expensive weapons is complete and utter nonsense.

In a sense, I guess if we have global war, we'll be nuked backed to the days of everyone farming their land and struggling to survive and yes, then maybe we'll innovate from that point and find ways to get by...like converting vista DVD's into sparkling scarecrows that keep the birds away from our crops.

It's Really Two Things... (1)

Gallenod (84385) | more than 6 years ago | (#21492743)

...the speed at which humans work and the Graphical User Interface.

We are the main limiting factor in any system. Computers are theoretically designed to meet human expectations of response times. However, How much overall variation in response have we noticed between the response on a 386 running MSDOS and Windows 3.1 15 years ago and a 2 Gig Pentium running XP today? Maybe compilers run faster, but everyday tasks like word-processing or e-mail seem to run at about the same speed from a user perspective. If we designed systems to respond faster we'd likely confuse or annoy most of the current computer-using populace.

From a GUI perspective, the "modern" GUI is full of speed bumbs. Very little has changed since the original Macintosh design (or the original Xerox design, for that matter) except to add more menues, buttons and widgets that make it even harder to develop a reflexive capability in navigating the desktop. Hotkeys are still the fastest way to activate functions, but most people I know stil use the mouse and menus even for simple things like cutting and pasting text. You'd think someone would have developed a way to use screen corners plus keys to activate functions, but other than activating or disabling a Mac's screensaver no GUI designer has used the only four points on the monitor anyone can find while blindfolded to automate functions. Every time we take our eyes off what we're working on we lose "visual attention," which causes a loss of concentration. How many of us can hit any button in a window in any interface without looking? The way GUIs have stagnated under the guise of providing "familiarity" simply adds to our limitations.

We are the weakest link.

Misguided view of computer architecture (2, Interesting)

aneviltrend (1153431) | more than 6 years ago | (#21492749)

... as does the chip-makers' obsession with speed. 'There is more to computing than processor speed -- a point which can be easily proven by comparing a two-year-old PC running Linux with a new PC buckling under the weight of Vista. Shrinking the manufacturing process to enable greater speed has proven essential, but it's running out of magic ...

Yes, there is more to a processor than raw clock speed. But the article misses a great discussion here and suggests "a better way of tagging data." WTF?

AMD and Intel have already realized that faster clock speeds no longer equates into free performance. The newest processors have cache sizes that were unthought of four years ago. Whether consumers realize it or not, multicore superscalar desktop processors will and have become the norm. These processors have the ability to take advantage of parallelism in programs, which is what the article should have addressed: the slow adoption of desktop processor technologies by large software companies.

While some software, such as 3D renderers and other CPU-intensive applications, take advantage of multiple cores on the same CPU, the vast majority of desktop software still is compiled for a single-issue, single-core CPU. Until we see compilers that are able to take advantage of parallelism in source code, and coding languages that emphasize run-time parallelism and multi-threading become the norm, desktop performance is going to progress pretty slowly.

Blame Microsoft and Chip developers? (1)

HerculesMO (693085) | more than 6 years ago | (#21492769)

This is about the stupidest article I've read in a while.

What "holds back tech" is the lack of talent. Plain and simple. If you want to beat Microsoft, you have to out innovate them. Yes, they have a stranglehold on the desktop, but why? Because they have an open OS that is easy to program for, and has low development costs along with quick development. .NET while far from perfect, is a pretty good building ground. Want to know why Mac gaming, or Linux gaming never took off? Ask John Carmack, who has espoused Microsoft as a very good platform on which to build his engines. And he can do it easily, and cheaply too.

And then you blame chip makers for focusing on chip speeds? Of course, THAT'S the problem! Too fast chips!

Every market, whether it be automobiles, fast food, or anything else responds to what the market will bear and want. It's why consumers buy more Japanese cars than American cars nowadays, simply because they last longer, are cheaper to maintain, and overall, have higher build quality and give better gas mileage. If Ford finally put out a car that wasn't a total piece of shit, don't you think people would buy it? But no.. the tried and true Honda Accord and Toyota Camry are amongst the best selling cars in the entire world. Because they make sure to hit the points that customers will need, and have been shown to want.

The IT world is NO DIFFERENT. You want to "break the stranglehold" on Microsoft? Then go make something better. Make an OS that has software around it that compliments another. Right now there is no equivalent to the Office and Windows combination. You can create a document in Windows, in Office. You put that on Sharepoint. Then you add .NET code to it. You can share it, and collaborate on it. All through Microsoft software. All reasonably seamlessly. And cheaply to boot. When there are BETTER ALTERNATIVES to the "whole package", then Microsoft will lose its foothold. But ask any financial institution what they need more than anything in their office, and the word back will be resoundingly "Excel." Breaking the stranglehold means offering something BETTER than Microsoft can offer. And right now, there is no better office suite than MS Office, not by a LONG SHOT. And the new version (2007) is actually VERY good, regardless of what the naysayers may have you believe.

I'm far from a Microsoft lover. I hate a lot of things about them, I think Vista is largely pointless, but not BAD. I think that it's too expensive. But in the end, my games play on Windows, not on a Mac, and not on Linux. I could go through the trouble of setting up Wine and getting games to work... but why would I bother? It works on Windows, and it's easy to do. When it's easy to do on Linux or a Mac, then you will see the paradigm shift. But not before then.

Re:Blame Microsoft and Chip developers? (1)

naetuir (970044) | more than 6 years ago | (#21492995)

Y'know, you're saying all the right things, but without the realization that there are tons of options out there. It has absolutely nothing to do with Microsoft being the greatest innovator of our time, as you seem to think.

Office is exactly one type of application on the personal computer. That doesn't account for a hundreds of others out there. Office exists on the Mac. You can create a document from a Mac and put it on a Sharepoint server, too. For that matter, there are plenty of other Document Management Systems out there that are not proprietary to Microsoft. I don't know the names of them off the top of my head, but then again, that's not something I deal with on a day to day basis in my IT position. Personally, I find the solutions on Mac OS to be far superior to those on Windows. Windows applications haven't really attempted to much innovation in a long time. It's just a bunch of rehashes.

Take a look at Pages (A Word AND Publisher replacement), and Keynote (Far better than it's Powerpoint cousin - Much better animation, very easy to use), and Numbers (Okay, so it's still not as good as Excel. It definitely looks better though.. Oh yeah, and Excel is on the Mac too), and Project X (Better than Microsoft Project in Many, Many ways) or Merlin (Almost an exact look alike for Project, except prettier), and Omnigraph...and..on and on. Oh! And of course: Entourage = Outlook. Though with OS X Leopard having been released, Mail + Calendar are now basically at the same level as Outlook (which you actually have to purchase with Office).

All those "killer apps" out there for Windows, are on Mac too. The hold over is not coming from Microsoft being better than the rest. Far from it. It's coming from users being familiar with Windows and not wanting their cheese moved.

#1: Pursuit of new shiny things... (4, Insightful)

ErichTheRed (39327) | more than 6 years ago | (#21492781)

I know I'm going to get it for this, but here goes. One of the biggest holdbacks on technology progress is the constant churning of the tech landscape every few months. Before you think I'm crazy, hear me out. How many people work in workplaces that use Technology X where the CIO reads an airline magazine article about Technology Y? The next day, you're ripping out system X, which was actually getting stable and mature, and implementing Y just because it's new. When Y starts causing all sorts of problems, Technology Z will come along and solve everything. Software and hardware vendors love this because it keeps them in business. Most mature IT people can't stand it because they're constantly reinventing the wheel.

There's a reason why core systems at large businesses are never changed...they work, and have had years to stabilize. Along the way, new features are added on top.

I know the thrust of the article was "what's holding up progress in general?" Part of running a good IT organization is balancing the new and shiny with the mature and tested. Bringing in new stuff alongside the mature stuff is definitely the way to go. See what works for you, and keep stuff that works and isn't a huge pain to support.

One other note -- a lot of technology innovation isn't really innovation. It's just repackaging old ideas. SOA and Web 2.0 is the new mainframe/centalized computing environment. Utility computing is just beefed-up timesharing distributed out on a massive scale. This is another thing that holds up progress. Vendors reinvent the same tech over and over to build "new" products.

Its an industry rag.. (1)

Creepy Crawler (680178) | more than 6 years ago | (#21492797)

So there MUST be 10, no more, no less.

Problems with IT development:

1. Proprietary formats: Mow much effort is lost in "Resend that as a *** file?" Or "How do I open that file?" We have some decent standards like Post Script, Latex, HTML, and OOXML. But everybody is intent on using that newfangled version of MSOffice, in which each version is intentionally incompatible with the previous.

2. Proprietary network protocols: We still talk about MS again. This time in terms of SMB filesharing and Kerberos munging. These tactics are purely to sell more copies of MS software, and have no real good interest for us users and admins. Even beter yet, we dont know what actually is being transferred as many a times it uses some sort of "hidden encryption".

3. Licensing struggles and legal harassment: As we se with the BSA attacks, proprietary software brings in a segment of liability that GPL (and alike) software does not have. It is impossible to violate the terms of the GPL if you only USE the software. Try saying that about any of the big box companies. One simply cannot.

4. High speed bandwidth deployment: At least in the USA, we have the telcos and our government to blame. Our country could have a much richer infrastructure and allow IT people remote everything. Instead, many of us haven't the bandwidth to stream an MP3.

The biggest roadblock to development of IT (1)

roman_mir (125474) | more than 6 years ago | (#21492819)

The biggest roadblock is that there are not enough people doing pure computer science research, everything else is secondary.

#7 Skill inequalities (1)

n00kie (986574) | more than 6 years ago | (#21492903)

Technology has traditionally been terrible at attracting anyone but the technically minded. Seen by many as incredibly dull and exclusive, the industry most needs the influence of those who give it the least thought. Even the best technical process could benefit from a little humanity.

About 3 years ago at a windows software packaging contractor this packager submitted a package for QA review. The QA was a female in her early twenties, having worked there for half a year before most packagers came aboard. Get this: she FAILED the package because she didn't like what some aspect of the application looked like. After being told that the original application's dialogs haven't been modified at all, she still ordered the packager to change it to something 'prettier', to which he ironically suggested that he should also change the background color and maybe add some flowers. She actually considered it (for about 5 seconds) before she realized he was joking.

Yeah, fuck that! I don't need this kind of 'humanity'.

I/O performance much more important than CPU speed (3, Interesting)

smcdow (114828) | more than 6 years ago | (#21492911)

I'd rather have a machine with slower CPU but with wide, fast busses and smart, fast I/O subsystems, then a machine with a faster CPU but with crappy I/O. Maybe I'm just wierd that way.

Lazy programmers (0)

Anonymous Coward | more than 6 years ago | (#21492921)

Who should we blame? Developers of both Operating Systems, and the applications that run on them. Just because I have a faster processor & more storage than in yesteryear, the basic requirements of my day-to-day computing experience (web surf, email, develop some code, listen to music, etc) have not. Yet, the operating system, and applications these days are such resource hogs. Before we get more UI candy, or the latest 'framework' how about making the code more efficient?

It is becoming an arms race between the lazy programmers and the hardware guys keeping up to produce systems that will support a basic install.

 

Another missed option (1)

wattrlz (1162603) | more than 6 years ago | (#21492999)

Articles that claim to be IT related, but are really just filling space in hopes of getting advertising revenue.

Ten (1)

ELProphet (909179) | more than 6 years ago | (#21493061)

Print view: http://www.zdnet.co.uk/misc/print/0,1000000169,39291080-39001111c,00.htm [zdnet.co.uk]

0

I like the war analogy - War against "environmental change, disease and international political and economic upheaval!" ... so, because no one likes change, and everyone has their own goals and motives first, technological advancement will not meet its full potential. AKA, we need the buggers to attack so we can unite around Ender Wiggins.

The divide (1)

JosefAssad (1138611) | more than 6 years ago | (#21493135)

IMO, the biggest obstacle is the digital divide. The prevalining and overwhelming majority of people in the world are economically and socially dispossessed, which one can only imagine deprives the rest of us of people who would otherwise have contributed richly to IT development.

Monopoly == Technological Stagnation (4, Interesting)

erroneus (253617) | more than 6 years ago | (#21493153)

This is a pretty well accepted notion and has numerous examples not of where monopolistic powers coincide with stagnation of technology, but examples of where monopolies were busted and things changed shortly thereafter. The most common example of this is when the phone service monopolies were interrupted.

But in most (probably all) states in the US, there is a utility commission that sets the minimum standards for service offerings. Why is this? Clearly, because there is a need to mandate to companies a minimum required level of service. When the utility commissions don't mandate levels of service high enough, we end up with... well, what we see all too often, which are technological "ghettos" where service providers don't want to invest in areas that yield low return. They would rather, if it were up to them, cherry pick only the areas that would yield premium return as it would make sense. But even today, there are too many places where DSL isn't available or more commonly, where fiber service is unavailable.

And all too often we hear about "net neutrality" because the telecoms are complaining that various applications are flooding the internet and threatening to crash it. The answer that they don't want to hear, of course, is that they should be required to scale up their hardware to handle heavier loads. They would rather restrict or impede certain types of service to reduce the bandwidth demand. (Think Comcast)

But beyond communications, when Microsoft or any other company lacks competition, they lose incentive to apply funding to R&D, which directly affects new technologies being developed and released. Microsoft probably doesn't do much R&D. Instead, their strategy seems bent on "buying new things." This makes their R&D budget low and relies on a practice that maintains their monopoly while being parasitic against the rest of the industry. (That is to say when someone comes up with and develops a really good idea, Microsoft is likely to simply buy it... and either suppress it or put their name on it.)

This is a rather "natural" behavior even if it is unhealthy for economies and societies hungry for growth and improvement. Note my assertion that "natural" doesn't mean healthy or good.

In a rut. (3, Insightful)

ZonkerWilliam (953437) | more than 6 years ago | (#21493223)

IMHO, I think IT is in a rut, just as the article eludes to. What is needed is to rethink the process. Look at providing important information to the people where they are. In other words it shouldn't matter where I am, if I sit down in front of a computer I should be able to get to my information and application's wherever I am. Information and not the computer should become ubiquitous. A RFID card system (with encryption) should allow a person to sit in a an office, or cube, and have their phone calls and desktop forwarded to the workstation their in front of.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?