Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Scientists And Engineers Say "Computers Suck!"

timothy posted more than 13 years ago | from the so-what's-new? dept.

Technology 251

drhpbaldy writes: "At the latest ACM meeting, scientists and engineers threw mud at computer scientists for not contributing anything useful. They lambasted CS types for developing complex and useless technologies. Some of the fault was placed on VC's for funding only the fanciest and stupidest technologies." Of course, when people say that "design" will save the world, they usually mean their idea of design, which might not jibe with yours or mine.

cancel ×

251 comments

Sorry! There are no comments related to the filter you selected.

Sinewave (1)

Anonymous Coward | more than 13 years ago | (#365026)

When you toss coin, call heads tails, only coin land on inside, do you wither? When you stumble all clundyvoo like the stainless wazzjackets that populate oo voof many souls? Do you hander mascronian, or smell tobcoa all fleshy like? Will it be? Or will it soon? Eek leff von fools, all them be brandybums.

The Rambler

Re:Secure Path Login/LogOut (1)

Anonymous Coward | more than 13 years ago | (#365027)

Actually, I'd heard that you can take advantage of the DirectInput part of DirectX to capture that key combination... can anyone confirm this?

Geesh, am I the only one? (1)

Paul Komarek (794) | more than 13 years ago | (#365029)

I almost agree with the premise, with some conditions. Computers aren't doing anything near what they were and are hyped to do. Hal is still a long, long ways away. But I don't think that is computer scientists faults -- I think it is Intel and Microsoft's fault.

These companies told the world that computers were ready to make their lives better. They made a lot of laughable statements that were, unfortunately, easy and desireable to believe. Then these companies mass marketed their products and made bundles of money. Imagine vulture, er, venture capitalists in 1910 saying "London to New York in 3 hours via plane!" This is what happened in the computer industry, and there has been a lot of dissapointment as a result.

Consequently, Intel's research budget grew very fast, evidently much faster than they could improve their designs by the look of things. However, the companies that were making real advances in processors have been pushed out of business (next week, we'll discuss whether the "efficiency" of capitalism is really the right economic principle to maximize ;-). The same with operating systems. It's very interesting to see that the only successful competition for Windows is a piece of volunteer-built public infrastructure that grew on a schedule largely independent of "market forces".

The term Artificial Intelligence (my research, sort of) is horrible, and has probably contributed to the disappointment. I don't think software techniques have matured much. Hardware and hardware processes have become much better -- memory densities, magnetic storage densities, even CRT manufacturing. But I really don't see any improvement in available software. At least with GNU/Linux, there's an attempt to find the right way to do things even if it takes several tries and isn't popular or financially rewarding.

The best thing that has happened, by my estimation, is the interconnection of computers. Networks have proven far more valuable than so many other technologies like speech recognition and vision. Those technologies are very, very interesting, and it's proper for people to study them. But natural language processing has not had an effect on how we get through each day, yet, despite hype from the market.

It's interesting, therefore to see how Microsoft, Intel, etc. hype the Internet. Watch how they try to spin their products to say that they add "value" to the Internet experience. An Intel P-MCMXXXI doesn't make the Internet any better. The important aspects of the 'net don't depend on Flash or Javascript, and certainly don't depend on Flash or Javascript executing on bajillion-megahurts processors. The Internet, the best thing to happen to come to the public from the tech sector (followed by handhelds, I think =-), is useful with or without Intel's latest and greatest. The internet is even better without Microsoft's latest and greatest Media-Hijacker. =-)

The Internet is valuable for the transmission of information. Computers are valuable for the processing of information in simple ways, really quickly. Neither of these create new information in any meaningful sense--we still need humans or nature or whatever for that. But none of this sounds exciting enough to sell computers, and as a result Microsoft and Intel, etc., created the hype that has led to a large disappointment. They preached the land of milk and honey but delivered a desert (I better watch out for lightning bolts after saying that...).

I like to say that these companies, and the whole PC industry, have been "taxing naive consumers." And now consumers are realizing that these companies have squandered their money. It is ironic, and slightly humorous if you've a strong stomach, that the academics are getting blamed.

-Paul Komarek

Re:Oh please... (1)

PhilosopherKing (7890) | more than 13 years ago | (#365043)

of course, before the advent of computers, travel was all but impossible. Just look at Columbus's voyage to the New World. If Spain hadn't heavily invested in GPS software and IT infrastructure, how would they know where they were going. (Of course, had they solved the traveling salesman problem first, they could have gone to the East in only x ln x moves.)

Re:Good design... (1)

DavidTC (10147) | more than 13 years ago | (#365045)

AFAIK, most computer science degrees do require a user interface course.

-David T. C.

Mainstream is mainstream, as with everything. (1)

M@T (10268) | more than 13 years ago | (#365046)

There are millions of bits and pieces of software and hardware out there...not all of it bad, particularly when you start looking at medical implementations and other one off systems.

As with mainstream TV, mainstream music and any other mass production market... 95% of it sucks... but that other 5%???? Yeah baby!!!

Re:Looks like (1)

damm0 (14229) | more than 13 years ago | (#365047)

There are only a finite number of atoms in any given object that you can lathe away.

Your assertion that a computer cannot do inifinite things is quite wrong. A computer may have a finite instruction set, but has access to an infinite amount of memory via networking. A computer can, in fact, do an infinite number of things. Even if you try really hard, you cannot write all programs that can be written.

I guess that's why you want to use Linux (1)

ihxo (16767) | more than 13 years ago | (#365052)

because you obviously got a serious ego problem. iMac is more than just the case, it's the same as other Macintosh computers, ease of use. as I always say, smart people use linux as server, suckers use linux as desktop (and still they they are really smart).

An engineers take on this. (1)

acomj (20611) | more than 13 years ago | (#365053)

I'm an actual engineer (eit really as I didn't get my PE..). Having switched careers and gotten a masters in software engineering and working in software I can say the feild is definitely less structured and diciplined than that of Civil engineering. In Civil engineering plans are looked at by a least 2 people and can only be built if signed and stamped by a PE (Profession licenced Engineer). If something goes wrong they go see the PE.

Software can be built and deployed by anyone (this is a good thing though.)

Of course bad software hasn't killed many people yet, and Civil enginneering disasters have caused fatalities, but as more and more systems depend on computers more peoples lives will depend on code.

I don't think its a bad as the article made it out tobe. The industry seems to be maturing and more design/engineering is being used . Thats because those "CS" types have "discovered" a lot of building blocks, structures that can be put together into an "Engineered" system.

Re:Convolusion isn't necessary. Try dialogs. (1)

G-funk (22712) | more than 13 years ago | (#365054)

And if i'm typing away while reading something, i accidently hit POWER and continue typing, and my sentance has an 'o' and a space in it (as most sentences do) which depressed the OK button and my computer turns off......

Yeah that sounds cool, let's do that!


--Gfunk

Correction: (1)

TheDullBlade (28998) | more than 13 years ago | (#365059)

It should have started like this -

Subject: Apparently unrelated "teaser" title...

Body: ...which becomes a blatantly angry or seemingly thoughtful and even-handed (but actually angry) denial.
---

Tim: Read the article; IT != CS (1)

Skwirl (34391) | more than 13 years ago | (#365064)

"That's the gist of what academics and engineers told IT workers gathered here this week for the three-day Association for Computing Machinery conference."
Get it straight, this article is about the failure of IT professionals to make intuitive computers... NOT computer scientists. Half the people quoted in this article ARE computer scientists. The IT community contains everything from useless MSCE's to computer scientists, to pointy-haired bosses. Get a clue.

Re:Convolusion isn't necessary. Try dialogs. (1)

Skwirl (34391) | more than 13 years ago | (#365065)

Mmm. I sure do love wading through y/n prompts.

Me fail english... thats unpossible! (1)

_ECC_ (43365) | more than 13 years ago | (#365069)

jibe?.... now you're just makin' words up.

Re:ANGRY DENIAL! (1)

seanw (45548) | more than 13 years ago | (#365071)


you know, if I could, I'd mod you up. meta-posts like this really save a lot of reading.

sean

Re:If you think about it, most electronic stuff su (1)

Mr. Gus (58458) | more than 13 years ago | (#365075)

Why do I have turn 5 knobs and push 4 buttons to make my home theater receiver/tv switch from dss to ps2?

You do?

Why do 90% of VCR functions only come on the remote, especially important stuff, like switch to the aux video source?

Because it's cheaper...

why does every piece of software come with crappy default settings?

That depends...

A: The author wants the most popular options (regardless of how they interact) enabled and the blandest, least-offensive, most generic defaults possible.

B: The author wants defaults that make them money. If there is an annoying new "feature" that is either a selling point or a lame attempt to get information/money/whatever out of poeple (somehow), it gets enabled, even if newborn chimps can understand just how doomed the feature is even after being dropped on their head and cut in fifths.

why are we stuck with crappy interoperatability between anything? DV vs. D8mm, firewire vs. whatever, ide vs. scsi .. you name it ...

What exactly are you looking for, there...? VHS-C? :)

i have a pda, cell phone, pager, email, etc. .. but for some reason, getting them all to work together in a synchronous, efficient manner is impossible, unless you of course get all your services from the same company, and who wants that?

I repeat, what are you looking for? I can sort of see the usefulness of getting your cell phone to dial up something in an address book on a pda once you've found the number, but it's not much of an increase in convenience (actually dialing a number isn't mind-numbingly difficult, to say the least). And you'd think a cell phone should make a decent pager as well (although having both at once seems like the on-the-go version of using your answering machine to screen your calls, or something). What is so wrong with these devices not communicating? I don't own any of them, so, umm, anyway.

I know I'm generalizing, but these 'engineers and scientists' are the same jerks who've been pushing shitty technology down our throats ... if all engineers and scientists contributed to what makes sense and push good technology instead of market-speak, we'd would have had beta vcr's, be running linux, and stupid shit like DMCA/UCITA wouldn't exist...

Err, not that companies didn't do their darndest to manipulate the markets, but it wasn't a bunch of engineers that decided to buy more BETA VCR's and tapes than VHS VCR's and tapes... And so on.

my 2 cents ....

I'm penniless. And sigless. I think.

Re:not contributing anything useful? (1)

Bios_Hakr (68586) | more than 13 years ago | (#365076)

Acutally, mister grammar nazzi, it would be '16-year-old'.

I honostly thought all these guys would have gotten bored years ago. Thanks for the trip down memory lane.

CompScientists not contributing anything? (1)

Trekologer (86619) | more than 13 years ago | (#365081)

That's odd... I guess that all the software that makes the hardware that engineers (computer and electrical) is just nothing. I also suppose that hardware will rise up and work on its own, if it weren't for those nasty computer scientists...

Oh, wait. That's right... Electronics are just expensive useless boxes without the software to make them do something useful.

I wise person once said "engineers think they know everything but they don't know shit."

As for vanity coming before functionality, remember that all the porducts that they cited are just that...products. Consumers need a compelling reason to buy them. That's why "the computer for the rest of us" (the Macintosh, if you don't know the reference) used simple graphics as the user interface. A form of *nix (or other command line system) just wouldn't sell at all. All those in accidemia have their head in the clouds when they say that fashonable technology is foolish. Consumers will buy a shinny "cool" looking device with half the functionality or intuitiveness of something that looks like it should be in a lab. That's a fact.

Re:Funding only stupid techonologies? (1)

Greyfox (87712) | more than 13 years ago | (#365084)

http://www.fufme.com

You could set up a network of these and fuck yourself in the ass. If I read their product specs correctly.

Re:not contributing anything useful? (1)

nublord (88026) | more than 13 years ago | (#365085)

Heh.

It is not science, it is an art!! (1)

cs668 (89484) | more than 13 years ago | (#365087)

I am so tired of people acting as if systems/software development is a science, it is not. It is an art. Some people are great artists and some are not. Instead of spending time in "Computer Science" classes they should have "Code Art Appreciation" classes.

I have worked with so many academics that produce the worst implimentation. They think it is not important, but that is just a cover for the fact that they do not have the eye for elegent design/implimentation.

The real problem is... (1)

cworley (96911) | more than 13 years ago | (#365089)

As Intel made faster processors, Microsoft added more functionality, which exponentially grew in complexity given their poor kernel design.

The functionality was inevitable... Microsoft had to catch up to Unix, but the complexity was way beyond the average user.

Now, the average user needs a great deal of support and a sysadmin, that they can't afford (or didn't used to need to afford), so it's their brother-in-law or the guy down the hall who's into computers (even though it's not his job) who's got to fix Microsoft's problems. Microsoft support can't help... they don't have the proper education. Neither does the standard MSCE.

Microsoft has started saying "Software is support", but they don't yet understand this new mantra.

We're all used to hiring professionals to fix our appliances... but not our computers.

Large corporations hire armyies of sysadmins. Sun and IBM know how to sell them service and support.

Small businesses and individuals are going to have to get used to hiring professional support to maintain their computers.

uh yeah (1)

PharCyDE (101385) | more than 13 years ago | (#365093)

whatever..they seem to be more intrested in making jokes then actually thinking about what they were saying... if computers havnt changed in the past 20 years..yet most people dont know how to use em..what does that say about most people.. and iMacs...jus using a case where style is more then substance.. screw it..personally i rather live in a world where most people dont know how to use a computer.. then those of us who do..can make more money.. a computer thats smarter then a toilet? is he talking about core dumps???

Re:I guess that's why you want to use Linux (1)

PharCyDE (101385) | more than 13 years ago | (#365094)

i have the ego problem? linux is cheaper, more powerul, alot more stable then whats out there...plus it looks nice on your resume... ease of use..learn how to use something properly and there will be no issue of how easy it is to use..guess command line is to much for someone like you and if more people knew how to use computers it would put less of a premium on those skills..which would affect people with programming/engineering skills.. pull your head out your arse before you make a post..and if ya see what the average Mac user says..he swears his G4 is faster then any other computer out there..but unfortunatly thats jus wishful thinking..damn shame you guys dont realize style != substance..atleast steve jobs lets you pick what color your iCrap will be

Re:LAMEST. ARTICLE. EVER. (1)

Count Spatula (103735) | more than 13 years ago | (#365096)

Not only that, homes, but you also have to consider the fact that the 'corporate public' has been calling for a somewhat standardized computing situation for years now, and now that they have it, they piss and moan that it hasn't changed.

Fsck them. Really.

Yeah, it's gotten better over the years, but it hasn't changed. Good. That means that the user of old can interface with the new platforms. Is that bad?

Re:Good design... (1)

Trepalium (109107) | more than 13 years ago | (#365100)

Or perhaps more "Computer Science" courses should mandate courses in User Interface design, because there are just too many BAD DESIGNS out there. The problem isn't that interface design is a difficult concept to understand, but rather that it's not considered a priority.

Re:Funding only stupid techonologies? (1)

jred (111898) | more than 13 years ago | (#365102)

Damn, I wish I could mod this up. My girlfriend was just trying to talk me into doing a new case mod, with one of those mechanical mouths mounted in the front.

jred
www.cautioninc.com [cautioninc.com]

Re:"Too easy" shutdown procedures (1)

tooth (111958) | more than 13 years ago | (#365103)

The early microbee had a similar reset key, and I'm reminded of it when I hit the power off switch on my PC accidently or to soon (utering the phrase "oh, damn"), and then have to do comlicated areobatics to shut the machine down properly.

At least the microbee only reset when you released the key, but still a pain when you hit it. (not just figurativly either, finger cramps from not moving, "come on hurry up!")

Innovation? Maybe not. . . (1)

larsal (128351) | more than 13 years ago | (#365104)

This reminds me of an interesting question I'd been thinking over recently:

Has there really been much in the way of innovation in computer science since the heady days of the 1970s, or have the last few decades been more of a playing out of the various innovations and ideas produced back then, now merely possible thanks to lower costs and cheaper prices.

Considering:

  • Successive products now tend to improve on things by brute force, rather than by finding new alternatives [eg. bloatware, rising transistor counts].
  • Many things heralded almost as new technologies are refinements of existing ones, rather than significantly new processes or methods [eg. .18-.13 micron fabrication, Gigabit/100 base T Ethernet].

Are there, perhaps, other examples? I don't believe that these "scientists" are necessarily upset with the work done by CS workers. I think they, quite rightly, object to the popular notion, seldom disputed by the industry, that constant innovation and invention are part and parcel of the marketplace.

The analysis of statistics, the backbone of much scientific research work on computers, may be aided at present more by the application of brute processing power than by any inventions since the transistor.

Given that, who's to say they don't have a point. . .

Larsal

A load of hooey. (1)

dcollins (135727) | more than 13 years ago | (#365108)

I'm kind of guessing that this presentation was much more tongue-in-cheek, or a joke, than the article presents it. If not, it's really downright bizarre.

The PC in many respects has too many functions, and it does them too poorly. So there's all the reason in the world do look at different choices.

So I guess that's either lobbying for an array of specialty digital appliances, or handicapping computers so they can't do so many functions (i.e., be a generalized logical device). I mean, come on, that's the whole magic of computers. Call me a hobbyist or whatever, but you can pry my word-processing, spreadsheet-calculating, internet-surfing, game-playing, software-developing PC out of my cold, dead arms.

It's stuff like this that is making me more and more likely to completely disregard as rubbish any presentation that includes the phrase "computers need to be more intuitive and user-friendly"...

backwards compatibility (1)

Racer X (140445) | more than 13 years ago | (#365111)

i thought the article was interesting, but it seemed to overlook the basic practical necessity of backwards compatibility. it probably wouldnt be too hard at all to start fresh on a new kind of machine for doing the things we (supposedly) have so much trouble getting done with the computers of today--but what about when you need to transport or view standard file formats? who'll write the new driver you need for the network printer? how will this machine talk to all the other machines we already have? practical technologies don't exist in an academic vacuum--they have to be design with real world constraints in mind. and that means making new devices that work with old devices. and that means its very difficult to make a clean start.

Say It Ain't So (1)

Rura Penthe (154319) | more than 13 years ago | (#365114)

"Industrial designers poked fun at virtually all facets of computers and other electronic gadgets, and the Apple iMac--displayed in PowerPoint presentations in its groovy new shades--bore the brunt of scorn and jokes about how fashion has superseded functionality. "

Huh? The iMac and other Apple designs (although controversial) consistently win industrial design awards from many places...what industrial designers are these? Even those who think its dumb have to acknowledge that colored computers have been wildly successful for Apple at least. As for fashion superceding functionality, what about when an engineer puts functionality over usability? :)

"One presenter went so far as to blame the non-intuitive, non-human oriented design of desktop computers for the current economic slowdown that has ravaged the broader technology sector."

He what? Someone please enlighten me how these computers that were designed by humans for humans and (in general) with extensive human interface group research are non-human oriented. The evolution of the GUI since it *began* has been to more and more intuitive and user friendly interfaces (once again, in general)...

(Flamebait): Why are engineers ragging on CS when they are responsible for more UI idiocy than most? Engineers tend to want to make UIs that favor the knowledgeable and competent person, not make it easy to use for anyone.

Cost of mistakes (1)

Halo- (175936) | more than 13 years ago | (#365122)

The software industry is laden with executives who care more about making the time to market window shorter than producing a quality product. Since the turnover rate is so high, they "guilty parties" have moved on to selling the next vaporware product, leaving over-taxed development to follow through on the previous pipedream.
I'm sure the hardware industry would run the same way if the consequences weren't so great. If XYZ Corp puts out some crappy web - enabled - XML - Java - object - oriented turd, it hits the marker, flops, and dies quietly. If XYZ Corp is producing say, CPU's, they cost of failure could well destroy the company. Plus, there's a lot less "well, this is just the beta" stuff. The chip works (for the most part) or doesn't get released.
I suppose my point is that geeks are geeks, and we all have more bad ideas than good ones. In the software world, these ideas can see the light of day. In hardware, the bad ideas get weeded out early. Quite frankly, I think the management of the technical talent dictates the quality of the idea. And the truth is, most management will take the shortest path to getting paid. For hardware, the path is a little longer...

Re:not contributing anything useful? (1)

tfrayner (186362) | more than 13 years ago | (#365125)

Well if we're going to be picky about grammar etcetera... I'm fairly sure it's 'jive', not 'jibe' that the original poster means. Although it could go either way, I suppose. Opposite meanings, y'see.

Just remember, I didn't start it :-)

Disclaimer: Two beers down may be too many to be able to communicate in a grammatically correct fashion...

Re:not contributing anything useful? (1)

tfrayner (186362) | more than 13 years ago | (#365126)

Oops. Can't argue with that. I must be confused about something I read about a year ago in Bill Bryson's book "Mother Tongue". I thought he mentioned something about the meaning of these two words having been confused. I would go and check, but my copy of the book is currently 50 miles away at the bottom of a box ready to be shipped to Scotland. And so it goes...

Near-sighted nerds. (1)

billcopc (196330) | more than 13 years ago | (#365131)

Although I can't help but agree with some of the (few) arguments cited in the article, and I have to admit that there's plenty of useless crap that's come out of the CS field lately, but these scientists and researchers seem to have ignored the not-so-intellectual side of things : making the damned PC more popular and attractive to common people.

While there may not have been many remarkable technological achievements, marketing and image have been steadily improving and focusing on Joe Random Luser. This will bring more eyeballs to the next great neat project that will be perceived as the greatest thing since sliced bread. In the long run, more people will ultimately benefit from future advances in computing. Now I'd like to see an astrophysicist say the same about their practice.

IT vs. Academic (1)

guinsu (198732) | more than 13 years ago | (#365134)

I think IT and computer companies ahve done mor over the years to make computers easier to use than the academic world. Its certainly a case of the pot calling the kettle black. After all, industry gave us Windows and the Mac, the academic world saddled us with Unix.

Re:Say It Ain't So (1)

angry old man (211217) | more than 13 years ago | (#365140)

(Flamebait): Why are engineers ragging on CS when they are responsible for more UI idiocy than most? Engineers tend to want to make UIs that favor the knowledgeable and competent person, not make it easy to use for anyone.
I don't know which Engineers that you're talking about, but all of the engineers that I've ever worked with couldn't even figure out how to archive and Outlook folder. I'm sure that any Sys Admins will back me up on this. Most engineers think that they are above current computer technology, hence they don't bother to learn it.

Dilbert (1)

rattid (214610) | more than 13 years ago | (#365142)

I can see a dilbert like comic with engineers poking fun at computer scientists.

Computers *suck*??!!??!! (1)

LaZZaR (216092) | more than 13 years ago | (#365144)

Yeah right, I'd like to see them do their work without them

Re:Computer scientists will rule the world (1)

Maudib (223520) | more than 13 years ago | (#365146)

So I dont generally like replying to trolls. But in this case I am concerned. I think you should seek counseling. No, seriously. I mean it. I can only think of a few other examples of cynism that match yours, possibly Nitzche. There is more to life then the bottom line. Just about everything in fact.

Re:Finally some truth at Slashdot! (1)

norwoodites (226775) | more than 13 years ago | (#365149)

A language that both combines smalltalk and C is problely a better language than a language based on C or one based on smalltalk.

Re:not contributing anything useful? (1)

ConsumedByTV (243497) | more than 13 years ago | (#365153)

Why be the grammer nazi?
Why not the American Secret Service?



Fight censors!

A market full of WHAT? (1)

RetroRichie (259581) | more than 13 years ago | (#365156)

"We have a market of very confused customers and observers."

My butt. What we have is a market full of technophobic, ignorant baby boomers. Just because it's not intuitive to a 50 year old marketing manager doesn't mean it's not intuitive to a 10 year old.

Most Gen X people (and ALL generation-whatever-they-are kids) find the Windows GUI, for example, to be very intuitive and very simple. Hell, my 5 year old cousin flies around my PC. Screw the baby boomers and the engineers that are blowing them. In 10 years, you won't see these arguments anymore.

but so they swallow ? (1)

llamasonic (261549) | more than 13 years ago | (#365158)

come to think of it i met my fiance through this sucky computer / internet useless technology thing.. i wonder how many people have met their mates online ?

regardless of how unintuitive the technology is it is bringing people together every day that might not have ever crossed paths before.. in droves.

StarTrek (1)

travis77 (261826) | more than 13 years ago | (#365159)

When i was a kid i was amazed by the computers on star trek (next generation) you didn't have to boot them or really know how to use them you just did paper was a thing of the past and a book was something read for pure enjoyment not to hold crap found in a text book and growing up on star trek i expected computers to be like the ones on star trek and now that i am older i am kind of disappointed that they are not and i have to type and click instead of issuing a voice command like i would to my waiter. the problem is simple this is still a young industry the first computers where only invented a little over 50 years ago and the average person has only been interacting with pc for around 6 years maybe less just give it time

Travis

I'm a scientist (1)

wroot (264810) | more than 13 years ago | (#365161)

... and the authors of this paper do not represent me.

Contrary to what some may believe, most scientists are quite intelligent, especially in fields like physics and math. Non-analytical or less analytical fields like biology are where you could actually encounter cry-babies complaining about how their computers aren't user-friendly enough (Yes, biologists do need to use computers for things like DNA sequence analysis, spectroscopy, X-ray crystallography, etc.)

Just trying to clear up the matter. The title should say "Some scientists say computers suck." instead.

Wroot

FORTRAN( Re:Where were the "scientists and en) (1)

wroot (264810) | more than 13 years ago | (#365162)

If anything, most of the scientists I know are perfectly happy to write in FORTRAN (gack) and could care less about the marketability of personal computers.
Because of certain language features, such as anti-aliasing, FORTRAN compilers can optimize code and take advantage of hierarchical memory (i.e. RAM vs. different levels of cache) better than most modern C/C++ compilers. I experimented with it myself and saw FORTRAN to be several times faster for things like matrix multiplication than equivalent C/C++ code.

I still don't write in FORTRAN because g77 (GNU FORTRAN compiler) is archaic (Fortran77 doesn't even have dynamic memory allocation) while modern compilers aren't Free/GPL'ed. BTW, it's a shame GCC settles for the 24 year old standard.

Wroot

Where were the "scientists and engineers"?? (1)

dissipative_struct (312023) | more than 13 years ago | (#365171)

I didn't see mention of scientists and engineers in the article's quotes... there was one guy who was a "computer scientist", and another was "Chief Scientist" of a graphics company, but it didn't sound like a wholesale assault from a bunch of EEs and Physicists. I wasn't at the ACM meeting, but it sounds to me like this article warped some comments into a "Scientist v. Programmers" vein to make it more interesting. In any case, when you're talking things like interfaces and fundamental hardware design, aren't you talking about SEs and EEs and CEs and what have you? Not computer scientists...

Given that this was an ACM meeting, I don't think the views represented here are at all indicative of "scientists and engineers" in general. If anything, most of the scientists I know are perfectly happy to write in FORTRAN (gack) and could care less about the marketability of personal computers. Engineers would be more likely to care about usability, but if anything they would be the ones looking for a full-featured computer, not an "appliance". Sounds to me like a bunch of people hurt by the recent economic problems associated with the stock market are angry and want someone to blame... a few may happen to be engineers and scientists, but I very much doubt this is at all a majority opinion.

Looks like... MOTORS (1)

mtsang (316278) | more than 13 years ago | (#365179)

Look at the history of tools. First there were separate tools, each one doing a single or small number of jobs. Even a Swiss ArmyKnife was limited to about as many tasks as it had specialized attachments...
Ok. Let's look at history. In the past, there were home motors you could buy. Big unwieldy devices, with attachments that allowed it to act as a blender, hairdryer, clotheswringer, etc.

We don't buy 'generalized' home motors anymore. Instead we have small specialized motors embedded in the home. In a hairdryer, in the microwave, in the air conditioner, In the fridge, In your scooter, In your car, In your cpu fan, in your hard-drive and in your automatic camera.

But the computer is far far different than any other tool that came before. The computer has the ability to be an INFINITE (or at least huge enough that you won't exhaust the possibilities in the lifetime of the universe) number of tools.
Sounds not so far far different than the motor to me....Our needs are complex, but technology can adapt to our needs. Not the other way through the looking-glass.

An automatic camera is intelligent in the domain that a camera needs to be because it is speciallized. It can calculate the distance to objects, set the aperature, focus, flash, advance the film, all within a fraction of a second, with the push of a button -- because it is specialized.

But the difference is that our specialization is in the software, and the specialization they are proposing is in the hardware. If I want a single purpose tool, I don't need a computer to get that.
You may not need a general purpose computer at all. The specialization I'm talking about is deeper than what you are talking about. You can't easily get the affordances of pencil and paper through a general purpose computer. Different forms are conducive to different uses. Specialized hardware and software together in conjuction make it easier to do things. (see automatic camera example above) Don't tell me that your mouse/keyboard/monitor combo is equally good for doing spreadsheets/word processing as it is for drawing, or painting. Why are secretaries and artists (people who do fundamentally different things with the computer) working with essentially the same system?

It doesn't make much sense.

We definitely will NOT have behemoths like the current home computer as a common household item of the future.

Complaint is not well founded (1)

dkwright (316655) | more than 13 years ago | (#365181)

At least not in the article cited. Complaints that efforts are focused too much on the PC rather than distributing technology across the household, strike me as being motivated by a certain faddishness and "buzzwordiness" rather than a substantive critique of some deficiency in computer science.

The article sounded to me like some people with a certain technological agenda are unhappy because everyone else in technology does not have that same agenda at the forefront of their minds.

That's hardly a substantive complaint.

not their job (1)

Prisoner 655321 (317527) | more than 13 years ago | (#365182)

its not necessarily the job of the computer scientists to provide for the scientists and engineers. most computer scientists go where the money is. engineers and scientists often aren't motivated to pay the computer scientists a proper wage.

plus, if i understand this correctly, the computer scientists would be throwing mud at themselves, am i correct? just a thought.

But think of all the innovations! (1)

strictnein (318940) | more than 13 years ago | (#365183)

How can they even say that? Think about all the amazing innovations computer scientists have come up with! One click shopping! Great Porn sites! PORN SEARCH ENGINES! COMPUTER PORN! PORN!

Well maybe (1)

sagacious_gnostic (319793) | more than 13 years ago | (#365184)

... we should start calling computers a "toolbox" rather than a tool.

Re:I agree with this post (1)

KingAzzy (320268) | more than 13 years ago | (#365185)

If you don't think good programming = good engineering then I'd advise you to please remove yourself from our industry because its hard enough getting good work done in the first place, without having to deal with attitudes like that. Either that, or perhaps you'll be promoted to management sometime real soon.

I agree with this post (1)

KingAzzy (320268) | more than 13 years ago | (#365186)

There's an almost infinite supply of stupid ideas out there and wasted money and it drives us serious engineers and scientists completely crazy. Here we are giving ourselves brain damage in the name of computer science while at the same time battling the political forces of Bullshit MegaCorp so as to keep drawing that necessary evil, the paycheck, all while at the same time across the hallway a group of complete morons have succeeded in drawing out tens of millions of dollars for a project of which the quality is somewhere along the same level as a bowl of runny shit.

GRRRRRRRRRRRRRRRRRRRRRRRRR

Re:I agree with this post (1)

Kevin Mitnick (324809) | more than 13 years ago | (#365187)

And now we've got all the bilogists pissed off too! I found this article [goatse.cx] on CNN about how some conservative genetecists have totally despised the use of computers to decode the human genome. According to the story, the project is being funded by the church of scientology and the Raelian Revolution. Since they absolutely shun computers they exclusively use an abacus and a couple of jumping jacks to help them in the decoding process. Experts estimate that their grassroots effort will culminate in 2048, by which time humans will be clothing-optional Xentors.

Re:LAMEST. ARTICLE. EVER. (2)

Anonymous Coward | more than 13 years ago | (#365195)

From Jonathan Walls

This is insightful? Does it not strike moderators as pathetic to see a knee-jerk reaction to criticism, laced with bad sarcasm, insults and poor logic, pandering to the tastes of the audience?

Especially in a community that likes to think of itself as intelligent and cutting-edge - you would have thought a bit more open mindedness could be expected. Anyone with the ability to see another person's point of view would acknowledge that using the Start button to stop, or requiring hardware knowledge to install an OS, and so on, is indicative of a situation that needs improvement. And remember this is criticism of an industry rather than individuals, so there's no point cherry-picking to prove a point.

As for "computers are complex because your needs are complex", that sound like a pissing competition i.e. "My needs are so complex you could never design a simple [set of] interface[s] to solve them. Gee, you must be pretty simple if you think it's possible for you." Then you get, "my complex needs are inconsistent the needs of others", or in other words, "I am such an individual that noone could ever produce a simple solution that suits me."

Personally, I want things to be simple, I'm not strutting around claiming to be a complex individual, with difficult to meet needs. For a start, such a person sounds like an arsehole. But more to the point, I have lots of simple needs. Take the example of doing my taxes - I don't want to, I want a simple process. After all, all the figures I provide are automated somewhere anyway, I don't want to expend any effort at all, I just want a simple solution. Such a solution would undoubtably have a complex back-end, take a lot of work if it's possible at all currently, and take some talented people to do it right. If I simply saw a one page print out at the end of the tax year with a breakdown of income and taxes I would be very happy (and rather impressed). Simplicity of interface, sheer useability, takes a lot of talent, skill and creativity.

If the only example of an intelligent device you can think of is a computerised thermometer, I wouldn't hold much hope of ever getting a good job requiring any of these skills.

Finally some truth at Slashdot! (2)

Anonymous Coward | more than 13 years ago | (#365196)

As a software engineer I always strive to make things as complex as possible. :)

The main problem is the toolkits/frameworks that are used for developing software. Most Unix toolkits really suck! What's even worse is that the language they are designed in, be it C or C++ makes such a mess, because those languages weren't designed for graphical interfaces, they are portable assemblers.

If the world programmed in Smalltalk life would be much easier. Imagine if everybody had truely reusable classes. Although maybe that would put some programmers out of work. Using a specific language doesn't mean that code reuse will be well done, a lot of it has to do with the programmer.

Maybe one of you has the idea that will push us past the archaic languages that we currently use.

This can easily be solved... (2)

Art Tatum (6890) | more than 13 years ago | (#365204)

...by a good old CS vs. Eng. paintball tourney. Let's get it on!

Bill Buxton (2)

snicker (7648) | more than 13 years ago | (#365205)

I think it's interesting that Prof. Buxton, one of the most innovative researchers into human interfaces, is one of the people cited in this article. He's responsible for some very interesting work... er, that I can't properly cite because I'm not sure where to cite it from.
But he's done very good work in making, say, Alias | Wavefront's software be very usable by artists. Technically minded artists, to be sure, but there is a level of intuitive access to the program that just isn't found in a lot of other packages.

*n

Re:Martketroids, etc. (2)

PD (9577) | more than 13 years ago | (#365207)

>Targets of the critics' scorn included convoluted
> commands such as the common "Alt-Control-Delete"
> sequence used to close a program or perform an
> emergency shutdown

So the engineers are getting all concerned about human factors? I guess I wasn't aware that they had traded in their pocket protectors and slide rules.

Re:Computer scientists will rule the world (2)

WasterDave (20047) | more than 13 years ago | (#365214)

Programmers might not get the satisfaction of building something useful

Au contraire, Rodney. Exactly the reason I left engineering is that no-one in their right mind was going to give me two million quid to make a fast ferry because some hung-over graduate thought it would have fantastic seakeeping. Computing, OTOH, if I think it could be good, I'll sit down and code it. Man, this is way creative.

Dave

DISCLAIMER: Sometimes you are going to have to make software to an engineering quality.

If you think about it, most electronic stuff sucks (2)

reaper20 (23396) | more than 13 years ago | (#365215)

Take a second, look at almost every piece of electronic equipment that you own... most of it sucks ... design wise... not just including computer stuff...

Why do I have turn 5 knobs and push 4 buttons to make my home theater receiver/tv switch from dss to ps2?

Why do 90% of VCR functions only come on the remote, especially important stuff, like switch to the aux video source?

why does every piece of software come with crappy default settings?

why are we stuck with crappy interoperatability between anything? DV vs. D8mm, firewire vs. whatever, ide vs. scsi .. you name it ...

i have a pda, cell phone, pager, email, etc. .. but for some reason, getting them all to work together in a synchronous, efficient manner is impossible, unless you of course get all your services from the same company, and who wants that?

I know I'm generalizing, but these 'engineers and scientists' are the same jerks who've been pushing shitty technology down our throats ... if all engineers and scientists contributed to what makes sense and push good technology instead of market-speak, we'd would have had beta vcr's, be running linux, and stupid shit like DMCA/UCITA wouldn't exist...

my 2 cents ....

Re:Convolusion isn't necessary. Try dialogs. (2)

jfunk (33224) | more than 13 years ago | (#365220)

What if the system can't bring up a dialog?

Isn't that what ctrl-alt-delete is for?

Usually, in that case, the mouse will still work in Windows or X. In Linux I hit ctrl-alt-esc and my pointer turns into a jolly-roger. I then click on the misbehaving window. If your mouse won't move, you can either hit that reset switch (I hope your FS does journalling) or, in Linux, hit alt-sysrq-s, alt-sysrq-u, then alt-sysrq-b. That is, in order, sync all FS, unmount all FS (actually remount RO), and boot.

Either way, modal dialogs will not work in many cases and you'll have to go to lower levels to recover somewhat cleanly.

If there was an LCD and a couple of buttons on the front panel, however, I would fully support a confirmation.

Needless complexity bagged the VC buck$? Er, no. (2)

smirkleton (69652) | more than 13 years ago | (#365226)

I'm neither a computer scientist nor an engineer, but I must at least take issue with part of the complaint. The criticism that "Wall Street rewards needless complexity and shuns those who build the most simple, human-centric devices" seems simply ill-informed. The bulk of VC money in the late nineties didn't get tied up in companies developing "needlessly complex" technologies. Consider:

1) theman.com received $20,000,000. Rather than suffering from needless complexity, it suffered from needless simplicity. (A website that advised Gen-X age men on good gift ideas for moms, or free pickup lines for cheap chicks?)

2) boo.com received $130,000,000. Their website suffered from needless complexity, but one could hardly say it was the fault of computer scientists (unless you consider flash animators and guys who sorta know javascript as computer scientists).

3) DEN received $60,000,000. They made 6 minute long short films targetting demographics supposedly ignored by television programming (latino gangland culture, teenage christian dungeon & dragon players, drunken fraternity morons, etc.). Needless stupidity, to be sure. Anything but complex.

4) Eve.com wasted $29,000,000 of VC money to build an ecommerce site for cosmetics and other ephemera for females. (The pitch to the VC, Bill Gross of Idealab, took 90 minutes, and didn't involve any computer scientists)

5) iCast.com cast $90,000,000 at streaming video. They're dead, too.

The list goes on and on. There is over a quarter of a billion above thrown at companies founded not by computer scientists but by:

A poet & an ex-model, a couple of ex-hollywood honchos, previously unemployed MBAs and other non computer scientist types.

FWIWIA.

Problem between CS and other sciences (2)

smoondog (85133) | more than 13 years ago | (#365228)

As a CS/Scientist in biotech, I am well familiar with these issues. The problem that the scientists are really hitting on is a perceived lack of communication between the CS field and the rest of science. Don't get me wrong, not all CS people don't communicate with thier collaborators and not all CS people need collaborators, but some do and they don't have any. Doing cs in a vacuum when you are developing tools for others to use is really frustrating for those of us who need the tools but they don't quite do what we need.

-Moondog

Convolusion isn't necessary. Try dialogs. (2)

Ukab the Great (87152) | more than 13 years ago | (#365229)

See, there's these things called modal dialogs that prevent the user from taking any further action unless confirmation is received.If there is an action that could be incredibly destructive to the users data (like shutting down), you pop these one of these suckers up and the user will have to either confirm or deny that they made the decision for the dialog to close. When you employ such user interface design conventions, you can do things like put a power-up/power-down key on the keyboard. User hits power key on keyboard to start computer, and when they want to shut down, they hit the power button again and click on the shut down dialog button to confirm. It's just that bloody simple.

Computers are Aliens and Abstract Testing Devices (2)

hexx (108181) | more than 13 years ago | (#365232)

"If Rip Van Wrinkle went to sleep in 1982 and woke up today, he'd be able to drive our modern computers with no problem because they're essentially unchanged"

It's easy to criticize modern computers, as their user interface is not modern. Designing a legacy human interface was a calculated decision however. People are accustomed to the windows (as in the object, not the MS software) interface, and when things change people get scared. When people get scared, money stops flowing.

From a human interface standpoint, computers might as well be aliens from another planet. We taught them to speak to us with windows about 20 years ago (don't nitpick time with me :) and now that is the de facto standard. Computers that don't "speak that language" are considered toys in the public eye (see PS2, furbies, games on cell phones).

The essence of the speakers' complaints was that computer engineers have spent the last five decades designing computers around the newest technology--not for the people who use the machines.

I don't think it is appropriate for them to suggest computer interfaces have become obsolete because no one was paying attention, or because no one cared to advance the interface. On the contrary there is a great deal of research on the subject, any computer science student has probably taken a human interface course or pieces therein (I did).

I think another big problem is that it's posh to be one of the "tech elite" in the business world. Someone who can handle their computer is generally considered more skillful, and seems to have more potential than one who can't. Logically this is because they are able to learn new things, and have no difficulties with abstraction. That is important in business, and in life.

Anyone agree?

as it's criticize-the-grammer-nazi day (2)

Fred Ferrigno (122319) | more than 13 years ago | (#365233)

I am now a 16 year old kid.

...and it's '16-year', not '16 year'.

--

Hrm.... (2)

fluxrad (125130) | more than 13 years ago | (#365234)

For example, he said, computer users must know that to turn off the computer they have to click on "Start"--not an intuitive step to end a computing session.

You know, when i want my computer to shut down, i just type "shutdown."

maybe i want to reboot the computer....i type "reboot"

I don't think most Scientists are wrong for flaming the computer industry, but there is innovation out there....they're just looking in the wrong places ;-)


FluX
After 16 years, MTV has finally completed its deevolution into the shiny things network

Funding only stupid techonologies? (2)

Goldberg's Pants (139800) | more than 13 years ago | (#365235)

So does that mean they're giving Microsoft money?

---

Martketroids, etc. (2)

Alien54 (180860) | more than 13 years ago | (#365237)

The links for the original and related stories are here [cnet.com] . The original story in the news report is here [cnet.com] , and is much long than the Yahoo Article.

To a large degree, even though it is not named, well, for example there is this bit:

Targets of the critics' scorn included convoluted commands such as the common "Alt-Control-Delete" sequence used to close a program or perform an emergency shutdown. They also lambasted computer designers who refuse to distribute the machines' intelligence to smaller devices scattered throughout the home, instead insisting on packing a single box with maximum functionality.

Strangely, this sounds rather familiar. Certain large companies will not be named. They do not have to be. The marketroids have strangled the future.

Re:not contributing anything useful? (2)

grammar nazi (197303) | more than 13 years ago | (#365240)

heh. Yeah, I was on-track for a PhD. Unfortunately, I quit when I was in 7th grade and I am now a 16 year old kid. That's just kidding, what I said in my first post is is true.

...and DON'T FORGET TO CAPITALIZE YOUR I, MISTER!!!! ...and it's '16-year', not '16 year'.

Re:not contributing anything useful? (2)

grammar nazi (197303) | more than 13 years ago | (#365241)

Since I am the grammar nazi:

jibe: To be in accord; agree: Your figures jibe with mine.

jive:
1. Music. Jazz or swing music. The jargon of jazz musicians and enthusiasts.
2. Slang. Deceptive, nonsensical, or glib talk: "the sexist, locker-room jive of men boasting and bonding" (Trip Gabriel).

I'll let you decide which version that our friend timothy meant.

From our friends at dictionary.com [goatse.cx] .

Re:But think of all the innovations! (2)

Vuarnet (207505) | more than 13 years ago | (#365244)

How can they even say that? Think about all the amazing innovations computer scientists have come up with! One click shopping! Great Porn sites! PORN SEARCH ENGINES! COMPUTER PORN! PORN!
Funny... I always thought Porn was invented by lawyers, since they seem so focused on screwing everybody else, and recording it for posterity.

But hey, at least the Internet was invented by a non-computer scientist, Al Gore!

Star Trek and Voice Recognition (2)

Vuarnet (207505) | more than 13 years ago | (#365245)

Well, voice-operated computers sounds like a neat idea, until you stop to consider the consequences:

- I dont know about you, but I can usually type faster than I can talk.
- Imagine yourself speaking to your computer for 8 hours straight, 5 days a week. Heck, I doubt even professional speakers do that sort of thing.
- A room full of people typing and clicking away is slightly noisy. A room full of people talking to their computers would be quite stressing.

So, all in all, i'm ok with using keyboard and mouse to work on the computer. Now, what I'd really would like to see in reality would be a functioning Holodeck. Playing VR-Quake would be sooo cool!

Re:LAMEST. ARTICLE. EVER. (2)

Technician (215283) | more than 13 years ago | (#365249)

I think they have a point in the MS area. Notice how all the hardware becomes WIN hardware with less smarts of their own? Examples I can think of are AGP video cards, Win Printers, Win Modems, and WIN sound cards. Their is no real reason your computer should stop playing music, printing, downloading or whatever because the OS is busy with something else. Put the smarts back into the devices so they can again buffer data and function on their own. A win modem is a waste of CPU cycles, even if it can voice answer the phone. My new computer can't even play the startup wave file properly. It stutters 3 or 4 times because the cpu is busy with disk IO. The cheaper the better concept has hurt the quality of the design.

We also have a one-up in the design process. (2)

Gendou (234091) | more than 13 years ago | (#365250)

I think most will remember a certain quote that came out of Redmond regarding the fact that no matter how fast Company X makes processor A, Software B will be able to slow it down.

Now, I'm not accusing anyone. I'm not saying all software developers are out to screw over the hardware people, but look...

Those who write the software are the last stage. Regardless of how well the engineers designed the hardware, the CS people can either make or break their designs with good or bad code respectively. CS people essentially have engineers at their whim.

So yes, I certainly agree they're jealous... but in more than one way. They're jealous because CS people, in a way, have more power over the flow of technology.

Re:Convolusion isn't necessary. Try dialogs. (2)

Neumann (240442) | more than 13 years ago | (#365251)

I guess you have great users who always read the dialog boxes, eh? Bet they never open ILOVEYOU.jpg.vbs either.

Taguchi method???? (2)

calidoscope (312571) | more than 13 years ago | (#365252)

Bob Pease of National Semiconductor has written several articles poking holes in the Taguchi Method. One example was that of a voltage regulator designed by the Taguchi Method - it was very insensitive to part tolerances - However it didn't regulate worth a damn.

Oh please... (3)

dimator (71399) | more than 13 years ago | (#365256)

There are a lot of technologies out there that suck. Computers have many problems. But "have contributued nothing useful"? How many of these scientists and engineers would be where they're at without computers? Indeed, how many of them would have been able to schedule, arrive at, and execute their trip to this meeting?

I dont know why they would say such a stupid thing... I'll assume we all took what they said out of context/too seriously.

--

Re:CTRL-ALT-DEL (3)

mduell (72367) | more than 13 years ago | (#365257)

And makes it even more confusing for grandmothers trying to logon to an NT box.
"Control-Alt-Delete, but wont that stop it?"

Mark Duell

Good design... (3)

adubey (82183) | more than 13 years ago | (#365258)

Of course, when people say that "design" will save the world, they usually mean their idea of design, which might not jibe with yours or mine.

No timothy, when they say "design", I beleive they are referring to things like usability testing. In other words, taking a software package to groups of users, and designing statistically sound experiments to see what users find easy and fast to use. In other words, users ideas of good design - not yours, not mine.

If you're interested, maybe read some [asktog.com] sites [useit.com] on design. [acm.org]

Moreover, I think they are also saying that VC's should at least be aware of what theoreticians are thinking about so they make better use of their investor's dollars

Technology as an ends and not as a means (3)

Ukab the Great (87152) | more than 13 years ago | (#365259)

The moment someone designs technology as an ends and not as a means, that technology is issued a death sentance. It might be commuted for 15 or 20 years, but it will eventually happen. The PC isn't dying, it's been slowly murdered for the last two decades by many companies (one in Redmond Washington comes to mind) who have made the PC so ridiculously difficult to use and maintain that people are being driven to use network appliances. For many years, makers of software and hardware have lost touch with the needs of their consumers. The latest buzzword compliant technology gets higher priority than what could actually help someone use their computer more efficiently and effectively. The perfect example (from soooo many to choose from) would be the 3.5 magneto optical disk. It was rewritable, damn reliable, as small as a floppy and, if it would have been produced in massive quantities, massively cheap. But that didn't meet with the agendas of the technology industry. They backed zip drives and superdisks that were far less reliable and held far less data. When it became absolutely critical to hold data sizes larger than 100+MB, they came up with another kludge: CD-RW--Technically ungraceful (has to rewrite the entire disk every time written to), has a file-system that requires special software (for windows and I think mac) to read, and still has trouble fitting in your pocket. Yet another missed opportunity for the tech industry.

One more example (this time in the present), firewire. Apple, one of the few companies to move computer technology ahead (despite all of its numerous business/PR flaws) has started putting internal firewire buses in their computers. Why didn't any other computer/motherboard companies think of this? Don't they understand that firewire cables are far less of a hassle than ribbon cables, and block airflow far less? Don't they reckognize the ease of use of being able to chain FW drives together? Don't they understand that external firewire is probably the easiest way for non-geeks to add new hardware (without the need to buy hubs)? But where is intel? Where is Western Digital? Where is Seagate, or Asus, or Abit, Tyan, or any of the others? Nowhere, that's where. In fact, they barely put any stock in USB. Rumor has it that when apple announced that it was killing serial and replacing it with USB, an Intel executive called Steve Jobs to thank him for taking the bold move "Getting all the others [OEMS] to go to USB was like herding cats".

To capitalize on the obvious pun, technology sucks because too many people are pussys

Re:Computer scientists will rule the world (3)

Mox-Dragon (87528) | more than 13 years ago | (#365260)

Programmers might not get the satisfaction of building something useful and might not experience the artistic delight of design, but we at least don't have to work as hard. And when it comes to the bottom line, that's all that counts.

What are you talking about? Programming (for me, anyway) is ALL about the satisfaction from building something useful and the artistic delight of design - in programming, you build something from quite literally nothing - you create order from chaos. Programming is speech, but it's much more than that - to be a good programmer, you have to think in abstract ways and be able to think truly dynamically - static thinkers have no places in the art of programming. Anyone who says they are programming for *just* money is NOT an artist. Good code is truly poetry, and good programmers are truly artists.

Hypocritical.. (3)

proxima (165692) | more than 13 years ago | (#365261)

Industrial designers poked fun at virtually all facets of computers and other electronic gadgets, and the Apple iMac--displayed in PowerPoint presentations in its groovy new shades

Funny..computers appear to be useful enough to give PowerPoint presentations to a crowd to quickly and easily present information to a large group. I find it a bit hypocritical they'd bash computer design and ease of use and use PowerPoint instead of some other presentation medium.

that's why I'm changing my major (3)

Megahurts (215296) | more than 13 years ago | (#365263)

I'm a college student who recently decided against continuing a major in computer science, primarily because the code bases I've worked with have been so horribly designed that they're beyond repair. The way I see it, we've (Americans, that is. I know much of the world is quite different) become quite fixated on the miracle of computers. But very few people ever actually learn how they work or how they can be properly and efficiently integrated into our lives. So we then get bad designs from hardware and software vendors who realize that there's a large number of people unwilling to make the investment in knowledge necessary to choose the good from the bad, and will buy anything they see on a billboard, on the television, and (decreasingly) in magazines for entirely superficial reasons. Had they known better they could have avoided the junk or at least returned it for a refund, economically deselecting the implemenators of inferior technology from the economic gene pool.

In explaining such issues to friends not familiar with the industry, I'll often draw parallels to similar situations. With this one, I'd say the computer craze is now at the point the car craze was in the late 1960's. Hobbyists are still common but on their way out. More and more people want the physical ideas of the technology eschewed for its practical purposes. Perhaps this economic turn is analogous to the oil crisis. (and quite similar. I've heard that at least some of is due to the californian legislator and power companies scratching each others back to create the energy crisis out here. Personally, it wouldn't surprise me, since I feel absolutely no trust toward the motives of either group)

---

Re:LAMEST. ARTICLE. EVER. (3)

madcow_ucsb (222054) | more than 13 years ago | (#365264)

Second, they recommend creating "simpler" and "distributed" devices instead of monolithic boxes that do everything. What the hell does this mean, what devices really need more intelligence? All I can think of is one of those computerized thermostats. Whoopee.


Seriously...I just have visions of what would happen if my appliances started communicating with each other...

Fridge: Ok everyone, we know Alex has a final tomorrow at 8am.

All Kitchen Appliances: *evil laughter*

Fridge: Everybody turn on in 3...2...1...NOW!

*All appliances in the house turn on at once*

*Circuit breaker trips*

(At 11am the next morning)
Alex: NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO!


You know it'll happen. One day. When you least expect it. They'll turn on you.

Looks like (4)

PD (9577) | more than 13 years ago | (#365265)

the engineers have bought into the myth of the dying PC. Horsepucky. The PC is here with us forever, and as time goes on more and more things will be integrated into it.

Distributed systems are a nice thing in principle, but some problems can be broken up only so far. And then when it comes down to it, nobody wants to buy a specialized piece of a computer when they can get their generalized computer to work.

Look at the history of tools. First there were separate tools, each one doing a single or small number of jobs. Even a Swiss Army Knife was limited to about as many tasks as it had specialized attachments.

People like to poo poo the computer as being "just another tool". But the computer is far far different than any other tool that came before. The computer has the ability to be an INFINITE (or at least huge enough that you won't exhaust the possibilities in the lifetime of the universe) number of tools.

The engineers are being engineers. Who can blame them? They like single purpose tools. Heck, we like single purpose tools too, and that's why we generally embrace the UNIX philosophy of making a program do one thing, and do it well.

But the difference is that our specialization is in the software, and the specialization they are proposing is in the hardware. If I want a single purpose tool, I don't need a computer to get that.

ANGRY DENIAL! (4)

TheDullBlade (28998) | more than 13 years ago | (#365266)

Angry denial reiterated.

Supporting claim. Second supporting claim.

Revelation of inconsistencies in the complaints.

Setup for attempt at witty attack on academics.

Punchline of witty attack.

---

Secure Path Login/LogOut (4)

ka9dgx (72702) | more than 13 years ago | (#365267)

The Secure path in NT is Control-Alt-Delete. There is a very sane reason for this, it's not allowed to be intercepted by ANY application running under NT. Thus, you can ALWAYS know that the OS is in control when you do Control-Alt-Delete. This is one of the GOOD features of the operating system, and helps prevent a trojan horse from taking your password.

It's too bad Microsoft couldn't build applications the same way, safe from Trojan Horses.
--Mike--

Computer scientists will rule the world (5)

alewando (854) | more than 13 years ago | (#365268)

When engineers sneer at computer science, I just chuckle to myself. Because I know something they don't know: they're just jealous.

Engineers are jealous of programmers. It's that simple. Programmers have an easy life, after all. I only work a few hours a day, get paid big bucks, and for what? For telling a machine what to do. For the act of mere speech. It's Rapunzel's tale incarnate: I spin words into gold.

Engineers have too many constraints; the standards are too high. When the Tacoma Narrows bridge fell down, heads rolled. But when a bug in the latest version of NT disables an aircraft carrier, Microsoft doesn't get blamed at all. Bugs are par for the course in our industry, and we have no intention of changing it. It means higher profits from fixes and lower expectations. How are engineers supposed to compete with those sorts of odds?

I admit I considered going into engineering when I started my college days, but I was quickly disuaded. The courses were too involved, whereas the CS courses were a breeze for anyone who didn't fail calculus. And I don't regret it at all, really.

Programmers might not get the satisfaction of building something useful and might not experience the artistic delight of design, but we at least don't have to work as hard. And when it comes to the bottom line, that's all that counts.

BAH! (5)

hugg (22953) | more than 13 years ago | (#365270)

If it wasn't for us software guys, you scientific types would still be writing programs in Fortran.

Oh that's right, you ARE still writing in Fortran. My bad.

"Too easy" shutdown procedures (5)

cje (33931) | more than 13 years ago | (#365271)

Anybody remember the original Apple II?

The RESET key, located at the top-left corner of the keyboard, triggered a software reset. This had the effect of (depending on the software you were using) terminating the program and dumping you back to a BASIC prompt or erasing whatever unsaved data you had or doing a hard reboot of the machine. Users quickly found out (the hard way) that this button was way too easy to press by accident. In fact, this problem was so pervasive that magazines such as Creative Computing began advertising for "RESET key protectors" .. typically these were pieces of firm foam rubber that you would place underneath the RESET key (you had to pry up the keycap) .. resulting in a key that was still "pressable", albeit with a bit more effort.

In later versions of the Apple II/II+ (and in subsequent machines such as the IIe, //c, and IIgs), Apple listened to their users' complaints, learned from their mistake, and required a Ctrl-RESET combination in order to actually trigger the reset. That hard-learned lesson carried over to other hardware and software manufacturers, including the choice of Ctrl-Alt-Delete.

CTRL-ALT-DEL (5)

suss (158993) | more than 13 years ago | (#365272)

Targets of the critics' scorn included convoluted commands such as the common "ALT-CONTROL-DELETE" sequence used to close a program or perform an emergency shutdown.

Put it under F1, see if that makes them happy. You know, there's a reason it's such a 'convoluted' command, It keeps people from accidently executing it!.

Re:LAMEST. ARTICLE. EVER. (5)

IvyMike (178408) | more than 13 years ago | (#365273)

I thought my sarcasm was pretty good, thank you very much.

Perhaps my bile was uncalled for, but I'm sick of people implying "good design is easy, why doesn't someone just do it?"

Good design and usability are difficult. Do you think that the industry doesn't know that billions of dollars and instant fame and fortune are at stake here? Do you think that the industry doesn't try really, really hard to get that money?

There's a right way to criticize usablity--one author who does it right is Donald Norman (I'm sure there are others, but on this topic I've only read Mr. Norman's books). He manages to carefully consider what is wrong with a designs, discusses the alternatives, and points out how usablity could be improved.

There's also a wrong way. Say something "Why can't my computer be as easy to use as a toilet?" God, I'm getting pissed off again. What's the feature list of that toilet? And what's the feature list of your computer; can you even hope to cover that list in any amount of detail? In fact, does your computer actually even have a standing feature list, or do you actively install new software (and thus new features) constantly? Dammit, everybody who uses a computer has complex needs--I have a web browser, an email client, a remote telnet session, an mp3 player, and a "find file" all open RIGHT now, and I suspect that I'm pretty tame compared to the average slashdot reader. I'm going to play an online game with friends in another state shortly. I could spend hours describing what I want EACH ONE of these to do. I happen to think that all facts considered, the usability is pretty good. (And I might add: Damn, it's cool to live in the year 2001. This stuff rocks.)

Are things perfect? Of course not. One company has a monopoly on the desktop market and has very little incentive to innovate (in spite of their claims the contrary) and every incentive to continue to keep the status quo. Yes, the "START" button is retarded. Should we strive to improve the state-of-the art? Of course. Would it be awesome if it was easier to do my taxes? Sure, but are you absolutely you want the automated solution you described when it sacrifices transparency (are you sure your taxes were done correctly in that system) and possibly privacy (who's keeping track of all that information flowing between your income-payers and the government?) I actually think that TurboTax made my taxes about as easy as I'd like--it asked me a simple set of questions, I answered, and it was done. Any easier, and I'm not sure I'd completely trust it.

I actually don't know why you're arguing, since in at least one respect, you agreed with me. You said:

Simplicity of interface, sheer useability, takes a lot of talent, skill and creativity.

If you think about it, the article in question basically said these are all trivial, require little skill or talent, and they said it with a condescending attitude. It's actually really really hard. Dismissing the problem is unwarranted and deserves and equally scathing reply.

LAMEST. ARTICLE. EVER. (5)

IvyMike (178408) | more than 13 years ago | (#365274)

Dammit, I hate these fuckers.

First of all, they contradict themselves. "Computers are too hard," they whine, but when a computer interface remains consistent and usable for twenty years, "If Rip Van Wrinkle went to sleep in 1982 and woke up today, he'd be able to drive our modern computers with no problem because they're essentially unchanged".

Second, they recommend creating "simpler" and "distributed" devices instead of monolithic boxes that do everything. What the hell does this mean, what devices really need more intelligence? All I can think of is one of those computerized thermostats. Whoopee.

Look. Computers are complex because your needs are complex. Worse yet, my complex needs are inconsistent the needs of others. Try to download mp3s on your toaster. Try to do your taxes while downloading porn while instant messaging your friend in France while checking the weather on one of their great appliances. Try to use that "more intelligent than a computer" airport toilet to write up your Powerpoint slides, you pompus pricks.

Actually, in this case, that might have actually worked.

not contributing anything useful? (5)

grammar nazi (197303) | more than 13 years ago | (#365275)

Not contributing anything useful?

I just love it when Scientists fling mud and proclaim that the 'real-world' isn't science.

In mathematics, we have the very 'real' Taguchi quality control that revolutionized manufacturing processes, but according to my Math professors, "It's not real mathematics, just some linear algebra application."

On the topic of manufacturing, Metal can now be formed and machined into virtually any shape, Ceramics and metals can be mixed and then burned to form Aluminum tools (molds) for injection molding parts. "That's just an trick to sintering the ceramic" my ceramics professor told me.

My point is that industry types, whether they are applying nueral networks to read handwriting or creating thinner flat panel displays, solve the same complicated types of problems that the more 'scientific' community solves. The scientific community discredits their work because "Theoretically it can be done, so why bother doing it." It's as though the companies that want to enhance their products by funding research shouldn't fund the research that is most likely to enhance their products!

I'm sorry to sound harsh because this strikes close to home for me. I was on track for PhD, but quit and now I'm having a lot more fun developing optimized neural networks to do hand writing recognition.

Re:Funding only stupid techonologies? (5)

atrowe (209484) | more than 13 years ago | (#365276)

I love my computer enough as it is. If I had a computer that sucked, I'd never leave the house!
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?