Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Why Software is Hard

Zonk posted more than 7 years ago | from the comedy-and-software-are-in-the-same-club dept.

409

GoCanes writes "Salon's Scott Rosenberg explains why even small-scale programming projects can take years to complete, one programmer is often better than two, and the meaning of 'Rosenberg's Law.' After almost 50 years, the state of the art is still pretty darn bad. His point is that as long as you're trying to do something that has already been done, then you have an adequate frame of reference to estimate how long it will take/cost. But if software is at all interesting, it's because no one else has done it before."

cancel ×

409 comments

Sorry! There are no comments related to the filter you selected.

Hah HAH (-1, Troll)

canUbeleiveIT (787307) | more than 7 years ago | (#17876486)

Tell us, why is it HARD?

My theory (2, Insightful)

The_Abortionist (930834) | more than 7 years ago | (#17876776)

SO many points of failure.

At the lowest level, there are bugs made by the programmer. There can be misunderstanding also, especially if there is cultural differences in the organisation.

When multiple programmers are involved there are even more points of failure relating to the interface. Does the module behave as expected? Is the documentation accurate?

At the highest level, is the design accurate enough? Do we even know exactly what we have to do? Do we have the resources or is there wishful thinking involved?

And then, because it can take such a long time to write software, specs change during development. There is people turnover, changing customer need, chainge in direction, etc.

The points of failure are incalculable. The thing to do is adhere to best practices and then take the estimated amount of time to develop and multiply by 2 (in some places, experience will show 3).

And most importantly, don't lose any sleep because of this. Don't want to die at 60 of a hard attack because of software development.

Re:Hah HAH (-1, Troll)

Anonymous Coward | more than 7 years ago | (#17876974)

Software isn't hard, my cock is hard, software is difficult!

Ahhh, 50 years ago? (0)

Anonymous Coward | more than 7 years ago | (#17876494)

So, 50 years ago programmers were trying to solve the exact same problems wer'e trying to solve today? What Windows software was around in 1957?

Not to take potshots, but (5, Funny)

Mateo_LeFou (859634) | more than 7 years ago | (#17876502)

...can anyone explain Vista's schedule in light of this discovery?

Re:Not to take potshots, but (5, Funny)

edwardpickman (965122) | more than 7 years ago | (#17876544)

...can anyone explain Vista's schedule in light of this discovery?

Another law explains it, Entropy.

Re:Not to take potshots, but (0)

Anonymous Coward | more than 7 years ago | (#17876604)

Or Duke Nukem's lack of one?

Re:Not to take potshots, but (5, Funny)

creimer (824291) | more than 7 years ago | (#17876838)

All the programmers got better jobs at Google?

Re:Not to take potshots, but (1, Funny)

dreamlax (981973) | more than 7 years ago | (#17877230)

I heard that all of the programmers suffered head injuries from having chairs thrown at them.

Ah! The great unknown... (3, Insightful)

alshithead (981606) | more than 7 years ago | (#17876530)

"But if software is at all interesting, it's because no one else has done it before."

"Interesting" to me means something new and/or unknown...mostly. There are exceptions. Treading new ground always requires greater effort. If I cut a my way through virgin jungle then those who follow have a path.

Re:Ah! The great unknown... (1, Funny)

canUbeleiveIT (787307) | more than 7 years ago | (#17876570)

I'm not sure I like the way this is going. Software is HARD. The jungle is VIRGIN. Uh-oh...

Re:Ah! The great unknown... (0, Troll)

kfg (145172) | more than 7 years ago | (#17876642)

Treading new ground always requires greater effort. If I cut a my way through virgin jungle then those who follow have a path.

The problem is even thinking in terms of "effort." Ideas are not the product of labor. Time with a chain saw is proportional to length of path cleared.

Ideas may come in a flash, or evade forever.

KFG

Re:Ah! The great unknown... (5, Insightful)

alshithead (981606) | more than 7 years ago | (#17876754)

"Ideas are not the product of labor."

Respectfully, I have to disagree. Some of my best ideas have come from pondering over a problem. Pondering can be effort. It's not like daydreaming. To think about a problem and apply logic to try and come up with a resolution requires effort in many, if not most, cases.

Re:Ah! The great unknown... (0, Troll)

kfg (145172) | more than 7 years ago | (#17876780)

Some of my best ideas have come from pondering over a problem.

What I did not say is that thinking is easy.

KFG

Re:Ah! The great unknown... (2, Insightful)

alshithead (981606) | more than 7 years ago | (#17876930)

"What I did not say is that thinking is easy."

No you didn't. You said, "Ideas are not the product of labor."

Definition of labor according to Merriam-Webster, just the first/primary definition:

Main Entry: 1labor
Pronunciation: 'lA-b&r
Function: noun
Etymology: Middle English, from Anglo-French labur, from Latin labor; perhaps akin to Latin labare to totter, labi to slip -- more at SLEEP
1 a : expenditure of physical or mental effort especially when difficult or compulsory

Please note: "physical or mental effort". I'm not trying to nitpick. Just disagreeing on our definition of the word "labor".

Re:Ah! The great unknown... (0, Troll)

kfg (145172) | more than 7 years ago | (#17876962)

Just disagreeing on our definition of the word "labor".

That's what I was doing; and within the context of a specific provided example.

Think about it. It might take some effort.

KFG

Re:Ah! The great unknown... (3, Insightful)

alshithead (981606) | more than 7 years ago | (#17877150)

"Just disagreeing on our definition of the word "labor".

That's what I was doing; and within the context of a specific provided example.

Think about it. It might take some effort."

Okay troll, right. I've put some effort into it and I'm still clueless. Are you talking about "Ideas may come in a flash, or evade forever."? If so, I consider that a partial truism. Ideas also come about from a slow, plodding, methodical effort. Your generalization is half-assed. If you've got a point to make, please do so. You haven't stated how you disagree with my (and the general use) definition of "labor" and you certainly haven't clearly provided your interpretation of the context involved in the "specific provided example".

Re:Ah! The great unknown... (0, Troll)

kfg (145172) | more than 7 years ago | (#17877174)

Are you talking about "Ideas may come in a flash, or evade forever."?

Which is the clue that I'm not talking about thinking.

If you've got a point to make, please do so.

Think.

KFG

Re:Ah! The great unknown... (1)

alshithead (981606) | more than 7 years ago | (#17877326)

Are you talking about "Ideas may come in a flash, or evade forever."?

"Which is the clue that I'm not talking about thinking."

If you've got a point to make, please do so.

"Think."

Yup, I'm still clueless. Are you talking about a "flash of inspiration"? If so, doesn't some prior thought have to have gone into the problem? No one has a flash of inspiration without having put the thought into identifying a problem or goal. If so, you still haven't stated how that is not labor. I've already put way too much LABOR into trying to decipher your ramblings. I'll reiterate, if you've got a point to make, do so...without obfuscation.

Re:Ah! The great unknown... (0, Troll)

kfg (145172) | more than 7 years ago | (#17877344)

I'll reiterate, if you've got a point to make, do so...without obfuscation.

No.

KFG

Re:Ah! The great unknown... (0)

Anonymous Coward | more than 7 years ago | (#17877000)

Some of my best ideas have been a product of the closest men get to feeling the pains of labor....."dropping a fat kid off at the pool" as it were.

Re:Ah! The great unknown... (4, Insightful)

ComputerSlicer23 (516509) | more than 7 years ago | (#17877200)

I believe his point isn't that you're not doing work, but rather that scheduling pondering is impossible. Otherwise give me a fairly firm estimate of when you will either prove P = NP, or that you can prove that P < NP. Logical deduction isn't precisely the same as "resolving the unknown". One doesn't provide a time table for when the Twin Prime conjecture will be solved. I can apply logic deduction to lots of problems, but I can't necessarily provide a firm estimate of when I'll find the solution to a problem.

Any time you provide an estimate of the time it will take to do anything in "problem solving", you are using statistical conjecture about how long you think it should taken given that you've solved other similar issues. How long will it take me to resolve a logic puzzle. How long will it take to construct a proof to show something? You think logically on those, but you don't provide a schedule. If you tell me, I'm going to give you 30 different distance, rate, time story problems that are geared for a high school freshmen, I can tell you that I'll be done in about an hour. If you tell me that you'd like me to prove Fermat's last theorem without using reference material. I know it's true, and I know that I can't provide a schedule for it. It's highly unlikely that if I took the rest of my life I couldn't do it. Both require deduction and logical thought. One is an entirely different scale then the other.

When working in the unknown, you can't provide a schedule. Otherwise, you'd be working either in the known, or very close to the edge of known.

Kirby

Re:Ah! The great unknown... (4, Informative)

SQLGuru (980662) | more than 7 years ago | (#17876844)

I take game programming classes. One of the instructors made some very good points related to innovation. His context was game wise, but since my background is business application programming, I can easily see how it applies here.

When you innovate in a game, only make one....maybe two innovations. Otherwise, you skew so far away that you usually end up a complete failure. Applying it here: sure, keep things interesting by doing some piece new, but keep it manageable by keeping the rest of it "boring". You gain predictability while retaining "fun".

Layne

Re:Ah! The great unknown... (1)

smaddox (928261) | more than 7 years ago | (#17877154)

So thats why all the games these days are just rehashes of old games with one or two new features.

Re:Ah! The great unknown... (1)

DDLKermit007 (911046) | more than 7 years ago | (#17877232)

Hah, gaming programing classes. You won't find one person thats important in the industry that's taken those (grunt-work is ok for allot now I guess). As for your instructors words, that's the EA method right there. You want to be part of that? Games that fail, fail for a number of reasons that aren't because they did somethings that were innovative. Poor implementation yes, but thats not because they did too many new & interesting things. If you poorly implement an idea your going to fail (or at least have a VERY hard time) no matter what.

Re:Ah! The great unknown... (4, Interesting)

SQLGuru (980662) | more than 7 years ago | (#17877334)

Actually, every instructor I've had works in the industry. Not *DID WORK*....but *WORKS*. Classes are at night. It's in Austin, so there are plenty of studios to pull from. I've had instructors that have worked on games from all eras and genres. Some of the companies that represents: Sony and SOE, Midway, NCSoft, and Microsoft. Plenty who have started their own studios after having worked at bigger ones, too.

http://www.austincc.edu/techcert/Video_Games.html [austincc.edu]

It's not a degree program (yet), but I'm not too worried about that since I already have a CS degree. For me, it's more about having fun, learning some new stuff, and making good contacts for when I'm ready to jump into the industry.

Check out the list of names on the Advisory Board and the list of Instructors. There are some influential names on that list.

Layne

Re:Ah! The great unknown... (3, Insightful)

Evil Pete (73279) | more than 7 years ago | (#17877390)

I read somewhere that in science fiction writing this is called "The Tooth Fairy Principle". Don't introduce more than one exotic technology or idea. I immediately realised that it applied even more strongly to software development. New areas represent areas of high risk, adding even a few to a project can change the risk from moderate to very high. I've participated in a few projects who broke this principle ... as usual commenting on the risk that this implied only made me sound like a Cassandra when eventually the prediction bore fruit.

However, the major reasons I see for software projects becoming late are: clients repeatedly wanting to change design after the design phase (in one surreal case we had a client change a fundamental design issue 24 hours before going live!), poor resource allocation (a very large subject), management saying yes to unrealistic deadlines, bleeding edge technology (Tooth Fairy Principle - high buzzword compliance).

Re:Ah! The great unknown... (2, Insightful)

iminplaya (723125) | more than 7 years ago | (#17877178)

If I cut a my way through virgin jungle then those who follow have a path.

And copyright puts in the toll booth.

Programmers (5, Insightful)

bendodge (998616) | more than 7 years ago | (#17876532)

One programmer is better than two for the same reason that one woman in the kitchen is better than 2. You have to get on a pretty large scale before you need multiple cooks/programmers.

Software programming in general is hard for 2 reasons:
1. Computers aren't built for interfacing with humans, thus UI us terribly time-consuming.
2. The environments people like to drop an app into can be so bizarre, that rock-solid stability is very difficult to achieve.

Re:Programmers (4, Funny)

bennomatic (691188) | more than 7 years ago | (#17876592)

Where do you live? The 50s? You may want to ask some women you know about using that particular illustrative image.

Re:Programmers (5, Funny)

cyborg_zx (893396) | more than 7 years ago | (#17876724)

Indeed.

It is well known that men are superior in the kitchen.

Re:Programmers (1, Funny)

Anonymous Coward | more than 7 years ago | (#17876950)

You may want to ask some women you know
This is slashdot, you insensitive clod! Know women? Are you from MySpace or something?

Re:Programmers (0)

Anonymous Coward | more than 7 years ago | (#17876970)

"for the same reason that one woman in the kitchen is better than 2."

... and yet two women in the bedroom is better than one. It all depends on the context of what you are trying to accomplish. The same thing applies to software projects.

Re:Programmers (1)

Oligonicella (659917) | more than 7 years ago | (#17877086)

By #1 I presume you mean because we don't have telepathic reader devices yet? Voice recognition, typing, mouse, pad, text recognition, visual plotting; what's missing?

Re:Programmers (3, Interesting)

fireboy1919 (257783) | more than 7 years ago | (#17877206)

Most cooking projects don't take more than 10 man-hours, but pretty much every programming project does. And, furthermore, mostly when the chef makes a mistake it's obvious to her.

Neither condition hold for programming. It's for this reason that I think that, in general, *two* programmers can program faster than one. At least, me and my partner can program code that's more bug-free together than we can when we program separate projects, and that makes a difference. If the project is sufficiently large - i.e. takes longer than about 10 hours, the cost of communication between two people is less than the cost of switching. :)

While we're at it, I think that there's another misconceptions in this interview.

programmers are programmers because they like to code -- given a choice between learning someone else's code and just sitting down and writing their own, they will always do the latter

Two of the five developers at my little software company are programmers because they like to figure things out. So we almost always figure someone else's code out before we do anything ourselves. There are varying degrees of this in a lot of the developers we've got there. I would say that none of us will write anything ourselves unless it saves us a considerable period of time.

But even more, if you had a relative who was always wondering, "What is it that you do all day?" you could hand my book to that relative and say, This is what my work is really like.

No. I couldn't. My experience as a developer is nothing like what he's described. And he didn't talk about the phenomenon of unknowns that I've noticed - for every project I do, if I estimate how long the known things will take, dealing with unknowns will generally take 60% longer (so multiple time estimates by 3 is generally correct). He didn't talk at all about testing.

Almost everything he talked about are things that I thought would be true when I started but that have ended up more or less untrue. Discipline coding makes a difference. Automated unit testing catches most problems, and regression testing finds almost all the rest, and not everybody does these things.

duh! (0)

Anonymous Coward | more than 7 years ago | (#17876546)

not only software, every project is hard if it first of its kind. the fact that we have ready-made ingredients available does not mean that we know the right combinations and permutations and design from the start. a good engineer can build a great bridge out of stone, while an idiot wouldn't even if given best concrete and machinery but no helping brains.

Nine women cannot have one baby in one month (5, Funny)

euice (953774) | more than 7 years ago | (#17876556)

and of course, we are the better programmers, so better fire those other 8.

It needs more professionalism (5, Insightful)

petes_PoV (912422) | more than 7 years ago | (#17876560)

Mostly programmers are trained in the technical details of languages and the libraries/APIs associated with them. They don't gain skills in knowing what users really want and are hurried into producing barely-working stuff, fast.

Whatever testing is done often only tests that the product produces the correct answers when feed the proper input - no account is taken for how the program reacts to incorrect or incomplete data.
Changes are requested faster than they can be implemented and often are not communicated very well.

In short there are systemic failures throughout the whole process, from inception through to delivery. There is no single answer to why software is hard and there won't be until the industry matures and people start to get thrown out of the business for acting unprofessionally

It's economics. (0)

Anonymous Coward | more than 7 years ago | (#17876912)

The problem with software development does't involve professionalism. It involves economics. Put simply, it's a matter of companies and consumers continually expecting better results, but not always being willing to pay what it takes to get those results.

So what we end up with are shorter deadlines than are sensible or possible to meet, all while trying to accomplish increasingly complex tasks. In order to ship a product, something will have to give. Often times this is only possible by reducing the time and effort spent on testing, or by otherwise reducing the quality and performance of the software that is being developed.

Programming without cookies (5, Interesting)

Allicorn (175921) | more than 7 years ago | (#17876574)

Programming websites that let you actually view a page without requiring a cookie is obviously hard for the folks at Salon.

Heavy-handed management (2, Interesting)

Tontoman (737489) | more than 7 years ago | (#17876608)

Software becomes hard when heavy-handed management decisions are made to give too much emphasis to a particular software tool or methodology.
  1. An expert programmer (working with human factors experts) can prototype the new system in a cross-platform scripting language (doesn't matter which one), then can identify the objects
  2. a software team can refactor the system once again in an object-oriented language (doesn't matter which one).
  3. Finally, a period of benchmarking can identify the bottlenecks which can be refactored one last time, plus the hardware and Operating System decisions can be made based on the available hardware at the end of the software development cycle.

An approach like this would probably have been helpful in FBI's failed $100 million debacle the Virtual Case File system [ieee.org]

Re:Heavy-handed management (2, Insightful)

malraid (592373) | more than 7 years ago | (#17876784)

This is also a problem with some programmers. Most geeks place more emphasis in the tools than on the objectives. Some don't even care about the objectives (basically the need of the users) and just want to use a shiny new tool. Or they want to do whatever task in the same tool no matter what (there is a saying that a determined Fortran programmer can write Fortran programs in any language).

Re:Heavy-handed management (1)

flyingfsck (986395) | more than 7 years ago | (#17877216)

Yup, it doesn't matter what language I write in, it always ends up looking like C...

Re:Heavy-handed management (1, Insightful)

Anonymous Coward | more than 7 years ago | (#17877272)

Finally, a period of benchmarking can identify the bottlenecks which can be refactored one last time
"one last time"? - I admire your optimism

Becuase People don't know what they want! (5, Funny)

Herkum01 (592704) | more than 7 years ago | (#17876620)

I would say the reason a lot of projects, even small ones take so much time is that requirements cannot be defined.

Compare building a house to software. Before you build a house

  1. Plans are drawn up
  2. A step-by-step schedule to created for the construction.
  3. Contractors are brought it to complete the work as needed

Schedule times can slip but you still know where you are in terms of progression.

If we built this house the way we do software development

  1. Hire all the construction workers
  2. Tell them to build something.
  3. At any point during construction tell them they are not doing it right.
  4. After missing all the deadlines (which were made up by wants/desires of the customer) hire more workers.
  5. Wonder why they cannot get the job done
  6. Cancel the project after everyone realizes they don't want it anymore.

Re:Becuase People don't know what they want! (5, Interesting)

cowscows (103644) | more than 7 years ago | (#17876694)

I design buildings for a living, and I've dabbled in programming, and I think architecture and software development have a whole lot in common.

Your step one in "building a house" can go through all 6 of the steps that you have listed for software development. We get hired by clients, sometimes they have a good idea what they want, sometimes they don't. Sometimes what they want is feasible, sometimes it isn't. It's not unusual for even smaller projects to drag on for years, because the client keeps changing his/her mind. Many projects that cross our desks will never be built.

Many projects are not the traditional design phase ->building phase. They often overlap, and it's pretty messy.

I could go on for paragraphs with the similarities that I see between software design and architecture, but I'll save that for another post.

Re:Becuase People don't know what they want! (3, Insightful)

Hoi Polloi (522990) | more than 7 years ago | (#17876748)

I think one thing they all have in common is that they are always custom jobs. It isn't like going to a car dealership and asking for model X in dark blue. Software is more like "I'd like a car with extra wheels on top in case it flips and purple stripes and only 1 door...". Standardization is very limited.

Re:Becuase People don't know what they want! (1)

chris_mahan (256577) | more than 7 years ago | (#17877090)

My sig is better than your sig :)

Re:Becuase People don't know what they want! (1)

kanweg (771128) | more than 7 years ago | (#17877116)

On the contrary. There is lots of standardisation. The programmer says "I've an API here, and I can give you as many doors as you want, in 10 minutes, but there is no API for the DeLorean door you request and it will take me 2-6 month to program it. Yes, I know Mr. User that you just want to drive your car, but I want you to give me the specs for every millimeter of that DeLorean door, or it won't get done". Then he starts complaining if the customer suggest a Ferrari door; that the customer changes his mind all the time.

Bert

Re:Becuase People don't know what they want! (1)

dreadclown (842647) | more than 7 years ago | (#17877350)

Let's follow the scenario a little further ...

Mr. User then says you're just bullshitting him with this "API" crap and gets a 1 week quote from a cowboy team.

At which point we have various branching decision paths, all of which result in misery for all concerned.

And unlike programmers, you're held accountable (0)

Anonymous Coward | more than 7 years ago | (#17877066)

That's another reason.

If an architect designs a building that falls down, he's done.

But there's no accountability in writing software. It's going to be interesting watching what happens when accountability starts being applied to programmers. I can hear the whining already as the incompetents get forced into flipping burgers: "But I'm not incompetent, I just can't write code fast."

Re:Becuase People don't know what they want! (0)

Anonymous Coward | more than 7 years ago | (#17876708)

> If we built this house the way we do software development

Reminds me of this [guardian.co.uk]

Re:Becuase People don't know what they want! (1)

M. Baranczak (726671) | more than 7 years ago | (#17877224)

The clients who don't know what they want aren't so bad - most of them will accept whatever you give them. The real problem is with the ones who do know what they want, but can't describe it properly. Or the ones who want the impossible (but those are usually easy to spot early on).

Building with atoms... (0)

Anonymous Coward | more than 7 years ago | (#17877268)

And we are building the house with molecules and atoms. Even with code libraries it isn't as if one can order a window in a standard size, an 8 foot 2 by 4, or pvc pipe.

Programming is *currently* at a much lower level than construction. It is like saying: "Well, we need some pipe here, so lets find some polyvinyl chloride (or worse, how do we make polyvinal chloride, lets get a chem e and the chemicals and make it) and the tools needed to heat it and shape it into pipe. And we need some wood, so lets go grow a tree, cut it down and mill it." The tree and the polyvinyl chloride are about the best level of APIs, libraries etc. And the code between is at the atom/molecule level.

And then there is the issue of the foundation of the "house" varying by the OS, compiler etc all in very slight ways.

CS/Comp Engineering is making progress, but it is highly complex and as if we were creating the construction industry, but starting it from the basic building blocks of chemsitry and physics level.

Software is hard AND there's lots of incompetence (5, Insightful)

mrjb (547783) | more than 7 years ago | (#17876624)

Yes, writing software is hard, especially writing good software. The hardest part is to make things simple, even harder is to make things simple AND flexible. The need for a thorough analysis is greatly underappreciated.

Incompetent developers tend to make things more complex than necessary. From that point on, under economic pressure, workarounds are needed to get things done. This in turn makes things even more complex than necessary. THAT is what makes writing software hard. The problem is, it is difficult to be aware of the skills that we lack. As such, a lot of programmers with a huge ego don't deserve one.

I'm not into Extreme Programming per se, but I've noticed that if multiple people look at a piece of software, chances of problems going undetected get smaller and smaller. Yes, even if you, a master programmer, show your code to a rookie, the chance of bugs going undetected will reduce. In fact, it will inevitably result in more bugs being detected before rolling them out to customers.

Make it simple (0)

Anonymous Coward | more than 7 years ago | (#17876810)

Surveys / SURVEY: INFORMATION TECHNOLOGY [economist.com]

Make it simple
Oct 28th 2004
From The Economist print edition

The next thing in technology, says Andreas Kluth, is not just big but truly huge: the conquest of complexity

IMAGE [economist.com]

“THE computer knows me as its enemy,” says John Maeda. “Everything I touch doesn’t work.” Take those “plug-and-play” devices, such as printers and digital cameras, that any personal computer (PC) allegedly recognises automatically as soon as they are plugged into an orifice called a USB port at the back of the PC. Whenever Mr Maeda plugs something in, he says, his PC sends a long and incomprehensible error message from Windows, Microsoft’s ubiquitous operating system. But he knows from bitter experience that the gist of it is no.

At first glance, Mr Maeda’s troubles might not seem very noteworthy. Who has not watched Windows crash and reboot without provocation, downloaded endless anti-virus programs to reclaim a moribund hard disc, fiddled with cables and settings to hook up a printer, and sometimes simply given up? Yet Mr Maeda is not just any old technophobic user. He has a master’s degree in computer science and a PhD in interface design, and is currently a professor in computer design at the Massachusetts Institute of Technology (MIT). He is, in short, one of the world’s foremost computer geeks. Mr Maeda concluded that if he, of all people, cannot master the technology needed to use computers effectively, it is time to declare a crisis. So, earlier this year, he launched a new research initiative called “Simplicity” at the MIT Media Lab. Its mission is to look for ways out of today’s mess.

Mr Maeda has plenty of sympathisers. “It is time for us to rise up with a profound demand,” declared the late Michael Dertouzos in his 2001 book, “The Unfinished Revolution”: “Make our computers simpler to use!” Donald Norman, a long-standing advocate of design simplicity, concurs. “Today’s technology is intrusive and overbearing. It leaves us with no moments of silence, with less time to ourselves, with a sense of diminished control over our lives,” he writes in his book, “The Invisible Computer”. “People are analogue, not digital; biological, not mechanical. It is time for human-centred technology, a humane technology.”

The information-technology (IT) industry itself is long past denial. Greg Papadopoulos, chief technologist at Sun Microsystems, a maker of powerful corporate computers, says that IT today is “in a state that we should be ashamed of; it’s embarrassing.” Ray Lane, a venture capitalist at Kleiner Perkins Caufield & Byers, one of the most prominent technology financiers in Silicon Valley, explains: “Complexity is holding our industry back right now. A lot of what is bought and paid for doesn’t get implemented because of complexity. Maybe this is the industry’s biggest challenge.” Even Microsoft, which people like Mr Lane identify as a prime culprit, is apologetic. “So far, most people would say that technology has made life more complex,” concedes Chris Capossela, the boss of Microsoft’s desktop applications.

The economic costs of IT complexity are hard to quantify but probably exorbitant. The Standish Group, a research outfit that tracks corporate IT purchases, has found that 66% of all IT projects either fail outright or take much longer to install than expected because of their complexity. Among very big IT projects—those costing over $10m apiece—98% fall short.

Gartner, another research firm, uses other proxies for complexity. An average firm’s computer networks are down for an unplanned 175 hours a year, calculates Gartner, causing an average loss of over $7m. On top of that, employees waste an average of one week a year struggling with their recalcitrant PCs. And itinerant employees, such as salesmen, incur an extra $4,400 a year in IT costs, says the firm.

Tony Picardi, a boffin at IDC, yet another big research firm, comes up with perhaps the most frightening number. When he polled a sample of firms 15 years ago, they were spending 75% of their IT budget on new hardware and software and 25% on fixing the systems that they already had; now that ratio has been reversed—70-80% of IT spending goes on fixing things rather than buying new systems. According to Mr Picardi, this suggests that this year alone IT complexity will cost firms worldwide some $750 billion. Even this, however, does not account for the burden on consumers, whether measured in the cost of call-centres and help desks, in the amount of gadgets and features never used because they are so byzantine, or in sheer frustration.

Why now?
Complaints about complex technology are, of course, nothing new. Arguably, IT has become more complex in each of the 45 years since the integrated circuit made its debut. But a few things have happened in the past three years that now add a greater sense of urgency.

IMAGE [economist.com]

The most obvious change is the IT bust that followed the dotcom boom of the late 1990s. After a decade of strong growth, the IT industry suddenly started shrinking in 2001 (see chart 1). In early 2000 it accounted for 35% of America’s S&P 500 index; today its share is down to about 15%. “For the past three years, the tech industry’s old formula—build it and they come—has no longer worked,” says Pip Coburn, a technology analyst at UBS, an investment bank. For technology vendors, he thinks, this is the sort of trauma that precedes a paradigm shift. Customers no longer demand “hot” technologies, but instead want “cold” technologies, such as integration software, that help them stitch together and simplify the fancy systems they bought during the boom years.

Steven Milunovich, an analyst at Merrill Lynch, another bank, offers a further reason why simplicity is only now becoming a big issue. He argues that the IT industry progresses in 15-year waves. In the first wave, during the 1970s and early 1980s, companies installed big mainframe computers; in the second wave, they put in PCs that were hooked up to “server” computers in the basement; and in the third wave, which is breaking now, they are beginning to connect every gadget that employees might use, from hand-held computers to mobile phones, to the internet.

The mainframe era, says Mr Milunovich, was dominated by proprietary technology (above all, IBM’s), used mostly to automate the back offices of companies, so the number of people actually working with it was small. In the PC era, de facto standards (ie, Microsoft’s) ruled, and technology was used for word processors and spreadsheets to make companies’ front offices more productive, so the number of people using technology multiplied tenfold. And in the internet era, Mr Milunovich says, de jure standards (those agreed on by industry consortia) are taking over, and every single employee will be expected to use technology, resulting in another tenfold increase in numbers.

Moreover, the boundaries between office, car and home will become increasingly blurred and will eventually disappear altogether. In rich countries, virtually the entire population will be expected to be permanently connected to the internet, both as employees and as consumers. This will at last make IT pervasive and ubiquitous, like electricity or telephones before it, so the emphasis will shift towards making gadgets and networks simple to use.

UBS’s Mr Coburn adds a demographic observation. Today, he says, some 70% of the world’s population are “analogues”, who are “terrified by technology”, and for whom the pain of technology “is not just the time it takes to figure out new gadgets but the pain of feeling stupid at each moment along the way”. Another 15% are “digital immigrants”, typically thirty-somethings who adopted technology as young adults; and the other 15% are “digital natives”, teenagers and young adults who have never known and cannot imagine life without IM (instant messaging, in case you are an analogue). But a decade from now, Mr Coburn says, virtually the entire population will be digital natives or immigrants, as the ageing analogues convert to avoid social isolation. Once again, the needs of these converts point to a hugely increased demand for simplicity.

The question is whether this sort of technology can ever become simple, and if so, how. This survey will analyse the causes of technological complexity both for firms and for consumers, evaluate the main efforts toward simplification by IT and telecom vendors today, and consider what the growing demands for simplicity mean for these industries. A good place to start is in the past.
 
::: yfnET

Re:Make it simply contradictory (1)

Oligonicella (659917) | more than 7 years ago | (#17877130)

Let's see: the idea is that technology is hostile and hard for humans and the conclusion is that a decade from now everyone with either be digital natives who have no problems or digital immigrants who are learning. Kind'a contradictory.

Re:Make it simply contradictory (0)

Anonymous Coward | more than 7 years ago | (#17877378)

No, dumbass. The idea is that within the next ten years, technology will become much more accessible and integrated into our lives, because companies are already beginning to realize there's a world beyond those of you autistic little shits who demand technology be obtuse and difficult to use.

Too well do I know (1)

ZwJGR (1014973) | more than 7 years ago | (#17876652)

Anybody who tries to program anything thorough, complex or unorthodox knows in great detail what Rosenborg means when he says, "software is hard", and that even trivially simple projects take eras to complete.
You would be astonished, how many subtle bugs, memory leaks, logic errors, inconsistencies, typos, misinterpretations of specs, functionality, etc. can quietly mug your simple program. If you add multithreaded functionality as well, you introduce a whole new class of horrendous synchronisation, race-condition, resource starving, deadlock, memory corruption, instability, etc errors.
Once you start with multiple files, header files, ridiculously complex make scripts (these are very popular) and odd dependencies on obscure libraries, you can while away many a dull evening having fun fixing things.
If you have multiple programmers working on the same thing, then you tread on each others toes, can't understand what other people have written, somehow manage to subtley break someone else's code, can't agree on what to call object X, have to call a commitee meeting, time wasting...

I've spent many days, trying to figure out why my train goes through the wrong signal, and after a week, discover that I've saved something in one register, then used it again to save something else two lines down, then pushed and poped it for no apparent reason.
Then if you simultaneously click button X while holding key Y, memory in arbitrary position Z is modified and your program starts acting really weird and then crashes ten minutes later...
Developing properly robust, indestructable code varies proportionally to an unfortunately high power of program complexity.
I've spent many dull evenings trying

First Post! (5, Funny)

Harmonious Botch (921977) | more than 7 years ago | (#17876662)

Two of us typed this. We thought it might be faster.

Read a real book on software engineering (1, Interesting)

Anonymous Coward | more than 7 years ago | (#17876672)

It should be pointed out that the author is not a software engineer and really does not know what he is talking about. Professional developers who want to understand what makes software development difficult should read some of the textbooks on software engineering that he quotes.

Many "software engineering" books are trash. (1, Insightful)

Anonymous Coward | more than 7 years ago | (#17877004)

Many of the "software engineering" textbooks out there, even those written by "respected" authors, are complete trash.

A large number of the modern books suggest UML as a solution. Anyone who has actually employed UML knows that it's virtually nothing but hype. Yes, a UML class diagram may be somewhat useful when demonstrating how existing code is structured, and a sequence diagram may prove helpful in showing the flow of messages between objects. But it's unsuitable when used for the design of a large-scale system. One you get beyond 10 or so classes, UML diagrams become too complex to work with, and are basically useless.

The same goes for software patterns. Many do make sense on a small scale. But then the designers of systems try too hard to use patterns wherever possible, the application becomes unnecessarily complex, and failure is often the end result.

Fred Brooks is one of the few authors who has made any real contribution. Of course, that's because he actually had years of experience working on some of the largest, most important and most widely used systems of his time. Even years after some of his writings were first published, they remain one of the most useful guides when it comes to software project management and development. So while some authors babble on about UML and patterns, we need to listen to what Brooks said. Mainly, we need to realize that the main factor is the people who are working on the software, more specifically their knowledge and experience. Solid developers will produce solid software.

Re:Many "software engineering" books are trash. (2, Interesting)

Oligonicella (659917) | more than 7 years ago | (#17877164)

Utter bullshit. I use UML for not only analysis, but design, programming and working on things in daily life. It's a matter of understanding the techniques. I've designed four cooperative wire transfer subsystems using it myself.

"we need to listen to what Brooks said... more specifically their knowledge and experience."

Basically what I said.

"Solid developers will produce solid software."

Ibid.

It's easy (1, Interesting)

superangrybrit (600375) | more than 7 years ago | (#17876686)

Wrong tools: C and C++ language. Overcomplicated APIs: Win32, HTML, drivers, JAVA, etc... Lack of standards enforcement: HTML disaster. All this makes it easy to derail any project.

Re:It's easy (1)

TapeCutter (624760) | more than 7 years ago | (#17877278)

Wrong for who? They have provided myself and many others a very respectable living for decades. And surely you don't doubt the overall usefullness of their application to society over the same time period?

Comprehensive requirements, good design, elegant code and adequate testing requires: experience, knowlage, patience, and bags of money. And that's just for version 1.0, wait until the users get hold of it and tell you what you "should" have done (often as a response to you pointing to the requirenents and basically saying "we warned you when you asked for it").

Commercial programming is basically a small group of people proscribing a solution for a large group of people, thus the oft quoted 80/20 rule, you get what you pay for (if you are carefull). Time consuming yes, but there is nothing intrinsicly "hard" about programming, other than people.

Re:It's easy (0)

Anonymous Coward | more than 7 years ago | (#17877318)

I program in Ada. It took a lot of time to learn how to use it correctly, but now I won't go back to something like C or C++. I really don't understand why people still insist to use those languages.

Software is Math (1)

Valdrax (32670) | more than 7 years ago | (#17876690)

And as Mattel once pointed out, "Math is hard!"

Re:Software is Math (-1, Troll)

Anonymous Coward | more than 7 years ago | (#17876836)

Math are plural.

Product managers... (4, Informative)

osolemirnix (107029) | more than 7 years ago | (#17876706)

I don't think I can fully agree. I think software development may be hard, but that's never the main reason projects fail. The main reason projects fail in my 10+ years experience is because of product managers, not coders.

Product managers I have seen (and I have seen many) often don't know zilch about technology, but even worse they usually also don't know much about their market, target audience/users, User Interfaces, project management, etc.
Consequently they simply don't know what they want and aren't able to explain it in one coherent paragraph of sentences. Once they would be able to explain it, the actual coding would be half as bad.

So if this guy complains that their projects back in the days at salon went bad, I'm not suprised. He's not a coder after all, he was a typical clueless product manager - started out as a journalist and suddenly he was responsible for a type of product he knew nothing about: CMSs, in addition to having no other qualification in software development or a related area (UI design, project management).

So am I surprised this project didn't succeed? LOL, of course not.

You wouldn't let a journalist build a space shuttle or a car now would you? But software? Sure, software is easy, anyone can do it. In the end, it's probably not harder than building a car, but not easier either. it just takes proper skills for all roles in the team, is all.

Re:Product managers... (3, Insightful)

edittard (805475) | more than 7 years ago | (#17876874)

You wouldn't let a journalist build a space shuttle or a car now would you? But software? Sure, software is easy, anyone can do it.
[PHB] Heck, what's the difference? Journalism, programming, they both look like a load of typing to me! [/PHB]

Re:Product managers... (0)

Anonymous Coward | more than 7 years ago | (#17876908)

Language Trolling:

don't know zilch about technology
Double negatives.

Re:Product managers... (1)

maxume (22995) | more than 7 years ago | (#17877030)

Building a car isn't that huge a deal:

http://www.boydcoddington.com/store/HotrodShop/Def ault.aspx [boydcoddington.com]

Most mechanics can do it; they might need a bit of reference material. Space shuttles are a different thing, but most countries can't manage to build them, so...

Software is bad because crappy, cheap software that mostly works is apparently acceptable, especially compared to expensive software.

Re:Product managers... (1)

C10H14N2 (640033) | more than 7 years ago | (#17877284)

"Sure, software is easy, anyone can do it." ..which is why we get the perennially insightful thread titles about "why software is hard" followed by a zillion-posts saying, essentially, "no shit." Really raises the question of if only one could get management lackeys to simply understand, "hey, software IS hard!" perhaps we could get on without having to perpetually explain why. I mean, you don't go to your surgeon and say "hey, Doc, why can't I just get a bunch of community college kids to swap this heart out--and do we really need all this support staff? I mean, you know what you're doing, right? EIGHT HOURS?! Christ, man, it's just ONE piece! I don't have all day. Follow-ups? What, you can't do it right the first time? You must be incompetent. Hand me that scalpel..."

Too many ad-hoc hacks (4, Interesting)

Peaker (72084) | more than 7 years ago | (#17876736)

The software world is in a very poor state indeed.

I think that once someone improves the situation of software architecture and programming languages so that programmers don't have to mess with ad-hoc hacks but instead write the logic that they want to implement, then software will cease to suck.

The main problem is Operating Systems architecture and Programming Languages.
Due to lack of time, I will only list a few of the Operating Systems problems that weren't solved after more than 30 years of OS development:
  1. Don't allocate resources sanely. One program (even worse when it has many threads) that is wanting more memory and more CPU will get the entire User Interface to a halt, even though guaranteeing the required resources for a smooth UI is so cheap. (i.e: Instead of guaranteeing 0.5% of the memory/cpu to the UI so its always smooth, even this 0.5% goes as an extra 0.5% boost to the program that's already got 99.4%)
  2. Offer an unnecessarily(historically) complicated model to programs, where there are multiple spaces of memory (malloc'able/sbrk memory, and file system space), even though these memory types are actually interchangable and when you malloc, your RAM is moved to disk, and when you use a file, it often allocated RAM. Instead, operating systems should just expose one type of memory, that is always non-volatile and persistent, so that programs don't have to worry about converting/serializing back and forth between these memory types.
    This would also get rid of the unnecessary bootup/shutdown sequence all programs are currently dealing with.
  3. Does not offer a high-level world of network-transparent primitives, that allows all method calls to transparently run over a network. If this existed, we would not see the abomination that is web-forms+AJAX and the rest of this ultra-complicated world that still does not work nearly as well as local GUI's. Instead of extending the web to support GUI functionality (poorly), we should have seen GUI's be extended to transparently reach over the network. The X protocol is similar, but not good enough as it transmits too low-level primitives (pixel data and mouse movements) and is also an alternative and not a standard GUI API that the operating system offers.
  4. The security model, using users, groups and assigning those to objects is of very rough granulity, requires a system administrator to modify the model (users/groups) and does not allow fine-grained control over the access of entities (processes) to objects (i.e: As a non-administrator, I cannot prevent my mp3 player from accessing the network or deleting the files it can read).
    Instead, a capability-security model should be used (not POSIX capabilities, but EROS/KeyKos type ones), which is much simpler to use, verify and much more powerful and fine-grained. This would also facilitate secure movement of components between computers - which could be done automatically by the OS to improve performance. More on that on a later post.


Re:Too many ad-hoc hacks (1)

Eli Gottlieb (917758) | more than 7 years ago | (#17877038)

# Don't allocate resources sanely. One program (even worse when it has many threads) that is wanting more memory and more CPU will get the entire User Interface to a halt, even though guaranteeing the required resources for a smooth UI is so cheap. (i.e: Instead of guaranteeing 0.5% of the memory/cpu to the UI so its always smooth, even this 0.5% goes as an extra 0.5% boost to the program that's already got 99.4%)
Real-time scheduling. Lottery scheduling. Raising the priority of the UI process, even under a normal Priority Round Robin scheduler.

Offer an unnecessarily(historically) complicated model to programs, where there are multiple spaces of memory (malloc'able/sbrk memory, and file system space), even though these memory types are actually interchangable and when you malloc, your RAM is moved to disk, and when you use a file, it often allocated RAM. Instead, operating systems should just expose one type of memory, that is always non-volatile and persistent, so that programs don't have to worry about converting/serializing back and forth between these memory types.
This would also get rid of the unnecessary bootup/shutdown sequence all programs are currently dealing with.
You're talking about "orthogonal persistence". EROS, KeyKOS and Unununium all pursue it.

Does not offer a high-level world of network-transparent primitives, that allows all method calls to transparently run over a network. If this existed, we would not see the abomination that is web-forms+AJAX and the rest of this ultra-complicated world that still does not work nearly as well as local GUI's. Instead of extending the web to support GUI functionality (poorly), we should have seen GUI's be extended to transparently reach over the network. The X protocol is similar, but not good enough as it transmits too low-level primitives (pixel data and mouse movements) and is also an alternative and not a standard GUI API that the operating system offers.
Sun's NeWS did a little better than X Windows. So did Amoeba (and every other distributed operating system).

The security model, using users, groups and assigning those to objects is of very rough granulity, requires a system administrator to modify the model (users/groups) and does not allow fine-grained control over the access of entities (processes) to objects (i.e: As a non-administrator, I cannot prevent my mp3 player from accessing the network or deleting the files it can read).
Instead, a capability-security model should be used (not POSIX capabilities, but EROS/KeyKos type ones), which is much simpler to use, verify and much more powerful and fine-grained. This would also facilitate secure movement of components between computers - which could be done automatically by the OS to improve performance. More on that on a later post.
You said it yourself: Capability security systems. Real Access Control Lists would help, too.

Most of the solutions in the operating system world are already there, but nobody uses them in commodity systems. Why? Backwards compatibility. I'm serious. Too many new operating systems aim for backwards compatibility with POSIX, Windows, Amiga, BeOS, RISC OS, the Lisp Machines, or some other old OS architecture. I'll give you a quote: "A new operating system project should address a real problem that is no currently being addressed; constructing yet another general-purpose POSIX- or Windows32-compliant system that runs standard applications is not a worthwhile goal in and of itself." That's from The Pebble Component-Based Operating System [bell-labs.com] , written in 1999.

And then, of course, we can go and search for "Systems Software Research is Irrelevant". Everyone in the commercial seems to ignore the great strides forward operating-system research has made.

Yes, I am an operating-systems geek. How'd you know?

Too many sweet-nothings. (0)

Anonymous Coward | more than 7 years ago | (#17877312)

"Yes, I am an operating-systems geek. How'd you know?"

The sweet way you said everything.

Re:Too many ad-hoc hacks (1)

Watson Ladd (955755) | more than 7 years ago | (#17877168)

Number 2 really isn't desirable. The overhead would be quite painful for a lot of programs that need realtime performance. Also, clearing ram is desirable sometimes, such as when a program fails to maintain an invariant and then goes into an infinite loop. If the variable is persistant and nonvolitile, we have problems. Also, some data *shouldn't* be persistant, like passwords.

Re:Too many ad-hoc hacks (1)

Oligonicella (659917) | more than 7 years ago | (#17877184)

Um, not everyone programs on and for a Windows machine in C.

Good programming is a boundaries problem (4, Interesting)

argoff (142580) | more than 7 years ago | (#17876764)

One thing I've noticed about companies is that they try to treat programmers like factory workers. Expect each one to be interchangeable and jump in anywhere on the "assembly line" at any place at any time for any piece of code. However, programming takes understanding, and complex programming takes complex understanding. Even a good programmer fixing a bug may need to analyze surrounding code for several hours before changing a single line.
Unlike most engineering projects that are completed and done, most programming is a living growing process that is constantly changed modified and improved.

That implies that there is a need for specialisation and clear boundries, to assign "ownership" or "territory" over certain parts of code. A programmer who understands it and gets the last say on how it's changed and have clear non-arbitrary rules for changing that "territory". Like in open source projects. If you want a kernel fix, you submit it to the proper maintainers, or make your own fork, but no corporate bureaucrat comes along and micromanage how the code is merged and managed.

Re:Good programming is a boundaries problem (2, Insightful)

zymurgy_cat (627260) | more than 7 years ago | (#17876820)

Unlike most engineering projects that are completed and done, most programming is a living growing process that is constantly changed modified and improved.

Most engineering projects (and software projects...hell, any project) are like children. They're never completed and done. Once you give birth to one, you're stuck with it for a long time....

To engineer is human... (0)

Anonymous Coward | more than 7 years ago | (#17877146)

"One thing I've noticed about companies is that they try to treat programmers like factory workers. "

Software Factories [amazon.com]

Boo-Fucking-Hoo (-1, Troll)

Anonymous Coward | more than 7 years ago | (#17876774)

Our job is hard. Boo hoo. Welcome to the fucking world you crybabies!

I 7hank you for your time (-1, Troll)

Anonymous Coward | more than 7 years ago | (#17876924)

was at the 5ame [goat.cx]

Software is neither "hard" or "easy" (5, Insightful)

Bright Apollo (988736) | more than 7 years ago | (#17876978)

Implementing a good design is usually half the battle. Creating a good design is usually the other half, but in practice, a solid design is almost always the part that gets skipped. Let me bore you with a brief anecdote.

I have a large, global project underway. User requirements are done and have been done, and we're turning those requirements into things we can code or deliver ("View a workorder", "Print asset detail", "Group revisions into single document"). Of that, we have 150 odd deliverable items, not to mention all the fit/ finish work we may have to do, and all of this barely touches on reports, security roles for users, etc.

The reason we're going to make our date, despite the 1280 discrete requirements we need to test, is that we've taken the time to look at the requirements from a few different angles and come up with a solid design plan, before even thinking about implementation. Each piece will build on another, really hard parts are identified early, blockers and such are flagged ASAP. We know things will emerge that we didn't expect, but we've got the biggest chunks identified and working together on paper. We have the flows mapped out, exceptions and variations listed, and a user group that has to sign off on every iteration of the incremental build (we're spiraling out functions and features).

The only thing "hard" about all of this is the incessant thinking about the details, and discipline required to focus on the un-fun part of software construction, i.e. the planning and design walkthroughs. The itch to code something already is growing, but delayed gratification means that when the time comes to actually write something, the design will almost certainly lead to a working, if not optimal, solution. We can refactor as we go, but it needs to work completely before it can work efficiently.

I've been following Chandler off and on, somewhat through Spolsky's references to it and some stray links around the web, and sounds like design didn't go deep enough into what it'll really take to build some of the pieces.

-BA

Primary problem: programmers (this may mean you) (1)

jvarszegi (758617) | more than 7 years ago | (#17877010)

The replies as to what the problem really is are all over the map. This shows the biggest problem facing software developers today: software developers. There's no standardization that's worth a damn in most shops or on most projects. In addition, there are many people writing code who shouldn't be-- survivors from the flush times, when plenty of secretaries and liberal arts majors made their way into the field for the easy money. Lots of these people stayed on, got undeserved respect from business people, and were viewed as experts while others lost their jobs. I think that big business thinks it's got a good chokehold on IT, finally, but they're wrong. Instead non-savvy managers have managed to promote people who are most like them. In addition, without any pressure to improve or conform to standards, most people who begin with rigorous training let themselves slide, through a lack of direction on sheer laziness. The one-programmer rule only seems to add value because the suckage of multiple lamebrains working on the same bug-riddled code, with different bad concepts, being pulled in multiple directions by cheeseheaded managers, increases exponentially.

Primary problem: Playing by the rules. (1, Informative)

Anonymous Coward | more than 7 years ago | (#17877078)

Here's a book [amazon.com] I recommend anyone contemplating "why is software hard" should read.

Once again... (5, Insightful)

etnu (957152) | more than 7 years ago | (#17877018)

Why doesn't anyone complain about how hard brain surgery is? Why doesn't anyone complain about how hard building space exploration vehicles is? Why doesn't anyone complain about how hard creating a successful marketing campaign is? Software engineering is difficult because it's a complex subject that takes a combination of intelligent people and training to produce good results. Just because businesses are too stupid to realize this doesn't make the problem go away. You can't throw complex projects at untrained, stupid, incompetent people and expect them to produce quality software. You can't just invent some magic formula for software development that will work 100% of the time to maximize efficiency. Software engineering is NOT manufacturing. Accept it and move on for fuck's sake.

Structure. (1)

azav (469988) | more than 7 years ago | (#17877022)

Even dealing with a high level language like that in Adobe's Director, there are too many unknowns and edge cases and it is not painfully clear just how everything plugs together. Building a structure for doing all the tasks you need to accomplish is also paramount. As I said, "The structure allows you the luxiry to focus on the details". Build the structure.

Cheers

How hard can it be? (-1, Troll)

Anonymous Coward | more than 7 years ago | (#17877024)

How hard can it be, even girls can do it.

Software Engineering is such a young discipline (1, Interesting)

Anonymous Coward | more than 7 years ago | (#17877094)

People love to ask why we can't build software like we build bridges. Well, we've been building bridges for what? Several thousand years now? I bet if we look back at the first 50 years of bridge building (which is about how long we've been building software, give or take) we'd see similar mistakes being made to how we are trying to build software. In a thousand years or so, I'm sure we'll have enough accumulated knowledge of how to build software that these silly questions of why we can't build software easily now will seem childish at best.

Programming != Software Engineering (0)

Anonymous Coward | more than 7 years ago | (#17877132)

There is a reason why software takes so long to complete, and another why the 'state of the art' is so damn bad.

Software is an art, but the development of Working software is not, it's an Engineering Science. Software that works, correctly is possible to build, if ones follows engineering principles. Analysis, careful design, and testing at every step in the process.

Software in itself does not fail, it is the process behind the software that causes the software to fail. Poor testing, poor requirements == poor software. A weak link in any of these, and you have shit instead of software.

There is a reason why software engineering (here in Canada) is being taught, it's to correct the weakest link in the chain, the developer. This is the main differnce between a simple programmer (code monkey) and a software engineer. If one applies the correct engineering principles to everything, the quality certainly improves. It took 50 years for engineering to become what it is today, and still structures fail. There is a difference however, in how these failures are treated, and what analysis is done before even the first concrete is poured.

It might take another 50 years before the distinction between a simple programmer, and a software engineer is made, but progress marches on.

Artificial intelligence is the grand challlenge. (1)

Mentifex (187202) | more than 7 years ago | (#17877140)

Mind for MSIE [sourceforge.net] has recently developed the ability to embark upon meandering chains of thought. Not only must concepts from one idea give rise to a branching idea, but the artificial AI Mind must be able to deal with gaps in its own knowledge base and dead ends where no further data are available.

The Technological Singularity [blogcharm.com] should be here by 2012 at the latest. AI was incredibly hard, but now AI has been solved.

Programming is easy (0)

Anonymous Coward | more than 7 years ago | (#17877148)

But a single lazy or incompetent programmer can make the whole thing impossible.

I'm working on a project now, there are 2 people in a team of 60 that have this problem, and as a result the whole project is designed around what they can't/won't do.

Why is software hard? (1)

iminplaya (723125) | more than 7 years ago | (#17877156)

Licensing.

EOF

One reason (1)

Bluesman (104513) | more than 7 years ago | (#17877270)

I think that possibly one reason that software isn't getting better is that hardware hasn't caught up to software in terms of abstraction.

Why is there a hardware stack with special instructions for supporting pushing and popping data onto it? It's not absolutely necessary.

Why are there certain data types like int, float, etc?

Why is there hardware supported virtual memory? Didn't that start as software? Why didn't it stay that way?

It's because these are extremely useful abstractions over bits. The hardware could just move bits around, but we have stacks and data types to support the fast execution of C-style programs and the UNIX process model.

Herein lies the problem. Until hardware supports more abstractions than just C-style data types and functions, we'll always be coding to that level.

Hardware has been getting faster and faster in terms of raw speed, but in terms of features, it's been stagnant for decades.

How nice would it be to be to have hardware supported garbage collection? This would solve a huge number of problems in the software world.

Sure, you can use high-level languages to get similar results, but there's no unifying architecture that lets them inter-operate easily.

I think that things will get better once manufacturers start to look at other ways than raw speed to differentiate themselves.

Because people forget experience (3, Informative)

Beryllium Sphere(tm) (193358) | more than 7 years ago | (#17877316)

Fred Brooks had much the same material in _The Mythical Man-Month_: communication overhead spirals out of control in large groups, project scope creeps out to infinity without a budget, overconfident people try to do too much and fail, it's impossible to know what the customer wants and (in a new area) even what works until you've built something and watched how it fails, only make change to known-good baselines, etc.

This author had to discover Fred Brooks after he'd started a career of big projects. TMM should have been in his school curriculum.

A Simple Test (0)

Anonymous Coward | more than 7 years ago | (#17877358)

Before I listen to another word from someone who complains about what a mess computers are, I want to know: "Show me your desktop".

Running Windows? There's your problem. NEXT!

premier league (1)

Anne Thwacks (531696) | more than 7 years ago | (#17877360)

Everybody knows some footballers are worth a million dollars and others are not worth a fig, but somehow hardly anyone realises that some programmers are worth a million dollars but with some others it would be worth a million dollars to get shot them (see the daily wtf [thedailywtf.com] for details).

Wrong! (1)

Monsieur_F (531564) | more than 7 years ago | (#17877380)

The title is nonsensical.

Hardware is hard, but software is soft !
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?