Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Dan Bricklin on Software That Lasts 200 Years

simoniker posted about 10 years ago | from the in-it-for-the-long-term dept.

Software 359

Lansdowne writes "Dan Bricklin, author of VisiCalc, has written a great new essay identifying a need for software that needs to last for decades or even centuries without replacement. Neither prepackaged nor custom-written software is fully able to meet the need, and he identifies how attributes of open source might help to produce long-lasting 'Societal Infrastructure Software'."

cancel ×

359 comments

hi (-1, Offtopic)

Anonymous Coward | about 10 years ago | (#9705560)

hi. yeah i believe that is impossible. hi. first post.

BENEDICT ARNOLDS OF THE OPEN SOURCE MOVEMENT (-1, Offtopic)

Anonymous Coward | about 10 years ago | (#9705659)

  • Marc Andreessen made 100s of millions of dollars shortly after graduating from UIUC. Today's graduates of the same university face moving back in with their parents. "Fuck that, I got mine!" [nytimes.com]
  • Brian Behlendorf decided he'd rather go to India [salon.com] to recruit software engineers than help out the graduating classes of 2001-2004 here in the US.
  • Robert Malda stood idly by and said NOTHING while his company offshored its flagship product [vasoftware.com] .

Miguel de Icaza, Bruce Perens, Eric Raymond, and Linus Torvalds all got rich off the Open Source Movement. What do you have to look forward to?

Work on the hardware first. (4, Insightful)

dosun88888 (265953) | about 10 years ago | (#9705569)

I think the subject line says it all. You can't worry about your software working for that long until your hardware can last that long.

~D

Re:Work on the hardware first. (4, Funny)

Keruo (771880) | about 10 years ago | (#9705590)

we have the hardware, paper and pen only problem is that the software, human generally dies of old age around 70-100 years I haven't yet seen custom-written software from this field, but some re-packaged with silicone enhancemets did catch my eye

Re:Work on the hardware first. (1, Funny)

foobsr (693224) | about 10 years ago | (#9705792)

re-packaged with silicone enhancemets did catch my eye

Please explain ! [superpam.com]

CC.

Re:Work on the hardware first. (5, Informative)

Deag (250823) | about 10 years ago | (#9705616)

Well Dan Bricklin does point out that software of today can run on different hardware and having software tied to specific hardware is a bad idea.

He says that the system should stay fundamentaly the same and components can be replaced and upgraded, not having everything replaced completely every five years.

He is not just talking about one specific program that doesn't change, but rather open standards and techniques that mean data that is stored today, will be accessible in 200 years time.

Re:Work on the hardware first. (3, Insightful)

julesh (229690) | about 10 years ago | (#9705711)

Well Dan Bricklin does point out that software of today can run on different hardware and having software tied to specific hardware is a bad idea

Software of today can run on a variety of different hardware, but there is a degree of similarity between the different types of hardware that probably won't exist between todays computers and those available a hundred years from today, much less two.

He is not just talking about one specific program that doesn't change, but rather open standards and techniques that mean data that is stored today, will be accessible in 200 years time.

That, on the other hand, I can agree with. Anyone storing information in a format that isn't publically documented really ought to consider whether they'll still need it in 30 years time, and start migrating it to an open format now if they will. However, there are very few formats that are completely undocumented. I believe the most commonly used might be Microsoft Access databases. I'm not sure what documentation exists on the formats of various other commercial database systems; I believe Oracle's formats are well documented (?). What about MSSQL? Informix?

Most accounts packages have documentation available on their database formats I believe. Certainly Sage and Pegasus provide such documentation. What about Great Plains, etc.?

Re:Work on the hardware first. (2, Interesting)

warrax_666 (144623) | about 10 years ago | (#9705839)

[...] mean data that is stored today, will be accessible in 200 years time.

and a huge part of this is hardware support and, interestingly enough, storage bandwidth. You see, you have to migrate data from obsolete hardware/media to newer hardware/media, but in the near future the amount of data stored on obsolete hardware/media may become too large to transfer to newer hardware/media before it dies/decays/whatever, simply because the throughput of the transfer mechanism is too low.

Re:Work on the hardware first. (1, Informative)

Vitus Wagner (5911) | about 10 years ago | (#9705620)

Whole point of Open Source software is that you can compile it for any architecture you've ever imagine.

As long as you have some compilers ported to your new hardware and kernel with compatible system calls, you are perfectly safe.

Re:Work on the hardware first. (2, Insightful)

Anonymous Coward | about 10 years ago | (#9705729)

Whole point? Umm.. no.

Side benefit? Yes.

Re:Work on the hardware first. (1, Insightful)

Vitus Wagner (5911) | about 10 years ago | (#9705751)

Whole point is that software is comprehandable by humans. Anyone can read, fix and imporve.

Porting to new architecture is integral part of fixing and improving.

And it is no side effect, it is quite significal part of RMS's "free as freedom" - independence from any vendor (hardware vendor in this case).

Re:Work on the hardware first. (0)

Anonymous Coward | about 10 years ago | (#9705703)

Even if the hardware and software lasted that long, society would most likely change so much that it would hardly be applicable.

For example, look at 200 year old recipes or old laws like the Homestead and mining acts.
Very few people today load up on lard while riding in the caravan to stake their claim in the praries.

Re:Work on the hardware first. (2, Interesting)

2Bits (167227) | about 10 years ago | (#9705726)

Well, hardware can be worn out, but you can replace it, component by component, as he has suggested.

The problem with software is, as long as there is proprietory formats (or if you don't have the source codes), you can forget about what he calls "societal infrastructure software". If the government is thinking about being able to retrieve its data 50 years from now, better enforce that an open data format be used in your application, right now, and with clean and precise documentation. That means, no MS Word format, or something.

To extend that, the database files would be troublesome too. Anyone knows the internal structure of the Oracle database files?

My first job, out of school, and the first day on site, was to crack some database files of a system of a dead vendor. And there's no API (was in the early 90s), coz it's some PC software. The company has its customer info and accouting info in that DB, and must migrate it to another system which was still alive. There were about 300K records of all kinds, and no way to retrieve the data, except display them one by one, then copy it to another system. The alternative is to print out the whole thing, and re-enter in the new system. The system was critical to the company, but it has some serious bugs, but no one is going to fix.

What are you going to do with that? Well, good thing I was fresh out of school, and learned my data structures well, and the files are not encrypted. After one day of cracking, I figured it was some weirdo B+tree plus some kind of trie, and some more acrobatic tree structures and pointers pointing to all directions (probably for indexing or fast access). Another day to figure out enough of info, and by looking at the relationship in the application, to retrieve the data from that dead system.

That was easy enough. But if I were to brute-force retrieve from Oracle, that would be another thing.

Governments can enforce that vendors must provide proper documentation of their software data formats before a deal is struck, especially if the system is going to run national infrastructures, such as IRS, etc. Especially when the system costs in the hundred of millions (if not billions), why don't they enforce that? I would be multibillionaire if I knew the answer.

Re:Work on the hardware first. (1)

nmrs (591104) | about 10 years ago | (#9705736)

not true at all.... as MAME and a host of other emulators have so clearly proven, if the software is that stable and that useful, someone will write an emulator to keep it running on contemporary hardware.

200 years??? (2, Funny)

danormsby (529805) | about 10 years ago | (#9705571)

Any presidents set for this to show 200 years is a good target to aim for?

Must have had one hell of a beta test phase.

Re:200 years??? (3, Insightful)

pjt33 (739471) | about 10 years ago | (#9705663)

I've stared at your post for a long time trying to work out what you mean. Please put me out of my misery by telling me whether the second word should read "precedents".

Re:200 years??? (1)

danormsby (529805) | about 10 years ago | (#9705697)

Yes is should. I made a most embosing typo.

Re:200 years??? (3, Funny)

Ours (596171) | about 10 years ago | (#9705860)

Wait untill the secret service starts asking you questions about shooting presidents ;-).

Re:200 years??? (0)

Anonymous Coward | about 10 years ago | (#9705705)

The physical significance of Kohn-Sham orbitals is very debatable, Mr Snaketown.

Re:200 years??? (5, Funny)

nomadic (141991) | about 10 years ago | (#9705858)

I think 200 years isn't long enough. They just don't make software like they used to. For example, last time I visited Atlantis, and used the Amulet of Chr'Thalis to activate the ancient computers laying dormant beneath the Temple of the Dawn, they just started working perfectly. True, they only speak Ancient Atlantean, but the software's just fine. And we're talking about systems that haven't been maintained since the Temple Wardens vanished sometime during the Fourth Age; that's several hundred years at least since their last debugging. Of course, some of the hardware's a bit run-down (in the case of some the Temple traps that's a good thing), and the Orb of Kings is still inactive, but the Temple software works perfectly.

Open Source (5, Funny)

Anonymous Coward | about 10 years ago | (#9705575)

It seems like most open source has been less than 1.0 for at least 200 years. But all for a quality product right? Oh you found a bug? Well thats because its pre-1.0!

I'm sad... (1)

Dr. q00p (714993) | about 10 years ago | (#9705737)

...to say that your comment feels more like "5, Insightful" than "3, Funny" to me. Wish I had some modp to give you.

What about Job Security? (0)

Anonymous Coward | about 10 years ago | (#9705576)

To maintain a product you don't need a lot of developers. How can we survive?

Re:What about Job Security? (1)

kfg (145172) | about 10 years ago | (#9705767)

How can we survive?

Produce something useful, for a start.

KFG

Survival? (2, Funny)

Dr. q00p (714993) | about 10 years ago | (#9705772)

I think there was a /. post some time ago (that I cannot seem to locate right now) that talked about the freeware paradox: The better freeware becomes the less you make on support.

So, in order to survive I guess you have to make shitty sw and do lots of marketing to sell your products anyway.

Hmm, sounds familiar in some way...

This is nothing new (-1, Offtopic)

slashman2004 (797128) | about 10 years ago | (#9705583)

Some files never die.. such as http://www.lagnet.co.za

Maybe it's needed, but who will develop it? (4, Insightful)

Biotech9 (704202) | about 10 years ago | (#9705593)

No company in the world will ever try and develop software that never needs (costly) upgrades and add-ons. Take a look at Micrsofts behaviour with MS Office, it's a complete cash cow because they can update it when they want and force people into upgrading with changed document types. Even the open source community will be too interested in improving and adding on to thier pet projects to consider leaving it alone.

this article seems pretty flawed.

We need to start thinking about software in a way more like how we think about building bridges, dams, and sewers.

The fundamental difference being bridges cost more to alter than software does. And the capabilities of hardware allows more freedom in software, to which there is no correlation in bridges.

hmmm, just my 2 euro-cents.

Re:Maybe it's needed, but who will develop it? (-1, Offtopic)

Anonymous Coward | about 10 years ago | (#9705608)

That's all very interesting, but how do you feel about violently sodomising Sir Roger Douglas?

Re:Maybe it's needed, but who will develop it? (5, Insightful)

tessonec (620168) | about 10 years ago | (#9705629)

I think you do not understand completely the point of the article...

The point is that, given the fact that there is a vast amount of information in computer files, you must be aware that if you can't retreive that information in the future, it will be lost.

You are right, most of the software gets updated. But it is the interface that understands the format the thing that must last for much more time than a couple of software-updates-cycles

This is exactly another reason to consider OS standards instead of closed-source formats, as MS in 100 years (if it does still exist) will have forgotten how .doc in windows 2000 looked like

Re:Maybe it's needed, but who will develop it? (2, Insightful)

Mr_Silver (213637) | about 10 years ago | (#9705634)

Take a look at Micrsofts behaviour with MS Office, it's a complete cash cow because they can update it when they want and force people into upgrading with changed document types.

Maybe before, but the document format hasn't changed since Office 2000.

You can send me a document written in Word 2003 and I can happily open it in Word 2000.

Re:Maybe it's needed, but who will develop it? (3, Informative)

pheede (37918) | about 10 years ago | (#9705684)

In fact, the Word document format hasn't changed since Word 97. So any Word version from 1997 or onwards will do the job.

And changing the settings to saving in RTF format by default (enabling Word versions from Word 6.0 through 2003, as well as basically all other word processors, to read the documents) isn't all that hard. Not even in a corporate setting.

Microsofts encourages upgrading of Office installations through a lot of questionable means, but the Word document format isn't one of them.

Word 97-2000 You are Wrong (1, Redundant)

tessonec (620168) | about 10 years ago | (#9705783)

In fact, the Word document format hasn't changed since Word 97. So any Word version from 1997 or onwards will do the job.

And changing the settings to saving in RTF format by default (enabling Word versions from Word 6.0 through 2003, as well as basically all other word processors, to read the documents) isn't all that hard. Not even in a corporate setting.

What? do you think your brand-new Office XP will flawlessly read your 10 years-old Word 2.0 .doc file???

just googling a little bit shows [www.mcse.ms] that you are not right

Not to mention .doc changes between different archs (MAC, X86...)

So...

Microsofts encourages upgrading of Office installations through a lot of questionable means, but the Word document format isn't one of them.

It IS another of them

Re:Maybe it's needed, but who will develop it? (0)

Anonymous Coward | about 10 years ago | (#9705681)

I don't know many bridges that cost more than 100 million upgrade copies of MS Office.

Re:Maybe it's needed, but who will develop it? (1)

MaxwellStreet (148915) | about 10 years ago | (#9705694)

Who will develop it?

There will (according to this article) be cooperatives of gov't agencies and public organizations that raise money to collectively develop systems.

His example was of a parking ticket management system - there's no reason why different municipalities should each use a different one. He advocates using best practices, viewable (if not open) source, and transparent data interchange techniques to collectively create excellent software for public information infrastructure.

If you get past the literal analysis of his metaphor (with respect to bridges, sewers, etc.) - and think of them as public works projects that serve the community - the article is far from flawed.

Quite the opposite - this is a very compelling idea.

It's efficient with taxpayer dollars by spreading the cost over different government groups (say, municipalities); it provides for paid development efforts as opposed to relying on volunteer open source development; it allows for competition; and it would allow for customization at the municipality level for any special features they can afford to add.

Re:Maybe it's needed, but who will develop it? (1)

BinBoy (164798) | about 10 years ago | (#9705714)

Also bridges, dams and sewers need lots of maintenance. Even streets need maintenance. The street in front of my house has been repaired repeatedly, repaved once within the last 10 years, had its lines repainted many times and is cleaned twice a week.

Re:Maybe it's needed, but who will develop it? (2, Interesting)

term8or (576787) | about 10 years ago | (#9705725)

I think that the fundamental reason that construction industry generally succeeds in producing bridges that don't fall down is the existence of building and planning regulations that require product to be of a good standard before they are sold. For example, in the UK if a bridge falls down and someone's killed it's corporate manslaughter and the MD is going to jail. Perhaps what we need is more regulation for the software industry ;-) For example instead of customers paying for software support maybe it should be a legal requirement that VENDORS pay software support and insure via a third party to guarantee support even if they go bust.

Bricklin It (0)

neilmoore67 (682829) | about 10 years ago | (#9705595)

With radical new ideas like this the software old-guard must be Bricklin it!

Start using simpler hardware (4, Interesting)

MavEtJu (241979) | about 10 years ago | (#9705600)

I think the trick is to use simpler hardware, which is easy to replace.

Take todays computer: motherboard with one big black chip, CPU on it, network card also one chip on it, videocard is too impossible to figure out how it works. Due to the integrated design, you can't fix it if it is broken. And in five years you won't be able to replace it one-on-one.

On older hardware (8 bitters), you were able to repair it yourself because you knew how it worked and you know you were capable of replacing a failing chip. Even if you didn't have exactly the same chip, you can use one of a newer family which did the same but would be capable of switching much much faster.

Re:Start using simpler hardware (1)

91degrees (207121) | about 10 years ago | (#9705610)

How often do chips go wrong?

Re:Start using simpler hardware (1)

jrockway (229604) | about 10 years ago | (#9705790)

Pull the heatsink off your CPU and, after you get a new one, come back and tell me that chips don't go wrong :)

No (5, Insightful)

Mr_Silver (213637) | about 10 years ago | (#9705609)

Neither prepackaged nor custom-written software is fully able to meet the need

I disagree. It's got nothing to do with the software but the data.

If the data format is clearly documentented, then it doesn't matter whether the application that generated it is open or closed.

True, you could argue that since the code is open the data format is also documented, but personally I'd find it easier if it was written in a properly structured document.

Otherwise you'd have to resort to learning and then plouging through an application written in some 200 year old programming language (by someone who possibly hacked it up with a hangover at the time) to try and understand what they were doing and why.

Re:No (0)

Anonymous Coward | about 10 years ago | (#9705776)

> Otherwise you'd have to resort to learning and
> then plouging through an application written in
> some 200 year old programming language (by
> someone who possibly hacked it up with a
> hangover at the time) to try and understand
> what they were doing and why.

Yea verily. For sooth older languages give pause to even those skilled in many tongues. Methinks one must ask whether 'tis nobler in the mind to suffer the slings and arrows of obscure allusions, or to take arms against a sea of language troubles, and, by standardizing, end them.

Re:No (1)

huge (52607) | about 10 years ago | (#9705810)

I disagree. It's got nothing to do with the software but the data.
Exactly.

The medium used to store the data should be documented as well.

We already have the problem accessing 5 - 10 yesr old backups as it's pain trying to find compatible devices for reading the data. But no matter how well documented the data format are if you cannot gain access to it.

Hypothetical question... (4, Funny)

BladeMelbourne (518866) | about 10 years ago | (#9705611)

I wonder if there will still be holes/bugs in Microsoft Internet Explorer 6 SP1 in 2204?

Now excuse me while I get back to writing my "Hello World" application that will last two centuries :-)

Re:Hypothetical question... (2, Funny)

OldManAndTheC++ (723450) | about 10 years ago | (#9705661)

Now excuse me while I get back to writing my "Hello World" application that will last two centuries :-)

Unfortunately, in 200 years the language will have evolved, and the words and phrases we use today will have completely different meanings. People of the future will understand "Hello World" to mean "All Your Base Are Belong To Us", and believing your program to be a dire threat, they will fire up their time machine and send back Arnold Schwarzenegger's great-great-great-great-great-great-grandson to eliminate you before you can even write it...

2 letters (4, Funny)

News for nerds (448130) | about 10 years ago | (#9705612)

vi

Re:5 letters (2, Funny)

DungeonCoder (796682) | about 10 years ago | (#9705650)

Emacs is better

(mod +5 funny, remember, this is /. :))

Re:5 letters (0)

Anonymous Coward | about 10 years ago | (#9705829)

You both suck. The correct answer is NotePad. DUH! FOR GREAT JUSTICE!

It's a tool... (2, Insightful)

tgv (254536) | about 10 years ago | (#9705617)

For Christ's sake, computers are mostly used as tools. And who keeps their old tools around for so long? Only neanderthals: [paleodirect.com] ...

tools vs. infrastructure (2, Interesting)

karzan (132637) | about 10 years ago | (#9705708)

there is an important difference between tools and infrastructure. true, much software is used as tools--for accomplishing discrete tasks that evolve as societies and technology evolves. but much software--databases, routers, control devices for physical infrastructure, etc--is used more as infrastructure; that is, as a resource expected to be reliable and predictable by many users and necessary for accomplishing other tasks that ride on top of it, including employing new tools.

infrastructure, because of its multi-user character and the fact that other things are designed to work on top of it, has to have lasting standards--if road lanes suddenly start to become half or double the width, then cars, trucks, traffic flows, etc will all be affected. even if some small technical reason might make it be reasonable to change them, their character as infrastructure means that the long term reliability of how they work is more important than short term technical considerations.

in other words, it would probably be silly at this point to try to design user interfaces, web browsers, etc. that last 200 years, because they are still rapidly evolving. however it makes a great deal of sense to start designing standards for data storage and interface, as well as actual 'infrastructure' software to last a long time because more users (including developers of more 'tool-like' software) benefit from its stability than from its instability.

Not Hopping On (1)

illuminata (668963) | about 10 years ago | (#9705618)

Part of what makes software great is because it's relatively disposable. Software development is one area where you actually get to reinvent the wheel as many times as you want. Why the hell would you want to take away that freedom in any area of software?

No, thanks, I prefer things the way that they are now. No point in locking ourselves into this method when we'll probably figure out a better way to do complete certain software tasks in no time.

Already there? (2, Interesting)

rudy_wayne (414635) | about 10 years ago | (#9705623)

Remember Y2K? Did anyone notice that the world didn't come crashing down on Jan. 1, 2000?

It seems that all those old mainframes running programs from the 60's weren't in such bad shape after all.

This is an over-simplification of course -- people did have to do some work to make sure there weren't any "Y2K" problems.

Re:Already there? (1)

hcdejong (561314) | about 10 years ago | (#9705774)

Yes. but sooner or later those 1960's mainframes will be 'beyond repair'. Even now this is an issue, with replacement parts becoming rare [slashdot.org] .

Re:Already there? (3, Funny)

njcoder (657816) | about 10 years ago | (#9705808)

" Remember Y2K? Did anyone notice that the world didn't come crashing down on Jan. 1, 2000?"

You mean it's safe to come out of my bunker? Thank God! I'm sick of sustaining myself on spam, twinkies and tang.

Not Possible (5, Insightful)

deutschemonte (764566) | about 10 years ago | (#9705627)

Constant standards are what is needed to make software last that long.

Language standards don't even last 200 years, how do we expect something as new as software standards to be more uniform than language standards? Language has been around for thousands of years and we still can't agree on that.

Re:Not Possible (1, Interesting)

Anonymous Coward | about 10 years ago | (#9705828)

Um. Welsh?

True, there are additions. However, it is still possible to read welsh from 200+ (600+?) years ago and translate that effectively into current welsh/english/whatever.

Your word 1.0 doc containing that welsh however, is a different matter....

think back 200 years (3, Interesting)

Keruo (771880) | about 10 years ago | (#9705635)

We've had software and computers for ~30 years now
Going back 200 years, we only began the proper industrialization and everything was pretty much running on steam.
I think it's flawed to try to design software that lasts for decades or centuries.
The technology is constantly evolving, and as the hardware changes, so does the software.
If the hardware developement continues as it does, in 2200 we, or the people then, might be working with hardware running at terahertz speeds with 4096 bit architechtures.
Probably that's an underestimatement, since the evolving curves tend to be exponential.
I don't really think they would still need the software someone wrote for windows 95 200 years ago.

Standards, not Software (5, Insightful)

amitofu (705703) | about 10 years ago | (#9705641)

Standards are what must be designed to last for decades, not the software that conforms to the standards. Things like XML, RDF and POSIX will be supported for decades, if not centuries. Who cares if it is Linux running your POSIX apps, or FreeBSD, or HURD? I don't think it matters if software uses libxml2 to parse your XML data, or some yet-unconceived API--as long as it understands XML!

If it is stability and reliable infrastructure that is desired, it is standards that must remain constant and software that must evolve to make the standards work with new technology.

Re:Standards, not Software (5, Funny)

B2382F29 (742174) | about 10 years ago | (#9705738)

Aren't you a little bit optimistic about HURD being available in just 200 Years?

now history depending on electricity (2, Insightful)

clsc (730336) | about 10 years ago | (#9705642)

The world is different now than it was even just a decade or two ago. In more and more cases, there are no paper records.

The point that the author makes here is really that without electricity we will lose great parts of recent history.

Re:now history depending on electricity (5, Interesting)

I confirm I'm not a (720413) | about 10 years ago | (#9705674)

The point that the author makes here is really that without electricity we will lose great parts of recent history.

When I was at secondary school, in Britain during the 1980s, we participated in a UK-wide project to create a modern version of the "Domesday Book", the 11th-century record of people and property.

The project we worked on was recorded onto a (state-of-the-art) laserdisc so it would "last through the generations".

Last year I read an article saying that dedicated enthusiasts were desperately trying to assemble a working laserdisc system, in order to archive all the data collected just 20 years earlier.

Moral: it's not just electricity we need to worry about - media and the equipment necessary to access specific media is vital, too.

Re:now history depending on electricity (1)

term8or (576787) | about 10 years ago | (#9705743)

You would have thought that someone would have printed it out;)

That's not really data loss (1)

clsc (730336) | about 10 years ago | (#9705752)

- as that laser disc player could be "reconstructed" from old patents and diagrams as long as they're stored on paper.

Also, they could just have bought one [google.com]

It's merely a case of "Betamax vs. VHS" or "history is written by the victorious part". As long as the artifacts/data are retrievable you will be able to reconstruct, but electricity is still the limit.

Re:now history depending on electricity (2, Funny)

dabigpaybackski (772131) | about 10 years ago | (#9705863)

**The project we worked on was recorded onto a (state-of-the-art) laserdisc so it would "last through the generations". Last year I read an article saying that dedicated enthusiasts were desperately trying to assemble a working laserdisc system, in order to archive all the data collected just 20 years earlier.** So what's the problem? As long as they included instructions on the laserdisk as to how to build a laserdisk player then they could just...oh. Maybe I should sleep.

Software != Civil Engineering (2, Insightful)

Jotham (89116) | about 10 years ago | (#9705651)

I disagree with the common comparison of Software to Civil Engineering and Standards Bodies.

Data Structures would be a better analogy, which Standards Bodies have done a really good job declaring. So in 200 years time you'll still be able to read the DVD data format (assuming the media is still good), even though the software that plays it will likely be different.

Software is more like mechanical engineering, where things do break and improvements keep being found. You wouldn't for example use a 1960's car engine in a car today, even though the basic principle is the same. No ones asks why they didn't get it right 40 years ago and aren't still using the same design.

Unfortunately, what would often be considered an early prototype in engineering, is often released as v1.0 -- the cause of which is a long post all unto itself.

Re:Software != Civil Engineering (1)

hcdejong (561314) | about 10 years ago | (#9705785)

A good job? When exactly none of the most popular computer applications use a standardized data format?

Re:Software != Civil Engineering (1)

mqx (792882) | about 10 years ago | (#9705837)


You can't compare software engineering with either mechnical engineering or civil engineering, in the same way you generally can't compare any of the engineering professions with each other - but what they do all share, is a common underlying concern for attributes such as reliability, maintainability, risk management, structural integrity, and so on.

Lasting 200 years (2, Interesting)

banana fiend (611664) | about 10 years ago | (#9705656)

Societal infrastructure is the key part here.

How many democracies are older than 200 years? How many governmental structures have survived 200 years? Bridges may last that long, but 200 years ago, Ireland was a very different place. America was a very different place, England was a very different place (see Ireland and America for why ;) ) as a matter of fact, EVERYWHERE was very very different

200 years ago, the Americans loved the french for helping them in the civil war, the english hated the americans as barbarians, the Irish as "Paddies" and the Irish hated the english. The English hated the French ..... Come to think of it - only some things change.

Back to the point - Software, or those parts of it that do qualify as societal infrastructure will have to change, simply to keep up with the rate of societal change and anything that lasts for 200 years is a very fundamental tool indeed.

See also (3, Informative)

CGP314 (672613) | about 10 years ago | (#9705660)

See also The Long Now Foundation. [longnow.org]

I read their book in college and, though it is a bit pie-in-the-sky, I thought it raised some interesting ideas. One of their projects was to build a clock that could last a thousand years. When I moved to London [colingregorypalmer.net] one of the first things I did was go to see the thousand-year clock in the National Science Museum. There it was, it all it's broken-non-time-telling glory. About a month ago I checked up on it again. Status: still not fixed : \

Paul Graham said it best. (3, Informative)

fatcow (121528) | about 10 years ago | (#9705666)

Re:Paul Graham said it best. (1)

julesh (229690) | about 10 years ago | (#9705835)

Interesting article. I don't think he's necessarily right in all aspects of that, but he has some good ideas.

It's clear that he was approaching the question with the LISP-user's mindset: simpler is better. There is such a thing as too simple, in my opinion.

long-term.. short-term (2, Funny)

Janek Kozicki (722688) | about 10 years ago | (#9705671)

hey, all this babbling about long-term and short-term reminds me of xterm. Soon xterm will be 200 years old. Or at least sooner than almost anything else. (except for getty ;)

Defies the functioning of the economy (1)

syrinje (781614) | about 10 years ago | (#9705679)

Any technological product that does not contain within itself the seeds of its obsolescence is hughly unlikely to reach the market, even if it were possible to design and produce.


What prevents that from happening is less the state of scientific or technological development - rather it is governed by simple marketplace economics. The call for a "perpetual" product denies the necessity of transactional perpetuation that is indispensible to sustaining the economy. And our daily survival is closely interlinked with this whimsical beast that we all love to loath.


This article is really a call for a change in both the economic and pilitical models that are in place today. I don't know if the author did'nt realise that or deliberately chose to focus on the near issue, but it is an expression of dissatisfaction with the way we do business today.


Which is strange - we routinely accept impossible deadlines in our jobs - deadlines that are dictated by transparently artificial business urgencies.


Makes me wonder what would happen if the growth rate of ALL companies in the world were to be scaled back by say 15% in some kind of economic Slo-Time....:)

On one hand, a deliberate, parity maintained global slow-down might improve everybodys quality of life. On the other hand it might just make things worse and result in a month of Black Tuesdays. And on the gripping hand the offended ghosts of Smith, Keynes and company might curse humankind to be confined to a barter economy for evermore.

Re:Defies the functioning of the economy (1)

hcdejong (561314) | about 10 years ago | (#9705822)

The point is that Bricklin proposes not to leave the creation of this software to the open market, but to commission software for this purpose. Nothing stops the goverment from specifying "no obsolescence" when they commission a new system. And it'll usually be the government that needs these systems.

This is not a threat to the current economic model. There'll still be lots of data that doesn't require a 200-year lifespan, so commercial software can still be used.

Too young (2, Insightful)

frankthechicken (607647) | about 10 years ago | (#9705685)

The problem with comparing computer practices with civil engineering practices, is the age of the two industries.

Software is such a young industry that best practices, standards etc. have yet to be settled upon and thus will be hard to implement. Most engineering practices have come about after centuries of development, I somehow feel software development will have to mature for a while before we can see similar licences and standards bodies.

Legacy COBOL (2, Interesting)

FJ (18034) | about 10 years ago | (#9705686)

There are already legacy COBOL programs that are key pieces of many businesses. Some of those are almost 30 years old. Not really exciting code, but still important to many businesses.

Ask the programmers at Duke Nukem Forever (4, Funny)

davejenkins (99111) | about 10 years ago | (#9705691)

Those Duke Nukem guys should have this problem pegged by now...

Not possible (2, Interesting)

kcbrown (7426) | about 10 years ago | (#9705700)

Software is technology as much as it is art, and as such is subject to the same dependencies that other technologies are subject to.

The nature of technology is to evolve over time. Only the most basic tools we have haven't changed significantly over time: things like the knife, the hammer, etc. Even the screwdriver has seen significant development in the 20th century (Torx screws, for instance).

Only those things for which the underlying rules don't change can remain constant over time. Software is especially vulnerable to change over time because the platforms it depends on, both other pieces of software (like operating systems) and hardware, change significantly over time. 200 years ago, computers weren't even a glimmer in Charles Babbage's eye.

And as much as technology has changed over that period of time, so have the needs of society. And since software is written to fulfull those needs, it's absurd to even ask it to live much longer than a lifetime. About the only kind of software that could possibly live that long would be games, and even then only a select few have that kind of timelessness.

It simply doesn't work.. (1)

sucati (611768) | about 10 years ago | (#9705704)

An interesting paper, however it's completly idealogical. Consider the IRS's woes with its modernization effort [washingtontechnology.com] . Also, think about all the mission critical software running on near-end-of-life VAX equipement. Letting software age without proper maintenance and improvement is a dangerous thing.

Re:It simply doesn't work.. (1)

hcdejong (561314) | about 10 years ago | (#9705841)

No, the problem with the IRS (and others) is that 30 years ago they didn't consider they'd have to move the application and data to another platform eventually, so they tied the application and data to the current platform and/or didn't document what they did.
If they'd done it the "Bricklin way" those problems wouldn't have existed.

Trust me, it's not a problem! (2, Insightful)

Dr. q00p (714993) | about 10 years ago | (#9705718)

Just find me a customer that wants to pay for "robustness, testing, maintainability, ease of replacement, security, and verifiability" and I'll deliver.

The problem is more social then tecnical. (4, Insightful)

jellomizer (103300) | about 10 years ago | (#9705720)

Sure it is possible to write a program that is platform independent and could possible run for 200 years. But the problem is this. How many organizations can last for 200 years without changing their policies or without society changing. Lets compare us Now and 200 years ago 1804. How many companies have lasted sense 1804 not to many. And all of them have changed the way that they did business since then. How many companies 200 years ago would have enough foresight to allow policies for IT workers. Maybe 1 who was swiftly locked away for his crazy talk. Also a lot of todays terminology will go away in 200 year. I predict the term "Race" would be an out dated word confined to the old literature and newspapers, this is because with the steady decline in racial prejudice and inter racial marriages. It would be like 200 years ago a business man will ask you for your religion in order for them to decide to do business with or not, and now there would be some problems even if they asked as just a personal question. Or say we get visited by space aliens, Sex: M F X A I C. Who know what new and unheard of categories will be added or perhaps a method of doing things is drastically changed who even what the company does changes, heck the company I worked for started repairing mainframes, now we do mostly IT Consulting, and that is in 10 years imagine 200 year.
So to make a program this customizable you need to make it a programming language with everything to you need to add and delete change and alter over time. Now even programming languages think Fortran 30 years ago it was the most popular language out there. And now it is tossed aside for the newer languages, even with fortran compilers for linux, most people will rewrite their fortran code to a more modern language then just port it. To take advantage of new features such as GUI, Internet Connectivity, Color Printing, Web Access. More thing that seemed useless or impossible 30 years ago, are now becoming important. Sure it is possible to make a program run for 200 years. But is is possible to make it useful for 200 year. And beside all this extra design time to make a program that can run for 200 years will cost a lot of money and time to do. Are the users of the applications are willing to pay $1,000,000 for a java program that number crunches their numbers. Or will they pay $50,000 for a program that will last them 10 years, and will be a lot less bloated and simpler to use.

Paper is a bad analogy (3, Interesting)

Grab (126025) | about 10 years ago | (#9705723)

I love the way that everyone presents written records as a good example of a "perpetual" medium which surpasses digital.

You may note that the author says "you can read 100-year-old newspapers *on* *microfiche*". This point practically jumps up and down to be noticed - even in the world of printing, paper copies are not seen as suitable for long-term storage, due to difficulties of preservation and physical bulk. So these paper copies are transferred to some other medium for long-term storage. This medium relies on readers existing - if all companies making microfiche readers went out of business (which probably won't be too many years ahead) then the microfiches will be unreadable. And the microfiches themselves are fragile - a scratch in the wrong place will make it difficult to read, and it's on plastic which will degrade over time.

Why should digital be any different? If you want ultra-long-term storage of digital data, use punch holes in gold sheets. Otherwise you use a storage medium which gives you a reasonable storage size and reasonable data security.

On reading the data back, suppose microfiche readers went obsolete and you couldn't buy them. The method of reading the data is still known and recorded, and can be reconstructed by someone needing to get the data back. Similarly, the most common bulk storage methods today are the CD-R and the DVD+/-R (tape backups are practically obsolete). Now the standard for data storage on CD and DVD is, well, *standard*. So if in 200 years time someone wants to read one back, they could build a CD player from first principles.

Grab.

Re:Paper is a bad analogy (0)

Anonymous Coward | about 10 years ago | (#9705782)

Tape backups are practically obsolete?
Damn! And I just made a 70Gb backup of one of the servers to tape.

Where is the cost of change ? (2, Interesting)

Alain Williams (2972) | about 10 years ago | (#9705724)

The cost of changing software can be looked at 2 ways:
  1. Move the software to a new box (but similar) since the old one is worn out or not fast enough or ... In practice this is not too difficult since you can either just copy the binaries or buy new ones or
    ./configure && make
    This I would not call a real change and is not too expensive.
  2. Move the software to a new (or much changed [the current] version of the same one) operating system. This is expensive as there is a lot of recoding that must be done and then work configuring it on the new platform.
Note that the above is only valid if the software being copied does not really change it's functionality as the customer has not changed the requirement spec.

One of the nice things about Unix (Linux/...) is that you can still run very old software on new boxes with at most minimal changes - I still use code that I first wrote some 20 years ago.

There has been much assumption in this discussion that the whole system (hardware, OS, software) has to live unchanged for many years; I think that is missing the point as the true cost of software change is only big in case (2) above.

Note that some software does need to be regularily changed, eg payroll - because the governments change the rules every year or two.

Hardware is not the point (1)

BaronGanut (780013) | about 10 years ago | (#9705731)

The point is not to make the software run on the same hardware for 200 years. But to make software that still serves its purpose for a long time.

Take accounting: They still use the system they started with. A telnet(or ssh) client connecting to a console or menu based server running the database, the server itself is about 1/10th to 1/20th of the physical size it was first, and the client machines has been upgraded and replaced.
But the old system is still in use because of the speed and usability.

Same with libraries with large databases of their books, they cannot change database and update every year or third for that sake.. It would cost a lot of time and money.. and there is no real point in it. It works!

Ink and Paper (3, Insightful)

Quirk (36086) | about 10 years ago | (#9705741)

What's needed is ink and paper. It's our proven technology for archiving. Micro fiche and magnetic storage devices are now more prevalent than any time before but the book industry and published journals and daily newspapers show no sign of diminishing. And as the article points out newspapers dating back 200 years are still available in the public libraries. Electronic voting protocol is just now hashing out whether a paper trail is prudent. Granted the article rightly points out the need to develop an archiving industry that is able to meet the needs for computers to replace paper, based archiving but as long as hardware development thrives in an open competitive economy the market will dictate the timing of implementing the necessary hardware. Unless some body like the library of congress undertakes financing the necessary hardware and software.

Open Source (0)

Anonymous Coward | about 10 years ago | (#9705775)

From the article: "most new software and hardware can only access the most recent or most popular old data. (...) The companies that built the software and hardware are often long gone and the specifications lost."

Theres where OSS makes its difference.

paper books (2, Informative)

spectrokid (660550) | about 10 years ago | (#9705781)

In Belgium, notary's still pay law students to copy by hand important documents on thick books, made from acid-free paper and solidly bound together. Stacked in a basement, you can throw a jerrycan of gasoline over them and set fire to it. You will lose (almost) nothing. Instead of relying on laser discs (see other post), print everything out and count on OCR.

User defined requirements??? (2, Insightful)

cardpuncher (713057) | about 10 years ago | (#9705793)

requirements for the project must be set by the users

I've yet to meet a client commissioning a project who knew well how his own business operated, still less was able to understand how any knowledge he did have might be usefully turned into a specification. One of the reasons some software projects have a short life is that the intended users fundamentally misunderstood how their business worked, or that its way of working was likely to change.

the world is just not ready for this (1)

big ben bullet (771673) | about 10 years ago | (#9705794)

I totally agree with the author, but i think the world is just not ready for this. I've had some really futuristic toughts while reading the article (one tends to have those when reading about things that have to last for 200 years)

Until society is less about profit, i don't see such a thing happening... open source might be an answer for this but it won't be enough. There would have to be _more_ open and globally adopted standards and this won't happen very soon (again... profit)

Take the positive things out of futurama, 1984, bill and teds bogus journey, total recall, or any other futuristic story with an 'evolved' society and we might get there...

I recently read a reply stating the 60's mentality... the fact that technology will have to work for us as it evolves, though nowadays we still have to work for technology too much

heck... we still have to work! i'd better get back on that before i write a complete article about this subject :-)

The 4-function calculator (1)

dj245 (732906) | about 10 years ago | (#9705801)

Maybe they don't have uptimes of 200 years, but they could probably have used the same software written 40 years ago. In the future calculators might be so cheap the 4-function might be an antique, but right now their selling point is cheapness (keep one in the car for MPG calculations!). Why write the same software over and over again for the same chip architecture?

Software that lasts forever is the simplest kind.

200 years? I'll raise you 2,200 ... (3, Interesting)

Bazzargh (39195) | about 10 years ago | (#9705802)

The idea of software that lasts 200 years reminded me of a discussion on the radio the other day about the origin of a joke: "I've had this broom 50 years, its had 5 new heads and 3 new handles". The identity issue played with here dates back at least to Plutarch's Ship of Theseus [washington.edu] - if you keep replacing parts of a thing, until no original parts remain, is it still the same thing?

The relevance to software is captured with an example: Is Linux still Linux? How much remains of the kernel originally published by Linus? Would would you say that Linux has been around for X years (pick X to suit)?

Most people would agree that it's still Linux. What Linux, the broom, and Theseus' ship have in common is that they could be modified to meet the demands of time, while retaining their identity.

I've always thought that maintainability is the highest virtue software can strive for, above other quality-oriented goals like being bug-free, or performant. If its buggy, but maintainable, it can be fixed; if its slow, but maintainable, we can make it faster. I think it could also be argued that software, like Theseus' ship, needs to be maintainable to last 200 years; but the version 200 years from now may not resemble the original in the slightest.

Just my 2c

Baz

Re:200 years? I'll raise you 2,200 ... (1)

argent (18001) | about 10 years ago | (#9705862)

Would would you say that Linux has been around for X years (pick X to suit)?

You're looking at the wrong software. Linux is UNIX, for all practical purposes, and UNIX has been around for getting on 35 years, and it's been 25-30 years since the last major set of changes in existing interfaces, and almost that long since the UNIX API was first independently implemented.

Linux is just the latest of those implementations, though BSD is a better example of your 50 year old broom since it started out as UNIX and *has* been replaced module by module until no line of the original code remains. Still, if Linux is a newer broom, it's still fundamentally a broom, not a Microsoft Mop or a VMS Vacuum.

Though I'll note that Microsoft is integrating a kind of "broom compatibility module" in Longhorn.

So... I wouldn't expect any given software package to last 200 years. But APIs and file formats, open systems and standards more than open source, those will survive.

200yrs when we have enough problems with 15yrs... (1)

nih (411096) | about 10 years ago | (#9705818)

See the BBC Domesday project for a good example

BBC Domesday project [bbc.co.uk]

Wait... (1)

millahtime (710421) | about 10 years ago | (#9705824)

Wait... what do I care. In 200 years I'll be dead.

What is the reason Donald E. Knuth wrote TeX? (4, Insightful)

mvw (2916) | about 10 years ago | (#9705831)

Prof. Knuth [stanford.edu] was unhappy with the degrading typographical quality of the printings of his The arts of Computer Programming [stanford.edu] series. So he took 10 years of his research time to develop the TeX [stanford.edu] computer type setting system. (A stunt hard to pull off, if you are not a professor or rich :-). Now look at how he published the TeX System. There is a set of 5 books [stanford.edu] containting
  • TeX user manual
  • TeX commented source code
  • Metafont user manual
  • Metafont commented source code
  • The Metafont programms to generate the computer modern fonts
What is that good for?

If you, say in 500 years, get a copy of these 5 volumes (and if they are printed on good paper, there is good chance that these survive). You just need some kind of computing device and the skillset to implement some easy pascal like programming language. Then you type in the programms and fonts from this book and voila, you have working a TeX system!

Of course you need to write a .dvi driver for whatever output device you want to need and have at that time.

If you now find some .tex source of one of Knuth's books, be it in print or some crude hyperflux memory cube, you are then able to reproduce that book in the quality Knuth intended it to have!

Thus TeX is explicitly developed to transfer the typographic quality of Knuth's books into the future, without depending that lots of software vendors establish lots of data format (e.g. Word 2325 to Wort 2326) converters!

Regards,
Marc

is it just me... (0)

Anonymous Coward | about 10 years ago | (#9705846)


or is everyone else sick and tired of these old computer visionarys coming down from their 42nd floor penthouses to spell d00m upon the current state of computing?

NASTRAN? SPICE? (0)

Anonymous Coward | about 10 years ago | (#9705849)

I am too lazy to log in.

I think this software package, NASTRAN, is possible one of the few that could last that long. Finite element analysis is incredible useful. This software, written in Fortran, has been around since the 60's.

Also, Spice has proven to be incredibly useful. It even has the ability to model superconducting Josephson's Junctions.

These types of software allow us to explore and create new products/ inventions. Truly, applications that help advance knowledge are more likely to stay around.

good example (1)

mqx (792882) | about 10 years ago | (#9705853)


If you want a good example of survivable software and data, just look at the emulators for old 8/16bit computers, e.g. the Commodore 64 emulators. They faithfully reproduce the entire machine, allowing any of the old software to be used without a problem. Even in 200 years time, these old C64 emulators will be around as a curiosity, in the same way that early "flip card" cinema machines are found in museums.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...