Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming IT Technology

The Hundred-Year Language 730

dtolton writes "Paul Graham has a new article called "The Hundred-Year Language" posted. The article is about the programming languages of the future and what form they may take. He makes some interesting predictions about the rate of change we might expect in programming languages over the next 100 years. He also makes some persuasive points about the possible design and construction of those languages. The article is definitely worth a read for those interested in programming languages."
This discussion has been archived. No new comments can be posted.

The Hundred-Year Language

Comments Filter:
  • by grub ( 11606 ) <slashdot@grub.net> on Friday April 11, 2003 @10:33AM (#5710322) Homepage Journal

    I do not know what the language of the year 2000 will look like, but it will be called FORTRAN. [cmbi.kun.nl] -Attributed to many people including Seymour Cray, John Backus
  • by Jack William Bell ( 84469 ) on Friday April 11, 2003 @10:33AM (#5710323) Homepage Journal
    I predict that in 100 years someone, somewhere, will still be running COBOL applications.

    And I will still be refusing to maintain them. Six years in the COBOL mines was six years too long...
    • by $rtbl_this ( 584653 ) on Friday April 11, 2003 @10:45AM (#5710408)

      I predict that in 100 years someone, somewhere, will still be running COBOL applications.

      And I will still be refusing to maintain them.

      Surely that depends on whether you're damned or not. I imagine there's a whole circle of hell devoted to maintaining COBOL apps.

  • by sokkelih ( 632304 ) on Friday April 11, 2003 @10:33AM (#5710326)
    I guess that programming languages are like cycles. Ah, COBOL is all coming back to me. This object orientation is way too appreciated, it is time get back to the days when VAX-admins ruled the universe of COBOL :)
  • how long (Score:3, Insightful)

    by xao gypsie ( 641755 ) on Friday April 11, 2003 @10:35AM (#5710336)
    .... until programming languages begin to resemble spoken languages very closely? well, at least those languages with power, not BASIC and its friends. or, is it even possible to concieve, at this point, that there will be languages with the power of C but the syntax of English, SPanish...etc....

    xao
    • Re:how long (Score:5, Interesting)

      by GnuVince ( 623231 ) on Friday April 11, 2003 @10:43AM (#5710396)
      Forth can be used a little bit like that (example taken from "Starting Forth", by Leo Brodie):

      \ Word definitions : convicted-of 0 ; \ To convict someone : murder 25 + ; : arson 10 + ; : robbery 2 + ; : music-copying 40 + ; : sentenced-to . ." years of prison" ;

      And to use it:

      convicted-of music-copying robbery sentenced-to

      Output: 42 years of prison This looks quite like english. Of course, you can do that in many languages, but it feels more natural in Forth I think.

    • Re:how long (Score:5, Funny)

      by avandesande ( 143899 ) on Friday April 11, 2003 @10:46AM (#5710419) Journal
      I hope it never is like spoken languages. I can hardly understand what my wife wants when I talk with her, why would a computer. Spoken languages are ambiguous.
    • You're not a programmer, are you?
    • Not long... (Score:4, Informative)

      by MosesJones ( 55544 ) on Friday April 11, 2003 @11:02AM (#5710542) Homepage

      In fact never. Because while its okay human languages have a few problems

      1) Redundancy, far to many ways to say or do one thing

      2) Ambiguity, "1 may be equal to x" "Sometimes allow the user to do this if they aren't doing something else that might conflict"

      So what you might get is a restricted language with restricted terms that could help. But even these tend to fall down, the first UML spec was written using such a language but this was abandoned for the more formal UML language as the inherent ambiguities of languages couldn't be overcome.

      So basically you might have some mechanism of translating from formal into informal but the real work will be done in a formal manner, as now, as ever because at the end of the day....

      Who wants to rely on a system that implements "sometimes" ?
      • Re:Not long... (Score:4, Insightful)

        by bmj ( 230572 ) on Friday April 11, 2003 @11:16AM (#5710666) Homepage

        In fact never. Because while its okay human languages have a few problems

        Well, yeah, but doesn't a computer language suffer from the same pitfalls? If that isn't the case, why do languages tend to "evolve" over time? Why are new languages that borrow elements from other languages so prevelant?

        1) Redundancy, far to many ways to say or do one thing

        Isn't one of the driving principles of Perl "There's more than one way to do it"? Some say this is one of Perl's best feature, other's say it sucks.

        I won't argue with the point of ambiguity. You can remove ambiguity from a "spoken" language by applying rules to it. I do think we're quite far away from being to "speak" a program, but that's because we as a culture have moved away from a _grammar_ of English. Check the courses in a university and see what first year English and Linguistic students are taking. It's not Grammar, it's Grammars. Standard written English is a thing of the past. So we won't base a language on how we actually use our language, but we could base a language on certain grammars of the language. And, isn't that something else that languages like Perl and Python try to do? They try to create more "readable" programs?

        • by MosesJones ( 55544 ) on Friday April 11, 2003 @11:40AM (#5710879) Homepage

          English actually doesn't really have a written Grammar BTW, English was the language of the poor people not of the gentry therefore it evolved as a loosely ruled language rather than as a language with definate constructs like proscribed Latin or modern German.

          Basically English is the language of plebs, the rich and diplomats spoke French. The idea of a grammar was retro-fitted by the Victorians who applied Latin rules to English which just don't fit.

          Lets put it this way, in English you can screw with the language as much as you want and it continues to change every year. This is fine as it makes it a rich communication tool.

          What other languages can use one word to make an entire sentence ?

          F*ck's F*ckers F*cking F*cked
          • What other languages can use one word to make an entire sentence ?

            Latin:

            Malo
            malo
            malo
            malo

            The colloquial translation:

            "Oh I would rather be

            In an apple tree
            Than a naughty boy
            In adversity."

            And, although Latin is inflected but English is only partially inflected, all of the words are identical!

            So HAH. ;-)

            This sort of poetry is common in obfuscated C contests, although the visual lack of distinction between 0 and O and I and 1 is also commonly needed.

            -Billy

    • Re:how long (Score:4, Interesting)

      by yasth ( 203461 ) on Friday April 11, 2003 @11:10AM (#5710620) Homepage Journal
      As any one that has worked on Natural Language Processing can tell you, natural language is a bugger. It is very context driven, and too top it all off has a good deal of redundant syntax (a, the, sv agreement, etc.) Human language is a very nice protocol for transfering ideas (It is in many ways a system designed to transmit through noisy environments by many users all of whom differ in thier individual implementation of the standard). Natural spoken form language is less good at commands, and is particularly bad for unsupervised commands.

      For unsupervised commands humans tend to create something not all that different from code. A fixed set of grammar and vocabulary come into play (i.e. little slang, and very normalized style). For example:

      Employees will update thier status on the In/Out board in the lobby when they will be gone for more then 15 minutes.

      which is roughly:
      (if (> (expected-completiontime task) 15)
      (update-status out))

      So the need and utility isn't there.
    • Re:how long (Score:3, Interesting)

      by Anonymous Coward
      I thought Chomsky had a lot to say about this.

      Structurally, spoken languages and computer languages are very similar:

      Phonetics: sounds
      Phonology: sounds in relation to one another
      Morphology: words
      Syntax: structure (words in relation to one another)
      Semantics: meaning
      Pragmatics: meaning in context.

      Morphology, Syntax and Semantics are shared by human and computer languages. Arguments could be made about phonology, too, but not by me. Some computer langauges might even have pragmatics. (Example of pragmati
  • Aliens (Score:3, Funny)

    by GnuVince ( 623231 ) on Friday April 11, 2003 @10:36AM (#5710344)
    I liked the part about aliens:

    Presumably many libraries will be for domains that don't even exist yet. If SETI@home works, for example, we'll need libraries for communicating with aliens. Unless of course they are sufficiently advanced that they already communicate in XML.

    Let's hope it's not Microsoft's XML, because that could cause a problem with communication:they might say "We come in peace" and start shooting at us with lasers and everything!

  • by SystematicPsycho ( 456042 ) on Friday April 11, 2003 @10:36AM (#5710347)
    When quantum computers come into the picture a new type of programming language and way we think about computers will emerge. Bit shifting will especially be different, it will be called... QBit shifting.
  • Convergence (Score:2, Insightful)

    The evolution of languages differs from the evolution of species because branches can converge...

    For species branches can converge too - it's just kind of weird...
  • dead-end? (Score:2, Insightful)

    by xv4n ( 639231 )
    Java will turn out to be an evolutionary dead-end, like Cobol.

    dead-end? Java has already spawned javascript and C#.
    • Java did not spawn Javascript. That was a Netscape marketing op.
    • Re:dead-end? (Score:5, Insightful)

      by shemnon ( 77367 ) on Friday April 11, 2003 @11:26AM (#5710753) Journal
      Sorry, Wrong and Wrong.

      Comparing JavaScript and Java is like comparing a Shark to a Dolphin, quite different actually even though both animals live in the sea, and both languages use the letters J A and V. Both have cariovascular systems and both use variables and control structures. But that is basically where the similarities end.

      JavaScript actually started life inside of Netscape as LiveScript, and durring the Netscape 2.0 time frame was re-named to JavaScript to ride the Java bandwagon, but thre is no realtionship at all beyond that. Compile-time type saftey? Java yes JavaScript no. Prototypes? JavaScript yes Java no. eval() of new programming code? One but not the other. Interface inheritance? Again. First Class Methods? yep, not both. Bones? Sharks no Dolphins yes (teeth don't count).

      Now C# and Java, they are at best siblings but java did not beget C#. The namespace structure is straight from Ansi C++, and the primative types include Cisims like signed and unsigned varieties. You don't shed a tail and then grow it back further down the trail. The comparison here is alligators and crocidiles. Very similar but one did not beget the other, it was a closer common parent than the sharks and dolphins.
    • Yep, dead end (Score:4, Informative)

      by Anonymous Brave Guy ( 457657 ) on Friday April 11, 2003 @02:21PM (#5712119)
      dead-end? Java has already spawned javascript and C#.

      ...Neither of which has done anything to advance the state of the art in programming languages, even if your claim were true.

      The one thing I have confidence in about programming in the future is that sooner or later, the tools and techniques with genuine advantages will beat the "useful hacks". Java, C++, VB and their ilk are widely used today because they can get a job done, and there's not much better around that gets the same job done as easily.

      Sure, there are languages that are technically superior, but they're so cumbersome to use that no-one really notices them, and when they do, you don't have the powerful development tools, the established code base of useful libraries, the established user base of developers to hire, etc. When we get to the point that languages with more solid underlying models catch up on ease of use, then we'll relegate the useful hacks to their place in history as just that. Until then, we'll keep using the useful hacks because we have jobs to do, but don't expect the tools of the future to be built on them.

  • Why 'moores law' have to sneak into this article? A little bit of 'writers cruft'.
  • AI (Score:2, Interesting)

    by Anonymous Coward
    In 100 years, I would expect computers to be writing it's own code. And rewriting it agian to evolve.
  • by SeanTobin ( 138474 ) <byrdhuntr AT hotmail DOT com> on Friday April 11, 2003 @10:43AM (#5710394)
    Who will design the languages of the future? One of the most exciting trends in the last ten years has been the rise of open-source languages like Perl, Python, and Ruby. Language design is being taken over by hackers. The results so far are messy, but encouraging. There are some stunningly novel ideas in Perl, for example. Many are stunningly bad, but that's always true of ambitious efforts. At its current rate of mutation, God knows what Perl might evolve into in a hundred years.
    • by SeanTobin ( 138474 ) <byrdhuntr AT hotmail DOT com> on Friday April 11, 2003 @11:08AM (#5710601)
      What will perl look like in 100 years?
      #/usr/bin/perl
      #
      # Hello_world.pl
      #

      use uberstrict;
      use all_warnings;
      use diagnostics_and_repair;
      use linux::registry;

      use language_id qw(language);
      use DBI;

      my $dbinfo=new linux::registry;
      my $dbh = DBI->connect(
      "DBI:"
      . $dbinfo->{database}->{type} . ":"
      . "hello_world"
      . ";host="
      . $dbinfo->{database}->{host} . ";",
      $dbinfo->{database}->{username},
      $dbinfo->{database}->{password}
      )
      or die "Severe configuration error: " . DBI->errstr;

      my $lang_query=qqq(SELECT `translated_text` from `hello_world` where `language`=? LIMIT 1;);
      my $query=$dbh->prepare($lang_query);
      $query->execut e(&language);

      $output=$query->fetchrow_array();

      print $output[0];

      exit or die "exit failed";
      • by lostboy2 ( 194153 ) on Friday April 11, 2003 @12:00PM (#5711028)
        What will perl look like in 100 years?

        Ha. I always thought it would look more like:
        #!/CowboyNeal/bin
        use CowboyNeal;

        my $CowboyNeal;

        foreach $CowboyNeal (@ARGV) {
        $CowboyNeal || die "You insensitive clod!";
        $CowboyNeal =~ s/[^CowboyNeal]/CowboyNeal/gi;
        push @_, $CowboyNeal;
        }
        print @_;
  • Why Change? (Score:3, Insightful)

    by jetkust ( 596906 ) on Friday April 11, 2003 @10:43AM (#5710398)
    Languages will change when computers change. Languages are driven by machine instructions which are mathematical operations done in sequence. If this doesn't change in 100 years, why would we not use C in 100 years?
  • The horror (Score:4, Funny)

    by Chagatai ( 524580 ) on Friday April 11, 2003 @10:44AM (#5710402) Homepage
    If the children are our future, then they will be designing the future languages. This is horrible. Can you imagine the future code?

    VIOD THING (OMFG!!!1 LOLOLOLOOL!!!)
    INIT HAX0R N00B!!!
    WHIEL STFU DO
    GOTO 10
    DOEN

  • Awareness... (Score:5, Insightful)

    by dmorin ( 25609 ) <dmorin@@@gmail...com> on Friday April 11, 2003 @10:46AM (#5710422) Homepage Journal
    I know that's a scary word because it sounds like "self-aware". But I expect that in 100 years one of the inherent aspects of any computer language will be in detecting and working with other devices in a robust manner. In other words, being aware of what is around the programmed device. Not requiring a mandatory connection of type X. Instead I'm thinking about a device that can run just fine by itself, and then if another device of the same sort happens to come within 10 feet, then maybe they automatically attempt some sort of handshake (with encryption up the wazoo, of course) and then have the option of communicating. This would be useful for automatic transmittal of business cards, appointment schedules, and so on. Or it could be more of a client/server thing, where devices that do not have the power to get a certain job done will just naturally plug into "the grid" and request more power. The device won't have to deal with where the computing power comes from or how it is distributed.

    Imagine cars that, before changing lanes, signal to the surrounding cars' navigation systems and they work out for themselves how to let the car into the lane. A computer can be told to slow down, rather than speed up, when someone wants to change lanes. Or detectors in the dotted yellow lines that sense when you changed lanes without signalling, and alert the traffic authority to bump your points (ala Fifth Element).

    I always liked the idea of my PDA phonebook being more of a recently-used cache of numbers instead of a local store. I just punch up a number. If it's one of my commonly used ones, it comes right up (and dials, of course). But if it's not, then my PDA connects to the phone company, gets the information (and probably pays the phone company a micropayment for the service) and now I have that number locally on my PDA until it gets scrolled off if it's not used much.

    Also I expect lots of pseudo-intelligent content filtering software. You'll get 1000 emails a day and your spam filter will not only remove 99% of them, but it will also identify and prioritize the remaining ones. In order for this to be useful there needs to be languages that deal with expression of rules and logic in a meaningful way (far more than just and or not). No one 100 years from now will say "if subject ~= /*mom*/" (or however the hell you say it), they will expect to say "Give email from mom a higher priority", or sometihng very close.

  • by Mxyzptlk ( 138505 ) on Friday April 11, 2003 @10:48AM (#5710430) Homepage
    When I say Java won't turn out to be a successful language, I mean something more specific: that Java will turn out to be an evolutionary dead-end, like Cobol.

    Er... I don't think that Cobol is an evolutionary dead-end; in the best world, it would be extinct, but it isn't. What makes a language widely used is something that we can't predict right now - we have to watch it evolve over time, and as it grows and matures look at different aspects.

    Take architecture for example - new buildings are loved the first five years because of their freshly introduced ideas. After that, all the problems start to appear - mildew problems, asbestos in the walls, and so on. During the next ten years, the child diseases are fixed. It is only a HUNDRED YEARS after the new building (or in our case, the new programming language) can be properly evaluated. The language/building then has either been replaced, or it has survived.

    So - the only proper way to measure the successfulness of a programming language is to measure its survivability. Sure, we can do guesstimates along the way:

    During introduction: Does the language have a good development environment? Is the language backed/introduced by a market leader?

    Somewhere during the "middle years" (after about ten years): Does the language have a large user base? Does the language have a large code base?

    After twenty/thirty years: ask the programmers if it really is maintainable...

    Well - you get the picture! Predicting the survability of something more than five years into the future is impossible, I'd say.
  • Waste of Time (Score:3, Insightful)

    by tundog ( 445786 ) on Friday April 11, 2003 @10:50AM (#5710449) Homepage
    The author starts be describing the effect of moore's law on computing power (i.e. computers will be wicked fast)and then starts ranting about how today's constructs are so inefficient, then admits that inefficiency won't really matter because computers will be wicked fast (And it takes him half the article to impart this wisdom).

    huh!?!?

    This is the kind of mental constipation that is better left for blog sites.

    Somewhere there is parallel between the logic in this article and the dot.bomb busniess model.

  • No matter what the form of the language, its name shall be FORTRAN.
    -russ
  • History and Future (Score:5, Interesting)

    by AbdullahHaydar ( 147260 ) on Friday April 11, 2003 @10:51AM (#5710458) Homepage
    This is a really interesting paper [unc.edu] on the history and future of programming languages. (Check out the history chart in the middle....)
  • I think that it would be better to call this article "Where Programming is headed" rather than "The Hundred-Year Language". He tries to justify how he can predict the language 100 years into the future...

    It may seem presumptuous to think anyone can predict what any technology will look like in a hundred years...Looking forward a hundred years is a graspable idea when we consider how slowly languages have evolved in the past fifty.

    Hmm...funny, fifty years ago, if I remember my history (since I wasn't alive back then), those relay computers needed rolls and rolls of ticker-taped punch holes to compute math. The language was so-low-level...even x86 Assembly would have been a godsend to them. And he considers something like Object-Oriented Programming a slow evolution?

    All he's doing in the article is predicting what languages will be dead in the future, and which languages won't be. For example, he says Java will be dead...

    Cobol, for all its sometime popularity, does not seem to have any intellectual descendants. It is an evolutionary dead-end-- a Neanderthal language...I predict a similar fate for Java.

    I'll not go there, because predicting the demise of Java is opening another can of worms. But let's just say that he really doesn't support his argument with anything other than anecdotal opinion.

    I say read his article in jest, but don't look too deep into it.
    • From http://www.legacyj.com/cobol/cobol_history.html:
      In 1952, Grace Murray Hopper began a journey that would eventually lead to the language we know as COBOL.


      Fortran dates to 1954.


      So, there are 50 years of computer language.


    • > And he considers something like Object-Oriented Programming a slow evolution?

      When you consider that it is just a metaphor for refinements of pre-existing ideas such as data hiding, which in turn are refinements of pre-existing ideas such as structured programming, which in turn are refinements of pre-existing ideas such as "high level" programming languages, ...yes, it has been a slow evolution.

      See past the hype.

    • by Anonymous Coward on Friday April 11, 2003 @11:51AM (#5710956)
      no, you don't remeber your hist very well at all.

      using round numbers, he is talking about the fifties, although really he probably wants to include the sixties.

      So what did we have? Among others: Fortran, which is still around and has influenced many designs. Algol, which begat c, java, c++, c#. Lisp, which introduced FP and most (certainly not all) of the interesting ideas that somewhat mainstream languages like python, ruby, perl are starting to pick up on 30+ years later.

      I know you weren't paying attention, but OO came in the 60's, and was developed *far* beyond anything seen today in mainstream production languages by the early 80's. (smalltalk, New Flavours, CLOS)

      Most of what mainstream programmers think of as the history of language ideas is complete drek, because they make the mistake of thinking that the first time they see a company hyping an idea has any relationship to when the idea was arrived at.

      If you had actually read the quoted sentenc fo comprehension, you would understand that he didn't say that Java would be dead, he said that it was an evolutionary dead end.

      Not the same thing. Java is a fairly direct evolutionary descendant of Algol. Cobol, a contemporary of Algol, has no evolutionary descendants.

      What he said is that the languages of 100 years from now will not *descend* from Java, any more than the languages of today descend from Cobol. I wouldn't be surprised if there were Java programs around in 100 years, but that is the nature of legacy systems, not an interesting insight.
    • by MobyTurbo ( 537363 ) on Friday April 11, 2003 @03:40PM (#5712732)
      It may seem presumptuous to think anyone can predict what any technology will look like in a hundred years...Looking forward a hundred years is a graspable idea when we consider how slowly languages have evolved in the past fifty.
      Hmm...funny, fifty years ago, if I remember my history (since I wasn't alive back then), those relay computers
      Actually, relay computers were 1930s. They were using vaccum tubes in the late 40s, and were less than a decade away from transistors fifty years ago; not that they weren't just as primative.
      needed rolls and rolls of ticker-taped punch holes to compute math.
      Punch cards were a limit IBM placed on the technology because IBM thought compatability with their previous non-computer automated machines that used punch cards would be a big plus in selling them to existing clients. Actually IBM's competition, using magtape, had a better form of input/output; IBM set back the computer industry years in doing this.
      The language was so-low-level...even x86 Assembly
      One thing you've got to understand about Paul Graham is he is, for better or for worst, a big fan of LISP; a language that began in 1958 and is still used in Artificial Intellegence and other things (like Orbitz and Paul Graham's own Yahoo! Store) today. Since LISP has a lot of ability to use abstraction, and object oriented programming is a narrower level of abstraction, it does seem that OO isn't so revolutionary. (Even if you go by OO history alone specifically without recourse to comparing it with LISP, it is over 30 years old - Simula was written in the late 60s.)
  • by dmorin ( 25609 ) <dmorin@@@gmail...com> on Friday April 11, 2003 @11:00AM (#5710530) Homepage Journal
    I think that the question of whether natural language is the "way to go" misses out an important distinction. There will always be users of technology, and creators of new technology, and they must speak different languages. I do not need the same skills to drive a car as I do to build an engine. Being able to type does not make me a novelist. There are two different cultures at work.

    Having said that, I expect that the user language should certainly be natural language -- the "computers should understand people talk, not the other way around" argument. People know what they want out of their machines, for the most part. Whether it is "change my background to blue and put up a new picture of the baby" or "Find me a combination of variables that will result in the company not failing with a probability of greater than 90%", people want to do lots of things. They just need a way to say it. Pretty much every Star Trek reference you'll ever see that involves somebody talking to the computer is an input/output problem, NOT the creation of a new technology.

    It's when you build something entirely new that you need a new, efficient way to say it. Anybody remember APL? Fascinating language, particularly in that it used symbols rather than words to get its ideas across (those ideas primarily being focused on matrix manipulation, if I recall). Very hard for people to communicate about APL because you can't speak it. But the fact is that for what it did, it was a very good language. And I think that will always hold true. In order to make a computer work at its best, speak to it in a language it understands. When you are building a new device, very frequently you should go ahead and create a new language.

  • OOP (Score:3, Insightful)

    by samael ( 12612 ) <Andrew@Ducker.org.uk> on Friday April 11, 2003 @11:05AM (#5710567) Homepage
    I don't predict the demise of object-oriented programming, by the way. Though I don't think it has much to offer good programmers, except in certain specialized domains, it is irresistible to large organizations.


    Where OOP comes into it's own, in my experience, is with GUIs. The ability to say:

    If ThisScreen.Checkbox.IsTicked
    ThisScreen.OkButton.Disabled = True
    Endif

    is immensely useful. Similarly, the ability to change the definition of your master screen template and have all of the other screens take on it's new properties is something that OOP is designed to allow you to do.

    Similarly, anything where you tend to access things that act like objects in the first place suit it. Being able to say

    CurrentDocument.Paragraph(1).Bold= True

    or

    Errval=MyDatabase.SQL("Select * from mytable where name='Andrew'")
    Print MyDatabase.RecordCount

    has made my life easier on numerous occasions. There are certainly non OO methods of doing the same thing, but I've never found them as flexible.

    People who insist on making _everything_ an object, on the other hand, are idealists and should be carefully weeded from production environments and palced somewhere they'll be happier, like research.
  • Notation (Score:4, Insightful)

    by hey! ( 33014 ) on Friday April 11, 2003 @11:06AM (#5710577) Homepage Journal
    Lisp was a very early, successful language, because it was close to a mathematical notation and easy to implement on primitive computers. I think the uathor expects Lisp to remain a vital evolutionary branch because of its mathemtical roots.

    I'm not too sure though.

    A programming language is a notation in which we express our ideas through a user interface to a computer, which then interprets it/transforms it according to certain rules. I expect that a lot will depend upon the nature of the interfaces we use to communicate to a computer.

    For example, so far as I know people never programmed in lisp on punch cards; it doesn't fit that interface well. It was used on printing terminals (for you young'uns, these were essentially printers with terminals). Lisp fit this interface well; Fortan could be programmed either way.

    If you look at languages development as an evolutionary tree, Python's use of whitespace is an important innovation. However it presupposes havign sophisticated syntax aware editors on glass terminals. It would not have been convenient on printing terminals. Perhaps in 2103 we will have "digital paper" interfaces, that understand a combination of symbols and gestures. In that case white space sensitivity would be a great liability.

    In my mind the biggest question for the future of languages is not how powerful computers will be in one hundred years, but what will be the mechanics of our interaction with them? Most of our langages presume entry through a keyboard, but what if this is not true?

    • Re:Notation (Score:4, Insightful)

      by rabidcow ( 209019 ) on Friday April 11, 2003 @02:30PM (#5712184) Homepage
      I think the uathor expects Lisp to remain a vital evolutionary branch because of its mathemtical roots.

      I think he expects it to remain a vital branch because recent languages have been more and more like lisp. If lisp doesn't directly beget new, highly popular languages, lisp's features will be (and have been) absorbed into whatever does become popular.
  • What about ASM? (Score:3, Interesting)

    by siliconwafer ( 446697 ) on Friday April 11, 2003 @11:08AM (#5710596)
    Upper level languages will change. But what about Assembly? What about programming for embedded systems?
    • Re:What about ASM? (Score:3, Interesting)

      by BenjyD ( 316700 )
      In a hundred years time, the cost of processing power will be likely be *much* lower. The price difference between putting a 1 MIPS or a 10 GigaMIPS processor in a toaster will be in the order of a fractions of a pence.

      The few remaining areas for ASM programming - embedded, SSE-like optimisations - are being eroded gradually as processors and compilers get better.
  • My prediction. (Score:4, Interesting)

    by An Onerous Coward ( 222037 ) on Friday April 11, 2003 @11:10AM (#5710619) Homepage
    Try this on for size: In 100 years, computer languages won't exist, or at least won't be used for anything but toy programs. Programs will be created, tested, and debugged through genetic algorithms. Nobody programmed them, nobody is exactly sure how they do what they do, and it works so well that nobody really cares to find out.

    We're already to the point where it's absurd for a single person to understand the whole of a software project. Things are only going to get worse from here, and the only way out is to let the computers manage the complexity for us. As computers become faster, they'll be able to test out an ungodly number of permutations of a program to see which ones perform the fastest, or give the best results.

    Just a speculation. I don't wholeheartedly believe what I just said, but I think it's a bit silly to simply assume that programming languages will be around forever.
    • Re:My prediction. (Score:3, Insightful)

      by zCyl ( 14362 )
      In 100 years, computer languages won't exist, or at least won't be used for anything but toy programs. Programs will be created, tested, and debugged through genetic algorithms.

      I think this is sort of like saying 50 years ago that programming won't exist in the year 2000 because almost no one will use machine code anymore.

      I fully expect that in 100 years, computers will be able to do much of what we currently consider programming faster than humans can. The act of programming, and thus programming lang
    • If the computer generates the runtime code automatically (I don't necessarily agree with using GAs in fact, it seems there's lots of search algorithms out there that usually outperform GAs, but GAs do seem to work); then the question becomes one of testing it. Therefore, the programming becomes not, writing the code, but specifying what code has to do in some way, and then the computer writes the code to match.

      So say you want a chess program. You feed in the rules of the game in a special language, and it

  • by unfortunateson ( 527551 ) on Friday April 11, 2003 @11:14AM (#5710644) Journal
    The article seems a bit naive about data structures and their evolution into objects.

    Strings aren't lists, they're structures.

    Most strings use in programs is a holdover from teletype-style programming, where all you could display is a short (ahem) string of characters. Today's string use is a label to a data item, a menu item on a menu, a data object in a domain.

    XML -- as clunky as it can seem -- and XUL in particular, are ways of describing user interface to a system as a tree of objects.

    So I don't want lists of characters, I want associative structures of objects which can be of many different types, used in the manner required by the program (it's a string, it's a number, it's a floor wax, it's a desert topping).

    I'm trying really hard to avoid saying "object-oriented," but objects will become more complex and more abstract. Computers of the future may not have to worry about pixels in an image, but rather know the object itself, where a bitmap is just an attribute of the thing.

    Perhaps driver- and compiler-writers will still need stripped-down languages for efficient access to hardware, but as an app programmer and end user, I want the computer to handle statements like,

    BUY FLOWERS FOR ANNIVERSARY

    Currently, that would be something like
    event("Anniversary").celebrate.prepare.purch ase($f lowers)

    That's not nearly abstract enough.

  • by MickLinux ( 579158 ) on Friday April 11, 2003 @11:19AM (#5710696) Journal
    The evolution of languages differs from the evolution of species because branches can converge. The Fortran branch, for example, seems to be merging with the descendants of Algol. In theory this is possible for species too, but it's so unlikely that it has probably never happened.

    Ummm... how about lichen? our mitochondria? What about the parasitic relationships that become mutually beneficial, such as the numerous bacteria in our gut and on our skin, and then eventually become necessary for life?

    Merging actually does happen -- it just doesn't happen in the way he was thinking, that DNA become identical and cross-species fertility occurs. Rather, the two organisms live closer and closer, until they merge.

    Come to think of it, although it isn't on the species level, the concept of merging species isn't too different than sexual reproduction.

  • by Saige ( 53303 ) <evil.angelaNO@SPAMgmail.com> on Friday April 11, 2003 @11:22AM (#5710730) Journal
    Interesting article, but I think it had a serious flaw - by assuming that programming languages in the future are just going to extend the current model even further.

    Some of us working in the telecommunications industry are already familiar with SDL (Specification and Description Language) [sdl-forum.org] as a tool for designing and auto-coding software. Yes, auto-coding. The SDL design software lets us design a system graphically, breaking it up into sub-components and specifying message flows between those components, and defining state machiens for handling these messages.

    Developing software in such manner usually requires a very little coding, as the design tool will turn the design into code. Coding may be required for interfacing with the OS or other entities, though that's improving also.

    I'm starting to think as such tools mature, they're going to be the next step up, like the way programming languages were the step up from coding in assembly. They are less efficient, just as BASIC or C is less efficient than pure assembler, but they allow greater focus on a solid and robust design and less requirement to focus on repetitive details.

    Imagine being able to take out the step of having to go from a design to code - focus on the design, and you're done.
  • by mattsucks ( 541950 ) on Friday April 11, 2003 @11:32AM (#5710807) Homepage
    In the world 100 years from now, you don't program the computer ... the computer programs YOU!
  • by Jagasian ( 129329 ) on Friday April 11, 2003 @11:39AM (#5710865)
    I think it's important not just that the axioms be well chosen, but that there be few of them. Mathematicians have always felt this way about axioms-- the fewer, the better-- and I think they're onto something.


    To anyone that has studied theoretical computer science and/or programming languages knows that such reductionism is a fallacy. "...the fewer, the better..."

    It turns out that its better to strike a balance, where you make the formal mathematical system (what a programming language is after all) as simple as possible, until you get to the point where making it more simple makes it more complicated. Or in other words, making it more simple would cloud the mathematical structures that you are describing.

    Here are some examples of reductionism gone too far: Sheffer stroke, X = \z.zKSK, one instruction assembler, etc...

    The only logical connective you need is the sheffer stroke... but thats of no use to us as it is easier to more connectives such as conjunction, disjunction, implication, and negation.

    The only combinator you need is X, and you can compute anything... but making use of other combinators... or better yet the lambda-calculus is more useful.

    Point is that we need more powerful tools that we can actually use, and there is no simple description of what makes one tool better than another. Applying reductionism can result in nothing special.

    The true places to look for what the future brings with regards to programming languages are the following:

    1. Mobile-Calculi: pi-calculus, etc...
    2. Substructural Logics: linear-logic, etc...
    3. Category Theory: It is big on structure, which is useful to computer scientists.

    • The point he was trying to make was that the building blocks used to build up the language must be as simple as possible.

      Thus, in the core of the language you don't need to build in the ability to do multiplication if you have built in the ability to do addition. Multiplication is just a special case.

      However, you then add another layer to this simple core. In that layer you provide functionality for multiplication, subtraction etc.

      The key here being that the layer will have been written in the lang
  • strings (Score:3, Interesting)

    by roskakori ( 447739 ) on Friday April 11, 2003 @11:44AM (#5710911)
    from the article:
    Semantically, strings are more or less a subset of lists in which the elements are characters. So why do you need a separate data type? You don't, really. Strings only exist for efficiency.

    i think strings mainly exist because of usability considerations - from the developers point of view. they provide a compact notation for "list of characters". furthermore, most languages come with string routines/classes/operators that are a lot more powerful and flexible than their list-equivalent.

    efficiency definitely is a consideration, but not the main one.

  • LISP in 100 years (Score:4, Interesting)

    by axxackall ( 579006 ) on Friday April 11, 2003 @11:55AM (#5710991) Homepage Journal
    in 100 years, LISPers will be finally agree with different shapes of brackets. In fact they will accept that something like (defbracket {} metafunctor ...) will let possible something like this (abc {x y z})

    No need to mention they will agree with operators: (defop + a b (+ a b))

    That was a joke and you can do similar thing even today. Seriously, I very agree with these three quotes:

    • "Lisp is a programmable programming language." - John Foderaro, CACM, September 1991;
    • "Lisp isn't a language, it's a building material." - Alan Kay pad;
    • "Greenspun's Tenth Rule of Programming: any sufficiently complicated C or Fortran program contains an ad hoc informally-specified bug-ridden slow implementation of half of Common Lisp." - Phil Greenspun;
    Thus, I think that if underlying language for the most of OS components would be something like LISP than the whole concept of programming would be different. It could not happen before being limited by available hardware performance and quality of LISP implementations. But samething was about Java.

    So, if there will be a commercial effort to push LISP again to the market as underlying metalanguage then, if not in 100 then in 2 or about years, we may see all programming languages being "LISP-derived". Add here that LISP syntax is semantically much better than XML, but still same parser-unified. The only problem with LISP today is that it's not so "distributed" like Erlang. Fix it and you'll get the language of the nearest future.

    ---

    I don't know the future. I barely remember the past. I see the present very blur. Time doesn't exist. The reason is irrational. The space is irrelevant. There is no me.

    • SO, what you are saying is that in 100 years, we will still be using E-macs for everything but typing source code and word processing.
  • by billtom ( 126004 ) on Friday April 11, 2003 @11:56AM (#5710998)
    Hey, I actually read the article and there's a key point that Graham makes that I don't agree with.

    He makes the point to separate the details of a language into "fundamental operators" and "all the rest" then goes on to say that languages which last and have influence on future languages are the ones that minimize the number of fundamental operators. And then gives examples of things that are fundamental operators in many languages that he feels we don't need (e.g. strings, arrays, maybe numbers).

    He doesn't have much to say about "all the rest". Presumabily he would move strings into "all the rest" since we would still want our languages to have functions to manipulate strings (if you think that I'm ever going to write a string tokenizer function again, you've got another thing coming).

    But, I think that the basic concept of splitting up a language into these two parts is fundamentally flawed. The line between the core of the language and all the accompanying libraries of code has broken down completely. It was already falling apart in C (does anyone program C without assuming that the standard I/O library is available?). But with Java and C# the distinction is almost completely gone. Programming languages have become complete environments were you can assume that tons of libraries are naturally going to be available. And separating out a language's "fundamental operators" and it's "all the rest" is an artificial division that doesn't really work.
  • Java bad? (Score:3, Interesting)

    by MrBlue VT ( 245806 ) on Friday April 11, 2003 @12:38PM (#5711319) Homepage
    I found it interesting that right at the outset he dismissed Java as an "evolutionary dead-end" with no explanation of that comment in the whole article.

    The points he makes about what the good languages are seem to show that Java is indeed a good language. Specifically it has an additional layer that allows for abstraction from the hardware/operating system for portability. It takes care of mundane details for the programmer (garbage collection, no need to worry about dealing with memory directly, etc).

    Basically the article seemed to repeat itself a lot and show that Java does indeed have a lot of good qualities that he thinks will be in future languages. He also dismisses Object-Oriented programming as the cause for "spaggetti code" without giving any justification for that statement. Finally, he slips in a nice ad hominium attack there by saying any "reasonably competent" programmer knows that object-oriented code sucks.

    I think the author's own biases hurt his argument greatly.

  • by Sanity ( 1431 ) on Friday April 11, 2003 @12:39PM (#5711332) Homepage Journal
    His claim that Java will be an evolutionary dead-end is already wrong - since Microsoft's C# is clearly heavily inspired by Java.
  • by MrBandersnatch ( 544818 ) on Friday April 11, 2003 @12:55PM (#5711461)
    However its interesting waffle and does have some good points. I have to disagree that if we had the languages of 100 years hence that programmers would be able to use them. I still remember just how hard C++ was for those used to C and I am amazed at just how hard peeps find XSLT to be due to its lack of modifiable variables. Recursion just doesnt seem to come naturally to many.

    Basically the problem isnt going to be with the languages - the problem will be with the concepts that created those language features.

  • Article Summary (Score:3, Insightful)

    by Mannerism ( 188292 ) <keith-slashdotNO@SPAMspotsoftware.com> on Friday April 11, 2003 @01:23PM (#5711725)
    Here's what I got out of it:

    Nobody really has a clue what programming languages will be like in a hundred years, but if all the Perl and Python weenies would learn LISP then maybe we could get somewhere within the next decade.
  • by AxelTorvalds ( 544851 ) on Friday April 11, 2003 @01:38PM (#5711818)
    These "language discussions" are always flawed. There are zealots who like one hammer for every nail and then there are these other zealots that are so far from the actual problems that their ideas are kind of hoaky.

    Heirarchy will continue to exist. It's the only concept the human brain has to deal with complexity, call it what you will but you classify and associate things in to hierarchy whether you're aware of it or not. I see no reason to believe right now that processors will have more advanced instructions than they currently do now; they may be very different (like optmisitic registers that know values before they have been calculated or something) but they will be on the same order of complexity. The atomic operations will probably remain at the same order of complexity in biological processors, quantum, or SI/GAAS/whatever based transistor processors. I don't see how sort a list will be done without some sort of operations to look at elements in it, compare them, and then change their ordering. Even with quantum computers you have to set up those operations to happen and cause results. That being said there will always be an assembly language.

    On top of that there will always be a C like language, if it's not C, that will be a portable assembly language. Then there will be "application" languages built at a higher level still. That won't change, for good reasons, it's just too complex to push the protection and error checking and everything down a level. I'll give examples if you want them. The easiest one that comes to mind is something like Java garbage collection and how programmers assume that it has mystical powers and are shocked when they fire up a profiler and see leftovers sitting around, it's a very complex piece of software and you expect it to go down to a lower level? The lower levels have their own problems keeping up with Dr. Moore.

    I think the other biggest area is that reliability needs to go up by several orders. Linux, BSD, Win2000 and WinXP are pretty reliable but they aren't amazing. I've seen all of them crash at one point or another, I may have had hand in making it happen and so might have hardware; either way it did. To really start to solve the issues and problems of humanity better we need to have more trust for our computers, that requires more reliable computers and that require different methods of engineering. The biggest thing going on in programming languages now to deal with that is Functional Programming. In 50 years I could see some kind of concept like an algorithm broker that has the 1700+ "core algorithms" (Knuth suspects that there are about 1700 core algorithms in CS) implemented in an ML or Haskell like language, proven for correctness, in a proven runtime environment being the used in conjunction with some kind of easy to use scripting glue. And critical low level programming will be proven automatically by an interpreter at compile time, they are already making automatic provers for ML.

  • by Clod9 ( 665325 ) on Friday April 11, 2003 @02:28PM (#5712165) Journal
    Language structure is determined by two things:
    1. the target machine architecture
    2. the range of expression required by the programmer and/or workgroup

    Java is "successful" but it really looks a lot like Algol and Pascal,
    as does C++. The range of expression is greater in the newer languages
    (object-orientation in Java and C++) but the forte is still that of
    expressing algorithms in written form to be used on a stored-program
    digital computer.

    WILL WE STILL BE PROGRAMMING?
    Take one example -- genetic programming. If you had a programming system
    where the basic algorithm could learn, and all you had to do is set up
    the learning environment, then you'd be teaching rather than programming.
    In fact I believe THIS is what most "programmers" will be doing in 100 years. The challenge
    will be defining the problem domain, the inputs, the desired outputs; the
    algorithm and the architecture won't change, or won't change much, and the
    vast majority of people won't fiddle with it.
    But if HAL doesn't appear and we aren't all retrained as Dr. Chandra,
    I believe we'll still be handling a lot of text on flat screens.
    I don't think we'll be using sound, and I don't think we'll be using pictures.
    (see below)

    So predicting what languages will be like in 100 years is predicated
    on knowing what computers and peripherals will be like. I think progress
    will be slow, for the most part -- that is, I don't think it will be all
    that much different from how it is now.

    HOW WILL OUR RANGE OF EXPRESSION CHANGE?
    If we relied primarily on voice input, languages would be a lot more
    like spoken natural languages; there would be far less ambiguity than
    most natural languages (so they'd be more like Russian than like English,
    for example) but there wouldn't be nearly as much punctuation as there
    is in Java and C++.

    If we rely primarily on thought-transfer, they'll be something else
    entirely. But I don't think this will come in 100 years.

    How is a 24x80 IDE window different from punched cards and printers?
    Much more efficient but remarkably similar, really. It would not surprise
    me if we still use a lot of TEXT in the future. Speech is slow --
    a program body stored as audio would be hard to scan through quickly.
    Eyes are faster than ears so the program body will always be stored as
    either text or pictures.

    Pictures - well, pictorial languages assume too much of the work has
    already been done underneath. "Programming isn't hard because of all
    the typing; programming is hard because of all the thinking." (Who
    wrote that in Byte a couple of decades ago?). I don't think we'll be
    using pictures. When we get to the point that we can use hand-waving
    to describe to the computer what we want it to do, again we'll be
    teaching, not programming.

    HOW WILL THE ARCHITECTURE CHANGE?
    If the target architecture isn't Von Neumann, but something else,
    then we may not be describing "algorithms" as we know them today.
    Not being up to speed about quantum computing, I can speak to that
    example...but there are lots of other variations. Analog computers?
    Decimal instead of Binary digital machines? Hardware-implemented
    neural networks? Again, I don't see much progress away from binary
    digital stored-program machine in 40 years, and I think (barring
    a magical breakthrough) this may continue to be the cheapest, most
    available hardware for the next 50-100 years.

    SO WHAT DO I THINK?
    I think IDE's and runtime libraries will evolve tremendously, but
    I don't think basic language design will change much. As long as
    we continue to use physical devices at all, I think the low-level
    programming languages will be very similar to present day ones:
    Based on lines of text with regular grammars and punctuation,
    describing algorithms. I predict COBOL will be gone, FORTRAN will
    still be a dinosaur, and Java and C/C++ will also be dinosaurs.
    But compilers for all 4 wi
  • by mugnyte ( 203225 ) on Friday April 11, 2003 @03:44PM (#5712763) Journal
    The tongue in cheek references to XML, Aliens, and other topics are mere amusing, but no content there. The metaphor to evolution is lost.

    Languages are built on top of many changes in technology: connectivity, speed, machine type, concurrency, adoption.

    Plus, a language is just one codification of a problem solution. The solution can pushed towards any one of several goals: secure, speed, size, reuse, readability, etc.

    Different languages have sprung up for just these 2 statements above. What metric is this guy using to measure a language's popularity? LOC still runnning (COBOL?), steadfastness of code (C?), or CTO business choices (a zoo)...? There are so many ways to look at this, just picking a point of single view is misguided reductionism.

    We will continue to have a multitude of tools available for getting work done. Cross-breeding of concepts for languages is great, and does happen, but unless you trace decisions of specific prime movers you really can't say where a language comes from.

    Anyone can put together a new language, and even get it adopted from some audience. But what gets mindshare for usage are languages that satisfy goals that are popular for the moment. Exploring what those goals will be is impossible, in my mind. What will be popular?

    Speech recognition? Image recognition? Concurrency? Interoptibility? Auto-programming? Compactness? Mindshare?

    These questions are based on our human senses, our environment, etc. Any sci-fi reader would tell you of the concept of a "race based on musical communication", for example that would base progamming on completely other sets of goals. And so on.

    mug
  • my grade (Score:4, Insightful)

    by bob dobalina ( 40544 ) on Friday April 11, 2003 @03:45PM (#5712766)
    The signal to noise ratio in this piece is high. There's lots of metaphors and similes to explain his otherwise very facile points.

    He also seems to be contradicting himself. " Semantically, strings are more or less a subset of lists in which the elements are characters. So why do you need a separate data type? You don't, really. Strings only exist for efficiency. ", he says at one point, then a few paragraphs later says "What's gross is a language that makes programmers do needless work. Wasting programmer time is the true inefficiency, not wasting machine time.". The efficiency in implementing strings in programming languages is for the programmer, who doesn't have to use said "compiler advice" and carefully separate his strings from his other, non-string list instances and keep the two distinct in his programming model. Apparently it's "lame" to simplify text manipulation for programmers, but at the same time the efforts of programming language design should be towards making the programmer's life easier. Which is it? I know strings and string libraries have made my life a whole lot easier.

    Nevertheless, I'm willing to accept the notion that eliminating strings and other complex, native datatypes and structures serves to make a programmer's use of time more efficient. But how does it do it? Graham doesn't say, he just waxes nostalgic about lisp and simpler times and languages.

    I don't think the slashdot crowd needs it explained why data manipulation by computer needn't be simplified; it already is, as machine code is binary in the common paradigm. What ought to be simplified is data manipulation by humans, and on this point Graham nominally agrees (I think). This has been the thrust of the evolution of programming from machine code to assembler to high level language. Simplifying high level languages into more and more basic, statements -- getting closer to the "axioms" that Graham calls tokens and grammars -- simply reverses that evolution. It makes it easier and more elegant to compile programs, but it does absolutely zero to make the programmer's life more efficient, or easy. The whole reason high level languages were developed was precisely to get away from this enormously simple, yet completely tedious way of programming.

    The overarching fallacy in this article is Graham's reliance on what is known about computation theory now to determine what programming languages would (and should) look like then. And while it's interesting to prognosticate on what the future would be like 100 years from now based on what we have today, it's not a reliable guide. Like Metropolis, A Trip to the Moon, and other sci-fi stories from the distant past, they're entertaining and no doubt prescient to the people of the time, but when we reach the date in question, the predictions are largely off the mark. It's somewhat laughable to think that despite our flying cars and soaring skyscrapers, we use steam engines to power our cities and make robots with eyes and mouths. Likewise, I don't think an honest, intelligent prediction or forecast of (high level) programming languages 100 years hence can occur without a firm basis, or even idea, of what assembly code would look like then. This, in turn, relies on a firm idea of what computer architecture will look like. Who knows if five (or fifty) years from now a coprocessor is designed that makes string functionality as easy to implement as arithmetic. Such an advance would completely invalidate Graham's point about strings and advanced datatypes, and in fact possibly stand modern lexical analysis on its head. Or if an entirely new model of computation comes to the fore. Even Graham himself admits that foresight is foreshortened: " Languages today assume infrastructure that didn't exist in 1960.", but he doesn't let that stop him from making pronouncements on the future of computing.

    Graham seems to be spending too much time optimizing his lisp code and not enough on his writing. This piece of code could have been optimized had he used a simile-reductor and strict idea explanations. But it's definitely a thesis worth considering, if for no other reason than mild entertainment. C-
  • by gelfling ( 6534 ) on Friday April 11, 2003 @04:01PM (#5712868) Homepage Journal
    Languages are tools that are used to organize sytactic rules that are then converted into machine usable representations of the same general purpose.

    Languages as you understand them will be as dead as the steam powered loom in 50 years. We will have non letter/symbol typed tools to do that in as much as the DVD has 'replaced' live theater as the only way to 'reproduce' entertainment.
  • 25-year language? (Score:3, Insightful)

    by karlm ( 158591 ) on Saturday April 12, 2003 @04:06AM (#5715423) Homepage
    We really have no idea what the computing environments will be like in 100 years, nor what the core uses of computers will be. I think hoping to use languges "along that evolutionary path" is putting the cart before the horse. I think 25-years is about as far ahead as you can reasonably hope to predict. It's like a 1-week forecast. Nobody asks the weather channel for a 1-month forecast, as the margin of error becomes astronomical.

    One poster claimed quantum computing will make current languaes useless. This is false. Any reasonably flexible language has space for new data types and operators. You would have to be careful not to prematurely branch based on a quantum value, as this may force you to opserve its value, destroying the superposition. However, I don't doubt Fortran will be one of the first languages used in quantum computing once they get past the assebly language stage.

    What will languages be like in 25 years (and maybe 50 years)?

    Well, there will be a Fortran and a Lisp and a C (maybe a C++). Lisp has always had automatic garbage collection. The Fortran and the C will have optional garbage colletors. Fortran, Lisp, and C are all decent attempts at languages that are pretty easy to grasp and have huge legacy backings.

    Hopefully all of the main languages will be less machine depenent. Fixnums, ints, longs, floats, and doubles will be the same size across platforms, wasting a few cycles if this doesn't fit the underlying hardware.

    In terms of new novel languages, I see languages simultaneously going three ways. I forsee languages that resemble formal proofs and/or formal specifications for use where reliability is critical. I forsee languages specialized for Scientific/Engineering disciplines. (Maybe Fortran, Excell, and Matlab cover all of the bases well enough, but I hope there is enough room left for improvement to drive innovation and adoption. Having a CS background, I didn't appreciate LabView's "G" language until I had an opportunity to see the ugly ugly code scientists and engineers tend to write in Fortran.) (I can also imagine efforts to use sytaxts that better express parallelism and other features for optimimizers/compilers so we finally have a widely used scientific/engineering language that is faster than Fortran 77. I can also see more languages like the Bell Labs Aleph, designed for parallel/cluster environments.) The third direction I see languages going is scripting/prototyping-like languages that will look more like natural language. (It's too bad there isn't a cross-platform open-soource AppleScript-like language yet.)

    What do I think languages should have? Languages should have garbage collection and bounds checking enabled by default (of course, optionally turned off if you really must have the performance).

    Langages should have very clean and consitent APIs. Having few orthagonal types helps make a language clean Languages should merge character arrays and strings (arguably the Algol languages have had this for a while). If a language wants to be able to have immutable strings, it should provide a way to declair an immutable variant of each fundamental type. (This is actually very useful in writing less buggy code.) Languages should strictly define the size of fundametal numeric types. (I really like Python, but it seems a huge mistake that an integer is "at least 32-bits". Allowig variations in size of fundametal numeric types adds cross-platform bugs. If I wrote a language, the types would look like "int32" "int64" "float32", "float64", "complex32" and "complex64". We got rid of 9- and 12-bit bytes. We should further get rid of these headaches.) Having worked with lots of engineers and scientists, I would love to see complex numbers as basic numeric types that all of the normal operators work on. Wrapping two doubles in an object adds a function call overhead for each numeric operation. Performance with complex numers (and numbers in general) is one big reason a lot of the code I see is written in F

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...