Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Peter Naur Wins 2005 Turing Award 135

An anonymous reader writes "The Association for Computing Machinery (ACM) has named Peter Naur the winner of the 2005 A.M. Turing Award. The award is for Dr. Naur's fundamental contributions to programming language design and the definition of Algol 60, to compiler design, and to the art and practice of computer programming. The Turing Award is considered to be the Nobel Prize of computing, and a well-deserved recognition of Dr. Naur's pioneering contributions to the field."
This discussion has been archived. No new comments can be posted.

Peter Naur Wins 2005 Turing Award

Comments Filter:
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Saturday March 04, 2006 @04:49PM (#14850931)
    Comment removed based on user account deletion
  • Yes but... (Score:1, Funny)

    by Anonymous Coward
    Can he pass the Turing test himself?
  • by geoff lane ( 93738 ) on Saturday March 04, 2006 @04:52PM (#14850935)
    ..."Algol 60 is a great improvement on all its successors"

    Nice to see Peter getting some recognition.
    • Algol 60 is a great improvement on all its successors

      I had a look [wikipedia.org] at it and was left wondering what we have been doing with programming languages for the last 50 years. Since then we seem to have invented automatic garbage collection, standardised API's and protocols and OO.

      Its a shame. Is the idea of a "language" the problem? Perhaps its time we moved on to something totally new. Don't ask me for examples, though.

      • by AuMatar ( 183847 ) on Saturday March 04, 2006 @06:16PM (#14851172)
        It isn't the idea of a language thats the problem, the idea that language matters is the problem. Any problem can be solved in any Turing complete language. There's little to no difference between them. You're not going to write code an order of magnitude faaster because you change language. Short of having an API class you can leverage in one language and not the other, you'll be hard put to program faster by a factor of 10%, if you know the syntax of both languages equally well.

        The real problem is code reuse. 95% of what we do on a daily basis is to reinvent features available elsewhere. What we need are well designed, easy to use libraries that we can leverage and have most of the work done for us. Closed source programs are killing us, as we can't leverage off each other. Its like going back to the days of Newton and Liebnitz and requiring all mathematicians to prove the same ideas without reference to one another's work before moving on. Its ridiculous, and its the reason for our problems.
        • by belmolis ( 702863 ) <billposerNO@SPAMalum.mit.edu> on Saturday March 04, 2006 @08:02PM (#14851608) Homepage

          It is much more difficult to master and retain the syntax of some languages than of others, so a lot of the time you aren't going to know them equally well. In any case, I think you're just wrong about language not making a difference. It is much slower to write in a low-level language than in a high-level language. Sure, you may have mastered the syntax, but you still have to spend time and mental energy keeping track of what goes where if you don't have data structures like structs and arrays, and just adding automatic storage allocation and garbage collection saves a lot of time and bugs.

          • by AuMatar ( 183847 ) on Saturday March 04, 2006 @08:13PM (#14851642)
            I disagree- arrays and structs are made by quick macros, even in assembly. Think of it like an accessor function. It takes a small time up front to write- not a significant effort. I'm about 90% of the speed writing in asm than I am in C++.

            Garbage collection is a whole other rant- thats a complete strawman. Memory management takes a minor amount of time (almost 0), and making sure you properly null out dangling references in Java takes about as much. I find the problem to be totally different- there's a subset of programmers who just don't understand memory management. These people suck as programmers- everything you do in programming is resource management. Memory- alloc, use, free. Files- open, use, close. Networking- connect,use,close. Having people who don't understand that pattern on your team causes work to slow down by large amounts because of their incompetnece, not because of the language.
            • If you've got macros in assembler with macros that make structs and arrays easy, you're not writing real assembler but one of those new-fangled intermediate languages. That's a step up right there.

              Anyhow, its the storage allocation that is the big thing. I just don't agree that it makes such a small difference. It isn't just the need to free up what you use - that's relatively easy. It's the constant checking of whether you've got enough or need to reallocate, and the sometimes complicated and error-pro

              • Why do you have to worry about increasing your string length? That should be taken care of by your string library. Lets say you want to concatenate 2 strings. YOu don't just use strcat() do you? The correct way to do it is to write a function that goes something line this


                string* strcat(string *str1,string *str2){
                if(str2.length+str1.length>str1.size){
                if(reallocstr(dst,str2.length+str1.length)==ALLOCE RROR){
                return NU
                • But instead off using all this horrid C libraries, you could just write in a language which actually supported some high-level idioms.

                  And there's a lot of very useful stuff (e.g. functional programming) that you just can't do in C.

            • and making sure you properly null out dangling references in Java takes about as much.

              i always hate it when my references dangle. (so embarrassing. :)
        • Any problem can be solved in any Turing complete language. There's little to no difference between them. You're not going to write code an order of magnitude faster because you change language.

          You've got to be kidding me. There are plenty of cases where you will write code a magnitude faster if a language is changed. Can you write a web application supporting complex business logic in C? Yeah you can. But it absolutely doesn't compare to Rails, Struts, or asp.NET. A Perl program might take a few lines while
          • You've got to be kidding me. There are plenty of cases where you will write code a magnitude faster if a language is changed. Can you write a web application supporting complex business logic in C? Yeah you can. But it absolutely doesn't compare to Rails, Struts, or asp.NET

            I disagree. A quick download of a few libraries to help out (a database access library, a regex library, a better string library, maybe one or two others) and I'm ready to go. Rails is a particularly poor example- yeah, it autogenerate

        • "It isn't the idea of a language thats the problem, the idea that language matters is the problem. Any problem can be solved in any Turing complete language. There's little to no difference between them."

          I have a very nice OS kernel project for you to develop in TeX, ok?

      • I had a look at it and was left wondering what we have been doing with programming languages for the last 50 years.

        Rather than actually developing programming languages we've been doing 873 virtually identical variation on C...

        Pathetic, isn't it.

  • by Eightyford ( 893696 ) on Saturday March 04, 2006 @04:52PM (#14850938) Homepage
    I didn't think humans could win this award.
  • Me, like many readers of slashdot, also hope to pass the Turing test one day, so I congratulate him on this achievement.

    Meanwhile, in Soviet Russia, the Turing test passes you.
  • Look, Naur is mentioned right in the code!

    $ diff -Naur inftrees.c../zlib-1.2.2.orig/
    --- inftrees.c 2005-07-10 13:38:37.000000000 +0100
    +++../zlib-1.2.2.orig/inftrees.c 2004-09-15 15:30:06.000000000 +0100
    @@ -134,7 +134,7 @@
    left -= count[len];
    if (left < 0) return -1;/* over-subscribed */
    }
    - if (left > 0 && (type == CODES || max != 1))
    + if (left > 0 && (type == CODES || (codes - count[0] != 1)))
    return -1;/* incomplete set */

    Not much of a criterion for a Turing Award, though...

  • What about Backus-Naur form?
    • What about Backus-Naur form?

      RTFA. They mention BNF as well -- though they certainly don't give it as much time and space as it deserves. ALGOL was a tremendous accomplishment, but IMO, BNF far greater still. Then again, it's open to argument that John Backus really deserves most of the credit for BNF. At one time, BNF was an abbreviation for "Backus Normal Form" and only later was Peter Naur's name added in.

      Interestingly, Naur didn't seem quite as impressed with the success of Algol 60 as many peopl

    • Re:Just Algol-60? (Score:4, Informative)

      by weg ( 196564 ) on Saturday March 04, 2006 @05:37PM (#14851086)
      BNF originally stood for "Backus Normal Form", and the name Backus Naur Form was introduced by Donald Knuth:

      @article{365140,
        author = {Donald E. Knuth},
        title = {Backus Normal Form vs. Backus Naur form},
        journal = {Commun. ACM},
        volume = {7},
        number = {12},
        year = {1964},
        issn = {0001-0782},
        pages = {735--736},
        doi = {http://doi.acm.org/10.1145/355588.365140 [acm.org]},
        publisher = {ACM Press},
        address = {New York, NY, USA},
        }
  • For those of you like me and have never worked with this language, some sample code is here [monash.edu.au]

    I think I would have been driven nuts trying to find the unmatched ' in my code.
    • I think I would have been driven nuts trying to find the unmatched ' in my code.

      And you call yourself a programmer? Build a macro or some kind of simple code to check FOR you!
    • For those of you like me and have never worked with this language, some sample code is here [monash.edu.au]

      Scotty: Captain, we din' can reference it!
      Kirk: Analysis, Mr. Spock?
      Spock: Captain, it doesn't appear in the symbol table.
      Kirk: Then it's of external origin?
      Spock: Affirmative.
      Kirk: Mr. Sulu, go to pass two.
      Sulu: Aye aye, sir, going to pass two.

      No more wordy than COBOL. Seems like a cool language

  • Sir,

    I thank you for helping define structured computer programming languages. Programs were the dreams of the wireheads half a century ago. Now, if you can type, we can only hope you never see the dreaded :

    SYNTAX ERROR : GOSUB WITHOUT RETURN
    LINE 380

    Guess what language I learned to program first? :-P

    Visualize Whirled P.'s
  • Datalogy (Score:5, Interesting)

    by Peter_Pork ( 627313 ) on Saturday March 04, 2006 @05:28PM (#14851050)
    Peter Naur is an interesting character. For example, he dislikes the term "Computer Science", and prefers "Datalogy". He also gives Backus the whole credit for inventing BNF, which he calls the Backus Normal Form. I'm sure he has a better name for Algol-60...
    • Naur is hardly alone in not liking "computer science". The department where I studied CS was called "Information Science" because (as the chairman put it) computers are just instruments, not the thing being studied; do you call astronomers "telescopists"?

      But that's actually wrong. Computers are instruments, but they're not just instruments. Their existence drives the whole discipline. Leave "computer" out of the terminology and most people won't know what you're talking about. When I told people I was maj

      • Re:Datalogy (Score:2, Informative)

        by Krakhan ( 784021 )
        It's most likely for that reason Djikstra preferred the term "Computing Science" himself.
      • Re:Datalogy (Score:1, Informative)

        by Anonymous Coward
        Well, in Danish we actually use the term "Datalogi". And danish people I meet seem to known what I'm studying when I say Datalogy. That is, they known it has something to do with computers.
        But that's just the recuring problem we as computer scientists meet. Ordinary people think every one working with computers is doing the same work as Joe from the IT dept who helps them connect their Palm handheld.
        • Thats cool - in Germany CS is called "Informatik". I haven't seen any students of "Computerwissenschaft" or something similar here.

          Computer science is no more about computers than astronomy is about telescopes. [Edsger Dijkstra]
          • Same in Dutch ("Informatika") and in French ("Informatique").

            I still translate my degree (which is in dutch) to "Computer Science" because most non-german/dutch/french people have no idea what I'm talking about if I say "Informatics" :-)

    • Re:Datalogy (Score:2, Interesting)

      by Anonymous Coward
      He's a very funny character indeed. I study at DIKU where Naur used to work, and one of the funny things he invented was "the Naur frame".

      The Naur frame is an A4 sized piece of cardboard with a hole in it like a picture frame. When he would read and grade a report he would place this frame over the report when reading it. So if your margin was too narrow the frame would cover some of your text. He would then give you a horrible grade because your report didn't make any sense.

      A rather hard way to force peopl
      • Re:Datalogy (Score:2, Interesting)

        by paradigm82 ( 959074 )
        Here are some other stories (I study at DIKU too): * For most reports handed in at DIKU (like on most other universities I guiess) there's an upper limit on the number of pages you can hand in. Peter Nauer had the policy that if he received a report with more than that number of pages, he would simply refuse to read the last excess pages. * For every day a report was handed too late he would subtract one grade point. This led to complaints from the administration so the policy is now that overdue hand-ins
  • There is a famous controversy, described here:

    http://spirit.sourceforge.net/dl_docs/bnf.html [sourceforge.net]

    Some accuse this guy of bogarting the credit.
    • From TFA: John Backus, another former Turing Award winner, acknowledged Naur as the driving intellectual force behind the definition of Algol 60. He commented that Naur's editing of the Algol report and his comprehensive preparation for the January 1960 meeting in which Algol was presented "was the stuff that really made Algol 60 the language that it is, and it wouldn't have even come about, had he not done that."

      And from your own refrence, Naur says I don't know where BNF came from in the first place.
      • No it doesn't. But then if you read this, it is clear tha his contribution sounds so minor as to be unworthy of further discussion -- he just wanted to change some unprintable characters to printable ones.

        In a later appendix, F. L. Bauer responds to Naur's statements:

        "It is amusing to see how Peter Naur looks at the use of the Backus notation from his personal point of view. Among [other members of the committee] there was no question that we would like... a form similar to the
        • While you (and the text on the page you link to) seem to agree that Naur didn't try to steal any credit you both use a rather harsh tone like "contributions sounds so minor as to be unworthy of further discussion...". Nowehere did Naur claim to have invented BNF notation. He said he had done some slight modifications to it namely changing some unprintable characters to printable characters. And as you note, those changes are slight. But what's the problem? It isn't the changes he made to BNF that earned him
  • Algol 60 Group (Score:2, Interesting)

    by JehCt ( 879940 ) *
    It's interesting that Peter Naur is being recognized 40 years later, when another Algol team member, Alan Perlis, received the first Turing Award in 1966. Here's a photo of Perlis, Naur and the other Algol 1960 conference participants. [tugurium.com]
  • by Marc Rochkind ( 775756 ) on Saturday March 04, 2006 @05:39PM (#14851091) Homepage
    1. The Report on the language used a formal syntax specification, one of the first, if not the first, to do so. Semantics were specfied with prose, however.
    2. There was a distinction between the publication language and the implementation language (those probably aren't the right terms). Among other things, it got around differences such as whether to use decimal points or commas in numeric constants.
    3. Designed by a committee, rather than a private company or government agency.
    4. Archetype of the so-called "Algol-like languages," examples of which are (were?) Pascal, PL./I, Algol68, Ada, C, and Java. (The term Algol-like languages is hardly used any more, since we have few examples of contemporary non-Algol-like languages.)

    However, as someone who actually programmed in it (on a Univac 1108 in 1972 or 1973), I can say that Algol60 was extremely difficult to use for anything real, since it lacked string processing, data structures, adequate control flow constructs, and separate compilation. (Or so I recall... it's been a while since I've read the Report.)
    • > 1. The Report on the language used a formal syntax specification, one of the first, if not the first, to do so. Semantics were specfied with prose, however. Unfortunately this is the case with all programming languages (with the exception of standard ML).
      • 1. The Report on the language used a formal syntax specification, one of the first, if not the first, to do so. Semantics were specfied with prose, however.

        Unfortunately this is the case with all programming languages (with the exception of standard ML).

        Not all others -- for example, LISP 1.5 originally had its semantics defined in terms of actions taken by a LISP interpreter (written in LISP, of course).

        There have been a few more with formally defined semantics as well. SPARK and the current Scheme

  • I have been using his work for years. Congrats to him
    and his fantastic career.

    Hedley
  • Danes everywhere... (Score:4, Interesting)

    by weg ( 196564 ) on Saturday March 04, 2006 @05:53PM (#14851125)
    Amazing how many programming languages were actually invented by Danish computer scientists. Peter Naur (ALGOL), Bjarne Stroustrup (C++), Anders Hejlsberg (C#), and Mads Tofte contributed a good deal to SML.
  • The Turing Award is considered to be the Nobel Prize of computing, and a well-deserved recognition of Dr. Naur's pioneering contributions to the field."

    Why the heck don't the Nobel managers make a fricken Computer category? They created a Economics category even though Mr. Nobel hadn't originally set that one up.
           
    • Re:Nobel Games (Score:2, Interesting)

      by AuMatar ( 183847 )
      The Nobel committee has not made any awards that Alfred Nobel himself did not decide to set up. Economics is not a real nobel prize- its official name is "The Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel"- in other words its a rip off of the name. Quite fitting, given macroeconomics is a pseudo-science, that it be given a pseudo-award.
      • Quite fitting, given macroeconomics is a pseudo-science, that it be given a pseudo-award.

        As you're probably aware, more than macroeconomists receive the "The Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel". In fact, the 2005 prize went to two game theorists (and thus microeconomists). The debate about whether economics, or any subset of it, is a real "science" is not a very fruitful one (since one can define science in any number of ways, both including and excluding econ). I find
        • Why I separate micro from macro- a lot of micro stuff can be tested according to principles of the scientific method. Little to none of the macro stuff can. Not to mention the large swaths of macroeconomics that point to opposite results, depending on the political leanings of the economist who designed it. Trying to get two economists to agree on a macro issue is almost impossible. If it was a real science, at least the basics would be known, tested, and proven by now.

          The existence or lack of a Nobel p

          • Re:Nobel Games (Score:3, Insightful)

            by proxima ( 165692 )
            Trying to get two economists to agree on a macro issue is almost impossible. If it was a real science, at least the basics would be known, tested, and proven by now.

            I'll grant that there are many conflicting models in macro. Many of them stem from the assumptions. For example, assuming a closed economy or an open economy. In the real world and throughout history, various countries are somewhere in between, but often closer to one or the other. Thus, choosing appropriate assumptions for the question you'
  • I just read the WikiPedia article on Alan Turing:

    In 1952, Turing was convicted of acts of gross indecency after admitting to a sexual relationship with a man in Manchester. He was placed on probation and required to undergo hormone therapy. When Alan Turing died in 1954, an inquest found that he had committed suicide by eating an apple laced with cyanide.

    Then the article mentions an urban legend:

    In the book, Zeroes and Ones, author Sadie Plant speculates that the rainbow Apple logo with a bite take
  • Wow, for a moment there, I thought that they introduced Programming as an Olympic Event in the 2006 Winter Olympics in Torino. :o

For God's sake, stop researching for a while and begin to think!

Working...