Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Peter Naur Wins 2005 Turing Award

ScuttleMonkey posted more than 8 years ago | from the celebrating-people-who-deserve-it dept.

135

An anonymous reader writes "The Association for Computing Machinery (ACM) has named Peter Naur the winner of the 2005 A.M. Turing Award. The award is for Dr. Naur's fundamental contributions to programming language design and the definition of Algol 60, to compiler design, and to the art and practice of computer programming. The Turing Award is considered to be the Nobel Prize of computing, and a well-deserved recognition of Dr. Naur's pioneering contributions to the field."

cancel ×

135 comments

Sorry! There are no comments related to the filter you selected.

Took a while, didn't it? (4, Insightful)

jcr (53032) | more than 8 years ago | (#14850931)

The designer of Algol-60 is only getting this recognition in 2006? What?

-jcr

Re:Took a while, didn't it? (4, Informative)

LiquidCoooled (634315) | more than 8 years ago | (#14850963)

This may help to explain his importance even to this day:

The Backus-Naur form (BNF) (also known as the Backus-Naur formalism, Backus normal form or Panini-Backus Form) is a metasyntax used to express context-free grammars: that is, a formal way to describe formal languages.

Taken from the wikipedia page [wikipedia.org] .

Re:Took a while, didn't it? (4, Informative)

foonf (447461) | more than 8 years ago | (#14851015)

John Backus won the award in 1977 [acm.org] though, so it is quite legitimate to ask, as the original poster did, why they didn't recognize Naur sooner.

Re:Took a while, didn't it? (2, Interesting)

hopopee (859193) | more than 8 years ago | (#14852636)

Maybe it has something to do with Backus leading the project that created the FORTRAN language. You know, one of the most used languages ever. That is still used for scientific calculating.

Re:Took a while, didn't it? (2, Interesting)

Anonymous Coward | more than 8 years ago | (#14853007)

Speaking of years, a near equivalent of Backus-Naur Form had been used sometime in 5th century BCE (yes, 2500 years ago) by Panini [answers.com] to describe the grammar of Sanskrit language.

Re:Took a while, didn't it? (0)

Anonymous Coward | more than 8 years ago | (#14850965)

They had to make sure everybody dropped the language before recognizing it. An upswing in popularity would probably have kept the language alive. Would you be the one be programming in Algol? I thought so.

Re:Took a while, didn't it? (5, Funny)

0xC0FFEE (763100) | more than 8 years ago | (#14850970)

I hear the Turing committee actually has an infinite red tape.

Re:Took a while, didn't it? (1)

eclectro (227083) | more than 8 years ago | (#14851005)

I hear the Turing committee actually has an infinite red tape.

Yes they do. But they are busy beavers.

Re:Took a while, didn't it? (1)

ktwombley (682915) | more than 8 years ago | (#14851096)

Mod parent +6 Funny.

Re:Took a while, didn't it? (0)

Anonymous Coward | more than 8 years ago | (#14851169)

Parent is best comment ever

Re:Took a while, didn't it? (4, Funny)

MichaelSmith (789609) | more than 8 years ago | (#14850980)

The designer of Algol-60 is only getting this recognition in 2006?

Must be why they compare it with the Nobel.

Re:Took a while, didn't it? (0)

Anonymous Coward | more than 8 years ago | (#14851031)

I think the Turing Award is a lifetime achievement kind of thing.

Re:Took a while, didn't it? (1, Funny)

Anonymous Coward | more than 8 years ago | (#14851555)

Does this mean he's indistinguishable from a sentient being?

Re:Took a while, didn't it? (1)

Jerry Coffin (824726) | more than 8 years ago | (#14851077)

The designer of Algol-60 is only getting this recognition in 2006? What?

This is to help it fit into the history of the language in general. Even though it's almost always referred to as Algol 60, the ISO standard wasn't approved until 1984!

Re:Took a while, didn't it? (2, Informative)

Anonymous Coward | more than 8 years ago | (#14851116)

It took a while because noone nominated him until now. As a matter of fact, I am a senior Computer Engineer at Syracuse University taking a compiler class tought by the distinguished Dr. Per Brinch Hansen, who nominated Peter Naur for this award. (I forgot the exact number, but the ACM committe responsible for selecting the winners, recieved many recommendations for Naur from previous Turning award winners because of his credentials and Hansen's nomination). So congradulations to Naur on his prestigous award, and as the old saying goes "better late than never".

Politics of Prizes & Other Thoughts (1, Informative)

reporter (666905) | more than 8 years ago | (#14851758)

The prize was quite belated, but fortunately, Peter Naur received it before he died. The reason for the unreasonable delay is that prizes are political. The person DEF who receives the "right" support and the "right" letter of commendation from the "right" people has a much better chance of receiving a prize (from IEEE, ACM, Seymour Cray Engineering Award Committee, etc.) than the person GHI who receives no such support even though person GHI's achievement is more scientifically amazing than person DEF's achievement.

Take the example of John Hennessy. What exactly did he accomplish apart from what his graduate students developed? Yet, through politics, he was able to transform his students' work into his own success. He received the Seymour Cray Engineering Award and was inducted into the National Academy of Engineering.

Returning to ALGOL 60 [wikipedia.org] , its syntax has been used as the de-facto standard for describing computer algorithms from 1960 to 1990. ALGOL is inspiration behind Pascal. Further, ALGOL is the first computer language to be designed by actual computer scientists instead of hackers.

I am glad that justice prevailed even though it was belated. Peter earned a prize that was actually well deserved. His ALGOL 60 was key milestone in the development of computer science.

Unfortunately, Gary Kildall did not receive the prize that he deserved while he was alive. William Gates buried him -- figuratively and literally. The Software Publishers Association gave Kildall an award after he died [wikipedia.org] .

Re:Politics of Prizes & Other Thoughts (1)

jcr (53032) | more than 8 years ago | (#14852480)

Take the example of John Hennessy. What exactly did he accomplish apart from what his graduate students developed? Yet, through politics, he was able to transform his students' work into his own success.

On what, exactly, do you base this charge?

-jcr

Re:Took a while, didn't it? (1)

rickumali (756010) | more than 8 years ago | (#14852242)

At this rate, Larry Wall might get a Turing in 2040. :-)

Yes but... (1, Funny)

Anonymous Coward | more than 8 years ago | (#14850932)

Can he pass the Turing test himself?

There is a saying... (3, Funny)

geoff lane (93738) | more than 8 years ago | (#14850935)

..."Algol 60 is a great improvement on all its successors"

Nice to see Peter getting some recognition.

Re:There is a saying... (1)

MichaelSmith (789609) | more than 8 years ago | (#14851004)

Algol 60 is a great improvement on all its successors

I had a look [wikipedia.org] at it and was left wondering what we have been doing with programming languages for the last 50 years. Since then we seem to have invented automatic garbage collection, standardised API's and protocols and OO.

Its a shame. Is the idea of a "language" the problem? Perhaps its time we moved on to something totally new. Don't ask me for examples, though.

Re:There is a saying... (1)

cortana (588495) | more than 8 years ago | (#14851123)

Who knows, indeed? [paulgraham.com] ;)

Re:There is a saying... (5, Insightful)

AuMatar (183847) | more than 8 years ago | (#14851172)

It isn't the idea of a language thats the problem, the idea that language matters is the problem. Any problem can be solved in any Turing complete language. There's little to no difference between them. You're not going to write code an order of magnitude faaster because you change language. Short of having an API class you can leverage in one language and not the other, you'll be hard put to program faster by a factor of 10%, if you know the syntax of both languages equally well.

The real problem is code reuse. 95% of what we do on a daily basis is to reinvent features available elsewhere. What we need are well designed, easy to use libraries that we can leverage and have most of the work done for us. Closed source programs are killing us, as we can't leverage off each other. Its like going back to the days of Newton and Liebnitz and requiring all mathematicians to prove the same ideas without reference to one another's work before moving on. Its ridiculous, and its the reason for our problems.

Re:There is a saying... (3, Insightful)

belmolis (702863) | more than 8 years ago | (#14851608)

It is much more difficult to master and retain the syntax of some languages than of others, so a lot of the time you aren't going to know them equally well. In any case, I think you're just wrong about language not making a difference. It is much slower to write in a low-level language than in a high-level language. Sure, you may have mastered the syntax, but you still have to spend time and mental energy keeping track of what goes where if you don't have data structures like structs and arrays, and just adding automatic storage allocation and garbage collection saves a lot of time and bugs.

Re:There is a saying... (5, Insightful)

AuMatar (183847) | more than 8 years ago | (#14851642)

I disagree- arrays and structs are made by quick macros, even in assembly. Think of it like an accessor function. It takes a small time up front to write- not a significant effort. I'm about 90% of the speed writing in asm than I am in C++.

Garbage collection is a whole other rant- thats a complete strawman. Memory management takes a minor amount of time (almost 0), and making sure you properly null out dangling references in Java takes about as much. I find the problem to be totally different- there's a subset of programmers who just don't understand memory management. These people suck as programmers- everything you do in programming is resource management. Memory- alloc, use, free. Files- open, use, close. Networking- connect,use,close. Having people who don't understand that pattern on your team causes work to slow down by large amounts because of their incompetnece, not because of the language.

Re:There is a saying... (2, Insightful)

belmolis (702863) | more than 8 years ago | (#14851886)

If you've got macros in assembler with macros that make structs and arrays easy, you're not writing real assembler but one of those new-fangled intermediate languages. That's a step up right there.

Anyhow, its the storage allocation that is the big thing. I just don't agree that it makes such a small difference. It isn't just the need to free up what you use - that's relatively easy. It's the constant checking of whether you've got enough or need to reallocate, and the sometimes complicated and error-prone calculations of how much you need.

I'm a very experienced C-programmer (24 years) and the storage allocation idioms reside in my fingertips, yet when dealing with a lot of variable length strings, for example, I know that it is much faster to write in a high-level language like Tcl than in C. And studies of programmers seem to show this quantitatively.

Another feature that I suspect is helpful, if not in making things go faster, in reducing the expenditure of mental energy, is the use of iterators like foreach. Being able to iterate over a list without having to worry about what the first index is and what the last index is etc. as with a C-style for makes it much easier.

Re:There is a saying... (2, Interesting)

AuMatar (183847) | more than 8 years ago | (#14852707)

Why do you have to worry about increasing your string length? That should be taken care of by your string library. Lets say you want to concatenate 2 strings. YOu don't just use strcat() do you? The correct way to do it is to write a function that goes something line this


string* strcat(string *str1,string *str2){
    if(str2.length+str1.length>str1.size){
        if(reallocstr(dst,str2.length+str1.length)==ALLOCE RROR){
            return NULL;
        }
    }
    memcpy(str1.buffer+str1.length,str2.buffer,str2.le ngth);
    return str1;
}


Where string is a struct defined by the library. If you're using the plain C string library, that in itself is a problem- you're right, using it is slow and requires you to keep track of too much stuff. So don't-- use a better string library. There's a few dozen in C you can download off the web.

Another feature that I suspect is helpful, if not in making things go faster, in reducing the expenditure of mental energy, is the use of iterators like foreach. Being able to iterate over a list without having to worry about what the first index is and what the last index is etc. as with a C-style for makes it much easier.


So why don't you write a list library that takes a function pointer and calls it on each member of the list? Or one that at least has functions so you can look like this (assume a list of ints for the following example):


for(int iter=begin(list);iter!=end(list);iter=getnext(list ,iter)){
    int val=getval(list,iter); //do work
}


Your problem doesn't seem to be C, its using C poorly. If you're not doing stuff like this, you're working at the wrong level of abstraction. That leads to slow to write, buggy code in any language. And its equally likely in any language.

Re:There is a saying... (0)

Anonymous Coward | more than 8 years ago | (#14852506)

You're not going to write code an order of magnitude faster because you change language.

That's a swell statement on paper and in theory, but in the real world, to borrow a phrase, it is "nonsense".

Much like political ideals being limited only by barriers of diminished human integrity, so it is with your idea. As humans, we study, think, and furthermore write programs in a way that "feels" natural to us. We always come to put things in a perspective native to us, and we will always be most productive when we are most comfortable with the tools we use; this is the case in art, craftsmanship, and even programming. Heck, as an example, the hordes here on Slashdot will mostly swear by one text editor or another for programming -- I'm sure even an irrelevant "environmental factor" such as a developer's editor of choice would result in magnitudes of difference with respect to productivity.

In a perfect world, you are, of course, correct. In a world where all developers have no prefered syntax or programming structure predefined in their mind, no vision or "feel" for code, these perfect-minded developers would be indifferent in their efficiency with varying languages. However, we do not live in that alternate universe (the one where you actually have a valid argument and not the pretentious babbling of a complete smart-ass). Surely there are programmers who are, by all means, superb with a language such as C, but purely functional languages would just be out of any possible train of thought they might board any time in their development career, so to speak.

Summatively: you are absolutely wrong, so suck it down. We're not all Lt. Cmdr. Data.

Re:There is a saying... (2, Insightful)

AuMatar (183847) | more than 8 years ago | (#14852733)

Much like political ideals being limited only by barriers of diminished human integrity, so it is with your idea.


Other than disagreement, I'm not sure WTF you're trying to say here. Try to stick to one rant at a time, and you'll make much more sense.

We always come to put things in a perspective native to us, and we will always be most productive when we are most comfortable with the tools we use; this is the case in art, craftsmanship, and even programming. Heck, as an example, the hordes here on Slashdot will mostly swear by one text editor or another for programming -- I'm sure even an irrelevant "environmental factor" such as a developer's editor of choice would result in magnitudes of difference with respect to productivity.


True, to some extent. Point a gun at my head and tell me to program Lisp and I'll do so more slowly than C. This is not due to flaws in Lisp, but due to the fact that the syntax is not familar to me. As I said, my point is that when EQUALLY experienced with the syntax of 2 languages then there is no difference. Its not Lisp or C or Java or ASM that causes us to be slower programmers, its our unfamiliarity with it. My point is that its nothing inherent in the language that slows us down, and nothing inherent in some other language will speed us up. Its 100% a matter of familiarity. Looking for some new language which will magicly make you more productive is a wild goose chase.

Re:There is a saying... (1)

deander2 (26173) | more than 8 years ago | (#14852556)

and making sure you properly null out dangling references in Java takes about as much.

i always hate it when my references dangle. (so embarrassing. :)

How is volume IV coming along ? (0)

Anonymous Coward | more than 8 years ago | (#14851709)

> I'm about 90% of the speed writing in asm than I am in C++.

Hi, Don !

How is volume IV coming along ?

Re:There is a saying... (2)

danielk1982 (868580) | more than 8 years ago | (#14852392)

Any problem can be solved in any Turing complete language. There's little to no difference between them. You're not going to write code an order of magnitude faster because you change language.

You've got to be kidding me. There are plenty of cases where you will write code a magnitude faster if a language is changed. Can you write a web application supporting complex business logic in C? Yeah you can. But it absolutely doesn't compare to Rails, Struts, or asp.NET. A Perl program might take a few lines while the equivalent Java code might take tens of lines. Or how about this, how many lines of C code would it take to generate a user interface of slashdot, and how much conceptually harder would it be to understand than the html/css that its specified with now? Its all bits underneath, but a language can make all the difference.

Honestly, by your logic we shouldn't have gone further than assembler.

95% of what we do on a daily basis is to reinvent features available elsewhere.

I think you're overstating this. There are already plenty of libraries used by developers, but the simple fact is you'll never have a library for everything (not even close). You will need custom code, if only to weave provided libraries together.

Re:There is a saying... (2, Insightful)

AuMatar (183847) | more than 8 years ago | (#14852666)

You've got to be kidding me. There are plenty of cases where you will write code a magnitude faster if a language is changed. Can you write a web application supporting complex business logic in C? Yeah you can. But it absolutely doesn't compare to Rails, Struts, or asp.NET


I disagree. A quick download of a few libraries to help out (a database access library, a regex library, a better string library, maybe one or two others) and I'm ready to go. Rails is a particularly poor example- yeah, it autogenerates a lot of code, but if you want to go even slightly out of lockstep with it, you have a lot of fighting against it to do.

There are already plenty of libraries used by developers, but the simple fact is you'll never have a library for everything (not even close). You will need custom code, if only to weave provided libraries together.


Of course. But I don't think I overstated by much. Much of what we do daily is reinvent the wheel. Usually poorly (if we were inventing a better wheel I wouldn't complain). New languages aren't going to boost our productivity, not compared to the boost we'd get from not having to write the damn code in the first place.

I didn't think (4, Funny)

Eightyford (893696) | more than 8 years ago | (#14850938)

I didn't think humans could win this award.

Me, like many readers of slashdot (2, Funny)

RedLaggedTeut (216304) | more than 8 years ago | (#14850939)

Me, like many readers of slashdot, also hope to pass the Turing test one day, so I congratulate him on this achievement.

Meanwhile, in Soviet Russia, the Turing test passes you.

Re:Me, like many readers of slashdot (3, Funny)

MyLongNickName (822545) | more than 8 years ago | (#14850958)

Me, like many readers of slashdot, also hope to pass the Turing test one day, so I congratulate him on this achievement.

You passed the test. No computer would mangle the pronoun usage like this! ;)

Babel Fish (1)

tepples (727027) | more than 8 years ago | (#14851047)

No computer would mangle the pronoun usage like this! ;)

O rly? [altavista.com]

Re:Me, like many readers of slashdot (5, Funny)

weg (196564) | more than 8 years ago | (#14851101)

According to the Hitchhikers Guide to the Galaxy [bbc.co.uk] this will be hard, if you are a Computer Scientist:

(copied from http://www.h2g2.com/ [h2g2.com] )

Dave? Are you there Dave?

A test for artificial intelligence suggested by the mathematician and computer scientist Alan Turing. The gist of it is that a computer can be considered intelligent when it can hold a sustained conversation with a computer scientist without him being able to distinguish that he is talking with a computer rather than a human being.

Some critics suggest this is unreasonably difficult since most human beings are incapable of holding a sustained conversation with a computer scientist.

After a moments thought they usually add that most computer scientists aren't capable of distinguishing humans from computers anyway.

One of Peter Naur's Contributions (2, Funny)

Wayne_Knight (958917) | more than 8 years ago | (#14850948)

Look, Naur is mentioned right in the code!
$ diff -Naur inftrees.c../zlib-1.2.2.orig/
--- inftrees.c 2005-07-10 13:38:37.000000000 +0100
+++../zlib-1.2.2.orig/inftrees.c 2004-09-15 15:30:06.000000000 +0100
@@ -134,7 +134,7 @@
left -= count[len];
if (left < 0) return -1;/* over-subscribed */
}
- if (left > 0 && (type == CODES || max != 1))
+ if (left > 0 && (type == CODES || (codes - count[0] != 1)))
return -1;/* incomplete set */
Not much of a criterion for a Turing Award, though...

Obligatory Typo Joke (0)

Anonymous Coward | more than 8 years ago | (#14850952)

"the winner of the 2005 A.M. Turing Award."

Stay tuned for the 2005 P.M. Turing Award - commonly considered to be the Nobel Prize of the Online Porn industry.

Re:Obligatory Typo Joke (1)

mlow82 (889294) | more than 8 years ago | (#14851054)

There's no typo there. Turing's full name is Alan Mathison Turing [wikipedia.org] .

Re:Obligatory Typo Joke (0)

Anonymous Coward | more than 8 years ago | (#14851163)

That's a joke, get over with it.
whooosh!

Just Algol-60? (1)

endrue (927487) | more than 8 years ago | (#14850955)

What about Backus-Naur form?

Re:Just Algol-60? (1)

Jerry Coffin (824726) | more than 8 years ago | (#14851043)

What about Backus-Naur form?

RTFA. They mention BNF as well -- though they certainly don't give it as much time and space as it deserves. ALGOL was a tremendous accomplishment, but IMO, BNF far greater still. Then again, it's open to argument that John Backus really deserves most of the credit for BNF. At one time, BNF was an abbreviation for "Backus Normal Form" and only later was Peter Naur's name added in.

Interestingly, Naur didn't seem quite as impressed with the success of Algol 60 as many people currently seem to be. In a comment on a draft for the Algol 68 report, Naur said: "...nothing seems to have been learned from the outstanding failure of the Algol 60 report..."

Re:Just Algol-60? (4, Informative)

weg (196564) | more than 8 years ago | (#14851086)

BNF originally stood for "Backus Normal Form", and the name Backus Naur Form was introduced by Donald Knuth:

@article{365140,
  author = {Donald E. Knuth},
  title = {Backus Normal Form vs. Backus Naur form},
  journal = {Commun. ACM},
  volume = {7},
  number = {12},
  year = {1964},
  issn = {0001-0782},
  pages = {735--736},
  doi = {http://doi.acm.org/10.1145/355588.365140 [acm.org] },
  publisher = {ACM Press},
  address = {New York, NY, USA},
  }

Sample code (1)

GoofyBoy (44399) | more than 8 years ago | (#14850984)

For those of you like me and have never worked with this language, some sample code is here [monash.edu.au]

I think I would have been driven nuts trying to find the unmatched ' in my code.

Re:Sample code (1)

Heembo (916647) | more than 8 years ago | (#14851166)

I think I would have been driven nuts trying to find the unmatched ' in my code.

And you call yourself a programmer? Build a macro or some kind of simple code to check FOR you!

Re:Sample code (4, Funny)

GoofyBoy (44399) | more than 8 years ago | (#14851285)

Build a macro or some kind of simple code to check FOR you!

I did one in LISP; I'm still trying to find an unmatched (.

Re:Sample code (3, Funny)

musakko (739094) | more than 8 years ago | (#14851398)

For those of you like me and have never worked with this language, some sample code is here [monash.edu.au]

Scotty: Captain, we din' can reference it!
Kirk: Analysis, Mr. Spock?
Spock: Captain, it doesn't appear in the symbol table.
Kirk: Then it's of external origin?
Spock: Affirmative.
Kirk: Mr. Sulu, go to pass two.
Sulu: Aye aye, sir, going to pass two.

No more wordy than COBOL. Seems like a cool language

ACM must die (-1, Flamebait)

Anonymous Coward | more than 8 years ago | (#14851007)

The ACM are just another bunch of postal spamming bastards

Re:ACM must die (0, Offtopic)

justthinkit (954982) | more than 8 years ago | (#14851037)

Greatest ACM article: http://www.acm.org/cacm/AUG96/antimac.htm [acm.org] , demonstrating what I had thought from the beginning -- that the Mac is the opposite from how things should be done.

In other news... (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#14851016)

Linux is STILL for fags.

From : A grateful computer user (1)

KennyP (724304) | more than 8 years ago | (#14851027)


Sir,

I thank you for helping define structured computer programming languages. Programs were the dreams of the wireheads half a century ago. Now, if you can type, we can only hope you never see the dreaded :

SYNTAX ERROR : GOSUB WITHOUT RETURN
LINE 380

Guess what language I learned to program first? :-P

Visualize Whirled P.'s

What about VALGOL? (-1, Offtopic)

slickwillie (34689) | more than 8 years ago | (#14851046)

http://en.wikipedia.org/wiki/VALGOL_programming_la nguage [wikipedia.org]

Here's a sample:

14 LIKE, Y$KNOW (I MEAN) START
%% IF
PI A =LIKE BITCHEN AND
01 B =LIKE TUBULAR AND
9 C =LIKE GRODY**MAX
4K (FERSURE)**2
18 THEN
4I FOR I=LIKE 1 TO OH MAYBE 100
86 DO WAH + (DITTY**2)
9 BARF(I) =TOTALLY GROSS(OUT)
-17 SURE
1F LIKE BAG THIS PROGRAM
? REALLY
$$ LIKE TOTALLY (Y*KNOW)

Datalogy (4, Interesting)

Peter_Pork (627313) | more than 8 years ago | (#14851050)

Peter Naur is an interesting character. For example, he dislikes the term "Computer Science", and prefers "Datalogy". He also gives Backus the whole credit for inventing BNF, which he calls the Backus Normal Form. I'm sure he has a better name for Algol-60...

Re:Datalogy (0)

Anonymous Coward | more than 8 years ago | (#14851065)

I'm sure he has a better name for Algol-60...

Unfortunately, there is already a programming language called brainfuck

The Algols were good (2, Interesting)

Anonymous Coward | more than 8 years ago | (#14851486)

Although you were making a joke, it didn't actually reflect reality at all. Algol 60 was quite seminal, and Algol 68 was almost the "Perl" of its time, really powerful.

In almost 40 years since the Algol family of languages was defined, we haven't really moved things along all that much. Quite a lot of the "improvements" in modern languages are not fundamental but largely aesthetic. Pretty pathetic really.

Nearly 4 decades ago, we programmed in Algol 68 and we walked on the moon. It's curious how the pace of progress in both realms slackened off quite suddenly, to put it generously.

Re:The Algols were good (1, Offtopic)

zippthorne (748122) | more than 8 years ago | (#14851632)

In what year did the concept of "risk management" as an organisational paradigm/department become popular?

Re:Datalogy (1)

fm6 (162816) | more than 8 years ago | (#14851180)

Naur is hardly alone in not liking "computer science". The department where I studied CS was called "Information Science" because (as the chairman put it) computers are just instruments, not the thing being studied; do you call astronomers "telescopists"?

But that's actually wrong. Computers are instruments, but they're not just instruments. Their existence drives the whole discipline. Leave "computer" out of the terminology and most people won't know what you're talking about. When I told people I was majoring in "Information Science", they though I was studying to be a librarian! "Datalogy" is even worse.

Re:Datalogy (2, Informative)

Krakhan (784021) | more than 8 years ago | (#14852237)

It's most likely for that reason Djikstra preferred the term "Computing Science" himself.

Re:Datalogy (1, Informative)

Anonymous Coward | more than 8 years ago | (#14852944)

Well, in Danish we actually use the term "Datalogi". And danish people I meet seem to known what I'm studying when I say Datalogy. That is, they known it has something to do with computers.
But that's just the recuring problem we as computer scientists meet. Ordinary people think every one working with computers is doing the same work as Joe from the IT dept who helps them connect their Palm handheld.

Re:Datalogy (1)

javax (598925) | more than 8 years ago | (#14853019)

Thats cool - in Germany CS is called "Informatik". I haven't seen any students of "Computerwissenschaft" or something similar here.

Computer science is no more about computers than astronomy is about telescopes. [Edsger Dijkstra]

Re:Datalogy (1)

jawtheshark (198669) | more than 8 years ago | (#14853065)

Same in Dutch ("Informatika") and in French ("Informatique").

I still translate my degree (which is in dutch) to "Computer Science" because most non-german/dutch/french people have no idea what I'm talking about if I say "Informatics" :-)

Re:Datalogy (1)

dkasak (907430) | more than 8 years ago | (#14853317)

Interesting. It's also called 'Informatika' in Croatian.

Re:Datalogy (2, Interesting)

Anonymous Coward | more than 8 years ago | (#14853214)

He's a very funny character indeed. I study at DIKU where Naur used to work, and one of the funny things he invented was "the Naur frame".

The Naur frame is an A4 sized piece of cardboard with a hole in it like a picture frame. When he would read and grade a report he would place this frame over the report when reading it. So if your margin was too narrow the frame would cover some of your text. He would then give you a horrible grade because your report didn't make any sense.

A rather hard way to force people into having large margin, but also quite funny.

FAMOUS CONTROVERSY (1)

putko (753330) | more than 8 years ago | (#14851051)

There is a famous controversy, described here:

http://spirit.sourceforge.net/dl_docs/bnf.html [sourceforge.net]

Some accuse this guy of bogarting the credit.

Re:FAMOUS CONTROVERSY (1)

Voltageaav (798022) | more than 8 years ago | (#14851222)

From TFA: John Backus, another former Turing Award winner, acknowledged Naur as the driving intellectual force behind the definition of Algol 60. He commented that Naur's editing of the Algol report and his comprehensive preparation for the January 1960 meeting in which Algol was presented "was the stuff that really made Algol 60 the language that it is, and it wouldn't have even come about, had he not done that."

And from your own refrence, Naur says I don't know where BNF came from in the first place. I don't know -- surely BNF originally meant Backus Normal Form. I don't know who suggested it.

Now, tell me, does that sound like he's trying to steal the credit?

Re:FAMOUS CONTROVERSY (1)

putko (753330) | more than 8 years ago | (#14851711)

No it doesn't. But then if you read this, it is clear tha his contribution sounds so minor as to be unworthy of further discussion -- he just wanted to change some unprintable characters to printable ones.

In a later appendix, F. L. Bauer responds to Naur's statements:

        "It is amusing to see how Peter Naur looks at the use of the Backus notation from his personal point of view. Among [other members of the committee] there was no question that we would like... a form similar to the one Backus had used for its ICIP paper... If Peter Naur had seen this a result of his "plan" to make an appeal to the members of the ALGOL committee concerning the style of the language description, he was running into open doors."

        "... Peter Naur speaks of 'my slightly revised form of Backus's notation' and 'my slightly modified form of Backus's notation.' I think the minor notation difference is not worth mentioning. If some people speak of Backus- Naur form instead of the original Backus Normal Form, then they indicate that Peter Naur, as the editor of the ALGOL 60 report, brought this notation to a wide attention. Backus-ALGOL Form would be more appropriate anyhow."

hrrm, odd they should mention that (-1, Troll)

sedyn (880034) | more than 8 years ago | (#14851058)

FTA:
"Dr. Naur's ALGOL 60 embodied the notion of elegant simplicity for algorithmic expression," said Justin Rattner, Intel senior fellow and Chief Technology Officer. "Over the years, programming languages have become bloated with features and functions that have made them more difficult to learn and less effective.
An Intel guy critising feature bloat? Has he ever used Intel x86's assembly language?

Some contributions of Algol60 (5, Informative)

Marc Rochkind (775756) | more than 8 years ago | (#14851091)

1. The Report on the language used a formal syntax specification, one of the first, if not the first, to do so. Semantics were specfied with prose, however.
2. There was a distinction between the publication language and the implementation language (those probably aren't the right terms). Among other things, it got around differences such as whether to use decimal points or commas in numeric constants.
3. Designed by a committee, rather than a private company or government agency.
4. Archetype of the so-called "Algol-like languages," examples of which are (were?) Pascal, PL./I, Algol68, Ada, C, and Java. (The term Algol-like languages is hardly used any more, since we have few examples of contemporary non-Algol-like languages.)

However, as someone who actually programmed in it (on a Univac 1108 in 1972 or 1973), I can say that Algol60 was extremely difficult to use for anything real, since it lacked string processing, data structures, adequate control flow constructs, and separate compilation. (Or so I recall... it's been a while since I've read the Report.)

life::= birth education career gods_waiting_room ; (2, Insightful)

hedley (8715) | more than 8 years ago | (#14851093)

I have been using his work for years. Congrats to him
and his fantastic career.

Hedley

Re:life::= birth education career gods_waiting_roo (0)

Anonymous Coward | more than 8 years ago | (#14852754)

BNF was not his work.

Nobel prize for peace[of mind] (0)

packetmill (955023) | more than 8 years ago | (#14851120)

should go to the guy who invented Java.

Re:Nobel prize for peace[of mind] (1)

m50d (797211) | more than 8 years ago | (#14851160)

Given how much java makes me want to kill people, I don't think that's fair.

Re:Nobel prize for peace[of mind] (1)

NoMercy (105420) | more than 8 years ago | (#14851532)

Java alone isn't a great invention, it's a C style language with garbage collection, objects and compiles to a bytecode... all that's been done before.

Now maybe the inventor of Java's reflection system...

Re:Nobel prize for peace[of mind] (1)

AuMatar (183847) | more than 8 years ago | (#14851607)

s/Java/C/;

Fixed your post.

Danes everywhere... (4, Interesting)

weg (196564) | more than 8 years ago | (#14851125)

Amazing how many programming languages were actually invented by Danish computer scientists. Peter Naur (ALGOL), Bjarne Stroustrup (C++), Anders Hejlsberg (C#), and Mads Tofte contributed a good deal to SML.

Re:Danes everywhere... (1)

tomjen (839882) | more than 8 years ago | (#14851141)

As i recall it, Anders Hejlsberg was part of the team behind Delphi too.

Re:Danes everywhere... (4, Informative)

sidetracked (958931) | more than 8 years ago | (#14851179)

Anders Hejlsberg made Turbo Pascal as well. Also to name a few other Danes that created popular languages like Rasmus Lerdorf (PHP) and David Heinemeier Hansson (Ruby on Rails Framework).

Homer... (0)

Anonymous Coward | more than 8 years ago | (#14852153)

mmmmm. ...Danish

Re:Danes everywhere... (1)

Krakhan (784021) | more than 8 years ago | (#14852251)

Don't forget about Djikstra!

Re:Danes everywhere... (1)

jlar (584848) | more than 8 years ago | (#14853207)

Not to mention Rasmus Lerdorf the Danish-Canadian author of PHP.

Re:Danes everywhere... (1)

olau (314197) | more than 8 years ago | (#14853211)

It's the Danish conspiracy. Eventually all computers will be programmed in a Danish programming language - then WORLD DOMINATION! Just wait till your defense systems start responding to the code that can be implicitly read between the lines...

Nobel Games (1)

Tablizer (95088) | more than 8 years ago | (#14851414)

The Turing Award is considered to be the Nobel Prize of computing, and a well-deserved recognition of Dr. Naur's pioneering contributions to the field."

Why the heck don't the Nobel managers make a fricken Computer category? They created a Economics category even though Mr. Nobel hadn't originally set that one up.
       

Re:Nobel Games (0)

Anonymous Coward | more than 8 years ago | (#14851554)

I can assure you that real scientists regard the "Nobel Prize" in economics awarded by some Swedish bank, not the Nobel foundation, as a travesty.

Kind of like economics is a travesty compared to physics or chemistry or even literature!

Re:Nobel Games (2, Interesting)

AuMatar (183847) | more than 8 years ago | (#14851618)

The Nobel committee has not made any awards that Alfred Nobel himself did not decide to set up. Economics is not a real nobel prize- its official name is "The Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel"- in other words its a rip off of the name. Quite fitting, given macroeconomics is a pseudo-science, that it be given a pseudo-award.

Re:Nobel Games (1)

proxima (165692) | more than 8 years ago | (#14851956)

Quite fitting, given macroeconomics is a pseudo-science, that it be given a pseudo-award.

As you're probably aware, more than macroeconomists receive the "The Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel". In fact, the 2005 prize went to two game theorists (and thus microeconomists). The debate about whether economics, or any subset of it, is a real "science" is not a very fruitful one (since one can define science in any number of ways, both including and excluding econ). I find it interesting (though not terribly surprising, since I've heard it before) that you separate macro from micro, rather than claiming all of econ is a pseudo-science (disclaimer: I am a PhD student in econ who will have macro as one field).

The existence or lack of a Nobel prize is not really appropriate for use as some indicator of how important, or reliable, or [insert adjective here] a field is. After all, there is no Nobel prize in mathematics, and if some dramatically new field emerges that doesn't fit well within currently-established categories, a new "Bank of Sweden" prize might be created. Unless, of course, you believe that the original categories will always incorporate the only worthwhile pursuits of humanity.

Economics is a much younger field than physical or biological sciences, and most of the field we study today was developed since Nobel's time. I'm no historian or biographer, so I couldn't say with any degree of authority whether Alfred Nobel, were he choosing the awards today, would choose economics or anything else as a category.

 

Re:Nobel Games (0)

Anonymous Coward | more than 8 years ago | (#14852354)

The pretense of economists that it is a verifiable, repeatable, quantifiable discipline like physics is the bullshit causing the sort of comment that AuMatar posted. Come on, as much admirable field of study as it is, it's a social study, and no amount of statistics and brain-dead algebraic models (I've done my bit doing academic work in that area) is gonna make it into a science discipline that they envy, i.e. those fields that use the scientific method, because they can't use the method.

Although, it's not French Literature, I give you that. :-)

Re:Nobel Games (1)

proxima (165692) | more than 8 years ago | (#14852535)

The pretense of economists that it is a verifiable, repeatable, quantifiable discipline like physics is the bullshit causing the sort of comment that AuMatar posted.

I certainly agree that economists should be aware of the inherent drawbacks to the methods. It's true, most of economics does not involve experiments in the traditional sense (experimental economics is the exception, but there are certainly arguments about how the results there apply to the world at large). However, many other disciplines, some considered more scientific than others, are also unable to conduct experiments. In general, fields like astronomy, meteorology and paleontology are not based on experiments. Some of the methods used can be verified in the lab, just like statistical methods and "brain-dead algebraic models" used by economists can be proved with math. Of course, assumptions of some sort are required for the models (and even for statistics itself). The data, however, are not typically generated by experiment in these fields.

I really have no problem with those who consider economics not a science. As I mentioned in my previous post, it depends largely on how you define science in the first place. For this reason I'm glad it's not referred to as "economic science", but that doesn't seem to stop this debate from coming up again and again. I also welcome informed criticisms as to the methods and data sources used by economists. This is how fields improve.

Regardless of whether modern economic theory is doing a good job, individuals, businesses, and governments make economic choices constantly. Thus it seems worthwhile to pursue a better understanding of economic issues in the hope of improving those decisions (or, at the very least, better understanding why things happened in the past). Economics is useful in the sense that it can provide actual falsifiable predictions about quantitative phenomena. Whether this is sufficient to classify economics as a science really doesn't change anything, in my view.

Re:Nobel Games (1)

Tablizer (95088) | more than 8 years ago | (#14852749)

Agreed. A lot of "soft sciences" are important and necessary. Just because something is difficult to test with the scientific process does not mean that one should not try. Those who come up with good ideas, models, or techniques in the field should be awarded something prestigious, even if they turn out to be wrong. Sometimes much is learned from knowing why a model is wrong. you cannot tweak an economic model to match reality unless you first have something to tweak.
       

Re:Nobel Games (1)

AuMatar (183847) | more than 8 years ago | (#14852640)

Why I separate micro from macro- a lot of micro stuff can be tested according to principles of the scientific method. Little to none of the macro stuff can. Not to mention the large swaths of macroeconomics that point to opposite results, depending on the political leanings of the economist who designed it. Trying to get two economists to agree on a macro issue is almost impossible. If it was a real science, at least the basics would be known, tested, and proven by now.

The existence or lack of a Nobel prize is not really appropriate for use as some indicator of how important, or reliable, or [insert adjective here] a field is.


Agreed. I have high respect for both the Turing Award and the Field's Medal. Note that neither of them call themselves the Nobel Prize. The economics prize does- its a blatant attempt to horn in on the good name of the real Nobel prize. Although I do find it funny when economists try and rationalize that.

Re:Nobel Games (2, Insightful)

proxima (165692) | more than 8 years ago | (#14852696)

Trying to get two economists to agree on a macro issue is almost impossible. If it was a real science, at least the basics would be known, tested, and proven by now.

I'll grant that there are many conflicting models in macro. Many of them stem from the assumptions. For example, assuming a closed economy or an open economy. In the real world and throughout history, various countries are somewhere in between, but often closer to one or the other. Thus, choosing appropriate assumptions for the question you're asking is very important.

Beyond that, macro is very, very new. Physics had centuries from Aristotle to Newton to Einstein. The point is we can gather data and test these models for their effectiveness. Some of them work, and they tend to persist, and some of them don't (but again, choosing which model is appropriate for which data is very important). Besides, if you're a big fan of micro, there's a huge trend in macro to have the models based on micro foundations. Regarding economics as a science, see my other post [slashdot.org] in this thread.

Even if you don't like certain aggregate variables (GDP, etc), some macroeconomic variables are decided in the real world regardless (how much money to print, what interest rate the Fed sets). Macroeconomics will always exist in that sense. There's certainly plenty of room for improvement in the collection of data (especially of non-OECD countries), but that's true of micro as well.

Agreed. I have high respect for both the Turing Award and the Field's Medal. Note that neither of them call themselves the Nobel Prize.

Neither of them are decided by the Royal Swedish Academy of Sciences [wikipedia.org] . I wouldn't mind if everyone called it the Nobel Memorial Prize [wikipedia.org] or something to that effect. If it didn't exist, there would probably be some "top prize" similar to the Field's Medal or Turing Award, and that'd be fine too (there probably was before 1968).

Honest question from curious geek- (2, Interesting)

Josh teh Jenius (940261) | more than 8 years ago | (#14852166)

I just read the WikiPedia article on Alan Turing:

In 1952, Turing was convicted of acts of gross indecency after admitting to a sexual relationship with a man in Manchester. He was placed on probation and required to undergo hormone therapy. When Alan Turing died in 1954, an inquest found that he had committed suicide by eating an apple laced with cyanide.

Then the article mentions an urban legend:

In the book, Zeroes and Ones, author Sadie Plant speculates that the rainbow Apple logo with a bite taken out of it was an homage to Turing. This seems to be an urban legend as the Apple logo was designed in 1976, two years before Gilbert Baker's rainbow pride flag.

Urban Legend? Anyone have any more info on this?

In case you haven't seen it in a while, here is the classic Apple logo:
http://www.jeb.be/images/Apple/apple_logo_(640x480 ).jpg [www.jeb.be] .

Re:Honest question from curious geek- (1)

bensch128 (563853) | more than 8 years ago | (#14853232)

Thats amazing...

Sounds more like an urban legand then fact though...
Probably only Woz and Steve Jobs knows for sure though

Ben

Re:Honest question from curious geek- (2, Informative)

bj8rn (583532) | more than 8 years ago | (#14853296)

Google provides the answer [syncmag.com] : "For inspiration, the first thing I did was go to the supermarket, buy a bag of apples and slice them up. I just stared at the wedges for hours," recalls Janoff. The fruit of his labor: a simple 2-D monochromatic apple, with a healthy bite taken from the right side. Jobs loved the conceit-only he suggested it be more colorful. Janoff's boss disagreed, insisting the logo be made all black to save on printing costs. "But Jobs was resolute, arguing that color was the key to humanizing the company," says Janoff. "So I just put colors where I thought they should be, not even thinking about a prism."

From TFA (0)

Anonymous Coward | more than 8 years ago | (#14852294)

"...Before publication of the Algol 60 Report, computer languages were informally defined by their prose manuals and the compiler code itself..."

The definition of PERL. :-) Have a good weekend.

Re:From TFA (0)

Anonymous Coward | more than 8 years ago | (#14852775)

don't forget ruby and python
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>