×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

387 comments

PWN TEH NUB (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#10833001)

Gmail Invite (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#10833023)

Please send a gmail invite to amydox@hotmail.com. Please, I'm on hotmail for god's sake.

NOT WORK SAFE!!! (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#10833061)

NOT WORK SAFE!!!

Re:NOT WORK SAFE!!! (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#10833073)

gmail is not safe for work? since when?

Re:gmail invites (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#10833121)

Thanks for the invites brother. Now I see what all the fuss is about. Beautiful. 1 GIG! Can you believe it? Why hasn't Gmail allowed public sign ups yet? It would kill the competition!

HCL Principles (-1, Offtopic)

the_mighty_$ (726261) | more than 8 years ago | (#10833057)

More about HCL Principles can be found here [rice.edu]

HCL, HCl, or HCi? (2, Informative)

tepples (727027) | more than 8 years ago | (#10833137)

HCL (Hilbert Class Library) has little if anything to do with HCi (Human Computer Interaction) or HCl (hydrochloric acid). The article is about HCi.

Blatant! (0)

Anonymous Coward | more than 8 years ago | (#10833410)

That has got to be one of the most blatant karma-whoring attempts I've seen ever.

1) Search google for the wrong term, because you don't even understand the summary. (RTFA? Not a chance.)

2) Post the first result not obviously a business

3) Karma!

NOT.

Proposal (5, Funny)

Anonymous Coward | more than 8 years ago | (#10833060)

If
Natural Language is not making its way into Programming
Then
Programming should make its way into Natural Language
Else
Continue

on a bumper sticker (5, Funny)

devphil (51341) | more than 8 years ago | (#10833296)


YOU FORTH LOVE IF HONK THEN

And here's some filler text to compensate for /.'s sucktacular lameness filter. Blah blah blah. "It won't be any more frightening than the time I climbed up an elevator shaft with my teeth," said Sunny.

Re:on a bumper sticker (0)

Anonymous Coward | more than 8 years ago | (#10833452)

And here's some filler text to compensate for /.'s sucktacular lameness filter.

address injure effluvia zionism hibernate florentine pollock ribose stephanie airdrop crook dabble jitterbug alai babyhood binomial spaghetti pushout curtail judge sling tenacity constructor garibaldi visage cancelling

(more spam poetry for you!)

Is this a good idea? (4, Insightful)

hugesmile (587771) | more than 8 years ago | (#10833428)

It seems that the effort here is to allow end users to state their "problem" in natural language, and then a program gets generated to solve their problem.

Right now that happens - only the program gets generated by programmers (sometimes outsourced to India!)

Unfortunately, what the user says they want, and what they really want are usually very different things. Natural Language Programming really doesn't solve that problem.

The critical piece is the Designer, who sits between the end user and the programmer, and asks the tough questions: "Do you really want that? Let me explain the implications of what you just asked for." "How critical is that piece of functionality that you just added on a whim, but it just added 3 years to the project plan?" "You're asking for the data to be selected this way, but really there's no use for that - have you considered selecting the data this other way?" etc.

Other languages... (1)

10100 (767421) | more than 8 years ago | (#10833075)

Maybe I didn't RTFA closely enough, but what about languages like Python, C++, or Perl? It seems that most developers are more likely to use these (at least the ones I know).

Re:Other languages... (0)

Anonymous Coward | more than 8 years ago | (#10833297)

You're saying C++ and Perl are examples of natural language programming??

Maybe if you speak in 13375|*43| all the time.

Re:Other languages... (0)

Anonymous Coward | more than 8 years ago | (#10833473)

Or maybe you haven't been exposed to natural language enough to understand what he's saying.

Doesn't seem to say much. (2, Funny)

BigZaphod (12942) | more than 8 years ago | (#10833082)

The article seems like some kind of summary. Unless I missed something important, like, a second page or something. But basically, it seems to suggest that, even after all these years, we still aren't any closer to having a natural way to program. Huh.

ROFL! (1)

BigZaphod (12942) | more than 8 years ago | (#10833143)

Ha! There is a page 2. Heck, even a page 3.

*sigh*

Well I feel dumb. This is why I need a sign above my computer that says, "Absolutely no Internet posting before caffeine intake."

Re:ROFL! (1)

TwistedSquare (650445) | more than 8 years ago | (#10833466)

Actually, I still agree with your point having read all three pages. Not enough specific examples of problems in languages and what the better way would have been.

Re:Doesn't seem to say much. (2, Interesting)

Camel Pilot (78781) | more than 8 years ago | (#10833150)

That was assessment also.

However i did visit alice.org. I clicked on gallery and found no way to navigate back!

Seems kinda of odd that a site dedicated to "natural programming" concepts would not take the time to employ "natural navigation". Hmmmmm.

Re:Doesn't seem to say much. (1)

MORB (793798) | more than 8 years ago | (#10833439)

Well, it *is* natural navigation. In real life there's no back button that takes you back in time.

Programming in english sucks anyway (4, Insightful)

Qzukk (229616) | more than 8 years ago | (#10833090)

Inevitably you end up with an artificially rigid language structure that sounds like something that nobody would EVER say. Perfectly easy to read, after all, who wouldn't understand what "ADD VAR1 TO VAR2 GIVING VARX", but who the hell would use the word "giving" in such a way. It's a nightmare to learn or write, at least for English-speaking people who would have to constantly fight years of learning to speak real English to make up for the fake english in the language.

Re:Programming in english sucks anyway (4, Insightful)

Anonymous Coward | more than 8 years ago | (#10833145)

Witness--why everyone hates legalese in EULA's. Achieving unambiguous precision with English is HARD.

Re:Programming in english sucks anyway (2, Interesting)

ensignyu (417022) | more than 8 years ago | (#10833405)

It might be a little rigid, but not necessarily that bad. HyperTalk goes:
put the sum of var1 and var2 into varX

or:
set varX to the sum of var1 and var2

or:
add var1 and var2
put the result into varX

Re:Programming in english sucks anyway (1)

prell (584580) | more than 8 years ago | (#10833519)

Once they can make a translator that works, I'm thinking that they could make an NL compiler fairly easily. It would probably require proper grammar, but I think that you could easily rearrange words, split infinitives, etc. It seems to me that the Japanese language would be the easiest to process for tokens: it seems to be a pretty rigidly-structured language.

I don't know that an NL compiler would be necessarily useful, though. Programming languages are meant to bridge the gap between the human and the computer. I'd imagine that there would be plenty of errors that either resemble classical programming errors, or are completely new; I don't think using NL would necessarily make programming easier. After all, we all misspeak, and unless the compiler is staggeringly intelligent, it won't be able to correct errors for you. NL libraries would be useful though, I imagine, for end-user applications such as speech interpretation. NL for searches may be useful, but I have a very high success rate with Google as it is.

I'd rather see "Error: undefined symbol near 'fuck you gcc.'" than "I'm doing my best :'(" Poor gcc.

Is it any wonder? (5, Funny)

Tackhead (54550) | more than 8 years ago | (#10833093)

> They point out that well understood HCI principles aren't finding their way into relatively new languages like Java and C#."

Well, duh! That's because if, according to the article...

> The goal is to make it possible for people to express their ideas in the same way they think about them.

...most ideas just don't work that way.

#include // Do What I Mean

thingy main (thingy list) { Sort thingy
No, like this
With the guy's name on the right
No, I guess the middle initial deserves its own column. No, I didn't think of that.
But don't print the middle initial.
No, not like that.
Eew, that font sucks.
Yeah, like that.
No, like it was before.
Yeah, no--wait. I gotta talk to my boss.
He said to do it like this. // wave hands
But he didnt like it.
Fuck this, I'll pay some guy in India to do it.
}

OT: Thank You! (1)

JediTrainer (314273) | more than 8 years ago | (#10833240)

Was having a tough day. Just wanted to say thank you for bringing a smile to my face with that post. It was beautiful!

Closest I've found to Natural Language... (2, Insightful)

PortHaven (242123) | more than 8 years ago | (#10833129)

Is Macromedia's ColdFusion syntax. As it continues to become less tied to HTML it will be interesting to see where this goes.

But natural language requires more typing than say C syntax.

A EQUALS B
A = B

But does the thought process get speeded up. If so one needs to know how the gains and loss affect overall development.

Good point! (4, Insightful)

goldspider (445116) | more than 8 years ago | (#10833219)

One thing that programming languages force upon you (the programmer) is the ability to get what you want using the least possible resources.

Natural language, while easier for beginners, would make for horribly inefficient code and would be undesirable for any sizeable application.

Re:Good point! (1)

LnxAddct (679316) | more than 8 years ago | (#10833446)

Yea could you imagine being given a large application and being told to maintain it if it written in natural language? It'd be akin to having to read an encyclopedia or something. Absoultely ridiculous. If nothing else, the current state of programming languages allows one to easily skim through a few pages of source and pretty much get the idea of whats going on by reading a few comments and by seeing familiar structures in the code. If nothing else, your guaranteed that one line won't exceed 80 characters (at least in theory :-] ), but in natural language, it might take 80 characters to declare a damn variable and initialize it.
Regards,
Steve

Re:Good point! (1, Insightful)

Anonymous Coward | more than 8 years ago | (#10833472)

I don't see any reason why a compiler can't be smart enought to build efficent code from natural language, nor do I see any reason why you can't express an efficent algorithm in natural lange. The issue is how hard is it to do those things and is it worth it? Creating a compiler with a good enough understanding of english or any other widely used language would require decades of development or an army of developers take your pick.

The next issue is who would want to use it. How often have you read some techniacal instructions for anything relatively complex and not been entirely sure exactly sure what to do? Even stuff writen by proffesionals is not always exactly clear. Ever been confused reading a text book, I have. That won't fly with the computer. Any code one writes would have to be super verbose and probably writen and rewriten several times to be suitibly clear. No that the compiler could not give intelligent help like, "I don't understan what you mean by 'then goto the next node' did you want the left or the right?" but still while it might be easy for beginners it would be a pain for large projects; I doubt it could really ever be useful.

Which brings us back to the only thing you can do is comprimise. Limit the vocabulary and select a particualr definintion for each key word(or one for each of a limited set of contexts). As soon as you do this though the language becomes less "natural" and you get Cobol; I doubt many want to go back there.

Re:Good point! (1)

ensignyu (417022) | more than 8 years ago | (#10833492)

I don't think it would be that bad. It'd be akin to using a higher-level language like Python, where you don't care how something is sorted when you call list.sorted(), just that it's sorted. Some people think that Python is horribly inefficient, but it turns out that even 100x slower than C isn't too bad for many applications.

Re:Closest I've found to Natural Language... (1)

ebyrob (165903) | more than 8 years ago | (#10833426)

But does the thought process get speeded up.

You mean like: Hey, that's a nice piece of code, maybe it'll help me out here <clickety>. Or something moore complex like envisioning the interworkings of a whole system in your head and trying to turn that into something useful based on recalled behaviour of snippets of symbols...

I just don't think natural language meshes well with solution space. At best it might be useful for high-level specification.

The natural debugger (0)

Anonymous Coward | more than 8 years ago | (#10833133)

- Hypothesizing what runtime actions caused failure;
- Observing data about a program's runtime state;
- Restructuring data into different representations;
- Exploring restructured runtime data;
- Diagnosing what code caused faulty runtime actions; and
- Repairing erroneous code to prevent such actions

If I had all that in a debugger, why would I need programmers? It be cheaper to pay the debugger.

I don't buy it. (5, Insightful)

vontrotsky (667853) | more than 8 years ago | (#10833136)

I disagree with the article's assumption that interesting programming errors are due to people being unable to express themselves "naturally" in code. Rather, I find that almost all errors worthy of debugging come not understanding the problem domain correctly.

jeff

Re:I don't buy it. (4, Interesting)

Bastian (66383) | more than 8 years ago | (#10833374)

One thing that I have noticed about any debate about what programming facilities will most help programmers to write more bug-free code or spend less time debugging is that the debate is based entirely around anecdote.

I would love to see some numbers on the frequency and nature of bugs in software, and I want to see these numbers broken up by language as well as by appliction domain. I suspect that a comprehensive collection of such statistics doesn't exist, since I haven't seen any empirical data enter into the various debates to which they would apply.

Until someone spends some more time researching this information, I doubt that the development of programmign technology will advance in a fashion any more directed or smooth than science and technology did back in the fourteenth century.

Hmmmm (3, Insightful)

Profane MuthaFucka (574406) | more than 8 years ago | (#10833151)

The article wasn't really that clear on exactly what NLP is, but they pointed at something called Alice.

On that site, there's http://www.alice.org/whatIsAlice.htm [alice.org] which says
Rather than having to correctly type commands according to obscure rules of syntax, students drag-and-drop words in a direct manipulation interface. This user interface ensures that programs are always well-formed. In addition, Alice reifies object-based programming by providing animated, on-screen 3D virtual objects.

So, this is just like Visual Basic. I know that can't be true, or else Microsoft would be marketing VB as NLP. So what am I missing?

Oh NO! Not Again! (3, Insightful)

kaalamaadan (639250) | more than 8 years ago | (#10833152)

Cobol, anyone?

Multiply x by y to get something or the other ...

Re:Oh NO! Not Again! (2, Insightful)

Tablizer (95088) | more than 8 years ago | (#10833433)

Cobol, anyone?

(Or SQL for that matter. We managed to finally obsolete COBOL more or less, but SQL is still with us.)

Anyhow, the idea behind the COBOL Natural Language push was to rid the need for programmers or at least greatly reduce their training. However, it was found that somebody with some training could translate business logic into something that the computer could understand better than amatures. In other words, with some training productivity was so much higher than some manager trying to be precise enough.

Further, regular human speach is vague. To learn not to be vague takes a some training itself. And, many have discovered that English is not geared well to being precise. Rather than bend English to be precise, it appears better to toss English with something meant for precision. In other words, there is a bigger payback in efficiency to bend the human to be more like the computer (or at least use better language structure) than the other way around.

Further, non-programmers often have a horrible time with "normalization". They tend to duplicate stuff in ways that comes back to haunt the design down the road. In the real world making copies of papers is the norm, for example. If you do this in the computer, then you have to remember to change all the copies and know where they are.

Thus, if the training effort involves issues not just related to language to do it right, then it is worth it to integrate a more precise language into the training rather than futz with english.

Otherwise, it is like putting training wheels on motorcycles. It pays to get the prerequisites right first.

Maybe on a small scale, natural language will be okay. However, scaling this to production systems would probably be a mess. Even with languages more precise than English, we still make some ugly bugs. The problems would probably jump an order or two of magnitude if a hacky amature system based on the naturally vague English language is applied.

Natural language programming. (5, Insightful)

Adhemar (679794) | more than 8 years ago | (#10833157)

The virtue of formal texts is that their manipulations, in order to be legitimate, need to satisfy only a few simple rules; they are, when you come to think of it, an amazingly effective tool for ruling out all sorts of nonsense that, when we use our native tongues, are almost impossible to avoid.
- Edsger Wybe Dijkstra, "On the foolishness of natural language programming" [utexas.edu].
An interesting read.

Re:Natural language programming. (2, Funny)

Moofie (22272) | more than 8 years ago | (#10833512)

I don't think that anybody who spells their name like that gets to talk about "natural language".

(it's a joke, people!)

Write a Natural Language Compiler (5, Interesting)

TheUnFounded (731123) | more than 8 years ago | (#10833164)

Write a Natural Language Compiler and you'll find that programmers can't write in a Natural Language. Can you imagine what would happen when you have to understand, not the flow of the code, not the overall process of the application(s), but HOW the writer was THINKING when they wrote the code? I've worked on a couple interesting projects where the programmers originally were involved in the physical business process, and eventually ended up coding (don't ask). When I had to edit their code, there was NO way of understanding it unless you actually talked to them and realized how they were thinking about the problem. It's not that the code was so poor, but they wrote code based on how they'd seen the business operate, and that just didn't translate nicely into straightforward code.

Personally, I don't see how creating a language that encourages this behaviour can be a good thing. Isn't this the point of learned programmers? The ability to translate real world situations into easy to understand processes? Then again, I'm no language development guru. :)

People think in their languages. (3, Insightful)

Mr.Spaz (468833) | more than 8 years ago | (#10833175)

One of the big problems this approach will have to overcome (in my opinion) is that people generally tend to order their thoughts in a manner specific to their native language. A development environment that seems intuitive and easy to use to a native English speaker might be backwards or obtuse to a person who natively speaks another language. To clarify; I'm not speaking strictly of grammatical structure of language, but of a seemingly inherent difference in the way people learn things based on what language is used in the teaching. For this reason it has always seemed better to me for programmers to learn a new, common language (that of the higher-level compiler they are interested in) so that when they work with others, everyone is on the same page (similar to scientists and doctors using Latin nomenclature).

I'd imagine that a "natural language" system could be developed with different approaches based on the native tongue of the programmer, but I would think this would damage the benefits of commonality that other languages now enjoy.

Re:People think in their languages. (1)

Chundra (189402) | more than 8 years ago | (#10833382)

"For this reason it has always seemed better to me for programmers to learn a new, common language (that of the higher-level compiler they are interested in) so that when they work with others, everyone is on the same page"

Yeah, like maybe the programming language itself? Maybe it's just me but I think in whatever programming language I'm using. It's so much harder to try to convert data structures and algorithms to a natural language, or vice versa. The best we can hope for (in my opinion) is people using good, terse, descriptive function/method, and variable names. Of course, that'll never happen. :)

Re:People think in their languages. (1)

Mr.Spaz (468833) | more than 8 years ago | (#10833447)

Yeah, like maybe the programming language itself?

Actually, that's precisely what I was referring to (see the following sentence in parentheses).

Maybe my natural language wasn't clear enough. ;)

I didn't RTFA... (3, Insightful)

GillBates0 (664202) | more than 8 years ago | (#10833181)

The goal is to make it possible for people to express their ideas in the same way they think about them.

That's about as far as I got. I guess he didn't really express his ideas in the same way that I wanted to think about them.

Which nicely illustrates the point that there's always a "semantic gap" associated with natural languages, which builds up because people have different ways of thinking. The semantic gap is even wider when one of the entities being communicated to happens to be a machine. There's a reason why traditional programming languages are precise and exact...it's so that the gap is reduced - the machine will do exactly what you tell it to do...even then we have a disconnect between what the programmer's thinking, and the code that he's writing.

Real Men ... (-1, Troll)

Anonymous Coward | more than 8 years ago | (#10833191)

... Don't Use French.

Natural Language isn't for Serious Programming (3, Insightful)

I_Love_Pocky! (751171) | more than 8 years ago | (#10833203)

Natural language isn't precise enough for serious programming. I personally wouldn't enjoy typing so much for no added benefit. It seems like this sort of thing only has value amongst people who are learning to programming. Why would a mainstream language like Java or C# cater to this bunch?

Re:Natural Language isn't for Serious Programming (1)

alw53 (702722) | more than 8 years ago | (#10833423)

The article didn't seem to suggest that people try to program in English. The only example I saw was that of adding up a bunch of numbers in a set, and I thought it was spot-on about that issue. Everything in Java and C happens at the element level. But Lisp has a MAP function, and APL has reduction, and even BASIC can add A+B, where A and B are arrays. The higher-level operations that eliminate most of the explicit looping structures have been available for years but noboby ever builds them into new languages. In fact, language design seems to be moving backward as all three of those languages were designed in the 1960's.

Re:Natural Language isn't for Serious Programming (1, Insightful)

Anonymous Coward | more than 8 years ago | (#10833540)

Natural language isn't precise enough for serious programming.

What do you mean by "serious programming"? This strikes me as someone that learned assembly mocking those that knew C, which mock those that use Java.

Ah, I see the problem... (2)

Nijika (525558) | more than 8 years ago | (#10833209)

"The goal is to make it possible for people to express their ideas in the same way they think about them." There's your problem right there :) I think they're probably not being adopted because in the world of programming convention is the key to interoperability. Human thought and language aren't so strictly tied to convention.

Remember Apple Script (2)

brw215 (601732) | more than 8 years ago | (#10833210)

It seems to me that the steps in the Natural Programming approach are not at all novel and certainly not as useful as they appear. The authors seemed to have forgotten the train wreck that was AppleScript. The authors state that syntax in program languages are too complex. I would argue that the syntax of a programming language needs to be more complex then the syntax of a natural language. The sad fact is that English (and other natural languages) were not designed with enough precision for things like programming languages. For example: If in some natural programming someone were to state "if x or y do z" Does this statement mean that x and y need different values or can they both be true? One can't tell from looking at the statement.

Re:Remember Apple Script (2, Insightful)

Anonymous Coward | more than 8 years ago | (#10833432)

> The authors state that syntax in program languages are too complex. I would argue that the syntax of a programming language needs to be more complex then the syntax of a natural language.

I think you really mean the opposite of what you said. The syntax of natural language is bogglingly complex. You can express the syntax of even perl with a few kilobytes of EBNF. Noam Chomsky tried to come with formal syntax rules for spoken languages and utterly failed (though his work is what led to BNF and company)

Re:Remember Apple Script (0)

Anonymous Coward | more than 8 years ago | (#10833518)

That's what and/or is for :)

HyperTalk (0)

Anonymous Coward | more than 8 years ago | (#10833229)

The language of HyperCard has to be the one most resembling the English language.

"Programming for the rest of us"

Interesting Paper (2)

110010001000 (697113) | more than 8 years ago | (#10833233)

The paper is very unfocused. It is less about natural language programming and more about debugging environments and event driven programming. These guys must have received a grant of some type or must be working on a Ph.D. because they created yet another "natural language" environment and language and tested it on children (is that the target audience for your research?).
The reason procedural and "linear" programming works in the real world is because it mimics the operation of the digital computer itself (fetch, execute) and that concept and tools are well understood. Until that underyling structure is changed, there is no reason to switch away.

Some Natural language programming examples: (0)

Anonymous Coward | more than 8 years ago | (#10833249)

It was the best of for loops and the worst of for loops.

Letz get some spreadsheetz up in this hizzy.

Now is the time for all good men to come to the aid of this malloc.

I don't quite see the point (1, Interesting)

Anonymous Coward | more than 8 years ago | (#10833262)

I know that, being trained and experienced in traditional programming languages I am somewhat biased, but I don't quite see the point. We don't use natural language in other technical disciplines. There's no natural language math, physics, law, or biology.

What is the PURPOSE of natural language? (3, Insightful)

Anonymous Coward | more than 8 years ago | (#10833264)

IMO it's nothing more than a better way to introduce *newbies* into programming.

Would would any programming want to code in english? To me this:

myvar++

makes more sense than:

increase the variable myvar by one please

Do we really want people who can't understand something as simple as "myvar++" to be programming in the first place? Seems to me we NEED a barrier to entry. There're enough lousy programmers out there already.

The problem is (2, Insightful)

Anonymous Coward | more than 8 years ago | (#10833273)

It isn't that there aren't any languages that follow these principles coming out; lots of them are. It's just that the only languages that have become popular ignore these principles.

The fact is that people don't care what's academically sound, or what people have "proven" is the best way to do things. In fact, the things people do care about are directly contradictory with what's academically "best". It isn't some kind of head-slapping coincidence that the new popular languages ignore "natural programming". It's the market speaking, and it's saying "we don't want natural programming languages".

How about this... (1)

HexaByte (817350) | more than 8 years ago | (#10833286)

We develop a computer with voice interaction designed to write user programs. It will of course require superior programers who will have to be paid millions, since they are working themselves out of a job, but then again, aren't we all working ourselves - or someone else - out of a job by helping the automation revolution?

How may typing pool people have we put out of work with the word processor? The list goes on....

Wow. (4, Insightful)

cbiffle (211614) | more than 8 years ago | (#10833316)

Well, I'm not sure if it's that nobody read the article, or if nobody actually understood it, but.

We've had a lot of posts about "OH NO! COBOL!" Yes, yes, I agree with you -- pretending to be English usually results in awkward and unnatural syntaxes. One of the advantages of a formal syntax like most programming languages is that it clicks the brain into a different mode. (How many of you can read sigs like 2b||~2b? I thought so.)

But that's not really the paper's main aim. It makes a couple of notes that all of us, particularly those of us in language design, could benefit from.

1. People tend to deal with collections in the aggregate far more often than they step through them an item at a time. The example given was "set the nectar of all the flowers to 0." Look past the syntax for a moment and look at how simple that is.

2. Debugging the traditional way sucks. Did anyone actually read that bit at the end about the 'Why?' questions, and look at the screenshots? Holy crap. That's really impressive.

Of course, I may be biased, because the points made in the article are basically the same that underlie a language I'm currently designing. :-) (And no, I'm not using Englishy COBOL syntax.)

Do any languages support Natural Language? (1)

Bill, Shooter of Bul (629286) | more than 8 years ago | (#10833321)

Usually new concepts spawn new programming languages. Then after the concept has been proven, the features are hacked into other existing languages. Like Objects and SmallTalk. And the like. There are plenty of expiramental languages out there. If it really is a good idea, create a new language around it. If its such a fundemental change, then it couldn't be dropped in to a production quality language. No one would have the skills to use it, and everyone would switch to a competing language product, especially risk adverse managers.

set the nectar of all flowers to 0??? (1)

dmorin (25609) | more than 8 years ago | (#10833332)

In my experience the problem has always been that no one can really agree on limits to what a natural language should be able to do. Take the example in the subject, from the document -- "set the nectar of all flowers to 0". Fine, it's "more natural" than something like
for (int i =0; i < flowers.length; i++) { flowers[i].setNectar(0); } or something like that.

But wouldn't the most natural way be to say something like "no flowers have nectar"? This gets into a completely different level of parsing. In that statement you need to understand the logical set that is established (none of the set of all flowers) and that, although there is an attribute "nectar" that all flowers have, it is a quantity that starts with a zero value. You might even have started by saying "all flowers can have nectar" to setup the example.

Test (-1, Offtopic)

Anonymous Coward | more than 8 years ago | (#10833333)

Just a test

Positive Example(s)? (1)

BaldingByMicrosoft (585534) | more than 8 years ago | (#10833348)

The article seems to focus on what current, popular languages are -not- doing. What are some examples of languages that use this research?

Deterministic vs. nondeterministic (2, Insightful)

Just Some Guy (3352) | more than 8 years ago | (#10833350)

The current crop of computers is based on series of logical operations, and there is little in common between discrete logic and the Real World. Most programming languages bear a strong resemblance to mathematical notation, and that's not just coincidental.

The real problem is a lack of strong domain models for most real world situations. That is, if you're starting a project to emulate something happening outside of a computer, then there's a very large likelihood that you're going to have to build your own object model to describe the situation to the desired level of accuracy. Once you have that model, it's easy enough to say "do this until that happens", but there's a world of difference between that point and staring at a blank screen at the beginning of a project.

There's been some progress (depending on who you ask) to make this easier for those who aren't full-time programmers, such as UML and related design tools, but even these are mainly limited to building a high-level template of the final result so that a human can manually implement all of the details.

This may or may not be avoidable. Vernon Vinge (author and CompSci professor) refers to the "Age of Failed Dreams" where humans eventually concede that some things just aren't possible. Expecting a current deterministic Turing device to be programmable at the level where people interact with each other may very likely be one of those areas.

Brad Cox spoke at RubyConf 2004... (1)

tcopeland (32225) | more than 8 years ago | (#10833359)

...a torrent of all the presentations is here [rubyforge.org].

Anyway, he had some interesting things to say about micropayments; a summary of his talk is at the bottom of the page in Jim Weirich's blog here [onestepback.org].

You mean people write code? (1)

RandoX (828285) | more than 8 years ago | (#10833398)

I thought it was some kind of Code Gnome(tm) that only came out at night.

We used to leave a case of Mountain Dew out for them in the computer labs in college.

Non-natural-language definitely has its place (0)

Anonymous Coward | more than 8 years ago | (#10833399)

The most prominent example of non-natural language is mathematics. In the old days, mathematicians actually used to write math in words, and the result was very long manuscripts. But today it's virtually impossible to do anything without invoking notation and terminology that often seems cryptic to outsider.

Natural language might be useful if you want, say, your user to be able to script your application to perform simple tasks. But when precision and domain-specific functinality is needed, it's best to use a notation designed for that purpose.

I have a paper on this (1)

CrazyJim1 (809850) | more than 8 years ago | (#10833406)

Basically you get a 3d imagination space in the computer, and a camera system that can assimilate objects from reality into imagination space.

Then suddenly, you go can go all Zork in describing items. I don't see it happenining for at least 15 years though. Computer vision recognition is really in its infancy.
More AI stuff:
www.geocities.com/James_Sager2

God spoke to me:
www.geocities.com/James_Sager_PA

Natural Language?!?! (1)

Phixxr (794883) | more than 8 years ago | (#10833425)

Natural language(especially english) isn't structured enough to handle the rigors of team-programming and code re-use.

In my estimation, the closer programming languages get to math, the better off we'll be.

It doesn't need to be easy to write, it needs to be easy to read later.

-Phixxr

AppleScript, anyone? (0, Redundant)

Sathamoth (649713) | more than 8 years ago | (#10833438)

Have anyone tried AppleScript? It's very close-to-natural scripting language, and personally, I think it's awful. For me it's much easier to write things in PHP, Perl or Java than in "human speakable" AppleScript.

English and Computer Language dont Mix Well. (3, Interesting)

jellomizer (103300) | more than 8 years ago | (#10833451)

It Would be nice to send out the specs for the program and run it threw the parser and get the program you want but the truth is that normal Human Language wasn't designed for problemsolving espectilly in some of the details that programming requires. Things like nested Lists. (1,2,3,(2,4,3,2),5,2,(2,3,5,6)) Which are easy to learn to program and install are much harder with natural language.

Make a List with the values 1, 2, 3, then this is a list of 2,4,3,2, now we are back in the first list with some more values of 5 then 2, now we get an other list inside this list as 2,3,5,6 Now we finish both list.

As you see in english this is clumzy I am sure someone with a better master of english may be able to make it a little more percise but still just giving up and using the () makes it a lot easier to see and understand then using a bunch of words.

Most human languages were made Thousands of years ago. And came from languages 10s of thousands of years old if not Millions of years old. They were not designed for micro processing of infromation. They were required for more common sience reasioning. Which we as humans often fail a lot at and imagin how poor a computer would be a common sience.

Think? (0)

Anonymous Coward | more than 8 years ago | (#10833461)

we have been working to create programming languages and environments that are more natural, or closer to the way people think about their tasks.

Um, this mistakenly assumes that most workers think.

What is natural language? (2, Insightful)

drgonzo59 (747139) | more than 8 years ago | (#10833468)

It is a matter of habit and training. I am used to think in terms of objects so any object oriented language is "natural language" for me. When I solve a problem I think of objects, methods, properties and how they work together. I don't have to translate from some abstract "natural" concepts to OO concepts. I am sure someone who is using lisp will see lists and functions in the same problem that I see objects and methods.
I understand that the goal is to have the user just tell the computer what to do in English. The problem is that English is not precise and is too ambiguous. I don't know if I would want to fly on an airpline if I knew the computer on board was programmed in English.

They're just not thinking hard enough? (1)

t_allardyce (48447) | more than 8 years ago | (#10833481)

OOP was ment to let people program like they think but people never really bothered with it. With object inheritence you can layer everything right down to the simplest easy-to-program interface without a performance hit. The problem is people think messy and that leads to bad code. Even worse, people don't even think tidy enough to modularise/class/layer anything so you have bad code with no structure (good structure with bad code is ok because you can just swap out bad modules/functions with good ones). I think PHP has come far as a natural language and its managed to not be too slow and bloated. C++ with C is also good because you can skip the bullshit for most things (eg string = string + string + number) and when you have a tight loop or bottleneck you can use C to optimise it with some hackery. Do you really need anything else?

"Goal oriented" computing (0)

Anonymous Coward | more than 8 years ago | (#10833486)

Having RTFA, I see the point that a program that's easier to follow is nice. It appears to be a call for more "goal oriented" programming--rather than writing "Set y=0. For each x, compare x to stored value y. If x>y, then y=x. If x= y, next y", you'd instead write "find greatest x." Simpler, and avoids bogging down in the mechanics of HOW to find the greatest value, focus on WHAT needs to happen, and presumably the complier takes care of the rest.

The only problem here is that I see is that, if you leave the mechanics in the background, you'll do a few things.

First, you'll produce a generation of programmers who lack the nuts-and-bolts fundementals of how to do things. And please, no "yeah, so when's the last time YOU write assumbly" flames--I mean simply that papering over the mechanics will lead to a loss of understanding of them.

Second, you'll lose the ability to optimize. All the values I will ever have in my domain are short integers. How would the "find greatest x" function know this? I'd expect it would have to consider that maybe some of my x's are long doublke floating points, and allocate space based on that.

Third, you lose control over conditions. "when event happens do action" would be great, but who defines "event." Today, this is obvious--we use flags and variable compares to determine it. If we move away from that, we'll get even MORe subtle bugs that will be even HARDER to debug--now everything LOOKS right, but the problem is with how the compiler interprets "when even happens", which is inviisble to the programmer...

Not "Natural Language Programming" (1)

The Pim (140414) | more than 8 years ago | (#10833494)

Argh! The slashdot title completely mischaracterizes the article. The authors never use the term "natural language" at all! They call what they're talking about "natural programming", and if you read the article I hope you'll agree that it is something we should all be longing for: the ability to express ourselves in code that is close to the problem domain.

IMO, the best direction for natural programming is embedded domain-specific languages [yale.edu]. The best direction for natural debugging is a harder problem. It's well known that many expert programmers still find "printf" debugging the best option, which suggests to me that tracing systems [haskell.org] are promising. Of course, powerful type systems eliminate many possible run-time bugs, but then you need a type debugger [nus.edu.sg]....

NOT natural language programming (1)

useruser (638080) | more than 8 years ago | (#10833499)

Interesting how everyone thinks the article is about Natural Language programming. The researchers say nothing about "Natural Language": they only use the word "Natural" to mean "pertaining to the constitution of a thing."

Slashdot responds: (0, Redundant)

Control Group (105494) | more than 8 years ago | (#10833503)

"This is stupid. If you need natural language as a crutch to program, you shouldn't be programming."

"This will open the door to everyone programming, and spell the end of Micro$oft's monopoly."

"This will suck because performance will be crap. Just like Java. Real coders write ASM."

"I remember when they designed FORTRAN and COBOL, so I am qualified to say that this will never work, just like those didn't."

"This will be cool as soon as someone writes a GPL'd version of it."

"IANASECSLOIAOWQTGAEO (I Am Not A Software Engineer, Computer Scientist, Linguist Or In Any Other Way Qualified To Give An Educated Opinion), but..."

"Can you imagine a Beowulf cluster of natural language in Soviet Russia jokes?"

(Me? Post flamebait? That's unpossible!)

standard flaw in research like this (2, Insightful)

egomaniac (105476) | more than 8 years ago | (#10833506)

Why does all research like this seem to revolve around "toy" problems? They study non-programmers or, when they include real programmers, focus only on small tasks that can be completed in an hour or so.

Great, I accept that a new language can make toy problems easier.

However, I think the situation is very different when you have a real programmer working on a real program. Writing a real application, like a word processor or a web browser, is difficult no matter what language you do it in -- and I would argue that the difficulty doesn't vary much between languages. In fact, I would further argue that many of these research languages, while making toy problems easier, would actually make "real" programming substiantally harder, because the semantics of the language are not as formalized and thus more difficult to remember and deal with.

I'm certainly not opposed to advances in language theory and design -- our modern-day large applications would be essentially impossible to write if all we had to work with was machine language. But to be a major advance, a new language should focus on making real problems easier for real programmers, not making toy problems easier for non-programmers.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...