Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Stephen Wolfram Developing New Programming Language

samzenpus posted about 10 months ago | from the lets-try-this dept.

Programming 168

Nerval's Lobster writes "Stephen Wolfram, the chief designer of the Mathematica software platform and the Wolfram Alpha 'computation knowledge engine,' has another massive project in the works—although he's remaining somewhat vague about details for the time being. In simplest terms, the project is a new programming language—which he's dubbing the 'Wolfram Language'—which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware from PCs and smartphones all the way up to datacenters and embedded systems. The Language will leverage automation to cut out much of the nitpicking complexity that dominates current programming. 'The Wolfram Language does things automatically whenever you want it to,' he wrote in a recent blog posting. 'Whether it's selecting an optimal algorithm for something. Or picking the most aesthetic layout. Or parallelizing a computation efficiently. Or figuring out the semantic meaning of a piece of data. Or, for that matter, predicting what you might want to do next. Or understanding input you've given in natural language.' In other words, he's proposing a general-purpose programming language with a mind-boggling amount of functions built right in. At this year's SXSW, Wolfram alluded to his decades of work coming together in 'a very nice way,' and this is clearly what he meant. And while it's tempting to dismiss anyone who makes sweeping statements about radically changing the existing paradigm, he does have a record of launching very big projects (Wolfram Alpha contains more than 10 trillion pieces of data cultivated from primary sources, along with tens of thousands of algorithms and equations) that function reliably. At many points over the past few years, he's also expressed a belief that simple equations and programming can converge to create and support enormously complicated systems. Combine all those factors together, and it's clear that Wolfram's pronouncements—no matter how grandiose—can't simply be dismissed. But it remains to be seen how much of an impact he actually has on programming as an art and science."

cancel ×

168 comments

Sorry! There are no comments related to the filter you selected.

Well... (5, Interesting)

Adam Colley (3026155) | about 10 months ago | (#45431867)

Hrm, another programming language...

Attempts have been made in the past to automate programming, it's never worked very well (or at all in some cases)

Still, look forward to seeing it, perhaps I'll be pleasantly surprised.

Re:Well... (4, Informative)

Nerdfest (867930) | about 10 months ago | (#45431887)

Perhaps, but I can't help thinking that making assumptions will lead to unpredictable and inconsistent behaviour. Convention over configuration and type inference is one thing, but assumptions are completely another. It's like the dangers in lower level languages where a programmer assumes memory will be zeroed ... and _usually_ it is. It leads to obscure errors. There's a lot to be said for beiong explicit where possible.

Re:Well... (5, Insightful)

plover (150551) | about 10 months ago | (#45432443)

People seem to think that the problems with programming come from the languages. They're too weakly-typed, too strongly-typed, they use funny symbols, they don't have enough parenthesis, they use significant white space.

The biggest problems aren't coming from the languages. The problems come from managing the dependencies.

Everything needs to change state to do useful work. But each state has all these dependencies on prior states, and is itself often setting up to perform yet another task. Non-programmers even have a cute phrase for it: "getting your ducks in a row" is an expression meaning that if you get everything taken care of in advance, your task will be successful.

Ever notice that on a poorly done task that it's so much easier to throw away the prior work and start over? That's because you've solved the hard part: you learned through experience what things need to be placed in which order, which was the root of the hard problem in the first place. When you redo it, you naturally organize the dependencies in their proper order, and the task becomes easy.

What a good language has to do is encapsulate and manage these relationships between dependencies. It might be something like a cross between a PERT chart, a sequence diagram, a state chart, and a timeline. Better, the environment should understand the dependencies of every component to the maximum degree possible, and prevent you from assembling them in an unsuccessful order.

Get the language to that level, and we won't even need the awkward syntax of "computer, tea, Earl Grey, hot."

Re:Well... (4, Insightful)

Greyfox (87712) | about 10 months ago | (#45432761)

It's very rare that I see a dev team throw something away, unless it's an entire project. Once something's written, people seem to treat it as carved in stone, and they never look at it again. I was looking at some code a while back that output a file full of numbers in a particular format. Over the years the format had changed a few times. Their solution for addressing that was to write a new piece of code that took the original output file, and reformatted it to the new output version. The next time a format change came along, they did the same thing again using the file their reformatter had output! There was a lot of other nonsense in there that was apparently written so that they'd never have to go back and change anything that had already been written. And that kind of mentality seems to be pervasive in the industry (though usually not to THAT extreme.)

So people bitch about that or business process and I tell them "Well if it's not working for you, FIX it! It doesn't HAVE to be this way, we could just do things differently!" And they look at me as if I'd just suggested the Earth is flat.

Re:Well... (3, Interesting)

LongearedBat (1665481) | about 10 months ago | (#45433315)

I do, frequently. And my code is better 'cos of it. In my experience, when people are too afraid to start a module afresh, it's because they're afraid that they don't/can't/won't understand the problem well enough to a) write a solution that works b) understand the insufficiencies/faults of the existing code to do a better job next time around.

Node based GUI programming may be the way (0)

Anonymous Coward | about 10 months ago | (#45433661)

May as well program visually with a node tree. (Which is much like a flow chart in determining logical relationships. Works great in things like CGI software where complicated behaviors or relationships need to be modeled, so why not program elsewhere that way?) Then you could show status of things like dependencies with the color of the nodes or node sockets. Sockets and noodles used to connect sockets could also be color coded to show the data types they accept or carry. Functions can be built by grouping nodes which can be collapsed down into a logic block which appears as a single node. Library stuff that may be handy for later use could be kept in a browsable sidebar where you access things via drag-and-drop. Tooltips can be available to explain node behavior in more detail if it's name isn't obvious enough. Comment boxes can be sticky pasted around or attached to certain nodes. If somebody makes spaghetti code, it literally appears as spaghetti code. Also you eliminate almost all the syntax problems by doing it graphically. Knowing correct syntax is one of the major headaches to people new to programming or people that switch between languages that do things in arbitrarily different ways. Without that kind of overhead, you worry more about the actual process of what you want done instead of how to describe what you want done.

In theory if a node-based graphical programming editor is created right, it should be able to output code in a syntax for any programming language you choose. Diagram what you want, and click the button for C++/Javascript/Python/Etc., and then you get the desired source saved out for use in your compiler or interpreter of choice. So it really wouldn't have to be a language, but more like a meta-language.

Re:Well... (4, Insightful)

fuzzyfuzzyfungus (1223518) | about 10 months ago | (#45432875)

Perhaps, but I can't help thinking that making assumptions will lead to unpredictable and inconsistent behaviour. Convention over configuration and type inference is one thing, but assumptions are completely another. It's like the dangers in lower level languages where a programmer assumes memory will be zeroed ... and _usually_ it is. It leads to obscure errors. There's a lot to be said for beiong explicit where possible.

This is Stephen "A New Kind of Science" Wolfram. The guy who cataloged some cellular autonoma (and had his uncredited research peons catalog a bunch more) and then decided that he'd answered all the profound questions of metaphysics. I'm not sure that banal matters of 'software engineering' are his problem anymore.

very sharp guy. However, like many sharp guys, he seems to have entered his obsessive pseudoscience and grandiosity phase. Same basic trajectory as Kurzweil, whose achievements are not to be underestimated; but who now basically evangelizes for nerd cyber-jesus full time.)

Re:Well... (4, Interesting)

physicsphairy (720718) | about 10 months ago | (#45432895)

Being explicit is precisely what makes programming laborious and tedious. It is entirely true that without such tediousness, you do not enjoy a full range of functionality. But the vast majority of the time you do not need a full range of functionality.

Speaking as someone in a scientific major, Wolfram|Alpha has shortly become the go to resource for everyone looking to do quick, more-than-arithmetical calculation. It does a fantastic job of anticipating what you need and providing the appropriate options. If I need a differential equation integrated or the root to some expression I *can* ask for it explicitly, but usually I just type in the expression and everything I need will be generated by Wolfram automatically. For involved projects I do setup my problems in Python, but 99% of the time Wolfram|Alpha does just what I need for a hundredth of the effort. The fact my peers are using it the same way is notable because, while before Wolfram I might use Python or Maple or Mathematica, most everyone else would do these things by hand -- learning to use the available tools was something they considered too intimidating or not worth the effort.

If Stephen Wolfram can do something vaguely like Wolfram|Alpha with more ability to customize and automate what is happening, it's going to transform academics, maybe even down to the high school level. Imagine being able to easily develop a science fair project which requires solving some complicated ODEs, without having to take 3 years of college math first.

Re:Well... (2, Interesting)

Anonymous Coward | about 10 months ago | (#45431949)

Lisp worked well. So much so, most of the languages since C basically go "Here's our idea, we're going to be like C in base thinking but extra. What extra? Well, we're going to add lisp-like features, usually in a half-baked manner." The only really major variation is languages targetting multicore operation but they tend to be functional type like lisp.

Problem with C is that it's a high level assembly. Great for computers as they were in the 1970s and 1980s.

Problem back then was lisp was too heavy. Problem now is lisp is too fragmented.

I'm waiting to see if Wolfram does more C + Some of Lisp, or if it will be anything novel.

Re:Well... (0)

Anonymous Coward | about 10 months ago | (#45432221)

That comment reminds me of this essay:
http://www.winestockwebdesign.com/Essays/Lisp_Curse.html [winestockwebdesign.com]

The Lisp Curse. Too powerful for it's own good. Attracts brilliant hackers but not enough of the type of people who shove documented things out the door for us mere mental peons.

Re:Well... (2)

fuzzyfuzzyfungus (1223518) | about 10 months ago | (#45432899)

"Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp."

-Greenspun's Tenth Rule.

Re:Well... (1)

K. S. Kyosuke (729550) | about 10 months ago | (#45433313)

... which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware from PCs and smartphones all the way up to datacenters and embedded systems. The Language will leverage automation to cut out much of the nitpicking complexity that dominates current programming. 'The Wolfram Language does things automatically whenever you want it to,' he wrote in a recent blog posting. 'Whether it's selecting an optimal algorithm for something. Or picking the most aesthetic layout. Or parallelizing a computation efficiently. Or figuring out the semantic meaning of a piece of data. Or, for that matter, predicting what you might want to do next. Or understanding input you've given in natural language.' In other words, he's proposing a general-purpose programming language with a mind-boggling amount of functions built right in.

Well, that's pretty much a description of Common Lisp in the hands of a capable lisper. ;-)

Re:Well... (3, Insightful)

rudy_wayne (414635) | about 10 months ago | (#45431953)

Even really smart people come up with stupid ideas.

Anything that is capable of doing complex things is complex itself. It's unavoidable. Even if every function by itself is extremely simple -- just press the green button -- what happens when there are a thousand buttons. And any one of them can interact with any other button.

Re:Well... (0)

Anonymous Coward | about 10 months ago | (#45432205)

Kick it? Seems to work for almost everything.

Re: Well... (0)

Anonymous Coward | about 10 months ago | (#45432397)

Even stupid ideas can succeed, just choose the right mascot and hyped up enough preacher evangelist.

Re:Well... (4, Funny)

rudy_wayne (414635) | about 10 months ago | (#45431995)

Hrm, another programming language...

Attempts have been made in the past to automate programming, it's never worked very well (or at all in some cases)

Too many people think that programing is "just a lot of typing". Which leads people to believe that they should create a "new programming language" where you can just type "Make a new better version of Facebook" and be done with it.

Which leads to a lot of crap with "Visual" in its name. Hey look, you don't have to type. Just drag this widget from here to here. And we've seen how sell that turned out.

Re:Well... (1)

jythie (914043) | about 10 months ago | (#45432483)

I think part of the problem is making new languages is fun and sexy, so people keep doing it rather then building frameworks, libraries, or editors on top of existing ones. So we end up with dozens of half baked languages that do not work together and are missing a great deal of functionality.. with more on the way to try to fix the problem with the same solution that got us into the mess in the first place.

Re:Well... (1)

skids (119237) | about 10 months ago | (#45433369)

Attempts have been made in the past to automate programming, it's never worked very well (or at all in some cases)

The places where it does work, you don't notice. Compilers/optimizers/JIT engines are automated programming. You tell the system what you want to do and behind the scenes it figures out all the stuff you did not tell it. Like not to actually check if X again if you have checked that earlier and X could not have changed, even if you told it to check X again because it was easier for you to write it that way.

That said, we have words for this in Perl5/Perl6, DWIM (Do What I Mean) and WAT (acronym open to conjecture and often followed by ?!?!) and the Perl6 folks are working under the hypothesis that every DWIM you add will cause and equal and opposing WAT, so to be careful not to DWIM just for the sake of it.

TFA sounds like an uber-DWIMmy initiative. The corresponding WAT behavior is almost guaranteed to be hilarious.

Meh... (2)

DrPBacon (3044515) | about 10 months ago | (#45431869)

I'll stick with C

Re:Meh... (2)

npridgeon (784063) | about 10 months ago | (#45431895)

Anything with this guy's name on it makes me want to distance myself from it. Alpha was a tracking disaster and I still receive junk mail from this clown.

Re:Meh... (1)

justthinkit (954982) | about 10 months ago | (#45432741)

Don't forget this [wikipedia.org]

Re:Meh... (0)

Anonymous Coward | about 10 months ago | (#45432477)

This was pretty much the goal of Ada. Useful for embedded systems, all the way up to data processing. Though really only good for embedded systems.

Re:Meh... (2)

Sponge Bath (413667) | about 10 months ago | (#45432515)

Say brother, can you spare a pointer?

Re:Meh... (1)

BlackHawk-666 (560896) | about 10 months ago | (#45432621)

I have one going cheap here. It's just a copy of a pointer to a char which I am using globally in a multithreaded program with no semaphores or mutexs. It will probably work as long as you use it quick, and only read it's contents.

Re:Meh... (1)

Guignol (159087) | about 10 months ago | (#45433031)

:D excellent :D (too bad its ruined by it's ;))

Re:Meh... (1)

DrPBacon (3044515) | about 10 months ago | (#45433103)

I'll trade you two of my latest dumps.

Re:Meh... (1)

DrPBacon (3044515) | about 10 months ago | (#45433043)

Don't be like me, kid. It's hard living in real-time. I grew up believing I could never accomplish anything because someone else will do it first and own the patent. And yet here I am, 15 years later... My best friend is a still that same damn penguin...

Re:Meh... (1)

the_arrow (171557) | about 10 months ago | (#45433143)

0x3a28213a
0x6339392c
0x7363682e

Re:Meh... (1)

jythie (914043) | about 10 months ago | (#45432525)

I think that around the time C matured we had as many actual languages as we needed. But for some reason people keep coming up with new ones with syntax that is different enough to be incompatible with each other but similar enough one wonders why they created a whole new language rather then a library or framework to link an existing language to a new domain.

Re:Meh... (1)

DrPBacon (3044515) | about 10 months ago | (#45432839)

I've made two for my own purposes, but I'm not entirely sure where the boundary between markup and language is... Has anyone created a famous real-time language yet? Or is that still a pipe dream?

Wolfram's impact (0)

Anonymous Coward | about 10 months ago | (#45431871)

But it remains to be seen how much of an impact he actually has on programming as an art and science."

Or, for that matter, on search.

Re:Wolfram's impact (1)

tgd (2822) | about 10 months ago | (#45432149)

But it remains to be seen how much of an impact he actually has on programming as an art and science."

Or, for that matter, on search.

Good at PR, though.

yet another programming language (0)

Anonymous Coward | about 10 months ago | (#45431873)

It's not like there's any shortage of programming languages out there, or even a shortage of ambitious projects that fail.

Re:yet another programming language (3, Informative)

dotancohen (1015143) | about 10 months ago | (#45431915)

But this one is ostensibly designed by Stephen Wolfram, who knows what scientists and physicists need from a programing language.

Python, C, Java, et al were all designed by computer programmers for computer programmers. R and Mathlab were designed by computer programmers for mathematicians, thus works a lot better for expressing certain mathematical concepts and working with them (transformations, statistics). But there is much room for improvement, especially when looking at the problem from the scientist's point of view, not from the programmer's point of view.

Re: yet another programming language (0)

Anonymous Coward | about 10 months ago | (#45431947)

Wrong. Python was designed by a mathematician.

Re: yet another programming language (1)

Roger W Moore (538166) | about 10 months ago | (#45432357)

Then why does it use the engineer's 'j' for complex numbers instead of maths' and physics' 'i'?

Re: yet another programming language (1)

minstrelmike (1602771) | about 10 months ago | (#45433129)

Then why does it use the engineer's 'j' for complex numbers instead of maths' and physics' 'i'?

because i is for index I imagine.

Re: yet another programming language (1)

dotancohen (1015143) | about 10 months ago | (#45432391)

Guido is also an extremely competent C programmer (see recent Slashdot article) and he did not design Python for scientists, but rather for programmers.

Re: yet another programming language (2)

jythie (914043) | about 10 months ago | (#45432567)

Python is actually a good example of why adding new languages is not the answer. One of the big reasons that python has been so embraced in scientific computing are the libraries that were built on top of it that are well suited to those types of tasks. The python community did a reasonably good job of grafting domain specific functionality in via libraries that were fairly accessible to people who are not primarily programmers while still having the general purpose language behind it for people who are, allowing programmers and non-programmers to collaborate easily. Which is why I tend to get annoyed with the whole 'lets build a new language for this domain!' thing since all it really does is increase the barrier between fields and produces yet another custom language that needs to be learned and maintained.

Re: yet another programming language (1)

dotancohen (1015143) | about 10 months ago | (#45432667)

Python is actually a good example of why adding new languages is not the answer. One of the big reasons that python has been so embraced in scientific computing are the libraries that were built on top of it that are well suited to those types of tasks.

That is very true, however they still require one to express his problem in terms of lists, sets, dicts, strings, ints, floats, and complex numbers. Not all scientific concepts can be massaged into one of those datatypes.

The python community did a reasonably good job of grafting domain specific functionality in via libraries that were fairly accessible to people who are not primarily programmers while still having the general purpose language behind it for people who are, allowing programmers and non-programmers to collaborate easily. Which is why I tend to get annoyed with the whole 'lets build a new language for this domain!' thing since all it really does is increase the barrier between fields and produces yet another custom language that needs to be learned and maintained.

The counter argument is that each individual domain needs its own programming language in the same sense that each individual domain needs its own jargon. Each domain has its own unique intricacies, problems, methods, and context. The tools used should reflect that.

Re: yet another programming language (1)

jythie (914043) | about 10 months ago | (#45432941)

Jargon is a good parallel, though I feel that it is an example of why each domain doesn't need its own language. In the case of jargon, one is still using the same language as people in other domains with the addition of some extra shorthand. I would argue that such shorthand is closest to use of libraries.. still the same language plus domain specific functionality/sugar.

Re: yet another programming language (1)

dotancohen (1015143) | about 10 months ago | (#45434203)

Libraries only provide new functions and types. Go look at mathlab or (shudder) labview for some examples of domain-specific datatypes (not simply classes built on the common primitives) and paradigms.

Surely you are not suggesting that the field of particle physics should be using the same tools as the field of psychiatry? That materials engineers should be using the same tools as palaeontologists?

Re:yet another programming language (0)

Anonymous Coward | about 10 months ago | (#45431957)

The summary paints this language as a general solution to computing problems, but I think you are closer to the truth - that this will be very well suited to a particular subset of problems, much like Alpha. I've played witht Apha on and off since it was launched and except for a very narrow range of queries, mostly mathematical, I have found it to be a dead loss.

Re:yet another programming language (4, Interesting)

VortexCortex (1117377) | about 10 months ago | (#45432029)

Consider that the answer may be completely the opposite than what you assume. Perhaps we just teach kids math with programming. Then, just like long division or integration, etc. they won't have a problem explaining their desires to computers.

Hell, I have a _BEST_SORT() macro which, together with my collections library's _GEN_PROFILED_H directive will select the actual best sort method on next compile after profiling to PROVE which sort is best for the scale of the problem space, instead of guessing. Predicting what I want to do next? Yep, my brain even does that automatically me too. All I have to do is explain to the computer what I want to have happen, and it happens. IMO, the problem is the way mathematics is taught. A sigma is a for loop. The latter is more verbose, but if they'd have been taught for loop instead of sigma they'd be programmers; It's sort of ridiculous when you think about teaching kids the old way: "I'll never use this in real life", meanwhile they can utilize programming in say, javascript, to take better control of every damn device they own right now... Teachers just failed to tell them how.

Seriousy, I've taught pre-teens how to code as a remedy for flunking out of mathematics; Instantly they're able to see the utility of the tool. Humans are tool using creatures, no wonder they have a hard time learning how to use tools that aren't immediately useful to them. The flunkers are actually being smarter than the teachers.

Re:yet another programming language (1)

dotancohen (1015143) | about 10 months ago | (#45432409)

That is a rather creative idea. I would love to see more practical examples of what you do with it, such as the Sigma example.

Re:yet another programming language (1)

DougMackensie (79440) | about 10 months ago | (#45432453)

Close. A sigma is one type of a for loop that automatically adds each number in the sequence to a sum.

Re:yet another programming language (4, Insightful)

wickerprints (1094741) | about 10 months ago | (#45432881)

Being primarily a mathematician and not a computer scientist or engineer, I have used Maple, Mathematica, and R. At one point I knew Pascal and C. I've dabbled in Python.

Of all these programming languages, Mathematica was BY FAR the easiest language for me to learn to use. The way it does certain things makes so much more sense to me than the others--for example, how it handles functions and lists. Unlike C, it's a high-level language if you want it to be, although you aren't forced to use it in that way. Pattern matching is extremely powerful. And the syntax is totally unambiguous; brackets define functions, braces define lists, and parentheses are used only for algebraic grouping of terms.

The major criticism I have of Mathematica is that it is comparatively slow, mainly because of its lack of assumptions regarding the nature of the inputs. Internally, it tries to preserve numerical precision, it works with arbitrary precision arithmetic, and it doesn't assume values are machine precision. All this comes at a cost. Also, reading other people's code can be remarkably difficult, even if it's commented. The tendency is to write functions that do a lot of complicated things in one command, so code can be remarkably dense.

Most recently, I have had to learn how to use R, due to its abundance of statistical algorithms, many of which have not been implemented in Mathematica. There was a simple example where I tried to calculate a Bayes factor, and the expression was something like (1 - x)/(1 - y), where x and y were very small positive numbers, somewhere around the order of 10^-15. This calculation totally failed in R--the answer given was 1. Mathematica correctly calculated the ratio. Maybe I don't know enough about R to know how to preserve the necessary numerical precision, but it sort of shows that in Mathematica, such issues are handled automatically; moreover, if there is a potential problem, Mathematica warns you.

Anyway, this is all just personal opinion, really. The takeaway for me is that I see a lot of evidence that Stephen Wolfram is pretty good at designing computer languages for specific purposes. Yes, he's totally egocentric, but there's no denying that he is brilliant. When Wolfram | Alpha debuted, I remember thinking how totally stupid it was. And now...every single high school and college math student knows about it. It is one of the most ingenious marketing ploys I have ever seen. And the scary thing is, it keeps improving. It's almost Star Trek-like in its ability to parse natural language input. And I think that's the eventual direction that computer programming will evolve towards. Programs will not be written in code, but instead, as broad sentences, parsed by an AI which automatically performs the high-level task.

Re:yet another programming language (1)

canadiannomad (1745008) | about 10 months ago | (#45433587)

It's almost Star Trek-like in its ability to parse natural language input. And I think that's the eventual direction that computer programming will evolve towards. Programs will not be written in code, but instead, as broad sentences, parsed by an AI which automatically performs the high-level task.

That is kinda how I would think of it.. You make a request. The computer AI does its best to pick a starting point given what you described and starts running it. Then you explain to the computer what the AI is doing wrong in comparison to the running program. It tries again. Rinse and repeat until it has something that does everything you want it to.

From a programming point of view it is like starting with a similar project and using natural language to modify the existing program little by little.
This certainly wouldn't make the most efficient code, but it might be good for people who just want to do a variation on an existing theme.

Obviously this wouldn't be programming from the classical point of view, but it could be extrapolated to handle an enormous variety of things that an average person would want.... Then once there is a prototype they could hand it to a software engineer for the parts that need optimization for final tweaking.

The main innovation of course being ... (4, Interesting)

jopet (538074) | about 10 months ago | (#45431883)

that you will have to pay a lot of money to use it?

Re:The main innovation of course being ... (2)

dotancohen (1015143) | about 10 months ago | (#45431921)

that you will have to pay a lot of money to use it?

If the work that needs to be done could be done quicker or simpler (i.e. cheaper) by paying a $1000 license rather than having a $300,000-per-year researcher to go learn Python or R, then it is worth it to pay, no? The current options aren't going away.

Re:The main innovation of course being ... (1)

Anonymous Coward | about 10 months ago | (#45432777)

Don't forget that you will have to pay the $1000 fee for an eternity and it will likely raise in future, that you cannot fix any bugs in the language yourself, and that you will likely have to pay extra to get bug fixes in a timely fashion. But yeah, to some organizations that don't plan to use their code base for very long, paying $1000 a year might be a bargain.

Re:The main innovation of course being ... (0)

dotancohen (1015143) | about 10 months ago | (#45434271)

Those are cut-and-paste Fosstard arguments. Sorry, sometimes proprietary software is good.

Typed in Foss Firefox on Foss Kubuntu.

Re:The main innovation of course being ... (2)

rasmusbr (2186518) | about 10 months ago | (#45432045)

If the programming language relies on remote servers (basically Wolfram Alpha) in order to function it would make sense that it would cost money. It costs money to hire people to make and improve a system like Wolfram Alpha.

If people got over the idea of having everything on their computers for free the world would have a lot less corporate snooping and a lot less ad spamming. That would be nice.

Re:The main innovation of course being ... (1)

SirGarlon (845873) | about 10 months ago | (#45432077)

If people got over the idea of having everything on their computers for free the world would have a lot less corporate snooping and a lot less ad spamming. That would be nice.

And several of the current tech giants would shrivel up and die, and that would be even nicer. :-)

Re:The main innovation of course being ... (1)

Errol backfiring (1280012) | about 10 months ago | (#45433419)

How cutely naive! If a programming language costs money and relies on remote servers you expect corporate snooping to decrease? I think hell would freeze over first.

Re:The main innovation of course being ... (1)

rasmusbr (2186518) | about 10 months ago | (#45433585)

How cutely naive! If a programming language costs money and relies on remote servers you expect corporate snooping to decrease? I think hell would freeze over first.

I did not say that.

If you have two programming languages that depend on remote servers, one that's free is in gratis and one that has fees I would expect the one that has fees to value and respect your privacy more than the one that is free.

Re:The main innovation of course being ... (1)

canadiannomad (1745008) | about 10 months ago | (#45433663)

"Add a google Ad box to the upper left corner below the logo"
"Make it fit under the logo nicely."
"Make it blink."

...

Noooooooooo!

Just Call It "Wolf" (3, Funny)

Phrogman (80473) | about 10 months ago | (#45431897)

that way if we make a programming error we can just comment "Bad Wolf" (too much exposure to Dr Who recently) :P

Re:Just Call It "Wolf" (0)

Anonymous Coward | about 10 months ago | (#45433469)

It's Wolfram's.Call the language "EGO".

His next project is interesting (5, Funny)

paiute (550198) | about 10 months ago | (#45431925)

Wolfram announced his latest idea - that there needed to be some kind of pliable material available next to toilets with which to clean one's bum. This material, he said, is going to be really soft, probably a couple of layers thick, and needed to be on some kind of continuous dispenser mechanism which he is developing.

Re:His next project is interesting (0)

Anonymous Coward | about 10 months ago | (#45432063)

I know you're trying to be funny by implying he's reinventing the wheel, but ironically, there's more than one way to clean your ass. In some countries, they use water streams rather than TP. There's not just one unique solution to each problem.

Re:His next project is interesting (3, Funny)

paiute (550198) | about 10 months ago | (#45432283)

I know you're trying to be funny by implying he's reinventing the wheel, but ironically, there's more than one way to clean your ass. In some countries, they use water streams rather than TP. There's not just one unique solution to each problem.

You just made my point. There are already multiple and satisfactory ways to clean one's ass.

Re:His next project is interesting (1)

CCarrot (1562079) | about 10 months ago | (#45433021)

I know you're trying to be funny by implying he's reinventing the wheel, but ironically, there's more than one way to clean your ass. In some countries, they use water streams rather than TP. There's not just one unique solution to each problem.

You just made my point. There are already multiple and satisfactory ways to clean one's ass.

So maybe he's developing the ultrasonic ass-cleaning device? Guaranteed to leave your ass sparkly clean in 1/10th the time it takes for 'traditional' methods, and no chance the dog will drag it all over the house while you're at work. Also takes care of unsightly butt-hair...

Re:His next project is interesting (2)

macklin01 (760841) | about 10 months ago | (#45432127)

Wolfram announced his latest idea - that there needed to be some kind of pliable material available next to toilets with which to clean one's bum. This material, he said, is going to be really soft, probably a couple of layers thick, and needed to be on some kind of continuous dispenser mechanism which he is developing.

And naturally, he'll call it Wolfram paper. :-)

Did anyone else think of this... (1)

Aaron H (2820425) | about 10 months ago | (#45431959)

"The Wolfram Language does things automatically whenever you want it to"

Did that make anyone else think of Zombo.com?

Re:Did anyone else think of this... (0)

Anonymous Coward | about 10 months ago | (#45432099)

No.

Oh boy. (5, Funny)

korbulon (2792438) | about 10 months ago | (#45431961)

First a new kind of SCIENCE, now a new kind of PROGRAMMING.

Can't wait for a new kind of LOVE.

Re:Oh boy. (1)

Registered Coward v2 (447531) | about 10 months ago | (#45432329)

First a new kind of SCIENCE, now a new kind of PROGRAMMING.

Can't wait for a new kind of LOVE.

Given the challenges many face with the old kind I doubt we are ready to face a new kind...

Re:Oh boy. (1)

kencurry (471519) | about 10 months ago | (#45433535)

You beat me to this one. I actually read that whole damn book, thinking it would be worth my time - what a laugh.

Wolfie is an Aspie (-1)

Anonymous Coward | about 10 months ago | (#45431981)

Maybe the new language will teach him niceties like making eye contact, making some sign that he's heard when spoken to. Oh, and things like not shaking hands just after he's been picking his nose.

Alan Perlis said... (0)

Anonymous Coward | about 10 months ago | (#45431997)

When someone says "I want a programming language in which I need only say what I wish done," give him a lollipop.

Phantom Minus Minus (4, Insightful)

korbulon (2792438) | about 10 months ago | (#45432001)

Stephen Wolfram is the George Lucas of scientific computing.

Re:Phantom Minus Minus (0)

Anonymous Coward | about 10 months ago | (#45432245)

The George Lucas of scientific programming ? Let's hope he will stop his saga after the historical episode 2 : the non-programmer strikes back. Do you imagine Ewoks walking freely in your code ? or even worst, Jar Jar Binks as a lead in syntax or warning messages ?

Let me guess: (1)

DF5JT (589002) | about 10 months ago | (#45432019)

He won't publish it under a free software license...

Automatic everything? (1)

Millennium (2451) | about 10 months ago | (#45432037)

So you can do anything you want with Wolfram language? The only limit is your imagination?

Will the first project be the long-awaited 1.0 version of Zombo.com [zombo.com] ?

One hell of a language (4, Informative)

Celarent Darii (1561999) | about 10 months ago | (#45432055)

Well, either he's created the mother of all LISP macros, or it's simply vaporware. Love to see it when they publish it. Code or it didn't happen.

Here is the obligatory xkcd [xkcd.com] , panel two.

That's funny twice, considering... (3, Insightful)

alispguru (72689) | about 10 months ago | (#45432365)

1. Wolfram is a notorious Lisp disser [ymeme.com] , and Mathematica is arguably a shining example of Greenspun's tenth rule [wikipedia.org] .

2. Lisp has a long history of trying to help programmers, with mixed results. The term DWIM [wikipedia.org] was coined by Warren Teitelman in 1966 as part of a project based on BBN Lisp, the main predecessor of Interlisp; this project of his sounds like DWIM writ large.

Gasp! (0)

Anonymous Coward | about 10 months ago | (#45432083)

He's invented APL! But I notice a lot of skepticism here, I wonder where all that skepticism is in 3D printing stories??

Not a story (2)

umafuckit (2980809) | about 10 months ago | (#45432131)

This isn't even a story. The linked-to blog post is marketing fluff, full of big hazy promises and substance. I read some of it and sounds like some sort of data-centric OO language (it makes me think of the R plotting system ggplot: http://ggplot2.org/ [ggplot2.org] ), beyond that, however, who knows what the hell this is?

Re:Not a story (0)

Anonymous Coward | about 10 months ago | (#45432421)

I would be surprised if "Wolfram language" differs from language used in Mathematica in any other way than not being so tightly tied to a specific end-user product.

I must say I love Mathematica the product, but I hate hubris of Stephen Wolfram the man. Even though Mathematica is probably most powerful open-ended computing tool I've ever used, it's hard to defend when Mr. Wolfram comes around with his completely-over-the-top blog rants and basically non-applicable rants about cellular automata. He manages to keep a great product getting better, but it'd appear his personality hasn't really improved much during last couple of... decades.

Can't be dismissed? Watch me! (0)

Anonymous Coward | about 10 months ago | (#45432139)

No, but really, I'll wait until I see results. I've worked in the field long enough to approach anything with a healthy skepticism, especially when the only thing I have to trust is someone's reputation.

Does things automatically whenever you want it to (1)

pr100 (653298) | about 10 months ago | (#45432247)

... so you don't actually have to do any coding at all?

Libraries And Documentation (4, Insightful)

smpoole7 (1467717) | about 10 months ago | (#45432265)

I don't program for a living anymore, and I've always been more of a system-level, hardware driver kind of guy, so C/C++ work fine for me.

But especially coming from that background, my need isn't for another programming language, it's for better documentation of available libraries. For any common task that I want to do, somebody has probably written a great library that I can just strap in and use.

The problem is when I start trying to use it. The documentation has blank "TBD" pages, or really helpful comments like, "init_lib() -- initializes the library. You can specify the # of flickers per bleem ..."

Or ... and this is my 2nd favorite ... the documentation is out of date. "Hey, I tried to do this the way your tutorial said and it didn't work?" "Oh, yeah, that tutorial is out of date; we've changed some stuff ..."

My very most #1 favorite is automatically generated documentation that looks at (for example) a C++ class and then creates an HTML page. I might as well just look at the source code ... hoping, of course, that the people who wrote that source actually inserted more than a few, "does what it says" comments. Or that I don't have to play the Spaghetti Trace(tm) game, bouncing from one .c file to another .h file and back to a third .c (and this is after repeated greps in the "src" directory) to try to figure out what's happening to my poor variable while it's inside a function.

Not criticizing FOSS, per se; I understand that it's written by volunteers (for whom I'm very grateful). But this, rather than needing a new way to "PRINT" or "SORT" in a programming language, is the far bigger problem, in my book.

Re:Libraries And Documentation (1)

scamper_22 (1073470) | about 10 months ago | (#45434145)

I fully agree with this.

Just finding libraries, configuring them, and learning to use them is pretty hard some times. .NET/Java makes this a bit easier, just import the jar/.dll and away you go.

Some PERL distributions make this easier with a package manager.

I have no idea what Wolfram has, but it would be pretty cool if it managed to do a lot of this. Centralized package management. Maybe it scans your code, sees what you're trying to do and then chooses an optimal function in some library (hopefully offers it to you)...

How it would do that, I have no idea... but still it would be pretty good.

Yes! We need more languages! (0)

Anonymous Coward | about 10 months ago | (#45432279)

I was just thinking that the one thing we need is Yet Another Programming Language. As a bonus, Wolfram is not a language designer, so it will probably be some social-science abomination like R with a screwball syntax. Or maybe he could combine R (bizarre irregular syntax) and Erlang (the only serious language with a COME FROM construct where control magically passes to a routine with no obvious link to where it came from) and create a language no one could understand?

Surely all the time and effort that goes into these new languages could be put into making a few key languages the new industry standard? Python for scripting, Java and C for compiled code, and get rid of the hundreds of scripting and intermediate languages. The biggest problem in the industry right now is Balkanization of programming languages. If one language had been an industry standard for the past 15 years, it would have all the features anyone needed by now - lambdas, closures, generics, and everything else. Instead, every company has worked on a different language while scripting languages have been created at an alarming rate, and we have the Tower of Babel.

It's still early (0)

Anonymous Coward | about 10 months ago | (#45432285)

I read that as "paralizing a computer efficiently".

STOP PRESS! (0)

Anonymous Coward | about 10 months ago | (#45432353)

Wolfram just invented object-oriented programming! Film at 11!

There’s a fundamental idea that’s at the foundation of the Wolfram Language: the idea of symbolic programming, and the idea of representing everything as a symbolic expression.

Prediction (0)

Anonymous Coward | about 10 months ago | (#45432371)

This will be a half- or even quarter-baked implementation, and it will basically be software that automagically does what Wolfram thinks makes sense, and will have to be hammered into a different shape to meet other needs and styles.

Probability of it being a success by programmers: Less than 10%.

parable (4, Insightful)

Megane (129182) | about 10 months ago | (#45432427)

One day the student came to the master and said "There are too many programming languages! I am tired of having to learn thirty programming languages! I shall write a new programming language to replace them all!"

The master smacked the student upside the head. "Idiot! Then we would all have to learn thirty-one programming languages!" The student was enlightened.

Unfortunately, it was only the one student who was enlightened. Now we all have to learn fifty programming languages.

Language (1)

Insanity Defense (1232008) | about 10 months ago | (#45432431)

He should just name it Language.

It's called C. (1)

xtal (49134) | about 10 months ago | (#45432467)

Seriously, C is that awesome.

If C doesn't work, import python.

"Wolfram Language"? (4, Funny)

DdJ (10790) | about 10 months ago | (#45432897)

This fellow needs to work on his self-esteem.

like he [re]invented Von Neuman's 1948 physics? (1)

peter303 (12292) | about 10 months ago | (#45432903)

Both did work wioth cellular autmata 50 years apart

The Problem (0)

Greyfox (87712) | about 10 months ago | (#45432943)

The problem is not the language. It's the attitudes of programmers. Lazy programmers don't want to have to think about the problem they're trying to solve, and they think the code is the end goal. Except the code is not the end goal. The code is a means for the company you're working for to achieve higher productivity with the same number of resources. In order to do that, specific problems within their business need to be solved faster. In order to achieve that goal, one must first understand the problems. And most of the terrible, terrible programmers that I've had to clean up after in the last couple decades would obviously go to any extreme to not have to understand the problem, much less figure out a reasonable way to solve it. Hell a few of them didn't even understand how to program, which is the problem you have to solve BEFORE you can solve the problem of helping your company be more efficient with its resources.

You can try to make things more simple all you want, but even if you could tell the computer what to do in plain English, you still have to completely understand what you're telling it to do! And on that side of things, most of the clients I've talked to generally failed to some extent (Sometimes miserably) as well. Most in-house business software and a fair bit of the stuff that's sold commercial is the result of not-very-articulate clients telling not-very-good-programmers what they want, and those not-very-good-programmers crapping out a steaming pile of code that has as much of a chance of harming the business as helping it.

Re:The Problem (1)

minstrelmike (1602771) | about 10 months ago | (#45433071)

agreed. It sounds like we need a programming language where you don't have to think logically. Then those duds could really get something done.

imo, problem definition comes before problem solution.

Sharp cookies! (1)

Impy the Impiuos Imp (442658) | about 10 months ago | (#45433127)

"Microsoft also announced today they are developing a new language. Though largely unspecified, it will be called W# and will do everything you want in a way that makes you dependent on Windows'."

"cut out much of the nitpicking complexity" (2)

NikeHerc (694644) | about 10 months ago | (#45433429)

I'm not saying Wolfram can't pull this off, but I've been programming for a long, long time and mastering the pesky "nitpicking complexity" is one thing good programmers do very well.

I wish him well, but I remain skeptical. I hope the result doesn't devolve into "click here and here and here."

Sounds like something I have heard of before (1)

jandersen (462034) | about 10 months ago | (#45433657)

... which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware ...

Isn't that what fortran does?

Knowledge-based programming (2)

T.E.D. (34228) | about 10 months ago | (#45433833)

The most concrete detail I could find anywhere on his web about it was his repeated characterization of the language as "knowledge-based".

Now, unless he has some whole new meaning in mind, that isn't a totally new concept in languages. We generally call such languages "AI languages" (or more technically, Inference Engines [wikipedia.org] or Reasoning Engines [wikipedia.org] or whatever.

The general idea is that the programmer's job is to write rules. Then you feed the engine your rules and a set of facts (or an operating environment it can go get "facts" from), and it will follow what rules it needs to. The language/system of this kind that programmers here will probably be most familiar with is make [gnu.org]

It sounds cool, but I think a lot of folks here might find the concept of something like make being the answer to all their programming difficulties a case of the cure being worse than the disease.

Megalomanic behavior (0)

Anonymous Coward | about 10 months ago | (#45434019)

He is a genius but his megalomania is even stronger. When the new and ultimate programming language is finished, he will be claiming intellectual property of everything developed with it.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>