Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Announcing Ozma: Extending Scala With Oz Concurrency

timothy posted more than 3 years ago | from the whirlwind-tour-of-the-programming-world dept.

Cloud 38

setori88 writes "Programming for concurrency makes sense in developing for both large scales (cloud computing) and small (multicore CPUs). Some languages were designed for concurrency and distribution; One of those languages is Oz, which provides advanced primitives regarding concurrency and distribution. Oz is mostly declarative, a paradigm that encompasses functional and logic programming. Despite its innovative features and expressiveness, Oz never made it into the wide developer community; one reason is its unusual syntax." Read on to learn about an effort to bring Oz's concurrency features to more programmers.setori88 continues: "But first, some background: Martin Odersky, in designing the Scala programming language, recognized the need for community acceptance of the kind that Oz lacked. He designed Scala as being both close to Java (in terms of syntax and concepts) and interoperable with existing Java libraries. Today, Scala seems to be the best hope for making functional programming accessible to programmers.

Although Scala has no language feature related to concurrency, the advanced library Akka, inspired by Erlang, provides Scala programmers with concurrent and distributed concepts.

Now comes a project attempting to popularize the concurrency concepts of Oz, called Ozma. Ozma implements the full Scala specification and runs on the Mozart VM. It extends Scala with dataflow variables, declarative concurrency, lazy declarative concurrency, and message-passing concurrency based on ports. Ozma extends the duality of Scala, namely the combination of functional and object styles, to concurrent programming."

Sorry! There are no comments related to the filter you selected.

Frustrating name (1)

drinkypoo (153816) | more than 3 years ago | (#36582476)

I have a hard time imagining why you would use a name in 2003 that has been the name of a software product and a software company that you don't even work for since 1987. It would be like naming a programming language "Borland" when you work for Seagate.

Re:Frustrating name (0)

Anonymous Coward | more than 3 years ago | (#36582606)

Or the name of a Rock band from Pasadena, CA.

Re:Frustrating name (1)

drinkypoo (153816) | more than 3 years ago | (#36582680)

Or the name of a Rock band from Pasadena, CA.

Yeah, that's almost as relevant as the name of a knitting needle manufacturer in Sasketchewan. On the other hand, Scala is a digital presentation and signage system, featuring a scripting language called ScalaScript. Years later we get a programming language called Scala. This is inherently confusing.

New solutions emerge (0)

Anonymous Coward | more than 3 years ago | (#36582532)

It seems it's the right moment for new languages to take over the cloud.
I quite doubted it when the usual Gartner analyst said it, but the facts are now here.

Among the recently released solutions, you can also check out Opa at

Re:New solutions emerge (1)

phy_si_kal (729421) | more than 3 years ago | (#36582554)

Opa is only for web applications (or webservices) But indeed, it's a very good language to program them, as a concurrent alternative to Scala + Lift.

Re:New solutions emerge (0)

Anonymous Coward | more than 3 years ago | (#36588238)

Opa looks cool and all, but it looks like it's pay-to-play? []

Really? o_O

If I want to use this language, I have to actually buy it?

What year is this again?

fun {Send X} {Port.sendRecv Server.port X} (1)

nicholas22 (1945330) | more than 3 years ago | (#36582614)

I don't like Ozma, it takes the fun out of functions :P

Re:fun {Send X} {Port.sendRecv Server.port X} (0)

Anonymous Coward | more than 3 years ago | (#36582858)

Don't say that or Ozma Bin Laden's zombie will come for you.....

Ozma in my pants (1, Informative)

slashpot (11017) | more than 3 years ago | (#36582844)

I have Ozma extending scala in my pants.

Announcing Not Giving A Shit (-1)

Anonymous Coward | more than 3 years ago | (#36582860)

I took a massive dump and now my bowels are purged, so, alas, I cannot give a shit about why some niche language is now being ass-raped in prison by nignogs, skinheads, and beaners.

Oh good, CONcurrency... (0)

Anonymous Coward | more than 3 years ago | (#36582924)

For a minute I thought this was another post about bitcoins.

advanced primitives (0)

Anonymous Coward | more than 3 years ago | (#36583058)

I love a good pseudo-oxymoron.

[] X#nil then Z=X (3, Informative)

Islemaster (1572519) | more than 3 years ago | (#36583146)

The syntax can be weird, (why the double square-bracket? why?) but I wouldn't call that Oz's biggest misstep in gaining wide acceptance. In my experience, the biggest problems while learning were

1. Almost zero documentation on getting the language to (partially) compile and/or run outside of Eclipse. I'm not sure how I ever figured this out, to be honest.
2. Lack of a decent file I/O library. When was the last time you had to write your own streamreader for a business application?

I am thrilled to see Oz getting more mainstream attention - the whole dataflow concept is very cool, and it's fun to write.

Re:[] X#nil then Z=X (1)

marcosdumay (620877) | more than 3 years ago | (#36583848)

You have quite a good point. If syntax weirdeness were a problem, C would have never taken off. Lack of decent I/O is the problem with most languages that are easy to paralelize. It seems that most such languages aren't defined with I/O in mind, and it is easy to imagine why, doing I/O in a paralel environment is very complex.

Re:[] X#nil then Z=X (2)

GargamelSpaceman (992546) | more than 3 years ago | (#36585324)

Eclipse? WTF?

Not sure if it even runs in Eclipse... It's pretty tied to Emacs though. I think the Browser is ok for seeing concepts, but I doubt I'd care to use it for serious programming though. I've been compiling what I code in oz with vi, and running it on the command line.

The documentation needs to be better though. I did buy [] , but the online docs need some polishing and completing, especially section 12 of the tutorial since that's where much of the cool stuff would be.

Someday, if I ever get the energy/time/oz-expertise, I want to try implementing a network protocol using oz's determinacy driven execution and definite clause grammers. Wouldn't it be neat to (almost) paste in the specification for HTTP to implement a web browser/server with the same sort of simplicity as one can almost paste in the BNF for URIs from the RFC to implement a URI parser?

Not an Oz expert though... Oz is still on my list of 'things to do when I get around to it.'

Another annoying thing is that Oz only runs in 32 bit so far.

Re:[] X#nil then Z=X (2)

sourcerror (1718066) | more than 3 years ago | (#36585976)

"Another annoying thing is that Oz only runs in 32 bit so far."

When I tried it around 1.5 year ago, the database libraries were quite buggy (they borrowed them from tcl IIRC). So I don't think it can be considered a production ready language.

CSP (3, Informative)

sourcerror (1718066) | more than 3 years ago | (#36585916)

If I remember correctly, "#" is for constraint statisfaction problems (delayed goal), and I think that operator shouldn't be taught to an Oz newbie, as Constraint Logic Programming is a paradigm onto itself (and not just paradigm, as in thinking pattern, but it involves quite a lot of complex algorithm that run in the background (demons), and non-sequantial execution).

Last time I installed Oz, it came with Emacs bundled by default.

Re:CSP (1)

Islemaster (1572519) | more than 3 years ago | (#36593766)

Oh yeah, Emacs, not Eclipse. *facepalm* My bad. I just remember it was an IDE that I couldn't get working at the time, and had to dig through docs for literally days to figure out how to compile from the command line.

At least in the subject line of my original comment, # is just an infix representation for a tuple. From the Oz Tutorial [] :

"A common infix tuple-operator used in Oz is #. So, 1#2 is a tuple of two elements, and observe that 1#2#3 is a single tuple of three elements: '#'(1 2 3) and not the pair 1#(2#3). With the # operator, you cannot directly write an empty or a single element tuple. Instead, you must fall back on the usual prefix record syntax: the empty tuple must be written '#'() or just '#', and a single element tuple '#'(X).

Re:CSP (1)

sourcerror (1718066) | more than 3 years ago | (#36594012)

Well, I haven't used Oz in a while, but my first exposure consisted some conference slides, and it was quite overwhelming as it involved a CSP without even properly clarify the syntax. And in Prolog CSP frameworks "#" means constraint equality. (CSP got "mainstream" with Prolog first.)

On the other hand I agree that Emacs feels ass-backwards.

Disingenuous (1)

rjstanford (69735) | more than 3 years ago | (#36583282)

Scala's actor framework is indeed built in, and works very well for multi-core scaling. What it doesn't have is a distributed concurrency model that makes it trivial to run a single "program" on multiple weakly-connected systems ("in the cloud.")

Excuse me (1)

hamburgler007 (1420537) | more than 3 years ago | (#36583290)

But "proactive" and "paradigm"? Aren't these just buzzwords that dumb people use to sound important? Not that I'm accusing you of anything like that. I'm fired, aren't I?

Re:Excuse me (1)

marcosdumay (620877) | more than 3 years ago | (#36584094)

No, normaly dumb people prefer to cast doubt on perfectly made arguments when they want to sound important.

Those words have a meaning, a quite clear and well define done. Learn those meanings if you want to talk about the subject in question.

Re:Excuse me (1)

hamburgler007 (1420537) | more than 3 years ago | (#36584474)

I'm familiar with the definition, and the only distinct thing I see in there is the bizarre syntax. All of the features touted in this language are things that either already exist, or for any well designed application be a non-issue. As others have noted, "advanced primitives" is an oxymoron. This product screams of something inept managers want to use because they think it is the latest and greatest thing and don't know any better.

Re:Excuse me (1)

marcosdumay (620877) | more than 3 years ago | (#36587008)

Yep, I agree that the language seems pointless, but you'd have a way better reception if you criticised the language (or the "advanced primitives" phrase), instead of arguing against well known and rightly used jargon.

Re:Excuse me (1)

hamburgler007 (1420537) | more than 3 years ago | (#36587688)

Paradigm is certainly well known but almost never rightly used jargon in the computer software industry. That particular word is most frequently used to describe concepts which hardly meets the definition of a paradigm, rather it is used to describe concepts that have existed for a long time (e.g. the cloud). The original post was quoting The Simpsons (which I should have indicated earlier), and I was using it to illustrate a point. Namely that paradigm is often used by companies to advertise a product and by inept managers to validate their positions by hopping onto the bandwagon without any real understanding of the technology. The people competent enough to see through the smoke and mirrors (usually the developers) often can't push back, especially in this economic climate, because inept managers are also the ones who feel the most threatened, and respond poorly to anyone who questions their judgement.

Re:Excuse me (1)

Short Circuit (52384) | more than 3 years ago | (#36598668)

All of the features touted in this language are things that either already exist, or for any well designed application be a non-issue.

There's more to language design than creating new features. Most languages I've seen introduce few, if any, new or fundamental features, but instead seek different balances in the various tradeoffs involved. Off the top of my head, those tradeoffs include things like raw speed, syntax expressiveness, syntax flexibility, memory consumption, number, breadth and convenience of builtins and number, breadth and convenience of libraries.

Most languages even seek to remove some mundane aspects of architecting a "well-designed application;" it's been a long, long time since most programmers had to manually keep track of their memory consumption or object interaction models for trivial applications, but applications which we would consider trivial today would have required extraordinary amount of programmer effort thirty years ago for such things. Non-issues in a well-designed application eventually become non-issues in any trivial application; even the painstakingly-well-designed application itself may eventually become trivial.

"Advanced primitives" isn't an oxymoron if you evaluate it in the right context; you're missing the implicit "when compared to {other language's primitives}". Granted, it's still a vague term; compared to what? Compared to C primitives? Possibly. Compared to Java primitives? Possibly.

Oz's syntax doesn't look any stranger to me than comparing a C-like syntax with a FORTAN-like syntax, or either of those with a Lisp-like syntax.

If you want a bizarre syntax, see J [] . Before you knock it, consider its heritage [] , and figure out which you'd have an easier time entering on your average keyboard.

No, I don't know Oz. I don't know Scala. I don't know Ozma. I just watch a lot of language advocates and designers compete with each other, and learn a thing or two every now and then--and it irritates me when I see people knock langauges and tools because they don't have enough perspective to see that those tools might not be right for them, but might be right for someone else.

Re:Excuse me (0)

Anonymous Coward | more than 3 years ago | (#36592916)

"proactive"? Hell yeah.

But "paradigm"? Depends.
Depends on if they know what it really means and use it in the proper context.
I use "paradigm" to refer to generally accepted base views (what dumb people call "knowledge") on which people build, so they don't have to go down to the beginning of everything and quantum mechanics, just to make a certain argument.
If, for all participants, it logically follows from the paradigm, the argument is accepted. (Unless people have common fallacies.)
Analogous words are "axiom" (mathematics) and "dogma" (religious schizophrenics).

You're HIRED! :)

Re:Excuse me (1)

Short Circuit (52384) | more than 3 years ago | (#36598126)

In programming, "paradigm" usually refers to a mindset or a "way of doing things." You might find this [] useful.

certain personality (0)

Anonymous Coward | more than 3 years ago | (#36583342)

We have certain "personalities" that use Scala, needless to say I've never seen anyone finish a project with Scala with it that was 10x over budget/time and totally over engineered.

Re:certain personality (1)

sourcerror (1718066) | more than 3 years ago | (#36584690)

I think Scala is mostly used by Lambda-the-ultimate fabboys, who just fall in love with everything if it has anonymous functions and continuations, and don't really give a shit about productivity, developer training, good documentation, good tool support or performance.

What happened to Oz and Scala is... (0)

Anonymous Coward | more than 3 years ago | (#36583802)

... When you create languages that have bizarre syntax, you alienate developers that easily want to translate their current skill set AS WELL AS existing libraries! The reality of development is that there exist many libraries on *nux/Windows that you need to use in order to be productive. If you have to re-create the wheel from scratch in a language that doesn't fit the way most people program, you are not going to have a successful language. It needs to have a 'killer app,' and it won't have that without finding a way to join other languages instead of being its own island.

I have had the displeasure of using Oz, and I won't touch that ever again unless it makes me an offer I can't refuse. I congratulate them on their academic achievement and their intellectual exhibition, but no thank you from a real-world programmer.

I still have nightmares with Oz (0)

Anonymous Coward | more than 3 years ago | (#36583926)

I remember I once took a lecture on alternative programming paradigms. One of the sections of the class was about Oz and we had to write a small interpreter. Needless to say, it is by far the worst programming language I have used and I cannot understand how someone could pick up that nightmarish muck. Not even Modula 3 was as bad as Oz

Re:I still have nightmares with Oz (2)

sourcerror (1718066) | more than 3 years ago | (#36584630)

I bought the MIT book about Oz, and it was really fun. (Read only 3 chapters) However I can't imagine learning it from PowerPoint slides. (Especially if you got shown a constraint statisfaction problem right-away.) By the way it seems less weird for me than Haskell or Common Lisp. * (I'm just fiddled with them, don't know them at any usable level.)

I think when you understand the canonic representation of program structures, and how the flow of control goes, it does seem quite sensible.

On the other hand when I tried Scala, the type system always got in the way. The guys in the forum were really helpful, but they referred to things that wasn't in the documentation/ was hard to find in the documentation. (It was pretty badly organized.) It was too much time to statisfy the compiler, and I didn't see the gain in code roboustness. I started to really appreciate Java's half-baked generics. (Where usually you don't actually need to use generics due to type erasure, but if you do, the IDE will offer better auto-completition.)

"Advanced primitives"? (1)

jfengel (409917) | more than 3 years ago | (#36583932)

So it's got "unusual syntax" AND builtin oxymorons? Sign me up!

Why Ozma? (1)

Anonymous Coward | more than 3 years ago | (#36586306)

It's true that the Oz runtime system, libraries, and development environment are not as good as mainstream systems like Java. But that's not the point!

The point is concurrent programming: Ozma (and Oz) have a concurrency model worlds beyond that of Java, and much better even than Scala or Erlang. Simple, easy to program in, and with very few race conditions (zero if your program uses only dataflow and no ports). If you haven't seen this model, you can't imagine how much better it is than the crud that Java gives you. The purpose of Ozma is to make it easy for developers to learn about this model. That's why we picked Scala as the foundation for Ozma, because Scala is a well-designed and well-implemented language with a fairly large community, and it has some momentum as a possible successor to Java. We hope that Scala programmers will take a look at Ozma and like what they see.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?