Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Return of Ada

Zonk posted more than 6 years ago | from the lord-byron-will-be-in-the-sequel dept.

Programming 336

Pickens writes "Today when most people refer to Ada it's usually as a cautionary tale. The Defense Department commissioned the programming language in the late 1970s but few programmers used Ada, claiming it was difficult to use. Nonetheless many observers believe the basics of Ada are in place for wider use. Ada's stringency causes more work for programmers, but it will also make the code more secure, Ada enthusiasts say. Last fall, contractor Lockheed Martin delivered an update to ERAM, the Federal Aviation Administration's next-generation flight data air traffic control system — ahead of schedule and under budget, which is something you don't often hear about in government circles. Jeff O'Leary, an FAA software development and acquisition manager who oversaw ERAM, attributed at least part of it to the use of the Ada, used for about half the code in the system."

cancel ×

336 comments

Sorry! There are no comments related to the filter you selected.

I used ada.... (4, Informative)

aldousd666 (640240) | more than 6 years ago | (#23079338)

In school. It wasn't actually any different from very many other languages that have huge class libraries, it's just that they were all 'included' in the langauge instead of linked in separately. It's more verbose and stuff, but I didn't see any completely foreign concepts in Ada that aren't around in most other langauges. Just more typing, from what I remember.

Re:I used ada.... (2, Insightful)

Xipher (868293) | more than 6 years ago | (#23079416)

I had to use Ada in my Data Structures class with John McCormick at UNI http://www.cs.uni.edu/~mccormic/ [uni.edu] This guy teaches a tough class but you end up learning a lot, and he is very big on Ada. While I haven't used it much since I did like a lot of features in the language.

Re:I used ada.... (1)

Teese (89081) | more than 6 years ago | (#23079828)

yea, but mccormick was the only one who taught using Ada, practically every other class was c++ (well, except for Wallingford). Of course I was there 10 years ago, I guess I have no idea what goes on there now.

Re:I used ada.... (1)

iamhigh (1252742) | more than 6 years ago | (#23079468)

Me too... even graduated first in my little AF programming class (Keesler AFB). And I can't code for shit.

Re:I used ada.... (1)

Red Weasel (166333) | more than 6 years ago | (#23079736)

The great part about being a programmer for the Air Force is how easy it was to get a waiver to use a more appropriate language. ADA was used in tech school and that was about it.

Now that I've left I STILL find ADA code running from the 70s. Upgrading it is a major bitch but If you are going to sling code for the government it would be good to know the basics.

The ability to import java and c into ADA are a real boon.

Re:I used ada.... (2, Informative)

aldousd666 (640240) | more than 6 years ago | (#23079814)

They used it in real school too, I never had a java course, and I get along just fine today in 'the real world.' Admittedly I don't use Ada either, but it worked well enough for Data Structures and Algorithms classes.

Re:I used ada.... (0, Troll)

Mad Merlin (837387) | more than 6 years ago | (#23079762)

Well, that and Ada's I/O is pretty terrible. Ever try reading in and storing an arbitrary length string? I'm fairly convinced it's not possible in Ada.

Re:I used ada.... (4, Insightful)

Chris Mattern (191822) | more than 6 years ago | (#23079928)

Ever try reading in and storing an arbitrary length string? I'm fairly convinced it's not possible in Ada.


It's not possible anywhere, unless you have access to an arbitrary size memory. Ada simply makes you aware of that fact before you put the code into production.

Re:I used ada.... (1)

dpilot (134227) | more than 6 years ago | (#23080576)

Sounds like an incipient buffer overflow, to me.

Re:I used ada.... (2, Informative)

wfstanle (1188751) | more than 6 years ago | (#23080556)

That's the root cause of buffer overflows. If you can't do such "crimes" then security is advanced.

Re:I used ada.... (0)

Anonymous Coward | more than 6 years ago | (#23079802)

More typing is good.

Remember the Mars probe that crashed because US contractors used the US system and US scientists used the metric system? It wouldn't have happened in Ada, since the US system would be using SpeedMPH and the metric system would be using SpeedKMPH, both derived types from whatever precision decimal is required, and attempting to use a SpeedMPH decimal as a SpeedKMPH would be a compile time error and would be caught.

(OK, so that example is rather contrived, since as I recall the error was actually during IPC, and if that occurs outside the language constructs there's no much any language can do to determine that the program that's supposed to be giving it km/h is really giving it mph. But if they were both Ada libraries, it would be a compile time error.)

That's strong typing used correctly. Contrast it to Java "strong" typing, which only works with objects and not with "primitive" types despite "primitive" typing being far, far more useful than class typing.

Re:I used ada.... (4, Insightful)

TargetBoy (322020) | more than 6 years ago | (#23079860)

Likewise, I also used Ada in college.

I found it very easy to work with and is only slightly more verbose than VB or PowerBuilder.

Frankly a language that forces programmers to do the right thing up front might just be the thing to do. It's always faster to re-type something than to try to find the bug in your code after it is running.

Re:I used ada.... (2, Insightful)

harmonica (29841) | more than 6 years ago | (#23080358)

It's more verbose and stuff, but I didn't see any completely foreign concepts in Ada that aren't around in most other langauges.
However, Ada had a lot of those concepts working reliably in 1983 and 1995 (the years the first two major versions were released if I remember correctly), when most other people were using not-so-sophisticated languages, to put it mildly.

Getting into Ada is rather complicated and time-consuming, though, so it's not surprising that it never took off in a big way.

Skill and not language used? (5, Insightful)

thedak (833551) | more than 6 years ago | (#23079390)

I may just be a whippersnapper, get off my lawn and whatnot; as a Java, C, C++ coder, but the project being completed under-budget and pre-deadline and having that attributed to Ada itself seems rather misguided to me.

As far as I'm concerned, if a competent team is hired; skilled programmers and developers, then anyone could get it done under-budget and pre-deadline. (yes, yes, military intelligence, oxymoron, but it seems to have worked out with this project)

I think the headline could later read, "the return of C", or any other language in the future if a team manages to finish a project efficiently due to the use of skilled developers.

Not necessarily a praise of language used is necessary, and a congratulatory beer for the team may be advised.

Re:Skill and not language used? (1)

CogDissident (951207) | more than 6 years ago | (#23079510)

Oddly, they're saying a language which is slower for people to write, and considerably more obscure than most languages, is the reason something is done under-budget and quickly? It seems like those traits would make it more secure, but take much longer to make...

Re:Skill and not language used? (5, Insightful)

Digi-John (692918) | more than 6 years ago | (#23079554)

Perhaps a language which is slower and a bit more difficult to write prevents programmers from dumping so many lines of semi-working crap, requiring them to put a little more thought into the code?

Re:Skill and not language used? (2, Interesting)

hey! (33014) | more than 6 years ago | (#23079806)

I think there is something to be said for this idea. Not too much, mind you, but something.

Let's imagine a language so obscure and difficult, that 90% of working programmers cannot gain sufficient mastery of it to understand what it is saying at first glance. This sounds terrible, until you realize that every programmer at some time in his life has written code in "friendly" languages that 0% of programmers (including his future self) can understand. And maybe selecting a language that only the top 1% of programmers is capable of using might be a good thing for some projects.

Unfortunately, I don't think you can mandate thinking via language restrictions.

If you really, really wanted to improve the quality of thought in code, wouldn't mandate languages, you'd mandate editors that don't support cut and paste. Then instead of taking a piece of code that works more or less for one purpose, then hammering into the approximate shape you'd need for something else, sooner or later you'd be forced to abstract what was useful about it rather than banging it out over and over again.

Re:Skill and not language used? (1)

lorenzo.boccaccia (1263310) | more than 6 years ago | (#23080484)

if you need really really good code you need to mandate persons to use their brain, then also c could be abstract and object oriented.

Re:Skill and not language used? (0)

Anonymous Coward | more than 6 years ago | (#23080542)

All for it!

Re:Skill and not language used? (3, Insightful)

larry bagina (561269) | more than 6 years ago | (#23079572)

Aside from writing code, you also have to test it. That extra security/straightjacket can mean it works right the first time.

Re:Skill and not language used? (2, Insightful)

somersault (912633) | more than 6 years ago | (#23079890)

That's what I was thinking too. Testing should be a major part of any project - and especially one where large pieces of metal hurtling through the air are involved :p

Re:Skill and not language used? (4, Informative)

everphilski (877346) | more than 6 years ago | (#23079616)

Oddly, they're saying a language which is slower for people to write, and considerably more obscure than most languages, is the reason something is done under-budget and quickly? It seems like those traits would make it more secure, but take much longer to make...

You need to make a distinction: they weren't writing new code, they were updating existing code. This is a very important distinction. We are all aware of "code rot" [wikipedia.org] , etc. and how over time documentation gets lost, people have to re-learn a piece of code based purely on the source, etc. However they took an older piece of code and revamped it, right on time and under budget. This is notable, and may be attributable to some of the properties of Ada [wikipedia.org] .

Maybe, maybe not, but there's a good chance it had something to do with Ada.

Re:Skill and not language used? (4, Insightful)

Ephemeriis (315124) | more than 6 years ago | (#23080290)

You need to make a distinction: they weren't writing new code, they were updating existing code. This is a very important distinction. We are all aware of "code rot" [wikipedia.org] , etc. and how over time documentation gets lost, people have to re-learn a piece of code based purely on the source, etc. However they took an older piece of code and revamped it, right on time and under budget. This is notable, and may be attributable to some of the properties of Ada [wikipedia.org] .

  Maybe, maybe not, but there's a good chance it had something to do with Ada.
Ada is almost self-documenting. The syntax is all very verbose and human readable.

If you have to walk in blind and maintain someone else's code, Ada is the language to do it in.

Re:Skill and not language used? (1)

afidel (530433) | more than 6 years ago | (#23079632)

Not necessarily, if you only get the top 10% programmers because of the job requirements then it's easy to see where a difficult to learn language could lead to better, more consistent code. I know my coding skills suck which is why I went into sysadmining instead of programming, but I would technically qualify for many programming jobs and could even stumble through making code that would pass general inspection. There really is a TON of difference between a great coder and an average or mediocre coder.

Re:Skill and not language used? (0)

Anonymous Coward | more than 6 years ago | (#23080194)

if you only get the top 10% programmers because of the job requirements

So you're saying the top 10% of programmers prefer Ada?:)

Re:Skill and not language used? (5, Insightful)

Detritus (11846) | more than 6 years ago | (#23079658)

One of the advantages of a language like Ada is that more problems can be detected at compile time and corrected at low cost, as opposed to languages like C that assume that you know what you're doing and are optimized for speed. Ada also has run-time checks that can catch many problems. It's usually more efficient for the project to do the work up-front, rather than to hack together something and debug it.

Re:Skill and not language used? (1, Insightful)

hitmark (640295) | more than 6 years ago | (#23079892)

and debug, debug, debug, reimplement, debug, debug, debug, reimplement, debug, debug, debug, discard/emulate on top of new hacked together system, and the pattern goes on...

Re:Skill and not language used? (0)

Anonymous Coward | more than 6 years ago | (#23080144)

But, but, but ... that's the Agile refactoring cycle!

Re:Skill and not language used? (2, Insightful)

samkass (174571) | more than 6 years ago | (#23079694)

Not at all. Most of the time spent between project kickoff and software delivery is NOT spent actually typing the code. If you program in a language that makes it harder to write bugs, easier to find bugs, easier to express algorithms, easier to read other people's code, and easier to do automated testing and verification you'll save huge amounts of time even if coding takes several times as long.

Interestingly, that's one of the arguments in support of Java. Hardcore C or C++ hackers find it cumbersomely verbose, but it's pretty easy to read any Java coder's source code from anywhere in the world and debug it with relatively little time spent in the code archeology phase. That is, for those bugs that make it past the relatively extensive automated checks that are possible because of the straightforward syntax.

Re:Skill and not language used? (2, Insightful)

Anonymous Coward | more than 6 years ago | (#23079784)

but it's pretty easy to read any Java coder's source code from anywhere in the world
Only if India's on the moon.

Re:Skill and not language used? (3, Insightful)

aldousd666 (640240) | more than 6 years ago | (#23079898)

haha in refernce to your java being easy to read by default, you're forgetting about people like this [p-nand-q.com] .

Re:Skill and not language used? (1)

laddiebuck (868690) | more than 6 years ago | (#23079696)

More reliable language means less debugging -- and that's typically what a C/C++ programmer spends most of his/her time in.

Re:Skill and not language used? (1)

conspirator57 (1123519) | more than 6 years ago | (#23079818)

two words: type safety

this one feature prevents much debugging and allows better, more automated test coverage.

they coded slower, but took less time in QA.

Re:Skill and not language used? (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23079562)

Ada is a strong programming language and it is harder to make some of the more common stupid mistakes in Ada. But I think you are right and especially so with Ada. I haven't seen a junior Ada programmer in a LONG time. Projects using Ada tend to have senior people who understand their domain and don't make newbie mistakes.

When you combine a language that encourages errors at compile time rather than run-time with experienced people, you are going to get good results.

Reword the title (0)

Anonymous Coward | more than 6 years ago | (#23079666)

How about...

>>>Despite use of ADA, ERAM comes in ahead of schedule and under budget<<<

It's all in the spin, see.

Re:Skill and not language used? (1)

plague3106 (71849) | more than 6 years ago | (#23079670)

I agree, given that only half of the code was Ada code. Maybe it was all the non-Ada code that saved the project from Ada.

Re:Skill and not language used? (5, Insightful)

Jason King (909684) | more than 6 years ago | (#23079730)

Yes, 10 uber-coders can finish a project ahead of 10 clueless coders every time. What Ada does is it makes it harder for the clueless coder to hose the whole system. Because its persnickety you don't find buffer overruns (for example) in the wild. You always get them in test or sometimes they even generate compiler errors. The earlier in the cycle you get your errors the easier they are to fix.

Re:Skill and not language used? (5, Insightful)

wfstanle (1188751) | more than 6 years ago | (#23079846)

You are forgetting something... Actually writing the original code takes up a small part of the total time spent on a program in its life cycle. There is debugging, testing and updating that have to be considered. I have updated programs written in Ada and in other programming languages. Have you actually had to read code the code written by others? Reading a C or C++ program is not easy. Some say that C (and all of its derivatives) are "Write only languages". At least in Ada, it is easier to make sense of the code that others write.

Stringency==Secure (4, Funny)

Zordak (123132) | more than 6 years ago | (#23079428)

Ada's stringency causes more work for programmers, but it will also make the code more secure, Ada enthusiasts say.
So... you're saying I should ideally program my firewall in INTERCAL?

Re:Stringency==Secure (1)

somersault (912633) | more than 6 years ago | (#23079988)

He said stringency, not lunacy..

Ada (4, Informative)

nwf (25607) | more than 6 years ago | (#23079514)

I took a class in Ada for a previous employer. I found it a lot like Pascal and not all that difficult. The main issue was the cost of compilers which had to go through an expensive certification process. I did find the language a but verbose for many things, e.g. here [adapower.com]

The real issue isn't that it's hard to learn, it's that it's a little cumbersome, but more importantly, not many people know it and they typical clueless manager wants to see 10+ years of Ada experience on the resume/cv before hiring someone. Those people are few and far between, but and competent software developer can learn it.

Re:Ada (5, Insightful)

Maaras (1171713) | more than 6 years ago | (#23079738)

You hit the nail on the head. I wrote Ada on a defense project for about 4 years. From a purely technical standpoint, it is the best programming language that I have ever used. However, in the real world, other concerns tend to dominate. Concerns such as IDE's (AdaCore's IDE was exceptionally slow and hard to use, on Solaris, at least.) and finding developers who know Ada (or are willing to REALLY learn it) counter-balance a lot of Ada's strengths. What good is the best language on Earth if you can't get developers to use it?

Re:Ada (1)

Asuranceturix (1265166) | more than 6 years ago | (#23080390)

Its syntax is fairly easy, and it provides means to avoid common mistakes in other languages such as the ability of deriving new types from primitive types and strong typing even with numeric types. Thus, I could define types *and* operators which are really significant for the problem at hand, no matter which machine it runs on:

type Distance is digits 4 range 0.0 .. 100000.0;
type Time is digits 4 range 0.0 .. 1000.0;
type Speed is digits 4 range 0.0 .. 200.0; -- Based on physical constraints, for instance

function "/"(Left : Distance; Right : Speed) return Speed is begin (code here) end;

This way, there is no way I can mangle arguments, a Time/Distance operation is undefined, for example, and will raise a compilation error. Besides, I don't have to care which size those variables take, I just specify what I really need and the compiler and runtime will take care of the details.

On the other hand, advantages such as this one only work if the programmer adapts herself to this way of thinking, instead of using predefined types as she would have done in C.

For a quick review of interesting characteristics of the language which make it useful to avoid mistakes (and, for the purpose of the article, cut debugging time), you can refer to this set of slides (PDF) [adacore.com] .

Re:Ada (1, Informative)

Anonymous Coward | more than 6 years ago | (#23080424)

It's not expensive any more. Ada is included with GCC. It's also known as GNAT.

Ada is lovely (0)

davidwr (791652) | more than 6 years ago | (#23079528)

Lovelace-y [wikipedia.org] to be precise.

Cheesy movie title? (0)

Alzheimers (467217) | more than 6 years ago | (#23079542)

Can "Zombie Babbages From Space" be far behind?

I clicked too early... (1)

abolitiontheory (1138999) | more than 6 years ago | (#23079544)

... i thought this story's title read, "the return of ABBA."

*snaps fingers*, man, i thought this was going to be the year!

Re:I clicked too early... (3, Funny)

eclectro (227083) | more than 6 years ago | (#23080024)

... i thought this story's title read, "the return of ABBA."

ABBA was an early 80's rockband. ADA was an early 80's programming language. ABBA is seeing a resurgence in interest now. ADA is also seeing increased in interest now. ABBA consisted of four singers, and ADA consisted of four programming languages.

Coincidence? I too think not. Take a chance on it.

Re:I clicked too early... (1)

skoaldipper (752281) | more than 6 years ago | (#23080468)

Take a chance on it.
Was expecting "take a chance on me" instead, but I got it.

Overall, well done, sir. Well done.

Oh, yeah, and by the way, is it any coincidence that they're both anagrams? And, what's really surprising, if you mathematically extract the alphanumerics; ABBA = ADA; where A=1, B=2 [...] Z=26. Coincidence? I think not too.

Re:I clicked too early... (0)

Anonymous Coward | more than 6 years ago | (#23080696)

Coincidence? Well, since the language's name is Ada, not "ADA," I say "yes."

Have Apple or Google do the work (1)

backpackcomputing (1249130) | more than 6 years ago | (#23079552)

Here's a crazy idea: DoD should consider contracting out the work to Apple or Google! http://backpackcomputing.com/ [backpackcomputing.com]

Language Magic Bullets (2, Insightful)

msgmonkey (599753) | more than 6 years ago | (#23079556)

Most projects do not meet their goals and/or timeframes from bad project management. Whilst the choice of language is obviously important, I have never heard of choice of language being a major factor when reporting on contracts that are over due/over budget or plain just dont work.

I'm just worried that some PHB will read this and go, "Hmm, Ada, we must use that!"

Btw Ada is n't that bad a language, but does n't guarentee success. I remember being told that an Ariene rocket that exploded mid flight was written in Ada, the cause was a overflowing integer.

Re:Language Magic Bullets (1)

Maaras (1171713) | more than 6 years ago | (#23079662)

I remember being told that an Ariene rocket that exploded mid flight was written in Ada, the cause was a overflowing integer.
Can you provide a citation for this? I've used Ada before and would be curious to read about how such a thing could have happened.

Re:Language Magic Bullets (5, Informative)

Detritus (11846) | more than 6 years ago | (#23079794)

It was the first Ariane V launch. They had reused software from an earlier model of the Ariane without properly testing it in its new environment. Think of it this way, you take the speedometer module from your Trabant and install it in a Ferrari. The first time that you exceed 100 km/h, the speedometer module fails with an overflow error because the type for speed was defined as 0..100. The problem was that Ariane's management was cutting corners on requirements analysis and testing. The software performed as designed, it just wasn't designed for the Ariane V.

Re:Language Magic Bullets (1)

rtaylor (70602) | more than 6 years ago | (#23079812)

I don't know any of the specifics but an integer overflow is still very plausible.

A remote device might have settings 1 through 10. Send it an 11 and it has problems. ADA is still perfectly capable of overflowing a remote device by a programmer not adhearing to a communication protocol even if the local machine sending commands isn't susceptible to overflows.

Re:Language Magic Bullets (1)

bunratty (545641) | more than 6 years ago | (#23079838)

Wikipedia has an article on the Ariane crash [wikipedia.org] . If you think something in the article is inaccurate, you can always check the sources.

Re:Language Magic Bullets (1)

Maaras (1171713) | more than 6 years ago | (#23079964)

Oh, I never would have argued that what he said wasn't true, I was merely asking out of curiosity. Thanks for the link :)

Re:Language Magic Bullets (1)

Nicolas Roard (96016) | more than 6 years ago | (#23079968)

It's very well known. When I was a freshman in CS (in 96!), most of the teaching was done in Ada (some of the profs were really big on it), and they described the problem. As far as I remember, this is what happened:
- they chose to reuse a previous components made for Ariane 4 (ok...)
- they chose to NOT test it as it was working fine on Ariane 4 after all... in order to "cut costs"
- the flight parameters were quite different between Ariane 4 and 5, and notably, some were out of the range experienced with Ariane 4
- an exception was thus triggered in the component (which had something to do with navigation system), but there was no code to handle it (iirc there was code, but it had been disabled when compiling the component)
- the default exception mechanism was executed, and decided to reset the full navigation system
- the rocket then changed behaviour (duh!) and started to veer off from the planned flight path, and at that point a ground engineer decided to blow it up.

To sum it up: by deciding to "cut" cost by not testing a component they lost millions of euros. Nothing the language could have done, really.

I never had to program in Ada after my BS, but I did like the language. It really felt like an engineering language you could rely on -- we joked at the time that if you could compile your code, your program was probably right. On the other hand it was quite verbose and the compilator was really strict (but with wonderful error messages, never found something as good since).

Funnily I'm myself very keen on dynamic languages (ObjC/Smalltalk...) and usually don't like static languages. Still, kinda like Ada.

Re:Language Magic Bullets (4, Informative)

TheRaven64 (641858) | more than 6 years ago | (#23080248)

Actually, it's slightly more depressing than that. The component in question was a gyroscope used for immediately post-launch corrections on IV. It wasn't required at all in V due to improvements in other areas, but it was kept in for extra reliability.

The old 16-bit gyroscope controller was replaced with a 32-bit one but the software was kept the same. The software got an invalid input, diagnosed it as a fault and shut itself down. The backup was brought online, got an invalid input, and shut itself down. At this point, the system determined that the rocket was unsafe and caused it to self destruct. By this point, it was already at a higher altitude than the gyroscope was intended to operate (the V accelerated faster and so got above this threshold much faster than the IV).

Re:Language Magic Bullets (1)

HiThere (15173) | more than 6 years ago | (#23080264)


Ariane V http://www.adapower.com/index.php?Command=Class&ClassID=FAQ&CID=328 [adapower.com]

OTOH, remember the link is to a site promoting Ada. They're telling the truth as they know it, but they're biased.

Re:Language Magic Bullets (1)

dpilot (134227) | more than 6 years ago | (#23080618)

They may be biased, but they're also informed. Would you prefer a summary from a C coder who knew nothing whatsoever about Ada? A knowledgeable person can make efforts to set aside bias. An ignorant person can't set aside ignorance, because by doing so he becomes knowledgeable.

Re:Language Magic Bullets (0)

Anonymous Coward | more than 6 years ago | (#23079668)

I agree 100% especially since the text does not specify what the is the language used for the other half of ERAM (which is C++ BTW).

I have really no idea how they can attribute the merits of being under budget and ahead of schedule to a language more than the other.

Re:Language Magic Bullets (1)

Ford Prefect (8777) | more than 6 years ago | (#23079728)

Btw Ada is n't that bad a language, but does n't guarentee success. I remember being told that an Ariene rocket that exploded mid flight was written in Ada, the cause was a overflowing integer.

I was wondering if anyone would mention that - the first non-flight of the Ariane 5 booster [wikipedia.org] .

It's possible to program bugs in any real programming language...

Re:Language Magic Bullets (4, Insightful)

mihalis (28146) | more than 6 years ago | (#23079872)

Bug? Phooey - The software in question performed exacly as per spec... the spec for the Ariane 4 rocket, that is.

Re:Language Magic Bullets (1)

Chatterton (228704) | more than 6 years ago | (#23079942)

Especially when you deactivate checks.

Re:Language Magic Bullets (1)

Mikkeles (698461) | more than 6 years ago | (#23079970)

'...Ariene rocket that exploded mid flight...'

That was the first Ariene V flight. Wikipedia has an article [wikipedia.org] and the accident report is here [mit.edu] .

Basically, the error was an integer overflow in code reused from the Ariene IV for which the range check had been deliberately eliminated as a performance measure since (in the Ariene IV) it could not overflow. That assumption is not valid for the Ariene V.

To add insult to injury, the functionality provided by the component is not needed at or after liftoff, so could safely have been disabled at that time. Further, the original reason for the function was to allow, in older Ariene models, for the ability to suspend the launch sequence at or after -9 seconds and resume without having to redo some steps.

The lesson is that code cannot save one from a wrong design (although bad code can ruin a good design).

In Soviet Amerika (-1, Flamebait)

Anonymous Coward | more than 6 years ago | (#23079564)



Ada returns you.

You'll have to get Russian programmers because in Soviet Amerika, education system is as bankrupt as country.

P.S. Have fun paying high price for your addiction to oil.

Re:In Soviet Amerika (1)

Anita Coney (648748) | more than 6 years ago | (#23079648)

"P.S. Have fun paying high price for your addiction to oil."

I can't help but wonder what industrialized country you live in that is somehow not addicted to oil.

Re:In Soviet Amerika (0)

Anonymous Coward | more than 6 years ago | (#23080052)

France gets 80% of their electricity from Nuclear

Re:In Soviet Amerika (0)

Anonymous Coward | more than 6 years ago | (#23080094)

90% of all statistics can be made to say anything 50% of the time. Take your Soviet Amerika and shove it pal.

Re:In Soviet Amerika (1)

Anita Coney (648748) | more than 6 years ago | (#23080560)

We're talking about an alleged "addiction" to oil. In the US only 7 percent [doe.gov] of our electric power is created from petroleum. Therefore, 93% of our electricity is not created by oil. So, what exactly is your point about France? Thanks. I won't be holding my breath.

Re:In Soviet Amerika (1)

maxwell demon (590494) | more than 6 years ago | (#23080686)

According to this [insee.fr] statistics, about 1/3 of the energy in France comes from oil (or at least, came from oil two years ago), with another 1/7 from natural gas. Remember, electricity is only part of the energy consumption. Electric cars are the exception, and so is electric heating of buildings.

Having 33% of your energy production dependent on oil can well be considered "addicted". Would the oil sources disappear over night, you'd certainly have much trouble (as would the rest of the industrialized world).

Re:In Soviet Amerika (1)

bobbonomo (997543) | more than 6 years ago | (#23079758)

I just have to answer to a post like this (and to all the rest like it) not for the content because it is an opinion amongst others but that the poster did not have the courage to attach a name to it.

Do you not believe in what you say? ...and have the guts to stand up for it. What? you worried the NSA is going to put the death squad after you? Come on!

Or is this just flame bait?

Maybe time to eliminate Post Anonymously on /. I tune in and post less frequently because of it.

You got something to say? Say it. I'm listening.

Re:In Soviet Amerika (1)

zienth (890583) | more than 6 years ago | (#23080280)

Please don't feed this troll.

It is not just the language (0)

MarkWatson (189759) | more than 6 years ago | (#23079590)

I think that any strongly typed language with lots of compile time and link time checks would be about as good (e.g., Java).

That said, it has been a very long time since I used Ada, so I might have forgot what extra things that Ada does for you.

Re:It is not just the language (5, Interesting)

dprovine (140134) | more than 6 years ago | (#23079944)

I think that any strongly typed language with lots of compile time and link time checks would be about as good (e.g., Java).

In Ada, you can declare a variable to be an integer in the range 1..100, and if it goes outside that range at any point during its lifetime, an exception is immediately thrown. In most languages, you'd have to check it every time you assign it.

Also, you can declare subtypes which not only define ranges but wall themselves off from each other. If you declare "MonthType" and "DateType" as types, and then ThisMonth and ThisDate as variables, you can't say assign ThisMonth to ThisDate (or vice-versa) without an explicit cast, even if the value stored is within range.

I programmed in Ada more-or-less exclusively for a year, with all the warnings possible turned on, and it did change a bit how I think about programming. I always know, instantly, what type any object is and what its limits are, because I got so used to thinking about those things when using Ada.

Not that it's perfect, or the ultimate, or anything. I had a job where I wrote C only for about 2 years, and that definitely changed how I thought about programming too. When writing C++ I have sense of what the computer is going to have to do to actually run the code.

There's a quote that any language which doesn't change how you think about programming isn't worth knowing. Ada built up my mental macros for making sure my types and values were in order, and for that alone it was worth learning and using for a year.

Re:It is not just the language (3, Insightful)

Alistair Hutton (889794) | more than 6 years ago | (#23080058)

I think that any strongly typed language with lots of compile time and link time checks would be about as good (e.g., Java). Java, all the verbosity of Ada without any of the benefits. I can't work out how Java managed to make programmers type so many characters without achieving anything. Java's compile time checking is decent but seriously weak when compared to Ada. I've always liked the Ada compiler pointing out my spelling mistakes :-)

we need it where it matters (3, Interesting)

r00t (33219) | more than 6 years ago | (#23079612)

All that vulnerable client-side code (image libraries, HTML parser, etc.) would be immune to buffer overflows if it were in Ada.

Even better, write it in proof-carrying Ada. (while an aritrary theorem prover is impossible, one can get a theorem prover to work in practice via minor tweaks to the input)

Re:we need it where it matters (2, Informative)

dpilot (134227) | more than 6 years ago | (#23080640)

I was grabbing someone's .sig and stuffing it into my "quotes" file, and found this relevant tidbit...

Imagine a surgeon who discovers how much money can be saved by purchasing Xacto blades instead of using blades manufactured to more stringent standards. That is exactly the situation we are currently facing when contractors decide to use C or C++ instead of Ada. On the surface one gets the same result. It is only that superficial result that counts for the lowest bidder.

- Richard Riehle

Look at Oracle's PL/SQL - it is ADA (1)

oldwarrior (463580) | more than 6 years ago | (#23079622)

for the database. Oracle must have invested in an ADA parser then salvaged it for their stored procedure lang when they needed one in a hurry. Ada is a beast - but a nice beast. Java now has most of the worst parts of Ada but they run slower.

Generating Ada CSDs (4, Informative)

HockeyPuck (141947) | more than 6 years ago | (#23079680)

If anyone is programming in Ada, I highly recommend the program jGRASP http://www.jgrasp.org/ [jgrasp.org] . From the site:

jGRASP is a lightweight development environment, created specifically to provide automatic generation of software visualizations to improve the comprehensibility of software. jGRASP is implemented in Java, and runs on all platforms with a Java Virtual Machine (Java version 1.5 or higher). jGRASP produces Control Structure Diagrams (CSDs) for Java, C, C++, Objective-C, Ada, and VHDL; Complexity Profile Graphs (CPGs) for Java and Ada; UML class diagrams for Java; and has dynamic object viewers that work in conjunction with an integrated debugger and workbench for Java. The viewers include a data structure identifier mechanism which recognizes objects that represent traditional data structures such as stacks, queues, linked lists, binary trees, and hash tables, and then displays them in an intuitive textbook-like presentation view.
Another great product from the academic community.

Anything that forces discipline is good. (3, Interesting)

ErichTheRed (39327) | more than 6 years ago | (#23079732)

I don't think I'm the only one who has had to work with really lousy programming and IT coworkers. One of the good things about the past was that programmers had a much harder time hiding their mistakes. In the days of dual-core processors and tons of RAM, even a mediocre programmer can get Java or any of the .NET languages to produce code that works. Of course, readability, maintainability and speed aren't really a factor.

Is going back to Ada and other similar languages a good idea? Maybe. But I think you could get the same result by just demanding better quality work out of existing languages. People have correctly pointed out that the languages aren't really to blame, because you can write garbage in just about any language.

I sound like an old fogey, but I'd much rather see a smaller IT workforce with a very high skill set than a huge sea of mediocre IT folks. This would help combat outsourcing and the other problems affecting our jobs. Almost everyone I've heard complaining the loudest about outsourcing has been either downright lazy or just not very good at what they do.

I'm primarily a systems engineer/administrator. There are many parallels in my branch of IT to the development branch. We've got the guys who can really pick a system apart and get into the guts of a problem to find the right answer. We also have the ones who search Google for an answer, find one that solves half the problem, and wonder why the system breaks a different way after they deploy it.

Not sure how to solve it, but I think it's a problem that we should work on.

Re:Anything that forces discipline is good. (3, Insightful)

Petaris (771874) | more than 6 years ago | (#23080600)

Whats wrong with searching Google?

Its a tool that can lead you to valuable information just as asking a colleague or consulting a book or other publication can. No one knows everything or has come across every issue, but there is usually a good chance someone has. Just because you have seen someone use it to find information who then did a half-assed job of fixing the issue doesn't mean the tool they used is no good or always lends itself to half-assed fixes.

I'm quite fond of Ada (3, Interesting)

Alistair Hutton (889794) | more than 6 years ago | (#23079742)

I'm actually quite fond of Ada as a language. Yes, it's a very verbose language but unlike, say Java or C#, the verbosity gives you a lot of stuff. It gives you good threading. It gives you a very good encapsulation. It gives you a very nice parameter system for procedures/functions That's a point, it seperates between procedures and functions. It gives very, very, very good typing. Very good typing. It's very good. I like it. It's what I want when I'm doing strong, static typing rather than the wishy-washy getting in the way mess that many other main-stream languages. When I use a type I want it to mean something. It's a good language to teach students about programming in my opinion.

shhh! don't go blabbing this all over the place (5, Interesting)

museumpeace (735109) | more than 6 years ago | (#23079790)

I make a nice living rewriting Ada systems into C++. When DoD suspended the "only quote us system development costs based on Ada" requirement, most bidders dropped Ada like a burning bag of poop. Its best advances such as exception handling have been picked up by modern system programming languages and even Java. The doctrinaire variable type enforcements have yet to be equaled but OO it really aint. Bottom line, plenty of old defense software systems have few living authors who will admit to knowing the code and upkeep is expensive, talent hard to find. This is ironic since DoD spec'd Ada in the first place because it had a maintenance nightmare of hundreds of deployed languages. So of course the managers think a more popular language with "all the features" of Ada should be a porting target. Eventually even customers demanded modernization and compatibility ports.

I know a few die hard Ada programmers who just love it...but very few. The brilliance of the language can be debated but its moot: no talent pool to speak of.

And besides, Ada is really French. [why did GNU make an ada compiler??????????????]

technology market: you can't separate technical merits from market forces
open source: your market has a small leak and is slowly collapsing.

You scared me for moment there (3, Funny)

BigGar' (411008) | more than 6 years ago | (#23079804)

That was my aunt's name and she passed away many years ago.

I was running out the door with my zombie survival guide & bug out bag heading for my arctic hideout to escape the impending invasion and I noticed out of the corner of my eye a reference to programming.

Thank god, Aunt Ada was a tad weird when she was alive, I really didn't want to meet zombie Ada.

My problems with Ada (1)

HiThere (15173) | more than 6 years ago | (#23079850)

1) Ada is cumbersome, especially when you are dealing with character strings of variable length.

2) Garbage collection is managed by hand.

3) Poor interfacing with C++. (Well, so has most everything but C++.)

4) Programs in Ada tend to be HUGE!!!. That's the big one. I'm talking about source code size here, not binary size, which is reasonable.

5) If you don't know the type of data you're reading in, it's difficult to handle it. This is both a problem and an advantage.

6) Ada's handling of inheritance is very different from that of most other languages. Lisp is probably the language that comes closest to it. You could think of it as noun dominated rather than verb dominated. (I.e., instead of functional it's ??? I don't know the proper term. It works, but it's a very different model. Calling the inheritable types objects isn't really appropriate (and the term isn't used).)

Re:My problems with Ada (0)

Anonymous Coward | more than 6 years ago | (#23080420)

3) Poor interfacing with C++. (Well, so has most everything but C++.)
Actually C++ doesn't even interface well with itself between different compilers. That's one glaring massive problem in the C++ specification, they didn't specify the ABI. I think that and the long time it took to get the template system working were huge contributors to the rise of all the crappy little "mid-level" compiled languages like Java, C#, etc.

To me there are just two good viable language designs: Low-level (C/C++) and scripting languages (Lua, Perl, etc.). The mid-level languages are the worst of both worlds (compiled and slow).

Re:My problems with Ada (3, Interesting)

drxenos (573895) | more than 6 years ago | (#23080458)

When is the last time you used Ada? 1) See: Ada.Strings.Unbounded 2) Ada leaves that choice up to the programmer (like C++) (see: pragma controlled). The next version of Ada will have an STL-like library, which will at least reduce the need for GC. 3) See: pragma Export(C, Foo, "foo") and Convention(C, Foo). Some compiles even support CPP in place of C, with automatic translation between C++ classes and exceptions. 5) You should always know the type of data you are dealing with (unless you are writing generics, which still has some limits for safety). 6) Ada's dispatching is based on the actual call being made. No need to mark members has virtual (C++) or to just make them all virtual (Java).

Re:My problems with Ada (1)

T.E.D. (34228) | more than 6 years ago | (#23080582)

This is actually the best post I've seen on this topic so far. I do have some minor issues with it though

1) Ada is cumbersome, especially when you are dealing with character strings of variable length.

String issues are the single biggest bugaboo I've seen Cish programmers have with Ada. Ada's built in strings are fixed-sized and sized perfectly to their contents (no Cish null terminator). You'd think this would be a trivial difference, but it completely changes how (and when) you need to create and handle your strings. People who insist on handling strings the same way they would in C get really frustrated.

Fortunately Ada, like C++, comes with a variable-length string class in its built in class libraries.

2) Garbage collection is managed by hand.

You have to be kind of careful here, as different people think different things when you say "Garbage collection". If you mean you can't allocate memory with abandon and expect the runtime to automaticly generate cleaup code for you like Java does, that's true (unless you run on the JVM, then it works just like Java). But this is no different than how C and C++ work.

But there are tricks. For instance, if you declare an access type with a limited scope, there are ways to get all objects of that type to be automaticly garbage-collected when that type goes out of scope. Since you can declare functions in other functions (which C's don't allow), this is more useful that you might at first think.

4) Programs in Ada tend to be HUGE!!!. That's the big one. I'm talking about source code size here, not binary size, which is reasonable.

That's more a reflection of Ada's users than the language. DoD programs tend to be huge. Ada was built to help make huge programs like that managable. There's really nothing stopping an Ada program from being small, except the deveoplers themselves.

Your other points are bang-on. I think they are working on an update for the language that will allow Java-ish notation for calling class methods, but the current syntax is not typical. A lot of people are syntax-fixated enough that this really puts them off.

Ada was designed for multiple CPUs (1)

nurbles (801091) | more than 6 years ago | (#23079934)

When I was in the USAF (1984-88) I was sent to Ada training. When I finished, we were given an exemption, allowing our system to remain in C. However, one particular feature of Ada seems more suited to current hardware than other languages: it was designed to allow many computers/CPUs to communicate via the rendezvous. I don't believe that C/C++, Pascal/Delphi, or any of the most commonly used languages can say that. (I'm sure someone knows of another language that does, but I suspect it isn't used very much when compared to the others.)

Couple that with the concept that, "if Ada will compile (and link) your application it will do what you asked it to do," it seems to me that Ada may have simply been 20 years or so ahead of its time. (smile)

Re:Ada was designed for multiple CPUs (2, Insightful)

fitten (521191) | more than 6 years ago | (#23080350)

However, one particular feature of Ada seems more suited to current hardware than other languages: it was designed to allow many computers/CPUs to communicate via the rendezvous.


"I can do it in C" was the line we always used when our Ada teacher said anything "could be done in Ada but nothing else".

"Rendezvous" was built into Ada explicitly but other standards have given it directly or through libraries to everything else. It's just thread synchronization that you can emulate/simulate/do in many other languages with similar code... semaphores, mutexes, events/conditionals, etc. but perhaps with a little more explicitness by the programmer.

The first thing to remember is that no (higher level) language gives magical properties to a computer. If one language can do it, I can guarantee that at least one other can tap into it as well... Machine Language being the easiest example because even magical Ada gets compiled into ML or interpreted by ML at one point or another... if it can be done, it can be done in ML. C is just a step up from Assembly/ML (we used to joke that C was portable Assembly).

You *might* can say that Ada was the first to actually have it as a construct defined in the language specification (I dunno the history of it, there may have been a language before Ada with it designed in) but even by Ada's heyday (mid- to late 80s), we could do it in C.

Experience with ADA (1)

MagnumChaos (986635) | more than 6 years ago | (#23080252)

I had ADA in all 4 years of my college schooling, and I have just graduated in December. ADA is a complicated, cumbersome, yet extremely powerful and EFFICIENT language than many other popular varieties. I have seen this through experience. Programming algorithms and doing numerical analysis may seem difficult at first with ADA, but when you break the algorithm down into its necessary steps and requirements, creating packages that can inherit when needed, and check the packages and program for errors and bugs in the code at compile time is a blessing. It has saved me from doing extra work in all of my classes, and has helped prevent me from making the most horrible, and embarrassing mistakes. It's also a great language to learn as your first programming language. The concepts it can help introduce and refine is just great. I'll be honest, too. I'm not a very good programmer, but ADA, in the long run, helped me write my code better than a C/C++ compiler could have.

Re:Experience with ADA (0)

Anonymous Coward | more than 6 years ago | (#23080524)

Maybe ADA is, I don't know. But Ada is a great language.

Ada never went anywhere... (2, Insightful)

nullkill (835502) | more than 6 years ago | (#23080268)

My first three foundation CS courses (back in 2001) were all taught in ADA. This provides the benefit of qualifying me to talk on the subject, with the drawback of providing a heavily biased opinion...

Because of the nature of the language, you HAVE to know what you are doing to write a program in Ada. Getting something to compile in Ada pretty much guarantees you get something that will run reliably.

Schools that train programmers starting with Java or C++ provide the benefit of making their graduates highly employable, but with a greater risk of turning out highly incompetent programmers.

Ada never went anywhere, there has just been a large increase in the number of developers trained on other languages. The reason its 'Returning' is because almost any Project that uses Ada does provide for a success story (As long as you use developers who have been trained on Ada (Googling for source examples is not training)). Its a professional language for applications that just have to run. (Examples of great software written in ada: (Air Traffic Control, Flight Software, Hello World, etc...)

As I said, my opinion is biased.

Hurray! (2, Funny)

Zarf (5735) | more than 6 years ago | (#23080342)

I worked in Ada for a few years. I guess I better go and dust off my books.

And, here I spent all that time learning Java. Sheesh.

Political Agenda (2, Insightful)

David Greene (463) | more than 6 years ago | (#23080408)

Look, can we get beyond the "government is always inefficient" meme? It's just not true. Many government projects come in on schedule and on budget. Some project are late and over budget. Guess what? It happens in the private sector too.

Government is actually more accountable to the people than private corporations are. Numerous cost controls are in place. Public officials are elected. I have not seen the same level of scrutiny in the private sector.

So let's move beyond the ultra conservative and libertarian talking points, ok?

Ada Lovelace Zombie Flick (0)

Anonymous Coward | more than 6 years ago | (#23080460)

I don't think a zombie film featuring Ms Lovelace would be that exciting, unless it was set in Victorian England or something.

Less about language, more about scheduling (1)

galen (24777) | more than 6 years ago | (#23080476)

Speaking from my own experience as a software "engineer" working on defense contracts for the last decade, I would say their success on this project is more likely due to proper scheduling/budgeting than whatever tools the developers used. More often than not, when estimating schedules and costs for software projects managers and developers both tend to focus on code implementation time and forget about thorough design, testing, and rework time. Add to that the nearly inevitable requirements changes that occur over the life of a project and you have a disaster waiting to happen.

In my time I've seen several projects succeed and several projects fail. Those that succeeded always began with a robust and flexible design that would allow for growth and changes down the line. Those that failed were often rushed to implementation with just enough design to get the job done, then struggled to keep up with customers' changing requirements. I've seen every major programming language, development environment, operating system, and tool du jour used on both sides of the succeed/fail fence and never once did it make a bit of difference to the success or failure of the project. Personally, I'd be much more interested in seeing this project's WBS than hearing about how Ada may or may not have helped.

(As a parenthetical side note, there is definitely a dis-incentive to succeed in that those contracts that get dragged out year after year struggling to appease the customer continue to get funding while a successful project's funding ends when it's scheduled to end, leaving the development team looking for more work.)

Reminds me of structured programming (2, Interesting)

shoor (33382) | more than 6 years ago | (#23080596)

Back in the 70s there was a big fuss being made about something called "Structured Programming". A lot of people took notice when a big project, an indexing system for the New York Times was finished with remarkably few errors. Yet, that success did not seem to become the norm. (It's mentioned briefly in the wikipedia article on 'structured programming').

COBOL used to be touted as a great language because it was 'self documenting'. Yet a lot of retired COBOL programmers got a last hurrah when they were hired to update obscure code in this 'self-documenting' language to handle dates with the year 2000 in them back at the end of the 90s.

Basically, I think what it boils down to is discipline and talent in the development process. That is far more important than the choice of language. To some extent, I would buy the idea that the fewer lines of code required to write out a program, the better, because there are fewer chances of errors. But even that can be taken to extremes in a language like APL, or if the lines refer back through obscure nests of classes. By few lines of code I mean a few readable lines of code that a programmer can look at and actually know what is supposed to be happening and how.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?