Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Colossus Cipher Challenge Winner On Ada

timothy posted more than 6 years ago | from the ada-operiert-die-blinkenlights dept.

Encryption 168

An anonymous reader writes "Colossus Cipher Challenge winner Joachim Schueth talks about why he settled on Ada as his language of choice to unravel a code transmitted from the Heinz Nixdorf Museum in Germany, from a Lorenz SZ42 Cipher machine (used by the German High Command to relay secret messages during the World War II). 'Ada allowed me to concisely express the algorithms I wanted to implement.'"

Sorry! There are no comments related to the filter you selected.

Let the raging tardfight commence (2, Funny)

Malevolent Tester (1201209) | more than 6 years ago | (#23417312)

He should have used a real programming language like Java or VB.Net.

Re:Let the raging tardfight commence (5, Funny)

morgan_greywolf (835522) | more than 6 years ago | (#23417424)

He should have used a real programming language like Java or VB.Net.
Pffft. Real men write programs like this:

$ cat >/bin/myprogram

I, for one... (obligatory) (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#23417552)

CmdrTaco here. A few years ago, while browsing around the library downtown, I had to take a piss. As I entered the john, a big beautiful all-American football hero type, about twenty five, came out of one of the booths. I stood at the urinal looking at him out of the corner of my eye as he washed his hands. He didn't once look at me. He was "straight" and married -- and in any case I was sure I wouldn't have a chance with him.

As soon as he left, I darted into the booth he'd vacated, hoping there might be a lingering smell of shit and even a seat still warm from his sturdy young ass. I found not only the smell but the shit itself. He'd forgotten to flush. And what a treasure he had left behind. Three or four beautiful specimens floated in the bowl. It apparently had been a fairly dry, constipated shit, for all were fat, stiff, and ruggedly textured. The real prize was a great feast of turd -- a nine inch gastrointestinal triumph as thick as a man's wrist. I knelt before the bowl, inhaling the rich brown fragrance and wondered if I should obey the impulse building up inside me. I'd always been a heavy rimmer and had lapped up more than one little clump of shit, but that had been just an inevitable part of eating ass and not an end in itself.

Of course I'd had jerkoff fantasies of devouring great loads of it (what rimmer hasn't?), but I had never done it. Now, here I was, confronted with the most beautiful five-pound turd I'd ever feasted my eyes on, a sausage fit to star in any fantasy and one I knew to have been hatched from the asshole of the world's handsomest young stud.

Why not? I plucked it from the bowl, holding it with both hands to keep it from breaking.

I lifted it to my nose. It smelled like rich, ripe limburger (horrid, but thrilling), yet had the consistency of cheddar. What is cheese anyway but milk turning to shit without the benefit of a digestive tract? I gave it a lick and found that it tasted better then it smelled. I've found since then that shit nearly almost does. I hesitated no longer. I shoved the fucking thing as far into my mouth as I could get it and sucked on it like a big brown cock, beating my meat like a madman. I wanted to completely engulf it and bit off a large chunk, flooding my mouth with the intense, bittersweet flavor. To my delight I found that while the water in the bowl had chilled the outside of the turd, it was still warm inside. As I chewed I discovered that it was filled with hard little bits of something I soon identified as peanuts. He hadn't chewed them carefully and they'd passed through his body virtually unchanged. I ate it greedily, sending lump after peanutty lump sliding scratchily down my throat. My only regret was the donor of this feast wasn't there to wash it down with his piss. I soon reached a terrific climax. I caught my cum in the cupped palm of my hand and drank it down. Believe me, there is no more delightful combination of flavors than the hot sweetness of cum with the rich bitterness of shit. Afterwards I was sorry that I hadn't made it last longer. But then I realized that I still had a lot of fun in store for me. There was still a clutch of virile turds left in the bowl. I tenderly fished them out, rolled them into my hankercheif, and stashed them in my briefcase.

In the week to come I found all kinds of ways to eat the shit without bolting it right down. Once eaten it's gone forever unless you want to filch it third hand out of your own asshole -- not an unreasonable recourse in moments of desperation or simple boredom.

I stored the turds in the refrigerator when I was not using them but within a week they were all gone.

The last one I held in my mouth without chewing, letting it slowly dissolve. I had liquid shit trickling down my throat for nearly four hours. I must have had six orgasms in the process. I often think of that lovely young guy dropping solid gold out of his sweet, pink asshole every day, never knowing what joy it could, and at least once did,bring to a grateful shiteater.

Re:I, for one... (obligatory) (0)

Anonymous Coward | more than 6 years ago | (#23419280)

Big beautiful all-American football hero type here. You're welcome.

Wimp using ASCII (2, Funny)

jellomizer (103300) | more than 6 years ago | (#23417574)

01010111 01101001 01101101 01110000 00100000 01110101 01110011 01101001 01101110 01100111 00100000 01000001 01010011 01000011 01001001 01001001

Re:Wimp using ASCII (2, Informative)

91degrees (207121) | more than 6 years ago | (#23417780)

But you're using an ASCII representation of a binary representation of an ASCII string!

You're using ASCII twice so you're twice the wimp!

Re:Let the raging tardfight commence (3, Funny)

PeterKraus (1244558) | more than 6 years ago | (#23417696)

Real men use magnetic needle and steady hand.
And don't get me start on real programmers and their in-air-light-refraction-affecting butterflies.

Re:Let the raging tardfight commence (1)

compro01 (777531) | more than 6 years ago | (#23417738)

what about the real programmers who set the universal constants at the start of the universe? =P

Re:Let the raging tardfight commence (-1, Troll)

UbuntuLinux (1242150) | more than 6 years ago | (#23417912)

I didn't understand what you meant by your comment, until I saw the smiley ('=P') at the end, and then it all made sense! You were making a joke, haha! That was really funny. I'm not very good at telling jokes, but I am thinking of sending an email to my freind's daughter who's computer I have been fixing, and I wanted to add something to make it funny - do you think if I added a smiley to the end of some sentences in my email it would fool her into thinking I had a sense of humour?

Re:Let the raging tardfight commence (0)

Anonymous Coward | more than 6 years ago | (#23417994)

it's just, that GP was funny, and you're not. EVER. =P

Re:Let the raging tardfight commence (1)

compro01 (777531) | more than 6 years ago | (#23418654)

just added that for the sake of people who don't read xkcd [] , and thus wouldn't get the joke.

Re:Let the raging tardfight commence (2)

megaditto (982598) | more than 6 years ago | (#23418454)

Yeah, but how much choice do they really have in picking those?

I made myself sad just now

Re:Let the raging tardfight commence (1)

chaim79 (898507) | more than 6 years ago | (#23418196)

'Course there's an EMACS command to do that... the butterfly thing that is.

Re:Let the raging tardfight commence (1)

ozmanjusri (601766) | more than 6 years ago | (#23418348)

You may mock, but I wrote my first program with the pointy part of a compass.

Re:Let the raging tardfight commence (4, Insightful)

jellomizer (103300) | more than 6 years ago | (#23417434)

For the most part the language doesn't matter that much. ADA, C, C++, PASCAL, BASIC, LISP... Almost every languge can get the job done. It is just a matter on how well it handles different details. I like Python for its List Processing and Top Down Design. Some people like Visual Basic for its ease in creating good interfaces. Some people like C and C++ because they can control the system at a lower level.
ADA being a government/military based languge I am not to suprised that it won the competition decifering a goverment/military code. (it is more complex then that)

Re:Let the raging tardfight commence (1)

morgan_greywolf (835522) | more than 6 years ago | (#23417566)

For the most part the language doesn't matter that much. ADA, C, C++, PASCAL, BASIC, LISP...

The *language* doesn't matter so much as the *particular implementation* of that language and the platform(s) on which it runs and the libraries available.

C is a fine langugage, but don't try writing an OS kernel using the Ch [] C interpreter, for instance.

Re:Let the raging tardfight commence (0)

Anonymous Coward | more than 6 years ago | (#23417584)

And, some folks like Borland Delphi (Object Pascal)!

Mainly because it's been shown (& as far back as 1997, in OF ALL PLACES, "Visual Basic Programmer's Journal" Oct. 1997 issue "INSIDE THE VB 5 COMPILER") to produce code that is far F A S T E R than VB, & even MSVC++!

(Especially in math & strings, which EVERY PROGRAM DOES, mind you, & Delphi swept the majority of the tests done, from both of them)...


Re:Let the raging tardfight commence (1)

jellomizer (103300) | more than 6 years ago | (#23417632)

Whoa Hold the phone! Sience when have computers been able to compute math? This is a big deal, this could change the world!

Re:Let the raging tardfight commence (0)

Anonymous Coward | more than 6 years ago | (#23419290)

Uhm... your point, please?

Now - If you're trying to be funny, then so be it... sarcasm has its place too!

(However - despite the fact your statement's pretty humorous? It still doesn't change the facts & tests I cited, from a reputable publication for coders in this field, period... & the data + results haven't changed lately either on the account I noted (i.e./e.g.-> Borland Delphi produces faster code than MS' VB & VC++ products, especially in math & strings, which EVERY PROGRAM DOES)).

I wonder, what kind of processing occurs MOST in the type of work being done for this report/article @ /. here? Strings, & math related, by SOME/ANY chance?? Thanks for the answer on that note...


Don't get me wrong either:

I have used MSVB since version 3.0 on Win16 platforms, & MSVC++ since version 2.0 on 16-bit Windows too, & all the way up to current VB.NET/ASP.NET in Visual Studio 2005 for 32-bit development, professionally!

Hey - .NET's decent stuff, that has its merits & places!

(PLUS, .NET WAS DESIGNED/ARCHITECTED LARGELY BY DELPHI's CREATOR ( MS "Distinguished Engineer" title holder, @ MS, in Mr. Anders Hejlsberg, hired away from Borland by Microsoft no less ) but since .NET code's runtime interpreted, it's just slower than Delphi executables are, (no doubt about it, hands-down/period))...


HOWEVER - It's TOUGH to NOT have to use Microsoft's compilers & IDE's, especially today & nowadays for the past 10 yrs. now mostly imo + experience professionally in this field as a developer!

"Nobody ever got fired for buying IBM" is the old saying, & today, might as well replace IBM with MICROSOFT!

See - the business reasoning I have been given time & again, despite my showing & extolling the virtues of Delphi over .NET stuff especially for speed (yes, even though they have "garbage cleanup", a GOOD coder knows how to "de-lint" memory allocations w/ a matching freeing of them, well... VB & VC++ are just slower than Delphi, period)?

Well - since MS has the "BIG BUCKS"? Mgt. OFTEN feels "they will be here today, tomorrow, & years from now - will Borland?" (etc. et al)


Then again, by way of actual "hands-on" experience, "in the trenches", actually doing the job of coding & design ( as far as my 16++ yr. professional career in this field has shown me ( mostly MIS/IS/IT coding ( db work etc. )))?

I have had literally, only 2-3 mgr.'s, in that timeframe, that I could respect as being able to do the job & do it right... largely because THEY COULD ACTUALLY DO THE JOB & HAVE BEFORE PROFESSIONALLY ( most have not, unfortunately, & this IS a problem in this field ( and, others )).

That said? Who says mgt. really knows ANYTHING of worth, where the "rubber meets the road" (in actually having done the job, themselves, hands-on) when it comes to actually knowing AND DOING the job, themselves?? Imo, unfortuntely???

Most mgt. are just "stooges/flunkies" for the UPPER mgt. whose only goal is to make money, NOT QUALITY PRODUCTS, as well as being literally nearly incompetent @ doing the jobs of their subordinates, themselves.


P.S.=> Build crappy product, you LOSE monies & get a poor rep... build it RIGHT, w/ the BEST TOOLS (relative term, it depends on which languages lends itself to the job, best, & I'll freely concede that based on time to build/to get to market, ease of use & maintenance, BUT ALSO FOR SPEED/EFFICIENCY as well as security), you get that GOOD REP, & sales out the "wazoo", because it's done right the first time out... apk

Re:Let the raging tardfight commence (2, Insightful)

morgan_greywolf (835522) | more than 6 years ago | (#23417700)

In 1997, producing small, fast executables for desktop and database client applications (Delphi's raison d’ètre) mattered a whole lot more than it does today. These days, it's generally far more important to use a tool that lets the programmer be very productive and to produce nice, maintainable code. Both of which Delphi does very well, mind you.

Programmer not langauge. (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23418056)

ADA being a government/military based languge I am not to suprised that it won the competition decifering a goverment/military code.

The programmer won the competition; not the language. He just happened to write it in Ada. Languages are nothing but syntax: none are "better" than anther at certain operations. Folks here /. apparently never learned that in school.

Re:Programmer not langauge. (4, Informative)

jellomizer (103300) | more than 6 years ago | (#23418380)

Well it is more complex then that. Different language syntax can help or hinder someones performance for a particular job. For example old C didn't have much to say in terms of string handling if you wanted to use a string you needed to do a Char *VarName then when working with that languge you need to insure memory managment and that you don't create buffer overflows..... A big pain if you didn't make youself some good String functions you spend a lot of your programming just making sure your program doesn't blow up. vs. Newer Languges And the String class in C++ where you can concatinate get sub strings parse.... it makes doing such job much easier and a lot less anoying. Most well programed languges will not prevent you from getting anything done. However there is the Human element of the equasion if the syntax is to difficult for a particular job then the person will tire out and make more mistakes. The winner of this competition felt that ADA syntax offered him the ability to solve the problem better, thus helped him to win.

I have written Web Apps in FORTRAN 77 just to prove that you can. However I wouldn't say it would be OK to consult a client to do the same, as it really isn't the right tool for the job.

Re:Programmer not langauge. (2, Interesting)

Lodragandraoidh (639696) | more than 6 years ago | (#23420234)

If I could type 90 wpm and never make a mistake, I'd be using Ada or Fortran today.

I can't, so I don't.

For me it is how fast I can produce what the customer/user desires that matters - and more importantly, how fast I can change it - so I use Python - with the option (as yet unneeded) to build C/C++ modules for that language for slow bits. If a bug pops out of my code, I can easily squash it; more difficult with a compiled language.


Re:Let the raging tardfight commence (1)

B3ryllium (571199) | more than 6 years ago | (#23418148)

I still think they should have gone with DOS-based batch files ... or Monad/Windows PowerShell :)

Re:Let the raging tardfight commence (1)

oni (41625) | more than 6 years ago | (#23418256)

Some people like Visual Basic for its ease in creating good interfaces.

Perhaps OT: but I think people like VB because they don't know any other language.

I have never in my life heard anyone say, "I know C, Java, Ruby and VB and I really like VB!" More often it's, "I was working as an office assistant and wanted a promotion so I got a book titled, _Fast-Track Learn VB in 10 Hours for Dummies_ and this is the only language I've ever used - and I like it!"

Other than that, you're right. The language matters less than how you use it (and for performance, the language matters less than how it's implemented). But if you know several languages equally well, then you know the little idiosyncrasies that make each better suited to a particular task, and can make an informed choice.

Re:Let the raging tardfight commence (1)

Strilanc (1077197) | more than 6 years ago | (#23418500)

I know C, Java, VB, and python, and I really like VB(.net).

C is almost always too low-level for what I need. I don't want to worry about garbage and tracking the size of arrays.
python doesn't have static type checking or really any type of static checks (because of its interpreted nature). I really, really like compile-time checks.
Java is alright, but it doesn't have operator inheritance, doesn't force you to declare methods as overridable, and (mainly) doesn't have an IDE as nice as Visual Studio. Java generics are more flexible, though.

The big downside of VB is the dependency on windows, and some of the leftover junk from VB6 like on error goto.

Re:Let the raging tardfight commence (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23418912)

You only like VB.Net because you don't know C#.

Re:Let the raging tardfight commence (3, Insightful)

patchvonbraun (837509) | more than 6 years ago | (#23418344)

I only found out about the contest a couple of days before it began, and I was away at the time, so I
couldn't participate in "real time", but I used the
copies of the sent ciphertexts on the Bletchley

I worked away on a lorenz breaker for the SZ42 stuff, written in C. I was able to break
ciphertext roughly an order of magnitude faster
than Joachims code. Joachim worked away on his code
for several weeks in advance of the contest. I had only a couple of days notice.

I think two things matter in a competition like

                o The *appropriateness* of the language
                o The skills of the coder

Joachim got all the glory on this one, since he
    was first to announce the breaks. But there are
    probably a number of others who were "close"
    when Joachim announced his break.

Re:Let the raging tardfight commence (1)

ArtuRock (932265) | more than 6 years ago | (#23418652)

Some people like Visual Basic for its ease in creating interfaces.
Fixed it for you.

Re:Let the raging tardfight commence (1)

L4t3r4lu5 (1216702) | more than 6 years ago | (#23417444)

C:\>copy con c:\decrypt.exe

Re:Let the raging tardfight commence (1)

Nolde Huruska (1034512) | more than 6 years ago | (#23417578)

It matters not which language is used, just as long as it is edited with vi.

Re:Let the raging tardfight commence (0)

Anonymous Coward | more than 6 years ago | (#23418786)


Vi sucks compared to pico.

(trolling - can't beat it)

Re:Let the raging tardfight commence (1, Funny)

Anonymous Coward | more than 6 years ago | (#23419224)

You misspelled "emacs."

Re:Let the raging tardfight commence (0)

Anonymous Coward | more than 6 years ago | (#23417594)

Let the raging tardfight commence
It seems to have started sooner than you expected.

Re:Let the raging tardfight commence (0)

Anonymous Coward | more than 6 years ago | (#23417672)

You're right! Ada is an engineering language! why would he use something like that for such a trivial problem like a Cipher Challenge... Oh! wait! he may have actually wanted it to work properly without spending more time debugging then he spent on coding! You can send a man to the moon using a slide-rule but why not use something that will actually do a alot of the work for you! (type checking, runtime checking...) and allow you to do the actual engineering required! (instead of debugging all the time).

I have always wondered why tools like Ada have not caught on in the larger world... The only thing I can think of is that most apps are not really thought out very well before hand, they are just slapped together and _made_ to work without much (if any) actual engineering. I guess that's why most apps are crap!


Re:Let the raging tardfight commence (4, Funny)

valentingalea (1039734) | more than 6 years ago | (#23417750)

He should have used Brainfuck [] !

A great quote (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#23417350)

From [] :

MALE PRIORITY. Of course, another immediately obvious explanation for F25D3's flitting off while I was still in mid-sentence exists. It is that my conversation was unbearably boring and F25D3 had withdrawn to escape the tedium, or there may have been some other defect in my presentation. By way of response I state explicitly here that:

1. I don't believe that my conversation was boring, and in any event my conversation had certainly not been considered so by F25D3 when we had met previously and a hint had been given. I knew it to be true that people with specialized skills could be monotonous, and had even found some of my own friends so on occasion. Although such people might be thought dull, the information and skills they had were useful.

2. Compared to the utter trivia which seems to circulate in the minds of all but a tiny proportion of females, almost anything was an improvement. Even in a technologically advanced society such as the one in which we currently live, females seemed to think that their inconsequential conversations about relationships and similar immature tittle-tattle were equal to male conversations about abstract concepts and objects. [amen brotha]

3. As a male I was a wealth creator rather than (as the female) a net consumer, and reasoned that as such my desires should have priority over those of females.

4. The relatively safe and comfortable society from which females were now unreservedly taking benefit had been created by my male forebears, people like me, and their success demonstrated that my desires had more validity. The females were taking advantage of male efforts as blithely and unthinkingly as someone who believes that the blowing of a whistle is the power which sets a train in motion.

5. Whether by the medical research I had done, the organization I ran or by this study I was contributing in at least a small way to the advancement of humankind. Not only did this reinforce my conviction that my male desires should be honoured, in preference to those of females, but in addition I resented the pressure I was clearly under to indulge in dishonesty and trickery in order to fulfil my desires, especially since such dishonesty of thought, for example by selective perception and distortion of truth, was directly antagonistic to such advancement.

Comparison with Allies cypher machine (3, Interesting)

Anonymous Coward | more than 6 years ago | (#23417488)

I wonder how easy it would be to break the Allies corresponding machine, the SIGABA ( It was stated that during WWII, the Lorenz machine was broken, but the SIGABA wasn't. Of course, given 60 years of computer improvements, it might be possible to break the SIGABA, now.

Re:Comparison with Allies cypher machine (3, Informative)

RebornData (25811) | more than 6 years ago | (#23418570)

Here's a recent cryptanalysis of SIGABA: []

In normal use, it appears to have had a keyspace of about 48 bits, which is not easy to attack now with a modest distributed effort, but way out of the reach of WWII technology.

However, a variant of the machine used for communication between the US President and British PM had an effective keyspace of 95+ bits, which (if you have access to some known plaintext) can be reduced to 86 bits, which although shorter than key lengths in common use today is still out of reach.

I had to learn ADA in college (1)

Anonymous Coward | more than 6 years ago | (#23417512)

It was at the University of Dayton in the late 90s. ADA was the language they taught all their intro computer science classes in. They then switched to C++. I didn't like ADA, but looking back that may have been my own prejudges more than anything wrong with the language. Every computer class I had in high school used a different programing language. I was getting sick of learning new languages when I wanted to be advancing in my computer skills.


Re:I had to learn ADA in college (0)

Anonymous Coward | more than 6 years ago | (#23417614)

btw, that was my first exposure to the free software foundation. We used the gnu compiler to compile the ada programs. I never heard of the free software foundation. I remember speaking to the department head about "the Free Software Foundation." I remember him telling me, "you wont understand this now, but one can make money off of "free software." He was right, I didn't understand, but I do know. He didn't even use the "free speech / free beer" analogy. To my recollection he didn't give any more details at all. But if got my interest. I went online and started looking up the Free Software foundation. Then I somehow came across Linux. And here I am to day posting as an Anonymous Coward on Slashdot from my Gentoo box.


Re:I had to learn ADA in college (0)

Anonymous Coward | more than 6 years ago | (#23417714)

a late age geek! let me ask, while you are here how was having a girlfriend, back then?

Re:I had to learn ADA in college (3, Funny)

Shinmizu (725298) | more than 6 years ago | (#23417974)

As a cat once told me, "Anonymous Coward, ur doin it wrong."

Re:I had to learn ADA in college (0, Flamebait)

Anonymous Coward | more than 6 years ago | (#23418638)

I had a similar experience. My teacher told me FREE software could pay, and demonstrated by hooking me up with a paid summer job porting ghostscript to 64-bit alpha (at the time, GNU coding guidelines mandated 32-bit only, so it was a huge pain). They flew me to MIT to meet with RMS and the FSF people that summer. That really opened my eyes (not to mention my anus...). Now, years later I make a living writing "FREE" software and working as a gay prostitute.

Re:I had to learn ADA in college (2, Funny)

kellyb9 (954229) | more than 6 years ago | (#23417732)

Actually most colleges don't want to be typecasted a "C++ school" or an "ADA school". It's more important to learn data structures and theory. If you went to a good school, the language something is written is trivial, learning the syntax should not be that difficult.

Re:I had to learn ADA in college (0)

Anonymous Coward | more than 6 years ago | (#23418692)

yep, you do the same operations across the board. The language is equivalent to a car. You get from point a to point b. All that matters is you use the right vehicle for the road you are on and your activity.

Some change gears for you, others require a little more work but go faster. Reliability depends on the mechanic 8)

Re:I had to learn ADA in college (1)

ajs318 (655362) | more than 6 years ago | (#23419602)

Yes indeed. Programming languages are a lot like cars. Pascal is a driving school car with an extra brake and clutch on the passenger side. Modula-2 is a dodgem car with only one pedal and two gears and it isn't allowed on the main roads. ADA is a military staff car and if you want to drive it, you have to submit six different forms in triplicate to eighteen different departments, some of which are offsite, and then deal with at least one situation where you have to cross out your signature and sign the correction and one where you have to try to get some order forms out of the stationery stores without an order form for the order forms order. C is an engineer's car; it's held together with bits of string, there's no synchro so you have to double-declutch and there's a special technique for starting it, which nobody except its rightful owner has ever mastered. Python is a boy racer's car with blacked-out windows, fluffy dice, air horns, MAX POWER stickers and a riduculously loud stereo -- but its owner still hasn't got a girlfriend. Perl is a Ford Transit that's looked old and beaten-up since it was new, but it still gets you there.

Hey me too! Freshman year. (0)

Anonymous Coward | more than 6 years ago | (#23418322)

They're on to java now, but I think Ada was great. Sure, not much in the way of market demand, but the way that Ada makes everything so explicit was helpful in making the students understand what was going on.

Type Casting (3, Interesting)

hroo772 (900089) | more than 6 years ago | (#23417522)

For those that know the differences in Ada, its a very strongly typed language which makes it harder for a beginning programmer to pickup. It doesn't allow for type conversion and pretty much enforces strict coding rules. It would make sense that he used this since he would have complete control over what his code did exactly. This wouldn't be the case with java or other languages which allow type conversions easily, which is nice for alot of people, but can definitely lead to issues when not accounted for.

Re:Type Casting (3, Insightful)

egilhh (689523) | more than 6 years ago | (#23417660)

What makes you think that Ada does not allow type conversion?
"typename(variable)" is pretty easy in my opinion...
(not much harder than the type cast in other languages: "(typename)variable")

Ada even has a package called Ada.Unchecked_Conversion if you don't care
about ensuring the result is within the bounds of the target type...

Re:Type Casting (2, Informative)

Zoxed (676559) | more than 6 years ago | (#23417730)

> It doesn't allow for type conversion

It does (unchecked_conversion), but never (AFAIK) *implicitly*.

Re:Type Casting (2, Informative)

kst (168867) | more than 6 years ago | (#23419036)

>> It doesn't allow for type conversion

> It does (unchecked_conversion), but never (AFAIK) *implicitly*.

Unchecked_Conversion reinterprets the bits of the argument as a value of the specified type.

Ada also allows ordinary value conversions (for example, converting 3.1 to type Integer yields 3) among sufficiently closely related types; for example, a value of any numeric type can be converted to any other numeric type. It requires such conversions to be explicit in more cases than many other languages.

And yes, my experience is that it's more common in Ada than in, say, C, for a program to work correctly the first time you get it to compile successfully.

Re:Type Casting (1)

drxenos (573895) | more than 6 years ago | (#23420930)

Unchecked_Conversion reinterprets the bits of the argument as a value of the specified type.

Maybe, maybe not. There are several rules that define what it does, and many that are implementation-specific.

Re:Type Casting (2, Insightful)

morgan_greywolf (835522) | more than 6 years ago | (#23417768)

languages which allow type conversions easily, which is nice for alot of people, but can definitely lead to issues when not accounted for.
This is, IMHO, both Python's greatest strength and it's greatest weakness as a dynamically-typed language. Sometimes you can get bizarre results if you're not careful. OTOH, once you get the hang of it, you won't make those mistakes.

You couldn't BE more wrong (4, Insightful)

MosesJones (55544) | more than 6 years ago | (#23418236)

As someone whose first programming language was Ada, and who knows of several universities around the same time who chose Ada as a teaching language, I can say with certainty that you are completely wrong.

First off those strict rules help you because you spend miles less time debugging stuff you don't understand, once it compiles it will tend to run and the compiler gives helpful messages about what you are doing wrong (often including suggestions on how to fix it). With Java, and especially C and C++, let alone scripting languages the beginner spends much more time debugging non-operational code than writing the code in the first place. This tends to mean that these people focus on "getting an executable" rather than "getting it running".

Ada is a brilliant language to teach newbies in (again I've personally done this) as you can explain the abstract concepts and then have the compiler make sure they are doing it right rather than have them say "it compiles but it keeps falling over, why?".

Ada's issues are due to the mentality of lots of (IMO) unprofessional engineers who focus on the number of characters over the operational viability of a system.

And for a final point. Take a look at the complex code the guy wrote, if that was in Java, C, C++, Scala, Ruby, Perl, LISP or what ever do you think that you'd have a chance of understanding it?

Re:You couldn't BE more wrong (0)

Anonymous Coward | more than 6 years ago | (#23419928)

>those strict rules help you because you spend miles less time debugging stuff you don't understand

Well, yes, it's clear why you need this sort of help. Time isn't measured in miles, dear.

Re:Type Casting (0)

Anonymous Coward | more than 6 years ago | (#23418588)

Actually, Ada DOES allow type conversion - but only via "unchecked conversion" which puts all responsibility on the user as to policing what he/she is doing - i.e. giving the user a nice long rope on which to hang himself. :-) It definitely isn't "easy" to do, but the language (Ada 95) does provide this facility.

Re:Type Casting (1, Interesting)

Anonymous Coward | more than 6 years ago | (#23418792)

I disagree with the difficulty statement. Of all the languages I studied over the course of my computer science education Ada was by far the most straight foreward (as compared to Fortran [fucking columbs] and C/C++ and all the type casting) the only other language which might be somewhat comparable in ease of picking would be Lisp (fucking braces). Sure Ada might not lend itself to as much trickeration, but I'm not necessarily of the opinion that is a failing. Every Ada program I ever compiled performed perfectly within normal limits of my I guess Algebra. C? Oh hahaha. No, not so much.

With Ada I found you either understood the problem at hand well enough to express it program logic or not. C invites one to make certain kinds of assumptions which might get your there, but might not always be reliable. But that's just my personal experience.

Re:Type Casting (2, Informative)

fitten (521191) | more than 6 years ago | (#23418946)

That was pretty much my experience... when you finally got the Ada program to compile, it worked... it just sometimes took a lot of work to get it to compile ;) Back then, you had to be explicit in everything you wanted to do, particularly type conversion... of course, you could get around that by just using everything the same type, but if you started out typing everything, if you got your program to compile, you knew that you were always calling the right functions and/or always doing the typecasts you needed to do.

Re:Type Casting (1)

drxenos (573895) | more than 6 years ago | (#23418964)

What the hell are you talking about? Ada most certainly does allow for type conversions. It just don't potentially unsafe ones implicitly.

Re:Type Casting (0)

Anonymous Coward | more than 6 years ago | (#23421646)

Ada does support type conversion, it just does not do it implicitly. And unchecked_conversion is not necessarily needed, rather "Integer(float_value)" will convert between types. One annoyance is that you cannot mix types in an expression/equation so y:=10.0*x+3 is not valid since 3 is an integer. You'd need to use y:=10.0*x+3.0 or y:=10.0*x+float(3), both of which make the cast explicit to the compiler (and reader) but can muddy the code.

I like the choice. (4, Funny)

DrWho520 (655973) | more than 6 years ago | (#23417568)

Use a masochistic language to break a German code...groovy.

Ada Boy!! (1)

n1ckml007 (683046) | more than 6 years ago | (#23419046)

Ada Boy!!

ADA Resurgence? (5, Interesting)

Arakageeta (671142) | more than 6 years ago | (#23417604)

Has anyone else started to notice an ADA resurgence? I feel like several years ago the general feeling was "ADA is a backwards language used only on old military projects." Now I read a positive story about ADA every few weeks! Was ADA 2005 that good of a language revision?

Re:ADA Resurgence? (1, Funny)

Anonymous Coward | more than 6 years ago | (#23417710)

Was ADA 2005 that good of a language revision?
If you put lipstick on a pig, it's still a pig.

Re:ADA Resurgence? (5, Funny)

Skeptical1 (823232) | more than 6 years ago | (#23418022)

Ada is not a backward language. Ada is a palindrome.

Re:ADA Resurgence? (1)

kst (168867) | more than 6 years ago | (#23419384)

Ada isn't just a palindrome. It's a hexadecimal palindrome. How many other languages can make that claim?

(Well, six that I can think of: B, C, D, E, F, and my own 99 [] .)

Re:ADA Resurgence? (1)

hercubus (755805) | more than 6 years ago | (#23418150)

Oracle's procedural language (PL/SQL) is quite similar to ADA, PostreSQL's procedural language is quite similar to Oracle's

so there's some ADA-influenced code out there running in non-military projects

my managers keep trying to put me on the Java bandwagon -- i nod and smile, humor them, laugh at their jokes, slap them on the back, then when they're not looking i jump off and go back to my horrible, obsolete, verbose, backwards, sickly efficient ADA-like PL/SQL. joy

What are the good ones? (1)

Shivetya (243324) | more than 6 years ago | (#23418174)

I play with programming on both the PC and Mac (at work I am on a mini and there is no ADA there at all) so I am curious...

Which are the good compilers for ADA for Mac and PC. As well as being good, what are the relative costs?

Finally, which sites do ADA supporters consider best?

Re:What are the good ones? (2, Informative)

glop (181086) | more than 6 years ago | (#23418320)


The GNAT is based on GCC. It's free and it is damn good.
I was also using AONIX and they have a free (as in beer) version. I have always preferred GNAT though.

I am not sure about a website though...

Re:What are the good ones? (3, Informative)

DdJ (10790) | more than 6 years ago | (#23418358)

Can't give you advice on the PC, but on the Mac, the default compiler is the GNU compiler suite. That's where the C, C++, and Objective-C compilers come from.

The GNU compiler suite also has an ADA compiler (GNAT, GNU Ada Translator). Should be possible to get it and plug it in without much trouble, and then it'd integrate with everything else. Heck, should be possible to include ADA modules into an Objective-C Cocoa application, even.

There is also a GNU FORTRAN, worth checking out. Even today, you can't do mathematics as efficiently in C as you can in FORTRAN. (This is because of the language; in Fortran, taking the address of an existing variable isn't normal, so variables don't end up with the possibility of "aliases" that they don't know about, which means a lot more stuff can safely be done all in registers and stuff like that.)

There is also a GNU Pascal, but unlike ADA and FORTRAN, I'm not personally aware of any reason to actually use it.

Re:ADA Resurgence? (3, Interesting)

Anonymous Coward | more than 6 years ago | (#23418378)

Ada was considered too complex. By now C++ is orders of magnitudes more complex and still does not do half the things (Ada has had a sane integrated threading model that could be used for message passing constructs, namely actual OO programming techniques, pretty much from the start).

C++ templates, for example, are just a ripoff of Ada's generics _including_ the Ada angle bracket constraint notation which does not fit at all into C.

Basically it is like the Unix renaissance after Windows tried to offer everything Unix does, except doing it all wrong and contorted and only borderline operative.

Re:ADA Resurgence? (5, Informative)

Barromind (783894) | more than 6 years ago | (#23418804)

Ada 2005 is comparatively minor (although some changes, like interfaces, are not that minor). The real improvement was Ada 95. The 95 revision managed to standardize many things that C++/java are now settling.

Ada is not trendy, but it has had built-in portable concurrency and many other killer features for more than a decade. Proper specifications are one of my favs.

Of course there are other factors, like the lack of good and free compilers. Fortunately now the gcc toolchain has put this to rest. Also there are few libraries. Really few. Binding to C is easy, but still a deterrent for the hobbyist.

It's emphasis in making maintenance easy over quick programming really pays in the end, not even in the middle/long term but shortly after getting familiar with the language. I find myself much more productive. When something compiles, I'm sure that the only bugs remaining are logical, not some funny pointer or unexpected type conversion or overflow. Nowadays I rarely fire the debugger more than once a month. My C/C++ has improved because Ada forbids the things that are considered bad practices in C/C++, but you still end doing because "you know better".

I think that Ada is getting now more exposure because, albeit a niche language, Adacore is pushing hard behind it. Also, its SPARK derivative by Praxis has made some headlines with large and difficult projects getting flying marks. SPARK has made static analysis a reality for large projects.

I'd say that anyone capable of discipline will enjoy the benefits of Ada. It's not the thing for quick hacking, but it is perfect for anything not trivial. Software engineers should love it. I have heard somewhere that it is a safe C++, and I concur: feature-wise is more or less on par, it catches bugs sooner and prevents many typical ones.

Have I already said that concurrency is built-in and portable :P? And that inter-thread communication is really well done?

Re:ADA Resurgence? (1)

ChrisA90278 (905188) | more than 6 years ago | (#23419038)

It is good to program in a language designed for the task at hand. Ada was designed to control "things that can't fail" like aircraft flight controls and nucear power plants, guidance systems of "smart bombs" and soon. Must people don't work in this area. Most people write stuff that runs on desktop machines and web servers. In that environment software error is just expected and tolerated. So Ada will be a minority language.

Ada will always be used more by projects where the cost of software error is very high.

Compiler price.. (5, Interesting)

renoX (11677) | more than 6 years ago | (#23417744)

I think that the main reason why Ada has 'lost' to C++ is that some time ago, C++ compiler were either cheap or free whereas Ada compiler were expensive.

Too bad since Ada is 'by default' a language which is more secure than C++..

Re:Compiler price.. (0)

Anonymous Coward | more than 6 years ago | (#23418228)

There has been a free Ada compiler (GNAT) available since 1995, which is currently part of the GNU compiler collection.

Re:Compiler price.. (1)

renoX (11677) | more than 6 years ago | (#23419140)

Yes I know, but GNU C++ is from 1987..
8 years is a *very long time* in computer history.

Granted GCC is not the only compiler but my memory is that Ada's compiler were expensive whereas C++'s compilers less so which explain (partly) why Ada is much less widespread than C++ nowadays.

IMHO, the DOD should have invested in making a Free Ada compiler to create a community of Ada users in order to ensure that Ada would become widespread, to secure his investment in Ada's code.

Software and recorded audio can be found here (2, Informative)

puddles (147314) | more than 6 years ago | (#23417824)

hmm. (3, Interesting)

apodyopsis (1048476) | more than 6 years ago | (#23418042)

I often wondered at the time if this was a fair test?

I mean the german fellow was near teh transmitting station and got a very good signal and started right away.

Bletchley Park on the other hand, because of the atmospheric conditions did not get a signal until late in the day and started late. On the other hand the german SW took only 46 seconds.

I'm not saying that the german fellow should not of won, he did fair and square - but there seemed to be no mention in much of the news at the time of the receiver issues.

On the plus side, it was excellent publicity for the park and colossus. If only Churchill had not ordered then scrapped then Britain could of led the technological era.

Re:hmm. (0)

Anonymous Coward | more than 6 years ago | (#23418660)

"should not of won"

What the hell does that mean?

Re:hmm. (0)

Anonymous Coward | more than 6 years ago | (#23419032)

Some people don't seem to know what they're saying. Probably here he meant "Shouldn't've won" which does sound like "Should not of won" but really means "Should not have won".

Then again, it's still a sort of random conjecture on his part.

Re:hmm. (0)

Anonymous Coward | more than 6 years ago | (#23419048)

mod -1 grammar-nazi flamebait

Churchill didn't order it scrapped... (2, Interesting)

Anonymous Coward | more than 6 years ago | (#23419222)

He (and his successor, Attlee) kept it classified. Then, during decolonization, they gave lots of captured Enigma machines to former colonies to allow them to keep their communications secure -- and allow the former colonial power to keep an eye on things :)

Concise??!! (2, Insightful)

Lodragandraoidh (639696) | more than 6 years ago | (#23418084)

I can't imagine using the words concise and Ada in the same sentence.

Constricted - maybe. Painful - most certainly.

Re:Concise??!! (5, Funny)

hey! (33014) | more than 6 years ago | (#23418362)

I can't imagine using the words concise and Ada in the same sentence.

Perhaps you should read what you just wrote.

Re:Concise??!! (1)

Lodragandraoidh (639696) | more than 6 years ago | (#23419724)

Glad you found it mildly amusing. Anything to please the crowd.

Re:Concise??!! (1)

LynnwoodRooster (966895) | more than 6 years ago | (#23418434)

The choice was obvious! What better way to solve a cypher contest but to code in a language that is pretty much a cypher to everyone else?

other factors often dominate language choice (4, Interesting)

_|()|\| (159991) | more than 6 years ago | (#23418122)

Like the author of the article, I have a tendency to dabble with a variety of programming languages. I haven't used Ada seriously, but I am intrigued by it, especially in contrast to the looser languages that are currently popular. A lot of bytes have been spilled on the topic of static and dynamic typing, bondage & discipline vs. unit testing, etc. While these discussions often devolve to religious wars, I do think that language matters. Never mind Sapir-Whorf or Turing, some languages are simply more or less pleasurable or powerful for certain tasks.

That said, often the language itself is not the dominant factor in choosing the language. As nice as (Ada | Erlang | Haskell | Lisp | Ruby) is, it's not going to be my first choice if another language has a readily available library that will make it easier to write the program. I can write web applications in Lisp, but I probably won't. There is probably a parser generator for Ada, but I'd rather use Flex and Bison, or maybe ANTLR. And when it comes to my first choice, independent of problem domain, I'll usually pick Python, in part because of its extensive library.

Re:other factors often dominate language choice (4, Insightful)

hey! (33014) | more than 6 years ago | (#23419108)

Well libraries. That's a huge part of language choices these days; you really choose frameworks or libraries and live with the language as a consequence. A lot of what we do these days is glue stuff together.

This problem, however is a completely different kind of programming. It's old school stuff: building everything you need yourself to run on really slow hardware. And hardware is always slow relative to crypto problems. Ever try to implement RSA encryption from scratch? I have. There's a reason the public key stuff is only used for key exchange.

I think the usefulness of Ada on this kind of problem is related to the issue of testing being costly. When I started in this business, compiling and linking a two hundred line program took about fifteen minutes. Something like unit testing would have been utterly impractical. So a strictly typed language was for nearly everyone a good idea.

Over the last couple of years, I've been trying my hand at a number of difficult algorithmic problems. This is not the stuff that 99% of the programmers in the world do professionally, including me.

Working on these problems was like programming was in the old days. Not only was it just you and the problem with no frameworks to come between, every output becomes a milestone when it takes a program days to generate. It also means that the style of programming is different. You don't worry so much about language restrictions introducing frictional losses into the code/test/recode cycle. You do worry more about mistakes that make it past the compiler.

Ada's philosophy is that coding should be, if not exactly slower certainly more deliberate. If you are running something for which your hardware is monumentally slow, then this is a good style to work in.

Re:other factors often dominate language choice (1)

ChrisA90278 (905188) | more than 6 years ago | (#23419166)

"when it comes to my first choice, independent of problem domain,"

Notice the choise of words here. nothing wrong but it implies a small one-person project. Maybe even less then that a one person part time project.

But what if there were 250 software engineeers working full time over three the five years? This is what Ada was designed for, large scale software. Very few companies can do this kind of work. Mostly you are looking at the big aerospace companies, like Northrop, Lockeed and Boeing. The major problem to be solved in large scale software is getting the various parts to fit together. Ada it turns out is very good at speciying interfaces and isolating the effect of a change. This last part is realy the most importent. You need to have high certainty that a change some place doe not cause a bug some place else.

Horses for courses (4, Interesting)

Viol8 (599362) | more than 6 years ago | (#23418686)

Its not really surprising that he found ADA nicer to use than C for this sort of project because its not the sort of thing C was created for. People seem to think that C was designed as an all purpose programming language - it wasn't. It was specifically designed as a systems programming language that could substitute for assembler 99% of the time. Its abilities lie in low level manupulation of memory and I/O , not in high level mathematics algorithms (though obviously it can do this too).

Then of course C++ came along which wanted to have its cake and eat it and the end result was a nasty mishmash of low and high level constructs which is difficult to learn , unintuitive and generally messy to use.

Re:Horses for courses - please explain (1)

mykepredko (40154) | more than 6 years ago | (#23420236)

I don't want to start a philosophical battle here, but I would appreciate it if you could give me a pointer to a reference explaining what are the features of C that make it suitable for "low-level manipulation of memory and I/O"?

I've always found it to be sub-optimal due to its lack of a "bit" data type, the need to explicitly set pointers to address specific regions in memory (which may or may not be in the same address space as I/O - ie the x86 architecture), etc. To get around these issues, access functions are typically written in assembler and then accessed by the C mainline code or the compiler is enhanced to provide these functions more natively. For high performance applications, the programmers may find that they have to use assembler to minimize latency between commands. These factors typically result in code that isn't portable unless the compiler features and libraries can be used in the target.

It might seem that I'm glossing over the point (that I agree with) that C can be used in 99% of the places where assembly language could be used, but what I'm really doing is questioning why it is accepted as the best language, and how it has been optimized, for low-level programming.



HDL (2, Interesting)

Anonymous Coward | more than 6 years ago | (#23418742)

I've used both verilog (C based) and VHDL (ADA based) and the latter wins hands down for being maintainable and easy to debug. Nobody had to write a LINT checker for VHDL like they did for verilog. I totally believe this guy.

At the moment, software developers are like masons (1)

Colin Smith (2679) | more than 6 years ago | (#23418930)

Chipping the code into a specific shape by hand... Give it a few years and software development will be more like civil engineering. Pouring concrete into shapes which have known specifications.


Planck quote (1)

nguy (1207026) | more than 6 years ago | (#23419294)

truth never wins -- its opponents just go extinct

Yes, and the people who promote Ada as a secure and productive programming language have almost died out.

Ada is neither, and fortunately, the market has realized that.

I wonder ... (2, Interesting)

kst (168867) | more than 6 years ago | (#23419414)

I can't help wondering how many of the people making snide comments about Ada (note: not ADA; it's not an acronym) have actually used it.

Sounds like Object Pascal (1)

acheron12 (1268924) | more than 6 years ago | (#23420622)

From TFA:

That any discrete type can be used as an array index type, not just the predefined integer type, is a feature that sets Ada aside from most languages that I have seen so far.
Pascal [] does that. With the Object Pascal extensions, it does most of the other things he mentioned too.

Obvious.. (2, Informative)

jovius (974690) | more than 6 years ago | (#23420686)

He obviously settled on Ada, because Ada allowed him to implement.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?