Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Why Scientists Are Still Using FORTRAN in 2014

timothy posted about 5 months ago | from the why-change dept.

Programming 634

New submitter InfoJunkie777 (1435969) writes "When you go to any place where 'cutting edge' scientific research is going on, strangely the computer language of choice is FORTRAN, the first computer language commonly used, invented in the 1950s. Meaning FORmula TRANslation, no language since has been able to match its speed. But three new contenders are explored here. Your thoughts?"

Sorry! There are no comments related to the filter you selected.

Q: Why Are Scientists Still Using FORTRAN in 2014? (5, Insightful)

dtmos (447842) | about 5 months ago | (#46963867)

A: Legacy code.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (5, Informative)

Anonymous Coward | about 5 months ago | (#46963911)

No, not just "legacy code." Fortran (yes, that's how it's spelt now, not "FORTRAN") was designed to be highly optimizable. Because of the way Fortran handles such things as aliasing, it's compilers can optimize expressions a lot better than other languages.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (0, Flamebait)

Jeremiah Cornelius (137) | about 5 months ago | (#46964029)

Why?

Not to put too fine a point on the matter, Fortran is still used to bust the n00bs.

It's academia, for God's sake! This is where Doctorae Philosophiae held forth discourse in Latin, as recently as 150 years ago, fully 1600 years after that language fell into conversational extinction.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (5, Insightful)

Bing Tsher E (943915) | about 5 months ago | (#46964097)

Precision is important in scientific discourse. Latin isn't a language with creeping grammar and jargon. It's sorta what Esperanto only wished it could ever be.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (0)

Anonymous Coward | about 5 months ago | (#46964171)

Latin was the one language that all academics shared. To them, it was a convenience. The dominance of English in international communication is a very recent phenomenon...

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (4, Interesting)

K. S. Kyosuke (729550) | about 5 months ago | (#46964217)

APL-style languages should be even more optimizable, since they use higher-order array operators that make the control flow and data flow highly explicit without the need to recover information from loopy code using auto-vectorizers, and easily yield parallel code. By this logic, in our era of cheap vector/GPU hardware, APL-family languages should be even more popular than Fortran!

K.S. Kyosuke gets called out & runs (-1)

Anonymous Coward | about 5 months ago | (#46964257)

He likes tossing names & then ran from a fair challenge http://slashdot.org/comments.p... [slashdot.org]

Saw the post parent to your link (-1)

Anonymous Coward | about 5 months ago | (#46964269)

K.S. Kyosucky used illogical offtopic ad hominem attacks. He's a troll that ran when challenged like a blowhard chickenshit.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (3, Informative)

mbone (558574) | about 5 months ago | (#46964273)

Yeah, I used to hear that argument a lot in 1978...

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (4, Insightful)

Darinbob (1142669) | about 5 months ago | (#46964227)

Also "legacy training". Student learns from prof. Student becomes prof. Cycle repeats.

Also Fortran didn't stagnate in the 60s, it's been evolving over time.

Other languages are highly optimizable too. However most of the new and "cool" languages I've seen in the last ten years are all basic scripting languages, great for the web or It work but awful for doing lots of work in a short period of time. It's no mystery why Fortran, C/C++, and Ada are still surviving in areas where no just-in-time wannabe will flourish.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (1)

InfoJunkie777 (1435969) | about 5 months ago | (#46964267)

I am OP: Sorry I had it down as all caps.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (0)

Anonymous Coward | about 5 months ago | (#46963919)

B: CUDA adaptations of legacy code.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (4, Funny)

PPH (736903) | about 5 months ago | (#46963939)

At least Slashdot seems to encourage re-use of commonly used responses when a question is asked.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (5, Interesting)

Nemyst (1383049) | about 5 months ago | (#46963955)

This. I have many friends in the physics dept and the reason they're doing Fortran at all is that they're basing their own stuff off of existing Fortran stuff.

What amused me about the article was actually the Fortran versions they spoke about. F95? F03? F08? Let's be real: just about every Fortran code I've heard of is still limited to F77 (with some F90 if you're lucky). It just won't work on later versions, and it's deemed not worth porting over, so the entire codebase is stuck on almost 40 years old code.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (3, Interesting)

Brett Buck (811747) | about 5 months ago | (#46964237)

F77+extensions, usually DEC extensions. Very very few people ever used strict F77 with no extensions.

        Some of the issues this causes are irritating bordering on unnerving. This we we discovered that g77 didn't care for treating INTEGER as LOGICAL. Used to be that there was no other way to specify bit operations, now it is precluded. Everybody's code has that, and there's really nothing intrinsically wrong or difficult to understand about it, but it was technically non-standard (although everyone's extensions permitted it) and it won't work on g77 - maybe only with the infamous -fugly flag.

 

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (2, Interesting)

Anonymous Coward | about 5 months ago | (#46963967)

Seconded. And the legacy isn't necessarily just the source code. Many of the engineering industries using such codes have a relatively low turnover rate, meaning an older group of engineers and researchers with the most experience stick around for decades. Most of these folks used Fortran since college. It works for them, and they aren't concerned with any "new-fangled" languages that offer more features. Another reason I hear from these folks is that Fortran has powerful array slicing and indexing syntax not found in C, making big data manipulation simpler. Newer programming languages like Python have packages like NumPy which offer similar capabilities, but it's often a nightmare to translate hundreds of thousands of legacy code lines simply to "escape" Fortran. And there are decent bindings to Fortran that can be leveraged for many parallel computing packages (MPI), which means even less incentive to move up.

Newer folks entering the field often work under the tutelage or mentoring of these folks, and Fortran sticks around. Python is gaining usage in the scientific communities, and it's often coupled with mixed-language wrapping code like f2py or SWIG to access any legacy Fortran code for heavy number-crunching work. I've seen this recipe used successfully in parallel computing to detach some of the "administrative" aspects of scientific code into newer languages.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (5, Informative)

mbkennel (97636) | about 5 months ago | (#46963977)


A: Legacy code, and because Fortran 2003+ is a very good modern language for scientific computation and maps very naturally to problems. As it turns out, the language semantics (both legacy and modern constructs) make it very good to parallelize. And it runs fast, as in, equalling C++ level of performance is considered a weak showing.

If you haven't seen or used modern Fortran and think it's anything like Fortran 66/77 then you're mistaken. Except for I/O, which still tends to suck.

In addition there are still some seemingly trivial but actually important features which make it better than many alternatives (starting from Fortran 90).

There's some boneheaded clunkers in other languages which Fortran does right: obviously, built-in multi-dimensional arrays, AND, arrays whose indices can start at 0, 1 (or any other value) and of course know their size. Some algorithms are written (on paper) with 0-based indexing and others with 1-based and allowing either one to be expressed naturally lowers chance of bugs.

Another one is that Fortran distinguishes between dynamically allocatable, and pointers/references. The history of C has constrained/brain-damaged people to think that to get the first, you must necessarily take the second. That doesn't happen in Fortran, you have ALLOCATABLE arrays (or other things) for run-time allocation of storage, and if you need a pointer (rarer) you can get that too. And Fortran provides the "TARGET" attribute to indicate that something *may be pointed to/referenced*, and by default this is not allowed. No making pointers/references to things which aren't designed to be referred to multiple times. This also means that the aliasing potential is highly controlled & language semantics constructed to make Fortran able to make very aggressive, and safe, optimization assumptions.

The more parallel you want, the more of these assumptions you need to get fast code, and naturally written Fortran code comes this way out of the box than most other languages.

F77/F95 was what I saw used. (0)

Anonymous Coward | about 5 months ago | (#46964229)

That may be but most of the code I saw written was Fortran77 although some was Fortran 95. Some of the code was mocked in python first, but there was a HUGE performance difference so all the actual research code was fortran.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (5, Insightful)

the eric conspiracy (20178) | about 5 months ago | (#46963979)

Legacy code that has been carefully checked to give correct results under a wide range of conditions.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (0)

Anonymous Coward | about 5 months ago | (#46964043)

This may be especially important if the testing method is ISO acredited. Getting a new test accredited may just not be worth the hassle, so if it ain't broke...

Workers still use shovels in 2014!!!!! (5, Insightful)

TapeCutter (624760) | about 5 months ago | (#46964027)

Prospectors did not stop using shovels when bulldozers were invented. FORTRAN is the scientist's shovel, visualization software is the bulldozer.

A: Legacy code.

AKA battle hardened libraries that work as advertised.

Re:Workers still use shovels in 2014!!!!! (0, Flamebait)

Kaz Kylheku (1484) | about 5 months ago | (#46964209)

Fortran is a plastic shovel with a rolled up newspaper for a handle.

not in the field, eh? (4, Informative)

rubycodez (864176) | about 5 months ago | (#46964079)

no, used because Fortran is the high level language that produces the fastest code for numeric computation, it is by far the most optimizable. Yes, it blows away C.

Re:not in the field, eh? (-1)

angel'o'sphere (80593) | about 5 months ago | (#46964173)

That is nonsense.
Both compile down to the exact same machine code.

Re:not in the field, eh? (3, Insightful)

Cramer (69040) | about 5 months ago | (#46964235)

They both generate machine code. But they get there in different ways and produce very different output. It would be more correct to say FORTRAN (compilers) blows away any C compilers. (esp. gcc)

Re:not in the field, eh? (3, Informative)

InfoJunkie777 (1435969) | about 5 months ago | (#46964341)

OP here. This is what the article said. Compilers are the key. They have been around a long time. Another key is that commercial compilers (like Intel for example) further increase the speed, as the manufacturers know how to optomize the code for the specific CPU at hand.

Re:not in the field, eh? (1)

rubycodez (864176) | about 5 months ago | (#46964287)

you are spouting off about a field in which you do not work, against a well-known fact.

Doing the quite common operations of numeric analysis, machine output of the C compiler will be worse, less optimized, slower than that of Fortran.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (1)

tlambert (566799) | about 5 months ago | (#46964115)

A: Legacy code.

Most people who learned FORTRAN did so in a class intended to teach FORTRAN.

Most people these days aren't learning FORTRAN, they're learning other languages, and they're doing them in a context that teaches them about apply the tool, rather than making them really, really good with the tool itself.

To use an analogy, it's the difference between learning how to use all the tools in the wood shop before being sent in to make a chair (learn the tools) vs. being taught all about "chairness", and then being sent in to make a chair, without a great knowledge of the tools themselves. Someone is going to lose a finger, in the second case.

So it might be "legacy", but the legacy they want is the teaching methods which were in use when most of the FORTRAN programmers came into existence in the first place.

Re:Q: Why Are Scientists Still Using FORTRAN in 20 (0)

Anonymous Coward | about 5 months ago | (#46964191)

The headline is not a question but a statement. The explanation is in TFS where it says, "no language since has been able to match its speed."

It's the right tool for the job (5, Insightful)

Balial (39889) | about 5 months ago | (#46963893)

Scientists work in formulas. Fortran was designed to do things naturally that don't fit into C/C++, Python, whatever.

Re:It's the right tool for the job (5, Insightful)

smoothnorman (1670542) | about 5 months ago | (#46963951)

mod the above up please (i'm fresh out of mod points), because that's it in a nutshell. Fortran was designed for science/engineering work. And here's something that a majority of computer-science mavins never seem to grasp. In academia, at least, the use of a program is often relatively ad-hoc, and for the life of the publication. they need to have lots of numerical stuff down by easily references libraries, then handed off to their (poor) dost-docs/grad-students to study for their own one-off programming purposes. That is, the next vital program will have little to do with the previous except for those same well referenced peer-reviewed linked-to numerical libraries. Does that sound like a perfect use (model) of Clojure or Haskell to you? (yes yes you in the back, i know you brush your teeth with monads, but you're the rare case). Haskell and friends force you to think a lot up front for gains at the rear-end, but with much of academic programming there's no rear-end.

In other words... (4, Insightful)

93 Escort Wagon (326346) | about 5 months ago | (#46963985)

If it ain't broke - don't fix it.

Re:It's the right tool for the job (1)

Darinbob (1142669) | about 5 months ago | (#46964279)

I think this is indeed the problem with a lot of languages explicitly designed to be more concurrent or parallel. I looked at SISAL and similar languages ages ago in grad school and they really were more difficult to understand and get the concepts clear. Not so bad for CS students (or CS grad students in the case of Haskell) but do you really want the physics or chemistry academic to learn a style of programming language that the vast majority of programming professionals don't even understand or care about?

Re:It's the right tool for the job (0)

Anonymous Coward | about 5 months ago | (#46964199)

Also because vector machines are coming back, sort of.

Re:It's the right tool for the job (1)

Cramer (69040) | about 5 months ago | (#46964255)

Not so much that it "doesn't fit", but that some things are a pain to do and the compilers suck at handling it. (eg. array math. even 2 dimensions is a mess in C/C++)

Wrong question (5, Insightful)

Brett Buck (811747) | about 5 months ago | (#46963895)

Why not?

      Actually that is a serious question, for these sorts of applications there seems to be no significant downside.

Re:Wrong question (5, Insightful)

jythie (914043) | about 5 months ago | (#46963921)

That is what tends to bother me about these 'wow, people are not using what we in another field are using!' type questions. FORTRAN does its job well, has libraries relevant to what people are using it for, and experience in it is common within that community. Why shouldn't they use it?

Re:Wrong question (2)

timeOday (582209) | about 5 months ago | (#46963987)

Computer languages turned out to be one of those things that seem very deep and significant, but actually aren't. FORTRAN and Lisp (and BASIC and C, only somewhat later) made programmers about as productive, within a reasonably small constant factor, as anything since. (And before you hit me with "Citation Needed," remember it cuts both ways!)

Re:Wrong question (1)

Shompol (1690084) | about 5 months ago | (#46964157)

FORTRAN and Lisp (and BASIC and C, only somewhat later) made programmers about as productive, within a reasonably small constant factor, as anything since.

I recently switched from C to Python, and my productivity shot up ~100 times. For example, I created a tree of hashes to perform pattern matching on large data sets in linear time. It took me 2 hours from concept to production run. Also factor in a huge library that does everything needed on this planet, ease of maintaining 100x fewer lines of code and 0 memory management hurdles.

The downside is that it runs 100 times slower than C, but since it is the programmer's productivity your are talking about, you are very wrong.

Re:Wrong question (2)

mbone (558574) | about 5 months ago | (#46964321)

The downside is that it runs 100 times slower than C, but since it is the programmer's productivity your are talking about, you are very wrong.

You do realize that CPU limited problems are not uncommon in physics and engineering?

Re:Wrong question (5, Insightful)

Rhys (96510) | about 5 months ago | (#46964091)

There's actually significant upside.

Ever debugged a memory error in C? Ever done it when it is timing dependent? How about on 1024 nodes at once? Good luck opening that many gdb windows.

I TA'd the parallel programming class. I told the students (largely engineers & science, not CS) -- use fortran. Lack of pointers is actually a feature here.

Why not? (3, Insightful)

grub (11606) | about 5 months ago | (#46963899)


At work in the recent past (2000's) we were still supporting FORTRAN on the SGI machines we had running. The SGI compilers would optimize the hell out of the code and get it all parallized up, ready to eat up all the CPUs.

Newer isn't always better.

Ten Reasons to use Modern Fortran (5, Informative)

wispoftow (653759) | about 5 months ago | (#46963903)

1) Modern Fortran is not all uppercase
2) Modern Fortran does not have to start on column 7
3) Modern Fortran has dynamic memory allocation
4) Modern Fortran can use the same types as C (maximizes interoperability), hence can be called where C might be called
5) Modern Fortran has an objects, polymorphism, etc.
6) Modern Fortran has (a limited form of) pointers
7) Modern Fortran has concise array/vector/matrix operations
8) Modern Fortran has dynamically allocatable, multidimensional arrays that can be indexed starting with any integer
8) Modern Fortran supports the complex type without higgery-jiggery
9) Modern Fortran doesn't *need *pointers *in *all *the *places *that &C does, pass by reference is the norm
10) Modern Fortran is blazingly fast and designed for sciene ....

Some folks still write in Fortran 77, and the tired tales of woe that are bound to come from a language specification that is many decades old.

But, that code/style still works, and who am I to judge how you want to get your work done?

When (1)

Anonymous Coward | about 5 months ago | (#46963931)

When will they ruin Fortran, like they are ruining everything else?

When will Fortran lose half its funtionality?
When will Fortran get a touchscreen interface?
When will Fortran do a forced upgrade to continue being supported?

Re:When (1)

mbkennel (97636) | about 5 months ago | (#46963999)

No, Fortran won't be ruined.

Fortran, the language, has evolved very significantly with little annoying cruft hurting current design on account of legacy compatibility.

The comparison vs C and C++ is instructive.

Re:Ten Reasons to use Modern Fortran (4, Insightful)

rubycodez (864176) | about 5 months ago | (#46964117)

you left out the massive gigabytes of well-tested and respected numeric libraries for all the major fields of science and engineering (that are free for use too).....oh, and much of that written in F77. the most optimizable langague for numeric computation on planet earth, that's why supercomputer companies always sell ForTran compilers

Strangely? (5, Insightful)

fahrbot-bot (874524) | about 5 months ago | (#46963905)

When you go to any place where 'cutting edge' scientific research is going on, strangely the computer language of choice is FORTRAN, the first computer language commonly used, invented in the 1950s.

Perhaps it's still the best tool for the job. Why is that strange? Old(er) doesn't necessarily mean obsolete -- and new(er) doesn't necessarily mean better.

Re:Strangely? (2)

jythie (914043) | about 5 months ago | (#46963933)

"tool" is the key word. Within the type of research that uses it, they want a tool for getting their actual goals done. CompSci and such tend to see languages and such as points of focus unto themselves.

Re:Strangely? (5, Insightful)

Dutch Gun (899105) | about 5 months ago | (#46964093)

Agreed. My thought at reading the summary was "Do older languages have some sort of expiration date I don't know about?" What's odd about it? Also, it's not like the language has been stagnant. English is an old "legacy" human language with lots of cruft and inconsistent rules, but it works well enough for us that it's not worth jumping ship for Esperanto.

A large part of it is probably the simple inertia of legacy, both in code, systems, and personnel. However legacy systems tends to eventually be replaced if a demonstrably superior product can improve performance in some way. Any significant change, even one for the better, causes pain and friction, so the change typically has to be worth the pain involved. Obviously in the eyes of many science-focused projects, it hasn't been worth switching to a new language. There's also value in having a body of work in an older and very well understood and documented language, as it means new team members are much more likely to already be proficient with the language than a newer and less popular language.

I can also understand not wanting to switch to some "flavor of the month" language when you're not sure how long it will be actively supported. FORTRAN has credibility simply based on it's incredible longevity. No, it's not new and sexy, but you can bet it will probably be around for another half-century.

Similar probelm to COBOL (0)

Anonymous Coward | about 5 months ago | (#46963907)

It's not a dead language. It still changes with the times. And there's a vast reservoir of code already written in it. Why cast all that away and start from scratch with new languages.

Of course, if you could make the new language recognize the old code and incorporate or interface with the new code then you could do a transition rather than a plain break with the past.

It's not a bad language (1)

Anonymous Coward | about 5 months ago | (#46963909)

There are languages with worse syntax and clunkier to use -- I'd rather know why people are still willingly using COBOL. "Too lazy/expensive to upgrade legacy infrastructure" isn't a valid reason, it's just a popular and convenient excuse.

Re:It's not a bad language (2)

John.Banister (1291556) | about 5 months ago | (#46964073)

Someone told me (in 1986, I think) "It's amazing. You just write this documentation, and it runs!"

Fortran is NOT the language of choice (3, Informative)

Rostin (691447) | about 5 months ago | (#46963923)

I have a PhD in engineering, and my dissertation involved writing lots of code. Now I work at a national lab in the US, and I and nearly all of my coworkers work on scientific or engineering codes of some sort. Although there is significant amounts of legacy code that was written in Fortran lying around (a project I work on uses a fortran library written in 1973), very little development is done in that language. It's all C++ or Python.

Re:Fortran is NOT the language of choice (0)

Anonymous Coward | about 5 months ago | (#46963975)

Agreed. I work at a university and run on national supercomputers.. Lots of legacy code exists in Fortran, but lots more development happens in Python or C++ now, sometimes with small kernels being written in Fortran.

Also, the speed difference of Fortran over decently-written C/C++ is non-existent. C/C++ lets you write things less efficiently, but it doesn't mean you have to.

Re:Fortran is NOT the language of choice (1)

Anonymous Coward | about 5 months ago | (#46963997)

Is this a recent change? I toured Goddard Space Center with my college in 2011 and we were told that they were -always- on the lookout for anyone with FORTRAN skills.

Re:Fortran is NOT the language of choice (0)

Anonymous Coward | about 5 months ago | (#46964047)

I imagine they need people to maintain their legacy stuff when the current maintainers retire.

Re:Fortran is NOT the language of choice (1)

Cramer (69040) | about 5 months ago | (#46964331)

Indeed. They were sniffing around the College of Textiles (!) in '95 when I was graduating. We were one of the few departments that required FORTRAN (77).

(I had been campaigning since '93 to stop that shit. 'tho it paid well as a Lab Instructor (aka "TA".) They switched to Java (ugg) in '96-97.)

I never thought about engineering and Fortran (4, Insightful)

93 Escort Wagon (326346) | about 5 months ago | (#46964033)

Large scale models handling huge arrays, though - like climate or weather modeling - I think that's where Fortran has always been king of the roost.

The whole point is speed. No one's working in Python if they're interested in speed.

Re:I never thought about engineering and Fortran (1)

Darinbob (1142669) | about 5 months ago | (#46964295)

My guess is that python is being used mostly to handle data rather than crunching numbers. As in read those three files that came in from the satellite, merge them, strip out the irrelevant stuff, shove it back out as a packed binary file ready for processing.

Re:I never thought about engineering and Fortran (1)

Redeye Carci (2932323) | about 5 months ago | (#46964349)

Python is used to glue all the lower level stuff together. If you are iterating over large arrays in python you are doing something wrong, but if you are using python to glue C/Fortran molecules together then it starts to look much better.

Re:I never thought about engineering and Fortran (1)

majid_aldo (812530) | about 5 months ago | (#46964353)

by carefully constructing python statements and calling libraries you can get the speed you want from a 'python' program. if you can fit your problem in this way, then yea it makes sense to go to fortran or C.

Re:Fortran is NOT the language of choice (1, Insightful)

AchilleTalon (540925) | about 5 months ago | (#46964283)

Fortran is a scientific programming language. You are an engineer, seems clear enough why you are not using Fortran. Any explanations needed?

Memory management (2)

Fotis Georgatos (3006465) | about 5 months ago | (#46963945)

The biggest reason of interest is that it helps non-computer-science scientists write up computational codes, neither having to devote excessive amount of time in memory management, nor deviate from the classic imperative programming model. And, it is also important for a purely non-technical reason: a generation of domain experts in engineering and scientific domains where trained in FORTRAN codes.

As managers of High Performance Computing platforms, we generally take an a-religious approach and deliver to the users all possible permutations of language types that a given community may need. The following is a very common setup, containing both GNU & Intel compilers: https://hpc.uni.lu/users/softw... [hpc.uni.lu]

btw. I'm not defending Fortran in any kind of way; ask any Fortran-fun, in which language his compilers are written in ... there is a reason :)

Re:Memory management (1)

phantomfive (622387) | about 5 months ago | (#46963983)

Good point. Sometimes we forget how hard it is to adapt to the object-oriented programming style, since it comes so naturally to us. But if all you want to do is get the computer to implement your formula, OOP doesn't give you much.

Still a big hit in Vietnam (2, Insightful)

smitty_one_each (243267) | about 5 months ago | (#46963947)

After all, it was "For Tran".

Re:Still a big hit in Vietnam (1)

grub (11606) | about 5 months ago | (#46964013)

That had me laughing.

Re:Still a big hit in Vietnam (0)

Anonymous Coward | about 5 months ago | (#46964083)

I'm a sucker for any gag involving punctuation or whitespace manipulation.

As others have said... why not? (4, Insightful)

Karmashock (2415832) | about 5 months ago | (#46963963)

If the language accomplishes the task efficiently and effectively with no apparent downside then why attempt to switch languages simply for the sake of switching?

Furthermore, an ability to run legacy code should be sustained especially in science where being able to use that code again after many years might save scientists from having to reverse engineer past discoveries.

Weird (0)

Anonymous Coward | about 5 months ago | (#46963971)

They also use integrals and derivatives which first were used in the 17th century. Oh wait, that's OK! Software is mostly fashion these days.

It just works (0)

Anonymous Coward | about 5 months ago | (#46963973)

Fortran is the best at numbers, and not so good at other things (Try downloading a webpage with it, or even bind to Curl without the command prompt). The only language that can approach Fortran's niche is C, which is pretty good at everything, but you have to know a lot of programming tricks to get the same performance with numbers.

Why replace Fortran? I think it's great that a technology is still being used that spans back to vacuum tube machines and that those programmers' skills are not abandoned by industry (i.e. Visual Basic).

not really (0)

Anonymous Coward | about 5 months ago | (#46963995)

I working in scientific computing and almost all new software is written in c++. Two huge parallel solver libraries petsc and trilinos are c and c++, ditto for many, many other codes. LAPACK is the one fortran code that you might find yourself using. There are legacy codes written in Fortran, but it's always a mistake to start something new in Fortran, particularly if you want your developers to be able to get jobs outside of science (when the funding goes south).

yes really (1)

rubycodez (864176) | about 5 months ago | (#46964109)

haha, maybe you better look at what language huge parts of the cores of your petc and trillinos are written in. hint, starts with an F

Re:yes really (1)

smoothnorman (1670542) | about 5 months ago | (#46964145)

yep. along with all the rest of the BLAS, EISPACK, CERNLAB, MINPACK, SOFA, ATLAS, EIGEN, ... and even the comparatively more recent Bioinformatics cores.. BLAST, BLAT, ...

I really don't understand the "scientific computing .. almost all new software is written in C++" comes from. It's all become Python (and Perl before that) calling old libraries at the scientific meetings i've attended. (but i suppose YMMV)

Key Reason (4, Interesting)

stox (131684) | about 5 months ago | (#46964015)

Huge libraries of FORTRAN code have been formally proven. New FORTRAN code can be formally proven. Due the limitations of the language, it is possible to put the code through formal processes to prove the code is correct. In addition, again, as a benefit of those limitations, it is very easy to auto-parallelize FORTRAN code.

I call BS (0)

Anonymous Coward | about 5 months ago | (#46964063)

Sorry folks, I've spent a lot of time in HPC environments and I just don't see Fortran being the GOTO language any more. The differences that use to allow Fortran to be better compiled are fading away, the only thing hiding this fact is the bloated uber-templated unintelligible over-abstracted C++ doesn't necessarily compile well. Take a modern compiler, write code that is numeric heavy in Fortran and C -- then actually look at what ends up being created and there isn't a hill of beans difference, most of the instructions are in COMMON. WHILE Fortran used to be easier to compile, that is nearing its END. DO you not think so?

It's all about math, stupid (0)

Anonymous Coward | about 5 months ago | (#46964069)

FORTRAN is all about math. Math functions are trivially easy to define because the language was built do deal with math functions.

Also, there is an enormous library of established, proven, dependable libraries for the most important math functions in most industries. LINPACK, LAPACK, etc..

Doing math in C/C++ or whatever other language - you just never know what results you're going to get.

We're Not (1, Interesting)

friedmud (512466) | about 5 months ago | (#46964121)

I saw this link bait the other day...

We're NOT using Fortran anymore...

Many of us at the National Labs do modern, object-oriented C/C++... Like the project I'm in charge of: http://www.mooseframework.org/ [mooseframework.org]

There are whole labs that have completely expunged Fortran in favor of C++... Like Sandia (http://trilinos.sandia.gov) who actually went through a period in the late 90s and early 2000s where they systematically replaced all of their largest Fortan computational science codes with C++.

Those places that don't use C++ use C like the awesome PETSc library from Argonne ( http://www.mcs.anl.gov/petsc/ [anl.gov] ) which actually employs an object-oriented scheme in C.

The big name modern codes that are getting run on the biggest machines are generally done in C and C++.

I don't see that situation changing anytime soon as there is simply a massive amount of C and C++ libraries that will continue to provide the engine for tomorrows codes. The trend i see happening most often is utilizing C and C++ libraries with Python glue for everything doesn't need raw speed.... I think that trend will continue.

Re:We're Not (2, Interesting)

Anonymous Coward | about 5 months ago | (#46964151)

If you're using C++ for scientific math, then you deserve to have whatever credentials you may possess to be revoked immediately. No language should be used for scientific math that can produce different results based upon the version of library or platform it is compiled against.

You also cannot prove C++ code is good. You just can't. C++ is not deterministic, again, because the outcome depends on platform/library versions, compiler options, time of day, alignment of the planets, and many other factors. There is no way to say for certain that "Yes, this code will produce the correct results under all conditions."

The big name modern codes that are getting run on the biggest machines are generally done in C and C++ and producing incorrect results.

I have PoC code that I have used to prove that C++ can produce incorrect results based on factors other than the code itself, and at the level of significance as high as 10^-15. That is a completely unacceptable level of inaccuracy for scientific exploration.

Re:We're Not (5, Insightful)

friedmud (512466) | about 5 months ago | (#46964201)

Firstly... 10^-15 is WAY beyond what most scientific codes care about. Most nonlinear finite-element codes generally shoot for convergence tolerances between 1e-5 and 1e-8. Most of the problems are just too hard (read: incredibly nonlinear) to solve to anything beyond that. Further, 1e-8 is generally WAY beyond the physical engineering parameters for the problem. Beyond that level we either can't measure the inputs, have uncertainty about material properties, can't perfectly represent the geometry, have discretization error etc., etc. Who cares if you can reproduce the exact same numbers down to 1e-15 when your inputs have uncertainty above 1e-3??

Secondly... lots of the best computational scientists in the world would disagree:

http://www.openfoam.org/docs/u... [openfoam.org]
http://libmesh.sourceforge.net... [sourceforge.net]
http://www.dealii.org/ [dealii.org]
http://eigen.tuxfamily.org/ind... [tuxfamily.org]
http://trilinos.sandia.gov/ [sandia.gov]

I could go on... but you're just VERY wrong... and there's no reason to spend more time on you...

Re:We're Not (2)

Dizzer (251533) | about 5 months ago | (#46964259)

If you get hung up on floating point truncation errors, then I have bad news for you: Fortran won't protect you from that. You seem to be under the delusion impression that this invalidates the results for some reason. This is utter bullshit. One example are molecular dynamic simulations. An MD simulation is a chaotic system. The _exact_ trajectory is not the relevant result. The phase space that is sampled is. Trajectories of systems with identical initial conditions are bound to diverge on different machines due to a change in floating point operation order and the resulting truncation errors. But the phase space that is sample is _equivalent_ in each run. If for some reason machine precision is important to you you'd be much better off by using a library such as GMP (https://gmplib.org/).

Yes you are (1)

dbIII (701233) | about 5 months ago | (#46964207)

Don't need fortran due to Petc? Take a look at how it is installed:
http://www.mcs.anl.gov/petsc/documentation/installation.html
Especially note the: ./configure --with-cc=gcc --with-fc=gfortran --download-f-blas-lapack --download-mpich
Is that enough to tell you that the Petc libraries contain fortran?

Re:Yes you are (1)

friedmud (512466) | about 5 months ago | (#46964261)

You can install PETSc without a Fortran compiler at all. Change that --download-f-blas-lapack to --download-c-blas-lapack and you're good to go...

In fact... MOOSE works on platforms without a Fortran compiler at all... although we generally recommend that you have one (so that you can still link in any legacy routines you've written in Fortran).

I'm not specifically against Fortran... I was just trying to say that most new computational science development at the National Labs is NOT being done in it. We've moved on...

Re:Yes you are (1)

Dizzer (251533) | about 5 months ago | (#46964265)

Uhm...: http://www.mcs.anl.gov/petsc/documentation/faq.html#why-c

Re:We're Not (0)

Anonymous Coward | about 5 months ago | (#46964263)

the project I'm in charge of: http://www.mooseframework.org/ [mooseframework.org]

Thanks for wasting my tax dollars you unproductive asshole.

Re:We're Not (1)

friedmud (512466) | about 5 months ago | (#46964315)

Interesting... I'm not sure what's unproductive about producing a freely available scientific engineering platform that is directly impacting the energy generation issues in this country.

But, ok :-)

Postscript (1)

sk999 (846068) | about 5 months ago | (#46964127)

Today someone told me about how he once wasn't allowed to disturb a printer - because someone was using it to run a job doing an FFT written in Postscript. Apparently the large amount of memory available in the printer was paramount.

Re:Postscript (1)

50000BTU_barbecue (588132) | about 5 months ago | (#46964243)

Geez, was this Don Lancaster's printer? It makes sense though, Postscript has to do a lot of math.

http://www.tinaja.com/post01.s... [tinaja.com]

Why is anyone still using C++ in 2014? (0)

thisisauniqueid (825395) | about 5 months ago | (#46964133)

Forget Fortran, I want to know why anybody in their right mind is still using the obtuse juggernaut mongrel of a language known as C++ in 2014. (Even with the 11 and 14 versions don't make things any better, they only wallpaper over obtuse features with other obtuse features... very few people alive truly know all the weird quirks of C++ inside and out.)

Re:Why is anyone still using C++ in 2014? (1)

friedmud (512466) | about 5 months ago | (#46964161)

Not everyone needs to know all of the quirks of C++ to use it. My project ( http://mooseframework.org/ [mooseframework.org] ) does all of the nasty C++ stuff under the hood so that we can expose a very straightforward interface to non-computer-scientists.

It's working out well so far.

Object-oriented is still a good paradigm until the functional language people get everything figured out and there are enough computational science libraries written in functional languages. And if you want to do object-oriented and you still want to be fairly close to the metal for performance reasons then C++ is a good choice.

There are people that do object-oriented with C like the PETSc team ( http://www.mcs.anl.gov/petsc/ [anl.gov] )... and they have good reasons for doing so... but the result isn't necessarily less imposing to the uninitiated than C++...

Re:Why is anyone still using C++ in 2014? (1)

DMUTPeregrine (612791) | about 5 months ago | (#46964335)

Well, as the old joke goes, everyone knows 30% of C++.
They just all know a different 30%.

As a Social Science Ph.d. (5, Funny)

robbiedo (553308) | about 5 months ago | (#46964177)

I am sticking with Visual Basic 6

Because C and C++ multidimensional arrays suck (5, Insightful)

Animats (122034) | about 5 months ago | (#46964179)

A big problem is that C and C++ don't have real multidimensional arrays. There are arrays of arrays, and fixed-sized multidimensional arrays, but not general multidimensional arrays.

FORTRAN was designed from the beginning to support multidimensional arrays efficiently. They can be declared, passed to subroutines, and iterated over efficiently along any axis. The compilers know a lot about the properties of arrays, allowing efficient vectorization, parallization, and subscript optimization.

C people do not get this. There have been a few attempts to bolt multidimensional arrays as parameters or local variables onto C, (mostly in C99) but they were incompatible with C++, Microsoft refused to implement them, and they're deprecated in the latest revision of C.

Go isn't any better. I spent some time trying to convince the Go crowd to support multdimensional arrays properly. But the idea got talked to death and lost under a pile of little-used nice features.

Re:Because C and C++ multidimensional arrays suck (0)

friedmud (512466) | about 5 months ago | (#46964221)

Easily fixed with libraries like Eigen ( http://eigen.tuxfamily.org/ind... [tuxfamily.org] ) and many others.

Most of the better "frameworks" out there come with their own proxy objects for multidemensional arrays (like http://libmesh.sourceforge.net... [sourceforge.net] )

Multidmensional arrays haven't been an issue (especially in C++) for quite a long time...

Re:Because C and C++ multidimensional arrays suck (1)

Dizzer (251533) | about 5 months ago | (#46964343)

All the built-in array people are essentially obsessing over a micro-optimization. First of all I would argue that in a scientific research environment development time is a far more important factor than execution time. And having a framework with a clean outward facing interface for reusers makes a huge difference. Clean, well designed object oriented code also encourages contributions and allows you code to flourish, which reduces the pressure for people to invent their own wheels (again saving developer time). Secondly, the more substantial optimizations come from choosing the appropriate algorithms. Why worry about a 5% speed-up, when choosing the right preconditioner can give you a 10fold speed-up. As an aside, why worry about even a 20% slow down if you have a scalable parallel implementation that you can just throw a few more cores at. Profile before you optimize, and profile _economically_, too!!!!

What difference does it make??? (0)

Anonymous Coward | about 5 months ago | (#46964195)

The business world is still using COBOL. Sometimes you do get it right the first time...

Re:What difference does it make??? (1)

mark-t (151149) | about 5 months ago | (#46964253)

COBOL is a newer language than Fortran.

Took Fortran 77 std. in collegiate academia (0)

Anonymous Coward | about 5 months ago | (#46964203)

It was VERY easy to learn, loaded with libraries out the wazoo for just about anything imagineable in the sciences, & very strongly reminded me of BASIC (from what I recall of it).

* I didn't mind using it @ all...

APK

P.S.=> I never once used it professionally on the job though (though I had just about everything else language-wise I'd learned, which is around a dozen or so over time)... apk

Because it works (1)

BradMajors (995624) | about 5 months ago | (#46964205)

Because it works and it is good enough. FORTRAN is not a good language for "scientific research".

The Best (0)

Anonymous Coward | about 5 months ago | (#46964313)

Yes there is a legacy to FORTRAN.

I had my first introduction to FORTRAN, FORTRAN 77, in the Fall Quarter 1978. HOT ! And it was taught in a class from the Electrical Engineering Department ! A Computer Science "Department" came along soon enough in 1980 but the quality and rigor were not there.

Psst.

When I write Bourne shell scripts to do arithmetic and trend surfaces in ANOVA and geophysical inverse, the base functions are FORTRAN 77.

Ha ha

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?