When I was in school (around ’05), one of my favorite exchanges in class was a professor saying, “Well you can do x in Java, but not in Fortran”, to which a student (who was a full time programmer at the nuclear power plant) replied, “You can do it like x in Fortran 77, like y in Fortran 90, like z in Fortran 95 and I know cause I have to use all of them at work”.
Some things like Fortran won’t (easily) be displaced since they’re nowhere close to being the long pole in any workflows they are used in. He said they had some stuff written in C++, but the maintenance/speed ratio heavily leaned towards Fortran’s number crunching prowess.
If it can’t be done in Fortran, it’s not worth doing. Also, I can write Fortran programs in any language…
You can think of R as a wrapper around Fortran, in much the same way that Python is a wrapper around C.
Not only does building R itself require a Fortran compiler, common packages do as well. If you look inside your R package tarballs, particularly of solvers, you will see Fortran.
(Scientific computing libraries in Python also wrap Fortran, but CPython doesn’t depend on it. Base R depends on Fortran.)
(repost from HN)
This is where taking time to learn Gentoo is helpful. There is nothing like seeing fortan run across the screen to remind you of all the code that exists.
What a nice pick of a video, even though the context really only unfolds at the very end.
Are there readers here that do Fortran currently? If so could you tell a bite more about it? Just as exciting as NASA?
I used to use it at Bloomberg. It was terrible. Just the ultimate in spaghetti code and spaghetti state. Also, we had to use FORTRAN 77.
The thing about Fortran is that there’s nothing especially great about it as a language. It has language-level support for numerical processing stuff (which I never had to use), but it’s not necessary to have that as language-level features.
See https://lobste.rs/s/ndqxfv/fortran_is_still_thing#c_bclfdy for more comments in which I concur.
I don’t personally do Fortan, but I do recall there being an interest when I was in HPC up 2 years ago. Intel still ships a fortran compiler, for example. Sorry I can’t recall any specific reasons why folks preferred it other than “it’s just what we are used to using”.
I concur, and am also an escapee from HPC (I worked on Allinea/ARM Forge). There are lots of Fortran compilers (flang being a recent, US gov-sponsored addition), debuggers and perf tools, libraries, and so on. You can write a program in Fortran that will send CUDA to an Nvidia board.
Part of it is “what we’re used to/trained in”, part of it is “we trust these implementations of these models”. You don’t just write a new climate model in Scala and say “there you go, if it ever finishes compiling your toolchain will be modern”. There will be a decade of publication, testing and review in the literature before the new model is truated enough to be used by other researchers, and then they will start adapting it for their MPI/batch system/snowflake build environment. If then, even.
Also the HPC ecosystem has some interesting quirks. Because the purchase prices of the machines are 7-8 digits, often taxpayer funded, and will be used for years at very high utilisation, they like to find products that are supplied by multiple vendors to avoid gouging lock-in costs or disappearing vendors. So one of the interesting effects of gfortran or flang is that they make it easier to buy Intel or PGI’s compiler (in fact IIRC PGI contributed the flang code).
I use Fortran for HPC as well.
Not currently, but in the recent past I worked on a scientific modelling app whose backend was entirely fortran (and the frontend was delphi… oh my). It was a bit of a nightmare - think a single 10,000 LoC file that is fully of global state manipulation and GOTOs. I’m not sure how much of the terror was because of the language vs it being written mostly by scientists.
About six years ago I worked in a solar physics lab. Two of the scientists worked with CUDA, but the majority of scientists were actively developing Fortran 95 codebases, mostly for image processing. Most of what locked people into this were the libraries that had been authored in the 90’s that were still being shared (via email or HTTP links) around the solar physics community. These were algorithms, image reading/writing, etc.
I’ve used Fortran indirectly a couple times… But most of that was writing Python to wrap existing programs and make systems that can batch lots of program runs off of input that’s nicer than whatever horrible parameter file the programs expected.
I also have a friend who did a lot of similar stuff in R, which apparently has quite nice interfaces to lots of random scientific Fortran libs.
My understanding is that Fortran is used pretty heavily in heavily-paralisable workflows instead of C (mainly because of the array ops I think?). In a research lab I stayed in for a bit, there were people in the high performance computing corner doing fortran stuff…sometimes. I don’t know the full extent of this effect though
That is the theory - I would love to see some studies of how well this actually works. Dennis Ritchie was dubious.
I’ve been looking around for books, and I’ve found a few PDFs on lainchan, but most of them seem to be meh. Can someone recommend a good book on fortran, maybe even coming from a perspective of a more “modern” programmer?
It’s still a thing to the extent Algol is still a thing in the form of C#/Java.
Modern Fortran has a passing resemblance to FORTRAN-IV, like C#/Java has a passing resemblance to Algol-60. In fact, modern Fortran looks even less like FORTRAN-IV than it looks like Algol-60.