Smart aleck answer: NASA couldn’t have used Julia because it has a garbage collector :P
More serious answer: Properly handling types is a lot more complicated than just writing subtypes and conversions. For example, you need to be able to express meters/second as the division of one unit by another, and then properly handle being able to convert that into feet/hour.
Properly handling measurements is still an open research question. My two favorite resources are The Measurement Data Report and the Frink Programming Language (which seems to be down rn).
Sounds like exactly what the Haskell dimensional package gives you. It’s all checked at compile time, and covers all physical units and all combinations of them. Since you can also choose the number type being used, you can choose a type with more precision than a Double, such as the Real type which supports arbitrary precision for most operations, or even an expression type to give you exact equations for any calculation.
You’d find Dimensional Analysis in Programming Languages interesting, as it compares measurement solutions across multiple languages. So far F# is the only mainstream language that makes measurements part of the core language and not just a library.
That’s a great link, thanks for that, I’ll have to have a look at how this is handled in different languages.
Is there a particular benefit to this being a feature of the language over it being implementable within the language, with the problem then just becoming type checking? It doesn’t look like the F# language features add much over what’s possible with some of the Haskell examples listed there, apart for some slightly nicer syntax, and arguably the use of quasi-quotes in the uom-plugin doesn’t add any more noise than the F# syntax. In the past I guess error reporting would have been better having it as a language feature, but these days we have custom type errors in Haskell, so it’s relatively easy to bolt on really helpful type errors.
IMO the main benefit is that it’s given “official status”. Other features of the language won’t actively undermine it, all tooling is expected to support it, etc. To my understanding F# doesn’t do a whole lot with it, but something like e.g. Frink does a lot.
Also, better compiler support for optimizations. This was one of the big problems with adding unit support to 90’s Ada, though I’m not sure if it still is a problem.
TLDW - Julia has a units of measurement library like F#, Swift and probably others.
This isn’t particularly novel - Boost.Unit is 4 years older than the Julia language.
The video is referring to the Mars Climate Orbiter.
The specification called for SI units but the software was written to use US units.
The article Why the Mars Probe Went Off Course has a lot more information about the crash.
What’s special in Julia that you can’t achieve in other high level programming languages with types/objects?
Not sure. But it’s like asking what’s special about any language? Aren’t they all Turing complete? Why not use brainfuck for everything. It’s turing complete.
Well, you’re correct but also missing the point. The title makes me think that there’s some feature in Julia that would have caught this error ahead of time, which isn’t the case.
This is a really interesting line of reasoning/argument. In particular, I think that Julia is pretty special because of the character of it having been designed especially for scientific/numerical computing. This has influenced a variety of language decisions and certainly the ecosystem.