inc_pair is a really weird example to use to talk about the sins of abstraction. It doesn’t really seem all that abstract to me, I would use it purely as a mechanical way to reduce lines of code. This talk is completely missing actually valuable aspects of abstraction, such as composability. Using abstractions that are composable is great! It helps you build larger programs than you could otherwise. Maybe this is just what abstraction means if you’ve been slumming in C or C++ most of your life, but in the land of Ocaml or Haskell there is real value in abstractions when they can be used together to build newer things. That this post doesn’t even touch on this (there is a comment later on that does) makes me think the author is just on a different plane.
I think the author’s points do indeed work best as a commentary on language design, though I don’t believe they were meant that way in particular.
Perhaps this needs a c tag?
I suggest running any new formats or protocols by the LANGSEC people to ensure their tools can turn them into secure, predictable parsers. Modify if not.
In this case the authors have made an announcement there, and at least Tony Arcieri lurks in that community besides. See the (currently quite short) discussion thread here.
Good to know they’re already talking.
This probably ought to be merged with this post from a day or two ago.
You’re right — I looked for the link, not the title.
The paper itself was posted a few days ago.
The actual release announcement can be read here.
This was a very nice read. Grounded, wry, full with interesting information. Good find, @halosghost.
The article refers to a “Digital Media Primer for Geeks”, but the link doesn’t resolve. It can be found here, and some other interesting bits and bobs from the author here.
Unrelated to the actual study, but I enjoyed seeing the all different ways the same findings were headlined by various news outlets right next to one another.
Specifically, they’re considered unsafe because the authors' experience computing a freestart SHA-1 collision on Kraken (not me, a cluster named Kraken) led them to a cost estimate of US$173k to compute a real SHA-1 collision, which is cheap enough that e.g. the intelligence services of Iran, China, Russia, USA, or el Chapo Guzmán could probably compute one.
This comes as a big surprise to me because I wasn’t expecting this for several more years.
Substantially cheaper if you own a cluster already and don’t need to rent the time from Amazon.
I’m not sure the difference is substantial. Maybe a factor of two or three, or maybe sub-unity. People build their own clusters for lots of reasons and with a wide variety of levels of effectiveness, and some actually existing clusters are actually more expensive to run than renting the time from Amazon.
led them to a cost estimate of US$173k to compute a real SHA-1 collision
I believe their estimate is actually $75k-120k. $173k is Bruce Schneier’s standing estimate, mentioned on that page for contrast.
Critically that was Schneiers estimate for cost in 2018. iirc he was predicting around 750k today.
That’s an order of magnitude less…
Oh, thank you for the correction!
The OCaml community seems fairly active to me. For example, this post
from a few days ago links to a lot of recent work. Perhaps someone with a better view into that world can elaborate.
I agree. OCaml is alive and kicking with some ferocity. Mirage and Jane Street come instantly to mind as major community drivers. Upcoming OCaml versions include patches aimed at giving more precise control to the GC and having inlining as aggressive as Haskell’s. There was a recent massive rejiggering of OCaml metaprogramming techniques to be a bit friendlier and js_of_ocaml seems exciting if a bit sparse.
I don’t think the ML standardization process is moving forward, no, but OCaml as a language and platform seems to be continuing to move.
One way of rephrasing it might be: the ML community outside OCaml is stagnant. There used to be a number of “branches” of the ML community with active projects, but nowadays I think SML/NJ is the only other one even making regular releases, and their releases are mostly maintenance-oriented.
I’m not too sure if everyone in ML-land simply coalesced around OCaml, or if the people who were members of other parts of the community went elsewhere. I recall that at least when I last had much contact with ML (~10 years ago), many of the SML people weren’t big fans of OCaml, so I would be at least slightly surprised if they ended up there. But some may have. Also, I think many of the PLs researchers, who used to be a biggish part of the ML community, have moved to Haskell (with some exceptions, like some of the folks at CMU).
Poly/ML made a release in 2014. HOL is quite active and made a release in 2014, and Poly/ML is a recommended compiler; as I understand, you can’t even build HOL with SML/NJ. Isabelle is active, made a release in 2014, and recommends Poly/ML.
Yes, OCaml seems to be more lively. To be honest, I don’t consider it to belong to the group of MLs I was asking about.
It’s hard to explain, but OCaml always felt like they were something separate from the rest of ML.