Cool, we need more practical uses of formal methods. It’s not so impractical if you do just the right amount of it.
For example, I have implemented a real-time control system by first creating a timed finite-automata, then proving the desired properties, extracting a code skeleton implementation formally and finally filling in the little gaps in methods.
I would call Oil a middle ground … it’s not quite formal, but it is based on “exhaustive and relatively mathematical reasoning” – regular languages, algebraic data types, and state machines.
We still have some work to do on state machines, looking for help :-)
Middle grounds are great. I think formal verification is quite expensive, so most industries are reluctant to use it. But the alternative should not be none of it but some middle ground lightweight methods, such as model checking.
That’s what I’ve been seeing the most recently. Companies adopting lightweight formal methods on subsets of their systems. There’s some percentage of the system where the cost of FM is worth it it seems. And it’s usually when something’s high risk or high complexity, where conventional quality methods just aren’t working.
I agree. My overarching goal is to find the best application of formal methods to my actual day job. That’s why I focus so heavily on things like the verification gap and making sure that a real, executable implementation is produced in anything that I experiment with.
This is my goal too. I was actually working in formal verification positions for some time (e.g. RTOS, railway control, etc.). But it’s pretty hard to find that kind of positions without accepting a much lower compensation. I guess this is because they are somehow related to hardware, which pays much less than software right now.
The trend is changing a bit, with Amazon and some other companies advertising formal verification positions. I never thought I would ever see ads mentioning Dafny or Agda! Nonetheless, the market is still tiny. I think there is room for small consultancies or industrial labs like Tweag or Galois to increase adoption.
It certainly seems like there’s a formal methods wave going on for sure. Amazon specifically is doing great work there, even here their just doing property-based testing but the strategy comes from all of the classic formal verification ideas (refinement, etc.)
They’ve also been using the P language which has a great story for integrating model checking into the general language workflow. This personally is where I think the sweet spot is - language-based tools that have some form of lightweight formal verification built in. It’s almost like guiderails where, if you just follow the language conventions, you drastically reduce the cost of the verification effort.
I never heard of Smoosh - what an interesting project. It doesn’t look like they do any verification, just specify the syntax and semantics of POSIX. But I’m in favor of any project that uses formal specification, I think it’s one of the most underrated practices out there.
And thanks for mentioning Oil
Oh, no problem. I find your writing legitimately inspiring, particularly the middle-out post I linked here. In fact that’s one of the reasons I’m writing about language semantics - I’m playing around with a language-oriented approach to web applications.
That’s super neat. I had no idea you could export code from proofs in Isabelle/HOL. I had to check your linked codegen doc to see it supports (unless the doc is old) exporting to SML, OCaml, Haskell and Scala.
Yea the codegen functionality is really powerful. I haven’t scratched the surface of what it can do, you can customize it basically infinitely from what I understand as long as you prove that your changes preserve the semantics of the starting logic.
I’m not sure if that means that we should favor customizing it to create more optimized exports, or just use the default and use the fuzzing approach to compare the semantics to a hand-built implementation.
Probably the latter tbh, because verifying at the code level is still very cumbersome, especially when you care about performance.
Hm cool I didn’t know this was the mechanism!
I’m pretty sure “Smoosh: the executable formal semantics for POSIX shell” uses this:
https://arxiv.org/abs/1907.05308
Oil has been using some of their tests since 2019! But using the semantics directly doesn’t seem feasible.
In some sense, we do the same thing, just with a different metalanguage. (And thanks for mentioning Oil :-) )
Cool, we need more practical uses of formal methods. It’s not so impractical if you do just the right amount of it.
For example, I have implemented a real-time control system by first creating a timed finite-automata, then proving the desired properties, extracting a code skeleton implementation formally and finally filling in the little gaps in methods.
Cool, what tools did you use?
I would call Oil a middle ground … it’s not quite formal, but it is based on “exhaustive and relatively mathematical reasoning” – regular languages, algebraic data types, and state machines.
We still have some work to do on state machines, looking for help :-)
https://www.oilshell.org/blog/2022/01/notes-themes.html#a-model-of-the-runtime
UPPAAL, SML and Isabelle.
Middle grounds are great. I think formal verification is quite expensive, so most industries are reluctant to use it. But the alternative should not be none of it but some middle ground lightweight methods, such as model checking.
That’s what I’ve been seeing the most recently. Companies adopting lightweight formal methods on subsets of their systems. There’s some percentage of the system where the cost of FM is worth it it seems. And it’s usually when something’s high risk or high complexity, where conventional quality methods just aren’t working.
I agree. My overarching goal is to find the best application of formal methods to my actual day job. That’s why I focus so heavily on things like the verification gap and making sure that a real, executable implementation is produced in anything that I experiment with.
This is my goal too. I was actually working in formal verification positions for some time (e.g. RTOS, railway control, etc.). But it’s pretty hard to find that kind of positions without accepting a much lower compensation. I guess this is because they are somehow related to hardware, which pays much less than software right now.
The trend is changing a bit, with Amazon and some other companies advertising formal verification positions. I never thought I would ever see ads mentioning Dafny or Agda! Nonetheless, the market is still tiny. I think there is room for small consultancies or industrial labs like Tweag or Galois to increase adoption.
It certainly seems like there’s a formal methods wave going on for sure. Amazon specifically is doing great work there, even here their just doing property-based testing but the strategy comes from all of the classic formal verification ideas (refinement, etc.)
They’ve also been using the P language which has a great story for integrating model checking into the general language workflow. This personally is where I think the sweet spot is - language-based tools that have some form of lightweight formal verification built in. It’s almost like guiderails where, if you just follow the language conventions, you drastically reduce the cost of the verification effort.
I never heard of Smoosh - what an interesting project. It doesn’t look like they do any verification, just specify the syntax and semantics of POSIX. But I’m in favor of any project that uses formal specification, I think it’s one of the most underrated practices out there.
Oh, no problem. I find your writing legitimately inspiring, particularly the middle-out post I linked here. In fact that’s one of the reasons I’m writing about language semantics - I’m playing around with a language-oriented approach to web applications.
That’s super neat. I had no idea you could export code from proofs in Isabelle/HOL. I had to check your linked codegen doc to see it supports (unless the doc is old) exporting to SML, OCaml, Haskell and Scala.
You have a typo in the second sentence. :)
Typo fixed - thank you for reporting that!
Yea the codegen functionality is really powerful. I haven’t scratched the surface of what it can do, you can customize it basically infinitely from what I understand as long as you prove that your changes preserve the semantics of the starting logic.
I’m not sure if that means that we should favor customizing it to create more optimized exports, or just use the default and use the fuzzing approach to compare the semantics to a hand-built implementation.
Probably the latter tbh, because verifying at the code level is still very cumbersome, especially when you care about performance.
In Agda you can compile the program itself, which uses either GHC or JS as the backend.
That’s definitely a benefit of dependent types - the code and specification are tied together. It’s also a downside, interestingly.