I used Herbie to analyze and improve several common algorithms in the Monte reference interpreter. Herbie is not a complete solution on its own, but it includes all of the important automatic parts. Afterwards, I wrote code which incorporates Herbie’s analysis.
Herbie is not a panacea. If one’s original algorithm is unstable, then Herbie can stabilize it, but if one’s original algorithm is almost never precise (e.g. truncated Taylor series), then Herbie will not be able to improve upon it much, if at all. Nonetheless, it’s a wonderful tool.
One of the things I’d like to have is a way of tracking the quality of floating point operations over time…like, some way of saying “okay, this data is the result of this many operations, so you need to recondition it or rerun it or something before you keep using it”. Anybody know how to track that?
Valgrind version of this is interesting: http://herbgrind.ucsd.edu/using-herbgrind.html
I misread the domain name as “ulpwise” which I think would be a great name for a lab that was doing research into making floating point code better. :)
This is pretty amazing. Would it be feasible to integrate this into code editing environements, maybe into language servers even? Misuse of floating point values by programmers who don’t understand them is such a huge time-sink for everyone trying to find the source of inaccuracies after the fact.