We have faster hardware and better algorithms now, yes. But the real reason is because early theoretical results which emphasized worse-case performance had scared the entire field off even trying. These results based on complexity classes are true, but misleading: as it turns out, the “average” SAT instance for many real-world problems probably is solvable. Only when this was recognized could we make progress on efficient SAT algorithms. Beware sound theories mis-applied!

I saw SAT solvers as academically interesting but didn’t think that they have many practical uses outside of other academic applications. … I have to say that modern SAT solvers are fast, neat and criminally underused by the industry.

Echoing a good comment on reddit: The author didn’t list any practical applications!!! How can you then say they are criminally underused?

The only one I know of is writing versioned dependency solver for a package manager (in the style of Debian’s apt). However, very few people need to write such code.

What are some other practical applications? I think they are all quite specialized and don’t come up in day-to-day programming. Happy to be proven wrong.

EDIT: I googled and I found some interesting use cases, but they’re indeed specialized. Not something I’ve done or seen my coworkers do:

They see some use in formal methods for model checking. The Alloy Analyzer converts Alloy specs to SAT problems, which is one of the reasons its such a fast solver.

I’ve used Z3 to verify that a certain optimized bitvector operation is equivalent to the obvious implementation of the intended calculation.

Just typed up the two variants as functions in the SMT language with the bitvector primitives and asked Z3 for the satisfiability of f(x) != g(x) and rejoiced when it said “unsatisfiable.”

(define-sort Word () (_ BitVec 64))
(define-fun zero () Word (_ bv0 64))
;; Signed addition can wrap if the signs of x and y are the same.
;; If both are positive and x + y < x, then overflow happened.
;; If both are negative and x + y > x, then underflow happened.
(define-fun
add-overflow-basic
((x Word) (y Word)) Bool
(or (and (bvslt x zero)
(bvslt y zero)
(bvsgt (bvadd x y) x))
(and (bvsgt x zero)
(bvsgt y zero)
(bvslt (bvadd x y) x))))
;; Here is a clever way to calculate the same truth value,
;; from _Hacker's Delight_, section 2.13.
(define-fun
add-overflow-clever
((x Word) (y Word)) Bool
(bvslt (bvand (bvxor (bvadd x y) x)
(bvxor (bvadd x y) y))
zero))
(set-option :pp.bv-literals false)
(declare-const x Word)
(declare-const y Word)
(assert (not (= (add-overflow-basic x y)
(add-overflow-clever x y))))
(check-sat)

Here’s you an example of SMT solvers used for stuff like that. I added some more stuff in comments. You might also like some examples of Why3 code which is translated for use with multiple solvers. Why3 is main way people in verification community use solvers that I’m aware. WhyML is a nice, intermediate language.

Keiran King and @raph compiled a SAT solver to WebAssembly to use for the “auto-complete” feature of Phil, a tool for making crossword puzzles (source).

This looks like it’s about linear programming, not SAT solving. They’re related for sure but one reason it would be nice to see some specific examples is to understand where each one is applicable!

I have personally used them when writing code to plan purchases from suppliers for industrial processes and to deal with shipping finished products out using the “best” carrier.

Cool, I’m surprised a SAT solver can do a sodoku puzzle in just a millisecond or so. It’s got to be what, a couple thousand clauses? I guess it all collapses pretty quickly after a couple values slot in.

yeah one of the nice things about SAT solving is the efficient propagation of information: once one clause is solved it quickly updates all the clauses that were blocked by not knowing that.

Pretty close… the technical differences are mostly about using heuristics. Here’s a nice paper about implementing a reasonably-efficient toy SAT solver in Prolog:

A while back, hwayne submitted this article on how SAT works with code examples in Racket Scheme. It was one of better ones I’ve seen. You might want to start with a primer on propositional logic first, though. Lots of them to Google/DuckDuckGo for.

There’s a quote I like that I can’t remember from where:

I wrote something

vaguelylike that a few years ago, though I’m sure I wasn’t the first to observe it:Why is that? Because computers are much faster, or better algorithms?

We have faster hardware and better algorithms now, yes. But the real reason is because early theoretical results which emphasized worse-case performance had scared the entire field off

even trying. These results based on complexity classes are true, but misleading: as it turns out, the “average” SAT instance for many real-world problems probablyissolvable. Only when this was recognized could we make progress on efficient SAT algorithms. Beware sound theories mis-applied!Echoing a good comment on reddit: The author didn’t list any practical applications!!! How can you then say they are criminally underused?

The only one I know of is writing versioned dependency solver for a package manager (in the style of Debian’s apt). However, very few people need to write such code.

What are some other practical applications? I think they are all quite specialized and don’t come up in day-to-day programming. Happy to be proven wrong.

EDIT: I googled and I found some interesting use cases, but they’re indeed specialized. Not something I’ve done or seen my coworkers do:

https://www.quora.com/What-are-the-industrial-applications-of-all-SAT-solvers

http://www.carstensinz.de/talks/RISC-2005.pdf

I can think of a certain scheduling algorithm that might have used a SAT solver, but without details I can’t be sure.

They see some use in formal methods for model checking. The Alloy Analyzer converts Alloy specs to SAT problems, which is one of the reasons its such a fast solver.

There’s also this talk on analyzing floor plans.

A previous submission on SAT/SMT might help you answer that.

I’ve used Z3 to verify that a certain optimized bitvector operation is equivalent to the obvious implementation of the intended calculation.

Just typed up the two variants as functions in the SMT language with the bitvector primitives and asked Z3 for the satisfiability of f(x) != g(x) and rejoiced when it said “unsatisfiable.”

Hm this is interesting. Does this code happen to be open source?

I’ll just post it here. :)

Here’s you an example of SMT solvers used for stuff like that. I added some more stuff in comments. You might also like some examples of Why3 code which is translated for use with multiple solvers. Why3 is main way people in verification community use solvers that I’m aware. WhyML is a nice, intermediate language.

Keiran King and @raph compiled a SAT solver to WebAssembly to use for the “auto-complete” feature of Phil, a tool for making crossword puzzles (source).

I found this library (and its docs) interesting. They go over some practical examples.

https://developers.google.com/optimization/cp/

This looks like it’s about linear programming, not SAT solving. They’re related for sure but one reason it would be nice to see some specific examples is to understand where each one is applicable!

I have personally used them when writing code to plan purchases from suppliers for industrial processes and to deal with shipping finished products out using the “best” carrier.

Cool, I’m surprised a SAT solver can do a sodoku puzzle in just a millisecond or so. It’s got to be what, a couple thousand clauses? I guess it all collapses pretty quickly after a couple values slot in.

yeah one of the nice things about SAT solving is the efficient propagation of information: once one clause is solved it quickly updates all the clauses that were blocked by not knowing that.

I have an intuition that a Prolog program works like a SAT solver, is this an accurate view?

Pretty close… the technical differences are mostly about using heuristics. Here’s a nice paper about implementing a reasonably-efficient toy SAT solver in Prolog:

A while back, hwayne submitted this article on how SAT works with code examples in Racket Scheme. It was one of better ones I’ve seen. You might want to start with a primer on propositional logic first, though. Lots of them to Google/DuckDuckGo for.