17474@AAAI

Total: 1

#1 Learning Branching Heuristics for Propositional Model Counting [PDF] [Copy] [Kimi]

Authors: Pashootan Vaezipoor ; Gil Lederman ; Yuhuai Wu ; Chris Maddison ; Roger B Grosse ; Sanjit A. Seshia ; Fahiem Bacchus

Propositional model counting, or #SAT, is the problem of computing the number of satisfying assignments of a Boolean formula. Many problems from different application areas, including many discrete probabilistic inference problems, can be translated into model counting problems to be solved by #SAT solvers. Exact #SAT solvers, however, are often not scalable to industrial size instances. In this paper, we present Neuro#, an approach for learning branching heuristics to improve the performance of exact #SAT solvers on instances from a given family of problems. We experimentally show that our method reduces the step count on similarly distributed held-out instances and generalizes to much larger instances from the same problem family. It is able to achieve these results on a number of different problem families having very different structures. In addition to step count improvements, Neuro# can also achieve orders of magnitude wall-clock speedups over the vanilla solver on larger instances in some problem families, despite the runtime overhead of querying the model.