Total: 1
State-of-the-art neural network verifiers demonstrate that applying the branch-and-bound (BaB) procedure with fast bounding techniques plays a key role in tackling many challenging verification properties. In this work, we introduce the \emph{linear constraint-driven clipping} framework, a class of scalable and efficient methods to enhance bound propagation verifiers. Under this framework, we develop two novel algorithms that efficiently utilize constraints to 1) reduce portions of the input space that are either verified or irrelevant to a subdomain in the context of branch-and-bound, and 2) directly improve intermediate bounds throughout the network. The process novelly uses linear constraints that are readily available during verification in a highly scalable manner compared to using off-the-shelf linear programming (LP) solvers. This reduction tightens bounds globally and can significantly reduce the number of subproblems handled during BaB. We show our clipping procedures can intuitively and efficiently be incorporated into BaB-based verifiers such as $\alpha, \beta$-CROWN, and is amenable to BaB procedures that split upon the input or activation space. We demonstrate the effectiveness of our procedure on a broad range of benchmarks where, in some instances, we witness a 96\% reduction in the number of subproblems during branch-and-bound, and also achieve state-of-the-art verified accuracy across multiple benchmarks.