| Total: 11
When modeling an application of practical relevance as an instance of a combinatorial problem X, we are often interested not merely in finding one optimal solution for that instance, but in finding a sufficiently diverse collection of good solutions. In this work we initiate a systematic study of diversity from the point of view of fixed-parameter tractability theory. We consider an intuitive notion of diversity of a collection of solutions which suits a large variety of combinatorial problems of practical interest. Our main contribution is an algorithmic framework which --automatically-- converts a tree-decomposition-based dynamic programming algorithm for a given combinatorial problem X into a dynamic programming algorithm for the diverse version of X. Surprisingly, our algorithm has a polynomial dependence on the diversity parameter.
Constraints propagation and backtracking are two basic techniques for solving constraint satisfaction problems (CSPs). During the search for a solution, the variable and value pairs that do not belong to any solution can be discarded by constraint propagation to ensure generalized arc consistency so as to avoid the fruitless search. However, constraint propagation is frequently invoked often with little effect on many CSPs. Much effort has been devoted to predicting when to invoke constraint propagation for solving a CSP; however, no effective approach has been developed for the alldifferent constraint. Here we present a novel theorem for identifying the edges in a value graph of alldifferent constraint whose removal can significantly reduce useless constraint propagation. We prove that if an alternating cycle exists for a prospectively removable edge that represents a variable-value assignment, the edge (and the assignment) can be discarded without constraint propagation. Based on this theorem, we developed a novel optimizing technique for early detection of useless constraint propagation which can be incorporated in any existing algorithm for alldifferent constraint. Our implementation of the new method achieved speedup by a factor of 1-5 over the state-of-art approaches on 93 benchmark problem instances in 8 domains. Furthermore, the new algorithm is scalable well and runs increasingly faster than the existing methods on larger problems.
Modern subgraph isomorphism solvers carry out sophisticated reasoning using graph invariants such as degree sequences and path counts. We show that all of this reasoning can be justified compactly using the cutting planes proofs studied in complexity theory. This allows us to extend a state of the art subgraph isomorphism enumeration solver with proof logging support, so that the solutions it outputs may be audited and verified for correctness and completeness by a simple third party tool which knows nothing about graph theory.
Satisfiability (SAT) and Maximum Satisfiability (MaxSAT) are two basic and important constraint problems with many important applications. SAT and MaxSAT are expressed in CNF, which is difficult to deal with cardinality constraints. In this paper, we introduce Extended Conjunctive Normal Form (ECNF), which expresses cardinality constraints straightforward and does not need auxiliary variables or clauses. Then, we develop a simple and efficient local search solver LS-ECNF with a well designed scoring function under ECNF. We also develop a generalized Unit Propagation (UP) based algorithm to generate the initial solution for local search. We encode instances from Nurse Rostering and Discrete Tomography Problems into CNF with three different cardinality constraint encodings and ECNF respectively. Experimental results show that LS-ECNF has much better performance than state of the art MaxSAT, SAT, Pseudo-Boolean and ILP solvers, which indicates solving cardinality constraints with ECNF is promising.
Learning pseudo-Boolean (PB) constraints in PB solvers exploiting cutting planes based inference is not as well understood as clause learning in conflict-driven clause learning solvers. In this paper, we show that PB constraints derived using cutting planes may contain irrelevant literals, i.e., literals whose assigned values (whatever they are) never change the truth value of the constraint. Such literals may lead to infer constraints that are weaker than they should be, impacting the size of the proof built by the solver, and thus also affecting its performance. This suggests that current implementations of PB solvers based on cutting planes should be reconsidered to prevent the generation of irrelevant literals. Indeed, detecting and removing irrelevant literals is too expensive in practice to be considered as an option (the associated problem is NP-hard).
Constraint Satisfaction Problems (CSP) are notoriously hard. Consequently, powerful decomposition methods have been developed to overcome this complexity. However, this poses the challenge of actually computing such a decomposition for a given CSP instance, and previous algorithms have shown their limitations in doing so. In this paper, we present a number of key algorithmic improvements and parallelisation techniques to compute so-called Generalized Hypertree Decompositions (GHDs) faster. We thus advance the ability to compute optimal (i.e., minimal-width) GHDs for a significantly wider range of CSP instances on modern machines. This lays the foundation for more systems and applications in evaluating CSPs and related problems (such as Conjunctive Query answering) based on their structural properties.
Solving the problem is an important part of optimization. An equally important part is the analysis of the solution where several questions can arise. For a scheduling problem, is it possible to obtain a better solution by increasing the capacity of a resource? What happens to the objective value if we start a specific task earlier? Answering such questions is important to provide explanations and increase the acceptability of a solution. A lot of research has been done on sensitivity analysis, but few techniques can be applied to constraint programming. We present a new method for sensitivity analysis applied to constraint programming. It collects information, during the search, about the propagation of the CUMULATIVE constraint, the filtering of the variables, and the solution returned by the solver. Using machine learning algorithms, we predict if increasing/decreasing the capacity of the cumulative resource allows a better solution. We also predict the impact on the objective value of forcing a task to finish earlier. We experimentally validate our method with the RCPSP problem.
Recently, several exact methods to compute decision trees have been introduced. On the one hand, these approaches can find optimal trees for various objective functions including total size, depth or accuracy on the training set and therefore. On the other hand, these methods are not yet widely used in practice and classic heuristics are often still the methods of choice. In this paper we show how the SAT model proposed by [Narodytska et.al 2018] can be lifted to a MaxSAT approach, making it much more practically relevant. In particular, it scales to much larger data sets; the objective function can easily be adapted to take into account combinations of size, depth and accuracy on the training set; and the fine-grained control of the objective function it offers makes it particularly well suited for boosting. Our experiments show promising results. In particular, we show that the prediction quality of our approach often exceeds state of the art heuristics. We also show that the MaxSAT formulation is well adapted for boosting using the well-known AdaBoost Algorithm.
The Boolean satisfiability problem (SAT) is a famous NP-complete problem in computer science. An effective way for solving a satisfiable SAT problem is the stochastic local search (SLS). However, in this method, the initialization is assigned in a random manner, which impacts the effectiveness of SLS solvers. To address this problem, we propose NLocalSAT. NLocalSAT combines SLS with a solution prediction model, which boosts SLS by changing initialization assignments with a neural network. We evaluated NLocalSAT on five SLS solvers (CCAnr, Sparrow, CPSparrow, YalSAT, and probSAT) with instances in the random track of SAT Competition 2018. The experimental results show that solvers with NLocalSAT achieve 27% ~ 62% improvement over the original SLS solvers.
Constraint Satisfaction Problems (CSPs) are typically solved with Generalized Arc Consistency (GAC). A general CSP can also be encoded into a binary CSP and solved with Arc Consistency (AC). The well-known Hidden Variable Encoding (HVE) is still a state-of-the-art binary encoding for solving CSPs. We propose a new binary encoding, called Bipartite Encoding (BE) which uses the idea of partitioning constraints. A BE encoded CSP can achieve a higher level of consistency than GAC on the original CSP. We give an algorithm for creating compact bipartite encoding for non-binary CSPs. We present a AC propagator on the binary constraints from BE exploiting their special structure. Experiments on a large set of non-binary CSP benchmarks with table constraints using the Wdeg, Activity and Impact heuristics show that BE with our AC propagator can outperform existing state-of-the-art GAC algorithms (CT, STRbit) and binary encodings (HVE with HTAC).
Exploiting dominance relations in many Constraint Optimization Problems can drastically speed up the solving process in practice. Identification and utilization of dominance relations, however, usually require human expertise. We present a theoretical framework for a useful class of constraint optimization problems to detect dominance automatically and formulate the generation of the associated dominance breaking nogoods as constraint satisfaction. By controlling the length and quantity of the nogoods, our method can generate dominance break- ing nogoods of varying strengths. Experimentation confirms runtime improvements of up to three orders of magnitude against manual methods.