Refine
Document Type
- Article (3)
- In Proceedings (2)
- ZIB-Report (2)
Language
- English (7)
Is part of the Bibliography
- no (7)
Keywords
- Cutting planes (1)
- Intersection cuts (1)
- MINLP (1)
- QCQPs (1)
- Quadratic Optimization (1)
- Quadratic-free sets (1)
Maximal Quadratic-Free Sets
(2019)
The intersection cut paradigm is a powerful framework that facilitates
the generation of valid linear inequalities, or cutting planes, for a potentially complex set S. The key ingredients in this construction are a
simplicial conic relaxation of S and an S-free set: a convex zone whose
interior does not intersect S. Ideally, such S-free set would be maximal
inclusion-wise, as it would generate a deeper cutting plane. However, maximality can be a challenging goal in general. In this work, we show how
to construct maximal S-free sets when S is defined as a general quadratic
inequality. Our maximal S-free sets are such that efficient separation of
a vertex in LP-based approaches to quadratically constrained problems is
guaranteed. To the best of our knowledge, this work is the first to provide
maximal quadratic-free sets.
Deep Learning has received significant attention due to its impressive performance in many state-of-the-art learning tasks. Unfortunately, while very powerful, Deep Learning is not well understood theoretically and in particular only recently results for the complexity of training deep neural networks have been obtained. In this work we show that large classes of deep neural networks with various architectures (e.g., DNNs, CNNs, Binary Neural Networks, and ResNets), activation functions (e.g., ReLUs and leaky ReLUs), and loss functions (e.g., Hinge loss, Euclidean loss, etc) can be trained to near optimality with desired target accuracy using linear programming in time that is exponential in the input data and parameter space dimension and polynomial in the size of the data set; improvements of the dependence in the input dimension are known to be unlikely assuming P≠NP, and improving the dependence on the parameter space dimension remains open. In particular, we obtain polynomial time algorithms for training for a given fixed network architecture. Our work applies more broadly to empirical risk minimization problems which allows us to generalize various previous results and obtain new complexity results for previously unstudied architectures in the proper learning setting.
The generation of strong linear inequalities for QCQPs has been recently tackled by a number of authors using the intersection cut paradigm - a highly studied tool in integer programming whose flexibility has triggered these renewed efforts in non-linear settings. In this work, we consider intersection cuts using the recently proposed construction of maximal quadratic-free sets. Using these sets, we derive closed-form formulas to compute intersection cuts which allow for quick cut-computations by simply plugging-in parameters associated to an arbitrary quadratic inequality being violated by a vertex of an LP relaxation. Additionally, we implement a cut-strengthening procedure that dates back to Glover and evaluate these techniques with extensive computational experiments.
The generation of strong linear inequalities for QCQPs has been recently tackled by a number of authors using the intersection cut paradigm - a highly studied tool in integer programming whose flexibility has triggered these renewed efforts in non-linear settings. In this work, we consider intersection cuts using the recently proposed construction of maximal quadratic-free sets. Using these sets, we derive closed-form formulas to compute intersection cuts which allow for quick cut-computations by simply plugging-in parameters associated to an arbitrary quadratic inequality being violated by a vertex of an LP relaxation. Additionally, we implement a cut-strengthening procedure that dates back to Glover and evaluate these techniques with extensive computational experiments.
Using the recently proposed maximal quadratic-free sets and the well-known monoidal strengthening procedure, we show how to improve inter- section cuts for quadratically-constrained optimization problems by exploiting integrality requirements. We provide an explicit construction that allows an efficient implementation of the strengthened cuts along with computational results showing their improvements over the standard intersection cuts. We also show that, in our setting, there is unique lifting which implies that our strengthening procedure is generating the best possible cut coefficients for the integer variables.
The most important ingredient for solving mixed-integer nonlinear programs (MINLPs) to global ϵ-optimality with spatial branch and bound is a tight, computationally tractable relaxation. Due to both theoretical and practical considerations, relaxations of MINLPs are usually required to be convex. Nonetheless, current optimization solvers can often successfully handle a moderate presence of nonconvexities, which opens the door for the use of potentially tighter nonconvex relaxations. In this work, we exploit this fact and make use of a nonconvex relaxation obtained via aggregation of constraints: a surrogate relaxation. These relaxations were actively studied for linear integer programs in the 70s and 80s, but they have been scarcely considered since. We revisit these relaxations in an MINLP setting and show the computational benefits and challenges they can have. Additionally, we study a generalization of such relaxation that allows for multiple aggregations simultaneously and present the first algorithm that is capable of computing the best set of aggregations. We propose a multitude of computational enhancements for improving its practical performance and evaluate the algorithm’s ability to generate strong dual bounds through extensive computational experiments.