Refine
Year of publication
Document Type
- ZIB-Report (11)
- ZIB-Annual (1)
Has Fulltext
- yes (12)
Is part of the Bibliography
- no (12)
Keywords
- 2-dim Cutting Stock Problem (1)
- Annual Report (1)
- Bibliothekskatalog (1)
- Branch-and-Bound (1)
- Combinatorial Optimization (1)
- Conformation Analysis (1)
- Hybrid Monte Carlo (1)
- Integration (1)
- Maximum Diversity Problem (1)
- Meshfree Methods (1)
The Library Search Engine: A Smart Solution for Integrating Resources Beyond Library Holdings
(2008)
The Cooperative Library Network Berlin-Brandenburg (KOBV), Germany, addresses the problem of how to integrate resources found outside the library and library holdings into a single discovery tool. It presents a solution that uses open source technology to develop a next-generation catalog interface called the Library Search Engine. This pilot project was launched in 2007 with the library of Albert Einstein Science Park, Potsdam. The idea was to design and develop a fast and convenient search tool, integrating local holdings (books, journals, journal articles) as well as relevant scientific subject information such as open access publications and bibliographies.
Decomposition of the high dimensional conformational space of bio-molecules into metastable subsets is used for data reduction of long molecular trajectories in order to facilitate chemical analysis and to improve convergence of simulations within these subsets. The metastability is identified by the Perron-cluster cluster analysis of a Markov process that generates the thermodynamic distribution. A necessary prerequisite of this analysis is the discretization of the conformational space. A combinatorial approach via discretization of each degree of freedom will end in the so called ''curse of dimension''. In the following paper we analyze Hybrid Monte Carlo simulations of small, drug-like biomolecules and focus on the dihedral degrees of freedom as indicators of conformational changes. To avoid the ''curse of dimension'', the projection of the underlying Markov operator on each dihedral is analyzed according to its metastability. In each decomposition step of a recursive procedure, those significant dihedrals, which indicate high metastability, are used for further decomposition. The procedure is introduced as part of a hierarchical protocol of simulations at different temperatures. The convergence of simulations within metastable subsets is used as an ''a posteriori'' criterion for a successful identification of metastability. All results are presented with the visualization program AmiraMol.
Two essential ingredients of modern mixed-integer programming (MIP) solvers are diving heuristics that simulate a partial depth-first search in a branch-and-bound search tree and conflict analysis of infeasible subproblems to learn valid constraints. So far, these techniques have mostly been studied independently: primal heuristics under the aspect of finding high-quality feasible solutions early during the solving process and conflict analysis for fathoming nodes of the search tree and improving the dual bound. Here, we combine both concepts in two different ways. First, we develop a diving heuristic that targets the generation of valid conflict constraints from the Farkas dual. We show that in the primal this is equivalent to the optimistic strategy of diving towards the best bound with respect to the objective function. Secondly, we use information derived from conflict analysis to enhance the search of a diving heuristic akin to classical coefficient diving. The computational performance of both methods is evaluated using an implementation in the source-open MIP solver SCIP. Experiments are carried out on publicly available test sets including Miplib 2010 and Cor@l.
A new and time efficient model to evaluate the free energy of solvation has been developed. The solvation free energy is separated into an electrostatic term, a hydrogen bond term, and a rest-term, combining both entropic and van der Waals effects. The electrostatic contribution is evaluated with a simplified boundary element method using the partial charges of the MMFF94 force field. The number of hydrogen bonds and the solvent excluded surface area over the surface atoms are used in a linear model to estimate the non-electrostatic contribution. This model is applied to a set of 213 small and mostly organic molecules, yielding an rmsd of 0.87kcal/mol and a correlation with experimental data of r=0.951. The model is applied as a supplementary component of the free energy of binding to estimate binding constants of protein ligand complexes. The intermolecular interaction energy is evaluated by using the MMFF94 force field.
The analysis of infeasible subproblems plays an import role in solving mixed integer programs (MIPs) and is implemented in most major MIP solvers. There are two fundamentally different concepts to generate valid global constraints from infeasible subproblems. The first is to analyze the sequence of implications obtained by domain propagation that led to infeasibility. The result of the analysis is one or more sets of contradicting variable bounds from which so-called conflict constraints can be generated. This concept has its origin in solving satisfiability problems and is similarly used in constraint programming. The second concept is to analyze infeasible linear programming (LP) relaxations. The dual LP solution provides a set of multipliers that can be used to generate a single new globally valid linear constraint. The main contribution of this short paper is an empirical evaluation of two ways to combine both approaches. Experiments are carried out on general MIP instances from standard public test sets such as Miplib2010; the presented algorithms have been implemented within the non-commercial MIP solver SCIP. Moreover, we present a pool-based approach to manage conflicts which addresses the way a MIP solver traverses the search tree better than aging strategies known from SAT solving.
The analysis of infeasible subproblems plays an important role in solving mixed integer programs (MIPs) and is implemented in most major MIP solvers. There are two fundamentally different concepts to generate valid global constraints from infeasible subproblems. The first is to analyze the sequence of implications, obtained by domain propagation, that led to infeasibility. The result of this analysis is one or more sets of contradicting variable bounds from which so-called conflict constraints can be generated. This concept is called conflict graph analysis and has its origin in solving satisfiability problems and is similarly used in constraint programming. The second concept is to analyze infeasible linear programming (LP) relaxations. Every ray of the dual LP provides a set of multipliers that can be used to generate a single new globally valid linear constraint. This method is called dual proof analysis. The main contribution of this paper is twofold. Firstly, we present three enhancements of dual proof analysis: presolving via variable cancellation, strengthening by applying mixed integer rounding functions, and a filtering mechanism. Further, we provide an intense computational study evaluating the impact of every presented component regarding dual proof analysis. Secondly, this paper presents the first integrated approach to use both conflict graph and dual proof analysis simultaneously within a single MIP solution process. All experiments are carried out on general MIP instances from the standard public test set MIPLIB 2017; the presented algorithms have been implemented within the non-commercial MIP solver SCIP and the commercial MIP solver FICO Xpress.
Branching rules revisited
(2004)
Mixed integer programs are commonly solved with linear programming based branch-and-bound algorithms. The success of the algorithm strongly depends on the strategy used to select the variable to branch on. We present a new generalization called {\sl reliability branching} of today's state-of-the-art {\sl strong branching} and {\sl pseudocost branching} strategies for linear programming based branch-and-bound algorithms. After reviewing commonly used branching strategies and performing extensive computational studies we compare different parameter settings and show the superiority of our proposed newstrategy.
In this paper, we introduce the Maximum Diversity Assortment Selection Problem (MADASS), which is a generalization of the 2-dimensional Cutting Stock Problem (2CSP). Given a set of rectangles and a rectangular container, the goal of 2CSP is to determine a subset of rectangles that can be placed in the container without overlapping, i.e., a feasible assortment, such that a maximum area is covered. In MADASS, we need to determine a set of feasible assortments, each of them covering a certain minimum threshold of the container, such that the diversity among them is maximized. Thereby, diversity is defined as minimum or average normalized Hamming-Distance of all assortment pairs. The MADASS Problem was used in the 11th AIMMS-MOPTA Competition in 2019. The methods we describe in this article and the computational results won the contest.
In the following, we give a definition of the problem, introduce a mathematical model and solution approaches, determine upper bounds on the diversity, and conclude with computational experiments conducted on test instances derived from the 2CSP literature.
Jahresbericht 2008
(2009)
Conflict learning algorithms are an important component of modern MIP and CP solvers. But strong conflict information is typically gained by depth-first search. While this is the natural mode for CP solving, it is not for MIP solving. Rapid Learning is a hybrid CP/MIP approach where CP search is applied at the root to learn information to support the remaining MIP solve. This has been demonstrated to be beneficial for binary programs. In this paper, we extend the idea of Rapid Learning to integer programs, where not all variables are restricted to the domain {0, 1}, and rather than just running a rapid CP search at the root, we will apply it repeatedly at local search nodes within the MIP search tree. To do so efficiently, we present six heuristic criteria to predict the chance for local Rapid Learning to be successful. Our computational experiments indicate that our extended Rapid Learning algorithm significantly speeds up MIP search and is particularly beneficial on highly dual degenerate problems.