TY - GEN A1 - Hiller, Benjamin A1 - Vredeveld, Tjark T1 - Probabilistic alternatives for competitive analysis N2 - In the last 20 years competitive analysis has become the main tool for analyzing the quality of online algorithms. Despite of this, competitive analysis has also been criticized: It sometimes cannot discriminate between algorithms that exhibit significantly different empirical behavior, or it even favors an algorithm that is worse from an empirical point of view. Therefore, there have been several approaches to circumvent these drawbacks. In this survey, we discuss probabilistic alternatives for competitive analysis. T3 - ZIB-Report - 11-55 KW - online algorithms KW - probabilistic analysis KW - competitive analysis KW - survey Y1 - 2012 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:0297-zib-15131 SN - 1438-0064 ER - TY - GEN A1 - Kaplan, Bernhard A1 - Laufer, Jan A1 - Prohaska, Steffen A1 - Buchmann, Jens T1 - Monte-Carlo-based inversion scheme for 3D quantitative photoacoustic tomography N2 - The goal of quantitative photoacoustic tomography (qPAT) is to recover maps of the chromophore distributions from multiwavelength images of the initial pressure. Model-based inversions that incorporate the physical processes underlying the photoacoustic (PA) signal generation represent a promising approach. Monte-Carlo models of the light transport are computationally expensive, but provide accurate fluence distributions predictions, especially in the ballistic and quasi-ballistic regimes. Here, we focus on the inverse problem of 3D qPAT of blood oxygenation and investigate the application of the Monte-Carlo method in a model-based inversion scheme. A forward model of the light transport based on the MCX simulator and acoustic propagation modeled by the k-Wave toolbox was used to generate a PA image data set acquired in a tissue phantom over a planar detection geometry. The combination of the optical and acoustic models is shown to account for limited-view artifacts. In addition, the errors in the fluence due to, for example, partial volume artifacts and absorbers immediately adjacent to the region of interest are investigated. To accomplish large-scale inversions in 3D, the number of degrees of freedom is reduced by applying image segmentation to the initial pressure distribution to extract a limited number of regions with homogeneous optical parameters. The absorber concentration in the tissue phantom was estimated using a coordinate descent parameter search based on the comparison between measured and modeled PA spectra. The estimated relative concentrations using this approach lie within 5 % compared to the known concentrations. Finally, we discuss the feasibility of this approach to recover the blood oxygenation from experimental data. T3 - ZIB-Report - 17-04 KW - quantitative photoacoustic tomography KW - model-based inversion KW - oxygen saturation KW - chromophore concentration KW - photoacoustic imaging KW - Monte Carlo methods for light transport KW - boundary conditions KW - coordinate search Y1 - 2017 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:0297-zib-62318 SN - 1438-0064 ER - TY - GEN A1 - Quer, Jannes A1 - Donati, Luca A1 - Keller, Bettina A1 - Weber, Marcus T1 - An automatic adaptive importance sampling algorithm for molecular dynamics in reaction coordinates N2 - In this article we propose an adaptive importance sampling scheme for dynamical quantities of high dimensional complex systems which are metastable. The main idea of this article is to combine a method coming from Molecular Dynamics Simulation, Metadynamics, with a theorem from stochastic analysis, Girsanov's theorem. The proposed algorithm has two advantages compared to a standard estimator of dynamic quantities: firstly, it is possible to produce estimators with a lower variance and, secondly, we can speed up the sampling. One of the main problems for building importance sampling schemes for metastable systems is to find the metastable region in order to manipulate the potential accordingly. Our method circumvents this problem by using an assimilated version of the Metadynamics algorithm and thus creates a non-equilibrium dynamics which is used to sample the equilibrium quantities. T3 - ZIB-Report - 17-09 KW - Adaptive Importance Sampling KW - Molecular Dynamics KW - Metastability KW - Variance Reduction KW - Non Equilibrium Sampling KW - Metadynamics KW - Girsanov Y1 - 2017 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:0297-zib-62075 SN - 1438-0064 ER - TY - GEN A1 - Baum, Daniel A1 - Lindow, Norbert A1 - Hege, Hans-Christian A1 - Lepper, Verena A1 - Siopi, Tzulia A1 - Kutz, Frank A1 - Mahlow, Kristin A1 - Mahnke, Heinz-Eberhard T1 - Revealing hidden text in rolled and folded papyri N2 - Ancient Egyptian papyri are often folded, rolled up or kept as small packages, sometimes even sealed. Physically unrolling or unfolding these packages might severely damage them. We demonstrate a way to get access to the hidden script without physical unfolding by employing computed tomography and mathematical algorithms for virtual unrolling and unfolding. Our algorithmic approaches are combined with manual interaction. This provides the necessary flexibility to enable the unfolding of even complicated and partly damaged papyrus packages. In addition, it allows us to cope with challenges posed by the structure of ancient papyrus, which is rather irregular, compared to other writing substrates like metallic foils or parchment. Unfolding of packages is done in two stages. In the first stage, we virtually invert the physical folding process step by step until the partially unfolded package is topologically equivalent to a scroll or a papyrus sheet folded only along one fold line. To minimize distortions at this stage, we apply the method of moving least squares. In the second stage, the papyrus is simply flattened, which requires the definition of a medial surface. We have applied our software framework to several papyri. In this work, we present the results of applying our approaches to mockup papyri that were either rolled or folded along perpendicular fold lines. In the case of the folded papyrus, our approach represents the first attempt to address the unfolding of such complicated folds. T3 - ZIB-Report - 17-02 KW - unfolding, papyri, computed tomography Y1 - 2017 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:0297-zib-61826 SN - 1438-0064 ER - TY - GEN A1 - Shinano, Yuji A1 - Achterberg, Tobias A1 - Berthold, Timo A1 - Heinz, Stefan A1 - Koch, Thorsten A1 - Winkler, Michael T1 - Solving Open MIP Instances with ParaSCIP on Supercomputers using up to 80,000 Cores N2 - This paper describes how we solved 12 previously unsolved mixed-integer program- ming (MIP) instances from the MIPLIB benchmark sets. To achieve these results we used an enhanced version of ParaSCIP, setting a new record for the largest scale MIP computation: up to 80,000 cores in parallel on the Titan supercomputer. In this paper we describe the basic parallelization mechanism of ParaSCIP, improvements of the dynamic load balancing and novel techniques to exploit the power of parallelization for MIP solving. We give a detailed overview of computing times and statistics for solving open MIPLIB instances. T3 - ZIB-Report - 15-53 KW - Mixed Integer Programming KW - Parallel processing KW - Node merging KW - Racing ParaSCIP KW - Ubiquity Generator Framework KW - MIPLIB Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:0297-zib-56404 SN - 1438-0064 ER - TY - GEN A1 - Fujii, Koichi A1 - Ito, Naoki A1 - Kim, Sunyoung A1 - Kojima, Masakazu A1 - Shinano, Yuji A1 - Toh, Kim-Chuan T1 - Solving Challenging Large Scale QAPs N2 - We report our progress on the project for solving larger scale quadratic assignment problems (QAPs). Our main approach to solve large scale NP-hard combinatorial optimization problems such as QAPs is a parallel branch-and-bound method efficiently implemented on a powerful computer system using the Ubiquity Generator(UG) framework that can utilize more than 100,000 cores. Lower bounding procedures incorporated in the branch-and-bound method play a crucial role in solving the problems. For a strong lower bounding procedure, we employ the Lagrangian doubly nonnegative (DNN) relaxation and the Newton-bracketing method developed by the authors’ group. In this report, we describe some basic tools used in the project including the lower bounding procedure and branching rules, and present some preliminary numerical results. Our next target problem is QAPs with dimension at least 50, as we have succeeded to solve tai30a and sko42 from QAPLIB for the first time. T3 - ZIB-Report - 21-02 KW - QAP KW - Parallel Branch-and-Bound Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:0297-zib-81303 SN - 1438-0064 ER - TY - GEN A1 - Schäfer, Patrick T1 - Bag-Of-SFA-Symbols in Vector Space (BOSS VS) N2 - Time series classification mimics the human understanding of similarity. When it comes to larger datasets, state of the art classifiers reach their limits in terms of unreasonable training or testing times. One representative example is the 1-nearest-neighbor DTW classifier (1-NN DTW) that is commonly used as the benchmark to compare to and has several shortcomings: it has a quadratic time and it degenerates in the presence of noise. To reduce the computational complexity lower bounding techniques or recently a nearest centroid classifier have been introduced. Still, execution times to classify moderately sized datasets on a single core are in the order of hours. We present our Bag-Of-SFA-Symbols in Vector Space (BOSS VS) classifier that is robust and accurate due to invariance to noise, phase shifts, offsets, amplitudes and occlusions. We show that it is as accurate while being multiple orders of magnitude faster than state of the art classifiers. Using the BOSS VS allows for mining massive time series datasets and real-time analytics. T3 - ZIB-Report - 15-30 KW - Time Series KW - Classification KW - Data Mining Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:0297-zib-54984 SN - 1438-0064 ER - TY - GEN A1 - Hasler, Tim A1 - Peters-Kottig, Wolfgang T1 - Vorschrift oder Thunfisch? – Zur Langzeitverfügbarkeit von Forschungsdaten N2 - „Ich mache ihm ein Angebot, das er nicht ablehnen kann.” Diese Aussage aus einem gänzlich anderen Kontext lässt sich recht treffend übertragen als Wunsch von Dienstleistern und Zweck von Dienstleistungen für Datenproduzenten im Forschungsdatenmanagement. Zwar wirkt Druck zur Datenübergabe nicht förderlich, die Eröffnung einer Option aber sehr wohl. Im vorliegenden Artikel geht es um das Verständnis der Nachhaltigkeit von Forschung und ihren Daten anhand der Erkenntnisse und Erfahrungen aus der ersten Phase des DFG-Projekts EWIG. [Fn 01] Eine Auswahl von Fallstricken beim Forschungsdatenmanagement wird anhand der Erkenntnisse aus Expertengesprächen und eigenen Erfahrungen beim Aufbau von LZA-Workflows vorgestellt. Erste Konzepte in EWIG zur Datenübertragung aus unterschiedlich strukturierten Datenquellen in die „Langfristige Domäne” werden beschrieben. N2 - "I'm gonna make him an offer he can't refuse". This quote from a completely different context can be aptly rendered as a statement of service providers as well as the purpose of services for data producers in the field of research data management. Although pressure is not the leverage of choice if you want researchers to deposit their research data in some kind of repository, offering an option does the trick quite well. In this article we present some of the concepts for sustainability of research and its data from the first phase the of the project EWIG, funded by the Deutsche Forschungsgemeinschaft. A selection of pitfalls in research data management is presented based on the findings from expert interviews and our own experiences in the construction of LTP workflows. First concepts in EWIG to transfer data from differently structured data sources into the "Permanent Domain" are described. T3 - ZIB-Report - 13-70 KW - Langzeitverfügbarkeit KW - Forschungsdatenmanagement Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:0297-zib-43010 UR - http://libreas.eu/ausgabe23/08hasler/ SN - 1438-0064 ER - TY - GEN A1 - Dercksen, Vincent J. A1 - Hege, Hans-Christian A1 - Oberlaender, Marcel T1 - The Filament Editor: An Interactive Software Environment for Visualization, Proof-Editing and Analysis of 3D Neuron Morphology N2 - Neuroanatomical analysis, such as classification of cell types, depends on reliable reconstruction of large numbers of complete 3D dendrite and axon morphologies. At present, the majority of neuron reconstructions are obtained from preparations in a single tissue slice in vitro, thus suffering from cut off dendrites and, more dramatically, cut off axons. In general, axons can innervate volumes of several cubic millimeters and may reach path lengths of tens of centimeters. Thus, their complete reconstruction requires in vivo labeling, histological sectioning and imaging of large fields of view. Unfortunately, anisotropic background conditions across such large tissue volumes, as well as faintly labeled thin neurites, result in incomplete or erroneous automated tracings and even lead experts to make annotation errors during manual reconstructions. Consequently, tracing reliability renders the major bottleneck for reconstructing complete 3D neuron morphologies. Here, we present a novel set of tools, integrated into a software environment named ‘Filament Editor’, for creating reliable neuron tracings from sparsely labeled in vivo datasets. The Filament Editor allows for simultaneous visualization of complex neuronal tracings and image data in a 3D viewer, proof-editing of neuronal tracings, alignment and interconnection across sections, and morphometric analysis in relation to 3D anatomical reference structures. We illustrate the functionality of the Filament Editor on the example of in vivo labeled axons and demonstrate that for the exemplary dataset the final tracing results after proof-editing are independent of the expertise of the human operator. T3 - ZIB-Report - 13-75 Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:0297-zib-43157 SN - 1438-0064 ER - TY - GEN A1 - Shinano, Yuji T1 - The Ubiquity Generator Framework: 7 Years of Progress in Parallelizing Branch-and-Bound N2 - Mixed integer linear programming (MIP) is a general form to model combinatorial optimization problems and has many industrial applications. The performance of MIP solvers has improved tremendously in the last two decades and these solvers have been used to solve many real-word problems. However, against the backdrop of modern computer technology, parallelization is of pivotal importance. In this way, ParaSCIP is the most successful parallel MIP solver in terms of solving previously unsolvable instances from the well-known benchmark instance set MIPLIB by using supercomputers. It solved two instances from MIPLIB2003 and 12 from MIPLIB2010 for the first time to optimality by using up to 80,000 cores on supercomputers. ParaSCIP has been developed by using the Ubiquity Generator (UG) framework, which is a general software package to parallelize any state-of-the-art branch-and-bound based solver. This paper discusses 7 years of progress in parallelizing branch-and-bound solvers with UG. T3 - ZIB-Report - 17-60 KW - Parallelization, Branch-and-bound, Mixed Integer Programming, UG, ParaSCIP, FiberSCIP, ParaXpress, FiberXpress, SCIP-Jack Y1 - 2017 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:0297-zib-65545 SN - 1438-0064 ER -