TY - JOUR A1 - Semler, Phillip A1 - Weiser, Martin T1 - Adaptive Gaussian Process Regression for Efficient Building of Surrogate Models in Inverse Problems JF - Inverse Problems N2 - In a task where many similar inverse problems must be solved, evaluating costly simulations is impractical. Therefore, replacing the model y with a surrogate model y(s) that can be evaluated quickly leads to a significant speedup. The approximation quality of the surrogate model depends strongly on the number, position, and accuracy of the sample points. With an additional finite computational budget, this leads to a problem of (computer) experimental design. In contrast to the selection of sample points, the trade-off between accuracy and effort has hardly been studied systematically. We therefore propose an adaptive algorithm to find an optimal design in terms of position and accuracy. Pursuing a sequential design by incrementally appending the computational budget leads to a convex and constrained optimization problem. As a surrogate, we construct a Gaussian process regression model. We measure the global approximation error in terms of its impact on the accuracy of the identified parameter and aim for a uniform absolute tolerance, assuming that y(s) is computed by finite element calculations. A priori error estimates and a coarse estimate of computational effort relate the expected improvement of the surrogate model error to computational effort, resulting in the most efficient combination of sample point and evaluation tolerance. We also allow for improving the accuracy of already existing sample points by continuing previously truncated finite element solution procedures. Y1 - 2023 U6 - https://doi.org/10.1088/1361-6420/ad0028 VL - 39 IS - 12 SP - 125003 ER - TY - CHAP A1 - Semler, Phillip A1 - Weiser, Martin T1 - Adaptive Gradient Enhanced Gaussian Process Surrogates for Inverse Problems T2 - Proceedings of the MATH+ Thematic Einstein Semester on Mathematical Optimization for Machine Learning N2 - Generating simulated training data needed for constructing sufficiently accurate surrogate models to be used for efficient optimization or parameter identification can incur a huge computational effort in the offline phase. We consider a fully adaptive greedy approach to the computational design of experiments problem using gradient-enhanced Gaussian process regression as surrogates. Designs are incrementally defined by solving an optimization problem for accuracy given a certain computational budget. We address not only the choice of evaluation points but also of required simulation accuracy, both of values and gradients of the forward model. Numerical results show a significant reduction of the computational effort compared to just position-adaptive and static designs as well as a clear benefit of including gradient information into the surrogate training. Y1 - 2024 ER -