Overview Statistic: PDF-Downloads (blue) and Frontdoor-Views (gray)

Adaptive Gradient Enhanced Gaussian Process Surrogates for Inverse Problems

accepted for publication
  • Generating simulated training data needed for constructing sufficiently accurate surrogate models to be used for efficient optimization or parameter identification can incur a huge computational effort in the offline phase. We consider a fully adaptive greedy approach to the computational design of experiments problem using gradient-enhanced Gaussian process regression as surrogates. Designs are incrementally defined by solving an optimization problem for accuracy given a certain computational budget. We address not only the choice of evaluation points but also of required simulation accuracy, both of values and gradients of the forward model. Numerical results show a significant reduction of the computational effort compared to just position-adaptive and static designs as well as a clear benefit of including gradient information into the surrogate training.
Metadaten
Author:Phillip Semler, Martin WeiserORCiD
Document Type:In Proceedings
Parent Title (English):Proceedings of the MATH+ Thematic Einstein Semester on Mathematical Optimization for Machine Learning
Year of first publication:2024
ArXiv Id:http://arxiv.org/abs/2404.01864
Accept ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.