7.7 Modellierung und Simulation
Filtern
Dokumenttyp
- Vortrag (75)
- Zeitschriftenartikel (31)
- Forschungsdatensatz (14)
- Beitrag zu einem Tagungsband (12)
- Posterpräsentation (3)
- Preprint (3)
Sprache
- Englisch (124)
- Deutsch (13)
- Mehrsprachig (1)
Schlagworte
- 3D concrete printing (10)
- Model order reduction (9)
- Additive manufacturing (8)
- Digitaler Zwilling (7)
- Model calibration (7)
- Proper generalized decomposition (7)
- Structural build-up (7)
- Bayesian inference (6)
- Digital Twin (6)
- Digital twin (6)
Organisationseinheit der BAM
- 7 Bauwerkssicherheit (138)
- 7.7 Modellierung und Simulation (138)
- 7.4 Baustofftechnologie (8)
- 8 Zerstörungsfreie Prüfung (6)
- 5 Werkstofftechnik (4)
- 7.1 Baustoffe (4)
- 9 Komponentensicherheit (4)
- 9.3 Schweißtechnische Fertigungsverfahren (4)
- 7.2 Ingenieurbau (3)
- 5.2 Metallische Hochtemperaturwerkstoffe (2)
Paper des Monats
- ja (4)
Fiber optic sensors (FOS) are increasingly used for structural health monitoring of bridges, providing dense and distributed strain measurements. An especially relevant application is the detection and localization of tendon breaks, where not only the occurrence but also the depth of the damaged tendon must be identified. To explore this possibility, an experimental setup was established to reproduce the boundary and loading conditions of a bridge tendon, and FOS strain observations were collected during controlled failure events. A finite element (FE) model of this experiment was developed and is used here as a gray-box simulator. Its parameters are calibrated against the FOS data, and the resulting probabilistic characterization is transferred to more realistic models in which different tendon depths are simulated and compared to observed deformation fields. Because of the complexity of the tendon break phenomenon, model form uncertainty (MFU) plays a decisive role. Bayesian calibration is performed while explicitly accounting for MFU through an embedded representation in which a prescribed set of parameters captures the model–reality discrepancy. This formulation enables the quantified MFU to be consistently propagated to other models and quantities of interest. Due to the computational cost of FE simulations, Gaussian Process surrogates are employed, and their predictive uncertainty is incorporated into the calibration to ensure coherent uncertainty propagation. The calibration reveals that some observations cannot be represented by any admissible parameter values, indicating structural model inadequacies. To identify and characterize these, an influence analysis of the observations on the posterior distribution is carried out. The φ-divergence between posterior distributions with and without individual observations is evaluated to quantify their impact on the calibrated posterior. Furthermore, by examining the influence on the marginal posterior components within the embedded MFU formulation, it becomes possible to determine which observations drive the parameters associated with model discrepancy and which parameters are most affected, guiding model refinement and experimental design. Finally, when the calibrated parameters are transferred to alternative models, the separability of predictions for different tendon depths is analyzed. The overlap between predictive distributions is quantified to determine the minimum resolvable depth difference and regions of non-separability arising from MFU. This analysis defines the achievable resolution for tendon break localization and informs optimal sensor placement strategies under uncertainty.
3D concrete printing technologies enhance design freedom while reducing material use and costs without the need for formwork. Thereby, structural build-up is the key property governing stability and early strength evolution of 3D printed concrete after placement. Structural build-up is influenced by various factors, i.e., environmental conditions such as temperature. In this paper, the influence of ambient temperature on structural build-up was investigated through experimental and numerical approaches. Three experimental setups (small amplitude oscillatory shear, constant shear rate, and small amplitude oscillatory extensional tests) were applied to materials of increasing complexity under varying temperature conditions. A common modeling framework based on the maturity approach was developed to capture the time and temperature evolution. A stochastic framework was employed to estimate the unknown model parameters using experimental data. Experimental results demonstrate a significant temperature influence on structural build-up, consistent across all test setups and materials. The calibrated models successfully predict the structural build-up under different temperatures, confirming the applicability of the maturity approach to rheological parameters at early age. Furthermore, the stochastic parameter estimation allows a correct quantification of the uncertainties, enhancing model reliability. The comparison of two time evolution formulations indicates that a model with an additional linear stage is required for predicting the increase of the storage moduli (
$${G}{\prime}$$
G
′
,
$${E}{\prime}$$
E
′
). In conclusion, the study demonstrates that temperature significantly affects the structural build-up, and that the proposed modeling approach allows to predict this behavior.
Um die Lebensdauer von Bauwerken unter Wahrung derer Standsicherheit und Funktionsfähigkeit zu verlängern, bedarf es effektiver Monitorings‐ sowie Instandhaltungskonzepte. Im Rahmen des von der Deutschen Forschungsgemeinschaft (DFG) geförderten Schwerpunktprogramms 2388 „Hundert plus – Verlängerung der Lebensdauer komplexer Baustrukturen durch intelligente Digitalisierung“ (kurz: SPP 100+) werden hierfür innovative, interdisziplinäre Methoden entwickelt und an der Nibelungenbrücke in Worms (NBW) validiert. Der vorliegende Beitrag stellt einige dieser neuentwickelten digitalen Methoden vor. Unter anderem umfasst dies zwei Systeme des Structural Health Monitoring (SHM) und deren zielorientierte Verknüpfung von mehreren Beschleunigungsmessdaten zur umfassenden Zustandsbewertung. Ergänzend werden innovative datenbasierte Simulationsmethoden zur Bestimmung des Temperaturfelds des Brückenüberbaus vorgestellt sowie mehrere Finite‐Elemente‐Modelle unterschiedlicher Detailtiefe präsentiert und miteinander verglichen. Abschließend werden innovative Methoden zum Verwalten des Bestandswissens von Brückenbauwerken diskutiert. Die Methoden wurden überwiegend unabhängig voneinander entwickelt und an der NBW validiert. Im nächsten Schritt werden die Methoden integriert, um die Instandhaltung der NBW zu unterstützen.
Posterior sampling by Monte Carlo methods provides a more comprehensive solution approach to inverse problems than computing point estimates such as the maximum posterior using optimization methods, at the expense of usually requiring many more evaluations of the forward model. Replacing computationally expensive forward models by fast surrogate models is an attractive option. However, computing the simulated training data for building a sufficiently accurate surrogate model can be computationally expensive in itself, leading to the design of computer experiments problem of finding evaluation points and accuracies such that the highest accuracy is obtained given a fixed computational budget. Here, we consider a fully adaptive greedy approach to this problem. Using Gaussian process regression as surrogate, samples are drawn from the available posterior approximation while designs are incrementally defined by solving a sequence of optimization problems for evaluation accuracy and positions. The selection of training designs is tailored towards representing the posterior to be sampled as good as possible, while the interleaved sampling steps discard old inaccurate samples in favor of new, more accurate ones. Numerical results show a significant reduction of the computational effort compared to just position-adaptive and static designs.
A key factor in ensuring the accuracy of computer simulations that model physical systems is the proper calibration of their parameters based on real-world observations or experimental data. Inevitably, uncertainties arise, and Bayesian methods provide a robust framework for quantifying and propagating these uncertainties to model predictions. Nevertheless, Bayesian methods paired with inexact models usually produce predictions unable to represent the observed datapoints. Additionally, the quantified uncertainties of these overconfident models cannot be propagated to other Quantities of Interest (QoIs) reliably. A promising solution involves embedding a model inadequacy term in the inference parameters, allowing the quantified model form uncertainty to influence non-observed QoIs. This paper introduces a more interpretable framework for embedding the model inadequacy compared to existing methods. To overcome the limitations of current approaches, we adapt the existing likelihood models to properly account for noise in the measurements and propose two new formulations designed to address their shortcomings. Moreover, we evaluate the performance of this inadequacy-embedding approach in the presence of discrepancies between measurements and model predictions, including noise and outliers. Particular attention is given to how the uncertainty associated with the model inadequacy term propagates to the QoIs, enabling a more comprehensive statistical analysis of prediction’s reliability. Finally, the proposed approach is applied to estimate the uncertainty in the predicted heat flux from a transient thermal simulation using temperature observations.
Methodological Prerequisites for Reliable Simulation Results in the Virtual Lab and Digital Twins
(2025)
Simulationen übernehmen zunehmend eine zentrale Rolle in sicherheitskritischen Entscheidungsprozessen, etwa im Bauwesen, bei digitalen Zwillingen oder in der prädiktiven Instandhaltung. Damit diese Entscheidungen auf zuverlässigen numerischen Modellen basieren können, müssen die zugrunde liegenden Teilmodelle eindeutig beschrieben, reproduzierbar und in der Community validiert sein. Dieses Manuskript skizziert die notwendigen Voraussetzungen für vertrauenswürdige, FAIR-konforme und entscheidungsfähige Simulationsprozesse. Zunächst wird die Notwendigkeit einer formalen, softwareunabhängigen Beschreibung mathematischer Modelle und numerischer Implementierungen diskutiert. Standards wie VMAP, Ontologien wie MathModDB sowie BIM- und STEP-Formate bieten erste Ansätze zur semantischen Modellbeschreibung und zur automatisierten Generierung simulationsfähiger Eingabedaten. Im zweiten Teil wird die Rolle experimenteller Daten für die Validierung von Modellen beleuchtet. Es wird gezeigt, dass hochwertige, strukturierte und maschinenlesbare Validierungsdatensätze essenziell sind, insbesondere für die Bewertung konstitutiver Modelle.
Ein weiteres Kapitel widmet sich der Unsicherheitsquantifizierung. Neben klassischen Fehlermaßen werden aleatorische und epistemische Unsicherheiten sowie deren Einfluss auf die Modellbewertung behandelt. Besondere Aufmerksamkeit gilt der Bayes’schen Kalibrierung, der Trennung von Trainings und Testdaten und der Notwendigkeit, auch Begleitversuche aus Materialtests zu dokumentieren. Abschließend wird die Idee einer Benchmarking-Plattform vorgestellt, die auf reproduzierbaren Workflows, automatisierter Provenienzverfolgung und der Nutzung von Research Object (RO) Crates basiert. Diese Plattform erlaubt die dezentrale Veröffentlichung und zentrale Abfrage von Benchmark-Ergebnissen und fördert eine gemeinschaftliche Verifikation und Validierung. Ziel ist es, die Vergleichbarkeit von (Open-Source-)Simulationstools zu verbessern um dadurch komplexe experimentelle Setups zunehmend durch zuverlässige Simulationen ersetzen zu können.
This is a python library for finite element (FE) modelling of static/quasi-static structural mechanics problems with legacy FEniCS, which contains the following main modules. Module structure: for defining a structural mechanics experiment, including geometry, mesh, boundary conditions (BCs), and time-varying loadings. The time is quasi-static, i.e. no dynamic (inertia) effects will be accounted for in the problem module as follows. Module material: for handling constitutive laws such as elasticity, gradient damage, plasticity, etc. Module problem: for establishing structural mechanics problems for desired structures and material laws coming from the two above modules, and solving the problems built up. These can be performed for two main cases: static (no time-evolution) that also includes homogenization, and quasi-static (QS).
Methodological prerequisites for reliable simulation results in the Virtual Lab and Digital Twins
(2025)
Mit dem zunehmenden Einsatz von Methoden wie Künstlicher Intelligenz, Digitalen Zwillingen und datengetriebener Modellierung erweitert sich das Anwendungsspektrum der virtuellen Produktentwicklung kontinuierlich. In sicherheitsrelevanten Anwendungsfeldern steigen dabei die Anforderungen an die Qualität, Nachvollziehbarkeit und externe Prüfbarkeit von Simulationsergebnissen – insbesondere im Hinblick auf Reproduzierbarkeit und Transparenz.
Der Beitrag analysiert zentrale methodische und strukturelle Herausforderungen für den zuverlässigen Einsatz numerischer Simulationen in solchen Kontexten. Im Mittelpunkt stehen folgende Aspekte:
• Die formale, softwareunabhängige Spezifikation mathematischer und numerischer Modelle, einschließlich einer standardisierten Beschreibung sowohl der Problemstellung als auch der Ergebnisgrößen.
• Die Validierung sämtlicher Modellkomponenten anhand von Anwendungsszenarien mit hoher Ähnlichkeit zur realen Systemumgebung.
• Die systematische Quantifizierung von Unsicherheiten, insbesondere im Hinblick auf Sim2Real-Modellabweichungen und deren Einfluss auf die Aussagekraft der Simulation.
• Die Verifikation der Softwareimplementierung durch standardisierte Benchmark-Vergleiche. Hierzu wird ein Konzept föderierter Benchmarks mit dokumentierter Herkunft („Provenance“) vorgestellt, das eine transparente, reproduzierbare und öffentlich zugängliche Verifikation von Simulationswerkzeugen ermöglicht. Die Benchmarks folgen definierten Formaten, können dezentral veröffentlicht und in eine zentrale, abfragbare Datenbank integriert werden. Durch diese standardisierten Schnittstellen können dann Metriken zum Vergleich der Benchmarks visualisiert und in Analysen bereitgestellt werden.
Der Beitrag zielt darauf ab, zentrale Herausforderungen und offene Fragestellungen bei der methodischen Absicherung simulationsbasierter Aussagen zu identifizieren und zur Diskussion zu stellen – insbesondere im Spannungsfeld zwischen Modellbildung, Validierung, Verifikation und Unsicherheitsquantifizierung.
A python implementation of an analytical variational Bayes algorithm of "Variational Bayesian inference for a nonlinear forward model", Chappell, Michael A., Adrian R. Groves, Brandon Whitcher, and Mark W. Woolrich, IEEE Transactions on Signal Processing 57, no. 1 (2008): 223-236, with an updated free energy equation to correctly capture the log evidence. The algorithm requires a user-defined model error allowing an arbitrary combination of custom forward models and measured data.
Engineering simulations play a central role in predicting system behavior and supporting design decisions. Their impact will increase due to the concepts of virtual lab, digital twins and predictive maintenance, where expensive prototyping is reduced with simulations or where real measurement data of the structures or parts are used in combination with simulation models to improve decision making.
The credibility of a simulation model depends on the trust it earns within the scientific and engineering community. This trust can only be established through rigorous verification and validation. Verification and validation are inherently hierarchical processes. At the highest level, global approaches—such as standardized benchmarks and shared validation datasets—provide a common foundation for assessing model correctness and applicability across domains. These global resources enable regulatory bodies, industry, and research communities to evaluate whether models and their implementations meet agreed-upon standards.
Research Data Management (RDM) provides the infrastructure to support these processes by enabling formalized model definitions, structured validation datasets, and transparent verification procedures and complete provenance descriptions of the underlying workflows.
Currently, many engineering models are published as descriptive text or embedded in specific software implementations, which hinders reproducibility and standardization. Numerical results presented in publications must be fully reproducible. For regulatory assessment and interoperability, models require software-agnostic, machine-readable definitions that include all information needed for implementation and execution, such as scope, assumptions, parameter identification procedures, and uncertainty bounds.
Formal definitions alone do not guarantee consistency. Verification benchmarks are essential to confirm that different implementations of the same mathematical or numerical model produce equivalent results. Likewise, validation requires structured experimental data that is publicly available and documented according to community-agreed protocols for tests, measurements, and metadata. Robust calibration and uncertainty quantification frameworks must provide systematic, reproducible procedures for parameter estimation and uncertainty propagation, while the complete provenance of these steps has to be documented. Validation should extend beyond controlled laboratory conditions to diverse real-world scenarios.
The presentation will explore practical strategies for achieving reproducibility and reusability in engineering simulations through structured workflows, provenance tracking, and standardized data management practices. It will address the integration of experimental data for validation, the role of verification benchmarks, uncertainty quantification frameworks, and tool-independent model descriptions—including VMAP for result exchange—within collaborative platforms that support joint projects and regulatory compliance.