Filtern
Erscheinungsjahr
Dokumenttyp
- Zeitschriftenartikel (38)
- Vortrag (34)
- Beitrag zu einem Tagungsband (21)
- Forschungsdatensatz (8)
- Posterpräsentation (4)
- Preprint (3)
Schlagworte
- Concrete (8)
- Multiscale (6)
- Model updating (5)
- 3D concrete printing (4)
- Bayesian inference (4)
- Bridge Monitoring (4)
- Digital Twins (4)
- Fatigue (4)
- Model bias (4)
- Model calibration (4)
Organisationseinheit der BAM
Paper des Monats
- ja (4)
Um die Lebensdauer von Bauwerken unter Wahrung derer Standsicherheit und Funktionsfähigkeit zu verlängern, bedarf es effektiver Monitorings‐ sowie Instandhaltungskonzepte. Im Rahmen des von der Deutschen Forschungsgemeinschaft (DFG) geförderten Schwerpunktprogramms 2388 „Hundert plus – Verlängerung der Lebensdauer komplexer Baustrukturen durch intelligente Digitalisierung“ (kurz: SPP 100+) werden hierfür innovative, interdisziplinäre Methoden entwickelt und an der Nibelungenbrücke in Worms (NBW) validiert. Der vorliegende Beitrag stellt einige dieser neuentwickelten digitalen Methoden vor. Unter anderem umfasst dies zwei Systeme des Structural Health Monitoring (SHM) und deren zielorientierte Verknüpfung von mehreren Beschleunigungsmessdaten zur umfassenden Zustandsbewertung. Ergänzend werden innovative datenbasierte Simulationsmethoden zur Bestimmung des Temperaturfelds des Brückenüberbaus vorgestellt sowie mehrere Finite‐Elemente‐Modelle unterschiedlicher Detailtiefe präsentiert und miteinander verglichen. Abschließend werden innovative Methoden zum Verwalten des Bestandswissens von Brückenbauwerken diskutiert. Die Methoden wurden überwiegend unabhängig voneinander entwickelt und an der NBW validiert. Im nächsten Schritt werden die Methoden integriert, um die Instandhaltung der NBW zu unterstützen.
The MaterialDigital initiative represents a major driver toward the digitalization of material science. Next to providing a prototypical infrastructure required for building a shared data space and working on semantic interoperability of data, a core focus area of the Platform MaterialDigital (PMD) is the utilization of workflows to encapsulate data processing and simulation steps in accordance with findable, accessible, interoperable, and reusable principles. In collaboration with the funded projects of the initiative, the workflow working group strives to establish shared standards, enhancing the interoperability and reusability of scientific data processing steps. Central to this effort is the Workflow Store, a pivotal tool for disseminating workflows with the community, facilitating the exchange and replication of scientific methodologies. This article discusses the inherent challenges of adapting workflow concepts, providing the perspective on developing and using workflows in the respective domain of the various funded projects. Additionally, it introduces the Workflow Store’s role within the initiative and outlines a future roadmap for the PMD workflow group, aiming to further refine and expand the role of scientific workflows as a means to advance digital transformation and foster collaborative research within material science.
Simulation-based digital twins have emerged as a powerful tool for evaluating the mechanical response of bridges. As virtual representations of physical systems, digital twins can provide a wealth of information that complements traditional inspection and monitoring data. By incorporating virtual sensors and predictive maintenance strategies, they have the potential to improve our understanding of the behavior and performance of bridges over time. However, as bridges age and undergo regular loading and extreme events, their tructural characteristics change, often differing from the predictions of their initial design. Digital twins must be continuously adapted to reflect these changes. In this article, we present a Bayesian framework for updating simulation-based digital twins in the context of bridges. Our approach integrates information from measurements to account for inaccuracies in the simulation model and quantify uncertainties. Through its implementation and assessment, this work demonstrates the potential for digital twins to provide a reliable and up-to-date representation of bridge behavior, helping to inform decision-making for maintenance and management.
Die sprunghaft zunehmende Wichtigkeit von FAIR- und Open-Data für die Qualitätssicherung, aber auch für die Nachnutzbarkeit von Daten und den Erkenntnisfortschritt führt zu enormem Flandlungsbedarf in Forschung und Entwicklung. Damit verbunden laufen derzeit vielfältige, ambitionierte Aktionen, z. B. bezüglich der Erstellung von Ontologien und Wissensgraphen. Das Knowhow entwickelt sich rasant, die Ansätze zur Implementation entstehen in verschiedenen Fachwelten bzw. mit
unterschiedlichen Zielsetzungen parallel, so dass recht heterogene Herangehensweisen resultieren.
Diese Veröffentlichung fokussiert auf Arbeiten, die derzeit als möglichst ganzheitlicher Ansatz für Materialdaten im Rahmen der Digitalisierungsinitiative „Plattform MaterialDigital" vorangetrieben werden. Die Autoren bearbeiten baustoffbezogene Aspekte im Verbundprojekt „LeBeDigital - Lebenszyklus von Beton". Zielsetzung ist die digitale Beschreibung des Materialverhaltens von Beton über den kompletten Herstellungsprozess eines Fertigteils mit einer Integration von Daten und Modellen innerhalb eines Workflows zur probabilistischen Material- und Prozessoptimierung.
Es wird über die Vorgehensweise und die dabei gewonnenen Erfahrungen berichtet, nicht ohne den Blick auf die oft unterschätzte Komplexität der Thematik zu lenken.
In recent years, the use of simulation-based digital twins for monitoring and assessment of complex mechanical systems has greatly expanded. Their potential to increase the information obtained from limited data makes them an invaluable tool for a broad range of real-world applications. Nonetheless, there usually exists a discrepancy between the predicted response and the measurements of the system once built. One of the main contributors to this difference in addition to miscalibrated model parameters is the model error. Quantifying this socalled model bias (as well as proper values for the model parameters) is critical for the reliable performance of digital twins. Model bias identification is ultimately an inverse problem where information from measurements is used to update the original model. Bayesian formulations can tackle this task. Including the model bias as a parameter to be inferred enables the use of a Bayesian framework to obtain a probability distribution that represents the uncertainty between the measurements and the model. Simultaneously, this procedure can be combined with a classic parameter updating scheme to account for the trainable parameters in the original model.
This study evaluates the effectiveness of different model bias identification approaches based on Bayesian inference methods. This includes more classical approaches such as direct parameter estimation using MCMC in a Bayesian setup, as well as more recent proposals such as stat-FEM or orthogonal Gaussian Processes. Their potential use in digital twins, generalization capabilities, and computational cost is extensively analyzed.
A key limitation of the most constitutive models that reproduce a Degradation of quasi-brittle materials is that they generally do not address issues related to fatigue. One reason is the huge computational costs to resolve each load cycle on the structural level. The goal of this paper is the development of a temporal Integration scheme, which significantly increases the computational efficiency of the finite element method in comparison to conventional temporal integrations.
The essential constituent of the fatigue model is an implicit gradient-enhanced formulation of the damage rate. The evolution of the field variables is computed as amultiscale Fourier series in time.On a microchronological scale attributed to single cycles, the initial boundary value problem is approximated by linear BVPs with respect to the Fourier coefficients. Using the adaptive cycle jump concept, the obtained damage rates are transferred to a coarsermacrochronological scale associated with the duration of material deterioration. The performance of the developedmethod is hence improved due to an efficient numerical treatment of the microchronological problem in combination with the cycle jump technique on the macrochronological scale. Validation examples demonstrate the convergence of the obtained solutions to the reference simulations while significantly reducing the computational costs.
In this paper, a contact problem between two bodies, discretized by finite elements, is solved by adding an auxiliary NURBS layer between the bodies. The advantages of a smooth contact formulation in a NURBS approach are combined with simple mesh generation procedures for the bodies discretized with finite elements. Mesh tying conditions are used to couple the NURBS layer with the finite element discretization. The NURBS layer is the master side for contact and mesh tying. Mesh tying is enforced either using pointwise or mortar type approaches. Frictionless 2D and 3D contact problems are considered using small deformations. The contact problem is discretized with the mortar method and a penalty approach is used to enforce the contact constraints. A robust element-based quadrature is applied for mortar tying and contact discretizations, thus avoiding computationally expensive Segmentation.
3D concrete printing is an innovative new construction technology
offering the potential to enable the efficient production of individual structures with less consumption of resources. The technology will mainly shape the future construction philosophy. From the design of a structure to the printed component, many individual steps based on different software are required, which must be repeated for each new or even slightly changed design. The geometry of the structure is created in a CAD program. The print path is defined in slicer software leading to the machine code for the printer to print the structure. A numerical model of the printed structure makes optimization in design and fabrication possible, by predicting the behaviour of the structure and reducing the number of test prints and costs. For that, additional steps like meshing the design and running a simulation are required. In order to work efficiently, an automated workflow is necessary, which runs all of the individual steps without interacting with each software program. Furthermore, changes in parameters or the exchange of parts (different designs or printers) must be simple. One way to develop such an automated workflow is presented within this paper. The interfaces are defined in a way that allows running the full chain of tools as well as individual steps. The workflow is demonstrated based on the example of a parametrized wall element for extrusion-based concrete. Furthermore, a test series of cubes is printed, and the influence of different infill structures is numerically and experimentally compared.
Using digital twins for decision making is a very promising concept which combines simulation models with corresponding experimental sensor data in order to support maintenance decisions or to investigate the reliability. The quality of the prognosis strongly depends on both the data quality and the quality of the digital twin. The latter comprises both the modeling assumptions as well as the correct parameters of these models. This article discusses the challenges when applying this concept to realmeasurement data for a demonstrator bridge in the lab, including the data management, the iterative development of the simulation model as well as the identification/updating procedure using Bayesian inference with a potentially large number of parameters. The investigated scenarios include both the iterative identification of the structural model parameters as well as scenarios related to a damage identification. In addition, the article aims at providing all models and data in a reproducibleway such that other researcher can use this setup to validate their methodologies.
Multiscale modeling of linear elastic heterogeneous structures via localized model order reduction
(2023)
In this paper, a methodology for fine scale modeling of large scale linear elastic structures is proposed, which combines the variational multiscale method, domain decomposition and model order reduction. The influence of the fine scale on the coarse scale is modelled by the use of an additive split of the displacement field, addressing applications without a clear scale separation. Local reduced spaces are constructed by solving an oversampling problem with random boundary conditions. Herein, we inform the boundary conditions by a global reduced problem and compare our approach using physically meaningful correlated samples with existing approaches using uncorrelated samples. The local spaces are designed such that the local contribution of each subdomain can be coupled in a conforming way, which also preserves the sparsity pattern of standard finite element assembly procedures. Several numerical experiments show the accuracy and efficiency of the method, as well as its potential to reduce the size of the local spaces and the number of training samples compared to the uncorrelated sampling.