Ingenieurwissenschaften und zugeordnete Tätigkeiten
Filtern
Dokumenttyp
- Vortrag (11)
- Zeitschriftenartikel (4)
- Beitrag zu einem Tagungsband (4)
- Forschungsdatensatz (4)
Schlagworte
- Model calibration (4)
- Structural build-up (4)
- Additive manufacturing (3)
- Wire arc additive manufacturing (3)
- 3D concrete printing (2)
- Beton (2)
- Digitalisierung (2)
- Material characterization (2)
- Modelling (2)
- Ontology (2)
- Optimization workflow (2)
- Performance oriented concrete design (2)
- Reduced order modelling (2)
- Rheological properties (2)
- Thixotropy (2)
- Bayesian Uncertainty Quantification (1)
- Concrete modelling (1)
- Data infrastructures (1)
- Datenmanagement (1)
- Digital Twin (1)
- Digital Twins (1)
- Digital representations (1)
- Digital workflows (1)
- Domain Decomposition (1)
- Early-age concrete (1)
- Ermüdung (1)
- Experimental data (1)
- FAIR (1)
- FAIR data (1)
- FAIR-Prinzip (1)
- FEM (1)
- Fenics (1)
- Gaussian Processes (1)
- Hardly separable problem (1)
- Knowledge graphs (1)
- Localized model order reduction (1)
- Mapping for unseparable load (1)
- Materialdaten (1)
- Materials informatics (1)
- Metadata (1)
- Metadaten (1)
- Model Bias (1)
- Model Order Reduction (1)
- Model bias (1)
- Model order reduction (1)
- Model order reduction (MOR) (1)
- Multiscale Method (1)
- Multiscale methods (1)
- Ontologie (1)
- Ontologien (1)
- Ontologies (1)
- Precast concrete (1)
- Proper generalized decomposition (1)
- Proper generalized decomposition (PGD) (1)
- Proper orthogonal (1)
- Reduced Order Model (1)
- Reproducibility, scientific workflow (1)
- Reproducible data processing (1)
- Scientific workflows (1)
- Semantic web (1)
- Sensitivity analysis (1)
- Simulation Models (1)
- Simulations (1)
- Statistical Finite Element Method (1)
- Thermal transient problem (1)
- Tool comparison (1)
- Uncertainty Quantification (1)
- Variational Multiscale Method (1)
- Variational multiscale method (1)
- Vocabulary providers (1)
- Wissengraph (1)
- Workflow management (1)
- Zement (1)
Organisationseinheit der BAM
- 7.7 Modellierung und Simulation (23) (entfernen)
Eingeladener Vortrag
- nein (11)
In the field of computational science and engineering, workflows often entail the application of various software, for instance, for simulation or pre- and postprocessing. Typically, these components have to be combined in arbitrarily complex workflows to address a specific research question. In order for peer researchers to understand, reproduce and (re)use the findings of a scientific publication, several challenges have to be addressed. For instance, the employed workflow has to be automated and information on all used software must be available for a reproduction of the results. Moreover, the results must be traceable and the workflow documented and readable to allow for external verification and greater trust. In this paper, existing workflow management systems (WfMSs) are discussed regarding their suitability for describing, reproducing and reusing scientific workflows. To this end, a set of general requirements for WfMSswere deduced from user stories that we deem relevant in the domain of computational science and engineering. On the basis of an exemplary workflow implementation, publicly hosted at GitHub (https:// this http URL), a selection of different WfMSs is compared with respect to these requirements, to support fellow scientists in identifying the WfMSs that best suit their requirements.
Multiscale modeling of linear elastic heterogeneous structures via localized model order reduction
(2024)
In this paper, a methodology for fine scale modeling of large scale linear elastic structures is proposed, which combines the variational multiscale method, domain decomposition and model order reduction. The influence of the fine scale on the coarse scale is modelled by the use of an additive split of the displacement field, addressing applications without a clear scale separation. Local reduced spaces are constructed bysolving an oversampling problem with random boundary conditions. Herein, we inform the boundary conditions by a global reduced problem and compare our approach using physically meaningful correlated samples with existing approaches using uncorrelated samples. The local spaces are designed such that the local contribution of each subdomain can be coupled in a conforming way, which also preserves the sparsity pattern of standard finite element assembly procedures. Several numerical experiments show the accuracy and efficiency of the method, as well as its potential to reduce the size of the local spaces and the number of training samples compared to the uncorrelated sampling
Thermal transient problems, essential for modeling applications like welding and additive metal manufacturing, are characterized by a dynamic evolution of temperature. Accurately simulating these phenomena is often computationally expensive, thus limiting their applications, for example for model parameter estimation or online process control. Model order reduction, a solution to preserve the accuracy while reducing the computation time, is explored. This article addresses challenges in developing reduced order models using the proper generalized decomposition (PGD) for transient thermal problems with a specific treatment of the moving heat source within the reduced model. Factors affecting accuracy, convergence, and computational cost, such as discretization methods (finite element and finite difference), a dimensionless formulation, the size of the heat source, and the inclusion of material parameters as additional PGD variables are examined across progressively complex examples. The results demonstrate the influence of these factors on the PGD model’s performance and emphasize the importance of their consideration when implementing such models. For thermal example, it is demonstrated that a PGD model with a finite difference discretization in time, a dimensionless representation, a mapping for a moving heat source, and a spatial domain non-separation yields the best approximation to the full order model.
Despite the advances in hardware and software techniques, standard numerical methods fail in providing real-time simulations, especially for complex processes such as additive manufacturing applications. A real-time simulation enables process control through the combination of process monitoring and automated feedback, which increases the flexibil- ity and quality of a process. Typically, before producing a whole additive manufacturing structure, a simplified experiment in form of a bead-on- plate experiment is performed to get a first insight into the process and to set parameters suitably. In this work, a reduced order model for the transient thermal problem of the bead-on-plate weld simulation is devel- oped, allowing an efficient model calibration and control of the process. The proposed approach applies the proper generalized decomposition (PGD) method, a popular model order reduction technique, to decrease the computational effort of each model evaluation required multiple times in parameter estimation, control and optimization. The welding torch is modeled by a moving heat source, which leads to difficulties separating space and time, a key ingredient in PGD simulations. A novel approach for separating space and time is applied and extended to 3D problems allowing the derivation of an efficient separated representation of the tem- perature. The results are verified against a standard finite element model showing excellent agreement. The reduced order model is also leveraged in a Bayesian model parameter estimation setup, speeding up calibrations and ultimately leading to an optimized real-time simulation approach for welding experiment using synthetic as well as real measurement data.
Despite the advances in hardware and software techniques, standard numerical methods fail in providing real-time simulations, especially for complex processes such as additive manufacturing applications. A real-time simulation enables process control through the combination of process monitoring and automated feedback, which increases the flexibility and quality of a process. Typically, before producing a whole additive manufacturing structure, a simplified experiment in the form of a beadon-plate experiment is performed to get a first insight into the process and to set parameters suitably. In this work, a reduced order model for the transient thermal problem of the bead-on-plate weld simulation is developed, allowing an efficient model calibration and control of the process. The proposed approach applies the proper generalized decomposition (PGD) method, a popular model order reduction technique, to decrease the computational effort of each model evaluation required multiple times in parameter estimation, control, and optimization. The welding torch is modeled by a moving heat source, which leads to difficulties separating space and time, a key ingredient in PGD simulations. A novel approach for separating space and time is applied and extended to 3D problems allowing the derivation of an efficient separated representation of the temperature.
The results are verified against a standard finite element model showing excellent agreement. The reduced order model is also leveraged in a Bayesian model parameter estimation setup, speeding up calibrations and ultimately leading to an optimized real-time simulation approach for welding experiment using synthetic as well as real measurement data.
Beton ist weltweit einer der wichtigsten Konstruktionswerkstoffe und zeichnet sich durch eine enorme Anpassungsfähigkeit an sich verändernde Anforderungen aus. Damit verbunden ist eine hohe und kontinuierlich zunehmende Komplexität hinsichtlich der Ausgangsstoffe, Rezepturen und des Herstellungsprozesses. Folglich setzt eine Ausschöpfung des technischen und umweltbezogenen Potenzials der Betonbauweise höchste Expertise bei den Einzelakteuren der Bauindustrie voraus.
FenicsXConcrete
(2023)
Additive manufacturing (AM) has revolutionized the manufacturing industry, offering a new paradigm to produce complex geometries and parts with customized properties. Among the different AM techniques, the wire arc additive manufacturing (WAAM) process has gained significant attention due to its high deposition rate and low equipment cost. However, the process is characterized by a complex thermal history making it challenging to simulate it in real-time for online process control and optimization.
In this context, a reduced order model (ROM) using the proper generalized decomposition (PGD) method [1] is proposed as a powerful tool to overcome the limitations of conventional numerical methods and enable the real-time simulation of the temperature field of WAAM processes. These simulations use a moving heat source leading to a hardly separable parametric problem, which is handled by applying a novel mapping approach [2]. This procedure makes it possible to create a simple separated representation of the model, which allows to simulate multiple layers.
In this contribution, a PGD model is derived for the temperature field simulation of the WAAM process. A good agreement with a standard finite element method is shown. The reduced model is further used in a stochastic model parameter estimation using Bayesian inference, speeding up calibrations and ultimately leading to a calibrated real-time simulation.
Die sprunghaft zunehmende Wichtigkeit von FAIR- und Open-Data für die Qualitätssicherung, aber auch für die Nachnutzbarkeit von Daten und den Erkenntnisfortschritt führt zu enormem Flandlungsbedarf in Forschung und Entwicklung. Damit verbunden laufen derzeit vielfältige, ambitionierte Aktionen, z. B. bezüglich der Erstellung von Ontologien und Wissensgraphen. Das Knowhow entwickelt sich rasant, die Ansätze zur Implementation entstehen in verschiedenen Fachwelten bzw. mit
unterschiedlichen Zielsetzungen parallel, so dass recht heterogene Herangehensweisen resultieren.
Diese Veröffentlichung fokussiert auf Arbeiten, die derzeit als möglichst ganzheitlicher Ansatz für Materialdaten im Rahmen der Digitalisierungsinitiative „Plattform MaterialDigital" vorangetrieben werden. Die Autoren bearbeiten baustoffbezogene Aspekte im Verbundprojekt „LeBeDigital - Lebenszyklus von Beton". Zielsetzung ist die digitale Beschreibung des Materialverhaltens von Beton über den kompletten Herstellungsprozess eines Fertigteils mit einer Integration von Daten und Modellen innerhalb eines Workflows zur probabilistischen Material- und Prozessoptimierung.
Es wird über die Vorgehensweise und die dabei gewonnenen Erfahrungen berichtet, nicht ohne den Blick auf die oft unterschätzte Komplexität der Thematik zu lenken.
Concrete has a long history in the construction industry and is currently one of the most widely used building materials. Unfortunately, the concrete industry has a significant impact on the environment by contributing to about 9% of the total anthropogenic greenhouse gas (GHG) emissions. Concrete is a highly complex composite material. However, the main source of concrete's GHG emissions is the cement. This leads to two main strategies when trying to reduce the environmental impact. The first is to reduce the cement within the concrete mix. This can be done by substituting it using additives or increasing the amount of aggregates. Usually this will lead to decreased material properties, like compressive strength or stiffness. The second option is to reduce the amount of required concrete by optimizing the topology of the structure. However, this might require higher compressive strength. In addition, there are other properties like to workability which need to be considered. All in all, this leads to a highly complex optimization problem, which requires the estimation of effective concrete properties, based on the mixture as input to a predictive simulation.
We present an automated workflow framework which combines experimental data with simulations, calibrates the simulation and performs the desired optimization. This workflow includes classical FE models, design guidelines based on model codes, as well as data driven methods. The chosen example is a beam, for which the concrete mixture is optimized to reduce GHG emissions. The first step is an estimation of material parameters, based on experimental data. This includes measures of stochastic distribution, allowing the quantification of the quality of the estimated parameters. The second step is the optimization. It takes into account constraints like the loading capacity after 28 days, the maximum allowed temperature during cement hydration and the maximum time till demoulding. The applied models include a Mori-Tanaka-based homogenization method to estimate effective concrete parameters, an FE simulation including the evolution of the concrete compressive strength and stiffness, the temperature field, displacements, and stress. This research shows a way towards a more performance-oriented material design.
Additive manufacturing (AM) has revolutionized the manufacturing industry, offering a new paradigm to produce complex geometries and parts with customized properties. Among the different AM techniques, the wire arc additive manufacturing (WAAM) process has gained significant attention due to its high deposition rate and low equipment cost. However, the process is characterized by a complex thermal history, dynamic metallurgy, and mechanical behaviour that make it challenging to simulate it in real-time for online process control and optimization.
In this context, a reduced order model (ROM) using the proper generalized decomposition (PGD) method is proposed as a powerful tool to overcome the limitations of conventional numerical methods and enable the real-time simulation of the temperature field of WAAM processes. Though, the simulation of a moving heat source leads to a hardly separable parametric problem, which is handled by applying a novel mapping approach. Using this procedure, it is possible to create a simple separated representation of the model, also allowing to simulate multiple layers.
In this contribution, a PGD model is derived for the WAAM procedure simulating the temperature field. A good agreement with a standard finite element method is shown. The reduced model is further used in a stochastic model parameter estimation using Bayesian inference, speeding up calibrations and ultimately leading to a calibrated real-time simulation.
Multiscale modeling of heterogeneous structures based on a localized model order reduction approach
(2023)
Many of today’s problems in engineering demand reliable and accurate prediction of failure mechanisms of mechanical structures. Thus, it is necessary to take into account the heterogeneous structure on the smaller scale, to capture the underlying physical phenomena. However, this poses a great challenge to the numerical solution since the computational cost is significantly increased by resolving the smaller scale in the model. Moreover, in applications where scale separation as the basis of classical homogenization schemes does not hold, the influence of the smaller scale on the larger scale has to be modelled directly. This work aims to develop an efficient concurrent methodology to model heterogeneous structures combining the variational multiscale method (VMM) [1] and model order reduction techniques (e. g. [2]). First, the influence of the smaller scale on the larger scale can be taken into account following the additive split of the displacement field as in the VMM. Here, also a decomposition of the global domain into subdomains, each containing a fine grid discretization of the smaller scale, is introduced. Second, local reduced approximation spaces for the smaller scale solution are constructed by exploring possible solutions for each subdomain based on the concept of oversampling [3]. The associated transfer operator is approximated by random sampling [4]. Herein, we propose to incorporate the actual physical behaviour of the structure of interest in the training data by drawing random samples from a multivariate normal distribution with the solution of a reduced global problem as mean. The local reduced spaces are designed such that local contributions of each subdomain can be coupled in a conforming way. Thus, the resulting global system is sparse and reduced in size compared to the direct numerical simulation, leading to a faster solution of the problem.
In recent years, the use of simulation-based digital twins for monitoring and assessment of complex mechanical systems has greatly expanded. Their potential to increase the information obtained from limited data makes them an invaluable tool for a broad range of real-world applications. Nonetheless, there usually exists a discrepancy between the predicted response and the measurements of the system once built. One of the main contributors to this difference in addition to miscalibrated model parameters is the model error. Quantifying this socalled model bias (as well as proper values for the model parameters) is critical for the reliable performance of digital twins. Model bias identification is ultimately an inverse problem where information from measurements is used to update the original model. Bayesian formulations can tackle this task. Including the model bias as a parameter to be inferred enables the use of a Bayesian framework to obtain a probability distribution that represents the uncertainty between the measurements and the model. Simultaneously, this procedure can be combined with a classic parameter updating scheme to account for the trainable parameters in the original model. This study evaluates the effectiveness of different model bias identification approaches based on Bayesian inference methods. This includes more classical approaches such as direct parameter estimation using MCMC in a Bayesian setup, as well as more recent proposals such as stat- FEM or orthogonal Gaussian Processes. Their potential use in digital twins, generalization capabilities, and computational cost is extensively analyzed.
In recent years, the use of simulation-based digital twins for monitoring and assessment of complex mechanical systems has greatly expanded. Their potential to increase the information obtained from limited data makes them an invaluable tool for a broad range of real-world applications. Nonetheless, there usually exists a discrepancy between the predicted response and the measurements of the system once built. One of the main contributors to this difference in addition to miscalibrated model parameters is the model error. Quantifying this socalled model bias (as well as proper values for the model parameters) is critical for the reliable performance of digital twins. Model bias identification is ultimately an inverse problem where information from measurements is used to update the original model. Bayesian formulations can tackle this task. Including the model bias as a parameter to be inferred enables the use of a Bayesian framework to obtain a probability distribution that represents the uncertainty between the measurements and the model. Simultaneously, this procedure can be combined with a classic parameter updating scheme to account for the trainable parameters in the original model.
This study evaluates the effectiveness of different model bias identification approaches based on Bayesian inference methods. This includes more classical approaches such as direct parameter estimation using MCMC in a Bayesian setup, as well as more recent proposals such as stat-FEM or orthogonal Gaussian Processes. Their potential use in digital twins, generalization capabilities, and computational cost is extensively analyzed.
For extrusion-based 3D concrete printing, the early age mechanical behavior is influenced by various time dependent phenomena: structural build-up, plasticity as well as viscosity. The structural build-up is governing the stability and early-age strength development of the fresh printable cementitious materials and with that influencing the printability, buildability, and open time of the printing process. Generally, it is influenced by a number of factors, i.e. composition of the printable material, printing regime, and ambient conditions (temperature, humidity, etc.). There are several approaches to model the structural build-up of cementitious materials. All models are based on a time-dependent internal structural parameter describing the flocculation state, which is assumed to be zero after mixing and increases with time. The approaches differ in the definition of the time dependency (linear, exponential, bi-linear). Usually, the parameters are defined for a specific material composition without considering the influence of ambient conditions.
In this contribution, the bi-linear structural build-up model [Kruger et al., Construction and Building Materials 224, 2019] is extended by the temperature influence. Temperature changes will occur in real life printing processes due to changing ambient conditions (summer, winter, day, night) as well as the printing process (pressure changes etc.) and have a significant impact on the structural build-up process: an increase of the temperature leads to a faster dissolution of cement phases, accelerates hydration and boosts the Brownian motion. For that reason, the model parameters are simulated as temperature dependent using an Arrhenius function. Furthermore, the proposed extended model is calibrated based on measurement data using Bayesian inference. A very good agreement of the predicted model data with the measured control data was reached. Additionally, the structural build-up model is integrated into a viscoelastic and elastoplastic mechanical model, simulating the whole mechanical behavior during layer deposition.
FAIR (findable, accessible, interoperable and reusable) data usage is one of the main principals that many of the research and funding organizations include in their strategic plans, which means that following the main principals of FAIR data is required in many research projects. The definition of data being FAIR is very general. When implementing that for a specific application or project or even setting a standardized procedure within a working group, a company or a research community, many challenges arise. In this contribution, an overview about our experience with different methods and tools is outlined.
We begin with a motivation on potential use cases for the application of FAIR data with increasing complexity starting from a reproducible research paper over collaborative projects with multiple participants such as Round-Robin tests up to data-based models within standardization codes, applications in machine learning or parameter estimation of physics-based simulation models.
In a second part, different options for structuring the data (including metadata schema) are discussed. The first one is the openBIS system, which is an open-source lab notebook and PostgreSQL based data management system. A second option is a semantic representation using RDF based on ontologies for the domain of interest.
In a third section, requirements for workflow tools to automate data processing are discussed and their integration into reproducible data analysis is presented with an outlook on required information to be stored as metadata in the database.
Finally, the presented procedures are exemplarily demonstrated for the calibration of a temperature dependent constitutive model for additively manufactured mortar. A metadata schema for a rheological measurement setup is derived and implemented in an openBIS database. After a short review of a potential numerical model predicting the structural build-up behavior, the automatic workflow to use the stored data for model parameter estimation is demonstrated.
Structural build-up describes the stability and early-age strength development of fresh mortar used in 3D printing. lt is influenced by several factors, i.e. the composition of the print able material, the printing regime, and the ambient conditions. The existing modelling approaches for structural build-up usually define the model parameters for a specific material composition with out considering the influence of the ambient conditions. The goal of this contribution is to explicitly include the temperature dependency in the modelling approach. Temperature changes have signifi cant impact on the structural build-up process: an increase of the temperature leads to a faster dissol ution of cement phases and accelerates hydration. The proposed extended model includes temperature dependency using the Arrhenius theory. The new model parameters are successfully calibrated based on Viskomat measurement data using Bayesian inference. Furthermore, a higher impact of the temperature in the re-flocculation as in the structuration stage is observed.
The aim of the project LeBeDigital is to present opportunities of digitalization for concrete applications and show a way towards a performance oriented material design.
Due to the high complexity of the manufacturing process of concrete and the range of parameters affecting the effective composite properties, a global optimization is challenging.
Currently, most optimization is only carried out on a narrow scope related to the respective players, e.g. a mix optimization for a target strength, or a design optimization for minimum weight, using a given mix. To enable a path toward a full global optimization
requires a reproducible chain of data, accessible for all contributors.
We propose a framework based on an ontology, which automatically combines experimental data with numerical simulations. This not only simplifies experimental knowledge transfer, but allows the model calibration and the resulting simulation predictions to be
reproducible and interpretable. In addition to an optimized set of parameters, this setup allows to study the quality and uncertainty of the data and models, as well as giving information about optimal experiments to improve the data set.
We will present the proposed optimization workflow, using the example of a precast concrete element. The contribution will focus on the workflow and challenges of an interoperable FEM formulation.
With increasing focus on industrialized processing, investigating, understanding, and modelling the structural build-up of cementitious materials becomes more important. The structural build-up governs the key property of fresh printable materials -- buildability -- and it influences the mechanical properties after the deposition. The structural build-up rate can be adjusted by optimization of the mixture composition and the use of concrete admixtures. Additionally, it is known, that the environmental conditions, i.e. humidity and temperature have a significant impact on the kinetic of cement hydration and the resulting hardened properties, such as shrinkage, cracking resistance etc. In this study, small amplitude oscillatory shear (SAOS) tests are applied to examine the structural build-up rate of cement paste subject to different temperatures under controlled humidity. The results indicate significant influences of the ambient temperature on the intensity of the re-flocculation (Rthix) rate, while the structuration rate (Athix) is almost not affected. A bi-linear thixotropy model extended by temperature dependent parameters coupled with a linear viscoelastic material model is proposed to simulate the mechanical behaviour considering the structural build-up during the SAOS test
With increasing focus on industrialized processing, investigating, understanding, and modelling the structural build-up of cementitious materials becomes more important. The structural build-up governs the key property of fresh printable materials -- buildability -- and it influences the mechanical properties after the deposition. The structural build-up rate can be adjusted by optimization of the mixture composition and the use of concrete admixtures. Additionally, it is known, that the environmental conditions, i.e. humidity and temperature have a significant impact on the kinetic of cement hydration and the resulting hardened properties, such as shrinkage, cracking resistance etc. In this study, small amplitude oscillatory shear (SAOS) tests are applied to examine the structural build-up rate of cement paste subject to different temperatures under controlled humidity. The results indicate significant influences of the ambient temperature on the intensity of the re-flocculation (Rthix) rate, while the structuration rate (Athix) is almost not affected. A bi-linear thixotropy model extended by temperature dependent parameters coupled with a linear viscoelastic material model is proposed to simulate the mechanical behaviour considering the structural build-up during the SAOS test.
Numerical simulations are essential in predicting the behavior of systems in many engineering fields and industrial sectors. The development of accurate virtual representations of actual physical products or processes (also known as digital twins) allows huge savings in cost and resources. In fact, digital twins would allow reducing the number of real, physical prototypes, tests, and experiments, thus also increasing the sustainability of production processes and products’ lifetime. Standard numerical methods fail in providing real time simulations, especially for complex processes such as additive manufacturing applications.
This work aims to use a reduced order model for efficient wire arc additive manufacturing simulations, calibrations and real-time process control. Model reduction, e.g. the proper generalized decomposition [1,2] method, is a popular concept to decrease the computational effort. A new mapping approach [3] was applied to simulate a moving heat source with the proper generalized decomposition. Using this procedure even complex models can be simulated in real-time. The physical model is later on calibrated with the use of a stochastic model updating process and the reduced order model, leading to an optimized real-time simulation.
In this contribution, a proper generalized decomposition model for a bead-on-plate wire arc additive manufacturing is presented. It is also coupled with a stochastic model updating process identifying the heat source characteristics as well as the boundary conditions of the transient thermal problem, whereas the heat source shape is simulated using a Goldak heat source
Concrete has a long history in the construction industry and is currently one of the most widely used building materials. Especially precast concrete elements are frequently utilized in construction projects for standardized applications, increasing the quality of the composite material, as well as reducing the required building time. Despite the accumulated knowledge, continuous research and development in this field is essential due to the complexity of the composite combined with the ever-growing number of applications and requirements. Especially in view of global climate change, design aspects as CO2 emissions and resource efficiency require new mix designs and optimization strategies. A result of the material’s high complexity and heterogeneity on multiple scales is that utilizing the full potential with changing demands is highly challenging, even for the established industry. We propose a framework based on an ontology, which automatically combines experimental data with numerical simulations. This not only simplifies experimental knowledge transfer, but allows the model calibration and the resulting simulation predictions to be reproducible and interpretable. This research shows a way towards a more performance oriented material design. Within this talk we present our workflow for an automated simulation of a precast element, demonstrating the interaction of the ontology and the finite element simulation. We show the automatic calibration of our early-age concrete model [1, 2], to improve the prediction of the optimal time for the removal of the form work.
The amount of data generated worldwide is constantly increasing. These data come from a wide variety of sources and systems, are processed differently, have a multitude of formats, and are stored in an untraceable and unstructured manner, predominantly in natural language in data silos. This problem can be equally applied to the heterogeneous research data from materials science and engineering. In this domain, ways and solutions are increasingly being generated to smartly link material data together with their contextual information in a uniform and well-structured manner on platforms, thus making them discoverable, retrievable, and reusable for research and industry. Ontologies play a key role in this context. They enable the sustainable representation of expert knowledge and the semantically structured filling of databases with computer-processable data triples.
In this perspective article, we present the project initiative Materials-open-Laboratory (Mat-o-Lab) that aims to provide a collaborative environment for domain experts to digitize their research results and processes and make them fit for data-driven materials research and development. The overarching challenge is to generate connection points to further link data from other domains to harness the promised potential of big materials data and harvest new knowledge.