Datei für die Öffentlichkeit verfügbar ("Open Access")
Filtern
Dokumenttyp
- Beitrag zu einem Tagungsband (15)
- Zeitschriftenartikel (8)
- Dissertation (2)
Schlagworte
- Reliability (25) (entfernen)
Organisationseinheit der BAM
- 8 Zerstörungsfreie Prüfung (7)
- 7 Bauwerkssicherheit (4)
- 8.2 Zerstörungsfreie Prüfmethoden für das Bauwesen (3)
- 8.4 Akustische und elektromagnetische Verfahren (3)
- 7.2 Ingenieurbau (2)
- 1 Analytische Chemie; Referenzmaterialien (1)
- 1.2 Biophotonik (1)
- 7.0 Abteilungsleitung und andere (1)
- 7.7 Modellierung und Simulation (1)
- 8.0 Abteilungsleitung und andere (1)
An analytical expression for the frequency response function of a coupled pedestrian-bridge system is presented and evaluated using an experimental measurement campaign performed on the Folke Bernadotte Bridge in Stockholm, Sweden. A finite element model and the modal models that consider the human–structure interaction effect are calibrated with respect to the measurements. The properties of the spring–mass–damper model representing the pedestrians were identified, considering the different structural modes of the system. Good agreement was obtained between the experimental and theoretical frequency response functions. A sensitivity analysis of the obtained solution was performed, validating the determined analytical expression for the frequency response function of the coupled pedestrian-bridge system that takes into account the human–structure interaction effect.
The Infrastar training school provides lectures and hands-on training to Master and PhD students, early-stage researchers and (young) professionals on all aspects of asset management of civil infrastructures with respect to fatigue of materials. It is organized yearly since 2019 and up to 20 profiles are selected to attend the 3.5-day training.
A team of 7 teachers provides insight in multi-disciplinary and intersectoral basic concepts in three core fields, ranging from the design to the dismantling of the structures (bridges and wind turbines):
1. Monitoring and auscultation,
2. Structural and action models,
3. Reliability, risk and decision analyses.
Each year, a recognized expert is invited to deliver a keynote lecture to raise interest in a specific aspect of one of the fields covered. The presentation is recorded and published on the training school website.
Seit beinahe 50 Jahren werden Zuverlässigkeitsbewertungen zerstörungsfreier Prüfverfahren erfolgreich in unterschiedlichen Industriebranchen eingesetzt. Dennoch gibt es weiterhin viele Vorbehalte, da die Bewertungsgrundlage nicht standardisiert ist und konkrete Handlungsanweisungen fehlen. Dem gegenüber steht die Anerkennung in der ZfP-Community. Es ist bisher nur Wenigen bekannt, dass dem Wissen über die Zuverlässigkeit eines zerstörungsfreien Prüfverfahren eineSchlüsselfunktion für die ZfP 4.0 zukommt.
Im Vortrag wird die Notwendigkeit für eine Norm oder eine Richtlinie zum Thema Zuverlässigkeitsbewertungen aufgezeigt. Es wird darauf eingegangen, warum es bisher (in Deutschland) noch keine Richtlinie gab und gibt und was der:die Anwender:innen aus einer Richtlinie zur Zuverlässigkeitsbewertung von ZfP-Prüfverfahren erwarten kann.
Es werden konkrete Schritte und Vorgehensweisen zur Bewertung der Zuverlässigkeit eines Verfahrens aufgezeigt; von der Definition eines Anwendungsfalls, über die Herstellung geeigneter Testkörper bis hin zur Durchführung und Bewertung von Prüfungen unter Einbeziehung von menschlichen Einflüssen. Die Beispiele umfassen sowohl Anwendungen aus dem Bauwesen als auch aus dem Maschinen- und Anlagenbau.
Der Vortrag fasst die bisherigen und geplanten Arbeiten in dem WIPANO Projekt „Normung für die probabilistische Bewertung der Zuverlässigkeit für zerstörungsfreie Prüfverfahren“ zusammen.
Menschliche Faktoren sind ein häufig genanntes Thema, wenn wir über
die Zuverlässigkeit der zerstörungsfreien Prüfung (ZfP) sprechen. Die Probability of Detection (POD), das üblicherweise verwendete Maß für die Zuverlässigkeit der ZfP, betrachtet jedoch nur die technische Fähigkeit eines ZfP-Systems, einen Fehler zu entdecken. Nach mehreren Jahrzehnten der Erforschung des Einflusses der menschlichen Faktoren auf die Zuverlässigkeit der ZfP gibt es immer noch keinen allgemein akzeptierten Ansatz, um menschliche Faktoren in der Zuverlässigkeitsbewertung sichtbar zu machen. Dieser Beitrag gibt einen Überblick über verschiedene verfügbare Methoden zur Einbeziehung menschlicher Faktoren in die Zuverlässigkeitsbewertung. Das Thema ist ein essenzieller Bestandteil des laufenden WIPANO-Projekts "normPOD", das sich als Ziel gesetzt hat, die Standardisierung der Zuverlässigkeitsbewertung in Deutschland voranzutreiben und im Vergleich zu den schon bekannten internationalen Normen einen Fokus auf den Umgang mit den menschlichen Faktoren zu legen.
Each engineering decision is based on a number of more or less accurate information. In assessment of existing structures, additional relevant information collected with on-site inspections facilitate better decisions. However, observed data basically represents the physical characteristic of interest with an uncertainty. This uncertainty is a measure of the inspection quality and can be quantified by expressing the measurement uncertainty. The internationally accepted rules for calculating measurement uncertainty are well established and can be applied straightforwardly in many practical cases. Nevertheless, the calculations require the occasionally time-consuming development of an individually suitable measurement model. This contribution attempts to emphasize proposals for modelling the non-destructive depth measurement of tendons in concrete using the ultrasonic echo technique. The proposed model can serve as guideline for the determination of the quality of the measured information in future comparable inspection scenarios.
The acquisition and appropriate processing of relevant information about the considered system remains a major challenge in assessment of existing structures. Both the values and the validity of computed results such as failure probabilities essentially depend on the quantity and quality of the incorporated knowledge. One source of information are onsite measurements of structural or material characteristics to be modeled as basic variables in reliability assessment. The explicit use of (quantitative) measurement results in assessment requires the quantification of the quality of the measured information, i.e., the uncertainty associated with the information acquisition and processing. This uncertainty can be referred to as measurement uncertainty. Another crucial aspect is to ensure the comparability of the measurement results.This contribution attempts to outline the necessity and the advantages of measurement uncertainty calculations in modeling of measurement data-based random variables to be included in reliability assessment. It is shown, how measured data representing time-invariant characteristics, in this case non-destructively measured inner geometrical dimensions, can be transferred into measurement results that are both comparable and quality-evaluated. The calculations are based on the rules provided in the guide to the expression of uncertainty in measurement (GUM). The GUM-framework is internationally accepted in metrology and can serve as starting point for the appropriate processing of measured data to be used in assessment. In conclusion, the effects of incorporating the non-destructively measured data into reliability analysis are presented using a prestressed concrete bridge as case-study.
A modern day light microscope has evolved from a tool devoted to making primarily empirical observations to what is now a sophisticated, quantitative device that is an integral part of both physical and life science research. Nowadays, microscopes are found in nearly every experimental laboratory. However, despite their prevalent use in capturing and quantifying scientific phenomena, neither a thorough understanding of the principles underlying quantitative imaging techniques nor appropriate knowledge of how to calibrate, operate and maintain microscopes can be taken for granted. This is clearly demonstrated by the well-documented and widespread difficulties that are routinely encountered in evaluating acquired data and reproducing scientific experiments. Indeed, studies have shown that more than 70% of researchers have tried and failed to repeat another scientist’s experiments, while more than half have even failed to reproduce their own experiments1. One factor behind the reproducibility crisis of experiments published in scientific journals is the frequent underreporting of imaging methods caused by a lack of awareness and/or a lack of knowledge of the applied technique2,3. Whereas quality control procedures for some methods used in biomedical research, such as genomics (e.g., DNA sequencing, RNA-seq) or cytometry, have been introduced (e.g. ENCODE4), this issue has not been tackled for optical microscopy instrumentation and images. Although many calibration standards and protocols have been published, there is a lack of awareness and agreement on common Standards and guidelines for quality assessment and reproducibility5.
In April 2020, the QUality Assessment and REProducibility for instruments and images in Light Microscopy (QUAREP-LiMi) initiative6 was formed. This initiative comprises imaging scientists from academia and industry who share a common interest in achieving a better understanding of the performance and limitations of microscopes and improved quality control (QC) in light microscopy. The ultimate goal of the QUAREP-LiMi initiative is to establish a set of common QC standards, guidelines, metadata models7,8, and tools9,10, including detailed protocols, with the ultimate aim of improving reproducible advances in scientific research.
This White Paper 1) summarizes the major obstacles identified in the field that motivated the launch of the QUAREP-LiMi initiative; 2) identifies the urgent need to address these obstacles in a grassroots manner, through a community of Stakeholders including, researchers, imaging scientists11, bioimage analysts, bioimage informatics developers, corporate partners, Funding agencies, standards organizations, scientific publishers, and observers of such; 3) outlines the current actions of the QUAREPLiMi initiative, and 4) proposes future steps that can be taken to improve the dissemination and acceptance of the proposed guidelines to manage QC.
To summarize, the principal goal of the QUAREP-LiMi initiative is to improve the overall quality and reproducibility of light microscope image data by introducing broadly accepted standard practices and accurately captured image data metrics.
In many industrial sectors, Structural Health Monitoring (SHM) is considered as an addition to Non-Destructive Testing (NDT) that can reduce maintenance effort during lifetime of a technical facility, structural component or vehicle. A large number of SHM methods is based on ultrasonic waves, whose properties change depending on structural health. However, the wide application of SHM systems is limited due to the lack of suitable methods to assess their reliability. The evaluation of the system performance usually refers to the determination of the Probability of Detection (POD) of a test procedure. Up to now, only few limited methods exist to evaluate the POD of SHM systems, which prevent them from being standardised and widely accepted in industry. The biggest hurdle concerning the POD calculation is the large amount of samples needed. A POD analysis requires data from numerous identical structures with integrated SHM systems. Each structure is then damaged at different locations and with various degrees of severity. All of this is connected to high costs. Therefore, one possible way to tackle this problem is to perform computer-aided investigations. In this work, the POD assessment procedure established in NDT according to the Berens model is adapted to guided wave-based SHM systems. The approach implemented here is based on solely computer-aided investigations. After efficient modelling of wave propagation phenomena across an automotive component made of a carbon fibre-reinforced composite, the POD curves are extracted. Finally, the novel concept of a POD map is introduced to look into the effect of damage position on system reliability.
The current practice of operating and maintaining deteriorating structural systems ensures acceptable levels of structural reliability, but it is not clear how efficient it is. Changing the current prescriptive approach to a risk-based approach has great potential to enable a more efficient management of such systems. Risk-based optimization of operation and maintenance strategies identifies the strategy that optimally balances the cost for controlling deterioration in a structural system with the achieved risk reduction. Inspections and monitoring are essential parts of operation and maintenance strategies. They are typically performed to reduce the uncertainty in the structural condition and inform decisions on future operation and maintenance actions. In risk-based optimization of operation and maintenance strategies, Bayesian updating is used to include information contained in inspection and monitoring data in the prediction of the structural reliability. All computations need to be repeated many times for different potential inspection and monitoring outcomes. This motivates the development of robust and efficient approaches to this computationally challenging task.
The reliability of deteriorating structural systems is time-variant because the loads on them and their capacities change with time. In most practical applications, the reliability analysis of deteriorating structural systems can be approached by dividing their lifetime into discrete time intervals. The time-variant reliability problem can then be represented by a series of time-invariant reliability problems. Using this methodology as a starting point, this thesis proposes a novel approach to compute the time-variant reliability of deteriorating structural systems for which inspection and monitoring data are available. The problem is formulated in a nested way in which the prediction of the structural condition is separated from the computation of the structural reliability conditional on the structural condition. Information on the structural condition provided by inspections and monitoring is included in the reliability assessment through Bayesian updating of the system deterioration model employed to predict the structural condition. The updated system reliability is obtained by coupling the updated deterioration model with a probabilistic structural model utilized to calculate the failure probability conditional on the structural condition. This approach is the first main outcome of this thesis and termed nested reliability analysis (NRA) approach. It is demonstrated in two numerical examples considering inspected and monitored steel structures subject to high-cycle fatigue.
An alternative – recently developed – approach, which also follows the strategy of discretizing time, describes deteriorating structural systems with hierarchical dynamic Bayesian networks (DBN). DBN combined with approximate or exact inference algorithms also enable the computation of the time-variant reliability of deteriorating structural systems conditional on information provided by inspection and monitoring data. In this thesis – as a proof of concept – a software prototype is developed based on the DBN approach, which can be used to assess the reliability of a corroding concrete box girder for which half-cell potential measurements are available. This is the second main outcome of this thesis.
Both approaches presented in this thesis enable an integral reliability analysis of inspected and monitored structures that accounts for system effects arising from (a) the correlation among deterioration states of different structural elements, (b) the interaction between element deterioration and system failure, and (c) the indirect information gained on the condition of all unobserved structural elements from inspecting or monitoring the condition of some structural elements. Thus, both approaches enable a systemwide risk-based optimization of operation and maintenance strategies for deteriorating structural systems.
The NRA approach can be implemented relatively easily with subset simulation, which is a sequential Monte Carlo method suitable for estimating rare event probabilities. Subset simulation is robust and considerably more efficient than crude Monte Carlo simulation. It is, however, still sampling-based and its efficiency is thus a function of the number of inspection and monitoring outcomes, as well as the value of the simulated event probabilities. The current implementation of the NRA approach performs separate subset simulation runs to estimate the reliability at different points in time. The efficiency of the NRA approach with subset simulation can be significantly improved by exploiting the fact that failure events in different years are nested. The lifetime reliability of deteriorating structural systems can thus be computed in reverse chronological order in a single subset simulation run.
The implementation of the DBN approach is much more demanding than the implementation of the NRA approach but it has two main advantages. Firstly, the graphical format of the DBN facilitates the presentation of the model and the underlying assumptions to stakeholders who are not experts in reliability analysis. Secondly, it can be combined with exact inference algorithms. In this case, its efficiency neither depends on the number of inspection and monitoring outcomes, nor on the value of the event probabilities to be calculated. However, in contrast to the NRA approach with subset simulation, the DBN approach with exact inference imposes restrictions on the number of random variables and the dependence structure that can be implemented in the model.
One of the most important goals in civil engineering is to guarantee the safety of the construction. Standards prescribe a required failure probability in the order of 10−4 to 10−6. Generally, it is not possible to compute the failure probability analytically.
Therefore, many approximation methods have been developed to estimate the failure probability. Nevertheless, these methods still require a large number of evaluations of the investigated structure, usually finite element (FE) simulations, making full probabilistic design studies not feasible for relevant applications. The aim of this paper is to increase the efficiency of structural reliability analysis by means of reduced order models. The developed method paves the way for using full probabilistic approaches in industrial applications. In the proposed PGD reliability analysis, the solution of the structural computation is directly obtained from evaluating the PGD solution for a specific parameter set without computing a full FE simulation. Additionally, an adaptive importance sampling scheme is used to minimize the total number of required samples. The accuracy of the failure probability depends on the accuracy of the PGD model (mainly influenced on mesh discretization and mode truncation) as well as the number of samples in the sampling algorithm. Therefore, a general iterative PGD reliability procedure is developed to automatically verify the accuracy of the computed failure probability. It is based on a goal-oriented refinement of the PGD model around the adaptively approximated design point. The methodology is applied and evaluated for 1D and 2D examples. The computational savings compared to the method based on a FE model is shown and the influence of the accuracy of the PGD model on the failure probability is studied.