Chemie und Prozesstechnik
Filtern
Dokumenttyp
- Vortrag (1398) (entfernen)
Sprache
- Englisch (1044)
- Deutsch (340)
- Mehrsprachig (6)
- Spanisch (5)
- Russisch (2)
- Französisch (1)
Referierte Publikation
- nein (1398)
Schlagworte
- LIBS (47)
- Concrete (41)
- Nanoparticles (38)
- NDT (36)
- Traceability (29)
- Fluorescence (27)
- Synchrotron (27)
- Zerstörungsfreie Prüfung (27)
- Additive manufacturing (26)
- Digitalisierung (26)
Organisationseinheit der BAM
- 8 Zerstörungsfreie Prüfung (454)
- 6 Materialchemie (403)
- 1 Analytische Chemie; Referenzmaterialien (399)
- 8.0 Abteilungsleitung und andere (149)
- 6.1 Oberflächen- und Dünnschichtanalyse (136)
- 8.2 Zerstörungsfreie Prüfmethoden für das Bauwesen (127)
- 1.1 Anorganische Spurenanalytik (113)
- 8.4 Akustische und elektromagnetische Verfahren (111)
- 1.4 Prozessanalytik (100)
- 6.3 Strukturanalytik (94)
The ongoing geo-political conflicts and the increasing need for the implementation of measures to improve the energetic system sustainability are increasing the importance of tanks for storing cryogenic fluids in the energy industry. The most common example of cryogenic tank applications is the transport of natural gas and hydrogen in their liquid form (LNG and LH2 respectively) for which, considering the same transport volume cryogenic storage ensures significantly higher transport capacities with respect storage based solely on overpressure.
A common feature of all cryogenic transported fluids is that their condition must be maintained minimizing heat leaks from the environment as much as possible. This is achieved by the implementation of thermal super Insulations (TSI) systems based on e. g. rock wool, perlites, microspheres, multilayer insulations (MLI), and vacuum which have proven to be effective in applications. However, due to the relatively short period of use in some applications, the small number of documented incidents, and the still few investigations carried out in the field, the exploitation of such systems in the cryogenic fluids transport sector still suffers from insufficient knowledge about the course and consequences of incidents. Accidents involving collisions, fires, and their combination are quite common in the transportation sector and may generate extraordinary loads on the tank and its insulation system, eventually leading to tank failure.
The present study focuses on the behavior of TSI systems in tanks when it is exposed to an external heat source representative of a hydrocarbon fire scenario. This may cause an increase of the heat flux into a tank by several orders of magnitude with respect to normal design conditions, thus inducing severe and in the TSI, causing the rapid release of flammable gas and even resulting in a Boiling Liquide Expanding Vapour Explosion (BLEVE).
To study such scenarios a test rig was developed at BAM that allows testing of TSI at industrial conditions and enables subsequent analysis of TSI samples. This test rig considers the typical double-walled design of tanks for cryogenic fluids with vacuum and an additional insulating material in the interspace. Adjustable electrical heating elements simulate the fire on one side of the double wall. This process allows the implementation of repeatable heat loads of up to 100 kW/m². The other side of the double wall is represented by a fluid-supported heat exchanger, which allows the simulation of cold or cryogenic conditions in the test rig, and to determine the heat flux transmitted through the double wall. Thus, the test rig allows thermal loading and performance analysis of TSI samples at the same time.
In the presentation, the results of diverse tested TSI systems will be presented and discussed. As a result of this study, the list of advantages and disadvantages for the choice of tested TSI expands. Within the test, all samples degraded as a consequence of a hydrocarbon fire-orientated thermal load. Strong differences in the behavior of the tested TSI systems over temperature, location, and time were observed. Additionally, the tested MLI insulations were significantly more resistant to their base materials. These results are relevant for the design, the definition of national and international regulations, the Risk assessment, and the development of safety concepts for cryogenic tanks.
Sicherheitstechnische Untersuchungen von Wasserstoff Freistrahlflammen bei Hochdruck im Realmaßstab
(2024)
Wasserstoff als Energieträger gewinnt zunehmend an Bedeutung. Die Untersuchung von Störfallauswirkungen mit Wasserstoff rückt somit stärker in den Fokus. Da Wasserstoff meist unter Druck gelagert und transportiert wird, ist ein zu betrachtendes Szenario die Freisetzung aus einer Leckage mit anschließender Zündung. Die daraus resultierende Freistrahlflamme (Jet Flame) muss hinsichtlich der in die Umgebung emittierten Wärmestrahlung charakterisiert werden. In der Literatur existieren bereits verschiedene Modelle, welche jedoch vermehrt auf Daten aus Kohlenwasserstoffflammen mit geringem Impuls basieren. Zur Überprüfung dieser Modelle wird im Zuge des BAM internen H2 Jet Flame Projektes die sicherheitstechnische Untersuchung von impulsbehafteten Wasserstoff Freistrahlflammen vorgenommen. Hierfür finden Versuche im Realmaßstab auf dem Testgelände Technische Sicherheit der BAM (BAM-TTS) statt. Gegenstand der Untersuchungen ist die Beurteilung der Auswirkungen von realistischen Freisetzungsszenarien hinsichtlich der Flammengeometrie und der freigesetzten Wärmestrahlung. Dabei werden Parameter wie Freisetzungswinkel, Leckagedurchmesser (z.Zt. 1 mm bis 30 mm), Druck (z.Zt. bis max. 250 bar) und Massenstrom (bis max. 0,5 kg/s) variiert. Zusätzlich können auch Einflüsse wie Art der Zündung, Zündort sowie Zündung mit zeitlichem Verzug untersucht werden. Gewonnene Erkenntnisse werden mit den Ergebnissen bereits vorhandener Modelle verglichen und diese im Bedarfsfall weiterentwickelt. Insbesondere wird der Fokus auf die Modellierung der freigesetzten Wärmestrahlung von Wasserstoffflammen gelegt. Herausforderung dabei stellt die IR-Vermessung und Modellierung von Sichtmodellen der Flammen dar. Die Visualisierung der Flammengeometrie wird mit Hilfe mehrerer Infrarot Kamerasystemen (aus mindestens zwei Blickwinkeln) vorgenommen.
Bisherige Messungen, die in der Literatur zu finden sind, basieren meist auf instationären Auströmbedingungen. Der hier verwendete Versuchsaufbau ermöglicht ein stationäres Ausströmen für mehrere Minuten und somit eine direkte Vergleichbarkeit mit den existierenden (stationären) Modellen.
Weiterhin ist der Versuchsstand umrüstbar für Vergleichsmessungen mit Kohlenwasserstoffen (Methan etc.) sowie Mischungen aus Wasserstoff und Kohlenwasserstoffen.
In order to reduce the human footprint of CO2 emissions and limit global warming effects hydrogen combustion is becoming increasingly important. To enable fuel cells and gas turbines to operates this carbon free fuel, unprecedently large amounts of hydrogen need to be produced and safely transported and stored. The investigation of the effects of accidents involving hydrogen is therefore becoming of outmost importance. Since hydrogen is usually stored and transported under pressure, one scenario to be considered is the release of hydrogen from a leakage with subsequent ignition. The resulting jet flame must be characterized with respect to the thermal radiation emitted into the environment to define safety regulations. Various models that characterize the resulting flame shape and radiation already exist in the literature, but these are mainly based on empirical data from hydrocarbon jet flames.[1-4] To verify these models, a H2 Jet Flame project conducted at BAM, is investigating the safety of momentum driven hydrogen jet flames. For this purpose, large-scale tests are carried out at the Test Site Technical Safety (BAM-TTS). The object of the investigations is to assess the effects of real scale release scenarios regarding flame geometry and the thermal radiation emitted. Parameters such as release angle, leakage diameter (currently 1 mm to 10 mm), pressure (currently up to max. 250 bar) and mass flow (up to max. 0.5 kg/s) are varied. In addition, influences such as the type of ignition, ignition location as well as delayed ignition can also be investigated. The gained knowledge will be compared with existing jet flame models, to validate these and identify a possible need for further development. In particular, the focus will be laid on the thermal radiation of hydrogen flames. The challenge here is the visualization and characterization of the flame geometry in an open environment. Visualization is performed using infrared (IR) camera systems from at least two viewing angles. Measurements of the heat radiation of jet flames, which can be found in the literature, are mostly based on unsteady outflow conditions.The experimental setup used here allows for the generation of a steady-state outflow for several minutes and thus a direct comparability with existing (steady-state) models. Furthermore, the tests can be carried out for comparative measurements with hydrocarbons (methane, etc.) as well as mixtures of hydrogen and hydrocarbons.
Therapeutic monoclonal antibodies are the fastest-growing group of biological agents which generated a yearly turnover of USD 210 billion in 2022 and whose sales are expected to grow by 10% annually over the next 10 years. With steadily increasing market importance, analytical methods for reliable quantification of therapeutic antibodies also become more and more relevant. Liquid chromatography coupled with tandem mass spectrometry (LC–MS/MS) has become the main technology for antibody quantification. This approach, however, requires enzymatic digestion of the intact protein into peptides, for which a wide range of different protocols exists that often lead to different results depending on the digestion procedure or trypsin variants used. In particular, the amount and type of detergents added for protein unfolding prior to digestion is known to create significant bias in measurement results. The overall goal of the presented project is the application of novel thermostable and surface-functionalized trypsin particles for improved antibody digestion. Specifically, a trypsin-variant described in the literature exhibiting increased activity and thermal stability above 80°C, will be examined. The application of this enzyme should allow to perform digestion at elevated temperatures where the protein is naturally unfolding thereby increasing enzyme accessibility without the need for detergents. Furthermore, we will immobilize the thermostable trypsin onto the surface to further enhance enzyme stability, prevent self-digestion, and enable separation of trypsin from target peptides before LC–MS/MS analysis. As an immobilization platform, cheap and non-porous corundum particles will be used as these show high chemical stability and low levels of interaction of matrix proteins with the functionalized surface. adsorption. In a multidisciplinary collaboration with the SALSA Photonics Lab, we will investigate the characteristics of covalent enzyme binding and unspecific peptide binding using an interface-sensitive analytical tool, vibrational sum-frequency generation (VSFG) spectroscopy. The insights gained will not only lead to new competencies in peptide and enzyme surface analysis using VSFG spectroscopy in SALSA but will also significantly contribute to optimizing antibody quantification.
Die Angabe von Unsicherheiten bei zertifizierten Werten von Referenzmaterialien ist von entscheidender Bedeutung. Die korrekte Einbindung der Unsicherheiten zur Berechnung von Verfahrensmessunsicherheiten ist wesentlich für die Gewährleistung der Genauigkeit und Zuverlässigkeit von Messungen. In diesem Vortrag werden die verschiedenen Einflussfaktoren auf die Unsicherheit zertifizierter Werte gemäß ISO Guide 35 dargestellt. Dabei werden insbesondere die Charakterisierung, Homogenität und Stabilität als entscheidende Faktoren für die Bestimmung der Unsicherheit eines Referenzmaterials betrachtet. Abschließend wird das Konzept anhand eines konkreten Beispiels veranschaulicht, um die praktische Anwendung und die Auswirkungen auf die Berechnung von Verfahrensmessunsicherheiten zu verdeutlichen.
Aerosolemissionen ausgewählter Pyrotechnik - Gefährdungsabschätzung bei missbräuchlicher Verwendung
(2024)
Der Vortrag beschreibt die Ergebnisse von Messungen zur Freisetzung von Aerosolen beim Abbrand pyrotechnischer Gegenstände. Vor dem Hintergrund einer missbräuchlichen Verwendung (wie etwa in Fußballstadien) wird auf die speziellen Gefährdungen hingewiesen. Freigesetzte Partikel haben Größenbereiche von deutlich kleiner 100 nm und sind damit besonders alveolengängig.
The MBLabs consortium comprises various organizations that operate testing facilities encompassing a broad spectrum of tests relevant to the construction sector, particularly building envelopes. In the future, additional testing facilities will join the METABUILDING platform to offer their services. These services will be integrated in the MBLabs Open Innovation Test Bed and accessible via the METABUILDING platform. The METABUILDING platform is operated by the METABUILDING association.
In Task 8.5 the Quality Assurance system of the MBLabs OITB is developed. The presentation gives an overview regarding the development of this system after 3 years of project execution.
In einer Reihe von Experimenten wurden die möglichen Folgen der Freisetzung von verflüssigtem Wasserstoff (LH2) auf, bzw. unter Wasser untersucht. Die Experimente zielten darauf ab, eine unbeabsichtigte Freisetzung von LH2 (z.B. durch Schlauchabriss) insbesondere bei der Betankung eines Schiffes zu simulieren. Für verflüssigtes Erdgas (LNG) wurden dabei sog. RPT’s (rapid phase transition) nachgewiesen, bei denen die spontane Verdampfung relevante Druckwellen erzeugt. Es kann nicht ausgeschlossen werden, dass RPTs auch im Falle von LH2 möglich sind. Die Versuche wurden auf dem Testareal Wasserstoffsicherheit auf dem Testgelände Technische Sicherheit der Bundesanstalt für Materialforschung und -prüfung (BAM-TTS) in Horstwalde, im Rahmen einer Forschungskooperation zwischen der BAM und Gexcon im Rahmen des SH2IFT-Programms durchgeführt. Die LH2-Freisetzungen erfolgten direkt aus einem LH2-Tanklastwagen über eine lange, flexible, vakuumisolierte Transferleitung. Während die Freisetzung oberhalb und unterhalb der Wasseroberfläche jeweils vertikal orientiert war, wurde bei der Unterwasserfreisetzung zusätzlich eine horizontale Ausströmung, parallel zur Wasseroberfläche realisiert. Zur Bestimmung des Massenstromes, wurde ein Wägesystem unter dem Tankwagen eingesetzt. Spezielle Drucksensoren wurden verwendet, um die durch die Freisetzungsvorgänge erzeugten Stoßwellen sowohl im Wasser als auch in der Luft zu messen. Die Gaskonzentrationen über dem Wasserbecken wurden an verschiedenen Positionen gemessen. Hochgeschwindigkeits-, Infrarot- (IR) und normale Kameras wurden eingesetzt, um die Phänomenologie der Freisetzung aufzuzeichnen und das Verhalten der Gaswolke im Zeitverlauf zu verfolgen. Neben den fest installierten Systemen an Land, kamen auch Unterwasserkameras sowie eine Drohne mit Normal- und IR-Kameras zum Einsatz. Zwei Wetterstationen wurden zur Messung von Windgeschwindigkeit, Windrichtung, Temperatur und Luftfeuchtigkeit während aller durchgeführten Tests eingesetzt. Des Weiteren kamen Bolometer zur Wärmestrahlungsmessung zum Einsatz. Zwar führten die Freisetzung zu einer hochturbulenten LH2/Wasser Mischzone, jedoch zu keinen nennenswerten Überdrücken durch RPT. Im Gegensatz dazu wurde unerwartet, aber reproduzierbar, eine Zündung der Gaswolke in freier Luft in einiger Entfernung von den Instrumenten und dem Ort der Freisetzung beobachtet. Die daraus resultierenden Gaswolkenexplosionen führten zu relevanten Überdrücken und zur Wärmeabstrahlung in die Umgebung.
In einer Versuchsreihe im Realmaßstab wurden drei Flüssigwasserstofftanks (LH2) unterfeuert. Hierbei sollte unter anderem geklärt werden, ob es, analog zu druckverflüssigten Gasen, zu einem BLEVE (Boiling Liquid Expanding Vapor Cloud Explosion) kommen kann. Die Experimente wurden auf dem Testareal Wasserstoffsicherheit des Testgeländes Technische Sicherheit der Bundesanstalt für Materialforschung und -prüfung (BAM-TTS) im Rahmen einer Forschungskooperation zwischen der BAM und Gexcon, als Teil des SH2IFT-Programms durchgeführt. Es handelte sich um doppelwandige, vakuumisolierte Tanks von 1 m³ Volumen. Die zylindrischen Tanks unterschieden sich durch ihre Ausrichtung (horizontal oder vertikal) und das verwendete Isoliermaterial (Perlit oder Mehrschichtisolierung (MLI). Der Füllgrad der Tanks betrug bei jedem der durchgeführten Tests etwa 35-40 %. Die Brandlast wurde homogen durch ein propangasbetriebenes Brennersystem erzeugt. Gemessen wurden die Bedingungen im Behälter (Temperaturen und Druck) sowie äußere Randbedingungen und Auswirkungen beim Versagen (Wärmestrahlung, Druckwellen, Flammenballentwicklung und Fragmentierung). Mit Bolometern wurde die Wärmestrahlung gemessen, die sowohl durch das Propanfeuer als auch durch einen möglichen Feuerball/BLEVE erzeugt wurde. Zur Messung der durch das Bersten von Behältern/BLEVEs erzeugten Druckwellen wurden sog. Pencil-Probes verwendet. Des Weiteren wurden mehrere Kameras zur Überwachung der Experimente eingesetzt: Normalbildkameras, Infrarot (IR)-Kameras und Hochgeschwindigkeitskameras, sowohl bodengestützt als auch mittels einer Drohne. Zwei der untersuchten Tanks, ein horizontaler und der vertikale, die beide mit Perlit isoliert waren, hielten der Brandbelastung stand, ohne dass es zu einem Behälterversagen kam. Der mit MLI isolierte horizontale Behälter barst nach 1 Stunde und 6 Minuten und erzeugte dabei einen Feuerball, Trümmerflug und eine Druckwelle. Neben der Beschreibung der Auswirkungen eines kritischen Versagens eines LH2-Tanks sind weitere Ziele der Arbeit die Identifizierung kritischer Behälterzustände sowie die Erstellung eines umfassendes Datensatzes der Trümmer des geborstenen Tanks. Die größten Abstände für kritische Abstände ergaben sich durch Fragmentwurf. Hier sollen 3D-Scans für die Verwendung in CAD und FEM Anwendungen bereitgestellt werden. Umfassende Daten zu den Fragmentwurfweiten, - massen und -positionen wurden bereits publiziert. Damit soll die Voraussetzung geschaffen werden, um vorhandene Wurfweitenmodelle, die bisher nur für einschalige Behälter konzipiert wurden auf ihre Eignung für mehrschalige Behälter zu überprüfen oder mögliche Anpassungen anhand der realen Daten vornehmen zu können. Zudem wurde damit begonnen die nicht geborstenen Tanks zu vermessen und zu zerlegen, um u.a. die Perlit-Schüttung auf Beeinträchtigungen durch den Transport und die Unterfeuerung zu untersuchen.
This presentation highlights ongoing scientific misconduct as found in academic literature. This includes data- and image manipulation, and paper mills. Starting with an expose of examples, it delves deeper into the causes and metrics driving this phenomenon. Finally a range of possible tools is presented, that the young researcher can use to prevent themselves from sliding into the dark scientific methods.
By automatically recording as much information as possible in automated laboratory setups, reproducibility and traceability of experiments are vastly improved. This presentation shows what such an approach means for the quality of experiments in an X-ray scattering laboratory and an automated synthesis set-up.
Metabolomics
(2024)
Übersichtsvortrag zu Korrosionsarten, die bei nichtrostenden Stählen unter verschiedenen Einsatzbedingungen auftreten können. Die Grundlagen zur Pasivität und der örtlichen Störung der Passivität werden im Zusammenspiel komplexer Einflussfaktoren des Korrosionssystems herausgestellt. Die erarbeiteten theoretischen Grundlagen werden anhand praktischer Beispiel für alle relevanten Korrosionsarten vertieft.
In industrialised countries more than 80% of the time is spent indoors. Products, such as building materials and furniture, emit volatile organic compounds (VOCs), which are therefore ubiquitous in indoor air. VOC in combination may, under certain environmental and occupational conditions, result in reported sensory irritation and health complaints. Emission concentrations can become further elevated in new or refurbished buildings where the rate of air exchange with fresh ambient air may be limited due to improved energy saving aspects. A healthy indoor environment can be achieved by controlling the sources and by eliminating or limiting the release of harmful substances into the air. One way is to use (building) materials proved to be low emitting. Meanwhile, a worldwide network of professional commercial and non-commercial laboratories performing emission tests for the evaluation of products for interior use has been established. Therefore, comparability of test results must be ensured. A laboratory’s proficiency can be proven by internal and external validation measures that both include the application of suitable emission reference materials (ERM). For the emission test chamber procedure according to EN 16516, no artificial ERM is commercially available. The EU-funded EMPIR project MetrIAQ aims to fill this gap by developing new and improved ERMs. The goal is to obtain a material with a reproducible and temporally constant compound release (less than 10 % variability over 14 days). Two approaches were tested: the impregnation of porous materials with VOC, and the encapsulation of VOC in polymer microcapsules. Impregnation is performed with help of an autoclave and supercritical CO2. The encapsulation is done by interfacial polymerisation on VOC droplets. For both approaches, synthesis and/or material parameters were varied to obtain an optimal ERM. Findings about the optimisation of ERM generation, as well as performance of the best emission reference materials, will be presented.
The principles of (Hard) X-ray photoelectron spectroscopy and some application in the field of (core-shell) nanoparticles will be presented. The presentation should answer hoe to get reliable results. Furthermore, examples of the correlation between physical-chemical measurments and toxicological results are given which are crucial for the risk assessment of nanoparticles.
Climate change and increasing demand for electricity require the use of power electronics based on new wide bandgap (WBG) compound semiconductors. Power electronics devices are used in numerous application areas to control and convert electric energy. These may include generation and distribution of renewable energy for green hydrogen, electrification of transport or 5G communication. WBG electronics have much higher efficiency than the silicon-based ones and can operate at higher power densities, voltages, temperatures and switching frequencies with low energy losses. However, defects in the semiconductors can considerably affect the performance of power electronic devices or make their operation even impossible. The presentation will show the application of spectroscopic and imaging ellipsometry as well as white light interference microscopy for defect characterisation in SiC, GaN and Ga2O3 over a wide wavelength range.
We used parameterized modelling of ellipsometric transfer parameters to determine the dielectric properties of bulk materials and thin layers. Imaging ellipsometry offers more information and is an advanced variant of optical microscopy, combining the lateral resolution of optical microscopy with the extreme sensitivity to surface and interface effects of ellipsometry. Surface topography and morphology of different types of defects were additionally investigated with imaging white light interference microscopy. Modern electronic thin film components require complex surface analysis methodologies and hybrid metrology. Hybrid measurement techniques enable fast and non-destructive traceable characterisation of thin film compound semiconductors as well as accurate detection and identification of defects. This methodical approach leads to a better understanding of the materials themselves and of the defect formation mechanisms during manufacturing.
This work aims to enable highly reproducible manufacturing of compound semiconductor power electronics as well as operation monitoring to ensure failure-safety of electronic systems in power electronic devices.
An overview of personal experience with laser-induced plasma (LIP) will be given. The combination of LIP with laser-induced fluorescence, atomic absorption, Raman spectroscopy and spatial heterodyne spectroscopy for elemental and isotopic analysis will be discussed. Unusual applications of LIP will be covered, such as LIP-based lasers and LIP-based chemical reactors.
This course will provide an introduction to plasma diagnostic techniques. The major focus of the course will be on the discussions of the practical procedures as well as the underlying physical principles for the measurements of plasma fundamental characteristics (e.g., temperatures, thermodynamic properties, and electron number density). Particular emphasis will be placed on inductively coupled plasma–atomic emission spectrometry, but other analytical plasmas will also be used as examples when appropriate. Selected examples on how one can manipulate the operating conditions of the plasma source, based on the results of plasma diagnostic measurements, to improve its performance used for spectrochemical analysis will also be covered. Topics to be covered include thermal equilibrium, line profiles, temperatures, electron densities, excitation processes, micro reactions, pump and probe diagnostics, tomography,
temporal and spatial resolution. Basis of plasma computer modeling will be presented.
Die Laserbearbeitung von Materialien mit ultrakurzen Laserpulsen kann zu einer sekundären Emission gefährlicher Röntgenstrahlen führen. Dieser Effekt wurde bisher bei der Bearbeitung von technischen Materialien wie Metallen beobachtet. Die Röntgenemission bei der abtragenden Bearbeitung von biologischen Geweben ist noch weitgehend unerforscht. Der Vortrag präsentiert erste Untersuchungen und Ergebnisse des radiologischen Gefährdungspotentials bei der medizinischen Anwendung von Ultrakurzpulslasern am Menschen.
Until the 1980s radiography was used to inspect civil structures in case of special demands and showed a much better resolution than other NDT techniques. However, due to safety concerns and cost issues, this method is almost never used anymore. Meanwhile, non-destructive techniques such as ultrasound or
radar have found regular, successful practical application but sometimes suffer from limited resolution and accuracy, imaging artefacts or restrictions in detecting certain features when applied to reinforced or prestressed concrete inspection.
Muon tomography has received much attention recently. Muons are particles generated naturally by cosmic rays in the upper atmosphere and pose no risk to humans. Novel detectors and tomographic imaging algorithms have opened new fields of application, mainly in the nuclear sector, but also in spectacular cases such as the Egyptian pyramids.
As a first step towards practical application in civil engineering and as a proof of concept we used an existing system to image the interior of a reference reinforced 600 kg concrete block. Even with a yet not optimized setup for this kind of investigation, the muon imaging results have been at least of similar quality compared to ultrasonic and radar imaging, potentially even better. Recently, the research was expanded to more realistic testing problems such as the detection of voids in certain structural elements. However, before practical implementation, more robust, mobile, and affordable detectors would be required as well as user-Friendly imaging and simulation software.
Ellipsometry is a very powerful tool used for accurate material investigation in a wide wavelength range. It is a non-destructive and fast method. Imaging ellipsometry as a combination of optical microscopy and ellipsometry enables spatially resolved measurements when determining the layer thickness and dielectric properties of thin layers. It is known for its high polarisation sensitivity and high contrast for the surface structures. In this contribution we show the application of the imaging ellipsometry for detection of defects in energy materials and quality validation of possible reference materials for nano-electronics.
Defects in wide bandgap semiconductors, in homoepitaxial SiC and heteroepitaxial GaN layers on transparent SiC substrates, can be successfully detected and classified by means of imaging ellipsometry. Correlation of imaging ellipsometry results with results from complementary techniques such as white light interference microscopy as well as atomic force microscopy contribute to understanding of surface topography and defect formation mechanisms. We discuss the potential of different methods for analysing ellipsometric map data for monitoring the defect densities.
Electric properties of materials at the nanoscale can be investigated by means of scanning probe microscopy methods such as scanning microwave microscopy and conductive atomic force microscopy. However, development of new robust and easy-to-use calibration methods and calibration standards is essential to increase the traceability of these methods and allow their broad application in industry. We show how imaging spectroscopic ellipsometry can be used for development and monitoring of processing quality of patterned reference samples based on indium tin oxide (ITO) layer with different thickness and conductivity.
The built infrastructure ages and requires regular inspection and, when in doubt, monitoring. To ensure that older concrete bridges showing signs of deterioration can be used safely, several innovative monitoring tools have been introduced, including but not limited to optical, fiber-optic, or acoustic emission techniques. However, there are gaps in the portfolio. A sensing technique that covers a wide range of damage scenarios and larger volumes, while still being sensitive and specific, would be beneficial.
For about 15 years, research has been conducted on ultrasonic monitoring of concrete structures that goes beyond the traditional ultrasonic pulse velocity test (PV test), mostly using a very sensitive data evaluation technique called coda wave interferometry. At BAM we have developed sensors and instrumentation specifically for this method.
We have instrumented a 70-year-old, severely damaged prestressed concrete bridge in Germany in addition to a commercial monitoring system. We have now collected data for almost 3 years. We can show that we can provide information about the stress distribution in the bridge. We have also been able to confirm that there has been no significant additional damage to the bridge since the installation.
X-rays without X-rays: Can muon tomography provide pictures from within concrete and other objects?
(2024)
Until the 1980s radiography was used to inspect civil structures in case of special demands and showed a much better resolution than other NDT techniques. However, due to safety concerns and cost issues, this method is almost never used anymore. Meanwhile, non-destructive techniques such as ultrasound or
radar have found regular, successful practical application but sometimes suffer from limited resolution and accuracy, imaging artefacts or restrictions in detecting certain features when applied to reinforced or prestressed concrete inspection.
Muon tomography has received much attention recently. Muons are particles generated naturally by cosmic rays in the upper atmosphere and pose no risk to humans. Novel detectors and tomographic imaging algorithms have opened new fields of application, mainly in the nuclear sector, but also in spectacular cases such as the Egyptian pyramids.
As a first step towards practical application in civil engineering and as a proof of concept we used an existing system to image the interior of a reference reinforced 600 kg concrete block. Even with a yet not optimized setup for this kind of investigation, the muon imaging results have been at least of similar quality compared to ultrasonic and radar imaging, potentially even better. Recently, the research was expanded to more realistic testing problems such as the detection of voids in certain structural elements. However, before practical implementation, more robust, mobile, and affordable detectors would be required as well as user-Friendly imaging and simulation software.
The talk also discusses other applications , such as volcanology, mining and geothermal exploration.
Die Thermografie ist trotz ihrer ausgereiften wissenschaftlichen und technologischen Grundlagen ein noch relativ junges Mitglied in der Familie der zerstörungsfreien Prüfverfahren. Sie erschließt sich aufgrund einer Reihe von Vorzügen eine wachsende Anwendungsgemeinde. Für eine weitere Verbreitung insbesondere im industriellen Kontext spielen Normen, Standards und technische Regeln eine wichtige Rolle. In diesem Beitrag wird der aktuelle Stand der Normung in Deutschland vorgestellt. Wir zeigen, welche Normen und technischen Regeln es für die Thermografie in Deutschland und international gibt und wir wagen einen Blick in die Zukunft. Darüber hinaus lebt auch die Normierungsarbeit von der Beteiligung durch interessierte Kreise. Dies können industrielle und akademische Anwender*innen, Hersteller*innen von Geräten, Forschungseinrichtungen oder Dienstleistungsunternehmen sein. Sie können gern Ihre Bedarfe bezüglich Normierungsprojekten mitbringen und/oder direkt an die Autoren senden.
Laser powder bed fusion is one of the most promising additive manufacturing techniques for printing complex-shaped metal components. However, the formation of subsurface porosity poses a significant risk to the service lifetime of the printed parts. In-situ monitoring offers the possibility to detect porosity already during manufacturing. Thereby, process feedback control or a manual process interruption to cut financial losses is enabled.
Short-wave infrared thermography can monitor the thermal history of manufactured parts which is closely connected to the probability of porosity formation. Artificial intelligence methods are increasingly used for porosity prediction from the obtained large amounts of complex monitoring data. In this study, we aim to identify the potential and the challenges of deep-learning-assisted porosity prediction based on thermographic in-situ monitoring.
Therefore, the porosity prediction task is studied in detail using an exemplary dataset from the manufacturing of two Haynes282 cuboid components. Our trained 1D convolutional neural network model shows high performance (R2 score of 0.90) for the prediction of local porosity in discrete sub-volumes with dimensions of (700 x 700 x 40) μm³.
It could be demonstrated that the regressor correctly predicts layer-wise porosity changes but presumably has limited capability to predict differences in local porosity. Furthermore, there is a need to study the significance of the used thermogram feature inputs to streamline the model and to adjust the monitoring hardware. Moreover, we identified multiple sources of data uncertainty resulting from the in-situ monitoring setup, the registration with the ground truth X-ray-computed tomography data and the used pre-processing workflow that might influence the model’s performance detrimentally.
The European Commission has identified Advanced Manufacturing and Advanced Materials as two of six Key Enabling Technologies (KETs). It is considered that Metrology is a key enabler for the advancement of these KETs. Consequently, EURAMET, the association of metrology institutes in Europe, has strengthened the role of Metrology for these KETs by enabling the creation of a European Metrology Network (EMN) for Advanced Manufacturing. The EMN is comprised of National Metrology Institutes (NMIs) and Designated Institutes (DIs) from across Europe and was formally established in October 2021. The aim of the EMN is to provide a high-level coordination of European metrology activities for the Advanced Manufacturing community.
The EMN itself is organized in three sections representing the major stages of the manufacturing chain: 1) Advanced Materials, 2) Smart Manufacturing Systems, and 3) Manufactured Components & Products. The EMN for Advanced Manufacturing is engaging with stakeholders in the field of Advanced Manufacturing (large companies & SMEs, industry organisations, existing networks, and academia), as well as the wider Metrology community, including Technical Committees, to provide input for the Strategic Research Agenda (SRA) on Metrology for Advanced Manufacturing.
This contribution will give an overview about the first version of the SRA prepared by the EMN for Advanced Manufacturing.
Sr isotope ratio analysis
(2023)
Pb isotope ratio analysis
(2023)
Detector deadtime
(2023)
Isotope Ratio Analysis
(2023)
In recent years, chromium (III) complexes have received a lot of attention as novel near-infrared (NIR) emitters triggered by the report on the first molecular ruby Cr(ddpd)2(BF4)3 with a high photoluminescence quantum yield of 13.7% of its near infrared (NIR) emission band and a long luminescence lifetime of 1.122 ms at room temperature.[1] However, in an oxygen-containing environment, the photoluminescence quantum yields and luminescence lifetimes of these chromium(III) complexes show only very small values. This hampers their application as NIR luminescence labels. This application, that cannot be tackled by conventional deoxygenating approaches, requires suitable strategies to protect the luminescence of the chromium(III) complexes from oxygen quenching. An elegant approach to reduce the undesired luminescence quenching by triplet oxygen explored by us presents the incorporation of these chromium(III) complexes into different types of amorphous, non-porous silica nanoparticles, that can be simply surface functionalized, e.g., with targeting ligands and/or other sensor molecules. In this work, as first proof-of-concept experiments, a set of chromium (III) complexes constituting of different ligands and counter anions, were embedded into the core of silica nanoparticles. Subsequently, the optical properties of the resulting luminescent silica nanoparticles were spectroscopically assessed by steady state and time-resolved luminescence spectroscopy. First results of time-resolved luminescence measurements confirm our design concept of nanoscale NIR emissive Cr(III) complex-based reporters
Engineered nanomaterials (NM) with their large surface-to-volume ratios and their for some materials observed size-dependent functional properties are of increasing relevance for current and future developments in various fields such as medical and pharmaceutical industry, computing and electronics or food and consumer products. The performance and safety of NM are determined by the sum of their intrinsic physicochemical properties. Especially, the particle surface chemistry, which is largely controlled by the chemical nature and density of functional groups (FG) and ligands, is an important key driver for NM performance, stability, and processibility as well as the interaction of NM with the environment. Thus, methods for FG quantification can foster the sustainable development of functional and safe(r) NM.
Aiming at the development of simple, versatile, and multimodal tools for the quantification of many bioanalytically relevant FG and ligands, we investigated and compared various analytical methods commonly used for FG quantification. This includes electrochemical titration methods, dye-based optical assays, and other instrumental analytical techniques such as nuclear magnetic resonance and thermal analysis methods.
The potential of our multimodal approach for FG quantification was demonstrated for commercial and custom-made polymeric and silica particles of varying FG, used as optical pH sensors. In the future, our strategy can contribute to establish multi-method characterization strategies to provide a more detailed picture of the structure-properties relationship.
In recent years, the use of functionalized micro- and nanomaterials has increased rapidly for a wide range of applications in the life and material sciences, due to their unique properties in combination with their high surface-to-volume ratio and stability. For instance, functionalized micro- and nanomaterials, that are labeled or stained with a multitude of sensor dyes can be used for monitoring, and quantification of neutral and ionic analytes. These materials have several advantages as compared to conventional molecular probes like enhanced brightness, ease of designing ratiometric systems by combining analyte-sensitive and inert reference dyes, and increased photostability. Moreover, stained nanoparticles can enable the use of hydrophobic dyes in aqueous environments.
Versatile templates and carriers for the fabrication of nanosensors by staining and/or labeling with different fluorophores and sensor molecules are biocompatible silica and polymeric particles, because they can be synthesized in large scales at low costs with different surface chemistries.
Here we present our work on multicolored sensors for the measurement of pH, oxygen and saccharides utilizing commercially available or in-house synthesized silica and polymeric particles.
Nowadays amorphous silica nanoparticles (SiO2-NP) are one of the most abundant engineered nanomaterials, that are highly stable and can be easily produced on a large scale at low cost. Surface functionalized SiO2-NP are of great interest in the life and material sciences, as they can be used e.g. as
drug carriers, fluorescent sensors, and multimodal labels in bioanalytical assays and imaging applications. Their performance in such applications depends not only on particle size, size distribution, and morphology, but also on surface chemistry, i.e. the total number of surface functional groups (FG)
and the number of FG accessible for subsequent functionalization with ligands or biomolecules, which in turn determines surface charge, colloidal stability, biocompatibility, and toxicity. Aiming at the development of simple, versatile, and multimodal tools for the quantification of many bioanalytically relevant FG and ligands, we investigated and compared various analytical methods commonly used for FG quantification. This includes electrochemical titration methods, dye-based optical assays, and other instrumental analytical techniques such as nuclear magnetic resonance and thermal analysis methods.
The potential of our multimodal approach for FG quantification was demonstrated for commercial and custom-made silica particles of varying FG, showing not only an influence of the synthesis methods on the number of FG but also on the performance. In the future, our strategy can contribute to establish multi-method characterization strategies to provide a more detailed picture of the structure-properties relationship.
In the focus of division Biophotonics are the design, preparation, analytical and spectroscopic characterization, and application of molecular and nanoscale
functional materials, particularly materials with a photoluminescence in the visible, near infrared (NIR) and short-wave infrared (SWIR). This includes optical reporters for bioimaging and sensing, security and authentication barcodes, and materials for solid state lighting, energy conversion, and photovoltaics. For the identification of optimum particle structures quantitative spectroscopic studies are performed under application-relevant conditions, focusing on the key performance parameter photoluminescence quantum yield. In addition, simple, cost-efficient, and standardizable strategies for quantifying functional groups on the surface of nano- and microparticles are developed, here with a focus on optical assays and electrochemical titration methods, cross-validated by more advanced methods such as quantitative NMR. In addition, reference materials and reference products are developed for optical methods, particularly luminescence techniques, and for analytical methods utilized for the characterization of nanomaterials.
For fluorescence microscopy, there is an increasing need for suitable calibration tools and reference materials for microscope calibration, the determination of performance parameters, and the regular control and validation of instrument performance. This is addressed in the BmWk-financed project FluMikal (WIPANO program) by two research groups from academia and two companies, that is coordinated by BAM. Here we present different approaches to liquid and solid fluorescence standards for the determination of the wavelength-dependent spectral sensitivity of fluorescence microscopes and ideas concerning the choice of suitable fluorescence lifetime standard for increasingly utilized fluorescence lifetime imaging (FLIM).
We employ in-house generated synthetic Al-Si matrix composite XCT data for training deep convolutional neural networks for XCT data conditioning and automatic segmentation. We propose an in-house multilevel deep conditioning framework capable of rectifying noise and blur in corrupted XCT data sequentially. Furthermore, for automatic segmentation, we utilize a special in-house network coupled with a novel iterative segmentation algorithm capable of generalized learning from synthetic data. We report a consistent SSIM efficiency of 92%, 99%, and 95% for the combined denoising/deblurring, standalone denoising, and standalone deblurring, respectively. The overall segmentation precision was over 85% according to the Dice coefficient. We used experimental XCT data from various scans of Al-Si matrix composites reinforced with ceramic particles and fibers.
Im Rahmen des Forschungsprojekts "Artificial Intelligence for Rail Inspection" (AIFRI) wird ein KI-Algorithmus entwickelt, um die Fehlererkennung bei der Auswertung von Schienenprüfungen zu verbessern. Der Prozess der mechanisierten Schienenprüfung wird analysiert und die Schienenfehler sowie Artefakte werden in einem digitalen Zwilling abgebildet, um in einem weiteren Schritt die automatische Fehlererkennung und Klassifizierung mit KI-Algorithmen trainieren zu können. Zu diesem Zweck werden Ultraschalldatensätze auf der Grundlage der Regelwerke und Informationen aus der Instandhaltung mit einer Simulationssoftware erstellt, die Anzeigen der verschiedenen Schienenschädigungen und Artefakte enthalten.
Die Schienenfehler werden bei der Auswertung in Fehlerklassen eingeordnet, für das KI-Training priorisiert und auf Basis der von der DB Netz AG ausgewählten Informationen untersucht. Hierfür werden die Schienenfehler nach den für das KI-Training relevanten Merkmalen zerlegt und die Konfiguration der Parameter der Simulation entsprechend abgestimmt.
Für die Grundstruktur des Datensatzes wird ein Schienenmodell mit einer Länge von einem Meter für die Simulation eingesetzt, auf dessen Basis alle bei der Schienenprüfung zu verwendenden Prüfköpfe für den jeweiligen Reflektortyp betrachtet werden. Die simulierten Daten werden auf einer Testschiene im Labormaßstab validiert. Mögliche Einflussparameter wie z. B. der Signal-Rausch-Abstand sowie die Fahrgeschwindigkeit werden in den Datensätzen herangezogen. Die Zusammenstellung eines Testdatensatzes mit lokal veränderlichen Einflussgrößen erfolgt aus den simulierten Daten unter Verwendung der skriptbasierten Programmierumgebung Python und Matlab.
Das Projekt AIFRI wird im Rahmen der Innovationsinitiative mFUND unter dem Förderkennzeichen 19FS2014 durch das Bundesministerium für Digitales und Verkehr gefördert.
Thermal Destruction of PFAS
(2023)
Thermal treatment processes are currently the only full-scale option for thedestruction of per- and polyfluoroalkyl substances (PFAS) in large waste streams. While all organic molecules including PFAS are susceptible to thermal destruction, their decomposition rates are controlled by process variables such as temperature, reaction atmosphere, and residence time. Concerns exist about the formation of products of incomplete destruction and their emission from hazardous waste incinerators. This talk will summarize the current stateof-the-art of thermal PFAS destruction, identify research needs, and showcase future research designed to address critical knowledge gaps.
Die NMR-Spektroskopie ist eine der zentralen nicht-invasiven Analysenmethoden in der organischen Chemie und aus dem Laboralltag nicht mehr wegzudenken. Die direkte Proportionalität zwischen der Anzahl der Atomkerne im Messvolumen und der Signalfläche im Spektrum ist vergleichbar mit einem „Zählen der Kernspins“. Der Kalibrieraufwand für die Quantifizierung ist minimal und aus dem NMR-Spektrum sind Informationen zu Struktur und Identität zugänglich.
Online-NMR-Spektroskopie wird unter Zuhilfenahme von speziellen Durchflussproben-köpfen und Messzellen bereits seit Jahrzehnten erforscht und eingesetzt. Durch die hohen Anforderungen an die Aufstellung und den Betrieb von klassischen Hochfeld-NMR-Spektrometern konnte sie allerdings nie den Sprung vom Labor in den Prozess schaffen. Dies hat sich durch die Entwicklungen im Bereich mobil einsetzbarer Benchtop-NMR-Spektrometer grundlegend verändert. Geringere Investitions- und Betriebskosten, sowie die Robustheit und einfache Bedienbarkeit dieser Systeme sind entscheidende Faktoren. Damit rückt eine Anwendung als Online-PAT-Methode technisch in greifbare Nähe. Designierte Prozess-NMR-Spektrometer sucht man heutzutage allerdings auf dem Markt meist noch vergebens.
Die Reaktionsverfolgung im Labormaßstab konnte an unterschiedlichen Systemen erfolgreich demonstriert werden. Dabei zeigten sich allerdings Limitierungen in Hinblick auf einen möglichen Prozesseinsatz. Insbesondere die Temperaturempfindlichkeit der Magnetsysteme ist hier zu nennen. Anhand eines Prototyps wurde eine aktive Temperaturisolation mittels temperierter Luftströmungen entwickelt und erprobt.
Eine Feldintegration von Laborsystemen geht sowohl technisch als auch regulatorisch mit zahlreichen Herausforderungen einher. Das raue Umfeld von Produktionsanlagen, sowie die Anforderungen an Explosionsschutz erfordern eine zugelassene Einhausung. Auf Basis von Erfahrungen mit einem ersten Prototyp erfolgte die gemeinsame Entwicklung eines möglichst flexibel einsetzbaren Analysenschranks. Neben der Hardware-Integration sind die automatisierte modellbasierte Auswertung der NMR-Spektren, sowie die Einbindung in die Prozessleittechnik essenziell für einen zuverlässigen Betrieb als PAT-Analysengerät.
The BAMline at the synchrotron X-ray source BESSY II (Berlin, Germany) is supporting researchers especially in materials science. As a non-destructive characterization method, synchrotron X-ray imaging, especially tomography with hard X-Rays, plays an important role in structural 3D characterization. The imaging capabilities allow for in-situ and operando experiments. In this presentation the data handling pipeline is presented.
Referenzmaterialien sind unter anderem ein wichtiges Werkzeug zur Qualitätskontrolle von Messungen bestimmter Merkmalswerte. Dabei ist zu berücksichtigen, dass zertifizierte Merkmalswerte immer eine gewisse Unsicherheit haben. Die Ermittlung dieser Unsicherheitsbeiträge ist Gegenstand des Vortrags. Referenzmaterialien sind gleichzeitig ein wertvolles Werkzeug zur Ermittlung der Unsicherheit von Messverfahren und -Analysen unbekannter Proben. Die Vorgehensweise bei der Ermittlung der Messunsicherheit mit Hilfe eines Referenzmaterials wird beschrieben.
Referenzmaterialien sind unter anderem ein wichtiges Werkzeug zur Qualitätskontrolle von Messungen bestimmter Merkmalswerte. Dabei ist zu berücksichtigen, dass zertifizierte Merkmalswerte immer eine gewisse Unsicherheit haben. Die Ermittlung dieser Unsicherheitsbeiträge ist Gegenstand des Vortrags. Referenzmaterialien sind gleichzeitig ein wertvolles Werkzeug zur Ermittlung der Unsicherheit von Messverfahren und -Analysen unbekannter Proben. Die Vorgehensweise bei der Ermittlung der Messunsicherheit mit Hilfe eines Referenzmaterials wird beschrieben.
An introductory lecture on the Dark Side of Science; what it is, why it exists, and what can be done to fight it. This lecture illuminates the increasing prevalence of fraudulent scientific work (e.g. faked data, manipulated images, paper mills) with plenty of examples and sources. The second section expands on the driving forces that caused this phenomenon to emerge, largely driven by pressures from management, peers and the researcher themselves. The third section expands on methods and tools that can be used to educate and arm oneself against this phenomenon. The 2023 edition includes new examples of larger fraudulent bodies of work emerging, and the problems posed by the arrival of LLMs.
The project series CTSimU was initiated with the goal to develop a set of procedures to enable the determination of the task-specific measurement uncertainty of a CT system numerically by radiographic simulation. The first project (2019-2022) “Radiographic Computed Tomography Simulation for Measurement Uncertainty Evaluation - CTSimU” was focused on the sufficient physical correctness of the radiographic simulation and created as a result a test framework for simulation softwares and a draft of a VDI standard in the series VDI/VDE 2630 for this application. However, for the realistic simulation of a CT system in a simulation software (i.e. a digital twin), not only the correctness of the simulation software itself is crucial, but also the quality of the parameterization of the CT system in the simulation software - this represents the starting point of the 2nd project “Realistic Simulation of real CT systems with a basic-qualified Simulation Software - CTSimU2” (2022-2024).
The parameterization of a CT system in a simulation software can be divided into four steps: after the data acquisition at the real CT system (step 1) follows the evaluation of the acquired data for the generation of general parameter specifications (step 2). It follows the transfer of the parameters into the specific simulation software (step 3) and the validation of the resulting simulation parameters by a suitable test (step 4). The intended result of the project CTSimU2 is a draft VDI standard (for VDI/VDE 2630) for this test, which contains an informative annex on the state of the art regarding the possibilities for parameter determination.
Die Entwicklung von Werkzeugen zur realitätsnahen Nachbildung eines industriellen CT-Systems in einer Simulationssoftware ist derzeit Hauptaufgabe des WIPANO Forschungsprojektes CTSimU2 Realistische Simulation realer Röntgencomputertomografie - Systeme mit basisqualifizierter Simulationssoftware. Als Voraussetzung dienen dabei Simulationssoftwares, die durch das Testframework aus dem Vorprojekt CTSimU1 basisqualifiziert wurden. Das Testframework testet die hinreichende physikalische Korrektheit und Funktionalität einer Simulationssoftware (Basisqualifizierung der Software). Für eine realitätsnahe Nachbildung ist nicht nur die Güte der Simulationssoftware, sondern insbesondere die Güte der Parametrisierung des realen CT-Systems in der Simulationssoftware ausschlaggebend. Dabei kann das Vorgehen der Parametrisierung in vier Schritte unterteilt werden: die Datenaufnahme am realen CT-System (Schritt 1), die Auswertung der aufgenommenen Daten für die Generierung allgemeiner Parameterangaben (Schritt 2), die Übertragung der Parameter in die spezifischen Simulationssoftwares (Schritt 3) und die Validierung der resultierenden Simulationsparameter durch einen geeigneten Test (Schritt 4). Ziel des Projektes ist es daher neben der Erarbeitung eines Werkzeugkastens mit allgemeinen Methoden zur Datenaufnahme und Auswertung der Daten, die Entwicklung eines Tests, auf dessen Basis die ausreichend korrekte Simulation einer realen Anlage beurteilt werden kann. Die erarbeiteten Ergebnisse sollen wie bereits im Vorprojekt CTSimU1 in einen Richtlinienentwurf für die Richtlinienreihe VDI/VDE 2630 übertragen werden. Dieser Beitrag soll einen Überblick über das Projekt und die ersten Ergebnisse geben.
Research into new sources for EUV lithography is driving advancements in experimental methods tailored for this short wavelength range. This progress enables the exploration of spectroscopic techniques aimed at monitoring electronic transitions within this energy spectrum. Laser-induced breakdown spectroscopy (LIBS) serves as a rapid tool for elemental analysis, primarily established in the UV-vis range. However, LIBS encounters challenges such as limited repeatability precision and elevated background noise resulting from continuum radiation.
In parallel, laser-induced extreme UV spectroscopy (LIXS) delves into the initial stages of plasma evolution, characterized by the emergence of soft X-ray and extreme UV radiation. The method benefits from a fast timeframe and constrained plasma confinement, leading to better precision. Nevertheless, LIXS encounters convoluted spectra arising from unresolved transition arrays (UTA), particularly pronounced for heavier elements. This complexity renders conventional univariate data analysis impractical, demanding the adoption of a multivariate data analysis approach.
Multiple cathode samples, each coated with varying stoichiometries of lithium nickel manganese cobalt oxide (NMC), were prepared and used for calibration purposes. Through the application of Partial Least Squares (PLS) regression, a robust correlation with an R2 value exceeding 0.97 was achieved. The LIXS technique underwent a comparative evaluation against UV-vis LIBS. Furthermore, a comparison between univariate and multivariate analysis approaches was conducted, incorporating validation through y-randomization to mitigate overfitting risks.
The viability of this approach was confirmed through the testing of an NMC reference material. The results showed metrological compatibility with reference values, underscoring the potential capability of the proposed methodology.
The overview of the activity of Federal Institute for Material Research and Testing (BAM, Belin, Germany) in the field of additively manufacturing material characterization will be presented. The research of our group is focused on the 3D imaging of AM materials by means of X-ray Computed Tomography at the lab and at synchrotron, and the residual stress characterization by diffraction (nondestructive technique). Also, two successful research project in collaboration with CAM2, Sweden are presented.
Detaillierte Kenntnisse der elastischen Materialeigenschaften sind in vielen ingenieurtechnischen Bereichen von grundlegender Bedeutung. Insbesondere für die Anwendung von Predictive Maintenance und Structural- Health-Monitoring Methoden mit Ultraschall ist die genaue Kenntnis der elastischen Materialkonstanten eine Grundvoraussetzung. Die von den Herstellern zur Verfügung gestellten Angaben zu den elastischen Materialkonstanten, insbesondere für Polymere und faserverstärkte Kunststoffe, sind jedoch oft unzureichend, da diese vom Produktionsprozess abhängig sind und sich zusätzlich aufgrund von Materialabbauprozessen oder Ermüdung ändern können. In der Praxis liegen polymere Werkstoffe, faserverstärkte Kunststoffe und Metalle oft als dünne, plattenförmige Strukturen vor, in welchen sich geführte Ultraschallwellen (UGWs) ausbreiten können. In der aktuellen Forschung sind bereits verschiedene Neuronale Modelle zur Bestimmung der elastischen Konstanten und der Materialcharakterisierung mittels UGWs bekannt. Ein einfaches neuronales Netz, mit aus Dispersionsbildern extrahierten Werten für Frequenz und Wellenzahl ausbeutungsfähiger Moden als Eingabe, zur Vorhersage der elastischen Konstanten wird in verwendet. Ein rekurrentes Neuronales Netz mit einem Zeit-Frequenz Vektor als Eingabe wird in angewandt, während in ein 1D- Convolutional-Neuronal-Networks (CNN) unter Verwendung der zeitlichen Auslenkung der Grundmoden und in ein 2D-CNN unter Verwendung einer polaren Gruppengeschwindigkeitsdarstellung zur Bestimmung der elastischen Konstanten verwendet wird. In diesem Vortrag wird ein Ansatz zur Bestimmung der isotropen elastischen Konstanten von dünnen Platten auf der Grundlage von UGWs unter Verwendung von Dispersionsbildern und 2D-CNNs vorgestellt. Dispersionsabbildungen aus numerischen Simulationen werden mithilfe verschiedener Methoden vorverarbeitet, um realistische Messdaten zu simulieren. Mit den modifizierten Daten wird das Modell trainiert und die Architektur optimiert. Anschließend wird die Genauigkeit des erzeugten Modells mit realen Messdaten validiert. Es wird gezeigt, dass 2D-CNNs in der Lage sind, die isotropen elastischen Konstanten anhand multimodaler Merkmale aus Dispersionsbildern vorherzusagen, ohne dass eine anfängliche Schätzung der Parameter oder manuelle Merkmalsextraktion erforderlich ist.
Die Bezeichnung „Tinte auf Papier“ bietet einen guten Einstieg in die Beschreibung der Materialität von Musikhandschriften; so haben sich zahlreiche Manuskripte erhalten, die mit Eisengallustinte auf Hadernpapier verfasst wurden.
Das Papier besteht aus Fasern meist pflanzlicher Herkunft und wird durch Entwässerung einer Fasersuspension auf einem Sieb gebildet. Der Abdruck des Schöpfsiebes (die Papierstruktur) und ein vorhandenes Wasserzeichen bieten Anhaltspunkte für Provenienz und Datierung; daneben erlaubt die Analyse der Fasern, der Füllstoffe und der Leimung eine weitere Charakterisierung des Schriftträgers.
Spätestens seit dem Mittelalter sind es Eisengallustinten, die das schwarze bis bräunliche Erscheinungsbild vieler Manuskripte auf Papier hervorrufen. Daneben lassen sich auch schwarze Rußtuschen oder farbige Auszeichnungstuschen nachweisen. Eisengallustinten werden durch Mischung einer eisenhaltigen Komponente mit Gerbstoffen hergestellt. Eisensulfat ist die am häufigsten genannte Eisen liefernde Zutat, daneben sind aber auch eisenhaltige Minerale, Nägel oder Rost denkbar. Die Gallussäure leitet sich ab aus den Galläpfeln, der krankhaften Veränderung einzelner Planzenteile verschiedener Eichenarten. Diese werden durch die Eiablage von Schlupfwespen hervorgerufen.
Wie schon beim Papier ermöglicht der materialanalytische Nachweis charakteristischer Beimischungen oder Verunreinigungen innerhalb der Tinten die Beantwortung kulturhistorischer Fragestellungen – gemeint sind hier die Unterscheidung von Original und Korrektur oder die Sichtbarmachung späterer Ergänzungen.
BAM (Federal Institute for Materials Research and Testing) is developing an electronic measurement system to be placed inside a waste drum, which will be filled with concrete. The goal of this measurement system is to monitor the process of hardening and the evolution of the concrete itself over time to indirectly identify potential defects such as corrosion or cracking. The measured parameters are humidity, temperature, and pressure. In this regard, particular attention was given to the design of the electronic board’s enclosure, to allow the sensors to measure the state of the concrete without being in direct contact with it. In the scope of the European Commission’s project of PREDIS, the supply of power to the battery-less sensors and the data acquired by such sensors are transmitted through the metallic waste drum by an innovative wireless technology developed by VTT (Technical Research Centre of Finland) in order to ensure long-term operation while keeping the integrity of the sealed container.
BAM is developing an electronic measurement system to be placed inside a waste drum, which will be filled with concrete. The goal of this measurement system is to monitor the process of hardening and the evolution of the concrete itself over time to indirectly identify potential defects such as corrosion or cracking. The measured parameters are humidity, temperature, and pressure. In this regard, particular attention was given to the design of the electronic board’s enclosure, to allow the sensors to measure the state of the concrete without being in direct contact with it. In the scope of the EU project PREDIS, the data acquired by such sensors are transmitted from inside to outside the metallic waste drum through wireless technology.
The sensing system is made of a chain of small sensing units, called SensorNodes. Each SensorNode includes two off-the-shelf sensors, one for relative humidity and temperature and one for pressure and temperature. A SensorNode is designed to have a unique identifier, in order to be connected to other units while being uniquely discoverable by a standard communication protocol. In this way, a distributed matrix of measurement points is created.
One of the most challenging tasks in designing a measurement system to run in a harsh environment (such as hardening concrete) is to let the sensors sense the external environment without damaging the sensor itself.
To keep the external environment away from the electronic board while still letting the sensors measure the concrete behavior, holes have been drilled through the lid and covered from the inside with a layer of a porous membrane. The membrane's pores allow water and gas particles to pass through and let the enclosed air equilibrate with the external environment.
With the help of the developed sensors, monitoring concrete in cemented waste drums will be possible. The derived data will also serve as the basis for ongoing modeling approaches for digital twins within the Predis project. Overall, the sensors provide a means of enabling safe nuclear waste management through advanced monitoring
Laser-based active thermography is a contactless non-destructive testing method to detect material defects by heating the object and measuring its temperature increase with an infrared camera. Systematic deviations from predicted behavior provide insight into the inner structure of the object. However, its resolution in resolving internal structures is limited due to the diffusive nature of heat diffusion. Thermographic super resolution (SR) methods aim to overcome this limitation by combining multiple thermographic measurements and mathematical optimization algorithms to improve the defect reconstruction.
Thermographic SR reconstruction methods involve measuring the temperature change in an object under test (OuT) heated with multiple different spatially structured illuminations. Subsequently, these measurements are inputted into a severely ill-posed and heavily regularized inverse problem, producing a sparse map of the OuT’s internal defect structure. Solving this inverse problem relies on limited priors, such as defect-sparsity, and iterative numerical minimization techniques. Previously mostly experimentally limited to one-dimensional regions of interest (ROIs), this thesis aims to extend the method to the reconstruction of twodimensional ROIs with arbitrary defect distributions while maintaining reasonable experimental complexity. Ultimately, the goal of this thesis is to make the method suitable for a technology transfer to industrial applications by advancing its technology readiness level (TRL).
In order to achieve the aforementioned goal, this thesis discusses the numerical expansion of a thermographic SR reconstruction method and introduces two novel algorithms to invert the underlying inverse problem. Furthermore, a forward solution to the inverse problem in terms of the applied SR reconstruction model is set up. In conjunction with an additionally proposed
algorithm for the automated determination of a set of (optimal) regularization parameters, both create the possibility to conduct analytical simulations to characterize the influence of the experimental parameters on the achievable reconstruction quality. On the experimental side, the method is upgraded to deal with two-dimensional ROIs, and multiple measurement campaigns are performed to validate the proposed inversion algorithms, forward solution
and two exemplary analytical studies. For the experimental implementation of the method, the use of a laser-coupled DLP-projector is introduced, which allows projecting binary pixel patterns that cover the whole ROI, reducing the number of necessary measurements per ROI significantly (up to 20x).
Finally, the achieved reconstruction of the internal defect structure of a purpose-made OuT is qualitatively and qualitatively benchmarked against well-established thermographic testing methods based on homogeneous illumination of the ROI. Here, the background-noise-free twodimensional photothermal SR reconstruction results show to outclass all defect reconstructions by the considered reference methods.
X-ray photoelectron-spectroscopy (XPS) allows simultaneous irradiation and damage monitoring. Although water radiolysis is essential for radiation damage, all previous XPS studies were performed in vacuum. Here we present near-ambient-pressure XPS experiments to directly measure DNA damage under water atmosphere. They permit in-situ monitoring of the effects of radicals on fully hydrated double-stranded DNA. Our results allow us to distinguish direct damage, by photons and secondary low-energy electrons (LEE), from damage by hydroxyl radicals or hydration induced modifications of damage pathways. The exposure of dry DNA to x-rays leads to strand-breaks at the sugar-phosphate backbone, while deoxyribose and nucleobases are less affected. In contrast, a strong increase of DNA damage is observed in water, where OH-radicals are produced. In consequence, base damage and base release become predominant, even though the number of strand-breaks increases further. Furthermore, first data about the degradation of single-stranded DNA binding-proteins (G5P / GV5 and hmtSSB) under vacuum and NAP-XPS conditions are presented.
We give an overview about recent work concerning ionizing radiation damage to Oligonucleotides, plasmid DNA, DNA binding proteins (G5P), and DNA-protein complexes.
We focus on combining new experimental setups with Geant4/TOPAS particle scattering simulations to understand the effets of ionizing radiation.
Radiation biophysics
(2023)
In the course of tomorrow's hydrogen-based energy transition, the construction of the corresponding infrastructure will play a central role. The majority of materials used to date are typically welded for component fabrication. In that context, steels are widely applied and can be prone to hydrogen embrittlement. This includes the classical delayed cold cracking during welding processing as well as embrittlement phenomena during operation. For the evaluation of any hydrogen effect on, for example, the mechanical properties of a welded metallic material, the hydrogen content must be precisely determined. In the case of welds, for example, according to ISO 3690, this is the isothermal carrier gas hot extraction (CGHE). CGHE is based on accelerated hydrogen degassing due to thermal activation of hydrogen at elevated temperatures. In addition to the pure quantification of hydrogen, thermal desorption analysis (TDA) with varied heating rates can be used to determine and evaluate the bonding state at microstructural defects in the material. For both techniques, experimental and measurement influences have to be considered, which have a great effect on the result. For CGHE, for example, ISO 3690 suggests different sample geometries as well as minimum extraction times. The present study summarizes results and experiences of numerous investigations with different sample temperatures and geometries (ISO 3690 type B and cylindrical TDA samples) regarding: the influence of the sample surface (polished/welded), measurement accuracies depending on the sample volume as well as the insufficient monitoring of the effect of the PI controller on the extraction temperature. In particular, a deviating extraction temperature to the set temperature, can significantly falsify the measurement results. Based on the results, methods are shown to quickly reach the desired extraction temperature without having to physically interfere with the measurement equipment. This serves to substantially improve the reliability of hydrogen measurement through increased signal stability and accelerated hydrogen desorption. In general, an independent temperature measurement with dummy samples for the selected heating procedure is advisable to exclude possible unwanted temperature influences already before the measurement. In addition (and way more important), the methods described can be transferred directly to industrial applications.
Multi-principal element alloys (MPEAs) are innovative materials that have attracted extensive research attention within the last decade. MPEAs are characterized by a solid solution of equiatomic metallic elements. Depending on the number of elements, they are also referred as high entropy alloys (HEAs with n ≥ 4 elements like CoCrFeMnNi) and medium-entropy alloys (MEA with n = 3 elements CoCrNi). Depending on the alloy concept, MPEAs show exceptional properties in terms of mechanical performance or corrosion resistance at extreme environments. In that connection, hydrogen and its challenges for the most metallic materials gets more and more important. MPEAs are candidate materials for the substitution of conventional materials like austenitic stainless steels e.g., at very high-pressure up to 1000 bar. Those pressures are typically reached in valves or compressors for refueling of tanks with operational pressure of 700 bar. So far, the susceptibility of HEA/MEAs to hydrogen assisted cracking (if any) and the especially the underlying hydrogen uptake and diffusion was not within the scientific scope and not investigated in detail yet. For that reason, we focused on the hydrogen absorption the characterization of the hydrogen diffusion and trapping at elevated temperatures in a CoCrFeMnNi-HEA (each element with 20 at.-%) and CoCrNi-MEA, each element with 33.3 at.-%). As reference grade, the commercially available austenitic stainless steel AISI 316L was investigated. High-pressure hydrogen charging was conducted at different pressures in autoclave environment with maximum value of 1,000 bar. Thermal desorption analysis (TDA) via carrier gas hot extraction with coupled mass spectrometry was used with a max. heating rate of 0.5 K/s up to 650 °C. The measured desorption spectra of the different samples were deconvoluted into a defined number of individual peaks. The individually calculated peak temperatures allowed the definition of activation energies for predominant trap sites in the respective materials as well as the percentage share of the totally absorbed hydrogen concentration. The results present for the first time the complex interaction of both MPEAs and high-pressure hydrogen charging. A deconvolution of four peaks was selected and a main desorption peak was identified the dominant hydrogen trap containing the biggest share of the absorbed hydrogen concentration. The chemical composition an austenitic phase of both MPEAs is responsible for delayed hydrogen diffusion and strong, but mostly reversible, trapping. The comparison with the 316L samples showed significantly higher activation energies in the MPEAs, whereas hydrogen was also trapped at very high extraction temperatures. The absorbed maximum hydrogen concentration at 1,000 bar was 130 ppm for the CoCrFeMnNi-HEA, 50 ppm for the CoCrNi-MEA and 80 ppm for the 316L. It is interesting that the CoCrFeMnNi-HEA has obviously a way higher trapping capability compared to the conventional austenitic 316L, which could be a major advantage in terms of resistance to hydrogen assisted cracking.
In our (dramatically understaffed) X-ray scattering laboratory, developing a systematic, holistic methodology1 let us provide scattering and diffraction information for more than 2100 samples for 200+ projects led by 120+ collaborators over the last five years. Combined with universal, automat-ed data correction pipelines, as well as our analysis and simulation software, this led to more than 40 papers2 in the last 5 years with just over 2 full-time staff members.
While this approach greatly improved the consistency of the results, the consistency of the samples and sample series provided by the users was less reliable nor necessarily reproducible. To address this issue, we built an EPICS-controlled, modular synthesis platform to add to our laboratory. To date, this has prepared over 1200 additional (Metal-Organic Framework) samples for us to meas-ure, analyse and catalogue. By virtue of the automation, the synthesis of these samples is automat-ically documented in excruciating detail, preparing them for upload and exploitation in large-scale materials databases alongside the morphological results obtained from the automated X-ray scat-tering analysis.
Having developed these proof-of-concepts, we find that the consistency of results are greatly im-proved by virtue of their reproducibility, hopefully adding to the reliability of the scientific findings as well. Additionally, the nature of the experiments has changed greatly, with much more emphasis on preparation and careful planning. This talk will discuss the advantages and disadvantages of this highly integrated approach and will touch upon upcoming developments.
Per- and polyfluoroalkyl substances (PFAS) are a large group of more than 10,000 anionic, cationic, zwitterionic or neutral organofluorine surfactants. As a result of continuous and prolific use, mainly in aviation firefighting foams, thousands of industrial and military installations have been found to contain contaminated soil, groundwater and surface water. While liquid chromatography tandem mass spectrometry (LC-MS/MS) is commonly used technique to characterize targeted PFAS in environmental samples, there are more than 10,000 different PFAS known, which have various headgroups and properties. Therefore, several analytical techniques are available to analyse various groups or pools of PFAS or “all” PFAS as a sum parameter. Current decontamination strategies of PFAS-burdened soils mainly consist of adsorption methods using adsorbents for fixation of PFAS in the ground. A second option is the utilization of a “pump and treat” process, cycling polluted soils through a washing plant leading to the concentration of the pollutants in the fine fraction. Both approaches are cost-intensive and not intended for the direct decomposition of all PFAS contaminants. Hence, there is a great demand for innovative developments and chemical treatment technologies, dealing with new strategies of tackling the PFAS problem. Previously, mechanochemical treatment of polychlorinated organic compounds in soils showed an efficient dechlorination. Thus, we investigated mechanochemical treatment of PFAS contaminated soils with various additives in a ball mill and analyzed the PFAS defluorination with gas chromatography mass spectrometry (GC-MS) and liquid chromatography tandem mass spectrometry (LC-MS/MS), respectively, as well es the fluoride mineralization by ion chromatography (IC) and fluorine K-edge X-ray absorption near-edge structure (XANES) spectroscopy.
This contribution provides an overview of the BAMline synchrotron radiation beamline, which specializes in hard X-ray spectroscopy techniques for materials research. The BAMline offers X-ray absorption spectroscopy (XAS), x-ray fluorescence spectroscopy (XRF), and tomography to study materials' electronic structure, chemical composition, and structure. Key capabilities include standard and dispersive XAS for electronic structure, micro-XRF for elemental mapping, coded aperture imaging, and depth-resolved grazing exit XAS. The BAMline enables in situ characterization during materials synthesis and functions for energy, catalysis, corrosion, biology, and cultural heritage applications.
Ongoing developments like the implementation of machine learning techniques for experiment optimization and data analysis will be discussed. For instance, Bayesian optimization is being used to improve beamline alignment and scanning. An outlook to the future, where the BAMline will continue pioneering dynamic and multi-scale characterization, aided by advanced data science methods, to provide unique insights into materials research, will be given.
Enhancing efficiency at bamline: employing data science and machine learning for x-ray research
(2023)
This talk discusses how data science and machine learning techniques are being applied at the BAM Federal Institute for Materials Research and Testing to enhance efficiency and automation at the BAMLine synchrotron facility. The methods presented include Gaussian processes and Bayesian optimization for beamline adjustment and optimization of X-ray measurements. These statistical techniques allow automated alignment of beamline components and active learning scanning to reduce measurement time.
Additional machine learning methods covered are neural networks for quantification of X-ray fluorescence (XRF) data and decoding coded apertures.
Laser breakdown spectroscopy (LIBS) is a common tool for applications in various fields of science and technology. Originally an atomic analysis technique, LIBS was later extended to molecular analysis due to the transient nature of the laser-induced plasma, which develops from a hot dissociation stage on a nanosecond to several microsecond scale to a relatively cold recombination stage on a scale of 10 to 100 microseconds after breakdown. Molecules formed during the recombination stage or incompletely dissociated after ablation can be efficiently detected, allowing the analysis of "difficult" elements or even molecular isotopes. However, with a small amount of ablated material and a short lifetime of the luminous plasma, analytical signals, especially molecular ones, can be very weak.
Several methods have been proposed for reheating the plasma and increasing its lifetime, for example, a two-pulse LIBS or a LIBS combined with microwave radiation or with an electric spark discharge. Here we propose another one, LIBS combined with a capacitively coupled RF discharge at 13.6 MHz. The advantages of this combination are an increase in the lifetime of atomic and molecular emission and operation in a low-pressure atmosphere, which significantly reduces pressure line broadening and allows high-resolution spectroscopy. Another major advantage is operating in a chemically controlled atmosphere that can predictably drive desired chemical reactions. In this presentation, we will show the first results obtained with RF-LIBS combination. These will include separate and joint characterization of LIBS and RF plasmas and evaluation of its potential for elemental and molecular analysis and for plasma enhanced chemical vapor deposition.
Many applications of LIBS require the measurement of plasma temperature and electron density, which in turn requires knowledge of the integrated line intensity and the shape of the spectral lines. While the integral intensity is preserved as light passes through the spectrometer, the shape emitted by an individual atom or ion is greatly distorted. This is due, firstly, to the transfer of light through the plasma (self-absorption), secondly, to the influence of the instrumental function of the spectrometer, and, thirdly, to the aberrations of the optical system. In addition, processing of spectral information, such as background removal, noise reduction, deconvolution, and line fitting, introduces additional errors in the reconstructed linewidth and line integral, which leads to erroneous temperature and electron density values.
This communication will be devoted to the general shortcomings of spectral data processing and the resulting inaccuracies in determining the plasma parameters. The analysis is based on the use of synthetic spectra generated by plasma with known temperature and particle density. The estimation of errors caused by inadequate processing of the spectral data is made by comparing the initial and determined plasma parameters. As a result, an improved data processing method will be proposed that takes into account the spectrum distortion by the instrumental function and integration on the pixel detector. The former is accounted for by convolution (instead of deconvolution) of the estimated line profile using a predetermined slit function, and the latter is achieved by piecewise integration of the line profile by the pixel detector, taking into account the pixel size and uniform or non-uniform pixel separation. Recommendations will be made for which analytic function best approximates the observed spectral lines and examples will be given for the application of this routine to calibration-free LIBS using both synthetic and experimental data.
In the LIBS literature, almost every second article reports the determination of the plasma temperature using the Boltzmann plot method or the determination of the electron density using the Stark line broadening relation. The first requires the measurement of the integrated intensities of the spectral lines, and the second requires the measurement of the linewidth, under the same assumption of optical thinness. It is taken for granted that this can be easily done either by working with the raw spectra or by fitting an appropriate function to the observed spectral lines. As a rule, reported data are not verified either by an alternative method (e.g., Thomson scattering) or by computer simulations using synthetic spectra.
However, the question of how to extract the necessary information from the raw spectral data is not as simple as it might seem. The quality of such an extraction will depend critically on the type of spectral instrument used, its resolution, and the noise superimposed on the data. The problem is that we do not see the spectrum emitted by the plasma, but the spectrum distorted by the measurement; an exaggerated example of such a distortion is shown in Fig. 1. The elimination of this distortion belongs to the class of inverse problems, the so-called ill-posed problems, whose successful solution crucially depends on the quality of the information available. When it comes to spectroscopy, quality of information primarily means high spectral resolution and low noise. Not all spectrometers used in LIBS can provide the quality needed to solve the inverse problem; this casts doubt on many published plasma measurements.
The current presentation will be devoted to general shortcomings in the processing of spectral data and inaccuracies in the determination of plasma parameters resulting from these shortcomings. The analysis is based on the use of synthetic spectra produced by plasma with known characteristics, i.e., temperature, species densities, and electron density. The estimation of errors caused by inadequate processing of spectral data is made by comparing the initial and reconstructed plasma parameters. Recipes will be given for which the analytic function best approximates the observed spectral lines, and how data processing errors affect accuracy of calibration-free LIBS will be discussed. These issues were only partially covered in previously published works, for example [1, 2, 3].
The aim of the project is to develop an adequate model of laser induced plasma for conditions expected in space missions, i.e., vacuum, or low-pressure CO2 atmosphere. Numerical modeling will help to find optimal experimental parameters for the laser ablation under artificial lunar or Martian environments and obtain both qualitative, in terms of a composition, and quantitative, in terms of an elemental abundance, information about interrogated samples based on spectral data generated by the model. The best operational conditions will be found at a low cost without conducting tedious and time-consuming optimization experiments. The modeling approaches will be supported by machine learning to accelerate the optimization.
The application of multivariate data analysis is essential in extracting the full potential of laser-induced XUV spectroscopy (LIXS) for high-precision elemental mapping. LIXS offers significant advantages over traditional laser-induced breakdown spectroscopy in UV-vis (LIBS), including higher precision and a wider dynamic range,[1,2] while making it possible to determine light elements like lithium and fluorine. However, it is challenged by the presence of unresolved transition arrays (UTAs) for heavier elements. These UTAs add considerable complexity to the spectral data, often concealing crucial information. In this study, we employ well-established multivariate data analysis techniques and intensive data preprocessing to unravel this contained information.
The refined analysis reveals a high level of detail, enabling the precise identification of inhomogeneities within material samples. Our approach has particular relevance for studying aging processes in lithium-ion batteries (LIBs), specifically in relation to varying cathode materials and fluorine-containing polymer binder content. By combining elemental distribution with structural information, this improved method can offer a more comprehensive understanding of sample inhomogeneities and aging processes in LIBs, contributing to the development of more reliable and sustainable battery technologies.
Laser-induced XUV spectroscopy (LIXS) is an emerging technique for elemental mapping. In comparison to conventional laser-induced breakdown spectroscopy in UV-vis (LIBS), it has a higher precision and wider dynamic range, and it is well suited for the quantification light elements like lithium and fluorine. Further it can spot oxidation states. The XUV spectra are produced at a very early stage of the plasma formation. Therefore, effects from plasma evolution on the reproducibility can be neglected. It has been shown, that high-precision elemental quantification in precursor materials for lithium-ion batteries (LIBs) can be performed using LIXS. Based on these results, LIXS mapping was used to investigate aging processes in LIBs. Different cathode materials with varying compositions of fluorine containing polymer binders were compared at different stages of aging. Due to effects comparable to X-ray photoelectron spectroscopy but in reverse, monitoring of changes in the oxidation state is envisioned, which makes information about the chemical environment of the observed elements accessible. The combination of elemental distribution and structural information leads to a better understanding of aging processes in LIBs, and the development of more sustainable and safe batteries.
Improved Data Processing for Accurate Plasma Diagnostics with Implications for Calibration-Free LIBS
(2023)
Many LIBS papers report the determination of plasma temperature using the Boltzmann plot method or the determination of electron density using the Stark line broadening relation. This requires measuring the integrated intensities of the spectral lines and the linewidth under the assumption of optical thinness. It is taken for granted that this can be easily done either by working with the raw spectra or by fitting the appropriate function to the observed spectral lines. However, extracting the necessary information from raw spectral data is not as easy as it might seem. The quality of such extraction will depend to a decisive extent on the type of spectral instrument used. The spectrum emitted by the plasma is distorted by the device; an example is shown in Fig. 1. The elimination of this distortion belongs to the class of inverse problems, the successful solution of which fundamentally depends on the quality of the available information. When it comes to spectroscopy, the quality of information primarily means high spectral resolution and low noise. Not all spectrometers used in LIBS can provide the quality needed to solve the inverse problem; this casts doubt on many published plasma measurements. This communication will be devoted to the general shortcomings of spectral data processing and the inaccuracies in determining the plasma parameters resulting from these shortcomings. The analysis is based on the use of synthetic spectra generated by plasma with known temperature, particle density and electron density. The estimation of errors caused by inadequate processing of spectral data is made by comparing the initial and measured plasma parameters from the spectra. Recommendations will be made for which analytic function best approximates the observed spectral lines, and how data processing errors affect the accuracy of calibration-free LIBS will be discussed. These issues were only partially covered in previously published works, for example [1, 2].
Per- and polyfluoroalkyl substances (PFAS) are a large group of organofluorine surfactants used in the formulations of thousands of consumer goods. The continuous use of PFAS in household products and the discharge of PFAS from industrial plants into the sewer system have been resulted in contaminated effluents and sewage sludge from wastewater treatment plants (WWTPs) which became an important pathway for PFAS into the environment. Because sewage sludge is often used as fertilizer its application on agricultural soils has been observed as significant input path for PFAS into our food chain. To produce high-quality phosphorus fertilizers for a circular economy from sewage sludge, PFAS and other pollutants (e.g. pesticides and pharmaceuticals) must be separated from sewage sludge. Normally, PFAS are analyzed using PFAS protocols typically with time-consuming extraction steps and LC-MS/MS target quantification. However, for screening of PFAS contaminations in wastewater-based fertilizers also the DGT technique can be used for the PFAS extraction. Afterwards, combustion ion chromatography (CIC) can be applied to analyze the “total” amount of PFAS on the DGT binding layer. The DGT method was less sensitive and only comparable to the extractable organic fluorine (EOF) method values of the fertilizers in samples with >150 µg/kg, because of different diffusion properties for various PFAS, but also kinetic exchange limitations. However, the DGT approach has the advantage that almost no sample preparation is necessary. Moreover, the PFAS adsorption on the DGT binding layer was investigated via surface sensitive spectroscopical methods, such as Fourier-transform infrared (FT-IR) and fluorine K-edge X-ray absorption near-edge structure (XANES) spectroscopy.
O3BET Quality Protocols
(2023)
Presentation of the process-oriented approach for the development of the quality protocolls (standard operation procedures and work instructions) for the O3BETs. O3BETs are innovative testing facilities for building envelopes which are developed in the course of the Metabuilding Labs EU Horizon 2020 project.
Im Forschungsvorhaben PALUP werden luftgekoppelte Phased-Array-Sender und -Empfänger entwickelt. Die Wandler basieren auf geladenen zellulären Polymeren, auch Ferroelektrete genannt, die piezoelektrische Eigenschaften und eine sehr geringe spezifische akustische Impedanz aufweisen, womit sie sich für den Bau von luftgekoppelten Ultraschallwandlern sehr gut eignen.
X-ray refraction is analogous to visible light deflection by matter; it occurs at boundaries between different media. The main difference between visible light and X-rays is that in the latter case deflection angles are very small, from a few seconds to a few minutes of arc (i.e., the refraction index n is near to 1). Importantly, deflection of X-rays is also sensitive to the orientation of the object boundaries. These features make X-ray refraction techniques extremely suitable to a) detect defects such as pores and microcracks and quantify their densities in bulk (not too heavy) materials, and b) evaluate porosity and particle properties such as orientation, size, and spatial distribution (by mapping). While X-ray refraction techniques cannot in general image single defects, they can detect objects with size above a few wavelengths of the radiation.
Such techniques, especially at the Synchrotron BESSY II, Berlin, Germany, can be used in-situ, i.e. when the specimen is subjected to temperatures or external loads.
The use of X-ray refraction analysis yields quantitative information, which can be directly input in kinetics, mechanical and damage models.
We hereby show the application of non-destructive X-ray refraction radiography (SXRR, 2D mapping also called topography) to problems in additive manufacturing:
1) Porosity analysis in PBF-LM-Ti64. Through the use of SXRR, we could not only map the (very sparse) porosity distribution between the layers and quantify it, but also classify, and thereby separate, the filled porosity (unmolten powder) from the keyhole and gas pores (Figure 1).
2) In-situ heat treatment of laser powder bed fusion PBF-LM-AlSi10Mg to monitor microstructure and porosity evolution as a function of temperature (Figure 2). By means of SXRR we indirectly observed the initial eutectic Si network break down into larger particles as a function of increasing temperature. We also could detect the thermally induced porosity (TIP). Such changes in the Si-phase morphology upon heating is currently only possible using scanning electron microscopy, but with a much smaller field-of-view. SXRR also allows observing the growth of some individual pores, usually studied via X-ray computed tomography, but again on much smaller fields-of-view.
Our results show the great potential of in-situ SXRR as a tool to gain in-depth knowledge of the defect distribution and the susceptibility of any material to thermally induced damage and/or microstructure evolution over statistically relevant volumes.
Knowledge representation in the materials science and engineering (MSE) domain is a vast and multi-faceted challenge: Overlap, ambiguity, and inconsistency in terminology are common. Invariant and variant knowledge are difficult to align cross-domain. Generic top-level semantic terminology often is too abstract, while MSE domain terminology often is too specific.
The PMDco is designed in direct support of the FAIR principles to address immediate needs of the global experts community and their requirements. The illustrated findings show how the PMDco bridges semantic gaps between high-level, MSE-specific, and other science domain semantics, how the PMDco lowers development and integration thresholds, and how to fuel it from real-world data sources ranging from manually conducted experiments and simulations as well as continuously automated industrial applications.
LIBS ConSort: Development of a sensor-based sorting method for constuction and demolition waste
(2023)
Closed material cycles and unmixed material fractions are required to achieve high recovery and recycling rates in the building industry. In construction and demolition waste (CDW) recycling, the preference to date has been to apply simple but proven techniques to process large quantities of construction rubble in a short time. This is in contrast to the increasingly complex composite materials and structures in the mineral building materials industry. Manual sorting involves many risks and dangers for the executing staff and is merely based on obvious, visually detectable differences for separation. An automated, sensor-based sorting of these building materials could complement or replace this practice to improve processing speed, recycling rates, sorting quality, and prevailing health conditions. A joint project of partners from industry and research institutions approaches this task by investigating and testing the combination of laser-induced breakdown spectroscopy (LIBS) with near-infrared (NIR) spectroscopy and visual imaging. Joint processing of information (data fusion) is expected to significantly improve the sorting quality of various materials like concrete, main masonry building materials, organic components, etc., and may enable the detection and separation of impurities such as SO3-cotaining building materials (gypsum, aerated concrete, etc.) Focusing on Berlin as an example, the entire value chain will be analyzed to minimize economic / technological barriers and obstacles at the cluster level and to sustainably increase recovery and recycling rates. We present current advances and results about the test stand development combining LIBS with NIR spectroscopy and visual imaging. In the future, this laboratory prototype will serve as a fully automated measurement setup to allow real-time classification of CDW on a conveyor belt.
Hot Isostatic Pressing (HIP) is often introduced to tackle the porosity issue in additively manufactured (AM) materials. For instance, HIP post-processing is recommended to improve fatigue resistance of Laser powder bed fusion (PBF-LB) manufactured parts [1, 2]. Even though HIP cannot completely remove porosity, it significantly decreases the defect population and its average size below the critical threshold value leading to early crack initiation.
In the present study, in-situ investigation of HIP procedure of PBF-LB Ti-6Al-4V parts was carried out to gain further insights into the densification mechanism occurring during HIP. The in-situ observations at high pressure and high temperature are uniquely possible at the PSICHE beamline of the Soleil synchrotron (France), thanks to the Ultrafast Tomography on a Paris-Edinburgh Cell (UToPEC) and the combination of the fast phase-contrast tomography and energy-dispersive diffraction [3, 4]. A detailed methodology was developed to ensure that the correct pressure and temperature were maintained during the experiments.
The results allowed an estimation of the global dentification rate during HIP of PBF-LB Ti-Al-4V material, as well as a detailed quantitative characterization of the influence of pore size and shape on the densification process, thereby understanding the effectiveness of HIP process on different pore categories. After 20 mins, 75% of porosity can be considered as closed or has size below the resolution of the XCT reconstruction. We also observed that the smallest defects showed higher densification rate, while the defect shape did not have significant effect on such rate. The current development of in-situ HIP experiment allows experimental quantification and validation of the simulation work. Ultimately it paves the road to tailoring the HIP procedure for different materials depending on the porosity and microstructure.
The achievable spatial resolution of active thermographic testing is inherently limited by the diffusive nature of heat conduction in solids. This degradation of the achievable spatial resolution for a semi-infinite body acting on a defect signal can be approximated by spatial convolution with the Green’s function of the heat PDE. As the degradation in spatial resolution is dependent on the depth 𝐿, a common rule of thumb specifies that for proper detection, any defect should feature a spatial extension greater or equal to the depth it is located at. However, as the exact shape of a defect can have a large impact on its severity, at best a proper reconstruction of the defect shape should be performed, which therefore must also deal with the aforementioned adverse effects of heat conduction. One recent method to overcome the spatial resolution limit of thermographic testing is the photothermal super resolution reconstruction method. It is based on performing multiple active thermographic measurements on the same region of interest (ROI) with varying spatially structured heating and subsequent numerical reconstruction of the measured defect signals by solving a severely ill-posed inverse reconstruction problem relying on heavy regularization. By extending the experimental implementation of the method to make use of random-pixel patterns projected onto the ROI using a laser-coupled DLP-projector, defect reconstructions can now be performed within a reasonable time frame (~15 min per ROI) at high accuracy. Compared to conventional thermographic testing methods, the photothermal super resolution reconstruction stands out by resulting in a sparse representation of the defect structure of the ROI, making it especially well-suited to further automatic defect classification and quality assurance measures in an Industry 4.0 context.
In our (dramatically understaffed) X-ray scattering laboratory, developing a systematic, holistic methodology let us provide scattering and diffraction information for more than 2100 samples for 200+ projects led by 120+ collaborators. Combined with automated data correction pipelines, and our analysis and simulation software, this led to more than 40 papers in the last 5 years with just over 2 full-time staff members.
This year, our new, modular synthesis platform has made more than 1000 additional samples for us to analyse and catalogue. By virtue of the automation, the synthesis of these samples is automatically documented in excruciating detail, preparing them for upload and exploitation in large-scale materials databases. Having developed these proof-of-concepts, we find that materials research itself is changed dramatically by automating dull tasks in a laboratory.
This talk is intended to spark ideas and collaborations by providing an overview of: 1) the current improvements in our scattering laboratory methodology, 2) introducing our open, modular robotic platform that is used for systematic sample preparation, and 3) demonstrating the data structure of the synthesis logs and measurements. Finally, the remaining bottlenecks and points of attention across all three are highlighted.
In this talk, the importance of metadata is underscored by real-world examples.
Metadata is essential to alleviating the reproducibility crises in science. This imples that a wide range of metadata must be collected, with a heavy emphasis on the automated collection of such metadata. This must subsequently be organized in an intelligible, archival structure, when possible with units and uncertainties.
Such metadata can aid in improving the usage efficiency of instrumentation, as is demonstrated on the MOUSE instrument. This metadata can now be used to connect the various aspects of the holistic experimental procedure to gain better insights on the materials structure.
A second example shows the extraction and organization of such metadata from an automated materials development platform, collected during the synthesis of 1200 samples. These metadata from the synthesis can then be linked to the results from the analysis of these samples, to find direct correlations between the synthesis parameters and the final structure of the materials.
This presentation provides a comprehensive overview of recent developments and the current status within the Semantic Interoperability work area, with a particular emphasis on the advancements related to the Platform MaterialDigital Core Ontology (PMDco). The presentation will delve into the collaborative and community-supported curation process that has been instrumental in shaping PMDco. Additionally, we will introduce the innovative Ontology Playground, showcasing its role in fostering experimentation and exploration within the realm of ontology development.
The development of more powerful and more efficient lithium-ion batteries (LIBs) is a key area in battery research, aiming to support the ever-increasing demand for energy storage systems. To better understand the causes and mechanisms of degradation, and thus the diminishing cycling performance and lifetime often observed in LIBs, in operando techniques are essential, because battery chemistry can be monitored non-invasively, in real time. Moreover, there is increasing interest in developing new battery chemistries. Beyond LIBs, sodium ion batteries (NIBs) have gained increasing interest in recent years, as they are a promising candidate to complement LIBs, owing to their improved sustainability and lower cost, while still maintaining high energy density.[1] Initial phases of NIB commercialisation have occurred in the past year. However, for the widespread commercialisation of NIBs, there are still challenges that need to be overcome in developing optimized electrode materials and electrolytes. For the development of such materials and greater understanding of sodium storage mechanisms, solid electrolyte interface (SEI) formation and stability, and degradation processes, in operando methodologies are crucial.
Among the techniques available for in operando analysis, nuclear magnetic resonance spectroscopy (NMR) and imaging (MRI) are becoming increasingly used to characterize the chemical composition of battery materials, study the growth and distribution of dendrites, and investigate battery storage and degradation mechanisms. In situ and in operando 1H, 7Li and 23Na NMR and MRI have recently been used to study LIBs and NIBs, identifying chemical changes in Li and Na species respectively, in metallic, quasimetallic and electrolytic environment as well as directly and indirectly studying dendrite formation in both systems.[2-4] The ability of NMR and MRI to probe battery systems across multiple environments can further be complemented by the enhanced spatial resolution of micro-computed X-ray tomography (μ-CT) which can provide insight into battery material microstructure and defect distribution.
Here, we report in operando 1H and 7Li NMR and MRI experiments that investigate LIB performance, and the identification of changes in the Li signal during charge cycling, as well as the observation of signals in both 1H and 7Li NMR spectra that we attribute to diminishing battery performance, capacity loss and degradation. Additionally, recent operando methodology are adapted and implemented to study Sn based anodes in NIBs. 23Na spectroscopy is performed to monitor the formation and evolution of peaks assigned to stages of Na insertion into Sn, while 1H MRI is used to indirectly visualize the volume expansion of Sn anodes during charge cycling. Battery operation and degradation is further explored in these NIBs, using μ-CT, where the anode is directly visualized to a higher resolution and the loss of electrolyte in the cell, during cycling is observed
With the continuous release of anthropogenic pollutants into the environment, substantial risks for the human health arise. Concerning are especially persistent substances (e.g., PFAS) as they accumulate in food chains which inevitably result in the transgression of negative impact threshold levels. Environmental Analytical Chemsitry interfaces all disciplines of Risk Assessment. Therefore it is the important tool to identify, monitor, and remediate environmental pollutants. Based on the example of PFAS, a workflow to tackle environmental pollutants in a retro- and pro-spective way is shown. Within the project, the worldwide situation of environmental pollutants will be illustrated for the example of PFAS. As the major discipline to confront the problem, analytical chemistry will be shown as a key tool for contesting PFAS and creating safe-by-design materials in the future.
The combination of non-target analysis (NTA) based on HPLC-ESI-MS with elemental fluorine speciation analysis based on HPLC-PARCI-MS for the identification and quantification of (unknown) organofluorines in environmental samples is associated with several advantages e.g., reduced non-target data treatment workflow and quantification.
PFASs compromise persistent, bioaccumulative, and toxic properties and are hence, environmental contaminants of emerging concern. Thus, procedures for identifying potential sources of the entrance of these compounds into the environment, identification of new organofluorine species, and closing mass balances need urgent development. The intrinsic fluorine tag appears in a sizable fraction of these and other xenobiotics, making elemental speciation desirable for quantitative NTA in these areas. Current non-target approaches based on ESI-HRMS suffer from a lack of data mining algorithms for identification of PFASs with low fluorine mass percentages. Furthermore, low ionization efficiencies of the compounds hamper detection limits.
Hence, the proposed combination of simultaneous HPLC-ESI-MS and HPLC-PARCI-MS via split-stream coupling is a promising approach for environmental PFAS monitoring. Furthermore, it could serve as an important analytical procedure to set up limiting values in compliance with the desired PFAS ban of the European Union.
Die Bundesanstalt für Materialforschung und -prüfung (BAM) ist eine forschende Bundesoberbehörde und Einrichtung der Ressortforschung der Bundesrepublik Deutschland. Unter ihrer Leitlinie „Sicherheit in Technik und Chemie“ ist sie zuständig für die öffentliche technische Sicherheit und für metrologische Aufgaben in der Chemie. Das Aufgabenspektrum der BAM, das sich an aktuellen Fragestellungen aus Wissenschaft, Wirtschaft, Politik und Normung orientiert, bietet sehr viele interessante Tätigkeitsfelder für Naturwissenschaftler*Innen und Ingenieur*Innen.
Digital Calibration Certificates: Transforming Efficiency and Safety in Hydrogen Refuelling Station
(2023)
Digital Certificates have emerged as a pivotal element in automation and digitalisation. This presentation highlights the added value of a digitalised metrology, its impact on the workflows on the calibration service providing and receiving side as well as their significance in enhancing the quality infrastructure. An overview of digital calibration certificates (DCC) for temperature sensors, including their structure and role in establishing trust in hydrogen refuelling stations (HRS), will be discussed. Additionally, it explores the impact of DCC on optimising efficiency in the hydrogen refuelling process.
Exploring the paradigm shift brought about by Industry 4.0, where machines possess the ability to autonomously interpret digital certificate data, leads to streamlined safety checks and reduced human intervention. We will discuss how automated verification of machine-readable certificates contributes to maintaining and elevating safety standards over human-readable certificates. Furthermore, we will take a deep dive into the application of DCCs in HRS, showcasing how they enhance operational efficiency, accuracy, and maintenance by enabling real-time monitoring and adjustment of process data.
By exploring the interdependent relationship between digital certificates, machine-readable environments, and HRS optimisation, this presentation will provide valuable insights into harnessing cutting-edge technologies to create a safer, more efficient, and technologically empowered hydrogen refuelling process.
Per- and polyfluoroalkyl substances (PFAS) are chemicals which were developed to improve humanity’s quality of life. Due to their high chemical stability and resistance to degradation by heat or acids, PFAS were used in a variety of consumer products. The continuous use of PFAS in household products and the discharge of PFAS from industrial plants into the sewer system resulted in the contamination of effluents and sewage sludge from wastewater treatment plants (WWTPs) (Roesch et al. 2022). Since sewage sludge is often used as fertilizer, its application on agricultural soils has been observed as a significant entry path for PFAS into the environment, specifically in our food chain. In Germany the sewage sludge/biosolid application on agricultural land was banned with the amendment of the German Sewage Sludge Ordinance and by 2029 sewage sludge application will be totally prohibited. However, phosphorus (P) from sewage sludge should still be recycled in WWTPs of cities with a population larger than 50,000 residents. To produce high-quality P-fertilizers for a circular economy, PFAS and other pollutants (e.g. pesticides and pharmaceuticals) must be separated from sewage sludge. Due to the strong diversity of industrial PFAS usage it is not clear if a safe application of novel recycled P-fertilizers from WWTPs can be guaranteed. Therefore, we analyzed various sewage sludges and wastewater-based fertilizers. Sewage sludge (SL) samples from various WWTPs in Germany and Switzerland, six sewage sludge ashes (SSA) from Germany, six thermally treated SL and SSA samples with different additives (temperatures: 700-1050 °C), two pyrolyzed SL samples (temperature: 400 °C) and two struvite samples from Germany and Canada were analyzed. The goal was to quantify PFAS in sewage sludges and wastewater-based P-fertilizers with the sum parameter extractable organic fluorine (EOF) by combustion ion chromatography (CIC). The results were compared with data from classical LC-MS/MS target analysis as well as selected samples by HR-MS suspect screening. The EOF values of the SLs mainly range between 154 and 538 µg/kg except for one SL which showed an elevated EOF value of 7209 µg/kg due to high organofluorine contamination. For the SSA samples the EOF values were lower and values between LOQ (approx. 60 µg/kg) and 121 µg/kg could be detected. For the pyrolyzed SLs no EOF values above the LOQ were detected. Moreover, the two wastewater-based struvite fertilizers contain 96 and 112 µg/kg EOF, respectively. In contrast to the EOF values, the sum of PFAS target values were relatively low for all SLs. Additional applied PFAS HR-MS suspect screening aimed to tentatively identify PFAS that could contribute to the hitherto unknown part of the EOF value. The majority of the detected fluorinated compounds are legacy PFAS such as short- and long-chain perfluorocarboxylic acids (PFCA), perfluorosulfonic acids (PFSA), polyfluoroalkyl phosphate esters (PAPs) and perfluorophosphonic acids (PFPA). Moreover, fluorinated pesticides, pharmaceutical as well as aromatic compounds were also identified, which are all included in the EOF parameter. Our research revealed that the current PFAS limit of 100 µg/kg for the sum of PFOS + PFOA in the German Fertilizer Ordinance is no longer up to date. Since the number of known PFAS already exceeds 10,000, the ordinance limit should be updated accordingly. Recent regulations and restrictions on using long-chain PFAS (≥C8) have resulted in a significant shift in the industry towards (ultra-)short-chain alternatives, and other, partly unknown, emerging PFAS. Ultimately, also fluorinated pesticides and pharmaceuticals, which end up as ultrashort PFAS in the WWTPs, have to be considered as possible pollutants in fertilizers from wastewater, too.
When reassessing existing concrete bridges, the challenge is often to obtain missing or incomplete information on the internal structure. In particular, the number and position of the existing reinforcement as well as the geometric dimensions of the components are of interest. Non-destructive testing methods, like radar or ultrasound, which work on the basis of the pulse-echo method, have been established for this purpose, as they only require access to the component from one side. The measurement data recorded on the structure require pre-processing to be able to reproduce the internal structure geometrically accurately. Besides different steps of data processing, the geometrical reconstruction of the measured data based on the Synthetic Aperture Focusing Technique (SAFT) is state of the art today. In this paper, the technical possibilities of the ultrasonic echo method are presented based on measurements in the laboratory and on a real bridge structure. The precision of the reconstruction and its limitations are shown. In addition to the state of the art SAFT technique, open questions and the latest research approaches, such as imaging by reverse time migration (RTM) including initial results are discussed.
The focus of the presentation focus will be on 3D imaging by means of X-ray Computed Tomography (XCT) at the lab and at synchrotron, and the non-destructive residual stress (RS) characterization by diffraction of different kind of materials in FB8.5 Micro-NDT BAM. For instance, the manufacturing defects and high RS are inherent of additively manufacturing techniques and affect structural integrity of the components. Using XCT the defects size and shape distribution as well as geometrical deviations can be characterized, allowing the further optimization of the manufacturing process. Diffraction-based RS analysis methods using neutron and synchrotron X-rays at large scale facilities offer the possibility to non-destructively spatially resolve both surface and bulk RS in complex components and track their changes following applied thermal or mechanical loads.
This presentation summarizes recent developments within the scope of the national pre-standardization project “ZfPStatik”, which aims to prepare a guideline about NDT-supported structural analyses. The focus is on the purposeful and explicit utilization of geometrical tendon and reinforcement bar positions measured on-site in reliability analysis — shown by means of a prestressed concrete bridge as case study.
Experimentally informed multiscale creep modelling of additive manufactured Ni-based superalloys
(2023)
Excellent creep resistance at elevated temperatures, i.e. T / T_m> 0.5, due to γ-γ’ microstructure is one of the main properties of nickel-based superalloys. Due to its great importance for industrial applications, a remarkable amount of research has been devoted to understanding the underlying deformation mechanism in a wide spectrum of temperature and loading conditions. Additive manufactured (AM) nickel-based superalloys while being governed by similar γ-γ’ microstructure, exhibit AM-process specific microstructural characteristics, such as columnar grains, strong crystallographic texture (typically <001> fiber texture parallel to build direction) and compositional inhomogeneity, which in turn leads to anisotropic creep response in both stationary and tertiary phases.
Despite the deep insights achieved recently on the correlation between process parameters and the resulting microstructure, the anisotropic creep behavior and corresponding deformation mechanism of these materials are insufficiently understood so far. One reason for this is the lack of capable material models that can link the microstructure to the mechanical behavior. To overcome this challenge, a multiscale microstructure-based approach has been applied by coupling crystal plasticity (CP) and polycrystal model which enables the inclusion of different deformation mechanisms and microstructural characteristics such as crystallographic texture and grain morphology. The method has been applied to experimental data for AM-manufactured INCONEL-738LC (IN738). The effect of different slip systems, texture, and morphology on creep anisotropy at 850°C has been investigated. Results suggest a strong correlation between superlattice extrinsic stacking fault (SESF) and microtwinning and observed creep anisotropy.