Chemie und Prozesstechnik
Filtern
Dokumenttyp
- Vortrag (1060) (entfernen)
Sprache
- Englisch (1060) (entfernen)
Referierte Publikation
- nein (1060)
Schlagworte
- Concrete (39)
- LIBS (36)
- Nanoparticles (32)
- NDT (30)
- Fluorescence (26)
- Traceability (25)
- Synchrotron (24)
- Additive manufacturing (23)
- X-ray scattering (23)
- Metrology (22)
Organisationseinheit der BAM
- 6 Materialchemie (339)
- 1 Analytische Chemie; Referenzmaterialien (336)
- 8 Zerstörungsfreie Prüfung (287)
- 6.1 Oberflächen- und Dünnschichtanalyse (122)
- 1.1 Anorganische Spurenanalytik (99)
- 8.2 Zerstörungsfreie Prüfmethoden für das Bauwesen (88)
- 8.0 Abteilungsleitung und andere (87)
- 6.3 Strukturanalytik (84)
- 4 Material und Umwelt (69)
- 1.4 Prozessanalytik (61)
Glimpses of the Future ✨: Advancing X-ray Scattering in an Automated Materials Research Laboratory
(2023)
In our (dramatically understaffed) X-ray scattering laboratory, developing a systematic, holistic methodology1 let us provide scattering and diffraction information for more than 2100 samples for 200+ projects led by 120+ collaborators. Combined with automated data correction pipelines, and our analysis and simulation software, this led to more than 40 papers2 in the last 5 years with just over 2 full-time staff members.
This year, our new, modular synthesis platform has made more than 1000 additional samples for us to analyse and catalogue. By virtue of the automation, the synthesis of these samples is automatically documented in excruciating detail, preparing them for upload and exploitation in large-scale materials databases. Having developed these proof-of-concepts, we find that materials research itself is changed dramatically by automating dull tasks in a laboratory.
This talk is intended to spark ideas and invite collaborations by providing an overview of: 1) the current improvements in our wide-range X-ray scattering laboratory methodology, 2) Introduce some of our open-source analysis and simulation software, touching on scattering, diffraction and PDF, and 3) introducing our open, modular robotic platform for systematic sample preparation. Finally, the remaining bottlenecks and points of attention across all three are highlighted.
The second talk for the Swiss Society for Crystallography (SSCr) workshop on SAXS will highlight the data processing challenges, holistic experimental workflow developments, and the pitfalls. In particular, the following items will be addressed:
- The importance of data processing and estimating uncertainty
- A universal correction pipeline – away with the headaches, at least for this step!
- Experiment planning part 2, some tips and advice to improve your corrected data.
- Sample preparation, background selection, some tips and advice to improve your corrected data.
- Automate for your mental well-being; electronic logbooks, measurement catalogs and workflow management software
- Life on the edge: several pitfalls to avoid…
This talk for the Swiss Society for Crystallography (SSCr) workshop on SAXS will introduce scattering from various angles, focusing in particular on:
- Information content of X-ray scattering experiments, three entry points…
- An introduction to Fourier Transforms
- Sample criteria, compatibility, and selection
- Key indicators of a measurement – where is the information?
- Key indicators of measurement quality
- Experiment planning, the basics
In our (dramatically understaffed) X-ray scattering laboratory, developing a systematic, holistic methodology let us provide scattering and diffraction information for more than 2100 samples for 200+ projects led by 120+ collaborators. Combined with automated data correction pipelines, and our analysis and simulation software, this led to more than 40 papers in the last 5 years with just over 2 full-time staff members.
This year, our new, modular synthesis platform has made more than 1000 additional samples for us to analyse and catalogue. By virtue of the automation, the synthesis of these samples is automatically documented in excruciating detail, preparing them for upload and exploitation in large-scale materials databases.
This talk is intended to spark ideas and invite collaborations by providing an overview of: 1) the current improvements in our wide-range X-ray scattering laboratory methodology, and 2) introducing our open, modular robotic platform for systematic sample preparation.
Introduction: The influence of copper, iron and zinc concentrations on the formation of ß-amyloid plaques and neurofibrillary tangles in Alzheimer’s disease (AD) is widely discussed in the community. The results from human and animal studies so far are mixed with some studies showing a correlation and others not. From a number of studies, it is known that disease state and isotopic composition of essential elements can be coupled.
Aim: The aim of the study was to identify changes in element content and isotopic composition in two transgenic mouse models used in AD research compared to their genetic WT relatives and to establish whether element content and isotopic signature between different laboratories is comparable.
Methods: ß-amyloid (5xFAD) and tau overexpressing (L66) mice together with their matching wild-types were bred at dedicated facilities in accordance with the European Communities Council Directive (63/2010/EU). Serum and brain were sampled after sacrifice and the samples distributed among the participants of the study. The tissues were acid digested for total element determination and high-precision isotope ratio determination. Element content was determined by either sector-field or quadrupole-based inductively coupled plasma mass spectrometry (ICPMS). For the determination of isotope ratios multi-collector ICPMS was used.
Results: Total copper content was significantly higher for L66 and their matched WT compared to 5xFAD and WT. Brains of L66 mice contained more Fe in brain than their WT, Zn and Cu were not significantly different between L66 and WT. Whereas 5xFAD mice had a slightly lower Cu and slightly higher Zn concentration in brain compared to WT. The isotopic signature in brain of L66 mice for Fe was different from their controls, whereas Zn isotope ratios were influenced in 5xFAD mice compared to their WT . The Cu isotopic ratio did not seem to be influenced in either strain. In serum, the shifts were less pronounced.
Conclusion: Even though neither Tau-protein nor amyloid precursor protein are known to be metal-dependent / -containing proteins, the overexpression of both influences the Fe, Cu and Zn metabolism in brain and to some extent also in serum as can be seen not only using total element determination but probably more clearly studying the isotopic signature of Fe, Cu and Zn.
Many metallic materials gain better mechanical properties through controlled heat treatments. For example, in age-hardenable aluminium alloys, the strengthening mechanism is based on the controlled formation of nanometre-sized precipitates, which represent obstacles to dislocation movement and consequently increase the strength. Precise tuning of the material microstructure is thus crucial for optimal mechanical behaviour under service condition of a component. Therefore, analysis of the microstructure, especially the precipitates, is essential to determine the optimum parameters for the interplay of material and heat treatment. Transmission electron microscopy (TEM) is utilized to identify precipitate types and orientations in the first step. Dark-field imaging (DF-TEM) is often used to image the precipitates and thereafter quantify their relevant dimensions. Often, these evaluations are still performed by manual image analysis, which is very time-consuming and to some extent also poses reproducibility problems.
Our work aims at a semantic representation of an automatable digital approach for this material specific characterization method under adaption of FAIR data practices. Based on DF-TEM images of different precipitation states of a wrought aluminium alloy, the modularizable, digital workflow of quantitative analysis of precipitate dimensions is described. The integration of this workflow into a data pipeline concept will also be discussed. Using ontologies, the raw image data, their respective contextual information, and the resulting output data of the quantitative image analysis can be linked in a triplestore. Publishing the digital workflow and the ontologies will ensure data reproducibility. In addition, the semantic structure enables data sharing and reuse for other applications and purposes, demonstrating interoperability.
We report about a joint project aiming at the digitalization of a lab course in materials testing. The undergraduate students were asked to prepare samples of a precipitation hardened aluminum alloy and characterize them using hardness and tensile tests. In a first step, we developed the frames for the digital labor notebook using eLabFTW. The primary data and the relevant metadata of each run were saved in a central database and made available for analysis and report issues. The whole set of results produced in a course was made available in the database. This database can be improved and serve as an open repository for data on this specific alloy.
The logical frame for the joint project was provided by the PMD Core Ontology (PMDco), a mid-level ontology that enables the representation and description of processes and process chains in an MSE-specific manner, ensuring full traceability of generated data. For the digitalization of this lab course, the tensile test ontology (TTO) was applied which is designed as a module of the PMDco using strongly related semantic concepts.
Knowledge representation in the materials science and engineering (MSE) domain is a vast and multi-faceted challenge: Overlap, ambiguity, and inconsistency in terminology are common. Invariant and variant knowledge are difficult to align cross-domain. Generic top-level semantic terminology often is too abstract, while MSE domain terminology often is too specific.
In this presentation, an approach how to maintain a comprehensive and intuitive MSE-centric terminology composing a mid-level ontology–the PMD core ontology (PMDco)–via MSE community-based curation procedures is shown.
The PMDco is designed in direct support of the FAIR principles to address immediate needs of the global experts community and their requirements. The illustrated findings show how the PMDco bridges semantic gaps between high-level, MSE-specific, and other science domain semantics, how the PMDco lowers development and integration thresholds, and how to fuel it from real-world data sources ranging from manually conducted experiments and simulations as well as continuously automated industrial applications.
The present work is part of the AIFRI project (Artificial Intelligence For Rail Inspection), where we and our project partners train a neural network for defect detection and classification. Our goal at BAM is to generate artificial ultrasound and eddy current training data for the A.I. This paper has an exploratory nature, where we focus on the simulation of eddy current signals for head check cracks, one of the most important rail surface defects. The goal of this paper is twofold. On the one hand, we present our general simulation setup. This includes geometric models for head check cracks with features like branching and direction change, a model for the HC10 rail testing probe, and the configuration of the Faraday simulation software.
On the other hand, we use the Faraday software to simulate eddy current testing signals with a strong focus on the influence of the damage depth on the signal, while differentiating between different crack geometries. Here, we observe an early saturation effect of the test signal at a damage depth of 2 mm (at a crack angle of 25◦ to the surface). That is about 2 mm earlier than we would expect from measurements at a crack angle of 90◦. This behavior will be investigated further in a future paper. Finally, we interpolate the simulated signals in a two-step curve fitting process. With these interpolations we may generate eddy current test signals for any damage depth within the simulated range.
Additive manufacturing of concrete structures is a novel and emerging technology. Free contouring in civil engineering, which allows for entirely new designs, is a significant advantage. In the future, lower construction costs are expected with increased construction speeds and decreasing required materials and workers. However, architects and civil engineers rely on a certain quality of execution to fulfil construction standards. Although several techniques and approaches demonstrate the advantages, quality control during printing is highly challenging and rarely applied. Due to the continuous mixing process commonly used in 3D concrete printing, it is impossible to exclude variations in the dry mixture or water content, and a test sample cannot be taken as a representative sample for the whole structure. Although mortar properties vary only locally, a defect in one layer during printing could affect the entire integrity of the whole structure . Therefore, real-time process monitoring is required to record and document the printing process. At the Bundesanstalt für Materialforschung und -prüfung (BAM) a new test rig for the additive manufacturing of concrete is built. The primary purpose is measuring and monitoring the properties of a mortar during the printing process. The following study investigates an approach for calculating yield stress and plastic viscosity based on experimentally recorded pressure data. The calculations assume that fresh mortar behaves as a Bingham fluid and that the Buckingham-Reiner-equation is applicable. A test setup consisting of rigid pipes with integrated pressure sensors at different positions is utilized. Monitoring the printing process with different sensors is crucial for the quality control of an ongoing process.
Defects are still common in metal components built with Additive Manufacturing (AM). Process monitoring methods for laser powder bed fusion (PBF-LB/M) are used in industry, but relationships between monitoring data and defect formation are not fully understood yet. Additionally, defects and deformations may develop with a time delay to the laser energy input. Thus, currently, the component quality is only determinable after the finished process.
Here, active laser thermography, a non-destructive testing method, is adapted to PBF-LB/M, using the defocused process laser as heat source. The testing can be performed layer by layer throughout the manufacturing process. The results of the defect detection using infrared cameras are presented for a custom research PBF-LB/M machine. Our work enables a shift from post-process testing of components towards in-situ testing during the AM process. The actual component quality is evaluated in the process chamber and defects can be detected between layers.
The presentation provides an overview and introduction to the need and drivers for the digital transformation of the quality infrastructure (QI). It illustrates the tools and processes that are the fundation of a digital QI and how the initiatve QI-Digital aims at developing accordant solutions. Special emphasis is given to the pilot projects at BAM.
Current decontamination strategies of PFAS-burdened soils mainly consist of adsorption methods using adsorbents for fixation of PFAS in the ground. A second option is the utilization of a “pump and treat” process, cycling polluted soils through a washing plant leading to the concentration of the pollutants in the fine fraction. Only a subsequent, high-energy consuming pyrolysis process guarantees the total destruction of all fluorinated organic contaminants. These approaches are cost-intensive and not intended for the direct decomposition of all PFAS contaminants. Hence, there is a great demand for innovative developments and chemical treatment technologies, dealing with new strategies of tackling the PFAS problem. Thus, we investigated mechanochemical treatment of PFAS contaminated soils with various additives in a ball mill and analyzed the PFAS defluorination with gas chromatography mass spectrometry (GC-MS) and liquid chromatography tandem mass spectrometry (LC-MS/MS), respectively, as well as the fluoride mineralization by ion chromatography (IC) and fluorine K-edge X-ray absorption near-edge structure (XANES) spectroscopy.
The focus of the presentation focus will be on 3D imaging by means of X-ray Computed Tomography (XCT) at the lab and at synchrotron, and the non-destructive residual stress (RS) characterization by diffraction of additively manufactured (AM) materials in BAM (Berlin, Germany). The manufacturing defects and high RS are inherent of AM techniques and affect structural integrity of the components. Using XCT the defects size and shape distribution as well as geometrical deviations can be characterized, allowing the further optimization of the manufacturing process. Diffraction-based RS analysis methods using neutron and synchrotron X-rays at large scale facilities offer the possibility to non-destructively spatially resolve both surface and bulk RS in complex components and track their changes following applied thermal or mechanical loads.
In recent years, additive manufacturing technologies have gained in importance. Laser powder bed fusion can be used for complex functional components or the production of workpieces in small quantities. High safety requirements, e.g. in aerospace, demand comprehensive quality control. Therefore, non-destructive offline inspection methods such as computed tomography are used after production. Recently, online non-destructive testing methods such as optical tomography have been developed to improve profitability and practicality. In this presentation, the applicability of eddy current inspection using GMR sensors for online inspection of PBF-LB/M parts is demonstrated. Eddy current testing is performed for each layer during the production process at frequencies uo to 1.2 MHz. Despite the use of high-resolution arrays with 128 elements, the testing time is kept low by an adapted hardware. Thus, the measurement can be performed during the manufacturing process without significantly slowing down the production process. In addition to the approach, the results of an online eddy current test of a step-shaped test specimen made of Haynes282 are presented.
Ultrasonic coda wave interferometry can detect small changes in scattering materials like concrete. We embedded ultrasonic transducers in the Gänstorbrücke Ulm, a monitored road bridge in Germany, to test the methodology. Since fall 2020, we've been monitoring parts of the bridge and comparing the results to commercial monitoring systems. We calculate signal and volumetric velocity changes using coda waves, and long-term measurements show that the influence of temperature on strains and ultrasound velocity changes can be monitored. Velocity change maps indicate that different parts of the bridge react differently to environmental temperature changes, revealing local material property differences. A load experiment with trucks allows calibration to improve detectability of possibly damaging events. Our work focuses on measurement reliability, potential use of and distinction from temperature effects, combination with complementary sensing systems, and converting measured values to information for damage and life cycle assessment.
Wasserstoff als flexibel einsetzbarer und leicht transportierbarer Energieträger bildet eine Schlüsselkomponente auf dem Weg zu einer klimaneutralen Energiewende. Zur Speicherung von gasförmigem Wasserstoff kommen bei stationären und auch mobilen Anwendungen meist Druckbehälter aus Verbundwerkstoffen zum Einsatz. Dabei ist für den Betrieb der Behälter die Gewährleistung der Sicherheit von großer Relevanz. Structural Health Monitoring (SHM) bietet einen innovativen Ansatz, um sowohl die Sicherheit als auch die Zuverlässigkeit der Druckbehälter zu gewährleisten und kritische Versagensfälle zu vermeiden.
Zur kontinuierlichen Überwachung und Bewertung des Zustands eines Wasserstoffdruckbehälters wird in dem vorliegenden Beitrag eine aktive Methode präsentiert, die auf geführten Ultraschallwellen basiert. Dabei kommt das Pitch-Catch Verfahren zwischen Sender und Empfängern zur Überwachung der strukturellen Integrität zum Einsatz. Auf Grundlage des untersuchten Wellenausbreitungsverhaltens im Druckbehälter sowie der Geometrieeigenschaften wird ein Sensornetzwerk aus piezoelektrischen Flächenwandlern (PZT) zur Abdeckung des zu überwachenden Bereichs entworfen. Ziel ist es mit Hilfe eines Optimierungsalgorithmus eine größtmögliche Abdeckung bei gleichzeitiger Minimierung der Anzahl der Sensoren zu erreichen. Zudem soll eine gleichmäßige und platzsparende Verteilung im Vordergrund stehen, so dass zukünftig weitere Messmethoden auf dem Druckbehälter angewendet werden können (z. B. durch akustische oder faseroptische Sensoren).
Das für die Anwendung optimierte Sensornetzwerk wird anschließend hinsichtlich der Robustheit bei der Schadenserkennung und -lokalisierung durch Aufbringen von künstlichen Schäden evaluiert. Die Ergebnisse zeigen, dass die künstlichen Schäden am Wasserstoffdruckbehälter nachgewiesen werden können.
This presentation provides a short introduction to the Commission on Isotopic Abundances and Atomic Weights (CIAAW). It describes the role of the Commission and provides an insight into its work and the corresponding principal tasks. Finally, it provides the reader with the latest achievements and with the most recent publications.
Guided wave ultrasonic features in composite overwrapped pressure vessels towards digital twin
(2023)
The digitalization of quality control processes and the underlying data infrastructures for safety relevant components, such as hydrogen pressure vessels, plays a significant role in the transition towards Industry 4.0. In the current safety regulations for hydrogen pressure vessels, there is no established concept for structural health monitoring. The development of a reliable structural health monitoring methodology for monitoring the structural integrity of pressure vessels enables a fast-forward transition from personnel- and costintensive recurring inspections, a.k.a. periodic maintenance, to predictive maintenance. In the work presented; we investigated the application of ultrasonic guided wave propagation to monitor and assess the condition of Type IV composite overwrapped pressure vessel (COPV). A sensor network of fifteen piezo-electric wafers is placed on the carbon fibre reinforced composite cylinder. Five different artificial damage configurations are created by gluing two different weight blocks on three different locations. The database containing measured guided wave data sets is enriched by two different boundary conditions. We utilized an open-source software, openBIS labnotebook, to store and analyse experimental datasets. The guided wave ultrasonic signals were investigated and analysed by using commonly used ultrasonic features (e.g., amplitude, frequency, time of flight) as well as non-traditional time-series features (kurtosis, skewness, variance). The features were used to calculate damage index and the detection performance for the results has been evaluated. The results suggest that both traditional and non-traditional features assume significant importance in artificial damage detection. The future works will additionally involve the impacts of operational conditions, such as periodic pressure variations temperature loadings as well as material degradations.
Capillary active interior insulation materials are an important approach to minimize energy losses of historical buildings. A key factor for their performance is a high liquid conductivity, which enables redistribution of liquid moisture within the material. We set up an experiment to investigate the development of moisture profiles within two different interior insulation materials, calcium-silicate (CaSi) and expanded perlite (EP), under constant boundary conditions. The moisture profiles were determined by two different methods: simple destructive sample slicing with subsequent thermogravimetric drying as well as non-destructive NMR measurements with high spatial resolution. The moisture profiles obtained from both methods show good agreement, when compared at the low spatial resolution of sample slicing, which demonstrates the reliability of this method. Moreover, the measured T2- relaxation-time distributions across the sample depth were measured, which may give further insight into the saturation degree of the different pore sizes. In order to explain differences in the moisture profiles between CaSi and EP, we determined their pore-size distribution with different methods: conversion of the NMR T2 relaxationtime distribution at full saturation, mercury intrusion porosimetry and indirect determination from pressure plate measurements. CaSi shows a unimodal distribution at small pore diameters, while in EP, a bi-modal or wider distribution was found. We assume that the smaller pore diameters of CaSi lead to a higher capillary conductivity, which causes a more distributed moisture profile in comparison with that of EP.
For a long time, the rule of thumb for active thermography as a non-destructive testing method was that the resolution of internal defects/inhomogeneities is limited to a ratio of defect depth/defect size ≤ 1. This is due to the diffusive nature of thermal conduction in solids. So-called super resolution approaches have recently allowed this physical limit to be overcome many times over. This offers the attractive possibility of developing thermography from a purely near surface-sensitive testing method to one with improved depth range. How far this development can be pushed is the subject of current research.
We have already been able to show that this classical limitation for one- and two-dimensional defect geometries can be overcome by illuminating the test object sequentially in a structured manner with individual laser spots and thus subsequently calculating a defect map from the resulting measurement data by applying photothermal super resolution reconstruction, which allows significantly improved separation of individual closely spaced defects. As a result, this method benefits strongly from the combination of sequential spatially structured illumination and modern numerical optimization methods, which come at the expense of higher experimental complexity. This leads to long measurement times, large data sets, and tedious numerical analysis, in contrast to the application of established standard thermographic methods with homogeneous illumination.
In this work, we report on the application of full-area spatially structured two-dimensional illumination patterns, which, by applying state-of-the-art laser projector technology in conjunction with a high-power laser, makes it possible to achieve an efficient implementation of photothermal super-resolution reconstruction even for larger test areas in the first place.
International standards describing reliable protocols will facilitate the commercialization of graphene and related 2D materials. One physico-chemical key property next to flake size and thickness is the chemical composition of the material. Therefore, an ISO standard is under development with X-ray photoelectron spectroscopy having a prominent role. With its information depth of around 10 nm which is the similar length scale as the thickness as of particles of 2D materials consisting of a few monolayer XPS seems to be highly suitable for this purpose. Different sample preparation methods like pressing the powders onto adhesive tapes, into recesses, or into solid pellets result in inconsistencies in the quantification. For the validation of the quantification with XPS an interlaboratory comparison was initiated under the auspice of the “Versailles Project on Advanced Materials and Standards” (VAMAS). First results confirm that the sample preparation method (pellet vs. powder) influences the quantification results clearly.
International standards describing reliable protocols will facilitate the commercialization of graphene and related 2D materials. One physico-chemical key property next to flake size and thickness is the chemical composition of the material. Therefore, an ISO standard is under development with X-ray photoelectron spectroscopy having a prominent role. With its information depth of around 10 nm which is the similar length scale as the thickness as of particles of 2D materials consisting of a few monolayer XPS seems to be highly suitable for this purpose. Different sample preparation methods like pressing the powders onto adhesive tapes, into recesses, or into solid pellets result in inconsistencies in the quantification. For the validation of the quantification with XPS an interlaboratory comparison was initiated under the auspice of the “Versailles Project on Advanced Materials and Standards” (VAMAS). First results confirm that the sample preparation method (pellet vs. powder) influences the quantification results clearly. Considering this effect, a good agreement of the results from the different participants were observed. Similar results were observed for raw, N- and F-functionalized graphene.
Modular chemical production is a tangible implementation of the digital transformation of the specialty chemicals process industry. In particular, it enables acceleration of process development and thus faster time to market by flexibly interconnecting and orchestrating standardized physical modules and bringing them to life. For this purpose, specific (chemical) sensors of process analytics are needed, preferably without lengthy calibration or spectroscopic model development.
An excellent example of a "direct" analytical method is online nuclear magnetic resonance (NMR) spectroscopy. NMR spectroscopy meets the requirements of a direct analytical method because of the direct correlation between the signal area in the spectrum ("counting" the nuclear spins) and the analyte amount of substance concentrations. It is also extremely linear over the concentration range.
With the availability of compact benchtop NMR instruments, it is now possible to bring NMR spectroscopy directly into the field, in close proximity to specialized laboratory facilities, pilot plants, and even industrial-scale production facilities. The first systems are in TRL 8 (Qualified System with Proof of Functionality in the Field).
The presentation will discuss the many building blocks of online nuclear magnetic resonance spectroscopy, from flow cells to automated data analysis.
Moisture testing and mapping
(2023)
Luminescent Nanoparticles – From Photophysics to the Measurement of Photoluminescence Quantum Yields
(2023)
Inorganic nanocrystals with linear and nonlinear photoluminescence in the ultraviolet, visible, near infrared and short-wave infrared like spectrally shifting lanthanide-based nanoparticles (LnNCs) like NaYF4: Yb, Er and semiconductor quantum dots have meanwhile found applications in the life and material sciences ranging from optical reporters for bioimaging and sensing over security barcodes to solid state lighting and photovoltaics. The identification of optimum particle architectures for photonic applications requires quantitative spectroscopic studies, ideally flanked by single particle studies to assess spectroscopic inhomogeneities on a particle-to-particle level for typical preparation methods. In the following, photoluminescence studies of LnNCs are presented, addressing parameters such as particle size, surface coating, and dopant ion concentration as well as excitation power density mandatory for a profound mechanistic understanding of the nonradiative deactivation pathways in these nanocrystals. In addition, methods for the determination of particle brightness and photoluminescence quantum yield in different spectral windows are presented.
This talk introduces the expanded view that comes from wide-range X-ray scattering investigations.
Compared to X-ray diffraction studies alone, the additional angular range of this technique provides information on the larger structural dimensions present in your samples. This allows for the extraction of information on the size and size distribution of nanostructural components, such as nanoparticles, nanovoids, and any other structure exhibiting an electron density contrast.
The talk introduces the technique, the MOUSE instrument used for these investigations, and provides several real-world examples of its uses. The audience is invited to choose which examples captures their interest from a range of options, in the latter segment of the talk.
Utilizing Aspergillus niger Fumonisin Amine Oxidase for the Electrochemical Detection of Fumonisin
(2023)
Fumonisins are a class of toxic secondary metabolites produced by various Fusarium species. The two most important producers of fumonisins are F. verticillioides and F. proliferatum but also Aspergillus niger is known to produce fumonisins. Most frequently they occur on maize, but also other grains can be contaminated with this group of mycotoxins. Exposure to fumonisins by dietary intake can have serious health effects on farm animals and also within humans. Thus, the European Commission sets legal limits for fumonisins in feed and foodstuffs. The detection of fumonisins is frequently performed in laboratories by chromatographic methods, which are costly and require trained personnel. Simplifying the analysis is therefore a major goal using portable detection systems. Electrochemical enzymatic biosensors offer great promise to meet this demand. Here we report for the first time an enzymatic fumonisin sensing approach with amperometric detection. For this purpose, an Aspergillus niger fumonisin amine oxidase (AnFAO) catalyzing the oxidative deamination of fumonisins, producing hydrogen peroxide, was recombinantly produced in E. coli. It was found that the specific activity of AnFAO using 20 μM Fumonisin B1 as substrate is higher than for 20 μM Fumonisin B2 with 0.122 U mg-1 and 0.058 U mg-1, respectively. It was possible to show a dependence of enzyme activity with enzyme – and substrate-concentration. For fumonisin B1 detection, the enzyme was coupled covalently to magnetic particles and the enzymatically produced H2O2 was detected amperometrically in a flow injection system using Prussian blue carbon electrodes. The developed method allows to quantify fumonisin B1 concentrations down to 1.5 µM and demonstrates that the recombinantly produced AnFAO was able to deaminate different concentrations of fumonisin even in immobilized form. Thus, this enzyme is well suited to develop an enzyme based electrochemical biosensor for fumonisin contaminated food and feed.
Per- and polyfluoroalkyl substances (PFASs) are a group of anthropogenic contaminates associated with persistent, bioaccumulative and toxic properties. Mostly, target-based approaches (e.g., LC-MS/MS) are utilized for the analysis of PFASs in the environment. Target approaches are limited to a few selected PFASs and therefore underestimate the total PFAS burden. Analytical approaches based on total fluorine for PFAS sum parameter analysis become increasingly important to indicate realistic PFAS pollution levels.
Recently high resolution-continuum source-graphite furnace molecular absorption spectrometry (HR-CS-GFMAS) turned out to be a sensitive and highly selective tool for fluorine determination. The method is based on the in situ formation of diatomic gallium-mono fluoride (GaF) in a graphite furnace at a temperature of 1550°C. The molecular absorption of GaF can be detected at its most sensitive wavelength at 211.248 nm providing limits of quantification in the low µg F/L range. HR-CS-GFMAS analysis can be combined with the extractable organically bound fluorine (EOF) approach whereas PFASs are extracted from liquid or solid samples using organic solvents and/or solid phase extraction (SPE).
In this presentation the applicability of HR-CS-GFMAS for organic fluorine analysis in various environmental sample including (1) water samples, (2) soil samples and (3) plant samples is demonstrated.
(1) We investigated EOF concentrations in water bodies in Berlin, Germany and used additional PFAS target analysis for a PFAS mass balance approach. EOF concentrations were in the expected range for an urban river system. However, downstream of an effluent discharge, the EOF increased by one order of magnitude from 40 to 574 ng F/L. Target analysis determined mostly short-chained perfluorinated carboxylic acids and sulfonic acids, which however only made up less than 10% of the EOF. This study highlights that EOF screening using HR-CS-GFMAS is useful and advantageous compared to target analysis to identify pollution sites in urban water systems.
(2) For soil samples, we optimized a fast and simple PFAS extraction method for EOF determination. The developed extraction method consists of a liquid-solid extraction without any additional SPE for fluoride removal. We investigated different soil samples using the optimized method with and without an additional SPE clean-up step and revealed a drastic underestimation of EOF mass fractions using SPE. The optimized method is a valuable screening tool for fast PFAS monitoring.
(3) For plant samples, we conducted a study on the uptake and fate of PFASs in bean plants. For PFAS mass balancing HR-CS-GFMAS analysis was combined with LC-MS/MS analysis. PFASs were spiked as mixtures of known and unknown composition. Short-chained PFASs were determined with high mass fractions mainly in the fruits of the investigated plants while long-chained PFASs were mainly determined in roots. Overall, both methods indicate comparable results with target analysis being more reliable for known PFAS contamination and EOF/HR-CS-GFMAS analysis being more valuable to identify PFAS exposure of unknown composition.
Since its isolation, graphene has received growing attention from academia and industry due to its unique properties. Promising opportunities for applications are discussed in different field like electronics and optoelectronics, detection, and sensing devices, biosystems or chemical and environmental corrosion inhibition. Here, functionalization with elements like oxygen, nitrogen or fluorine can broaden the application, for example in composite materials. However, lack of generally accepted operation procedures hinders the commercialization, the so-called “what is my material” barrier. Therefore, first efforts were done to develop common, reliable, and reproducible ways to characterize the morphological and chemical properties of the industrially produced material.
In this contribution, our efforts in the development of reliable chemical characterizations protocols for functionalized graphene are presented. An ISO standard for the chemical characterization of graphene-related (GRM) is under development with X-ray photoelectron spectroscopy (XPS) having a prominent role. With its information depth of around 10 nm, which is the similar length scale as the thickness of particles of 2D materials consisting of a few monolayers, XPS seems to be highly suitable for the quantitative analysis of (functionalized) GRM. Thereby, different sample preparation methods like pressing the powders onto adhesive tapes, into recesses, or into solid pellets result in inconsistencies in the quantification. Furthermore, different morphologies like stacks of graphene layers (left figure) or irregular particles (right figure) lead to different analysis results for the chemical composition.
For the validation of the quantification with XPS and the further development of standards an international interlaboratory comparison was initiated under the head of the “Versailles Project on Advanced Materials and Standards” (VAMAS). First results are reported showing the suitability of the protocols. Finally, the XPS results are compared with the elemental composition results obtained after quantification with energy-dispersive X-ray spectroscopy (EDS) as a fast analytical method which is usually combined with electron microscopy.
McSAS3 is a refactored software package for fitting large batches of (X-ray or Neutron) scattering data. It uses a Monte-Carlo acceptance-rejection algorithm to optimize model parameters - ideal for analysis of size-disperse scatterers.
The refactored code can exploit multiprocessing, traceably stores (multiple) results in the output file, and allows for re-histogramming of previous optimizations. Besides analysis of large batches, it can also be integrated in automated data processing pipelines.
The live demonstration will show how to use the software, what its limitations are, and what outcomes can look like for batches of results.
In this study, we present an enhanced deep learning framework for the prediction of porosity based on thermographic in-situ monitoring data of laser powder bed fusion processes. The manufacturing of two cuboid specimens from Haynes 282 (Ni-based alloy) powder was monitored by a short-wave infrared camera. We use thermogram feature data and x-ray computed tomography data to train a convolutional neural network classifier. The classifier is used to perform a multi-class prediction of the spatially resolved porosity level in small sub-volumes of the specimen bulk.
While the synthesis of Metal-Organic Framework (MOF) particles can be as easy as adding two solutions together, reproducibly obtaining the same particles, time and time again, is a lot harder. As laboratory-independent reproducibility is a cornerstone of the scientific method, we must put effort into finding and controlling all necessary parameters to achieve this.
An open-source Python/EPICS-controlled robotic platform (see picture) was adapted to systematically explore this for a 20 ml MOF synthesis of the Zeolitic Imidazole Framework-8 (ZIF-8) chemistry in methanol. Parameters that were explored included: 1) addition sequence, 2) addition speeds, 3) reaction times, 4) source chemicals, 5) stirring speeds, 6) stirring bar choice, 7) starting concentrations, and 8) workup methodologies. It was found that, by controlling these parameters, highly reproducible syntheses are obtained. Secondly, the variation of these parameters alone led to a dramatic difference in volume-weighted particle size means, which exceeds an order of magnitude as investigated by our in-house X-ray scattering instrument [1].
The syntheses are thoroughly documented in an automated fashion, and the synthesis libraries as well as analyses libraries will become available in batches soon. With this library, it will be possible to extract previously unknown correlations, and other laboratories can produce specific particles by following the exact procedures of the particles of their choice.
Robot-assisted laser thermography for surface breaking crack detection on complex shaped components
(2023)
Laser thermography using a focused (spot or line) beam has proved to be effective for detection of surface breaking cracks on planar samples. In this work, we use the same principle, but applied to complex shaped components, like a rail cross-section, a gear, and a gas turbine blade. We use a six-axis robot to move the sample in-front of our thermographic setup. Several scanning-path and thermographic parameters are explored: scanning speed, density of points in each scanning slice, laser power, camera framerate. Additionally, we explore semi-automatic evaluation algorithms for crack detection, as well as 2D-to-3D registration of the found indications.
The use of benchtop-NMR instruments is constantly increasing during the recent years. Advantages of being affordable, portable and easy-to-operate without the need for trained staff make them especially interesting for industrial applications in quality control. However, applications of NMR spectroscopy as an online PAT tool are still very rare but offer a huge potential for process optimization and control. A key task to exploit this potential is hardware field integration of the lab-instruments in a rough environment of a chemical plant. Additionally, developments in automation and data evaluation are mandatory to ensure a robust unattended operation with low maintenance requirements. Here, we show an approach of a fully automated analyzer enclosure considering explosion safety, field communication, as well as environmental conditions in the field.
Temperature sensitivity is still a limitation of benchtop-NMR instruments in flow applications. Recent developments of manufacturers allow for limited operation at static temperature levels, however, a dynamic system for continuous operation is still not available. Using a prototype system offering a larger bore, active temperature shielding studies with thermostated air were performed evaluating the performance.
Automated data evaluation of NMR spectra using a modular indirect hard modeling (IHM) approach showed good results and flexibility. A second data analysis approach based on artificial neural networks (ANN) was evaluated.Therefore, amount of data was augmented to be sufficient for training. The results show comparable performance, while improving the calculation time tremendously, offering new ways to simultaneously evaluating large numbers of different models.
Introduction to microplastics, definition, relevance, analytical approaches & challenges. TED-GC/MS working principle, features & performance. Optimisation in TED-GC/MS methodology, new validation data. Replacement of Internal Standard, outlook towards upcoming publication and application within EU´s revised drinking water directive.
Introduction
Per- and polyfluorinated alkyl substances (PFASs) are a group of over 4730 individual compounds. Several PFASs are extremely persistent, bioaccumulative and toxic. The analysis of PFASs is challenging because of their various chemical and physical properties as well as the high number of compounds. Target-based approaches (e.g., LC-MS/MS) are limited to the availability of analytical grade standards and are not suitable for the analysis of new/unknown PFASs and transformation products. Therefore, PFAS sum parameter methods become increasingly important to indicate realistic PFAS pollution levels.
Methods
For the instrumental analysis of such sum parameters, a fluorine selective detector is needed. In our study we used high resolution-continuum source-graphite furnace molecular absorption spectrometry (HR-CS-GFMAS) which is a sensitive and highly selective tool for fluorine determination. The method is based on the in situ formation of diatomic gallium-mono fluoride (GaF) in a graphite furnace at a temperature of 1550°C. The molecular absorption of GaF can be detected at its most sensitive wavelength at 211.248 nm providing limits of quantification of c(F) 2.7 µg/L.
Results
Here, we present an improved method for the determination of PFASs using HR-CS-GFMAS via GaF detection. The optimized method includes a modifier pretreatment step using a mixture of Mg, Pd and Zr and a correction measurement using perfluorooctanoic acid. The combination of both resulted in increased accuracy and precision as well as overall lower detection limits. Furthermore, during optimization the influence of species-specific responses during HR-CS-GFMAS analysis was reduced resulting in a more accurate determination of PFAS sum parameters. To test the applicability of the improved method, we analysed soil samples from a former fire-fighting training area combining the improved method for detection with our previously optimized extraction method for extractable organically bound fluorine (EOF) determination in soils.
Innovative aspects
• Highly sensitive and selective method for fluorine/PFASs analysis based on HR-CS-GFMAS
• Increased accuracy for the determination of EOF
• Reduction of PFAS species-specific responses by optimized modifier conditions
X-ray photoelectron-spectroscopy (XPS) allows simultaneous irradiation and damage monitoring. Although water radiolysis is essential for radiation damage, all previous XPS studies were performed in vacuum. Here we present near-ambient-pressure XPS experiments to directly measure DNA damage under water atmosphere. They permit in-situ monitoring of the effects of radicals on fully hydrated double-stranded DNA. Our results allow us to distinguish direct damage, by photons and secondary low-energy electrons (LEE), from damage by hydroxyl radicals or hydration induced modifications of damage pathways. The exposure of dry DNA to x-rays leads to strand-breaks at the sugar-phosphate backbone, while deoxyribose and nucleobases are less affected. In contrast, a strong increase of DNA damage is observed in water, where OH-radicals are produced. In consequence, base damage and base release become predominant, even though the number of strand-breaks increases further.
The reference method for obtaining absolute isotope ratios still is the isotope mixture approach. Due the huge efforts required the full isotope mixture approach is applied only by a few institutes worldwide. To enable an IRWG key comparison with a sufficiently large number of participants a proposal for absolute cu isotope ratios is presented where participants will be provided with the enriched isotopes, the isotope mixtures and the samples. In parallel a pilot study will be organized where alternative approaches for obtaining absolute Cu isotope ratios can be applied.
IRWG strategy update
(2023)
Per- and polyfluoroalkyl substances (PFAS) are a group of anionic, cationic and zwitterionic synthetic products, in which the hydrogen atoms on the carbon skeleton of at least one carbon atom have been completely replaced by fluorine atoms and which include more than 4730 compounds, depending on the definition. As a result of continuous and prolific use, mainly in aviation firefighting foams, thousands of industrial and military installations have been found to contain contaminated soil, groundwater and surface water. Furthermore, current decontamination strategies of PFAS-burdened soils mainly consist of adsorption methods using adsorbents for fixation of PFAS in the ground. Hence, there is a great demand for innovative developments and chemical treatment technologies, dealing with new strategies of tackling the PFAS problem. Thus, we investigated mechanochemical treatment of PFAS contaminated soils with various additives in a ball mill and analyzed the PFAS defluorination. In this presentation the advantages of fluorine K-edge X-ray absorption near-edge structure (XANES) spectroscopy for various environment samples are shown.
For how trivial or provocative it can sound, the best neutron spectrometer in the world does not produce science and technology by itself. By definition of “Materials Science”, neutron scattering data on engineering materials must be used as a tool to understand, and even tailor, materials performance. In order for this to happen, neutron data need to be
1. Acquired under the most relevant condition possible
2. Coupled to other experimental techniques
3. Capitalized by means of proper simulations and data analysis
Point 1- calls for an intense use and the development of top-notch of in-situ techniques; Point 2- means that the sole use of neutron data will not lead to any solution of a global problem; All points above hint to the fact that access to neutron sources is not routine, and therefore it is imperative to search ways to make neutron data rentable and sustainable for the material science and industrial research community.
In this presentation, and based on two examples, we will show a couple of strategies to combine neutron data with other experiments, and with theoretical models to raise the validity of experiments to the level of problem-solving. As one might imagine, these are only a few among the almost infinite combinations possible to help improving material properties, performance, and safety, i.e., ripe for everyday use.
Inorganic nanocrystals with linear and nonlinear luminescence in the ultraviolet, visible, near infrared and short-wave infrared like semiconductor quantum dots and spectrally shifting lanthanide-based nanophosphors have meanwhile found many applications in the life and material sciences. This includes optical reporters for bioimaging and sensing, security and authentication barcodes, solid state lighting, converter materials, and photovoltaics. The identification of optimum particle structures requires quantitative spectroscopic studies under application-relevant conditions, focusing on the key performance parameter photoluminescence quantum yield, ideally flanked by single particle studies to assess spectroscopic inhomogeneities on a particle-to-particle level for typical preparation methods. In this context, methods to quantify the photoluminescence of these different nanoscale emitters are shown and utilized as a basis for a profound mechanistic understanding of the nonradiative deactivation pathways in semiconductor and upconversion nanocrystals of different size and particle architecture in different environments. Exemplary for the application potential of such nanomaterials, in addition, the design of optical sensors from different nanomaterials and functional organic dyes is briefly summarized.
Laser-induced breakdown spectroscopy (LIBS) is a spectroscopic method for detecting the chemical composition of optically accessible surfaces. In principle, the measurement of all elements of the periodic table is possible. System calibrations allow the quantification of element concentrations. In combination with scanner systems, the two-dimensional element distribution can be determined. Even rough surfaces can be measured by online adjustment of the laser focus. To detect element ingress into the concrete, typically cores are taken, cut in half, and LIBS measurements are performed on the cross-section. The high spatial resolution as well as the simultaneous multi-element analysis enables a separate evaluation of the binder-matrix and aggregates. Therefore, the element concentrations can be determined directly related to the cement paste. LIBS measurements are applicable in the laboratory, on-site and also over a distance of several meters.
Common applications include the investigation of material deterioration due to the ingress of harmful ions and their interaction in porous building materials. LIBS is able to provide precise input parameters for simulation and modelling of the remaining lifetime of a structure. Besides the identification of materials, also their composition can be determined on hardened concrete, such as the type of cement or type of aggregate. This also involves the identification of environmentally hazardous elements contained in concrete. Another possible application is the detection of the composition of material flows during dismantling. Non-contact NDT for “difficult to assess” structures as an example application through safety glass or in combination with robotics and automation are also possible.
This work presents the state of the art concerning LIBS investigations on concrete by showing exemplary laboratory and on-site applications.
Introduction
NMR spectroscopy is one of the most important analytical methods in organic chemistry. While most analyses are carried out qualitatively with the aim of substance identification and structure elucidation, quantitative NMR spectroscopy (qNMR) is increasingly gaining importance in research and industry. qNMR provides the most universally applicable form of direct purity determination without need for reference materials of impurities or the calculation of response factors but only exhibiting suitable NMR properties.
Methods
One of the most attractive features of quantitative NMR spectroscopy is that the NMR peak areas can be used directly for concentration quantification without further calibration. Another advantage of NMR spectroscopy is that the method has a high linearity between absolute signal area and sample concentration, which makes it an absolute analytical comparison method that is independent of the matrix. This enables automated robust data evaluation strategies that can be used for online applications of qNMR spectroscopy.
Jancke et al. proposed NMR spectroscopy as a relative primary analytical method because it can be fully described by mathematical equations from which a complete uncertainty budget can be derived, allowing it to be used at the highest metrological level. Weber et al. discussed in detail important aspects of the procedure that enable the realisation of low measurement uncertainties in qNMR measurements. Since certification of CRM requires expanded mea¬sure¬ment uncertainties of less than 0.5 % (relative), the work of Weber et al. demonstrated for the first time that qNMR can fulfil this criterion.
Results
To date, further comparative studies have been carried out in metrology and industry, demonstrating the performance of quantitative NMR spectroscopy and further reducing measurement uncertainties. The development of validation concepts and the commercial availability of suitable certified reference materials facilitate the application, especially in the usually highly regulated industrial environment. Users can thus accelerate the development of analytical methods. The talk will cover a wide range of topics from current metrological activities to new challenges for qNMR spectroscopy and also deals with aspects such as validation and accreditation.
Innovative aspects
• qNMR provides the most universally applicable form of direct purity determination
• Expanded measurement uncertainties lower than 0.15 % (relative) possible
• Benchtop NMR instruments increasingly used for qNMR spectroscopy
Destructive and non-destructive 3D-characterization of inner metal structures in ceramic packages
(2023)
Ceramic multilayer packages provide successful solutions for manifold
applications in telecommunication, microsystem, and sensor technology. In such packages, three-dimensional circuitry is generated by combination of structured and metallized ceramic layers by means of tape casting and multilayer technology. During development and for quality assurance in manufacturing, characterization of integrity, deformation, and positioning of the inner metal features is necessary.
Visualization with high resolution and material contrast is needed.
Robot-assisted 3D-materialography is a useful technique to characterize such multimaterial structures. In that, many sections of the specimen are polished and imaged automatically. A three-dimensional representation of the structure is created by digital combination of the image stack. A quasi non-destructive approach is to perform X-ray computer tomography (CT) with different beam energies. The energies are chosen to achieve a good imaging of either the metal features, or the ceramic matrix of the structure. The combination of the respective tomograms results in a high contrast representation of the entire structure. Both methods were tested to characterize Ag and Ag/Pd conductors in a ceramic multilayer package. The results were compared in terms of information content, effort, and applicability of the methods.
The rails of modern railways face an enormous wear and tear from ever increasing train speeds and loads. This necessitates diligent non-destructive testing for defects of the entire railway system.
Non-destructive testing of rail tracks is carried out by rail inspection trains equipped with ultrasonic and eddy current test devices. However, the evaluation of the gathered data is mainly done manually with a strong focus on ultrasonic data, and defects are checked on-site using hand-held testing equipment. Maintenance measures are derived based on these on-site findings.
The aim of the AIFRI project (Artificial Intelligence For Rail Inspection) is to
- increase the degree of automation of the inspection process, from the evaluation of the data to the planning of maintenance measures,
- increase the accuracy of defect detection,
- automatically classify detected indications into risk classes.
These aims will be achieved by training a neural network for defect detection and classification. Since the current testing data is unbalanced, insufficiently labeled and largely unverified we will supplement fused, simulated eddy current and ultrasonic testing data in form of a configurable digital twin.
Non-destructive testing of rail tracks is carried out by using rail inspection cars equipped with ultrasonic and eddy current measurement. The evaluation of test data is mainly done manually, supported by a software tool which pre-selects relevant indications shown to the evaluators. The resulting indications have to be checked on-site using hand-held testing equipment. Maintenance interventions are then derived on the basis of these on-site findings.
Overall aim of the AIFRI (Artificial Intelligence For Rail Inspection) project - funded by the German Federal Ministry of Digital and Transport (BMDV) as part of the mFUND programme under funding code 19FS2014 – is to increase the degree of automation of the inspection process from the evaluation of the data to the planning of maintenance interventions. The accuracy of defect detection shall be increased by applying AI methods in order to enable an automated classification of detected indications into risk classes. For this purpose, data from both eddy current inspections and ultrasonic inspections will be used in combination.
Within the framework of this data-driven project, relevant defect patterns and artefacts present in the rail are analysed and implemented into a configurable digital twin. With the help of this digital twin virtual defects can be generated and used to train AI algorithms for detection and classification. With the help of reliability assessment trained AI algorithms will be evaluated with regard to the resulting quality in defect detection and characterisation.
A particular aspect of the development of AI methods is the data fusion of different NDT data sources: Thereby, synergies are used that arise from linking eddy current and ultrasonic inspection data in a combined model.
In the course of the project a demonstrator consisting of the developed IT-tool and an asset management system will be implemented and tested in the field using real-world data.
The European project MefHySto addresses the need of large-scale energy storage, which is required for a shift to renewable energy supply. The project is funded by the European Metrology Programme on Innovation and Research (EMPIR) and consists of 14 consortium partners from all over Europe (www.mefhysto.eu). It is demonstrated, how MefHySto is contributing to the EMN for Energy Gases aiming at prioritisation of the measurement gaps and challenges interacting with the EMN stakeholders.
Vortrag über die Berechnung von Messunsicherheiten und die POD-Analyse zur Aufbereitung von ZfP-Ergebnissen für die Zuverlässigkeitsbewertung bestehender Bauwerke.
Determination of absolute (SI‐traceable) isotope ratios: The use of Gravimetric Isotope Mixtures
(2023)
The presentation is brief overview on how to use gravimetric isotope mixtures to determine SI traceable isotope ratios. There is no mass spectrometer on earth that directly measures isotope ratios. Mass spectrometers will always measure signal intensity ratios instead. The actual problem is that the measured intensity ratios differ more or less from the isotope ratios. The difference can be up to more than 10 % in case of lithium while it‘s below 1 % in case of the heavier elements like lead or uranium. Consequently, the signal intensity ratios are expressed for example in V/V depending on the type of mass spectrometer you are using, while the isotope ratios are expressed in mol/mol. This phenomenon is called Instrumental Isotopic Fractionation (or short IIF) but the more common name is still mass bias (even though this name is not entirely correct). To convert the measured into the isotope ratio usually a simple multiplication with a so-called correction (or short K) factor is done. Therefore, the problem is to determine the K factor. In absence of isotope reference materials the golden route is via gravimetric isotope mixtures, which will be explained within the presentation.
In industrialised countries more than 80% of the time is spent indoors. Products, such as building materials and furniture, emit volatile organic compounds (VOCs), which are therefore ubiquitous in indoor air. Different VOC combinations may, under certain environmental and occupational conditions, result in reported sensory irritation and health complaints. A healthy indoor environment can be achieved by controlling the sources and by eliminating or limiting the release of harmful substances into the air. One way is to use materials proven to be low emitting. Meanwhile, a worldwide network of professional commercial and non-commercial laboratories performing emission tests for the evaluation of products for interior use has been established. Therefore, comparability and metrological traceability of test results must be ensured. A laboratory’s proficiency can be proven by internal and external validation measures that both include the application of suitable reference materials. The emission test chamber procedure according to EN 16516 comprises several steps from sample preparation to sampling of test chamber air and chromatographic analysis. Quality assurance and quality control (QA/QC) must therefore be ensured. Currently, there is a lack of suitable reference products containing components relevant for the health-related evaluation of building products.
The EU-funded EMPIR project 20NRM04 MetrIAQ (Metrology for the determination of emissions of dangerous substances from building materials into indoor air) aims to develop 1) gaseous primary reference materials (gPRM), which are used for the certification of gaseous (certified) reference materials (gCRM) and 2) emission reference materials (ERM).
Most commercial gas standards of indoor-relevant compounds are not certified due to the lack of primary reference materials to which the project aims to contribute. The gPRM under development is a gas-phase standard containing trace levels of VOCs in nitrogen or air from the check standard according to EN 16516 (n-hexane, methyl isobutyl ketone, toluene, butyl acetate, cyclohexanone, o-xylene, phenol, 1,3,5-trimethylbenzene) with a target uncertainty of 5 %. The gPRM can be sampled into sorbent tubes to obtain transfer standards in the form of gCRM.
The well characterised ERM represents a sample of a test specimen, e.g. building material, that is loaded into the emission test chamber for a period of several days and is used to evaluate the whole emission test chamber procedure. It shall have a reproducible and temporally constant compound release of less than 10 % variability over 14 days. Different approaches for retarded VOC release, such as the encapsulation of pure compounds and the impregnation of porous materials, are being tested to reach this aim. Furthermore, the design of the ERM is accompanied by the development of a numerical model for the prediction of the emissions for each of the target VOCs. The current progress of the work on both materials will be presented.
Almost all building materials in civil engineering have an open porosity and interact with or are affected by the environmental conditions. Structures might suffer from effects such as moisture adsorption, carbonation, corrosion, penetration of salt ions and chemical substances, etc. In the hygroscopic range, these processes are mostly driven by diffusion. Due to the confinement of small pores ( 1 m), the Knudsen effect reduces the molecular diffusion. This reduction can become more significant in case of temporal changing pore systems because of physisorption of water vapor, carbonation, or chemisorption.
In this study, unstabilised earth blocks and earth masonry are investigated. In a first step, the pore size distribution of the blocks is measured and sorption isotherms are recorded in experiments. Besides the ordinary physisorption, the involved clay minerals undergo swelling or shrinking due to chemisorption. The following two effects must be considered: first, the reduction of the available pore space by the adsorbed water layer. For this, the Hillerborg sorption theory is used, which is a combination of the well-known Brunauer-Emmett-Teller sorption theory and the Kelvin equation. This allows the computation of adsorbed water layers even in curved pore geometries. Second, the variation of the initial pore size distribution due to chemisorption needs to be modelled. Based on these two models, the effective diffusion coefficient can be predicted. For validation, arrays of relative humidity sensors were embedded into a free-standing earth masonry wall. This monitoring was carried out over more than a year to have a broad variety of environmental conditions and was located in Berlin, Germany.
The prediction of the effective diffusion coefficient can also be transferred to other processes and allows the investigation of materials having temporarily changing pore systems. Examples are the carbonation of cementitious materials, alkali silica reaction, calcium leaching of long-lasting structures, etc. This effect becomes most prominent in the meso-pore range and might alter the effective diffusion coefficient by more than 100 %.
pH and oxygen are amongst the most important and frequently measured analytes in the life and material sciences, indicating, e.g., diseases and corrosion processes. This includes the optical monitoring of pH in living cells for studying cellular internalization pathways, such as phagocytosis, endocytosis, and receptor ligand internalization with the aid of molecular and nanoscale fluorescent sensors. Nanoparticle (NP)-based sensors, that are labeled or stained with a multitude of sensor dyes, have several advantages as compare to conventional molecular probes like enhanced brightness, i.e., amplified signals, ease of designing ratiometric systems by combining analyte sensitive and inert reference dyes, and increased photostability. Moreover, this can enable the use of hydrophobic dyes in aqueous environments. Versatile templates and carriers for the fabrication of nanosensors by the staining and/or labelling with different fluorophores and sensor molecules or surface functionalized NP like silica (SiO2-NP) and polystyrene (PS-NP) particles provide. Here we present the design of a versatile platform of color emissive nanosensors and stimuli-responsive microparticles for the measurement of pH, oxygen, and other targets utilizing both types of matrices and sets of spectrally distinguishable sensor and reference dyes and their characterization and demonstrate the applicability of representative sensor particle for cellular studies.
Bioanalytical, diagnostic, and security applications require the fast and sensitive determination of a steadily increasing number of analytes or events in parallel in a broad variety of detection formats and increased sensitivities. This – flanked by recent technical advancements and the availability of simple to use, commercial time-resolved photoluminescence measuring devices at reasonable costs - calls for the exploitation of the species- and environment-specific photoluminescence parameter luminescence lifetime. In this context, time-resolved photoluminescence measurements of different classes of molecular and nanocrystalline emitter and luminescent particles in different time windows are presented and examples for applications such as lifetime multiplexing and barcoding in conjunction with fluorescence lifetime imaging microscopy (FLIM) and flow cytometry are given.
The overview of the activity of Federal Institute for Material Research and Testing (BAM, Belin, Germany) in the field of additively manufacturing material characterization will be presented. The research of our group is focused on the 3D imaging of AM materials by means of X-ray Computed Tomography at the lab and at synchrotron, and the residual stress characterization by diffraction (nondestructive technique).
The surface chemistry of nanomaterials controls their interaction with the environment and biological species and their fate and is hence also relevant for their potential toxicity. This has meanwhile led to an increasing interest in validated and preferably standardized methods for the determination and quantification of surface functionalities on nanomaterials and initiated different standardization projects within ISO/TC 229 and IEC/TC 113 as well as interlaboratory comparisons (ILCs) of different analytical methods for the quantification of surface coatings by OECD. Here we present the results of a first ILC on the quantification of the amount of amino functionalities on differently sized inorganic nanoparticles done by division Biophotonics and the National Research Council of Canada (NRC) and the PWI 19257 on the Characterization and Quantification of Surface Functional Groups and Coatings on Nanoobjects approved by ISO/TC 229 (WG2) in fall 2022 that will result in a VAMAS study on this topic organized by division Biophotonics. Key words: nanoparticles, surface analysis, surface functional groups, quantification, optical assay, qNMR, VAMAS, standardization, ICL, quality assurance, reference material.
Due to their unique physico-chemical properties, nanoparticles are well established in research and industrial applications. A reliable characterization of their size, shape, and size distribution is not only mandatory to fully understand and exploit their potential and develop reproducible syntheses, but also to manage environmental and health risks related to their exposure and for regulatory requirements. To validate and standardize methods for the accurate and reliable particle size determination nanoscale reference materials (nanoRMs) are necessary. However, there is only a very small number of nanoRMs for particle size offered by key distributors such as the National Institute of Standards and Technology (NIST) and the Joint Research Centre (JRC) and, moreover, few provide certified values. In addition, these materials are currently restricted to polymers, silica, titanium dioxide, gold and silver, which have a spherical shape except for titania nanorods. To expand this list with other relevant nanomaterials of different shapes and elemental composition, that can be used for more than one sizing technique, we are currently building up a platform of novel nanoRMs relying on iron oxide nanoparticles of different shape, size and surface chemistry. Iron oxide was chosen as a core material because of its relevance for the material and life sciences.
Inorganic nanocrystals with linear and nonlinear luminescence in the ultraviolet, visible, near infrared and short-wave infrared like semiconductor quantum dots and spectrally shifting lanthanide-based nanophosphors have meanwhile found applications in the life and material sciences ranging from optical reporters for bioimaging and sensing over security barcodes to solid state lighting and photovoltaics. These nanomaterials commonly have increasingly sophisticated core/shell particle architectures with shells of different chemical composition and thickness to minimize radiationless deactivation at the particle surface that is usually the main energy loss mechanism [1]. For lanthanide-based spectral shifters, particularly for very small nanoparticles, also surface coatings are needed which protect near-surface lanthanide ions from luminescence quenching by high energy vibrators like O-H groups and prevent the disintegration of these nanoparticles under high dilution conditions. [2,3,4]. The identification of optimum particle structures requires quantitative spectroscopic studies focusing on the key performance parameter photoluminescence quantum yield [5,6], ideally flanked by single particle studies to assess spectroscopic inhomogeneities on a particle-to-particle level for typical preparation methods [7,8], Moreover, in the case of upconversion nanoparticles with a multi-photonic and hence, excitation power density (P)-dependent luminescence, quantitative luminescence studies over a broad P range are required to identify particle architectures that are best suited for applications in fluorescence assays up to fluorescence microscopy. Here, we present methods to quantify the photoluminescence of these different types of emitters in the vis/NIR/SWIR and as function of P and demonstrate the importance of such measurements for a profound mechanistic understanding of the nonradiative deactivation pathways in semiconductor and upconversion nanocrystals of different size and particle architecture in different environments.
Inductively coupled plasma mass spectrometry (ICP-MS) emerged as a powerful technique for trace analysis of soil due to its multi-element capability, high sensitivity and low sample consumption. However, despite its success and widespread use, ICP-MS has several persistent drawbacks, such as high argon gas consumption, argon-based polyatomic interferences and the need for complicated RF-power generators. Unlike argon-based ICP, the nitrogen microwave inductively coupled atmospheric pressure mass spectrometry (MICAP-MS) uses nitrogen as plasma gas, which eliminates high operating costs associated with argon-gas consumption as well as the argon-based interferences1. In this work, the applicability of MICAP-MS for elemental analysis in different matrices is investigated. For this purpose, reference soil samples and steel samples are digested with aqua regia and used for analysis. Concentrations of selected elements are determined using MICAP-MS and validated with ICP-MS und certified values. Sensitivities, limits of detection and gas consumption for both methods are compared and discussed in detail. Performance of MICAP-MS under different nitrogen plasma gas concentrations is investigated and compared. Moreover, the performance of MICAP-MS in alloy matrices is investigated and discussed.
Sediments and soils can act as sinks of species of inorganic mercury (Hg2+), while they are simultaneously sources of organic species, such as monomethylmercury (MMHg). Although the fraction of MMHg in total Hg of sediments is suggested to be only 0.1–1%, MMHg poses a threat for humans and wildlife due to its toxic properties, high bioaccumulation potential and the ability to pass the blood-brain barrier. One example of a highly Hg contaminated waterbody is the Finow Canal, the oldest artificial waterway still in operation in Germany. Here, Hg mass fractions of up to 100 µg/g were found in the sediment in previous studies. These are suggested to be associated with a chemical plant producing mercury-based seed dressings. Despite this high mass fraction of Hg, no Hg speciation studies have been conducted there up to now.
In this study, Hg speciation in sediments of Finow Canal at locations before and after the known polluted site was conducted using species-specific isotope dilution (SSID) GC-ICP-ToF-MS. Mass fractions of up to 0.41 µg/g MMHg were determined. In addition, waterbodies around the initially polluted site were investigated and elevated concentrations were also determined around 14 km downstream. For MMHg analysis, the performance of ICP-ToF-MS for SSID GC/ICP-MS was compared with ICP-Q-MS and ICP-SF-MS. Here, isotope ratio precision was similar between the tested instruments. However, the (quasi-) simultaneous detection of the whole mass spectrum will probably offer a much higher precision of ICP-ToF-MS, when more than one isotope system is used.
These results are the first evidence of the occurrence of MMHg in this region and show the need for further investigations of the whole regional ecosystem, as well as the consideration of possible measures of remediation. SSID GC-ICP-(ToF)-MS is a suitable tool for investigating species-specific (multi) isotope systems for environmental monitoring.
Since its discovery, graphene has got growing attention in the industrial and application research due to its unique properties . However, graphene has not been yet implemented into the industrial market, in particularly due to the difficulty of properly characterizing this challenging material. As most of other nanomaterials, graphene’s properties are closely linked to its chemical and structural properties, such as number of layers, flake thickness, degree of functionalisation and C/O ratio. For the commercialization, suitable procedures for the measurement and characterization of the ultrathin flakes, of lateral dimensions in the range from µm to tens of µm, are essential.Surface chemical methods, especially XPS, have an outstanding role of providing chemical information on the composition. Thereby, one well-known problem for surface analytical methods is the influence of contamination on the composition as in the case of adventitious carbon. The differentiation between carbon originated from the contamination or from the graphene sample itself is often not obvious, which can lead to altered results in the determination of the composition. To overcome this problem, Hard Energy X-ray Photoelectron Spectroscopy (HAXPES) offers new possibilities due to its higher information depth. Therefore, XPS measurement obtained with Al Kα radiation (E = 1486. 6 eV) were compared with analyses performed with a Cr Kα (E = 5414. 8 eV) excitation on functionalized graphene samples. Differences are discussed in terms of potential carbon contamination, but also of oxygen on the composition of the samples. Measurements are performed on O-, N- and F-functionalized graphene. Different preparation procedures (powder, pellet, drop cast from liquid suspension) will be also discussed, correlation of the results with the flakes morphology as well as their validation with other independent methods are in progress.
In mass spectrometry based proteomics, protein homology leads to
many shared peptides within and between species. This complicates
taxonomic inference. inference. We introduce PepGM, a graphical model for taxonomic profiling of viral proteomes and metaproteomic datasets.
Using the graphical model, our approach computes statistically sound
scores for taxa based on peptide scores from a previous database
search, eliminating the need for commonly used heuristics.
In mass spectrometry based proteomics, protein homology leads to
many shared peptides within and between species. This complicates
taxonomic inference. inference. We introduce PepGM, a graphical model for taxonomic profiling of viral proteomes and metaproteomic datasets.
Using the graphical model, our approach computes statistically sound
scores for taxa based on peptide scores from a previous database
search, eliminating the need for commonly used heuristics. heuristics.
In general, wind turbines transform the kinetic energy of the wind into electric power. Thereby, the wind turbine blades are facing unsteady loads which are transferred to the hub to generate a rotation of the turbine’s axis. This brief introduction focuses on the aerodynamics of the blades and the corresponding loads. Starting with the basic flow field and loads of an airfoil, terms like stagnation point, boundary layer, Reynolds number, transition, and separation are introduced. For different geometries, lift and drag coefficient curves are discussed. Then, full wings will be considered, including their three-dimensional flow field due to wing tip vortices and crossflows. As a main source of increased loads, unsteady effects are explained in more detail such as gusts, tower passing, earth boundary layer crossing, free stream turbulences, yaw misalignment, etc. At the end, extra loads due to an oscillating free stream are introduced.
Applying data-driven AI systems makes it possible to extract patterns from given data, generate predictions and helps making decisions. Material research and testing holds a plethora of AI-based applications, for example, for the automatized search and synthesis of new materials, the detection of materials defects, or the prediction of process and materials parameters (inverse problems). However, AI algorithms can often only be as good as the training data from which the corresponding models are learned. Therefore, it is also indispensable to develop measures for the standardization and quality assurance of such data.
For this purpose, we develop and implement methods from transferring data from various sources into a homogeneous data repository with uniform data descriptions. Through the standardization and corresponding machine-readable interfaces, research data can be made usable and reusable for further data analyses. In addition to the technical implementation of integrative platforms, it is crucial that quality-assured research data management is recognized and implemented as an integral part of daily scientific work. Finally, we provide a vision of how the Federal Institute for Materials Research and Testing can benefit from data-driven AI systems. We discuss early applications and take a peek at future research.
Metaproteomics has substantially grown over the past years and supplements other omics approaches by bringing valuable functional information, enabling genotype- phenotype linkages and connections to metabolic outputs. Currently, a wide variety of metaproteomic workflows is available, yet their impact on the results remains to be thoroughly assessed.
Here, we carried out the first community-driven, multi-lab comparison in metaproteomics: the critical assessment of metaproteome investigation (CAMPI) study. Based on well-established workflows, we evaluated the influence of sample preparation, mass spectrometry acquisition, and bioinformatic analysis using two samples: a simplified, lab-assembled human intestinal model and a human fecal sample.
Although bioinformatic pipelines contributed to variability in peptide identification, wet-lab workflows were the most important source of differences between analyses. Overall, these peptide-level differences largely disappeared at the protein group level. Differences were observed between peptide- and protein-centric approaches for the predicted community composition but similar functional profiles were found across workflows.
The CAMPI findings demonstrate the robustness of current metaproteomics research and provide a perspective for future benchmarking studies.
Driven by recent technological advances and the need for improved viral diagnostic applications, mass spectrometry-based proteomics comes into play for detecting viral pathogens accurately and efficiently. However, the lack of specific algorithms and software tools presents a major bottleneck for analyzing data from host-virus samples. For example, accurate species- and strain-level classification of a priori unidentified organisms remains a very challenging task in the setting of large search databases. Another prominent issue is that many existing solutions suffer from the protein inference issue, aggravated because many homologous proteins are present across multiple species. One of the contributing factors is that existing bioinformatic algorithms have been developed mainly for single-species proteomics applications for model organisms or human samples. In addition, a statistically sound framework was lacking to accurately assign peptide identifications to viral taxa. In this presentation, an overview is given on current bioinformatics developments that aim to overcome the above-mentioned issues using algorithmic and statistical methods. The presented methods and software tools aim to provide tailored solutions for both discovery-driven and targeted proteomics for viral diagnostics and taxonomic sample profiling. Furthermore, an outlook is provided on how the bioinformatic developments might serve as a generic toolbox, which can be transferred to other research questions, such as metaproteomics for profiling microbiomes and identifying bacterial pathogens.
In ultrasonic testing, the time of flight (ToF) of a signal can be used to infer material and structural properties of a test item. In dispersive media, extracting the bulk wave velocity from a received signal is challenging as the waveform changes along its path of propagation. When using signal features such as the first peak or the envelope maximum, the calculated velocity changes with the propagation distance. This does not occur when picking the signal onset. Borrowing from seismology, researchers used the Akaike information criterion (AIC) picker to automatically obtain onset times. In addition to being dependent on arbitrarily set parameters, the AIC picker assumes no prior knowledge of the spectral properties of the signal. This is unnecessary in ultrasonic through-transmission testing, where the signal spectrum is known to differ significantly from noise. In this contribution, a novel parameter-free onset picker is proposed, that is based on a spectral entropy criterion (SEC) to model the signal using the AIC framework. Synthetic and experimental data are used to compare the performance of SEC and AIC pickers, showing an improved accuracy for densely sampled data.
Climate change and related energy policies, exacerbated by unforeseen geopolitical developments, pose new challenges for gas analytics, such as the use of hydrogen, hydrogen-containing alternative gaseous fuels (NH3, etc.), the use of alternative methane-based energy gases (LNG, LPG, etc.) or decarbonisation via CCSU. In all topics, the quality, i.e. the actual chemical composition of the gases, naturally plays a decisive role. BAM is meeting this strategic importance with the further development of hydrogen analytics and is continuing to develop the methods used in order to support the German economy and research landscape with traceability, reference materials and analytical procedures as quickly as possible.
Mass spectrometry plays an important role for trace analysis in hydrogen matrix. The presentation shows first experimental results from the application of PTR-TOF-MS (Proton Transfer Reaction Time-of-Flight Mass Spectrometry).
Mycotoxins (toxins formed by fungi) in food and have caused problems for mankind since the beginning of time. The group of ergot alkaloids plays a special role in human history. Several tens of thousands of deaths during the middle ages caused by to ergotism (the disease caused by continuous intake of ergot alkaloid contaminated food) underscore the importance of reliable analytical methods to ensure food safety.
More than 50 compounds belong to the group of ergot alkaloids. The 12 most found structures – the major ergot alkaloids – are typically measured, when it comes to ergot alkaloid quantification. High performance liquid chromatography (HPLC) with a fluorescence detector (FLD) is typically used to quantify the ergot alkaloid content. The main disadvantage of this method are the high costs for calibration standards (12 different calibration substances are required). But also the time and effort required for the analysis of 12 peaks and overlapping signals that occur in complex food samples such as bread. As all ergot alkaloids share the ergoline structure and just differ in the substituents attached to this backbone, measurement of all ergot alkaloids in one sum parameter presents a time and cost saving alternative. The most important step for the development of such a sum parameter method is the reaction used to transfer all ergot alkaloids to one uniform structure. In the talk two promising reactions, the acidic esterification to lysergic acid methyl ester and hydrazinolysis to lysergic acid hydrazide, are examined for possible use in a routine analysis method. In addition to yield and reaction rate, factor such as the handling of the reaction and the possibility of parallelization play a role. Next to the current status of the ongoing research project, in this talk, current approaches to ergot alkaloid quantitation will be discussed.
Investigation of degradation of the aluminum current collector in lithium-ion batteries by GD-OES
(2022)
Lithium-ion batteries (LIBs) are one technology to overcome the challenges of climate and energy crisis. They are widely used in electric vehicles, consumer electronics, or as storage for renewable energy sources. However, despite innovations in batteries' components like cathode and anode materials, separators, and electrolytes, the aging mechanism related to metallic aluminum current collector degradation causes a significant drop in their performance and prevents the durable use of LIBs. Glow-discharge optical emission spectroscopy (GD-OES) is a powerful method for depth-profiling of batteries' electrode materials. This work investigates aging-induced aluminum deposition on commercial lithium cobalt oxide (LCO) batteries' cathodes. The results illustrate the depth-resolved elemental distribution from the cathode surface to the current collector. An accumulation of aluminum is found on the cathode surface by GD-OES, consistent with results from energy-dispersive X-ray spectroscopy (EDX) combined with focused ion beam (FIB) cutting. In comparison to FIB-EDX, GD-OES allows a fast and manageable depth-profiling. Results from different positions on an aged cathode indicate an inhomogeneous aluminum film growth on the surface. The conclusions from these experiments can lead to a better understanding of the degradation of the aluminum current collector, thus leading to higher lifetimes of LIBs.
In industrialised countries more than 80% of the time is spent indoors. Products, such as building materials and furniture, emit volatile organic compounds (VOCs), which are therefore ubiquitous in indoor air. VOC in combination may, under certain environmental and occupational conditions, result in reported sensory irritation and health complaints. Emission concentrations can become further elevated in new or refurbished buildings where the rate of air exchange with fresh ambient air may be limited due to improved energy saving aspects. A healthy indoor environment can be achieved by controlling the sources and by eliminating or limiting the release of harmful substances into the air. One way is to use (building) materials proved to be low emitting. Meanwhile, a worldwide network of professional commercial and non-commercial laboratories performing emission tests for the evaluation of products for interior use has been established. Therefore, comparability of test results must be ensured. A laboratory’s proficiency can be proven by internal and external validation measures that both include the application of suitable emission reference materials (ERM). For the emission test chamber procedure according to EN 16516, no artificial ERM is commercially available. The EU-funded EMPIR project MetrIAQ aims to fill this gap by developing new and improved ERMs. The goal is to obtain a material with a reproducible and temporally constant compound release (less than 10% variability over 14 days). Different approaches, such as the impregnation of porous materials, are being tested. The generation as well as results of the most promising materials will be presented.
Geführte Ultraschallwellen sind für die Materialcharakterisierung hervorragend geeignet, da ihr Ausbreitungsverhalten abhängig von den Materialeigenschaften des untersuchten Werkstoffs ist.
Um aus dem messtechnisch ermittelten Ausbreitungsverhalten geführter Ultraschallwellen Rückschlüsse auf die Materialparameter zu ziehen, werden in der aktuellen Forschung verschiedene inverse Methoden diskutiert. Dispersionsabbildungen im Frequenz-Wellenzahl-Bereich repräsentieren das Ausbreitungsverhalten geführter Ultraschallwellen. Maschinelles Lernen und insbesondere Convolutional-Neural-Networks (CNNs) sind eine Möglichkeit der automatisierten inversen Bestimmung der Materialparameter aus den Dispersionsabbildungen.
In diesem Beitrag wird anhand synthetischer Daten gezeigt, wie das Ausbreitungsverhalten von geführten Ultraschallwellen unter Verwendung von CNNs und Dispersionsabbildungen genutzt werden kann, um die elastischen Konstanten einer isotropen plattenförmigen Struktur zu bestimmen. Anhand dieses Beispiels wird das generelle Vorgehen zur Anwendung maschineller neuronaler Lernverfahren aufgezeigt. Hierfür werden die verwendeten Daten analysiert, das Preprocessing erläutert und eine einfache CNN-Architektur gewählt. Im Rahmen der Auswertung wird insbesondere Wert auf die Erklärbarkeit und Zuverlässigkeit des verwendeten CNNs gelegt und so Grenzen und Möglichkeiten aufgezeigt.
In the interest of exploring their potential in the field of single particle analysis, a Microdroplet Generator (MDG) was coupled to an ICP-ToF-MS.
Isotopic Dilution Analysis was also incorporated for the size determination of three different Platinum Nanoparticles samples (50, 63 and 70 nm). The performance of the technique was validated by comparison to traditional size characterization techniques (sp-ICP-ToF-MS, TEM), while the robustness of the technique was proven by incorporating NaCl in the samples’ matrix, up to 100 mg/L.
LIBS is a complementary method to XRF and can detect all elements without the need for vacuum conditions. Automated systems are already commercially available capable of scanning surfaces with a resolution of up to 0.1 mm within a few minutes. In addition to possible applications in R&D, LIBS is also used for practical applications in building materials laboratories and even on-site.
In view of ageing infrastructure facilities, a reliable assessment of the condition of concrete structures is of increasing interest. For concrete structures, the ingress of potential harmful ions is affecting the serviceability and eventually structural performance. Pitting corrosion induced by penetrating chlorides is the dominant deterioration mechanism. Condition assessment based on frequently performed chloride profiling can be useful to identify the extent and evolution of chloride ingress. This could prove to be more economical than extensive repairs, especially for important infrastructure facilities.
Currently the most common procedure for determining the chloride content is wet chemical analysis with standard resolution of 10 mm. The heterogeneity is not considered. LIBS is an economical alternative for determining the chloride content at depth intervals of 1 mm or less. It provides 2D distributions of multiple elements and can locate spots with higher concentrations. The results are directly correlated to the mass of binder and can also be performed on-site with a mobile LIBS-System.
The application of a LIBS-system is presented. Calibration is required for quantitative analysis. Concrete cores were drilled, sliced and analyzed to determine the 2D-distribution of harmful elements. By comparing the chloride ingress and the carbonation, the interaction of both processes can be visualized in a measurement that takes less than 10 minutes for a 50 mm x 100 mm drill core.
A leaflet on the use of LIBS for the chloride ingress assessment has been completed.
In many scientific fields, isotopic analysis can offer valuable information, e.g., for tracing the origin of food products, environmental contaminants, forensic and archaeological samples (provenance determination), for age determination of minerals (geochronological dating) or for elucidating chemical processes. Up to date, typically bulk analysis is aimed at measuring the isotopic composition of the entire elemental content of the sample. However, the analyte element is usually present under the form of different elemental species. Thus, separating species of interest from one another and from matrix components prior to isotope ratio measurements can provide species-specific isotopic information, which could be used for tracing the origin of environmental pollutants and elucidation of (environmental) speciation. Using on-line hyphenations of separation techniques with multicollector-ICP-MS (MC-ICP-MS) can save time and effort and enables the analysis of different species during a single measurement.
In this work, we developed an on-line hyphenation of CE with multicollector-ICP-MS (CE/MC-ICP-MS) for isotopic analysis of sulfur species. With this method, the isotopic composition of sulfur in sulfate originating from river water could be analyzed without sample preparation. The results were compared with data from off-line analysis of the same samples to ensure accuracy. The precision of the results of the on-line measurements was high enough to distinguish the rivers from one another by the isotopic signature of the river water sulfate. Next to environmental applications, a current field is species-specific isotopic analysis of biomolecules, as sulfur is the only covalently bound constituent of proteins which can be analyzed by MC-ICP-MS. Data analysis of transient signals in terms of isotope ratio determination is further issue - we developed a small free accessible App allowing for fast data analysis taking relevant aspects (e.g., mass bias correction, peak picking, …) into account.
Due to the ageing of the infrastructure facilities, a reliable assessment of the condition of concrete structures is of great interest to plan timely and appropriate measures. In concrete structures, pittingcorrosion of the reinforcement is the predominant deterioration mechanism affecting serviceability and eventually structural performance. Determination of quantitative chloride ingress is not only necessary to obtain valuable information on the current condition of a structure, but the data obtained can also be used to predict future developments and the associated risks. An overview of the progress and the possibilities of the application of laser-induced breakdown spectroscopy for concrete analysis in daily civil engineering practice is given. High-resolution 2D measurements of drill cores to determine the penetration of harmful species into concrete is presented. Furthermore, the application of a mobile LIBS system in a parking garage is shown. The system consists of a diode-pumped low-energy laser (3 mJ, 1.5 ns, 100 Hz) and a compact NIR spectrometer. A scanner allows two-dimensional element mapping. Progress towards the establishment of LIBS in a leaflet for the analysis of chlorine ingress into concrete in civil engineering is presented.
The main application of LIBS in civil engineering is the detection of harmful ions in concrete, which can penetrate the component through the porous concrete structure. The advantages of LIBS over standard methods are the possibility of multi-element analysis, measurement speed, spatially resolved measurements, and minimal sample preparation. The spatially resolved measurements of LIBS allow the assessment of the heterogeneity of the concrete by measuring separately the chemical composition of the aggregates and the binder matrix. The latter is particularly relevant because the determined elemental distribution can be directly related to the binder matrix. This is not possible with standard methods, since the material is homogenized to powder during sample preparation stage and the determined concentration is thus related to the total mass. In addition to the use of LIBS for the specific analysis of individual harmful ions, LIBS can also be used to estimate the concrete composition and thus determine, for example, the type of cement used. Corresponding information are relevant for the estimation of the remaining service life and for the preparation of a maintenance concept. In recent years, LIBS has been increasingly used in civil engineering. Currently, however, it is primarily used in research institutions and only occasionally in building materials laboratories. Special commercial devices have also been developed, which greatly simplify the application due to the high degree of automation. Mobile LIBS systems allow on-site application. A central point, which limits the use of LIBS in the commercial sector, is the lack of norms and standards. Therefore, within the framework of a project funded by the German government, work has been carried out on the preparation of a leaflet on quantitative chlorine determination in concrete, which will be published this year. In interlaboratory comparisons the robustness and accuracy for the practical application was demonstrated. LIBS also has great potential in the recycling of construction waste in conjunction with hyperspectral sensors. This issue is currently being addressed in a national project. During the presentation, the state of the art of LIBS in civil engineering will be presented, next steps will be discussed, and future challenges will be outlined.
The metrological analysis uses an unbroken chain of comparative measurements to trace results back to the national or international standard. This enables comparable, absolute quantification between laboratories. In 2020/2021, a pilot study for the quantification of SARS-CoV-2 antibodies was initiated with the involvement of BAM. Despite the consistent use of ID-MS, a large discrepancy in the results of the participating laboratories was found. This was the motivation for a project to systematically investigate and optimize traceable methods of protein quantification using mass spectrometric nontarget analysis (NTA) and recombinant antibody panels.
Der Bunsen-Kirchhoff-Preis 2022 wurde am 23.06.2022 anlässlich der analytica conference in München an Dr. Carlos Abad verliehen - in Anerkennung seiner exzellenten Entwicklungen im Bereich der continuum source atomic absorption spectrometry (CS-AAS).
Dr. Carlos Abad ist ein herausragender Experte auf dem Gebiet der Atom- und molekularen Absorptionsspektrometrie. insbesondere trug er maßgeblich zur substanziellen Weiterentwicklung von Echelle-Spektrometern für die CS-AAS bei. So gelang es, einen quantitativen Zugang zu Elementen wie Bor, Chlor, Fluor und Schwefel, mittels AAS zu erreichen. Erstmals demonstriert Dr. Carlos Abad am Beispiel eines Zr-Modifier, dass durch die Zeitauflösung der eingesetzten Echelle-Systeme mechanistische Untersuchungen zur Wirkung des Modifiers im Graphitrohrofen möglich sind.
Besonders hervorzuheben sind seine Arbeiten zum Einsatz der CS-AAS für die Analyse von Isotopen, die eine Genauigkeit aufweist, welche an die der Multikollektor-induktiv gekoppelten Plasma-Massenspektrometrie (MC-ICP-MS) heranreicht. Damit ergeben sich völlig neue Einsatzmöglichkeiten für technologisch hochrelevante Applikationen, wie z.B. die Untersuchung der Alterung von Lithium-Batterien oder die Lithium-Analyse in Blutserum.