Analytische Chemie
Filtern
Erscheinungsjahr
Dokumenttyp
- Vortrag (2213) (entfernen)
Sprache
- Englisch (1681)
- Deutsch (514)
- Mehrsprachig (8)
- Spanisch (5)
- Französisch (3)
- Russisch (2)
Referierte Publikation
- nein (2213) (entfernen)
Schlagworte
- Fluorescence (65)
- Nanoparticles (65)
- Concrete (64)
- LIBS (62)
- Synchrotron (48)
- NDT (47)
- Zerstörungsfreie Prüfung (46)
- Ultrasound (39)
- XRF (39)
- Nanoparticle (38)
Organisationseinheit der BAM
- 8 Zerstörungsfreie Prüfung (531)
- 1 Analytische Chemie; Referenzmaterialien (498)
- 6 Materialchemie (404)
- 8.0 Abteilungsleitung und andere (148)
- 6.1 Oberflächen- und Dünnschichtanalyse (137)
- 8.2 Zerstörungsfreie Prüfmethoden für das Bauwesen (126)
- 1.4 Prozessanalytik (122)
- 8.4 Akustische und elektromagnetische Verfahren (115)
- 1.1 Anorganische Spurenanalytik (113)
- 6.3 Strukturanalytik (98)
HR-CS-GFMAS a new screening tool for per- and polyfluoroalkyl substances (PFAS) in the environment
(2023)
Per- and polyfluorinated alkyl substances (PFASs) are a large group of anthropogenic contaminates. Concerning are especially their persistent, bioaccumulative and toxic properties. Mostly, target-based approaches (e.g., LC-MS/MS) are utilized for the analysis of PFASs in the environment. But these approaches are limited to the availability of analytical grade standards and therefore drastically underestimate the total PFAS burden. Analytical approaches based on total fluorine for PFAS sum parameter analysis become increasingly important to indicate realistic PFAS pollution levels.
PFAS sum parameters display the proportion of organically bound fluorine that can either be extracted (EOF) or adsorbed to activated carbon (AOF). For the instrumental analysis of such sum parameters, a fluorine selective detector is needed. Besides combustion ion chromatography (CIC), high resolution-continuum source-graphite furnace molecular absorption spectrometry (HR-CS-GFMAS) is a sensitive and highly selective tool for fluorine determination. The method is based on the in situ formation of diatomic gallium-mono fluoride (GaF) in a graphite furnace at a temperature of 1550°C. The molecular absorption of GaF can be detected at its most sensitive wavelength at 211.248 nm providing limits of quantification in the low µg F L-1 range.
Here, we present a comparison of total fluorine analysis methods – AOF vs. EOF and HR-CS-GFMAS vs. CIC. Therefore, surface water samples from the Spree River in Berlin, Germany were analyzed at 10 locations for total fluorine (TF), AOF and EOF. The AOF made up 0.14–0.81% of TF and the EOF 0.04–0.28% of TF while AOF concentrations were systematically higher. For the instrumental comparison, HR-CS-GFMAS was the more sensitive and precise method for fluorine analysis compared to CIC.
Glimpses of the Future ✨: Advancing X-ray Scattering in an Automated Materials Research Laboratory
(2023)
In our (dramatically understaffed) X-ray scattering laboratory, developing a systematic, holistic methodology1 let us provide scattering and diffraction information for more than 2100 samples for 200+ projects led by 120+ collaborators. Combined with automated data correction pipelines, and our analysis and simulation software, this led to more than 40 papers2 in the last 5 years with just over 2 full-time staff members.
This year, our new, modular synthesis platform has made more than 1000 additional samples for us to analyse and catalogue. By virtue of the automation, the synthesis of these samples is automatically documented in excruciating detail, preparing them for upload and exploitation in large-scale materials databases. Having developed these proof-of-concepts, we find that materials research itself is changed dramatically by automating dull tasks in a laboratory.
This talk is intended to spark ideas and invite collaborations by providing an overview of: 1) the current improvements in our wide-range X-ray scattering laboratory methodology, 2) Introduce some of our open-source analysis and simulation software, touching on scattering, diffraction and PDF, and 3) introducing our open, modular robotic platform for systematic sample preparation. Finally, the remaining bottlenecks and points of attention across all three are highlighted.
The second talk for the Swiss Society for Crystallography (SSCr) workshop on SAXS will highlight the data processing challenges, holistic experimental workflow developments, and the pitfalls. In particular, the following items will be addressed:
- The importance of data processing and estimating uncertainty
- A universal correction pipeline – away with the headaches, at least for this step!
- Experiment planning part 2, some tips and advice to improve your corrected data.
- Sample preparation, background selection, some tips and advice to improve your corrected data.
- Automate for your mental well-being; electronic logbooks, measurement catalogs and workflow management software
- Life on the edge: several pitfalls to avoid…
This talk for the Swiss Society for Crystallography (SSCr) workshop on SAXS will introduce scattering from various angles, focusing in particular on:
- Information content of X-ray scattering experiments, three entry points…
- An introduction to Fourier Transforms
- Sample criteria, compatibility, and selection
- Key indicators of a measurement – where is the information?
- Key indicators of measurement quality
- Experiment planning, the basics
In our (dramatically understaffed) X-ray scattering laboratory, developing a systematic, holistic methodology let us provide scattering and diffraction information for more than 2100 samples for 200+ projects led by 120+ collaborators. Combined with automated data correction pipelines, and our analysis and simulation software, this led to more than 40 papers in the last 5 years with just over 2 full-time staff members.
This year, our new, modular synthesis platform has made more than 1000 additional samples for us to analyse and catalogue. By virtue of the automation, the synthesis of these samples is automatically documented in excruciating detail, preparing them for upload and exploitation in large-scale materials databases.
This talk is intended to spark ideas and invite collaborations by providing an overview of: 1) the current improvements in our wide-range X-ray scattering laboratory methodology, and 2) introducing our open, modular robotic platform for systematic sample preparation.
Introduction: The influence of copper, iron and zinc concentrations on the formation of ß-amyloid plaques and neurofibrillary tangles in Alzheimer’s disease (AD) is widely discussed in the community. The results from human and animal studies so far are mixed with some studies showing a correlation and others not. From a number of studies, it is known that disease state and isotopic composition of essential elements can be coupled.
Aim: The aim of the study was to identify changes in element content and isotopic composition in two transgenic mouse models used in AD research compared to their genetic WT relatives and to establish whether element content and isotopic signature between different laboratories is comparable.
Methods: ß-amyloid (5xFAD) and tau overexpressing (L66) mice together with their matching wild-types were bred at dedicated facilities in accordance with the European Communities Council Directive (63/2010/EU). Serum and brain were sampled after sacrifice and the samples distributed among the participants of the study. The tissues were acid digested for total element determination and high-precision isotope ratio determination. Element content was determined by either sector-field or quadrupole-based inductively coupled plasma mass spectrometry (ICPMS). For the determination of isotope ratios multi-collector ICPMS was used.
Results: Total copper content was significantly higher for L66 and their matched WT compared to 5xFAD and WT. Brains of L66 mice contained more Fe in brain than their WT, Zn and Cu were not significantly different between L66 and WT. Whereas 5xFAD mice had a slightly lower Cu and slightly higher Zn concentration in brain compared to WT. The isotopic signature in brain of L66 mice for Fe was different from their controls, whereas Zn isotope ratios were influenced in 5xFAD mice compared to their WT . The Cu isotopic ratio did not seem to be influenced in either strain. In serum, the shifts were less pronounced.
Conclusion: Even though neither Tau-protein nor amyloid precursor protein are known to be metal-dependent / -containing proteins, the overexpression of both influences the Fe, Cu and Zn metabolism in brain and to some extent also in serum as can be seen not only using total element determination but probably more clearly studying the isotopic signature of Fe, Cu and Zn.
Many metallic materials gain better mechanical properties through controlled heat treatments. For example, in age-hardenable aluminium alloys, the strengthening mechanism is based on the controlled formation of nanometre-sized precipitates, which represent obstacles to dislocation movement and consequently increase the strength. Precise tuning of the material microstructure is thus crucial for optimal mechanical behaviour under service condition of a component. Therefore, analysis of the microstructure, especially the precipitates, is essential to determine the optimum parameters for the interplay of material and heat treatment. Transmission electron microscopy (TEM) is utilized to identify precipitate types and orientations in the first step. Dark-field imaging (DF-TEM) is often used to image the precipitates and thereafter quantify their relevant dimensions. Often, these evaluations are still performed by manual image analysis, which is very time-consuming and to some extent also poses reproducibility problems.
Our work aims at a semantic representation of an automatable digital approach for this material specific characterization method under adaption of FAIR data practices. Based on DF-TEM images of different precipitation states of a wrought aluminium alloy, the modularizable, digital workflow of quantitative analysis of precipitate dimensions is described. The integration of this workflow into a data pipeline concept will also be discussed. Using ontologies, the raw image data, their respective contextual information, and the resulting output data of the quantitative image analysis can be linked in a triplestore. Publishing the digital workflow and the ontologies will ensure data reproducibility. In addition, the semantic structure enables data sharing and reuse for other applications and purposes, demonstrating interoperability.
Knowledge representation in the materials science and engineering (MSE) domain is a vast and multi-faceted challenge: Overlap, ambiguity, and inconsistency in terminology are common. Invariant and variant knowledge are difficult to align cross-domain. Generic top-level semantic terminology often is too abstract, while MSE domain terminology often is too specific.
In this presentation, an approach how to maintain a comprehensive and intuitive MSE-centric terminology composing a mid-level ontology–the PMD core ontology (PMDco)–via MSE community-based curation procedures is shown.
The PMDco is designed in direct support of the FAIR principles to address immediate needs of the global experts community and their requirements. The illustrated findings show how the PMDco bridges semantic gaps between high-level, MSE-specific, and other science domain semantics, how the PMDco lowers development and integration thresholds, and how to fuel it from real-world data sources ranging from manually conducted experiments and simulations as well as continuously automated industrial applications.
The present work is part of the AIFRI project (Artificial Intelligence For Rail Inspection), where we and our project partners train a neural network for defect detection and classification. Our goal at BAM is to generate artificial ultrasound and eddy current training data for the A.I. This paper has an exploratory nature, where we focus on the simulation of eddy current signals for head check cracks, one of the most important rail surface defects. The goal of this paper is twofold. On the one hand, we present our general simulation setup. This includes geometric models for head check cracks with features like branching and direction change, a model for the HC10 rail testing probe, and the configuration of the Faraday simulation software.
On the other hand, we use the Faraday software to simulate eddy current testing signals with a strong focus on the influence of the damage depth on the signal, while differentiating between different crack geometries. Here, we observe an early saturation effect of the test signal at a damage depth of 2 mm (at a crack angle of 25◦ to the surface). That is about 2 mm earlier than we would expect from measurements at a crack angle of 90◦. This behavior will be investigated further in a future paper. Finally, we interpolate the simulated signals in a two-step curve fitting process. With these interpolations we may generate eddy current test signals for any damage depth within the simulated range.
Additive manufacturing of concrete structures is a novel and emerging technology. Free contouring in civil engineering, which allows for entirely new designs, is a significant advantage. In the future, lower construction costs are expected with increased construction speeds and decreasing required materials and workers. However, architects and civil engineers rely on a certain quality of execution to fulfil construction standards. Although several techniques and approaches demonstrate the advantages, quality control during printing is highly challenging and rarely applied. Due to the continuous mixing process commonly used in 3D concrete printing, it is impossible to exclude variations in the dry mixture or water content, and a test sample cannot be taken as a representative sample for the whole structure. Although mortar properties vary only locally, a defect in one layer during printing could affect the entire integrity of the whole structure . Therefore, real-time process monitoring is required to record and document the printing process. At the Bundesanstalt für Materialforschung und -prüfung (BAM) a new test rig for the additive manufacturing of concrete is built. The primary purpose is measuring and monitoring the properties of a mortar during the printing process. The following study investigates an approach for calculating yield stress and plastic viscosity based on experimentally recorded pressure data. The calculations assume that fresh mortar behaves as a Bingham fluid and that the Buckingham-Reiner-equation is applicable. A test setup consisting of rigid pipes with integrated pressure sensors at different positions is utilized. Monitoring the printing process with different sensors is crucial for the quality control of an ongoing process.