Chemie und Prozesstechnik
Filtern
Dokumenttyp
- Vortrag (1069) (entfernen)
Sprache
- Englisch (1069) (entfernen)
Referierte Publikation
- nein (1069)
Schlagworte
- Concrete (39)
- LIBS (36)
- Nanoparticles (32)
- NDT (30)
- Fluorescence (28)
- Traceability (25)
- Synchrotron (24)
- Additive manufacturing (23)
- X-ray scattering (23)
- Metrology (22)
Organisationseinheit der BAM
- 1 Analytische Chemie; Referenzmaterialien (339)
- 6 Materialchemie (339)
- 8 Zerstörungsfreie Prüfung (287)
- 6.1 Oberflächen- und Dünnschichtanalyse (122)
- 1.1 Anorganische Spurenanalytik (99)
- 8.2 Zerstörungsfreie Prüfmethoden für das Bauwesen (88)
- 8.0 Abteilungsleitung und andere (87)
- 6.3 Strukturanalytik (84)
- 4 Material und Umwelt (70)
- 1.4 Prozessanalytik (61)
Due to the fast growth of industry and the use of metal-containing compounds such as sewage sludge in agricultural fields, soil pollution associated with heavy metals presents a terrifying threat to the environment. Throughout the world, there are already 5 million sites of soil contaminated by heavy metals1. Some heavy metals pollutants can influence food chain safety and food quality, which in turn affects human health. According to the German Federal Soil Protection and Contaminated Site Ordinance (BBodSchV) 13 heavy metals such as arsenic (As), lead (Pb) and cadmium (Cd) are classified as heavily toxic to human health2. Therefore, elemental analysis and precise quantification of the heavy metals in soil are of great importance.
Inductively coupled plasma mass spectrometry (ICP-MS) emerged as a powerful technique for trace analysis of soil due to its multi-element capability, high sensitivity and low sample consumption. However, despite its success and widespread use, ICP-MS has several persistent drawbacks, such as high argon gas consumption, argon-based polyatomic interferences and the need for complicated RF-power generators. Unlike argon-based ICP, the nitrogen microwave inductively coupled atmospheric pressure mass spectrometry (MICAP-MS) uses nitrogen as plasma gas, which eliminates high operating costs associated with argon-gas consumption as well as the argon-based interferences3. For the first time, the applicability of MICAP-MS for elemental analysis of environmental soils is investigated in this work. For this purpose, 7 reference- and 3 random soil samples containing vanadium (V), cobalt (Co), zink (Zn), copper (Cu), chrome (Cr), mercury (Hg), As, Pb and Cd are digested with aqua regia and used for analysis. Concentrations of selected elements are determined using MICAP-MS and validated using ICP-MS. Sensitivities, limits of detection and gas consumption for both methods are compared and discussed in detail. Moreover, the performance of MICAP-MS under different nitrogen plasma gas concentrations is investigated and compared
Laser-induced plasmas are widely used in many areas of science and technology; examples include spectrochemical analysis, thin film deposition, and material processing. Several topics will be addressed. First, general phenomenology of laser-induced plasmas will be discussed. Then, a chemical model will be presented based on a coupled solution of Navier-Stokes, state, radiative transfer, material transport, and chemical equations. Results of computer simulations for several chemical systems will be shown and compared to experimental observations obtained by optical imaging, spectroscopy, and tomography. The latter diagnostic tools will also be briefly discussed. Finally, a prospective application of laser-induced plasma and plasma modeling will be illustrated on the example of chemical vapor deposition of molybdenum borides and micro processing and coating of titanium dental implants.
IRWG strategy update
(2022)
In view of the increasing digitization of research and the use of data-intensive measurement and analysis methods, research institutions and their staff are faced with the challenge of documenting a constantly growing volume of data in a comprehensible manner, archiving them for the long term, and making them available for discovery and re-use by others in accordance with the FAIR principles. At BAM, we aim to facilitate the integration of research data management (RDM) strategies during the whole research cycle from the creation and standardized description of materials datasets to their publication in open repositories. To this end, we present the BAM Data Store, a central system for internal RDM that fulfills the heterogenous demands of materials science and engineering labs. The BAM Data Store is based on openBIS, an open-source software developed by the ETH Zurich that has originally been created for life science laboratories but that has since been deployed in a variety of research domains. The software offers a browser-based user interface for the digital representation of lab inventory entities (e.g., samples, chemicals, instruments, and protocols) and an electronic lab notebook for the standardized documentation of experiments and analyses.
To investigate whether openBIS is a suitable framework for the BAM Data Store, we carried out a pilot phase during which five research groups with employees from 16 different BAM divisions were introduced to the software. The pilot groups were chosen to represent a diverse array of domain use cases and RDM requirements (e.g., small vs big data volume, heterogenous vs structured data types) as well as varying levels of prior IT knowledge on the users’ side.
Overall, the results of the pilot phase are promising: While the creation of custom data structures and metadata schemas can be time-intensive and requires the involvement of domain experts, the system offers specific benefits in the form of a simplified documentation and automation of research processes, as well as constituting a basis for data-driven analysis. In this way, heterogeneous research workflows in various materials science research domains could be implemented, from the synthesis and characterization of nanomaterials to the monitoring of engineering structures. In addition to the technical deployment and the development of domain-specific metadata standards, the pilot phase also highlighted the need for suitable institutional infrastructures, processes, and role models. An institute-wide rollout of the BAM Data Store is currently being planned.
Closed material cycles and unmixed material fractions are required to achieve high recovery and recycling rates in the building industry. The growing diversity of construction and demolition waste is leading to increasing difficulties in separating the individual materials. Manual sorting involves many risks and dangers for the executing staff and is merely based on obvious, visually detectable differences for separation. An automated, sensor-based sorting of these building materials could complement or replace this practice to improve processing speed, recycling rates, sorting quality, and prevailing health conditions.
A joint project of partners from industry and research institutions approaches this task by investigating and testing the combination of laser-induced breakdown spectroscopy (LIBS) and visual (VIS)/ near-infrared (NIR) spectroscopy. Joint processing of information (data fusion) is expected to significantly improve the sorting quality of various materials like concrete, main masonry building materials, organic components, etc., and may enable the detection and separation of impurities such as SO3-containing building materials (gypsum, aerated concrete, etc.). Focusing on Berlin as an example, the entire value chain will be analyzed to minimize economic/technological barriers and obstacles at the cluster level and to sustainably increase recovery and recycling rates.
First LIBS measurements show promising results in distinguishing various material types. A meaningful validation shall be achieved with further practical samples. Future works will investigate the combination of LIBS and VIS/NIR spectroscopy in a fully automated measurement setup with conveyor belt speeds of 3 m/s.
Laser metal deposition is a rapidly evolving method for additive manufacturing that combines high performance and simplified production routine. Quality of production depends on an instrumental design and operational parameters, which require constant control during the process. In this work, feasibility of using optical spectroscopy as a control method is studied via modeling and experimentally. A simplified thermal model is developed based on the time-dependent diffusion-conduction heat equation and geometrical light collection into detection optics. Intense light emitted by a laser-heated spot moving across a sample surface is collected and processed to yield the temperature and other temperature-related parameters. In the presence of surface defects, the temperature field is distorted in a specific manner that depends on shape and size of the defect. Optical signals produced by such the distorted temperature fields are simulated and verified experimentally using a 3D metal printer and a sample with artificially carved defects. Three quantities are tested as possible metrics for monitoring the process: temperature, integral intensity, and correlation coefficient. The shapes of the simulated signals qualitatively agree with the experimental signals; this allows for a cautious inference that optical spectroscopy can detect surface defects and, possibly, predict their characters, e.g., inner or protruding.
A new method combining isotope dilution mass spectrometry (IDMS) and standard addition has been developed to determine the mass fractions w of different elements in complex matrices: (a) silicon in aqueous tetramethylammonium hydroxide (TMAH), (b) sulfur in biodiesel fuel, and (c) iron bound to transferrin in human serum. All measurements were carried out using inductively coupled plasma mass spectrometry (ICP–MS). The method requires the gravimetric preparation of several blends (bi)—each consisting of roughly the same masses (mx,i) of the sample solution (x) and my,i of a spike solution (y) plus different masses (mz,i) of a reference solution (z).
Only these masses and the isotope ratios (Rb,i) in the blends and reference and spike solutions have to be measured. The derivation of the underlying equations based on linear regression is presented and compared to a related concept reported by Pagliano and Meija. The uncertainties achievable, e.g., in the case of the Si blank in extremely pure TMAH of urel (w(Si)) = 90% (linear regression method, this work) and urel (w(Si)) = 150% (the method reported by Pagliano and Meija) seem to suggest better applicability of the new method in practical use due to the higher robustness of regression analysis.
The residual stresses and load transfer in multiphase metal alloys and their composites (with both random planar-oriented short fibers and particles) will be shown, as studied by neutron diffraction, by X-ray computed tomography, and by a model based on the reformulation of classic Maxwell’s homogenization method.
Contrary to common understanding and state-of-the-art models, we experimentally observe that randomly oriented phases possess non-hydrostatic residual stress. Moreover, we disclose that the unreinforced matrix alloy stays under hydrostatic compression even under external uniaxial compression.
The recently developed modeling approach allows calculating the residual stress in all phases of the composites. It rationalizes the presence of deviatoric stresses accounting for the interaction of random oriented phases with fibers having preferential orientation. It also allows the explanation of the unconventional in-situ behavior of the unreinforced alloy and the prediction of the micromechanical behavior of other similar alloys.
The overall interest in nanotoxicity, triggered by the increasing use of nanomaterials in the material and life sciences, and the synthesis of an ever increasing number of new functional nanoparticles calls not only for standardized test procedures1,2 and for efficient approaches to screen the potential genotoxicity of these materials, but also for standardized and validated methods for surface analysis.
4,5 The analysis and quantification of surface chemistry is hence in the focus of an increasing number of standardization organizations and interlaboratory comparisons with different analytical methods are being done.5 For the monitoring of nanomaterial synthesis and the fast assessment of the number of functional groups such as carboxyl and amino functionalities, which are very commonly used functionalities in the life sciences, simple and validated methods are needed that can be performed with common laboratory instrumentation. 5,6 Here we provide a brief overview of the ongoing research in division Biophotonics employing quantitative NMR (qNMR), conductometry, and colorimetric and fluorometric optical assays for the determination of the total and the accessible number of carboxyl and amino groups on differently sized polymer and silica nano- and microparticles.5-7
The fabrication of laser-generated surface structures on titanium and titanium alloys has recently gained remarkable interests, being technologically relevant for applications in optics, medicine, fluid transport, tribology, and wetting of surfaces. The morphology of these structures, and so their chemistry, is influenced by the different laser processing parameters such as the laser fluence, wavelength, pulse repetition rate, the effective number of laser pulses per beam spot area, etc. A simple way to characterize laser-generated surface structures is by means of optical microscopy (OM) or white light interference microscopy (WLIM). The latter can address the surface topography, while having a lateral resolution limit of ~(lambda)/2 (lambda = illumination wavelength). To resolve morphologies with spatial periods significantly smaller than (lambda)/2, scanning electron microscopy (SEM) is often used, taking benefit of the reduced de Broglie wavelength associated to the electrons of several keV energy. However, all the above-mentioned techniques lack the necessary depth-resolution to reveal and quantify sub-surface material modifications of these laser-generated structures. Time-of-Flight secondary ion mass spectrometry (ToF-SIMS) represents a promising surface analytical technique for studying laser-induced chemical surface alterations since the method combines a high surface sensitivity with the capability to perform a depth-profiling of the laser-affected surface zone. In this study we combine WLIM and high-resolution SEM with ToF-SIMS to fully characterize the evolution of various types of laser-generated micro- and nanostructures formed on Ti-6Al-4V alloys upon irradiation by near infrared ultrashort laser pulses (1030 nm, 925 fs) at different laser fluence levels, effective number of pulses, and at different pulse repetition rates (1 – 400 kHz). We show how this combined surface analytical approach allows to evaluate alterations in the surface chemistry and topography of the laser-generated surface structures depending on the laser processing parameters
Nanoforms with at least one dimension below 100 nm have an important part to play in more and more areas of our daily life. Therefore, risk assessment of these materials is becoming increasingly important. In this context, the European Chemical Agency (ECHA) considered eleven physico-chemical properties as relevant, of which the following six are essential for the registration: chemical composition, crystallinity, particle size, particle shape, surface chemistry and specific surface area. Four of these priority properties can be obtained with electron microscopy and surface analytics like XPS and ToF-SIMS. The reliability of this data must be ensured, especially for their use for grouping and read across approaches. On the other hand, the “reproducibility” crisis has revealed major shortcomings in the reliability of published data.
In a case study, we show how the quality of the data can be ensured by using existing standards and protocols of each step in the workflow of sample characterization. As exemplary samples, two Al-coated TiO2 samples as nanopowders were selected from the JRC repository, capped either with a hydrophilic or a hydrophobic organic ultrathin shell. SEM results provided the size and shape of the nanoparticles, a first overview about the composition was obtained with EDS. XPS and ToF-SIMS supplied the surface chemistry, especially information about the shell and the coating of the particles. Standards and protocols of all steps of the analytical workflow including preparation and data reduction are discussed regarding reliable and reproducible data. Additionally, uncertainties for the different steps are specified.
Only such a detailed description of all these factors allows a comprehensive physico-chemical characterization of the nanoparticles with understanding of their potential risk assessment.
VAMAS-Enabling international standardisation for increasing the take up of Emerging Materials
(2022)
VAMAS (Versailles Project on Advanced Materials and Standards) supports world trade in products dependent on advanced materials technologies by providing technical basis for harmonized measurements, testing, specification, reference materials and standards. The major tools for fulfilling this task are interlaboratory comparisons (ILC). The organisation structure of VAMAS is presented. It is discussed, how a new technical activity can initiate.
In the face of rising energy demand and the impending climate change the development of a sustainable, fossil-free fuel and chemical production is of global importance. One possible goal is the development of electrochemical conversion processes using catalysts. Porous materials play an important role in such energy applications. The activity and stability of each catalyst is highly dependent on the properties of the coating, i.e., phase composition, crystallinity, accessible surface area, and many other factors. The key to the development of improved catalysts is a better understanding of the relations between their performance, stability and physico-chemical properties. However, the complex morphology of such catalysts constitutes a challenge even for modern analytical techniques. Spectroscopic ellipsometry (SE) is a versatile method for studying material properties by using appropriate models (e.g., film thickness, optical and electronic properties). Ellipsometric models need to be validated in order to produce accurate results. In a first step, the model for the ellipsometric fit studies of a calcination series of mesoporous iridium oxide films (300 – 600 °C) was investigated and validated with respect to their material properties.[4] The information on electronic structure of the catalysts shows a direct correlation with electrochemical activities. The development of an environmental electrochemical cell offers the possibility of investigations under operando conditions. Thus, changes in optical and electronic properties can be induced and monitored during the electrocatalytic oxygen evolution reaction.
A brief introduction is given into our data collection and organization procedure, and why we have settled on the HDF5-based NeXus format for describing experimental data.
The links between NeXus and the SciCat data catalog is also provided, showing how the NeXus metadata is automatically added as searchable metadata in the catalog.
Due to the diffusive nature of heat propagation in solids, the detection and resolution of internal defects with active thermography based non-destructive testing is commonly limited to a defect-depth-to-defect-size ratio greater than or equal to one. In the more recent past, we have already demonstrated that this limitation can be overcome by using a spatially modulated illumination source and photothermal super resolution-based reconstruction. Furthermore, by relying on compressed sensing and computational imaging methods we were able to significantly reduce the experimental complexity to make the method viable for investigating larger regions of interest. In this work we share our progress on improving the defect/inhomogeneity characterization using fully 2D spatially structured illumination patterns instead of scanning with a single laser spot. The experimental approach is based on the repeated blind pseudo-random illumination using modern projector technology and a high-power laser. In the subsequent post-processing, several measurements are then combined by taking advantage of the joint sparsity of the defects within the sample applying 2D-photothermal super resolution reconstruction. Here, enhanced nonlinear convex optimization techniques are utilized for solving the underlying ill-determined inverse problem for typical simple defect geometries. As a result, a higher resolution defect/inhomogeneity map can be obtained at a fraction of the measurement time previously needed.
Additively manufactured (and in particular laser powder bed fused) materials represent a manyfold challenge for the materials scientist and engineer because of their distinctive microstructure. If laser powder bed fusion is used to produce components, the complexity level increases because meso-structures (e.g., overhanging features, surface and internal defects) gain importance. Furthermore, if the main advantage of additive manufacturing, i.e., the freedom of design, is to be fully exploited, and geometrically complex structures, such as lattices, are manufactured, then such structures become meta-materials. This means that the geometry and the materials properties become equally important.
This matryoshka-like (more literary than the dry “multi-scale”) complexity makes the characterization of the residual stress fields by means of diffraction methods so difficult with the current means, that new paradigms are necessary to tackle the challenge.
Indeed, classic open problems acquire an extra layer of difficulty, such that new solutions need to be found and the sometimes-dormant debate needs to be re-opened. Examples include the determination of: a- the unstrained reference: this reference can become location-dependent and needs to be carefully determined; b- the so-called diffraction elastic constants, which becomes immensely challenging since even the single-crystal elastic constants are not known for additively manufactured materials.
On top of this, other problems arise. The determination of the principal axes of stress becomes non-trivial because the hatching strategy sometimes dominates over the sample geometry. Even further, in complex structures, such as lattices, the textbook statement that the strain measurement in six independent directions uniquely identifies the strain tensor becomes simply invalid. The peculiar surface features of additively manufactured materials transform trivial tasks into formidable challenges: the precise alignment of a specimen in a beam or the determination of surface stresses with laboratory X-rays need to be thoroughly re-discussed and lay far from being routine tasks.
In this paper, we will show a few examples of the cases mentioned above. We will demonstrate that sometimes the classic approach works very well, but other times surprising conclusions can be drawn from in-depth studies of the residual stress in additively manufactured materials. In short, we predicate that classic methods cannot be used on additively manufactured materials and structures without a critical evaluation of their validity and application range.
Quality assessment of components produced by metal based additive manufacturing (AM) technologies such as laser powder bed fusion is rising in importance due to the increased use of AM in industrial production. Here, the presence of internal porosity was identified as a limiting factor for the final component quality. The utilization of thermography as an in-situ monitoring technique allows the determination of the part’s thermal history which was found to be connected to the porosity formation [1]. Combining the local thermal information derived from thermography with the porosity information obtained by x-ray micro computed tomography, machine learning algorithms can be utilized to predict the porosity distribution in the part. In this study, a first approach for the prediction of keyhole porosity in a cylindric specimen from AISI 316L stainless steel is presented. It is based on data augmentation using the “SmoteR” algorithm [2] to cure the dataset imbalance and a 1-dimensional convolutional neural network.
[1] C.S. Lough et al., Local prediction of Laser Powder Bed Fusion porosity by short-wave infrared thermal feature porosity probability maps. Journal of Materials Processing Technology, 302, p. 117473 (2022)
https://dx.doi.org/10.1016/j.imatprotec.2021.117473
[2] L. Torgo et al., SMOTE for Regression. Progress in Artificial Intelligence, Chapter 33, p. 378-289 (2013)
https://dx.doi.org/10.1007/978-3-642-40669-0_33
Atmospheric pressure plasmas interact in various physical ways with their surroundings. They release heat and generate charge carriers, which leads to two effects. The first effect is the generation of acoustic waves due to heat release, known as the thermoacoustic effect, and the second effect is the perturbation of the resting fluid provoked by the release of charge carriers, called “ionic wind”. The direct connection between the charge carrier production of the discharge arrangement and the surroundings also allows the detection of acoustic waves by tracking the electrical current of the arrangement.
This contribution introduces a multi-fluid model approach to describe the acoustic interaction of atmospheric plasmas. In addition, we present experimental results on commercially available and in-house fabricated discharge arrangements for either emitting or receiving acoustic waves.
The prerequisites for a successful energy transition and the economic use of hydrogen as a clean green energy carrier and for H2 readiness are a rapid market ramp-up and the establishment of the required value chains. Reliable quality and safety standards for innovative technologies are the prerequisite for ensuring supply security, for environmental compatibility and sustainable climate protection, for building trust in these technologies and thus enable product and process innovations.
With the Competence Centre "H2Safety@BAM", BAM is creating the safety-related prere-quisites for the successful implementation of hydrogen technologies at national as well as European level. BAM uses decades of experience in dealing with hydrogen technologies to develop the necessary quality and safety standards.
The presentation will draw a bow from the typical basic tasks of BAM in the field of competence "Sensors, analytics and certified reference materials", such as maintenance and dissemination of the national gas composition standards for calorific value determination as Designated Institute for Metrology in Chemistry within the framework of the Metre Convention, to the further development of measurement and sensor technology for these tasks. For the certification of reference materials, a mostly slow and time-consuming but solid reference analysis is common. With hydrogen and its special properties, completely new requirements are added. In addition, fast and simple online analysis is required for process control, for example to register quality changes, e.g., during load changes or refuelling processes.
Dust deposition is an important source of phosphorus (P) to many ecosystems. However, there is little evidence of dust-derived P-containing minerals in soils. Here we studied P forms along a well-described climatic gradient on Hawaii, which is also a dust deposition gradient. Soil mineralogy and soil P forms from six sites along the climatic gradient were analyzed with bulk (X-ray diffraction and P K-edge X-ray absorption near edge structure)
and microscale (X-ray fluorescence, P K-edge X-ray absorption near edge structure, and Raman) analysis methods. In the wettest soils, apatite grains ranging from 5 to 30 μm in size were co-located at the micro-scale with quartz, a known continental dust indicator suggesting recent atmospheric deposition. In addition to colocation with quartz, further evidence of dust-derived P included backward trajectory modeling indicating that dust particles could be brought to Hawaii from the major global dust-loading areas in central Asia and northern Africa. Although it is not certain whether the individual observed apatite grains were derived from long-distance transport of dust, or from local dust sources such as volcanic ash or windblown fertilizer, these observations offer direct evidence that P-containing minerals have reached surface layers of highly-weathered grassland soils
through atmospheric deposition.
The fabrication of nanostructures with ever-decreasing sizes has increased the demand of suitable characterization methods which allow to determine their shape and size at the true nanoscale, and similarly important, enable the investigation of their optical properties beyond the diffraction limit. Due to its high spectral and spatial resolution down to the (sub-) nanometer range, electron beam-based techniques, namely cathodoluminescence (CL) has become a powerful characterization tool, particularly to study plasmonic and dielectric nanostructures. However, the interpretation of the resulting spectral CL maps is not always unambiguously straightforward.
In this work, Mie resonances in single Si nanospheres of different sizes have been systematically studied, using experimental CL spectroscopy and an analytical CL model. For smaller spheres (r ~ 75 nm), the eigenmodes can be unequivocally identified, with relative changes in intensity of the electric and magnetic dipole depending on the electron beam position within the sphere. However, in larger spheres (r ~ 105 nm), the modal assignment becomes increasingly difficult due to a larger number of Mie modes in the visible spectral range. Additionally, penetrating electron beams generate two radiating dipoles at the two Si interfaces – due to the electron and its image charge collapsing at those interfaces – which can, depending on the electron beam’s velocity and its path length inside the particle, produce distinct resonances or dips (constructive or destructive interference of those two radiative dipoles). It is demonstrated that superimposed on the eigenmodes of the studied nanospheres, these resonances can distort the recorded spectrum and lead to potentially erroneous assignment of modal characters to the spectral features. An intuitive analogy is developed to unambiguously distinguish those resonance induced by transition radiation from the nanoparticle-specific Mie resonances.
In current Li-ion batteries electrode materials consist typically of inorganic materials, such as LiCoO2, LiNixCoyMn1-zO2, LiFePO4, Li4Ti5O12. These materials struggle with toxicity or limited mineral resources, making them expensive. Therefore, eco-friendly, sustainable, and low-cost alternatives are researched for in recent years. A series of organic compounds were investigated as electrode materials for alkali-ion batteries. Among them organic carbonyl-based materials show reversible storage of lithium- or sodium-ions. Metal terephthalates stand out with their easy synthesis, moderate operational voltage and enhanced dissolution stability compared to other organic compounds. One degradation pathway consists of the dissolution of the electrode material by HF, formed in a side reaction by water and the fluorous electrolyte. Fluorinated metal terephthalates could offer higher dissolution stability against HF and less contamination by water due to their increased hydrophobicity.
The goal of this project is the synthesis and the investigation of the formation mechanism of a series of calcium-based MOFs with increasing fluorine content. For this purpose, we aimed for the construction of calcium-based MOFs with terephthalic acid (H2-pBDC), 2-fluoro-terephthalic acid (H2-2F-pBDC), 2,3,4,5-tetrafluoroterephthalic acid (H2-pBDC-F4), isophthalic acid (H2-mBDC) and 5-fluoro-isophthalic acid (H2-5F-mBDC).
Colours, nano and surface
(2022)
The basic ideas of colours and pigments are presented. The correlation between nanomaterials and colours are explained. Different methods are presented for investigating nanoparticles and their surface. At the end a case study is presented explaining the importance of coating for the properties of nanoparticles.
A brief introduction will be given on modeling chemical reactions in laser induced plasmas using stoichiometric and non-stoichiometric approaches. Several applications will be considered, which can benefit from such modeling. Those include plasma enhanced chemical vapor deposition (PECVD), surface modification and surface coating, and molecular analysis by LIBS. Each application will be illustrated by simulations of relevant chemical systems. For PECVD, chemical systems are BCl3/H2/Ar, BF3/H2/Ar, BCl3/BF3, Mo/BF3/H2; for surface modification/coating it is Ti/air; for molecular LIBS they are CaCO3/Ar, Ca(OH)2/Ar, and CaCl2/Ar. Advantages and shortcomings of equilibrium chemical hydrodynamic models of laser induced plasmas will be discussed.
Alternative to conventional transmission-based radiography and computed tomography, X-ray refraction techniques are being increasingly used to detect damage in light materials. In fact, their range of application has been recently extended even to metals. The big advantage of X-ray refraction techniques is that they are able to detect nanometric defects, whose size would lie below the resolution of even state-of-the-art synchrotron-based X-ray computed tomography (SXCT). The superiority of synchrotron X-ray refraction radiography and tomography (SXRR and SXRCT) has been shown in the case of light materials, in particular composites. X-ray refraction techniques also yield a quantitifaction of the amount of damage (the so-called relative internal specific surface) and can well be compared with damage models. At the same time, it is impossible for SXRR and SXRCT to image single defects. We show that the combination of refraction- and transmission-based imaging techniques yields an impressive amount of additional information about the type and amount of defects in microstructured materials such as additively manufactured metals or metal matrix composites. We also show that the use of data fusion techniques allows the classification of defects in statistically significant representative volume elements.
Turbine blades for gas turbines are exposed to extreme working conditions in a demanding environment. In-service inspection, maintenance and refurbishment of the heavily stressed parts is necessary to ensure both safety and efficiency, e.g. based on immersion ultrasound testing (UT).
In the course of NDE 4.0, the European project MRO 2.0 aims to innovate the maintenance, repair and overhaul of turbine blades by linking these with modern digital methods. For this, the goal of this project is to go beyond conventional automated and manual UT testing techniques.
The aim is to measure the actual geometry and wall thickness of the complex shaped parts by applying an adaptive TFM that takes into account the refraction of the ultrasonic waves at the transition from the coupling material (water) to the inspected part (steel). In this setup the phased array probe is held by a robotic arm that allows the part to be scanned while remaining mainly perpendicular to the inspected surface. In this way, even complex geometries can be inspected and a 3D model of the actual condition of the part can be created.
The laboratory setup is equipped with a Vantage 64 phased array instrument from Verasonics Inc. and an industrial robot from ABB. A 64 element linear array probe operating at 10 MHz is attached to the robot.
The focus is on optimizing resolution, reliability and inspection speed, as the reconstructed model will be fed to the digital twin at a later stage of the project and used for targeted repairs. In addition to enhancing the reconstruction algorithms, required probe geometry and the parameters needed to inspect turbine blades with partially thin walls and anisotropic materials will also be investigated.
This talk will describe the 3-year project and present the results of the first year. The main focus will be on the development of the reconstruction algorithms used and the experimental setup.
Surface modification of titanium by laser ablation is investigated theoretically and experimentally. The modification consists in texturing the surface and redeposition of chemically transformed material from the ablation plasma. The redeposition is driven by the hydrodynamic flow in the plasma. Such surface modification improves the biocompatibility of titanium implants.
This course will provide an introduction to plasma diagnostic techniques. The major focus of the course will be on the discussions of the practical procedures as well as the underlying physical principles for the measurements of plasma fundamental characteristics (e.g., temperatures and electron number density). Particular emphasis will be placed on laser induced plasma–atomic emission spectrometry, but other analytical plasmas will also be used as examples when appropriate. Selected examples on how one can manipulate the operating conditions of the plasma source, based on the results of plasma diagnostic measurements, to improve its performance used for spectrochemical analysis will also be covered. Topics to be covered include thermal equilibrium, line profiles, temperatures, electron densities, excitation processes, temporal and spatial resolution.
Um aus messtechnisch ermittelten Dispersionsabbildungen geführter Ultraschallwellen Rückschlüsse auf die Materialparameter zu ziehen, werden in der aktuellen Forschung verschiedene inverse Methoden diskutiert. Maschinelles Lernen und insbesondere Convolutional-Neural-Networks (CNNs) stellen eine Möglichkeit der automatisierten inversen Modellierung und Evaluierung von Bilddaten dar. In diesem Vortrag wird anhand synthetischer Daten gezeigt, wie das Ausbreitungsverhalten von geführten Ultraschallwellen unter Verwendung von CNNs genutzt werden kann, um die isotropen elastischen Konstanten einer plattenförmigen Struktur zu bestimmen.
There are many different methods to characterize air-coupled ultrasonic transducers for non-destructive testing. Data sheets of various manufacturers contain information about some parameters important for the performance of transducers, but this information is not standardized, so that a comparison between probes of different manufacturers is difficult. Therefore, the German Society for Non-Destructive Testing (DGZfP) is working on a guideline to characterization of air-coupled probes.
One of the topics in this guideline is the application of thermoacoustic transducers for the characterization of receivers and another topic is the application of microphones for the characterization of transmitters. In this presentation we compare various characterization methods with the particular focus on the characterization of thermoacoustic transducers using an optical microphone. Both thermoacoustic transmitters and optical microphones have a very large bandwidth compared to conventional air-coupled transducers, but their spectrum is not entirely linear, which needs to be taken into account if they are applied as reference transducers.
Ultrasonic monitoring, making use of the sensitivity of the coda of repeated transmission meas-urements to changes in stress, temperature, moisture, as well as localized or distributed damage, has gotten at-tention in structural health monitoring (SHM) research recently. Analysis methods such as coda wave inter-ferometry (CWI), including its nonlinear extension, have been shown to be able to measure ultrasonic wave velocity changes with a 1∙10-5 resolution, while indicators such as cross-correlation or cross-coherence have been used to distinguish between reversible and irreversible changes. Several small- and large-scale laboratory experiments have demonstrated that stress changes in structures can be captured or damage detected in a very early stage. The use of this technique for pre-warning before failure are currently under investigation, as well as detailed research on the physical causes and the connection between ultrasonic wave properties and materi-al/structural behavior. Recently, several of large-scale laboratory and real structures have been instrumented with embedded ultrasonic transducers to gather experience and evidence on how to use this technology in re-al-world applications. Preliminary results from installations on a new bridge, an existing bridge, a tunnel, a la-boratory earthquake test as well as a historic stadium in Germany, Poland, and the United States, respectively, are presented. Environmental influences (mainly temperature) and validation by load tests are discussed.
Non-destructive testing methods are available in civil engineering for decades to estimate concrete properties or to detect flaws and features. But recently we have seen the dawn of next-generation tools, methods, and applications. Some of them will be discussed in the web talk: – Better tools: deeper and more detailed insight into concrete constructions – Better methods: Quantitative use in probabilistic structural assessment – Better rules: Towards standardization, qualification, and certification – Better application: Digitalization and Elimination of the boundaries between NDT, SHM, and BIM: NDT-CE 4.0 Not enough? I might show, how cosmic rays might become a game-changer in NDT-CE. This live webinar record was provided by https://eurostruct.org
ML@BAMLINE
(2021)
In this talk I’ll describe the use of artificial neural networks (ANN) for quantifying X-ray fluorescence (XRF) measurements. The main idea of this talk is to give an overview of the process needed to generate a model that can then be applied to a specific problem.
In XRF, a sample is excited with X-rays and the resulting characteristic radiation is detected to determine elements quantitatively and qualitatively. This is traditionally done in several time-consuming steps. I’ll show the possibilities and problems of using a neural network to realise a "one-click" quantification. This includes generating training data using Monte Carlo simulation and augmenting the existing data set with an ANN to generate more data. The search for the optimal hyperparameters, manually and automatically, is also described. For the case presented, we were able to train a network with a mean absolute error of 0.1% by weight for the synthetic data and 0.7% by weight for a set of experimental data obtained with certified reference materials.
The determination of the measurement uncertainty of quantitative and qualitative results is an important quality management tool, for example to describe measuring equipment, procedures, measurement results and the quality of products. The lecture presents the process description for the determination of the measurement uncertainty according to GUM, the determination of uncertainties in qualitative measurement results and the application of the measurement uncertainty for conformity assessment.
ML has been successfully applied to solve many NDT-CE tasks. This is usually demonstrated with performance metrics that evaluate the model as a whole based on a given set of data. However, since in most cases the creation of reference data is extremely expensive, the data used is generally much sparser than in other areas, such as e-commerce. As a result, performance indicators often do not reflect the practical applicability of the ML model. Estimates that quantify transferability from one case to another are necessary to meet this challenge and pave the way for real world applications.
In this contribution we invetigate the uncertainty of ML in new NDT-CE scenarios. For this purpose, we have extended an existing training data set for the classification of corrosion damage by a new case study. Our data set includes half-cell potential mapping and ground-penetrating radar measurements. The measurements were performed on large-area concrete samples with built-in chloride-induced corrosion of reinforcement. The experiment simulated the entire life cycle of chloride induced exposed concrete components in the laboratory. The unique ability to monitor deterioration and initiate targeted corrosion initiation allowed the data to be labelled - which is crucial to ML. To investigate transferability, we extend our data by including new design features of the test specimen and environmental conditions. This allows to express the change of these features in new scenarios as uncertainties using statistical methods. We compare different sampling and statistical distribution-based approaches and show how these methods can be used to close knowledge gaps of ML models in NDT.
Virtual CT with aRTist
(2021)
Simulation becomes more and more important in modern CT imaging. It is increasingly used to optimize techniques for complex applications, to support the preparation of written procedures, and for educational purposes. The radiographic simulator aRTist is a modelling tool which simulates X-ray imaging using a hybrid analytical and Monte Carlo method to efficiently model the radiation transport. In addition to the relevant physical effects such as absorption, scattering and fluorescence, simplified fast models are employed to describe the characteristics of the X-ray source and the detector. aRTist is well equipped to model realistic X-ray imaging setups due to the ability to load exported CAD object descriptions. A simple CT scan module is contained in aRTist which allows the simulation of standard (circular cone beam) scanning trajectories.
This training session starts with a general introduction to aRTist and will highlight its basic usage. Furthermore, the focus is on the configuration of scan trajectories and batch simulations for virtual CT. The aim is to enable the audience to use the aRTist software for their own experiments in virtual CT.
It is fundamental to determine the machine geometry accurately for dimensional X-ray computed tomography (XCT) measurements. When performing high-accuracy scans, compensation of a non-static geometry, e.g. due to rotary axis errors or drift, might become necessary. Here we provide an overview of methods to determine and account for such deviations on a per projection basis. They include characterisation of stage error motions, in situ geometry measurements, numerical simulations, and reconstruction-based optimization relying on image quality metrics and will be discussed in terms of their metrological performance. Since a radiographic calibration is always required to provide an initial absolute geometry, this method will be presented as well.
High entropy alloys (HEAs) are considered as a new class of alloys containing at least 5 elements with concentrations between 5 and 35 atomic percent. There has been a growing interest in HEAs in the material research field in recent years. Due to their adjustable composition, which enables the modifications of mechanical properties (such as hardness, strength and ductility etc) and their stability at high temperatures, HEAs have been the focus of various studies.
Especially the corrosion behavior of HEAs has been a wide research interest. Since the grazing exit X-ray fluorescence (GEXRF) offers a non-destructive way to collect notable information regarding the high temperature oxidation, we consider it as a useful method to investigate how HEAs behave in corrosive environments.
The main idea of grazing geometry is to enhance the fluorescence signal of the surface. This enables highly sensitive surface analyses of thin protective film on surface in sub-micrometer scale. Position-sensitive area detectors provide information regarding the signal emitted from the sample as a function of emission angle and thus allow depth-sensitive analysis. Furthermore, the data collected from samples of an incidence energy which lays within a specific energy range provides XANES data to determine oxidation states. Moreover, since GEXRF profiles can also be simulated through physical models (Urbach 1999), they enable us to determine the layer thickness of a given sample in a non-destructive way.
In this contribution, we present the preliminary results of a conceptual study regarding layer properties of CrCoNi medium entropy alloy. The successful implementation of such methodological concept will pave the way for the investigation of more complex alloys with multiple layers, which is planned for the later phases of the project.
Mechanochemistry has emerged as one of the most interesting synthetic protocols to produce new materials. The development of mechanochemistry as a synthetic method is supported by excellent research by many groups worldwide in a wide range of applications. The potential of mechanochemistry is also reflected in the inclusion in IUPAC’s 10 chemical innovations that will change our world’.[1] Solvent-free methodologies lead to unique chemical processes during synthesis with the consequent formation of martials with new properties.2 In this contribution, we will discuss our recent results investigating the formation of (polymorphic) cocrystals, coordination polymers, metal oxides and metal nanoparticles.[3-8] We introduced different setups enabling in situ investigation of mechanochemical reactions using synchrotron XRD combined with Raman spectroscopy and thermography.
Mechanochemistry is increasingly used for synthesizing soft matter materials including metal organic compounds and cocrystals.1 The ever-increasing interest in this method is contrasted by a limited mechanistic understanding of the mechanochemical reactivity and selectivity. Time-resolved in situ investigations of milling reactions provide direct insights in the underlying mechanisms.2-4 We recently introduced different setups enabling in situ investigation of mechanochemical reactions using synchrotron XRD combined with Raman spectroscopy and thermography. The presented setup allows the detection of crystalline, amorphous, eutectic, and liquid intermediates. Furthermore, the chemical composition of the reaction mixture was found to be directly correlated with changes in the temperature profile of the reaction. The resulting deeper kinetic and thermodynamic understanding of milling processes is the key for future optimization of mechanochemical syntheses. In this contribution, we will discuss our recent results investigating the formation of (polymorphic) cocrystals and coordination polymers.2,3,5 Our results indicate that in situ investigation of milling reactions offer a new approach to tune and optimize mechanochemical syntheses.
Lithium exists in two stable isotopes, 6Li and 7Li. The ratio of these in every ore varies depending on the geological history of the sample, thus providing a tool for fingerprinting the distinct origin of Li containing samples. Determination of the exact isotope ratio for e.g. designation of provenance today relies on expensive and bulky instrumentation such as multi `collector inductively coupled plasma mass spectrometry` (MC-ICP-MS). These instruments, however, are known to bear pitfalls in the characterization of particular elements including Lithium. BAM recently developed two alternative analytical devices for this task, solely relying on inexpensive optical spectroscopy in combination with state-of-the-art multivariate data analysis such as Machine learning algorithms. Both techniques have been comprehensively studied using certified reference materials and comparing the results to MC-ICP-MS results and could be shown to result in comparable figures of merit, paving the way for a more general accessibility to provenance determination instrumentation. The results also pave the way towards even further simplification of the laboratory infrastructure demands and to further include additional elements into the isotopic fingerprinting methodology.
The unambiguous correlation of possible health and sustainability risks to nanoparticle size must be enabled by reliable measurement of nanoparticle size, to ensure comparability and compatibility between results measured under different methods. The NPSIZE project funded by European Metrology Program (EMPIR) develop methods, reference materials and modelling to improve the traceability chain, comparability and compatibility of nanoparticle size measurements. In this work, we present how spherical silica nanoparticles are synthetized with controlled monomodal or bimodal dispersion to be use as reference materials and international round-robin. Improving the fabrication requires a fine understanding of synthesis (1), coupled with an expertise of in-situ or ex-situ analysis methods. This is a new challenge for the analysis : determining not only average characteristics (size, chemical composition and shape ...) but also the concentration and the distribution over the population studied (2). Small-Angle X-ray Scattering (3) allows very precise measurements of the nanoparticles size and concentration that can be directly link to the metric system (4) (metrological traceability) . We developed a SAXS laboratory instrument dedicated to the in-situ characterization of nanoparticles, which enable fast measurements, and the monitoring of the synthesis parameters. Measurement protocols and software processing chain (5) (i.e. size distribution) are also combined & optimized.
The formation of DNA-protein complexes ocurrs during replication and repair within cells. They are assumed to modify the damage caused by ionization radiation during radaition therapy. Hereby the assumption is, that the underlying damaging channels in DNA and proteins are modified, especially when compared to single molecules.
Hydrogen production via water electrolysis will be an essential cornerstone in development of sustainable, fossil-free fuel and chemical production on a global scale. The activity and stability of each catalyst is highly dependent on the properties of the coating, i.e., phase composition, crystallinity, accessible surface area, and many other factors. The key to developing improved catalysts is a better understanding of the relationships between their performance, stability, and physicochemical properties. However, these relationships can be complex and are also strongly influenced by the reaction environment. Therefore, operando analysis of the catalyst material during catalysis at realistic potentials and current densities is highly desirable. However, many analytical techniques cannot be applied in liquid environments at realistic potentials and current densities.
We propose environmental ellipsometric analysis in a dedicated electrochemical flow cell as a method to evaluate gas evolution reactions operando under realistic working conditions. Figure 1 illustrates schematically the developed technique. Key factors to success are highly active model-type catalysts, a suitable sample environment, and a deep understanding of the appropriate model development, as well as concise cross validation with numerous other analytical techniques.
The method was developed and validated by analyzing a calcination series (300 – 600 °C) of mesoporous templated IrOx films ex-situ and operando under oxygen evolution reaction (OER) conditions. The employed environmental electrochemical spectroscopic ellipsometric (ECSE) analysis revealed during OER the change of optical and electronic properties, i.e. the dielectric functions, resistivity and band-to-band transitions (p-d band transitions). Film thickness and porosity were validated by means of scanning electron microscopy (SEM), X-ray reflectometry (XRR) or ellipsometric porosimetry (EP), electrical and electronic properties by means of conductivity measurements, X-ray photoelectron spectroscopy (XPS) or UV-Vis-NIR absorption spectroscopy. The electronic structures of the catalysts from valence electron energy loss spectra (VEELS) derived from ε1 and ε2 from SE measurements reveal a direct correlation with electrochemical activities in OER.
In the presentation reversible and irreversible potential-dependent changes of the catalyst properties during operation will be discussed along with the dynamics of gas formation, transport and dissolution at different potentials.
Formation and detection of molecules in laser induced plasmas (LIP) is a hot topic. In analytical plasmas like LIBS, the detection of molecules is important for identification of geological and other materials, analysis of isotopes and difficult elements (Cl, F, etc.) via molecular emission. In chemical plasmas, like PECVD (plasma enhanced chemical vapor deposition) or PLD (pulsed laser deposition), molecules formed in the plasma determine a composition and a thickness of deposits. Similarly, molecules play an important role in microstructuring and oxidizing metal surfaces by laser ablation. It is unfortunate that different communities, which utilize plasma methods and seek for solutions of similar problems, do not strongly overlap, and do not fully use knowledge accumulated by each other.
In this presentation, mechanisms of formation of molecules will be analyzed on the example of LIPs used for chemical vapor deposition and metal microstructuring. Theoretical analysis includes equilibrium chemistry calculations combined with plasma hydrodynamics. First, LIP excited in a gas mixture of BCl3 or BF3 with H2 or CH4 will be analyzed; this chemical system is used for obtaining deposits of refractory solid boron and boron carbide. Second, a breakdown in the SiF4 + SiCl4 gas mixtures will be described; this method allows synthesis of fluorochlorosilanes SiFxCl4-x (x = 1, 2, 3), the good etching agents (Figure). Third, solid ablation of Mo in BF3+H2 and Ti in air will be considered aimed at obtaining deposits of high hardness MoxBy and films of TixOy on textured Ti surfaces, correspondingly.
In experiment, reaction gases before and after laser illumination, and solid deposits are analyzed by optical emission spectroscopy (OES), IR and mass spectrometry (MS), SEM, X-ray, and AFM. It will be shown that the hydrodynamic-chemical model adequately predicts the composition of LIPs, zones of molecular formation, dependence on reactant stoichiometry, plasma temperature and pressure.
Current developments in PT
(2021)
Introduction
A good laboratory organization can help address the reproducibility crisis in science, and easily multiply the scientific output of a laboratory, while greatly elevating the quality of the measurements. We have demonstrated this for small- and wide-angle X-ray scattering in the MOUSE project (Methodology Optimization for Ultrafine Structure Exploration). In the MOUSE, we have combined: a) a comprehensive laboratory workflow with b) a heavily modified, highly automated X-ray scattering instrument. This combination allows us to collect fully traceable scattering data, with a well-documented data flow (akin to what is found at the more automated beamlines). With two full-time researchers, the lab collects and interprets thousands of datasets, on hundreds of samples for dozens of projects per year, supporting many users along the entire process from sample selection and preparation, to the analysis of the resulting data.
While these numbers do not light a candle to those achieved by our hardworking compatriots at the synchrotron beamlines, the laboratory approach does allow us to continually modify and fine-tune the integral methodology. So for the last three years, we have incorporated e.g. FAIR principles, traceability, automated processing, data curation strategies, as well as a host of good scattering practices into the MOUSE system. We have concomitantly expanded our purview as specialists to include an increased responsibility for the entire scattering aspect of the resultant publications. This ensures full exploitation of the data quality, whilst avoiding common pitfalls.
Talk scope
This talk will present the MOUSE project as implemented to date, and will introduce foreseeable upgrades and changes. These upgrades include better pre-experiment sample scattering predictions to filter projects on the basis of their suitability, exploitation of the measurement database for detecting long-term changes and automated flagging of datasets, extending the measurement range through an Ultra-SAXS module, and enhancing MC fitting with sample scattering simulations for better matching of odd-shaped scatterers.
Compared to the clear, real-space images you can get from electron microscopy, X-ray scattering patterns are rather featureless. These patterns, however, contain structural information from all of the material structure illuminated by the X-ray beam. With this technique, you can measure nanoparticle dispersions, catalysts, composites, MOF powders, battery materials, light metal alloys and gels to reveal information on the structural features found within these materials. We have even measured many such materials for several research groups from the University of Birmingham, revealing structure features in the sub-nm to the micrometer range.
Measuring an X-ray scattering pattern is relatively easy, but measuring a high-quality, useful pattern requires significant effort and good laboratory organization. Such laboratory organization can help address the reproducibility crisis in science, and easily multiply the scientific output of a laboratory, while greatly elevating the quality of the measurements. We have demonstrated this for small- and wide-angle X-ray scattering in the MOUSE project (Methodology Optimization for Ultrafine Structure Exploration) [1]. With the MOUSE, we have combined: a) a comprehensive and highly automated laboratory workflow with b) a heavily modified X-ray scattering instrument. This combination allows us to collect fully traceable scattering data, within a well-documented, FAIR-compliant data flow (akin to what is found at the more automated synchrotron beamlines). With two full-time researchers, our lab collects and interprets thousands of datasets, on hundreds of samples, for dozens of projects per year, supporting many users along the entire process from sample selection and preparation, to the analysis of the resulting data.
Chemical companies must find new paths to stay productive in a rapidly changing environment. One of these is the potential of digital technologies. Flexible and modular chemical plants can produce various high-quality products using multi-purpose equipment with short downtimes between campaigns and reduce time to market. Process safety is improved due to smaller amounts processed and the abilities of efficient heat-transfer allow for otherwise difficult-to-produce compounds.
To exploit these advantages, a fully automated process control along with real-time quality control is mandatory and should be based on “chemical” information. The advances of a fully automated NMR analyzer were demonstrated, using a given pharmaceutical reaction step operated within a modular pilot plant. A commercially available benchtop NMR spectrometer was integrated to the requirements of an automated chemical production environment such as explosion safety, field communication, and robust data evaluation. Obtained results were used for direct loop advanced process control and real-time optimization of the process.
NMR spectroscopy appeared as excellent online analytical tool and allowed a modular data analysis approach, which even served as reliable reference method for further PAT applications. Using the available datasets, a second data analysis approach based on artificial neural networks (ANN) was evaluated.Therefore, amount of data was augmented to be sufficient for training. The results show comparable performance, while improving the calculation time tremendously. In future, such fully integrated and interconnecting “smart” systems and processes can increase the efficiency of the production of specialty chemicals and pharmaceuticals.
Raman spectroscopy for online monitoring of a homogeneous hydroformylation process in microemulsion
(2021)
An important industrial reaction is hydro¬formylation for the production of aldehydes from alkenes and syngas on the basis of homogeneous catalysis. The main cost factors of the processes currently used are product selectivity and the loss of the catalysts used. Therefore, various concepts for the hydroformylation of long-chain olefins have been developed, including hydroformylation in microemulsions, which is being investigated on a mini-plant scale at the Technical University of Berlin [1]. In this study, online Raman spectroscopy of the reaction of 1-dodecene to 1-tri¬decanal in a microemulsion was performed [2]. First, an experimental design was used to obtain a good representation of the operating range in the mini plant with respect to the concentrations of five reactants in a laboratory setup [3]. Based on the Raman spectra, Partial Least Squares (PLS) models for the prediction of 1-dodecene and 1-tride-decanal were calibrated and with these the reactions were predicted on a laboratory scale. In the next step, the PLS models were applied to online spectra from a mini-plant. This resulted in promising estimates of 1-tridecanal and acceptable predictions of 1-dodecene mass fractions. The predictive power of PLS models in this particular case was limited by unexpected by-product formation which, however, can easily be compensated by an extended calibration. Hence, Raman spectroscopy is a promising technique for process analysis in microemulsions.
Boron neutron capture therapy (BNCT) relies on the activation of 10B by thermal neutrons, which results in small highly energetic particle emission inducing cancer cells damage. However, in order to overcome the limits of the currently used BNCT agents, it is necessary to design new systems, which can specifically accumulate and deliver a sufficient amount of 10B in tumors. In this study, we designed a 10B-BSH-containing aza-BODIPY (aza-SWIR-BSH). It enabled the efficient vectorization of clinically used 10B-BSH to the tumor, resulting in higher therapeutic activity than the 10B-BSH alone.
Inorganic nanocrystals with linear and nonlinear luminescence in the ultraviolet, visible, near infrared and shortwave infrared like semiconductor quantum dots and spectrally shifting lanthanide-based nanophosphors have meanwhile found applications in the life and material sciences ranging from optical reporters for bioimaging and sensing over security barcodes to solid state lighting and photovoltaics. These nanomaterials commonly have increasingly sophisticated core/shell particle architectures with shells of different chemical composition and thickness to minimize radiationless deactivation at the particle surface that is usually the main energy loss mechanism [1]. For lanthanide-based spectral shifters, particularly for very small nanoparticles, also surface coatings are needed which protect near-surface lanthanide ions from luminescence quenching by high energy vibrators like O-H groups and prevent the disintegration of these nanoparticles under high dilution conditions. [2,3,4]. The identification of optimum particle structures requires quantitative spectroscopic studies focusing on the key performance parameter photoluminescence quantum yield [5,6], ideally flanked by single particle studies to assess spectroscopic inhomogeneities on a particle-to-particle level for typical preparation methods [7], Moreover, in the case of upconversion nanoparticles with a multi-photonic and hence, excitation power density (P)-dependent luminescence, quantitative luminescence studies over a broad P range are required to identify particle architectures that are best suited for applications in fluorescence assays up to fluorescence microscopy. Here, we present methods to quantify the photoluminescence of these different types of emitters in the vis/NIR/SWIR and as function of Pand demonstrate the importance of such measurements for a profound mechanistic understanding of the nonradiative deactivation pathways in semiconductor and upconversion nanocrystals of different size and particle architecture in different environments.
The presentation will give a brief overview of the processes occurring in laser-induced plasma and methods of modeling these processes. In particular, a chemical-hydrodynamic model will be considered, which is related to the modification of the surface of metallic titanium by laser pulses. The details and simplifications of this model, its shortcomings and the possibilities of their elimination will be discussed. This model is related to the structuring of the surface of dental implants with a laser and the deposition of an oxide film on it.
Materials in contact with the environment release e.g., metal-ions, elemental species and/or (nano-)particles. Once these species and/or particles are released, they are ingested by organisms and cells and thus, might have a negative impact on the environment. Thus, identification as well as quantification of potentially harmful substances is of utmost importance and highly needed to assess ecotoxicological impact of (emerging) pollutants.
The oral presentation provides an overview on the power of elemental analytical techniques, in particular ICP-MS as well as HR-CS-GFMAS in environmental research. Current research topics from Division 1.1 @ BAM - Inorganic trace analysis will be highlighted:
i) Elemental Speciation & Isotope analysis - new tools:
Among elemental species separation and quantification, one of the main challenges in environmental elemental speciation analysis is the distinction between anthropogenic and natural elemental species. The on-line combination of elemental speciation and isotope analysis combines “the best from both worlds” - species specific isotopic information becomes available.
As an application example the analysis of current anti-fouling agents via CE/MCICP-MS will be highlighted.
ii) HR-CS-GFMAS for PFC analysis:
Per- and polyfluorinated compounds (PFC) are emerging contaminants in particular in soil and surface water samples. Due to the large number of compounds (>4700), target analytical methods are not sufficient and sum parameter methods for organically bound fluorine are highly needed.
High resolution-continuum source-graphite furnace molecular absorption spectroscopy (HR-CS-GFMAS) based methods for organically bound fluorine analysis will be presented. Application examples (soil and surface water) will be highlighted.
iii) Single cell-ICP-ToF-MS - ecotox. assessment:
Single cell and single organism analysis for e.g. ecotoxicological/medicinal assessment are hot topics in the research field of ICP-MS. In particular ICP-ToF-MS is a powerful, emerging techniques in terms of single cell/particle analysis.
Automated single cell/diatom-ICP-ToF-MS as a potential tool in ecotoxicological testing will be presented.
A wide range of analytical methods are used to estimate the plant-availability of soil phosphorus (P). Previous investigations showed that analytical methods based on the Diffusive Gradients in Thin films (DGT) technique provide a very good correlations to the amount of bioavailable nutrients and pollutants in environmental samples (Davison 2016, Vogel et al. 2017). However, the DGT results do not identify which P compound of the soil has the high bioavailability. But there are various spectroscopic techniques (infrared, Raman, P K-edge and L-edge XANES and P NMR spectroscopy) available to characterize P species in soils. Therefore, spectroscopic investigation of DGT binding layers after deployment allow us to determine the specific compounds. Nutrients such as phosphorus and nitrogen are often, together with other elements, present as molecules in the environment. These ions are detectable and distinguishable by infrared, P K- and L-edge X-ray absorption near-edge structure (XANES) and NMR spectroscopy, respectively. Additionally, microspectroscopic techniques make it also possible to analyze P compounds on the DGT binding layer with a lateral resolution down to 1 μm2. Therefore, species of elements and compounds of e.g. a spatial soil segment (e.g. rhizosphere) can be mapped and analyzed, providing valuable insight to understand the dynamics of nutrients in the environment.
A brief introduction will be given on modeling chemical reactions in laser induced plasmas using stoichiometric and non-stoichiometric approaches. Several applications will be considered, which can benefit from such modeling. Those include plasma enhanced chemical vapor deposition (PECVD), surface modification and surface coating, and molecular analysis by LIBS. Each application will be illustrated by simulations of relevant chemical systems. For PECVD, chemical systems are BCl3/H2/Ar, BF3/H2/Ar, BCl3/BF3, Mo/BF3/H2; for surface modification/coating it is Ti/air; for molecular LIBS they are CaCO3/Ar, Ca(OH)2/Ar, and CaCl2/Ar. Advantages and shortcomings of equilibrium chemical hydrodynamic models of laser induced plasmas will be discussed.
X-ray photoelectron-spectroscopy (XPS) allows simultaneous irradiation and damage monitoring. Although water radiolysis is essential for radiation damage, all previous XPS studies were performed in vacuum. Here we present near-ambient-pressure XPS experiments to directly measure DNA damage under water atmosphere. They permit in-situ monitoring of the effects of radicals on fully hydrated double-stranded DNA. Our results allow us to distinguish direct damage, by photons and secondary low-energy electrons (LEE), from damage by hydroxyl radicals or hydration induced modifications of damage pathways. The exposure of dry DNA to x-rays leads to strand-breaks at the sugar-phosphate backbone, while deoxyribose and nucleobases are less affected. In contrast, a strong increase of DNA damage is observed in water, where OH-radicals are produced. In consequence, base damage and base release become predominant, even though the number of strand-breaks increases further.
ISO/IEC 17025 is the worldwide quality standard for testing and calibration laboratories. It is the basis for accreditation by an accreditation body. The current version was published in 2018.
Implementing ISO/IEC 17025 as part of laboratory quality initiatives offers both laboratory and business benefits, such as expanding the potential customer base for testing and/or calibration, increasing the reputation and image of the laboratory at national and international level, continuous improvement of the data quality and the effectiveness of the laboratory or creation of a good basis for most other quality systems in the laboratory sector, such as GxP. The main difference between a proper approach to analysis and a formal accreditation is shown in a targeted documentation, especially on the qualification of the personnel, the test equipment and the validation of the analytical methods.
Using quantitative NMR spectroscopy as an example, it is shown how accreditation can be carried out and what documentation is required. In our case, we have described the procedure in an SOP ("Determination of the quantitative composition of simple mixtures of structurally known compounds with 1H-NMR spectroscopy") and supported it with a modular system of organizational and equipment SOPs. The special feature is that the accredited method is independent for the choice of the analyte and the matrix and therefore it is possible to operate with a single validated method. In our case, we have proposed three quality levels ("leagues") with different levels of analytical effort, which differ in their measurement uncertainty, in order to simplify the workflow and analysis design.
Plasma-chemical approach is used for synthesis of various gaseous, liquid, and solid substances since 1960th. Nowadays, the method of plasma enhanced chemical vapor deposition (PECVD) is used for production of thin films, protective coatings, carbon-based nanostructures, high purity isotopic materials, biomaterials, and other products. Plasma for PECVD is typically created in various electrical discharges, e.g. DC and AC glow discharges or discharges operated at audio (10-20 kHz), radio (13.56 MHz), and microwave (2.45 GHz) frequencies. Plasma induced by a laser, a laser induced plasma (LIP), is rarely used to deposit materials from the gas phase as in PECVD. This work is aimed at reviving interest to this latter technology and showing its efficiency and potential.
We run several pilot experiments. First, LIP is excited in BCl3 or BF3 plus H2 or CH4 to evaluate the efficiency of deposition of solid boron and boron carbide, the materials, which are largely used for refractory coatings. Second, we investigate a possibility of synthesis of fluorochlorosilanes SiFxCl4-x (x = 1, 2, 3) by LIP induced in SiF4 + SiCl4 gas mixtures. Using fluorochlorosilanes with different combinations of F and Cl in the SiFxCly molecule may add flexibility in processes of silicon deposition and etching. Third, LIP is excited in reactive mixture MoF6+H2+BF3 or on a Mo target ablated into H2/BF3 atmosphere. The goal is obtaining superhard molybdenum borides MoB, Mo2B, or MoB2. The gases used and solid deposits are analyzed by optical emission spectroscopy (OES), IR and mass spectrometry (MS).
We also model the plasma and perform static equilibrium chemistry calculations to see if the desired reaction products are thermodynamically favorable. Dynamic calculations of the expanding plasma plume are performed using a hydrodynamic code combined with the open source chemical software.
The successful shift to NDE 4.0 will not only require developing and embracing new technologies associated with the fourth industrial revolution or becoming an integral part of the overall Industry 4.0, but also developing and adopting new ways of working. It is undoubtful that people will remain in charge of the inspections. However, it is arguable if the current “procedure-following” “level I-III” paradigm can withstand the changes that come along NDE 4.0. With the increased autonomy and interconnectedness expected with NDT 4.0, the majority of traditional NDE tasks will no longer be needed. Instead, different skills, such as that of programming and adapting systems, as well as problem solving, will become vital for the inspections. Therefore, we suggest that a new paradigm is needed—one in which inspector roles and, thus, also the requirements will have to be reinvented. We expect the inspectors to be relieved from the tedious and error prone aspects of the current system and to take responsibility for increasingly complex automated systems and work in closer collaboration with other experts. Thus, we propose that the traditional inspector roles will be transformed into that of the system developer, caretaker and problem solver, each requiring a specific set of skills and assuming different responsibilities. In this talk, we will present the new roles and discuss the challenges that may arise with them.
The successful shift to NDE 4.0 will not only require developing and embracing new technologies associated with the fourth industrial revolution or becoming an integral part of the overall Industry 4.0, but also developing and adopting new ways of working. It is undoubtful that people will remain in charge of the inspections. However, it is arguable if the current “procedure-following” “level I-III” paradigm can withstand the changes that come along NDE 4.0. With the increased autonomy and interconnectedness expected with NDT 4.0, the majority of traditional NDE tasks will no longer be needed. Instead, different skills, such as that of programming and adapting systems, as well as problem solving, will become vital for the inspections. Therefore, we suggest that a new paradigm is needed—one in which inspector roles and, thus, also the requirements will have to be reinvented. We expect the inspectors to be relieved from the tedious and error prone aspects of the current system and to take responsibility for increasingly complex automated systems and work in closer collaboration with other experts. Thus, we propose that the traditional inspector roles will be transformed into that of the system developer, caretaker and problem solver, each requiring a specific set of skills and assuming different responsibilities. In this talk, we will present the new roles and discuss the challenges that may arise with them.
The successful shift to NDE 4.0 will not only require developing and embracing new technologies associated with the fourth industrial revolution or becoming an integral part of the overall Industry 4.0, but also developing and adopting new ways of working. It is undoubtful that people will remain in charge of the inspections. However, it is arguable if the current “procedure-following” “level I-III” paradigm can withstand the changes that come along NDE 4.0. With the increased autonomy and interconnectedness expected with NDT 4.0, the majority of traditional NDE tasks will no longer be needed. Instead, different skills, such as that of programming and adapting systems, as well as problem solving, will become vital for the inspections. Therefore, we suggest that a new paradigm is needed—one in which inspector roles and, thus, also the requirements will have to be reinvented. We expect the inspectors to be relieved from the tedious and error prone aspects of the current system and to take responsibility for increasingly complex automated systems and work in closer collaboration with other experts. Thus, we propose that the traditional inspector roles will be transformed into that of the system developer, caretaker and problem solver, each requiring a specific set of skills and assuming different responsibilities. In this talk, we will present the new roles and discuss the challenges that may arise with them.
That human factors (HF) affect the reliability of NDT is not novelty. Still, when it comes to reliability assessments, the role of people is often neglected. Reliability is typically expressed in terms of POD curves, and the effects of human and organisational factors on the inspection are typically tackled by the regulations, procedures and by the qualification and training of the inspection personnel. However, studies have shown that even the most experienced personnel can make mistakes and that the reliability in the field is never as high as the reliability measured in the POD experiments. Generally, HF are considered too unpredictable and too uncontrollable to model. If that is the fact, then what can we do? The engineering perspective to this problem has often been to find ways to automate inspections and, recently, to make use of artificial intelligence tools to decrease the direct effect of people on the inspection results and improve the overall efficiency and reliability. However, despite automation and AI, people remain the key players, though their tasks change. The contemporary approach to HF is not to engineer them out of the system but to design human-machine systems that make the best use of both. In this talk, ways of tackling HF in the design of systems and processes will be presented.
Carbonization of fluorinated metal-organic frameworks (MOFs) should yield fluorinated nitrogen- and metal-doped carbons (F-NMCs), which are a combination of NMCs and fluorinated carbons, each promising electrocatalysts on their own. We synthesized two polymorphs of a fluorinated MOF by mechanical ball mill grinding, and carbonized them to yield potential electrocatalyt materials. The catalytical activity towards the oxygen reduction reaction (ORR) was examined, finding good activites. Simulations from a theoretic model helped assesing the stability of proposed catalytic sites and understanding the measured activites towards the ORR catalysis.
Additive Manufacturing (AM) in terms of laser powder-bed fusion (L-PBF) offers new prospects regarding the design of parts and enables therefore the production of complex structures. The quality of the feedstock material receives increasing attention, as it depicts the first part of the L-PBF process chain. The powder quality control in terms of flowability and powder bed packing density is therefore mandatory.
In this work, a workflow for quantitative 3D powder analysis in terms of particle size, particle shape, particle porosity, inter-particle distance and packing density was established. Synchrotron computed tomography (CT) was used to correlate the packing density with the particle size and particle shape for three different powder batches. The polydisperse particle size distribution (PSD) was transformed into a statistically equivalent bidisperse PSD. The ratio of the small and large particles helped to understand the powder particle packing density. While the particle shape had a neglectable influence, the particle size distribution was identified as major contributor for the packing density.
Typically, the near-eutectic Al-Si alloys consist of highly interconnected three-dimensional network of the eutectic Silicon (Si) and intermetallics embedded into Aluminium (Al) matrix. For further improvement of the mechanical properties of such alloys, often, one single ceramic reinforcement phase, e.g. silicon carbide (SiC) or aluminium oxide (Al2O3) in the form of fibres or particles is added. However, hybrid reinforcements (fibres and particles) can further improve wear resistance and fracture toughness, and additionally, reduce anisotropy of the material. The engineering of metal matrix composites (MMC) for specific application requirements benefits from a comprehensive knowledge of the failure behaviour. Therefore, damage evolution under compression was investigated on:
- pure near-eutectic AlSi12CuMgNi matrix alloy
- type I: matrix reinforced with random-planar oriented Al2O3 short fibres (15 vol.%)
- type II: matrix reinforced with random-planar oriented Al2O3 short fibres (7 vol.%) and additional SiC particles (15 vol.%)
The analysis of damage mechanism was carried out in two rather independent but complementary studies. First, selected sister samples of every material were exposed to quasi-static compression (traverse control). The compression tests were interrupted at different strain levels. Miniature cylinders with a diameter of 1mm were extracted from the pre-strained samples and investigated by synchrotron computed tomography (SX-µCT) with a spatial resolution of about 0.7 µm. For the pure matrix alloy, microcracks are confined to the intermetallic particles and to the eutectic Si, hence no damage was observed in the Aluminium. The composite type II revealed a more effective strain accumulation (less damage) than type I at low plastic strain (up to 5 %), but a more catastrophic damage development due to cracking of the SiC clusters at higher strain levels.
The second approach to study the damage initiation and accumulation in the materials subjected to compressive load was Acoustic Emission (AE) analysis. In this case the in-situ monitoring of the acoustic emission signal was performed during compression tests on specimens with dimension of several mm. For all three material types, AE activity set at 2% strain. Differences in AE behaviour of the three materials was proven based on AE hitrate, signal peak amplitudes as well as weighted peak frequencies (WPF). Future work focuses on combination of AE and SX-µCT aiming for more detailed knowledge on damage mechanism of metal matrix composites.
Polyethylene glycols (PEGs) are widely used in everyday items such as food additives and in personal care products. In addition, they have multiple medical applications: as laxatives, excipients, and covalently coupled to drug molecules leading to improved pharmacokinetics (PEGylation). While generally regarded as biologically inert, the human body is known to produce antibodies against PEGylated molecules. In addition, PEGs have been shown to be part of a biomarker signature to predict colon cancer outcome, suggesting a more complex and yet unknown behavior of PEGs in the human body.
Here, we introduce PEGomics, a retrospective screening approach of publicly available LC-MS data. Using a custom R script to process entire studies, the presence of PEGs was reveled in most human plasma, serum and whole blood samples investigated. Several PEG species and adducts were identified and their correlation with different diseases and health conditions was investigated further.
Blood PEG levels significantly differed between patient groups in multiple clinical studies related to e.g. pregnancy duration, fasting and smoking. We discuss possible causes for these effects in the light of recent reports of allergies against PEGs and outline our further strategies to identify the source of PEGs in the human body as well as possible metabolic transformations.
In many scientific fields, isotopic analysis can offer valuable information, e.g., for tracing the origin of food products, environmental contaminants, forensic and archaeological samples (provenance determination), for age determination of minerals (geochronological dating) or for elucidating chemical processes. Up to date, typically bulk analysis is aimed at measuring the isotopic composition of the entire elemental content of the sample. However, the analyte element is usually present under the form of different species. Thus, separating species of interest from one another and from matrix components prior to isotope ratio measurements can provide species-specific isotopic information, which could be used for tracing the origin of environmental pollutants and elucidation of (environmental) speciation. Using on-line hyphenations of separation techniques with multicollector-ICP-MS (MC-ICP-MS) can save time and effort and enables the analysis of different species during a single measurement.
In this work, we developed an on-line hyphenation of CE with multicollector-ICP-MS (CE/MC-ICP-MS) for isotopic analysis of sulfur species using a multiple-injection approach for instrumental mass bias correction by standard-sample bracketing. With this method, the isotopic composition of sulfur in sulfate originating from river water could be analyzed without sample preparation. The results were compared to data from off-line analysis of the same samples to ensure accuracy. The precision of the results of the on-line measurements was promising regarding the differentiation of the river systems by the isotopic signature of river water sulfate. The great potential of this method is based on the versatility of the applied separation technique, not only in the environmental field but also for, e.g., biomolecules, as sulfur is the only covalently bound constituent of proteins that can be analyzed by MC-ICP-MS.
Understanding on how a machine learning model interprets data is a crucial step to verify its reliability and avoid overfitting. While the focus of the scientific community is nowadays orientated towards deep learning approaches, which are considered as black box approaches, this work presents a toolbox that is based on complementary methods of feature extraction and selection, where the classification decisions of the model are transparent and can be physically interpreted. On the example of guided wave benchmark data from the open guided waves platform, where delamination defects were simulated at multiple positions on a carbon fiber reinforced plastic plate under varying temperature conditions, the authors could identify suitable frequencies for further investigations and experiments. Furthermore, the authors presented a realistic validation scenario which ensures that the machine learning model learns global damage characteristics rather than position specific characteristics.
In recent years, the fabrication of laser-generated surface structures on metals such as titanium surfaces have gained remarkable interests, being technologically relevant for applications in optics, medicine, fluid transport, tribology, and wetting of surfaces.
The morphology of these structures, and so their chemistry, is influenced by the different laser processing parameters such as the laser fluence, wavelength, pulse repetition rate, laser light polarization type and direction, angle of incidence, and the effective number of laser pulses per beam spot area.
However, the characterization of the different surface structures can be difficult because of constraints regarding the analytical information from both depth and the topographic artifacts which may limit the lateral and depth resolution of elemental distributions as well as their proper quantification. A promising technique to investigate these structures even at the nano-scale is Time-of-Flight Secondary Ion Mass Spectrometry (ToF-SIMS), a very surface sensitive technique that at the same time allows to perform depth-profiling, imaging and 3D-reconstruction of selected ion-sputter fragment distributions on the surface.
In this study we combine chemical analyses such as Energy Dispersive X-ray spectroscopy (EDX) and high-resolution scanning electron microscopy (SEM) analyses with ToF-SIMS to fully characterize the evolution of various types of laser-generated micro- and nanostructures formed on Ti and Ti alloys at different laser fluence levels, effective number of pulses and at different pulse repetition rates (1 – 400 kHz), following irradiation by near-infrared ultrashort laser pulses (925 fs, 1030 nm) in air environment or under argon gas flow.
We show how this combined surface analytical approach allows to evaluate alteration in the surface chemistry of the laser-generated surface structures depending on the laser processing parameters and the ambient environment.
Controlling the thickness and tightness of surface passivation shells is crucial for many applications of core-shell nanoparticles (NP). Usually, to determine shell thickness, core and core/shell particle are measured individually requiring the availability of both nanoobjects. This is often not fulfilled for functional nanomaterials such as many photoluminescent semiconductor quantum dots (QD) used for bioimaging, solid state lighting, and display technologies as the core does not show the application-relevant functionality like a high photoluminescence (PL) quantum yield. This calls for a whole nanoobject approach. Moreover, the thickness of the organic coating remains often unclear.
By combining high-resolution transmission electron microscopy (HR-TEM) and X-ray photoelectron spectroscopy (XPS), a novel whole nanoobject approach is developed representatively for an ultrabright oleic acid-stabilized, thick shell CdSe/CdS QD with a PL quantum yield close to unity. The size of this spectroscopically assessed QD, is in the range of the information depth of usual laboratory XPS. Information on particle size and monodispersity were validated with dynamic light scattering (DLS) and small angle X-ray scattering (SAXS) and compared to data derived from optical measurements. The results of the different methods match very well within the different measurement uncertainties. Additionally, results obtained with energy-resolved XPS using excitation energies between 200 eV and 800 eV are discussed with respect to a potential core/shell intermixing.
Moreover, the future application potential of this approach correlating different sizing and structural methods is discussed considering the method-inherent uncertainties and other core/multi-shell nanostructures.
Crystalline Silicon undergoes a complex phase-change dynamic of melting, amorphization, ablation and re-crystallization upon irradiation with high intensity ultra-short laser pulses [1]. The final state of such a modified surface spot depends on many factors, most notably the local fluence and the surface’s crystal orientation. In this study, we induced superficial structure and phase changes in Silicon <111> and <100> wafers using single femtosecond laser pulses (790 nm, 30 fs) for a range of different peak fluences. The resulting surface modifications were studied in great detail using a number of different techniques, including spectroscopic imaging ellipsometry (SIE), atomic force microscopy, high-resolution transmission electron microscopy (HRTEM), and energy dispersive X-ray spectroscopy within scanning transmission electron microscopy (STEM-EDX).
Playing a pivotal role in this work, SIE provided non-destructive measurements for the calculation of the radial amorphous layer-thickness profiles of the irradiated spots using a two-layer thin-film model (Silicon dioxide and amorphous Silicon on a crystalline Silicon substrate). The measurements further allowed for the analysis of the oxide-layer modifications induced by the laser treatment. The results of the SIE-calculations were cross-checked by an in-depth material lamella via HRTEM and STEM-EDX.
Ultrasonic measurement technology has become indispensable in NDT. In order to reduce measurement time and extend the application to other materials, contactless ultrasound is the subject of many different research groups. Department 8 has been researching successfully in this field for years. A novel approach is based on so-called fluidic devices. These devices can be used to perform binary logic operations with the help of natural flow instabilities. Hence the abbreviated name, Fluidic (FLUID+LogIC). Only a pressure reservoir of the used fluid is required as energy supply. This enables the production of very robust actuators that generate ultrasonic signals in an extremely energy efficient way.
The presentation includes the research results of the ZIM innovation project OsciCheck. The original idea will be presented and its application on different building materials is validated. Beyond this, the possible application areas are much larger and a detailed outlook is given to discuss the future potential of fluidic ultrasonic actuators.
The latest ICP-MS technology - ICP-ToF (time of flight)-MS – enables the analysis of the multi-element fingerprint of individual cells. The interface between material and environmental analysis thus receives special attention, e.g., when considering corrosion processes. Microbiologically influenced corrosion (MIC) is a highly unpredictable phenomenon due to the influence of the environment, microbial communities involved and the respective electron source. However, the interaction pathway between cells and the metal surface remains unclear. The development of the MIC-specific ICP-ToF-MS analytical method presented here at the single cell level, in combination with the investigation of steel-MIC interactions, contributes significantly to progress in instrumental MIC analysis and will enable clarification of the processes taking place. For this, a MIC-specific staining procedure was developed. It allows the analysis of archaea at a single cell level and provides information about the interaction of the cells with the staining agent which is extremely scarce compared to other well characterized organisms. Additionally, the single cell ICP-ToF-MS is used for the analysis of archaea involved in MIC of steel. Hence, the possible uptake of individual elements from different steel samples is investigated - the information obtained will be used in the future to elucidate underlying mechanisms and develop possible material protection concepts, thus combining modern methods of analytical sciences with materials.
The substance class of per- and polyfluorinated alkyl substances (PFAS) comprises more than 5300 organic compounds. PFAS are completely fluorinated on at least one carbon atom. They are associated with negative impacts on human and animal health, are extremely persistent in the environment, and bioaccumulate along food chains. Therefore, PFAS are classified as emerging pollutants. At the same time, their physicochemical properties make them attractive for use in diverse technical applications. They are both hydrophobic and lipophobic and show high thermal as well as chemical resistance due to the strong C-F bond.
First regulations of some PFAS in combination with the technically excellent properties generated an innovation pressure and led to an enormous increase in the number of fluorinated substitution compounds. Due to the increasing complexity of this substance class, target analysis is not able to cover such a variety and multitude of analytes.
Therefore, a suitable PFAS sum parameter method is necessary for an accurate detection of PFAS pollution in the environment, the identification of PFAS hotspots and an evaluation of appropriate remediation measures.
Here we provide insights into the current state of PFAS sum parameter development and present our latest results on method development for the quantitative analysis of PFAS as extractable organically bound fluorine (EOF) in environmental samples using high-resolution molecular absorption spectrometry (HR-CS-GFMAS). For this purpose, we optimized the extraction of PFAS from different solid matrices with simultaneous separation of inorganic fluoride. For quantification resulting extracts were measured using a fluorine specific HR-CS-GFMAS method. By adding gallium salt solutions as modifiers in HR-CS-GFMAS, fluorine can be indirectly quantified very selectively by the in situ formation of GaF with low limits of quantification (instrumental LOQ c(F) < 3 µg/L). Here we will show results from real soil samples from sites with and without known contamination.
Ellipsometry has been an extremely successful and fast expanding method in the past decades along with other related techniques using polarisation sensitive measurements. Opening new fields of application for a successful measurement technique brings some requirements and issues that have to be solved. From a metrological point of view, ellipsometry has the problem that uncertainties are difficult to determine for model-based analysis techniques in general. In this presentation, we will explore how the usefulness of polarimetric methods like ellipsometry can be increased.
Ellipsometry as a method could profit from several current developments which will be discussed in this presentation:
• Standardisation initiatives on national and international level developing standards for best practice when using ellipsometry. A series of at least six standards is currently developed on national German and international level covering different levels of sample complexity.
• Projects on traceability of ellipsometry and structured surface spectrometry as well as new dielectric function database initiatives.
• Metadata handling and data ontology providing a better framework for exchange and collaborative use of research data.
We will also explore the quantification of measurement uncertainty using examples from projects in which BAM is involved. Examples will be presented of multilayer and non-ideal materials as well as the determination of layer properties for technical applications such as thin layer catalysts and complex polymers. The definition of reference materials will be discussed.
The Dark side of Science
(2021)
We all may have started out as bright-eyed students trying to do science to the best of our abilities, but over time, some of us have gradually drifted to the dark side. The dark side of science has an impressive publication rate in high-ranking journals, good success with funding agencies, and rocks the world with stellar findings. Unfortunately, these findings aren't real, either by accident or on purpose. As the presenter and his colleagues found, trying to correct or even dispute any of these findings in literature is a supremely complex and time-consuming effort.
With no recent reduction in the frequency of such false findings, it is up to us to try to stem the flow. Besides looking at examples, we need to understand the underlying driving forces behind this dark scientific movement. By combining this understanding with a refresher of the core scientific principles, we can then develop the necessary argumentative tools and mechanisms that may prevent our own slide down the slippery slope.
This talk will therefore start out with several entertaining examples of probably accidental, as well as definitely deliberate, false scientific findings in literature (and in particular in the field of materials research). We will then take a brief look at the possible causes for these developments, after which some tools will be presented that can help both the fresh as well as the well-seasoned scientist to rise up against the dark side.
Today's speaker is a young scientist whose research on all aspects of small-angle scattering has taken him from his birthplace in Netherlands, to Denmark, Japan and now Germany. His research has led to a new method and software for scattering pattern analysis, a comprehensive set of data corrections together with the Diamond Light Source, and a new ultra-SAXS plug-in instrument.
For the last few years, he has been working on a comprehensive and universal methodology to get high-quality X-ray scattering measurements for any sample, using his new instrument at the institute. This instrument has now been heavily modified both in hardware and software, so that it can deliver better data.
These developments are always driven by interesting collaborations with materials researchers and other scientists. As a joint member he has published works on a wide variety of materials, including self-assembled structures in liquids, composite materials and porous carbon catalysts.
He has also been very active in outreach, for example by co-organizing an online lecture series called ‘#the Light Stuff’ on scattering and diffraction, running the ‘looking at nothing’ weblog, hosting a yearly introductory scattering course, and he has many scattering-related lectures available on YouTube.
Our distinguished speaker is Dr. Brian Richard Pauw from the Federal Institute for Materials Research and Testing in Germany. I proudly invite Dr. Pauw to begin his talk.
Today's speaker is a young scientist whose research on all aspects of small-angle scattering has taken him from his birthplace in Netherlands, to Denmark, Japan and now Germany. His research has led to a new method and software for scattering pattern analysis, a comprehensive set of data corrections together with the Diamond Light Source, and a new ultra-SAXS plug-in instrument.
For the last few years, he has been working on a comprehensive and universal methodology to get high-quality X-ray scattering measurements for any sample, using his new instrument at the institute. This instrument has now been heavily modified both in hardware and software, so that it can deliver better data.
These developments are always driven by interesting collaborations with materials researchers and other scientists. As a joint member he has published works on a wide variety of materials, including self-assembled structures in liquids, composite materials and porous carbon catalysts.
He has also been very active in outreach, for example by co-organizing an online lecture series called ‘#the Light Stuff’ on scattering and diffraction, running the ‘looking at nothing’ weblog, hosting a yearly introductory scattering course, and he has many scattering-related lectures available on YouTube.
Our distinguished speaker is Dr. Brian Richard Pauw from the Federal Institute for Materials Research and Testing in Germany. I proudly invite Dr. Pauw to begin his talk
The Meticulous Approach: Fully traceable X-ray scattering data via a comprehensive lab methodology
(2021)
To find out if experimental findings are real, you need to be able to repeat them. For a long time, however, papers and datasets could not necessarily include sufficient details to accurately repeat experiments, leading to a reproducibility crisis. It is here, that the MOUSE project (Methodology Optimization for Ultrafine Structure Exploration) tries to implement change – at least for small- and wide-angle X-ray scattering (SAXS/WAXS).
In the MOUSE project, we have combined: a) a comprehensive laboratory workflow with b) a heavily modified, highly automated Xenocs Xeuss 2.0 instrumental component. This combination allows us to collect fully traceable scattering data, with a well-documented data flow (akin to what is found at the more automated beamlines). With two full-time researchers, the lab collects and interprets thousands of datasets, on hundreds of samples for dozens of projects per year, supporting many users along the entire process from sample selection and preparation, to the analysis of the resulting data.
While these numbers do not light a candle to those achieved by our hardworking compatriots at the synchrotron beamlines, the laboratory approach does allow us to continually modify and fine-tune the integral methodology. So for the last three years, we have incorporated e.g. FAIR principles, traceability, automated processing, data curation strategies, as well as a host of good scattering practices into the MOUSE system. We have concomitantly expanded our purview as specialists to include an increased responsibility for the entire scattering aspect of the resultant publications, to ensure full exploitation of the data quality, whilst avoiding common pitfalls.
This talk will discuss the MOUSE project1 as implemented to date, and will introduce foreseeable upgrades and changes. These upgrades include better pre-experiment sample scattering predictions to filter projects on the basis of their suitability, exploitation of the measurement database for detecting long-term changes and automated flagging of datasets, and enhancing MC fitting with sample scattering simulations for better matching of odd-shaped scatterers.
Reliable characterization of materials at the nanoscale regarding their physio-chemical properties is a challenging task, which is important when utilizing and designing nanoscale materials. Nanoscale materials pose a potential toxicological hazard to the environment and the human body. For this reason, the European Commission amended the REACH Regulation in 2018 to govern the classification of nanomaterials, relying on number-based distribution of the particle size.
Suitable methods exist for the granulometric characterization of monodisperse and ideally shaped nanoparticles. However, the evaluation of commercially available nanoscale powders is problematic. These powders tend to agglomerate, show a wide particle size distribution and are of irregular particle shape.
Zinc oxide, aluminum oxide and cerium oxide with particle sizes less than 100 nm were selected for the studies and different preparation methods were used comparatively.
First, the nanoparticles were dispersed in different dispersants and prepared on TEM-supported copper grids. Furthermore, individual powders were deposited on carbon-based self-adhesive pads. In addition, the samples were embedded by hot mounting and then ground and polished.
The prepared samples were investigated by scanning electron microscopy (including the transmission mode STEM-in-SEM) and Dynamic Light scattering. The software package ImageJ was used to segment the SEM images and obtain the particle sizes and shapes and finally the number-based particles size distribution with size expressed as various descriptors.
Ellipsometry is a powerful tool, which allows the investigation of material properties over a broad spectral range. Over the course of several years, the ellipsometry lab at BAM has become an accredited testing lab according to ISO/IEC 17025 laying bare the need of better methods for accuracy and traceability. Despite its wide range of application in both research and development as well as industry, there have been no generally accepted standards dealing with model validation and measurement uncertainties.
Based on the first German standard DIN 50989 – 1: 2018 Ellipsometry - Part 1: Principles (currently international standard ISO 23131: 2021) and under consideration of GUM [1] a series of standards for ellipsometry was developed. The entire 6-part series covers several model-based application cases. This standards series avoids having narrow and material specific application cases but instead classifies applications of ellipsometry according to the sample complexity. The concept of ellipsometric transfer quantities (Ψ and Δ or alternatively the elements of transfer matrices) is implemented in the series. For each application case a model-based validation strategy was developed. Thus, the standards are applicable to all materials, instruments and measuring principles.
The uniform structure concept of the series facilitates its practical applicability for users. The standards include the model-based GUM-compliant determination/estimation of the measurement uncertainties. In addition, the appendices of the documents contain numerous measurement and simulation examples as well as recommendations for measuring practice.
In this contribution we present the application cases and basic structure of the standards developed in collaboration with Accurion GmbH and SENTECH Instruments GmbH in the project SNELLIUS.
Pulse-compression thermography is an emerging technique that has shown versatility by combination of pulsed and lock-in thermography. Accordingly, several aspects of this technique are still unexplored, and some others not fully developed yet. Barker codes were widely used in radar applications due to their simplicity and their optimum autocorrelation function. Nevertheless, applications were limited by the amplitude of the sidelobes present in the autocorrelation function and therefore, several filters have been developed which aim to reduce the sidelobes. However, the filters usually depend on empirical parameters which must be determined for each application. A better alternative would improve the applicability of the Barker codes. In this work, we further develop the pulse-compression thermography technique by introducing a 13-bit modified Barker code (mBC): This allows to drastically reduce the sidelobes characteristic of the 13-bit Barker code (BC). Consequently, the thermographic impulse response, obtained by cross-correlation, is almost free of such sidelobes. Deeper defects become easier to detect in comparison with using a 13-bit Barker code. Numerical simulations using the finite element method are used for comparison and experimental measurements are performed in a sample of steel grade St 37 with machined notches of three different depths: 2 mm, 4 mm and 6 mm.
Laser metal deposition is a rapidly evolving method for additive manufacturing that combines high performance and simplified production routine. Quality of production depends on instrumental design and operational parameters that require constant control during the process. In this work, feasibility of using optical spectroscopy as a control method is studied via modeling and experimentally. A simplified thermal model is developed based on the time-dependent diffusion-conduction heat equation and geometrical light collection into detection optics. Intense light emitted by a laser-heated spot moving across a sample surface is collected and processed to yield the temperature and other temperature-related parameters. In a presence of surface defects the temperature field is distorted in a specific manner that depends on a shape and size of the defect. Optical signals produced by such the distorted temperature fields are simulated and verified experimentally using a 3D metal printer and a sample with artificially carved defects. Three quantities are tested as possible metrics for process monitoring: temperature, integral intensity, and correlation coefficient. The shapes of the simulated signals qualitatively agree with the experimental signals; this allows a cautious inference that optical spectroscopy is capable of detecting a defect and, possibly, predicting its character, e.g. inner or protruding.
Processes of laser induced oxidation of metals are typically studied in the framework of heterogeneous chemical reactions occurring on the irradiated surface, which lead to the formation of dense oxide films deposited on it. Such technology has many applications like color-laser marking technology and laser recording on thin metal films for creation of diffractive optical elements . Under the conditions of strong laser ablation, another oxidation mechanism becomes possible: evaporated atoms react with oxygen in a surrounding atmosphere and the products of such reaction are redeposited back onto the substrate. The chemical and phase composition of such deposited layer, its density, morphology and structure depend on the conditions of laser ablation. By varying these conditions, the main properties of such coating can be controlled that is important for some potential application (for example in biomedicine).
In our report we present the study of the processes of redeposition of oxides structure under the conditions of multipulse nanosecond laser ablation of titanium (Grade 2) in air atmosphere at normal conditions. Our experiments show that titanium-implants with such deposited oxide layer have increased biocompatibility.
Modelling of chemical reaction in laser-induced plasma coupled with experimental methods of plasma optical emission spectroscopy allows us to determine the types of main chemical reactions in laser plasma as well as it influences on the plume dynamics and vapor condensation kinetics. As a result, we propose the general physical picture of reverse deposition of oxides structure under the condition of strong nanosecond laser ablation. The formation of the titanium oxide precipitate is explained not only by collisions in the plasma, but also by the chemical interaction of titanium and oxygen, which leads to the formation of а low pressure area near the substrate and additionally stimulates the reverse deposition of oxides. We expect, similar processes are valid not only for titanium but also for other metals and, possibly, semiconductors.