Ingenieurwissenschaften und zugeordnete Tätigkeiten
Filtern
Dokumenttyp
- Vortrag (14)
- Zeitschriftenartikel (6)
- Forschungsdatensatz (3)
- Beitrag zu einem Sammelband (2)
- Sonstiges (2)
- Video (1)
- Posterpräsentation (1)
Sprache
- Englisch (29)
Schlagworte
- Nanomaterials (29) (entfernen)
Organisationseinheit der BAM
- 6 Materialchemie (26)
- 6.1 Oberflächen- und Dünnschichtanalyse (16)
- 6.5 Synthese und Streuverfahren nanostrukturierter Materialien (10)
- 5 Werkstofftechnik (3)
- 5.4 Multimateriale Fertigungsprozesse (2)
- 1 Analytische Chemie; Referenzmaterialien (1)
- 1.2 Biophotonik (1)
- 2 Prozess- und Anlagensicherheit (1)
- 2.1 Sicherheit von Energieträgern (1)
- 5.1 Mikrostruktur Design und Degradation (1)
Eingeladener Vortrag
- nein (14)
By automatically recording as much information as possible in automated laboratory setups, reproducibility and traceability of experiments are vastly improved. This presentation shows what such an approach means for the quality of experiments in an X-ray scattering laboratory and an automated synthesis set-up.
A round-robin study has been carried out to estimate the impact of the human element in small-angle scattering data analysis. Four corrected datasets were provided to participants ready for analysis. All datasets were measured on samples containing spherical scatterers, with two datasets in dilute dispersions and two from powders. Most of the 46 participants correctly identified the number of populations in the dilute dispersions, with half of the population mean entries within 1.5% and half of the population width entries within 40%. Due to the added complexity of the structure factor, far fewer people submitted answers on the powder datasets. For those that did, half of the entries for the means and widths were within 44 and 86%, respectively. This round-robin experiment highlights several causes for the discrepancies, for which solutions are proposed.
## Summary:
This notebook and associated datasets (including VASP details) accompany a manuscript available on the ArXiv (https://doi.org/10.48550/arXiv.2303.13435) and hopefully soon in a journal as short communication as well. Most of the details needed to understand this notebook are explained in that paper with the same title as above. For convenience, the abstract is repeated here:
## Paper abstract:
We demonstrate a strategy for simulating wide-range X-ray scattering patterns, which spans the small- and wide scattering angles as well as the scattering angles typically used for Pair Distribution Function (PDF) analysis. Such simulated patterns can be used to test holistic analysis models, and, since the diffraction intensity is presented coupled to the scattering intensity, may offer a novel pathway for determining the degree of crystallinity.
The ``Ultima Ratio'' strategy is demonstrated on a 64-nm Metal Organic Framework (MOF) particle, calculated from $Q<0.01$\,$\mathrm{nm}^{-1}$ up to $Q\approx150$\,$\mathrm{nm}^{-1}$, with a resolution of 0.16\,\AA. The computations exploit a modified 3D Fast Fourier Transform (3D-FFT), whose modifications enable the transformations of matrices at least up to $8000^3$ voxels in size. Multiple of these modified 3D-FFTs are combined to improve the low-$Q$ behaviour.
The resulting curve is compared to a wide-range scattering pattern measured on a polydisperse MOF powder.
While computationally intensive, the approach is expected to be useful for simulating scattering from a wide range of realistic, complex structures, from (poly-)crystalline particles to hierarchical, multicomponent structures such as viruses and catalysts.
This is a remote presentation I gave at the 2022 Small-angle Scattering conference in Campinas, Brazil. The video has been obtained from the conference organisers with their explicit permission for use on YouTube. I've tried to spruce up the audio from the remote recording the best I could.
The conference abstract for this talk was:
"How much do we, the small-angle scatterers, influence the results of an investigation? What uncertainty do we add by our human diversity in thoughts and approaches, and is this significant compared to the uncertainty from the instrumental measurement factors?
After our previous Round Robin on data collection, we know that many laboratories can collect reasonably consistent small-angle scattering data on easy samples[1]. To investigate the next, human component, we compiled four existing datasets from globular (roughly spherical) scatterers, each exhibiting a common complication, and asked the participants to apply their usual methods and toolset to the quantification of the results (https://lookingatnothing.com/index.ph....
Accompanying the datasets was a modicum of accompanying information to help with the interpretation of the data, similar to what we normally receive from our collaborators. More than 30 participants reported back with volume fractions, mean sizes and size distribution widths of the particle populations in the samples, as well as information on their self-assessed level of experience and years in the field.
While the Round Robin is still underway (until the 25th of April, 2022), the initial results already show significant spread in the results. Some of these are due to the variety in interpretation of the meaning of the requested parameters, as well as simple human errors, both of which are easy to correct for. Nevertheless, even after correcting for these differences in understanding, a significant spread remains. This highlights an urgent challenge to our community: how can we better help ourselves and our colleagues obtain more reliable results, how could we take the human factor out of the equation, so to speak?
In this talk, we will introduce the four datasets, their origins and challenges. Hot off the press, we will summarize the anonymized, quantified results of the Data Analysis Round Robin. (Incidentally, we will also see if a correlation exists between experience and proximity of the result to the median). Lastly, potential avenues for improving our field will be offered based on the findings, ranging from low-effort yet somehow controversial improvements, to high-effort foundational considerations."
This dataset is a complete set of raw, processed and analyzed data, associated with the manuscript mentioned in the title.
All associated metadata and processing history has been added. Particle size distribution analyses using McSAS are included as well.
The samples consisted of a 4.2 mass% dispersion of yttria-stabilized zirconia nanoparticles in a cross-linked matrix. The measurements show a good dispersion with minimal agglomeration. The wide-angle region shows diffraction information consistent with zirconia.
How much do we, the small-angle scatterers, influence the results of an investigation? What uncertainty do we add by our human diversity in thoughts and approaches, and is this significant compared to the uncertainty from the instrumental measurement factors?
After our previous Round Robin on data collection, we know that many laboratories can collect reasonably consistent small-angle scattering data on easy samples1. To investigate the next, human component, we compiled four existing datasets from globular (roughly spherical) scatterers, each exhibiting a common complication, and asked the participants to apply their usual methods and toolset to the quantification of the results https://lookingatnothing.com/index.php/archives/3274).
Accompanying the datasets was a modicum of accompanying information to help with the interpretation of the data, similar to what we normally receive from our collaborators. More than 30 participants reported back with volume fractions, mean sizes and size distribution widths of the particle populations in the samples, as well as information on their self-assessed level of experience and years in the field.
While the Round Robin is still underway (until the 25th of April, 2022), the initial results already show significant spread in the results. Some of these are due to the variety in interpretation of the meaning of the requested parameters, as well as simple human errors, both of which are easy to correct for. Nevertheless, even after correcting for these differences in understanding, a significant spread remains. This highlights an urgent challenge to our community: how can we better help ourselves and our colleagues obtain more reliable results, how could we take the human factor out of the equation, so to speak?
In this talk, we will introduce the four datasets, their origins and challenges. Hot off the press, we will summarize the anonymized, quantified results of the Data Analysis Round Robin. (Incidentally, we will also see if a correlation exists between experience and proximity of the result to the median). Lastly, potential avenues for improving our field will be offered based on the findings, ranging from low-effort yet somehow controversial improvements, to high-effort foundational considerations.
A versatile software package in the form of a Python extension, named CDEF (computing Debye’s scattering formula for extraordinary form factors), is proposed to calculate approximate scattering profiles of arbitrarily shaped nanoparticles for small-angle X-ray scattering (SAXS). CDEF generates a quasi-randomly distributed point cloud in the desired particle shape and then applies the open-source software DEBYER for efficient evaluation of Debye’s scattering formula to calculate the SAXS pattern (https://github.com/j-from-b/CDEF). If self-correlation of the scattering signal is not omitted, the quasi-random distribution provides faster convergence compared with a true-random distribution of the scatterers, especially at higher momentum transfer. The usage of the software is demonstrated for the evaluation of scattering data of Au nanocubes with rounded edges, which were measured at the four-crystal monochromator beamline of PTB at the synchrotron radiation facility BESSY II in Berlin. The implementation is fast enough to run on a single desktop computer and perform model fits within minutes. The accuracy of the method was analyzed by comparison with analytically known form factors and verified with another implementation, the SPONGE, based on a similar principle with fewer approximations. Additionally, the SPONGE coupled to McSAS3 allows one to retrieve information on the uncertainty of the size distribution using a Monte Carlo uncertainty estimation algorithm.
Nanoforms with at least one dimension below 100 nm have an important part to play in more and more areas of our daily life. Therefore, risk assessment of these materials is becoming increasingly important. In this context, the European Chemical Agency (ECHA) considered eleven physico-chemical properties as relevant, of which the following six are essential for the registration: chemical composition, crystallinity, particle size, particle shape, surface chemistry and specific surface area. Four of these priority properties can be obtained with electron microscopy and surface analytics like XPS and ToF-SIMS. The reliability of this data must be ensured, especially for their use for grouping and read across approaches. On the other hand, the “reproducibility” crisis has revealed major shortcomings in the reliability of published data.
In a case study, we show how the quality of the data can be ensured by using existing standards and protocols of each step in the workflow of sample characterization. As exemplary samples, two Al-coated TiO2 samples as nanopowders were selected from the JRC repository, capped either with a hydrophilic or a hydrophobic organic ultrathin shell. SEM results provided the size and shape of the nanoparticles, a first overview about the composition was obtained with EDS. XPS and ToF-SIMS supplied the surface chemistry, especially information about the shell and the coating of the particles. Standards and protocols of all steps of the analytical workflow including preparation and data reduction are discussed regarding reliable and reproducible data. Additionally, uncertainties for the different steps are specified.
Only such a detailed description of all these factors allows a comprehensive physico-chemical characterization of the nanoparticles with understanding of their potential risk assessment.
Both essential aspects of the surface of solid matter, its morphology and chemistry, are studied traditionally at BAM starting in the 60’s with different cyclical research focus areas, mostly related either to applicative research or method development. In the recent years, the focus has shifted almost exclusively to the nano-analytics of advanced materials such as complex nanoparticles, (ultra)thin films/coatings, nanocomposites, 2D materials, energy materials, etc. This is also the reason why BAM has established recently the new Competence Center nano@BAM (www.bam.de/Navigation/DE/Themen/Material/Nanotechnologie/sichere-nanomaterialien.html) with the five sub-fields nanoCharacterisation, nanoMaterial, nanoSafety, nanoData and nanoTechnology. The link to the BAM central guidelines to the safety in technology and chemistry is given by the development of reference products such as reference measurement procedures, reference (nano)materials, and newly reference data sets. Thus, an internationally well-networked group in surface analysis has been established @BAM, with regular contributions to integral analytical characterization with metrological and standardization background.
Examples of newly developed methodical approaches will be given with an emphasis on correlative nano-analysis of morphology and chemistry of nanomaterials. Correlative imaging by STEM-in-SEM with high-resolution SEM and EDX, and further with AFM or the new technique TKD (Transmission Kikuchi Diffraction) will be explained on various examples of nanostructures, both as starting materials and embedded/functionalized nanoparticles in products. The unique analytical benefits of the Auger electron probe as a veritable nano-tool for surface chemistry will be highlighted. The panoply of advanced surface characterization methods @BAM is completed by discussing examples of hybrid analysis of the bulk of nanomaterials by X-ray Spectroscopy with the highest surface-sensitive methods X-ray Photoelectron Spectroscopy (XPS) and Time-of-Flight Secondary Ion Mass Spectrometry (ToF-SIMS). Particularly for the analysis of the surface chemistry of nanostructures, such as the completeness of the shells of core-shell nanoparticles or in-depth and lateral gradients of chemistry within mesoporous thin layers, the latter methods are inherent.
Other special developments like approaches for the quantitative determination of the roughness of particle surface by electron microscopy or for the quantitative determination of the porosity of thin mesoporous layers by electron probe microanalysis (EPMA) with SEM will be presented.
Both essential aspects of the surface of solid matter, its morphology and chemistry, are studied traditionally at BAM starting in the 60’s with different cyclical research focus areas, mostly related either to applicative research or method development. In the recent years, the focus has shifted almost exclusively to the nano-analytics of advanced materials such as complex nanoparticles, (ultra)thin films/coatings, nanocomposites, 2D materials, energy materials, etc. This is also the reason why BAM has established recently the new Competence Center nano@BAM (www.bam.de/Navigation/DE/Themen/Material/Nanotechnologie/sichere-nanomaterialien.html) with the five sub-fields nanoCharacterisation, nanoMaterial, nanoSafety, nanoData and nanoTechnology. The link to the BAM central guidelines to the safety in technology and chemistry is given by the development of reference products such as reference measurement procedures, reference (nano)materials, and newly reference data sets. Thus, an internationally well-networked group in surface analysis has been established @BAM, with regular contributions to integral analytical characterization with metrological and standardization background.
Examples of newly developed methodical approaches will be given with an emphasis on correlative nano-analysis of morphology and chemistry of nanomaterials. Correlative imaging by STEM-in-SEM with high-resolution SEM and EDX, and further with AFM or the new technique TKD (Transmission Kikuchi Diffraction) will be explained on various examples of nanostructures, both as starting materials and embedded/functionalized nanoparticles in products. The unique analytical benefits of the Auger electron probe as a veritable nano-tool for surface chemistry will be highlighted. The panoply of advanced surface characterization methods @BAM is completed by discussing examples of hybrid analysis of the bulk of nanomaterials by X-ray Spectroscopy with the highest surface-sensitive methods X-ray Photoelectron Spectroscopy (XPS) and Time-of-Flight Secondary Ion Mass Spectrometry (ToF-SIMS). Particularly for the analysis of the surface chemistry of nanostructures, such as the completeness of the shells of core-shell nanoparticles or in-depth and lateral gradients of chemistry within mesoporous thin layers, the latter methods are inherent.
Other special developments like approaches for the quantitative determination of the roughness of particle surface by electron microscopy or for the quantitative determination of the porosity of thin mesoporous layers by electron probe microanalysis (EPMA) with SEM will be presented in conjunction with the corresponding advanced materials studied.
Current research projects, promising ideas, including ongoing (pre-)standardization activities in the field of the challenging nano/surface analysis will be touched systematically, with the open goal of identifying future bilateral cooperation possibilities between EMPA and BAM.
ACEnano is an EU-funded project which aims at developing, optimising and validating methods for the detection and characterisation of nanomaterials (NMs) in increasingly complex matrices to improve confidence in the results and support their use in regulation. Within this project, several interlaboratory comparisons (ILCs) for the determination of particle size and concentration have been organised to benchmark existing analytical methods. In this paper the results of a number of these ILCs for the characterisation of NMs are presented and discussed. The results of the analyses of pristine well-defined particles such as 60 nm Au NMs in a simple aqueous suspension showed that laboratories are well capable of determining the sizes of these particles. The analysis of particles in complex matrices or formulations such as consumer products resulted in larger variations in particle sizes within technologies and clear differences in capability between techniques. Sunscreen lotion sample analysis by laboratories using spICP-MS and TEM/SEM identified and confirmed the TiO2 particles as being nanoscale and compliant with the EU definition of an NM for regulatory purposes. In a toothpaste sample orthogonal results by PTA, spICP-MS and TEM/SEM agreed and stated the TiO2 particles as not fitting the EU definition of an NM. In general, from the results of these ILCs we conclude that laboratories are well capable of determining particle sizes of NM, even in fairly complex formulations.
The publicly available document encapsulates the first version of the Catalogue of Services of the future EC4Safenano Centre (CoS 2019).
The CoS 2019 is structured in 12 Service Categories and 27 Service Topics, for each of the 12 categories considered. This architecture configures a 12 x 27 matrix that allows ordering the potential EC4Safenano offer in 324 types of services/groups of services.
Each type of service/group of services is described, in a simple and friendly way, by means of a specific service sheet: the EC4Safenano - Service Data Sheet (EC4-SDS). These EC4-SDSs allow structuring and summarizing the information of each service, providing the customer with a concise view of characteristics of the service and also the contact details with the service provider.
The CoS 2019 deploys a map of services consisting of a set of 100 EC4-SDSs, covering 7 of the 12 Service Categories and 17 of the 27 Service Topics.
The harmonization of services is visualized as a future necessary step in EC4Safenano, in order to strengthen the offer and provide added value to customers with a growing offer of harmonized services in future versions of the CoS.
The information contained in this document is structured in 3 main sections, as follows:
• Catalogue structure. This section describes in short the main characteristics of the CoS 2019.
• Catalogue content. This section represents the core part of the document and encapsulates the set of 100 SDSs displaying the offer proposed by the CoS 2019.
• Online Catalogue. This section describes the resources implemented by EC4Safenano to facilitate the on-line consultation of the CoS 2019 by customers and other interested parties.
The coming years are expected to bring rapid changes in the nanotechnology regulatory landscape, with the establishment of a new framework for nano-risk governance, in silico approaches for characterisation and Risk assessment of nanomaterials, and novel procedures for the early identification and management of nanomaterial risks. In this context, Safe(r)-by-Design (SbD) emerges as a powerful preventive approach to support the development of safe and sustainable (SSbD) nanotechnology-based products and processes throughout the life cycle. This paper summarises the work undertaken to develop a blueprint for the deployment and operation of a permanent European Centre of collaborating laboratories and research organisations supporting safe Innovation in nanotechnologies. The proposed entity, referred to as “the Centre”, will establish a ‘one-stop shop’ for nanosafety-related services and a central contact point for addressing stakeholder questions about nanosafety. Its operation will rely on significant business, legal and market knowledge, as well as other tools developed and acquired through the EU-funded EC4SafeNano project and subsequent ongoing activities. The proposed blueprint adopts a demand-driven service update scheme to allow the necessary vigilance and flexibility to identify opportunities and adjust its activities and services in the rapidly evolving regulatory and nano risk governance landscape.
The proposed Centre will play a major role as a conduit to transfer scientific knowledge between the Research and commercial laboratories or consultants able to provide high quality nanosafety services, and the end-users of such services (e.g., industry, SMEs, consultancy firms, and regulatory authorities). The Centre will harmonise service provision, and bring novel risk assessment and management approaches, e.g. in silico methodologies, closer to practice, notably through SbD/SSbD, and decisively support safe and sustainable innovation of industrial production in the nanotechnology industry according to the European Chemicals Strategy for Sustainability.
Herein, we provide a "systems architecture"-like overview and detailed discussions of the methodological and instrumental components that, together, comprise the "MOUSE" project (Methodology Optimization for UltrafineStructure Exploration). The MOUSE project provides scattering information on a wide variety of samples, with traceable dimensions for both the scattering vector (q) and the absolute scattering cross-section (I). The measurable scattering vector-range of 0.012≤ q (nm-1) ≤ 92, allows information across a hierarchy of structures with dimensions ranging from ca. 0.1 to 400 nm. In addition to details that comprise the MOUSE project, such as the organisation and traceable aspects, several representative examples are provided to demonstrate its flexibility. These include measurements on alumina membranes, the tobacco mosaic virus, and dual-source information that overcomes fluorescence limitations on ZIF-8 and iron-oxide-containing carbon catalyst materials.
Regulatory decisions require reliable data and knowledge derived from this. Among stakeholders in nanotechnology, however, there is often uncertainty about the quality of data for regulatory purposes. In addition, the general public often finds itself excluded from nanoregulation and policy decisions. This creates uncertainty in the nanotechnology field and also in other branches of technology and leads to concerns among the society.
To address these issues, NANORIGO elaborates a framework to support decision making as well as data, information and knowledge sharing and use. We refer to “reliability” of data and knowledge as a degree of readiness or maturity. According to these criteria we worked out a 9-level scale in analogy to TRL (technology readiness level), the KaRL system (Knowledge, Data and Information Readiness Level). KaRL allows assessment of knowledge readiness for decision making by applying defined quality criteria for each level. It also provides guidance on how to enhance the readiness level by the help of available tools and procedures. KaRL addresses SEIN[1] principles, circular economy and thus involves the public concerns in regulation. A specialized nanorisk governance council (being under development in NANORIGO) is suggested to perform quality check of an actionable document, thus, aiding in consensus on the reliability (maturity) of knowledge for decision making. Moreover, KaRL facilitates traceability of knowledge before its use in decision making. This enables the transparency demanded by all stakeholders.
Nanomaterials may have brought many beneficial innovations with them in our daily lives and and have become indispensable for the society. However, one needs to be concerned of the risks which are still unknown and not sufficietly studied and therefore there is a need for a nanorisk governance. At the core of nanorisk governance is gathering, processing and analysing reliable data which will be used for decision making. The challenge is to assure data reliability and transform it into knowledge. To address this challenge, we used analogy to technology readiness level (TRL) approach (developed by NASA), and elaborated knowledge readiness level (KaRL). KaRL is a nine-scale system to categorize data and knowledge (documents) into levels of readiness for particular purposes and to enhance readiness level by using quality and completeness filters, compliance requirements, nanorisk-related tools, stakeholders’ input. By our approach we addressed key issues in nanotechnology such as societal and ethical concerns, circular economy and sustainability, traceability of data, knowledge and decisions.
This dataset is a complete set of raw, processed and analyzed data, complete with Jupiter notebooks, associated with the manuscript mentioned in the title.
In the manuscript, we provide a "systems architecture"-like overview and detailed discussions of the methodological and instrumental components that, together, comprise the "MOUSE" project (Methodology Optimization for Ultrafine Structure Exploration). Through this project, we aim to provide a comprehensive methodology for obtaining the highest quality X-ray scattering information (at small and wide angles) from measurements on materials science samples.
Blueprint for a sustainable new European Centre to support safe innovation for nanotechnology
(2020)
This paper presents the blueprint for the operation of a sustainable and permanent European Centre of collaborating reference laboratories and research centres, to establish a one-stop shop for a wide variety of nanosafety related services, and to provide a central contact point for questions about nanosafety in Europe. The Centre aims to harmonise service provision, and bring novel risk assessment and management approaches closer to practice.
You Ask – ACEnano Replies
(2020)
The workshop starts with introductory information about the workshop and the H2020 project ACEnano, followed by two expert round tables, focussing on how the project could address regulator and industry needs, respectively. This is be followed by parallel sessions on tools (based on preferences expressed by those registered to attend, see “Questions”) and finally a question-and-answer session with the attendees.
The experts invited in Round Table 1 have been prepared to answer to questions related to obstacles and advantages for stakeholders such as SMEs to use the ACEnano approaches/tools. Standardisation needs are discussed.
The European legislation has responded to the wide use of nanomaterials in our daily life and defined the term “nanoform” in the Annexes to the REACH (Registration, Evaluation, Authorization of Chemicals) Regulation. Now specific information of the nanomaterials is required from the companies when registering the appropriate materials in a dossier.
In the context of REACH eleven physicochemical properties were considered as relevant, of which the following six are essential for registration of nanoforms (priority properties): chemical composition, crystallinity, particle size, particle shape, chemical nature of the surface (“surface chemistry”), and specific surface area (SSA). A key role is the reliable, reproduceable and traceable character of the data of these priority properties.
In this context, we want to discuss which ‘analytical’ information is exactly required to fulfill these conditions. Time-of-Flight Secondary Ion Mass Spectrometry (ToF-SIMS) and X-ray Photoelectron Spectroscopy (XPS) were chosen as the most popular surface analytical methods. Both methods allow a detailed understanding of the surface chemistry with an information depth below ten nanometers. As a rather bulk method for the analysis of nanoforms, Electron Probe Microanalysis (EPMA) in the version with energy dispersive X-ray spectroscopy (EDS) is considered for the quick identification of the main chemical elements present in the sample. Furthermore, Scanning Electron Microscopy (SEM) results are discussed which provide results on particle size and shape. Thus, four of the six priority properties can be obtained with these methods.
Identifying nanomaterials (NMs) according to European Union Legislation is challenging, as there is an enormous variety of materials, with different physico-chemical properties. The NanoDefiner Framework and its Decision Support Flow Scheme (DSFS) allow choosing the optimal method to measure the particle size distribution by matching the material properties and the performance of the particular measurement techniques. The DSFS leads to a reliable and economic decision whether a material is an NM or not based on scientific criteria and respecting regulatory requirements. The DSFS starts beyond regulatory requirements by identifying non-NMs by a proxy Approach based on their volume-specific surface area. In a second step, it identifies NMs. The DSFS is tested on real-world materials and is implemented in an e-tool. The DSFS is compared with a decision flowchart of the European Commission’s (EC) Joint Research Centre (JRC), which rigorously follows the explicit criteria of the EC NM definition with the focus on identifying NMs, and non-NMs are identified by exclusion. The two approaches build on the same scientific basis and measurement methods, but start from opposite ends: the JRC Flowchart starts by identifying NMs, whereas the NanoDefiner Framework first identifies non-NMs.
In contrast to the crisp, clear images you can get from electron microscopy, small-angle X-ray scattering (SAXS) patterns are rather featureless. These patterns, however, contain averaged structural information of all of the finest material structures that were illuminated by the X-ray beam. With careful and precise investigation, and supplementary information from complementary techniques, this bulk material structure can be quantified to reveal structural information spanning four or even five decades in size. Additionally, while the data correction and analysis is complex, sample preparation is very straightforward, also allowing for in-situ and operando measurements to be performed without breaking a sweat. In the right hands, then, this technique can be the most powerful tool in your analytical arsenal.
This chapter provides an introduction in secondary ion mass spectrometry as one of the leading surface chemical analysis and imaging techniques with molecular specificity in the field of material sciences. The physical basics of the technique are explained along with a description of the typical instrumental setups and their modes of operation. The application paragraph specifically focuses on nanoparticle analysis by SIMS in terms of surface spectrometry, imaging, analysis in organic and complex media, and depth profiling.
A review of the existing literature is provided, and selected studies are showcased. Limitations and pitfalls as well as current technical developments of SIMS application in nanoparticle surface chemical analysis are equally discussed.
This chapter first gives an introduction to the concepts of SSA and volume-specific surface area (VSSA) and an outline of the BET method. It continues with a discussion of the relationship between particle size, shape, and the VSSA, followed by an overview of instrumentation, experimental methods, and standards. Finally, sections on the use of the VSSA as a tool to identify nanomaterials and non-nanomaterials and its role in a regulatory context provide some insight on the importance of VSSA in the current Regulation of nanomaterials.
Fluorescent particles like nm- and m-sized polymeric beads doped or labeled with different types of fluorophores and nanocrystalline systems like quantum dots and upconversion phosphors emitting in the visible (vis), near-infrared (NIR), and IR (infrared) region are of increasing importance as fluorescent reporters for bioanalysis and medical diagnostics. The assessment and comparison of material performance and the development of rational design strategies for improved systems requires suitable spectroscopic tools for the determination of signal-relevant optical properties and analytical tools for the determination of the number of surface groups, ligands, biomolecules and /or fluorophores per bead. In this respect, suitable spectroscopic tools for the characterization of the optical properties of such materials like photoluminescence quantum yields and brightness values and the determination of their surface chemistry are introduced. This includes integrating sphere setups for absolute measurements of fluorescence quantum yields of liquid and solid, transparent and scattering materials in the wavelength region of 350 nm to 1600 nm at varying excitation power densities for the study of multi-photon processes and simple optical assays, validated by comparison with established analytical techniques relying on different detection principles. Here, different examples for the optical and analytical characterization of different types of nanoscale reporters are presented.
There is an increasing interest in optical reporters like semiconductor quantum dots and upconversion nanocrystals with emission > 800 nm for bioanalysis, medical diagnostics, and safety barcodes. Prerequisites for the comparison of material performance, the mechanistic understanding of nonradiative decay channels, and the rational design of new nanomaterials with improved properties are reliable fluorescence measurements and validated methods for the assessment of their surface chemistry. The latter is of special relevance for nanocrystalline emitters, where surface states and the accessibility of emissive states by quenchers largely control accomplishable photoluminescence quantum yields and hence, signal sizes and detection sensitivities from the reporter side. Here, we present the design of integrating sphere setups for the excitation power density-dependent absolute measurement of emission spectra and photoluminescence quantum yields in the wavelength region of 350 to 1600 nm and results from spectroscopic studies of semiconductor quantum dots and upconversion nanocrystals of different size and surface chemistries in various environments. Subsequently, examples for simple approaches to surface group and ligand analysis are presented.
Coating, stabilization layers, functionalization of particles or simple contamination are common variants of a core-shell system. For smaller nanoparticles this is of major importance. A particle with 16 nm diameter and a usual surface layer of 2 nm will have the same volume for the core as for the shell. In this case the material of the particle doesn’t have a clear definition. It is a common case that a particle consists of four different layers: Core, shell, stabilization layer and contamination. The properties of the particles differ according to this structure. For example silver particles might have a different dissolution rate for pure particles and for particles which are grown on top of a core.
Different solubility or defined other properties of materials is a common reason for producing core-shell systems. Gold cores are surrounded by silica to stabilize them or to get a defined distance between the cores. Silica might be surrounded by gold and the silica dissolved afterwards. This delivers hollow shells. Another important example for core-shell systems are quantum dots. A small core is surrounded by a different material for increasing the photoluminescence. Furthermore there a stabilization layer is needed. The smallest part of the final particles is the initial core. The photoluminescence is based on this core, but the shells contain much more material. Categorization should address this.
Core-shell systems are not covered by most of the existing decision trees for grouping. They are either regarded as special case or a singular layer. This disqualifies core-shell systems for grouping within the common models. There might be a very easy way to avoid this problem and even to combine some of the different decision trees. Starting the decision tree with the solubility of the outer shell and subsequently addressing the inner layers will be a pragmatic approach to solve the problem. If there is no shell, the categorization can start with a tiered approach or with the proposed “stawman” chemical categorization. If a shell is covering the surface there is a need to check if the shell is stable. If it is stable, the particle can be categorized based on this shell. If it is soluble, the ions need to be addressed as in the classic case. Furthermore the shell might increase the uptake by the cells. If the ions and the uptake are not critical the categorization can continue with the next layer.
With this not perfect but pragmatic approach, the surface layers can be addressed with very limited additional efforts. Most criteria are based on classically tabulated data. Including a rating system like the precautionary matrix approach might even address the fact that some parameters are not always Yes/No, e.g. solubility, ion toxicity and uptake.