Forschungsdatensätze der BAM
Filtern
Erscheinungsjahr
Dokumenttyp
- Forschungsdatensatz (138)
Referierte Publikation
- nein (138)
Schlagworte
- SEM (23)
- XPS (23)
- HAXPES (22)
- Automation (17)
- NanoSolveIT (15)
- SAXS (15)
- X-ray scattering (15)
- Bonding analysis (14)
- Computational Chemistry (10)
- Database (10)
Organisationseinheit der BAM
- 6 Materialchemie (80)
- 6.1 Oberflächen- und Dünnschichtanalyse (29)
- 6.5 Synthese und Streuverfahren nanostrukturierter Materialien (28)
- 5 Werkstofftechnik (21)
- 6.0 Abteilungsleitung und andere (18)
- 8 Zerstörungsfreie Prüfung (18)
- 5.2 Metallische Hochtemperaturwerkstoffe (16)
- VP Vizepräsident (12)
- VP.1 eScience (12)
- 1 Analytische Chemie; Referenzmaterialien (11)
FenicsXConcrete
(2023)
This dataset contains raw data resulting from Impact-Echo measurements at the reference conrete block "Radarplatte", located at BAM (German Federal Institute for Materials Research and Testing). This specimen has been described in detail by Niederleithinger et al. (2021), who applied muon tomography, ultrasonic echo measurements, radar and X-ray laminography to visualize its internal structure.
The Impact-Echo method is based on the excitation of the zero-group-velocity frequency of the first symmetric Lamb mode of a plate-like structure, in order to assess its thickness. Numerous publication elaborate on Impact-Echo theory, examples are (Gibson and Popovics 2005, Schubert and Köhler 2008 , Abraham and Popovics 2010).
The measurements have been conducted using a setup that contains only commercially available components. The setup consists of an Olson CTG-2 concrete thickness gauge (Olsen Instruments, USA) for actuation and sensing and an 8-bit NI USB-5132 digital storage oscilloscope (National Instruments, USA) combined with the Echolyst software (Schweizerischer Verein für technische Inspektionen (SVTI), Switzerland) for data acquisition.
Measurements were conducted using a grid of 23x23 points with a spacing of 50 mm. At each point 8192 samples were recorded at a sampling rate of 1 MS/s.
The dataset contains the (X,Y) location in mm of the individual measurement points as well as the raw measurement data at those points.
The data is provided in the formats *.mir/*.mhdr (Echolyst), *.npy (Python) and *.mat (Matlab) and *.csv to ease the import in various post-processing tools.
The KupferDigital project aims to develop digital methods, tools, and data space infrastructures for digitalizing the entire life cycle of copper materials. The mechanical testing process is one of the main chains of such life cycles which generates lots of important testing data about the mechanical properties of the materials and their related materials and testing metadata. To train the digitalization of the mechanical testing process, different kinds of copper alloys were provided for this project, and their mechanical properties were measured by typical methods like Brinell and Vickers hardness and tensile testing. The primary raw testing data as well as the secondary datasets of these tests are provided. The detailed materials specifications, the utilized mechanical testing methods, and provided datasets are described in the content file. The test data files of heterogeneous structures are processed by the KupferDigital digital tools to be converted to standardized machine-readable data files.
KupferDigital mechanical testing datasets: Stress relaxation and low-cycle fatigue (LCF) tests
(2024)
The KupferDigital project deals with the development of a data ecosystem for digital materials research on the basis of ontology-based digital representations of copper and copper alloys. This document provides exemplary mechanical testing datasets for training the developed KupferDigital infrastructures. Different types of cast copper alloys were provided for this research and their mechanical testing (stress relaxation and low-cycle fatigue) was performed in the accredited materials testing laboratory, while the test results were reported according to the DIN/ISO standards and attached with the maximum possible metadata about the sample history, equipment, and calibration. The attached content file consisted of the obtained primary raw testing data as well as the secondary datasets of these tests containing the detailed metadata of mechanical testing methods. Such test data files are processed by the KupferDigital digital tools to be converted to standardized machine-readable data files.
Mechanical testing ontology
(2023)
The materials mechanical testing ontology (MTO) was developed by collecting the mechanical testing vocabulary from ISO 23718 standard, as well as the standardized testing processes described for various mechanical testing of materials like tensile testing, Brinell hardness test, Vickers hardness test, stress relaxation test, and fatigue testing. Confirming the ISO/IEC 21838-2 standard, MTO utilizes the Basic Formal Ontology (BFO), Common Core Ontology (CCO), Industrial Ontologies Foundry (IOF), Quantities, Units, Dimensions, and data Types ontologies (QUDT), and Material Science and Engineering Ontology (MSEO) as the upper-level ontologies.
Data for the publication "The role of extracellular polymeric substances of fungal biofilms in mineral attachment and weathering" (https://doi.org/10.1038/s41529-022-00253-1). It includes:
- The Summary of the EPS concentration, EPS sugar components and EPS linkages.
- The Summary of the XPS analysis of freeze-dried biofilm samples of all strains.
- The Summary of the pH, Mg, SI and Fe concentration, biomass and olivine dissolution rate for each time point of all dissolution experiments.
In the field of computational science and engineering, workflows often entail the application of various software, for instance, for simulation or pre- and postprocessing. Typically, these components have to be combined in arbitrarily complex workflows to address a specific research question. In order for peer researchers to understand, reproduce and (re)use the findings of a scientific publication, several challenges have to be addressed. For instance, the employed workflow has to be automated and information on all used software must be available for a reproduction of the results. Moreover, the results must be traceable and the workflow documented and readable to allow for external verification and greater trust. In this paper, existing workflow management systems (WfMSs) are discussed regarding their suitability for describing, reproducing and reusing scientific workflows. To this end, a set of general requirements for WfMSswere deduced from user stories that we deem relevant in the domain of computational science and engineering. On the basis of an exemplary workflow implementation, publicly hosted at GitHub (https:// this http URL), a selection of different WfMSs is compared with respect to these requirements, to support fellow scientists in identifying the WfMSs that best suit their requirements.
Software-driven scientific workflows are often characterized by a complex interplay of various pieces of software executed in a particular order. The output of a computational step may serve as input to a subsequent computation, which requires them to be processed sequentially with a proper mapping of outputs to inputs. Other computations are independent of each other and can be executed in parallel. Thus, one of the main tasks of a workflow tool is a proper and efficient scheduling of the individual processing steps.
Each processing step, just as the workflow itself, typically processes some input and produces output data. Apart from changing the input data to operate on, processing steps can usually be configured by a set of parameters to change their behavior. Moreover, the behavior of a processing step is determined by its source code and/or executable binaries/packages that are called within it. Beyond this, the computation environment not only has a significant influence on its behavior, but is also crucial in order for the processing step to work at all. The environment includes the versions of the interpreters or compilers, as well as all third-party libraries and packages that contribute to the computations carried out in a processing step.
Multiscale modeling of linear elastic heterogeneous structures via localized model order reduction
(2024)
In this paper, a methodology for fine scale modeling of large scale linear elastic structures is proposed, which combines the variational multiscale method, domain decomposition and model order reduction. The influence of the fine scale on the coarse scale is modelled by the use of an additive split of the displacement field, addressing applications without a clear scale separation. Local reduced spaces are constructed bysolving an oversampling problem with random boundary conditions. Herein, we inform the boundary conditions by a global reduced problem and compare our approach using physically meaningful correlated samples with existing approaches using uncorrelated samples. The local spaces are designed such that the local contribution of each subdomain can be coupled in a conforming way, which also preserves the sparsity pattern of standard finite element assembly procedures. Several numerical experiments show the accuracy and efficiency of the method, as well as its potential to reduce the size of the local spaces and the number of training samples compared to the uncorrelated sampling
This is the repository of all experimental raw data used in the Scientific Reports publication "Specific adsorption sites and conditions derived by thermal decomposition of activated carbons and adsorbed carbamazepine" by Daniel Dittmann, Paul Eisentraut, Caroline Goedecke, Yosri Wiesner, Martin Jekel, Aki Sebastian Ruhl, and Ulrike Braun.
It includes
- overview_measurements.xlsx and overview_measurements.ods containing a list of all TGA experiments (TGA, TGA-FTIR, TED-GC-MS, and ramp-kinetics)
- TED-GC-MS.zip containing gas chromatography-mass spectrometry experimtent files for the Chemstation and OpenChrom
- TGA.zip containing thermogravimetric analyses raw data on evolved gas analyses experiments (TGA-FTIR and TED-GC-MS)
- TGA_kinetics.zip containing thermogravimetric analyses raw data on decomposition kinetic experiments (ramp-kinetics)
- TGA-FTIR.zip containing Fourier-transform infrared spectroscopy series files for OMNIC
- XRF.zip containing x-ray flourescence data on elemental composition
Scientific welding data covers a wide range of physical domains and timescales and are measured using various different sensors. Complex and highly specialized experimental setups at different welding institutes complicate the exchange of welding research data further. The WelDX research project aims to foster the exchange of scientific data inside the welding community by developing and establishing a new open source file format suitable for the documentation of experimental welding data and upholding associated quality standards. In addition to fostering scientific collaboration inside the national and international welding community an associated advisory committee will be established to oversee the future development of the file format. The proposed file format will be developed with regard to current needs of the community regarding interoperability, data quality and performance and will be published under an appropriate open source license. By using the file format objectivity, comparability and reproducibility across different experimental setups can be improved.
2PP-TestArtifact
(2023)
This repository contains a test artifact (TA), also called test structure, designed for two-photon polymerization (also known as Direct Laser Writing (DLW) or Two/Multi-photon lithography (2PA/MPA)). Test artifacts can be used to compare structures, to check options used by the slicer, check the state of the 2PP machine itself or to get a construction guidelines for a certain combination of power, velocity and settings.
The associated paper can be found here: https://dx.doi.org/10.1088/1361-6501/acc47a
General ideas behind the test artifact:
1. optimized for 2PP-DLW
2. should be fast and easy to analyse with optical microscopy or 3. scanning electron microscopy without tilt.
3. short time to fabricate
4. include a reasonable amount of different features
5. bulk and small structures on the substrate
Test artifact for fs-LDW
(2023)
"This data set contains three different data types obtained from concrete specimens. For each specimen, the rebound numbers, ultrasonic data (ultrasonic velocity, time of flight), and destructive concrete strength are given. Two kind of specimen geometries were tested: cubes and drilled cores. The files are labeled according to the specimen geometry as "cube" or "core" and the type of measurement data as "compressive_strength", "rn_R" and "rn_Q" for rebound numbers as well as "us" for ultrasonic data. The ultrasonic data were generated by six independent laboratories, the rebound numbers by five independent laboratories and the destructive tests by one laboratory. The designation of each specimen establishes the relationship between the different data types."
This data set contains three different data types obtained from concrete specimens. For each specimen, the rebound numbers, ultrasonic data (ultrasonic velocity, time of flight), and destructive concrete strength are given. Two kind of specimen geometries were tested: cubes and drilled cores. The files are labeled according to the specimen geometry as "cube" or "core" and the type of measurement data as "compressive_strength", "rn_R" and "rn_Q" for rebound numbers as well as "us" for ultrasonic data. The ultrasonic data were generated by six independent laboratories, the rebound numbers by five independent laboratories and the destructive tests by one laboratory. The designation of each specimen establishes the relationship between the different data types.
Python Materials Genomics (pymatgen) is a robust materials analysis code that defines classes for structures and molecules with support for many electronic structure codes. This open-source software package powers the Materials Project.
In this particular contribution, the handling of obital-resolved "ICOHPLIST.lobster" files from Lobster was implemented in the software package (github handle: @JaGeo).