8.0 Abteilungsleitung und andere
Filtern
Dokumenttyp
- Vortrag (252)
- Zeitschriftenartikel (121)
- Beitrag zu einem Tagungsband (84)
- Posterpräsentation (62)
- Sonstiges (9)
- Forschungsdatensatz (4)
- Buchkapitel (3)
- Dissertation (3)
- Tagungsband (Herausgeberschaft für den kompletten Band) (2)
- Video (2)
Sprache
- Englisch (385)
- Deutsch (152)
- Mehrsprachig (3)
- Russisch (2)
Schlagworte
- Thermography (105)
- NDT (48)
- Additive Manufacturing (46)
- Thermografie (46)
- Additive manufacturing (40)
- Super resolution (34)
- Zerstörungsfreie Prüfung (32)
- Additive Fertigung (30)
- Non-destructive testing (24)
- Laser thermography (22)
Organisationseinheit der BAM
- 8.0 Abteilungsleitung und andere (542) (entfernen)
Paper des Monats
- ja (5)
Eingeladener Vortrag
- nein (252)
Industrial Radiology is used for volumetric inspection of industrial objects. By penetration of these objects (typically weldments, pipes or castings) with X-ray or Gamma radiation the 3D-volume is projected onto a 2D image detector. The X-ray film is the oldest radiographic image detector and still in wide use in industry. The industrial X-ray film systems used today differ from these used in medicine. Medical film systems are described well in the literature, but industrial film systems not. So we start with a description of the properties and standards for industrial film systems. The requirements on image quality are defined by several standards and can be verified with different image quality indicators (IQIs). They describe the ability of the human being to detect small and low contrast indications in a noisy image background. The essential parameters for digital industrial radiology are described.
Since about 30 years electronic image detectors are gradually replacing the industrial film. These detectors are based on storage phosphor imaging plates in combination with Laser scanners (“Computed Radiography”, CR) or a variety of different digital detector arrays (DDA). Typical applications of CR and DDAs are discussed as well as new possibilities by digital image processing, which is enabled by the computer based image handling, processing and analysis.
Der Ersatz von Röntgenfilmen durch digitale Detektoren in der Zerstörungsfreien Prüfung im Anlagenbau wird seit 2013 durch neue ISO-Normen unterstützt. Nach der Erfolgsgeschichte der digitalen Photographie und der fast vollständigen Ablösung der Filmtechnik in der optischen Bilderfassung wird erwartet, dass sich der Übergang von der Filmradiographie zur digitalen Radiographie schnell vollzieht. Dieser Trend ist derzeit so noch nicht zu beobachten. Zur mobilen Prüfung wird in Europa immer noch vorrangig die Filmradiographie und nur teilweise die Computerradiographie mit Speicherfolien eingesetzt. Im Bereich der stationären Prüftechnik haben sich die Matrixdetektoren (DDA) weitgehend durchgesetzt, wobei es auch hier noch alternative Detektoren wie Bildverstärker und Fluoroskope gibt. Mit Einführung der DIN EN ISO 17636-2 zur Schweißnahtprüfung wird seit 2013 erstmalig geregelt, wie man einen digitalen Detektor für die technische Radiographie auswählen soll, um im Vergleich zur Filmradiographie eine äquivalente Bildqualität zu erhalten. Essentielle Parameter bestimmen, ob die geforderte Bildgüte (Fehlernachweißempfindlichkeit) erreicht wird. Ein Überblick zeigt, welche Parameter für den praktischen Einsatz wichtig sind und wie sie ermittelt werden. Praktische Einsatzbeispiele im Anlagenbau werden diskutiert.
In den letzten zwei Jahren wurden 11 neue Standards/Standardrevisionen zur RT veröffentlicht (ohne Strahlenschutz) und 5 werden dafür gestrichen. Es wird über die neuen Anforderungen ausgewählter Normen bzw. Normrevisionen informiert und insbesondere auch über die veränderten Parameter, die bei Prüfpraxis und bei der Klassifizierung zu berücksichtigen sind. Das wichtigste Projekt ist die Revision der DIN EN ISO 17636-1, -2, RT von Schweißverbindungen, in ISO TC 44 SC 5 WG 1. Die Standards zur Schweißnahtprüfung in der Kerntechnik, DIN 25435, wurden ins Englische übertragen und bei ISO TC 85 SC 6 bearbeitet. Hauptaktivität bei ASTM ist gegenwärtig die Überarbeitung der CT-Standards und die Berücksichtigung der Anforderungen an das dimensionelle Messen. Der Guide ASTM E 1441 zur Bestimmung von MTF, Kontrast-Detail-Funktion (CDF) und Kontrast-Diskrimination-Diagramm (CDD) wird revidiert. E 2445 zur CR Long Term Stability wird ebenfalls revidiert. Die Revision soll auch in die ISO 16371-1 eingehen. Dazu ist ein Round Robin-Test geplant. Auch die Standards zur Durchstrahlungsprüfung auf Korrosion und Ablagerungen in Rohren mit Röntgen- und Gammastrahlen (EN 16407-1, -2, 2014) wurden revidiert. Diese Revision wurde als DIN EN ISO 20769-1, -2 2018 veröffentlicht und EN 16407 wird zurückgezogen. Die Revision der Standards EN 12543 und EN 12679 zur Messung der Brennfleckgröße und der Strahlergröße ist mit Verzögerung in Überarbeitung, um die digitalen Detektoren und Messmöglichkeiten mit Bildverarbeitungsprogrammen zu berücksichtigen. Ein Entwurf zur Messung der Brennfleckgröße von nano-Fokusröhren wird derzeit vom Europäischen Metrologie-Programm EMPIR gefördert. Zur radiographischen Gussteilprüfung wurde der Standard EN 12681 revidiert und 2018 veröffentlicht. Er wurde um Zulässigkeitsgrenzen erweitert und die digitale Radiographie wurde in Teil 2 berücksichtigt.
Metallische Mikropartikel in der Größenfraktion von ca. 10 – 100 Mikrometer bilden das Grundmaterial für den additiven Aufbau komplexer Strukturen durch lokales Aufschmelzen mit einem Laserstrahl. Entscheidend für dieses Verfahren ist die Kenntnis der Größen-, Form- und Defektverteilungen von Partikelfraktionen, da sie direkten Einfluss auf die Fließfähigkeit des Pulvers während des lagenweisen Auftragens, als auch auf die finale innere Porosität des Bauteils haben. Im Pulver vorhandene Poren verlassen in der Regel das Schmelzbad vor dem Erstarren nicht wieder, da deren Konvektionszeiten größer sind als die Zeit, während der das lokale Schmelzbad besteht, so dass einmal durch die Ausgangspartikel eingebrachte Poren in der Regel auch im fertigen Bauteil verbleiben. Eine strenge Qualitätskontrolle des Ausgangspulvers ist also Voraussetzung, um hochwertige Bauteile mit geringer Restporosität zu erhalten. Der Vorteil der Röntgentomographie ist dabei, dass mit nur sehr geringem Präparationsaufwand innenliegende Poren bis zu einem Durchmesser von wenigen Mikrometern detektiert werden können. Dies ist mit konkurrierenden Verfahren (Licht- / Elektronenmikroskopie, Laserbeugung) so nicht möglich. Zusammenfassend werden wir zeigen, dass die zerstörungsfreie Bildgebung mit der Röntgen-Mikrotomographie für die Qualitätskontrolle in der additiven Fertigung ein leistungsfähiges Werkzeug ist. Wir erörtern die Grenzen der Methode durch eine numerische Studie mithilfe des Simulationsprogramms aRTist. Dabei werden verschiedene Partikelgrößen dahingehend untersucht, welche Poren mit mikroradiographischen Verfahren detektierbar sind. Dazu wird ein Modell für eine dreilagige dichteste Kugelpackung entwickelt, so dass auch Überlagerungen von Anzeigen untersucht werden können. Im Ergebnis werden verschiedene Partikel- und Porengrößenkombinationen untersucht und die entsprechenden Detektionsgrenzen bestimmt.
Neben der klassischen Durchstrahlung wird die bildgebende Röntgen-rückstreuung bisher nur sehr begrenzt eingesetzt. Vor über 20 Jahren hatte Philips mit „COMSCAN“ eine erste industrielle Anwendung in der Luftfahrt gefunden, die sogar eine Tiefenauflösung besaß. Die Firma AS&E in Boston bietet Röntgenrückstreu-Anlagen für den Sicherheitsbereich an. In diesen wird mit einem hochkollimierten Nadelstrahl aus einem rotierenden Blendenrad ein Objekt einseitig abgetastet. Zur Detektion der gestreuten Strahlung aus dem Objekt werden großflächige Detektoren direkt neben dem rotierenden Nadelstrahlkollimator eingesetzt. Ein solcher Prototyp wird an der BAM für den Einsatz in der zerstörungsfreien Prüfung untersucht und optimiert. Als neues industrielles Einsatzgebiet wird dabei die in situ Überwachung in der additiven Fertigung avisiert. Hier ist die Zugänglichkeit zum entstehenden Werkstück in den 3D-Druckanlagen stark eingeschränkt, was den Einsatz einer zweiseitigen Durchstrahlung oder der Computer-Tomographie, bei der das Objekt rotiert wird, verhindert. Auch sind die in der additiven Fertigung eingesetzten Werkstoffe (Polymere, Keramik, Leichtmetalle wie Al oder Ti) für die Rückstreuung besser geeignet als Metalle höherer Dichte, da das Streusignal mit der Ordnungszahl und der Materialdichte abnimmt. Allerdings sind die Anforderungen an räumliche Auflösung und Kontrastempfindlichkeit in der zerstörungsfreien Prüfung additiv gefertigter Bauteil deutlich höher als im Sicherheitsbereich, da hier Bauteilfehler mit typischen Dimensionen kleiner als 1 mm sicher detektiert werden müssen. Die Untersuchung dieser Grenzen des derzeitigen Standes der Röntgenrückstreutechnik mit Nadelstrahl ist ein Teilbereich des mehrjährigen BAM-Themenfeldprojektes „ProMoAM“. Im Vortrag werden die ersten Ergebnisse der Optimierung sowie die gefundenen Anwendungsgrenzen an Beispielen erläutert.
Neues aus der BAM 2019
(2019)
X-ray back scatter imaging is rarely applied compared to classical X-ray projection imaging. More than 20 years ago the company Philips developed “COMSCAN”, a first application case for aircraft industry, which allowed even a depth resolution using back scatter imaging. The company AS&E in Boston offers back scatter imaging solutions for the security market. The principle is to scan the object with a highly collimated X-ray needle beam from one side only and to detect the backscattered radiation by a large area detector side by side with the collimation wheel. A new prototype is investigated at BAM for application and optimization in non-destructive testing. As modern industrial application field in-situ inspection in additive manufacturing is targeted. The accessibility of the printed part during the production process is very limited. This prevent the application of a two sided X-ray inspection or Computed Tomography, were an rotation of the object is required to acquire projections from 360 degrees. An important advantage for the X-ray back scatter technique are also the materials used in additive manufacturing (polymers, ceramics, light metals like Aluminum or Titanium). These materials with lower density and lower Z values give better scatter signals than metals with higher densities and Z values. The back scatter intensity decreases with increasing density and Z value of the material. But the requirements on spatial resolution and contrast sensitivity are more stringent for non-destructive testing of additive manufactured parts compared to the security area. In NDT sizes of indications smaller than 1 mm have to be detected clearly. The investigation of these limits on a state-of-the-art prototype for X-ray back scattering using rotating collimated X-ray needle beams is a part of the BAM project “ProMoAM”. The contribution shows first results of the optimization for NDT and the achieved application limits for several example cases.
Der Ersatz von Röntgenfilmen durch digitale Detektoren in der Zerstörungsfreien Prüfung im Anlagenbau wird seit 2013 durch neue ISO-Normen unterstützt. Nach der Erfolgsgeschichte der digitalen Photographie und der fast voll-ständigen Ablösung der Filmtechnik in der optischen Bilderfassung wird erwartet, dass sich der Übergang von der Filmradiographie zur digitalen Radiographie schnell vollzieht. Dieser Trend ist derzeit so noch nicht zu beobachten. Zur mobilen Prüfung wird in Europa immer noch vorrangig die Filmradiographie und nur teilweise die Computerra-diographie mit Speicherfolien eingesetzt. Im Bereich der stationären Prüftechnik haben sich die Matrixdetektoren (DDA) weitgehend durchgesetzt, wobei es auch hier noch alternative Detektoren wie Bildverstärker und Fluorosko-pe gibt. Mit Einführung der DIN EN ISO 17636-2 zur Schweißnahtprüfung wird seit 2013 erstmalig geregelt, wie man einen digitalen Detektor für die technische Radiographie auswählen soll, um im Vergleich zur Filmradiographie eine äquivalente Bildqualität zu erhalten. Essentielle Parameter bestimmen, ob die geforderte Bildgüte (Fehlernachweiß-empfindlichkeit) erreicht wird. Ein Überblick zeigt, welche Parameter für den praktischen Einsatz wichtig sind und wie sie ermittelt werden. Praktische Einsatzbeispiele im Anlagenbau werden diskutiert.
Die Technologie der hybriden Teilchen (oder Photonen) zählenden Bilddetektoren wurde seit mehr als 2 Jahrzehnten durch Institutionen wie das CERN (MediPix-/ TimePix-Chips) oder das Paul-Scherrer-Institut (Pilatus-/ Eiger-Chips) entwickelt. Die Ursprünge dieser Technologie kommen aus der Hochenergie-Physik mit ihren Beschleuniger-basierten Experimenten, bei denen ein zuverlässiger und rauschfreier Nachweis jedes ionisierten Teilchens gefordert wird. Eine Energieauflösung wird ebenfalls oft verlangr. Das ist auch in der klassischen Radiographie von Nutzen, so wurden diese Detektoren auch dafür optimiert. Neueste Photonen zählenden Detektoren besitzen folgende exzellenten Bildeigenschaften: rauschfreie Photonen-Detektion in einem hohen Dynamikbereich (SNR >2500 erreichbar), hohe räumliche Auflösung (bis zu 10 µm Pixelgröße), Energieschwellen bis zur vollen spektralen Auflösung und eine sehr hohe Geschwindigkeit (bis zu Tausenden Bildern pro Sekunde). Diese hybriden Detektoren können mit hochabsorbierenden Sensorschichten aus CdTe, CZT oder GaAs bestückt werden, um bildgebende Anwendungen mit Gamma- oder harten Röntgenstrahlen zu realisieren, in denen bisher Szintillator basierende Bilddetektoren dominierten (z.B. für NDT, SPECT, PET). Im Beitrag wird der erste wirklich große Photonen zählende Detektor mit CdTe und einer lückenlosen Detektionsfläche, die aus einer theoretisch unbegrenzten Anzahl von MediPix-/ TimePix-Modulen besteht, gezeigt. Die erste kommerzielle Version eines solchen Detektors hat eine Fläche von 7cm x7cm, 1,6 Megapixel und erfüllt die Anforderungen der Prüfklasse „B“ in der industriellen Radiographie. Die Energie auflösenden Fähigkeiten dieses Detektors erlauben eine „Farb-Radiographie“, in der verschiedene Materialien in verschiedene Farben dargestellt werden. Der 2. Teil dieses Beitrags stellt einen voll spektral auflösenden Gamma- und Röntgen-Bilddetektor mit dickem CdTe und 2,4 Megapixel vor, der auf TimePix3-Chips basiert und eine sehr hohe Ortsauflösung von ~10 µm erreicht. Die hohe Zeitauflösung von TimePix3 wird mit großem Nutzen für eine substantielle Verbesserung der erhaltenen Bilder und Spektren zur Unterdrückung interner Fluoreszenzstrahlung sowie Comptonstreuungsanteilen, die die Bildinformation verschmieren, verwendet.
In den letzten zwei Jahren wurden 11 neue Standards/Standardrevisionen zur RT veröffentlicht (ohne Strahlenschutz) und 5 werden dafür gestrichen. Es wird über die neuen Anforderungen ausgewählter Normen bzw. Normrevisionen informiert und insbesondere auch über die veränderten Parameter, die bei Prüfpraxis und bei der Klassifizierung zu berücksichtigen sind. Das wichtigste Projekt ist die Revision der DIN EN ISO 17636-1, -2, RT von Schweißverbindungen, in ISO TC 44 SC 5 WG 1. Die Standards zur Schweißnahtprüfung in der Kerntechnik, DIN 25435, wurden ins Englische übertragen und bei ISO TC 85 SC 6 bearbeitet. Hauptaktivität bei ASTM ist gegenwärtig die Überarbeitung der CT-Standards und die Berücksichtigung der Anforderungen an das dimensionelle Messen. Der Guide ASTM E 1441 zur Bestimmung von MTF, Kontrast-Detail-Funktion (CDF) und Kontrast-Diskrimination-Diagramm (CDD) wird revidiert. E 2445 zur CR Long Term Stability wird ebenfalls revidiert. Die Revision soll auch in die ISO 16371-1 eingehen. Dazu ist ein Round Robin-Test geplant. Auch die Standards zur Durchstrahlungsprüfung auf Korrosion und Ablagerungen in Rohren mit Röntgen- und Gammastrahlen (EN 16407-1, -2, 2014) wurden revidiert. Diese Revision wurde als DIN EN ISO 20769-1, -2 2018 veröffentlicht und EN 16407 wird zurückgezogen. Die Revision der Standards EN 12543 und EN 12679 zur Messung der Brennfleckgröße und der Strahlergröße ist mit Verzögerung in Überarbeitung, um die digitalen Detektoren und Messmöglichkeiten mit Bildverarbeitungsprogrammen zu berücksichtigen. Ein Entwurf zur Messung der Brennfleckgröße von nano-Fokusröhren wird derzeit vom Europäischen Metrologie-Programm EMPIR gefördert. Zur radiographischen Gussteilprüfung wurde der Standard EN 12681 revidiert und 2018 veröffentlicht. Er wurde um Zulässigkeitsgrenzen erweitert und die digitale Radiographie wurde in Teil 2 berücksichtigt.
X-ray back scatter imaging is rarely applied compared to classical X-ray projection imaging. 20 years ago the company Philips developed “COMSCAN”, a first application case in the aircraft industry, which allowed even a depth resolution using back scatter imaging. The company AS&E in Boston offers back scatter imaging solutions for the security market. Their principle is to scan the object with a highly collimated X-ray needle beam from one side only and detect the backscattered radiation by a large area detector side by side with the collimation wheel. A new prototype is investigated at BAM for application and optimization in non-destructive testing. As modern industrial application field in-situ testing in additive manufacturing is targeted. The accessibility of the printed part during the production process is very limited. This prevent the application of a two sided X-ray inspection or Computed Tomography, were an rotation of the object is required to acquire projections from 360 degrees. An important advantage for the X-ray back scatter technique are also the materials used in additive manufacturing (polymers, ceramics, light metals like Aluminum or Titanium). These materials with lower density and lower Z values give better scatter signals than metals with higher densities and Z values. The back scatter intensity decreases with increasing density and Z value of the material. But the requirements on spatial resolution and contrast sensitivity are more stringent for non-destructive testing of additive manufactured parts compared to the security area. In NDT sizes of indications smaller than 1 mm have to be detected clearly. The investigation of these limits on a stateof-the-art prototype for X-ray back scattering using rotating collimated X-ray needle beams is a part of the BAM project “ProMoAM”. The contribution shows first results of the optimization for NDT and the achieved application limits for several example cases.
X-ray back scatter imaging is rarely applied compared to classical X-ray projection imaging. 20 years ago the company Philips developed “COMSCAN”, a first application case in the aircraft industry, which allowed even a depth resolution using back scatter imaging. The company AS&E in Boston offers back scatter imaging solutions for the security market. Their principle is to scan the object with a highly collimated X-ray needle beam from one side only and detect the backscattered radiation by a large area detector side by side with the collimation wheel. A new prototype is investigated at BAM for application and optimization in non-destructive testing. As modern industrial application field in-situ testing in additive manufacturing is targeted. The accessibility of the printed part during the production process is very limited. This prevent the application of a two sided X-ray inspection or Computed Tomography, were an rotation of the object is required to acquire projections from 360 degrees. An important advantage for the X-ray back scatter technique are also the materials used in additive manufacturing (polymers, ceramics, light metals like Aluminum or Titanium). These materials with lower density and lower Z values give better scatter signals than metals with higher densities and Z values. The back scatter intensity decreases with increasing density and Z value of the material. But the requirements on spatial resolution and contrast sensitivity are more stringent for non-destructive testing of additive manufactured parts compared to the security area. In NDT sizes of indications smaller than 1 mm have to be detected clearly. The investigation of these limits on a stateof-the-art prototype for X-ray back scattering using rotating collimated X-ray needle beams is a part of the BAM project “ProMoAM”. The contribution shows first results of the optimization for NDT and the achieved application limits for several example cases.
The revision of ISO 24497:2007 started in 2017 (by decision of Com-V at the IIW annual assembly in Shanghai). After 3 years of discussions and incorporation of many comments from all over the world the working group C-V-E-b headed by Uwe Zscherpel finalized the work with a final draft to be forwarded for final vote at ISO TC 44. The working group C-V-E-b finalised the work and can be resolved. The chair thanks for all the successfully work done, the vote to forward the latest draft to ISO for final vote was done without any negative.
The transition from X-ray film to digital detectors in radiography is accompanied by an increase of unsharpness due to the larger inherent digital detector unsharpness in comparison to film. The basic spatial resolution of digital detectors (see EN ISO 17636-2) is used today to describe this unsharpness. The geometrical unsharpness of the radiographic projection of object structures onto the detector plane is determined by the focal spot size of the X-ray tube and the magnification. The focal spot size is measured today (see ASTM E 1165) from pin hole camera exposures or edge unsharpness (see ASTM E 2903). The final image unsharpness is a result of a convolution of the geometrical and inherent detector unsharpness function, divided by the magnification factor of the object onto the detector plane. Different approximations of this convolution result in ASTM E 1000 and ISO 17636-2 in different optimum values for the magnification factor for a given focal spot size of a X—ray tube and the basic spatial resolution of the detector. The higher contrast sensitivity, an advantage of digital radiography, compared to film radiography is furthermore improved when using higher X-ray voltages as used with film and smaller focal spots of the X-ray tubes. This allows a higher distance between object and detector resulting in reduced object scatter in the image. The interactions between all these parameters will be discussed and simple rules for practitioners will be derived in this contribution.
In the field of optically excited thermography, flash lamps (impulse shaped planar heating) and halogen lamps (modulated planar heating) have become established for the specific regimes of impulse and lock-in thermography. Flying-spot laser thermography is implemented by means of a rasterized focused laser, e. g. for crack detection (continuous wave operation) and photothermal material characterization (high-frequency modulated). The availability of novel technologies, i. e. fast and high-resolution IR cameras, brilliant innovative light sources and high-performance data acquisition and processing technology will enable a paradigm shift from stand-alone photothermal and thermographic techniques to uniform quantitative measurement and testing technology that is faster and more precise. Similar to an LED array, but with irradiance two orders of magnitude higher, a new type of brilliant laser source, i. e. the VCSEL array (vertical-cavity surface-emitting laser), is now available. This novel optical energy source eliminates the strong limitation to the temporal dynamics of established light sources and at the same time is spectrally clearly separated from the detection wavelength. It combines the fast temporal behavior of a diode laser with the high optical irradiance and the wide illumination area of flash lamps. In addition, heating can also be carried out in a structured manner, because individual areas of the VCSEL array can be controlled independently of each other. This new degree of freedom enables the development of completely new thermographic NDT methods.
Optical lock-in thermography is a completely contactless and very sensitive NDT technique. As an optical source of energy, incandescent lamps are most commonly used because they are relatively inexpensive and offer high irradiances at the test specimen. However, they are strongly restricted by their low modulation bandwidth with a maximum modulation frequency of only about 1 Hz. The use of high-power kilowattclass laser sources, e.g. diode laser arrays, pushes this constraint beyond 100 Hz. This allows for the exploration of the near-surface region of metals and layer systems with better and more accurate penetration depth and depth resolution. Moreover, these lasers are virtually free of any additional thermal radiation that could interfere with the “true” thermal response emitted from the heated sample. In turn, they can be easily used in a one-sided test configuration. We present current activities with kilowatt-class highpower laser sources for advanced lock-in thermography and focus on the application of laser arrays that offer a very high irradiation strength over a large sample area beyond the mentioned advantages.
Mit Laserlicht kann man eine Materialoberfläche berührungslos und schnell moduliert aufheizen. Dabei entsteht eine stark gedämpfte Wärmewelle, die tief ins Material eindringen kann. Erzeugt und überlagert man solche thermischen Wellen auf kohärente Weise, dann kann man damit versteckte Materialfehler zerstörungsfrei und sehr präzise aufspüren. Sogar eine bildgebende Tomografie ist denkbar.
Impact damages and delaminations in fibre-reinforced composites (FRC) might not be visible at the surface, but could have an influence on the resistance and on the long-term behaviour of the component. Therefore, and especially for safety relevant structures, non-destructive methods are required for the assessment of such damages.
Active thermography methods are suitable to characterize damages after loading using different kind of excitation techniques and various configurations of infrared (IR) camera and heating sources. Here, flash lamps, impulse excitation with infrared radiator and lock-in technique with halogen lamps or widened laser beams are suited. In addition, non-optical sources like sonotrodes (requiring direct contact to the structure) or induction generators (only suited for carbon fibre reinforced polymer (CFRP) structures) could be applied as well. For the investigation of the evolution of the damage during the impact, passive thermography can be applied in-situ. Elastic and plastic deformations alter the temperature of the structure and thus the temperature on the surface.
In this contribution, at first the general principles of quantitative defect characterisation in FRC using active thermography with flash, impulse and lock-in excitation are described. Optical and thermal properties of the FRC material and its anisotropy are considered. Results of phase differences obtained at flat bottom holes with flash and lock-in thermography are compared for qualifying both methods for quantitative defect characterization. Secondly, the damage evolution of CFRP and GFRP structures under impact load and static tensile loading is described. The spatial and temporal evolution of the surface temperature enables us to distinguish matrix cracks or fibre-matrix separation from delaminations between the layers. Afterwards, all results for loading defects, obtained by passive and active thermography, are compared with each other. Fig. 1 and 2 show the difference of passive and flash thermography obtained at impact and tensile loaded CFRP plates, respectively. As one purpose of these investigations is the development of standards within national (DIN) and European (CEN) standardisation bodies, new draft and final standards are presented and further needs are discussed at the end of the presentation.
In vielen Umgebungen, besonders bei hohen Temperaturen, korrosiven Umgebungen oder auf bewegten oder schlecht zugänglichen Flächen, kann die Temperatur nicht oder nur mit nicht akzeptablem Aufwand mit Berührungsthermometern gemessen werden. Diese Umgebungsbedingungen sind unter anderem in der chemischen Industrie, der Lebensmittel-, Metall-, Glas-, Kunststoff- und Papierherstellung sowie bei der Lacktrocknung anzutreffen. In diesen Bereichen kommen Strahlungsthermometer zum Einsatz. Der VDI-Statusreport zeigt typische Anwendungsfelder von nicht radiometrisch kalibrierten Wärmebildkameras und von radiometrisch kalibrierten Thermografiekameras. Um verlässlich mit spezifizierten Messunsicherheiten berührungslos Temperaturen zu messen, müssen die Strahlungsthermometer und Thermografiekameras nicht nur kalibriert, sondern radiometrisch und strahlungsthermometrisch umfassend charakterisiert werden. Auch die optische Materialeigenschaft, der spektrale Emissionsgrad und die Gesamtstrahlungsbilanz (Strahlung des Messobjekts und der Umgebung) sind bei der industriellen Temperaturmessung von großer Bedeutung. In den letzten Jahrzehnten ist dazu ein umfassendes technisches Regelwerk entstanden, das wir Ihnen mit diesem VDI-Statusreport vorstellen. Manche in den Richtlinien beschriebenen Kennwerte mögen abstrakt wirken. In diesem Statusreport zeigen wir an konkreten Beispielen, was diese Kenngrößen für die berührungslose Temperaturmessung bedeuten. Beispiele von Anwendungen zeigen, wo temperaturmessende Thermografiekameras und ausschließlich bildgebende Wärmebildkameras in der Praxis eingesetzt werden. Mit einer Analyse, welche Themen und Anwendungen derzeit besonders intensiv diskutiert werden, versuchen wir Trends für zukünftige Entwicklungen herauszuarbeiten.
Using spatial and temporal shaping of laser-induced diffuse thermal wave fields in thermography
(2020)
The diffuse nature of thermal waves is a fun-damental limitation in thermographic nonde-structive testing. In our studies we investigated different approaches by shaping the thermal wave fields which result from heating. We have used high-power laser sources to heat metallic samples. Using these spatial and temporal shaping techniques leads to a higher detection sensitivity in our measurements with the infra-red camera. In this contribution we show our implementation of shaping laser-induced diffuse thermal wave fields and the effect on the defect reconstruction quality.
Die Thermografie ist trotz ihrer ausgereiften wissenschaftlichen und technologischen Grundlagen ein noch relativ junges Mitglied in der Familie der zerstörungsfreien Prüfverfahren. Sie erschließt sich aufgrund einer Reihe von Vorzügen eine wachsende Anwenderschaft. Für eine weitere Verbreitung insbesondere im industriellen Kontext spielen Normen, Standards und Richtlinien eine wichtige Rolle. In diesem Beitrag wird der aktuelle Stand der Normierung vorgestellt. Wir werden zeigen, welche Grundlagennormen und Anwendungsnormen es für die Thermografie in Deutschland und international gibt und wir wagen einen Blick in die Zukunft. Darüber hinaus lebt auch die Normierungsarbeit von der Beteiligung durch interessierte Kreise. Dies können industrielle und akademische Anwender*innen, Hersteller*innen von Geräten, Forschungseinrichtungen oder Dienstleistungsunternehmen sein. Sie können gern Ihre Bedarfe bezüglich Normierungsprojekten mitbringen und/oder direkt an die Autoren senden.
Optical lock-in thermography is a completely contactless and very sensitive NDT technique. As an optical source of energy, incandescent lamps are most commonly used because they are relatively inexpensive and offer high irradiances at the test specimen. However, they are strongly restricted by their low modulation bandwidth with a maximum modulation frequency of only about 1 Hz. The use of high-power kilowatt-class laser sources, e.g. diode laser arrays, pushes this constraint beyond 100 Hz. This allows for the exploration of the near-surface region of metals and layer systems with better and more accurate penetration depth and depth resolution. Moreover, these lasers are virtually free of any additional thermal radiation that could interfere with the “true” thermal response emitted from the heated sample. In turn, they can be easily used in a one-sided test configuration.
We present current activities with kilowatt-class high-power laser sources for advanced lock-in thermography and focus on the application of laser arrays that offer a very high irradiation strength over a large sample area beyond the mentioned advantages.
Optical lock-in thermography is a completely contactless and very sensitive NDT technique. As an optical source of energy, incandescent (i.e. halogen) lamps are most commonly used because they are relatively inexpensive and offer high irradiances at the test site. However, they are strongly restricted by their low modulation bandwidth with a maximum modulation frequency of only about 1 Hz. The use of high-power kilowatt-class laser sources, e.g. diode laser arrays, pushes this constraint beyond 100 Hz, see Fig.1. This allows for the exploration of the near-surface region of metals and layer systems with better and more accurate penetration depth and depth resolution. Moreover, these lasers are virtually free of any additional thermal radiation that could interfere with the “true” thermal response emitted from the heated sample. In turn, they can be easily used in a one-sided test configuration.
Using the one-dimensional solution to the thermal heat diffusion equation together with the absorptance of the material which is illuminated with a harmonically modulated light source, we can calculate the temperature oscillation at the surface of a solid. As a second step, we calculate the corresponding oscillation of the total thermal emission using Stefan-Boltzmann law as a first order approximation and taking into account the emissivity of the material. Within this framework we can calculate the minimal irradiance of a light source necessary to provoke a measurable signal within a thermographic camera at a noise equivalent temperature difference (NETD) of 30 mK. In Fig. 2 this relationship is displayed for a wide spectrum of modulation frequencies and for a number of different light sources scaled to the same electrical input power and illumination area. Using this figure, it is now easily possible to analyze the range of materials to be tested using lock-in thermography, since only the materials (dotted lines) below the irradiance-vs-frequency curves (solid lines) are heated in excess of the camera’s NETD. This figure clearly shows that laser sources considerably increase the application range of lock-in thermography, since especially for metals with a high reflectance and high thermal diffusivity a high irradiance is vitally important to allow for lock-in texting.
We present current activities with kilowatt-class high-power laser sources for advanced lock-in thermography and focus on the application of laser arrays that offer a very high irradiation strength over a large sample area beyond the mentioned advantages.
Optical lock-in thermography is a completely contactless and very sensitive NDT technique. As an optical source of energy, incandescent (i.e. halogen) lamps are most commonly used because they are relatively inexpensive and offer high irradiances at the test site. However, they are strongly restricted by their low modulation bandwidth with a maximum modulation frequency of only about 1 Hz. The use of high-power kilowatt-class laser sources, e.g. diode laser arrays, pushes this constraint beyond 100 Hz, see Fig.1. This allows for the exploration of the near-surface region of metals and layer systems with better and more accurate penetration depth and depth resolution. Moreover, these lasers are virtually free of any additional thermal radiation that could interfere with the “true” thermal response emitted from the heated sample. In turn, they can be easily used in a one-sided test configuration.
Using the one-dimensional solution to the thermal heat diffusion equation together with the absorptance of the material which is illuminated with a harmonically modulated light source, we can calculate the temperature oscillation at the surface of a solid. As a second step, we calculate the corresponding oscillation of the total thermal emission using Stefan-Boltzmann law as a first order approximation and taking into account the emissivity of the material. Within this framework we can calculate the minimal irradiance of a light source necessary to provoke a measurable signal within a thermographic camera at a noise equivalent temperature difference (NETD) of 30 mK. In Fig. 2 this relationship is displayed for a wide spectrum of modulation frequencies and for a number of different light sources scaled to the same electrical input power and illumination area. Using this figure, it is now easily possible to analyze the range of materials to be tested using lock-in thermography, since only the materials (dotted lines) below the irradiance-vs-frequency curves (solid lines) are heated in excess of the camera’s NETD. This figure clearly shows that laser sources considerably increase the application range of lock-in thermography, since especially for metals with a high reflectance and high thermal diffusivity a high irradiance is vitally important to allow for lock-in texting.
We present current activities with kilowatt-class high-power laser sources for advanced lock-in thermography and focus on the application of laser arrays that offer a very high irradiation strength over a large sample area beyond the mentioned advantages.
Optical lock-in thermography is a completely contactless and very sensitive NDE technique. As an optical source of energy, incandescent (i.e. halogen) lamps are most commonly used because they are relatively inexpensive, do not need any work safety measures and offer high irradiances at the test site. However, they are strongly restricted by their low modulation bandwidth with a maximum modulation frequency of only about 1 Hz. The use of high-power kilowatt-class laser sources, e.g. diode laser arrays, pushes this constraint beyond 100 Hz. This allows for the exploration of the near-surface region of metals and layer systems with better and more accurate penetration depth and depth resolution. Moreover, these lasers are virtually free of any additional thermal radiation that could interfere with the “true” thermal response emitted from the heated sample. In turn, they can be easily used in a one-sided test configuration. Altogether using lasers considerably increases the application range of lock-in thermography, since especially for metals with a high reflectance and high thermal diffusivity a high irradiance is vitally important to allow for lock-in testing [1, 2]. We report on the mentioned benefits of using such high-power lasers and analyze the range of materials to be tested using lock-in thermography in dependence on the laser irradiance, the modulation frequency, the infrared camera as well as the optical and thermal material parameters. In this context, we also address a number of systematic errors caused by the use of ideal and non-ideal heat sources. For example, the measured phase angle in lock-in thermography depends on the irradiance and the modulation bandwidth of the source. This in turn has a decisive influence on the uncertainty in the quantification of, e.g. layer thicknesses.
Die Thermografie ist trotz ihrer ausgereiften wissenschaftlichen und technologischen Grundlagen ein noch relativ junges Mitglied in der Familie der zerstörungsfreien Prüfverfahren. Sie erschließt sich aufgrund einer Reihe von Vorzügen eine wachsende Anwendungsgemeinde. Für eine weitere Verbreitung insbesondere im industriellen Kontext spielen Normen, Standards und technische Regeln eine wichtige Rolle. In diesem Beitrag wird der aktuelle Stand der Normung in Deutschland vorgestellt. Wir zeigen, welche Normen und technischen Regeln es für die Thermografie in Deutschland und international gibt und wir wagen einen Blick in die Zukunft. Darüber hinaus lebt auch die Normierungsarbeit von der Beteiligung durch interessierte Kreise. Dies können industrielle und akademische Anwender*innen, Hersteller*innen von Geräten, Forschungseinrichtungen oder Dienstleistungsunternehmen sein. Sie können gern Ihre Bedarfe bezüglich Normierungsprojekten mitbringen und/oder direkt an die Autoren senden.
Die Thermografie ist trotz ihrer ausgereiften wissenschaftlichen und technologischen Grundlagen ein noch relativ junges Mitglied in der Familie der zerstörungsfreien Prüfverfahren. Sie erschließt sich aufgrund einer Reihe von Vorzügen eine wachsende Anwendungsgemeinde. Für eine weitere Verbreitung insbesondere im industriellen Kontext spielen Normen, Standards und technische Regeln eine wichtige Rolle. In diesem Beitrag wird der aktuelle Stand der Normung in Deutschland vorgestellt. Wir zeigen, welche Normen und technischen Regeln es für die Thermografie in Deutschland und international gibt und wir wagen einen Blick in die Zukunft. Darüber hinaus lebt auch die Normierungsarbeit von der Beteiligung durch interessierte Kreise. Dies können industrielle und akademische Anwender*innen, Hersteller*innen von Geräten, Forschungseinrichtungen oder Dienstleistungsunternehmen sein. Sie können gern Ihre Bedarfe bezüglich Normierungsprojekten mitbringen und/oder direkt an die Autoren senden.
Due to their high irradiance and wide modulation bandwidth, high-power lasers open up a wide field of application. For example, the classical methods of pulse and lock-in thermography can be realized in high quality. In addition, structured heating is also possible by using arrays of such lasers. This makes it possible to implement new thermographic methods, such as interference-based detection of cracks or super resolution.
Die thermografische ZfP basiert auf der Wechselwirkung von thermischen Wellen mit Inhomogenitäten. Diese Inhomogenitäten hängen mit der Probengeometrie oder der Materialzusammensetzung zusammen. Obwohl die Thermografie für ein breites Spektrum von Inhomogenitäten und Materialien geeignet ist, ist die grundlegende Einschränkung die diffuse Natur der thermischen Wellen und die Notwendigkeit, ihre Wirkung lediglich an der Probenoberfläche radiometrisch messen zu können. Die Ausbreitung der thermischen Wellen von der Wärmequelle über die Inhomogenität zur Detektionsoberfläche führt zu einer Verschlechterung des räumlichen Auflösungsvermögens der thermografischen Technik. Ein neuer konzertierter Ansatz auf Basis einer räumlich strukturierten Erwärmung und einer Joint Sparsity des Signalensembles ermöglicht eine verbesserte Rekonstruktion von Inhomogenitäten.
Im Beitrag wird die Implementierung dieses Ansatzes auf Basis einer strukturierten Beleuchtung mit einem 1D-Laser-Array vorgestellt. Die einzelnen Emitterzellen des Laser-Arrays werden durch ein pseudozufälliges Binärmuster gesteuert und zusätzlich leicht verschoben. Die wiederholte Messung dieser verschiedenen Konfigurationen bei gleichzeitig konstanter Inhomogenität ermöglicht eine Rekonstruktion der Inhomogenität unter Ausnutzung der Joint Sparsity.
Thermographic non-destructive testing is based on the interaction of thermal waves with inhomogeneities. The propagation of thermal waves from the heat source to the inhomogeneity and to the detection surface according to the thermal diffusion equation leads to the fact that two closely spaced defects can be incorrectly detected as one defect in the measured thermogram. In order to break this spatial resolution limit (super resolution), the combination of spatially structured heating and numerical methods of compressed sensing can be used. The improvement of the spatial resolution for defect detection then depends in the classical sense directly on the number of measurements. Current practical implementations of this super resolution detection still suffer from long measurement times, since not only the achievable resolution depends on performing multiple measurements, but due to the use of single spot laser sources or laser arrays with low pixel count, also the scanning process itself is quite slow. With the application of most recent high-power digital micromirror device (DMD) based laser projector technology this issue can now be overcome.
Das thermografische Nachweisprinzip beruht auf der Analyse von instationären Temperaturverteilungen, welche durch die Wechselwirkung eines von außen zugeführten Wärmestroms mit der inneren Geometrie des Prüfobjekts oder mit darin eingeschlossenen Inhomogenitäten verursacht werden. Eine äquivalente Beschreibung dieses wechselwirkenden Wärmestroms ist die Ausbreitung von Wärmewellen im Inneren des Prüfobjekts. Obwohl die Thermografie für die Erkennung einer Vielzahl von Inhomogenitäten und für die Prüfung verschiedenster Materialien geeignet ist, besteht die grundlegende Einschränkung in der diffusen Natur der Wärmewellen und der Notwendigkeit, ihre Wirkung nur an der Prüfobjektoberfläche radiometrisch messen zu können. Der fundamentale Nachteil von diffusen Wärmewellen gegenüber propagierenden Wellen, wie sie z. B. im Ultraschall vorkommen, ist die dadurch verursachte schnelle Verschlechterung der räumlichen Auflösung mit zunehmender Defekttiefe. Diese Verschlechterung schränkt in der Regel die Anwendbarkeit der Thermografie bei der Suche nach kleinen tiefliegenden Defekten ein.
Ein vielversprechender Ansatz zur Verbesserung der räumlichen Auflösung und damit der Erkennungsempfindlichkeit und der Rekonstruktionsqualität in der thermografischen Prüfung liegt in der speziellen Formung dieser diffusen Wärmewellenfelder mittels strukturierter Laserthermografie bzw. photothermischer Anregung. Einige Beispiele sind:
- Schmale rissartige Defekte unterhalb der Oberfläche können durch Überlagerung mehrerer interferierender Wärmewellenfelder mit hoher Empfindlichkeit detektiert werden,
- Nahe beieinander liegende Defekte können durch mehrere Messungen mit unterschiedlichen Heizstrukturen getrennt werden,
- Defekte in unterschiedlichen Tiefen können durch eine optimierte zeitliche Gestaltung der thermischen Anregungsfunktion unterschieden werden,
- Schmale Risse auf der Oberfläche können durch robotergestütztes Scannen mit fokussierten Laserspots gefunden werden,
- Defekte, die während der additiven Fertigung auftreten, können bereits im Bauraum und mit dem Fertigungslaser detektiert werden.
Wir präsentieren die neuesten Ergebnisse dieser Technologie, die mit Hochleistungslasersystemen und modernen numerischen Methoden erzielt wurden.
AbstractHigh-strength aluminum alloys used in aerospace and automotive applications obtain their strength through precipitation hardening. Achieving the desired mechanical properties requires precise control over the nanometer-sized precipitates. However, the microstructure of these alloys changes over time due to aging, leading to a deterioration in strength. Typically, the size, number, and distribution of precipitates for a quantitative assessment of microstructural changes are determined by manual analysis, which is subjective and time-consuming. In our work, we introduce a progressive and automatable approach that enables a more efficient, objective, and reproducible analysis of precipitates. The method involves several sequential steps using an image repository containing dark-field transmission electron microscopy (DF-TEM) images depicting various aging states of an aluminum alloy. During the process, precipitation contours are generated and quantitatively evaluated, and the results are comprehensibly transferred into semantic data structures. The use and deployment of Jupyter Notebooks, along with the beneficial implementation of Semantic Web technologies, significantly enhances the reproducibility and comparability of the findings. This work serves as an exemplar of FAIR image and research data management.
Mechanical properties of metals and their alloys are strongly governed by their microstructure. The nanometer-sized precipitates in hardenable wrought aluminium alloys, which can be controlled by heat treatment, act as obstacles to dislocation movement within the material and are critical to the mechanical performance of the component, in this case a radial compressor wheel of a ships’ engine. TEM-based image analysis is essential for the study to investigate the microstructural changes (precipitation coarsening) that occur as a result of ageing at elevated temperatures.
Tri-structural Isotropic (TRISO)-coated particle is the fission energy source and the first safety barrier in high temperature gas-cooled reactors (HTGRs). The integrity of TRISO particle should be carefully tested before operation because the shape may affect the failure possibility of the particles, leading to increased risk of fission product release. Due to the large difference in density between the kernel and the coating layers in TRISO particles, traditional X-ray radiography cannot achieve a good image quality in terms of identifying coating layers reliably, while phase-contrast CThas the advantageof being sensitive to boundaries. This paperpresents a non-destructive test and evaluation (NDT&E) method to facilitate 3-dimensional (3D) measurement of a TRISO particle's structure, using a synchrotron phase-contrast CT. After reconstructed, the TRISO particle was rendered in a 3D space and the thickness and asphericity of the TRISO particle's layers were measured. It was found that the thickness of coating layers of the tested particle obeys Gaussian distribution. The deviation of thicknesses of the kernel and the other four layers is −2.42%, −16.32%, 26.51%, 0.98% and 7.49% compared with the design parameters. The deviation of asphericity of the kernel, IPyC and OPyC layers is −11.51%, −0.41% and 3.39%, respectively. The effect of the deviations on the temperature distribution and failure probability calculation of the particle will be investigated in the future.
Bei der Trocknung von Gesteinen vermindert sich der Anteil des Wasservolumens im Porenraum, der als Sättigung bezeichnet wird. Mit abnehmender Sättigung sinkt die elektrische Leitfähigkeit des Gesteins. Die Abhängigkeit der elektrischen Leitfähigkeit von der Sättigung wird durch ein Potenzgesetz entsprechend der 2. Archie-Gleichung beschrieben: log(#CHR:sigma_LOWER#)=n log(S<sub>w</sub>), wobei n der Sättigungsexponent ist. Bei der Trocknung von Gesteinen muss beachtet werden, dass die Wasserleitfähigkeit mit abnehmender Sättigung zunimmt, da die Konzentration der im Wasser gelösten Salze ansteigt. Das hat zur Folge, dass die bei Trocknungsexperimenten bestimmten Sättigungsexponenten deutlich kleiner sind als bei Verdrängungsexperimenten (z.B. Wasser durch Öl) mit gleichbleibender Wasserleitfähigkeit. Bei der Spektral(en) Induzierten Polarisation wird die komplexe elektrische Leitfähigkeit des Gesteins als Funktion der Frequenz betrachtet. Für den Real- und Imaginärteil der elektrischen Leitfähigkeit werden unterschiedliche Sättigungsexponenten bestimmt.
Wir haben an zwei Sandsteinproben (Langenauer und Gravenhorster Sandstein) und einem Baumberger Kalksandstein Trocknungsexperimente durchgeführt. Die Sättigung wurde schrittweise von 100 % bis auf 10 % reduziert. In jeder Sättigungsstufe wurden an den Proben die Spektren der komplexen elektrischen Leitfähigkeit aufgezeichnet. Unsere Experimente zeigen, dass die Sättigungsexponenten für den Imaginärteil der komplexen elektrischen Leitfähigkeit mit 1,23 bis 1,59 deutlich größer sind als die des Realteils mit Werten zwischen 0,41 und 0,90. Dieser Unterschied resultiert aus der Tatsache, dass der Imaginärteil der Leitfähigkeit weniger stark mit zunehmender Wasserleitfähigkeit ansteigt als der Realteil. Für den Realteil der elektrischen Leitfähigkeit wird nach der 1. Archie-Gleichung ein linearer Zusammenhang zwischen Wasserleitfähigkeit und Gesteinsleitfähigkeit erwartet. Der Anstieg des Imaginärteils mit der Wasserleitfähigkeit kann mit einem Potenzgesetz mit einem Exponenten zwischen 0.1 und 0.62 beschrieben werden. Unsere Experimente zeigen, dass der Imaginärteil der elektrischen Leitfähigkeit wesentlich deutlicher auf die Austrocknung von Gesteinen reagiert und damit ein wirksamer Indikator zur Überwachung von Sättigungsänderungen sein kann.
We investigate the pore space of rock samples with respect to different petrophysical parameters using various methods, which provide data on pore size distributions, including micro computed tomography (µ-CT), mercury Intrusion porosimetry (MIP), nuclear magnetic resonance (NMR), and spectral-induced polarization (SIP). The resulting cumulative distributions of pore volume as a function of pore size are compared. Considering that the methods differ with
regard to their limits of resolution, a multiple-length-scale characterization of the pore space is proposed, that is based on a combination of the results from all of these methods.
The approach is demonstrated using samples of Bentheimer and Röttbacher sandstone. Additionally, we compare the potential of SIP to provide a pore size distribution with other commonly used methods (MIP, NMR). The limits of Resolution of SIP depend on the usable frequency range (between 0.002 and 100 Hz). The methods with similar Resolution show a similar behavior of the cumulative pore volume distribution in the verlapping pore size range. We assume that µ-CT and NMR provide the pore body size while MIP and SIP characterize the pore throat size. Our study Shows that a good agreement between the pore radius distributions can only be achieved if the curves are adjusted considering the resolution and pore volume in the relevant range of pore radii. The MIP curve with the widest range in Resolution should be used as reference.
In this study, the optimized fabrication and evolution of the microstructure and magnetic Transition behavior of the melt-extraction LaFe11.2Si1.8 microwires have been studied. After the optimization of extraction technique (heating power 22 KW, feeding rate 30-50 mm/s, rotation velocity 1700 r/min), the content of La Fe,Si)13 phase in the as-extracted microwires was 54 wt% due to the high solidification velocity, which was increased to 85 wt% via annealing at 1373 K for 20 min. The amount of La(Fe,Si)13 phase was increased and the composition of La(Fe,Si)13 phase became more homogenized through peritectic reaction and short-distance diffusion in the microwires during annealing process. The coexistence of the nanocrystalline and amorphous structures contributed to the broad magnetic Transition temperature range of the as-extracted and annealed microwires. The annealed microwires exhibited a second-order magnetic transformation behavior and showed a maximum magnetic entropy Change jDSMjmax of 6.2 J/kgK and working temperature interval of 36.0 K under a magnetic field of 20 kOe.
An observation of the fracture process in front of the crack tip inside a dentin sample by means of ex-situ X-ray computed tomography after uniaxial compression at different deformation values was carried out in this work. This ex-situ approach allowed the microstructure and fracturing process of human dentin to be observed during loading. No cracks are observed up to the middle part of the irreversible deformation in the samples at least visible at 0.4μm resolution. First cracks appeared before the mechanical stress reached the compression strength. The growth of the cracks is realized by connecting the main cracks with satellite cracks that lie ahead of the main crack tip and parallel its trajectory. When under the stress load the deformation in the sample exceeds the deformation at the compression strength of dentin, an appearance of micro-cracks in front of the main cracks is observed. The micro-cracks are inclined (~60°) to the trajectory of the main cracks. The further growth of the main cracks is not realized due to the junction with the micro-cracks; we assume that the micro-cracks dissipate the energy of the main crack and suppressed its growth. These micro-cracks serve as additional stress accommodations, therefore the samples do not break apart after the compression test, as it is usually observed under bending and tension tests.
Historians and librarians are interested in watermarks and mould surface patterns in historic papers, because they represent the “fingerprints” of antique papers. However, these features are usually covered or hidden by printing, writing or other media. Different techniques have been developed to extract the watermarks in the paper while avoiding interference from media on the paper. Beta radiography provides good results, but this method cannot be widely used because of radiation safety regulations and the long exposure times required due to weak isotope sources employed. In this work, two promising methods are compared which can be used to extract digital high-resolution images for paper watermarks and these are electron radiography and low energy X-ray radiography. For electron radiography a “sandwich” of a lead sheet, the paper object, and a film in a dark cassette, is formed and it is exposed at higher X-ray potentials (> 300 kV). The photoelectrons escaping from the lead sheet penetrate the paper and expose the film. After development, the film captures the watermark and mould surface pattern Images for the paper being investigated. These images are then digitized using an X-ray film digitizer. The film employed could potentially be replaced by a special type of imaging plate with a very thin protection layer to directly generate digital Images using computed radiography (CR). For the second method, a low energy X-ray source is used with the specimen paper placed on a digital detector array (DDA). This method directly generates a low energy digital radiography (DR) image. Both methods provide high quality images without interference from the printing media, and provide the potential to generate a “fingerprint”
database for historical papers. There were nevertheless found to be differences in the images obtained using the two methods.
The second method, using a low energy X-ray source, has the potential to be integrated in a portable device with a small footprint incorporating user safety requirements. Differences obtained using the two methods are shown and discussed.
Investigation on Wall Thickness Ranges Using Digital Radiography for Tangential Projection Technique
(2018)
X-ray testing is based on the attenuation of X-rays when passing through matter. Image detectors acquire the X-ray information which is defined by the local penetrated wall thickness of the tested sample. By X-ray absorption in the detector and following read-out and digitization steps a digital image is generated. As detectors a radiographic film and film digitization, a storage phosphor imaging plate and a special Laser scanner (Computer Radiography - CR) or a digital detector array (DDA) can be used. The digital image in the computer can then be further analyzed using many types of image processing. In the presented work the automated evaluation of wall thickness profiles are investigated using a test steel pipe with 9 different wall thicknesses and various X-ray voltages and different filter materials at the tube port and intermediate between object and detector. In this way the influence of different radiation qualities on the accuracy of the automated wall thickness evaluation depending on the penetrated wall thickness of the steel pipe was investigated.
Laser Induced Breakdown Spectroscopy – A Tool for Imaging the Chemical Composition of Concrete
(2022)
One of the most common causes of damage is the ingress of harmful ions into the concrete, which can lead to deterioration processes and affect structural performance. Therefore, the increasingly aging infrastructure is regularly inspected to assess durability. Regular chemical analysis can be useful to determine the extent and evolution of ion ingress and to intervene in a timely manner. This could prove more economical than extensive repairs for major damage, particularly for critical infrastructure. In addition to already established elemental analysis techniques in civil engineering such as potentiometric titration or X-ray fluorescence analysis, laser-induced breakdown spectroscopy (LIBS) can provide further important complementary information and benefits. The possibilities of LIBS are demonstrated using the example of a drill core taken from a parking garage.
Laser-induced breakdown spectroscopy (LIBS) is a spectroscopic method for the analysis of the chemical composition of sample materials. Generally, the measurement of all elements of the periodic table is possible. In particular, light elements such as H, Li, Be, S, C, O, N and halogens can be measured. Calibration with matrix-matching standards allows the quantification of element concentrations. In combination with scanner systems, the two-dimensional element distribution can be determined. Even rough surfaces can be measured by online adjustment of the laser focus. LIBS can also be used on-site with mobile systems. Hand-held systems are available for point measurements.
Common applications include the investigation of material deterioration due to the ingress of harmful ions and their interaction in porous building materials. Due to the high spatial resolution of LIBS and the consideration of the heterogeneity of concrete, the determination of precise input parameters for simulation and modelling of the remaining lifetime of a structure is possible. In addition to the identification of materials, it is also possible to assess the composition for example of hardened concrete, which involves the cement or aggregate type used. Other important fields of application are the detection of environmentally hazardous elements or the material classification for sorting heterogeneous material waste streams during dismantling. Non-contact NDT for “difficult to assess” structures as an example application through safety glass or in combination with robotics and automation are also possible.
In this work, an overview of LIBS investigations on concrete is given based on exemplary laboratory and on-site applications.
A joint project of partners from industry and research institutions for the research and construction of an analysis system for an automated, sensor-supported sorting of construction and demolition waste will be presented. This is intended to supplement or replace the previously practiced manual sorting, which harbors many risks and dangers for the staff and only enables obvious, visually detectable differences for separation. The method of laser-induced breakdown spectroscopy is to be used in combination with hyperspectral sensors. Due to the jointly processed information (data fusion), this should lead to a significant improvement in the separation of types. In addition to the sorting of different materials (concrete, main masonry building materials, organic components, glass, etc.), impurities such as SO3-containing building materials (gypsum, aerated concrete, etc.) could also be detected and separated.
The subsequent recycling and sales opportunities are examined, such as the use of recycled aggregates in concrete, the recycling of building materials containing sulphate as a gypsum substitute for the cement industry or the agglomeration of synthetic lightweight aggregates for lightweight concrete or as a substrate for green roofs. At the same time, it is investigated whether soluble components (sulfates, heavy metals, etc.) can be detected by LIBS without a wet chemical analysis and what impact the recycling materials have on the environment.
The entire value chain is examined using the example of the Berlin location in order to minimize economic / technological barriers and obstacles on a cluster level and to sustainably increase the recovery and recycling rates.
LIBS is a complementary method to XRF and can detect all elements without the need for vacuum conditions. Automated systems are already commercially available capable of scanning surfaces with a resolution of up to 0.1 mm within a few minutes. In addition to possible applications in R&D, LIBS is also used for practical applications in building materials laboratories and even on-site.
In view of ageing infrastructure facilities, a reliable assessment of the condition of concrete structures is of increasing interest. For concrete structures, the ingress of potential harmful ions is affecting the serviceability and eventually structural performance. Pitting corrosion induced by penetrating chlorides is the dominant deterioration mechanism. Condition assessment based on frequently performed chloride profiling can be useful to identify the extent and evolution of chloride ingress. This could prove to be more economical than extensive repairs, especially for important infrastructure facilities.
Currently the most common procedure for determining the chloride content is wet chemical analysis with standard resolution of 10 mm. The heterogeneity is not considered. LIBS is an economical alternative for determining the chloride content at depth intervals of 1 mm or less. It provides 2D distributions of multiple elements and can locate spots with higher concentrations. The results are directly correlated to the mass of binder and can also be performed on-site with a mobile LIBS-System.
The application of a LIBS-system is presented. Calibration is required for quantitative analysis. Concrete cores were drilled, sliced and analyzed to determine the 2D-distribution of harmful elements. By comparing the chloride ingress and the carbonation, the interaction of both processes can be visualized in a measurement that takes less than 10 minutes for a 50 mm x 100 mm drill core.
A leaflet on the use of LIBS for the chloride ingress assessment has been completed.
Laser-induced breakdown spectroscopy (LIBS) is a spectroscopic method for detecting the chemical composition of optically accessible surfaces. In principle, the measurement of all elements of the periodic table is possible. System calibrations allow the quantification of element concentrations. In combination with scanner systems, the two-dimensional element distribution can be determined. Even rough surfaces can be measured by online adjustment of the laser focus. To detect element ingress into the concrete, typically cores are taken, cut in half, and LIBS measurements are performed on the cross-section. The high spatial resolution as well as the simultaneous multi-element analysis enables a separate evaluation of the binder-matrix and aggregates. Therefore, the element concentrations can be determined directly related to the cement paste. LIBS measurements are applicable in the laboratory, on-site and also over a distance of several meters.
Common applications include the investigation of material deterioration due to the ingress of harmful ions and their interaction in porous building materials. LIBS is able to provide precise input parameters for simulation and modelling of the remaining lifetime of a structure. Besides the identification of materials, also their composition can be determined on hardened concrete, such as the type of cement or type of aggregate. This also involves the identification of environmentally hazardous elements contained in concrete. Another possible application is the detection of the composition of material flows during dismantling. Non-contact NDT for “difficult to assess” structures as an example application through safety glass or in combination with robotics and automation are also possible.
This work presents the state of the art concerning LIBS investigations on concrete by showing exemplary laboratory and on-site applications.
To date, there are very few technologies available for the conversion of low-temperature waste heat into electricity. Thermomagnetic generators are one approach proposed more than a century ago. Such devices are based on a cyclic change of magnetization with temperature. This switches a magnetic flux and, according to Faraday’s law, induces a voltage. Here we demonstrate that guiding the magnetic flux with an appropriate topology of the magnetic circuit improves the performance of thermomagnetic generators by orders of magnitude. Through a combination of experiments and simulations, we show that a pretzel-like topology results in a sign reversal of the magnetic flux. This avoids the drawbacks of previous designs, namely, magnetic stray fields, hysteresis and complex geometries of the thermomagnetic material. Our demonstrator, which is based on magnetocaloric plates, illustrates that this solid-state energy conversion technology presents a key step towards becoming competitive with thermoelectrics for energy harvesting near room temperature.
Strong coupling effects in magnetocaloric materials are the key factor to achieve a large magnetic entropy change. Combining insights from experiments and ab initio calculations, we review relevant coupling phenomena, including atomic coupling, stress coupling, and magnetostatic coupling. For the investigations on atomic coupling, we have used Heusler compounds as a flexible model system. Stress coupling occurs in first‐order magnetocaloric materials, which exhibit a structural transformation or volume change together with the magnetic transition. Magnetostatic coupling has been experimentally demonstrated in magnetocaloric particles and fragment ensembles. Based on the achieved insights, we have demonstrated that the materials properties can be tailored to achieve optimized magnetocaloric performance for cooling applications.
Strong coupling effects in magnetocaloric materials are the key factor to achieve a large magnetic entropy change. Combining insights from experiments and ab initio calculations, we review relevant coupling phenomena, including atomic coupling, stress coupling, and magnetostatic coupling. For the investigations on atomic coupling, we have used Heusler compounds as a flexible model system. Stress coupling occurs in first‐order magnetocaloric materials, which exhibit a structural transformation or volume change together with the magnetic transition. Magnetostatic coupling has been experimentally demonstrated in magnetocaloric particles and fragment ensembles. Based on the achieved insights, we have demonstrated that the materials properties can be tailored to achieve optimized magnetocaloric performance for cooling applications.
In diesem Vortrag wird am Beispiel magnetischer Werkstoffe zur Energiewandlung gezeigt, wie röntgentomographische Untersuchungen zur Strukturaufklärung in Kompositen und Massivproben beitragen können. Die Bauteile werden zerstörungsfrei geprüft, um Risse, Poren und andere Defekte und ihren Einfluss auf die funktionellen Eigenschaften dreidimensional und rechtzeitig im Lebenszyklus des Werkstoffs zu charakterisieren. Kombiniert man Mikrotomographie mit anderen Methoden der magnetischen Werkstoffcharakterisierung, lassen sich einzigartige Aussagen über den Aufbau und die funktionellen Eigenschaften treffen.
To date, there are only very few technologies available for the conversion of low temperature waste heat to electricity. In this talk, we first describe the principle of thermomagnetic generators. Then we focus on the impact of topology of the magnetic circuit within thermomagnetic generators. We demonstrate that the key operational parameters strongly depend on the genus, i.e. the number of holes within the magnetic circuit.
To date, there are only very few technologies available for the conversion of low temperature waste heat to electricity. In this talk, we first describe the principle of thermomagnetic generators. Then we focus on the impact of topology of the magnetic circuit within thermomagnetic generators. We demonstrate that the key operational parameters strongly depend on the genus, i.e. the number of holes within the magnetic circuit.
To date, there are very few technologies available for the conversion of low-temperature waste heat into electricity. Thermomagnetic generators are one approach proposed more than a century ago. Such devices are based on a cyclic change of magnetization with temperature. For thermomagnetic materials, we used a commercial magnetocaloric alloy with a transition temperature of 300 K.
To date, there are only very few technologies available for the conversion of low temperature waste heat to electricity. In this talk, we first describe the principle of thermomagnetic generators. Then we focus on the impact of topology of the magnetic circuit within thermomagnetic generators. We demonstrate that the key operational parameters strongly depend on the genus, i.e. the number of holes within the magnetic circuit.
To date, there are very few technologies available for the conversion of low-temperature waste heat into electricity. Thermomagnetic generators are one approach proposed more than a century ago. Such devices are based on a cyclic change of magnetization with temperature. For thermomagnetic materials, we used a commercial magnetocaloric alloy with a transition temperature of 300 K.
To date, there are only very few technologies available for the conversion of low temperature waste heat to electricity. In this talk, we first describe the principle of thermomagnetic generators. Then we focus on the impact of topology of the magnetic circuit within thermomagnetic generators. We demonstrate that the key operational parameters strongly depend on the genus, i.e. the number of holes within the magnetic circuit.
In diesem Vortrag wird am Beispiel magnetischer Werkstoffe zur Energiewandlung gezeigt, wie röntgentomographische Untersuchungen zur Strukturaufklärung in Kompositen und Massivproben beitragen können. Die Bauteile werden zerstörungsfrei geprüft, um Risse, Poren und andere Defekte und ihren Einfluss auf die funktionellen Eigenschaften dreidimensional und rechtzeitig im Lebenszyklus des Werkstoffs zu charakterisieren. Kombiniert man Mikrotomographie mit anderen Methoden der magnetischen Werkstoffcharakterisierung, lassen sich einzigartige Aussagen über den Aufbau und die funktionellen Eigenschaften treffen.
In diesem Vortrag wird am Beispiel magnetischer Werkstoffe zur Energiewandlung gezeigt, wie röntgentomographische Untersuchungen zur Strukturaufklärung in Kompositen und Massivproben beitragen können. Die Bauteile werden zerstörungsfrei geprüft, um Risse, Poren und andere Defekte und ihren Einfluss auf die funktionellen Eigenschaften dreidimensional und rechtzeitig im Lebenszyklus des Werkstoffs zu charakterisieren. Kombiniert man Mikrotomographie mit anderen Methoden der magnetischen Werkstoffcharakterisierung, lassen sich einzigartige Aussagen über den Aufbau und die funktionellen Eigenschaften treffen.
To date, there are only very few technologies available for the conversion of low temperature waste heat to electricity. More than a century ago, thermomagnetic generators were proposed, which are based on a change of magnetization with temperature, switching a magnetic flux, which according to Faraday’s law induces a voltage. In this talk, we first describe the principle of thermomagnetic generators. Then we focus on the impact of topology of the magnetic circuit within thermomagnetic generators. We demonstrate that the key operational parameters strongly depend on the genus, i.e. the number of holes within the magnetic circuit. A pretzel-like topology of the magnetic circuit with genus =3 improves the performance of thermomagnetic generators by orders of magnitude. We will show that this technique is on its way to becoming competitive with thermoelectrics for energy harvesting near room temperature.
Thermomagnetic materials are a new type of magnetic energy materials, which enable the conversion of low temperature waste heat to electricity by three routes: Thermomagnetic motors, generators and microsystems. Taking our recent work on thermomagnetic generators as a starting point, in this talk we analyse the material requirements for a more energy and economic efficient conversion. We will describe the influence of magnetisation change and heat capacity on thermodynamic efficiency, as well as the consequences of thermal conductivity on power density. Our analysis will allow selecting the best thermomagnetic materials in Ashby plots and illustrate the substantial different properties compared to magnetocaloric materials. Supported by DFG, project FA 453/14)
Preisträgervortrag Georg-Sachs-Preis der DGM
Röntgencomputertomographie (CT) ist heute ein Standardwerkzeug in der Materialcharakterisierung. Im Vortrag zeigen wir ihre Anwendung für die Untersuchung magnetischer Funktionsmaterialien, additiv gefertigter Bauteile und deren Feedstockpulver und stellen erste CT-Ergebnisse biogener Feedstockpulver vor.
One of the most common causes of damage is the ingress of harmful ions into the concrete, which can lead to deterioration processes and affect structural performance. Therefore, the increasingly aging infrastructure is regularly inspected to assess durability. Regular chemical analysis can be useful to determine the extent and evolution of ion ingress and to intervene in a timely manner. This could prove more economical than extensive repairs for major damage, particularly for critical infrastructure. In addition to already established elemental analysis techniques in civil engineering such as potentiometric titration or X-ray fluorescence analysis, laser-induced breakdown spectroscopy (LIBS) can provide further important complementary information and benefits. The possibilities of LIBS are demonstrated using the example of a drill core taken from a parking garage.
Responsible treatment of the environment and resources is a key element of sustainability. The building and construction industry is one of the largest consumers of natural resources. Consequently, there is a particular need for regulations and technologies that help to create closed material cycles. From the technological point of view, such efforts are complicated by the growing material diversity and the amount of composites contained in present and future construction and demolition waste (CDW). Nowadays, simple but proven techniques like manual sorting are mainly used. However, this practice not only poses health risks and dangers to the staff performing the work, but also relies on merely obvious, visually striking differences. Automated, sensor-based sorting of these building materials could complement or replace this practice to improve processing speed, recycling rates, sorting quality, and prevailing health conditions. The preliminary results for the identification of a wide variety of building materials with LIBS are presented.
Geschlossene Materialkreisläufe und sortenreine Materialfraktionen sind erforderlich, um hohe Verwertungs und Recyclingquoten in der Bauindustrie zu erreichen Beim Recycling von Bau und Abbruchabfällen wurden bisher bevorzugt einfache, aber bewährte Techniken eingesetzt, um große Mengen Bauschutt in kurzer Zeit zu verarbeiten Dies steht im Gegensatz zu den immer komplexer werdenden Verbundwerkstoffen in der Mineralbaustoffindustrie Die aktuell oft praktizierte händische Klaubung bürgt viele Risiken und Gefahren für das ausführende Personal und basiert lediglich auf offensichtlichen, visuell erkennbaren Unterschieden zur Trennung Eine automatisierte, sensorgestützte Sortierung dieser Baustoffe könnte diese Praxis ergänzen oder ersetzen, um die Verarbeitungsgeschwindigkeit, die Recyclingraten, die Sortierqualität und die vorherrschenden Gesundheitsbedingungen zu verbessern.
Geschlossene Materialkreisläufe und sortenreine Materialfraktionen sind erforderlich, um hohe Verwertungs und Recyclingquoten in der Bauindustrie zu erreichen Beim Recycling von Bau und Abbruchabfällen wurden bisher bevorzugt einfache, aber bewährte Techniken eingesetzt, um große Mengen Bauschutt in kurzer Zeit zu verarbeiten Dies steht im Gegensatz zu den immer komplexer werdenden Verbundwerkstoffen in der Mineralbaustoffindustrie Die aktuell oft praktizierte händische Klaubung bürgt viele Risiken und Gefahren für das ausführende Personal und basiert lediglich auf offensichtlichen, visuell erkennbaren Unterschieden zur Trennung Eine automatisierte, sensorgestützte Sortierung dieser Baustoffe könnte diese Praxis ergänzen oder ersetzen, um die Verarbeitungsgeschwindigkeit, die Recyclingraten, die Sortierqualität und die vorherrschenden Gesundheitsbedingungen zu verbessern.
In recent decades, the number of components in concrete has grown, particularly in formulations aimed at reducing carbon footprints. Innovations include diverse binders, supplementary cementitious materials, activators, concrete admixtures, and recycled aggregates. These developments target not only the enhancement of material properties but also the mitigation of the ecological and economic impacts of concrete — the most extensively used material by humankind. However, these advancements also introduce a greater variability in the composition of raw materials. The material’s behavior is significantly influenced by its nanoscale properties, which can pose challenges in accurate characterization. Consequently, there’s an increasing need for experimental tuning of formulations. This is accompanied by a more inconsistent composition of raw materials, which makes an experimental tuning of formulations more and more necessary. However, the increased complexity in composition presents a challenge in finding the ideal formulation through trial and error. Inverse design (ID) techniques offer a solution to this challenge by allowing for a comprehensive search of the entire design space to create new and improved concrete formulations. In this publication, we introduce the concept of ID and demonstrate how our open-source app “SLAMD” provides all necessary steps of the workflow to adapt it in the laboratory, lowering the application barriers. The intelligent screening process, guided by a predictive model, leads to a more efficient and effective data-driven material design process resulting in reduced carbon footprint and improved material quality while considering socio-economic factors in the materials design.
This paper presents a novel approach for developing sustainable building materials through Sequential Learning. Data sets with a total of 1367 formulations of different types of alkali-activated building materials, including fly ash and blast furnace slag-based concrete and their respective compressive strength and CO2-footprint, were compiled from the literature to develop and evaluate this approach. Utilizing this data, a comprehensive computational study was undertaken to evaluate the efficacy of the proposed material design methodologies, simulating laboratory conditions reflective of real-world scenarios. The results indicate a significant reduction in development time and lower research costs enabled through predictions with machine learning. This work challenges common practices in data-driven materials development for building materials. Our results show, training data required for data-driven design may be much less than commonly suggested. Further, it is more important to establish a practical design framework than to choose more accurate models. This approach can be immediately implemented into practical applications and can be translated into significant advances in sustainable building materials development.
SLAMD-FIB-Case-Study
(2022)
With 8% of man-made CO2 emissions, cement production is an important driver of the climate crisis. By using alkali-activated binders part of the energy-intensive clinker production process can be dispensed with. However, because numerous chemicals are involved in the manufacturing process here, the complexity of the materials increases by orders of magnitude. Finding a properly balanced cement formulation is like looking for a needle in a haystack. We have shown for the first time that artificial intelligence (AI)-based optimization of cement formulations can significantly accelerate research. The „Sequential Learning App for Materials Discovery“ (SLAMD) aims to accelerate practice transfer. With SLAMD, materials scientists have low-threshold access to AI through interactive and intuitive user interfaces. The value added by AI can be determined directly. For example, the CO2 emissions saved per ton of cement can be determined for each development cycle: the more efficient the AI optimization, the greater the savings. Our material database already includes more than 120,000 data points of alternative cements and is constantly being expanded with new parameters. We are currently driving the enrichment of the data with a life cycle analysis of the building materials. Based on a case study we show how intuitive access to AI can drive the adoption of techniques that make a real contribution to the development of resource-efficient and sustainable building materials of the future and make it easy to identify when classical experiments are more efficient.
With 8% of man-made CO2 emissions, cement production is an important driver of the climate crisis. By using alkali-activated binders, part of the energy-intensive clinker production process can be dispensed. However, as numerous raw materials are involved in the manufacturing process here, the complexity of the materials increases by orders of magnitude. Finding a properly balanced binder formulation is like looking for a needle in a haystack. We have shown for the first time that artificial intelligence (AI)-based optimization of alkali-activated binder formulations can significantly accelerate research.
The "Sequential Learning App for Materials Discovery" (SLAMD) aims to accelerate practice transfer. With SLAMD, materials scientists have low-threshold access to AI through interactive and intuitive user interfaces. The value added by AI can be determined directly. For example, the CO2 emissions saved per ton of cement can be determined for each development cycle: the more efficient the AI optimization, the greater the savings.
Our material database already includes more than 120,000 data points of alternative binders and is constantly being expanded with new parameters. We are currently driving the enrichment of the data with a life cycle analysis of the building materials.
Based on a case study we show how intuitive access to AI can drive the adoption of techniques that make a real contribution to the development of resource-efficient and sustainable building materials of the future and make it easy to identify when classical experiments are more efficient.
Potentialfeldmessung (PM) ist die beliebteste Methode der Zerstörungsfreien Prüfung (ZfP) zur Lokalisierung von aktiver Betonstahlkorrosion. PM wird durch Parameter wie z. B. Feuchtigkeits- und Chloridgradienten im Bauteil beeinflusst, so dass die Sensitivität gegenüber der räumlich sehr begrenzten, aber gefährlichen Lochkorrosion gering ist. Wir zeigen in dieser Studie, wie zusätzliche Messinformationen mit Multisensor-Datenfusion genutzt werden können, um die Detektionsleistung zu verbessern und die Auswertung zu automatisieren. Die Fusion basiert auf überwachtem maschinellen Lernen (ÜML). ÜML sind Methoden, die Zusammenhänge in (Sensor-) Daten anhand vorgegebener Kennzeichnungen (Label) erkennen. Wir verwenden ÜML um „defekt“ und „intakt“ gelabelte Bereiche in einem Multisensordatensatz zu unterscheiden. Unser Datensatz besteht aus 18 Messkampagnen und enthält jeweils PM-, Bodenradar-, Mikrowellen-Feuchte- und Wenner-Widerstandsdaten. Exakte Label für veränderliche Umweltbedingungen wurden in einer Versuchsanordnung bestimmt, bei der eine Stahlbetonplatte im Labor kontrolliert und beschleunigt verwittert. Der Verwitterungsfortschritt wurde kontinuierlich überwacht und die Korrosion gezielt erzeugt. Die Detektionsergebnisse werden quantifiziert und statistisch ausgewertet. Die Datenfusion zeigt gegenüber dem besten Einzelverfahren (PM) eine deutliche Verbesserung. Wir beschreiben die Herausforderungen datengesteuerter Ansätze in der zerstörungsfreien Prüfung und zeigen mögliche Lösungsansätze.
This work presents machine learning-inspired data fusion approaches to improve the non-destructive testing of reinforced concrete. The principal effects that are used for data fusion are shown theoretically. Their effectiveness is tested in case studies carried out on largescale concrete specimens with built-in chloride-induced rebar corrosion. The dataset consists of half-cell potential mapping, Wenner resistivity, microwave moisture and ground penetrating radar measurements. Data fusion is based on the logistic Regression algorithm.
It learns an optimal linear decision boundary from multivariate labeled training data, to separate intact and defect areas. The training data are generated in an experiment that simulates the entire life cycle of chloride-exposed concrete building parts. The unique possibility to monitor the deterioration, and targeted corrosion initiation, allows data labeling.
The results exhibit an improved sensitivity of the data fusion with logistic regression compared to the best individual method half-cell potential.
With 8% of man-made CO2 emissions, cement production is an important driver of the climate crisis. By using alkali-activated binders, part of the energy-intensive clinker production process can be dispensed. However, as numerous raw materials are involved in the manufacturing process here, the complexity of the materials increases by orders of magnitude. Finding a properly balanced binder formulation is like looking for a needle in a haystack. We have shown for the first time that artificial intelligence (AI)-based optimization of alkali-activated binder formulations can significantly accelerate research.
The "Sequential Learning App for Materials Discovery" (SLAMD) aims to accelerate practice transfer. With SLAMD, materials scientists have low-threshold access to AI through interactive and intuitive user interfaces. The value added by AI can be determined directly. For example, the CO2 emissions saved per ton of cement can be determined for each development cycle: the more efficient the AI optimization, the greater the savings.
Our material database already includes more than 120,000 data points of alternative binders and is constantly being expanded with new parameters. We are currently driving the enrichment of the data with a life cycle analysis of the building materials.
Based on a case study we show how intuitive access to AI can drive the adoption of techniques that make a real contribution to the development of resource-efficient and sustainable building materials of the future and make it easy to identify when classical experiments are more efficient.
Alkali-activated binders (AAB) can provide a clean alternative to conventional cement in terms of CO2 emissions. However, as yet there are no sufficiently accurate material models to effectively predict the AAB properties, thus making optimal mix design highly costly and reducing the attractiveness of such binders. This work adopts sequential learning (SL) in high-dimensional material spaces (consisting of composition and processing data) to find AABs that exhibit desired properties. The SL approach combines machine learning models and feedback from real experiments. For this purpose, 131 data points were collected from different publications. The data sources are described in detail, and the differences between the binders are discussed. The sought-after target property is the compressive strength of the binders after 28 days. The success is benchmarked in terms of the number of experiments required to find materials with the desired strength. The influence of some constraints was systematically analyzed, e.g., the possibility to parallelize the experiments, the influence of the chosen algorithm and the size of the training data set. The results show the advantage of SL, i.e., the amount of data required can potentially be reduced by at least one order of magnitude compared to traditional machine learning models, while at the same time exploiting highly complex information. This brings applications in laboratory practice within reach.
Das maschinelle Lernen (ML) wurde erfolgreich zur Lösung vieler Aufgaben in der zerstörungsfreien Prüfung im Bauwesen (ZfPBau) eingesetzt. Allerdings ist die Erstellung von Referenzdaten in den meisten Fällen extrem teuer und daher viel knapper als in anderen Forschungsbereichen. Auch decken die verfügbaren Daten mitunter nur ein einziges Szenario ab, so dass die Leistungsindikatoren oft nicht die tatsächliche Leistung des ML-Modells in der praktischen Anwendung widerspiegeln. Schätzungen, die die Übertragbarkeit von einem Szenario auf ein anderes quantifizieren, sind erforderlich, um dieser Herausforderung gerecht zu werden und den Weg für Anwendungen in der Praxis zu ebnen.
In diesem Beitrag stellen wir Werkzeuge zur Beschreibung der Unsicherheit von ML in neuen ZfPBau-Szenarien vor. Zu diesem Zweck haben wir einen bestehenden Trainingsdatensatz zur Klassifizierung von Korrosionsschäden der Bewehrung in Beton um eine neue Fallstudie erweitert. Die Messungen wurden an großflächigen Betonproben mit eingebauter chloridinduzierter Korrosion des Bewehrungsstahls durchgeführt. Das Experiment simulierte den gesamten Lebenszyklus von chloridinduzierten Sichtbetonbauteilen im Labor. Unser Datensatz umfasst Potenzialfeld- und Radarmessungen. Die einzigartige Fähigkeit, die Schädigung zu überwachen und eine gezielte Korrosion einzuleiten, ermöglichte es, die Daten zu labeln - was für die Konstruktion von ML-Modellen entscheidend ist. Um die Übertragbarkeit zu untersuchen, erweitern wir unser Modell um Metadaten - wie etwa Konstruktionsmerkmale des Prüfkörpers und Umweltbedingungen. Dies erlaubt es, die Veränderung dieser Merkmale in neuen Szenarien mit statistischen Methoden als Unsicherheiten auszudrücken. Wir vergleichen verschiedene auf Stichproben und statistischer Verteilung basierende Ansätze und zeigen, wie diese Methoden eingesetzt werden können, um Wissenslücken von ML-Modellen in der ZfP zu schließen
ML has been successfully applied to solve many NDT-CE tasks. This is usually demonstrated with performance metrics that evaluate the model as a whole based on a given set of data. However, since in most cases the creation of reference data is extremely expensive, the data used is generally much sparser than in other areas, such as e-commerce. As a result, performance indicators often do not reflect the practical applicability of the ML model. Estimates that quantify transferability from one case to another are necessary to meet this challenge and pave the way for real world applications.
In this contribution we invetigate the uncertainty of ML in new NDT-CE scenarios. For this purpose, we have extended an existing training data set for the classification of corrosion damage by a new case study. Our data set includes half-cell potential mapping and ground-penetrating radar measurements. The measurements were performed on large-area concrete samples with built-in chloride-induced corrosion of reinforcement. The experiment simulated the entire life cycle of chloride induced exposed concrete components in the laboratory. The unique ability to monitor deterioration and initiate targeted corrosion initiation allowed the data to be labelled - which is crucial to ML. To investigate transferability, we extend our data by including new design features of the test specimen and environmental conditions. This allows to express the change of these features in new scenarios as uncertainties using statistical methods. We compare different sampling and statistical distribution-based approaches and show how these methods can be used to close knowledge gaps of ML models in NDT.
Environmentally friendly alternatives to cement are created through the synthesis of numerous base materials. The variation of their proportions alone leads to millions of materials candidates. Identifying suitable materials is very laborious; traditional systematic research in the laboratory consumes a lot of time and effort.
Sequential learning (SL) potentially speeds up the materials research process despite limited but highly complex available information. SL does not make direct predictions of material properties but ranks possible experiments according to their utility. The most promising experiments are prioritized over dead-end experiments and experiments whose outcome is already known.
Our work has shown that SL seems to be promising for cement research. So far, research has mainly focused on materials whose synthesis is faster and whose material properties require less time for development or characterization (allowing many successive experiments). Contrarily, in the case of binders, SL is only useful if few experiments lead to the desired goal, as for example, the determination of the compressive strength alone typically requires 28 days.
In research practice, experimental designs and the availability of resources often determine which data can be used - for example, when some laboratory resources are not available or deemed irrelevant to a task. As a result, new research scenarios are constantly emerging, each of which requires to demonstrate SL’s performance.
We are presenting the SLAMD app to facilitate the exploration of SL methods in numerous research scenarios. The app provides flexible and low-threshold access to AI methods via intuitive and interactive user interfaces. We deliberately pursue a software-based research approach (as opposed to code-, or script-based). On the one hand, the results are more comprehensible since we refer to a common (code) basis (’reproducible science’). On the other hand, the methods are easily accessible to all which accelerates the knowledge transfer into laboratory practice.
High greenhouse gas emissions from the production of building materials are a major contributor to the current climate crisis. However, developing alternative building materials is complex. Traditional laboratory methods are reaching their limits. Artificial intelligence, on the other hand, can give research a new dynamic.
Novel materials are usually developed manually in the laboratory rather than on a computer. This makes the processes time-consuming, difficult and expensive. With the app SLAMD (Sequential Learning App for Materials Discovery), materials researchers can explore the potential of artificial intelligence to speed up materials research and easily apply AI in the lab. The app was developed by our team at the Federal Institute for Materials Research and Testing (BAM) led by Prof. Sabine Kruschwitz together with a team in the Department of Building Materials and Construction Chemistry at TU Berlin led by Prof. Dietmar Stephan.
It uses material composition and characterization data to predict ideal material candidates. It can be used to optimize many material properties simultaneously and even incorporates database information such as carbon footprint, material cost or resource availability. Unlike the usual data-intensive AI methods, SLAMD optimally integrates existing knowledge and human feedback, and provides numerous decision support tools to precisely navigate complex scientific knowledge processes towards success.
In this talk, we will present some case studies where we were able to find suitable advanced materials in a few months instead of several years. We will talk about the challenges we overcame and the future potential we see for this approach to developing the green materials of the future.
With 8% of man-made CO2 emissions, cement production is an important driver of the climate crisis. By using alkali-activated binders, part of the energy-intensive clinker production process can be dispensed. However, as numerous raw materials are involved in the manufacturing process here, the complexity of the materials increases by orders of magnitude. Finding a properly balanced binder formulation is like looking for a needle in a haystack. We have shown for the first time that artificial intelligence (AI)-based optimization of alkali-activated binder formulations can significantly accelerate research.
The "Sequential Learning App for Materials Discovery" (SLAMD) aims to accelerate practice transfer. With SLAMD, materials scientists have low-threshold access to AI through interactive and intuitive user interfaces. The value added by AI can be determined directly. For example, the CO2 emissions saved per ton of cement can be determined for each development cycle: the more efficient the AI optimization, the greater the savings.
Our material database already includes more than 120,000 data points of alternative binders and is constantly being expanded with new parameters. We are currently driving the enrichment of the data with a life cycle analysis of the building materials.
Based on a case study we show how intuitive access to AI can drive the adoption of techniques that make a real contribution to the development of resource-efficient and sustainable building materials of the future and make it easy to identify when classical experiments are more efficient.
WEBSLAMD
(2022)
The objective of SLAMD is to accelerate materials research in the wet lab through AI. Currently, the focus is on sustainable concrete and binder formulations, but it can be extended to other material classes in the future.
1. Summary
Leverage the Digital Lab and AI optimization to discover exciting new materials Represent resources and processes and their socio-economic impact.
Calculate complex compositions and enrich them with detailed material knowledge. Integrate laboratory data and apply it to novel formulations. Tailor materials to the purpose to achieve the best solution.
Workflow
Digital Lab
Specify resources: From base materials to manufacturing processes – "Base" enables a detailed and consistent description of existing resources
Combine resources: The combination of base materials and processes offers an almost infinite optimization potential. "Blend" makes it easier to design complex configurations.
Digital Formulations: With "Formulations" you can effortlessly convert your resources into the entire spectrum of possible concrete formulations. This automatically generates a detailed set of data for AI optimization.
AI-Optimization
Materials Discovery: Integrate data from the "Digital Lab" or upload your own material data. Enrich the data with lab results and adopt the knowledge to new recipes via artificial intelligence. Leverage socio-economic metrics to identify recipes tailored to your requirements.
Angetrieben durch Digitalisierung und die sogenannte Industrie 4.0 steigt die Erwartungshaltung gegenüber der Nutzung von Sensorik. Die Unsicherheit, Widersprüchlichkeit und Redundanz separat verarbeiteter einzelner Quellen soll durch die synergetische Zusammenführung heterogener Datensätze überwunden werden. Dabei steht der zunehmenden Aufgabenkomplexität eine ebenso zunehmende Verfügbarkeit an Sensoren, Daten und Rechenleistung gegenüber.
Der Begriff Datenfusion fasst Ansätze zusammen, die Daten zu abstrakteren, aber besser verständlichen Informationen verarbeiten. Die Fusion ist Kernbestandteil effektiver Assistenzsysteme und beweist in vielzähligen Aufgaben - von militärischen Anwendungen über Flug- und Fahrassistenzsysteme, bis in den Heimbereich – ihr großes Potenzial. Durch die wachsende Automatisierung bei der Messdatenerfassung wird Datenfusion auch in der industriellen Qualitätsprüfung und –sicherung zunehmend attraktiver.
Der Vortrag gibt einen Überblick über den breiten Themenkomplex und widmet sich dabei im Theorieteil speziell der Fragen, welche Informationen in multivariaten Datensätzen stecken und wie sie extrahiert werden können. Anschließend wird ein Beispiel für die erfolgreiche Anwendung zur zerstörungsfreien Prüfung von Betonbauteilen vorgestellt. Der dargestellte Datensatz ist klein, heterogen, hochdimensional und unausgeglichen. Anhand von Algorithmen mit unterschiedlicher Leistungsfähigkeiten hinsichtlich Anpassungsfähigkeit und Invarianz gegenüber Höherdimensionalität wird erläutert welche Prozesse zur Verbesserung der Informationsqualität nötig sind.
Explore and Exploit - Strategische Erweiterung der fraktographischen Datenbank mit Machine Learning
(2020)
In diesem Vortrag stellen wir den aktuellen Stand zu einer Masterarbeit zusammen die sich mit dem Thema beschäftigt wie die Generalisierbarkeit von Datenmodellen auf Basis kleiner Datensätze erhöht werden kann. Wir stellen vor, wie die Datenbasis eines fraktogafischen Bildklassifizierers mit einem statistischen Model strategische erweitert, bzw. an eine Anwendung angepasst werden kann.
We have arrived in the data age. But why is it so difficult for the NDT community to achieve real breakthroughs with data-driven science? In this seminar, we will take a brief look at the evolution of mainstream data science to understand why the most exciting times are perhaps just ahead. We will give an overview of our activities in the junior research group 8.K which are aimed at enabling the next generation of data science methods in NDT. The seminar addresses the two main work fields of our group: semantic data management and the handling of limited data resources.
The first field addresses the problem that a uniform representation of our data is not yet available. However, knowledge creation in data science - whose main contribution lies in the analysis of distributed resources - requires common data access based on a collective understanding. To achieve this, we present an ontology-based approach. Ontologies are already the core of many intelligent systems such as building information models or research databases. We summarize some of the basic principles of this technology and describe our approach to create an NDT ontology.
The second field ties in with the first and addresses the application of data-based methods in engineering practice. Especially in the field of non-destructive testing many successful applications have been published. In most cases, however, the creation of referenced data is extremely expensive and therefore much sparser than in other research areas. As a result, the available data may cover only one scenario, so that common benchmarks often do not reflect the actual performance of the model in practical applications. Estimates that quantify the transferability from one scenario to another are not only necessary to overcome this challenge - they also prove to be a powerful tool for the strategic expansion of what we consider knowledge.
Data-driven research is considered the new paradigm in science. In this field, data is the new resource from which knowledge is extracted that is too complex for traditional methods. Several factors such as national funding and advances in information technology, are driving the development. In particular, the creation of databases and the analysis of data with artifical intelligence are playing an important role in establishing the new paradigm. However, there are numerous challenges that must be overcome to realize the full potential of data-driven methods. This talk sets the stage for the upcoming workshop by reviewing some of the historical developments and the current state of data-driven science in NDT and materials science.
Many PhD students are interested in applying machine learning, AI, data science, etc., and there are many good reasons for this. However, there is a disconnect between mainstream data science and materials science, for example, when it comes to the sheer size of the data. This talk will highlight some of the unique challenges in materials informatics and present some interesting approaches to overcome them. Although the field is large, this talk will focus on cases that have some practical relevance to PhD students at BAM.
In the field of non-destructive testing (NDT) in civil engineering, a large number of measurement data are collected. Although they serve as a basis for scientific analyses, there is still no uniform representation of the data. An analysis of various distributed data sets across different test objects is therefore only possible with high manual effort.
We present a system architecture for an integrated data management of distributed data sets based on Semantic Web technologies. The approach is essentially based on a mathematical model - the so-called ontology - which represents the knowledge of our domain NDT. The ontology developed by us is linked to data sources and thus describes the semantic meaning of the data. Furthermore, the ontology acts as a central concept for database access. Non-domain data sources can be easily integrated by linking them to the NDT construction ontology and are directly available for generic use in the sense of digitization. Based on an extensive literature research, we outline the possibilities that this offers for NDT in civil engineering, such as computer-aided sorting, analysis, recognition and explanation of relationships (explainable AI) for several million measurement data.
The expected benefits of this approach of knowledge representation and data access for the NDT community are an expansion of knowledge through data exchange in research (interoperability), the scientific exploitation of large existing data sources with data-based methods (such as image recognition, measurement uncertainty calculations, factor analysis, material characterization) and finally a simplified exchange of NDT data with engineering models and thus with the construction industry.
Ontologies are already the core of numerous intelligent systems such as building information modeling or research databases. This contribution gives an overview of the range of tools we are currently creating to communicate with them.
Im Bereich der Zerstörungsfreien Prüfung (ZfP) im Bauwesen werden eine Vielzahl von Messdaten erfasst. Obwohl Sie als Grundlage für wissenschaftliche Analysen dienen, gibt es noch keine einheitliche Repräsentation der Daten. Eine Analyse verschiedener verteilter Datensätze über unterschiedliche Prüfobjekte hinweg ist daher kaum möglich.
Wir stellen einen Ansatz für ein integriertes Datenmanagement verteilter Datensätze auf Basis von Semantic-Web Technologien vor. Der Ansatz basiert im Kern auf einem mathematischen Modell – der sogenannten Ontologie – welches das Wissen unserer Domäne ZfPBau repräsentiert. Die von uns entwickelte ZfPBau Ontologie wird mit Datenquellen verknüpft und beschreibt so die semantische Bedeutung der Daten. Darüber hinaus fungiert die Ontologie als zentrales Konzept für den Datenbankzugriff. Domänen-fremde Datenquellen können durch die Verknüpfung mit der ZfPBau Ontologie einfach integriert werden und stehen zur generischen Nutzung im Sinne der Digitalisierung direkt zur Verfügung. Basierend auf einer umfangreichen Literaturrecherche, skizzieren wir die Möglichkeiten die sich daraus für die ZfP im Bauwesen ergeben, wie zum Beispiel Messdaten computergestützt zu sortieren, zu analysieren, Zusammenhänge zu erkennen und zu erklären.
Der erwartete Nutzen dieses Ansatzes der Wissensrepräsentation und des Datenzugriffs für die ZfP-Community ist eine Erweiterung des Wissens durch Datenaustausch in der Forschung (Interoperabilität), die wissenschaftliche Verwertung großer existierender Datenquellen mit datenbasierten Verfahren (wie Bilderkennung, Messunsicherheitsberechnungen, Faktoranaylsen, Materialcharackterisierung) und letztlich ein vereinfachter Transfer von ZFP-Daten in Ingenieurmodelle und somit in die Baupraxis.
Ontologien sind bereits Kern vielzähliger intelligenter Systeme wie Building-Information-Modeling oder Forschungsdatenbanken. Der Beitrag gibt einen Überblick über die Werkzeuge die wir derzeit für die Kommunikation mit ihnen schaffen.
Wir laden zum Trainingsworkshop Datenanalyse ein. Angetrieben durch die Digitalisierung und die sogenannte Industrie 4.0 steigt die Erwartungshaltung gegenüber der Nutzung von Daten. Die Unsicherheit, Widersprüchlichkeit und Redundanz separat verarbeiteter einzelner Quellen soll durch die synergetische Zusammenführung heterogener Datensätze überwunden werden. Dabei steht der zunehmenden Aufgabenkomplexität eine ebenso zunehmende Verfügbarkeit an Daten und Analyseverfahren gegenüber.
Der Workshop vermittelt ein konzeptionelles Verständnis für moderne Datenanalyseverfahren (Machine Learning (ML), Multivariate Statistik) und soll durch ein anschließendes Hands-On Training mit Python (https://www.python.org) einen einfachen Einstieg in die Thematik ermöglichen. Der Kurs richtet sich an den wissenschaftlichen Nachwuchs.
Angetrieben durch Digitalisierung und die sogenannte Industrie 4.0 steigt die Erwartungshaltung gegenüber der Nutzung von Sensorik. Die Unsicherheit, Widersprüchlichkeit und Redundanz separat verarbeiteter einzelner Quellen soll durch die synergetische Zusammenführung heterogener Datensätze überwunden werden. Dabei steht der zunehmenden Aufgabenkomplexität eine ebenso zunehmende Verfügbarkeit an Sensoren, Daten und Rechenleistung gegenüber.
Der Begriff Datenfusion fasst Ansätze zusammen, die Daten zu abstrakteren, aber besser verständlichen Informationen verarbeiten. Die Fusion ist Kernbestandteil effektiver Assistenzsysteme und beweist in vielzähligen Aufgaben - von militärischen Anwendungen über Flug- und Fahrassistenzsysteme, bis in den Heimbereich – ihr großes Potenzial. Durch die wachsende Automatisierung bei der Messdatenerfassung wird Datenfusion auch in der industriellen Qualitätsprüfung und –sicherung zunehmend attraktiver.
Der Vortrag gibt einen Überblick über den breiten Themenkomplex und widmet sich dabei speziell der Fragen, welche Informationen in multivariaten Datensätzen stecken und wie sie extrahiert werden können. Abschließend wird ein Beispiel für die erfolgreiche Anwendung zur zerstörungsfreien Prüfung von Betonbauteilen vorgestellt. Der systematische Vergleich von Algorithmen, die Charakteristika des Labordatensatzes in unterschiedlicher Weise adressieren erlaubt überraschende Schlussfolgerungen.
The application and benefits of Semantic Web Technologies (SWT) for managing, sharing, and (re-)using of research data are demonstrated in implementations in the field of Materials Science and Engineering (MSE). However, a compilation and classification are needed to fully recognize the scattered published works with its unique added values. Here, the primary use of SWT at the interface with MSE is identified using specifically created categories. This overview highlights promising opportunities for the application of SWT to MSE, such as enhancing the quality of experimental processes, enriching data with contextual information in knowledge graphs, or using ontologies to perform specific queries on semantically structured data. While interdisciplinary work between the two fields is still in its early stages, a great need is identified to facilitate access for nonexperts and develop and provide user-friendly tools and workflows. The full potential of SWT can best be achieved in the long term by the broad acceptance and active participation of the MSE community. In perspective, these technological solutions will advance the field of MSE by making data FAIR. Data-driven approaches will benefit from these data structures and their connections to catalyze knowledge generation in MSE.
Additive manufacturing (AM) of metals and in particular laser powder bed fusion (LPBF) enables a degree of freedom in design unparalleled by conventional subtractive methods. To ensure that the designed precision is matched by the produced LPBF parts, a full understanding of the interaction between the laser and the feedstock powder is needed. It has been shown that the laser also melts subjacent layers of material underneath. This effect plays a key role when designing small cavities or overhanging structures, because, in these cases, the material underneath is feed-stock powder. In this study, we quantify the extension of the melt pool during laser illumination of powder layers and the defect spatial distribution in a cylindrical specimen. During the LPBF process, several layers were intentionally not exposed to the laser beam at various locations, while the build process was monitored by thermography and optical tomography. The cylinder was finally scanned by X-ray computed tomography (XCT). To correlate the positions of the unmolten layers in the part, a staircase was manufactured around the cylinder for easier registration. The results show that healing among layers occurs if a scan strategy is applied, where the orientation of the hatches is changed for each subsequent layer. They also show that small pores and surface roughness of solidified material below a thick layer of unmolten material (>200 µm) serve as seeding points for larger voids. The orientation of the first two layers fully exposed after a thick layer of unmolten powder shapes the orientation of these voids, created by a lack of fusion.