open_access
Refine
Year of publication
- 2023 (9) (remove)
Document Type
- Article (3)
- Preprint (2)
- Part of a Book (1)
- Conference Proceeding (1)
- Image (1)
- Researchdata (1)
Has Fulltext
- no (9) (remove)
Is part of the Bibliography
- yes (9) (remove)
Keywords
- Artificial Intelligence (3)
- Machine Learning (2)
- AI performance metrics (1)
- Chimney divergence angle (1)
- Computers and Society (1)
- Data Structures and Algorithms (1)
- Dataset (1)
- Efficiency (1)
- Experience Sampling Method (1)
- FDM (1)
Department/institution
A numerical study is carried out meticulously to scrutinize the impact of different shapes of chimneys like circular (outer dia, dc), convergent (outer dia, 0.5dc), divergent (outer dia, 1.5dc), sudden contraction (outer dia, 0.5dc), and sudden expansion (outer dia, 1.5dc) on the performance of an SCPP. Furthermore, the parametric impact with different chimney divergence angles (CDA, ϕ), and ground absorber slope angle (GSA, γ) on the SCPP performance is also scrutinized. Optimum divergence angle (ϕ=+0.75◦) enhances the power generation up to ~ 47% (76 kW) with a horizontal ground absorber surface. An increase or decrease in CDA lessens the power generation. With a sloped ground absorber angle γ=0.6◦, the gain in power generation is 60% (82 kW). The study of combination of ground sloped absorber (γ=0.6◦) and divergent chimney (ϕ=+0.75◦) shows enhancement of the power generation upto 80% (92 kW) more than the classical Manzaranes plant.
The importance of high data quality is increasing with the growing impact and distribution of ML systems and big data. Also the planned AI Act from the European commission defines challenging legal requirements for data quality especially for the market introduction of safety relevant ML systems. In this paper we introduce a novel approach that supports the data quality assurance process of multiple data quality aspects. This approach enables the verification of quantitative data quality requirements. The concept and benefits are introduced and explained on small example data sets. How the method is applied is demonstrated on the well known MNIST data set based an handwritten digits.
Refractive power measurements serve as the primary quality standard in the automotive glazing industry. In the light of autonomous driving new optical metrics are becoming more and more popular for specifying optical quality requirements for the windshield. Nevertheless, the link between those quantities and the refractive power needs to be established in order to ensure a holistic requirement profile for the windshield. As a consequence, traceable high-resolution refractive power measurements are still required for the glass quality assessment. Standard measurement systems using Moiré patterns for refractive power monitoring in the automotive industry are highly resolution limited, wherefore they are insufficient for evaluating the camera window area. Consequently, there is a need for more sophisticated refractive power measurement systems that provide a higher spatial resolution. In addition, a calibration procedure has to be developed in order to guarantee for comparability of the measurement results. For increasing the resolution, a measurement setup based on an auto-correlation algorithm is tested in this paper. Furthermore, a calibration procedure is established by using a single reference lens with a nominal refractive power of 100 km-1. For the calibration of the entire measurement range of the system, the lens is tilted by an inclination angle orthogonal to the optical axis. The effective refractive power is then given by the Kerkhof model. By adopting the measurement and calibration procedure presented in this paper, glass suppliers in the automotive industry will be able to detect relevant manufacturing defects within the camera window area more accurately paving the way for a holistic quality assurance of the windshield for future advanced driver-assistance system (ADAS) functionalities. Concurrently, the traceability of the measurement results is ensured by establishing a calibration chain based on a single reference lens, which is traced back to international standards.
With the increasing capabilities of machine learning systems and their potential use in safety-critical systems, ensuring high-quality data is becoming increasingly important. In this paper we present a novel approach for the assurance of data quality. For this purpose, the mathematical basics are first discussed and the approach is presented using multiple examples. This results in the detection of data points with potentially harmful properties for the use in safety-critical systems.
Abstract The modulation-transfer function (MTF) is a fundamental optical metric to measure the optical quality of an imaging system. In the automotive industry it is used to qualify camera systems for ADAS/AD. Each modern ADAS/AD system includes evaluation algorithms for environment perception and decision making that are based on AI/ML methods and neural networks. The performance of these AI algorithms is measured by established metrics like Average Precision (AP) or precision-recall-curves. In this article we research the robustness of the link between the optical quality metric and the AI performance metric. A series of numerical experiments were performed with object detection and instance segmentation algorithms (cars, pedestrians) evaluated on image databases with varying optical quality. We demonstrate with these that for strong optical aberrations a distinct performance loss is apparent, but that for subtle optical quality differences – as might arise during production tolerances – this link does not exhibit a satisfactory correlation. This calls into question how reliable the current industry practice is where a produced camera is tested end-of-line (EOL) with the MTF, and fixed MTF thresholds are used to qualify the performance of the camera-under-test.
Datenlebenszyklus
(2023)
Ein Datenlebenszyklus dient als Rahmen für das strukturierte Datenmanagement. Er umfasst den gesamten Datenfluss von der Planung über die Erzeugung bis hin zur Aufbewahrung und Bereitstellung. An allen Stationen des Datenlebenszyklus können Maßnahmen für ein strukturiertes Datenmanagement getroffen werden, um Daten effizient zu nutzen, ihre Qualität sicherzustellen und ihre Langzeitverfügbarkeit zu gewährleisten.
Die vorliegende Grafik visualisiert den Datenlebenszyklus, der sich aus drei Hauptbereichen zusammensetzt: Planung und Nachnutzung, Erzeugung und Bearbeitung sowie Aufbewahrung und Bereitstellung. Jeder Bereich umfasst spezifische Aufgaben und Prozesse im Umgang mit Daten.
Die erste Phase „Planung und Nachnutzung“ beinhaltet zwei wesentliche Elemente. Als Erstes erfolgt die Planung des Forschungsvorhabens, bei der die Ziele, Methoden und Ressourcen definiert werden. Als Zweites umfasst sie die Recherche und Wiederverwendung bereits vorhandener Daten, um deren Nutzen und Potenzial für das vorliegende Projekt einzuschätzen.
Die zweite Phase „Erzeugung und Bearbeitung“ besteht aus drei Elementen, die in einem eigenen zirkulären Ablauf abgebildet sind. Als Erstes werden Daten erhoben und erfasst, entweder durch experimentelle Studien, Umfragen oder andere Methoden. Der Übergang von der ersten Phase zur zweiten Phase erfolgt über dieses Element, da hier die grundlegende Generierung eigener Daten innerhalb der Forschung beginnt. Als Zweites werden die gesammelten Daten gespeichert und organisiert, um ihre Integrität und Verfügbarkeit zu gewährleisten. Als Drittes erfolgt die Verarbeitung, Analyse und Auswahl der relevanten Informationen. Der Übergang von der zweiten Phase zur dritten Phase erfolgt aus diesem Element, da die gewonnenen Erkenntnisse für die weitere Aufbewahrung und Bereitstellung vorbereitet werden. Insgesamt bildet die zweite Phase einen kontinuierlichen Kreislauf.
Die dritte Phase „Aufbewahrung und Bereitstellung“ besteht aus drei Elementen. Als Erstes erfolgt die Publikation der Forschungsergebnisse, sei es in wissenschaftlichen Zeitschriften, Repositorien oder anderen relevanten Publikationsorten. Als Zweites werden die Daten archiviert, um ihre Langzeitverfügbarkeit und -integrität zu gewährleisten. Als Drittes wird Zugang zu den Forschungsergebnissen geschaffen, beispielsweise über eine gezielte Freigabe für bestimmte Nutzergruppen oder Erhöhung der Sichtbarkeit.
The soundscape approach highlights the role of situational factors in sound evaluations; however, only a few studies have applied a multi‐domain approach including sound‐related, person‐related, and time‐varying situational variables. Therefore, we conducted a study based on the Experience Sampling Method to measure the relative contribution of a broad range of potentially relevant acoustic and non‐auditory variables in predicting indoor soundscape evaluations. Here we present the comprehensive dataset for which 105 participants reported temporally (rather) stable trait variables such as noise sensitivity, trait affect, and quality of life. They rated 6.594 situations regarding the soundscape standard dimensions, perceived loudness, and the saliency of its sound components and evaluated situational variables such as state affect, perceived control, activity, and location. To complement these subject‐centered data, we additionally crowdsourced object‐centered data by having participants make binaural measurements of each indoor soundscape at their homes using a low‐(self‐)noise recorder. These recordings were used to compute (psycho‐)acoustical indices such as the energetically averaged loudness level, the A‐weighted energetically averaged equivalent continuous sound pressure level, and the A‐weighted five‐percent exceedance level. This complex hierarchical data can be used to investigate time‐varying non‐auditory influences on sound perception and to develop soundscape indicators based on the binaural recordings to predict soundscape evaluations.