TY - JOUR A1 - Langmann, Reinhard A1 - Michael, Stiller T1 - The PLC as a Smart Service in Industry 4.0 Production System JF - Applied Sciences Y1 - 2019 U6 - https://doi.org/10.3390/app9183815 SN - 2076-3417 VL - 9 IS - 18 PB - MDPI ER - TY - CHAP A1 - Echternacht, David A1 - Schermuly, Rainer ED - Cichowski, Rolf Rüdiger T1 - Simulation der Auswirkungen privater Ladeinfrastruktur auf Niederspannungsnetze T2 - Anlagentechnik 2020 Y1 - 2019 SN - 978-3-8007-4838-9 SP - 149 EP - 158 PB - VDE Verlag CY - Berlin ET - Neuerscheinung ER - TY - JOUR A1 - Richter, Jessica A1 - Steenmann, Anna A1 - Schellscheidt, Benjamin A1 - Licht, Thomas T1 - On-Chip Diffusion Bonding creates Stable Interconnections Usable at Temperatures over 300°C JF - International Symposium on Microelectronics Y1 - 2019 U6 - https://doi.org/10.4071/2380-4505-2019.1.000530 VL - 2019 IS - 1 SP - 000530 EP - 000534 PB - IMAPS ER - TY - CHAP A1 - Müller, Patrick A1 - Lehmann, Matthias A1 - Braun, Alexander ED - Kress, Bernard C. ED - Schelkens, Peter T1 - Optical quality metrics for image restoration T2 - Digital Optical Technologies 2019 N2 - Image restoration is a process used to remove blur (from different sources like object motion or aberrations) from images by either non-blind or blind-deconvolution. The metrics commonly used to quantify the restoration process are peak signal-to-noise ratio (PSNR) and structural similarity index measure (SSIM). Often only a small sample of test images are used (like Lena or the camera guy). In optical design research PSNR and SSIM are not normally used, here image quality metrics based on linear system theory (e.g. modulation transfer function, MTF) are used to quantify optical errors like spherical or chromatic aberration. In this article we investigate how different image restoration algorithms can be quantified by applying image quality metrics. We start with synthetic image data that is used in camera test stands (e.g. Siemens star etc.), apply two different spatially variant degradation algorithms, and restore the original image by a direct method (Wiener filtering within sub-images), and by an iterative method (alternating direction method of multipliers, ADMM). Afterwards we compare the quality metrics (like MTF curves) for the original, the degraded and the restored image. As a first result we show that restoration algorithms sometimes fail in dealing with non-natural scenes, e.g. slanted-edge targets. Further, these first results indicate a correlation between degradation and restoration, i.e. the restoration algorithms are not capable of removing the optically relevant errors introduced by the degradation, a fact neither visible nor available from the PSNR values. We discuss the relevance in the context of the automotive industry, where image restoration may yield distinct advantages for camera-based applications, but testing methods rely on the used image quality metrics. Y1 - 2019 U6 - https://doi.org/10.1117/12.2528100 VL - Proceedings, Vol. 11062 CY - Munich ER - TY - JOUR A1 - Lehmann, Matthias A1 - Wittpahl, Christian A1 - Zakour, Hatem Ben A1 - Braun, Alexander T1 - Resolution and accuracy of nonlinear regression of point spread function with artificial neural networks JF - Optical Engineering N2 - We had already demonstrated a numerical model for the point spread function (PSF) of an optical system that can efficiently model both the experimental measurements and the lens design simulations of the PSF. The novelty lies in the portability and the parameterization of this model, which allow for completely new ways to validate optical systems, which is especially interesting not only for mass production optics such as in the automotive industry but also for ophthalmology. The numerical basis for this model is a nonlinear regression of the PSF with an artificial neural network (ANN). After briefly describing both the principle and the applications of the model, we then discuss two optically important aspects: the spatial resolution and the accuracy of the model. Using mean squared error (MSE) as a metric, we vary the topology of the neural network, both in the number of neurons and in the number of hidden layers. Measurement and simulation of a PSF can have a much higher spatial resolution than the typical pixel size used in current camera sensors. We discuss the influence this has on the topology of the ANN. The relative accuracy of the averaged pixel MSE is below 10  −  4, thus giving confidence that the regression does indeed model the measurement data with good accuracy. This article is only the starting point, and we propose several research avenues for future work. Y1 - 2019 U6 - https://doi.org/10.1117/1.oe.58.4.045101 SN - 0091-3286 VL - 58 IS - 4 SP - 045101 PB - SPIE ER - TY - JOUR A1 - Lehmann, Matthias A1 - Wittpahl, Christian A1 - Zakour, Hatem Ben A1 - Braun, Alexander T1 - Modeling realistic optical aberrations to reuse existing drive scene recordings for autonomous driving validation JF - Journal of Electronic Imaging N2 - Training autonomous vehicles requires lots of driving sequences in all situations. Collecting and labeling these drive scenes is a very time-consuming and expensive process. Currently, it is not possible to reuse these drive scenes with different optical properties, because there exists no numerically efficient model for the transfer function of the optical system. We present a numerical model for the point spread function (PSF) of an optical system that can efficiently model both experimental measurements and lens design simulations of the PSF. The numerical basis for this model is a nonlinear regression of the PSF with an artificial neural network. The novelty lies in the portability and the parameterization of this model. We present a lens measurement series, yielding a numerical function for the PSF that depends only on the parameters defocus, field, and azimuth. By convolving existing images and videos with this PSF, we generate images as if seen through the measured lens. The methodology applies to any optical scenario, but we focus on the context of autonomous driving, where the quality of the detection algorithms depends directly on the optical quality of the used camera system. With this model, it is possible to reuse existing recordings, with the potential to avoid millions of test drive miles. The parameterization of the optical model allows for a method to validate the functional and safety limits of camera-based advanced driver assistance systems based on the real, measured lens actually used in the product. Y1 - 2019 U6 - https://doi.org/10.1117/1.JEI.28.1.013005 SN - 1560-229X VL - 28 IS - 1 SP - 013005 PB - SPIE ER -