TY - JOUR A1 - Barz, Tilman A1 - Seliger, Dominik A1 - Marx, Klemens A1 - Sommer, Andreas A1 - Walter, Sebastian F. A1 - Bock, Hans Georg A1 - Koerkel, Stefan T1 - State and state of charge estimation for a latent heat storage JF - Control Engineering Practice N2 - A nonlinear state observer is designed for a thermal energy storage with solid/liquid phase change material (PCM). Using a physical 2D dynamic model, the observer reconstructs transient spatial temperature fields inside the storage and estimates the stored energy and the state of charge. The observer has been successfully tested with a lab-scale latent heat storage with a single pass tube bundle and the phase change material located in a shell around each tube. It turns out that the observer robustly tracks the real process data with as few as four internal PCM temperature sensors. © 2017 Elsevier Ltd. All rights reserved. KW - Heat conduction in cylindrical shell KW - Kalman filter KW - Latent heat thermal energy storage (LHTES) KW - MODEL KW - Nonlinear state observer KW - OF-THE-ART KW - Orthogonal collocation KW - PCM KW - PERFORMANCE ENHANCEMENT TECHNIQUES KW - PHASE-CHANGE MATERIAL KW - POLYETHYLENE KW - Reduced model KW - simulation KW - Solidification KW - State of charge (SOC) KW - SYSTEMS KW - THERMAL-ENERGY STORAGE Y1 - 2018 U6 - https://doi.org/10.1016/j.conengprac.2017.11.006 VL - 72 SP - 151 EP - 166 PB - Pergamon-Elsevier d. ER - TY - CHAP A1 - Schwindl, Tobias A1 - Volbert, Klaus A1 - Bock, Sebastian ED - Fleury, Eric ED - Ahrens, Andreas ED - Benavente-Peces, César ED - Cam-Winget, Nancy T1 - Fast and Reliable Update Protocols in WSNs During Software Development, Testing and Deployment T2 - Proceedings of the 7th International Conference on Sensor Networks - SENSORNETS, Funchal, January 22-24, 2018, Madeira, Portugal N2 - A lot of research has been done in the area of Wireless Sensor Networks during the past years. Today, Wireless Sensor Networks are in field in many different ways and applications (e.g. energy management services, heat and water billing, smoke detectors). Nevertheless, research and development is continued in this area. After the network is deployed, software updates are performed very rarely, but during development and testing one typical, high frequented task is to deploy a new firmware to thousands of nodes. In this paper, we consider such a software update for a special, but well-known and frequently used sensor network platform. There exist some interesting research papers about updating sensor nodes, but we have a special focus on the technical update process. In this context, we show the reasons why these existing update processes do not cover our challenges. Our goal is to allow a developer to update thousands of nodes reliably and very fast during development and testing. Fo r this purpose, it is not so important to perform the best update with regard to energy consumption. We do not need a multi hop protocol, because all devices are in range, e.g., in a laboratory. In our work, we present a model of the update process and give very fast protocols to solve it. The results of our extensive simulations show that the developed protocols do a fast, scalable and reliable update. KW - WSN KW - Software Update KW - Low-Power Devices Y1 - 2018 UR - https://www.scitepress.org/DigitalLibrary/Link.aspx?doi=10.5220/0006534400190030 SN - 978-989-758-284-4 U6 - https://doi.org/10.5220/0006534400190030 SP - 19 EP - 30 PB - SCITEPRESS - Science and Technology Publications Lda CY - Setúbal, Portugal ER - TY - INPR A1 - Bock, Sebastian A1 - Weiß, Martin Georg T1 - Local Convergence of Adaptive Gradient Descent Optimizers N2 - Adaptive Moment Estimation (ADAM) is a very popular training algorithm for deep neural networks and belongs to the family of adaptive gradient descent optimizers. However to the best of the authors knowledge no complete convergence analysis exists for ADAM. The contribution of this paper is a method for the local convergence analysis in batch mode for a deterministic fixed training set, which gives necessary conditions for the hyperparameters of the ADAM algorithm. Due to the local nature of the arguments the objective function can be non-convex but must be at least twice continuously differentiable. Then we apply this procedure to other adaptive gradient descent algorithms and show for most of them local convergence with hyperparameter bounds. KW - ADAM Optimizer KW - Convergence KW - momentum method KW - dynamical system KW - fixed point Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:898-opus4-7546 ER - TY - CHAP A1 - Bock, Sebastian A1 - Weiß, Martin Georg T1 - A Proof of Local Convergence for the Adam Optimizer T2 - Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN), 2019, Budapest, Hungary, July 14-19 N2 - Adaptive Moment Estimation (Adam) is a very popular training algorithm for deep neural networks, implemented in many machine learning frameworks. To the best of the authors knowledge no complete convergence analysis exists for Adam. The contribution of this paper is a method for the local convergence analysis in batch mode for a deterministic fixed training set, which gives necessary conditions for the hyperparameters of the Adam algorithm. Due to the local nature of the arguments the objective function can be non-convex but must be at least twice continuously differentiable. KW - Non-convex optimization KW - Adam optimizer KW - Convergence KW - Momentum method KW - Dynamical system KW - Fixed point KW - Neuronales Netz KW - Maschinelles Lernen KW - Optimierungsalgorithmus KW - Konvergenz 〈Informationstechnik〉 Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:898-opus4-501 UR - https://doi.org/10.1109/IJCNN.2019.8852239 VL - 2019 SP - 1 EP - 8 ER - TY - INPR A1 - Bock, Sebastian A1 - Weiß, Martin Georg T1 - Rotation Detection of Components with Convolutional Neural Networks N2 - The main issues in many image processing applications are object recognition and detection of objects, which answers the questions whether an object is present and if it is present, where it is located. Popular object detection algorithms like YOLO use a regression formulation for the whole problem, especially for the bounding box parameters. In production industry the setting usually is different: One usually knows the object type and rather wants to know with high precision where the object is. We study a prototype application in this area where we identify the rotation of an object in a plane. To solve this problem use a regression approach with a CNN architecture as a function approximator. We compare our results to standard image processing algorithms, which do not use neural networks, and present quantitative results on the accuracy. CNNs seem at least competitive to classical image processing. KW - Neural networks KW - Network Architecture KW - CNN KW - Functionapproximation KW - Image orientation Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:898-opus4-4120 ER - TY - CHAP A1 - Bock, Sebastian A1 - Weiß, Martin Georg T1 - Non-Convergence and Limit Cycles in the Adam Optimizer T2 - Proceedings of the 28th International Conference on Artificial Neural Networks, 2019, Munich, Germany, September 17-19 N2 - One of the most popular training algorithms for deep neural networks is the Adaptive Moment Estimation (Adam) introduced by Kingma and Ba. Despite its success in many applications there is no satisfactory convergence analysis: only local convergence can be shown for batch mode under some restrictions on the hyperparameters, counterexamples exist for incremental mode. Recent results show that for simple quadratic objective functions limit cycles of period 2 exist in batch mode, but only for atypical hyperparameters, and only for the algorithm without bias correction. We extend the convergence analysis to all choices of the hyperparameters for quadratic functions. This finally answers the question of convergence for Adam in batch mode to the negative. We analyze the stability of these limit cycles and relate our analysis to other results where approximate convergence was shown, but under the additional assumption of bounded gradients which does not apply to quadratic functions. The investigation heavily relies on the use of computer algebra due to the complexity of the equations. KW - Adam optimizer KW - Convergence KW - Computer algebra KW - Dynamical system KW - Limit cycle KW - Neuronales Netz KW - Maschinelles Lernen KW - Optimierungsalgorithmus KW - Konvergenz 〈Informationstechnik〉 KW - Computeralgebra Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:898-opus4-490 UR - https://doi.org/10.1007/978-3-030-30484-3_20 SP - 232 EP - 243 ER -