TY - INPR A1 - Bock, Sebastian A1 - Weiß, Martin Georg T1 - Local Convergence of Adaptive Gradient Descent Optimizers N2 - Adaptive Moment Estimation (ADAM) is a very popular training algorithm for deep neural networks and belongs to the family of adaptive gradient descent optimizers. However to the best of the authors knowledge no complete convergence analysis exists for ADAM. The contribution of this paper is a method for the local convergence analysis in batch mode for a deterministic fixed training set, which gives necessary conditions for the hyperparameters of the ADAM algorithm. Due to the local nature of the arguments the objective function can be non-convex but must be at least twice continuously differentiable. Then we apply this procedure to other adaptive gradient descent algorithms and show for most of them local convergence with hyperparameter bounds. KW - ADAM Optimizer KW - Convergence KW - momentum method KW - dynamical system KW - fixed point Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:898-opus4-7546 ER - TY - INPR A1 - Bock, Sebastian A1 - Weiß, Martin Georg T1 - Rotation Detection of Components with Convolutional Neural Networks N2 - The main issues in many image processing applications are object recognition and detection of objects, which answers the questions whether an object is present and if it is present, where it is located. Popular object detection algorithms like YOLO use a regression formulation for the whole problem, especially for the bounding box parameters. In production industry the setting usually is different: One usually knows the object type and rather wants to know with high precision where the object is. We study a prototype application in this area where we identify the rotation of an object in a plane. To solve this problem use a regression approach with a CNN architecture as a function approximator. We compare our results to standard image processing algorithms, which do not use neural networks, and present quantitative results on the accuracy. CNNs seem at least competitive to classical image processing. KW - Neural networks KW - Network Architecture KW - CNN KW - Functionapproximation KW - Image orientation Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:898-opus4-4120 ER - TY - CHAP A1 - Bock, Sebastian A1 - Weiß, Martin Georg T1 - Non-Convergence and Limit Cycles in the Adam Optimizer T2 - Proceedings of the 28th International Conference on Artificial Neural Networks, 2019, Munich, Germany, September 17-19 N2 - One of the most popular training algorithms for deep neural networks is the Adaptive Moment Estimation (Adam) introduced by Kingma and Ba. Despite its success in many applications there is no satisfactory convergence analysis: only local convergence can be shown for batch mode under some restrictions on the hyperparameters, counterexamples exist for incremental mode. Recent results show that for simple quadratic objective functions limit cycles of period 2 exist in batch mode, but only for atypical hyperparameters, and only for the algorithm without bias correction. We extend the convergence analysis to all choices of the hyperparameters for quadratic functions. This finally answers the question of convergence for Adam in batch mode to the negative. We analyze the stability of these limit cycles and relate our analysis to other results where approximate convergence was shown, but under the additional assumption of bounded gradients which does not apply to quadratic functions. The investigation heavily relies on the use of computer algebra due to the complexity of the equations. KW - Adam optimizer KW - Convergence KW - Computer algebra KW - Dynamical system KW - Limit cycle KW - Neuronales Netz KW - Maschinelles Lernen KW - Optimierungsalgorithmus KW - Konvergenz 〈Informationstechnik〉 KW - Computeralgebra Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:898-opus4-490 UR - https://doi.org/10.1007/978-3-030-30484-3_20 SP - 232 EP - 243 ER -