• search hit 3 of 4
Back to Result List

A Proof of Local Convergence for the Adam Optimizer

  • Adaptive Moment Estimation (Adam) is a very popular training algorithm for deep neural networks, implemented in many machine learning frameworks. To the best of the authors knowledge no complete convergence analysis exists for Adam. The contribution of this paper is a method for the local convergence analysis in batch mode for a deterministic fixed training set, which gives necessary conditions for the hyperparameters of the Adam algorithm. Due to the local nature of the arguments the objective function can be non-convex but must be at least twice continuously differentiable.

Download full text files

Export metadata

Additional Services

Share in Twitter Search Google Scholar Statistics
Metadaten
Author:Sebastian BockORCiD, Martin Georg WeißORCiD
URN:urn:nbn:de:bvb:898-opus4-501
URL / DOI:https://doi.org/10.1109/IJCNN.2019.8852239
Parent Title (English):Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN), 2019, Budapest, Hungary, July 14-19
Document Type:conference proceeding (article)
Language:English
Year of first Publication:2019
Publishing Institution:Ostbayerische Technische Hochschule Regensburg
Release Date:2019/11/28
Tag:Adam optimizer; Convergence; Dynamical system; Fixed point; Momentum method; Non-convex optimization
GND Keyword:Neuronales Netz; Maschinelles Lernen; Optimierungsalgorithmus; Konvergenz 〈Informationstechnik〉
Volume:2019
First Page:1
Last Page:8
Institutes:Fakultät Informatik und Mathematik
Begutachtungsstatus:peer-reviewed
research focus:Digitalisierung
Licence (German):Keine Lizenz - Es gilt das deutsche Urheberrecht: § 53 UrhG