Adam: A Method for Stochastic Optimization

E182822 UNEXPLORED

"Adam: A Method for Stochastic Optimization" is a highly influential machine learning paper that introduces the Adam optimizer, a widely used adaptive gradient-based optimization algorithm for training deep neural networks.


Referenced by (2)

Full triples — surface form annotated when it differs from this entity's canonical label.

Jimmy Ba coAuthorOf Adam: A Method for Stochastic Optimization
Jimmy Ba hasNotableWork Adam: A Method for Stochastic Optimization