Adam optimizer

E182821 UNEXPLORED

The Adam optimizer is a popular stochastic gradient descent method in machine learning that adaptively adjusts learning rates for each parameter using estimates of first and second moments of gradients.

Jump to: Referenced by

Referenced by (1)

Full triples — surface form annotated when it differs from this entity's canonical label.

Jimmy Ba knownFor Adam optimizer