Batch Normalization

E701500

Batch Normalization is a deep learning technique that stabilizes and accelerates neural network training by normalizing layer inputs using mini-batch statistics.

Try in SPARQL Jump to: Surface forms Statements Referenced by

Observed surface forms (1)

Statements (49)

Predicate Object
instanceOf deep learning technique
normalization method
regularization technique
appliedBetween linear transformation and nonlinearity
category neural network optimization technique
neural network regularization method
commonlyUsedIn convolutional neural networks
fully connected networks
residual networks
commonPlacement after convolutional layer
after fully connected layer
before activation function
describedIn Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift NERFINISHED
domain deep learning
machine learning
effect acts as regularizer
allows higher learning rates
can reduce need for dropout
improves gradient flow
reduces internal covariate shift
reduces sensitivity to initialization
implementationDetail often fused with preceding linear or convolutional layer for efficiency
influenced development of Group Normalization
development of Instance Normalization
development of Layer Normalization
introducedBy Christian Szegedy NERFINISHED
Sergey Ioffe NERFINISHED
introducesParameter beta
gamma
limitation depends on batch statistics
less effective with very small batch sizes
mathematicalOperation standardization of activations
normalizesTo unit variance
normalizesTo zero mean
operatesOn layer activations
mini-batches
parameterType learnable scale parameter gamma
learnable shift parameter beta
primaryGoal accelerate neural network training
stabilize neural network training
publicationYear 2015
requires mini-batch of examples
requiresPhase inference phase
training phase
updateRule exponential moving average of batch statistics
usesDuringInference running mean
running variance
usesStatistic mini-batch mean
mini-batch variance

Referenced by (5)

Full triples — surface form annotated when it differs from this entity's canonical label.

Layer Normalization relatedTo Batch Normalization
Sergey Ioffe knownFor Batch Normalization
Sergey Ioffe coAuthorOf Batch Normalization
this entity surface form: Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Sergey Ioffe coInvented Batch Normalization
Sergey Ioffe notableWork Batch Normalization
this entity surface form: Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift