Layer Normalization
E182824
UNEXPLORED
Layer Normalization is a neural network normalization technique that stabilizes and accelerates training by normalizing activations across features within each data sample, particularly useful in recurrent and transformer-based models.
Referenced by (1)
Full triples — surface form annotated when it differs from this entity's canonical label.