Oja rule

E899010

Oja rule is a normalized form of Hebbian learning used in neural networks to extract principal components by stabilizing synaptic weight growth.

Jump to: Statements Referenced by

Statements (47)

Predicate Object
instanceOf learning rule
synaptic plasticity rule
unsupervised learning algorithm
appliesTo linear feedforward networks
single linear neuron
assumes stationary input statistics
zero-mean input data
basedOn Hebbian learning
category Hebbian learning rules
PCA learning rules
contrastsWith standard Hebbian rule without normalization
convergesTo first principal component
leading eigenvector of input covariance matrix
countryOfOrigin Finland
definedIn “Simplified neuron model as a principal component analyzer” NERFINISHED
ensures bounded weight norm
extendedTo multi-neuron PCA networks
field computational neuroscience NERFINISHED
machine learning
neural computation
hasComponent Hebbian term
weight decay term
hasParameter learning rate
hasProperty normalized Hebbian learning
inspired neural PCA algorithms
introducedBy Erkki Oja NERFINISHED
learningType unsupervised
mathematicallyRelatedTo eigenvalue problem
stochastic gradient ascent
maximizes output variance under unit-norm constraint
normalizes weight vector magnitude
optimizes variance of neuron output
prevents unbounded synaptic weight growth
publicationYear 1982
publishedIn Journal of Mathematical Biology NERFINISHED
relatedTo Kohonen learning rule
Sanger rule NERFINISHED
stabilizes synaptic weights
updateType online learning rule
usedFor adaptive signal processing
dimensionality reduction
feature extraction
principal component analysis
principal component extraction
subspace tracking
usedIn neural networks NERFINISHED
unsupervised neural learning

Referenced by (1)

Full triples — surface form annotated when it differs from this entity's canonical label.

Hebbian learning hasVariant Oja rule