Randomized ReLU

E565191

Randomized ReLU is a neural network activation function that introduces randomness into the slope of the negative part of the ReLU to improve robustness and generalization.

Try in SPARQL Jump to: Statements Referenced by

Statements (45)

Predicate Object
instanceOf activation function
neural network component
affects model variance
network training dynamics
aimsTo improve generalization
improve robustness
reduce overfitting
appliedElementwiseTo neuron pre-activations
basedOn ReLU NERFINISHED
canBeViewedAs noise injection method
category rectifier activation
comparedWith Leaky ReLU NERFINISHED
Parametric ReLU NERFINISHED
goal improve test performance
increase robustness to input perturbations
hasAbbreviation RReLU NERFINISHED
hasHyperparameter lower bound of negative slope distribution
upper bound of negative slope distribution
hasInputDomain real numbers
hasOutputRange real numbers
hasProperty non-saturating for positive inputs
nonlinear
piecewise linear
random negative slope
stochastic
helpsWith gradient flow for negative inputs
regularization
implementedIn deep learning frameworks
introducesRandomnessIn slope of negative region
isDifferentiableAlmostEverywhere true
lessCommonIn output layers
modifies Rectified Linear Unit
negativeSlopeSampledFrom uniform distribution
oftenDeterministicDuring inference phase
reduces dying ReLU problem
relatedTo dropout
stochastic regularization techniques
requires random number generation
usedDuring training phase
usedFor classification tasks
image recognition tasks
regression tasks
usedIn convolutional neural networks NERFINISHED
deep neural networks
hidden layers

Referenced by (1)

Full triples — surface form annotated when it differs from this entity's canonical label.

ReLU relatedFunction Randomized ReLU