ReLU

E134578 UNEXPLORED

ReLU (Rectified Linear Unit) is a widely used activation function in neural networks that outputs zero for negative inputs and the input value itself for positive inputs, enabling efficient and stable training of deep models.


Referenced by (1)
Subject (surface form when different) Predicate
deep feedforward networks
canUseActivationFunction

Please wait…