Derivative of the Rectified Linear Unit (ReLU) Activation Function
This function applies the derivative of the Rectified Linear Unit (ReLU) activation function to the input numeric vector.
dReLU(x)
x
: A numeric vector. All elements must be finite and non-missing.A numeric vector where the derivative of the ReLU function
dReLU(c(-1, 0, 1, 2)) # Can also be used in rxode2: x <- rxode2({ r=dReLU(time) }) e <- et(c(-1, 0, 1, 2)) rxSolve(x, e)
Other Activation Functions: ELU()
, GELU()
, PReLU()
, ReLU()
, SELU()
, Swish()
, dELU()
, dGELU()
, dPReLU()
, dSELU()
, dSwish()
, dlReLU()
, dsoftplus()
, lReLU()
, softplus()
Useful links