ReLU function

Rectified Linear Unit (ReLU) Activation Function

Rectified Linear Unit (ReLU) Activation Function

This function applies the Rectified Linear Unit (ReLU) activation function to the input numeric vector. The ReLU function is defined as the positive part of its argument: f(x)=max(0,x)f(x) = max(0, x).

ReLU(x)

Arguments

  • x: A numeric vector. All elements must be finite and non-missing.

Returns

A numeric vector where the ReLU function has been applied to each element of x.

Examples

ReLU(c(-1, 0, 1, 2)) # Can also be used in rxode2: x <- rxode2({ r=ReLU(time) }) e <- et(c(-1, 0, 1, 2)) rxSolve(x, e)

See Also

Other Activation Functions: ELU(), GELU(), PReLU(), SELU(), Swish(), dELU(), dGELU(), dPReLU(), dReLU(), dSELU(), dSwish(), dlReLU(), dsoftplus(), lReLU(), softplus()

Author(s)

Matthew Fidler