nn_softplus function

Softplus module

Softplus module

Applies the element-wise function: [REMOVE_ME]\mboxSoftplus(x)=1βlog(1+exp(βx))[REMOVEME2] \mbox{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x)) [REMOVE_ME_2]

nn_softplus(beta = 1, threshold = 20)

Arguments

  • beta: the β\beta value for the Softplus formulation. Default: 1
  • threshold: values above this revert to a linear function. Default: 20

Description

Applies the element-wise function:

\mboxSoftplus(x)=1βlog(1+exp(βx)) \mbox{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x))

Details

SoftPlus is a smooth approximation to the ReLU function and can be used to constrain the output of a machine to always be positive. For numerical stability the implementation reverts to the linear function when input×β>thresholdinput \times \beta > threshold.

Shape

  • Input: (N,)(N, *) where * means, any number of additional dimensions
  • Output: (N,)(N, *), same shape as the input

Examples

if (torch_is_installed()) { m <- nn_softplus() input <- torch_randn(2) output <- m(input) }
  • Maintainer: Daniel Falbel
  • License: MIT + file LICENSE
  • Last published: 2025-02-14