nn_selu function

SELU module

SELU module

Applied element-wise, as:

nn_selu(inplace = FALSE)

Arguments

  • inplace: (bool, optional): can optionally do the operation in-place. Default: FALSE

Details

\mboxSELU(x)=\mboxscale(max(0,x)+min(0,α(exp(x)1))) \mbox{SELU}(x) = \mbox{scale} * (\max(0,x) + \min(0, \alpha * (\exp(x) - 1)))

with α=1.6732632423543772848170429916717\alpha = 1.6732632423543772848170429916717 and \mboxscale=1.0507009873554804934193349852946\mbox{scale} = 1.0507009873554804934193349852946.

More details can be found in the paper Self-Normalizing Neural Networks.

Shape

  • Input: (N,)(N, *) where * means, any number of additional dimensions
  • Output: (N,)(N, *), same shape as the input

Examples

if (torch_is_installed()) { m <- nn_selu() input <- torch_randn(2) output <- m(input) }
  • Maintainer: Daniel Falbel
  • License: MIT + file LICENSE
  • Last published: 2025-02-14