nn_silu function

Applies the Sigmoid Linear Unit (SiLU) function, element-wise. The SiLU function is also known as the swish function.

Applies the Sigmoid Linear Unit (SiLU) function, element-wise. The SiLU function is also known as the swish function.

nn_silu(inplace = FALSE)

Arguments

  • inplace: can optionally do the operation in-place. Default: FALSE

Details

See Gaussian Error Linear Units (GELUs)

where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning

and Swish: a Self-Gated Activation Function

where the SiLU was experimented with later.

  • Maintainer: Daniel Falbel
  • License: MIT + file LICENSE
  • Last published: 2025-02-14