Applies the Sigmoid Linear Unit (SiLU) function, element-wise. The SiLU function is also known as the swish function.
nn_silu(inplace = FALSE)
inplace
: can optionally do the operation in-place. Default: FALSE
See Gaussian Error Linear Units (GELUs)
where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning
and Swish: a Self-Gated Activation Function
where the SiLU was experimented with later.
Useful links