torch.nn.functional.silu#
- torch.nn.functional.silu(input,inplace=False)[source]#
Apply the Sigmoid Linear Unit (SiLU) function, element-wise.
The SiLU function is also known as the swish function.
Note
SeeGaussian Error Linear Units (GELUs)where the SiLU (Sigmoid Linear Unit) was originally coined, and seeSigmoid-Weighted Linear Units for Neural Network Function Approximationin Reinforcement Learning andSwish:a Self-Gated Activation Functionwhere the SiLU was experimented with later.
See
SiLUfor more details.- Return type