SiLU#
- classtorch.nn.modules.activation.SiLU(inplace=False)[source]#
Applies the Sigmoid Linear Unit (SiLU) function, element-wise.
The SiLU function is also known as the swish function.
Note
SeeGaussian Error Linear Units (GELUs)where the SiLU (Sigmoid Linear Unit) was originally coined, and seeSigmoid-Weighted Linear Units for Neural Network Function Approximationin Reinforcement Learning andSwish:a Self-Gated Activation Functionwhere the SiLU was experimented with later.
- Shape:
Input:, where means any number of dimensions.
Output:, same shape as the input.

Examples:
>>>m=nn.SiLU()>>>input=torch.randn(2)>>>output=m(input)