Rate this Page

SELU#

classtorch.nn.SELU(inplace=False)[source]#

Applies the SELU function element-wise.

SELU(x)=scale(max(0,x)+min(0,α(exp(x)1)))\text{SELU}(x) = \text{scale} * (\max(0,x) + \min(0, \alpha * (\exp(x) - 1)))

withα=1.6732632423543772848170429916717\alpha = 1.6732632423543772848170429916717 andscale=1.0507009873554804934193349852946\text{scale} = 1.0507009873554804934193349852946.

Warning

When usingkaiming_normal orkaiming_normal_ for initialisation,nonlinearity='linear' should be used instead ofnonlinearity='selu'in order to getSelf-Normalizing Neural Networks.Seetorch.nn.init.calculate_gain() for more information.

More details can be found in the paperSelf-Normalizing Neural Networks .

Parameters

inplace (bool,optional) – can optionally do the operation in-place. Default:False

Shape:
  • Input:()(*), where* means any number of dimensions.

  • Output:()(*), same shape as the input.

../_images/SELU.png

Examples:

>>>m=nn.SELU()>>>input=torch.randn(2)>>>output=m(input)
extra_repr()[source]#

Return the extra representation of the module.

Return type

str

forward(input)[source]#

Runs the forward pass.

Return type

Tensor