Rate this Page

Mish#

classtorch.nn.Mish(inplace=False)[source]#

Applies the Mish function, element-wise.

Mish: A Self Regularized Non-Monotonic Neural Activation Function.

Mish(x)=xTanh(Softplus(x))\text{Mish}(x) = x * \text{Tanh}(\text{Softplus}(x))
Shape:
  • Input:()(*), where* means any number of dimensions.

  • Output:()(*), same shape as the input.

../_images/Mish.png

Examples:

>>>m=nn.Mish()>>>input=torch.randn(2)>>>output=m(input)
extra_repr()[source]#

Return the extra representation of the module.

Return type

str

forward(input)[source]#

Runs the forward pass.

Return type

Tensor