Rate this Page

torch.nn.functional.mish#

torch.nn.functional.mish(input,inplace=False)[source]#

Apply the Mish function, element-wise.

Mish: A Self Regularized Non-Monotonic Neural Activation Function.

Mish(x)=xTanh(Softplus(x))\text{Mish}(x) = x * \text{Tanh}(\text{Softplus}(x))

SeeMish for more details.

Return type

Tensor