Rate this Page

CELU#

classtorch.nn.modules.activation.CELU(alpha=1.0,inplace=False)[source]#

Applies the CELU function element-wise.

CELU(x)=max(0,x)+min(0,α(exp(x/α)1))\text{CELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x/\alpha) - 1))

More details can be found in the paperContinuously Differentiable Exponential Linear Units .

Parameters
  • alpha (float) – theα\alpha value for the CELU formulation. Default: 1.0

  • inplace (bool) – can optionally do the operation in-place. Default:False

Shape:
  • Input:()(*), where* means any number of dimensions.

  • Output:()(*), same shape as the input.

../_images/CELU.png

Examples:

>>>m=nn.CELU()>>>input=torch.randn(2)>>>output=m(input)
extra_repr()[source]#

Return the extra representation of the module.

Return type

str

forward(input)[source]#

Runs the forward pass.

Return type

Tensor