Rate this Page

Softmax#

classtorch.nn.Softmax(dim=None)[source]#

Applies the Softmax function to an n-dimensional input Tensor.

Rescales them so that the elements of the n-dimensional output Tensorlie in the range [0,1] and sum to 1.

Softmax is defined as:

Softmax(xi)=exp(xi)jexp(xj)\text{Softmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}

When the input Tensor is a sparse tensor then the unspecifiedvalues are treated as-inf.

Shape:
  • Input:()(*) where* means, any number of additionaldimensions

  • Output:()(*), same shape as the input

Returns:

a Tensor of the same dimension and shape as the input withvalues in the range [0, 1]

Parameters:

dim (int) – A dimension along which Softmax will be computed (so every slicealong dim will sum to 1).

Return type:

None

Note

This module doesn’t work directly with NLLLoss,which expects the Log to be computed between the Softmax and itself.UseLogSoftmax instead (it’s faster and has better numerical properties).

Examples:

>>>m=nn.Softmax(dim=1)>>>input=torch.randn(2,3)>>>output=m(input)
extra_repr()[source]#

Return the extra representation of the module.

Return type:

str

forward(input)[source]#

Runs the forward pass.

Return type:

Tensor