Rate this Page

torch.nn.functional.glu#

torch.nn.functional.glu(input,dim=-1)Tensor[source]#

The gated linear unit. Computes:

GLU(a,b)=aσ(b)\text{GLU}(a, b) = a \otimes \sigma(b)

whereinput is split in half alongdim to forma andb,σ\sigmais the sigmoid function and\otimes is the element-wise product between matrices.

SeeLanguage Modeling with Gated Convolutional Networks.

Parameters:
  • input (Tensor) – input tensor

  • dim (int) – dimension on which to split the input. Default: -1

Return type:

Tensor