Rate this Page

GLU#

classtorch.nn.modules.activation.GLU(dim=-1)[source]#

Applies the gated linear unit function.

GLU(a,b)=aσ(b){GLU}(a, b)= a \otimes \sigma(b) whereaa is the first halfof the input matrices andbb is the second half.

Parameters

dim (int) – the dimension on which to split the input. Default: -1

Shape:
../_images/GLU.png

Examples:

>>>m=nn.GLU()>>>input=torch.randn(4,2)>>>output=m(input)
extra_repr()[source]#

Return the extra representation of the module.

Return type

str

forward(input)[source]#

Runs the forward pass.

Return type

Tensor