GELU#
- classtorch.nn.GELU(approximate='none')[source]#
Applies the Gaussian Error Linear Units function.
where is the Cumulative Distribution Function for Gaussian Distribution.
When the approximate argument is ‘tanh’, Gelu is estimated with:
- Parameters:
approximate (str,optional) – the gelu approximation algorithm to use:
'none'|'tanh'. Default:'none'
- Shape:
Input:, where means any number of dimensions.
Output:, same shape as the input.

Examples:
>>>m=nn.GELU()>>>input=torch.randn(2)>>>output=m(input)