Rate this Page

Struct TransformerEncoderLayerOptions#

Struct Documentation#

structTransformerEncoderLayerOptions#

Options for theTransformerEncoderLayer

Example:

autooptions=TransformerEncoderLayer(512,8).dropout(0.2);

Public Functions

TransformerEncoderLayerOptions(int64_td_model,int64_tnhead)#
inlineautod_model(constint64_t&new_d_model)->decltype(*this)#

the number of expected features in the input

inlineautod_model(int64_t&&new_d_model)->decltype(*this)#
inlineconstint64_t&d_model()constnoexcept#
inlineint64_t&d_model()noexcept#
inlineautonhead(constint64_t&new_nhead)->decltype(*this)#

the number of heads in the multiheadattention models

inlineautonhead(int64_t&&new_nhead)->decltype(*this)#
inlineconstint64_t&nhead()constnoexcept#
inlineint64_t&nhead()noexcept#
inlineautodim_feedforward(constint64_t&new_dim_feedforward)->decltype(*this)#

the dimension of the feedforward network model, default is 2048

inlineautodim_feedforward(int64_t&&new_dim_feedforward)->decltype(*this)#
inlineconstint64_t&dim_feedforward()constnoexcept#
inlineint64_t&dim_feedforward()noexcept#
inlineautodropout(constdouble&new_dropout)->decltype(*this)#

the dropout value, default is 0.1

inlineautodropout(double&&new_dropout)->decltype(*this)#
inlineconstdouble&dropout()constnoexcept#
inlinedouble&dropout()noexcept#
inlineautoactivation(constactivation_t&new_activation)->decltype(*this)#

the activation function of intermediate layer, can betorch::kReLU,torch::GELU, or a unary callable.

Default:torch::kReLU

inlineautoactivation(activation_t&&new_activation)->decltype(*this)#
inlineconstactivation_t&activation()constnoexcept#
inlineactivation_t&activation()noexcept#