Rate this Page

LSTM#

classtorch.ao.nn.quantizable.LSTM(input_size,hidden_size,num_layers=1,bias=True,batch_first=False,dropout=0.0,bidirectional=False,device=None,dtype=None,*,split_gates=False)[source]#

A quantizable long short-term memory (LSTM).

For the description and the argument types, please, refer toLSTM

Variables

layers – instances of the_LSTMLayer

Note

To access the weights and biases, you need to access them per layer.See examples below.

Examples:

>>>importtorch.ao.nn.quantizableasnnqa>>>rnn=nnqa.LSTM(10,20,2)>>>input=torch.randn(5,3,10)>>>h0=torch.randn(2,3,20)>>>c0=torch.randn(2,3,20)>>>output,(hn,cn)=rnn(input,(h0,c0))>>># To get the weights:>>>print(rnn.layers[0].weight_ih)tensor([[...]])>>>print(rnn.layers[0].weight_hh)AssertionError: There is no reverse path in the non-bidirectional layer