torch.nn.utils.rnn.pad_packed_sequence#
- torch.nn.utils.rnn.pad_packed_sequence(sequence,batch_first=False,padding_value=0.0,total_length=None)[source]#
Pad a packed batch of variable length sequences.
It is an inverse operation to
pack_padded_sequence().The returned Tensor’s data will be of size
TxBx*(ifbatch_firstisFalse)orBxTx*(ifbatch_firstisTrue) , whereTis the length of the longestsequence andBis the batch size.Example
>>>fromtorch.nn.utils.rnnimportpack_padded_sequence,pad_packed_sequence>>>seq=torch.tensor([[1,2,0],[3,0,0],[4,5,6]])>>>lens=[2,1,3]>>>packed=pack_padded_sequence(...seq,lens,batch_first=True,enforce_sorted=False...)>>>packedPackedSequence(data=tensor([4, 1, 3, 5, 2, 6]), batch_sizes=tensor([3, 2, 1]), sorted_indices=tensor([2, 0, 1]), unsorted_indices=tensor([1, 2, 0]))>>>seq_unpacked,lens_unpacked=pad_packed_sequence(packed,batch_first=True)>>>seq_unpackedtensor([[1, 2, 0], [3, 0, 0], [4, 5, 6]])>>>lens_unpackedtensor([2, 1, 3])
Note
total_lengthis useful to implement thepacksequence->recurrentnetwork->unpacksequencepattern in aModulewrapped inDataParallel.Seethis FAQ section fordetails.- Parameters
sequence (PackedSequence) – batch to pad
batch_first (bool,optional) – if
True, the output will be inBxTx*format,TxBx*otherwise.padding_value (float,optional) – values for padded elements.
total_length (int,optional) – if not
None, the output will be padded tohave lengthtotal_length. This method will throwValueErroriftotal_lengthis less than the max sequence length insequence.
- Returns
Tuple of Tensor containing the padded sequence, and a Tensorcontaining the list of lengths of each sequence in the batch.Batch elements will be re-ordered as they were ordered originally whenthe batch was passed to
pack_padded_sequenceorpack_sequence.- Return type