Rate this Page

torch.nn.utils.rnn.pad_packed_sequence#

torch.nn.utils.rnn.pad_packed_sequence(sequence,batch_first=False,padding_value=0.0,total_length=None)[source]#

Pad a packed batch of variable length sequences.

It is an inverse operation topack_padded_sequence().

The returned Tensor’s data will be of sizeTxBx* (ifbatch_first isFalse)orBxTx* (ifbatch_first isTrue) , whereT is the length of the longestsequence andB is the batch size.

Example

>>>fromtorch.nn.utils.rnnimportpack_padded_sequence,pad_packed_sequence>>>seq=torch.tensor([[1,2,0],[3,0,0],[4,5,6]])>>>lens=[2,1,3]>>>packed=pack_padded_sequence(...seq,lens,batch_first=True,enforce_sorted=False...)>>>packedPackedSequence(data=tensor([4, 1, 3, 5, 2, 6]), batch_sizes=tensor([3, 2, 1]),               sorted_indices=tensor([2, 0, 1]), unsorted_indices=tensor([1, 2, 0]))>>>seq_unpacked,lens_unpacked=pad_packed_sequence(packed,batch_first=True)>>>seq_unpackedtensor([[1, 2, 0],        [3, 0, 0],        [4, 5, 6]])>>>lens_unpackedtensor([2, 1, 3])

Note

total_length is useful to implement thepacksequence->recurrentnetwork->unpacksequence pattern in aModule wrapped inDataParallel.Seethis FAQ section fordetails.

Parameters
  • sequence (PackedSequence) – batch to pad

  • batch_first (bool,optional) – ifTrue, the output will be inBxTx*format,TxBx* otherwise.

  • padding_value (float,optional) – values for padded elements.

  • total_length (int,optional) – if notNone, the output will be padded tohave lengthtotal_length. This method will throwValueErroriftotal_length is less than the max sequence length insequence.

Returns

Tuple of Tensor containing the padded sequence, and a Tensorcontaining the list of lengths of each sequence in the batch.Batch elements will be re-ordered as they were ordered originally whenthe batch was passed topack_padded_sequence orpack_sequence.

Return type

tuple[torch.Tensor,torch.Tensor]