PackedSequence(data, batch_sizes=None, sorted_indices=None, unsorted_indices=None)¶
Holds the data and list of
batch_sizesof a packed sequence.
All RNN modules accept packed sequences as inputs.
Instances of this class should never be created manually. They are meant to be instantiated by functions like
Batch sizes represent the number elements at each sequence step in the batch, not the varying sequence lengths passed to
pack_padded_sequence(). For instance, given data
PackedSequencewould contain data
~PackedSequence.data (Tensor) – Tensor containing packed sequence
~PackedSequence.batch_sizes (Tensor) – Tensor of integers holding information about the batch size at each sequence step
~PackedSequence.unsorted_indices (Tensor, optional) – Tensor of integers holding how this to recover the original sequences with correct order.
batch_sizesshould always be a CPU
This invariant is maintained throughout
PackedSequenceclass, and all functions that construct a :class:PackedSequence in PyTorch (i.e., they only pass in tensors conforming to this constraint).
Alias for field number 1
Return number of occurrences of value.
Alias for field number 0
index(value, start=0, stop=9223372036854775807, /)¶
Return first index of value.
Raises ValueError if the value is not present.
Returns true if self.data stored on a gpu
Returns true if self.data stored on in pinned memory
Alias for field number 2
Performs dtype and/or device conversion on self.data.
It has similar signature as
torch.Tensor.to(), except optional arguments like non_blocking and copy should be passed as kwargs, not args, or they will not apply to the index tensors.
self.dataTensor already has the correct
selfis returned. Otherwise, returns a copy with the desired configuration.
Alias for field number 3