Shortcuts

Attention

June 2024 Status Update: Removing DataPipes and DataLoader V2

We are re-focusing the torchdata repo to be an iterative enhancement of torch.utils.data.DataLoader. We do not plan on continuing development or maintaining the [DataPipes] and [DataLoaderV2] solutions, and they will be removed from the torchdata repo. We’ll also be revisiting the DataPipes references in pytorch/pytorch. In release torchdata==0.8.0 (July 2024) they will be marked as deprecated, and in 0.9.0 (Oct 2024) they will be deleted. Existing users are advised to pin to torchdata==0.8.0 or an older version until they are able to migrate away. Subsequent releases will not include DataPipes or DataLoaderV2. Please reach out if you suggestions or comments (please use this issue for feedback)

Shuffler

class torchdata.datapipes.iter.Shuffler(datapipe: IterDataPipe[_T_co], *, buffer_size: int = 10000, unbatch_level: int = 0)

Shuffle the input DataPipe with a buffer (functional name: shuffle).

The buffer with buffer_size is filled with elements from the datapipe first. Then, each item will be yielded from the buffer by reservoir sampling via iterator.

buffer_size is required to be larger than 0. For buffer_size == 1, the datapipe is not shuffled. In order to fully shuffle all elements from datapipe, buffer_size is required to be greater than or equal to the size of datapipe.

When it is used with torch.utils.data.DataLoader, the methods to set up random seed are different based on num_workers.

For single-process mode (num_workers == 0), the random seed is set before the DataLoader in the main process. For multi-process mode (num_worker > 0), worker_init_fn is used to set up a random seed for each worker process.

Parameters:
  • datapipe – The IterDataPipe being shuffled

  • buffer_size – The buffer size for shuffling (default to 10000)

  • unbatch_level – Specifies if it is necessary to unbatch source data before applying the shuffle

Example

>>> # xdoctest: +SKIP
>>> from torchdata.datapipes.iter import IterableWrapper
>>> dp = IterableWrapper(range(10))
>>> shuffle_dp = dp.shuffle()
>>> list(shuffle_dp)
[0, 4, 1, 6, 3, 2, 9, 5, 7, 8]

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources