Shortcuts

Prefetcher

class torchdata.datapipes.iter.Prefetcher(source_datapipe, buffer_size: int = 10)

Prefetches elements from the source DataPipe and puts them into a buffer (functional name: prefetch). Prefetching performs the operations (e.g. I/O, computations) of the DataPipes up to this one ahead of time and stores the result in the buffer, ready to be consumed by the subsequent DataPipe. It has no effect aside from getting the sample ready ahead of time.

This is used by MultiProcessingReadingService when the arguments worker_prefetch_cnt (for prefetching at each worker process) or main_prefetch_cnt (for prefetching at the main loop) are greater than 0.

Beyond the built-in use cases, this can be useful to put after I/O DataPipes that have expensive I/O operations (e.g. takes a long time to request a file from a remote server).

Parameters:
  • source_datapipe – IterDataPipe from which samples are prefetched

  • buffer_size – the size of the buffer which stores the prefetched samples

Example

>>> from torchdata.datapipes.iter import IterableWrapper
>>> dp = IterableWrapper(file_paths).open_files().prefetch(5)

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources