Shortcuts

InMemoryCacheHolder

class torchdata.datapipes.iter.InMemoryCacheHolder(source_dp: IterDataPipe[T_co], size: Optional[int] = None)

Stores elements from the source DataPipe in memory, up to a size limit if specified (functional name: in_memory_cache). This cache is FIFO - once the cache is full, further elements will not be added to the cache until the previous ones are yielded and popped off from the cache.

Parameters:
  • source_dp – source DataPipe from which elements are read and stored in memory

  • size – The maximum size (in megabytes) that this DataPipe can hold in memory. This defaults to unlimited.

Example

>>> from torchdata.datapipes.iter import IterableWrapper
>>> source_dp = IterableWrapper(range(10))
>>> cache_dp = source_dp.in_memory_cache(size=5)
>>> list(cache_dp)
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources