Shortcuts

BatchMapper

class torchdata.datapipes.iter.BatchMapper(datapipe: IterDataPipe, fn: Callable, batch_size: int, input_col=None)

Combines elements from the source DataPipe to batches and applies a function over each batch, then flattens the outputs to a single, unnested IterDataPipe (functional name: map_batches).

Parameters:
  • datapipe – Source IterDataPipe

  • fn – The function to be applied to each batch of data

  • batch_size – The size of batch to be aggregated from datapipe

  • input_col

    Index or indices of data which fn is applied, such as:

    • None as default to apply fn to the data directly.

    • Integer(s) is used for list/tuple.

    • Key(s) is used for dict.

Example

>>> from torchdata.datapipes.iter import IterableWrapper
>>> def fn(batch):
>>>     return [d + 1 for d in batch]
>>> source_dp = IterableWrapper(list(range(5)))
>>> mapped_dp = source_dp.map_batches(fn, batch_size=3)
>>> list(mapped_dp)
[1, 2, 3, 4, 5]

Notes

Compared with map, the reason that map_batches doesn’t take output_col argument is the size of fn output is not guaranteed to be the same as input batch. With different size, this operation cannot assign data back to original data structure.

And, this operation is introduced based on the use case from TorchText. A pybinded C++ vectorized function can be applied for efficiency.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources