Attention
June 2024 Status Update: Removing DataPipes and DataLoader V2
We are re-focusing the torchdata repo to be an iterative enhancement of torch.utils.data.DataLoader. We do not plan on continuing development or maintaining the [DataPipes] and [DataLoaderV2] solutions, and they will be removed from the torchdata repo. We’ll also be revisiting the DataPipes references in pytorch/pytorch. In release torchdata==0.8.0 (July 2024) they will be marked as deprecated, and in 0.9.0 (Oct 2024) they will be deleted. Existing users are advised to pin to torchdata==0.8.0 or an older version until they are able to migrate away. Subsequent releases will not include DataPipes or DataLoaderV2. Please reach out if you suggestions or comments (please use this issue for feedback)
BatchMapper¶
- class torchdata.datapipes.iter.BatchMapper(datapipe: IterDataPipe, fn: Callable, batch_size: int, input_col=None)¶
Combines elements from the source DataPipe to batches and applies a function over each batch, then flattens the outputs to a single, unnested IterDataPipe (functional name:
map_batches
).- Parameters:
datapipe – Source IterDataPipe
fn – The function to be applied to each batch of data
batch_size – The size of batch to be aggregated from
datapipe
input_col –
Index or indices of data which
fn
is applied, such as:None
as default to applyfn
to the data directly.Integer(s) is used for list/tuple.
Key(s) is used for dict.
Example
>>> from torchdata.datapipes.iter import IterableWrapper >>> def fn(batch): >>> return [d + 1 for d in batch] >>> source_dp = IterableWrapper(list(range(5))) >>> mapped_dp = source_dp.map_batches(fn, batch_size=3) >>> list(mapped_dp) [1, 2, 3, 4, 5]
Notes
Compared with
map
, the reason thatmap_batches
doesn’t takeoutput_col
argument is the size offn
output is not guaranteed to be the same as input batch. With different size, this operation cannot assign data back to original data structure.And, this operation is introduced based on the use case from TorchText. A pybinded C++ vectorized function can be applied for efficiency.