Shortcuts

SampleMultiplexer

class torchdata.datapipes.iter.SampleMultiplexer(pipes_to_weights_dict: Dict[IterDataPipe[T_co], float], seed: Optional[int] = None)

Takes a Dict of (IterDataPipe, Weight), and yields items by sampling from these DataPipes with respect to their weights. When individual DataPipes are exhausted, continues to sample from the remaining DataPipes according to their relative weights. If you wish to maintain the same ratio of weights indefinitely, you need to ensure that the inputs are never exhausted, by, for instance, applying cycle to them.

Sampling is controlled by the provided random seed. If you don’t provide it, the sampling will not be deterministic.

Parameters:
  • pipes_to_weights_dict – a Dict of IterDataPipes and Weights. The total weight of unexhausted DataPipes will be normalized to 1 for the purpose of sampling.

  • seed – random seed to initialize the random number generator

Example

>>> from torchdata.datapipes.iter import IterableWrapper, SampleMultiplexer
>>> source_dp1 = IterableWrapper([0] * 5)
>>> source_dp2 = IterableWrapper([1] * 5)
>>> d = {source_dp1: 99999999, source_dp2: 0.0000001}
>>> sample_mul_dp = SampleMultiplexer(pipes_to_weights_dict=d, seed=0)
>>> list(sample_mul_dp)
[0, 0, 0, 0, 0, 1, 1, 1, 1, 1]

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources