- class ignite.distributed.auto.DistributedProxySampler(sampler, num_replicas=None, rank=None)[source]#
Distributed sampler proxy to adapt user’s sampler for distributed data parallelism configuration.
Code is based on https://github.com/pytorch/pytorch/issues/23430#issuecomment-562350407
sampler – Input torch data sampler.
num_replicas – Number of processes participating in distributed training.
rank – Rank of the current process within
Input sampler is assumed to have a constant size.