Shortcuts

DistributedProxySampler#

class ignite.distributed.auto.DistributedProxySampler(sampler, num_replicas=None, rank=None)[source]#

Distributed sampler proxy to adapt user’s sampler for distributed data parallelism configuration.

Code is based on https://github.com/pytorch/pytorch/issues/23430#issuecomment-562350407

Note

Input sampler is assumed to have a constant size.

Parameters
  • sampler (Sampler) – Input torch data sampler.

  • num_replicas (Optional[int]) – Number of processes participating in distributed training.

  • rank (Optional[int]) – Rank of the current process within num_replicas.

Methods