• Docs >
  • Utils >
  • torchtnt.utils.distributed.get_global_rank
Shortcuts

torchtnt.utils.distributed.get_global_rank

torchtnt.utils.distributed.get_global_rank() int

Get rank using torch.distributed if available. Otherwise, the RANK env var instead if initialized. Returns 0 if neither condition is met.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources