Shortcuts

is_distributed

torchtune.training.is_distributed() bool[source]

Check if all environment variables required to initialize torch.distributed are set and distributed is properly installed. This indicates a distributed run. https://pytorch.org/docs/stable/distributed.html#environment-variable-initialization

Checks the following conditions:

  • torch.distributed is available

  • master port and master address environment variables are set

  • world size is >1

  • rank environment variable is set

Returns:

True if all of the above conditions hold, False otherwise.

Return type:

bool

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources