is_distributed¶
- torchtune.training.is_distributed() bool [source]¶
Check if all environment variables required to initialize torch.distributed are set and distributed is properly installed. This indicates a distributed run. https://pytorch.org/docs/stable/distributed.html#environment-variable-initialization
Checks the following conditions:
torch.distributed is available
master port and master address environment variables are set
world size is >1
rank environment variable is set
- Returns:
True if all of the above conditions hold, False otherwise.
- Return type: