Shortcuts

init_distributed

torchtune.utils.init_distributed(**kwargs: Dict) bool[source]

Initialize torch.distributed.

Parameters:

**kwargs (Dict) – Additional arguments to pass to torch.distributed.init_process_group.

Returns:

True if torch.distributed is initialized.

Return type:

bool

Raises:

RuntimeError – If torch.distributed is already initialized.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources