Shortcuts

WandBLogger

class torchtune.training.metric_logging.WandBLogger(project: str = 'torchtune', entity: Optional[str] = None, group: Optional[str] = None, log_dir: Optional[str] = None, **kwargs)[source]

Logger for use w/ Weights and Biases application (https://wandb.ai/). For more information about arguments expected by WandB, see https://docs.wandb.ai/ref/python/init.

Parameters:
  • project (str) – WandB project name. Default is torchtune.

  • entity (Optional[str]) – WandB entity name. If you don’t specify an entity, the run will be sent to your default entity, which is usually your username.

  • group (Optional[str]) – WandB group name for grouping runs together. If you don’t specify a group, the run will be logged as an individual experiment.

  • log_dir (Optional[str]) – WandB log directory. If not specified, use the dir argument provided in kwargs. Else, use root directory.

  • **kwargs – additional arguments to pass to wandb.init

Example

>>> from torchtune.training.metric_logging import WandBLogger
>>> logger = WandBLogger(project="my_project", entity="my_entity", group="my_group")
>>> logger.log("my_metric", 1.0, 1)
>>> logger.log_dict({"my_metric": 1.0}, 1)
>>> logger.close()
Raises:

ImportError – If wandb package is not installed.

Note

This logger requires the wandb package to be installed. You can install it with pip install wandb. In order to use the logger, you need to login to your WandB account. You can do this by running wandb login in your terminal.

close() None[source]

Close log resource, flushing if necessary. Logs should not be written after close is called.

log(name: str, data: Union[Tensor, ndarray, int, float], step: int) None[source]

Log scalar data.

Parameters:
  • name (str) – tag name used to group scalars

  • data (Scalar) – scalar data to log

  • step (int) – step value to record

log_config(config: DictConfig) None[source]

Saves the config locally and also logs the config to W&B. The config is stored in the same directory as the checkpoint. You can see an example of the logged config to W&B in the following link: https://wandb.ai/capecape/torchtune/runs/6053ofw0/files/torchtune_config_j67sb73v.yaml

Parameters:

config (DictConfig) – config to log

log_dict(payload: Mapping[str, Union[Tensor, ndarray, int, float]], step: int) None[source]

Log multiple scalar values.

Parameters:
  • payload (Mapping[str, Scalar]) – dictionary of tag name and scalar value

  • step (int) – step value to record

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources