Shortcuts

LearningRateMonitor

class torchtnt.framework.callbacks.LearningRateMonitor(loggers: Union[MetricLogger, List[MetricLogger]], *, logging_interval: str = 'epoch')

A callback which logs learning rate of tracked optimizers and learning rate schedulers. Logs learning rate for each parameter group associated with an optimizer.

Parameters:loggers – Either a torchtnt.loggers.logger.MetricLogger or list of torchtnt.loggers.logger.MetricLogger
on_train_epoch_start(state: State, unit: TrainUnit[TTrainData]) None

Hook called before a new train epoch starts.

on_train_step_start(state: State, unit: TrainUnit[TTrainData]) None

Hook called before a new train step starts.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources