LRScheduler#
- class ignite.handlers.param_scheduler.LRScheduler(lr_scheduler, save_history=False)[source]#
A wrapper class to call torch.optim.lr_scheduler objects as ignite handlers.
- Parameters
lr_scheduler (_LRScheduler) – lr_scheduler object to wrap.
save_history (bool) – whether to log the parameter values to engine.state.param_history, (default=False).
Examples
from torch.optim.lr_scheduler import StepLR torch_lr_scheduler = StepLR(default_optimizer, step_size=3, gamma=0.1) scheduler = LRScheduler(torch_lr_scheduler) # In this example, we assume to have installed PyTorch>=1.1.0 # (with new `torch.optim.lr_scheduler` behaviour) and # we attach scheduler to Events.ITERATION_COMPLETED # instead of Events.ITERATION_STARTED to make sure to use # the first lr value from the optimizer, otherwise it is will be skipped: default_trainer.add_event_handler(Events.ITERATION_STARTED, scheduler) @default_trainer.on(Events.ITERATION_COMPLETED) def print_lr(): print(default_optimizer.param_groups[0]["lr"]) default_trainer.run([0] * 8, max_epochs=1)
0.1 0.1 0.010... 0.010... 0.010... 0.001... 0.001... 0.001...
New in version 0.4.5.
Methods
Method to get current optimizer's parameter value
Method to simulate scheduled values during num_events events.