Shortcuts

LRScheduler#

class ignite.handlers.param_scheduler.LRScheduler(lr_scheduler, save_history=False)[source]#

A wrapper class to call torch.optim.lr_scheduler objects as ignite handlers.

Parameters
  • lr_scheduler (_LRScheduler) – lr_scheduler object to wrap.

  • save_history (bool) – whether to log the parameter values to engine.state.param_history, (default=False).

from ignite.handlers.param_scheduler import LRScheduler
from torch.optim.lr_scheduler import StepLR

step_scheduler = StepLR(optimizer, step_size=3, gamma=0.1)
scheduler = LRScheduler(step_scheduler)

# In this example, we assume to have installed PyTorch>=1.1.0
# (with new `torch.optim.lr_scheduler` behaviour) and
# we attach scheduler to Events.ITERATION_COMPLETED
# instead of Events.ITERATION_STARTED to make sure to use
# the first lr value from the optimizer, otherwise it is will be skipped:
trainer.add_event_handler(Events.ITERATION_COMPLETED, scheduler)

New in version 0.4.5.

Methods

get_param

Method to get current optimizer's parameter value

simulate_values

Method to simulate scheduled values during num_events events.

get_param()[source]#

Method to get current optimizer’s parameter value

Return type

Union[float, List[float]]

classmethod simulate_values(num_events, lr_scheduler, **kwargs)[source]#

Method to simulate scheduled values during num_events events.

Parameters
  • num_events (int) – number of events during the simulation.

  • lr_scheduler (_LRScheduler) – lr_scheduler object to wrap.

  • kwargs (Any) –

Returns

event_index, value

Return type

List[List[int]]