Shortcuts

CosineAnnealingScheduler#

class ignite.handlers.param_scheduler.CosineAnnealingScheduler(optimizer, param_name, start_value, end_value, cycle_size, cycle_mult=1.0, start_value_mult=1.0, end_value_mult=1.0, save_history=False, param_group_index=None)[source]#

Anneals ‘start_value’ to ‘end_value’ over each cycle.

The annealing takes the form of the first half of a cosine wave (as suggested in [Smith17]).

Parameters
  • optimizer (Optimizer) – torch optimizer or any object with attribute param_groups as a sequence.

  • param_name (str) – name of optimizer’s parameter to update.

  • start_value (float) – value at start of cycle.

  • end_value (float) – value at the end of the cycle.

  • cycle_size (int) – length of cycle.

  • cycle_mult (float) – ratio by which to change the cycle_size at the end of each cycle (default=1).

  • start_value_mult (float) – ratio by which to change the start value at the end of each cycle (default=1.0).

  • end_value_mult (float) – ratio by which to change the end value at the end of each cycle (default=1.0).

  • save_history (bool) – whether to log the parameter values to engine.state.param_history, (default=False).

  • param_group_index (Optional[int]) – optimizer’s parameters group to use.

Note

If the scheduler is bound to an ‘ITERATION_*’ event, ‘cycle_size’ should usually be the number of batches in an epoch.

Examples

from ignite.handlers.param_scheduler import CosineAnnealingScheduler

scheduler = CosineAnnealingScheduler(optimizer, 'lr', 1e-1, 1e-3, len(train_loader))
trainer.add_event_handler(Events.ITERATION_STARTED, scheduler)
#
# Anneals the learning rate from 1e-1 to 1e-3 over the course of 1 epoch.
#
from ignite.handlers.param_scheduler import CosineAnnealingScheduler
from ignite.handlers.param_scheduler import LinearCyclicalScheduler

optimizer = SGD(
    [
        {"params": model.base.parameters(), 'lr': 0.001},
        {"params": model.fc.parameters(), 'lr': 0.01},
    ]
)

scheduler1 = LinearCyclicalScheduler(optimizer, 'lr', 1e-7, 1e-5, len(train_loader), param_group_index=0)
trainer.add_event_handler(Events.ITERATION_STARTED, scheduler1, "lr (base)")

scheduler2 = CosineAnnealingScheduler(optimizer, 'lr', 1e-5, 1e-3, len(train_loader), param_group_index=1)
trainer.add_event_handler(Events.ITERATION_STARTED, scheduler2, "lr (fc)")
Smith17

Smith, Leslie N. “Cyclical learning rates for training neural networks.” Applications of Computer Vision (WACV), 2017 IEEE Winter Conference on. IEEE, 2017

New in version 0.4.5.

Methods

get_param

Method to get current optimizer's parameter value

get_param()[source]#

Method to get current optimizer’s parameter value

Return type

float