SequentialLR
- class torch.optim.lr_scheduler.SequentialLR(optimizer, schedulers, milestones, last_epoch=-1)[source][source]
Contains a list of schedulers expected to be called sequentially during the optimization process.
Specifically, the schedulers will be called according to the milestone points, which should provide exact intervals by which each scheduler should be called at a given epoch.
- Parameters
Example
>>> # Assuming optimizer uses lr = 1. for all groups >>> # lr = 0.1 if epoch == 0 >>> # lr = 0.1 if epoch == 1 >>> # lr = 0.9 if epoch == 2 >>> # lr = 0.81 if epoch == 3 >>> # lr = 0.729 if epoch == 4 >>> scheduler1 = ConstantLR(optimizer, factor=0.1, total_iters=2) >>> scheduler2 = ExponentialLR(optimizer, gamma=0.9) >>> scheduler = SequentialLR(optimizer, schedulers=[scheduler1, scheduler2], milestones=[2]) >>> for epoch in range(100): >>> train(...) >>> validate(...) >>> scheduler.step()
- get_last_lr()[source]
Return last computed learning rate by current scheduler.
- get_lr()[source]
Compute learning rate using chainable form of the scheduler.
- load_state_dict(state_dict)[source][source]
Load the scheduler’s state.
- Parameters
state_dict (dict) – scheduler state. Should be an object returned from a call to
state_dict()
.
- recursive_undo(sched=None)[source][source]
Recursively undo any step performed by the initialisation of schedulers.