Shortcuts

ExpStateScheduler#

class ignite.handlers.state_param_scheduler.ExpStateScheduler(initial_value, gamma, param_name, save_history=False, create_new=False)[source]#

Update a parameter during training by using exponential function. The function decays the parameter value by gamma every step. Based on the closed form of ExponentialLR from PyTorch https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.ExponentialLR.html

Parameters
  • initial_value (float) – Starting value of the parameter.

  • gamma (float) – Multiplicative factor of parameter value decay.

  • param_name (str) – name of parameter to update.

  • save_history (bool) – whether to log the parameter values to engine.state.param_history, (default=False).

  • create_new (bool) – whether to create param_name on engine.state taking into account whether param_name attribute already exists or not. Overrides existing attribute by default, (default=False).

Examples

param_scheduler = ExpStateScheduler(
    param_name="param", initial_value=1, gamma=0.9, create_new=True
)

# parameter is param, initial_value sets param to 1, gamma is set as 0.9
# Epoch 1, param changes from 1 to 1*0.9, param = 0.9
# Epoch 2, param changes from 0.9 to 0.9*0.9, param = 0.81
# Epoch 3, param changes from 0.81 to 0.81*0.9, param = 0.729
# Epoch 4, param changes from 0.81 to 0.729*0.9, param = 0.6561

param_scheduler.attach(default_trainer, Events.EPOCH_COMPLETED)

@default_trainer.on(Events.EPOCH_COMPLETED)
def print_param():
    print(default_trainer.state.param)

default_trainer.run([0], max_epochs=4)
0.9
0.81
0.7290...
0.6561

New in version 0.4.7.

Methods

get_param

Method to get current parameter values

get_param()[source]#

Method to get current parameter values

Returns

list of params, or scalar param

Return type

Union[List[float], float]