pytorch_lightning_spells.lr_schedulers module
Classes:
|
|
|
|
|
Exponentially increases the learning rate between two boundaries over a number of iterations. |
|
Linearly increases or decrease the learning rate between two boundaries over a number of iterations. |
|
- class pytorch_lightning_spells.lr_schedulers.BaseLRScheduler(optimizer, last_epoch=-1, verbose='deprecated')[source]
Bases:
_LRScheduler
- class pytorch_lightning_spells.lr_schedulers.CosineAnnealingScheduler(optimizer, T_max, eta_min=0, last_epoch=-1, verbose='deprecated')[source]
Bases:
CosineAnnealingLR
,BaseLRScheduler
- class pytorch_lightning_spells.lr_schedulers.ExponentialLR(optimizer, min_lr_ratio, total_epochs, last_epoch=-1)[source]
Bases:
BaseLRScheduler
Exponentially increases the learning rate between two boundaries over a number of iterations.
Mainly used by LR finders.
- __init__(optimizer, min_lr_ratio, total_epochs, last_epoch=-1)[source]
Initialize a scheduler.
- Parameters:
optimizer (Union[torch.optim.Optimizer, apex.fp16_utils.fp16_optimizer.FP16_Optimizer]) –
min_lr_ratio (float) – min_lr_ratio * base_lr will be the starting learning rate.
total_epochs (int) – the total number of “steps” in this run.
last_epoch (int, optional) – the index of last epoch, by default -1.
- class pytorch_lightning_spells.lr_schedulers.LinearLR(optimizer, min_lr_ratio, total_epochs, upward=True, last_epoch=-1)[source]
Bases:
BaseLRScheduler
Linearly increases or decrease the learning rate between two boundaries over a number of iterations.
- Parameters:
optimizer (Optimizer) –
min_lr_ratio (float) –
total_epochs (float) –
upward (bool) –
last_epoch (int) –
- __init__(optimizer, min_lr_ratio, total_epochs, upward=True, last_epoch=-1)[source]
Initialize a scheduler.
- Parameters:
optimizer (Union[torch.optim.Optimizer, apex.fp16_utils.fp16_optimizer.FP16_Optimizer]) –
min_lr_ratio (float) – min_lr_ratio * base_lr will be the starting learning rate.
total_epochs (float) – the total number of “steps” in this run.
upward (bool) – whether the learning rate goes up or down. Defaults to True.
last_epoch (int) – the index of last epoch. Defaults to -1.
- class pytorch_lightning_spells.lr_schedulers.MultiStageScheduler(schedulers, start_at_epochs, last_epoch=-1)[source]
Bases:
_LRScheduler
- Parameters:
schedulers (Sequence) –
start_at_epochs (Sequence[int]) –
last_epoch (int) –
- __init__(schedulers, start_at_epochs, last_epoch=-1)[source]
- Parameters:
schedulers (Sequence) –
start_at_epochs (Sequence[int]) –
last_epoch (int) –
- load_state_dict(state_dict)[source]
Loads the schedulers state.
- Parameters:
state_dict (dict) – scheduler state. Should be an object returned from a call to
state_dict()
.