pytorch_lightning_spells.losses module
Classes:
Cross Entropy with Label Smoothing |
|
|
A softmax loss that supports MixUp augmentation. |
|
Poly-1 Cross-Entropy Loss |
|
Poly-1 Focal Loss |
- class pytorch_lightning_spells.losses.LabelSmoothCrossEntropy(eps)[source]
Bases:
Module
Cross Entropy with Label Smoothing
Reference: wangleiofficial/lable-smoothing-pytorch
The ground truth label will have a value of 1-eps in the target vector.
- Parameters:
eps (float) – the smoothing factor.
- class pytorch_lightning_spells.losses.MixupSoftmaxLoss(class_weights=None, reduction='mean', label_smooth_eps=0, poly1_eps=0)[source]
Bases:
Module
A softmax loss that supports MixUp augmentation.
It requires the input batch to be manipulated into certain format. Works best with MixUpCallback, CutMixCallback, and SnapMixCallback.
Reference: Fast.ai’s implementation
- Parameters:
class_weights (torch.Tensor, optional) – The weight of each class. Defaults to the same weight.
reduction (str, optional) – Loss reduction method. Defaults to ‘mean’.
label_smooth_eps (float, optional) – If larger than zero, use LabelSmoothedCrossEntropy instead of CrossEntropy. Defaults to 0.
poly1_eps (float) –
- forward(output, target)[source]
The feed-forward.
The target tensor should have three columns:
the first class.
the second class.
the lambda value to mix the above two classes.
- Parameters:
output (torch.Tensor) – the model output.
target (torch.Tensor) – Shaped (batch_size, 3).
- Returns:
the result loss
- Return type:
torch.Tensor
- class pytorch_lightning_spells.losses.Poly1CrossEntropyLoss(epsilon=1.0, reduction='none', weight=None)[source]
Bases:
Module
Poly-1 Cross-Entropy Loss
Adapted from abhuse/polyloss-pytorch.
Reference: PolyLoss: A Polynomial Expansion Perspective of Classification Loss Functions.
- Parameters:
epsilon (float) –
reduction (str) –
weight (Tensor | None) –
- class pytorch_lightning_spells.losses.Poly1FocalLoss(num_classes, epsilon=1.0, alpha=0.25, gamma=2.0, reduction='none', weight=None, label_is_onehot=False)[source]
Bases:
Module
Poly-1 Focal Loss
Adapted from abhuse/polyloss-pytorch.
Reference: PolyLoss: A Polynomial Expansion Perspective of Classification Loss Functions.
- Parameters:
num_classes (int) –
epsilon (float) –
alpha (float) –
gamma (float) –
reduction (str) –
weight (Tensor) –
label_is_onehot (bool) –
- forward(logits, labels)[source]
Forward pass :param logits: output of neural netwrok of shape [N, num_classes] or [N, num_classes, …] :param labels: ground truth tensor of shape [N] or [N, …] with class ids if label_is_onehot was set to False, otherwise
one-hot encoded tensor of same shape as logits
- Returns:
poly focal loss