-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Open
Description
DiceLoss has a from_logits parameter to handle both logits and probabilities, but FocalLoss does not. This creates an inconsistency when using models with softmax activation and requires awkward workarounds.
model = smp.create_model(..., activation='softmax')
outputs = model(x) # Probabilities [0, 1]
# This works
dice_loss = smp.losses.DiceLoss(mode='multiclass', from_logits=False)
# This fails - FocalLoss applies softmax to output probabilities resulting in incorrect loss calc
focal_loss = smp.losses.FocalLoss(mode='multiclass')Currently need to either:
- Remove model activation and use logits everywhere
- Manually wrap
FocalLossto convert probabilities back to logits
FocalLoss should have a from_logits parameter like DiceLoss for consistent API.
Environment
- segmentation-models-pytorch version: 0.5.0
- PyTorch version: 2.7.1
Metadata
Metadata
Assignees
Labels
No labels