loss function

A loss function is a scalar objective that measures the discrepancy between a model’s predictions and the target data, providing the signal that guides the parameter updates during training.

Common choices include mean squared error for regression, cross-entropy or negative log-likelihood for classification, and margin-based or contrastive objectives for ranking and metric-learning.

The training objective often combines a task loss with regularization terms to control complexity. Properties such as differentiability, calibration behaviour, and robustness to outliers, influence optimization dynamics and generalisation. Variants like class-weighting, focal loss, or label-smoothing, adapt the signal to data imbalance or noise.


By Leodanis Pozo Ramos • Updated Oct. 21, 2025