TAAFT
Free mode
100% free
Freemium
Free Trial
Create tool

Log Loss

[lɔg lɔs]
Machine Learning
Last updated: December 9, 2024

Definition

A performance measurement for classification models that outputs probabilistic values between 0 and 1.

Detailed Explanation

Also known as cross-entropy loss, log loss measures the performance of a model whose output is a probability value between 0 and 1. It increases as predicted probability diverges from actual label, severely penalizing predictions that are both confident and wrong.

Use Cases

Common in probabilistic classification tasks like risk assessment, disease probability prediction, and customer conversion prediction.

Related Terms