TAAFT
Free mode
100% free
Freemium
Free Trial
Create tool

Cross-Entropy Loss

[krɔs ˈɛntrəpi lɔs]
Machine Learning
Last updated: December 9, 2024

Definition

A loss function measuring the difference between predicted probability distributions and actual distributions.

Detailed Explanation

Cross-entropy loss quantifies the difference between two probability distributions: the actual distribution and the predicted distribution. It's particularly useful for multi-class classification problems and is closely related to log loss. The loss increases as predicted probability diverges from actual label.

Use Cases

Used in image classification, natural language processing, and any multi-class classification problem.

Related Terms