Definition
A loss function measuring the difference between predicted probability distributions and actual distributions.
Detailed Explanation
Cross-entropy loss quantifies the difference between two probability distributions: the actual distribution and the predicted distribution. It's particularly useful for multi-class classification problems and is closely related to log loss. The loss increases as predicted probability diverges from actual label.
Use Cases
Used in image classification, natural language processing, and any multi-class classification problem.