TAAFT
Free mode
100% free
Freemium
Free Trial
Deals
Create tool

Perplexity

[pərˈplɛksɪti]
Natural Language Processing
Last updated: December 9, 2024

Definition

A measurement of how well a probability model predicts a sample, commonly used in language models.

Detailed Explanation

Perplexity is the exponential of the average negative log-likelihood per word. Lower perplexity indicates the model is better at predicting the sample. It can be interpreted as the weighted average branching factor of a language model.

Use Cases

Used in evaluating language models, speech recognition systems, and text generation models.

Related Terms