TAAFT
Free mode
100% free
Freemium
Free Trial
Create tool

BERT

[bɜːt]
Deep Learning
Last updated: December 9, 2024

Definition

Bidirectional language model that revolutionized natural language processing tasks

Detailed Explanation

BERT (Bidirectional Encoder Representations from Transformers) uses masked language modeling to learn contextualized word representations by considering both left and right context. It achieves state-of-the-art results on many NLP tasks.

Use Cases

Text classification, question answering, named entity recognition

Related Terms