Definition
Bidirectional language model that revolutionized natural language processing tasks
Detailed Explanation
BERT (Bidirectional Encoder Representations from Transformers) uses masked language modeling to learn contextualized word representations by considering both left and right context. It achieves state-of-the-art results on many NLP tasks.
Use Cases
Text classification, question answering, named entity recognition