Definition
Dense vector representations of words that capture semantic relationships.
Detailed Explanation
Word embeddings map words to high-dimensional vectors such that similar words have similar vector representations. These representations capture semantic relationships and allow mathematical operations on words. Popular methods include Word2Vec, GloVe, and FastText.
Use Cases
Search relevance, recommendation systems, document classification, machine translation