TAAFT
Free mode
100% free
Freemium
Free Trial
Deals
Create tool

Positional Encoding

[pəˈzɪʃənəl ɛnˈkoʊdɪŋ]
Machine Learning
Last updated: December 9, 2024

Definition

A technique used in transformer models to incorporate sequence order information into position-independent attention mechanisms. It adds position-dependent signals to input embeddings.

Detailed Explanation

Positional encoding adds information about token positions in a sequence using sinusoidal functions or learned embeddings. This is crucial for transformer models which otherwise have no inherent way to understand sequence order. The encoding allows the model to consider both the content and position of tokens when processing sequences.

Use Cases

Transformer models Language processing Speech recognition Sequential data processing

Related Terms