TAAFT
Free mode
100% free
Freemium
Free Trial
Create tool

Mixture of Experts (MoE)

[ˈmɪkstʃər əv ˈɛkspɜrts ɛm oʊ i]
Machine Learning
Last updated: April 4, 2025

Definition

An ML architecture using multiple specialized sub-models (experts) and a gating mechanism to route inputs.

Detailed Explanation

A machine learning architecture, especially for large models, that uses multiple specialized sub-models ('experts') and a gating mechanism to route inputs to the most relevant expert(s), improving efficiency and performance.

Use Cases

Large language models (LLMs), sparse activation models, improving computational efficiency, scaling model capacity, handling diverse datasets.

Related Terms