TAAFT
Free mode
100% free
Freemium
Free Trial
Create tool

Dropout

[ˈdrɒpaʊt]
Machine Learning
Last updated: December 9, 2024

Definition

A regularization technique that randomly deactivates neurons during training to prevent overfitting.

Detailed Explanation

Dropout works by randomly 'dropping out' (setting to zero) a proportion of neurons and their connections during training. This prevents neurons from co-adapting too much and forces the network to learn more robust features. During inference, all neurons are used but their outputs are scaled according to the dropout rate used during training.

Use Cases

Deep neural network training, Preventing overfitting, Ensemble learning approximation

Related Terms