TAAFT
Free mode
100% free
Freemium
Free Trial
Create tool

Dropout Regularization

[ˈdrɒpaʊt ˌrɛɡjʊləraɪˈzeɪʃən]
Deep Learning
Last updated: December 9, 2024

Definition

Technique to prevent overfitting by randomly deactivating neurons during training

Detailed Explanation

Dropout randomly removes units from the neural network during training, forcing the network to learn redundant representations and preventing co-adaptation of neurons. This improves generalization and reduces overfitting.

Use Cases

Training robust neural networks, preventing overfitting

Related Terms