Definition
Technique to prevent overfitting by randomly deactivating neurons during training
Detailed Explanation
Dropout randomly removes units from the neural network during training, forcing the network to learn redundant representations and preventing co-adaptation of neurons. This improves generalization and reduces overfitting.
Use Cases
Training robust neural networks, preventing overfitting