Definition
A regularization technique that stops training when the model's performance on a validation set starts to degrade.
Detailed Explanation
Early stopping monitors the model's performance on a validation set during training and stops the training process when the performance begins to deteriorate. This helps prevent overfitting by capturing the point where the model begins to learn noise in the training data rather than generalizable patterns.
Use Cases
Neural network training, Model optimization, Preventing overfitting