Definition
A regularization method that adds the squared magnitude of weights as a penalty term to the loss function, preventing large weight values.
Detailed Explanation
L2 Regularization (Ridge) adds the sum of squared weights to the loss function, which penalizes large weights and effectively spreads the weight values more evenly across all features. This helps prevent any single feature from having a disproportionately large impact on the model predictions.
Use Cases
Neural network training, Preventing overfitting, Linear regression regularization