TAAFT
Free mode
100% free
Freemium
Free Trial
Deals
Create tool

Momentum Optimization

[məˈmɛntəm ˌɒptɪmaɪˈzeɪʃən]
Machine Learning
Last updated: December 9, 2024

Definition

An optimization technique that accelerates gradient descent by adding a fraction of the previous update to the current update.

Detailed Explanation

Momentum optimization accumulates a velocity vector in directions of persistent reduction in the objective function across iterations. It adds a fraction (momentum coefficient) of the update vector of the past time step to the current update vector, helping the optimizer navigate ravines and escape local minima more effectively.

Use Cases

Neural network training, Deep learning optimization, Gradient descent acceleration

Related Terms