Definition
An optimization technique that accelerates gradient descent by adding a fraction of the previous update to the current update.
Detailed Explanation
Momentum optimization accumulates a velocity vector in directions of persistent reduction in the objective function across iterations. It adds a fraction (momentum coefficient) of the update vector of the past time step to the current update vector, helping the optimizer navigate ravines and escape local minima more effectively.
            Use Cases
Neural network training, Deep learning optimization, Gradient descent acceleration
            
 Mini tools
Mini tools