TAAFT
Free mode
100% free
Freemium
Free Trial
Deals
Create tool

Catastrophic Forgetting

[ˌkætəˈstrɒfɪk fərˈɡɛtɪŋ]
Machine Learning
Last updated: December 9, 2024

Definition

A phenomenon where neural networks rapidly forget previously learned information when trained on new tasks.

Detailed Explanation

Catastrophic forgetting occurs when neural networks overwrite previously learned parameters while adapting to new data or tasks. This happens because neural networks typically learn through weight updates that optimize for current training data potentially disrupting representations learned for previous tasks. Various techniques like elastic weight consolidation progressive neural networks and replay buffers have been developed to mitigate this issue.

Use Cases

Continual learning systems Autonomous agents Adaptive AI systems Multi-task learning applications

Related Terms