TAAFT
Free mode
100% free
Freemium
Free Trial
Create tool

Batch Normalization

[bætʃ ˌnɔːməlaɪˈzeɪʃən]
Deep Learning
Last updated: December 9, 2024

Definition

Technique that normalizes the input of each layer to improve training stability and speed

Detailed Explanation

Batch normalization normalizes the activations of each layer by adjusting and scaling the activations. This reduces internal covariate shift, allows higher learning rates, and acts as a regularizer.

Use Cases

Training deep neural networks, accelerating training

Related Terms