222. Batch Normalization

A technique to standardize the inputs of each layer in a neural network, improving training speed and stability.

Last updated