91. Learning Rate Decay

  • A technique that reduces the learning rate over time during training to help the model converge more smoothly and avoid overshooting minima. ​

Last updated