Problems that occur during neural network training when gradients become too small (vanishing) or too large (exploding), hindering effective learning.
Last updated 10 months ago