100. Weight Sharing

  1. A technique in neural networks where different parts of the network share the same weights, reducing the total number of parameters and promoting generalization. ​

Last updated