100. Weight Sharing
A technique in neural networks where different parts of the network share the same weights, reducing the total number of parameters and promoting generalization.
Last updated
A technique in neural networks where different parts of the network share the same weights, reducing the total number of parameters and promoting generalization.
Last updated