238. ReLU (Rectified Linear Unit)
An activation function defined as the positive part of its input, commonly used in deep learning models.
Last updated
An activation function defined as the positive part of its input, commonly used in deep learning models.
Last updated