ReLU vs Sigmoid vs Softmax

ReLU: Rectified Linear Unit;  $$y = max(0,x)$$

Sigmoid: $$y(x)= \frac{1}{1+e^{-x}}$$

Softmax: $$y(z)_j = \frac{e^{z_j}}{\sum_{k=1}^{K} e^{z_k}}$$ for $$j=1,2…,K$$

ReLU and Sigmoid are used for hidden layers. Softmax used for the last layer and normalized the groups.

Leave a Reply

Your email address will not be published. Required fields are marked *