ReLU vs Sigmoid vs Softmax

ReLU: Rectified Linear Unit;  

Sigmoid:

Softmax: for

ReLU and Sigmoid are used for hidden layers. Softmax used for the last layer and normalized the groups.

Leave a Reply

Your email address will not be published. Required fields are marked *