# ReLU vs Sigmoid vs Softmax

ReLU: Rectified Linear Unit;  

Sigmoid: 

Softmax:  for 

ReLU and Sigmoid are used for hidden layers. Softmax used for the last layer and normalized the groups.