Softmax vs softmax_cross_entropy_with_logits

cross entropy means $$ H =-\sum_i y_i’ log(y_i) $$

is equivalent to

However, Softmax_cross_entropy_with_logits is somehow numerically more stable.

Leave a Reply

Your email address will not be published. Required fields are marked *