**Mean Squared Error vs Cross entropy loss function**
https://vitalflux.com/mean-squared-error-vs-cross-entropy-loss-function/

Aug 31, 2021 · What is cross-entropy loss? Cross entropy loss is used in classification tasks where we are trying to minimize the probability of a negative class by maximizing an expected value of some function on our training data, also called as “loss function”. Simply speaking, it is used to measure the difference between two probabilities that a model assigns to classes.

**DA:** 66 **PA:** 19 **MOZ Rank:** 81