Cross-Entropy Loss

We use this for Classification problems, i am pretty sure?

The cross-entropy between a “true” distribution  and an estimated distribution  is defined as:

The cross-entropy loss is given by

Is this a similar idea of using the negative log to the Negative Log Likelihood??

https://aboveintelligent.com/deep-learning-basics-the-score-function-cross-entropy-d6cc20c9f972

Also see Softmax Classifier