OhMyCalc

Cross-Entropy Calculator

Calculate cross-entropy loss and KL divergence between true and predicted probability distributions. Widely used as a loss function in classification tasks.

How to Use the Cross-Entropy Calculator

  1. Enter the true probability (0 to 1).
  2. Enter the predicted probability (0 to 1).
  3. Click Calculate to get cross-entropy and KL divergence.

Anwendungsfälle

Formel

H(p,q) = −[p×log(q) + (1−p)×log(1−q)]. KL(p||q) = p×log(p/q) + (1−p)×log((1−p)/(1−q)).

Häufig gestellte Fragen

What is cross-entropy loss?
Cross-entropy measures how well a predicted probability distribution matches the true distribution. Lower is better.
What is KL divergence?
KL divergence measures the difference between two probability distributions. It is always ≥ 0.
What is a good cross-entropy value?
Values close to 0 indicate the model's predictions closely match the true labels.