Cross-Entropy Calculator
Calculate cross-entropy loss and KL divergence between true and predicted probability distributions. Widely used as a loss function in classification tasks.
How to Use the Cross-Entropy Calculator
- Enter the true probability (0 to 1).
- Enter the predicted probability (0 to 1).
- Click Calculate to get cross-entropy and KL divergence.
Casos de Uso
- •Monitoring training loss during neural network training.
- •Comparing model outputs against ground truth labels.
- •Understanding information theory in ML.
- •Debugging overfitting or underfitting issues.
Fórmula
H(p,q) = −[p×log(q) + (1−p)×log(1−q)]. KL(p||q) = p×log(p/q) + (1−p)×log((1−p)/(1−q)).
Perguntas Frequentes
What is cross-entropy loss?
Cross-entropy measures how well a predicted probability distribution matches the true distribution. Lower is better.
What is KL divergence?
KL divergence measures the difference between two probability distributions. It is always ≥ 0.
What is a good cross-entropy value?
Values close to 0 indicate the model's predictions closely match the true labels.