Confusion Matrix Calculator
Calculate accuracy, precision, recall, F1, specificity, and MCC from a confusion matrix. Enter TP, TN, FP, FN to get all key classification metrics.
How to Use the Confusion Matrix Calculator
- Enter True Positives (TP).
- Enter True Negatives (TN).
- Enter False Positives (FP).
- Enter False Negatives (FN).
- Click Calculate to see all metrics.
Cas d'utilisation
- •Comprehensive evaluation of binary classifiers.
- •Comparing model performance across multiple metrics.
- •Diagnosing class imbalance effects on model quality.
- •Reporting model evaluation results in ML projects.
Formule
Accuracy = (TP+TN)/(TP+TN+FP+FN). Precision = TP/(TP+FP). Recall = TP/(TP+FN). F1 = 2PR/(P+R). Specificity = TN/(TN+FP). MCC = (TP×TN−FP×FN)/√((TP+FP)(TP+FN)(TN+FP)(TN+FN)).
Questions fréquemment posées
What is a confusion matrix?
A confusion matrix shows the counts of true positives, true negatives, false positives, and false negatives for a classifier.
What is MCC?
Matthews Correlation Coefficient (MCC) is a balanced measure even for imbalanced classes, ranging from -1 to +1.
What is specificity?
Specificity (true negative rate) = TN / (TN + FP), measuring how well the model identifies negatives.