Classification Metrics Explorer

Adjust the classification threshold to understand Accuracy, Precision & Recall

Classification Distribution

Threshold

0.50

Logistic Regression Sigmoid Curve

Accuracy

0.00

Precision

0.00

Recall

0.00

F1-Score

0.00

Confusion Matrix

Pred + Pred −
Act + ☑️TP: 0 FN: 0
Act − FP: 0 ☑️TN: 0

The sigmoid curve shows predicted probability. Black line = decision threshold.

Metrics vs Threshold

ROC Curve AUC: 0.00

Precision-Recall Curve AUC: 0.00

💡 Understanding the Trade-off

Lower Threshold: More predictions labeled as positive → Higher Recall, Lower Precision

Higher Threshold: Fewer predictions labeled as positive → Lower Recall, Higher Precision

Why the trade-off? Lowering the threshold catches more true positives, but also more false positives. Raising it reduces false positives but misses more true positives.