Precision-recall cheat sheet

Written Nov 20, 2017

Plus sensitivity, specificity, false positive rate, AUC, etc

Precision Recall


Precision is “how useful the search results are”, and recall is “how complete the results are”.


Recall == sensitivity == True Positive Rate (for “positives”)

Specificity = same, but for negatives. True Negative Rate.

In simple terms, high precision means that an algorithm returned substantially more relevant results than irrelevant ones, while high recall means that an algorithm returned most of the relevant results.

Precision can be seen as a measure of exactness or quality, whereas recall is a measure of completeness or quantity.

There are two curves

… for which you can compute Area under curve

AUC for ROC curve

  • The more popular, well-known one
  • TPR vs FPR

AUC for PR curve (precision-recall)

  • prefereed in the case of imbalanced outcome
  • Precision vs recall

AUC curve

TPR on y, FPR on x

the following are the two sides of the main plot above

TPR = TP / (TP + FN)
FPR = FP / (FP + TN)
  • TPR = recall = sensitivity
  • FPR = 1 - specificity


PPV == Precision


Comprehensive table at wikipedia