Confusion Matrix

From Open Risk Manual
Revision as of 15:08, 5 September 2020 by Wiki admin (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


The Confusion Matrix is a method for evaluating forecasts by comparing the frequencies of predicted against observed outcomes (Also Contingency Table or Prediction-Realization Table)


The confusion matrix is the simplest form of model assessment for certain types of statistical models. For a binary classification system, say a credit scoring card it would be simply a 2x2 matrix as follows:

Predicted performance
Goods Bads
Actual performance
Goods 800 100
Bads 50 50
In this confusion matrix, of the 1000 actual clients, 900 performed and 100 did not. The model predicted that 850 would

perform and 150 would not. Hence the model is relatively ok about identifying goods, but quite confused about identifying bads.

See Also