Mapping the confusion matrix
[This article was first published on modTools, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
The ‘confusionLabel‘ function below labels the predictions of a binary response model according to their confusion matrix categories, i.e., it classifies each prediction into a false positive, false negative, true positive or true negative, given a user-defined threshold value:
confusionLabel <- function(model, # placeholder, not yet implemented obs, # vector of observed 0 and 1 values pred, # vector of predicted probabilities thresh) # threshold to separate predictions { if (length(obs) != length(pred)) stop("'obs' and 'pred' must be of the same length (and in the same order)") res <- rep("", length(obs)) res[pred >= thresh & obs == 1] <- "TruePos" res[pred < thresh & obs == 0] <- "TrueNeg" res[pred >= thresh & obs == 0] <- "FalsePos" res[pred < thresh & obs == 1] <- "FalseNeg" res }
Usage example:
confusionLabel(obs = myspecies_presabs, pred = myspecies_P, thresh = 0.23)
The result is a vector with the same length as ‘obs’ and ‘pred’, containing the label for each value. This allows you to then map the confusion matrix, like this:
This function will be included in the next version of the ‘modEvA‘ package, which is soon to be released.
To leave a comment for the author, please follow the link and comment on their blog: modTools.
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.