score_roc_measures {mlr3} | R Documentation |
Calculate ROC Measures
Description
Calculate a set of roc performance measures based on the confusion matrix.
-
tpr
True positive rate (Sensitivity, Recall) -
fpr
False positive rate (Fall-out) -
fnr
False negative rate (Miss rate) -
tnr
True negative rate (Specificity) -
ppv
Positive predictive value (Precision) -
fomr
False omission rate -
lrp
Positive likelihood ratio (LR+) -
fdr
False discovery rate -
npv
Negative predictive value -
acc
Accuracy -
lrm
Negative likelihood ratio (LR-) -
dor
Diagnostic odds ratio
Usage
score_roc_measures(pred)
Arguments
pred |
(PredictionClassif) |
Value
list()
A list containing two elements confusion_matrix
which is the 2 times 2 confusion matrix of absolute frequencies and measures
, a list of the above mentioned measures.
Examples
learner = lrn("classif.rpart", predict_type = "prob")
splits = partition(task = tsk("pima"), ratio = 0.7)
task = tsk("pima")
learner$train(task)
pred = learner$predict(task)
score_roc_measures(pred)
[Package mlr3 version 1.0.1 Index]