<model_name>!SHOW_THRESHOLD_METRICS

Returns raw counts and metrics for a specific threshold for each class in models where evaluation was enabled at instantiation. This method takes no arguments. See Metrics in `show_threshold_metrics`.

Output

ColumnTypeDescription
dataset_typeVARCHARThe name of the dataset used for metrics calculation, currently EVAL.
classVARCHARThe predicted class. Each class has its own set of metrics, which are provided in multiple rows.
thresholdFLOATThreshold used to generate predictions.
precisionFLOATPrecision for the given class. The ratio of true positives to the total predicted positives.
recallFLOATRecall for the given class. Also called “sensitivity.” The ratio of true positives to the total actual positives.
f1FLOATF1 score for the given class.
tprFLOATTrue positive rate for the given class.
fprFLOATFalse positive rate for the given class.
tpINTEGERTotal count of true positives in the given class.
fpINTEGERTotal count of false positives in the given class.
tnINTEGERTotal count of true negatives in the given class.
fnINTEGERTotal count of false negatives in the given class.
accuracyFLOATThe accuracy (ratio of correct predictions, both positive and negative, to the total number of predictions) for the given class.
supportINTEGERThe support (true positives plus false negatives) for the given class.
logsVARIANTContains error or warning messages.