- Categories:
MODEL_MONITOR_PERFORMANCE_METRIC¶
Gets performance metrics from a model monitor. Each model monitor monitors one machine learning model.
- See also:
Querying monitoring results for more information.
Syntax¶
MODEL_MONITOR_PERFORMANCE_METRIC(<model_monitor_name>, <performance_metric_name>,
[, <granularity> [, <start_time> [, <end_time> ] ] ] )
Arguments¶
Required:
MODEL_MONITOR_NAME
Name of the model monitor used to compute the metric.
Valid values:
A string that’s the name of the model monitor. It can be a simple or fully qualified name.
METRIC_NAME
Name of the performance metric.
Valid values if the model monitor is attached to a regression model:
'RMSE'
'MAE'
'MAPE'
'MSE'
Valid values if the model monitor is attached to a binary classification model:
'ROC_AUC'
'CLASSIFICATION_ACCURACY'
'PRECISION'
'RECALL'
'F1_SCORE'
Optional:
GRANULARITY
Granularity of the time range being queried. The default value is
1 DAY
.Valid values:
'<num> DAY'
'<num> WEEK'
'<num> MONTH'
'<num> QUARTER'
'<num> YEAR'
'ALL'
NULL
START_TIME
Start of the time range used to compute the metric. The default value is 60 days before the current time, and is calculated each time you call the function.
Valid values:
A timestamp expression or
NULL
.END_TIME
End of the time range used to compute the metric. The default value is the current time, and is calculated each time you call the function.
Valid values:
A timestamp expression or
NULL
.
Returns¶
Column |
Description |
Example values |
---|---|---|
|
Timestamp at the start of the time range. |
|
|
Value of the metric within the specified time range. |
|
|
Number of records used to compute the metric. |
|
|
Number of records excluded from the metric computation. |
|
|
Name of the metric that has been computed. |
|
Usage Notes¶
If value you’ve specified for model_monitor_name
is case-sensitive or contains special characters or spaces, enclose it in double quotes.
You must enclose the double quotes within single quotes. For example, '"<example_model_monitor_name>"'
.
If you don’t use double-quotes, the model_monitor_name
is assumed to be case-insensitive.
General requirements¶
The model monitor must be associated with a model that supports the requested metric type.
The model monitor must contain the necessary data for each metric type, as described below.
Metric requirements¶
The following are the required columns to get regression metrics:
RMSE: Requires the
prediction_score
andactual_score
columnsMAE: Requires the
prediction_score
andactual_score
columnsMAPE: Requires the
prediction_score
andactual_score
columns
The following are the required columns to get binary classification metrics:
ROC_AUC: Requires the
prediction_score
andactual_class
columnsCLASSIFICATION_ACCURACY: Requires the
prediction_class
andactual_class
columnsPRECISION: Requires the
prediction_class
andactual_class
columnsRECALL: Requires the
prediction_class
andactual_class
columnsF1_SCORE: Requires the
prediction_class
andactual_class
columns
The following are the required columns to get multiclass classification metrics:
CLASSIFICATION_ACCURACY: Requires the
prediction_class
andactual_class
columnsMACRO_AVERAGE_PRECISION: Requires the
prediction_class
andactual_class
columnsMACRO_AVERAGE_RECALL: Requires the
prediction_class
andactual_class
columnsMICRO_AVERAGE_PRECISION: Requires the
prediction_class
andactual_class
columnsMICRO_AVERAGE_RECALL: Requires the
prediction_class
andactual_class
columns
Note
For binary classification, you can use micro-average precision and recall metrics similarly to how you use classification accuracy in multi-class classification.
Error cases¶
You might run into errors if you do the following:
Request an accuracy metric without setting the corresponding prediction or actual column.
Fail to provide data in the
actual_score
oractual_class
column.
Examples¶
The following example gets the Root Mean Square Error (RMSE) over a one-day period from the model monitor.
SELECT * FROM TABLE(MODEL_MONITOR_PERFORMANCE_METRIC(
'MY_MONITOR', 'RMSE', '1 DAY', TO_TIMESTAMP_TZ('2024-01-01'), TO_TIMESTAMP_TZ('2024-01-02'))
)
The following example gets the Root Mean Square Error (RMSE) over the last 30 days from the model monitor:
SELECT * FROM TABLE(MODEL_MONITOR_PERFORMANCE_METRIC(
'MY_MONITOR', 'RMSE', '1 DAY', DATEADD('DAY', -30, CURRENT_DATE()), CURRENT_DATE())
)