<model_ name>!SHOW_ GLOBAL_ EVALUATION_ METRICS¶
Returns overall evaluation metrics for models where evaluation was enabled at instantiation. This method takes no arguments. See Metrics in `show_global_evaluation_metrics`.
Output¶
| Column | Type | Description |
|---|---|---|
dataset_type | VARCHAR | The name of the dataset used for metrics calculation, currently EVAL. |
average_type | VARCHAR | The method of aggregation used to calculate overall metrics from the individual class metrics, currently MACRO. |
error_metric | VARCHAR | The error metric name. Can include Precision, Recall, F1, etc. |
metric_value | FLOAT | The error metric value |
logs | VARIANT | Contains error or warning messages. |