<model_name>!SHOW_GLOBAL_EVALUATION_METRICS¶
Returns overall evaluation metrics for models where evaluation was enabled at instantiation. This method takes no arguments. See Metrics in show_global_evaluation_metrics.
Output¶
Column |
Type |
Description |
---|---|---|
|
The name of the dataset used for metrics calculation, currently EVAL. |
|
|
The method of aggregation used to calculate overall metrics from the individual class metrics, currently MACRO. |
|
|
The error metric name. Can include Precision, Recall, F1, etc. |
|
|
The error metric value |
|
|
Contains error or warning messages. |