<model_name>!SHOW_EVALUATION_METRICS

Returns out-of-sample evaluation metrics generated using time-series cross-validation. Metrics are available only if evaluate=TRUE in the CONFIG_OBJECT during model construction (this is the default).

Syntax

<model_name>!SHOW_EVALUATION_METRICS();
Copy

Output

The SERIES column is present only for multi-series forecasts. Single-series forecasts do not have this column.

Column

Type

Description

SERIES

VARIANT

Series value (only present if model was trained with multiple time series)

ERROR_METRIC

VARCHAR

The name of the error metric used. The method returns the following metrics:

Point Metrics:

Interval Metrics: These metrics use the prediction_interval argument from the Evaluation configuration.

  • COVERAGE_INTERVAL: The proportion of actual values that fall within the prediction interval.

  • WINKLER_ALPHA: Winkler Score.

LOGS

VARIANT

Contains error or warning messages.

Examples

See Examples.