- Categories:
COPY_HISTORY¶
This table function can be used to query Snowflake data loading history along various dimensions within the last 14 days. The function returns load activity for both COPY INTO <table> statements and continuous data loading using Snowpipe. The table function avoids the 10,000 row limitation of the LOAD_HISTORY view. The results can be filtered using SQL predicates.
You can also view data loading details in Snowsight. See Monitor data loading activity by using Copy History.
Syntax¶
COPY_HISTORY(
TABLE_NAME => '<string>'
, START_TIME => <constant_expr>
[, END_TIME => <constant_expr> ] )
Arguments¶
Required:
TABLE_NAME => 'string'
A string specifying a table name.
START_TIME => constant_expr
Timestamp (in TIMESTAMP_LTZ format), within the last 14 days, marking the start of the time range for retrieving load events.
Optional:
END_TIME => constant_expr
Timestamp (in TIMESTAMP_LTZ format), within the last 14 days, marking the end of the time range for retrieving load events.
Default: CURRENT_TIMESTAMP.
Usage notes¶
For bulk data loads, this function returns results for a role that has MONITOR privilege on your Snowflake account, or a role with USAGE privilege on schema and database and any privilege on table.
For Snowpipe data loads, this function returns results for a role that has MONITOR privilege on your Snowflake account, or a role with USAGE privilege on schema and database that contains the pipe and any privilege on table. In addition, if MONITOR on pipe is not available, pipe name, pipe table name, pipe schema name and pipe catalog name are masked as NULL.
When calling an Information Schema table function, the session must have an INFORMATION_SCHEMA schema in use or the function name must be fully-qualified. For more details, see Snowflake Information Schema.
This view returns a limit of 14 days of copy history. To avoid this limitation, use the COPY_HISTORY view (Account Usage).
The function only includes COPY INTO commands that executed to completion, with or without errors.
Dropping or recreating a table object removes the historical data for bulk data loads (COPY INTO <table> statements) into the table.
Dropping or recreating a pipe object removes the historical data for Snowpipe data loads using the pipe.
The COPY_HISTORY view shows copy history only after the latest truncate operation on the target table. This applies to the COPY_HISTORY views before and after replication.
Output¶
The function returns the following columns:
Column Name |
Data Type |
Description |
---|---|---|
FILE_NAME |
TEXT |
Name of the source file and relative path to the file. |
STAGE_LOCATION |
TEXT |
Name of the stage where the source file is located. |
LAST_LOAD_TIME |
TIMESTAMP_LTZ |
Date and time of when the file finished loading. |
ROW_COUNT |
NUMBER |
Number of rows loaded from the source file. |
ROW_PARSED |
NUMBER |
Number of rows parsed from the source file; |
FILE_SIZE |
NUMBER |
Size of the source file loaded (in bytes). |
FIRST_ERROR_MESSAGE |
TEXT |
First error of the source file. |
FIRST_ERROR_LINE_NUMBER |
NUMBER |
Line number of the first error. |
FIRST_ERROR_CHARACTER_POS |
NUMBER |
Position of the first error character. |
FIRST_ERROR_COLUMN_NAME |
TEXT |
Column name of the first error. |
ERROR_COUNT |
NUMBER |
Number of error rows in the source file. |
ERROR_LIMIT |
NUMBER |
If the number of errors reaches this limit, then abort. |
STATUS |
TEXT |
Status: |
TABLE_CATALOG_NAME |
TEXT |
Name of the database in which the target table resides. |
TABLE_SCHEMA_NAME |
TEXT |
Name of the schema in which the target table resides. |
TABLE_NAME |
TEXT |
Name of the target table. |
PIPE_CATALOG_NAME |
TEXT |
Name of the database in which the pipe resides. |
PIPE_SCHEMA_NAME |
TEXT |
Name of the schema in which the pipe resides. |
PIPE_NAME |
TEXT |
Name of the pipe defining the load parameters; |
PIPE_RECEIVED_TIME |
TIMESTAMP_LTZ |
Date and time when the INSERT request for the file loaded through the pipe was received; |
Examples¶
Retrieve details about all loading activity in the last hour:
select * from table(information_schema.copy_history(TABLE_NAME=>'MYTABLE', START_TIME=> DATEADD(hours, -1, CURRENT_TIMESTAMP())));