SnowConvert AI - BigQuery Functional Differences¶
Note
Conversion Scope
SnowConvert AI for Google BigQuery currently supports assessment and translation for TABLES and VIEWS. Although SnowConvert AI can recognize other types of statements, they are not fully supported.
SSC-FDM-BQ0001¶
Accessing arrays produces NULL instead of an error for positive out of bounds indexes in Snowflake.
Description¶
When accessing an ARRAY object by index in Snowflake, specifying an index greater than the size of the array will result in a NULL value, this differs with the behavior of BigQuery, where accessing an ARRAY with an index that is out of bounds will produce an error, unless the functions SAFE_OFFSET or SAFE_ORDINAL are used.
This FDM is added to any ARRAY access that is not safe.
Code Example¶
Input Code:¶
BigQuery¶
Generated Code:¶
Snowflake¶
Best Practices¶
Analyze the uses of array access in the code. If there was never the risk of getting an out of bounds error in the original code, no difference will be observed and this FDM can be safely ignored.
If the original code relies on out-of-bounds access raising an error (e.g., for flow control), add explicit bounds checking in Snowflake using
ARRAY_SIZEbefore accessing the array.
SSC-FDM-BQ0002¶
Exception system variables are not supported in Snowflake.
Description¶
BigQuery’s exception system variables (@@error.message, @@error.stack_trace, @@error.statement_text, @@error.formatted_stack_trace) have no direct equivalent in Snowflake. SnowConvert AI replaces exception variable references with OBJECT_CONSTRUCT('SQLERRM', SQLERRM, 'SQLCODE', SQLCODE, 'SQLSTATE', SQLSTATE) as a workaround. This workaround provides basic error information but does not include stack trace or statement text details available in BigQuery. For more information, see Handling Exceptions in Snowflake.
Code Example¶
Input Code:¶
BigQuery¶
Result¶
Generated Code:¶
Snowflake¶
Result¶
Best Practices¶
Snowflake provides three built-in exception variables as an alternative to BigQuery’s
@@errorsystem variables:BigQuery Variable
Snowflake Equivalent
Notes
@@error.messageSQLERRMError message text
@@error.statement_textN/A
No direct equivalent in Snowflake
@@error.stack_traceN/A
No direct equivalent in Snowflake
@@error.formatted_stack_traceN/A
No direct equivalent in Snowflake
N/A
SQLSTATE5-character ANSI SQL state code
N/A
SQLCODE5-digit signed integer error code
Review the generated
OBJECT_CONSTRUCT('SQLERRM', SQLERRM, 'SQLCODE', SQLCODE, 'SQLSTATE', SQLSTATE)workaround and adjust it based on your specific error-handling requirements.For more information, see Handling Exceptions in Snowflake.
SSC-FDM-BQ0003¶
Unable to generate correct return table clause due to missing dependent object information.
Note
This issue is deprecated and no longer generated by SnowConvert AI. Check SSC-EWI-BQ0009 for the issue now generated for this scenario
Description¶
Snowflake requires a valid RETURNS TABLE clause for CREATE TABLE FUNCTION statements.
If the original BigQuery source code does not have a RETURNS TABLE clause, SnowConvert AI must build one. To do this, an analysis is made to the CREATE TABLE FUNCTION query to properly infer the types of the columns of the resulting table. When SnowConvert AI cannot gather the required information, this EWI is added.
Code Example¶
Input Code:¶
BigQuery¶
Generated Code:¶
Snowflake¶
Best Practices¶
Always try to include any dependent object definitions in the input code, so that SnowConvert AI has access to important information.
If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-BQ0004¶
The INFER_SCHEMA function requires a file path without wildcards to generate the table template, replace the FILE_PATH placeholder with it
Warning
This FDM is deprecated; please refer to SSC-FDM-0035 for the latest version of this FDM.
Description¶
The INFER_SCHEMA function is used in Snowflake to generate the columns definition of a table based on the structure of a file, it requires a LOCATION parameter that specifies the path to a file or folder that will be used to construct the table columns, however, this path does not support regex, meaning that the wildcard * character is not supported.
When the table has no columns, SnowConvert AI will check all URIS to find one that does not use wildcards and use it in the INFER_SCHEMA function. When no URI meets such criteria, this FDM and a FILE_PATH placeholder is generated, and the placeholder has to be replaced with the path of one of the files referenced by the external table to generate the table columns.
Code Example¶
Input Code:¶
BigQuery¶
Generated Code:¶
Snowflake¶
Best Practices¶
If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-BQ0005¶
Parsing the CSV header is not supported in external tables, columns must be renamed to match the original names
Description¶
Snowflake external tables do not support parsing the header of CSV files. SKIP_HEADER is used as a workaround to avoid runtime errors, but the resulting table column names will have auto-generated names (c1, c2, …, cN) instead of the original header names.
When SnowConvert AI detects an external table with CSV file format and no explicit column list, it adds the SKIP_HEADER = 1 file format option. The columns must be manually renamed to match the original names from the CSV header.
Code Example¶
Input Code:¶
BigQuery¶
Generated Code:¶
Snowflake¶
Best Practices¶
Rename the auto-generated column names (
c1,c2, …,cN) back to the original column names from the CSV file header.If the original column names are known, use
ALTER TABLE ... RENAME COLUMNor recreate the external table with explicit column definitions.For non-external-table loading scenarios, consider using
MATCH_BY_COLUMN_NAMEwithPARSE_HEADER = TRUEin the file format to automatically match columns by header names.
SSC-FDM-BQ0006¶
Reading from Google Drive is not supported in Snowflake, upload the files to the external location and replace the FILE_PATH placeholders
Description¶
Snowflake does not support reading data from files hosted in Google Drive, this FDM is generated to notify it, please upload the Google Drive files to the external location so they can be accessed through the external stage.
The PATTERN clause will hold autogenerated placeholders FILE_PATH0, FILE_PATH1, …, FILE_PATHN that should be replaced with the file/folder path after the files were moved to the external location.
Code Example¶
Input Code:¶
BigQuery¶
Generated Code:¶
Snowflake¶
Best Practices¶
Download the files from Google Drive and upload them to a cloud storage location accessible by Snowflake (e.g., Amazon S3, Azure Blob Storage, or Google Cloud Storage).
Create or configure an external stage in Snowflake pointing to the cloud storage location.
Replace the
FILE_PATHplaceholders in thePATTERNclause with the actual file or folder paths relative to the external stage.
SSC-FDM-BQ0007¶
The GOOGLE_SHEETS format is not supported in Snowflake. CSV file type is used as a workaround.
Description¶
The GOOGLE_SHEETS format is not supported in Snowflake. CSV file type is used as a workaround because the structure of Google Sheets data is similar to CSV.
When SnowConvert AI detects an external table using the GOOGLE_SHEETS format, it produces an external table with the CSV file format instead. The resulting table expects a CSV file rather than a Google Sheets source.
Code Example¶
Input Code:¶
BigQuery¶
Generated Code:¶
Snowflake¶
Best Practices¶
Export the Google Sheets data as CSV files and upload them to a cloud storage location accessible by Snowflake.
Verify that the CSV export preserves the expected data types and formatting, especially for dates, numbers, and text fields with commas.
If the external table also references Google Drive URIs, see SSC-FDM-BQ0006 for instructions on migrating the files to an external stage.
SSC-FDM-BQ0008¶
Where clause references a column of STRUCT type. Comparison operations may produce different results in Snowflake.
Description¶
BigQuery STRUCT types have no direct equivalent in Snowflake. VARIANT is used as a workaround (see SSC-FDM-0034). When a comparison involves a Snowflake VARIANT created from a BigQuery STRUCT, the results may differ because Snowflake compares both keys and values, whereas BigQuery compares only values regardless of field names.
This FDM is added when a WHERE clause comparison involves a column of STRUCT type that was converted to VARIANT.
Code Example¶
Input Code:¶
BigQuery¶
Output Code:¶
Snowflake¶
Best Practices¶
Review WHERE clause comparisons involving STRUCT-derived VARIANT columns. If the original BigQuery query compared STRUCTs by value only, extract and compare individual fields explicitly in Snowflake.
For example, replace
WHERE col1 <> col2withWHERE col1:sc1 <> col2:sc2to compare specific field values instead of the entire VARIANT object.For more information on VARIANT comparison behavior, see the Snowflake VARIANT documentation.
SSC-FDM-BQ0010¶
Geography function is not required in Snowflake.
Description¶
Snowflake automatically detects GEOGRAPHY data from WGS 84 formatted strings (WKT, WKB, GeoJSON), so explicit geography conversion functions like ST_GEOGFROMTEXT are not required in VALUES clause inserts. SnowConvert AI removes the function call and passes the string literal directly. This FDM is added to notify that the geography function was removed.
Code Example¶
Input Code:¶
BigQuery¶
Output Code:¶
Snowflake¶
Best Practices¶
This FDM can be safely ignored in most cases. Snowflake natively supports GEOGRAPHY data from WKT, WKB, and GeoJSON string formats without requiring explicit conversion functions.
If the removed function performed validation or transformation beyond simple type casting, verify that the inserted data is valid GEOGRAPHY data in Snowflake.
For more information, see the Snowflake GEOGRAPHY data type documentation.
SSC-FDM-BQ0011¶
Named parameters in this script were transformed to Snowflake CLI variables.
Description¶
BigQuery supports named parameters using the @parameter_name syntax in queries. SnowConvert AI transforms these named parameters to Snowflake CLI variables using the <% parameter_name %> syntax.
To execute the transformed .sql scripts containing named parameters, use Snowflake CLI with variable substitution.
For more information on how to set up and use Snowflake CLI, see What is Snowflake CLI?
Code Example¶
Input Code:¶
BigQuery¶
Example execution (using the bq query command)¶
Output Code:¶
Snowflake¶
Example execution (Snowflake CLI)¶
Best Practices¶
Install and configure Snowflake CLI to execute the transformed scripts with variable substitution using the
-Dflag (e.g.,snow sql -f script.sql -D "param=value").Review each transformed
<% parameter_name %>variable to ensure the parameter name and intended value match the original BigQuery@parameter_nameusage.If the transformed script will be executed outside of Snowflake CLI (e.g., in a Snowflake worksheet), replace
<% parameter_name %>variables with literal values or session variables as appropriate.
SSC-FDM-BQ0012¶
Select * with multiple UNNEST operators will produce column ambiguity in Snowflake
Description¶
As part of the SnowConvert transformation for the UNNEST operator, the FLATTEN function is used, this function generates multiple columns not required to emulate the UNNEST operator functionality like the THIS or PATH columns.
When a SELECT * with the UNNEST operator is found, SnowConvert will remove the unnecessary columns using the EXCLUDE keyword, however, when multiple UNNEST operators are used in the same statement, the columns can not be removed due to ambiguity problems, this FDM will be generated to mark these cases.
It is recommended to expand the SELECT expression list in order to specify only the expected columns and solve this issue.
Code Example¶
Input Code:¶
BigQuery¶
Generated Code:¶
Snowflake¶
Recommendations¶
Expand the SELECT list: Replace
SELECT *with an explicit column list specifying only the columns you need from each UNNEST/FLATTEN result. This eliminates the ambiguity caused by duplicate metadata columns.Use table aliases: Qualify each column reference with the corresponding table alias to avoid ambiguity between the FLATTEN results.