SnowConvert: Teradata Functional Differences

SSC-FDM-TD0001

Description

This message appears when SnowConvert detects a BLOB data type. Since Snowflake does not support BLOB, it automatically converts it to Binary data type.

Code Example

Input Code:
 CREATE TABLE TableExample
(
ColumnExample BLOB
)
Copy
Generated Code:
 CREATE OR REPLACE TABLE TableExample
(
ColumnExample BINARY /*** SSC-FDM-TD0001 - COLUMN CONVERTED FROM BLOB DATA TYPE ***/
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"teradata"}}'
;
Copy

Best Practices

SSC-FDM-TD0002

Description

This message appears when SnowConvert encounters a CLOB data type. Since SnowConvert does not support CLOB, it automatically converts it to VARCHAR.

Code Example

Input Code:
 CREATE TABLE TableExample
(
ColumnExample CLOB
)
Copy
Generated Code:
 CREATE OR REPLACE TABLE TableExample
(
ColumnExample VARCHAR /*** SSC-FDM-TD0002 - COLUMN CONVERTED FROM CLOB DATA TYPE ***/
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"teradata"}}'
;
Copy

Best Practices

SSC-FDM-TD0003

Description

When SnowConvert migrates script files to Snowflake Scripting, it automatically converts Bash-style variable placeholders ($variable or ${variable}) into their SnowSQL equivalent format (&variable or &{variable}).

This script requires SnowSQL to run successfully. Before executing the migrated script in SnowSQL, please note the following:

Example Code

Input Code:
 .LOGON dbc, dbc;

select '$variable', '${variable}', '${variable}_concatenated';

select $colname from $tablename where info = $id;

select ${colname} from ${tablename} where info = ${id};

.LOGOFF;
Copy
Generated Code:
-- Additional Params: -q snowscript

EXECUTE IMMEDIATE
$$
  --** SSC-FDM-TD0003 - BASH VARIABLES FOUND, USING SNOWSQL WITH VARIABLE SUBSTITUTION ENABLED IS REQUIRED TO RUN THIS SCRIPT **
  DECLARE
    STATUS_OBJECT OBJECT := OBJECT_CONSTRUCT('SQLCODE', 0);
  BEGIN
    --.LOGON dbc, dbc
    !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'BTLogOn' NODE ***/!!!
    null;
    BEGIN
      SELECT
        '&variable',
        '&{variable}',
        '&{variable}_concatenated';
      STATUS_OBJECT := OBJECT_CONSTRUCT('SQLROWCOUNT', SQLROWCOUNT);
    EXCEPTION
      WHEN OTHER THEN
        STATUS_OBJECT := OBJECT_CONSTRUCT('SQLCODE', SQLCODE, 'SQLERRM', SQLERRM, 'SQLSTATE', SQLSTATE);
    END;
    BEGIN
      SELECT
        &colname
      from
        &tablename
      where
        info = &id;
      STATUS_OBJECT := OBJECT_CONSTRUCT('SQLROWCOUNT', SQLROWCOUNT);
    EXCEPTION
      WHEN OTHER THEN
        STATUS_OBJECT := OBJECT_CONSTRUCT('SQLCODE', SQLCODE, 'SQLERRM', SQLERRM, 'SQLSTATE', SQLSTATE);
    END;
    BEGIN
      SELECT
        &{colname}
      from
        &{tablename}
      where
        info = &{id};
      STATUS_OBJECT := OBJECT_CONSTRUCT('SQLROWCOUNT', SQLROWCOUNT);
    EXCEPTION
      WHEN OTHER THEN
        STATUS_OBJECT := OBJECT_CONSTRUCT('SQLCODE', SQLCODE, 'SQLERRM', SQLERRM, 'SQLSTATE', SQLSTATE);
    END;
    --.LOGOFF
    !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'LogOff' NODE ***/!!!
    null;
  END
$$
Copy

Best Practices

SSC-FDM-TD0004

Description

Teradata’s PERIOD data type represents time intervals. Each PERIOD has a start and end value of the same type (TIME, DATE, or TIMESTAMP). Teradata provides built-in functions like PERIOD, BEGIN, END, and OVERLAPS to create and manage these time intervals.

Since Snowflake does not support the period data type, SnowConvert converts this type and its associated functions using specific transformation rules:

  • Period type declarations in column tables are converted into two columns of identical type.

  • The period value constructor function is split into two separate constructors: one for the start value and one for the end value.

  • Functions that require period type parameters are converted to User-Defined Functions (UDFs). These UDFs typically require two parameters: one for the start value and one for the end value.

Example code

Input code:
 -- Additional Params: --SplitPeriodDatatype
CREATE TABLE DateTable
(
	COL1 PERIOD(DATE) DEFAULT PERIOD (DATE '2005-02-03', UNTIL_CHANGED)
);
Copy
Generated Code:
CREATE OR REPLACE TABLE DateTable
(
	COL1_begin DATE DEFAULT DATE '2005-02-03',
	COL1_end DATE DEFAULT DATE '9999-12-31' /*** SSC-FDM-TD0004 - PERIOD DATA TYPES ARE HANDLED AS TWO DATA FIELDS ***/
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},"attributes":{"component":"teradata"}}'
;
Copy

Best Practices

SSC-FDM-TD0005

Description

Teradata allows users to set any time zone offset between -12:59 and +14:00 using the SET TIME ZONE command. However, Snowflake only supports time zones that are officially listed in the IANA Time Zone Database.

When using SET TIME ZONE with a specific offset, Snowflake will automatically adjust it to match the closest IANA standard time zone if the specified offset doesn’t exactly match any standard time zone. When this happens, Snowflake will display a warning message to notify you of the adjustment.

Example Code

Input Code:
-- Will be rounded to Asia/Colombo (+05:30)
SET TIME ZONE '05:26';
Copy
Generated Code:
 -- Will be rounded to Asia/Colombo (+05:30)
--** SSC-FDM-TD0005 - NON-STANDARD TIME ZONE OFFSETS NOT SUPPORTED IN SNOWFLAKE, ROUNDED TO NEAREST VALID TIME ZONE **
ALTER SESSION SET TIMEZONE = 'Asia/Colombo';
Copy

Best Practices

SSC-FDM-TD0006

Description

This message appears when SnowConvert detects a view containing the WITH CHECK OPTION clause. Since Snowflake does not support this feature, the clause is automatically commented out in the converted code.

This clause enables you to perform INSERT and UPDATE operations on updatable views. When you execute these commands on the view, the changes are automatically applied to the underlying table associated with that view.

The WHERE clause filters which rows will be affected by the command in the view.

For more information about this clause and its functionality, please refer to the official Teradata documentation.

Example code

Input code:
REPLACE VIEW VIEWWITHOPTIONTEST AS
LOCKING ROW FOR ACCESS
SELECT 
    *        
FROM SOMETABLE
WHERE app_id = 'SUPPLIER'
WITH CHECK OPTION;
Copy
Generated Code:
 CREATE OR REPLACE VIEW VIEWWITHOPTIONTEST
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"teradata"}}'
AS
SELECT
    *
FROM
    SOMETABLE
WHERE app_id = 'SUPPLIER'
--    --** SSC-FDM-TD0006 - VIEW WITH OPTION NOT SUPPORTED IN SNOWFLAKE **
--    WITH CHECK OPTION
                     ;
Copy

Best Practices

SSC-FDM-TD0007

Description

This message appears when SnowConvert encounters a COLLATE clause while transforming code with a Variant data type. Since Variant data types do not support COLLATE clauses, SnowConvert will remove the COLLATE clause and display a notification message.

Example code

Input code:
CREATE TABLE TableExample
(
ColumnExample JSON(2500) NOT CASESPECIFIC
)
Copy
Generated Code:
 CREATE OR REPLACE TABLE TableExample
(
ColumnExample VARIANT
--                      NOT CASESPECIFIC /*** SSC-FDM-TD0007 - VARIANT COLUMN DOES NOT SUPPORT COLLATION ***/
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"teradata"}}'
;
Copy

The JSON data type is automatically converted to VARIANT. Any NOT CASESPECIFIC specifications are transformed into their equivalent COLLATE clause.

Best Practices

SSC-FDM-TD0008

Description

When using non-literal delimiters that contain spaces in Snowflake, you must escape the backslash character to ensure proper functionality.

Example code

Input code
SELECT NVP('store = whole foods&&store: ?Bristol farms','store', '&&', valueDelimiter, 2);
Copy
Output code
 SELECT
PUBLIC.NVP_UDF('store = whole foods&&store: ?Bristol farms', 'store', '&&', valueDelimiter, 2) /*** SSC-FDM-TD0008 - WHEN NVP_UDF FOURTH PARAMETER IS NON-LITERAL AND IT CONTAINS A BACKSLASH, THAT BACKSLASH NEEDS TO BE ESCAPED ***/;
Copy

Best Practices

SSC-FDM-TD0009

Description

This message appears when SnowConvert detects a DEFAULT SESSION with a data type other than VARCHAR. In such cases, SnowConvert automatically converts the data type to VARCHAR and generates a notification message.

Code Example

Input Code:
 CREATE TABLE TableExample
(
ColumnExample INTEGER DEFAULT SESSION,
ColumnExample2 VARCHAR DEFAULT SESSION
)
Copy
Generated Code:
 CREATE OR REPLACE TABLE TableExample
(
ColumnExample VARCHAR DEFAULT CURRENT_SESSION() /*** SSC-FDM-TD0009 - CONVERTED FROM INTEGER TO VARCHAR FOR CURRENT_SESSION DEFAULT ***/,
ColumnExample2 VARCHAR DEFAULT CURRENT_SESSION()
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"teradata"}}'
;
Copy

Let’s examine this example. The column named “ColumnExample” is defined with an INTEGER data type and has a DEFAULT SESSION setting. Because the data type is INTEGER and not VARCHAR, the system automatically converts it to VARCHAR in the output.

The data type of ColumnExample2 remains unchanged because it is already defined as VARCHAR.

Best Practices

SSC-FDM-TD0010

Description

The Teradata table DBC.COLUMNSV is mapped to Snowflake’s INFORMATION_SCHEMA.COLUMNS. However, please note that:

  1. Some Teradata columns do not have corresponding columns in Snowflake

  2. When columns do match between systems, the data content might differ

A sample view of the DBC.COLUMNSV table structure in Teradata

A sample view of the INFORMATION_SCHEMA.COLUMNS table in Snowflake

Notice that Snowflake does not have an equivalent column for “ColumnFormat”. Additionally, while “DATA_TYPE” appears to correspond to Teradata’s “ColumnType” column, their contents are significantly different.

Code Example

Input Code:
 SELECT columnname FROM dbc.columnsV WHERE tablename = 'TableN';
Copy
Generated Code:
 SELECT
COLUMN_NAME AS COLUMNNAME
FROM
--** SSC-FDM-TD0010 - USES OF TABLE DBC.COLUMNSV ARE CONVERTED TO INFORMATION_SCHEMA.COLUMNS, BUT SOME COLUMNS MIGHT NOT HAVE AND EXACT MATCH IN SNOWFLAKE **
INFORMATION_SCHEMA.COLUMNS
WHERE
TABLE_NAME = 'TableN';
Copy

Best Practices

  • Compare the columns used in Teradata with those available in Snowflake to ensure they meet your requirements.

  • For additional assistance, contact us at snowconvert-support@snowflake.com

SSC-FDM-TD0011

Description

Snowflake does not support Unicode Basic Multilingual Plane (BMP) characters. This message appears when SnowConvert converts Teradata Unicode Delimited Character Literal containing Unicode BMP escape sequences to Snowflake format.

Example code

Input Code:
 SELECT U&'hola #+005132 mundo' UESCAPE '#';
Copy
Generated Code:
 SELECT
--** SSC-FDM-TD0011 - UNICODE BMP IS NOT SUPPORTED IN SNOWFLAKE **
'hola \u+005132 mundo';
Copy

Best Practices

SSC-FDM-TD0012

Note

This FDM has been deprecated. For more information, please see SSC-EWI-TD0006.

Description

The FLOAT data type does not support default values using DEFAULT TIME, DEFAULT DATE, DEFAULT CURRENT_DATE, DEFAULT CURRENT_TIME, or DEFAULT CURRENT_TIMESTAMP specifications.

Example Code

Teradata:
CREATE TABLE T_2004
(
    -- In the output code all of these columns will be FLOAT type
    -- and will include the SSC-FDM-TD0012 message.
    COL1 FLOAT DEFAULT TIME,
    COL2 FLOAT DEFAULT DATE,
    COL3 FLOAT DEFAULT CURRENT_DATE,
    COL4 FLOAT DEFAULT CURRENT_TIME,
    COL5 FLOAT DEFAULT CURRENT_TIMESTAMP
);
Copy
Snowflake Scripting
 CREATE TABLE T_2004
(
    -- In the output code all of these columns will be FLOAT type
    -- and will include the SSC-FDM-TD0012 message.
    COL1 FLOAT DEFAULT TIME /*** SSC-FDM-TD0012 - DEFAULT CURRENT_TIME NOT VALID FOR DATA TYPE ***/,
    COL2 FLOAT DEFAULT DATE /*** SSC-FDM-TD0012 - DEFAULT CURRENT_DATE NOT VALID FOR DATA TYPE ***/,
    COL3 FLOAT DEFAULT CURRENT_DATE /*** SSC-FDM-TD0012 - DEFAULT CURRENT_DATE NOT VALID FOR DATA TYPE ***/,
    COL4 FLOAT DEFAULT CURRENT_TIME /*** SSC-FDM-TD0012 - DEFAULT CURRENT_TIME NOT VALID FOR DATA TYPE ***/,
    COL5 FLOAT DEFAULT CURRENT_TIMESTAMP /*** SSC-FDM-TD0012 - DEFAULT CURRENT_TIMESTAMP NOT VALID FOR DATA TYPE ***/
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"teradata"}}'
;
Copy

Best Practices

SSC-FDM-TD0013

Description

This message appears because the error code stored in the BTEQ ERRORCODE built-in variable cannot be directly mapped to an equivalent code in Snowflake Scripting.

Example code

Input code:
SELECT * FROM table1;
 
.IF ERRORCODE<>0 THEN .EXIT 1

.QUIT 0
Copy
Output code:
 -- Additional Params: -q snowscript

EXECUTE IMMEDIATE
$$
  DECLARE
    STATUS_OBJECT OBJECT := OBJECT_CONSTRUCT('SQLCODE', 0);
  BEGIN
    BEGIN
      SELECT
        *
      FROM
        table1;
      STATUS_OBJECT := OBJECT_CONSTRUCT('SQLROWCOUNT', SQLROWCOUNT);
    EXCEPTION
      WHEN OTHER THEN
        STATUS_OBJECT := OBJECT_CONSTRUCT('SQLCODE', SQLCODE, 'SQLERRM', SQLERRM, 'SQLSTATE', SQLSTATE);
    END;
    IF (STATUS_OBJECT['SQLCODE'] /*** SSC-FDM-TD0013 - THE SNOWFLAKE ERROR CODE MISMATCH THE ORIGINAL TERADATA ERROR CODE ***/ != 0) THEN
      RETURN 1;
    END IF;
    RETURN 0;
  END
$$
Copy

Best Practices

SSC-FDM-TD0014

Description

This warning appears when migrating BTEQ code that executes SQL statements from an environment file (for example, $(<$INPUT_SQL_FILE)). There is an important behavioral difference to note: while BTEQ continues executing remaining statements even if one fails, the Python-generated code will stop execution when it encounters any error.

Example Code

Teradata BTEQ:
 .logmech LDAP;
.logon $LOGON_STR;
.SET DEFAULTS;


$(<$INPUT_SQL_FILE)

.export reset
.logoff
.quit
Copy
Python:
#*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***

from snowconvert.helpers import exec_file
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
#** SSC-FDM-TD0022 - SHELL VARIABLES FOUND, RUNNING THIS CODE IN A SHELL SCRIPT IS REQUIRED **
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. LOGMECH **
  #.logmech LDAP;
   
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. LOGON **
  #.logon $LOGON_STR
   
  #** SSC-EWI-TD0005 - THE STATEMENT WAS CONVERTED BUT ITS FUNCTIONALITY IS NOT IMPLEMENTED YET **
  Export.defaults()
  #** SSC-FDM-TD0014 - EXECUTION OF FILE WITH SQL STATEMENTS STOPS WHEN AN ERROR OCCURS **
  exec_file("$INPUT_SQL_FILE")
  #** SSC-EWI-TD0005 - THE STATEMENT WAS CONVERTED BUT ITS FUNCTIONALITY IS NOT IMPLEMENTED YET **
  Export.reset()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. LOGOFF **
  #.logoff
   
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

Best Practices

SSC-FDM-TD0015

Note

This FDM is no longer supported. For more information, please see SSC-EWI-0009.

Description

Snowflake currently supports only POSIX Basic Regular Expression syntax. Advanced regular expression features are not available.

This warning appears whenever a function call to REGEX_SUBSTR, REGEX_REPLACE, or REGEX_INSTR is converted to Snowflake. It alerts users that some regular expression features may not be supported in Snowflake. Important unsupported features include lookahead, lookbehind, and non-capturing groups.

Example Code

Teradata:
 SELECT REGEXP_SUBSTR('qaqequ','q(?=u)', 1, 1);
Copy
Snowflake Scripting:
 SELECT
--** SSC-FDM-TD0015 - REGEXP_SUBSTR FUNCTION ONLY SUPPORTS POSIX REGULAR EXPRESSIONS **
REGEXP_SUBSTR('qaqequ','q(?=u)', 1, 1);
Copy

Best Practices

  • Review each regular expression pattern to determine if manual changes are required. For more details about SnowFlake’s regex capabilities and alternative options, see here.

  • For additional assistance, contact us at snowconvert-support@snowflake.com

SSC-FDM-TD0016

Description

In Teradata, regular expression functions (REGEX_SUBSTR, REGEX_REPLACE, and REGEX_INSTR) include a parameter called “match_arg”. This parameter is a character argument that accepts specific values.

  • 'i': Matches characters regardless of their case (uppercase or lowercase).

  • 'c': Matches characters exactly as they appear, considering case.

  • 'n': Allows the period (.) to match newline characters.

  • 'm': Treats the input string as multiple separate lines rather than one continuous line.

  • 'l': Returns NULL when the input string is larger than 16 MB, instead of generating an error.

  • 'x': Ignores spaces and whitespace characters in the pattern.

The function accepts multiple characters as input.

In Snowflake, these functions use regexp_parameters as their equivalent argument. This argument is a string containing one or more characters that define how the regular expression pattern matching should behave. The supported values are:

  • c: Makes the pattern matching case-sensitive

  • i: Makes the pattern matching case-insensitive

  • m: Enables matching across multiple lines

  • e: Allows extraction of sub-patterns from the match

  • s: Allows the dot (.) wildcard to match newline characters

As shown, the data type indicators 'i', 'c', 'm' are identical in both languages. The Teradata value 'n' corresponds to 's' in the target system. However, the Teradata values 'l' and 'x' do not have matching equivalents.

When using the 'x' value, the functionality is replicated using the REGEXP_REPLACE function. However, the 'l' parameter cannot be replicated, which results in a warning message.

Input Code:

 SELECT REGEXP_SUBSTR('Chip Chop','ch(i|o)p', 1, 1, 'i'), 
       REGEXP_SUBSTR('Chip Chop','ch(i|o)p', 1, 1, 'c'),
       REGEXP_SUBSTR('Chip Chop','ch(i|o)p', 1, 1, 'm'),
       REGEXP_SUBSTR('Chip Chop','ch(i|o)p', 1, 1, 'n'),
       REGEXP_SUBSTR('Chip Chop','ch(i|o)p', 1, 1, 'l'),
       REGEXP_SUBSTR('Chip Chop','ch(i|o)p', 1, 1, 'x');
Copy
Generated Code:
 SELECT
       REGEXP_SUBSTR('Chip Chop', 'ch(i|o)p', 1, 1, 'i'),
       REGEXP_SUBSTR('Chip Chop', 'ch(i|o)p', 1, 1, 'c'),
       REGEXP_SUBSTR('Chip Chop', 'ch(i|o)p', 1, 1, 'm'),
       REGEXP_SUBSTR('Chip Chop', 'ch(i|o)p', 1, 1, 's'),
       --** SSC-FDM-TD0016 - VALUE 'l' FOR PARAMETER 'match_arg' IS NOT SUPPORTED IN SNOWFLAKE **
       REGEXP_SUBSTR('Chip Chop', 'ch(i|o)p', 1, 1),
       REGEXP_SUBSTR('Chip Chop', 'ch(i|o)p', 1, 1);
Copy

Best Practices

SSC-FDM-TD0017

Note

This FDM has been deprecated. For more information, please see SSC-EWI-TD0076.

Description

Foreign tables in Teradata allow you to access data stored in external locations like Amazon S3, Azure Blob storage, and Google Cloud Storage. While Snowflake doesn’t support this specific syntax, you can achieve similar functionality using:

  • External tables

  • Iceberg tables

  • Standard tables

Foreign tables

Example code

Input code:
 SELECT cust_id, income, age FROM 
FOREIGN TABLE (SELECT cust_id, income, age FROM twm_customer)@hadoop1 T1;
Copy
Output code:
 SELECT
cust_id,
income,
age FROM
--** SSC-FDM-TD0017 - THE USE OF FOREIGN TABLES IS NOT SUPPORTED IN SNOWFLAKE. **
 FOREIGN TABLE (SELECT cust_id, income, age FROM twm_customer)@hadoop1 T1;
Copy

Recommendations

  • To replace Teradata foreign tables, you can use Snowflake external tables. These tables allow you to query data stored in cloud storage platforms (Amazon S3, Google Cloud Storage, or Microsoft Azure) as if it were in a database. External tables support all data formats that are compatible with COPY INTO

  • Snowflake’s Iceberg tables provide another solution. These tables use open formats and store data in Parquet files within your own cloud storage.

  • Standard Snowflake tables can also provide similar functionality to Teradata foreign tables.

  • For additional assistance, please contact us at snowconvert-support@snowflake.com

  • SSC-FDM-TD0018

    Note

    This FDM is no longer supported. For more information, please see SSC-EWI-TD0063.

    Description

    This error occurs when SnowConvert is unable to process a JSON path because the string format is either invalid or not supported by Snowflake.

    Example code

    Input Code:
     SELECT
        *
    FROM
    JSON_TABLE (
        ON (
            SELECT
                id,
                trainSchedule as ts
            FROM
                demo.PUBLIC.Train T
        ) USING rowexpr('$weekShedule.Monday[*]') colexpr(
            '[{"jsonpath"  "$.time",
                  "type"" : "CHAR ( 12 )"}]'
        )
    ) AS JT(Id, Ordinal, Time, City);
    
    Copy
    Generated Code:
     SELECT
        *
    FROM
        --** SSC-FDM-TD0018 - UNRECOGNIZED JSON PATH $weekShedule.Monday[*] **
    JSON_TABLE (
        ON
           !!!RESOLVE EWI!!! /*** SSC-EWI-0108 - THE FOLLOWING SUBQUERY MATCHES AT LEAST ONE OF THE PATTERNS CONSIDERED INVALID AND MAY PRODUCE COMPILATION ERRORS ***/!!! (
               SELECT
                   id,
                   trainSchedule as ts
    FROM
                   demo.PUBLIC.Train T
        ) USING rowexpr('$weekShedule.Monday[*]') colexpr(
            '[{"jsonpath"  "$.time",
                  "type"" : "CHAR ( 12 )"}]'
        )
    ) AS JT(Id, Ordinal, Time, City);
    
    Copy

    Recommendations

    • Verify that the JSON path is properly formatted and does not contain invalid characters.

    • For additional assistance, please contact us at snowconvert-support@snowflake.com

    SSC-FDM-TD0019

    Description

    Teradata enables users to set query bands (metadata tags) at three different levels: transaction, session, and profile. Users can retrieve these query band values using functions such as GetQueryBandValue.

    Snowflake uses the query_tag parameter instead of query bands. You can set query_tag at the session, user, or account level. Note that Snowflake does not support profiles.

    This Feature Migration Detail (FMD) alerts users that Snowflake does not support transaction or profile-level query tags. Instead, session-level query tags will be used as an alternative. This change may affect functionality in certain scenarios.

    Example Code

    Input Code:
     SELECT GETQUERYBANDVALUE(3, 'account');
    
    Copy
    Generated Code
     SELECT
    --** SSC-FDM-TD0019 - TRANSACTION AND PROFILE LEVEL QUERY TAGS NOT SUPPORTED IN SNOWFLAKE, REFERENCING SESSION QUERY TAG INSTEAD **
    GETQUERYBANDVALUE_UDF('account');
    
    Copy

    Recommendations

    SSC-FDM-TD0020

    Note

    For clarity, we have simplified the code by omitting some sections.

    Description

    This error occurs when SnowConvert attempts to process JSON data during a transformation, but encounters either incorrectly formatted JSON or invalid JSON content.

    Example code

    Input Code:
     SELECT
    *
    FROM 
     JSON_TABLE
    (ON (SELECT id,
    trainSchedule as ts
    FROM demo.PUBLIC.Train T)
    USING rowexpr('$.weekShedule.Monday[*]')
          colexpr('[ {"ordinal"  true},
                     {"jsonpath"  "$.time",
                      "type"" : "CHAR ( 12 )"},
                     {"jsonpath"  "$.city",
                      "type" : "VARCHAR ( 12 )"}]'))
    AS JT(Id, Ordinal, Time, City);
    
    SELECT
    *
    FROM 
     JSON_TABLE
    (ON (SELECT id, 
    trainSchedule as ts
    FROM demo.PUBLIC.Train T)
    USING rowexpr('$.weekShedule.Monday[*]')
          colexpr('{"jsonpath"  "$.time",
                      "type"" : "CHAR ( 12 )"}'))
    AS JT(Id, Ordinal, Time, City);
    
    Copy
    Generated Code:
     SELECT
     *
     FROM
     (
      SELECT
       id
      --** SSC-FDM-TD0020 - UNRECOGNIZED JSON LITERAL [ {"ordinal" true}, {"jsonpath" "$.time", "type"" : "CHAR ( 12 )"}, {"jsonpath" "$.city", "type" : "VARCHAR ( 12 )"}] **
      FROM
       demo.PUBLIC.Train T,
       TABLE(FLATTEN(INPUT =>
       trainSchedule:weekShedule.Monday)) rowexpr
     ) JT;
    
     SELECT
     *
     FROM
     (
      SELECT
       id
      --** SSC-FDM-TD0020 - UNRECOGNIZED JSON LITERAL {"jsonpath" "$.time", "type"" : "CHAR ( 12 )"} **
      FROM
       demo.PUBLIC.Train T,
       TABLE(FLATTEN(INPUT =>
       trainSchedule:weekShedule.Monday)) rowexpr
     ) JT;
    
    Copy

    Recommendations

    SSC-FDM-TD0021

    Note

    This EWI has been deprecated. Please refer to SSC-EWI-TD0046 documentation for current information.

    Description

    This error occurs when running a query that references the DBC.DATABASES table and attempts to select a column that does not have a corresponding match in Snowflake.

    Example Code

    Input:
     CREATE VIEW SAMPLE_VIEW
    AS
    SELECT PROTECTIONTYPE FROM DBC.DATABASES;
    
    Copy
    Output:
     CREATE OR REPLACE VIEW SAMPLE_VIEW
    COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": {  "major": 0,  "minor": 0,  "patch": "0" }, "attributes": {  "component": "teradata",  "convertedOn": "08/14/2024" }}'
    AS
    SELECT
    !!!RESOLVE EWI!!! /*** SSC-EWI-TD0046 - BUILT-IN REFERENCE TO PROTECTIONTYPE IS NOT SUPPORTED IN SNOWFLAKE ***/!!!
    PROTECTIONTYPE FROM
    INFORMATION_SCHEMA.DATABASES;
    
    Copy

    Recommendations

    SSC-FDM-TD0022

    Description

    In Teradata scripts, shell variables serve as temporary storage containers for values that you can use and modify throughout your script. To create a shell variable, use a dollar sign ($) followed by a variable name. You can optionally enclose the variable name in curly braces {}. To assign a value to a shell variable, use the equals sign (=).

    #!/bin/bash
    
    ## define a shell variable
    tablename="mytable"
    
    ## use the variable in a Teradata SQL query
    bteq <<EOF
        .LOGON myhost/myuser,mypassword
        SELECT * FROM ${tablename};
        .LOGOFF
    EOF
    
    
    Copy

    Shell variables serve a similar purpose to string interpolation. When scripts are converted to Python, shell variables maintain their functionality by executing the converted code within a shell script (.sh file). To preserve this functionality, the shell variables in the converted code must match the format of the original input code.

    Example Code

    Input Code:

     SELECT $column FROM ${tablename}
    
    Copy
    Generated Code
     #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
     
    import os
    import sys
    import snowconvert.helpers
    from snowconvert.helpers import Export
    from snowconvert.helpers import exec
    from snowconvert.helpers import BeginLoading
    con = None
    #** SSC-FDM-TD0022 - SHELL VARIABLES FOUND, RUNNING THIS CODE IN A SHELL SCRIPT IS REQUIRED **
    def main():
      snowconvert.helpers.configure_log()
      con = snowconvert.helpers.log_on()
      exec("""
        SELECT
          $column
        FROM
          ${tablename}
        """)
      snowconvert.helpers.quit_application()
    
    if __name__ == "__main__":
      main()
    
    Copy

    Recommendations

    SSC-FDM-TD0023

    Description

    This Feature Difference Message (FDM) appears when SnowConvert converts the Similarity Function from Teradata to Snowflake. Please note that the function’s behavior may differ between the two platforms.

    Example Code

    Using the following data as an example

    statements.

    Id

    a

    b

    1

    2

    Gute nacht

    Ich weis nicht

    3

    Ich weiß nicht

    Ich wei? nicht

    4

    Ich weiß nicht

    Ich wei? nicht

    5

    Ich weiß nicht

    Ich weiss nicht

    6

    Snowflake

    Oracle

    7

    święta

    swieta

    8

    NULL

    9

    NULL

    NULL

    Input Code:
    -- Additional Params: -q SnowScript
    SELECT * FROM StringSimilarity (
      ON (
        SELECT id, CAST(a AS VARCHAR(200)) AS a, CAST(b AS VARCHAR(200)) AS b
        FROM table_1
      ) PARTITION BY ANY
      USING
      ComparisonColumnPairs ('jaro_winkler(a,b) AS sim_fn')
      Accumulate ('id')
    ) AS dt ORDER BY 1;
    
    Copy
    Idsim_fn
    10
    20.565079365
    31
    40.959047619
    50
    60.611111111
    70.7777777777777777
    80
    90
    Generated Code
     SELECT
    * FROM
    --** SSC-FDM-TD0023 - STRING SIMILARITY MIGHT HAVE A DIFFERENT BEHAVIOR. **
    (
       SELECT
         id,
         JAROWINKLER_UDF(a, b) AS sim_fn
       FROM table_1
     ) dt ORDER BY 1;
    
    Copy

    ID

    SIM_FN

    1

    0.000000

    2

    0.560000

    3

    0.970000

    4

    0.950000

    5

    0.000000

    6

    0.610000

    7

    0.770000

    8

    0.000000

    9

    0.000000

Recommendations

SSC-FDM-TD0024

Description

This warning appears when SnowConvert detects a CREATE TABLE statement with the SET option. Since Snowflake does not support SET TABLE, SnowConvert removes this option during conversion.

Example Code

Teradata:
 CREATE SET TABLE TableExample
(
ColumnExample Number
)
Copy
 CREATE SET VOLATILE TABLE SOMETABLE, LOG AS 
(SELECT ColumnExample FROM TableExample);
Copy
Snowflake Scripting:
 --** SSC-FDM-TD0024 - SET TABLE FUNCTIONALITY NOT SUPPORTED. TABLE MIGHT HAVE DUPLICATE ROWS **
CREATE OR REPLACE TABLE TableExample
(
ColumnExample NUMBER(38, 18)
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"teradata"}}'
;
Copy
 --** SSC-FDM-TD0024 - SET TABLE FUNCTIONALITY NOT SUPPORTED. TABLE MIGHT HAVE DUPLICATE ROWS **
CREATE OR REPLACE TEMPORARY TABLE SOMETABLE
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"teradata"}}'
AS
(
SELECT
ColumnExample FROM
TableExample
);
Copy

Recommendations

SSC-FDM-TD0025

Description

The Teradata Database’s temporal features, which include temporal tables and time-based operations (DDL and DML), cannot be directly replicated in Snowflake. Snowflake does not currently support temporal tables or time-aware data management in the same way as Teradata. For more information about Teradata’s temporal features, see Teradata Database Temporal Support.

These statements are recognized by SnowConvert during parsing, but they are removed during the translation process to ensure compatibility with Snowflake’s execution environment.

When an abort statement is encountered, it will be converted to a Delete command. This maintains equivalent functionality by allowing you to roll back transaction operations and restore the database to its original state.

Example code

The following example demonstrates how a Temporal-form Select statement is converted into a standard Select statement.

Input code:
 SEQUENCED VALIDTIME  
   SELECT
   Policy_ID,
   Customer_ID
   FROM Policy
      WHERE Policy_Type = 'AU';
Copy
Output code:
 ----** SSC-FDM-TD0025 - TEMPORAL FORMS ARE NOT SUPPORTED IN SNOWFLAKE **
--SEQUENCED VALIDTIME
SELECT
   Policy_ID,
   Customer_ID
   FROM
   Policy
      WHERE Policy_Type = 'AU';
Copy

When a transaction needs to be rolled back, the Abort command is used to cancel all changes made during that transaction.

Input code:
 CREATE OR REPLACE PROCEDURE TEST.ABORT_STATS()
BEGIN
    CURRENT VALIDTIME AND NONSEQUENCED TRANSACTIONTIME ABORT 
     FROM table_1 
     WHERE table_1.x1 = 1;
END;
Copy
Output code:
 CREATE OR REPLACE PROCEDURE TEST.ABORT_STATS ()
RETURNS VARCHAR
LANGUAGE SQL
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"teradata"}}'
EXECUTE AS CALLER
AS
$$
    BEGIN
        --    CURRENT VALIDTIME AND NONSEQUENCED TRANSACTIONTIME
        --** SSC-FDM-TD0025 - TEMPORAL FORMS ARE NOT SUPPORTED IN SNOWFLAKE **
        LET _ROW_COUNT FLOAT;
        SELECT
            COUNT(*)
        INTO
            _ROW_COUNT
            FROM
            table_1
                 WHERE table_1.x1 = 1;
            IF (_ROW_COUNT > 0) THEN
            ROLLBACK;
            END IF;
    END;
$$;
Copy

Recommendations

SSC-FDM-TD0026

Note

For clarity, we have simplified the code by omitting some parts.

Description

When replicating SQL IF statement functionality, developers often combine GOTO commands with IF and LABEL commands. These combinations can be directly converted into if, if-else, or if-elseif-else statements. In such cases, you should remove the GOTO commands to avoid their conversion into LABEL sections, as they are no longer needed.

Example Code

Input Code:

 -- Additional Params: --scriptsTargetLanguage SnowScript
.If ActivityCount = 0 THEN .GOTO endIf
DROP TABLE TABLE1;
.Label endIf
SELECT A FROM TABLE1;
Copy

Generated Code

 EXECUTE IMMEDIATE
$$
  DECLARE
    STATUS_OBJECT OBJECT := OBJECT_CONSTRUCT('SQLCODE', 0);
  BEGIN
    IF (NOT (STATUS_OBJECT['SQLROWCOUNT'] = 0)) THEN
      --** SSC-FDM-TD0026 - GOTO endIf WAS REMOVED DUE TO IF STATEMENT INVERSION **
       
      BEGIN
        DROP TABLE TABLE1;
        STATUS_OBJECT := OBJECT_CONSTRUCT('SQLROWCOUNT', SQLROWCOUNT);
      EXCEPTION
        WHEN OTHER THEN
          STATUS_OBJECT := OBJECT_CONSTRUCT('SQLCODE', SQLCODE, 'SQLERRM', SQLERRM, 'SQLSTATE', SQLSTATE);
      END;
    END IF;
    /*.Label endIf*/
    --** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE.  **
     
    BEGIN
      SELECT
        A
      FROM
        TABLE1;
      STATUS_OBJECT := OBJECT_CONSTRUCT('SQLROWCOUNT', SQLROWCOUNT);
    EXCEPTION
      WHEN OTHER THEN
        STATUS_OBJECT := OBJECT_CONSTRUCT('SQLCODE', SQLCODE, 'SQLERRM', SQLERRM, 'SQLSTATE', SQLSTATE);
    END;
  END
$$
Copy
Recommendations

SSC-FDM-TD0027

Note

This FDM is no longer supported. For more information, please see SSC-EWI-TD0061.

Description

SnowConvert (SC) can transform the TD_UNPIVOT function from Teradata. This function allows you to convert column data into rows, making it easier to analyze and manipulate your data.

This transformation needs information about the column names in the table(s) to work correctly. If this information is not available, the transformation may be incomplete, with missing columns in the result. In such cases, an EWI (Error, Warning, Information) message is generated.

Example code

Input Code:
 CREATE TABLE unpivotTable  (
	myKey INTEGER NOT NULL PRIMARY KEY,
	firstSemesterIncome DECIMAL(10,2),
	secondSemesterIncome DECIMAL(10,2),
	firstSemesterExpenses DECIMAL(10,2),
	secondSemesterExpenses DECIMAL(10,2)
);

SELECT * FROM
 TD_UNPIVOT(
 	ON unpivotTable 
 	USING
 	VALUE_COLUMNS('Income', 'Expenses')
 	UNPIVOT_COLUMN('Semester')
 	COLUMN_LIST('firstSemesterIncome, firstSemesterExpenses', 'secondSemesterIncome, secondSemesterExpenses')
 	COLUMN_ALIAS_LIST('First', 'Second')
 )X ORDER BY mykey;

SELECT * FROM
 TD_UNPIVOT(
 	ON unknownTable
 	USING
 	VALUE_COLUMNS('MonthIncome')
 	UNPIVOT_COLUMN('Months')
 	COLUMN_LIST('januaryIncome', 'februaryIncome', 'marchIncome', 'aprilIncome')
 	COLUMN_ALIAS_LIST('January', 'February', 'March', 'April')
 )X ORDER BY yearKey;
Copy
Output Code:
 CREATE TABLE unpivotTable (
	myKey INTEGER NOT NULL PRIMARY KEY,
	firstSemesterIncome DECIMAL(10,2),
	secondSemesterIncome DECIMAL(10,2),
	firstSemesterExpenses DECIMAL(10,2),
	secondSemesterExpenses DECIMAL(10,2)
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"teradata"}}'
;

SELECT
	* FROM
	(
		SELECT
			myKey,
			TRIM(GET_IGNORE_CASE(OBJECT_CONSTRUCT('FIRSTSEMESTERINCOME', 'First', 'FIRSTSEMESTEREXPENSES', 'First', 'SECONDSEMESTERINCOME', 'Second', 'SECONDSEMESTEREXPENSES', 'Second'), Semester), '"') AS Semester,
			Income,
			Expenses
		FROM
			unpivotTable UNPIVOT(Income FOR Semester IN (
				firstSemesterIncome,
				secondSemesterIncome
			)) UNPIVOT(Expenses FOR Semester1 IN (
				firstSemesterExpenses,
				secondSemesterExpenses
			))
		WHERE
			Semester = 'FIRSTSEMESTERINCOME'
			AND Semester1 = 'FIRSTSEMESTEREXPENSES'
			OR Semester = 'SECONDSEMESTERINCOME'
			AND Semester1 = 'SECONDSEMESTEREXPENSES'
	) X ORDER BY mykey;

	SELECT
	* FROM
	--** SSC-FDM-TD0027 - TD_UNPIVOT TRANSFORMATION REQUIRES COLUMN INFORMATION THAT COULD NOT BE FOUND, COLUMNS MISSING IN RESULT **
	(
		SELECT
			TRIM(GET_IGNORE_CASE(OBJECT_CONSTRUCT('JANUARYINCOME', 'January', 'FEBRUARYINCOME', 'February', 'MARCHINCOME', 'March', 'APRILINCOME', 'April'), Months), '"') AS Months,
			MonthIncome
		FROM
			unknownTable UNPIVOT(MonthIncome FOR Months IN (
				januaryIncome,
				februaryIncome,
				marchIncome,
				aprilIncome
			))
	) X ORDER BY yearKey;
Copy

Recommendations

  • You can provide column information to the conversion tool using one of two methods:

    • Include the table specification in the same file as the TD_UNPIVOT call

    • List specific columns in the SELECT query of the ON expression instead of using SELECT * or just the table name

  • If you are unpivoting ALL columns from the input tables, you can ignore this issue. However, if you are only unpivoting some columns, the result will be missing data.

  • For additional support, please contact us at snowconvert-support@snowflake.com

SSC-FDM-TD0028

Note

This FDM is no longer supported. For more information, please see SSC-EWI-TD0060.

Description

The SnowConvert tool can transform the JSON_TABLE function, but it needs to know the specific column names that are being selected in the JSON_TABLE ON subquery to perform the transformation correctly.

This warning appears when column names are not explicitly specified in a subquery (such as when using SELECT *) and the system cannot find the table’s structure information. Without this information, the system cannot determine the specific column names being referenced.

Example code

Input Code:
 CREATE TABLE demo.Train (
    firstCol INT,
    jsonCol JSON(400),
    thirdCol VARCHAR(30)
);

SELECT * FROM JSON_TABLE 
(ON (SELECT T.*
           FROM demo.Train T)
USING rowexpr('$.schools[*]')
               colexpr('[ {"jsonpath" : "$.name",
                           "type" : "CHAR(20)"},
                          {"jsonpath" : "$.type",
                           "type" : "VARCHAR(20)"}]')
)
AS JT;

SELECT * FROM JSON_TABLE 
(ON (SELECT T.*
           FROM demo.missingTable T)
USING rowexpr('$.schools[*]')
               colexpr('[ {"jsonpath" : "$.name",
                           "type" : "CHAR(20)"},
                          {"jsonpath" : "$.type",
                           "type" : "VARCHAR(20)"}]')
)
AS JT;
Copy
Output Code:
 CREATE TABLE demo.Train (
    firstCol INT,
    jsonCol VARIANT,
    thirdCol VARCHAR(30)
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},"attributes":{"component":"teradata"}}'
;

SELECT
    * FROM
    (
        SELECT
            firstCol,
            rowexpr.value:name :: CHAR(20) AS Column_0,
            rowexpr.value:type :: VARCHAR(20) AS Column_1,
            thirdCol
        FROM
            demo.Train T,
            TABLE(FLATTEN(INPUT => jsonCol:schools)) rowexpr
    ) JT;

    SELECT
    * FROM
    --** SSC-FDM-TD0028 - JSON_TABLE NOT TRANSFORMED, COLUMN NAMES COULD NOT BE RETRIEVED FROM SEMANTIC INFORMATION **
    JSON_TABLE
   (ON
       !!!RESOLVE EWI!!! /*** SSC-EWI-0108 - THE FOLLOWING SUBQUERY MATCHES AT LEAST ONE OF THE PATTERNS CONSIDERED INVALID AND MAY PRODUCE COMPILATION ERRORS ***/!!! (
        SELECT
            T.*
                  FROM
            demo.missingTable T)
   USING rowexpr('$.schools[*]')
                  colexpr('[ {"jsonpath" : "$.name",
                           "type" : "CHAR(20)"},
                          {"jsonpath" : "$.type",
                           "type" : "VARCHAR(20)"}]')
   )
   AS JT;
Copy

Recommendations

  • Ensure that you include the table definition when providing code to SnowConvert. Without it, you’ll need to run the code again.

  • For additional assistance, contact us at snowconvert-support@snowflake.com

SSC-FDM-TD0029

Format elements that depend on session parameters

The following Teradata format elements are mapped to Snowflake functions that require specific session parameter settings. To ensure consistent results between Teradata and Snowflake, you must configure these session parameters to match your Teradata settings:

  • D: Mapped to DAYOFWEEK function. The results differ between Teradata and Snowflake due to different default settings. Teradata uses Sunday as the first day of the week, while Snowflake uses Monday. This is controlled by the WEEK_START session parameter.

  • WW: Mapped to WEEK function. The behavior is controlled by the WEEK_OF_YEAR_POLICY session parameter. Snowflake’s default setting follows the ISO standard (first week must contain at least four days of January). Teradata considers January 1st as the start of the first week.

To change session parameters, use the ALTER SESSION SET parameter_name = value command. For more details about available session parameters, please visit this page.

Single parameter version of TO_CHAR

The TO_CHAR(Datetime) function with a single parameter uses default date and time formats defined in your session parameters. These parameters include:

  • TIMESTAMP_LTZ_OUTPUT_FORMAT

  • TIMESTAMP_NTZ_OUTPUT_FORMAT

  • TIMESTAMP_TZ_OUTPUT_FORMAT

  • TIME_OUTPUT_FORMAT

To ensure consistent behavior between Teradata and Snowflake, make sure these parameters match your Teradata settings.

When converting numeric values to strings using TO_CHAR(Numeric), Snowflake automatically uses either the TM9 or TME format to create a compact string representation. Since Teradata also creates compact number representations by default, no additional formatting is needed.

Example Code

Input Code:
 select to_char(date '2008-09-13', 'DD/RM/YYYY');

select to_char(date '2010-10-20', 'DS');

select to_char(1255.495, 'SC9999.9999', 'nls_iso_currency = ''EUR''');

select to_char(45620);
Copy
Output Code:
 SELECT
TO_CHAR(date '2008-09-13', 'DD/') || PUBLIC.ROMAN_NUMERALS_MONTH_UDF(date '2008-09-13') || TO_CHAR(date '2008-09-13', '/YYYY') /*** SSC-FDM-TD0029 - SNOWFLAKE SUPPORTED FORMATS FOR TO_CHAR DIFFER FROM TERADATA AND MAY FAIL OR HAVE DIFFERENT BEHAVIOR ***/;

SELECT
TO_CHAR(date '2010-10-20', 'MM/DD/YYYY') /*** SSC-FDM-TD0029 - SNOWFLAKE SUPPORTED FORMATS FOR TO_CHAR DIFFER FROM TERADATA AND MAY FAIL OR HAVE DIFFERENT BEHAVIOR ***/;

SELECT
PUBLIC.INSERT_CURRENCY_UDF(TO_CHAR(1255.495, 'S9999.0000'), 2, 'EUR') /*** SSC-FDM-TD0029 - SNOWFLAKE SUPPORTED FORMATS FOR TO_CHAR DIFFER FROM TERADATA AND MAY FAIL OR HAVE DIFFERENT BEHAVIOR ***/;

SELECT
TO_CHAR(45620) /*** SSC-FDM-TD0029 - SNOWFLAKE SUPPORTED FORMATS FOR TO_CHAR DIFFER FROM TERADATA AND MAY FAIL OR HAVE DIFFERENT BEHAVIOR ***/;
Copy

Best Practices

  • When working with Format Functions (FF), use DateTime types that match Teradata’s precision, or specify a precision in the format element to ensure consistent behavior.

  • For timezone formatting, ensure the first parameter is of type TIMESTAMP_TZ for consistent results. Note that Snowflake’s TIME data type does not support timezone information.

  • Configure session parameters to match Teradata’s default values to maintain consistent behavior.

  • For additional assistance, contact us at snowconvert-support@snowflake.com

SSC-FDM-TD0030

Description

When SC replaces a Goto statement with a Label section, it automatically adds a return statement at the end of the section if one is not present. This ensures that the program’s execution flow remains the same as in the original code.

When a BTEQ Goto command is executed, all statements between the Goto command and its corresponding label are skipped. To prevent any unintended execution after reaching the label, you should include a return statement in the label section.

It’s important to note that when using the Goto command, it skips all statements until it finds a matching Label. The program execution then continues from that Label. The program will never execute any Label sections that appear before the Goto command.

Example Code

Input Code:
 -- Additional Params: --scriptsTargetLanguage SnowScript
.LOGON dbc,dbc;
select 'STATEMENTS';
.GOTO LABEL_B
select 'IGNORED STATEMENTS';
.label LABEL_B
select 'LABEL_B STATEMENTS';
Copy
Output Code
 EXECUTE IMMEDIATE
$$
  DECLARE
    STATUS_OBJECT OBJECT := OBJECT_CONSTRUCT('SQLCODE', 0);
  BEGIN
    -- Additional Params: --scriptsTargetLanguage SnowScript
    --.LOGON dbc,dbc
    !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'BTLogOn' NODE ***/!!!
    null;
    BEGIN
      SELECT
        'STATEMENTS';
      STATUS_OBJECT := OBJECT_CONSTRUCT('SQLROWCOUNT', SQLROWCOUNT);
    EXCEPTION
      WHEN OTHER THEN
        STATUS_OBJECT := OBJECT_CONSTRUCT('SQLCODE', SQLCODE, 'SQLERRM', SQLERRM, 'SQLSTATE', SQLSTATE);
    END;
     
    /*.label LABEL_B*/
     
    BEGIN
      SELECT
        'LABEL_B STATEMENTS';
      STATUS_OBJECT := OBJECT_CONSTRUCT('SQLROWCOUNT', SQLROWCOUNT);
    EXCEPTION
      WHEN OTHER THEN
        STATUS_OBJECT := OBJECT_CONSTRUCT('SQLCODE', SQLCODE, 'SQLERRM', SQLERRM, 'SQLSTATE', SQLSTATE);
    END;
    --** SSC-FDM-TD0030 - A RETURN STATEMENT WAS ADDED AT THE END OF THE LABEL SECTION LABEL_B TO ENSURE THE SAME EXECUTION FLOW **
    RETURN 0;
    BEGIN
      SELECT
        'IGNORED STATEMENTS';
      STATUS_OBJECT := OBJECT_CONSTRUCT('SQLROWCOUNT', SQLROWCOUNT);
    EXCEPTION
      WHEN OTHER THEN
        STATUS_OBJECT := OBJECT_CONSTRUCT('SQLCODE', SQLCODE, 'SQLERRM', SQLERRM, 'SQLSTATE', SQLSTATE);
    END;
    /*.label LABEL_B*/
    --** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE.  **
     
    BEGIN
      SELECT
        'LABEL_B STATEMENTS';
      STATUS_OBJECT := OBJECT_CONSTRUCT('SQLROWCOUNT', SQLROWCOUNT);
    EXCEPTION
      WHEN OTHER THEN
        STATUS_OBJECT := OBJECT_CONSTRUCT('SQLCODE', SQLCODE, 'SQLERRM', SQLERRM, 'SQLSTATE', SQLSTATE);
    END;
  END
$$
Copy

Recommendations

SSC-FDM-TD0031

Description

The Teradata function ST_SPHERICALDISTANCE uses the Haversine formula to calculate the distance between two points on Earth using spherical coordinates. In contrast, Snowflake’s ST_DISTANCE function uses a different method to measure the distance between two geographical points.

Example Code

Input Code:
 --The distance between New York and Los Angeles
Select Cast('POINT(-73.989308 40.741895)' As ST_GEOMETRY) As location1,
	Cast('POINT(40.741895 34.053691)' As ST_GEOMETRY) As location2,
	location1.ST_SPHERICALDISTANCE(location2) As Distance_In_km;
Copy
Teradata Results

location1

location2

Distance_In_Km

POINT (-73.989308 40.741895)

POINT (40.741895 34.053691)

9351139.978062356

Output Code
 --The distance between New York and Los Angeles
SELECT
	TO_GEOGRAPHY('POINT(-73.989308 40.741895)') As location1,
	TO_GEOGRAPHY('POINT(40.741895 34.053691)') As location2,
	--** SSC-FDM-TD0031 - ST_DISTANCE RESULTS ARE SLIGHTLY DIFFERENT FROM ST_SPHERICALDISTANCE **
	ST_DISTANCE(
	location1, location2) As Distance_In_km;
Copy
Snowflake Results

LOCATION1

LOCATION2

DISTANCE_IN_KM

{ “coordinates”: [ -73.989308, 40.741895 ], “type”: “Point” }

{ “coordinates”: [ 40.741895, 34.053691 ], “type”: “Point” }

9351154.65572674

Recommendations

SSC-FDM-TD0032

Note

For clarity, we have simplified some sections of the output code.

Description

This error occurs when a LIKE expression includes the [NOT] CASESPECIFIC clause.

Example Code

Input Code:
 SELECT * FROM MY_TABLE
WHERE Name Like 'Marco%' (NOT CASESPECIFIC);
Copy
Output Code
 SELECT
* FROM
MY_TABLE
WHERE Name LIKE 'Marco%' /*** SSC-FDM-TD0032 - NOT CASESPECIFIC CLAUSE WAS REMOVED ***/;
Copy

Recommendations

  • TERADATA’s case sensitivity behavior is determined by the TMODE system configuration setting.

  • For additional assistance, please contact us at snowconvert-support@snowflake.com

SSC-FDM-TD0033

Description

The ACTIVITY_COUNT status variable shows how many rows were modified by a DML statement (such as INSERT, UPDATE, or DELETE) when used in either embedded SQL or stored procedures. For more details, visit here.

To replicate the behavior of ACTIVITY_COUNT, you can use the workaround described in the translation specification.

 SELECT $1 FROM TABLE(RESULT_SCAN(LAST_QUERY_ID()));
Copy

However, there are several limitations to consider:

Limitations

First case

If you call ACTIVITY_COUNT multiple times before executing another DML statement, you may receive incorrect results. Make sure to execute a DML statement between each ACTIVITY_COUNT call to get accurate values.

 REPLACE PROCEDURE InsertEmployeeSalaryAndLog_1 ()
BEGIN
    DECLARE row_count1 INT;

    INSERT INTO employees (employee_id, first_name, last_name, department_id, salary)
    VALUES (101, 'Alice', 'Smith', 10, 70000.00);

    -- Get the ACTIVITY_COUNT
    SET row_count1 = ACTIVITY_COUNT;
    SET row_count1 = ACTIVITY_COUNT;

    -- Insert the ACTIVITY_COUNT into the activity_log table
    INSERT INTO activity_log (operation, row_count)
    VALUES ('INSERT PROCEDURE', row_count1);
END;

REPLACE PROCEDURE InsertEmployeeSalaryAndLog_2 ()
BEGIN
    DECLARE row_count1 INT;
    DECLARE message VARCHAR(100);

    INSERT INTO employees (employee_id, first_name, last_name, department_id, salary)
    VALUES (101, 'Alice', 'Smith', 10, 70000.00);

    -- Get the ACTIVITY_COUNT
    SET row_count1 = ACTIVITY_COUNT + 1;
    SET row_count1 = ACTIVITY_COUNT;

    -- Insert the ACTIVITY_COUNT into the activity_log table
    INSERT INTO activity_log (operation, row_count)
    VALUES ('INSERT PROCEDURE', row_count1);
END;
Copy
 CREATE OR REPLACE PROCEDURE InsertEmployeeSalaryAndLog_1 ()
RETURNS VARCHAR
LANGUAGE SQL
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": {  "major": 0,  "minor": 0,  "patch": "0" }, "attributes": {  "component": "teradata",  "convertedOn": "07/15/2024" }}'
EXECUTE AS CALLER
AS
$$
    DECLARE
        row_count1 INT;
    BEGIN
         
        INSERT INTO employees (employee_id, first_name, last_name, department_id, salary)
        VALUES (101, 'Alice', 'Smith', 10, 70000.00);

           -- Get the ACTIVITY_COUNT
        row_count1 := (
            SELECT
                $1
            FROM
                TABLE(RESULT_SCAN(LAST_QUERY_ID()))
        ) /*** SSC-FDM-TD0033 - 'ACTIVITY_COUNT' TRANSFORMATION MIGHT REQUIRE MANUAL ADJUSTMENTS ***/;
        row_count1 := (
            SELECT
                $1
            FROM
                TABLE(RESULT_SCAN(LAST_QUERY_ID()))
        ) /*** SSC-FDM-TD0033 - 'ACTIVITY_COUNT' TRANSFORMATION MIGHT REQUIRE MANUAL ADJUSTMENTS ***/;

        -- Insert the ACTIVITY_COUNT into the activity_log table
        INSERT INTO activity_log (operation, row_count)
        VALUES ('INSERT PROCEDURE', :row_count1);
    END;
$$;

CREATE OR REPLACE PROCEDURE InsertEmployeeSalaryAndLog_2 ()
RETURNS VARCHAR
LANGUAGE SQL
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": {  "major": 0,  "minor": 0,  "patch": "0" }, "attributes": {  "component": "teradata",  "convertedOn": "07/15/2024" }}'
EXECUTE AS CALLER
AS
$$
    DECLARE
        row_count1 INT;
        message VARCHAR(100);
    BEGIN
         
         
        INSERT INTO employees (employee_id, first_name, last_name, department_id, salary)
        VALUES (101, 'Alice', 'Smith', 10, 70000.00);

           -- Get the ACTIVITY_COUNT
        row_count1 := (
            SELECT
                $1
            FROM
                TABLE(RESULT_SCAN(LAST_QUERY_ID()))
        ) /*** SSC-FDM-TD0033 - 'ACTIVITY_COUNT' TRANSFORMATION MIGHT REQUIRE MANUAL ADJUSTMENTS ***/ + 1;
        row_count1 := (
            SELECT
                $1
            FROM
                TABLE(RESULT_SCAN(LAST_QUERY_ID()))
        ) /*** SSC-FDM-TD0033 - 'ACTIVITY_COUNT' TRANSFORMATION MIGHT REQUIRE MANUAL ADJUSTMENTS ***/;

        -- Insert the ACTIVITY_COUNT into the activity_log table
        INSERT INTO activity_log (operation, row_count)
        VALUES ('INSERT PROCEDURE', :row_count1);
    END;
$$;
Copy

When using ACTIVITY_COUNT in Teradata, you can call it multiple times after a DML statement (like INSERT), and it will consistently return the number of affected rows from that statement. However, in Snowflake, the transformed code uses LAST_QUERY_ID(), which behaves differently. The result from LAST_QUERY_ID() depends on the most recent query executed, so multiple calls may not return the same value if other queries are executed in between.

The stored procedure InsertEmployeeSalaryAndLog_1() works without requiring any modifications. You can verify this by reviewing the Query History in chronological order from bottom to top.

Query History showing the execution of InsertEmployeeSalaryAndLog_1()

  1. The INSERT statement runs first, and LAST_QUERY_ID() references this operation.

  2. The first SELECT statement with ACTIVITY_COUNT executes, setting $1 to 1. LAST_QUERY_ID() now points to this SELECT statement.

  3. The second SELECT statement with ACTIVITY_COUNT runs. Since the previous statement returned 1, $1 remains 1 for this SELECT.

  4. The value 1 is stored in row_count1 and is then inserted into the activity_log table.

Looking at the InsertEmployeeSalaryAndLog_2() procedure, manual modifications are needed. Let’s examine the Query History in chronological order, starting from the oldest entries.

Query History showing the execution results of the InsertEmployeeSalaryAndLog_2() procedure

  1. When the INSERT statement runs, LAST_QUERY_ID() references this specific statement.

  2. During the first SELECT with ACTIVITY_COUNT, $1 equals 1. However, the QUERY_TEXT contains + 10, which will modify the final result. At this point, LAST_QUERY_ID() points to this SELECT statement.

  3. When the second SELECT with ACTIVITY_COUNT executes, it returns 11 (instead of 1) because of the previous query. This value is stored in $1.

  4. The variable row_count1 receives the value 11, which is then stored in the activity_log table.

The following values are stored in the activity_log table:

LOG_ID

OPERATION

ROW_COUNT

LOG_TIMESTAMP

1

INSERT PROCEDURE

1

2024-07-15 09:22:21.725

101

INSERT PROCEDURE

11

2024-07-15 09:22:26.248

Adjustments for the first case

According to Snowflake’s documentation for LAST_QUERY_ID, you can retrieve specific queries using a position number. For example:

  • LAST_QUERY_ID(-1) retrieves the most recent query

  • LAST_QUERY_ID(-2) retrieves the second most recent query And so on for older queries.

The solution to fix the issue in InsertEmployeeSalaryAndLog_2() is to use LAST_QUERY_ID(-2) in the second SELECT statement when retrieving the ACTIVITY_COUNT. This ensures that we get the row count from the previous INSERT statement.

 ...
INSERT INTO employees (employee_id, first_name, last_name, department_id, salary)
        VALUES (101, 'Alice', 'Smith', 10, 70000.00);

           -- Get the ACTIVITY_COUNT
        row_count1 := (
            SELECT
                $1
            FROM
                TABLE(RESULT_SCAN(LAST_QUERY_ID()))
        ) /*** SSC-FDM-TD0033 - 'ACTIVITY_COUNT' TRANSFORMATION MIGHT REQUIRE MANUAL ADJUSTMENTS ***/ + 1;
        row_count1 := (
            SELECT
                $1
            FROM
                TABLE(RESULT_SCAN(LAST_QUERY_ID(-2)))
        ) /*** SSC-FDM-TD0033 - 'ACTIVITY_COUNT' TRANSFORMATION MIGHT REQUIRE MANUAL ADJUSTMENTS ***/;
...
Copy

Second case

If ACTIVITY_COUNT is used after executing a non-DML statement (such as a SELECT), it will not return the correct count of affected rows.

REPLACE PROCEDURE InsertEmployeeSalaryAndLog_3 ()
BEGIN
    DECLARE row_count1 INT;
    DECLARE emp_id INT;
    DECLARE message VARCHAR(100);

    INSERT INTO employees (employee_id, first_name, last_name, department_id, salary)
    VALUES (101, 'Alice', 'Smith', 10, 70000.00);

    SELECT employee_id INTO emp_id FROM employees;
    -- Get the ACTIVITY_COUNT
    SET row_count1 = ACTIVITY_COUNT;
    SET message = 'EMPLOYEE INSERTED - ID: ' || emp_id;

    -- Insert the ACTIVITY_COUNT into the activity_log table
    INSERT INTO activity_log (operation, row_count)
    VALUES (message, row_count1);
END;
Copy
 CREATE OR REPLACE PROCEDURE InsertEmployeeSalaryAndLog_3 ()
RETURNS VARCHAR
LANGUAGE SQL
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": {  "major": 0,  "minor": 0,  "patch": "0" }, "attributes": {  "component": "teradata",  "convertedOn": "07/15/2024" }}'
EXECUTE AS CALLER
AS
$$
    DECLARE
        row_count1 INT;
        emp_id INT;
        message VARCHAR(100);
    BEGIN
         
         
         
        INSERT INTO employees (employee_id, first_name, last_name, department_id, salary)
        VALUES (101, 'Alice', 'Smith', 10, 70000.00);
        SELECT
            employee_id INTO
            :emp_id
        FROM
            employees;
               -- Get the ACTIVITY_COUNT
               row_count1 := (
            SELECT
                $1
            FROM
                TABLE(RESULT_SCAN(LAST_QUERY_ID()))
               ) /*** SSC-FDM-TD0033 - 'ACTIVITY_COUNT' TRANSFORMATION MIGHT REQUIRE MANUAL ADJUSTMENTS ***/;
               message := 'EMPLOYEE INSERTED - ID: ' || emp_id;

               -- Insert the ACTIVITY_COUNT into the activity_log table
               INSERT INTO activity_log (operation, row_count)
               VALUES (:message, :row_count1);
    END;
$$;
Copy

The LAST_QUERY_ID function returns an incorrect value for row_count1 because it references the wrong query. You can verify this by reviewing the Query History in reverse chronological order.

Query History showing the results of executing InsertEmployeeSalaryAndLog_3()

  1. First, an INSERT statement runs. The LAST_QUERY_ID() function references this INSERT operation.

  2. Next, a SELECT INTO statement executes, setting $1 to 101. The LAST_QUERY_ID() function now points to this SELECT INTO operation.

  3. Then, a SELECT statement runs to get the ACTIVITY_COUNT. Since the last query returned 101, $1 contains 101 instead of the expected value of 1.

  4. As a result, row_count1 stores 101, which is then recorded in the activity_log table.

The following values are recorded in the activity_log:

LOG_ID

OPERATION

ROW_COUNT

LOG_TIMESTAMP

1

EMPLOYEE INSERTED - ID: 101

101

2024-07-15 11:00:38.000

Adjustments for the second case

  1. To resolve this issue, use LAST_QUERY_ID with the correct query reference number. For example, LAST_QUERY_ID(-2) will retrieve the specific query you need.

 ...
row_count1 := (
            SELECT
                $1
            FROM
                TABLE(RESULT_SCAN(LAST_QUERY_ID(-2)))
               ) /*** SSC-FDM-TD0033 - 'ACTIVITY_COUNT' TRANSFORMATION MIGHT REQUIRE MANUAL ADJUSTMENTS ***/;
               ...
Copy
  1. You can also verify the success of the INSERT statement by checking the ACTIVITY_COUNT using a SELECT statement right after the insertion.

...
INSERT INTO employees (employee_id, first_name, last_name, department_id, salary)
VALUES (101, 'Alice', 'Smith', 10, 70000.00);
-- Get the ACTIVITY_COUNT
       row_count1 := (
    SELECT
        $1
    FROM
        TABLE(RESULT_SCAN(LAST_QUERY_ID()))
       ) /*** SSC-FDM-TD0033 - 'ACTIVITY_COUNT' TRANSFORMATION MIGHT REQUIRE MANUAL ADJUSTMENTS ***/;
SELECT
    employee_id INTO
    :emp_id
FROM
    employees;
       message := 'EMPLOYEE INSERTED - ID: ' || emp_id;
...
Copy

Recommendations

  • When using LAST_QUERY_ID, verify that you are referencing the intended query.

  • Execute ACTIVITY_COUNT immediately after your DML statement to ensure accurate evaluation.

  • For additional assistance, contact us at snowconvert-support@snowflake.com