SnowConvert: Teradata Scripts To Python Translation Reference

Browse through the following pages to find more information about specific topics.

  • BTEQ, explore the translation reference for Basic Teradata Query syntax.

  • FastLoad, explore the translation reference for FastLoad syntax.

  • MultiLoad, explore the translation reference for MultiLoad syntax.

  • TPT, explore the translation reference for TPT syntax.

TPT

TPT statements transformation

All TPT statements, like other Teradata scripting languages, are being converted to python code. Here are some examples of transformations already supported.

Define Job header transformation

The job statement is translated to a python class with all the statements like operators, schema definitions, and steps inside of it.

Source code

 /* Some comments on the job  */
DEFINE JOB LOADJOB
DESCRIPTION 'LOAD AC_SCHEMA TABLE FROM A FILE'
JobBody
Copy

Translated code

 ## Some comments on the job
class LOADJOB:
    # DESCRIPTION 'LOAD AC_SCHEMA TABLE FROM A FILE'
    JobBody
Copy

Define Schema transformation

The schema statement is translated to an attribute in the class created for the job statement.

Source code

 DEFINE SCHEMA DCS_SCHEMA
DESCRIPTION 'DCS DATA'
(
PNRHEADER_ID   PERIOD(DATE),
PNRLOCPERIOD   PERIOD(TIMESTAMP(0)),
CRTDATE        CLOB,
REQTYP         JSON(100000),
seqno          INTEGER,
resdata        INTEGER
);
Copy

Translated code

 class JOBNAME:
    DCS_SCHEMA = """(
    PNRHEADER_ID VARCHAR(24),
    PNRLOCPERIOD VARCHAR(58),
    CRTDATE VARCHAR /*** MSC-WARNING - SSC-FDM-TD0002 - COLUMN CONVERTED FROM CLOB DATA TYPE ***/,
    REQTYP VARIANT,
    seqno INTEGER,
    resdata INTEGER,
    );"""
Copy

Define Operator transformation

The operators are translated to python functions inside the class generated for the job. The examples provided are the operators that SnowConvert currently supports

DDL Operator

Source code for DDL operator

 DEFINE OPERATOR DDL_OPERATOR()
DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DDL OPERATOR'
TYPE DDL
ATTRIBUTES
(
  VARCHAR PrivateLogName ,
  VARCHAR TdpId          = @MyTdpId,
  VARCHAR UserName       = @MyUserName,
  VARCHAR UserPassword   = 'SomePassWord',
  VARCHAR AccountID,
  VARCHAR ErrorList      = ['3807','2580']
);
Copy

Translated code

 class JobName:
    def DDL_OPERATOR(self):
        #'TERADATA PARALLEL TRANSPORTER DDL OPERATOR'
        global args
        self.con = log_on(user = args.MyUserName, password = 'SomePassWord')
Copy
UPDATE Operator

Source code for UPDATE operator

 DEFINE OPERATOR LOAD_OPERATOR()
DESCRIPTION 'TERADATA PARALLEL TRANSPORTER LOAD OPERATOR'
TYPE UPDATE
SCHEMA AC_MASTER_SCHEMA
ATTRIBUTES
(
    VARCHAR PrivateLogName ,
    INTEGER MaxSessions       =  32,
    INTEGER MinSessions       =  1,
    VARCHAR TargetTable       = '&TARGET_TABLE',
    VARCHAR TdpId             = @MyTdpId,
    VARCHAR UserName          = @MyUserName,
    VARCHAR UserPassword      = @MyPassword,
    VARCHAR AccountId,
    VARCHAR ErrorTable1       = '&LOG_DB_NAME.ERR1',
    VARCHAR ErrorTable2       = '&LOG_DB_NAME.ERR2',
    VARCHAR LogTable          = '&LOG_DB_NAME.LOG_TABLE'
);
Copy

Translated code

 class JobName:
    def LOAD_OPERATOR(self, query):
        #'TERADATA PARALLEL TRANSPORTER LOAD OPERATOR'
        #USES SCHEMA AC_MASTER_SCHEMA
        operator_name = "LOAD_OPERATOR"
        return query
Copy
DATA CONNECTOR PRODUCER Operator

Source code for Data Connector Producer operator

 DEFINE OPERATOR FILE_READER()
DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'
TYPE DATACONNECTOR PRODUCER
SCHEMA AC_MASTER_SCHEMA
ATTRIBUTES
(
  VARCHAR PrivateLogName ,
  VARCHAR DirectoryPath   = '&INPUTFILEPATH' ,
  VARCHAR FileName        = '&INPUTTEXTFILE' ,
  VARCHAR Format          = 'delimited',
  VARCHAR OpenMode        = 'Read',
  VARCHAR TextDelimiter     = '~',
  VARCHAR IndicatorMode   = 'N'
);
Copy

Translated code

 class JobName:
    def FILE_READER(self):
        #'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'
        #USES SCHEMA AC_MASTER_SCHEMA
        operator_name = "FILE_READER"
        stage_name = f"{self.jobname}_{operator_name}"
        format_name = f"{self.jobname}_{operator_name}_FILEFORMAT"
        exec(f"""CREATE OR REPLACE FILE FORMAT {format_name} TYPE = 'CSV' FIELD_DELIMITER = '~' TRIM_SPACE = TRUE SKIP_HEADER = 0""")
        exec(f"""CREATE STAGE IF NOT EXISTS {self.jobname}_STAGE""")
        exec(f"""PUT file://{INPUTFILEPATH}/{INPUTTEXTFILE} @{stage_name} OVERWRITE = TRUE AUTO_COMPRESS = FALSE;""")
        temp_table_name = f"{self.jobname}_{operator_name}_TEMP"
        exec(f"""DROP TABLE IF EXISTS {temp_table_name}""")
        exec(f"""CREATE TEMPORARY TABLE {temp_table_name} {self.AC_MASTER_SCHEMA}""")
        exec(f"""COPY INTO {temp_table_name} FROM @{stage_name} FILE_FORMAT = (format_name = '{format_name}')""")
        return temp_table_name
Copy

Define step transformation

Steps are too translated to python functions inside the class generated for the job, they will be called in the main function of the translated code.

Step source code

 STEP setup_tables
(
  APPLY
  ('DELETE FROM  &STAGE_DB_NAME.EMS_AC_MASTER_STG;')
   TO OPERATOR (DDL_OPERATOR() );
);

STEP stLOAD_FILE_NAME
(
  APPLY
  ('INSERT INTO CRASHDUMPS.EMP_NAME
  (EMP_NAME, EMP_YEARS, EMP_TEAM)
  VALUES
  (:EMP_NAME, :EMP_YEARS, :EMP_TEAM);')
  TO OPERATOR (ol_EMP_NAME() [1])
  SELECT * FROM OPERATOR(op_EMP_NAME);
);
Copy

Translated code

 def setup_tables(self):
    self.DDL_OPERATOR()
    exec(f"""DELETE FROM DATABASE1.{STAGE_DB_NAME}.EMS_AC_MASTER_STG""")

def stLOAD_FILE_NAME(self):
    exec(f"""INSERT INTO DATABASE1.CRASHDUMPS.EMP_NAME (EMP_NAME, EMP_YEARS, EMP_TEAM)
SELECT EMP_NAME, EMP_YEARS, EMP_TEAM
FROM (
{self.ol_EMP_NAME('SELECT * FROM ' + self.op_EMP_NAME() )})""")
Copy

Main function

The main function is always generated for any scripting language, for TPT the main function contains an instance of the job class and calls to the steps in the job

Main function sample code

 def main():
  _LOADJOB = LOADJOB()
  _LOADJOB.setup_tables()
  _LOADJOB.stLOAD_FILE_NAME()
  snowconvert.helpers.quit_application()
Copy

MLOAD

Teradata MultiLoad is a command-driven utility for fast, high-volume maintenance on multiple tables and views in Teradata Database.

In order to simulate the MultiLoad functionality for Teradata in Snowflake, MultiLoad files and commands are transformed to Python code, similar to the transformations performed for BTEQ and FastLoad scripts. The generated code uses the Snowflake Python project called snowconvert.helpers which contains the required functions to simulate the MultiLoad statements in Snowflake.

MultiLoad Commands Translation

Most of the MultiLoad Commands are considered not relevant in Snowflake, these commands are commented out. Below is the summary list of MultiLoad commands and their transformation status into Snowflake:

CommandsTransformation StatusNote
ACCEPTCommented
BEGIN MLOADTransformed​​The node is commented out since the transformation occurs in other related statements instead.
BEGIN DELETE MLOADCommented
DATEFORMCommented
DELETEPartially transformedCheck known issues.​
DISPLAYCommented
DML LABELTransformed
END MLOADTransformed​​Commented out since is not necessary for the transformation of the BEGIN MLOAD.
EOCCommented
FIELDTransformed
FILLERTransformedThis command needs to be with a FIELD and LAYOUT command to be converted.
IF, ELSE, and ENDIFCommented
IMPORTTransformed
INSERTTransformedThis is taken as a Teradata Statement, so it doesn't appear in this chapter.
LAYOUTTransformedThis command needs to be with a FIELD and FILLER command to be converted.
LOGDATACommented
LOGMECHCommented
LOGOFFCommented
LOGONCommented
LOGTABLECommented
PAUSE ACQUISITIONCommented
RELEASE MLOADCommented
ROUTE MESSAGESCommented
RUN FILECommented
SETCommented
SYSTEMCommented
TABLECommented
UPDATETransformedThis is taken as a Teradata Statement, so it doesn't appear in this chapter.
VERSIONCommented

However, there are some exceptional commands that must be converted into Python-specific code for them to work as intended in Snowflake. See this section.

If you have any additional questions regarding this documentation, you can email us at snowconvert-support@snowflake.com.

BEGIN MLOAD

The transformation for the command .BEGIN MLOAD is a multi-part transformation that requires the .LAYOUT, .FIELD, .FILLER,.DML LABEL, and .IMPORT commands to simulate its behavior correctly.

This transformation is fully explained in the following subsections.

.LAYOUT, .FIELD and .FILLER

The transformation for the commands .LAYOUT, .FIELD, and .FILLER will create variable definitions to be used in a future function call of the IMPORT of this layout.

Teradata (MultiLoad)

 .LAYOUT INFILE_LAYOUT;
.FIELD TABLE_ID        * INTEGER;
.FIELD TABLE_DESCR     * CHAR(8);
.FILLER COL1           * CHAR(1);
.FIELD TABLE_NBR       * SMALLINT;
.FIELD TABLE_SOMEFIELD * SMALLINT;
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  INFILE_LAYOUT_TableName = "INFILE_LAYOUT_TEMP_TABLE"
  INFILE_LAYOUT_Columns = """TABLE_ID INTEGER, 
TABLE_DESCR CHAR(8), 
COL1 CHAR(1), 
TABLE_NBR SMALLINT, 
TABLE_SOMEFIELD SMALLINT"""
  INFILE_LAYOUT_Conditions = """TABLE_ID AS TABLE_ID, TABLE_DESCR AS TABLE_DESCR, COL1 AS COL1, TABLE_NBR AS TABLE_NBR, TABLE_SOMEFIELD AS TABLE_SOMEFIELD"""
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

.DML LABEL

The transformation for the .DML LABELcommand will create a function containing the statements after the label definition. Note that after the .DML LABEL command there is usually an Insert, Update or Delete.

Teradata (MultiLoad)

 -- Example of .DML LABEL with INSERT:
.DML LABEL INSERT_TABLE;                                                                                    
INSERT INTO mydb.mytable( TABLE_ID,TABLE_DESCR,TABLE_NBR ) VALUES( :TABLE_ID,:TABLE_DESCR,:TABLE_NBR );

-- Example of .DML LABEL with DELETE:
.DML LABEL DELETE_TABLE;              
DELETE FROM Employee WHERE EmpNo  = :EmpNo;

-- Example of .DML LABEL with an UPDATE, followed by an INSERT:
.DML LABEL UPSERT_TABLE DO INSERT FOR MISSING UPDATE ROWS;
UPDATE   mydb.mytable SET TABLE_ID = :TABLE_ID WHERE TABLE_DESCR = :somedescription
INSERT INTO mydb.mytable(TABLE_ID, TABLE_DESCR, TABLE_NBR) VALUES(:TABLE_ID, :TABLE_DESCR, :TABLE_NBR );
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  def INSERT_TABLE(tempTableName, queryConditions = ""):
    exec(f"""INSERT INTO mydb.mytable (TABLE_ID, TABLE_DESCR, TABLE_NBR)
SELECT
   :TABLE_ID,
   :TABLE_DESCR,
   :TABLE_NBR
FROM {tempTableName} SRC {queryConditions}""")
  exec("""
    DELETE FROM
      Employee
    WHERE
      EmpNo = :EmpNo
    """)
  def UPSERT_TABLE(tempTableName, queryConditions = ""):
    exec(f"""MERGE INTO mydb.mytable TGT USING (SELECT * FROM {tempTableName} {queryConditions}) SRC ON TABLE_DESCR = :somedescription
WHEN MATCHED THEN UPDATE SET
   TABLE_ID = :TABLE_ID
WHEN NOT MATCHED THEN INSERT (TABLE_ID, TABLE_DESCR, TABLE_NBR)
VALUES (:TABLE_ID, :TABLE_DESCR, :TABLE_NBR)""")
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

.IMPORT

The transformation of the .IMPORT command will create a call to theimport_file_to_temptablehelper to load the data from the file to a temporary table. Then, the calls to all theAPPLYlabels used in the original import will be created. Finally, the calls for anINSERTlabel will be transformed to a query parameter and optionally can have a query condition.

Teradata (MultiLoad)

 .IMPORT INFILE INFILE_FILENAME
    LAYOUT INFILE_LAYOUT
    APPLY INSERT_TABLE
    APPLY UPSERT_TABLE
    Apply DELETE_TABLE;
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW **
  #.IMPORT INFILE INFILE_FILENAME LAYOUT INFILE_LAYOUT APPLY INSERT_TABLE APPLY UPSERT_TABLE Apply DELETE_TABLE
   
  snowconvert.helpers.import_file_to_temptable(fr"INFILE_FILENAME", INFILE_LAYOUT_TableName, INFILE_LAYOUT_Columns, INFILE_LAYOUT_Conditions, ',')
  INSERT_TABLE(INFILE_LAYOUT_TableName)
  UPSERT_TABLE(INFILE_LAYOUT_TableName)
  DELETE_TABLE(INFILE_LAYOUT_TableName)
  exec(f"""DROP TABLE {INFILE_LAYOUT_TableName}""")
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

Large Example

Given the transformations shown above for a variety of commands, consider the following example.

With this input data:

 id,name,age
1,John,25
2,Maria,29
3,Carlos,31
4,Mike,40
5,Laura,27
Copy

Teradata (MultiLoad)

 .begin import mload
        tables
	mySampleTable1
sessions 20
ampcheck none;

.layout myLayOut;
 .field ID * VARCHAR(2) NULLIF ID = '1';
 .field NAME * VARCHAR(25);
 .field AGE * VARCHAR(10);
.dml label insert_data;

INSERT INTO mySampleTable1
 ( 
    ID,
    NAME,
    AGE	
 )
VALUES
 (
    :ID,
    SUBSTRING(:NAME FROM 2),
    :AGE
 );

.import infile sampleData.txt
layout myLayOut
apply insert_data

.end mload;
.logoff;
Copy
ROWIDNAMEAGE
1NULLohn25
22aria29
33arlos31
44ike40
55aura27

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE.  **
  #.begin import mload tables mySampleTable1 sessions 20 ampcheck none
   
  myLayOut_TableName = "myLayOut_TEMP_TABLE"
  myLayOut_Columns = """ID VARCHAR(2), 
NAME VARCHAR(25), 
AGE VARCHAR(10)"""
  myLayOut_Conditions = """CASE
   WHEN ID = '1'
      THEN NULL
   ELSE ID
END AS ID, NAME AS NAME, AGE AS AGE"""
  def insert_data(tempTableName, queryConditions = ""):
    exec(f"""INSERT INTO mySampleTable1 (ID, NAME, AGE)
SELECT
   SRC.ID,
   SUBSTRING(SRC.NAME, 2),
   SRC.AGE
FROM {tempTableName} SRC {queryConditions}""")
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW **
  #.import infile sampleData.txt layout myLayOut apply insert_data
   
  snowconvert.helpers.import_file_to_temptable(fr"sampleData.txt", myLayOut_TableName, myLayOut_Columns, myLayOut_Conditions, ',')
  insert_data(myLayOut_TableName)
  exec(f"""DROP TABLE {myLayOut_TableName}""")

  if con is not None:
    con.close()
    con = None
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy
ROWIDNAMEAGE
1NULLohn25
22aria29
33arlos31
44ike40
55aura27

Known Issues

1. Delete statement is partially supported

The DELETE statement is partially supported since the where conditions, when found, are not being converted correctly if pointing to a LAYOUT defined column.

In the example below, :EmpNo is pointing to a LAYOUT defined column. However, the transformation does not take this into account and thus the code will be referencing a column that does not exists.

  exec("""
    DELETE FROM
      Employee
    WHERE
      EmpNo = :EmpNo
    """)

Copy

If you have any additional questions regarding this documentation, you can email us at snowconvert-support@snowflake.com.

SnowConvert Scripts Helpers

SnowConvert for Teradata can take in any Teradata SQL or scripts (BTEQ, FastLoad, MultiLoad, and TPump) and convert them to functionally equivalent Snowflake SQL, JavaScript embedded in Snowflake SQL, and Python. Any output Python code from SnowConvert will call functions from these helper classes to complete the conversion and create a functionally equivalent output in Snowflake.

The Snowflake Connector for Python will also be called in order to connect to your Snowflake account and run the output Python code created by SnowConvert.

The latest version information of the package can be found here.

Note

The Python packagesnowconvert-helpers supports Python versions 3.6, 3.7, 3.8, and 3.9.

Script Migration

Source

Suppose you have the following BTEQ code to be migrated.

 insert into table1 values(1, 2);
insert into table1 values(3, 4);
insert into table1 values(5, 6);
Copy

Output

You should get an output like the one below.

The `log_on`function parameters ('user', 'password', 'account', 'database', 'warehouse', 'role', 'token') should be defined by the user.
Copy
 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  exec("""
    INSERT INTO table1
    VALUES (1, 2)
    """)
  exec("""
    INSERT INTO table1
    VALUES (3, 4)
    """)
  exec("""
    INSERT INTO table1
    VALUES (5, 6)
    """)
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

Getting Started

To install the package, you should run the following command in your python environment. If you’re not familiar with installing packages in Python, visit the following page on python packages (https://packaging.python.org/tutorials/installing-packages/).

 pip install snowconvert-helpers
Copy

Once your package is installed, you will be able to run the script migrated code in Python.

Run the code

To run the migrated code, you just have to open the Command Prompt or the Terminal and execute the following command.

 python sample_BTEQ.py
Copy

If the script has no errors, you will get in your console an output like the one below.

 Executing: INSERT INTO PUBLIC.table1 VALUES (1, 2).
Printing Result Set:
number of rows inserted
1

Executing: INSERT INTO PUBLIC.table1 VALUES (3, 4).
Printing Result Set:
number of rows inserted
1

Executing: INSERT INTO PUBLIC.table1 VALUES (5, 6).
Printing Result Set:
number of rows inserted
1

Error Code 0
Script done >>>>>>>>>>>>>>>>>>>>
Error Code 0
Copy

Passing connection parameters

There are several ways to pass the connection parameters to the connection of the database:

  • As parameters in the function call snowconvert.helpers.log_on inside the python file.

  • As positional parameters with the specific order of user, password, account, database, warehouse, and role when the python is being executed from the command line.

  • As named parameters with no order restriction of SNOW_USER, SNOW_PASSWORD, SNOW_ACCOUNT, SNOW_DATABASE, SNOW_WAREHOUSE, SNOW_ROLE, SNOW_QUERYTAG, SNOWAUTHENTICATOR and SNOWTOKEN when the python is being executed from the command line and any of them are passed like –param-VARNAME=VALUE.

  • As environment variables named SNOW_USER, SNOW_PASSWORD, SNOW_ACCOUNT, SNOW_DATABASE, SNOW_WAREHOUSE, SNOW_ROLE, SNOW_QUERYTAG, SNOWAUTHENTICATOR and SNOWTOKEN before python execution.

The previous order specified is the way to determine the precedence.

Parameters in the function call

They can be set as positional parameters in the function call as follows.

    .....
   con = snowconvert.helpers.log_on(
     'myuser',
     'mypassword',
     'myaccount',
     'mydatabase',
     'mywarehouse',
     'myrole',
     5,
     'myauthenticator',
     'mytoken')
   .....
Copy

Or they can be set any of the named parameters in any order in the function call as follows.

    .....
   con = snowconvert.helpers.log_on(
     account:'myaccount',
     password:'mypassword',
     user:'myuser',     
     warehouse:'mywarehouse',
     login_timeout:5,
     authenticator:'myauthenticator',
     toke:'mytoken')
   .....
Copy
Positional parameters

They need to be set in the specific order in the command line as follows.

 python sample_BTEQ.py myuser mypassword myaccount mydatabase mywarehouse myrole myauthenticator mytokenr
Copy

Or they can be set only some of the parameters but always starting with the user parameter as follows.

 python sample_BTEQ.py myuser mypassword myaccount
Copy
Named parameters

They can be set any of the named parameters in any order in the command line as follows (use a single line, multiline shown for readability reasons).

python sample_BTEQ.py --param-SNOW_WAREHOUSE=mywarehouse 
  --param-SNOW_ROLE=myrole 
  --param-SNOW_PASSWORD=mypassword 
  --param-SNOW_USER=myuser 
  --param-SNOW_QUERYTAG=mytag 
  --param-SNOW_ACCOUNT=myaccount
  --param-SNOW_DATABASE=mydatabase 
  --param-SNOW_AUTHENTICATOR=myauthenticator
  --param-SNOW_TOKEN=mytoken
  --param-PRIVATE_KEY_PATH=myprivatekey
  --param-PRIVATE_KEY_PASSWORD=myprivatekeypassword

Copy
Environment variables

Before calling the python script, any of the following environment variables can be set:

  • SNOW_USER

  • SNOW_PASSWORD

  • SNOW_ACCOUNT

  • SNOW_DATABASE

  • SNOW_WAREHOUSE

  • SNOW_ROLE

  • SNOW_QUERYTAG

  • SNOW_AUTHENTICATOR

  • SNOW_TOKEN

  • PRIVATE_KEY_PATH

  • PRIVATE_KEY_PASSWORD

Key Pair Authentication

The log_on function can also support the key pair authetication process. Review the following Snowflake documentation for more information about the key creation . Please notice the required parameters:

log_on(

user='YOUR_USER',

account='YOUR_ACCOUNT',

role = 'YOUR_ROLE',

warehouse = 'YOUR_WAREHOUSE',

database = 'YOUR_DATABASE',

private_key_path='/YOUR_PATH/rsa_key.p8',

private_key_password='YOUR_PASSWORD')

Example of passing environment variables

Here is an example of using SNOW_AUTHENTICATOR, SNOW_USER and SNOW_PASSWORD. They must be defined before running the output python file and then run the python generated file.

 SET SNOW_AUTHENTICATOR=VALUE
SET SNOW_USER=myuser
SET SNOW_PASSWORD=mypassword
python sample_BTEQ.py
Copy
 export SNOW_AUTHENTICATOR=VALUE
export SNOW_USER=myuser
export SNOW_PASSWORD=mypassword
python3 sample_BTEQ.py
Copy

Enabling Logging

To enable logging, you should enable an environment variable called SNOW_LOGGING set as true.

Then, if you want to customize the logging configuration you can pass a parameter to the snowconvert.helpers.configure_log() method like this:

 snowconvert.helpers.configure_log("SOMEPATH.conf")
Copy

The configuration file should contain the next structure. For more information about python logging, click here

 [loggers]
keys=root

[handlers]
keys=consoleHandler

[formatters]
keys=simpleFormatter

[logger_root]
level=DEBUG
handlers=consoleHandler

[logger_simpleExample]
level=DEBUG
handlers=consoleHandler
qualname=simpleExample
propagate=0

[handler_consoleHandler]
class=FileHandler
level=DEBUG
formatter=simpleFormatter
args=('python2.log', 'w')

[formatter_simpleFormatter]
format=%(asctime)s -%(levelname)s - %(message)s
Copy

Snowflake

Once any migrated code you have been executed, you can go to Snowflake and check your changes or deployments.

 select * from PUBLIC.table1;
Copy

You will be able to see the rows you have inserted in the example above.

Query result

Local Helpers Documentation

First of all, it is required to install the python package named pydoc (Available since version 2.0.2 of snowconvert-helpers).

 pip install pydoc
Copy

Then in order to display the python documentation of the package snowconvert-helpers, you should go to a folder where you have the converted output code and you have a python output.

D:\bteq\Output>dir

 Volume in drive D is Storage
 Volume Serial Number is 203C-168C

 Directory of D:\bteq\Output

05/25/2021  03:55 PM    <DIR>          .
05/25/2021  03:55 PM    <DIR>          ..
05/25/2021  03:55 PM               630 input_BTEQ.py
               1 File(s)            630 bytes
               2 Dir(s)  1,510,686,502,912 bytes free

Copy

Located in this directory you need to run:

 python -m pydoc -b
Copy

The console will open your preferred browser with the HTML help of the documentation for all the installed packages.

D:\bteq\Output>python -m pydoc -b
Server ready at http://localhost:61355/
Server commands: [b]rowser, [q]uit
server>

Copy

This will open the browser with the documentation of your code like:

Home page for the generated local documentation

Scroll thru the end of the page to see the installed packages. And you will see something similar to:

Local installed packages documentation index

Clicking in the SnowConvert(package) you will see something like:

Home page for the snowconvert-helpers documentation

Clicking in the module helpers will display a screen similar to:

Home page for helpers module

Then you can scroll thru the functions and classes of the module.

Functions documentation

Known Issues

No issues were found.

Related EWIs

No related EWIs.

Technical Documentation

Functions

All the functions defined in the project.

access

**`access`**`(path, mode, *, dir_fd=None, effective_ids=False, follow_symlinks=True)`
Copy
Description:

Use the real uid/gid to test for access to a path.

dir_fd, effective_ids, and follow_symlinks may not be implemented on your platform. If they are unavailable, using them will raise a NotImplementedError.

Note that most operations will use the effective uid/gid, therefore this routine can be used in a suid/sgid environment to test if the invoking user has the specified access to the path.

Parameters:
  • path, Path to be tested; can be string, bytes, or a path-like object

  • mode, Operating-system mode bitfield. Can be F_OK to test existence, or the inclusive-OR of R_OK, W_OK, and X_OK

  • dir_fd, If not None, it should be a file descriptor open to a directory, and path should be relative; path will then be relative to that directory

  • effective_ids, If True, access will use the effective uid/gid instead of the real uid/gid

  • follow_symlinks, If False, and the last element of the path is a symbolic link, access will examine the symbolic link itself instead of the file the link points to

at_exit_helpers

**`at_exit_helpers`**`()`
Copy
Description:

Executes at the exit of the execution of the script.

colored

**`colored`**`(text, color='blue')`
Copy
Description:

Prints colored text from the specified color.

Parameters:
  • text, The text to be printed

  • color="blue", The color to print

configure_log

**`configure_log`**`(configuration_path)`
Copy
Description:

Configures the logging that will be performed for any data-related execution on the snowflake connection. The log file is named ‘snowflake_python_connector.log’ by default.

Parameters:

  • configuration_path, The configuration path of the file that contains all the settings desired for the logging

drop_transient_table

**`drop_transient_table`**`(tempTableName, con=None)`
Copy
Description:

Drops the transient table with the specified name.

Parameters:

  • tempTableName, The name of the temporary table

  • con=None, The connection to be used, if None is passed it will use the last connection performed

exception_hook

**`exception_hook`**`(exctype, value, tback)`
Copy
Description:

Parameters:

  • exctype

  • value

  • tback

exec

**`exec`**`(sql_string, using=None, con=None)`
Copy
Description:

Executes a sql string using the last connection, optionally it uses arguments or an specific connection. Examples:

  • exec("SELECT * FROM USER")

  • exec("SELECT * FROM USER", con)

  • exec("SELECT * FROM CUSTOMER WHERE CUSTOMERID= %S", customer)

Parameters:

  • sql_string, The definition of the sql

  • using=None, The optional parameter that can be used in the sql passed

  • con=None, The connection to be used, if None is passed it will use the last connection performed

exec_file

**`exec_file`**`(filename, con=None)`
Copy
Description:

Reads the content of a file and executes the sql statements contained with the specified connection.

Parameters:

  • filename, The filename to be read and executed

  • con=None, The connection to be used, if None is passed it will use the last connection performed

exec_os

**`exec_os`**`(command)`
Copy
Description:

Executes a command in the operative system.

exec_sql_statement

**`exec_sql_statement`**`(sql_string, con, using=None)`
Copy
Description:

Executes a sql statement in the connection passed, with the optional arguments.

Parameters:

  • sql_string, The sql containing the string to be executed

  • con, The connection to be used

  • using, The optional parameters to be used in the sql execution

expands\using_params**

**`expands_using_params`**`(statement, params)`
Copy
Description:

Expands the statement passed with the parameters.

Parameters:

  • statement, The sql containing the string to be executed

  • params, The parameters of the sql statement

expandvar

**`expandvar`**`(str)`
Copy
Description:

Expands the variable from the string passed.

Parameters:

  • str, The string to be expanded with the variables

expandvars

**`expandvars`**`(path, params, skip_escaped=False)`
Copy
Description:

Expand environment variables of form $var and ${var}. If parameter ‘skip_escaped’ is True, all escaped variable references (i.e. preceded by backslashes) are skipped. Unknown variables are set to ‘default’. If ‘default’ is None, they are left unchanged.

Parameters:

  • path,

  • params,

  • skip_escaped=False,

FAST LOAD

**`fast_load`**`(target_schema, filepath, stagename, target_table_name, con=None)`
Copy
Description:

Executes the fast load with the passed parameters target_schema, filepath, stagename and target_table_name.

Parameters:

  • target_schema, The name of the schema to be used in the fast load

  • filepath, The filename path to be loaded in the table

  • target_table_name, The name of the table that will have the data loaded

  • con=None, The connection to be used, if None is passed it will use the last connection performed

file\exists_and_readable**

**`file_exists_and_readable`**`(filename)`
Copy
Description:

Parameters:

  • filename,

get\argkey**

**`get_argkey`**`(astr)`
Copy
Description:

Gets the argument key value from the passed string. It must start with the string ‘–param-’

Parameters:

  • astr, The argument string to be used. The string should have a value similar to –param-column=32 and the returned string will be ‘32

get\error_position**

**`get_error_position`**`()`
Copy
Description:

Gets the error position from the file using the information of the stack of the produced error.

get\from_vars_or_args_or_environment**

**`get_from_vars_or_args_or_environment`**`(arg_pos, variable_name, vars, args)`
Copy
Description:

Gets the argument from the position specified or gets the value from the table vars or gets the environment variable name passed.

Parameters:

  • arg_pos, The argument position to be used from the arguments parameter

  • variable_name, The name of the variable to be obtained

  • vars, The hash with the variables names and values

  • args, The arguments array parameter

import\data_to_temptable**

**`import_data_to_temptable`**`(tempTableName, inputDataPlaceholder, con)`
Copy
Description:

Imports data to a temporary table using an input data place holder.

Parameters:

  • tempTableName, The temporary table name.

  • inputDataPlaceholder, The input place holder used that is a stage in the snowflake database

  • con, The connection to be used

import\file**

**`import_file`**`(filename, separator=' ')`
Copy
Description:

Imports the passed filename with the optional separator.

Parameters:

  • filename, The filename path to be imported

  • separator=' ', The optional separator

import\file_to_temptable**

**`import_file_to_temptable`**`(filename, tempTableName, columnDefinition)`
Copy
Description:

Imports the file passed to a temporary table. It will use a public stage named as the temporary table with the prefix Stage_. At the end of the loading to the temporary table, it will delete the stage that was used in the process.

Parameters:

  • filename, The name of the file to be read

  • tempTableName, The name of the temporary table

  • columnDefinition, The definition of all the fields that will have the temporary table

import\reset**

**`import_reset`**`()`
Copy
Description:

log

**`log`**`(*msg, level=20, writter=None)`
Copy
Description:

Prints a message to the console (standard output) or to the log file, depending on if logging is enabled

Parameters:

  • *msg, The message to print or log

  • level=20,

  • writter=None,

log\on**

**`log_on`**`(user=None, password=None, account=None, database=None, warehouse=None, role=None, login_timeout=10, authenticator=None)`
Copy
Description:

Logs on the snowflake database with the credentials, database, warehouse, role, login_timeout and authenticator passed parameters.

Parameters:

  • user, The user of the database

  • password The password of the user of the database

  • database, The database to be connected

  • warehouse, The warehouse of the database to be connected

  • role, The role to be connected

  • login_timeout, The maximum timeout before giving error if the connection is taking too long to connect

  • authenticator, The authenticator supported value to use like SNOWFLAKE, EXTERNALBROWSER, SNOWFLAKE_JWT or OAUTH

  • token, The OAUTH or JWT token

os

**`os`**`(args)`
Copy
Description:

Parameters:

  • args,

quit\application**

**`quit_application`**`(code=None)`
Copy
Description:

Quits the application and optionally returns the passed code.

Parameters:

  • code=None, The code to be returned after it quits

read\params_args**

**`read_param_args`**`(args)`
Copy
Description:

Reads the parameter arguments from the passed array.

Parameters:

  • args, The arguments to be used

readrun

**readrun**(line, skip=0)
Copy
Description:

Reads the given filename lines and optionally skips some lines at the beginning of the file.

Parameters:

  • line, The filename to be read

  • skip=0, The lines to be skipped

remark

**remark**(arg)
Copy
Description:

Prints the argument.

Parameters:

  • arg, The argument to be printed

repeat\previous_sql_statement**

**`repeat_previous_sql_statement`**`(con=None, n=1)`
Copy
Description:

Repeats the previous executed sql statement(s).

Parameters:

  • con=None, Connection if specified. If it is not passed it will use the last connection performed

  • n=1, The number of previous statements to be executed again

set\default_error_level**

**`set_default_error_level`**`(severity_value)`
Copy
Description:

Parameters:

  • severity_value,

set\error_level**

**`set_error_level`**`(arg, severity_value)`
Copy
Description:

Parameters:

  • arg,

  • severity_value,

simple\fast_load**

**`simple_fast_load`**`(con, target_schema, filepath, stagename, target_table_name)`
Copy
Description:

Executes a simple fast load in the connection and the passed parameter target_schema, filepath, stagename and target table name.

Parameters:

  • arg, The connection to be used

  • target_schema, The name of the schema to be used in the fast load

  • filepath, The filename path to be loaded in the table

  • target_table_name, The name of the table that will have the data loaded

stat

**`stat`**`(path, *, dir_fd=None, follow_symlinks=True)`
Copy
Description:

Perform a stat system call on the given path. dir_fd and follow_symlinks may not be implemented on your platform. If they are unavailable, using them will raise a NotImplementedError. It’s an error to use dir_fd or follow_symlinks when specifying path as an open file descriptor

Parameters:

  • path, Path to be examined; can be string, bytes, a path-like object or
    open-file-descriptor int

  • dir_fd, If not None, it should be a file descriptor open to a directory, and path should be a relative string; path will then be relative to that directory

  • follow_symlinks, If False, and the last element of the path is a symbolic link, stat will examine the symbolic link itself instead of the file the link points to

system

**`system`**`(command)`
Copy
Description:

Execute the command in a subshell.

Parameters:

  • command,

using

**`using`**`(*argv)`
Copy
Description:

Parameters:

  • *argv,

Classes

All the classes defined in the project

BeginLoading Class

This class contains the import_file_to_tab static function which provides support for the BEGIN LOADING and associated commands in FastLoad.

import_file_to_tab()

Parameters:

  1. target_schema_table

    • the target schema (optional) and table name

  2. define_file

    • The name of the file to be read

  3. define_columns

    • The definition of all the columns for the temporary table

  4. begin_loading_columns

    • The column names to insert. Dictates the order in which values are inserted

  5. begin_loading_values

    • The list of raw insert values to convert

  6. field_delimiter

    • The field delimiter

  7. (optional) skip_header

    • The number of rows to skip

  8. (optional) input_data_place_holder

    • The location of the file in a supported cloud provider. Set parameter when the file is not stored locally

  9. (optional) con

    • The connection to be used

Export Class

Static methods in the class

  • defaults()

  • null(value=None)

  • record_mode(value=None)

  • report(file, separator=' ')

  • reset()

  • separator_string(value=None)

  • separator_width(value=None)

  • side_titles(value=None)

  • title_dashes(value=None, withValue=None)

  • title_dashes_with(value=None)

  • width(value=None)


Data and other attributes defined here

  • expandedfilename = None

  • separator = '' \

Import Class

Methods in the class

  • reset()

Static methods in the class

  • file(file, separator=' ')

  • using(globals, *argv)

Data and other attributes defined in the class

  • expandedfilename = None

  • no_more_rows = False

  • read_obj = None

  • reader = None

  • separator = ' '

Parameters Class

Data and other attributes defined in the class

  • passed_variables = {}

BTEQ

Basic Teradata Query (BTEQ) is a general-purpose, command-based program that enables users on a workstation to communicate with one or more Teradata Database systems, and to format reports for both print and screen output.

In order to simulate the BTEQ functionality for Teradata in Snowflake, BTEQ files and commands are transformed to Python code, similar to the transformations performed for MultiLoad and FastLoad scripts. The generated code uses the Snowflake Python project called snowconvert.helpers which contains the required functions to simulate the BTEQ statements in Snowflake.

BTEQ Commands Translation

The following table presents the conversion for the BTEQ commands to Snowflake.

Teradata

Snowflake

Notes

ERRORCODE != 0

snowconvert.helpers.error_code != 0

.EXPORT DATA FILE=fileName

Export.report(“fileName”, “,”)

The function has no functionality

.EXPORT INDICDATA FILE=fileName

Export.report(“fileName”, “,”)

The function has no functionality

.EXPORT REPORT FILE=fileName

Export.report(“fileName”, “,”)

The function has no functionality

.EXPORT DIF FILE=fileName

Export.report(“fileName”, “,”)

The function has no functionality

.EXPORT RESET

Export.reset()

The function has no functionality

.IF ERRORCODE != 0 THEN .QUIT ERRORCODE

If snowconvert.helpers.error_code != 0: snowconvert.helpers.quit_application (snowconvert.helpers.error_code)

.IMPORT RESET

snowconvert.helpers.import_reset()

The function has no functionality

.LABEL newLabel

def NEWLABEL():
snowconvert.helpers.quit_application()

.LOGOFF

The statement is commented

.LOGON

The statement is commented

.LOGMECH

The statement is commented

.OS /fs/fs01/bin/filename.sh ‘load’

snowconvert.helpers.os(“”/fs/fs01/bin/filename.sh ‘load’ “”)

.RUN FILE=newFile

for statement in snowconvert.helpers.readrun(“newFile”): eval(statement)

.SET DEFAULTS

Export.defaults()

The function has no functionality

.SET ERRORLEVEL 3807 SEVERITY 0

snowconvert.helpers.set_error_level(3807, 0)

.SET RECORMODE OFF

Export.record_mode(False)

.SET RECORMODE ON

Export.record_mode(True)

.SET SEPARATOR ‘|’

Export.separator_string(‘|’)

The function has no functionality

.SET WIDTH 120

Export.width(120)

The function has no functionality

.Remark “”Hello world!”””

snowconvert.helpers.remark(r””””””Hello world!””””””)

.QUIT ERRORCODE

snowconvert.helpers.quit_application(

snowconvert.helpers.error_code

)

.QUIT

snowconvert.helpers.quit_application()

SQL statements

exec(statement)

$(<$INPUT_SQL_FILE)

exec_file(“$INPUT_SQL_FILE”)

= (Repeat previous command)

snowconvert.helpers.repeat_previous_sql_statement(con)

For more complicated statements presented in the previous table, subsections with more elaborated examples are explained.

.GOTO Conversion

Since we are converting BTEQ scripts to Python, certain structures that are valid in BTEQ are not inherently supported in Python. This is the case for the .GOTO command using the .Label commands.

For this reason, an alternative has been developed so that the functionality of these commands can be emulated, turning the .Label commands into functions with subsequent call statements.

Check the following code:

 .LABEL FIRSTLABEL
SELECT * FROM MyTable1;
.LABEL SECONDLABEL
SELECT * FROM MyTable2;
SELECT * FROM MyTable3;
Copy

In the example above, there were five commands. Two of them were.Labelcommands. The commandFIRSTLABELwas transformed into a function with the statement(s) that follow it below until another.LABELcommand is found. When another label is called (in this case, SECONDLABEL), that call ends the first function and starts a new one.

If we were to migrate the previous example, the result would be:

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  FIRSTLABEL()
  snowconvert.helpers.quit_application()
def FIRSTLABEL():
  exec("""
    SELECT
      *
    FROM
      MyTable1
    """)
  SECONDLABEL()
def SECONDLABEL():
  exec("""
    SELECT
      *
    FROM
      MyTable2
    """)
  exec("""
    SELECT
      *
    FROM
      MyTable3
    """)

if __name__ == "__main__":
  main()
Copy

Notice there is a call to the function_FIRSTLABEL, this function has only one statement, which would be the only non-label command that followsFIRSTLABELin the original code. Before theFIRSTLABELfunction ends, it calls SECONDLABEL_, with the statements that followed it.

Notes:

  • Creating a connector variable con = None, and populating it in the main() function: con = snowconvert.helpers.log_on().

  • Setting up a log: snowconvert.helpers.configure_log().

Execute Query Statements

Every SQL statement found in a BTEQ file will be executed though to theexecfunction provided by the snowconvert.helpers. Take for example the following code:

 CREATE TABLE aTable (aColumn BYTEINT);
Copy

This is converted to:

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  exec("""
    CREATE OR REPLACE TABLE aTable (
      aColumn BYTEINT
    )
    """)
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

Execute Script Files

Files that contain a user’s BTEQ commands and Teradata SQL statements are called scripts, run files, macros, or stored procedures. For example, create a file called SAMPFILE, and enter the following BTEQ script:

    .LOGON tdpid/userid,password 
   SELECT * FROM department;
   .LOGOFF
Copy

To execute the run file, enter either form of the BTEQ RUN command:

.RUN FILE=sampfile

Copy

If you convert the second code, the result is the following:

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()

  for statement in snowconvert.helpers.readrun(fr"sampfile"):
    eval(statement)
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

The snowconvert.helpers.readrun("sampfile") will return each line from the SAMPFILE and in theFORstatement, each one of the lines will be passed to the eval function, a method that parses the expression passed to it and runs python expression (the SAMPFILE should be converted to work) within the program.

Execute SQL Files

In some instances during the execution of a BTEQ file a SQL file can be found, take for example the SQL file called NEWSQL:

 CREATE TABLE aTable (aColumn BYTEINT);
Copy

This can be executed during a script with the following line:

 $(<$NEWSQL)
Copy

And after the conversion of the script the result is:

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  exec("""
    CREATE OR REPLACE TABLE aTable (
      aColumn BYTEINT
    )
    """)
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

The exec_file helper function will read each line from the NEWSQL file and then use the exec function as explained in the section Execute query statement.

Known Issues

No issues were found.

Related EWIs

No issues were found.

REPEAT

Note

Some parts in the output code are omitted for clarity reasons.

As per Teradata’s documentation, the REPEAT statement enables users to specify the maximum number of times the next SQL request is to be submitted. Note that a SQL request can be a single or multi-statement request. This is defined by the position of the semicolons for each statement following the REPEAT statement.

Syntax

REPEAT [ n [ PACK p [ REQBUFLEN b ] ] | * | RECS r]
<sql_request>

Copy

Sample Source Patterns

With this input data:

A B C
D E F
G H I

Copy
* [
] *

Copy

Teradata:

 .IMPORT DATA FILE = inputData.dat;
.REPEAT *
USING var_1 (CHARACTER), var_2 (CHARACTER), var_3 (CHARACTER)
INSERT INTO testtabu (c1) VALUES (:var_1)
;INSERT INTO testtabu (c1) VALUES (:var_2) 
;INSERT INTO testtabu (c1) VALUES (:var_3) 
;UPDATE testtabu 
   SET c2 = 'X'
   WHERE c1 = :var_1 
;UPDATE testtabu 
   SET c2 = 'Y' 
   WHERE c1 = :var_2 
;UPDATE testtabu 
   SET c2 = 'Z'
   WHERE c1 = :var_3
;INSERT INTO TESTTABU (c1, c2) VALUES ('?','_');

.REPEAT 10
INSERT INTO TESTTABU2 VALUES ('John Doe', 23);

.REPEAT RECS 5
INSERT INTO TESTTABU2 VALUES ('Bob Alice', 21);

.IMPORT DATA FILE = inputData2.dat;
USING (var_1 CHARACTER, var_2 CHARACTER)
INSERT INTO testtabu (c1) VALUES (:var_1) 
;INSERT INTO testtabu (c1) VALUES (:var_2);
Copy
C1C2
AX
DX
GX
BY
EY
HY
CZ
FZ
IZ
?_
?_
?_
*null
[null
MY_NAMEMY_AGE
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
Bob Alice21
Bob Alice21
Bob Alice21
Bob Alice21
Bob Alice21

Snowflake:

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  snowconvert.helpers.import_file(fr"inputData.dat")
  ssc_repeat_value = '*'
  ssc_max_iterations = 1

  for ssc_repeat_position in range(0, ssc_max_iterations):

    if ssc_repeat_position == 0:
      using = snowconvert.helpers.using("var_1", "CHARACTER", "var_2", "CHARACTER", "var_3", "CHARACTER", rows_to_read = ssc_repeat_value)
      exec("""
        INSERT INTO testtabu (c1)
        VALUES (:var_1)
        """, using = using)
      exec("""
        INSERT INTO testtabu (c1)
        VALUES (:var_2)
        """, using = using)
      exec("""
        INSERT INTO testtabu (c1)
        VALUES (:var_3)
        """, using = using)
      exec("""
        UPDATE testtabu
          SET
            c2 = 'X'
          WHERE
            c1 = :var_1
        """, using = using)
      exec("""
        UPDATE testtabu
          SET
            c2 = 'Y'
          WHERE
            c1 = :var_2
        """, using = using)
      exec("""
        UPDATE testtabu
          SET
            c2 = 'Z'
          WHERE
            c1 = :var_3
        """, using = using)
      exec("""
        INSERT INTO TESTTABU (c1, c2)
        VALUES ('?', '_')
        """, using = using)

  ssc_repeat_value = 10
  ssc_max_iterations = 10

  for ssc_repeat_position in range(0, ssc_max_iterations):
    exec("""
      INSERT INTO TESTTABU2
      VALUES ('John Doe', 23)
      """)
  ssc_repeat_value = 5
  ssc_max_iterations = 5

  for ssc_repeat_position in range(0, ssc_max_iterations):
    exec("""
      INSERT INTO TESTTABU2
      VALUES ('Bob Alice', 21)
      """)
  snowconvert.helpers.import_file(fr"inputData2.dat")
  using = snowconvert.helpers.using("var_1", "CHARACTER", "var_2", "CHARACTER", rows_to_read = 1)
  exec("""
    INSERT INTO testtabu (c1)
    VALUES (:var_1)
    """, using = using)
  exec("""
    INSERT INTO testtabu (c1)
    VALUES (:var_2)
    """, using = using)
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy
C1C2
AX
DX
GX
BY
EY
HY
CZ
FZ
IZ
?_
?_
?_
*null
[null
MY_NAMEMY_AGE
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
John Doe23
Bob Alice21
Bob Alice21
Bob Alice21
Bob Alice21
Bob Alice21

Known Issues

No issues were found.

Related EWIs

No related EWIs.

USING REQUEST MODIFIER

Some parts in the output code are omitted for clarity reasons.
Copy

As per Teradata’s documentation, the USING REQUEST MODIFIER defines one or more variable parameter names to be used in the subsequent SELECT, INSERT, UPDATE, or DELETE statements to import or export data.

The syntax for this statement is as follows:

 USING ( <using_spec> [,...] ) SQL_request

<using_spec> ::= using_variable_name data_type [ data_type_attribute [...] ]
  [ AS { DEFERRED [BY NAME] | LOCATOR } ]
Copy

As stated in Teradata’s documentation, the USING REQUEST MODIFIER needs to be preceded by an .IMPORT statement for it to load the data into the defined parameters.

Thus, the transformation for this statement follows these steps:

  1. Call the import_file() function from the SnowConvert Helpers. This loads the data into a temporary file.

  2. Call the using() function from the SnowConvert Helpers to create a dictionary with the loaded data.

  3. For each query, run the exec() function from the SnowConvert Helpers and pass the previously defined dictionary. This will use Snowflake Python Connector data binding capabilities.

With this input data:

 A,B,C
Copy

Teradata (MultiLoad)

 .IMPORT DATA FILE = inputData.dat;
USING var_1 (CHARACTER), var_2 (CHARACTER), var_3 (CHARACTER)
INSERT INTO testtabu (c1) VALUES (:var_1)
;INSERT INTO testtabu (c1) VALUES (:var_2) 
;INSERT INTO testtabu (c1) VALUES (:var_3) 
;UPDATE testtabu 
   SET c2 = 'X'
   WHERE c1 = :var_1 
;UPDATE testtabu 
   SET c2 = 'Y' 
   WHERE c1 = :var_2 
;UPDATE testtabu 
   SET c2 = 'Z'
   WHERE c1 = :var_3;
Copy
ROWC1C2
1AX
2BY
3CZ

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  snowconvert.helpers.import_file(fr"inputData.dat")
  using = snowconvert.helpers.using("var_1", "CHARACTER", "var_2", "CHARACTER", "var_3", "CHARACTER", rows_to_read = 1)
  exec("""
    INSERT INTO testtabu (c1)
    VALUES (:var_1)
    """, using = using)
  exec("""
    INSERT INTO testtabu (c1)
    VALUES (:var_2)
    """, using = using)
  exec("""
    INSERT INTO testtabu (c1)
    VALUES (:var_3)
    """, using = using)
  exec("""
    UPDATE testtabu
      SET
        c2 = 'X'
      WHERE
        c1 = :var_1
    """, using = using)
  exec("""
    UPDATE testtabu
      SET
        c2 = 'Y'
      WHERE
        c1 = :var_2
    """, using = using)
  exec("""
    UPDATE testtabu
      SET
        c2 = 'Z'
      WHERE
        c1 = :var_3
    """, using = using)
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy
ROWC1C2
1AX
2BY
3CZ

Known Issues

1. .REPEAT command is not yet supported

The .REPEAT command is not yet supported. This means that the USING REQUEST MODIFIER will only use the data loaded from the first row of the input file. Thus, the queries will only run once.

This issue should be fixed when the .REPEAT command receives proper transformation support.

If you have any additional questions regarding this documentation, you can email us at snowconvert-support@snowflake.com.

Related EWIs

No related EWIs.

FLOAD

Teradata FastLoad is a command‑driven utility for quickly loading large amounts of data in an empty table on a Teradata Database.

In order to simulate the FastLoad functionality for Teradata in Snowflake, FastLoad files and commands are transformed to Python code, similar to the transformations performed for BTEQ and MultiLoad scripts. The generated code uses the Snowflake Python project called snowconvert.helpers which contains the required functions to simulate the FastLoad statements in Snowflake.

FastLoad Commands Translation

Most of the FastLoad commands are considered not relevant in Snowflake, these commands are commented out. Below is the summary list of FastLoad commands and their transformation status into Snowflake:

Teradata FastLoad Command

Transformation Status

Note

AXSMOD

Commented

BEGIN LOADING

Transformed

​The node is commented out since the transformation occurs in the related INSERT statement instead.

CLEAR

Commented

DATEFORM

Commented

DEFINE

Transformed

END LOADING

Transformed

​Commented out since is not necessary for the transformation of the BEGIN LOADING.

ERRLIMIT

Commented

HELP

Commented

HELP TABLE

Commented

INSERT

Transformed

Transformed as part of the BEGIN LOADING.

LOGDATA

Commented

LOGMECH

Commented

LOGOFF

Commented

LOGON

Commented

NOTIFY

Commented

OS

Commented

QUIT

Commented

RECORD

Commented

RUN

Commented

SESSIONS

Commented

SET RECORD

Transformed

SET SESSION CHARSET

Commented

SHOW

Commented

SHOW VERSIONS

Commented

SLEEP

Commented

TENACITY

Commented

Default Transformation

The default behavior of the ConversionTool for these statements is to comment them out. For example:

Teradata (FastLoad)

 SESSIONS 4;
ERRLIMIT 25;
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE.  **
  #SESSIONS 4
   
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE.  **
  #ERRLIMIT 25
   
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

Nonetheless, there are some exceptions that must be converted to specific Python statements in order to work as intended in Snowflake.

Embedded SQL

FastLoad scripts support Teradata statements inside the same file. The majority of these statements are converted just as if they were inside a BTEQ file, with some exceptions.

Dropping an error table is commented out if inside a FastLoad file.

Teradata (FastLoad)

 DROP TABLE Error1;
DROP TABLE Error2;
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  exec("""
    DROP TABLE Error1
    """)
  exec("""
    DROP TABLE Error2
    """)
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

Large Example

Given the transformations shown above for a variety of commands, consider the following example.

Teradata (FastLoad)

 SESSIONS 4;
ERRLIMIT 25;
DROP TABLE FastTable;
DROP TABLE Error1;
DROP TABLE Error2;
CREATE TABLE FastTable, NO FALLBACK
   ( ID INTEGER, UFACTOR INTEGER, MISC CHAR(42))
   PRIMARY INDEX(ID);
DEFINE ID (INTEGER), UFACTOR (INTEGER), MISC (CHAR(42))
   FILE=FileName;
SHOW;
BEGIN LOADING FastTable ERRORFILES Error1,Error2
   CHECKPOINT 10000;
INSERT INTO FastTable (ID, UFACTOR, MISC) VALUES
   (:ID, :MISC);
END LOADING;
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
#** SSC-FDM-0007 - MISSING DEPENDENT OBJECTS "Error1", "Error2" **
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE.  **
  #SESSIONS 4
   
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE.  **
  #ERRLIMIT 25
   
  exec("""
    DROP TABLE FastTable
    """)
  exec("""
    CREATE OR REPLACE TABLE FastTable (
      ID INTEGER,
      UFACTOR INTEGER,
      MISC CHAR(42)
    )
    """)
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS ASSIGNMENT STATEMENTS **
  #DEFINE ID (INTEGER), UFACTOR (INTEGER), MISC (CHAR(42)) FILE = FileName
   
  ssc_define_columns = "ID (INTEGER), UFACTOR (INTEGER), MISC (CHAR(42))"
  #Set file name manually if empty
  ssc_define_file = f"""FileName"""
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE.  **
  #SHOW
   
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW **
  #BEGIN LOADING FastTable ERRORFILES Error1, Error2 CHECKPOINT 10000
   
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS PART OF THE BEGIN LOADING TRANSLATION **
  #INSERT INTO FastTable (ID, UFACTOR, MISC) VALUES (:ID, :MISC)
   
  ssc_begin_loading_columns = "(ID, UFACTOR, MISC)"
  ssc_begin_loading_values = [":ID", ":MISC"]
  BeginLoading.import_file_to_table(f"""FastTable""", ssc_define_columns, ssc_define_file, ssc_begin_loading_columns, ssc_begin_loading_values, ",")
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. END LOADING **
  #END LOADING
   
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

If you have any additional questions regarding this documentation, you can email us at snowconvert-support@snowflake.com.

Known Issues

No issues were found.

Related EWIs

  1. SSC-FDM-0007: Element with missing dependencies.

  2. SSC-FDM-0027: Removed next statement, not applicable in SnowFlake.

BEGIN LOADING

The transformation for the command BEGIN LOADING is a multi-part transformation that requires the DEFINE, INSERT and (optionally) SET RECORD commands to simulate its behavior correctly.

This transformation is fully explained in the following subsections.

SET RECORD

As stated above, this command is not required for the transformation of the BEGIN LOADING. If not found, the default delimiter will be set to ‘,’ (comma). Else, the defined delimiter will be used. This value is stored in the ssc_set_record variable.

As of now only SET RECORD VARTEXT, SET RECORD FORMATTED and SET RECORD UNFORMATTED are supported. For the BINARY and TEXT keyword specification an error EWI is placed instead.

Teradata (FastLoad)

 SET RECORD VARTEXT DELIMITER 'c' DISPLAY ERRORS 'efilename';
SET RECORD VARTEXT 'l' 'c' NOSTOP;
SET RECORD VARTEXT 'l' TRIM NONE LEADING 'p';
SET RECORD VARTEXT 'l' TRIM NONE TRAILING 'p';
SET RECORD VARTEXT 'l' TRIM NONE BOTH 'p';
SET RECORD FORMATTED TRIM NONE BOTH;
SET RECORD UNFORMATTED QUOTE NO OPTIONAL;
SET RECORD BINARY QUOTE NO YES 'q';
SET RECORD TEXT QUOTE OPTIONAL;
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS AN ASSIGNMENT STATEMENT **
  #SET RECORD VARTEXT DELIMITER 'c' DISPLAY ERRORS 'efilename'
   
  ssc_set_record = ""
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS AN ASSIGNMENT STATEMENT **
  #SET RECORD VARTEXT 'l' 'c' NOSTOP
   
  ssc_set_record = "'l'"
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS AN ASSIGNMENT STATEMENT **
  #SET RECORD VARTEXT 'l' TRIM NONE LEADING 'p'
   
  ssc_set_record = "'l'"
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS AN ASSIGNMENT STATEMENT **
  #SET RECORD VARTEXT 'l' TRIM NONE TRAILING 'p'
   
  ssc_set_record = "'l'"
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS AN ASSIGNMENT STATEMENT **
  #SET RECORD VARTEXT 'l' TRIM NONE BOTH 'p'
   
  ssc_set_record = "'l'"
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS AN ASSIGNMENT STATEMENT **
  #SET RECORD FORMATTED TRIM NONE BOTH
   
  ssc_set_record = ","
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS AN ASSIGNMENT STATEMENT **
  #SET RECORD UNFORMATTED QUOTE NO OPTIONAL
   
  ssc_set_record = "UNFORMATTED"
  #** SSC-EWI-0021 - 'BINARY' KEYWORD SPECIFICATION FOR SET RECORD NOT SUPPORTED IN SNOWFLAKE **
  #SET RECORD BINARY QUOTE NO YES 'q'
   
  #** SSC-EWI-0021 - 'TEXT' KEYWORD SPECIFICATION FOR SET RECORD NOT SUPPORTED IN SNOWFLAKE **
  #SET RECORD TEXT QUOTE OPTIONAL
   
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

DEFINE

The transformation for the DEFINE command sets the ssc_define_columns and ssc_define_file variables with the value of the columns definition and the file path to be used in the BEGIN LOADING transformation respectively.

Teradata (FastLoad)

 DEFINE
    id (INTEGER),
    first_name (VARCHAR(50)),
    last_name (VARCHAR(50)),
    salary (FLOAT)
FILE=/tmp/inputData.txt;

DEFINE
    id (INTEGER),
    first_name (VARCHAR(50)),
    last_name (VARCHAR(50)),
    salary (FLOAT)

DEFINE
FILE=/tmp/inputData.txt;

DEFINE;
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS ASSIGNMENT STATEMENTS **
  #DEFINE id (INTEGER), first_name (VARCHAR(50)), last_name (VARCHAR(50)), salary (FLOAT) FILE = /tmp/inputData.txt
   
  ssc_define_columns = "id (INTEGER), first_name (VARCHAR(50)), last_name (VARCHAR(50)), salary (FLOAT)"
  #Set file name manually if empty
  ssc_define_file = f"""/tmp/inputData.txt"""
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS ASSIGNMENT STATEMENTS **
  #DEFINE id (INTEGER), first_name (VARCHAR(50)), last_name (VARCHAR(50)), salary (FLOAT)
   
  ssc_define_columns = "id (INTEGER), first_name (VARCHAR(50)), last_name (VARCHAR(50)), salary (FLOAT)"
  #Set file name manually if empty
  ssc_define_file = f""""""
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS ASSIGNMENT STATEMENTS **
  #DEFINE FILE = /tmp/inputData.txt
   
  ssc_define_columns = ""
  #Set file name manually if empty
  ssc_define_file = f"""/tmp/inputData.txt"""
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS ASSIGNMENT STATEMENTS **
  #DEFINE
   
  ssc_define_columns = ""
  #Set file name manually if empty
  ssc_define_file = f""""""
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

BEGIN LOADING

The BEGIN LOADING command is commented out since the relevant information for the transformation is found in the associated INSERT statement instead.

ERRORFILES, NODROP, CHECKPOINT, INDICATORS and DATAENCRYPTION specifications are not necessary for the transformation and thus commented out.

Teradata (FastLoad)

 BEGIN LOADING FastTable ERRORFILES Error1,Error2
   CHECKPOINT 10000;
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW **
  #BEGIN LOADING FastTable ERRORFILES Error1, Error2 CHECKPOINT 10000
   
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

INSERT

The transformation for the associated INSERT statement sets the value for the ssc_begin_loading_columns and ssc_begin_loading_values variables, used to determine the order in which to insert the values to be loaded.

Finally, these variables and the ones described in the above sections are used to call the the BeginLoading.import_file_to_table function part of the SnowConvert.Helpers module. This function simulates the behavior of the whole FastLoad BEGIN LOADING process. To learn more about this function check here.

Teradata (FastLoad)

 SET RECORD VARTEXT """";
DEFINE
    _col1 (CHAR(10)),
    _col2 (CHAR(7)),
    _col3 (CHAR(2, NULLIF = 'V5'))
FILE=inputDataNoDel.txt;
BEGIN LOADING TESTS.EmpLoad4
ERRORFILES ${CPRDBName}.ET_${LOADTABLE},${CPRDBName}.UV_${LOADTABLE}
CHECKPOINT 1000;
INSERT INTO TESTS.EmpLoad4 (col2, col3, col1, col4)	
VALUES
(
    :_col2,
    :_col3,
    :_col1,
    CURRENT_DATE
);
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
#** SSC-FDM-0007 - MISSING DEPENDENT OBJECT "TESTS.EmpLoad4" **
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
#** SSC-FDM-TD0022 - SHELL VARIABLES FOUND, RUNNING THIS CODE IN A SHELL SCRIPT IS REQUIRED **
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS AN ASSIGNMENT STATEMENT **
  #SET RECORD VARTEXT "" ""
   
  ssc_set_record = ""
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS ASSIGNMENT STATEMENTS **
  #DEFINE _col1 (CHAR(10)), _col2 (CHAR(7)), _col3 (CHAR(2, NULLIF = 'V5')) FILE = inputDataNoDel.txt
   
  ssc_define_columns = "_col1 (CHAR(10)), _col2 (CHAR(7)), _col3 (CHAR(2, NULLIF = 'V5'))"
  #Set file name manually if empty
  ssc_define_file = f"""inputDataNoDel.txt"""
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW **
  #BEGIN LOADING TESTS.EmpLoad4 ERRORFILES ${CPRDBName}.ET_${LOADTABLE}, ${CPRDBName}.UV_${LOADTABLE} CHECKPOINT 1000
   
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. TRANSLATED BELOW AS PART OF THE BEGIN LOADING TRANSLATION **
  #INSERT INTO TESTS.EmpLoad4 (col2, col3, col1, col4) VALUES (:_col2, :_col3, :_col1, CURRENT_DATE)
   
  ssc_begin_loading_columns = "(col2, col3, col1, col4)"
  ssc_begin_loading_values = [":_col2", ":_col3", ":_col1", "CURRENT_DATE()"]
  BeginLoading.import_file_to_table(f"""TESTS.EmpLoad4""", ssc_define_columns, ssc_define_file, ssc_begin_loading_columns, ssc_begin_loading_values, ssc_set_record)
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

Internally, the import_file_to_table function creates a temporary stage and puts the local file in the stage to load into the specified table. However, the file might be already stored in one the supported cloud provider by Snowflake:

StageInput Data Place Holder
StageInput Data Place Holder
Internal stage@my_int_stage
External stage@my_int_stage/path/file.csv
Amazon S3 buckets3://mybucket/data/files
Google Cloud Storagegcs://mybucket/data/files
Microsoft Azureazure://myaccount.blob.core.windows.net/mycontainer/data/files

If this is the case, please manually add the additional parameter input_data_place_holder="<cloud_provider_path>" in the import_file_to_table function. For example:

 BeginLoading.import_file_to_table(
  f"""TESTS.EmpLoad4""", 
  ssc_define_columns, 
  ssc_define_file, 
  ssc_begin_loading_columns, 
  ssc_begin_loading_values, 
  ssc_set_record,
  input_data_place_holder="s3://mybucket/data/files")
Copy

END LOADING

The END LOADING command is commented out since is not necessary for the transformation of the BEGIN LOADING.

Teradata (FastLoad)

 END LOADING;
Copy

Snowflake (Python)

 #*** Generated code is based on the SnowConvert Python Helpers version 2.0.6 ***
 
import os
import sys
import snowconvert.helpers
from snowconvert.helpers import Export
from snowconvert.helpers import exec
from snowconvert.helpers import BeginLoading
con = None
def main():
  snowconvert.helpers.configure_log()
  con = snowconvert.helpers.log_on()
  #** SSC-FDM-0027 - REMOVED NEXT STATEMENT, NOT APPLICABLE IN SNOWFLAKE. END LOADING **
  #END LOADING
   
  snowconvert.helpers.quit_application()

if __name__ == "__main__":
  main()
Copy

Known Issues

1. BINARY and TEXT keyword specification not supported

The BINARY and TEXT keyword specification for the SET RECORD command are not yet supported.

2. Only base specification for VARTEXT is supported

Extra specifications for the SET RECORD VARTEXT such as TRIM or QUOTE are not yet supported.

Related EWIs

  1. SSC-FDM-0007: Element with missing dependencies.

  2. SSC-FDM-0027: Removed next statement, not applicable in SnowFlake.

  3. SSC-EWI-0021: Not supported.

  4. SSC-FDM-TD0022: Shell variables found, running this code in a shell script is required.