SnowConvert: Oracle SQL Statements¶
Alter Table¶
Note
Some parts in the output code are omitted for clarity reasons.
1. Description¶
Use the ALTER TABLE statement to alter the definition of a nonpartitioned table, a partitioned table, a table partition, or a table subpartition. For object tables or relational tables with object columns, use ALTER TABLE to convert the table to the latest definition of its referenced type after the type has been altered (Oracle documentation).
Oracle syntax
ALTER TABLE [ schema. ] table
[ alter_table_properties
| column_clauses
| constraint_clauses
| alter_table_partitioning
| alter_external_table
| move_table_clause
]
[ enable_disable_clause
| { ENABLE | DISABLE } { TABLE LOCK | ALL TRIGGERS }
] ...
;
Note
To review Snowflake syntax, review the following documentation.
2. Sample Source Patterns¶
2.1. Alter table with clauses¶
Warning
memoptimize_read_clause and memoptimize_read_clause are not applicable in Snowflake so are being removed.
Oracle
ALTER TABLE SOMESCHEMA.SOMENAME
MEMOPTIMIZE FOR READ
MEMOPTIMIZE FOR WRITE
ADD (SOMECOLUMN NUMBER , SOMEOTHERCOLUMN VARCHAR(23))
(PARTITION PT NESTED TABLE COLUMN_VALUE STORE AS SNAME
( SUBPARTITION SPART NESTED TABLE COLUMN_VALUE STORE AS SNAME))
ENABLE TABLE LOCK;
Snowflake
ALTER TABLE SOMESCHEMA.SOMENAME
ADD (SOMECOLUMN NUMBER(38, 18), SOMEOTHERCOLUMN VARCHAR(23));
Note
Only some column_clauses and constraint_clauses are applicable in Snowflake. In Oracle alter table allows modifying properties from partitions created but in Snowflake, these actions are not required
2.2. Alter table with not supported cases¶
Oracle
ALTER TABLE SOMENAME MODIFY COLUMN SCOLUMN NOT SUBSTITUTABLE AT ALL LEVELS FORCE;
ALTER TABLE SOMENAME MODIFY(SCOLUMN VISIBLE,SCOLUMN INVISIBLE);
ALTER TABLE SOMENAME MODIFY VARRAY VARRAYITEM (
STORAGE(PCTINCREASE 10));
Snowflake
!!!RESOLVE EWI!!! /*** SSC-EWI-0109 - ALTER TABLE SYNTAX NOT APPLICABLE IN SNOWFLAKE ***/!!!
ALTER TABLE SOMENAME
MODIFY COLUMN SCOLUMN NOT SUBSTITUTABLE AT ALL LEVELS FORCE;
!!!RESOLVE EWI!!! /*** SSC-EWI-0109 - ALTER TABLE SYNTAX NOT APPLICABLE IN SNOWFLAKE ***/!!!
ALTER TABLE SOMENAME
MODIFY(SCOLUMN VISIBLE,SCOLUMN INVISIBLE);
!!!RESOLVE EWI!!! /*** SSC-EWI-0109 - ALTER TABLE SYNTAX NOT APPLICABLE IN SNOWFLAKE ***/!!!
ALTER TABLE SOMENAME
MODIFY VARRAY VARRAYITEM (
STORAGE(PCTINCREASE 10));
2.3. ADD CONSTRAINT action¶
The ADD CONSTRAINT action has an equivalent in Snowflake, but it only one constraint can be added per ALTER TABLE statement, so it will be commented when the statement contains two or more constraints.
Warning
enable_disable_clause is removed since it is not relevant in Snowflake.
Oracle
-- MULTIPLE CONSTRAINT ADDITION SCENARIO
ALTER TABLE TABLE1 ADD (
CONSTRAINT TABLE1_PK
PRIMARY KEY
(ID)
ENABLE VALIDATE,
CONSTRAINT TABLE1_FK foreign key(ID2)
references TABLE2 (ID) ON DELETE CASCADE);
-- ONLY ONE CONSTRAINT ADDITION SCENARIO
ALTER TABLE TABLE1 ADD (
CONSTRAINT TABLE1_FK foreign key(ID2)
references TABLE2 (ID) ON DELETE CASCADE);
Snowflake
-- MULTIPLE CONSTRAINT ADDITION SCENARIO
!!!RESOLVE EWI!!! /*** SSC-EWI-OR0067 - MULTIPLE CONSTRAINT DEFINITION IN A SINGLE STATEMENT IS NOT SUPPORTED IN SNOWFLAKE. ***/!!!
ALTER TABLE TABLE1
ADD (
CONSTRAINT TABLE1_PK
PRIMARY KEY
(ID) ,
CONSTRAINT TABLE1_FK foreign key(ID2)
references TABLE2 (ID) ON DELETE CASCADE);
-- ONLY ONE CONSTRAINT ADDITION SCENARIO
ALTER TABLE TABLE1
ADD
CONSTRAINT TABLE1_FK foreign key(ID2)
references TABLE2 (ID) ON DELETE CASCADE;
Known Issues¶
Some properties on the tables may be adapted to or not applicable.
Create Database Link¶
Warning
Currently, Create Database Link statement is not being converted but it is being parsed. Also, if your source code hascreate database link
statements, these are going to be accounted for in the Assessment Report.
Example of a Source Code¶
CREATE PUBLIC DATABASE LINK db_link_name
CONNECT TO CURRENT_USER
USING 'connect string'
CREATE DATABASE LINK db_link_name2
CONNECT TO user_name IDENTIFIED BY user_password
USING 'connect string'
CREATE PUBLIC DATABASE LINK db_link_name3
Snowflake output¶
----** SSC-OOS - OUT OF SCOPE CODE UNIT. CREATE DATABASE LINK IS OUT OF TRANSLATION SCOPE. **
--CREATE PUBLIC DATABASE LINK db_link_name
--CONNECT TO CURRENT_USER
--USING 'connect string'
----** SSC-OOS - OUT OF SCOPE CODE UNIT. CREATE DATABASE LINK IS OUT OF TRANSLATION SCOPE. **
--CREATE DATABASE LINK db_link_name2
--CONNECT TO user_name IDENTIFIED BY user_password
--USING 'connect string'
----** SSC-OOS - OUT OF SCOPE CODE UNIT. CREATE DATABASE LINK IS OUT OF TRANSLATION SCOPE. **
--CREATE PUBLIC DATABASE LINK db_link_name3
Database Link References¶
If in your input code you use objects from the database link the output code will keep the name of these objects but the name of the database link that they are using will be removed.
Example of a Source Code¶
-- CREATE DATABASE LINK STATEMENTS
CREATE DATABASE LINK mylink1
CONNECT TO user1 IDENTIFIED BY password1
USING 'my_connection_string1';
CREATE DATABASE LINK mylink2
CONNECT TO user2 IDENTIFIED BY password2
USING 'my_connection_string2';
-- SQL statements that use the database links
SELECT * FROM products@mylink1;
INSERT INTO employees@mylink2
(employee_id, last_name, email, hire_date, job_id)
VALUES (999, 'Claus', 'sclaus@oracle.com', SYSDATE, 'SH_CLERK');
UPDATE jobs@mylink2 SET min_salary = 3000
WHERE job_id = 'SH_CLERK';
DELETE FROM employees@mylink2
WHERE employee_id = 999;
-- SQL statement where it uses an object from
-- a database link that is not created
SELECT * FROM products@mylink;
Snowflake output¶
---- CREATE DATABASE LINK STATEMENTS
----** SSC-OOS - OUT OF SCOPE CODE UNIT. CREATE DATABASE LINK IS OUT OF TRANSLATION SCOPE. **
--CREATE DATABASE LINK mylink1
-- CONNECT TO user1 IDENTIFIED BY password1
-- USING 'my_connection_string1'
----** SSC-OOS - OUT OF SCOPE CODE UNIT. CREATE DATABASE LINK IS OUT OF TRANSLATION SCOPE. **
--CREATE DATABASE LINK mylink2
-- CONNECT TO user2 IDENTIFIED BY password2
-- USING 'my_connection_string2'
-- SQL statements that use the database links
SELECT * FROM
!!!RESOLVE EWI!!! /*** SSC-EWI-OR0123 - DBLINK CONNECTIONS NOT SUPPORTED [ DBLINK : mylink1 | USER: user1/password1 | CONNECTION: 'my_connection_string1' ] ***/!!!
products;
INSERT INTO
!!!RESOLVE EWI!!! /*** SSC-EWI-OR0123 - DBLINK CONNECTIONS NOT SUPPORTED [ DBLINK : mylink2 | USER: user2/password2 | CONNECTION: 'my_connection_string2' ] ***/!!!
employees
(employee_id, last_name, email, hire_date, job_id)
VALUES (999, 'Claus', 'sclaus@oracle.com', CURRENT_TIMESTAMP(), 'SH_CLERK');
UPDATE
!!!RESOLVE EWI!!! /*** SSC-EWI-OR0123 - DBLINK CONNECTIONS NOT SUPPORTED [ DBLINK : mylink2 | USER: user2/password2 | CONNECTION: 'my_connection_string2' ] ***/!!!
jobs
SET min_salary = 3000
WHERE job_id = 'SH_CLERK';
DELETE FROM
!!!RESOLVE EWI!!! /*** SSC-EWI-OR0123 - DBLINK CONNECTIONS NOT SUPPORTED [ DBLINK : mylink2 | USER: user2/password2 | CONNECTION: 'my_connection_string2' ] ***/!!!
employees
WHERE employee_id = 999;
-- SQL statement where it uses an object from
-- a database link that is not created
--** SSC-FDM-0007 - MISSING DEPENDENT OBJECT "mylink" **
SELECT * FROM
!!!RESOLVE EWI!!! /*** SSC-EWI-OR0123 - DBLINK CONNECTIONS NOT SUPPORTED [ DBLINK : mylink | USER: / | CONNECTION: ] ***/!!!
products;
Related EWIs¶
SSC-EWI-OR0123: Db Link connections not supported.
SSC-FDM-0007: Element with missing dependencies.
Drop Table¶
Note
Some parts in the output code are omitted for clarity reasons.
Description¶
A Drop Table statement is used to remove a table. This statement varies a little between Oracle and Snowflake. Please double-check each documentation for more information regarding the differences.
In Oracle, the Drop Table syntax is:
DROP TABLE <table_name> [ CASCADE CONSTRAINTS ] [ PURGE ]
In Snowflake, the Drop table syntax is:
DROP TABLE [ IF EXISTS ] <table_name> [ CASCADE | RESTRICT ]
The main difference is that Snowflake does not have an equal for the PURGE clause, as the table will not be permanently removed from the system. Though, the CASCADE CONSTRAINTS and the CASCADE clauses are the same. Both drop the table, even if foreign keys exist that reference this table.
Examples¶
Now, let’s see some code examples, and what it would look like after it has been transformed. Each example uses a different variation of the Drop Table statement.
Example 1:¶
This example uses the Drop Table statement as simple as possible.
Input Code:
DROP TABLE TEST_TABLE1;
Transformed Code:
DROP TABLE TEST_TABLE1;
Example 2:¶
This example uses the Drop Table statement with the PURGE clause. Remember there is no equivalent in Snowflake for the PURGE clause inside a Drop Table statement.
Input Code:
DROP TABLE TEST_TABLE1 PURGE;
Transformed Code:
DROP TABLE TEST_TABLE1;
Example 3:¶
This example uses the Drop Table statement with the CASCADE CONSTRAINTS clause.
Input Code:
DROP TABLE TEST_TABLE1 CASCADE CONSTRAINTS;
Transformed Code:
DROP TABLE TEST_TABLE1 CASCADE;
In the transformed code, the CONSTRAINTS word is removed from the CASCADE CONSTRAINTS clause.
Example 4:¶
This example uses the Drop Table statement with the CASCADE CONSTRAINTS and the PURGE clauses.
Input Code:
DROP TABLE TEST_TABLE1 CASCADE CONSTRAINTS PURGE;
Transformed Code:
DROP TABLE TEST_TABLE1 CASCADE;
As seen, the code changes. In the new Snowflake code, the PURGE clause is removed and the CONSTRAINTS word is also removed from the CASCADE clause.
Functional Equivalence¶
Run the following code to check for functional equivalence, bear in mind the only part that is not equivalent is the PURGE clause, which in Oracle removes completely the table from the system and there is no equal for Snowflake. In both cases, the table is dropped even if it’s referenced in another table.
Oracle:
CREATE TABLE TEST_TABLE2 (
col2 INTEGER,
CONSTRAINT constraint_name PRIMARY KEY (col2)
);
CREATE TABLE OTHER_TABLE (
other_col INTEGER REFERENCES TEST_TABLE2 (col2)
);
DROP TABLE TEST_TABLE2 CASCADE CONSTRAINTS PURGE;
Snowflake:
CREATE OR REPLACE TABLE TEST_TABLE2 (
col2 INTEGER,
CONSTRAINT constraint_name PRIMARY KEY (col2)
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
CREATE OR REPLACE TABLE OTHER_TABLE (
other_col INTEGER REFERENCES TEST_TABLE2 (col2)
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
DROP TABLE TEST_TABLE2 CASCADE;
Related EWIs¶
No related EWIs.
Create Index¶
Warning
Currently, Create Index statement is not being converted but it is being parsed. Also, if your source code has create index
statements, these are going to be accounted for in the Assessment Report.
Example of a create index parsed code:¶
CREATE UNIQUE INDEX COL1_INDEX ILM (ADD POLICY OPTIMIZE AFTER 10 DAYS OF NO ACCESS) ON CLUSTER CLUSTER1
ONLINE USABLE DEFERRED INVALIDATION;
CREATE BITMAP INDEX COL1_INDEX ILM (ADD POLICY OPTIMIZE ( ON FUNC1 )) ON TABLE1 AS TAB1 (COL1 ASC) GLOBAL PARTITION BY RANGE (COL1, COL2) ( PARTITION VALUES LESS THAN (MAXVALUE) ) UNUSABLE IMMEDIATE INVALIDATION;
CREATE MULTIVALUE INDEX COL1_INDEX ILM (ADD POLICY SEGMENT TIER TO LOW_COST_TBS) ON TABLE1( TAB1 COL1 DESC, TAB1 COL2 ASC) FROM TABLE1 AS TAB1 WHERE COL1 > 0 LOCAL STORE IN (STORAGE1)
VISIBLE USABLE DEFERRED INVALIDATION;
CREATE INDEX COL1_INDEX ILM (DELETE POLICY POLICY1) ON CLUSTER CLUSTER1
PCTFREE 10
LOGGING
ONLINE
TABLESPACE DEFAULT
NOCOMPRESS
SORT
REVERSE
VISIBLE
INDEXING PARTIAL
NOPARALLEL;
CREATE INDEX COL1_INDEX ILM (DELETE_ALL) ON TABLE1 AS TAB1 (COL1 ASC) LOCAL (
PARTITION PARTITION1 TABLESPACE TABLESPACE1 NOCOMPRESS USABLE) DEFERRED INVALIDATION;
CREATE INDEX COL1_INDEX ON TABLE1 (COL1 ASC) GLOBAL
PARTITION BY HASH (COL1, COL2) (PARTITION PARTITION1 LOB(LOB1) STORE AS BASICFILE LOB_NAME (TABLESPACE TABLESPACE1)) USABLE IMMEDIATE INVALIDATION;
CREATE INDEX COL1_INDEX ON TABLE1 (COL1 DESC, COL2 ASC) INDEXTYPE IS INDEXTYPE1 LOCAL ( PARTITION PARTITION1 PARAMETERS('PARAMS')) NOPARALLEL PARAMETERS('PARAMS') USABLE DEFERRED INVALIDATION;
CREATE INDEX COL1_INDEX ON TABLE1 (COL1 ASC) INDEXTYPE IS XDB.XMLINDEX LOCAL ( PARTITION PARTITION1) PARALLEL 6 UNUSABLE IMMEDIATE INVALIDATION;
Note
Due to architectural reasons, Snowflake does not support indexes so, SnowConvert will remove all the code related to the creation of indexes. Snowflake automatically creates micro-partitions for every table that help speed up the performance of DML operations, the user does not have to worry about creating or managing these micro-partitions.
Usually, this is enough to have an exceptionally good query performance. However, there are ways to improve it by creating data clustering keys. Snowflake’s official page provides more information about micro-partitions and data clustering.
Create Materialized Views¶
Description ¶
In Snowconvert, Oracle Materialized Views are transformed into Snowflake Dynamic Tables. To properly configure Dynamic Tables, two essential parameters must be defined: TARGET_LAG and WAREHOUSE. If these parameters are left unspecified in the configuration options, Snowconvert will default to preassigned values during the conversion, as demonstrated in the example below.
For more information on Materialized Views, click here.
For details on the necessary parameters for Dynamic Tables, click here.
Sample Source Patterns¶
Oracle
CREATE MATERIALIZED VIEW sales_total
AS
SELECT SUM(amount) AS total_sales
FROM sales;
Snowflake
CREATE OR REPLACE DYNAMIC TABLE sales_total
--** SSC-FDM-0031 - DYNAMIC TABLE REQUIRED PARAMETERS SET BY DEFAULT **
TARGET_LAG='1 day'
WAREHOUSE=UPDATE_DUMMY_WAREHOUSE
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},"attributes":{"component":"oracle"}}'
AS
SELECT SUM(amount) AS total_sales
FROM
sales;
Refresh Modes¶
Snowflake dynamic tables support an equivalent to Oracle’s materialized view refresh modes. The corresponding modes are as follows:
Oracle:
FAST: Refreshes only the rows that have changed.
COMPLETE: Refreshes the entire materialized view.
FORCE: Uses FAST if possible, otherwise uses COMPLETE.
Snowflake:
AUTO: Automatically determines the best refresh method.
FULL: Refreshes the entire table, equivalent to Oracle’s COMPLETE mode.
INCREMENTAL: Refreshes only the changed rows.
Default Refresh Mode¶
When using SnowConvert, the dynamic table’s default refresh mode is AUTO.
Mode Mappings¶
Oracle FAST and FORCE -> Snowflake AUTO
Oracle COMPLETE -> Snowflake FULL
For more details, refer to the official documentation on Oracle Refresh Modes and Snowflake Refresh Modes.
Oracle
CREATE MATERIALIZED VIEW CUSTOMER_SALES_SUMMARY
REFRESH COMPLETE
AS
SELECT
CUSTOMER_ID,
SUM(AMOUNT) AS TOTAL_AMOUNT
FROM
SALES
GROUP BY
CUSTOMER_ID;
Snowflake
CREATE OR REPLACE DYNAMIC TABLE CUSTOMER_SALES_SUMMARY
--** SSC-FDM-0031 - DYNAMIC TABLE REQUIRED PARAMETERS SET BY DEFAULT **
TARGET_LAG='1 day'
WAREHOUSE=UPDATE_DUMMY_WAREHOUSE
REFRESH_MODE=FULL
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},"attributes":{"component":"oracle"}}'
AS
SELECT
CUSTOMER_ID,
SUM(AMOUNT) AS TOTAL_AMOUNT
FROM
SALES
GROUP BY
CUSTOMER_ID;
Known Issues¶
No known errors detected at this time.
Related EWIs¶
ssc-fdm-0031.md: Dynamic Table required parameters set by default
Create Sequence¶
Let’s first see a code example, and what it would look like after it has been transformed.
Input Code:¶
CREATE SEQUENCE SequenceSample
START WITH 1000
INCREMENT BY 1
NOCACHE
NOCYCLE;
Output Code:¶
CREATE OR REPLACE SEQUENCE SequenceSample
START WITH 1000
INCREMENT BY 1
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}';
The first change that it is done is to apply the schema or datawarehouse to the name of the sequence. The second transformation consists in removing some elements and add then as comments, since oracle has some elements in the create sequence that are not supported in snowflake.
In Oracle, after the name of the Sequence, the elements that are NOT commented are the following
START WITH 1000
INCREMENT BY 1
If the element is not one of those, it will be commented and added as a warning just before the create sequence, like in the example.
The following elements are the ones that are removed
MAXVALUE
NOMAXVALUE
MINVALUE
NOMINVALUE
CYCLE
NOCYCLE
CACHE
NOCACHE
ORDER
NOORDER
KEEP
NOKEEP
SESSION
GLOBAL
SCALE
EXTEND
SCALE
NOEXTEND
NOSCALE
SHARD
EXTEND
SHARD
NOEXTEND
NOSHARD
SEQUENCE EXPRESSIONS¶
NEXTVAL: Snowflake grammar is the same as the Oracle one.
CURRVAL: Snowflake does not has an equivalent so it is transformed to a stub function. Check this link to understand Snowflake’s approach.
Input Code:¶
select seq1.nextval from dual;
select seq1.currval from dual;
Output Code:¶
select seq1.nextval from dual;
select
!!!RESOLVE EWI!!! /*** SSC-EWI-OR0069 - THE SEQUENCE CURRVAL PROPERTY IS NOT SUPPORTED IN SNOWFLAKE. ***/!!! seq1.currval from dual;
Sequence START WITH¶
START WITH
statement value may exceed the maximum value allowed by Snowflake. What Snowflake said about the start value is: Specifies the first value returned by the sequence. Supported values are any value that can be represented by a 64-bit two’s compliment integer (from -2^63
to 2^63-1
). So according to the previously mentioned, the max value allowed is 9223372036854775807 for positive numbers and 9223372036854775808 for negative numbers.
Example Code¶
Input Code:¶
CREATE SEQUENCE SEQUENCE1
START WITH 9223372036854775808;
CREATE SEQUENCE SEQUENCE2
START WITH -9223372036854775809;
Output Code:¶
CREATE OR REPLACE SEQUENCE SEQUENCE1
!!!RESOLVE EWI!!! /*** SSC-EWI-OR0068 - SEQUENCE START VALUE EXCEEDS THE MAX VALUE ALLOWED BY SNOWFLAKE. ***/!!!
START WITH 9223372036854775808
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}';
CREATE OR REPLACE SEQUENCE SEQUENCE2
!!!RESOLVE EWI!!! /*** SSC-EWI-OR0068 - SEQUENCE START VALUE EXCEEDS THE MAX VALUE ALLOWED BY SNOWFLAKE. ***/!!!
START WITH -9223372036854775809
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}';
Related EWIs¶
SSC-EWI-OR0069: The sequence CURRVAL property is not supported in Snowflake.
SSC-EWI-OR0068: The sequence start value exceeds the max value allowed by Snowflake.
Alter Session¶
Alter session¶
Alter session has an equivalent in Snowflake and some the variables are mapped to Snowflake variables. If a permutation of Alter Session is not supported the node will be commented and a warning will be added.
Oracle Code:¶
alter session set nls_date_format = 'DD-MM-YYYY';
Snowflake Code:¶
ALTER SESSION SET DATE_INPUT_FORMAT = 'DD-MM-YYYY' DATE_OUTPUT_FORMAT = 'DD-MM-YYYY';
Session Parameters Reference¶
Note
The session parameters that doesn’t appear in the table are not currently being transformed.
Session Parameter |
Snowflake transformation |
---|---|
NLS_DATE_FORMAT |
DATE_INPUT_FORMAT and DATE_OUTPUT_FORMAT |
NLS_NUMERIC_CHARACTERS |
NOT SUPPORTED |
Known Issues ¶
No issues were found.
Related EWIs ¶
No related EWIs.
Create Synonym¶
Note
Some parts in the output code are omitted for clarity reasons.
Create Synonym¶
Synonyms are not supported in Snowflake. The references to the Synonyms will be changed for the original Object.
Input Code:¶
CREATE OR REPLACE SYNONYM B.TABLITA_SYNONYM FOR TABLITA;
Output Code:¶
----** SSC-FDM-OR0005 - SYNONYMS NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS SYNONYM WERE CHANGED BY THE ORIGINAL OBJECT NAME. **
--CREATE OR REPLACE SYNONYM B.TABLITA_SYNONYM FOR TABLITA
;
Example 1: Synonym that refers to a table.¶
Oracle source code:
CREATE TABLE TABLITA
(
COLUMN1 NUMBER
);
CREATE OR REPLACE SYNONYM B.TABLITA_SYNONYM FOR TABLITA;
SELECT * FROM B.TABLITA_SYNONYM WHERE B.TABLITA_SYNONYM.COLUMN1 = 20;
Snowflake migrated code: you’ll notice that the SELECT
originally refers to a synonym, but now it refers to the table that points the synonym.
CREATE OR REPLACE TABLE TABLITA
(
COLUMN1 NUMBER(38, 18) /*** SSC-FDM-0006 - NUMBER TYPE COLUMN MAY NOT BEHAVE SIMILARLY IN SNOWFLAKE. ***/
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
-- --** SSC-FDM-OR0005 - SYNONYMS NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS SYNONYM WERE CHANGED BY THE ORIGINAL OBJECT NAME. **
-- CREATE OR REPLACE SYNONYM B.TABLITA_SYNONYM FOR TABLITA
;
SELECT * FROM
TABLITA
WHERE
TABLITA.COLUMN1 = 20;
Example 2: Synonym that refers to another synonym.¶
Oracle source code:
CREATE TABLE TABLITA
(
COLUMN1 NUMBER
);
CREATE OR REPLACE SYNONYM B.TABLITA_SYNONYM FOR TABLITA;
CREATE OR REPLACE SYNONYM C.TABLITA_SYNONYM2 FOR B.TABLITA_SYNONYM;
SELECT * FROM C.TABLITA_SYNONYM2 WHERE C.TABLITA_SYNONYM2.COLUMN1 = 20;
UPDATE C.TABLITA_SYNONYM2 SET COLUMN1 = 10;
INSERT INTO C.TABLITA_SYNONYM2 VALUES (1);
Snowflake migrated code: you’ll notice that originally the SELECT
, UPDATE
, INSERT
refers to a synonym, and now it refers to the atomic object, which is a table.
CREATE OR REPLACE TABLE TABLITA
(
COLUMN1 NUMBER(38, 18) /*** SSC-FDM-0006 - NUMBER TYPE COLUMN MAY NOT BEHAVE SIMILARLY IN SNOWFLAKE. ***/
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
-- --** SSC-FDM-OR0005 - SYNONYMS NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS SYNONYM WERE CHANGED BY THE ORIGINAL OBJECT NAME. **
-- CREATE OR REPLACE SYNONYM B.TABLITA_SYNONYM FOR TABLITA
;
-- --** SSC-FDM-OR0005 - SYNONYMS NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS SYNONYM WERE CHANGED BY THE ORIGINAL OBJECT NAME. **
--CREATE OR REPLACE SYNONYM C.TABLITA_SYNONYM2 FOR B.TABLITA_SYNONYM
;
SELECT * FROM
TABLITA
WHERE
TABLITA.COLUMN1 = 20;
UPDATE TABLITA
SET COLUMN1 = 10;
INSERT INTO TABLITA
VALUES (1);
Example 3: Synonym that refers to a view¶
Oracle Source Code
CREATE OR REPLACE SYNONYM B.TABLITA_SYNONYM FOR TABLITA;
CREATE OR REPLACE SYNONYM C.TABLITA_SYNONYM2 FOR B.TABLITA_SYNONYM;
CREATE VIEW VIEW_ORGINAL AS SELECT * FROM C.TABLITA_SYNONYM2;
CREATE OR REPLACE SYNONYM VIEW_SYNONYM FOR VIEW_ORGINAL;
SELECT * FROM VIEW_SYNONYM;
Snowflake migrated code: you’ll notice that the SELECT
originally refers to a synonym, and now it refers to the atomic objects, which is a view.
----** SSC-FDM-OR0005 - SYNONYMS NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS SYNONYM WERE CHANGED BY THE ORIGINAL OBJECT NAME. **
--CREATE OR REPLACE SYNONYM B.TABLITA_SYNONYM FOR TABLITA
;
----** SSC-FDM-OR0005 - SYNONYMS NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS SYNONYM WERE CHANGED BY THE ORIGINAL OBJECT NAME. **
--CREATE OR REPLACE SYNONYM C.TABLITA_SYNONYM2 FOR B.TABLITA_SYNONYM
;
CREATE OR REPLACE VIEW VIEW_ORGINAL
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
AS
--** SSC-FDM-0001 - VIEWS SELECTING ALL COLUMNS FROM A SINGLE TABLE ARE NOT REQUIRED IN SNOWFLAKE AND MAY IMPACT PERFORMANCE. **
SELECT * FROM
TABLITA;
----** SSC-FDM-OR0005 - SYNONYMS NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS SYNONYM WERE CHANGED BY THE ORIGINAL OBJECT NAME. **
--CREATE OR REPLACE SYNONYM VIEW_SYNONYM FOR VIEW_ORGINAL
;
SELECT * FROM
VIEW_ORGINAL;
Synonyms transformation can be enabled or disabled from the UI settings.
Related EWIs¶
SSC-FDM-0001: Views selecting all columns from a single table are not required in Snowflake.
SSC-FDM-0006: Number type column may not behave similarly in Snowflake.
SSC-FDM-OR0005: Synonyms are not supported in Snowflake but references to this synonym were changed by the original object name.
Create Table¶
1. Description¶
In Oracle, the CREATE TABLE statement is used to create one of the following types of tables: a relational table which is the basic structure to hold user data, or an object table which is a table that uses an object type for a column definition. (Oracle documentation)
Oracle syntax
CREATE [ { GLOBAL | PRIVATE } TEMPORARY | SHARDED | DUPLICATED | [ IMMUTABLE ] BLOCKCHAIN
| IMMUTABLE ]
TABLE
[ schema. ] table
[ SHARING = { METADATA | DATA | EXTENDED DATA | NONE } ]
{ relational_table | object_table | XMLType_table }
[ MEMOPTIMIZE FOR READ ]
[ MEMOPTIMIZE FOR WRITE ]
[ PARENT [ schema. ] table ] ;
Snowflake Syntax
CREATE [ OR REPLACE ]
[ { [ { LOCAL | GLOBAL } ] TEMP | TEMPORARY | VOLATILE | TRANSIENT } ]
TABLE [ IF NOT EXISTS ] <table_name> (
-- Column definition
<col_name> <col_type>
[ inlineConstraint ]
[ NOT NULL ]
[ COLLATE '<collation_specification>' ]
[
{
DEFAULT <expr>
| { AUTOINCREMENT | IDENTITY }
[
{
( <start_num> , <step_num> )
| START <num> INCREMENT <num>
}
]
[ { ORDER | NOORDER } ]
}
]
[ [ WITH ] MASKING POLICY <policy_name> [ USING ( <col_name> , <cond_col1> , ... ) ] ]
[ [ WITH ] TAG ( <tag_name> = '<tag_value>' [ , <tag_name> = '<tag_value>' , ... ] ) ]
[ COMMENT '<string_literal>' ]
-- Additional column definitions
[ , <col_name> <col_type> [ ... ] ]
-- Out-of-line constraints
[ , outoflineConstraint [ ... ] ]
)
[ CLUSTER BY ( <expr> [ , <expr> , ... ] ) ]
[ ENABLE_SCHEMA_EVOLUTION = { TRUE | FALSE } ]
[ STAGE_FILE_FORMAT = (
{ FORMAT_NAME = '<file_format_name>'
| TYPE = { CSV | JSON | AVRO | ORC | PARQUET | XML } [ formatTypeOptions ]
} ) ]
[ STAGE_COPY_OPTIONS = ( copyOptions ) ]
[ DATA_RETENTION_TIME_IN_DAYS = <integer> ]
[ MAX_DATA_EXTENSION_TIME_IN_DAYS = <integer> ]
[ CHANGE_TRACKING = { TRUE | FALSE } ]
[ DEFAULT_DDL_COLLATION = '<collation_specification>' ]
[ COPY GRANTS ]
[ COMMENT = '<string_literal>' ]
[ [ WITH ] ROW ACCESS POLICY <policy_name> ON ( <col_name> [ , <col_name> ... ] ) ]
[ [ WITH ] TAG ( <tag_name> = '<tag_value>' [ , <tag_name> = '<tag_value>' , ... ] ) ]
Note
For more Snowflake information review the following documentation.
2. Sample Source Patterns¶
2.1. Physical and Table Properties¶
Oracle
CREATE TABLE "MySchema"."BaseTable"
(
BaseId NUMBER DEFAULT 10 NOT NULL ENABLE
) SEGMENT CREATION IMMEDIATE
PCTFREE 0 PCTUSED 40 INITRANS 1 MAXTRANS 255
COLUMN STORE COMPRESS FOR QUERY HIGH NO ROW LEVEL LOCKING LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1
BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)
TABLESPACE "MyTableSpace"
PARTITION BY LIST ("BaseId")
(
PARTITION "P20211231" VALUES (20211231) SEGMENT CREATION DEFERRED
PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255
ROW STORE COMPRESS ADVANCED LOGGING
STORAGE(
BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)
TABLESPACE "MyTableSpace"
)
PARALLEL;
Snowflake
CREATE OR REPLACE TABLE "MySchema"."BaseTable"
(
BaseId NUMBER(38, 18) /*** SSC-FDM-0006 - NUMBER TYPE COLUMN MAY NOT BEHAVE SIMILARLY IN SNOWFLAKE. ***/ DEFAULT 10 NOT NULL
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
Note
Table properties are removed because they are not required after the migration in Snowflake.
2.2. Constraints and Constraint States¶
The following constraints will be commented out:
CHECK
Constraint
Note
The USING INDEX
constraint will be entirely removed from the output code during the conversion.
Oracle
CREATE TABLE "MySchema"."BaseTable"
(
BaseId NUMBER DEFAULT 10 NOT NULL ENABLE NOVALIDATE,
"COL1" NUMBER CHECK( "COL1" IS NOT NULL ),
CHECK( "COL1" IS NOT NULL ),
CONSTRAINT "Constraint1BaseTable" PRIMARY KEY (BaseId)
USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1) ENABLE
);
Snowflake
CREATE OR REPLACE TABLE "MySchema"."BaseTable"
(
BaseId NUMBER(38, 18) /*** SSC-FDM-0006 - NUMBER TYPE COLUMN MAY NOT BEHAVE SIMILARLY IN SNOWFLAKE. ***/ DEFAULT 10 NOT NULL,
"COL1" NUMBER(38, 18) /*** SSC-FDM-0006 - NUMBER TYPE COLUMN MAY NOT BEHAVE SIMILARLY IN SNOWFLAKE. ***/ NOT NULL
!!!RESOLVE EWI!!! /*** SSC-EWI-0035 - CHECK STATEMENT NOT SUPPORTED ***/!!!
CHECK( "COL1" IS NOT NULL ),
!!!RESOLVE EWI!!! /*** SSC-EWI-0035 - CHECK STATEMENT NOT SUPPORTED ***/!!!
CHECK( "COL1" IS NOT NULL ),
CONSTRAINT "Constraint1BaseTable" PRIMARY KEY (BaseId)
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},"attributes":{"component":"oracle"}}'
;
On the other hand, but in the same way, in case you have any constraint state after a NOT NULL constraint as follows:
RELY
NO RELY
RELY ENABLE
RELY DISABLE
VALIDATE
NOVALIDATE
These will also be commented out.
Note
The ENABLE constraint state will be completely removed from the output code during the conversion process. In the case of the DISABLE state, it will also be removed concurrently with the NOT NULL constraint.
Oracle
CREATE TABLE Table1(
col1 INT NOT NULL ENABLE,
col2 INT NOT NULL DISABLE,
col3 INT NOT NULL RELY
);
Snowflake
CREATE OR REPLACE TABLE Table1 (
col1 INT NOT NULL,
col2 INT ,
col3 INT NOT NULL /*** SSC-FDM-OR0006 - CONSTRAINT STATE RELY REMOVED FROM NOT NULL INLINE CONSTRAINT ***/
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
2.3. Foreign Key¶
If there is a table with a NUMBER column with no precision nor scale, and another table with a NUMBER(*,0) column that references to the previously mentioned NUMBER column, we will comment out this foreign key.
Oracle
CREATE TABLE "MySchema"."MyTable"
(
"COL1" NUMBER,
CONSTRAINT "PK" PRIMARY KEY ("COL1")
);
Snowflake
CREATE OR REPLACE TABLE "MySchema"."MyTable"
(
"COL1" NUMBER(38, 18) /*** SSC-FDM-0006 - NUMBER TYPE COLUMN MAY NOT BEHAVE SIMILARLY IN SNOWFLAKE. ***/,
CONSTRAINT "PK" PRIMARY KEY ("COL1")
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
2.4. Virtual Column¶
Oracle
CREATE TABLE "MySchema"."MyTable"
(
"COL1" NUMBER GENERATED ALWAYS AS (COL1 * COL2) VIRTUAL
);
Snowflake
CREATE OR REPLACE TABLE "MySchema"."MyTable"
(
"COL1" NUMBER(38, 18) /*** SSC-FDM-0006 - NUMBER TYPE COLUMN MAY NOT BEHAVE SIMILARLY IN SNOWFLAKE. ***/ AS (COL1 * COL2)
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
2.5. Identity Column¶
For identity columns, a sequence is created and assigned to the column.
Oracle
CREATE TABLE "MySchema"."BaseTable"
(
"COL0" NUMBER GENERATED BY DEFAULT ON NULL
AS IDENTITY MINVALUE 1 MAXVALUE 9999999999999999999999999999
INCREMENT BY 1
START WITH 621
CACHE 20
NOORDER NOCYCLE NOT NULL ENABLE
);
Snowflake
CREATE OR REPLACE SEQUENCE MySchema.BaseTable_COL0
INCREMENT BY 1
START WITH 621
COMMENT = 'FOR TABLE-COLUMN "MySchema.BaseTable".COL0';
CREATE OR REPLACE TABLE "MySchema"."BaseTable" (
"COL0" NUMBER(38, 18) /*** SSC-FDM-0006 - NUMBER TYPE COLUMN MAY NOT BEHAVE SIMILARLY IN SNOWFLAKE. ***/ DEFAULT MySchema.BaseTable_COL0.NEXTVAL NOT NULL
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
2.6. CLOB and BLOB column declaration¶
Columns declared as CLOB or BLOB will be changed to VARCHAR.
Oracle
CREATE TABLE T
(
Col1 BLOB DEFAULT EMPTY_BLOB(),
Col5 CLOB DEFAULT EMPTY_CLOB()
);
Snowflake
CREATE OR REPLACE TABLE T
(
Col1 BINARY,
Col5 VARCHAR
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
2.7. Constraint Name¶
Warning
The constraint name is removed from the code because it is not applicable in Snowflake.
Oracle
CREATE TABLE "CustomSchema"."BaseTable"(
"PROPERTY" VARCHAR2(64) CONSTRAINT "MICROSOFT_NN_PROPERTY" NOT NULL ENABLE
);
Snowflake
CREATE OR REPLACE TABLE "CustomSchema"."BaseTable" (
"PROPERTY" VARCHAR(64) NOT NULL
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
2.8. Default columns with times¶
The columns declared as Date types will be cast to match with the specific date type.
Oracle
CREATE TABLE TABLE1
(
"COL1" VARCHAR(50) DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE TABLE1
(
COL0 TIMESTAMP(6) DEFAULT CURRENT_TIMESTAMP,
COL1 TIMESTAMP(6) DEFAULT CURRENT_TIME,
COL2 TIMESTAMP(6) WITH LOCAL TIME ZONE DEFAULT '1900-01-01 12:00:00',
COL3 TIMESTAMP(6) WITH TIME ZONE DEFAULT '1900-01-01 12:00:00',
COL4 TIMESTAMP(6) WITHOUT TIME ZONE DEFAULT '1900-01-01 12:00:00',
COL5 TIMESTAMP(6) DEFAULT TO_TIMESTAMP('01/01/1900 12:00:00.000000 AM', 'MM/DD/YYYY HH:MI:SS.FF6 AM')
);
Snowflake
CREATE OR REPLACE TABLE TABLE1
(
"COL1" VARCHAR(50) DEFAULT TO_VARCHAR(CURRENT_TIMESTAMP(), 'YYYY-MM-DD HH:MI:SS')
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
--** SSC-FDM-0019 - SEMANTIC INFORMATION COULD NOT BE LOADED FOR TABLE1. CHECK IF THE NAME IS INVALID OR DUPLICATED. **
CREATE OR REPLACE TABLE TABLE1
(
COL0 TIMESTAMP(6) DEFAULT CURRENT_TIMESTAMP() :: TIMESTAMP(6),
COL1 TIMESTAMP(6) DEFAULT CURRENT_TIME() :: TIMESTAMP(6),
COL2 TIMESTAMP_LTZ(6) DEFAULT '1900-01-01 12:00:00' :: TIMESTAMP_LTZ(6),
COL3 TIMESTAMP_TZ(6) DEFAULT '1900-01-01 12:00:00' :: TIMESTAMP_TZ(6),
COL4 TIMESTAMP(6) WITHOUT TIME ZONE DEFAULT '1900-01-01 12:00:00' :: TIMESTAMP(6) WITHOUT TIME ZONE,
COL5 TIMESTAMP(6) DEFAULT TO_TIMESTAMP('01/01/1900 12:00:00.000000 AM', 'MM/DD/YYYY HH:MI:SS.FF6 AM') :: TIMESTAMP(6)
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
2.10 AS SubQuery¶
The following properties and clauses are unsupported when creating a table through AS SubQuery
in Snowflake.
[ immutable_table_clauses ]
[ blockchain_table_clauses ]
[ DEFAULT COLLATION collation_name ]
[ ON COMMIT { DROP | PRESERVE } DEFINITION ]
[ ON COMMIT { DELETE | PRESERVE } ROWS ]
[ physical_properties ]
Oracle
create table table1
-- NO DROP NO DELETE HASHING USING sha2_512 VERSION v1 -- blockchain_clause not yet supported
DEFAULT COLLATION somename
ON COMMIT DROP DEFINITION
ON COMMIT DELETE ROWS
COMPRESS
NOLOGGING
AS
select
*
from
table1;
Snowflake
CREATE OR REPLACE TABLE table1
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},"attributes":{"component":"oracle"}}'
-- NO DROP NO DELETE HASHING USING sha2_512 VERSION v1 -- blockchain_clause not yet supported
AS
select
*
from
table1;
Known Issues¶
Some properties on the tables may be adapted to or commented on because the behavior in Snowflake is different.
Related EWIs¶
SSC-EWI-0035: Check statement not supported.
SSC-FDM-0006: Number type column may not behave similarly in Snowflake.
SSC-FDM-0019: Sematic information could not be loaded.
SSC-FDM-OR0042: Date Type Transformed To Timestamp Has A Different Behavior.
SSC-FDM-OR0006: Constraint state removed from not null inline constraint.
Create View¶
Note
Some parts in the output code are omitted for clarity reasons.
Create View¶
CREATE OR REPLACE VIEW View1 AS SELECT Column1 from Schema1.Table1;
CREATE OR REPLACE VIEW View1
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
AS
SELECT Column1 from
Schema1.Table1;
The following clauses for Create View are removed:
No Force/ Force
Edition Clause
Sharing Clause
Default collation
Bequeath clause
Container clause
CREATE OR REPLACE
NO FORCE
NONEDITIONABLE
VIEW Schema1.View1
SHARING = DATA
DEFAULT COLLATION Collation1
BEQUEATH CURRENT_USER
AS SELECT Column1 from Schema1.Table1
CONTAINER_MAP;
CREATE OR REPLACE VIEW Schema1.View1
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
AS
SELECT Column1 from
Schema1.Table1;
Alter View¶
Alter is not supported by SnowConvert yet.
Drop View¶
The CASCADE CONSTRAINT clause is not supported yet.
DROP VIEW Schema1.View1;
DROP VIEW Schema1.View1
CASCADE CONSTRAINTS;
DROP VIEW Schema1.View1;
DROP VIEW Schema1.View1
CASCADE CONSTRAINTS !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'DropBehavior' NODE ***/!!!;
Related EWIs¶
SSC-EWI-0073: Pending Functional Equivalence Review.
Create Type¶
General Description¶
One of the most important features the Oracle database engine offers is an Object-Oriented approach. PL/SQL offers capabilities beyond other relational databases in the form of OOP by using Java-like statements in the form of packages, functions, tables and types. This document will cover the last one and how SnowConvert solves it, remaining compliant to functionality.
Oracle supports the following specifications:
Abstract Data Type (ADT) (including an SQLJ object type).
Standalone varying array (varray) type.
Standalone nested table type.
Incomplete object type.
All this according to the information found in Oracle Create Type Statement Documentation
CREATE [ OR REPLACE ] [ EDITIONABLE | NONEDITIONAL ] TYPE <type name>
[ <type source creation options> ]
[<type definition>]
[ <type properties> ]
Limitations¶
Snowflake doesn’t support user-defined data types, according to its online documentation Unsupported Data Types, but it supports Semi-structured Data Types, which can be used to mimic the hierarchy-like structure of most User-defined types. For this reason, there are multiple type features that have no workaround.
Following are the User Defined Types features for which NO workaround is proposed:
Subtypes: Type Hierarchy¶
These statements aren’t supported in Snowflake. SnowConvert only recognizes them, but no translation is offered.
CREATE TYPE person_t AS OBJECT (name VARCHAR2(100), ssn NUMBER)
NOT FINAL;
/
CREATE TYPE employee_t UNDER person_t
(department_id NUMBER, salary NUMBER)
NOT FINAL;
/
CREATE TYPE part_time_emp_t UNDER employee_t (num_hrs NUMBER);
/
Type properties¶
These refer to the options that are normally used when using OOP in PL/SQL: Persistable, Instantiable and Final.
CREATE OR REPLACE TYPE type1 AS OBJECT () NOT FINAL NOT INSTANTIABLE NOT PERSISTABLE;
CREATE OR REPLACE TYPE type2 AS OBJECT () FINAL INSTANTIABLE PERSISTABLE;
Nested Table Type¶
These statements aren’t supported in Snowflake. SnowConvert only recognizes them, but no translation is offered.
CREATE TYPE textdoc_typ AS OBJECT
( document_typ VARCHAR2(32)
, formatted_doc BLOB
) ;
/
CREATE TYPE textdoc_tab AS TABLE OF textdoc_typ;
/
Type Source Creation Options¶
These options stand for custom options regarding access and querying the type.
CREATE TYPE type1 FORCE OID 'abc' SHARING = METADATA DEFAULT COLLATION schema1.collation ACCESSIBLE BY (schema1.unitaccesor) AS OBJECT ();
CREATE TYPE type2 FORCE OID 'abc' SHARING = NONE DEFAULT COLLATION collation ACCESSIBLE BY (PROCEDURE unitaccesor) AS OBJECT ();
CREATE TYPE type3 AUTHID CURRENT_USER AS OBJECT ();
CREATE TYPE type4 AUTHID DEFINER AS OBJECT ();
Proposed workarounds¶
About types definition¶
For the definition, the proposed workaround is to create semi-structure data type to mimic Oracle’s data type.
About types member function¶
For the member functions containing logic and DML, the proposed workaround relies on helpers to translate this into stored procedures.
Current SnowConvert Support¶
The next table shows a summary of the current support provided by the SnowConvert tool. Please keep into account that translations may still not be final, and more work may be needed.
Type Statement Element | Current recognition status | Current translation status | Has Known Workarounds |
---|---|---|---|
Object Type Definitions | Recognized. | Partially Translated. | Yes. |
Subtype Definitions | Recognized. | Not Translated. | No. |
Array Type Definitions | Recognized. | Not Translated. | Yes. |
Nested Table Definitions | Recognized. | Not Translated. | No. |
Member Function Definitions | Recognized. | Not Translated. | Yes. |
Known Issues¶
1. DML usages for Object Types are not being transformed¶
As of now, only DDL definitions that use User-Defined Types are being transformed into Variant. This means that any Inserts, Updates or Deletes using User-defined Types are not being transformed and need to be manually transformed. There is no EWI for this but there is a work item to add this corresponding EWI.
2. Create Type creation options are not supported¶
Currently, there is no known workaround for any of the creation options, for these reasons they are not taken into account when defining the type.
Related EWIs¶
No related EWIs.
Array Type Definition¶
SnowConvert only recognizes these definitions and for the moment does not support any translation for them. This page is only used as a future reference for translations.
Note
Some parts in the output code are omitted for clarity reasons.
Description¶
Array Types define an array structure of a previously existing datatype (including other Custom Types).
For the translation of array types, the type definition is replaced by a Semi-structured Data Type and then it is expanded on any usages across the code. This means taking type’s definition and then expanding it on the original code.
CREATE TYPE <type name>
AS { VARRAY | [VARYING] ARRAY } ( <size limit> ) OF <data type>
Sample Source Patterns¶
Inserts for the array usage¶
The next data will be inserted inside the table before querying the select. Please note these Inserts currently need to be manually migrated into Snowflake.
Oracle¶
INSERT INTO customer_table_demo(customer_table_id, customer_data) VALUES
(1, phone_list_typ_demo('2000-0000', '4000-0000', '0000-0000'));
INSERT INTO customer_table_demo(customer_table_id, customer_data) VALUES
(1, phone_list_typ_demo('8000-2000', '0000-0000', '5000-0000'));
Snowflake¶
INSERT INTO customer_table_demo(customer_table_id, customer_data)
SELECT 1, ARRAY_CONSTRUCT('2000-0000', '4000-0000', '0000-0000');
INSERT INTO customer_table_demo(customer_table_id, customer_data)
SELECT 1, ARRAY_CONSTRUCT('8000-2000', '0000-0000', '5000-0000');
Array Type usage¶
Oracle¶
CREATE TYPE phone_list_typ_demo AS VARRAY(3) OF VARCHAR2(25);
/
CREATE TABLE customer_table_demo (
customer_table_id INTEGER,
customer_data phone_list_typ_demo
);
/
SELECT * FROM customer_table_demo;
/
CUSTOMER_TABLE_ID|CUSTOMER_DATA |
-----------------+---------------------------------------+
1|[['2000-0000','4000-0000','0000-0000']]|
1|[['8000-2000','0000-0000','5000-0000']]|
Snowflake¶
!!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'VARYING ARRAY' NODE ***/!!!
CREATE TYPE phone_list_typ_demo AS VARRAY(3) OF VARCHAR2(25);
CREATE OR REPLACE TABLE customer_table_demo (
customer_table_id INTEGER,
customer_data VARIANT !!!RESOLVE EWI!!! /*** SSC-EWI-0062 - CUSTOM TYPE 'phone_list_typ_demo' USAGE CHANGED TO VARIANT ***/!!!
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},"attributes":{"component":"oracle"}}'
;
CREATE OR REPLACE VIEW PUBLIC.customer_table_demo_view
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": { "major": 0, "minor": 0, "patch": "0" }, "attributes": { "component": "oracle", "convertedOn": "" }}'
AS
SELECT
customer_table_id,
customer_data
FROM
customer_table_demo;
SELECT * FROM
customer_table_demo_view;
CUSTOMER_TABLE_ID|CUSTOMER_DATA |
-----------------+---------------------------------------+
1|{['2000-0000','4000-0000','0000-0000']}|
1|{['8000-2000','0000-0000','5000-0000']}|
Known Issues¶
1. Create Type creation options are not supported¶
Currently, there is no known workaround for any of the creation options, for these reasons they are not taken into account when defining the type.
2. Migrated code output is not functional¶
The statements are being changed unnecessarily, which makes them no longer be functional on the output code. This will be addressed when a proper transformation for them is in place.
Related EWIs¶
SSC-EWI-0062: Custom type usage changed to variant.
SSC-EWI-0073: Pending Functional Equivalence Review.
Member Function Definitions¶
SnowConvert still does not recognize type member functions nor type body definitions. This page is only used as a future reference for translation.
Note
Some parts in the output code are omitted for clarity reasons.
Description¶
Like other Class definitions, Oracle’s TYPE can implement methods to expose behaviors based on its attributes. MEMBER FUCTION will be transformed to Snowflake’s Stored Procedures, to maintain functional equivalence due to limitations.
Since functions are being transformed into procedures, the transformation reference for PL/SQL also applies here.
Sample Source Patterns¶
Inserts for Simple square() member function¶
The next data will be inserted inside the table before querying the select. Please note these Inserts currently need to be manually migrated into Snowflake.
Oracle¶
INSERT INTO table_member_function_demo(column1) VALUES
(type_member_function_demo(5));
Snowflake¶
INSERT INTO table_member_function_demo (column1)
SELECT OBJECT_CONSTRUCT('a1', 5);
Simple square() member function¶
Oracle¶
-- TYPE DECLARATION
CREATE TYPE type_member_function_demo AS OBJECT (
a1 NUMBER,
MEMBER FUNCTION get_square RETURN NUMBER
);
/
-- TYPE BODY DECLARATION
CREATE TYPE BODY type_member_function_demo IS
MEMBER FUNCTION get_square
RETURN NUMBER
IS x NUMBER;
BEGIN
SELECT c.column1.a1*c.column1.a1 INTO x
FROM table_member_function_demo c;
RETURN (x);
END;
END;
/
-- TABLE
CREATE TABLE table_member_function_demo (column1 type_member_function_demo);
/
-- QUERYING DATA
SELECT
t.column1.get_square()
FROM
table_member_function_demo t;
/
T.COLUMN1.GET_SQUARE()|
----------------------+
25|
Snowflake¶
-- TYPE DECLARATION
!!!RESOLVE EWI!!! /*** SSC-EWI-0056 - CUSTOM TYPES ARE NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS CUSTOM TYPE WERE CHANGED TO VARIANT ***/!!!
CREATE TYPE type_member_function_demo AS OBJECT (
a1 NUMBER,
MEMBER FUNCTION get_square RETURN NUMBER
)
;
---- TYPE BODY DECLARATION
--!!!RESOLVE EWI!!! /*** SSC-EWI-OR0007 - CREATE TYPE WITHOUT BODY IS NOT SUPPORTED IN SNOWFLAKE ***/!!!
--CREATE TYPE BODY type_member_function_demo IS
-- MEMBER FUNCTION get_square
-- RETURN NUMBER
-- IS x NUMBER;
-- BEGIN
-- SELECT c.column1.a1*c.column1.a1 INTO x
-- FROM table_member_function_demo c;
-- RETURN (x);
-- END;
--END
;
-- TABLE
CREATE OR REPLACE TABLE table_member_function_demo (column1 VARIANT !!!RESOLVE EWI!!! /*** SSC-EWI-0062 - CUSTOM TYPE 'type_member_function_demo' USAGE CHANGED TO VARIANT ***/!!!
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},"attributes":{"component":"oracle"}}'
;
CREATE OR REPLACE VIEW PUBLIC.table_member_function_demo_view
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": { "major": 0, "minor": 0, "patch": "0" }, "attributes": { "component": "oracle", "convertedOn": "" }}'
AS
SELECT
column1:a1 :: NUMBER AS a1
FROM
table_member_function_demo;
-- QUERYING DATA
SELECT
t.column1.get_square() !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 't.column1.get_square' NODE ***/!!!
FROM
table_member_function_demo t;
GET_SQUARE()|
------------+
25|
Known Issues¶
No Known issues.
Related EWIs¶
SSC-EWI-0056: Create Type Not Supported.
SSC-EWI-0062: Custom type usage changed to variant.
SSC-EWI-0073: Pending Functional Equivalence Review.
SSC-EWI-OR0007: Create Type Not Supported in Snowflake
Nested Table Type Definition¶
SnowConvert only recognizes these definitions, does not support any translation and there is no known workaround for them.
Description
Nested Table Types define an embedded table structure of a previously existing datatype (including other Custom Types). They can be used as a more powerful version of the Array Type.
Unlike any of the other types, there is still no known workaround or any possible translation for them.
CREATE TYPE <type name> AS TABLE OF <data type>
Sample Source Patterns
Nested Table Type usage
Oracle
CREATE TYPE textdoc_typ AS OBJECT (
document_typ VARCHAR2(32),
formatted_doc BLOB
);
/
CREATE TYPE textdoc_tab AS TABLE OF textdoc_typ;
/
Snowflake
Snowflake
!!!RESOLVE EWI!!! /*** SSC-EWI-0056 - CUSTOM TYPES ARE NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS CUSTOM TYPE WERE CHANGED TO VARIANT ***/!!!
CREATE TYPE textdoc_typ AS OBJECT (
document_typ VARCHAR2(32),
formatted_doc BLOB
)
;
!!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'NESTED TABLE' NODE ***/!!!
CREATE TYPE textdoc_tab AS TABLE OF textdoc_typ;
Known Issues
1. Create Type creation options are not supported
Currently, there is no known workaround for any of the creation options; for these reasons, they are not taken into account when defining the type.
Related EWIs
SSC-EWI-0073: Pending Functional Equivalence Review
SSC-EWI-0056: Create Type Not Supported.
Object Type Definition
Note
SnowConvert supports a translation for Object Type Definitions itself. However, their usages are still a work in progress.
Note
Some parts in the output code are omitted for clarity reasons.
Description¶
Object Types define a structure of data similar to a record, with the added advantages of the member function definitions. Meaning that their data may be used along some behavior within the type.
For the translation of object types, the type definition is replaced by a Semi-structured Data Type and then it is expanded on any usages across the code. For tables this means replacing the column for a Variant, adding a View so that selects (and also Views) to the original table can still function.
CREATE TYPE <type name> AS OBJECT
( [{<type column definition> | type method definition } , ...]);
Sample Source Patterns¶
Inserts for Simple Type usage¶
The next data will be inserted inside the table before querying the select. Please note these Inserts currently need to be manually migrated into Snowflake.
Oracle¶
INSERT INTO customer_table_demo(customer_table_id, customer_data)
VALUES ( 1, customer_typ_demo(1, 'First Name 1', 'Last Name 1'));
INSERT INTO customer_table_demo(customer_table_id, customer_data)
VALUES ( 2, customer_typ_demo(2, 'First Name 2', 'Last Name 2'));
Snowflake¶
INSERT INTO customer_table_demo(customer_table_id, customer_data)
VALUES ( 1, customer_typ_demo(1, 'First Name 1', 'Last Name 1') !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'customer_typ_demo' NODE ***/!!!);
INSERT INTO customer_table_demo(customer_table_id, customer_data)
VALUES ( 2, customer_typ_demo(2, 'First Name 2', 'Last Name 2') !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'customer_typ_demo' NODE ***/!!!);
Simple Type usage¶
Oracle¶
CREATE TYPE customer_typ_demo AS OBJECT (
customer_id INTEGER,
cust_first_name VARCHAR2(20),
cust_last_name VARCHAR2(20)
);
CREATE TABLE customer_table_demo (
customer_table_id INTEGER,
customer_data customer_typ_demo
);
SELECT * FROM customer_table_demo;
CUSTOMER_TABLE_ID|CUSTOMER_DATA |
-----------------+------------------------------+
1|[1, First Name 1, Last Name 1]|
2|[2, First Name 2, Last Name 2]|
Snowflake¶
!!!RESOLVE EWI!!! /*** SSC-EWI-0056 - CUSTOM TYPES ARE NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS CUSTOM TYPE WERE CHANGED TO VARIANT ***/!!!
CREATE TYPE customer_typ_demo AS OBJECT (
customer_id INTEGER,
cust_first_name VARCHAR2(20),
cust_last_name VARCHAR2(20)
)
;
CREATE OR REPLACE TABLE customer_table_demo (
customer_table_id INTEGER,
customer_data VARIANT !!!RESOLVE EWI!!! /*** SSC-EWI-0062 - CUSTOM TYPE 'customer_typ_demo' USAGE CHANGED TO VARIANT ***/!!!
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
CREATE OR REPLACE VIEW PUBLIC.customer_table_demo_view
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": { "major": 0, "minor": 0, "patch": "0" }, "attributes": { "component": "oracle", "convertedOn": "" }}'
AS
SELECT
customer_table_id,
customer_data:customer_id :: INTEGER AS customer_id,
customer_data:cust_first_name :: VARCHAR AS cust_first_name,
customer_data:cust_last_name :: VARCHAR AS cust_last_name
FROM
customer_table_demo;
SELECT * FROM
customer_table_demo_view;
CUSTOMER_TABLE_ID|CUST_ID|CUST_FIRST_NAME|CUST_LAST_NAME|
-----------------+-------+---------------+--------------+
1|1 |First Name 1 |Last Name 1 |
2|2 |First Name 2 |Last Name 2 |
Inserts for Nested Type Usage¶
These statements need to be placed between the table creation and the select statement to test the output.
Oracle¶
INSERT INTO customer_table_demo(customer_id, customer_data) values
(1, customer_typ_demo('Customer 1', email_typ_demo('email@domain.com')));
INSERT INTO customer_table_demo(customer_id, customer_data) values
(2, customer_typ_demo('Customer 2', email_typ_demo('email2@domain.com')));
Snowflake¶
INSERT INTO customer_table_demo(customer_id, customer_data) values
(1, customer_typ_demo('Customer 1', email_typ_demo('email@domain.com') !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'email_typ_demo' NODE ***/!!!) !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'customer_typ_demo' NODE ***/!!!);
INSERT INTO customer_table_demo(customer_id, customer_data) values
(2, customer_typ_demo('Customer 2', email_typ_demo('email2@domain.com') !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'email_typ_demo' NODE ***/!!!) !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'customer_typ_demo' NODE ***/!!!);
Nested Type Usage¶
Oracle¶
CREATE TYPE email_typ_demo AS OBJECT (email VARCHAR2(20));
CREATE TYPE customer_typ_demo AS OBJECT (
cust_name VARCHAR2(20),
cust_email email_typ_demo
);
CREATE TABLE customer_table_demo (
customer_id INTEGER,
customer_data customer_typ_demo
);
SELECT * FROM customer_table_demo;
CUSTOMER_ID|CUSTOMER_DATA |
-----------+---------------------------------+
1|[Customer 1, [email@domain.com]] |
2|[Customer 2, [email2@domain.com]]|
Snowflake¶
!!!RESOLVE EWI!!! /*** SSC-EWI-0056 - CUSTOM TYPES ARE NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS CUSTOM TYPE WERE CHANGED TO VARIANT ***/!!!
CREATE TYPE email_typ_demo AS OBJECT (email VARCHAR2(20))
;
!!!RESOLVE EWI!!! /*** SSC-EWI-0056 - CUSTOM TYPES ARE NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS CUSTOM TYPE WERE CHANGED TO VARIANT ***/!!!
CREATE TYPE customer_typ_demo AS OBJECT (
cust_name VARCHAR2(20),
cust_email email_typ_demo
)
;
CREATE OR REPLACE TABLE customer_table_demo (
customer_id INTEGER,
customer_data VARIANT !!!RESOLVE EWI!!! /*** SSC-EWI-0062 - CUSTOM TYPE 'customer_typ_demo' USAGE CHANGED TO VARIANT ***/!!!
)
COMMENT = '{"origin":"sf_sc","name":"snowconvert","version":{"major":1, "minor":0},{"attributes":{"component":"oracle"}}'
;
CREATE OR REPLACE VIEW PUBLIC.customer_table_demo_view
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": { "major": 0, "minor": 0, "patch": "0" }, "attributes": { "component": "oracle", "convertedOn": "" }}'
AS
SELECT
customer_id,
customer_data:cust_name :: VARCHAR AS cust_name,
customer_data:cust_email:email :: VARCHAR AS email
FROM
customer_table_demo;
SELECT * FROM
customer_table_demo_view;
CUSTOMER_ID|CUST_NAME |CUST_EMAIL |
-----------+----------+-----------------+
1|Customer 1|email@domain.com |
2|Customer 2|email2@domain.com|
Known Issues¶
1. Migrated code output is not the same¶
The view statement is being changed unnecessarily, which makes the table no longer have the same behavior in the output code. There is a work item to fix this issue.
2. DML for User-defined Types is not being transformed¶
DML that interacts with elements that have User-defined types within them (like a table) are not being transformed. There is a work item to implement this in the future.
3. Create Type creation options are not supported¶
Currently, there is no known workaround for any of the creation options, for these reasons they are not taken into account when defining the type.
Related EWIs¶
SSC-EWI-0056: Create Type Not Supported.
SSC-EWI-0062: Custom type usage changed to variant.
SSC-EWI-0073: Pending Functional Equivalence Review.
Subtype Definition¶
Since there are no known workarounds, SnowConvert only recognizes these definitions and does not support any translation for them.
Description
Subtypes define a structure of data similar to a record, with the added advantages of the member function definitions. Meaning that their data may be used along some behavior within the type. Unlike Object Types, Subtypes are built as an extension to another existing type.
Regarding subtype definitions, there is still no translation, but there might be a way to reimplement them using Object Type Definitions and then using their respective translation.
CREATE TYPE <type name> UNDER <super type name>
( [{<type column definition> | type method definition } , ...]);
Sample Source Patterns
Subtypes under an Object Type
Oracle
CREATE TYPE person_t AS OBJECT (name VARCHAR2(100), ssn INTEGER)
NOT FINAL;
/
CREATE TYPE employee_t UNDER person_t
(department_id INTEGER, salary INTEGER)
NOT FINAL;
/
CREATE TYPE part_time_emp_t UNDER employee_t (num_hrs INTEGER);
/
Snowflake
!!!RESOLVE EWI!!! /*** SSC-EWI-0056 - CUSTOM TYPES ARE NOT SUPPORTED IN SNOWFLAKE BUT REFERENCES TO THIS CUSTOM TYPE WERE CHANGED TO VARIANT ***/!!!
CREATE TYPE person_t AS OBJECT (name VARCHAR2(100), ssn INTEGER)
NOT FINAL;
--!!!RESOLVE EWI!!! /*** SSC-EWI-OR0007 - CREATE TYPE SUBTYPE IS NOT SUPPORTED IN SNOWFLAKE ***/!!!
--CREATE TYPE employee_t UNDER person_t
-- (department_id INTEGER, salary INTEGER)
-- NOT FINAL
;
--!!!RESOLVE EWI!!! /*** SSC-EWI-OR0007 - CREATE TYPE SUBTYPE IS NOT SUPPORTED IN SNOWFLAKE ***/!!!
--CREATE TYPE part_time_emp_t UNDER employee_t (num_hrs INTEGER)
;
Known Issues
1. Create Type creation options are not supported
Currently, there is no known workaround for any of the creation options, for these reasons they are not taken into account when defining the type.
Related EWIs
SSC-EWI-0056: Create Type Not Supported.
SSC-EWI-OR0007: Create Type Not Supported in Snowflake.