SnowConvert AI - Hive 기능적 차이

SSC-FDM-HV0001

외부 테이블에 값을 삽입하는 기능은 Snowflake에서 지원되지 않습니다

설명

Hive 형식 테이블을 사용하면 값을 삽입할 수 있지만, Snowflake 외부 테이블은 값 삽입을 지원하지 않습니다. 즉, 테이블 구조가 변환되는 동안 Snowflake의 외부 테이블에 직접 데이터를 삽입하려는 모든 작업이 실패합니다.

코드 예제

입력

Spark
 CREATE EXTERNAL TABLE IF NOT EXISTS External_table_hive_format
(
  order_id int,
  date string,
  client_name string,
  total float
)
stored as AVRO
LOCATION 'gs://sc_external_table_bucket/folder_with_avro/orders.avro';

출력

Snowflake
 --** SSC-FDM-HV0001 - INSERTING VALUES INTO AN EXTERNAL TABLE IS NOT SUPPORTED IN SNOWFLAKE **
CREATE EXTERNAL TABLE IF NOT EXISTS hive_format_orders_Andres
(
  order_id int AS CAST(GET_IGNORE_CASE($1, 'order_id') AS int),
  date string AS CAST(GET_IGNORE_CASE($1, 'date') AS string),
  client_name string AS CAST(GET_IGNORE_CASE($1, 'client_name') AS string),
  total float AS CAST(GET_IGNORE_CASE($1, 'total') AS float)
)
!!!RESOLVE EWI!!! /*** SSC-EWI-0032 - EXTERNAL TABLE REQUIRES AN EXTERNAL STAGE TO ACCESS gs:, DEFINE AND REPLACE THE EXTERNAL_STAGE PLACEHOLDER ***/!!!
LOCATION = @EXTERNAL_STAGE
AUTO_REFRESH = false
FILE_FORMAT = (TYPE = AVRO)
PATTERN = '/sc_external_table_bucket/folder_with_avro/orders.avro'
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": {  "major": 0,  "minor": 0,  "patch": "0" }, "attributes": {  "component": "spark",  "convertedOn": "06/18/2025",  "domain": "no-domain-provided" }}';

모범 사례

SSC-FDM-HV0002

Partitioned column added to table definition

설명

For Hive/Spark partitioned tables, the partition columns are stored in the directory structure rather than in the table data. Snowflake does not support this pattern. SnowConvert AI adds the partitioned columns to the table definition as regular columns so the table schema is complete.

코드 예제

입력

Hive
 CREATE EXTERNAL TABLE sales_data
(
  product_id INT,
  amount DECIMAL(10,2)
)
PARTITIONED BY (sale_month STRING)
STORED AS PARQUET
LOCATION 's3://bucket/sales/';

출력

Snowflake
 CREATE EXTERNAL TABLE sales_data (
  product_id INT,
  amount DECIMAL(10,2),
  sale_month STRING
)
--** SSC-FDM-HV0002 - PARTITIONED COLUMN ADDED TO TABLE DEFINITION. **
LOCATION = @EXTERNAL_STAGE
FILE_FORMAT = (TYPE = PARQUET);

모범 사례

  • Verify that partition columns are correctly mapped to your file path structure.

  • 추가 지원이 필요한 경우 snowconvert-support@snowflake.com으로 이메일을 보내주세요.

SSC-FDM-HV0003

NULL format parameter is not supported in FROM_UNIXTIME

설명

Hive’s FROM_UNIXTIME function allows a NULL format parameter, in which case it uses a default format. Snowflake’s equivalent (TO_VARCHAR with TO_TIMESTAMP_NTZ) does not support a NULL format parameter. SnowConvert AI passes the NULL through, but the conversion may fail at runtime or behave unexpectedly.

코드 예제

입력

Hive
 SELECT FROM_UNIXTIME(1697328000, CAST(NULL AS STRING));

출력

Snowflake
 SELECT
  --** SSC-FDM-HV0003 - NULL FORMAT PARAMETER IS NOT SUPPORTED IN FROM_UNIXTIME. **
  TO_VARCHAR(TO_TIMESTAMP_NTZ(1697328000), CAST(NULL AS STRING));

모범 사례

  • Replace NULL format parameters with an explicit format string (e.g., ‘yyyy-MM-dd HH:mm:ss’).

  • 추가 지원이 필요한 경우 snowconvert-support@snowflake.com으로 이메일을 보내주세요.

SSC-FDM-HV0004

INSTR transformed to REGEXP_INSTR changes literal to regex pattern

설명

Hive’s INSTR function uses literal string matching. Snowflake does not have INSTR; SnowConvert AI translates it to REGEXP_INSTR. REGEXP_INSTR interprets the pattern as a regex, so metacharacters (e.g., ., *, $) will behave differently than in Hive’s literal matching.

코드 예제

입력

Hive
 SELECT INSTR('price: $10.99', pattern_col, 1, 1);

출력

Snowflake
 SELECT
  --** SSC-FDM-HV0004 - HIVE'S INSTR USES LITERAL STRING MATCHING, BUT REGEXP_INSTR INTERPRETS THE PATTERN AS A REGEX. METACHARACTERS WILL BEHAVE DIFFERENTLY. **
  REGEXP_INSTR('price: $10.99', pattern_col, 1, 1);

모범 사례

  • When the pattern contains regex metacharacters, escape them or use REGEXP_REPLACE to sanitize the pattern.

  • 추가 지원이 필요한 경우 snowconvert-support@snowflake.com으로 이메일을 보내주세요.