SnowConvert AI - Informatica PowerCenter

This topic documents the Informatica PowerCenter components supported by SnowConvert AI and describes the output generated for each component category.

For general information about the ETL migration process, prerequisites, and step-by-step migration instructions, see ETL Migration.

Control flow elements

Informatica Workflows define the orchestration logic: which Sessions run, in what order, and with what variable context. SnowConvert AI converts Workflow elements into Snowflake TASK DAGs and stored procedures.

These Informatica Workflow elements are supported:

Element

Category

Conversion Target

Notes

Session

Task

Snowflake TASK with EXECUTE DBT PROJECT

See Session task section

Session Overrides

Configuration

Applied during dbt project generation

Pre/Post SQL, SQL Override

Worklet

Container

Nested Snowflake Task Graph

See Worklet task section

Start

Task

Initialization TASK

Variable setup via InitVariablesFromConfig

Variable Assignment

Configuration

Table-driven variable management

See Variable management section

Parameter Files

Configuration

Loaded into control_variables at runtime

.txt and .xml formats supported

Note

Unlisted Workflow elements generate an EWI code indicating manual conversion is required.

Session task

A Session in Informatica executes a Mapping at runtime. Each Session is converted into a Snowflake TASK that runs the corresponding dbt project.

Conversion output:

  • A CREATE OR REPLACE TASK statement with AFTER dependency on the preceding task

  • Variable initialization via BuildDbtVarsJsonUDF

  • EXECUTE DBT PROJECT call with the variable JSON payload

Example:

CREATE OR REPLACE TASK public.wf_daily_load_s_load_customers
WAREHOUSE = DUMMY_WAREHOUSE
AFTER public.wf_daily_load
AS
BEGIN
    LET dbt_vars VARCHAR := public.BuildDbtVarsJsonUDF('wf_daily_load');
    EXECUTE DBT PROJECT public.m_load_customers ARGS = :dbt_vars;
END;

Notes:

  • Replace DUMMY_WAREHOUSE with your actual Snowflake warehouse name

  • The AFTER clause preserves the original Workflow execution order

  • Reusable Sessions are resolved to reference the correct Mapping

Session overrides

Informatica Sessions can override properties defined at the Mapping level. SnowConvert AI captures these overrides and applies them during conversion.

Pre-SQL and Post-SQL

  • Session-level Pre-SQL statements are generated before the EXECUTE DBT PROJECT call within the TASK body

  • Session-level Post-SQL statements are generated after the EXECUTE DBT PROJECT call

  • Pre-SQL and Post-SQL on Target transformations are also supported

SQL Override

  • When a Session defines a SQL Override for a Source Qualifier, the override takes priority over the Mapping-level SQL

  • The override is applied during dbt model generation, replacing the default source query

Worklet task

Informatica Worklets are reusable, nested sub-workflows that can be embedded inside a parent Workflow. SnowConvert AI converts Worklets into nested Snowflake Task Graphs.

Conversion output:

  • A separate SQL file per Worklet in the worklets/ directory

  • The Worklet’s internal Start task and Session tasks are converted following the same patterns as the parent Workflow

  • The parent Workflow references the Worklet root task via an AFTER dependency

Example structure:

{FolderName}/{MappingName}/
├── wf_daily_load.sql               # Parent Workflow
└── worklets/
    └── wklt_validate.sql             # Worklet sub-graph

The Worklet Task Graph preserves the original execution hierarchy. Worklet tasks follow the same naming and variable management patterns as the parent Workflow.

Start task

The Informatica Start task (the entry point of a Workflow) is converted into an initialization TASK that sets up the variable context:

CREATE OR REPLACE TASK public.wf_daily_load AS
BEGIN
    CALL public.InitVariablesFromConfig('wf_daily_load');
END;

This TASK loads default variable values and applies parameter file overrides before any Session tasks execute.

dbt project execution

Within the orchestration code, Sessions are executed using Snowflake’s EXECUTE DBT PROJECT command:

EXECUTE DBT PROJECT schema.project_name ARGS = :dbt_vars;

Important requirements:

  • The project_name must match the name used when deploying the dbt project (via CREATE DBT PROJECT or Snowflake Workspace deployment)

  • The ARGS parameter passes workflow variables as a JSON payload built by BuildDbtVarsJsonUDF

  • Each execution runs the entire dbt project with all models in dependency order

Deployment options:

  • Snowflake CLI: snow dbt deploy --schema schema_name --database database_name --force package_name

  • Snowflake Workspace: Upload and deploy via UI

Variable management

Informatica variables ($$var assignments) are converted into a table-driven management system. The infrastructure components are generated in the etl_configuration/ folder.

Control variables table

The control_variables table stores all workflow variables, parameters, and their values:

Field

Type

Description

variable_name

VARCHAR

Variable name

variable_value

VARIANT

Value (accommodates any data type)

variable_type

VARCHAR

Original Informatica data type

variable_scope

VARCHAR

Workflow or session name

is_parameter

BOOLEAN

Distinguishes parameters from variables

is_persistent

BOOLEAN

Reserved for future use

last_updated_at

TIMESTAMP

Last update time

UDFs and procedures

Component

Purpose

GetControlVariableUDF

Retrieves a variable value from the control table

BuildDbtVarsJsonUDF

Constructs a JSON payload of all variables for a workflow scope, passed as ARGS to EXECUTE DBT PROJECT

InitVariablesFromConfig

Initializes variables at workflow start; loads defaults and applies parameter file overrides

UpdateControlVariable

Updates a variable value during orchestration execution

Parameter file support

Informatica .txt and .xml parameter files are parsed during migration. Parameter values are loaded into the control_variables table at runtime via InitVariablesFromConfig, overriding default variable values for each workflow execution.

Built-in variables

Variable

Description

SESSSTARTTIME

Session start timestamp. Captured at task start via CURRENT_TIMESTAMP().

SYSDATE

Current date and time. Mapped to CURRENT_TIMESTAMP().

Data flow components

Informatica Mappings define the data transformation logic: how data flows from sources through transformations to targets. SnowConvert AI converts each Mapping into a standalone dbt project with a three-tier model architecture.

These Informatica Mapping transformations are supported:

Informatica Component

Category

dbt Output

Naming Pattern

Notes

Source Qualifier

Source

Staging Model

stg_{source_name}

Source Definition

Source

Staging Model

stg_{source_name}

Expression

Transformation

Intermediate Model

int_{transformation_name}

Filter

Transformation

Intermediate Model

int_{transformation_name}

Joiner

Transformation

Intermediate Model

int_{transformation_name}

Lookup (Connected)

Transformation

Intermediate Model

int_{transformation_name}

Lookup (Unconnected)

Transformation

Intermediate Model

int_{transformation_name}

Aggregator

Transformation

Intermediate Model

int_{transformation_name}

Aggregator (No Group By)

Transformation

Intermediate Model

int_{transformation_name}

Router

Transformation

Intermediate Model

int_{transformation_name}

Sorter

Transformation

Intermediate Model

int_{transformation_name}

Union

Transformation

Intermediate Model

int_{transformation_name}

Normalizer

Transformation

Intermediate Model

int_{transformation_name}

Rank

Transformation

Intermediate Model

int_{transformation_name}

Sequence Generator

Transformation

Intermediate Model

int_{transformation_name}

Update Strategy

Transformation

Intermediate Model

int_{transformation_name}

Mart uses incremental materialization with merge strategy

Stored Procedure

Transformation

Intermediate Model

int_{transformation_name}

Mapplet

Reuse

dbt Macro

macros/{mapplet_name}.sql

Target Definition

Destination

Mart Model

{target_name}

Note

Unlisted Mapping transformations generate an EWI code indicating manual conversion is required.

Layer organization

Layer

Materialization

Purpose

models/staging/

View

Clean, type-safe access to source data referenced in sources.yml. Generated from Source Qualifier and Source Definition transformations.

models/intermediate/

Ephemeral

Transformation logic from the original Mapping. Not persisted to database. Generated from Expression, Joiner, Filter, Lookup, and other transformations.

models/marts/

Incremental or Table

Business-ready data models corresponding to Target Definitions. Uses incremental materialization with merge strategy when an Update Strategy transformation is present.

dbt project structure

Each Mapping produces a standalone dbt project:

{MappingName}/
├── dbt_project.yml                   # Materialization config
├── profiles.yml                      # Snowflake connection profile
├── models/
│   ├── sources.yml                   # Source table definitions
│   ├── staging/
│   │   ├── stg_customers.sql
│   │   └── stg_regions.sql
│   ├── intermediate/
│   │   ├── int_expression.sql
│   │   └── int_joiner.sql
│   └── marts/
│       └── customer_dim.sql
└── macros/
    └── *.sql                         # Mapplet-derived macros

Important

Before deploying, replace the YOUR_SCHEMA and YOUR_DB placeholders in sources.yml and profiles.yml with your actual Snowflake schema and database names.

Expression functions

SnowConvert AI supports over 60 Informatica expression functions across the following categories:

Category

Functions

String

CONCAT, SUBSTR, LENGTH, LOWER, UPPER, LTRIM, RTRIM, LPAD, RPAD, INSTR, REPLACECHR, REPLACESTR, REVERSE, INITCAP, CHR, ASCII

Date and time

ADD_TO_DATE, GET_DATE_PART, TO_DATE, DATE_DIFF, DATE_COMPARE, LAST_DAY, SYSTIMESTAMP, TO_CHAR

Numeric

ROUND, TRUNC, ABS, CEIL, FLOOR, POWER, SQRT, EXP, LN, LOG, MOD, MEDIAN, SIGN

Aggregate

SUM, MIN, MAX, AVG, COUNT, CUME

Type conversion

TO_INTEGER, TO_DECIMAL, TO_BIGINT, TO_FLOAT, TO_CHAR, TO_DATE

Conditional

IIF, DECODE, ISNULL, IN

Encoding

MD5, ENC_BASE64

Variable management

SETVARIABLE, SETMAXVARIABLE, SETMINVARIABLE, ABORT

Other

GREATEST, LEAST, REG_MATCH

Naming and sanitization rules

SnowConvert AI applies consistent sanitization rules to all Informatica object names to ensure dbt and Snowflake compatibility:

Rule

Description

Example

Convert to lowercase

All names converted to lowercase

M_Load_Customersm_load_customers

Replace invalid characters

Spaces, hyphens, and special characters become underscores

m_load-customer datam_load_customer_data

Remove consecutive underscores

Avoids __ sequences

m___loadm_load

Prefix with t_

Adds prefix if name starts with a number

123_mappingt_123_mapping

Remove quotes and brackets

Strips surrounding quotes and brackets

"M_Load"m_load

These rules apply uniformly across all generated artifacts: dbt model names, Snowflake TASK names, procedure names, and variable identifiers.