SnowConvert AI - Informatica PowerCenter

This topic documents the Informatica PowerCenter components supported by SnowConvert AI and describes the output generated for each component category.

For general information about the ETL migration process, prerequisites, and step-by-step migration instructions, see ETL Migration.

Control flow elements

Informatica Workflows define the orchestration logic: which Sessions run, in what order, and with what variable context. SnowConvert AI converts Workflow elements into Snowflake TASK DAGs and stored procedures.

These Informatica Workflow elements are supported:

ElementCategoryConversion TargetNotes
SessionTaskSnowflake TASK with EXECUTE DBT PROJECTSee Session task section
Session OverridesConfigurationApplied during dbt project generationPre/Post SQL, SQL Override
WorkletContainerNested Snowflake Task GraphSee Worklet task section
StartTaskInitialization TASKVariable setup via InitVariablesFromConfig
Variable AssignmentConfigurationTable-driven variable managementSee Variable management section
Parameter FilesConfigurationLoaded into control_variables at runtime.txt and .xml formats supported
Unlisted Workflow elements generate an EWI code indicating manual conversion is required.

Session task

A Session in Informatica executes a Mapping at runtime. Each Session is converted into a Snowflake TASK that runs the corresponding dbt project.

Conversion output:

  • A CREATE OR REPLACE TASK statement with AFTER dependency on the preceding task
  • Variable initialization via BuildDbtVarsJsonUDF
  • EXECUTE DBT PROJECT call with the variable JSON payload

Example:

:force:

CREATE OR REPLACE TASK public.wf_daily_load_s_load_customers
WAREHOUSE = DUMMY_WAREHOUSE
AFTER public.wf_daily_load
AS
BEGIN
    LET dbt_vars VARCHAR := public.BuildDbtVarsJsonUDF('wf_daily_load');
    EXECUTE DBT PROJECT public.m_load_customers ARGS = :dbt_vars;
END;

Notes:

  • Replace DUMMY_WAREHOUSE with your actual Snowflake warehouse name
  • The AFTER clause preserves the original Workflow execution order
  • Reusable Sessions are resolved to reference the correct Mapping

Session overrides

Informatica Sessions can override properties defined at the Mapping level. SnowConvert AI captures these overrides and applies them during conversion.

Pre-SQL and Post-SQL

  • Session-level Pre-SQL statements are generated before the EXECUTE DBT PROJECT call within the TASK body
  • Session-level Post-SQL statements are generated after the EXECUTE DBT PROJECT call
  • Pre-SQL and Post-SQL on Target transformations are also supported

SQL Override

  • When a Session defines a SQL Override for a Source Qualifier, the override takes priority over the Mapping-level SQL
  • The override is applied during dbt model generation, replacing the default source query

Worklet task

Informatica Worklets are reusable, nested sub-workflows that can be embedded inside a parent Workflow. SnowConvert AI converts Worklets into nested Snowflake Task Graphs.

Conversion output:

  • A separate SQL file per Worklet in the worklets/ directory
  • The Worklet’s internal Start task and Session tasks are converted following the same patterns as the parent Workflow
  • The parent Workflow references the Worklet root task via an AFTER dependency

Example structure:

{FolderName}/{MappingName}/
 wf_daily_load.sql               # Parent Workflow
 worklets/
     wklt_validate.sql             # Worklet sub-graph

The Worklet Task Graph preserves the original execution hierarchy. Worklet tasks follow the same naming and variable management patterns as the parent Workflow.

Start task

The Informatica Start task (the entry point of a Workflow) is converted into an initialization TASK that sets up the variable context:

:force:

CREATE OR REPLACE TASK public.wf_daily_load AS
BEGIN
    CALL public.InitVariablesFromConfig('wf_daily_load');
END;

This TASK loads default variable values and applies parameter file overrides before any Session tasks execute.

dbt project execution

Within the orchestration code, Sessions are executed using Snowflake’s EXECUTE DBT PROJECT command:

:force:

EXECUTE DBT PROJECT schema.project_name ARGS = :dbt_vars;

Important requirements:

  • The project_name must match the name used when deploying the dbt project (via CREATE DBT PROJECT or Snowflake Workspace deployment)
  • The ARGS parameter passes workflow variables as a JSON payload built by BuildDbtVarsJsonUDF
  • Each execution runs the entire dbt project with all models in dependency order

Deployment options:

  • Snowflake CLI: snow dbt deploy --schema schema_name --database database_name --force package_name
  • Snowflake Workspace: Upload and deploy via UI

Variable management

Informatica variables ($$var assignments) are converted into a table-driven management system. The infrastructure components are generated in the etl_configuration/ folder.

Control variables table

The control_variables table stores all workflow variables, parameters, and their values:

FieldTypeDescription
variable_nameVARCHARVariable name
variable_valueVARIANTValue (accommodates any data type)
variable_typeVARCHAROriginal Informatica data type
variable_scopeVARCHARWorkflow or session name
is_parameterBOOLEANDistinguishes parameters from variables
is_persistentBOOLEANReserved for future use
last_updated_atTIMESTAMPLast update time

UDFs and procedures

ComponentPurpose
GetControlVariableUDFRetrieves a variable value from the control table
BuildDbtVarsJsonUDFConstructs a JSON payload of all variables for a workflow scope, passed as ARGS to EXECUTE DBT PROJECT
InitVariablesFromConfigInitializes variables at workflow start; loads defaults and applies parameter file overrides
UpdateControlVariableUpdates a variable value during orchestration execution

Parameter file support

Informatica .txt and .xml parameter files are parsed during migration. Parameter values are loaded into the control_variables table at runtime via InitVariablesFromConfig, overriding default variable values for each workflow execution.

Built-in variables

VariableDescription
SESSSTARTTIMESession start timestamp. Captured at task start via CURRENT_TIMESTAMP().
SYSDATECurrent date and time. Mapped to CURRENT_TIMESTAMP().

Data flow components

Informatica Mappings define the data transformation logic: how data flows from sources through transformations to targets. SnowConvert AI converts each Mapping into a standalone dbt project with a three-tier model architecture.

These Informatica Mapping transformations are supported:

Informatica ComponentCategorydbt OutputNaming PatternNotes
Source QualifierSourceStaging Modelstg_{source_name}
Source DefinitionSourceStaging Modelstg_{source_name}
ExpressionTransformationIntermediate Modelint_{transformation_name}
FilterTransformationIntermediate Modelint_{transformation_name}
JoinerTransformationIntermediate Modelint_{transformation_name}
Lookup (Connected)TransformationIntermediate Modelint_{transformation_name}
Lookup (Unconnected)TransformationIntermediate Modelint_{transformation_name}
AggregatorTransformationIntermediate Modelint_{transformation_name}
Aggregator (No Group By)TransformationIntermediate Modelint_{transformation_name}
RouterTransformationIntermediate Modelint_{transformation_name}
SorterTransformationIntermediate Modelint_{transformation_name}
UnionTransformationIntermediate Modelint_{transformation_name}
NormalizerTransformationIntermediate Modelint_{transformation_name}
RankTransformationIntermediate Modelint_{transformation_name}
Sequence GeneratorTransformationIntermediate Modelint_{transformation_name}
Update StrategyTransformationIntermediate Modelint_{transformation_name}Mart uses incremental materialization with merge strategy
Stored ProcedureTransformationIntermediate Modelint_{transformation_name}
MappletReusedbt Macromacros/{mapplet_name}.sql
Target DefinitionDestinationMart Model{target_name}
Unlisted Mapping transformations generate an EWI code indicating manual conversion is required.

Layer organization

LayerMaterializationPurpose
models/staging/ViewClean, type-safe access to source data referenced in sources.yml. Generated from Source Qualifier and Source Definition transformations.
models/intermediate/EphemeralTransformation logic from the original Mapping. Not persisted to database. Generated from Expression, Joiner, Filter, Lookup, and other transformations.
models/marts/Incremental or TableBusiness-ready data models corresponding to Target Definitions. Uses incremental materialization with merge strategy when an Update Strategy transformation is present.

dbt project structure

Each Mapping produces a standalone dbt project:

{MappingName}/
 dbt_project.yml                   # Materialization config
 profiles.yml                      # Snowflake connection profile
 models/
    sources.yml                   # Source table definitions
    staging/
       stg_customers.sql
       stg_regions.sql
    intermediate/
       int_expression.sql
       int_joiner.sql
    marts/
        customer_dim.sql
 macros/
     *.sql                         # Mapplet-derived macros
Before deploying, replace the `YOUR_SCHEMA` and `YOUR_DB` placeholders in `sources.yml` and `profiles.yml` with your actual Snowflake schema and database names.

Expression functions

SnowConvert AI supports over 60 Informatica expression functions across the following categories:

CategoryFunctions
StringCONCAT, SUBSTR, LENGTH, LOWER, UPPER, LTRIM, RTRIM, LPAD, RPAD, INSTR, REPLACECHR, REPLACESTR, REVERSE, INITCAP, CHR, ASCII
Date and timeADD_TO_DATE, GET_DATE_PART, TO_DATE, DATE_DIFF, DATE_COMPARE, LAST_DAY, SYSTIMESTAMP, TO_CHAR
NumericROUND, TRUNC, ABS, CEIL, FLOOR, POWER, SQRT, EXP, LN, LOG, MOD, MEDIAN, SIGN
AggregateSUM, MIN, MAX, AVG, COUNT, CUME
Type conversionTO_INTEGER, TO_DECIMAL, TO_BIGINT, TO_FLOAT, TO_CHAR, TO_DATE
ConditionalIIF, DECODE, ISNULL, IN
EncodingMD5, ENC_BASE64
Variable managementSETVARIABLE, SETMAXVARIABLE, SETMINVARIABLE, ABORT
OtherGREATEST, LEAST, REG_MATCH

Naming and sanitization rules

SnowConvert AI applies consistent sanitization rules to all Informatica object names to ensure dbt and Snowflake compatibility:

RuleDescriptionExample
Convert to lowercaseAll names converted to lowercaseM_Load_Customersm_load_customers
Replace invalid charactersSpaces, hyphens, and special characters become underscoresm_load-customer datam_load_customer_data
Remove consecutive underscoresAvoids __ sequencesm___loadm_load
Prefix with t_Adds prefix if name starts with a number123_mappingt_123_mapping
Remove quotes and bracketsStrips surrounding quotes and brackets"M_Load"m_load

These rules apply uniformly across all generated artifacts: dbt model names, Snowflake TASK names, procedure names, and variable identifiers.