Snowflake Migration Skill

The Snowflake Migration Skill is an AI-powered skill for Cortex Code that guides you through an end-to-end database migration to Snowflake. It provides a conversational, interactive workflow — from connecting to your source database through code conversion, deployment, data migration, and validation.

Migrations take time. A full migration — especially for large workloads with hundreds of objects — is not completed in a single session. The skill automatically tracks your progress at every step. Each time you start a session, it reads your project state and picks up exactly where you left off. There is no need to start over.


What you can do

The skill guides you through the full migration lifecycle and lets you jump directly to any stage at any time:

Capability

Description

Connect to source

Set up a connection to your source database with credentials stored securely for reuse

Extract source code

Pull DDL and stored procedures directly from a live database, or import local .sql files

Convert code

Translate source SQL to Snowflake-compatible SQL via SnowConvert, with a full EWI report

AI-assisted conversion

AI explains remaining conversion issues, suggests fixes, and applies them interactively

Assess workloads

Generate an interactive HTML report covering deployment waves, object exclusions, dynamic SQL patterns, and SSIS/Informatica ETL analysis

Deploy objects

Deploy converted tables, views, functions, and procedures to Snowflake wave by wave

Migrate data

Copy rows from source tables to Snowflake with automatic row-count validation

Test functions and procedures

Capture source-side baselines and run two-sided validation to confirm output equivalence

Fix and rule engine

Create reusable fix rules from corrections you make and propagate them across the project automatically

Repoint reports

Repoint Power BI reports to use Snowflake as the data source


Supported source systems

Not all capabilities are available for all source systems. The following table shows what is supported today and what is coming soon.

Capability

SQL Server

Redshift

Teradata

Oracle

Other source systems

Code extraction

Yes

Yes

Planned

Planned

Deterministic code conversion

Yes

Yes

Yes

Yes

Yes

AI conversion and verification

Yes

Yes

Code deploy

Yes

Yes

Planned

Planned

SSIS to dbt (deterministic)

Yes

Yes

Yes

Yes

Yes

Informatica to dbt (deterministic)

Yes

Yes

Yes

Yes

Yes

AI conversion of SSIS to dbt

Yes

Yes

Yes

Yes

Yes

AI conversion of Informatica to dbt

Yes

Yes

Yes

Yes

Yes

Cloud data migration

Yes

Yes

Planned

Planned

Cloud data validation

Yes

Yes

Yes

Planned

Testing framework

Yes

Yes

Planned

Planned

Testing with synthetic data

Planned

Planned

AI assessment (via Cortex Code)

Yes

Yes

Yes

Planned

Power BI report repointing

Yes

Yes

Yes

Yes

Synapse and PostgreSQL only

Other dialects with deterministic conversion support: Azure Synapse, Sybase IQ, Google BigQuery, Greenplum, Netezza, PostgreSQL, Spark SQL, Databricks SQL, Vertica, Hive, IBM DB2

Available extraction scripts: Teradata, SQL Server, Synapse, Oracle, Redshift, Netezza, Vertica, DB2, Hive, BigQuery, Databricks, Sybase IQ


Prerequisites

Before starting a migration, ensure you have:

  • Cortex Code installed (the migration skill is bundled).

  • A Snowflake account with a connection configured in ~/.snowflake/connections.toml.

  • A source database (SQL Server or Redshift) accessible from your machine.

All other dependencies (uv, scai, Python packages) are installed automatically when the skill runs for the first time.


Quick Start

1. Launch Cortex Code:

cortex

2. Start a migration:

migrate

Or describe what you want:

I want to migrate my SQL Server database to Snowflake

The migration-guide skill activates automatically, confirms plugin installation with you, registers the plugin, and walks you through the full migration workflow. If a project already exists, the agent picks up where you left off.


Migration workflow

The skill guides you through six stages. Each session starts by detecting your current progress and resuming automatically.

Stage

Name

What happens

1

Connect

Set up a connection to your source database

2

Init

Create a local migration project

3

Register

Extract DDL and code from the source, or import local .sql files

4

Convert

Translate source SQL to Snowflake-compatible SQL via SnowConvert

5

Assess

Generate an interactive report covering waves, exclusions, dynamic SQL, and ETL

6

Migrate

Deploy objects, migrate data, validate output, and fix errors — wave by wave

Every session shows a live progress checklist so you always know where you stand:

✅  1. Connect             — Connected to SQL Server
✅  2. Init                — Project initialized
✅  3. Register            — 342 objects registered
◐   4. Initial Conv        — 280/342 converted
⬚   5. Assess              — Not run
⬚   6. Migrate Objects     — 0/120 tables deployed

Skills Reference

The skill is organized as a skill tree. The root skill detects your project state and delegates to the right sub-skill. You can also invoke any skill directly by describing what you want.

Setup Skills (Stages 1–5)

connection

Walks you through connecting to your source database. The agent collects credentials, tests the connection, and saves it for reuse across sessions. Supports:

  • SQL Server — configures ODBC driver, host, port, and authentication.

  • Amazon Redshift — configures host, port, database, and IAM or password authentication.

register-code-units

Gets source code into the migration project. Two paths are available:

Path

When to use

Extract from database

You have a live source connection and want the agent to pull DDL and object code directly

Import local files

You already have .sql files on disk and want to import them into the project

convert

Runs SnowConvert to translate your source SQL (T-SQL or Redshift SQL) into Snowflake-compatible SQL. After conversion, the agent presents:

  • Total objects converted successfully.

  • EWI (Early Warning Issue) summary broken down by severity (errors, warnings, informational).

  • A list of objects that require manual review.

assessment

Generates an interactive multi-tab HTML report. The assessment includes four analyses that can be run individually or together:

Analysis

What it does

Deployment Waves

Analyzes object dependencies to produce an ordered deployment sequence. Objects within a wave have no inter-dependencies; waves are ordered so dependencies are always deployed first.

Object Exclusion

Identifies objects that do not need migration: temporary tables, staging objects, deprecated objects, and test artifacts. Reduces scope before deployment.

Dynamic SQL Analysis

Classifies and scores Dynamic SQL patterns in your converted code. Identifies patterns that Snowflake handles natively, patterns requiring manual rewrite, and patterns with elevated migration complexity.

ETL/SSIS Assessment

Analyzes SSIS packages individually: classifies each package (Ingestion, Transformation, Export, Orchestration, Hybrid), maps control and data flow, and estimates migration effort.

The report is generated as a single self-contained HTML file. You can iterate on the wave plan interactively — for example, reprioritizing objects, adjusting wave sizes, or relocating specific objects — before locking it for deployment.

Migration Skills (Stage 6)

migrate-objects

The main deploy loop. Processes all objects in the current wave in dependency order:

Object type

What happens

Tables

Deployed to Snowflake, then data is migrated from the source.

Views

Deployed to Snowflake. Blocked views retry after their dependent functions/procedures pass.

Functions & Procedures

Deployed, tested against source output, and fixed if tests fail. The loop repeats until tests pass or the user decides to skip.

After each wave completes, the agent automatically advances to the next wave.

baseline-capture

Captures the expected output of source stored procedures and functions for use as test baselines. Two approaches are supported:

Approach

When to use

Query Logs

You have CSV logs of real EXEC or CALL statements from your source system. The agent parses these to extract parameters and expected outputs.

AI-Assisted

No logs are available. A swarm of specialized agents generates test cases covering business logic, data-driven scenarios, and edge cases by analyzing the source SQL.

Baselines are stored locally and uploaded to Snowflake so they can be used for two-sided validation (source output vs. Snowflake output) during the migrate-objects loop.

rule-engine

Manages reusable migration rules stored in Snowflake. Rules encode known source-to-Snowflake fix patterns and are shared across all objects in the project. Each rule can operate in two modes:

Mode

How it works

Regex

A regex find-and-replace applied mechanically to SQL files

AI-guided

The rule provides context and strategy; the AI interprets and applies it

The rule engine has four sub-capabilities:

Sub-skill

What it does

search

Scans a SQL file against all rules using regex pattern matching and Cortex semantic search. Returns matched rules ranked by relevance.

apply

Applies matched rules to local SQL files. Regex rules are applied automatically; AI-mode rules are shown for review before applying. Supports single-file and batch application.

extract

Creates a new reusable rule from a fix you just made. Works from an interactive before/after comparison or retroactively from git history.

propagate

Given a rule, finds every code unit in the project it applies to (via reverse regex + semantic search), then hands off to batch apply.

Rules accumulate over the lifetime of the project. Every time the agent fixes an object and extracts a rule, that rule becomes available to all subsequent objects — reducing manual effort as the migration progresses.


What You Can Ask

You do not need to follow the prescribed path. You can ask for any capability at any time.

Status and navigation

Prompt

What happens

"What is the current state?"

Shows the progress checklist

"What should I work on next?"

Returns the next dependency-ready object

"Continue"

Picks up the prescribed migration path

Setup

"Connect to my SQL Server database"
"Extract objects from the source"
"Import SQL files from ./my-scripts/"
"Convert my source code"

Assessment

"Run a full assessment"
"Generate deployment waves"
"I want a maximum of 30 objects per wave"
"Prioritize all Payroll objects in Wave 1"
"Identify temporary and staging objects"
"Analyze dynamic SQL patterns"
"Assess my SSIS packages"

Migration

"Deploy tables"
"Migrate data"
"Deploy and test the next function"
"Capture baselines for dbo.GetCustomerOrders"

Rule engine

"Search rules for this file"
"Apply all matched rules"
"Extract a rule from my last fix"
"Propagate this rule across the project"
"Show me all rules"

Example Workload

You can use AdventureWorksDW as an example source database to try the skill end-to-end. Substitute any SQL Server or Redshift database you have access to — the skill adapts to whatever source you connect.


Troubleshooting

Problem

Resolution

migration-guide skill doesn’t appear

Ensure you’re on the latest version of Cortex Code. Run cortex --version to confirm.

Plugin installation fails during skill setup

Check that python3 is available on your PATH. The skill installs dependencies via Homebrew (macOS/Linux) or winget (Windows).

scai not found after skill install

Run the install hook manually: migration-plugin/hooks/install-dependencies. Or install directly: brew install --cask snowflakedb/snowconvert-ai/snowconvert-ai.

Snowflake connection errors

Verify your connection in ~/.snowflake/connections.toml and confirm the connection name matches what you provided during setup.

Agent seems lost

Say "What is the current state?" — the agent re-reads project status and resets context.


Support

For help with the migration skill, contact: snowconvert-support@snowflake.com