Snowflake Migration Plugin¶
The Snowflake Migration Plugin is an AI-powered plugin for Cortex Code that guides you through an end-to-end database migration to Snowflake. It provides a conversational, interactive workflow — from connecting to your source database through code conversion, deployment, data migration, and validation.
The plugin is organized as a skill tree: a hierarchy of AI instructions that detect where you are in the migration lifecycle and route you to the right action automatically. You can follow the prescribed path from start to finish, or jump directly to any capability at any time.
Supported sources: SQL Server, Amazon Redshift
Download¶
Download the latest release from:
Prerequisites¶
Before starting a migration, ensure you have:
A Snowflake account with a connection configured in
~/.snowflake/connections.toml.A source database (SQL Server or Redshift) accessible from your machine.
Cortex Code installed.
All other dependencies (uv, scai, Python packages) are installed automatically on first launch.
Quick Start¶
1. Launch Cortex Code with the plugin:
2. Start migrating:
The agent reads your project state, shows a progress checklist, and picks up where you left off. If no project exists, it starts from the beginning.
Tip: You can also combine the plugin with a profile:
Migration Workflow¶
The plugin guides you through six stages. At every session start, the agent detects your current progress and resumes from where you left off.
Stage |
Name |
What happens |
|---|---|---|
1 |
Connect |
Set up a connection to your source database (SQL Server or Redshift) |
2 |
Init |
Create a local migration project |
3 |
Register |
Extract DDL and code from the source database, or import local |
4 |
Convert |
Translate source SQL to Snowflake-compatible SQL via SnowConvert |
5 |
Assess |
Generate an interactive assessment report covering deployment waves, object exclusions, dynamic SQL patterns, and ETL/SSIS analysis |
6 |
Migrate |
Deploy objects, migrate data, validate output, and fix errors — wave by wave |
Each time you start a session, the agent presents a live progress checklist:
Skills Reference¶
The plugin is organized as a skill tree. The root skill detects your project state and delegates to the right sub-skill. You can also invoke any skill directly by describing what you want.
Setup Skills (Stages 1–5)¶
connection¶
Walks you through connecting to your source database. The agent collects credentials, tests the connection, and saves it for reuse across sessions. Supports:
SQL Server — configures ODBC driver, host, port, and authentication.
Amazon Redshift — configures host, port, database, and IAM or password authentication.
register-code-units¶
Gets source code into the migration project. Two paths are available:
Path |
When to use |
|---|---|
Extract from database |
You have a live source connection and want the agent to pull DDL and object code directly |
Import local files |
You already have |
convert¶
Runs SnowConvert to translate your source SQL (T-SQL or Redshift SQL) into Snowflake-compatible SQL. After conversion, the agent presents:
Total objects converted successfully.
EWI (Early Warning Issue) summary broken down by severity (errors, warnings, informational).
A list of objects that require manual review.
assessment¶
Generates an interactive multi-tab HTML report. The assessment includes four analyses that can be run individually or together:
Analysis |
What it does |
|---|---|
Deployment Waves |
Analyzes object dependencies to produce an ordered deployment sequence. Objects within a wave have no inter-dependencies; waves are ordered so dependencies are always deployed first. |
Object Exclusion |
Identifies objects that do not need migration: temporary tables, staging objects, deprecated objects, and test artifacts. Reduces scope before deployment. |
Dynamic SQL Analysis |
Classifies and scores Dynamic SQL patterns in your converted code. Identifies patterns that Snowflake handles natively, patterns requiring manual rewrite, and patterns with elevated migration complexity. |
ETL/SSIS Assessment |
Analyzes SSIS packages individually: classifies each package (Ingestion, Transformation, Export, Orchestration, Hybrid), maps control and data flow, and estimates migration effort. |
The report is generated as a single self-contained HTML file. You can iterate on the wave plan interactively — for example, reprioritizing objects, adjusting wave sizes, or relocating specific objects — before locking it for deployment.
Migration Skills (Stage 6)¶
migrate-objects¶
The main deploy loop. Processes all objects in the current wave in dependency order:
Object type |
What happens |
|---|---|
Tables |
Deployed to Snowflake, then data is migrated from the source. |
Views |
Deployed to Snowflake. Blocked views retry after their dependent functions/procedures pass. |
Functions & Procedures |
Deployed, tested against source output, and fixed if tests fail. The loop repeats until tests pass or the user decides to skip. |
After each wave completes, the agent automatically advances to the next wave.
baseline-capture¶
Captures the expected output of source stored procedures and functions for use as test baselines. Two approaches are supported:
Approach |
When to use |
|---|---|
Query Logs |
You have CSV logs of real |
AI-Assisted |
No logs are available. A swarm of specialized agents generates test cases covering business logic, data-driven scenarios, and edge cases by analyzing the source SQL. |
Baselines are stored locally and uploaded to Snowflake so they can be used for two-sided validation (source output vs. Snowflake output) during the migrate-objects loop.
rule-engine¶
Manages reusable migration rules stored in Snowflake. Rules encode known source-to-Snowflake fix patterns and are shared across all objects in the project. Each rule can operate in two modes:
Mode |
How it works |
|---|---|
Regex |
A regex find-and-replace applied mechanically to SQL files |
AI-guided |
The rule provides context and strategy; the AI interprets and applies it |
The rule engine has four sub-capabilities:
Sub-skill |
What it does |
|---|---|
search |
Scans a SQL file against all rules using regex pattern matching and Cortex semantic search. Returns matched rules ranked by relevance. |
apply |
Applies matched rules to local SQL files. Regex rules are applied automatically; AI-mode rules are shown for review before applying. Supports single-file and batch application. |
extract |
Creates a new reusable rule from a fix you just made. Works from an interactive before/after comparison or retroactively from git history. |
propagate |
Given a rule, finds every code unit in the project it applies to (via reverse regex + semantic search), then hands off to batch apply. |
Rules accumulate over the lifetime of the project. Every time the agent fixes an object and extracts a rule, that rule becomes available to all subsequent objects — reducing manual effort as the migration progresses.
What You Can Ask¶
You do not need to follow the prescribed path. You can ask for any capability at any time.
Setup¶
Assessment¶
Migration¶
Rule engine¶
Example Workload¶
You can use AdventureWorksDW as an example source database to try the plugin end-to-end. Substitute any SQL Server or Redshift database you have access to — the plugin adapts to whatever source you connect.
Troubleshooting¶
Problem |
Resolution |
|---|---|
Plugin fails on startup |
Check |
|
Run the install hook manually: |
Snowflake connection errors |
Verify your connection in |
Agent seems lost |
Say |
Support¶
For help with the migration plugin, contact: snowconvert-support@snowflake.com