What is SnowConvert CLI?¶
SnowConvert AI CLI (scai) encapsulates all SnowConvert functions into a single command line tool dedicated to increasing the speed of migrations from various source platforms into Snowflake.
With the SnowConvert AI CLI, migration engineers can:
extract code from their source platform
run a deterministic conversion on that code
further advance their migration using ai-conversion to cover objects that the deterministic engine could not translate
deploy that code to Snowflake
migrate data from the source system to Snowflake
validate that data between the two systems
The CLI will also allow developers to create skills and agents that utilize the tool to automate their process.
Prerequisites¶
macOS, Windows, or Linux
SnowflakeCLI: recommended for Snowflake connection configuration SnowCLI Install Guide
A source database to extract from, or a set of code to use
Snowflake Connection Setup¶
The SnowConvert AI CLI (scai) reuses your Snowflake CLI connection configuration. The connection is used for functionality in ai-convert, deploy, and the cloud versions of data migration and validation. Your Snowflake account also authenticates you for SnowConvert, skipping the need for an access code that was necessary in prior versions.*
Snowflake Account Requirement
Before using scai init, scai code convert, scai code extract, scai ai-convert, or scai code deploy, ensure that you:
Can connect to Snowflake with snow connection test Your Snowflake CLI has a default connection configured (this is used when you don’t specify a name).
To configure a Snowflake connection:
Once this is configured, commands needing a connection to Snowflake will use your Snowflake CLI connection automatically.
Installation¶
Homebrew Installation
If you do not have homebrew installed, follow the instructions here.
There are two public channels for builds, Preview and GA.
Stable Version (recommended)
Install the stable production (GA) release:
Preview Version
Install the Preview (pr) version with pre-release features from the beta/staging environment:
Usage
After installation, you can use the SnowConvert CLI:
Managing Installations
View installed Version
Switch between versions
Update to latest version
Important: you must run brew update first to sync the tap with the latest cask definitions:
Why both commands? brew update synchronizes your local tap with the latest cask definitions from GitHub. Without it, brew upgrade won’t see new versions even if they exist on the server.
Installer Packages:¶
GA Releases
OS |
Installer |
|---|---|
macOS |
|
macOS |
|
Linux |
|
Linux |
|
Linux |
|
Linux |
|
Linux |
|
Linux |
|
Windows |
|
Windows |
Preview Releases
OS |
Installer |
|---|---|
macOS |
|
macOS |
|
Linux |
|
Linux |
|
Linux |
|
Linux |
|
Linux |
|
Linux |
|
Windows |
|
Windows |
Accept Terms and Conditions¶
Issusing the following command will display the license terms for using the SnowConvert AI CLI. It is required that you do this in order to use the product.
Understanding Projects¶
A project is required before you can use any other scai command. This is similar to how Git requires you to run git init before using other Git commands.
A project:
Organizes your migration work in a dedicated folder structure
Tracks your source dialect (Oracle, SQL Server, Teradata, etc.)
Stores configuration, source code, converted code, and reports
When you run scai init, it creates this folder structure in the target directory.
If you pass a PATH, scai will create that folder (if it doesn’t exist) and initialize the project inside it.
If you omit PATH, scai initializes the current directory (which must be empty).
The following folder structure:
Important: after creating a project, run all subsequent scai commands from within the project folder (where the .scai directory is).
CLI Logs are written to ~/.scai/logs/jobs.log by default
Quick Start: Code Conversion Only¶
Use this workflow when you have existing SQL files to convert. Works with all supported dialects.
[!IMPORTANT] You must already be in an empty directory to create a project.
#1. Create a project folder and initialize it (project name is inferred from folder name)
#2. Add your source code
#3. Convert to Snowflake SQL
#4. Deploy to Snowflake
Your converted code will be in the snowflake/ folder, and conversion reports in reports/.
Quick Start: End-to-End Migration¶
Use this workflow to extract code directly from your source database. Only available for SQL Server and Redshift.
Step 1: Create Project # Create a project folder and initialize it (project name is inferred from folder name)
Step 2: Configure Source Connection
The interactive mode will prompt you for connection details.
Set a default source connection (used with scai code extract runs without -source-connection)
Or
Step 3: Extract, Convert, Deploy
#extract code from source database
#convert to Snowflake SQL
#deploy to Snowflake
Filtering Objects with the --where Clause¶
Many scai commands operate on the Code Unit Registry – a local index of every code unit (table, view, procedure, function, etc.) in your project. The --where flag lets you filter which objects a command acts on, using a SQL-like expression against that registry.
This section covers how the WHERE clause works, which commands support it, and how to use it effectively.
How the Code Unit Registry works¶
When you run
scai code addorscai code extract, source code is added to your project and split into individual code units.When you run
scai code convert, the CLI builds a registry entry for every code unit, tracking its source/target metadata, object type, and conversion status.When you pass
--whereto a supported command, the CLI queries that registry and applies the operation only to matching objects.
The registry must exist before --where can be used. If you haven’t run at least scai code add (or scai code extract) yet, --where will fail with a “registry not found” error.
Discovering queryable fields: scai code where¶
Run scai code where to see the full, up-to-date reference of all queryable fields, supported operators, and usage examples. The output is generated from the actual registry library, so it is always current.
The most commonly used fields (all field names are camelCase, all enum values are lowercase):
Field |
Description |
Example values |
|---|---|---|
|
Object name in the source database |
|
|
Object type in source |
|
|
Source database name |
|
|
Source schema name |
|
|
Object name in Snowflake |
|
|
Object type in Snowflake |
|
|
Target database name |
|
|
Target schema name |
|
|
Conversion result |
|
|
Registration/extraction result |
|
|
AI verification result |
|
Previewing results: scai code find¶
Before running a destructive or long-running operation, use scai code find to test your filter and see which objects match:
scai code find accepts the same --where syntax as all other commands that support it. Use it as a dry-run before committing to an operation.
Commands that support --where¶
The following table summarizes every scai command that accepts the --where flag:
Command |
Purpose |
|
|---|---|---|
|
Preview/query code units in the registry |
Primary tool for testing filters before using them elsewhere |
|
Convert source code to Snowflake SQL |
Only matched units are transformed; dependencies are still parsed for symbol resolution |
|
Deploy converted code to Snowflake |
Also supports |
|
Accept latest artifact versions into the snowflake folder |
|
|
Start AI-powered conversion improvement |
Cannot be combined with |
|
Accept AI-suggested fixes |
Filters which suggested fixes to review/accept |
|
Migrate data from source to Snowflake |
Full migration projects only |
|
Validate data between source and Snowflake |
Full migration projects only |
Each of these commands uses the same WHERE clause syntax. The recommended workflow is:
scai code find --where¶
Query the Code Unit Registry and display matching objects. This is the safest way to test a filter before using it with a command that makes changes.
Flag |
Description |
|---|---|
|
SQL-like WHERE clause to filter objects |
|
Show all results (default limit is 100) |
Examples:
scai code convert --where¶
Convert only a filtered subset of code units to Snowflake SQL. Objects that don’t match the filter are still parsed for dependency and symbol resolution, but only matched units produce converted output.
Flag |
Description |
|---|---|
|
SQL-like filter to select which code units to convert |
|
Overwrite existing output files in the snowflake/ directory |
|
Show detailed EWI table instead of summary |
Examples:
Note: Even when filtering, the converter still parses all source files for symbol resolution. This ensures that cross-object references (e.g., a procedure referencing a table) are resolved correctly, even if the referenced object is not in the
--wherefilter.
scai code deploy --where¶
Deploy a filtered subset of converted objects to Snowflake, instead of deploying everything.
Flag |
Description |
|---|---|
|
SQL-like WHERE clause to filter objects to deploy |
|
Also deploy the dependencies of the filtered code units. Has no effect without |
|
Snowflake connection to use |
|
Target database name for deployment |
|
Deploy all successfully converted objects without selection prompt |
|
Number of retry attempts for failed deployments (default: 1) |
|
Continue deploying remaining objects even if some fail (default: True) |
|
Warehouse override (in-memory only, applied if connection has none) |
|
Schema override (in-memory only) |
|
Role override (in-memory only) |
Examples:
Why --include-dependencies matters: When you filter with --where, you may select procedures that depend on tables or views. Without --include-dependencies, those dependent objects won’t be deployed, and the procedures may fail at runtime. Use this flag to automatically pull in everything the filtered objects need.
scai ai-convert start --where¶
Send a filtered subset of objects for AI-powered conversion improvement, instead of processing everything.
Flag |
Description |
|---|---|
|
SQL-like WHERE clause to filter objects. Requires the code unit registry. |
|
Snowflake connection to use |
|
Path to object selector file (code-conversion-only projects). Cannot be combined with |
|
Path to instructions file |
|
Wait for completion and show progress |
|
Skip disclaimer prompt |
|
Warehouse override (in-memory only) |
|
Schema override (in-memory only) |
|
Role override (in-memory only) |
|
Database override (in-memory only) |
Examples:
--whereand--selectorcannot be combined. Use--selectorwhen you have a selector file for a code-conversion-only project. Use--wherefor expressive filtering by type, schema, status, or any other registry field.
scai code accept --where¶
Accept the latest artifact versions into the snowflake output folder for a filtered subset of objects. Without --where, all objects are accepted.
Flag |
Description |
|---|---|
|
Filter expression to select which objects to accept |
Examples:
scai ai-convert accept --where¶
Review and accept AI-suggested fixes for a filtered subset of objects, instead of reviewing everything.
Flag |
Description |
|---|---|
|
SQL-like WHERE clause to filter which suggested fixes to accept. Full migration projects only. |
|
Accept all matching AI-suggested fixes without prompting |
|
Review each matching code unit one by one |
|
Preview affected code units without making changes (default) |
|
Output results in JSON format (for automation) |
Examples:
scai data migrate --where¶
Migrate data from the source system to Snowflake for a filtered subset of tables. Available only for full migration projects (SQL Server, Redshift).
Flag |
Description |
|---|---|
|
SQL-like WHERE clause to filter tables from the Code Unit Registry |
|
Source connection to extract data from |
|
Snowflake connection to migrate data to |
Examples:
Note:
--whereand--selectorserve the same purpose (filtering tables) but for different project types. Use--wherefor full migration projects that have a Code Unit Registry. Use--selectorfor code-conversion-only projects.
scai data validate --where¶
Compare data between source and Snowflake for a filtered subset of tables. Available only for full migration projects (SQL Server, Redshift).
Flag |
Description |
|---|---|
|
SQL-like WHERE clause to filter tables from the Code Unit Registry |
|
Source connection for validation |
|
Snowflake connection for validation |
|
Target Snowflake database for validation |
|
Database name mapping ( |
|
Schema name mapping ( |
Examples:
Common --where scenarios¶
Re-process objects that failed conversion¶
After running scai code convert, some objects might have failed. Target just those for AI conversion:
Run AI conversion on objects that haven’t been AI-verified yet¶
If you’ve already run AI conversion on some objects but not others, target the ones still pending:
Focus on a specific object type in a specific schema¶
Deploy only tables, then only procedures with dependencies¶
Incremental deployment of a single schema¶
Things to keep in mind¶
--whereand--selectorcan’t be combined (onscai ai-convert startandscai ai-convert accept). Use--selectorfor code-conversion-only projects with a short list, or--wherefor full migration projects with expressive filtering.--whereand--selectorserve the same purpose onscai data migrateandscai data validate. Use--wherefor full migration projects; use--selectorfor code-conversion-only projects.The registry must exist. You need to have run at least
scai code addorscai code extractbefore--wherewill work. Otherwise you’ll get a “registry not found” error.Use
scai code findto preview. Always test your filter withscai code find --where "..."before running a deployment or AI conversion job.scai code whereis the definitive reference. The field list in this document covers the most common fields. Runscai code wherefor the full, always-up-to-date list of fields, operators, and examples.
AI Convert Quick Guide¶
Use scai ai-convert to improve your converted code with AI. It uploads your converted SQL to Snowflake, analyzes it for functional equivalence issues, generates regression tests, and produces improved code.
Supported languages: SQL Server, Redshift, BigQuery, PostgreSQL
Before you start¶
A project initialized with
scai initCode already converted with
scai code convertA Snowflake connection configured via
snow connection addCREATE MIGRATIONprivilege on your Snowflake accountA warehouse set in the connection
Basic usage¶
For filtering objects with --where, see the Filtering Objects with the --where Clause section above.
Managing jobs¶
Once a job is running, you’ve got a few commands to work with:
Accepting AI fixes¶
After a job completes, review what the AI suggested and decide what to keep:
Output structure¶
Results land in the ai-converted/ directory inside your project:
All ai-convert start options¶
Flag |
Short |
Description |
|---|---|---|
|
|
Snowflake connection to use |
|
|
Comma-separated object names, or |
|
SQL-like WHERE clause to filter objects (see Filtering Objects) |
|
|
|
Path to instructions file for custom config |
|
|
Wait for completion and show progress |
|
|
Skip disclaimer prompt |
|
Override warehouse (in-memory only) |
|
|
Override schema (in-memory only) |
|
|
Override role (in-memory only) |
|
|
Override database (in-memory only) |
Workflow Examples¶
Example 1: Migrate Oracle Stored Procedures¶
Example 2: SQL Server End-to-End with Specific Schema¶
Example 3: AI Convert After Code Conversion¶
Example 4: Selective Migration Using --where¶
Getting Help¶
Use –help with any command to see available options:
Troubleshooting¶
“Project file not found” You must run commands from within a project directory. Navigate to your project folder (where the .scai/ directory exists) before running commands:
“Connection not found” (source database)
List your connections: scai connection list -l <language>
Add a connection if needed: scai connection add-sql-server or scai connection add-redshift
Or set a default: scai connection set-default -l <language> –connection-name <name>
“Authentication failed” for Snowflake
The SCAI CLI uses your Snowflake CLI configuration. Ensure your connection is working:
Make sure you have a default Snowflake connection configured in the Snowflake CLI (used when no connection name is specified).
“Registry not found” when using --where
The --where flag requires a Code Unit Registry, which is created when you run scai code add or scai code extract. Make sure you’ve run one of those commands before using --where:
Supported Source Dialects¶
Dialect |
Extract |
Convert |
Deploy |
|---|---|---|---|
SQL Server |
X |
X |
X |
Redshift |
X |
X |
X |
Oracle |
X |
||
Teradata |
X |
||
BigQuery |
X |
||
Databricks |
X |
||
Greenplum |
X |
||
Sybase |
X |
||
PostgreSQL |
X |
||
Netezza |
X |
||
Spark |
X |
||
Vertica |
X |
||
Hive |
X |
||
DB2 |
X |
Complete CLI Reference¶
For quick reference, here is every top-level command and subcommand available in the scai CLI:
Command |
Subcommands |
Description |
|---|---|---|
|
Create a new migration project |
|
|
|
View and manage project configuration |
|
|
Manage source database connections |
|
|
Manage code migration operations |
|
|
AI-powered conversion improvement |
|
|
Data migration and validation |
|
|
Generate and run test cases for stored procedures |
|
|
Generate selector files for filtering objects |
|
Execute SQL queries on source database systems |
|
|
|
Manage offline license operations |
|
|
View and accept terms and conditions |
|
Show log directory and recent log files |