SnowConvert AI (scai) Command Reference¶
This is the full command reference for the SnowConvert CLI.
Table of Contents¶
Quick Start¶
Basic workflow to get started
Create a project (use -c to set default Snowflake connection):
Add source code (E2E languages: SqlServer, Redshift):
Add source code (other languages):
Convert to Snowflake SQL:
Global Options¶
Option |
Description |
|---|---|
|
Show help message |
|
Display version information |
Commands¶
scai init¶
Create a new migration project
scai init¶
Create a new migration project in the specified directory (or current directory if PATH is omitted).
Usage:
Prerequisites:
Target directory must not contain an existing project
Valid source language must be specified
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Optional directory to create the project in. If omitted, uses the current directory. |
No |
- |
|
Project name. If omitted, defaults to the target folder name. |
No |
- |
|
Source language for the project |
Yes |
- |
|
Optional path to source code files to copy into the project’s Source folder during initialization |
No |
- |
|
Enable git workflow automation for iterative conversions |
No |
|
|
Name of the baseline branch for conversion results (requires –git-flow) |
No |
|
|
Enables state management for the project |
No |
|
|
Snowflake connection name to save as project default. Precedence: -c option > project connection > default TOML connection. |
No |
- |
Examples:
Create a project in a new folder (recommended):
Create a project in the current directory:
Create project with source code:
Create project with a specific connection:
scai ai-convert¶
AI-powered code improvement and test generation
scai ai-convert cancel¶
Cancel a running AI code conversion job.
Usage:
Prerequisites:
A running job started with ‘scai ai-convert start’
Snowflake connection (uses job’s connection if –connection not specified)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
The job ID to cancel. If omitted, cancels the last started job. |
No |
- |
|
Override the Snowflake connection. By default, uses the connection saved when the job was started. |
No |
- |
Examples:
Cancel last job:
Cancel specific job:
Use different connection:
scai ai-convert list¶
List AI code conversion jobs for the current project.
Usage:
Prerequisites:
A migration project initialized with ‘scai init’
Snowflake connection for refreshing job status
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Maximum number of jobs to display |
No |
|
|
Show all jobs (ignores limit) |
No |
- |
|
Override the Snowflake connection for refreshing job status. By default, uses the connection saved when the job was started. |
No |
- |
Examples:
List recent jobs:
Show all jobs:
Refresh with different connection:
scai ai-convert start¶
Start AI-powered code conversion on converted code.
Usage:
Prerequisites:
Code converted with ‘scai code convert’ (generates TopLevelCodeUnits report)
Snowflake connection configured with ‘snow connection add’
CREATE MIGRATION privilege granted on the Snowflake account
A warehouse configured in the Snowflake connection
Must accept AI disclaimers (interactive prompt or -y flag)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the Snowflake connection to use for AI code conversion |
No |
- |
|
Comma-separated list of object names to convert, or ‘all’ for all objects |
No |
|
|
Path to instructions file with custom AI code conversion configuration |
No |
- |
|
Display job progress until completion (may take several minutes to hours depending on code size) |
No |
|
|
Accept all AI code conversion disclaimers without prompting (required for non-interactive use) |
No |
|
Examples:
Start AI code conversion:
Start and wait for completion:
Convert specific objects:
Non-interactive (CI/CD):
Source system verification:
scai ai-convert status¶
Check the status of an AI code conversion job.
Usage:
Prerequisites:
A job started with ‘scai ai-convert start’
Snowflake connection (uses job’s connection if –connection not specified)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
The job ID to check status for. If omitted, checks the last started job. |
No |
- |
|
Override the Snowflake connection. By default, uses the connection saved when the job was started. |
No |
- |
|
Monitor job progress until completion. For finished jobs, forces a server-side refresh and downloads detailed results. |
No |
|
Examples:
Check last job status:
Check specific job:
Wait and download results:
Use different connection:
scai code¶
Code operations: extract, convert, add, deploy
scai code add¶
Add source code from an input path to the project’s Source folder.
Usage:
Prerequisites:
A migration project initialized with ‘scai init’
source/ folder must be empty
Input path must contain valid SQL source files
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Path to the source code files to add to the project |
Yes |
- |
Examples:
Add source code to project:
Add code using full option name:
scai code convert¶
Transform source database code to Snowflake SQL.
Usage:
Prerequisites:
A migration project initialized with ‘scai init’
Source code in the ‘source/’ folder (from ‘scai code extract’, ‘scai code add’, or manual copy)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Display all the conversion settings available for the specified source language |
No |
- |
|
Path to ETL replatform source files for cross-project code analysis. Must be provided for each conversion run. |
No |
- |
|
Path to Power BI files for input repointing. Must be provided for each conversion run. |
No |
- |
|
Show detailed EWI (Early Warning Issues) table instead of summary |
No |
- |
|
Path to read migration context from. Defaults to .scai/conversion-context. Generated context is always written to .scai/conversion-context. |
No |
- |
Examples:
Convert using project defaults:
Convert with custom context path:
Show all conversion settings for the project’s dialect:
Convert with custom schema:
Convert with comment on missing dependencies:
Convert with object renaming file:
scai code deploy¶
Deploy converted SQL code to Snowflake.
Usage:
Prerequisites:
Converted code in ‘converted/Output/’ (from ‘scai code convert’)
Snowflake connection configured (set with ‘scai init -c’ or project settings)
Appropriate Snowflake privileges (CREATE TABLE, CREATE VIEW, etc.)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
The name of the Snowflake connection to use. Uses default if not specified. |
No |
- |
|
Target database name for deployment. Uses converted database name if not specified. |
No |
- |
|
Deploy all successfully converted objects without selection prompt. |
No |
|
|
Number of retry attempts for failed object deployments. |
No |
|
|
Continue deploying remaining objects even if some fail. |
No |
|
Examples:
Deploy using default connection:
Deploy all objects:
Deploy with specific connection:
scai code extract¶
Extract code from the source database.
Usage:
Prerequisites:
A migration project initialized with ‘scai init’
Source database connection configured (use ‘scai connection add-redshift’ or ‘scai connection add-sql-server’)
Network access to the source database
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the source connection to extract code from |
No |
- |
|
Schema name to extract code from |
No |
- |
|
Object types to extract (comma-separated). E.g., TABLE,VIEW,PROCEDURE |
No |
- |
Examples:
Extract tables from a schema:
Extract tables and views:
Extract from all schemas:
scai connection¶
Manage source database connections (Redshift, SQL Server)
scai connection add-redshift¶
Add a new Redshift source database connection.
Usage:
Prerequisites:
Network access to the Redshift cluster/serverless endpoint
For IAM auth: AWS credentials configured (AWS CLI or environment variables)
For standard auth: Username and password
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name for this connection |
No |
- |
|
Authentication method (iam-serverless, iam-provisioned-cluster, standard) |
No |
- |
|
Username |
No |
- |
|
Database name |
No |
- |
|
Connection timeout in seconds |
No |
- |
|
Redshift Serverless workgroup name |
No |
- |
|
Redshift Provisioned Cluster ID |
No |
- |
|
AWS region |
No |
- |
|
AWS Access Key ID |
No |
- |
|
AWS Secret Access Key |
No |
- |
|
Redshift host |
No |
- |
|
Port number |
No |
- |
|
Password |
No |
- |
Examples:
Add connection interactively (recommended):
IAM Serverless (inline):
scai connection add-sql-server¶
Add a new SQL Server source database connection.
Usage:
Prerequisites:
Network access to the SQL Server instance
For Windows auth: Valid domain credentials
For standard auth: SQL Server username and password
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name for this connection |
No |
- |
|
Authentication method (windows, standard) |
No |
- |
|
Username |
No |
- |
|
Database name |
No |
- |
|
Connection timeout in seconds |
No |
- |
|
SQL Server URL |
No |
- |
|
Port number |
No |
- |
|
Password |
No |
- |
|
Trust server certificate |
No |
- |
|
Encrypt connection |
No |
- |
Examples:
Add connection interactively (recommended):
Windows Authentication:
Standard Authentication:
scai connection list¶
List connections for a given source database.
Usage:
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Source language of the connection. If omitted, shows a summary of all connections. |
No |
- |
Examples:
List all connections summary:
List Redshift connections:
List SQL Server connections:
scai connection set-default¶
Set the default source connection for a database type.
Usage:
Prerequisites:
Connection already added with ‘scai connection add-redshift’ or ‘scai connection add-sql-server’
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Database type of the connection |
Yes |
- |
|
Name of the source connection to set as default |
Yes |
- |
Examples:
Set default Redshift connection:
Set default SQL Server connection:
scai connection test¶
Test a source database connection.
Usage:
Prerequisites:
Connection already configured
Network access to the database
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Source language of the connection. Supported languages: SqlServer and Redshift. |
Yes |
- |
|
Name of the connection to test |
No |
- |
Examples:
Test SQL Server connection:
Test Redshift connection:
scai data¶
Data operations: migrate, validate
scai data migrate¶
Migrate data from the source system into a Snowflake account.
Usage:
Prerequisites:
Code converted with ‘scai code convert’ (generates TopLevelCodeUnits report)
Code deployed with ‘scai code deploy’ (creates target tables in Snowflake)
Source database connection configured
Snowflake connection configured with INSERT privileges
If using –selector: A selector file (create with ‘scai object-selector create’)
For Redshift: S3 Bucket, Snowflake Storage Integration, and External Stage configured
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the source connection to extract data from. If not provided, the default connection will be used. |
No |
- |
|
Name of the target connection to migrate data to. If not provided, the default connection will be used. |
No |
- |
|
Name of the selector file to use for migration. If not provided, all tables from the TopLevelCodeUnits report will be migrated. |
No |
- |
|
(Redshift only) The S3 bucket URI where data will be staged (e.g., s3://my-bucket/path). |
No |
- |
|
(Redshift only) The fully qualified name of the Snowflake stage used to load parquet files from the S3 bucket. (e.g., database.schema.stage_name). |
No |
- |
|
(Redshift only) The IAM role ARN to unload parquet files to the S3 bucket. |
No |
- |
Examples:
Migrate all tables:
Migrate selected tables:
Important: Redshift Data Migration Considerations
When migrating data from Amazon Redshift, you must ensure that the Snowflake stage properly connects to the AWS S3 bucket that you are using to unload the data. The stage must be configured with the correct Storage Integration and have the appropriate permissions to access the S3 bucket.
For detailed instructions on setting up the S3 bucket, configuring the stage, and specifying the data unloading IAM role ARN, see the Migrate Amazon Redshift data section in the data migration guide.
scai data validate¶
Compare data between source and Snowflake to verify data integrity.
Usage:
Prerequisites:
Source database connection configured
Snowflake connection configured
Tables must exist in both source and target databases
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the source connection to use (from the configured source connections). Uses default if not specified. |
No |
- |
|
Name of the Snowflake connection to use (from connections.toml). Uses default if not specified. |
No |
- |
|
Target Snowflake database for validation. Uses database from connection if not specified. |
No |
- |
|
Name of the selector file to use for validation. If not provided, all tables from the TopLevelCodeUnits report will be validated. |
No |
- |
|
Database name mapping in format ‘source:target’. Can be specified multiple times for multiple mappings. |
No |
- |
|
Schema name mapping in format ‘source:target’. Can be specified multiple times for multiple mappings. |
No |
- |
Examples:
Validate all tables from report:
Validate with selector file:
With target database:
With name mappings:
With explicit connections:
scai git¶
Git workflow for iterative conversions
scai git enable¶
Enable Git workflow for iterative conversions.
Usage:
Prerequisites:
A migration project initialized with ‘scai init’
Git installed on the system
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the baseline branch for conversion results |
No |
|
Examples:
Enable git workflow:
Enable with custom baseline branch:
scai license¶
Install offline license for air-gapped environments
scai license install¶
Install an offline license for running conversions without online activation.
Usage:
Prerequisites:
A valid offline license file (.lic) from Snowflake
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Path to the license file to install |
Yes |
- |
Examples:
Install license:
scai object-selector¶
Create selector files for filtering objects
scai object-selector create¶
Create a selector file to filter objects for data migration.
Usage:
Prerequisites:
Code converted with ‘scai code convert’ (generates TopLevelCodeUnits report)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Filter objects by source database name. |
No |
- |
|
Filter objects by source schema name. |
No |
- |
|
Filter objects by type (comma-separated, e.g., table,view,procedure). |
No |
- |
|
Label for the selector file (becomes |
No |
- |
Examples:
Create selector file:
Create with custom output path:
scai project¶
View and manage project configuration
scai project info¶
Display project details including name, source language, and status.
Usage:
Prerequisites:
Must be run from within a migration project directory
Examples:
Show current project details:
scai project set-default-connection¶
Set the default Snowflake connection for the current project.
Usage:
Prerequisites:
A migration project initialized with ‘scai init’
Snowflake connection available in connections.toml or config.toml
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the Snowflake connection to set as the project default. |
Yes |
- |
Examples:
Set project default connection:
Change to production connection:
scai state¶
Track code unit state through migration stages
scai state disable¶
Disable state management for the project.
Usage:
Prerequisites:
A migration project with state management enabled
Examples:
Disable state management:
scai state enable¶
Enable state management for the project.
Usage:
Prerequisites:
A migration project initialized with ‘scai init’
Examples:
Enable state management:
scai state status¶
Show the current state of code units in the project.
Usage:
Prerequisites:
A migration project with state management enabled
Examples:
Show current state:
Generated: 2026-01-30 11:17:01