SnowConvert AI (scai) Command Reference¶
This is the full command reference for the SnowConvert CLI.
Table of Contents¶
Quick Start¶
Basic workflow to get started
Create a project (use -c to set default Snowflake connection):
scai init -n <name> -l <language> -c <connection>
Add source code (E2E languages: SqlServer, Redshift):
scai code extract
Add source code (other languages):
scai code add -i <path>
Convert to Snowflake SQL:
scai code convert
Global Options¶
Option |
Description |
|---|---|
|
Show help message |
|
Display version information |
Commands¶
scai init¶
Create a new migration project
scai init¶
Create a new migration project in the specified directory (or current directory if PATH is omitted).
Usage:
scai init [PATH] -l <LANGUAGE> [-n <NAME>] [-i <INPUT_PATH>] [-c <CONNECTION>]
Prerequisites:
Target directory must not contain an existing project
Valid source language must be specified
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Optional directory to create the project in. If omitted, uses the current directory. |
No |
- |
|
Project name. If omitted, defaults to the target folder name. |
No |
- |
|
Source language for the project |
Yes |
- |
|
Optional path to source code files to copy into the project’s Source folder during initialization |
No |
- |
|
Enable git workflow automation for iterative conversions |
No |
|
|
Name of the baseline branch for conversion results (requires –git-flow) |
No |
|
|
Enables state management for the project |
No |
|
|
Snowflake connection name to save as project default. Precedence: -c option > project connection > default TOML connection. |
No |
- |
Examples:
Create a project in a new folder (recommended):
scai init my-project -l Teradata
Create a project in the current directory:
scai init -l Teradata
Create project with source code:
scai init my-project -l Oracle -i /path/to/code
Create project with a specific connection:
scai init my-project -l Oracle -c my-snowflake-conn
scai ai-convert¶
AI-powered code improvement and test generation
scai ai-convert cancel¶
Cancel a running AI code conversion job.
Usage:
scai ai-convert cancel [JOB_ID] [OPTIONS]
Prerequisites:
A running job started with ‘scai ai-convert start’
Snowflake connection (uses job’s connection if –connection not specified)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
The job ID to cancel. If omitted, cancels the last started job. |
No |
- |
|
Override the Snowflake connection. By default, uses the connection saved when the job was started. |
No |
- |
Examples:
Cancel last job:
scai ai-convert cancel
Cancel specific job:
scai ai-convert cancel JOB_20260112041123_XYZ
Use different connection:
scai ai-convert cancel -c other-snowflake
scai ai-convert list¶
List AI code conversion jobs for the current project.
Usage:
scai ai-convert list [OPTIONS]
Prerequisites:
A migration project initialized with ‘scai init’
Snowflake connection for refreshing job status
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Maximum number of jobs to display |
No |
|
|
Show all jobs (ignores limit) |
No |
- |
|
Override the Snowflake connection for refreshing job status. By default, uses the connection saved when the job was started. |
No |
- |
Examples:
List recent jobs:
scai ai-convert list
Show all jobs:
scai ai-convert list --all
Refresh with different connection:
scai ai-convert list -c other-snowflake
scai ai-convert start¶
Start AI-powered code conversion on converted code.
Usage:
scai ai-convert start [OPTIONS]
Prerequisites:
Code converted with ‘scai code convert’ (generates TopLevelCodeUnits report)
Snowflake connection configured with ‘snow connection add’
CREATE MIGRATION privilege granted on the Snowflake account
A warehouse configured in the Snowflake connection
Must accept AI disclaimers (interactive prompt or -y flag)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the Snowflake connection to use for AI code conversion |
No |
- |
|
Comma-separated list of object names to convert, or ‘all’ for all objects |
No |
|
|
Path to instructions file with custom AI code conversion configuration |
No |
- |
|
Display job progress until completion (may take several minutes to hours depending on code size) |
No |
|
|
Accept all AI code conversion disclaimers without prompting (required for non-interactive use) |
No |
|
Examples:
Start AI code conversion:
scai ai-convert start
Start and wait for completion:
scai ai-convert start -w
Convert specific objects:
scai ai-convert start -o PROC1,PROC2
Non-interactive (CI/CD):
scai ai-convert start -y -w
Source system verification:
scai ai-convert start -i config/instructions.yml
scai ai-convert status¶
Check the status of an AI code conversion job.
Usage:
scai ai-convert status [JOB_ID] [OPTIONS]
Prerequisites:
A job started with ‘scai ai-convert start’
Snowflake connection (uses job’s connection if –connection not specified)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
The job ID to check status for. If omitted, checks the last started job. |
No |
- |
|
Override the Snowflake connection. By default, uses the connection saved when the job was started. |
No |
- |
|
Monitor job progress until completion. For finished jobs, forces a server-side refresh and downloads detailed results. |
No |
|
Examples:
Check last job status:
scai ai-convert status
Check specific job:
scai ai-convert status JOB_20260112041123_XYZ
Wait and download results:
scai ai-convert status -w
Use different connection:
scai ai-convert status -c other-snowflake
scai code¶
Code operations: extract, convert, add, deploy
scai code add¶
Add source code from an input path to the project’s Source folder.
Usage:
scai code add -i <INPUT_PATH>
Prerequisites:
A migration project initialized with ‘scai init’
source/ folder must be empty
Input path must contain valid SQL source files
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Path to the source code files to add to the project |
Yes |
- |
Examples:
Add source code to project:
scai code add -i /path/to/source/code
Add code using full option name:
scai code add --input-path ./my-sql-scripts
scai code convert¶
Transform source database code to Snowflake SQL.
Usage:
scai code convert [OPTIONS]
Prerequisites:
A migration project initialized with ‘scai init’
Source code in the ‘source/’ folder (from ‘scai code extract’, ‘scai code add’, or manual copy)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Display all the conversion settings available for the specified source language |
No |
- |
|
Path to ETL replatform source files for cross-project code analysis. Must be provided for each conversion run. |
No |
- |
|
Path to Power BI files for input repointing. Must be provided for each conversion run. |
No |
- |
|
Show detailed EWI (Early Warning Issues) table instead of summary |
No |
- |
|
Path to read migration context from. Defaults to .scai/conversion-context. Generated context is always written to .scai/conversion-context. |
No |
- |
Examples:
Convert using project defaults:
scai code convert
Convert with custom context path:
scai code convert --context-path /path/to/context
Show all conversion settings for the project’s dialect:
scai code convert --help
Convert with custom schema:
scai code convert --customschema MY_SCHEMA
Convert with comment on missing dependencies:
scai code convert --comments
Convert with object renaming file:
scai code convert --renamingfile /path/to/renaming.json
scai code deploy¶
Deploy converted SQL code to Snowflake.
Usage:
scai code deploy [OPTIONS]
Prerequisites:
Converted code in ‘converted/Output/’ (from ‘scai code convert’)
Snowflake connection configured (set with ‘scai init -c’ or project settings)
Appropriate Snowflake privileges (CREATE TABLE, CREATE VIEW, etc.)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
The name of the Snowflake connection to use. Uses default if not specified. |
No |
- |
|
Target database name for deployment. Uses converted database name if not specified. |
No |
- |
|
Deploy all successfully converted objects without selection prompt. |
No |
|
|
Number of retry attempts for failed object deployments. |
No |
|
|
Continue deploying remaining objects even if some fail. |
No |
|
Examples:
Deploy using default connection:
scai code deploy
Deploy all objects:
scai code deploy --all
Deploy with specific connection:
scai code deploy --connection my-snowflake
scai code extract¶
Extract code from the source database.
Usage:
scai code extract [OPTIONS]
Prerequisites:
A migration project initialized with ‘scai init’
Source database connection configured (use ‘scai connection add-redshift’ or ‘scai connection add-sql-server’)
Network access to the source database
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the source connection to extract code from |
No |
- |
|
Schema name to extract code from |
No |
- |
|
Object types to extract (comma-separated). E.g., TABLE,VIEW,PROCEDURE |
No |
- |
Examples:
Extract tables from a schema:
scai code extract --schema public --object-type TABLE
Extract tables and views:
scai code extract --object-type TABLE,VIEW
Extract from all schemas:
scai code extract
scai connection¶
Manage source database connections (Redshift, SQL Server)
scai connection add-redshift¶
Add a new Redshift source database connection.
Usage:
scai connection add-redshift [OPTIONS]
Prerequisites:
Network access to the Redshift cluster/serverless endpoint
For IAM auth: AWS credentials configured (AWS CLI or environment variables)
For standard auth: Username and password
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name for this connection |
No |
- |
|
Authentication method (iam-serverless, iam-provisioned-cluster, standard) |
No |
- |
|
Username |
No |
- |
|
Database name |
No |
- |
|
Connection timeout in seconds |
No |
- |
|
Redshift Serverless workgroup name |
No |
- |
|
Redshift Provisioned Cluster ID |
No |
- |
|
AWS region |
No |
- |
|
AWS Access Key ID |
No |
- |
|
AWS Secret Access Key |
No |
- |
|
Redshift host |
No |
- |
|
Port number |
No |
- |
|
Password |
No |
- |
Examples:
Add connection interactively (recommended):
scai connection add-redshift
IAM Serverless (inline):
scai connection add-redshift --connection my-redshift --auth iam-serverless --workgroup my-workgroup --database mydb --region us-east-1
scai connection add-sql-server¶
Add a new SQL Server source database connection.
Usage:
scai connection add-sql-server [OPTIONS]
Prerequisites:
Network access to the SQL Server instance
For Windows auth: Valid domain credentials
For standard auth: SQL Server username and password
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name for this connection |
No |
- |
|
Authentication method (windows, standard) |
No |
- |
|
Username |
No |
- |
|
Database name |
No |
- |
|
Connection timeout in seconds |
No |
- |
|
SQL Server URL |
No |
- |
|
Port number |
No |
- |
|
Password |
No |
- |
|
Trust server certificate |
No |
- |
|
Encrypt connection |
No |
- |
Examples:
Add connection interactively (recommended):
scai connection add-sql-server
Windows Authentication:
scai connection add-sql-server --connection my-sqlserver --auth windows --server-url localhost --database mydb
Standard Authentication:
scai connection add-sql-server --connection my-sqlserver --auth standard --server-url localhost --database mydb --username sa
scai connection list¶
List connections for a given source database.
Usage:
scai connection list [-l <LANGUAGE>]
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Source language of the connection. If omitted, shows a summary of all connections. |
No |
- |
Examples:
List all connections summary:
scai connection list
List Redshift connections:
scai connection list -l redshift
List SQL Server connections:
scai connection list -l sqlserver
scai connection set-default¶
Set the default source connection for a database type.
Usage:
scai connection set-default -l <LANGUAGE> -c <CONNECTION>
Prerequisites:
Connection already added with ‘scai connection add-redshift’ or ‘scai connection add-sql-server’
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Database type of the connection |
Yes |
- |
|
Name of the source connection to set as default |
Yes |
- |
Examples:
Set default Redshift connection:
scai connection set-default -l redshift --connection prod
Set default SQL Server connection:
scai connection set-default -l sqlserver --connection dev
scai connection test¶
Test a source database connection.
Usage:
scai connection test -l <LANGUAGE> [-c <CONNECTION>]
Prerequisites:
Connection already configured
Network access to the database
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Source language of the connection. Supported languages: SqlServer and Redshift. |
Yes |
- |
|
Name of the connection to test |
No |
- |
Examples:
Test SQL Server connection:
scai connection test -l sqlserver -c my-sqlserver
Test Redshift connection:
scai connection test -l redshift -c my-redshift
scai data¶
Data operations: migrate, validate
scai data cloud-migrate¶
Migrate data from the source system into a Snowflake account using cloud infrastructure.
Usage:
scai data cloud-migrate --config <CONFIG_PATH> --compute-pool <POOL> [-c <CONNECTION>] [OPTIONS]
Prerequisites:
Data migration configuration file (JSON format)
Snowflake compute pool created and accessible
Snowflake connection with appropriate privileges
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Path to the configuration file |
Yes |
- |
|
Name of the compute pool that the Data Migration Service will run on |
Yes |
- |
|
Name of the warehouse that the Data Migration Service will use for queries |
No |
- |
|
Name of the Snowflake connection to use (from connections.toml). Uses default if not specified. |
No |
- |
|
Wait for the workflow to complete. Without this flag, the command returns after the workflow is created. |
No |
|
Examples:
Start migration (returns immediately):
scai data cloud-migrate --config my-data-migration-config.json --compute-pool MY_COMPUTE_POOL --connection my-snowflake
Start and wait for completion:
scai data cloud-migrate --config my-data-migration-config.json --compute-pool MY_COMPUTE_POOL --connection my-snowflake --watch
scai data cloud-migrate-status¶
Check the status of a Cloud Data Migration workflow.
Usage:
scai data cloud-migrate-status <WORKFLOW_NAME> [OPTIONS]
Prerequisites:
A workflow started with ‘scai data cloud-migrate’
Snowflake connection with access to the workflow
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
The workflow name to check status for. |
Yes |
- |
|
The Snowflake connection name to use |
No |
- |
|
Display progress bars and poll for updates until workflow completes |
No |
|
Examples:
Check workflow status:
scai data cloud-migrate-status DATA_MIGRATION_WORKFLOW_xxx
Watch workflow progress:
scai data cloud-migrate-status DATA_MIGRATION_WORKFLOW_xxx --watch
Watch with custom interval:
scai data cloud-migrate-status --watch --poll-interval 10
scai data migrate¶
Migrate data from the source system into a Snowflake account.
Usage:
scai data migrate [OPTIONS]
Prerequisites:
Code converted with ‘scai code convert’ (generates TopLevelCodeUnits report)
Code deployed with ‘scai code deploy’ (creates target tables in Snowflake)
Source database connection configured
Snowflake connection configured with INSERT privileges
If using –selector: A selector file (create with ‘scai object-selector create’)
For Redshift: S3 Bucket, Snowflake Storage Integration, and External Stage configured
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the source connection to extract data from. If not provided, the default connection will be used. |
No |
- |
|
Name of the target connection to migrate data to. If not provided, the default connection will be used. |
No |
- |
|
Name of the selector file to use for migration. If not provided, all tables from the TopLevelCodeUnits report will be migrated. |
No |
- |
|
(Redshift only) The S3 bucket URI where data will be staged (e.g., s3://my-bucket/path). |
No |
- |
|
(Redshift only) The fully qualified name of the Snowflake stage used to load parquet files from the S3 bucket. (e.g., database.schema.stage_name). |
No |
- |
|
(Redshift only) The IAM role ARN to unload parquet files to the S3 bucket. |
No |
- |
Examples:
Migrate all tables:
scai data migrate --source-connection my-redshift --target-connection my-snowflake
Migrate selected tables:
scai data migrate --source-connection my-redshift --target-connection my-snowflake --selector my-selector.yml
scai data validate¶
Compare data between source and Snowflake to verify data integrity.
Usage:
scai data validate [OPTIONS]
Prerequisites:
Source database connection configured
Snowflake connection configured
Tables must exist in both source and target databases
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the source connection to use (from the configured source connections). Uses default if not specified. |
No |
- |
|
Name of the Snowflake connection to use (from connections.toml). Uses default if not specified. |
No |
- |
|
Target Snowflake database for validation. Uses database from connection if not specified. |
No |
- |
|
Name of the selector file to use for validation. If not provided, all tables from the TopLevelCodeUnits report will be validated. |
No |
- |
|
Database name mapping in format ‘source:target’. Can be specified multiple times for multiple mappings. |
No |
- |
|
Schema name mapping in format ‘source:target’. Can be specified multiple times for multiple mappings. |
No |
- |
Examples:
Validate all tables from report:
scai data validate
Validate with selector file:
scai data validate --selector my-tables.yml
With target database:
scai data validate --target-database PROD_DB
With name mappings:
scai data validate --db-mapping "sourcedb:TARGETDB" --schema-mapping "dbo:PUBLIC"
With explicit connections:
scai data validate --source-connection my-sqlserver --target-connection my-snowflake
scai git¶
Git workflow for iterative conversions
scai git enable¶
Enable Git workflow for iterative conversions.
Usage:
scai git enable [OPTIONS]
Prerequisites:
A migration project initialized with ‘scai init’
Git installed on the system
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the baseline branch for conversion results |
No |
|
Examples:
Enable git workflow:
scai git enable
Enable with custom baseline branch:
scai git enable --baseline-branch vendor
scai license¶
Install offline license for air-gapped environments
scai license install¶
Install an offline license for running conversions without online activation.
Usage:
scai license install -p <LICENSE_PATH>
Prerequisites:
A valid offline license file (.lic) from Snowflake
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Path to the license file to install |
Yes |
- |
Examples:
Install license:
scai license install --path /path/to/license.lic
scai object-selector¶
Create selector files for filtering objects
scai object-selector create¶
Create a selector file to filter objects for data migration.
Usage:
scai object-selector create [OPTIONS]
Prerequisites:
Code converted with ‘scai code convert’ (generates TopLevelCodeUnits report)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Filter objects by source database name. |
No |
- |
|
Filter objects by source schema name. |
No |
- |
|
Filter objects by type (comma-separated, e.g., table,view,procedure). |
No |
- |
|
Label for the selector file (becomes |
No |
- |
Examples:
Create selector file:
scai object-selector create
Create with custom output path:
scai object-selector create -o custom-selector.yml
scai project¶
View and manage project configuration
scai project info¶
Display project details including name, source language, and status.
Usage:
scai project info
Prerequisites:
Must be run from within a migration project directory
Examples:
Show current project details:
scai project info
scai project set-default-connection¶
Set the default Snowflake connection for the current project.
Usage:
scai project set-default-connection -c <CONNECTION>
Prerequisites:
A migration project initialized with ‘scai init’
Snowflake connection available in connections.toml or config.toml
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the Snowflake connection to set as the project default. |
Yes |
- |
Examples:
Set project default connection:
scai project set-default-connection -c my-snowflake
Change to production connection:
scai project set-default-connection -c prod-snowflake
scai state¶
Track code unit state through migration stages
scai state disable¶
Disable state management for the project.
Usage:
scai state disable
Prerequisites:
A migration project with state management enabled
Examples:
Disable state management:
scai state disable
scai state enable¶
Enable state management for the project.
Usage:
scai state enable
Prerequisites:
A migration project initialized with ‘scai init’
Examples:
Enable state management:
scai state enable
scai state status¶
Show the current state of code units in the project.
Usage:
scai state status
Prerequisites:
A migration project with state management enabled
Examples:
Show current state:
scai state status
Generated: 2026-01-30 11:17:01