SnowConvert AI (scai) Command Reference

This is the full command reference for the SnowConvert CLI.

Table of Contents

Quick Start

Basic workflow to get started

  1. Create a project (use -c to set default Snowflake connection):

scai init -n <name> -l <language> -c <connection>
Copy
  1. Add source code (E2E languages: SqlServer, Redshift):

scai code extract
Copy
  1. Add source code (other languages):

scai code add -i <path>
Copy
  1. Convert to Snowflake SQL:

scai code convert
Copy

Global Options

Option

Description

-h, --help

Show help message

-v, --version

Display version information

Commands

scai init

Create a new migration project

scai init

Create a new migration project in the specified directory (or current directory if PATH is omitted).

Usage:

scai init [PATH] -l <LANGUAGE> [-n <NAME>] [-i <INPUT_PATH>] [-c <CONNECTION>]
Copy

Prerequisites:

  • Target directory must not contain an existing project

  • Valid source language must be specified

Options:

Option

Description

Required

Default

[PATH]

Optional directory to create the project in. If omitted, uses the current directory.

No

-

-n, --name <NAME>

Project name. If omitted, defaults to the target folder name.

No

-

-l, --source-language <LANGUAGE>

Source language for the project

Yes

-

-i, --input-code-path <PATH>

Optional path to source code files to copy into the project’s Source folder during initialization

No

-

--git-flow

Enable git workflow automation for iterative conversions

No

False

--baseline-branch <NAME>

Name of the baseline branch for conversion results (requires –git-flow)

No

baseline

-s, --state-management

Enables state management for the project

No

False

-c, --connection <NAME>

Snowflake connection name to save as project default. Precedence: -c option > project connection > default TOML connection.

No

-

Examples:

Create a project in a new folder (recommended):

scai init my-project -l Teradata
Copy

Create a project in the current directory:

scai init -l Teradata
Copy

Create project with source code:

scai init my-project -l Oracle -i /path/to/code
Copy

Create project with a specific connection:

scai init my-project -l Oracle -c my-snowflake-conn
Copy

scai ai-convert

AI-powered code improvement and test generation

scai ai-convert cancel

Cancel a running AI code conversion job.

Usage:

scai ai-convert cancel [JOB_ID] [OPTIONS]
Copy

Prerequisites:

  • A running job started with ‘scai ai-convert start’

  • Snowflake connection (uses job’s connection if –connection not specified)

Options:

Option

Description

Required

Default

[JOB_ID]

The job ID to cancel. If omitted, cancels the last started job.

No

-

-c, --connection <NAME>

Override the Snowflake connection. By default, uses the connection saved when the job was started.

No

-

Examples:

Cancel last job:

scai ai-convert cancel
Copy

Cancel specific job:

scai ai-convert cancel JOB_20260112041123_XYZ
Copy

Use different connection:

scai ai-convert cancel -c other-snowflake
Copy

scai ai-convert list

List AI code conversion jobs for the current project.

Usage:

scai ai-convert list [OPTIONS]
Copy

Prerequisites:

  • A migration project initialized with ‘scai init’

  • Snowflake connection for refreshing job status

Options:

Option

Description

Required

Default

-l, --limit <N>

Maximum number of jobs to display

No

10

-a, --all

Show all jobs (ignores limit)

No

-

-c, --connection <NAME>

Override the Snowflake connection for refreshing job status. By default, uses the connection saved when the job was started.

No

-

Examples:

List recent jobs:

scai ai-convert list
Copy

Show all jobs:

scai ai-convert list --all
Copy

Refresh with different connection:

scai ai-convert list -c other-snowflake
Copy

scai ai-convert start

Start AI-powered code conversion on converted code.

Usage:

scai ai-convert start [OPTIONS]
Copy

Prerequisites:

  • Code converted with ‘scai code convert’ (generates TopLevelCodeUnits report)

  • Snowflake connection configured with ‘snow connection add’

  • CREATE MIGRATION privilege granted on the Snowflake account

  • A warehouse configured in the Snowflake connection

  • Must accept AI disclaimers (interactive prompt or -y flag)

Options:

Option

Description

Required

Default

-c, --connection <NAME>

Name of the Snowflake connection to use for AI code conversion

No

-

-o, --objects <OBJECTS>

Comma-separated list of object names to convert, or ‘all’ for all objects

No

all

-i, --instructions <PATH>

Path to instructions file with custom AI code conversion configuration

No

-

-w, --watch

Display job progress until completion (may take several minutes to hours depending on code size)

No

False

-y, --accept-disclaimers

Accept all AI code conversion disclaimers without prompting (required for non-interactive use)

No

False

Examples:

Start AI code conversion:

scai ai-convert start
Copy

Start and wait for completion:

scai ai-convert start -w
Copy

Convert specific objects:

scai ai-convert start -o PROC1,PROC2
Copy

Non-interactive (CI/CD):

scai ai-convert start -y -w
Copy

Source system verification:

scai ai-convert start -i config/instructions.yml
Copy

scai ai-convert status

Check the status of an AI code conversion job.

Usage:

scai ai-convert status [JOB_ID] [OPTIONS]
Copy

Prerequisites:

  • A job started with ‘scai ai-convert start’

  • Snowflake connection (uses job’s connection if –connection not specified)

Options:

Option

Description

Required

Default

[JOB_ID]

The job ID to check status for. If omitted, checks the last started job.

No

-

-c, --connection <NAME>

Override the Snowflake connection. By default, uses the connection saved when the job was started.

No

-

-w, --watch

Monitor job progress until completion. For finished jobs, forces a server-side refresh and downloads detailed results.

No

False

Examples:

Check last job status:

scai ai-convert status
Copy

Check specific job:

scai ai-convert status JOB_20260112041123_XYZ
Copy

Wait and download results:

scai ai-convert status -w
Copy

Use different connection:

scai ai-convert status -c other-snowflake
Copy

scai code

Code operations: extract, convert, add, deploy

scai code add

Add source code from an input path to the project’s Source folder.

Usage:

scai code add -i <INPUT_PATH>
Copy

Prerequisites:

  • A migration project initialized with ‘scai init’

  • source/ folder must be empty

  • Input path must contain valid SQL source files

Options:

Option

Description

Required

Default

-i, --input-path <PATH>

Path to the source code files to add to the project

Yes

-

Examples:

Add source code to project:

scai code add -i /path/to/source/code
Copy

Add code using full option name:

scai code add --input-path ./my-sql-scripts
Copy

scai code convert

Transform source database code to Snowflake SQL.

Usage:

scai code convert [OPTIONS]
Copy

Prerequisites:

  • A migration project initialized with ‘scai init’

  • Source code in the ‘source/’ folder (from ‘scai code extract’, ‘scai code add’, or manual copy)

Options:

Option

Description

Required

Default

-h, --help

Display all the conversion settings available for the specified source language

No

-

-e, --etl-replatform-sources-path <PATH>

Path to ETL replatform source files for cross-project code analysis. Must be provided for each conversion run.

No

-

-p, --powerbi-repointing <PATH>

Path to Power BI files for input repointing. Must be provided for each conversion run.

No

-

-x, --show-ewis

Show detailed EWI (Early Warning Issues) table instead of summary

No

-

--context-path <PATH>

Path to read migration context from. Defaults to .scai/conversion-context. Generated context is always written to .scai/conversion-context.

No

-

Examples:

Convert using project defaults:

scai code convert
Copy

Convert with custom context path:

scai code convert --context-path /path/to/context
Copy

Show all conversion settings for the project’s dialect:

scai code convert --help
Copy

Convert with custom schema:

scai code convert --customschema MY_SCHEMA
Copy

Convert with comment on missing dependencies:

scai code convert --comments
Copy

Convert with object renaming file:

scai code convert --renamingfile /path/to/renaming.json
Copy

scai code deploy

Deploy converted SQL code to Snowflake.

Usage:

scai code deploy [OPTIONS]
Copy

Prerequisites:

  • Converted code in ‘converted/Output/’ (from ‘scai code convert’)

  • Snowflake connection configured (set with ‘scai init -c’ or project settings)

  • Appropriate Snowflake privileges (CREATE TABLE, CREATE VIEW, etc.)

Options:

Option

Description

Required

Default

-c, --connection <NAME>

The name of the Snowflake connection to use. Uses default if not specified.

No

-

-d, --database <NAME>

Target database name for deployment. Uses converted database name if not specified.

No

-

-a, --all

Deploy all successfully converted objects without selection prompt.

No

False

-r, --retry <N>

Number of retry attempts for failed object deployments.

No

1

--continue-on-error

Continue deploying remaining objects even if some fail.

No

True

Examples:

Deploy using default connection:

scai code deploy
Copy

Deploy all objects:

scai code deploy --all
Copy

Deploy with specific connection:

scai code deploy --connection my-snowflake
Copy

scai code extract

Extract code from the source database.

Usage:

scai code extract [OPTIONS]
Copy

Prerequisites:

  • A migration project initialized with ‘scai init’

  • Source database connection configured (use ‘scai connection add-redshift’ or ‘scai connection add-sql-server’)

  • Network access to the source database

Options:

Option

Description

Required

Default

-c, --connection <NAME>

Name of the source connection to extract code from

No

-

-s, --schema <SCHEMA>

Schema name to extract code from

No

-

-t, --object-type <TYPES>

Object types to extract (comma-separated). E.g., TABLE,VIEW,PROCEDURE

No

-

Examples:

Extract tables from a schema:

scai code extract --schema public --object-type TABLE
Copy

Extract tables and views:

scai code extract --object-type TABLE,VIEW
Copy

Extract from all schemas:

scai code extract
Copy

scai connection

Manage source database connections (Redshift, SQL Server)

scai connection add-redshift

Add a new Redshift source database connection.

Usage:

scai connection add-redshift [OPTIONS]
Copy

Prerequisites:

  • Network access to the Redshift cluster/serverless endpoint

  • For IAM auth: AWS credentials configured (AWS CLI or environment variables)

  • For standard auth: Username and password

Options:

Option

Description

Required

Default

-c, --connection <NAME>

Name for this connection

No

-

--auth <AUTH>

Authentication method (iam-serverless, iam-provisioned-cluster, standard)

No

-

--user <USER>

Username

No

-

--database <DATABASE>

Database name

No

-

--connection-timeout <SECONDS>

Connection timeout in seconds

No

-

--workgroup <NAME>

Redshift Serverless workgroup name

No

-

--cluster-id <ID>

Redshift Provisioned Cluster ID

No

-

--region <REGION>

AWS region

No

-

--access-key-id <KEY>

AWS Access Key ID

No

-

--secret-access-key <KEY>

AWS Secret Access Key

No

-

--host <HOST>

Redshift host

No

-

--port <PORT>

Port number

No

-

--password <PASSWORD>

Password

No

-

Examples:

Add connection interactively (recommended):

scai connection add-redshift
Copy

IAM Serverless (inline):

scai connection add-redshift --connection my-redshift --auth iam-serverless --workgroup my-workgroup --database mydb --region us-east-1
Copy

scai connection add-sql-server

Add a new SQL Server source database connection.

Usage:

scai connection add-sql-server [OPTIONS]
Copy

Prerequisites:

  • Network access to the SQL Server instance

  • For Windows auth: Valid domain credentials

  • For standard auth: SQL Server username and password

Options:

Option

Description

Required

Default

-c, --connection <NAME>

Name for this connection

No

-

--auth <AUTH>

Authentication method (windows, standard)

No

-

--user <USER>

Username

No

-

--database <DATABASE>

Database name

No

-

--connection-timeout <SECONDS>

Connection timeout in seconds

No

-

--server-url <URL>

SQL Server URL

No

-

--port <PORT>

Port number

No

-

--password <PASSWORD>

Password

No

-

--trust-server-certificate

Trust server certificate

No

-

--encrypt

Encrypt connection

No

-

Examples:

Add connection interactively (recommended):

scai connection add-sql-server
Copy

Windows Authentication:

scai connection add-sql-server --connection my-sqlserver --auth windows --server-url localhost --database mydb
Copy

Standard Authentication:

scai connection add-sql-server --connection my-sqlserver --auth standard --server-url localhost --database mydb --username sa
Copy

scai connection list

List connections for a given source database.

Usage:

scai connection list [-l <LANGUAGE>]
Copy

Options:

Option

Description

Required

Default

-l, --source-language <LANGUAGE>

Source language of the connection. If omitted, shows a summary of all connections.

No

-

Examples:

List all connections summary:

scai connection list
Copy

List Redshift connections:

scai connection list -l redshift
Copy

List SQL Server connections:

scai connection list -l sqlserver
Copy

scai connection set-default

Set the default source connection for a database type.

Usage:

scai connection set-default -l <LANGUAGE> -c <CONNECTION>
Copy

Prerequisites:

  • Connection already added with ‘scai connection add-redshift’ or ‘scai connection add-sql-server’

Options:

Option

Description

Required

Default

-l, --source-language <LANGUAGE>

Database type of the connection

Yes

-

-c, --connection <NAME>

Name of the source connection to set as default

Yes

-

Examples:

Set default Redshift connection:

scai connection set-default -l redshift --connection prod
Copy

Set default SQL Server connection:

scai connection set-default -l sqlserver --connection dev
Copy

scai connection test

Test a source database connection.

Usage:

scai connection test -l <LANGUAGE> [-c <CONNECTION>]
Copy

Prerequisites:

  • Connection already configured

  • Network access to the database

Options:

Option

Description

Required

Default

-l, --source-language <LANGUAGE>

Source language of the connection. Supported languages: SqlServer and Redshift.

Yes

-

-c, --connection <NAME>

Name of the connection to test

No

-

Examples:

Test SQL Server connection:

scai connection test -l sqlserver -c my-sqlserver
Copy

Test Redshift connection:

scai connection test -l redshift -c my-redshift
Copy

scai data

Data operations: migrate, validate

scai data cloud-migrate

Migrate data from the source system into a Snowflake account using cloud infrastructure.

Usage:

scai data cloud-migrate --config <CONFIG_PATH> --compute-pool <POOL> [-c <CONNECTION>] [OPTIONS]
Copy

Prerequisites:

  • Data migration configuration file (JSON format)

  • Snowflake compute pool created and accessible

  • Snowflake connection with appropriate privileges

Options:

Option

Description

Required

Default

--config <PATH>

Path to the configuration file

Yes

-

-p, --compute-pool <NAME>

Name of the compute pool that the Data Migration Service will run on

Yes

-

-w, --warehouse <NAME>

Name of the warehouse that the Data Migration Service will use for queries

No

-

-c, --connection <NAME>

Name of the Snowflake connection to use (from connections.toml). Uses default if not specified.

No

-

--watch

Wait for the workflow to complete. Without this flag, the command returns after the workflow is created.

No

False

Examples:

Start migration (returns immediately):

scai data cloud-migrate --config my-data-migration-config.json --compute-pool MY_COMPUTE_POOL --connection my-snowflake
Copy

Start and wait for completion:

scai data cloud-migrate --config my-data-migration-config.json --compute-pool MY_COMPUTE_POOL --connection my-snowflake --watch
Copy

scai data cloud-migrate-status

Check the status of a Cloud Data Migration workflow.

Usage:

scai data cloud-migrate-status <WORKFLOW_NAME> [OPTIONS]
Copy

Prerequisites:

  • A workflow started with ‘scai data cloud-migrate’

  • Snowflake connection with access to the workflow

Options:

Option

Description

Required

Default

<WORKFLOW_NAME>

The workflow name to check status for.

Yes

-

-c, --connection <NAME>

The Snowflake connection name to use

No

-

-w, --watch

Display progress bars and poll for updates until workflow completes

No

False

Examples:

Check workflow status:

scai data cloud-migrate-status DATA_MIGRATION_WORKFLOW_xxx
Copy

Watch workflow progress:

scai data cloud-migrate-status DATA_MIGRATION_WORKFLOW_xxx --watch
Copy

Watch with custom interval:

scai data cloud-migrate-status --watch --poll-interval 10
Copy

scai data migrate

Migrate data from the source system into a Snowflake account.

Usage:

scai data migrate [OPTIONS]
Copy

Prerequisites:

  • Code converted with ‘scai code convert’ (generates TopLevelCodeUnits report)

  • Code deployed with ‘scai code deploy’ (creates target tables in Snowflake)

  • Source database connection configured

  • Snowflake connection configured with INSERT privileges

  • If using –selector: A selector file (create with ‘scai object-selector create’)

  • For Redshift: S3 Bucket, Snowflake Storage Integration, and External Stage configured

Options:

Option

Description

Required

Default

-c, --source-connection <NAME>

Name of the source connection to extract data from. If not provided, the default connection will be used.

No

-

-t, --target-connection <NAME>

Name of the target connection to migrate data to. If not provided, the default connection will be used.

No

-

-o, --selector <PATH>

Name of the selector file to use for migration. If not provided, all tables from the TopLevelCodeUnits report will be migrated.

No

-

-b, --bucket-uri <URI>

(Redshift only) The S3 bucket URI where data will be staged (e.g., s3://my-bucket/path).

No

-

--stage <STAGE_NAME>

(Redshift only) The fully qualified name of the Snowflake stage used to load parquet files from the S3 bucket. (e.g., database.schema.stage_name).

No

-

-i, --iam-role-arn <ARN>

(Redshift only) The IAM role ARN to unload parquet files to the S3 bucket.

No

-

Examples:

Migrate all tables:

scai data migrate --source-connection my-redshift --target-connection my-snowflake
Copy

Migrate selected tables:

scai data migrate --source-connection my-redshift --target-connection my-snowflake --selector my-selector.yml
Copy

scai data validate

Compare data between source and Snowflake to verify data integrity.

Usage:

scai data validate [OPTIONS]
Copy

Prerequisites:

  • Source database connection configured

  • Snowflake connection configured

  • Tables must exist in both source and target databases

Options:

Option

Description

Required

Default

--source-connection <NAME>

Name of the source connection to use (from the configured source connections). Uses default if not specified.

No

-

--target-connection <NAME>

Name of the Snowflake connection to use (from connections.toml). Uses default if not specified.

No

-

-d, --target-database <NAME>

Target Snowflake database for validation. Uses database from connection if not specified.

No

-

-o, --selector <PATH>

Name of the selector file to use for validation. If not provided, all tables from the TopLevelCodeUnits report will be validated.

No

-

-m, --db-mapping <MAPPING>

Database name mapping in format ‘source:target’. Can be specified multiple times for multiple mappings.

No

-

-e, --schema-mapping <MAPPING>

Schema name mapping in format ‘source:target’. Can be specified multiple times for multiple mappings.

No

-

Examples:

Validate all tables from report:

scai data validate
Copy

Validate with selector file:

scai data validate --selector my-tables.yml
Copy

With target database:

scai data validate --target-database PROD_DB
Copy

With name mappings:

scai data validate --db-mapping "sourcedb:TARGETDB" --schema-mapping "dbo:PUBLIC"
Copy

With explicit connections:

scai data validate --source-connection my-sqlserver --target-connection my-snowflake
Copy

scai git

Git workflow for iterative conversions

scai git enable

Enable Git workflow for iterative conversions.

Usage:

scai git enable [OPTIONS]
Copy

Prerequisites:

  • A migration project initialized with ‘scai init’

  • Git installed on the system

Options:

Option

Description

Required

Default

-b, --baseline-branch <NAME>

Name of the baseline branch for conversion results

No

baseline

Examples:

Enable git workflow:

scai git enable
Copy

Enable with custom baseline branch:

scai git enable --baseline-branch vendor
Copy

scai license

Install offline license for air-gapped environments

scai license install

Install an offline license for running conversions without online activation.

Usage:

scai license install -p <LICENSE_PATH>
Copy

Prerequisites:

  • A valid offline license file (.lic) from Snowflake

Options:

Option

Description

Required

Default

-p, --path <LICENSE_PATH>

Path to the license file to install

Yes

-

Examples:

Install license:

scai license install --path /path/to/license.lic
Copy

scai object-selector

Create selector files for filtering objects

scai object-selector create

Create a selector file to filter objects for data migration.

Usage:

scai object-selector create [OPTIONS]
Copy

Prerequisites:

  • Code converted with ‘scai code convert’ (generates TopLevelCodeUnits report)

Options:

Option

Description

Required

Default

-d, --database <NAME>

Filter objects by source database name.

No

-

-s, --schema <NAME>

Filter objects by source schema name.

No

-

-t, --type <TYPES>

Filter objects by type (comma-separated, e.g., table,view,procedure).

No

-

-n, --name <NAME>

Label for the selector file (becomes ..yml), if not provided, it will be called object-selector..yml.

No

-

Examples:

Create selector file:

scai object-selector create
Copy

Create with custom output path:

scai object-selector create -o custom-selector.yml
Copy

scai project

View and manage project configuration

scai project info

Display project details including name, source language, and status.

Usage:

scai project info
Copy

Prerequisites:

  • Must be run from within a migration project directory

Examples:

Show current project details:

scai project info
Copy

scai project set-default-connection

Set the default Snowflake connection for the current project.

Usage:

scai project set-default-connection -c <CONNECTION>
Copy

Prerequisites:

  • A migration project initialized with ‘scai init’

  • Snowflake connection available in connections.toml or config.toml

Options:

Option

Description

Required

Default

-c, --connection <NAME>

Name of the Snowflake connection to set as the project default.

Yes

-

Examples:

Set project default connection:

scai project set-default-connection -c my-snowflake
Copy

Change to production connection:

scai project set-default-connection -c prod-snowflake
Copy

scai state

Track code unit state through migration stages

scai state disable

Disable state management for the project.

Usage:

scai state disable
Copy

Prerequisites:

  • A migration project with state management enabled

Examples:

Disable state management:

scai state disable
Copy

scai state enable

Enable state management for the project.

Usage:

scai state enable
Copy

Prerequisites:

  • A migration project initialized with ‘scai init’

Examples:

Enable state management:

scai state enable
Copy

scai state status

Show the current state of code units in the project.

Usage:

scai state status
Copy

Prerequisites:

  • A migration project with state management enabled

Examples:

Show current state:

scai state status
Copy

Generated: 2026-01-30 11:17:01