SnowConvert AI CLI - Getting Started¶

What is SnowConvert AI CLI?¶

SnowConvert AI CLI (scai) encapsulates all SnowConvert functions into a single command line tool dedicated to increasing the speed of migrations from various source platforms into Snowflake.

With the SnowConvert AI CLI, migration engineers can extract code from their source platform, run a deterministic conversion on that code, further advance their migration using ai-conversion to cover objects that the deterministic engine could not translate, deploy that code to snowflake, migrate data from the source system to snowflake, and then validate that data between the two systems.

The CLI will also allow developers to create skills and agents that utilize the tool to automate their process.

Prerequisites¶

  • macOS, Windows, or Linux

  • Snowflake CLI: recommended for Snowflake connection configuration (install guide)

Snowflake Connection Setup¶

The SnowConvert AI CLI reuses your Snowflake CLI connection configuration.

Before using scai init, scai code convert, scai code extract, scai ai-convert or scai code deploy, ensure:

  • You can connect to Snowflake with snow connection test

  • Your Snowflake CLI has a default connection configured (the one used when you don’t specify a connection name)

To configure a Snowflake connection:

# Add a new connection using Snowflake CLI

snow connection add

# Test your connection

snow connection test
Copy

Once configured, scai code deploy will use your Snowflake CLI connections automatically.

Installation¶

Package Manager (macOS only)¶

PUBLIC and BETA channels are available via

brew tap snowflakedb/snowconvert-ai
brew install --cask snowconvert-ai
Copy

Verify installation:

scai --version
Copy

If you encounter issues, perform a clean install:

brew update && brew upgrade
brew untap snowflakedb/snowconvert-ai
brew tap snowflakedb/snowconvert-ai
brew install --cask snowconvert-ai
Copy

Installer packages (macOS, Linux, Windows)¶

BETA Releases¶

OS

Package type

Permalink

macOS

arm64 .pkg

https://snowconvert.snowflake.com/storage/darwin_arm64/beta/cli/snowflake-scai-cli-darwin-arm64-beta.pkg

macOS

x64 .pkg

https://snowconvert.snowflake.com/storage/darwin_x64/beta/cli/snowflake-scai-cli-darwin-x64-beta.pkg

Linux

arm64 .rpm

https://snowconvert.snowflake.com/storage/linux/beta/cli/snowflake-scai-cli-linux-arm64-beta.rpm

Linux

arm64 .deb

https://snowconvert.snowflake.com/storage/linux/beta/cli/snowflake-scai-cli-linux-arm64-beta.deb

Linux

x64 .rpm

https://snowconvert.snowflake.com/storage/linux/beta/cli/snowflake-scai-cli-linux-x64-beta.rpm

Linux

x64 .deb

https://snowconvert.snowflake.com/storage/linux/beta/cli/snowflake-scai-cli-linux-x64-beta.deb

Windows

arm64 .msi

https://snowconvert.snowflake.com/storage/windows_arm64/beta/cli/snowflake-scai-cli-windows-arm64-beta.msi

Windows

x64 .msi

https://snowconvert.snowflake.com/storage/windows/beta/cli/snowflake-scai-cli-windows-x64-beta.msi

Understanding Projects¶

A project is required before you can use any other scai command. This is similar to how Git requires you to run git init before using other Git commands.

A project:

  • Organizes your migration work in a dedicated folder structure

  • Tracks your source dialect (Oracle, SQL Server, Teradata, etc.)

  • Stores configuration, source code, converted code, and reports

When you run scai init, it creates this folder structure in the target directory.

  • If you pass a PATH, scai will create that folder (if it doesn’t exist) and initialize the project inside it.

  • If you omit PATH, scai initializes the current directory (which must be empty).

.

├── .scai/ \# Project configuration
│ └── config/ \# project.yml and local settings
├── source/ \# Your source code (input)
├── converted/ \# Converted Snowflake code (output)
└── results/ \# Additional job artifacts (conversion reports are under
converted/Reports)
Copy

Important: After creating a project, run all subsequent scai commands from within the project folder (where the .scai/ directory exists).

CLI logs are written to ~/.scai/logs/jobs.log by default.

Quick Start: Code Conversion Only¶

Use this workflow when you have existing SQL files to convert. Works with all supported dialects.

# 1. Create a project folder and initialize it (project name is inferred from folder name)

scai init my-project -l Oracle
cd my-project
Copy

# 2. Add your source code

scai code add -i /path/to/your/sql/files
Copy

# 3. Convert to Snowflake SQL

scai code convert
Copy

# 4. Deploy to Snowflake

scai code deploy
Copy

Your converted code will be in the converted/ folder (including converted/Output/SnowConvert/), and conversion reports in converted/Reports/.

Quick Start: End-to-End Migration¶

Use this workflow to extract code directly from your source database. Only available for SQL Server and Redshift.

Step 1: Create Project¶

# Create a project folder and initialize it (project name is inferred from folder name)

scai init my-project -l SqlServer \# or Redshift
cd my-project
Copy

Step 2: Configure Source Connection¶

# SQL Server (interactive mode - recommended)

scai connection add-sql-server
Copy

# Redshift (interactive mode - recommended)

scai connection add-redshift
Copy

The interactive mode will prompt you for connection details. See for non-interactive options.

Set a default source connection (used when scai code extract runs without –source-connection):

scai connection set-default -l sqlserver --connection \<NAME\>
Copy

# or:

scai connection set-default -l redshift --connection \<NAME\>
Copy

Step 3: Extract, Convert, Deploy¶

# Extract code from source database

scai code extract
Copy

# Convert to Snowflake SQL

scai code convert
Copy

# Deploy to Snowflake

scai code deploy
Copy

Workflow Examples¶

Example 1: Migrate Oracle Stored Procedures¶

# Create project

scai init oracle-migration -l Oracle
cd oracle-migration

# Add your PL/SQL files

scai code add -i ./oracle-procs/

# Convert

scai code convert

# Review converted code in converted/ folder, then deploy

scai code deploy --all
Copy

Example 2: SQL Server End-to-End with Specific Schema¶

# Create project

scai init sqlserver-migration -l SqlServer

cd sqlserver-migration

# Add connection

scai connection add-sql-server

# Extract only the 'sales' schema

scai code extract --schema sales

# Convert

scai code convert
\# Deploy

scai code deploy
```Shell
### Example 3: Redshift Migration with Git Tracking

# Create project with git workflow

scai init redshift-migration -l Redshift --git-flow

cd redshift-migration

# Add connection

scai connection add-redshift

# Extract

scai code extract

# Convert (changes are tracked in git)

scai code convert

# Deploy

scai code deploy --all
Copy

Example 4: AI Convert After Code Conversion¶

# Create project

scai init ai-convert-demo -l SqlServer

cd ai-convert-demo

# Add connection

scai connection add-sql-server

# Extract and convert

scai code extract

scai code convert

# Start AI code conversion and wait for completion

scai ai-convert start -w

# Review last executed job results

scai ai-convert status

# Review executed job list

scai ai-convert list
Copy

Getting Help¶

Use –help with any command to see available options:

scai --help

scai init --help

scai code convert --help

scai connection add-redshift --help
Copy

Troubleshooting¶

“Project file not found”¶

You must run commands from within a project directory. Navigate to your project folder (where the .scai/ directory exists) before running commands:

cd \<project-folder\>

scai code convert
Copy

“Connection not found” (source database)¶

  1. List your connections: scai connection list -l \<language\>

  2. Add a connection if needed: scai connection add-sql-server or scai connection add-redshift

  3. Or set a default: scai connection set-default -l \<language\> --connection-name \<name\>

“Authentication failed” for Snowflake¶

The SCAI CLI uses your Snowflake CLI configuration. Ensure your connection is working:

Make sure you have a default Snowflake connection configured in the Snowflake CLI (used when no connection name is specified).

# List available Snowflake connections

snow connection list

# Test your connection

snow connection test

# Add a new connection if needed

snow connection add
Copy

Supported Source Dialects¶

Dialect

Code Extraction

Code Conversion

Deployment

SQL Server

âś…

âś…

âś…

Redshift

âś…

âś…

âś…

Oracle

—

âś…

—

Teradata

—

âś…

—

BigQuery

—

âś…

—

Databricks

—

âś…

—

Greenplum

—

âś…

—

Sybase

—

âś…

—

PostgreSQL

—

âś…

—

Netezza

—

âś…

—

Spark

—

âś…

—

Vertica

—

âś…

—

Hive

—

âś…

—

DB2

—

âś…

—