SnowConvert AI CLI (scai) Command Reference¶
SnowConvert AI (scai) is a CLI tool for accelerated database migration to Snowflake. It manages end-to-end migration workflows including code extraction from source databases, automated conversion to Snowflake SQL, AI-powered code improvement, deployment, data migration, and validation.
Global Options¶
These options are available on every scai command.
Option |
Description |
|---|---|
|
Show help message |
|
Display version information |
|
Enable debug-level logging. Can also be set via the |
Quick Start¶
The basic workflow to get started with scai. For a more detailed walkthrough, see the quick start guide.
1. Create a project (use -c to set a default Snowflake connection):
2. Add source code:
For full migration (SQL Server, Redshift) – extract directly from the source database:
For other languages – add source files from disk:
3. Convert to Snowflake SQL:
Optional additional steps:
Tip: Using
-c <connection>duringscai initsaves the Snowflake connection in the project, so you don’t need to specify it for each command.Run
scai <command> -hfor detailed help on any command.
Commands¶
scai init¶
Create a new migration project in the specified directory (or the current directory if PATH is omitted).
Prerequisites:
Target directory must not contain an existing project
Valid source language must be specified (see Supported Languages)
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Directory to create the project in. If omitted, uses the current directory. |
No |
|
|
Project name. If omitted, defaults to the target folder name. |
No |
|
|
Source language for the project. |
Yes |
|
|
Path to source code files to add during initialization. SQL Server and Redshift sources are processed through the arrange and assess pipeline; other languages are copied directly to |
No |
|
|
Skip the arrange/split phase when source code is already split (SQL Server and Redshift only). Requires |
No |
|
|
Snowflake connection name to save as project default. |
No |
Behavior notes:
Creates the project directory structure and configuration files.
When
--input-code-pathis provided: SQL Server and Redshift run the arrange and parse-and-assess pipeline, promote processed files tosource/, and generate a code unit registry. Other languages copy source directly tosource/.When
--skip-splitis used with--input-code-path(SQL Server and Redshift only), skips the arrange/split phase, promotes raw source directly tosource/, runs assessment only for code unit registry generation, and marks the project as type Full (new folder structure).Redshift source files require paired SC tags (e.g.,
-- <sc-table> table_name </sc-table>) for the arrange step. If validation or arrange fails, the project is still created but source code is not added. Recovery: fix source files and runscai code add -i <path>, or use--skip-splitif the code is already split.
Project folder structure:
Path |
Description |
|---|---|
|
Project configuration |
|
|
|
Intermediate processing artifacts |
|
Processed source code (populated by |
|
Converted code, reports, and logs |
Examples:
scai project¶
View and manage project configuration.
scai project info¶
Display project details including name, source language, and status.
Prerequisites:
Must be run from within a migration project directory.
Output fields: Project Name, Project ID, Source Language, Snowflake Connection, Project Root.
Examples:
scai project set-default-connection¶
Set the default Snowflake connection for the current project.
Prerequisites:
A migration project initialized with
scai init.Snowflake connection available in
connections.tomlorconfig.toml.
Options:
Option |
Description |
Required |
|---|---|---|
|
Name of the Snowflake connection to set as the project default. |
Yes |
Connection precedence:
-c/--connectionoption (per-command override)Project connection (set by this command)
Default TOML connection
Examples:
scai connection¶
Manage source database connections (Redshift, SQL Server).
scai connection add-sql-server¶
Add a new SQL Server source database connection.
Prerequisites:
Network access to the SQL Server instance.
For Windows auth: valid domain credentials.
For standard auth: SQL Server username and password.
Authentication methods: Windows Authentication (Integrated Security), Standard Authentication (username/password).
Operation modes: Interactive (prompts for all required information – recommended) or Inline (command-line options for automation/CI).
Connections are saved to ~/.snowflake/connections.toml (or project-local).
Options:
Option |
Description |
Required |
|---|---|---|
|
Name for this connection. |
No |
|
Authentication method ( |
No |
|
Username. |
No |
|
Database name. |
No |
|
Connection timeout in seconds. |
No |
|
SQL Server URL. |
No |
|
Port number. |
No |
|
Password. |
No |
|
Trust server certificate. |
No |
|
Encrypt connection. |
No |
Examples:
scai connection add-redshift¶
Add a new Redshift source database connection.
Prerequisites:
Network access to the Redshift cluster/serverless endpoint.
For IAM auth: AWS credentials configured (AWS CLI or environment variables).
For standard auth: username and password.
Authentication methods: IAM Serverless (AWS IAM with Redshift Serverless), IAM Provisioned (AWS IAM with Provisioned Cluster), Standard (username/password).
Operation modes: Interactive (recommended) or Inline (for automation/CI).
Connections are saved to ~/.snowflake/connections.toml (or project-local).
Options:
Option |
Description |
Required |
|---|---|---|
|
Name for this connection. |
No |
|
Authentication method ( |
No |
|
Username. |
No |
|
Database name. |
No |
|
Connection timeout in seconds. |
No |
|
Redshift Serverless workgroup name. |
No |
|
Redshift Provisioned Cluster ID. |
No |
|
AWS region. |
No |
|
AWS Access Key ID. |
No |
|
AWS Secret Access Key. |
No |
|
Redshift host. |
No |
|
Port number. |
No |
|
Password. |
No |
Examples:
scai connection set-default¶
Set the default source connection for a database type.
Prerequisites:
Connection already added with
scai connection add-redshiftorscai connection add-sql-server.
Options:
Option |
Description |
Required |
|---|---|---|
|
Database type of the connection ( |
Yes |
|
Name of the source connection to set as default. |
Yes |
Examples:
scai connection list¶
List connections for a given source database.
Options:
Option |
Description |
Required |
|---|---|---|
|
Source language of the connection ( |
No |
Output: Table with columns: Name, Default, Host, Database.
Examples:
scai connection test¶
Test a source database connection.
Prerequisites:
Connection already configured.
Network access to the database.
Options:
Option |
Description |
Required |
|---|---|---|
|
Source language of the connection ( |
Yes |
|
Name of the connection to test. |
No |
Examples:
scai code¶
Code operations: add, extract, convert, deploy, find, accept, where, resync.
scai code add¶
Add source code from an input file or directory to the project’s source/ folder.
Prerequisites:
A migration project initialized with
scai init.Input must be a valid SQL source file or a directory containing SQL source files.
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Path to a source code file or directory to add to the project. |
Yes |
|
|
Overwrite existing files in the project’s |
No |
|
|
Skip the arrange/split phase when source code is already split (SQL Server and Redshift only). |
No |
|
|
Identifier for the source system (e.g., server hostname). Recorded in the code unit registry under |
No |
Behavior notes:
SQL Server and Redshift: runs arrange-only, produces
artifacts/source_raw_Processed/, merges intosource/.Other languages: copies source directly into
source/.Checks for conflicting files when destination folders are non-empty (unless
--overwriteis set).
Examples:
scai code extract¶
Extract code from the source database.
Supported languages: SQL Server, Redshift.
Prerequisites:
A migration project initialized with
scai init.Source database connection configured (use
scai connection add-redshiftorscai connection add-sql-server).Network access to the source database.
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Name of the source connection to extract code from. |
No |
|
|
Schema name to extract code from. |
No |
|
|
Object types to extract (comma-separated, e.g., |
No |
|
|
Filter objects by name. Supports substring match or wildcard patterns with |
No |
|
|
Interactive mode: browse and select schemas, object types, and filter by name. |
No |
|
|
Identifier for the source system. Recorded in the code unit registry under |
No |
Interactive mode:
Requires an interactive terminal. In non-interactive or CI environments, use --schema, --object-type, and --name instead.
Pre-fetch phase: prompt for schema (or leave empty for all) and multi-select object types to scope the catalog query.
Post-fetch phase: multi-select schemas to include, optional name filter (wildcard
*supported), summary table, then confirm extraction.
Options --schema, -t/--object-type, and -n/--name pre-fill the interactive prompts when used with -i.
Output structure:
Examples:
scai code convert¶
Transform source database code to Snowflake SQL.
Prerequisites:
A migration project initialized with
scai init.Source code in the
source/folder (fromscai code extract,scai code add, or manual copy).
Options:
Option |
Description |
Required |
|---|---|---|
|
Display all conversion settings available for the project’s source language. |
No |
|
Path to ETL replatform source files for cross-project code analysis. |
No |
|
Path to Power BI files for input repointing. |
No |
|
Show detailed EWI (Early Warning Issues) table instead of summary. |
No |
|
Path to read migration context from. Defaults to |
No |
|
Overwrite the output files in the |
No |
|
SQL-like filter to select which code units to convert (see WHERE Clause Reference). Only matched units are transformed; dependencies are still parsed for symbol resolution. |
No |
Dialect-specific settings:
Additional options are dynamically loaded based on the project’s source language. Run scai code convert --help within a project to see all available options for that dialect.
Common options available across multiple dialects:
Option |
Description |
Dialects |
|---|---|---|
|
Comment nodes that have missing dependencies. |
SQL Server, Oracle, Teradata |
|
File encoding for source files (default: UTF-8). |
All |
|
Custom schema name to apply. |
SQL Server, Oracle, Teradata |
|
Custom database name to apply. |
SQL Server, Oracle, Teradata |
|
Preserve existing name qualification from input code. Must be used with |
SQL Server, Oracle, Teradata |
|
Path to a file that specifies new names for objects. |
Redshift, SQL Server, Teradata |
|
Arrange the code before translation. |
SQL Server, Oracle, Teradata, Redshift |
|
Target language for stored procedure transformation ( |
SQL Server, Oracle, Teradata |
|
Warehouse name for dynamic table refresh. Default: |
SQL Server, Oracle, Teradata, Databricks, Spark |
|
Target lag for dynamic tables (e.g., |
SQL Server, Oracle, Teradata, Databricks, Spark |
|
Feature flags to enable Snowflake preview features. |
All |
|
Generate estimation reports. |
All |
Output structure:
Examples:
scai code deploy¶
Deploy converted SQL code to Snowflake.
Prerequisites:
Converted code in
snowflake/Output/(fromscai code convert).Snowflake connection configured (set with
scai init -cor project settings).Appropriate Snowflake privileges (CREATE TABLE, CREATE VIEW, etc.).
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
The Snowflake connection to use. Uses default if not specified. |
No |
|
|
Target database name for deployment. Also sets the connection database if not already configured. |
No |
|
|
Warehouse for the Snowflake connection. Only applied if the connection does not already have one. |
No |
|
|
Schema for the Snowflake connection. Only applied if the connection does not already have one. |
No |
|
|
Role for the Snowflake connection. Only applied if the connection does not already have one. |
No |
|
|
SQL-like WHERE clause to filter objects to deploy (see WHERE Clause Reference). |
No |
|
|
Deploy all successfully converted objects without selection prompt. |
No |
|
|
Number of retry attempts for failed object deployments. |
No |
|
|
Continue deploying remaining objects even if some fail. |
No |
|
|
When used with |
No |
|
Behavior notes:
--warehouse,--schema, and--roletemporarily set missing connection fields (in-memory only; the TOML file is not modified).If the connection already has a value for an overridden field, an error is returned.
Examples:
scai code find¶
Find code units from the project’s Code Unit Registry.
Prerequisites:
A migration project initialized with
scai init.An initialized Code Unit Registry (generated after
scai code convert).
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
SQL-like WHERE clause to filter objects (see WHERE Clause Reference). |
No |
|
|
Disable the default 100-row limit on displayed objects. |
No |
|
Output: Table with columns: Id, Fully Qualified Name, Object Type.
Examples:
scai code accept¶
Accept the latest converted artifact versions into the snowflake/ output folder.
Prerequisites:
A migration project initialized with
scai init.Source code must be split and registry files must be generated (run
scai code add).At least one code conversion run with
scai code convert.
Options:
Option |
Description |
Required |
|---|---|---|
|
Filter expression to select which objects to accept (see WHERE Clause Reference). |
No |
Behavior notes:
Scans the
artifacts/directory for timestamped conversion outputs.For each code unit, selects the most recent version based on the timestamp folder name (
yyyyMMdd.HHmmss).Copies the latest
.sqlfiles into thesnowflake/folder, preserving directory structure.
Examples:
scai code where¶
Show the WHERE clause query reference for code unit filtering.
This command displays all queryable fields, supported operators, and usage examples for WHERE clause filtering. It does not require a project directory or network access. The reference is generated at runtime from the Code Unit Registry schema.
Field naming conventions:
Field names use camelCase with dot-notation:
source.objectType,target.objectType,codeStatus.conversion.status,codeStatus.aiVerification.status,codeStatus.registration.statusEnum values are lowercase:
'table','procedure','view','function','completed','failed','pending','excluded'
Commands that support --where:
Command |
Usage |
|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Examples:
scai code resync¶
Re-scan modified converted files and update issue metadata in the Code Unit Registry.
Prerequisites:
A migration project initialized with
scai init.Code converted with
scai code convert.
Behavior notes:
Detects code units whose converted files have been modified.
Re-scans each modified file for SnowConvert issue codes (EWI, FDM, OOS, PRF).
Updates the issue metadata in the registry.
Examples:
scai ai-convert¶
AI-powered code improvement and test generation.
scai ai-convert start¶
Start AI-powered code conversion on converted code.
Prerequisites:
Code converted with
scai code convert(generates TopLevelCodeUnits report).Snowflake connection configured with
snow connection add.CREATE MIGRATIONprivilege granted on the Snowflake account.A warehouse configured in the Snowflake connection.
Must accept AI disclaimers (interactive prompt or
-yflag).
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Snowflake connection for AI code conversion. |
No |
|
|
Path to object selector file (YAML). Only for |
No |
|
|
Path to instructions file with custom AI conversion configuration. |
No |
|
|
Display job progress until completion (may take several minutes to hours). |
No |
|
|
Accept all AI disclaimers without prompting (required for non-interactive use). |
No |
|
|
SQL-like WHERE clause to filter objects to convert (see WHERE Clause Reference). |
No |
|
|
Warehouse for the Snowflake connection. Only applied if the connection does not already have one. |
No |
|
|
Schema for the Snowflake connection. Only applied if the connection does not already have one. |
No |
|
|
Role for the Snowflake connection. Only applied if the connection does not already have one. |
No |
|
|
Database for the Snowflake connection. Only applied if the connection does not already have one. |
No |
Testing modes:
Default: Tests converted code on Snowflake only.
Source system verification: Also runs tests against the source database (requires an instructions file).
Output structure:
Examples:
scai ai-convert status¶
Check the status of an AI code conversion job.
Prerequisites:
A job started with
scai ai-convert start.Snowflake connection (uses the job’s connection if
--connectionnot specified).
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
The job ID to check. If omitted, checks the last started job. |
No |
|
|
Override the Snowflake connection. |
No |
|
|
Monitor progress until completion. For finished jobs, forces a server-side refresh and downloads detailed results. |
No |
|
|
Warehouse override (if not already configured on connection). |
No |
|
|
Schema override (if not already configured on connection). |
No |
|
|
Role override (if not already configured on connection). |
No |
|
|
Database override (if not already configured on connection). |
No |
Examples:
scai ai-convert cancel¶
Cancel a running AI code conversion job.
Prerequisites:
A running job started with
scai ai-convert start.Snowflake connection (uses the job’s connection if
--connectionnot specified).
Options:
Option |
Description |
Required |
|---|---|---|
|
The job ID to cancel. If omitted, cancels the last started job. |
No |
|
Override the Snowflake connection. |
No |
|
Warehouse override (if not already configured on connection). |
No |
|
Schema override (if not already configured on connection). |
No |
|
Role override (if not already configured on connection). |
No |
|
Database override (if not already configured on connection). |
No |
Examples:
scai ai-convert list¶
List AI code conversion jobs for the current project.
Prerequisites:
A migration project initialized with
scai init.Snowflake connection for refreshing job status.
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Maximum number of jobs to display. |
No |
|
|
Show all jobs (ignores limit). |
No |
|
|
Override the Snowflake connection for refreshing job status. |
No |
|
|
Warehouse override (if not already configured on connection). |
No |
|
|
Schema override (if not already configured on connection). |
No |
|
|
Role override (if not already configured on connection). |
No |
|
|
Database override (if not already configured on connection). |
No |
Output: Table with columns: Job ID, Status, Start Time, Duration, Objects. Possible status values: PENDING, IN_PROGRESS, FINISHED, FAILED, CANCELLED.
Examples:
scai ai-convert accept¶
Review, compare, and accept AI-suggested fixes from a completed verification job.
Prerequisites:
A completed AI code conversion job (run
scai ai-convert startfirst).If using
--selector: a selector file (code_conversion_only). Create withscai object-selector create.If using
--where: full migration project. Runscai code wherefor syntax.
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
The job ID to accept changes for. If omitted, uses the last finished job. |
No |
|
|
Review each code unit one by one with options to accept, verify, or compare. |
No |
|
|
Path to object selector file (YAML). Only for |
No |
|
|
SQL-like WHERE clause to filter which objects to accept (see WHERE Clause Reference). Full migration projects only. |
No |
|
|
Replace all converted files with their AI-fixed versions without prompting. |
No |
|
|
Show a summary of what would be affected without making changes. |
No |
|
|
Output results in JSON format (for automation). Works with |
No |
|
Review modes:
Summary (
--summary, default): Preview affected code units without making changes.Interactive (
-i): Review each code unit with accept/verify/diff options.All (
--all): Accept all AI-suggested fixes without prompting.
Interactive actions:
[d]Diff – open diff tool to compare original and AI-fixed code[v]Verify – mark as verified (you applied changes manually)[a]Accept – overwrite converted file with AI fix[s]Skip – decide later[q]Quit – exit (progress is saved)
Examples:
scai data¶
Data operations: migrate and validate.
scai data migrate¶
Migrate data from the source system into a Snowflake account.
Prerequisites:
Code converted with
scai code convert(generates TopLevelCodeUnits report).Code deployed with
scai code deploy(creates target tables in Snowflake).Source database connection configured.
Snowflake connection configured with INSERT privileges.
If using
--selector: a selector file (code_conversion_only). Create withscai object-selector create.If using
--where: full migration project; filter tables from Code Unit Registry.For Redshift: S3 bucket, Snowflake Storage Integration, and External Stage configured.
Options:
Option |
Description |
Required |
|---|---|---|
|
Source connection to extract data from. Uses default if not specified. |
No |
|
Snowflake connection to migrate data to. Uses default if not specified. |
No |
|
Selector file for migration ( |
No |
|
SQL-like WHERE clause to filter tables from the Code Unit Registry (see WHERE Clause Reference). Full migration projects only. |
No |
|
(Redshift only) S3 bucket URI for staging data (e.g., |
No |
|
(Redshift only) Fully qualified Snowflake stage name for loading parquet files (e.g., |
No |
|
(Redshift only) IAM role ARN to unload parquet files to S3. |
No |
|
Warehouse override (if not already configured on connection). |
No |
|
Schema override (if not already configured on connection). |
No |
|
Role override (if not already configured on connection). |
No |
|
Database override (if not already configured on connection). |
No |
Examples:
scai data validate¶
Compare data between source and Snowflake to verify data integrity.
Prerequisites:
Source database connection configured.
Snowflake connection configured.
Tables must exist in both source and target databases.
Options:
Option |
Description |
Required |
|---|---|---|
|
Source connection to use. Uses default if not specified. Ignored when |
No |
|
Snowflake connection to use as the source (for Snowflake-to-Snowflake validation). |
No |
|
Snowflake target connection. Uses default if not specified. |
No |
|
Target Snowflake database for validation. |
No |
|
Selector file for validation ( |
No |
|
SQL-like WHERE clause to filter tables from the Code Unit Registry (see WHERE Clause Reference). Full migration projects only. |
No |
|
Database name mapping in format |
No |
|
Schema name mapping in format |
No |
|
Path to an existing data validation config file (YAML). When provided, uses this config instead of generating one. |
No |
Output structure:
Examples:
scai test¶
Generate test cases for migrated stored procedures.
scai test seed¶
Generate YAML test case files from an execution log of stored procedure calls.
Prerequisites:
A migration project initialized with
scai init.An execution log file produced by running the original stored procedures.
Code converted with
scai code convert.
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Path to the execution log file. |
Yes |
|
|
Source connection to use. Uses default if not specified. |
No |
|
|
Snowflake connection. Uses project/default if not specified. |
No |
|
|
Maximum number of test cases to generate per procedure. |
No |
|
|
Append test cases to existing test files instead of replacing them. |
No |
Output: One YAML test case file per procedure at artifacts/<target_db>/<target_schema>/<object_type>/.../<procedure_name>.yml.
Examples:
scai test capture¶
Capture test baselines from the source database.
Prerequisites:
A migration project initialized with
scai init.Test YAML files in
artifacts/**/test/*.yml(generated byscai test seed).A configured source database connection.
Options:
Option |
Description |
Required |
|---|---|---|
|
Source connection to use. Uses default if not specified. |
No |
|
Snowflake connection (for baseline stage upload). Uses project/default if not specified. |
No |
|
Directory to write baseline files to. Defaults to |
No |
Output: JSON baseline files written to .scai/baselines/ (or the directory specified by --baseline-dir).
Examples:
scai test validate¶
Validate Snowflake procedures against captured baselines.
Prerequisites:
A migration project initialized with
scai init.Baselines captured with
scai test capture.A configured Snowflake connection.
Options:
Option |
Description |
Required |
|---|---|---|
|
Snowflake connection. Uses project/default if not specified. |
No |
|
Directory containing baseline files. Falls back to Snowflake stage if not specified. |
No |
|
Snowflake stage containing baselines. Uses implicit default stage if not specified. |
No |
|
Regex pattern to filter test files by procedure name. Defaults to all procedures. |
No |
|
Create the VALIDATION schema and objects before running. |
No |
Examples:
scai object-selector¶
Create selector files for filtering objects.
scai object-selector create¶
Create a selector file to filter objects for data migration and other operations.
Prerequisites:
Code converted with
scai code convert(generates TopLevelCodeUnits report).
Options:
Option |
Description |
Required |
|---|---|---|
|
Filter objects by source database name. |
No |
|
Filter objects by source schema name. |
No |
|
Filter objects by type (comma-separated, e.g., |
No |
|
Label for the selector file (becomes |
No |
Output: A YAML selector file with the following structure:
Examples:
scai query¶
Execute SQL queries on source database systems.
Prerequisites:
Source database connection configured via
scai connection add-sql-serverorscai connection add-redshift.Network access to the source database.
Options:
Option |
Description |
Required |
|---|---|---|
|
SQL query to execute on the source system. |
Yes |
|
Source connection to use for query execution. |
Yes |
|
Source database type ( |
No |
Output: Query results printed as a formatted table (limited to 1000 rows).
Examples:
scai logs¶
Display the location of CLI log files and list recent entries.
Options:
Option |
Description |
Required |
Default |
|---|---|---|---|
|
Number of recent log files to display. |
No |
|
|
Open the log directory in the system file explorer. |
No |
|
Examples:
scai license¶
Install offline license for air-gapped environments.
scai license install¶
Install an offline license for running conversions without online activation.
Prerequisites:
A valid offline license file (
.lic) from Snowflake.
Use cases:
Running in air-gapped environments without internet.
CI/CD pipelines that cannot use online activation.
Environments with restricted network access.
Options:
Option |
Description |
Required |
|---|---|---|
|
Path to the license file to install. |
Yes |
Examples:
Supported Languages¶
scai supports two project types depending on the source language.
Full Migration¶
These languages support the complete migration workflow: code extraction from a live source database, conversion, AI improvement, deployment, data migration, and validation.
Language |
|---|
SqlServer |
Redshift |
Code Conversion Only¶
These languages support code conversion from files on disk. Source code is added manually via scai code add or scai init -i.
Language |
|---|
Oracle |
Teradata |
BigQuery |
Databricks |
Greenplum |
Sybase |
Postgresql |
Netezza |
Spark |
Vertica |
Hive |
Db2 |
Workflows¶
Full Migration (SQL Server / Redshift)¶
Complete migration workflow for full project types with source database connectivity.
Code Conversion Only¶
Workflow for projects without source database connectivity. Source code is added from local files.
Snowflake Connection¶
SnowConvert AI uses the Snowflake CLI (snow) for managing Snowflake connections. This is separate from the scai CLI.
Configuration:
Usage in scai:
Connection precedence (highest to lowest):
-c/--connectionoption on thescaicommandProject connection (set by
scai project set-default-connectionorscai init -c)Default TOML connection (set by
snow connection set-default)
For more details on configuring Snowflake connections, see the Snowflake CLI connection documentation.