DevOps with Snowflake¶
Snowflake provides tools and practices for managing your Snowflake environments as code, validating changes before they reach production, and automating deployments through CI/CD pipelines.
What is DevOps with Snowflake?¶
DevOps with Snowflake brings software engineering best practices to data infrastructure management. The core principles are:
Define as code. Declare the desired state of your Snowflake objects in version-controlled files. Snowflake determines and applies the necessary changes (create, alter, or drop) to reach that state.
Validate before you deploy. Preview proposed changes in a plan step before applying them to your account. Review creates, alters, and drops, then deploy when you’re confident the changes are correct.
Automate with CI/CD. Integrate Snowflake into your existing CI/CD pipelines so that deployments are triggered by pull requests, merges, or scheduled runs rather than manual steps.
The recommended approach is to use DCM Projects (Database Change Management Projects), which unify declarative object management, plan-then-deploy validation, multi-environment targeting, and CI/CD automation into a single workflow.
Define your Snowflake objects as code¶
DCM Projects (recommended)¶
DCM Projects (Database Change Management Projects) provide a declarative, infrastructure-as-code approach to managing your Snowflake environment. Instead of writing imperative scripts that specify each step, you define the desired target state of your objects. Snowflake compares those definitions against the current state and determines the necessary changes.
A DCM project consists of:
A manifest file (
manifest.yml) that specifies deployment targets, owner roles, and templating configurations for each environment.Definition files (SQL files under
sources/definitions/) that contain DEFINE statements for your Snowflake objects, GRANT statements for access control, and ATTACH statements for data quality expectations.
The following example shows a definition file that creates infrastructure for multiple teams using Jinja2 templating:
For complete documentation on DCM Projects, including how to set up your project files, manage multiple environments, and automate deployments, see Snowflake DCM Projects.
dbt Projects on Snowflake¶
dbt Projects on Snowflake let you deploy and run dbt Core projects as native Snowflake objects. You define SQL transformations in dbt models, deploy them as a versioned DBT PROJECT object, and execute them with Snowflake SQL or the Snowflake CLI. You can schedule runs with Snowflake tasks and integrate deployment into CI/CD pipelines.
For more information, see dbt Projects on Snowflake.
Alternative: CREATE OR ALTER with versioned scripts¶
For individual object changes outside of a DCM project, you can use the CREATE OR ALTER <object> command, which creates the object or alters it to match the definition specified by the command. By using this command from a versioned file in a remote repository, you can roll back changes to a previous version by executing a previous version of the file.
Note
You can also use the Snowflake Python APIs and Snowflake CLI to manage Snowflake resources. If you prefer to do your data engineering work in Python, Snowflake’s first-class Python API enables you to do the same resource management in the language you are most productive in.
Validate and preview changes¶
Before deploying changes to your Snowflake account, you can preview the proposed modifications to verify they match your intent.
Plan with DCM Projects¶
DCM Projects use a plan-then-deploy model. The PLAN command compares your definition files against the current state of your account and produces a list of proposed changes without modifying anything.
You can run a plan using the Snowflake CLI:
Or using SQL:
Review the output to confirm the expected creates, alters, and drops before proceeding to deploy.
Automate deployment with CI/CD¶
You can integrate Snowflake into your CI/CD pipelines so that deployments are triggered automatically by events such as pull request merges, branch pushes, or scheduled runs.
The following table maps common CI/CD pipeline jobs to the corresponding Snowflake CLI commands:
Pipeline job |
CLI command |
Description |
|---|---|---|
Plan on pull request |
|
Generates a plan that previews the changes that would be applied to the target environment. You can post the plan output as a PR comment for review. |
Deploy on merge |
|
Applies the planned changes to the target environment. Typically runs after a PR is merged to the main branch. |
Refresh dynamic tables |
|
Triggers a refresh of dynamic tables after deployment to ensure downstream data is up to date. |
Test expectations |
|
Runs expectation checks defined in your DCM project to verify that the deployment produced the expected results. |
GitHub Actions¶
You can use GitHub Actions to automate the jobs that constitute a CI/CD pipeline.
To authenticate securely, Snowflake recommends using workload identity federation (WIF) with OpenID Connect (OIDC) instead of static credentials like passwords or private keys. With WIF OIDC, GitHub Actions requests a short-lived token from GitHub’s OIDC provider, and Snowflake verifies the token directly. No long-lived secrets are stored in your repository.
To set up WIF OIDC, create a Snowflake service user that trusts GitHub’s OIDC provider:
For more information about configuring the subject claim and WIF in general, see Workload identity federation.
The following example shows a workflow that uses WIF OIDC and DCM Projects to plan and deploy changes on push to the main branch:
For more information about setting up CI/CD with the Snowflake CLI, including alternative authentication methods, see Integrating CI/CD with Snowflake CLI.
Manage environments¶
By maintaining separate environments for development, test, and production, your teams can isolate development activities from the production environment, which reduces the chance of unintended consequences and data corruption.
Connection profiles for environment targeting¶
With DCM Projects, you can define multiple deployment targets in your manifest.yml file. Each target maps to a specific Snowflake
account (or database), project object, owner role, and templating configuration. The same definition files can deploy to all environments
with environment-specific settings applied through configuration profiles.
For enterprise patterns such as multi-project setups and team collaboration, see Enterprise use cases for DCM Projects.
Advanced: Jinja parameterization for custom scripts¶
DCM Projects natively support Jinja2 templating in definition files. You can use template variables, loops, conditions, macros, and
dictionaries to make your definitions reusable across environments. Variable values come from configuration profiles in the
manifest.yml or from runtime overrides.
For details on DCM templating, see DCM Projects files and templates.
You can also parameterize standalone SQL scripts (outside of DCM Projects) using Jinja2 with EXECUTE IMMEDIATE FROM. The Snowflake CLI allows you to pass environment variables to Python scripts as well.
To change a deployment target, for example, you replace the name of the target database with a Jinja variable such as
{{ environment }} in SQL scripts, or an environment variable in Python scripts. This technique is shown in the following SQL
and Python code examples:
Getting started¶
To get started with DCM Projects, see Snowflake DCM Projects for a complete overview of the feature, including how to set up your project files, configure environments, and deploy changes.
For sample projects, CI/CD templates, and quickstarts, see the snowflake-labs DCM repository.
To follow a step-by-step tutorial, try the Getting Started with Snowflake DCM Projects quickstart.