Deploy and manage DCM Projects

This topic describes how to create and deploy DCM Projects to manage Snowflake environments, including as accounts.

Managing a DCM project involves the following steps:

  1. Prepare your Snowflake account for a DCM project.

  2. Define project configuration and objects in project files.

  3. Create a DCM Projects object.

  4. Plan to preview proposed changes before deployment.

  5. Deploy the project.

  6. Maintain the project by monitoring, updating, and repeating the process as needed.

You can continuously deploy incremental changes to your project as well as large-scale account infrastructure changes.

It is recommended that you continuously deploy incremental changes and additions to your project, rather than going from 0 to 100 large-scale account infrastructure changes in a single deployment.

Prepare for a DCM project

To get started with DCM Projects, your Snowflake account must satisfy the following prerequisites:

  • A database and schema where you can create your DCM Projects object

  • A role with privileges to create a DCM Projects object and access to run queries on a warehouse

  • For Snowflake CLI, a role with privileges to create a temporary stage

This section describes the tasks that you need to complete to prepare for DCM Projects:

Note

The snowflake-labs DCM repository is continuously updated with resources to help you get started.

  • Quickstarts and demo projects: A repository you can clone into a Snowflake Workspace or local folder to try out DCM Projects commands and explore DCM Projects capabilities.

  • Sample GitHub actions and workflows: CI/CD pipeline templates using Snowflake CLI to test and deploy DCM projects.

Interface tools

You have the following interface options available for DCM Projects.

Interface tool

Best for

Snowsight

A workspace in Snowsight is a Snowflake native cloud IDE in your account.

  • Easily create or upload DCM definition files via the UI.

  • Connect to a Git repository to pull and push changes.

  • Review, edit and debug definition files.

  • Execute DCM PLAN and DEPLOY commands using the workspace UI.

  • Browse the database catalog to see DCM project objects and their configuration, managed objects and deployment history.

  • Select a target profile to automatically use the linked DCM project and templating configuration.

Local IDE with Snowflake CLI

The most familiar and personalized interface for software engineers.

  • Create and edit definition files locally.

  • Connect to a Git repository to pull and push changes.

  • Concise Snowflake CLI commands with directory context and optional flags.

  • Rich formatted output and an option to save as a .json file.

  • Option to leverage Cortex Code CLI for agentic or assisted development.

  • See Snowflake CLI for DCM Projects for information about installing and running Snowflake CLI in your local IDE.

Cortex Code

An agentic AI tool for Snowflake. See Cortex Code for DCM Projects for more information.

  • AI assisted or agentic authoring of local definition files.

  • AI assisted or agentic code validation and debugging by running static analysis and DCM PLAN commands.

SQL commands

  • Run SQL commands from the Snowflake CLI REPL, workspaces, notebooks, or worksheets.

  • Customize commands with additional arguments.

  • Same commands work across all Snowflake SQL interfaces.

Cortex Code for DCM Projects

Cortex Code is an agentic AI tool for Snowflake. With the DCM skill enabled, Cortex Code can autonomously create, migrate, debug, and deploy DCM Projects. It can also work alongside you step by step.

Note

Cortex Code with the DCM skill is currently available via the Cortex Code CLI only. It is not available in Snowsight Workspaces.

The Cortex Code DCM skill enables the following:

  • Scaffold a new DCM project from scratch, including the manifest file, the folder structure, and definition files.

  • Author and edit DEFINE statements, Jinja templates, and macros.

  • Run PLAN, DEPLOY, REFRESH, TEST, and PREVIEW commands.

  • Interpret plan output, diagnose failures, and suggest fixes.

  • Download and inspect deployment artifacts.

  • Navigate and explain an existing DCM project.

To get started with the Cortex Code DCM skill, follow these steps:

  1. Install Cortex Code CLI as described in Installing Cortex Code.

  2. Start Cortex Code in your terminal.

  3. Use the $dcm skill reference or use the term DCM in your natural language prompt to interact with your DCM Projects conversationally.

For example:

  • “Create a new DCM project for our analytics pipeline”

  • “Plan my project against the PROD target”

  • “Why did my last plan fail?”

  • “Add a new dynamic table definition for customer spending”

Snowflake CLI for DCM Projects

Snowflake CLI is a command-line interface for Snowflake. It is a tool that you can use to interact with your Snowflake account from your local IDE.

  1. DCM Projects require Snowflake CLI version 3.16 or higher. Install or upgrade Snowflake CLI as described in Installing Snowflake CLI.

  2. Configure your connection to your Snowflake account, as described in Configuring Snowflake CLI and connecting to Snowflake. Confirm you have a working connection:

    snow connection test
    
  3. Navigate to the local directory of your Git repository clone. For example:

    cd ./Quickstarts/DCM_Quickstart_1
    
  4. See the Snowflake CLI DCM commands available to you:

    snow dcm --help
    

Git integration

Connect to the Git repository where your DCM project definition files are stored.

  1. Create a new workspace from a Git repository.

  2. Create or select a Git branch for your planned changes.

    Snowflake clones files from that branch into your workspace editor.

  3. Navigate to the folder where you have your DCM project definition files or want to create them.

Create a DCM project

Required roles and privileges

The role of the user who creates a DCM project object must have the following roles and privileges:

  • The CREATE DCM PROJECT ON SCHEMA privilege:

    GRANT CREATE DCM PROJECT ON SCHEMA <schema_name> TO ROLE <role_name>;
    

Create a DCM project

Create a DCM project object by using one of the following options.

CREATE [OR REPLACE] DCM PROJECT [IF NOT EXISTS] <my_project>
[COMMENT = 'my comment'];

Access control and role privileges

You can set role-based access control (RBAC) of the schema-level DCM project object to READ, MONITOR, or OWNERSHIP privileges.

These privileges are independent of the access control for definition files stored in a workspace, stage, or repository.

Privilege

Description

Allowed operations

READ

  • Shows if the DCM project object exists.

  • Lists the objects and grants deployed by the DCM project, which are visible to the user’s role.

    This means you need both READ on the DCM project and READ on the objects themselves.

  • SHOW DCM PROJECTS LIKE ‘%project’

  • DESCRIBE DCM PROJECT <project>

  • SHOW ENTITIES IN DCM PROJECT <project>

MONITOR

  • Gives access to the complete deployment history, including all artifacts.

  • Gives the role the ability to analyze, debug, or audit production deployments without the ability to deploy changes directly.

  • All READ privileges

  • DESCRIBE DCM PROJECT <project> (with source and deployment path of latest deployment)

  • INFORMATION_SCHEMA.DCM_DEPLOYMENT_HISTORY (project_name => ‘db.schema.project’)

  • SHOW DEPLOYMENTS IN DCM PROJECT <project>

  • LIST all files in the deployment

  • GET any access to files inside the DCM project

OWNERSHIP

  • The role that is used to create the DCM project object is the owner of that project.

  • Gives the role the ability to deploy changes.

  • Gives the role the ability to transfer ownership of the project to another role when the project has not yet been deployed.

  • All MONITOR privileges

  • EXECUTE DCM PROJECT <project> PLAN

  • EXECUTE DCM PROJECT <project> DEPLOY

  • EXECUTE DCM PROJECT <project> PREVIEW

  • EXECUTE DCM PROJECT <project> REFRESH

  • EXECUTE DCM PROJECT <project> TEST

  • DROP DCM PROJECT <project>

  • ALTER DCM PROJECT <project>

  • GRANT READ on DCM PROJECT <project> TO ROLE <role2>

  • GRANT MONITOR on DCM PROJECT <project> TO ROLE <role2>

Note

Like other Snowflake commands, EXECUTE DCM PROJECT respects when privileges from secondary roles are enabled for the user who runs the command. Run USE SECONDARY ROLES NONE; so that you are not leveraging privileges from other roles than the project owner role. This ensures that deployment behavior is consistent across different environments when executed by different service-users with the same primary role.

Ownership on DCM-managed objects

The role that deploys a DCM project, by default, has the OWNERSHIP privilege of all deployed objects.

The project definitions can include GRANT OWNERSHIP statements to other roles. Snowflake recommends that the DCM project owner role only grant ownership of DCM-managed objects to another lower-level role that it also holds. Then the project can continue to manage this object, as the project owner role “inherits” the privileges of the new object owner role.

If the DCM project owner role grants ownership of DCM-managed objects to another role that it does not hold itself, the project can no longer manage this deployed object because the project owner role no longer has ownership of it. Subsequent deployments will fail. The object definition needs to be removed from the project or ownership needs to be granted back to the project owner role.

If you want to migrate existing objects to be managed by a DCM project, the role that owns the DCM project object also has to have ownership privileges (direct or inherited through other roles) on the object to be managed by DCM project.

Note

If a migrated object, we recommend adding the corresponding GRANT OWNERSHIP statement to the project definitions as well to ensure that the current state and DCM project definitions are in sync.

Define a DCM project

A DCM project is based on a manifest file and one or more SQL object definition files. These files are typically stored and managed in a Git repository or your local workspace.

  • The manifest file

    • Specifies one or more target environments with corresponding account identifiers, DCM project objects, owner roles for these object and optional templating configurations

    • Optionally, specifies templating defaults and one or more configurations with values for template variables.

  • The object definition files

    • Define a group of Snowflake objects, grants, and expectations that you want to manage together in this DCM project.

See Create a DCM project folder to store your definition files for how to set up a DCM project folder and the definition files and how to use templates to define your DCM project.

Plan a DCM project

Planning a DCM project performs a dry run to preview changes before deployment. Snowflake compares your project definition files to existing objects and shows which objects will be created, altered, or dropped. No changes are made to your account.

Use planning to review and validate changes before deploying a DCM project. You can specify options such as a configuration or an output path for plan results.

The PLAN mimics the DEPLOY command as much as possible, except it doesn’t actually execute any DDL statements.

Important

Always run the PLAN command on your projects before deployment to help ensure there are no errors from syntax, templating, object dependency, access privileges, and so on. Review the plan output to debug any errors, preview the rendered Jinja with the provided variables, and preview the changes that will be made once you deploy.

The plan performs the following steps:

  1. Renders all Jinja templating with the selected configuration profile or values provided at runtime.

  2. Compares all definitions against the current state of entities that were defined as part of the last deployment.

  3. Converts all defined statements into CREATE, ALTER, DROP, GRANT, and REVOKE statements.

  4. Sorts all statements based on their interdependencies.

  5. Compiles all statements.

Note

Although PLAN catches almost all possible errors that can occur during deployment, it does not guarantee a successful deployment.

Run the PLAN command

The PLAN command takes the following information as input:

  • The path to the manifest file

    The CLI reads the target from the manifest (default_target or --target flag). For SQL commands, the path to the manifest file and the project name must be provided.

  • Defined values for Jinja variables (optional).

  • The target’s templating_config automatically selects the configuration profile. For SQL commands, use the USING CONFIGURATION clause to specify the profile.

  • One or more values of the configuration profile to overwrite (optional).

The following are examples of how to run the PLAN command.

Run the snow dcm plan command in your local IDE terminal or as part of a Git workflow.

An example of a CLI command to plan a DCM project from a local directory is:

cd ./Quickstarts/DCM_Project_Quickstart_1/
snow dcm plan

An example of a CLI command to plan a DCM project from a Snowflake stage or Git repository clone is:

snow dcm plan --target PROD_US --save-output

An example of a CLI command to plan a DCM project with optional arguments is:

snow dcm plan
--variable "wh_size='MEDIUM'" --variable "teams = ['TEAM_A', 'TEAM_B']"
--save-output

Variables are required in double-quotes with additional single quotes for string-values. Lists of values require square-brackets.

Snowflake CLI commands

Definition file path

You have the following options to reference the location of the manifest and definition files.

  • From a Workspace path

    The Snowsight user interface automatically lists all DCM project definitions inside the current workspace. You can select one of these paths and workspaces will use it to run DCM commands.

    If you want to manually run SQL commands in workspaces you can also refer to that same path inside any of your workspaces.

    Tip: The 3-dot menu behind every file in your workspace lets you copy the full path to that file into your SQL code.

    An example of a SQL command to plan a DCM project from a workspace path is:

    EXECUTE DCM PROJECT DCM_PROJECT_DEV
      PLAN
      USING CONFIGURATION DEV
    FROM
      'snow://workspace/USER$.PUBLIC.DEFAULT$/versions/live/Quickstarts/DCM_Project_Quickstart_1'
    
  • From a local Git repository clone on your disk

    Select the directory that contains your manifest.yml file before running the CLI command in your local IDE. Alternatively you can specify a different local directory that contains the manifest and definitions you want to use.

    An example of a CLI command to plan a DCM project from the current directory of a local Git repo:

    cd ./Quickstarts/DCM_Project_Quickstart_1/
    
    snow dcm plan
    
    snow dcm plan --target PROD
    

    An example of a CLI command to plan a DCM project from a different directory in a local Git repo clone:

    snow dcm plan DCM_PROJECT_DEV --configuration DEV --from ./Quickstarts/DCM_Project_Quickstart_2/
    
  • From your remote repository in a workflow

    The same CLI syntax can be used when the DCM commands are executed in a CI/CD workflow.

    An example of a CI/CD workflow to plan a DCM project from a local Git repo clone:

    steps:
      - name: Clone Repo
        uses: actions/checkout@v4
      - name: Setup SnowCLI
        uses: snowflakedb/snowflake-cli-action@v2.0
      - name: Run PLAN
        run: |
          cd ./Quickstarts/DCM_Project_Quickstart_1/
          snow dcm plan --target PROD --save-output
    
  • From a Stage or Git repository clone in Snowflake

    In case you want to run a Procedure or Task inside Snowflake that executes DCM commands, this SQL command can reference an absolute path to a Snowflake Stage or Git Repository Clone inside the account.

    For Git Repository clones consider first running ALTER GIT REPOSITORY FETCH to have the latest version.

    '@...' paths can only be used when executing DCM SQL commands.

    An example of a SQL command to plan a DCM project from a Stage or Git repository clone in Snowflake is:

    EXECUTE DCM PROJECT DCM_PROJECT_DEV
      PLAN
      USING CONFIGURATION DEV
    FROM
      '@DCM_DEMO.DEPLOY.DCM_DEMO/branches/main/Quickstarts/DCM_Project_Quickstart_1/'
    

Plan output

Note

During the preview phase, the exact output format can be subject to change.

The standard plan output contains the following information about the plan execution in JSON format:

version: 2
metadata:
  timestamp: <timestamp>
  query_id: <query_id>
  project_name: <project_name>
  user: <user>
  role_name: <role_name>
  command: <command>
  changes:
    - kind: <kind>
        "attribute_name": <attribute_name>,
        "value": <value>
        "changes": [
          {
            "kind": <kind>,
            "attribute_name": <attribute_name>,
            "value": <value>
          }
        ]
      }
    ]
  }
}

Property

Description

version

Schema version of the output format. Version 2 is the latest and only supported version.

metadata

Contextual information about the execution.

metadata.timestamp

ISO 8601 timestamp of when the command was executed.

metadata.query_id

Unique identifier for the query that produced this plan.

metadata.project_name

Fully qualified name of the DCM Project object.

metadata.user

Name of the user who executed the command.

metadata.role_name

Active role used to execute the command.

metadata.command

The command that was executed. PLAN or DEPLOY.

changeset

An array of change entries. Each entry represents one object that would be or was created, altered, or dropped. An empty array indicates the project definitions are already in sync with the account.

changeset[].type

The planned action for the object. Possible values: CREATE, ALTER, DROP.

changeset[].object_id

Identifies the target object.

changeset[].object_id.domain

The Snowflake object type.

changeset[].object_id.name

Name of the object.

changeset[].object_id.fqn

Fully qualified name of the object.

changeset[].object_id.database

Database containing the object. Omitted for account-level objects.

changeset[].object_id.schema

Schema containing the object. Omitted for database-level and account-level objects.

changeset[].changes

An array of change descriptors detailing the specific attribute modifications.

changeset[].changes[].kind

The type of change. Possible values: set, changed, unset, nested, collection. The values of kind determines the remaining keys in the object.

changeset[].changes[].attribute_name

Name of the attribute being set or changed. Present when kind is set, changed, or unset.

changeset[].changes[].value

The new value for the attribute. Present when kind is set or changed.

changeset[].changes[].prev_value

The previous value of the attribute before the change. Present only when kind is changed.

changeset[].changes[].collection_name

Name of the collection being modified (for example, columns, constraints, privileges, expectations). Present only when kind is collection.

changeset[].changes[].id_label

Label used to identify items within the collection (for example, name). Present only on certain collections.

changeset[].changes[].changes

A nested array of collection item descriptors. Present only when kind is collection.

changeset[].changes[].changes[].kind

The type of change to the collection item. Possible values: added, removed, modified.

changeset[].changes[].changes[].item_id

Identifies the item within the collection. Can be a string or an object, depending on the collection type.

changeset[].changes[].changes[].changes

An array of further change descriptors for this item. Present for added and modified items. Always absent for removed items.

An example of a plan output:

{
  "version": 2,
  "metadata": {
    "timestamp": <timestamp>,
    "query_id": <query_id>,
    "project_name": <project_name>,
    "user": <user>,
    "role_name": <role_name>,
    "command": <command>
  },
  "changeset": [
    {
      "type": "CREATE",
      "object_id": {
        "domain": "TABLE",
        "name": "CUSTOMER_SUMMARY",
        "fqn": "MY_DB.ANALYTICS.CUSTOMER_SUMMARY",
        "database": "MY_DB",
        "schema": "ANALYTICS"
      },
      "changes": [
        {
          "kind": "set",
          "attribute_name": "warehouse_size",
          "value": "XSMALL"
        },
        {
          "kind": "set",
          "attribute_name": "query",
          "value": "SELECT customer_id, SUM(amount) AS total FROM orders GROUP BY customer_id"
        }
      ]
    },
    {
      "type": "ALTER",
      "object_id": {
        "domain": "DYNAMIC_TABLE",
        "name": "ORDER_DETAILS",
        "fqn": "MY_DB.ANALYTICS.ORDER_DETAILS",
        "database": "MY_DB",
        "schema": "ANALYTICS"
      },
      "changes": [
        {
          "kind": "changed",
          "attribute_name": "warehouse_size",
          "value": "SMALL",
          "prev_value": "XSMALL"
        },
        {
          "kind": "collection",
          "collection_name": "columns",
          "id_label": "name",
          "changes": [
            {
              "kind": "added",
              "item_id": "DISCOUNT_AMOUNT",
              "changes": [
                {
                  "kind": "set",
                  "attribute_name": "data_type",
                  "value": "NUMBER(10,2)"
                }
              ]
            },
            {
              "kind": "modified",
              "item_id": "ORDER_STATUS",
              "changes": [
                {
                  "kind": "changed",
                  "attribute_name": "data_type",
                  "value": "VARCHAR(50)",
                  "prev_value": "VARCHAR(20)"
                }
              ]
            },
            {
              "kind": "removed",
              "item_id": "LEGACY_FLAG"
            }
          ]
        }
      ]
    },
    {
      "type": "DROP",
      "object_id": {
        "domain": "VIEW",
        "name": "OLD_REPORT_VIEW",
        "fqn": "MY_DB.ANALYTICS.OLD_REPORT_VIEW",
        "database": "MY_DB",
        "schema": "ANALYTICS"
      },
      "changes": []
    }
  ]
}

Deploy a DCM project

When you deploy a DCM project, the following actions are performed:

  • Objects that are defined but don’t exist yet are created.

  • Objects that already exist but differ from the current definition are altered.

  • Objects that already exist as defined are skipped.

  • Objects that already exist but are no longer defined are dropped.

The same behavior applies to grants and attached data quality expectations defined in the project.

Important

To avoid any unintended data loss, always run and review your PLAN output before running DEPLOY.

Each DCM project can only have one instance deployed at any time. Multiple configuration profiles can’t coexist. Deploying configuration B with the same DCM project will drop any objects from other previous configurations that are not defined in B.

Create one DCM project for each target environment. The DCM project for each environment can then point to the same definition files, but deploy independently with different values for each variable, like suffix => 'DEV_JS', so that they can exist independently side-by-side on the same Snowflake account.

You can overwrite values for selected variables at runtime if you want to use a pre-defined profile with a slight variation.

For example:

EXECUTE DCM PROJECT DCM_DEMO.PROJECTS.DCM_PROJECT_DEV
  DEPLOY
  USING CONFIGURATION DEV (suffix=>'DEV_USER', user=>'JANEDOE')
FROM
  'snow://workspace/USER$.PUBLIC.DEFAULT$/versions/live/DCM_Project_Quickstart_1';
snow dcm deploy DCM_PROJECT_DEV --configuration DEV --variable "suffix='DEV_USER'" --variable "user='JANEDOE'"

Each deployment attempt (successful, failed, or canceled) has a deployment number, for example DEPLOYMENT$1. Optionally you can specify a unique string as a deployment alias in order to name individual deployments for better observability in the deployment history. Think of the deployment alias like a commit message for your code change.

Each DEPLOY command first runs an internal PRE-PLAN as part of the deployment. If the PRE-PLAN succeeds the DEPLOY is executed directly afterwards. There is no option to intercept or review this internal plan step. The PRE-PLAN is executed to further reduce the risk of failure during the deployment. If a DEPLOY fails, you can see in the error message if it failed during the PRE-PLAN or DEPLOY step. Failure during the PRE-PLAN step is similar to PLAN - no DDL changes are executed.

Important

Failure during the DEPLOY step can result in partial execution of the defined changes. This can potentially cause some of the managed objects to be in an undefined state. In most cases fixing the root cause and executing DEPLOY again restores the defined target state.

The target path for the DEPLOY output file can’t be customized. Deployment artifacts are always stored inside the DCM project.

Run the DEPLOY command

To execute the DEPLOY command, provide the following inputs:

  • The path to the manifest file.

  • A configuration profile must be named if configuration profiles are defined in the manifest.

  • Optionally, values for the configuration profile overriding the default values.

  • Optionally, a deployment alias.

The following are examples of how to run the DEPLOY command.

EXECUTE DCM PROJECT DCM_DEMO.PROJECTS.DCM_PROJECT_DEV
  DEPLOY
FROM
  'snow://workspace/USER$.PUBLIC.DEFAULT$/versions/live/Quickstarts/DCM_Project_Quickstart_1';

An example of a SQL command to deploy a DCM project when using Jinja with configuration profiles but overwriting wh_size and teams is:

EXECUTE DCM PROJECT DCM_DEMO.PROJECTS.DCM_PROJECT_DEV
  DEPLOY AS "testing 2 teams"
  USING CONFIGURATION DEV (wh_size => 'MEDIUM', teams => ['TEAM_A', 'TEAM_B'])
FROM
  'snow://workspace/USER$.PUBLIC.DEFAULT$/versions/live/Quickstarts/DCM_Project_Quickstart_1';

See Plan output for the standard plan output structure.

Manage a DCM project

Show all objects managed by a DCM project

The SHOW ENTITIES IN DCM PROJECT command allows you to see a list of all Snowflake objects that are currently managed by a specific DCM project. It provides a list of fully qualified names for all objects. To see the results, you need both READ privilege on the DCM project and privileges to see the managed object itself.

Note

The result does not necessarily match the objects of the most recent deployment. Objects that were manually dropped or detached from the project are not listed in the result.

You can use LIKE to search by name or use a flow operator to further process or filter the result set.

Similarly you can SHOW GRANTS and SHOW FUTURE GRANTS that are defined and deployed with this project.

Examples to see all objects that are currently managed by a DCM project:

SHOW ENTITIES LIKE '%DASHBOARD%' IN DCM PROJECT DCM_DEMO.PROJECTS.DCM_PROJECT_DEV;

SHOW ENTITIES IN DCM PROJECT DCM_DEMO.PROJECTS.DCM_PROJECT_DEV
  ->> SELECT * FROM $1 WHERE "object_type" = 'DYNAMIC_TABLE';

SHOW [FUTURE] GRANTS IN DCM PROJECT DCM_DEMO.PROJECTS.DCM_PROJECT_DEV;

Detach objects from a DCM project

Using the ALTER <object> command with the UNSET DCM PROJECT clause, you can detach an object that was deployed and is now managed by a DCM project. The command removes the association between the object and the DCM project without dropping the object. You can use this command when you want to start managing an object by a different DCM project.

Make sure to remove the corresponding DEFINE statement from your project definition files before you deploy it again. Otherwise, the object will be reintegrated into the DCM project.

An example of a SQL command to detach an object from a DCM project:

ALTER TABLE MY_DB.MY_SCH.MY_TABLE
  UNSET DCM PROJECT;

You can not detach deployed grants or exectations from a DCM Project.

Drop a DCM project

When a DCM project object is dropped, all managed entities, grants, and expectations remain in place as “unmanaged”.

Important

Dropping or replacing a DCM project object causes you to lose all deployment history artifacts that the object contains.

DROP DCM PROJECT [IF EXISTS] <my_project>;

Automate a DCM project deployment

CI/CD best practices

Follow these practices when automating deployments with CI/CD pipelines:

  • A DCM project targeting a non-production environment should be owned by a different role than its production counterpart to avoid accidental deployments to production.

  • A DCM project targeting a production environment should be owned by a dedicated role for production deployments with specifically tailored access privileges that are just enough to deploy all objects in the project.

    • Avoid using general administrator roles for DCM project ownership. Grant such roles only to service users, not to individual developers.

    • Grant the dedicated production deployment role only to service users, not to individual developers.

    • Restrict the ownership to the production deployment role to ensure immutability of critical infrastructure or data products.

      If the dedicated production deployment role grants ownership of production objects to other roles, users who are granted those roles can still modify or drop the production objects.

GitHub Actions examples

This section provides sample GitHub Actions workflows that illustrate typical CI/CD patterns for DCM Projects. The same concepts apply to other Git-based platforms such as Azure DevOps, GitLab CI/CD, or Bitbucket Pipelines — only the workflow syntax differs.

Each sample provides building blocks that you can customize and combine based on your pipeline setup, environment topology, and organizational requirements.

The sample workflows demonstrate the following patterns applicable to any DCM CI/CD setup:

  • Manifest-driven configuration Each workflow reads account_identifier, project_owner, and project_name from the manifest targets. This keeps environment configuration in one place and avoids duplicating it across GitHub secrets.

  • Data drop protection The deploy workflow parses plan_result.json for destructive DROP operations on data-bearing objects, such as databases, schemas, tables, and stages, and blocks the deployment if any are found.

  • Sequential stage-to-production promotion Production deployment starts only after staging deployment succeeds, dynamic tables are refreshed, and data quality tests pass.

  • Structured plan output parsing Workflows use jq to extract operation counts and object domains from plan_result.json, making it easy to build custom summaries and checks.

  • AI-powered summaries snow cortex complete generates natural-language summaries of post-hook script results and Dynamic Table refresh output for the GitHub Actions job summary.

Before running these sample workflows, complete the following prerequisites:

  • Store the DCM project files in a Git repository.

  • Grant a user privileges to create and run GitHub Actions.

  • Configure GitHub secrets for the Snowflake service user credentials (SNOWFLAKE_USER, SNOWFLAKE_PASSWORD or a personal access token).

  • Configure GitHub variables for the path to the DCM project folder (DCM_PROJECT_PATH).

  • Configure GitHub environments for each manifest target (for example, DCM_STAGE, DCM_PROD_US).

For setting up a Snowflake connection in GitHub Actions, see the first half of the blog post, A Practical Guide to GitHub Actions CI/CD.

See the snowflake-labs DCM repository for a set of GitHub Actions workflows that cover the full CI/CD lifecycle for DCM Projects.

All sample workflows read the Snowflake account_identifier and project_owner role directly from the manifest targets using yq, so that environment-specific configuration lives in the version-controlled manifest.yml rather than in duplicated GitHub secrets. Only the service user credentials are stored as secrets.

Sample workflow: Validate connections and privileges

This workflow validates that the GitHub Actions service user can connect to every target environment defined in the manifest. Use it when setting up a new repository, onboarding a new account, or debugging authentication issues. The workflow performs the following steps:

  • Parses all target names from manifest.yml dynamically.

  • Uses a GitHub Actions matrix strategy to test each target in parallel.

  • For each target, verifies the Snowflake connection, reports the connected account, user, and role, and checks whether the connected role matches the DCM project owner.

  • Reports whether the DCM project object already exists and whether the service user has deployment privileges.

Sample workflow: Preview changes on pull request

  • Workflow configuration file: DCM_1_Test_PR_to_main.yml

  • Trigger: Pull request opened, synchronized, or reopened against the main branch

This workflow runs a PLAN against both the staging and production targets as an integration test for every pull request. It provides reviewers with a summary of the planned changes directly on the pull request. The workflow performs the following steps:

  • Runs snow dcm plan against the STAGE and PROD targets in parallel.

  • Parses plan_result.json to summarize CREATE, ALTER, and DROP operations grouped by object domain.

  • Uploads plan artifacts for later inspection.

  • Posts a consolidated comment to the pull request with the plan summary for both environments.

  • Fails the check if either PLAN fails, blocking the merge.

Sample workflow: Deploy changes to Stage and Prod

This workflow implements a sequential promotion pipeline. Changes are first deployed to staging, validated end-to-end, and only then promoted to production. If any stage fails, the pipeline stops and production is not affected.

Stage deployment sequence:

  1. Plan: Runs snow dcm plan and summarizes the change set.

  2. Data drop detection: Parses the plan output and blocks the pipeline if it contains DROP operations for databases, schemas, tables, or stages.

  3. Deploy: Runs snow dcm deploy.

  4. Post scripts: Runs optional SQL post-hook scripts with Jinja variable injection using snow sql.

  5. Refresh Dynamic Tables: Runs snow dcm refresh to apply any new transformation logic.

  6. Test expectations: Runs snow dcm test to validate data quality expectations.

Production deployment sequence:

The same six steps are repeated for the production target, but only after all staging jobs pass.

After all jobs complete, the workflow posts a final status summary to the originating pull request.

Note

The deploy workflow uses snow cortex complete to generate human-readable summaries for post-hook script results and Dynamic Table refresh output in the GitHub Actions job summaries.

Sample workflow: Test expectations on Stage

This workflow provides an on-demand way to validate data quality on the staging environment without triggering a full deployment. Use it to verify expectations after manual data changes or upstream data refreshes, or as a periodic quality check. The workflow performs the following steps:

  • Refreshes all Dynamic Tables managed by the staging DCM project.

  • Runs all data quality expectation tests attached to tables, Dynamic Tables, and views in the project.

  • Reports pass or fail status with details about violated expectations.

Frequently asked questions (FAQ)

How do I rename an existing object?
  1. Run an ALTER command outside of the DCM project.

  2. Change the definition.

  3. Run PLAN to verify that the new definition matches the new state (no change in PLAN).

  4. Run DEPLOY to save the new state.

How do I deploy objects that are not yet supported by DEFINE statements?

You can run CREATE IF NOT EXISTS or CREATE OR REPLACE statements in a separate SQL script after executing your DCM project plan or deployment.

Both options support Jinja2 templating and dry-run (dry-run renders the Jinja templating but does not verify successful SQL compilation).

For example:

EXECUTE DCM PROJECT my_project
  PLAN ...
USING ...
FROM ...

EXECUTE IMMEDIATE
FROM
  'snow://workspace/USER$.PUBLIC.DEFAULT$/versions/head/DCM_Project_Quickstart_1/hooks/post_hook.sql'
  USING (db => 'DEV')
  dry_run = TRUE      -- shows the rendered Jinja but does not verify successful compilation
;