Creating, running, and managing dbt projects on Snowflake¶
dbt Core is an open-source data transformation tool and framework that you can use to define, test, and deploy SQL transformations. dbt on Snowflake allows you to use familiar Snowflake features to create, edit, test, run, and manage your dbt Core projects. Snowflake integrates with Git repositories and offers Snowflake CLI commands to support continuous integration and development (CI/CD) workflows for data pipelines.
Key concepts¶
Snowflake offers the following features for you to interact with dbt projects on Snowflake. Combine the features that you need to suit your workflow and business needs.
Workspaces - A workspace is a container for your dbt projects in Snowsight where you can create, edit, test, and deploy your dbt projects to Snowflake. Workspaces can be integrated with Git repositories and offer a web-based IDE for you to work with your dbt projects. For more information, see Workspaces.
SQL - Use familiar SQL commands to work with dbt project objects in Snowflake. You can perform traditional operations like creating, updating, and scheduling a dbt project object. You can use the EXECUTE DBT PROJECT command to execute dbt operations such as
run
andtest
with dbt flags and parameters.Snowflake CLI - Use Snowflake CLI commands to integrate dbt on Snowflake with your data CI/CD workflows. For more information about the Snowflake CLI, see Snowflake CLI.
Before you start with dbt on Snowflake, you might find a summary of the terms we use in this guide helpful:
dbt project - A working directory in dbt Core.
dbt project object - A deployed dbt project in Snowflake, which can be managed and scheduled as an object entity on Snowflake.
dbt monorepo - A folder structure in dbt Core that contains multiple dbt projects and is typically associated with a Git repository and a workspace for dbt on Snowflake.
workspace for dbt on Snowflake - The editing environment in Snowflake for working with a dbt monorepo and any files you associate with that monorepo.
Requirements, considerations, and limitations¶
Before you start using dbt on Snowflake, review the following requirements, considerations, and limitations:
dbt Core projects only - dbt Cloud projects are not supported.
dbt versions - dbt on Snowflake runs dbt-core version 1.8.9 and dbt-snowflake version 1.8.4. When you migrate an existing dbt project to Snowflake, your dbt versions don’t have to align with Snowflake versions.
Integrating with a dbt Git repository requires that you create and set up an API integration on Snowflake. For more information, see Create an API integration for interacting with the repository API.
dbt dependencies have the following requirements and limitations:
Specifying Git packages in the
packages.yml
file is not supported.A network rule and external access integration are required to allow Snowflake to access the repositories for the dependencies. For more information, see Create an external access integration in Snowflake for dbt dependencies.
You must run the
dbt deps
command from within a workspace for dbt on Snowflake to build and populate thedbt_packages
project folder.
Each dbt project must have a profiles.yml file stored in its working directory. Snowflake credentials (username and password) are not required in this file.
dbt projects are limited to 20,000 files - A dbt project in a workspace cannot have more than 20,000 files in its folder structure. This limit includes all files in the dbt project directory and sub-directories, including the
target/dbt_packages/logs
directories, where log files are saved when a dbt project runs from within the workspace.Workspaces cannot be shared because they are created in a personal database.
Personal databases must be enabled at the account level. For more information, see Enable and disable private notebooks for the account.
Viewing logs and tracing requires that you set the LOG_LEVEL and TRACE_LEVEL on the dbt project object. For more information, see Access control and Observability and monitoring. By default, Snowflake collects telemetry in the default SNOWFLAKE.TELEMETRY.EVENTS table. If you have a custom event table that is set as the event table for your account, telemetry data is collected there. If you use an Enterprise Edition account, you can create an event table to collect telemetry data and associate it with the database where the dbt project object is deployed. For more information, see Event table overview.
Known issues¶
The following are known issues with dbt on Snowflake during the private preview:
Operational dbt commands in Workspaces - Some commands listed in Workspaces are not fully supported and might result in errors. Support for additional commands is on the roadmap. Currently supported workspace dbt commands are:
dbt output not immediately available in workspace - stdout does not appear incrementally during project execution but only appears after the command completes.
Scheduling not yet available in Snowsight - Using a Snowflake task to schedule a dbt project object currently requires running SQL commands as described in Create a task to schedule dbt project execution. Scheduling a dbt project using tasks in Snowsight is on the roadmap.
dbt run history not yet available - The ability for you to view dbt run history pages in Snowsight is on the roadmap. As a workaround, you can use Logging, tracing, and metrics to view the history of queries that a dbt project ran. For more information, see Observability and monitoring.
OAuth for workspaces integrated with Git is not yet supported
Upcoming improvements¶
Performance improvements - The team is implementing a 20-40x improvement in dbt warmup time–that is, the time between job start and dbt process starts. For a dbt project with files a few bytes in size, the new method takes approximately 3ms per file, reduced from approximately 130ms per file. This improvement is targeted for the 9.14 release.
Manifest file generation fix - This fix resolves the
No project manifest found, please compile first
exception when trying to view a DAG.
Access control¶
The following privileges apply to dbt project objects and workspaces:
To grant privileges to create a dbt project object:
GRANT CREATE DBT PROJECT ON SCHEMA my_database.my_schema TO ROLE my_role
To grant privileges to execute a dbt project object and to list or get files:
GRANT USAGE ON DBT PROJECT my_dbt_project_object TO ROLE my_role
To grant privileges to alter or drop a dbt project object:
GRANT OWNERSHIP ON DBT PROJECT my_dbt_project_object TO ROLE my_role
Getting started tutorial¶
This getting started tutorial guides you through creating a workspace for dbt on Snowflake that is integrated with a GitHub repository that you fork from our getting-started-with-dbt-on-snowflake repository in Snowflake Labs. You then use the workspace to update dbt project files, and test and run the dbt project, which materializes the data model output of the dbt project in target Snowflake databases and schemas. Finally, you will deploy the project to create a dbt project object on Snowflake, set up a task to schedule execution of the project, and then view logs and tracing information on the initial execution.
Prerequisites¶
GitHub
A GitHub account with the ability to create a repository and manage access to that repository.
Git on the command line. For more information about installation, see Set up Git.
Snowflake
A Snowflake account and user with privileges as described in Access control.
Privileges to create and edit the following objects or access to an administrator who can create each of them on your behalf:
An API integration
If your GitHub repository is private, a secret
A network rule
An external access integration that references the network rule
Your user object
Fork and clone the dbt on Snowflake getting started repository¶
Go to https://github.com/Snowflake-Labs/getting-started-with-dbt-on-snowflake, select the down arrow next to Fork, and then select Create a new fork.
Specify the owner and name of your forked repository and other details. Later in the tutorial, we use the following URL to represent your forked repository:
https://github.com/my-github-account/getting-started-with-dbt-on-snowflake.git
Activate all secondary roles for your user¶
Workspaces that you create in Snowflake are created in the personal database associated with the active user. To use Workspaces, you must run the following SQL commands to activate all secondary roles for your user.
Sign in to Snowsight, open a worksheet, and run the following SQL commands. Replace
my_user
with your Snowflake user name.ALTER USER my_user UNSET DEFAULT_SECONDARY_ROLES; ALTER USER my_user SET DEFAULT_SECONDARY_ROLES = ('ALL');
After the default secondary role has been set for your user, run the following command from a worksheet while logged in as that user.
USE SECONDARY ROLES ALL;
Sign out of Snowsight and sign back in.
Create a warehouse for executing workspace actions (optional)¶
Having a dedicated warehouse assigned to your workspace can help you log, trace, and identify actions initiated from within that workspace. In this tutorial, we use a warehouse named TASTY_BYTES_DBT_WH. Alternatively, you can use an existing warehouse in your account. For more information about creating a warehouse, see Creating a warehouse.
The Tasty Bytes data model that you create for source data is fairly large, so we recommend using an XL warehouse. Run the following SQL to create the warehouse:
CREATE WAREHOUSE tasty_bytes_dbt_wh WAREHOUSE_SIZE = X-LARGE;
Create a database and schema for integrations and model materializations¶
This tutorial uses a database named TASTY_BYTES_DBT_DB. Within that database, a schema named INTEGRATIONS will store objects you create that Snowflake needs for GitHub integration. Schemas named DEV and PROD will store materialized objects that your dbt project creates.
Run the following SQL commands to create the database and schema:
CREATE DATABASE tasty_bytes_dbt_db;
CREATE SCHEMA tasty_bytes_dbt_db.integrations;
CREATE SCHEMA tasty_bytes_dbt_db.dev;
CREATE SCHEMA tasty_bytes_dbt_db.prod;
Create an API integration in Snowflake for connecting to GitHub¶
Snowflake needs an API integration to interact with GitHub.
If your repository is private, you must also create a secret in Snowflake to store GitHub credentials for your repository. You then specify the secret in the API integration definition as one of the ALLOWED_AUTHENTICATION_SECRETS. You also specify this secret when you create the workspace for your dbt project later.
Creating a secret requires a personal access token for your repository. For more information about creating a token, see Managing your personal access tokens in GitHub documentation.
In this tutorial, we use a secret named TB_DBT_GIT_SECRET. For more information about creating a secret, see Create a secret with credentials for authenticating.
Run the following SQL commands in Snowflake to create a secret for GitHub:
USE tasty_bytes_dbt_db.integrations;
CREATE OR REPLACE SECRET tasty_bytes_dbt_db.integrations.tb_dbt_git_secret
TYPE = password
USERNAME = 'your-gh-username'
PASSWORD = 'YOUR_PERSONAL_ACCESS_TOKEN';
After you create the secret, run the following SQL commands to create an API integration for GitHub. Replace https://github.com/my-github-account
with the SSH URL of the GitHub account for your forked repository:
CREATE OR REPLACE API INTEGRATION tb_dbt_git_api_integration
API_PROVIDER = git_https_api
API_ALLOWED_PREFIXES = ('https://github.com/my-github-account')
-- Comment out the following line if your forked repository is public
ALLOWED_AUTHENTICATION_SECRETS = (tasty_bytes_dbt_db.integrations.tb_dbt_git_secret)
ENABLED = TRUE;
Create an external access integration in Snowflake for dbt dependencies¶
Most dbt projects specify dependencies in their packages.yml
file. You must install these dependencies in the dbt project workspace. You can’t update a deployed dbt project object with dependencies.
To get dependency files from remote URLs, Snowflake needs an external access integration that relies on a network rule.
For more information about external access integrations in Snowflake, see Creating and using an external access integration.
Run the following SQL commands in Snowflake to create a network rule and an external access integration:
-- Create NETWORK RULE for external access integration
CREATE OR REPLACE NETWORK RULE dbt_network_rule
MODE = EGRESS
TYPE = HOST_PORT
-- Minimal URL allowlist that is required for dbt deps
VALUE_LIST = (
'hub.getdbt.com',
'codeload.github.com'
);
-- Create EXTERNAL ACCESS INTEGRATION for dbt access to external dbt package locations
CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION dbt_ext_access
ALLOWED_NETWORK_RULES = (dbt_network_rule)
ENABLED = true;
Enable logging, tracing, and metrics¶
You can capture logging and tracing events for a dbt project object and for the task that runs it, if applicable. For more information, see Observability and monitoring later in this topic.
Set logging, tracing, and metrics on the schema where the dbt project object and task are deployed. The following commands enable logging, tracing, and metrics for the DEV and PROD schemas in the TASTY_BYTES_DBT_DB database.
ALTER SCHEMA tasty_bytes_dbt_db.dev SET LOG_LEVEL = 'INFO';
ALTER SCHEMA tasty_bytes_dbt_db.dev SET TRACE_LEVEL = 'ALWAYS';
ALTER SCHEMA tasty_bytes_dbt_db.dev SET METRIC_LEVEL = 'ALL';
ALTER SCHEMA tasty_bytes_dbt_db.prod SET LOG_LEVEL = 'INFO';
ALTER SCHEMA tasty_bytes_dbt_db.prod SET TRACE_LEVEL = 'ALWAYS';
ALTER SCHEMA tasty_bytes_dbt_db.prod SET METRIC_LEVEL = 'ALL';
Create a workspace integrated with your repository¶
In this step, you create a workspace in Snowsight that is integrated with your GitHub repository. For more information about workspaces, see Workspaces.
Sign in to Snowsight.
Select Projects »
Workspaces
in the left-side navigation menu.From the Workspaces list above the workspace explorer, under Create Workspace, select From Git repository.
For Repository URL, enter the SSH URL of your forked GitHub repository, for example, https://github.com/my-github-account/getting-started-with-dbt-on-snowflake.git
For Workspace name, enter a name. Later in this tutorial, we use tasty_bytes_dbt.
Under API integration, select the name of the API integration that you created earlier, for example, TB_DBT_GIT_API_INTEGRATION.
If your GitHub repository is public, select Public repository, and then select Create.
If your GitHub repository is private, and you created a secret for your API integration during setup, do the following:
Select Personal access token.
Under Credentials secret, select Select database and schema.
Select the database from the list (for example, TASTY_BYTES_DBT_DB) and then select the schema from the list (for example, INTEGRATIONS) where you stored the API integration.
Select Select secret, and then select your secret from the list, for example, tb_dbt_git_secret.
Select Create.
Snowflake connects to the GitHub repository that you specified and opens your new workspace. A single folder in the workspace named
tasty_bytes_dbt_demo
contains the dbt project that you will work with.
Run the SQL commands in tasty_bytes_setup.sql to set up source data¶
As source data for its transformations, the dbt project in your repository uses the foundational data model for the fictitious Tasty Bytes food truck brand. The SQL script to create the data model is in the workspace.
Navigate to the tasty_bytes_dbt_demo/setup/tasty_bytes_setup.sql file in your workspace and open it.
From the Run on warehouse list in the upper right of the worksheet tab, select the warehouse you created earlier, for example, TASTY_BYTES_DBT_WH.
Run all worksheet commands in the file (cmd + Shift + Enter). The Output tab displays the message
tb_101 setup is now complete
.
Verify the contents of the profiles.yml file in your dbt project root¶
Each dbt project folder in your Snowflake workspace must contain a profiles.yml
file that specifies a target warehouse
, database
, schema
, and role
in Snowflake for the project. The type
must be set to snowflake
.
dbt requires an account
and user
, but these can be left with an empty or arbitrary string because the dbt project runs in Snowflake under the current account and user context.
When you run dbt commands, your workspace reads profiles.yml
. When you have at least one valid target specified in profiles.yml
, the selections for the targets defined in that project and the dbt commands available to run are available in the workspace menu bar.
Open the tasty_bytes_dbt_demo/profiles.yml
file and verify the contents as shown below. If you specified different database or warehouse names earlier, replace them with your own.
tasty_bytes:
target: dev
outputs:
dev:
type: snowflake
account: 'not needed'
user: 'not needed'
role: accountadmin
database: tasty_bytes_dbt_db
schema: dev
warehouse: tasty_bytes_dbt_wh
prod:
type: snowflake
account: 'not needed'
user: 'not needed'
role: accountadmin
database: tasty_bytes_dbt_db
schema: prod
warehouse: tasty_bytes_dbt_wh
Update dependencies by running dbt deps for each dbt target defined in profiles.yml¶
The first dbt commands that you run update the dependency specified in your project’s packages.yml
file. The dbt demo repository in your branch specifies version 1.3.0
of the dbt-labs/dbt_utils
package.
Select Output in the workspace footer to open the Output tab. This allows you to see stdout when you run dbt commands from the workspace.
From the workspace menu bar, confirm that the default Project (tasty_bytes_dbt_demo) and target (dev) are selected.
Select Deps from the command list.
Select the down arrow next to the execute button. In the dbt Deps window, leave Run with defaults selected, and enter the name of the External Access Integration you created during setup in the space provided, for example, dbt_ext_access.
Select Deps to run the command.
The command that runs on Snowflake is shown in the Output tab similar to the following:
execute dbt project from workspace "USER$"."PUBLIC"."tasty_bytes_dbt" project_root='tasty_bytes_dbt_demo' args='deps --target dev' external_access_integrations = (dbt_ext_access)
When the command finishes, stdout messages similar to the following appear:
14:47:19 Running with dbt=1.8.9 14:47:19 Updating lock file in file path: /tmp/dbt/package-lock.yml 14:47:19 Installing dbt-labs/dbt_utils 14:47:19 Installed from version 1.3.0 14:47:19 Up to date! Uploading /tmp/dbt/package-lock.yml to snow://workspace/USER$ADMIN.PUBLIC."tasty_bytes_dbt"/versions/live/dbt//package-lock.yml
Repeat the steps above, selecting Prod as the target.
Execute common dbt commands¶
You can use the workspace to execute common dbt tasks on a project:
You specify the dbt project and target in the workspace menu bar, select the command to execute, and then select the execute button. The down arrow next to the execute button allows you to specify additional arguments that the dbt command supports.
When you execute any dbt command within the workspace, the Output tab shows the command executed on Snowflake (in green) and the stdout for that command so that you can monitor command success or failure.
Important
Choosing the dbt Run command for a project from within the workspace materializes target output for the project as defined in the project’s profiles.yml
file.
Compile the dbt project and view compiled SQL¶
Compiling a project in dbt creates executable SQL from modeled SQL files. For more information, see About dbt models in dbt documentation. After you compile the project in the workspace, you can open any SQL file in the models
folder to see the model SQL and the compiled SQL in side-by-side tabs.
Select the project and target you want to compile.
Select Compile from the command list and then select the execute button (optionally, you can select the down arrow and specify compile command arguments).
After the command completes successfully, selected any file in the models directory of the project and then select View Compiled SQL.
View the dbt demo project DAG¶
A workspace for dbt on Snowflake features a DAG pane, which allows you to visualize your dbt project transformations from source files to materialized data model objects in Snowflake.
After you compile a project, select DAG on the right of the workspace menu bar to open the DAG tab. Click and drag any DAG object to pan the view. Use the + and – buttons to zoom in and out.
Run the dbt dev project and verify the materialized Snowflake objects¶
Executing the dbt run command executes your compiled SQL against the target database and schema using the Snowflake warehouse and role that are specified in the profiles.yml
file of the project. In this step, we’ll materialize the output of the Dev
target in your dbt demo project. We then create a SQL worksheet named dbt_sandbox.sql
in the workspace where you can run SQL to verify object creation.
Select Dev from the target list, Run from the command list, and then select Execute.
The output pane shows the completion status of the run.
Navigate to the
examples
folder in your tasty_bytes_dbt_demo project, select the + next to the folder name, and then select SQL File. Type dbt_sandbox.sql and then press Enter.In the workspace tab for
dbt_sandbox.sql
, run the following query:SHOW TABLES IN DATABASE tasty_bytes_dbt_db;
You should see the tables CUSTOMER_LOYALTY_METRICS, ORDERS, and SALES_METRICS_BY_LOCATION in the Status and Results pane.
Run the following command to see the views that your dbt project run created:
SHOW VIEWS IN DATABASE tasty_bytes_dbt_db;
Push your file updates from the workspace to your repository¶
Now that you have updated your workspace and compiled, tested, run, and deployed your project as a dbt project object, you can push the changes you made in the workspace to your GitHub repository.
With your workspace open, select Changes.
The object explorer showing the list of folders and files in your workspace appears.
A indicates a file added in the workspace and not to the Git repository.
M indicates a modified file.
D indicates a deleted file.
Select a file to view its diff with GitHub since the last pull (in this case, when the workspace was created).
The menu bar above the object explorer has a branch selector, which should be set to main for this tutorial.
Select Pull to pull any changes from GitHub.
A text box allows you to enter a commit message, which is required to push.
Type a commit message in the box provided to enable the Push button, for example, Updating project with initial changes from dbt on Snowflake.
Select Push.
A push to your repository might take several minutes.
Deploy the dbt project object from the workspace¶
Deploying your dbt project from a workspace creates a dbt project object. You can use the object to schedule, run, and monitor a dbt project in Snowflake outside of the workspace.
When you deploy your dbt project object from the workspace to a Snowflake database and schema, you can create or overwrite an object that you previously created.
Select Deploy on the right of the workspace menu bar.
Select Select database and schema, and then select the TASTY_BYTES_DBT_DB database and the DEV schema.
Under Select or Create dbt Object, select Create dbt Object.
Under Enter Name, type TASTY_BYTES_DBT_DEV and then select Deploy.
The Output tab displays the command executed on Snowflake similar to the following:
create or replace DBT PROJECT "TASTY_BYTES_DBT_DB"."DEV"."TASTY_BYTES_DBT_DEV" from snow://workspace/USER$ADMIN.PUBLIC."tasty_bytes_dbt_demo"/versions/live/dbt tasty_bytes_dbt_dev successfully created.
The Output tab displays stdout similar to the following:
create or replace DBT PROJECT "TASTY_BYTES_DBT_DB"."DEV"."TASTY_BYTES_DBT_DEV" from snow://workspace/USER$ADMIN.PUBLIC."tasty_bytes_dbt_demo"/versions/live/dbt tasty_bytes_dbt_dev successfully created.
To verify the creation of the project, run the following SQL command from the dbt_sandbox.sql file worksheet that you created earlier:
SHOW DBT PROJECTS LIKE 'tasty%';
Enable logging, tracing, and metrics for the task that runs the dbt project object¶
To view logging, tracing, and metrics for the task that executes your dbt project object, you must set the LOG_LEVEL, TRACE_LEVEL, and METRIC_LEVEL parameters on the schema where the task is deployed. For more information, see Observability and monitoring.
Create a task to schedule dbt project execution¶
Now that your dbt project object for tasty_bytes_demo_db is deployed, you can use SQL to set up a task that executes a dbt run
command on your dbt project object.
Open the workspace for the dbt_sandbox.sql
file that you created earlier and run the following SQL command, which creates a task in Snowflake that will run your dbt project using the customer_loyalty_metrics
model every minute. You can adjust the schedule to suit your preference.
For more information about tasks and task options, see Introduction to tasks and CREATE TASK.
CREATE OR ALTER TASK tasty_bytes_dbt_db.dev.run_prepped_data_dbt
WAREHOUSE = tasty_bytes_dbt_wh
SCHEDULE = '1 minute'
AS
EXECUTE DBT PROJECT tasty_bytes_dbt_db.dev.tasty_bytes_dbt_dev args='run --select customer_loyalty_metrics';
-- Tasks are created in a suspended state by default.
-- Use the following command to run the task immediately:
EXECUTE TASK tasty_bytes_dbt_db.dev.run_prepped_data_dbt;
-- Use the following command to resume the task so that it runs on schedule:
ALTER TASK tasty_bytes_dbt_db.dev.run_prepped_data_dbt RESUME;
Clean up¶
You can delete the databases, workspaces, and warehouse that you created to clean up after this tutorial.
To delete the tasty_bytes_dbt_db database:
Run the following SQL commands from your
dbt_sandbox.sql
worksheet to remove the TASTY_BYTES_DBT_DB and TB_101 databases that you created, along with all schemas and objects created in the databases:DROP WAREHOUSE IF EXISTS tasty_bytes_dbt_wh; DROP DATABASE IF EXISTS tasty_bytes_dbt_db; DROP DATABASE IF EXISTS tb_101;
To delete your tasty_bytes_dbt_demo workspace:
Observability and monitoring¶
You can capture logging and tracing events for a dbt project object and for the task that runs it, if applicable. Set logging, tracing, and metric levels on the schema where the dbt project object is deployed. Replace my_database.my_schema
below with the identifier of the schema where the dbt project object is deployed:
ALTER SCHEMA tasty_bytes_dbt_db.dev SET LOG_LEVEL = 'INFO';
ALTER SCHEMA tasty_bytes_dbt_db.dev SET TRACE_LEVEL = 'ALWAYS';
ALTER SCHEMA tasty_bytes_dbt_db.dev SET METRIC_LEVEL = 'ALL';
With events, logging, and tracing enabled, you can view traces and logs for dbt project object execution. For more information, see Logging, tracing, and metrics.
You can view the history of compiled queries that your dbt project runs in query history. For more information, see Monitor query activity with Query History.
If you use a task to schedule dbt project object execution, you can view task history using SQL or Snowsight. For more information, see Viewing the task history for your account and Viewing tasks and task graphs in Snowsight.
Using the Snowflake CLI to integrate dbt with CI/CD¶
You can use the Snowflake CLI to create a CI/CD workflow for your dbt project that integrates with popular systems and frameworks, such as GitHub actions. For more information, see Integrating CI/CD with Snowflake CLI and Managing dbt projects in Snowflake CLI.
Using SQL to manage dbt project objects¶
You can specify dbt project objects in SQL using standard notation for object identifiers, for example:
my_account.my_database.my_schema.my_dbt_project_object
Some SQL commands require you to specify the name of the workspace for dbt on Snowflake rather than the deployed dbt project object. In this case, use the following notation:
user$.public."my_workspace_name".
DESCRIBE DBT PROJECT¶
Displays information about the dbt project object.
DESCRIBE DBT PROJECT <database>.<schema>.<dbt_project_object>;
DROP DBT PROJECT¶
Removes the dbt project object from the system.
DROP DBT PROJECT <database>.<schema>.<dbt_project_object>;
EXECUTE DBT PROJECT¶
Executes the specified dbt command for the specified dbt project object.
Syntax¶
EXECUTE DBT PROJECT <name>
args='<dbt_command> [--<dbt_flag> <flag_value> [...]] [...]';
Required parameters¶
dbt_project_object_name
The fully qualified name of the dbt project object to run, for example,
my_account.my_database.my_schema.my_dbt_project_object
.args='<dbt_command>'
The dbt command to execute and any flags to pass to the command. The
args
value is passed as a string and must be enclosed in single quotes.
Optional parameters¶
Optional dbt flags for commands must be specified in the format --<dbt_flag> <flag_value>
in the same format that dbt supports.
For example, to execute the test
command specifying the dbt models to test with the --select
flag, you would specify:
EXECUTE DBT PROJECT my_account.my_database.my_schema.my_dbt_project_object
args='test --select my_first_model my_second_model';
The following flags are not supported by dbt on Snowflake:
–log-path
–profiles-dir
–project-dir
–log-format
–log-format-file
LIST¶
Lists files in a dbt project object.
Syntax¶
LIST 'snow://dbt/<database>.<schema>.<dbt_project_object>/versions/last/';