PutDatabricksSQL 2025.10.9.21

Bundle

com.snowflake.openflow.runtime | runtime-databricks-processors-nar

Description

Submit a SQL Execution using Databricks REST API then write the JSON response to FlowFile Content. For high performance SELECT or INSERT queries use ExecuteSQL instead.

Tags

databricks, openflow, sql

Input Requirement

Supports Sensitive Dynamic Properties

false

Properties

PropertyDescription
Databricks ClientDatabricks Client Service.
Default CatalogDefault table catalog, some SQL statements such as ‘COPY INTO’ do not support using a default catalog
Default SchemaDefault table schema, some SQL statements such as ‘COPY INTO’ do not support using a default schema
Record WriterSpecifies the Controller Service to use for writing results to a FlowFile. The Record Writer may use Inherit Schema to emulate the inferred schema behavior, i.e. an explicit schema need not be defined in the writer, and will be supplied by the same logic used to infer the schema from the column types.
SQL Warehouse IDWarehouse ID used to execute SQL
SQL Warehouse NameSQL Warehouse Name used to execute SQL, will search through all SQL Warehouses to find matching name.
StatementSQL statement to execute

Relationships

NameDescription
failureDatabricks failure relationship
http.responseHTTP Response to SQL API Request
originalThe original FlowFile is routed to this relationship when processing is successful.
recordsSerialized SQL Records

Writes attributes

NameDescription
statement.stateThe final state of the executed SQL statement
error.codeThe error code for the SQL statement if an error occurred.
error.messageThe error message for the SQL statement if an error occurred.