Openflow Connector for Salesforce Bulk API: Configure the connector¶
Note
This connector is subject to the Snowflake Connector Terms.
This topic describes the steps to configure the Openflow Connector for Salesforce Bulk API.
Install the connector¶
Follow these steps to install the Openflow Connector for Salesforce Bulk API in an Openflow runtime:
Navigate to the Openflow Overview page. In the Featured connectors section, select View more connectors.
On the Openflow connectors page, find Openflow connector for Salesforce Bulk API and select Add to runtime.
In the Select runtime dialog, select your runtime from the Available runtimes drop-down.
The Openflow canvas appears with the connector process group added to it.
Configure the connector¶
To configure the connector, perform the following steps:
Right-click on the imported process group and select Parameters.
Populate the required parameter values as described in the table below.
Parameter |
Description |
|---|---|
Column Removal Strategy |
Defines the strategy to adopt when a column should be removed in the destination table based on the latest received schema. Three possible values:
|
Connected App Key |
Copy-paste the content of the |
Connected App Key File |
You can directly upload the |
Connected App Key Password |
Password set on the private key file during the Salesforce Setup steps. |
Destination Database |
Name of the database in Snowflake where the Salesforce data will be replicated. The database must exist before starting the connector. |
Destination Schema |
Name of the schema, in the database above, into which the connector will create tables for the Salesforce data to be added. The schema must exist before starting the connector. |
Filter |
Comma-separated list of objects to replicate from Salesforce, or regular expression to apply against all existing objects. Example: Note If left empty, all objects will be replicated. This is not recommended as there are usually thousands of objects in a Salesforce instance. |
Incremental Offload |
Whether the processor should perform incremental offload. If |
Initial Load Chunking |
If set to a value other than This is useful for large datasets where loading all historical data in a single query may time out, exceed API limits, or exceed the storage size of the content repository of the runtime. Once caught up, the processor will continue with normal incremental offload behavior. |
OAuth2 Audience |
Audience to set in the JWT token. This is usually set to |
OAuth2 Client ID |
Should be set to the Consumer Key value retrieved during the Salesforce Setup steps. |
OAuth2 Subject |
Should be set to the username of an admin-approved user for the application to interact with Salesforce APIs on behalf of this user. |
OAuth2 Token Endpoint URL |
Endpoint to negotiate tokens via the JWT Bearer Flow. Example: |
Object Fields Filter JSON |
A JSON specifying which fields and field patterns should be included or excluded, per Salesforce object. Takes the form of an array with one item per object. Example 1: This will include all fields that end with ‘name’ in the ‘Account’ Salesforce object:
Example 2: This will include the fields Id, Name, and Revenue in the ‘Account’ Salesforce object:
|
Object Identifier Resolution |
Determines if schema / table / column names are treated as case-sensitive or case-insensitive. One of: Note Changing this parameter value will require clearing the state and doing a full reload of all objects. |
Removed Column Name Suffix |
Suffix added to the column name when the parameter Column Removal Strategy is set to |
Run Schedule |
Frequency at which the connector will check for updates in Salesforce for configured objects via the Filter parameter. Default: |
Salesforce Instance |
Qualified name of the Salesforce instance. Example: |
Snowflake Account Identifier |
Snowflake account name formatted as |
Snowflake Username |
The service user name used by the connector to connect to Snowflake. |
Snowflake User Private Key |
The RSA Private Key used by the connector for authentication to Snowflake, formatted according to PKCS8 standards and including standard PEM headers and footers. The header line starts with You may also use the next parameter to upload the private key to the Openflow runtime instead. |
Snowflake User Private Key File |
The file containing the RSA Private Key used by the connector for authentication to Snowflake, formatted according to PKCS8 standards and including standard PEM headers and footers. The header line starts with Select the Reference asset checkbox to upload the private key file and store it securely in the Openflow runtime. |
Snowflake User Private Key Password |
The password associated with the Snowflake Private Key File (if encrypted). |
Snowflake User Role |
Name of the Snowflake role assigned to the Snowflake username that is used during query execution. |
Snowflake Authentication Strategy |
Authentication strategy for the connector to connect to Snowflake. Using |
Snowflake Warehouse |
The Snowflake warehouse used to run queries. |
Special Objects Filter |
Comma-separated list of objects to offload from Salesforce (using direct API access), or regular expression to apply against all existing objects. This filter should only be used for objects that are not supported by the Salesforce Bulk API such as knowledge data, for example. This parameter should not overlap with the parameter Filter. Example: |
Run the connector¶
Follow these steps to start the connector and begin replicating data from Salesforce to Snowflake:
Right-click on an empty area in the canvas and select Enable all Controller Services.
Right-click on the connector process group and select Start.
Next steps¶
To monitor and troubleshoot the connector, see Openflow Connector for Salesforce Bulk API: Troubleshooting.