Openflow Connector for Oracle: Configure connector

Note

The connector is subject to the Connector Terms.

This topic describes the steps to set up the Openflow Connector for Oracle connector itself.

As a data engineer, perform the following tasks to install and configure the connector:

Install the connector

  1. Navigate to the Openflow Overview page. In the Featured connectors section, select View more connectors.

  2. On the Openflow connectors page, find the connector and select Add to runtime.

  3. In the Select runtime dialog, select your runtime from the Available runtimes drop-down list.

  4. Select Add.

    Note

    Before you install the connector, ensure that you have created a database and schema in Snowflake for the connector to store ingested data.

  5. Authenticate to the deployment with your Snowflake account credentials and select Allow when prompted to allow the runtime application to access your Snowflake account. The connector installation process takes a few minutes to complete.

  6. Authenticate to the runtime with your Snowflake account credentials.

The Openflow canvas appears with the connector process group added to it.

Configure the connector

To configure the connector, perform the following steps:

  1. Right-click on the added runtime and select Parameters.

  2. Populate the required parameter values as described in Specify flow parameters.

Specify flow parameters

This section describes the flow parameters that you can configure based on the following parameter contexts:

Snowflake Destination Parameters

Parameter

Description

Required

Destination Database

The database where data will be persisted. It must already exist in Snowflake. The name is case-sensitive. For unquoted identifiers, provide the name in uppercase.

Yes

Snowflake Account Identifier

When using:

  • Session Token Authentication Strategy: Must be blank.

  • KEY_PAIR: Snowflake account name formatted as [organization-name]-[account-name] where data will be persisted.

Yes

Snowflake Authentication Strategy

When using:

  • Snowflake Openflow Deployment: Use SNOWFLAKE_SESSION_TOKEN. This token is managed automatically by Snowflake.

  • BYOC: Use KEY_PAIR as the value for authentication strategy.

Yes

Snowflake Private Key

When using:

  • Session Token Authentication Strategy: Must be blank.

  • KEY_PAIR: Must be the RSA private key used for authentication.

    The RSA key must be formatted according to PKCS8 standards and have standard PEM headers and footers. Note that either a Snowflake Private Key File or a Snowflake Private Key must be defined.

No

Snowflake Private Key File

When using:

  • Session token authentication strategy: The private key file must be blank.

  • KEY_PAIR: Upload the file that contains the RSA private key used for authentication to Snowflake, formatted according to PKCS8 standards and including standard PEM headers and footers. The header line begins with -----BEGIN PRIVATE. To upload the private key file, select the Reference asset checkbox.

No

Snowflake Private Key Password

When using

  • Session Token Authentication Strategy: Must be blank.

  • KEY_PAIR: Provide the password associated with the Snowflake Private Key File.

No

Snowflake Role

When using

  • Session Token Authentication Strategy: Use your Runtime Role. You can find your Runtime Role in the Openflow UI, by navigating to View Details for your Runtime.

  • KEY_PAIR Authentication Strategy: Use a valid role configured for your service user.

Yes

Snowflake Username

When using

  • Session Token Authentication Strategy: Must be blank.

  • KEY_PAIR: Provide the user name used to connect to the Snowflake instance.

Yes

Snowflake Warehouse

Snowflake warehouse used to run queries.

Yes

Oracle Ingestion Parameters

Parameter

Description

Included Table Names

Comma-separated list of fully-qualified table paths (e.g., SCHEMA_NAME.TABLE_NAME).

Included Table Regex

A regular expression to match table paths for automatic inclusion of existing and new tables.

Filter JSON

A JSON array to include specific columns based on a regex pattern for given tables. Currently unsupported

Merge Task Schedule CRON

A CRON expression to define when merge operations from the Journal to the Destination Table will be triggered For example, * * * * * ? for continuous merge.

Object Identifier Resolution

Specifies how source object identifiers such as schemas, tables, and columns names are stored and queried in Snowflake. This setting dictates whether you will need to use double quotes in SQL queries.

Option 1: Default, case-sensitive. For backwards compatibility.

  • Transformation: Case is preserved. For example, My_Table remains My_Table.

  • Queries: SQL queries must use double quotes to match the exact case for database objects. For example, SELECT * FROM "My_Table";.

Note

Snowflake recommends using this option if you must preserve source casing for legacy or compatibility reasons. For example, if the source database includes table names that differ in case only such as MY_TABLE and my_table that would result in a name collision when using when using case-insensitive comparisons.

Option 2: Recommended, case-insensitive

  • Transformation: All identifiers are converted to uppercase. For example, My_Table becomes MY_TABLE.

  • Queries: SQL queries are case-insensitive and don’t require SQL double quotes. For example SELECT * FROM my_table; returns the same results as SELECT * FROM MY_TABLE;.

Note

Snowflake recommends using this option if database objects are not expected to have mixed case names.

Important

Do not change this setting after connector ingestion has begun. Changing this setting after ingestion has begun will break the existing ingestion. If you must change this setting, create a new connector instance.

Oracle Source Parameters

Parameter

Description

Required

Oracle Connection URL

JDBC URL of the database connection to the DB. For example jdbc:oracle:thin@<host>:<port>/YOUR_DB_NAME.

Yes

Oracle Username

Username of the connect user that has access to the XStream Server.

Yes

Oracle Password

Password of the connect user that has access to the XStream Server.

Yes

Oracle Database Processor Multiplier

Core Processor Licensing Factor as described in Oracle Processor Core Factor Table

Required for Embedded License only

Oracle Database Processor Cores

The number processor cores in your Oracle database.

Required for Embedded License only

XStream Billing Acknowledgement

A confirmation of the licensing agreement

Required for Embedded License only

XStream Out Server Name

The name of the XStream Server that must already exist in Oracle.

Yes

XStream Out Server URL

JDBC URL of the database connection for XStream, must use OCI driver. For example jdbc:oracle:oci:@<host>:<port>/SID.

Yes

Run the flow

  1. Right-click on the plane and select Enable all Controller Services.

  2. Right-click on the imported process group and select Start. The connector starts the data ingestion.

Next steps

Monitor the flow.