Set up the Openflow Connector for Google Ads

Note

This connector is subject to the Snowflake Connector Terms.

This topic describes the steps to set up the Openflow Connector for Google Ads.

Prerequisites

  1. Ensure that you have reviewed About Openflow Connector for Google Ads.
  2. Ensure that you have Set up Openflow - BYOC or Set up Openflow - Snowflake Deployments.
  3. If using Openflow - Snowflake Deployments, ensure that you’ve reviewed configuring required domains and have granted access to the required domains for the Google Ads connector.

Get the credentials

As a Google Ads administrator, perform the following steps:

  • Ensure that you have access to a Google Cloud project or create a new one.
  • Ensure that the Google Ads API is enabled for your Google Cloud project. Google Ads API access is required to ingest data.
  • Configure Service account authentication for Google Ads.
  • Obtain developer token for your organization following instructions.

Note

Developer token should have Access Level either Basic or Standard. For more information about Access Level please see documentation.

Set up Snowflake account

As a Snowflake account administrator, perform the following tasks:

  1. Create a new role or use an existing role and grant the Database privileges.

  2. Create a new Snowflake service user with the type as SERVICE.

  3. Grant the Snowflake service user the role you created in the previous steps.

  4. Configure with key-pair auth for the Snowflake SERVICE user from step 2.

  5. Snowflake strongly recommends this step. Configure a secrets manager supported by Openflow, for example, AWS, Azure, and Hashicorp, and store the public and private keys in the secret store.

    Note

    If for any reason, you do not wish to use a secrets manager, then you are responsible for safeguarding the public key and private key files used for key-pair authentication according to the security policies of your organization.

    1. Once the secrets manager is configured, determine how you will authenticate to it. On AWS, it’s recommended that you the EC2 instance role associated with Openflow as this way no other secrets have to be persisted.
    2. In Openflow, configure a Parameter Provider associated with this Secrets Manager, from the hamburger menu in the upper right. Navigate to Controller Settings » Parameter Provider and then fetch your parameter values.
    3. At this point all credentials can be referenced with the associated parameter paths and no sensitive values need to be persisted within Openflow.
  6. If any other Snowflake users require access to the raw ingested documents and tables ingested by the connector (for example, for custom processing in Snowflake), then grant those users the role created in step 1.

  7. Designate a warehouse for the connector to use. Start with the smallest warehouse size, then experiment with size depending on the number of tables being replicated, and the amount of data transferred. Large table numbers typically scale better with multi-cluster warehouses, rather than larger warehouse sizes.

Set up the connector

As a data engineer, perform the following tasks to install and configure the connector:

Install the connector

  1. Create a database and schema in Snowflake for the connector to store ingested data. Grant required Database privileges to the role created in the first step. Substitute the role placeholder with the actual value and use the following sql commands:
CREATE DATABASE GOOGLE_ADS_DESTINATION_DB;
CREATE SCHEMA GOOGLE_ADS_DESTINATION_DB.GOOGLE_ADS_DESTINATION_SCHEMA;
GRANT USAGE ON DATABASE GOOGLE_ADS_DESTINATION_DB TO ROLE <GOOGLE_ADS_CONNECTOR_ROLE>;
GRANT USAGE ON SCHEMA GOOGLE_ADS_DESTINATION_DB.GOOGLE_ADS_DESTINATION_SCHEMA TO ROLE <GOOGLE_ADS_CONNECTOR_ROLE>;
GRANT CREATE TABLE ON SCHEMA GOOGLE_ADS_DESTINATION_DB.GOOGLE_ADS_DESTINATION_SCHEMA TO ROLE <GOOGLE_ADS_CONNECTOR_ROLE>;

To install the connector, do the following as a data engineer:

  1. Navigate to the Openflow overview page. In the Featured connectors section, select View more connectors.

  2. On the Openflow connectors page, find the connector and select Add to runtime.

  3. In the Select runtime dialog, select your runtime from the Available runtimes drop-down list and click Add.

    Note

    Before you install the connector, ensure that you have created a database and schema in Snowflake for the connector to store ingested data.

  4. Authenticate to the deployment with your Snowflake account credentials and select Allow when prompted to allow the runtime application to access your Snowflake account. The connector installation process takes a few minutes to complete.

  5. Authenticate to the runtime with your Snowflake account credentials.

The Openflow canvas appears with the connector process group added to it.

Configure the connector

  1. Right-click on the imported process group and select Parameters.
  2. Populate the required parameter values as described in Flow parameters.

Flow parameters

There are three parameter contexts. Google Ads Destination Parameters and Google Ads Source Parameters are respectively responsible for allowing connections with GoogleAds API and Snowflake. Google Ads Ingestion Parameters is used to define the reconfiguration of data downloaded from Google Ads. Google Ads Parameters aggregates all of them in one.

ParameterDescriptionRequired
Client Account IDID of the account in the Google Ads for which given report should be ingestedtrue
Login Customer IDCustomer ID of the Google Ads manager account (MCC) for which the report should be ingestedfalse
Google Ads Resource NameName of the resource in Google Ads that is a source for the reporttrue
Report AttributesAttributes of the selected resourcetrue
Report MetricsMetrics collected in the context of a given resourcefalse
Report SegmentsBuckets in which metrics should be groupedfalse
Report Start DateStart date from which the ingestion should happen. The date format is YYYY-MM-DD.false
ScheduleGet Google Ads Report processor scheduletrue

Note

The easiest way to obtain proper combination of Report Attributes, Report Metrics and Report Segments is to use Google Ads Query Builder. Select the resource based on the one inserted into parameter Google Ads Resource Name and construct the query. Then copy and pase attributes, metrics and segments to corresponding parameters.

ParameterDescriptionRequired
Google Developer TokenDeveloper token required to query Google Ads APItrue
Google Service Account JSONService Account JSON required for Google Ads authenticationtrue
ParameterDescriptionRequired
Destination Database

The database where data will be persisted. It must already exist in Snowflake. The name is case-sensitive. For unquoted identifiers, provide the name in uppercase.

Yes
Destination Schema

The schema where data will be persisted, which must already exist in Snowflake. The name is case-sensitive. For unquoted identifiers, provide the name in uppercase.

See the following examples:

  • CREATE SCHEMA SCHEMA_NAME or CREATE SCHEMA schema_name: use SCHEMA_NAME
  • CREATE SCHEMA "schema_name" or CREATE SCHEMA "SCHEMA_NAME": use schema_name or SCHEMA_NAME, respectively
Yes
Snowflake Authentication Strategy

When using:

  • Snowflake Openflow Deployment or BYOC: Use SNOWFLAKE_MANAGED_TOKEN. This token is managed automatically by Snowflake. BYOC deployments must have previously configured runtime roles to use SNOWFLAKE_MANAGED_TOKEN.
  • BYOC: Alternatively BYOC can use KEY_PAIR as the value for authentication strategy.
Yes
Snowflake Account Identifier

When using:

  • Session Token Authentication Strategy: Must be blank.
  • KEY_PAIR: Snowflake account name formatted as [organization-name]-[account-name] where data will be persisted.
Yes
Snowflake Private Key

When using:

  • Session Token Authentication Strategy: Must be blank.
KEY_PAIR: Must be the RSA private key used for authentication.

The RSA key must be formatted according to PKCS8 standards and have standard PEM headers and footers. Note that either a Snowflake Private Key File or a Snowflake Private Key must be defined.

No
Snowflake Private Key File

When using:

  • Session token authentication strategy: The private key file must be blank.
  • KEY_PAIR: Upload the file that contains the RSA private key used for authentication to Snowflake, formatted according to PKCS8 standards and including standard PEM headers and footers. The header line begins with -----BEGIN PRIVATE. To upload the private key file, select the Reference asset checkbox.
No
Snowflake Private Key Password

When using

  • Session Token Authentication Strategy: Must be blank.
  • KEY_PAIR: Provide the password associated with the Snowflake private key file.
No
Snowflake Role

When using

  • Session Token Authentication Strategy: Use your Snowflake role. You can find your Snowflake role in the Openflow UI, by navigating to View Details for your Runtime.
  • KEY_PAIR Authentication Strategy: Use a valid role configured for your service user.
Yes
Snowflake Username

When using

  • Session Token Authentication Strategy: Must be blank.
  • KEY_PAIR: Provide the user name used to connect to the Snowflake instance.
Yes
Oversized Value Strategy

Determines how the connector handles values that exceed its internal size limits (16 MB) during replication. Possible values are:

  • Fail Table (default): The table is marked as permanently failed, and replication stops for that table.
  • Set Null: The value is replaced with NULL in the destination table. Use this to prevent table failures when it is acceptable to lose data in tables beyond the oversized value.
No
Snowflake WarehouseSnowflake warehouse used to run queries.Yes

Run the flow

  1. Right-click on the plane and select Enable all Controller Services.
  2. Right-click on the imported process group and select Start. The connector starts the data ingestion.

How to reset the connector

To fully reset connector to the initial state, do the following:

  1. Ensure that there are no more flow files in the queues.

  2. Stop all the processors.

  3. Clear the state of the initial processor.

    1. Right click on the processor Get Google Ads Report and select View State.
    2. Select the option Clear State. This resets the state of the processor.
  4. Drop the destination table in Snowflake.