Set up the Openflow Connector for Google Ads¶
Note
The connector is subject to the Connector Terms.
This topic describes the steps to set up the Openflow Connector for Google Ads.
Prerequisites¶
Ensure that you have reviewed Openflow Connector for Google Ads.
Ensure that you have set up Openflow.
Get the credentials¶
As a Google Ads administrator, perform the following steps:
Ensure that you have access to a Google Cloud project or create a new one.
Ensure that the Google Ads API is enabled for your Google Cloud project. Google Ads API access is required to ingest data.
Configure Service account authentication for Google Ads.
Obtain developer token for your organization following instructions.
Set up Snowflake account¶
As a Snowflake account administrator, perform the following tasks:
Create a new role or use an existing role and grant the Database privileges.
Create a new Snowflake service user with the type as SERVICE.
Grant the Snowflake service user the role you created in the previous steps.
Configure with key-pair auth for the Snowflake SERVICE user from step 2.
Snowflake strongly recommends this step. Configure a secrets manager supported by Openflow, for example, AWS, Azure, and Hashicorp, and store the public and private keys in the secret store.
Note
If for any reason, you do not wish to use a secrets manager, then you are responsible for safeguarding the public key and private key files used for key-pair authentication according to the security policies of your organization.
Once the secrets manager is configured, determine how you will authenticate to it. On AWS, it’s recommended that you the EC2 instance role associated with Openflow as this way no other secrets have to be persisted.
In Openflow, configure a Parameter Provider associated with this Secrets Manager, from the hamburger menu in the upper right. Navigate to Controller Settings » Parameter Provider and then fetch your parameter values.
At this point all credentials can be referenced with the associated parameter paths and no sensitive values need to be persisted within Openflow.
If any other Snowflake users require access to the raw ingested documents and tables ingested by the connector (for example, for custom processing in Snowflake), then grant those users the role created in step 1.
Designate a warehouse for the connector to use. Start with the smallest warehouse size, then experiment with size depending on the number of tables being replicated, and the amount of data transferred. Large table numbers typically scale better with multi-cluster warehouses, rather than larger warehouse sizes.
Configure the connector¶
As a data engineer, perform the following tasks to configure the connector:
Create a database and schema in Snowflake for the connector to store ingested data.
Download the
connector definition file
.Import the connector definition into Openflow:
Open the Snowflake Openflow canvas.
Add a process group. To do this, drag and drop the Process Group icon from the tool palette at the top of the page onto the canvas. Once you release your pointer, a Create Process Group dialog appears.
On the Create Process Group dialog, select the connector definition file to import.
Right-click on the imported process group and select Parameters.
Populate the required parameter values as described in Flow parameters.
Flow parameters¶
There are three parameter contexts. GoogleAdsConnectionContext and SnowflakeConnectionContext are respectively responsible for allowing connections with GoogleAds API and Snowflake. GetGoogleAdsReportContext is used to define the reconfiguration of data downloaded from Google Ads.
GetGoogleAdsReportContext¶
Property name |
Description |
Example |
Required |
---|---|---|---|
Client Account ID |
ID of the account in Google Ads for which given report should be ingested |
3242231648 |
true |
The number of days after an ad interaction, such as an ad click or video view, during which a conversion, such as a purchase, is recorded in Google Ads |
30 |
true |
|
Resource Name |
Name of the resource in Google Ads that is a source for the report |
campaign |
true |
Report Attributes |
Attributes of the selected resource |
campaign.id, campaign.name |
true |
Report Metrics |
Metrics collected in the context of a given resource |
metrics.absol ute_top_impress ion_percentage, metrics.clicks |
false |
Report Segments |
Buckets in which metrics should be grouped |
segments.date |
false |
Report Start Date |
Start date from which the ingestion should happen |
2025-01-01 |
false |
Schedule |
GetGoogleAds processor schedule |
24 h |
true |
GoogleAdsConnectionContext¶
Property name |
Description |
Example |
Required |
---|---|---|---|
Google Service Account JSON |
Service Account JSON required for Google Ads authentication |
N/A |
true |
Google Developer Token |
Developer token required to query Google Ads API |
N/A |
true |
SnowflakeConnectionContext¶
Property name |
Description |
Example |
Required |
---|---|---|---|
Destination Database Name |
Name of the Snowflake database where data will be ingested |
GADS_DB |
true |
Destination Schema Name |
Name of the Snowflake schema where tables will be created |
GADS_SCHEMA |
true |
Snowflake Account |
Name of the Snowflake account to which connection will be made |
snowf lake.qa6.us-wes t-2.aws.snowfla kecomputing.com |
true |
Snowflake User |
User name used to connect to Snowflake instance |
N/A |
true |
Snowflake Role |
Snowflake Role used during query execution |
N/A |
true |
Snowflake Private Key |
The RSA private key which is used for authentication. The RSA key must be formatted according to PKCS8 standards and have standard PEM headers and footers. Note that either Snowflake Private Key File or Snowflake Private Key must be defined |
N/A |
false |
Snowflake Private Key File |
The file that contains the RSA Private Key used for authentication to Snowflake, formatted according to PKCS8 standards and having standard PEM headers and footers. The header line starts with —–BEGIN PRIVATE |
/opt/resources/ snowflake/ rsa_key.p8 |
false |
Snowflake Private Key Password |
The password associated with the Snowflake Private Key File |
N/A |
false |
Warehouse Name |
Snowflake warehouse on which Snowflake queries will be executed |
APP_WAREHOUSE |
true |
Run the flow¶
Right-click on the plane and select Enable all Controller Services.
Right-click on the imported process group and select Start. The connector starts the data ingestion.