Set up the Openflow Connector for Meta Ads¶
Note
The connector is subject to the Connector Terms.
This topic describes the steps to set up the Openflow Connector for Meta Ads.
Prerequisites¶
Ensure that you have reviewed About Openflow Connector for Meta Ads.
Ensure that you have set up Openflow.
Get the credentials¶
As a Meta Ads administrator, perform the following actions in your Meta Ads account:
Create a Meta App or ensure that you have access to one.
Enable Marketing API in the App dashboard.
Generate a long-lived token.
Optional: Increase the rate limit by changing the app access type from
Standard access
toAdvanced access
of the Ads Management Standard Access. Enable theads_read
andads_management
permissions.
Set up Snowflake account¶
As a Snowflake account administrator, perform the following tasks:
Create a new role or use an existing role and grant the Database privileges.
Create a new Snowflake service user with the type as SERVICE.
Grant the Snowflake service user the role you created in the previous steps.
Configure with key-pair auth for the Snowflake SERVICE user from step 2.
Snowflake strongly recommends this step. Configure a secrets manager supported by Openflow, for example, AWS, Azure, and Hashicorp, and store the public and private keys in the secret store.
Note
If for any reason, you do not wish to use a secrets manager, then you are responsible for safeguarding the public key and private key files used for key-pair authentication according to the security policies of your organization.
Once the secrets manager is configured, determine how you will authenticate to it. On AWS, it’s recommended that you the EC2 instance role associated with Openflow as this way no other secrets have to be persisted.
In Openflow, configure a Parameter Provider associated with this Secrets Manager, from the hamburger menu in the upper right. Navigate to Controller Settings » Parameter Provider and then fetch your parameter values.
At this point all credentials can be referenced with the associated parameter paths and no sensitive values need to be persisted within Openflow.
If any other Snowflake users require access to the raw ingested documents and tables ingested by the connector (for example, for custom processing in Snowflake), then grant those users the role created in step 1.
Designate a warehouse for the connector to use. Start with the smallest warehouse size, then experiment with size depending on the number of tables being replicated, and the amount of data transferred. Large table numbers typically scale better with multi-cluster warehouses, rather than larger warehouse sizes.
Set up the connector¶
As a data engineer, perform the following tasks to install and configure the connector:
Install the connector¶
Create a database and schema in Snowflake for the connector to store ingested data.Grant required Database privileges to the role created in the first step. Substitute the role placeholder with the actual value and use the following sql commands:
CREATE DATABASE META_ADS_DESTINATION_DB; CREATE SCHEMA META_ADS_DESTINATION_DB.META_ADS_DESTINATION_SCHEMA; GRANT USAGE ON DATABASE META_ADS_DESTINATION_DB TO ROLE <META_ADS_CONNECTOR_ROLE>; GRANT USAGE ON SCHEMA META_ADS_DESTINATION_DB.META_ADS_DESTINATION_SCHEMA TO ROLE <META_ADS_CONNECTOR_ROLE>; GRANT CREATE TABLE ON SCHEMA META_ADS_DESTINATION_DB.META_ADS_DESTINATION_SCHEMA TO ROLE <META_ADS_CONNECTOR_ROLE>;
Navigate to the Openflow Overview page. In the Featured connectors section, select View more connectors.
On the Openflow connectors page, find the connector and select Add to runtime.
In the Select runtime dialog, select your runtime from the Available runtimes drop-down list.
Select Add.
Note
Before you install the connector, ensure that you have created a database and schema in Snowflake for the connector to store ingested data.
Authenticate to the deployment with your Snowflake account credentials and select Allow when prompted to allow the runtime application to access your Snowflake account. The connector installation process takes a few minutes to complete.
Authenticate to the runtime with your Snowflake account credentials.
The Openflow canvas appears with the connector process group added to it.
Configure the connector¶
Right-click on the imported process group and select Parameters.
Populate the required parameter values as described in Flow parameters.
Flow parameters¶
This section describes the flow parameters that you can configure based on the following parameter contexts:
Meta Ads Source Parameters: Used to establish connection with MetaAds API.
Meta Ads Destination Parameters: Used to establish connection with Snowflake.
Meta Ads Ingestion Parameters: Used to define the configuration of data downloaded from Meta Ads.
Meta Ads Source Parameters¶
Parameter |
Description |
---|---|
Access Token |
Token required to request Meta Ads Insights API |
Meta Ads Destination Parameters¶
Parameter |
Description |
---|---|
Destination Database |
The database where data will be persisted. It must already exist in Snowflake. |
Destination Schema |
The schema where data will be persisted. It must already exist in Snowflake. |
Snowflake Account Identifier |
Snowflake account name formatted as [organization-name]-[account-name] where data will be persisted |
Snowflake Authentication Strategy |
Strategy of authentication to Snowflake. Possible values: |
Snowflake Private Key |
The RSA private key used for authentication. The RSA key must be formatted according to PKCS8 standards and have standard PEM headers and footers. Note that either Snowflake Private Key File or Snowflake Private Key must be defined. |
Snowflake Private Key File |
The file that contains the RSA Private Key used for authentication to Snowflake, formatted according to PKCS8 standards and having standard PEM headers and footers. The header line starts with |
Snowflake Private Key Password |
The password associated with the Snowflake Private Key File |
Snowflake Role |
Snowflake Role used during query execution |
Snowflake Username |
User name used to connect to Snowflake instance |
Snowflake Warehouse |
Snowflake warehouse used to run queries |
Meta Ads Ingestion Parameters¶
Parameter |
Description |
---|---|
Report Name |
Name of the report to be used as a destination table name. The name must be unique within the destination schema. |
Report Object Id |
Identifier of the downloaded object from Meta Ads.
|
Report Ingestion Strategy |
Mode in which data is fetched, either snapshot or incremental |
Meta Ads Version |
Version of Meta Ads API used for downloading reports. Allowed value: |
Report Level |
Presents the aggregation level of the result.
|
Report Fields |
Comma separated list of report fields |
Report Breakdowns |
Comma separated list of report breakdowns. Full list of available breakdowns can be found here. |
Report Time Increment |
Level of aggregation based on the day count
|
Report Action Time |
Time of action stats
|
Report Click Attribution Window |
Attribution window for the click action
|
Report View Attribution Window |
Attribution window for the view action
|
Report Schedule |
Schedule time for processor creating reports |
Report Start Date |
Start date from which the ingestion should happen. The date format is YYYY-MM-DD. |
Run the flow¶
Right-click on the plane and select Enable all Controller Services.
Right-click on the imported process group and select Start. The connector starts the data ingestion.
How to reset the connector¶
To fully reset connector to the initial state, do the following:
Ensure that there are no more flow files in the queues.
Stop all the processors.
Clear the state of the initial processor.
Right click on the processor
Create Meta Ads Report
and select View State.Select the option Clear State. This resets the state of the processor.
Drop the destination table in Snowflake.