Post BigQuery data on Facebook Marketing API

Victor Paulillo
5 min readOct 29, 2020

How to easily post data from a BigQuery table on Facebook Marketing API using Python and the Google Cloud Platform.

There’s a direct relationship between an advanced marketing advertisement strategy and the acquisition, monetization of better customers. One way to accomplish that is to use advanced analytics and find specifics customers from your company, to ad target them in a custom way that meets their customer lifecycle.

This project shows how to post data on Facebook Ads that can be used to build audiences for ad targeting or to set as an offline conversion.

Offline conversions enable Facebook machine learning to better understand who is the best customer of your company, especially when you have products and services that aren’t distributed online.

Before you start

Follow the Facebook for Developer documentation to fulfill all the requirements to be able to use Marketing API

To use the API you will need:

  • APP ID;
  • Advertiser ID from Android or Advertising Identifier (IDFA) from Apple devices;
  • Access token for Facebook.

The architecture

Architecture of the project

Our desired goal is to give the data analysts from marketing team, autonomy to set any custom audiences and conversions on Facebook Ads, ensuring that all of our data is delivered correctly.

Assuming that you already have a table on BigQuery that will be read by the first Cloud Function that will publish each row on Pub/Sub, scheduled by Cloud Scheduler, them each message on Pub/Sub will trigger the second Function that will post the message on Facebook Ads.

This project uses Cloud Scheduler to run the Cloud Function daily, but you could trigger it anyway using the Trigger URL.

The solution

Assuming that you already have a table on BigQuery to post on Facebook Ads, let’s start creating a pub/sub topic.

Step 1. Create a pub/sub topic

Creating the topic

Step 2. Create a pull subscription for your topic

Creating pull subscription for the topic

Step 3. Create the first Cloud Function

If you have a table with hundreds of thousands ofrows, then it will be necessary to allocate more memory and increase the timeout of the function.

If you have a larger table, then you can solve the timeout in a couple of ways, like publishing in a bulk or reading the table multiple times with just a few rows, but it’s a topic for another article.

Creating the first Cloud Function

Use both codes below in Cloud Functions to read the BigQuery table and publish each row at Pub/Sub topic. The first image shows how to read the BigQuery table.

Reading BigQuery table

And this second image shows a code that publishes each row at pub/sub

Publishing each row at topic

This function publishes 4 parameters to the topic. The advertiser_id is mandatory (necessary to Facebook Ads recognize your customer), and the others 3 parameters are dispensable, you can add any field.

Don’t forget to add the libraries to requirements.txt:

google-cloud-bigquery
google-cloud-pubsub

Step 4. Validate if the first Function is working properly

Alright, now let’s test and validate if the publisher function is sending our desired message and parameters to the topic. To validate do these tasks:

  • Open your subscription, select VIEW MESSAGES
  • Open the first Cloud Function click on TESTING and then trigger the function by pressing the TEST THE FUNCTION button.
  • When finished running the test, press PULL button at VIEW MESSAGES (if needed press a few times)
Validating messages from the first function

Step 5. Create the function to post on Facebook Marketing API

Okay, everything is correct, so create the function that will be triggered for each message from pub/sub and will post our desired parameters at Facebook Ads.

Go to the topic and press the button + TRIGGER CLOUD FUNCTION, then define the function that will request the post API when used.

Use the function name ‘post_api’ as the Entry Point for your Cloud Function.

Defining the function that post request at Facebook Marketing API

After that define the function that Cloud Function will trigger. This defined function gets the parameters from the message and calls the request (previous) function to post on Facebook Marketing API

And don’t forget to add the libraries to requirements.txt:

google-cloud-pubsub

Step 6. Create a job on Cloud Scheduler to trigger the first function

Creating the scheduler job to trigger the first function

Lastly, go to Facebook Ads and validate your data.

Conclusion

This article covered a simples way to connect BigQuery with Facebook Marketing API, giving the marketing team alternatives to improve their custom audiences and conversions using GCP tools, which consequently improve their advertisements performances.

Now it’s possible to dive deep into customer analytics and machine learning models to sustain new marketing strategies involving Facebook Ads.

For future articles, it’s possible to advance this project on these followings topics:

  • Generalize the functions that connect to API, so that it’s possible to send any parameters and any event names, on a generic table;
  • Save the requests, response, and payload on a BigQuery table for future analytics;
  • Integrate with other media players, such as Google Ads;
  • Publish the message on pub/sub in a bulk, to support a larger public audience.

Feel free to comment or connect with me on LinkedIn.

--

--