Skip to content
Last updated

Facebook Offline Conversions

Update at 2025-03-10 Meta has announced that it's deprecating the Offline Conversions API (OCAPI) which the current Facebook Offline Conversions app uses. In addition, Meta will disable the ability to create new offline event sets (OES). Meta anticipates that the OCAPI will be discontinued May 1, 2025. Ref. https://www.facebook.com/business/help/1835755323554976 In preparation for this change, Treasure Data has added support for offline conversions in Meta Conversion API connector : - https://docs.treasuredata.com/articles/#!int/facebook-conversions-api-export-integration Please review the different fields name between these 2 APIs. - https://docs.treasuredata.com/articles/int/migration-guide-to-facebook-conversion-connector Users can switch to the connector today to prepare for the upcoming changes to the Facebook Offline Conversions app.

You can use the Facebook Offline Conversions to send job results (in the form of offline event data) from Treasure Data directly to Facebook to measure how much your Facebook ads lead to real-world outcomes, such as purchases in your stores, phone orders, bookings, and more.

Prerequisites

  • Basic Knowledge of Treasure Data.
  • Basic knowledge of Facebook Offline Conversions and Facebook Offline Event
  • To upload event data, you need access to one of the following on Facebook:
    • Business Manager admin
    • Admin system user who created the offline event set
    • Admin on the ad_account connected to the offline event set

Offline Event Set ID

  1. Open the Business Manager dashboard and select Event Manager.
  2. Select an Event Set.
  3. Select the Settings and the Offline event set ID is displayed.

Use the TD Console to Create Your Connection

Create a New Connection

In Treasure Data, you must create and configure the data connection to be used during export prior to running your query. As part of the data connection, you provide authentication to access the integration.

  1. Open TD Console.
  2. Navigate to Integrations HubCatalog.
  3. Search for and select Facebook Offline Conversions.
  4. After the following dialog opens, choose your type of authentication method, which is further described in the following section.

  1. Enter a name for your connection.
  2. Select Done.

Authenticating your Connection

The method you use to authenticate Treasure Data with Facebook affects the steps you take to enable the data connector to access Facebook. You can choose to authenticate in the following ways:

  • Access Token
  • OAuth

Using Access Token to Authenticate

You need an access token and client secret to authenticate using Access Token. A long-lived user access token or system user access token is recommended. You may need to create a long-lived access token or a system access token.

You need to assign ads_management permission to your access_token.

Using OAuth to Authenticate

Using OAuth is the most common authentication method. Authentication requires that you manually connect your Treasure Data account to your Facebook Ads account. To authenticate, complete the following procedure:

  1. Select Click here to connect a new account. You are redirected to Facebook to log in if you haven't logged in yet or the consent page to grant access to Treasure Data.

  1. Log into your Facebook account in the popup window and grant access to the Treasure Data app. You will be redirected back to TD Console.

  1. Repeat the first step (Create a new connection) and choose your new OAuth connection.

  1. Name your new Facebook Offline Conversions connection.
  2. Select Done.

Configure Export Results in Your Data Connection

In this step, you create or reuse a query. In the query, you configure the data connection.

You need to define the column mapping in the query. The columns in the query represent Offline Event data to be uploaded to Facebook.

Additionally, the match_keys column and its data are hashed/normalized before being sent to Facebook. Learn more about hashing and normalization requirements. You need at least one match_keys column to configure export results.

Column nameData typeMatch KeyRequiredMultipleExample
emailstringYesNoYesfoo@fb.com
phonestringYesNoYes1-202-555-0192
genstringYesNoNoM
dobystringYesNoNo1990
dobmstringYesNoNo10
dobdstringYesNoNo20
lnstringYesNoNoBar
fnstringYesNoNoFoo
fistringYesNoNoL
ctstringYesNoNoLong Beach
ststringYesNoNoCalifornia
zipstringYesNoNo90899
countrystringYesNoNoUS
madidstringYesNoNoaece52e7-03ee-455a-b3c4-e57283
extern_idstringYesNoNo
lead_idstringYesNoNo12399829922
event_timelongNoYesNo1598531676
event_namestringNoYesNoPurchase
currencystringNoYesNoUSD
valuedoubleNoYesNo100.00
content_typestringNoNoNo
contentsjson stringNoNoYes{"id": "b20", "quantity": 100}
custom_datajson stringNoNoNo{"a":12, "b":"c"}
order_idstringNoNoNoOD123122
item_numberstringNoNoNo

To include Data Processing Options specifying these columns mapping in your query.

Column nameData TypeRequiredMultipleExample
data_processing_optionsstringNoNo“LDU“
data_processing_options_countrylongNoNo1
data_processing_options_statelongNoNo1000

To query multiple values with the same name, you specify the name multiple times in the query. For example:

SELECT home_email as email, work_email as email, first_name as fn, last_name as ln
FROM table my_table

Configure the Connection by Specifying the Parameters

  1. Open the TD Console.
  2. Navigate to Data Workbench > Queries.
  3. Select the query that you plan to use to export data.
  4. Select Export Results, located at the top of your query editor.
  5. The Choose Integration dialog opens.
  6. You have two options when selecting a connection to use to export the results: using an existing connection or by first creating a new one.

Use an Existing Connection

  1. Type the connection name in the search box to filter.
  2. Select your connection.
  3. Set the following parameters.
ParameterDescription
Offline Event Set ID (required)Facebook offline event set ID. See the Appendix for the Offline Event Set ID.
Upload Tag (required)Use to track your event uploads
Namespace ID (optional)Scope used to resolve extern_id or tpid. It can be another data set or data partner ID. Example: 12345
Match Keys (required)The identifying information is used to match people on Facebook. The value is a comma-separated string. Example: email,phone,fn,ln,st,country…
Skip Invalid Data (optional)It is used to terminate a job (without reverting) when invalid records are encountered. For example, a record is missing the required columns, e.g. event_name, event_time...

Here is a sample configuration:

Example of a Query to Populate Offline Events Data

From Treasure Data, run the following query with export results into a connection for Facebook Offline Conversions:

  • Regular SELECT query from a table
SELECT 
  an_email_column       AS EMAIL,
  a_phone_column        AS PHONE,
  an_event_time_column  AS EVENT_TIME,
  an_event_name_column  AS EVENT_NAME,
  a_double_column       AS VALUE,
  a_currency_column     AS CURRENCY
FROM your_table;
  • Query multiple email and phone columns for multiple values.
SELECT
  'elizabetho@fb.com' as email,
  'olsene@fb.com'     as email,
  '1-(650)-561-5622'  as phone,
  '1-(650)-782-5622'  as phone,
  'Elizabeth'         as fn,
  'Olsen'             as ln,
  '94046'             as zip,
  'Menlo Park'        as st,
  'US'                as country,
  '1896'              as doby,
  'Purchase'          as event_name,
  1598531676          as event_time,
  150.01              as value,
  'USD'               as currency
  • Query with multiple contents
SELECT
  'elizabetho@fb.com' as email,
  'Purchase'          as event_name,
  1598531676          as event_time,
  150.01              as value,
  'USD'               as currency
  '{"id": "b20", "quantity": 100}' as contents
  '{"id": "b21", "quantity": 200}' as contents
  • Query custom_data column
SELECT
  'elizabetho@fb.com' as email,
  'Purchase'          as event_name,
  1598531676          as event_time,
  150.01              as value,
  'USD'               as currency
  '{"a":12, "b":"c"}' as custom_data

(Optional) Schedule Query Export Jobs

You can use Scheduled Jobs with Result Export to periodically write the output result to a target destination that you specify.

Treasure Data's scheduler feature supports periodic query execution to achieve high availability.

When two specifications provide conflicting schedule specifications, the specification requesting to execute more often is followed while the other schedule specification is ignored.

For example, if the cron schedule is '0 0 1 * 1', then the 'day of month' specification and 'day of week' are discordant because the former specification requires it to run every first day of each month at midnight (00:00), while the latter specification requires it to run every Monday at midnight (00:00). The latter specification is followed.

Scheduling your Job Using TD Console

  1. Navigate to Data Workbench > Queries

  2. Create a new query or select an existing query.

  3. Next to Schedule, select None.

  4. In the drop-down, select one of the following schedule options:

    Drop-down ValueDescription
    Custom cron...Review Custom cron... details.
    @daily (midnight)Run once a day at midnight (00:00 am) in the specified time zone.
    @hourly (:00)Run every hour at 00 minutes.
    NoneNo schedule.

Custom cron... Details

Cron ValueDescription
0 * * * *Run once an hour.
0 0 * * *Run once a day at midnight.
0 0 1 * *Run once a month at midnight on the morning of the first day of the month.
""Create a job that has no scheduled run time.
 *    *    *    *    *
 -    -    -    -    -
 |    |    |    |    |
 |    |    |    |    +----- day of week (0 - 6) (Sunday=0)
 |    |    |    +---------- month (1 - 12)
 |    |    +--------------- day of month (1 - 31)
 |    +-------------------- hour (0 - 23)
 +------------------------- min (0 - 59)

The following named entries can be used:

  • Day of Week: sun, mon, tue, wed, thu, fri, sat.
  • Month: jan, feb, mar, apr, may, jun, jul, aug, sep, oct, nov, dec.

A single space is required between each field. The values for each field can be composed of:

Field ValueExampleExample Description
A single value, within the limits displayed above for each field.
A wildcard '*' to indicate no restriction based on the field.'0 0 1 * *'Configures the schedule to run at midnight (00:00) on the first day of each month.
A range '2-5', indicating the range of accepted values for the field.'0 0 1-10 * *'Configures the schedule to run at midnight (00:00) on the first 10 days of each month.
A list of comma-separated values '2,3,4,5', indicating the list of accepted values for the field.0 0 1,11,21 * *'Configures the schedule to run at midnight (00:00) every 1st, 11th, and 21st day of each month.
A periodicity indicator '*/5' to express how often based on the field's valid range of values a schedule is allowed to run.'30 */2 1 * *'Configures the schedule to run on the 1st of every month, every 2 hours starting at 00:30. '0 0 */5 * *' configures the schedule to run at midnight (00:00) every 5 days starting on the 5th of each month.
A comma-separated list of any of the above except the '*' wildcard is also supported '2,*/5,8-10'.'0 0 5,*/10,25 * *'Configures the schedule to run at midnight (00:00) every 5th, 10th, 20th, and 25th day of each month.
  1. (Optional) You can delay the start time of a query by enabling the Delay execution.

Optionally Configure Export Results in Workflow

Within Treasure Workflow, you can specify the use of this data connector to export data.

timezone: UTC

_export:
  td:
    database: sample_datasets

+td-result-into-target:
  td>: queries/sample.sql
  result_connection: facebook_offline_conversions
  result_settings:
    event_set_id: 361738844830373
    upload_tag: purcharse_event_upload
    match_keys: email,phone,ln,fn

Learn about Exporting Data with Parameters for more information on using data connectors in a workflow to export data.