# Facebook Offline Conversions Update at 2025-03-10 Meta has announced that it's deprecating the Offline Conversions API (OCAPI) which the current Facebook Offline Conversions app uses. In addition, Meta will disable the ability to create new offline event sets (OES). Meta anticipates that the OCAPI will be discontinued May 1, 2025. Ref. [https://www.facebook.com/business/help/1835755323554976](https://www.facebook.com/business/help/1835755323554976) In preparation for this change, Treasure Data has added support for offline conversions in Meta Conversion API connector : - [https://docs.treasuredata.com/articles/#!int/facebook-conversions-api-export-integration](/int/facebook-conversions-api-export-integration) Please review the different fields name between these 2 APIs. - [https://docs.treasuredata.com/articles/int/migration-guide-to-facebook-conversion-connector](/int/migration-guide-to-facebook-conversion-connector) Users can switch to the connector today to prepare for the upcoming changes to the Facebook Offline Conversions app. You can use the Facebook Offline Conversions to send job results (in the form of offline event data) from Treasure Data directly to Facebook to measure how much your Facebook ads lead to real-world outcomes, such as purchases in your stores, phone orders, bookings, and more. ## Prerequisites - Basic Knowledge of Treasure Data. - Basic knowledge of Facebook Offline Conversions and Facebook Offline Event - To upload event data, you need access to one of the following on Facebook: - Business Manager admin - Admin system user who created the offline event set - Admin on the `ad_account` connected to the offline event set ## Offline Event Set ID 1. Open the Business Manager dashboard and select Event Manager. 2. Select an **Event Set**. 3. Select the **Settings** and the Offline event set ID is displayed. ![](/assets/image-20200916-125520.871f138abf2ac583105cd7a6388da2fef20b40ab55f783cc84fa22394da6955e.105d1a9c.png) ## Use the TD Console to Create Your Connection ### Create a New Connection In Treasure Data, you must create and configure the data connection to be used during export prior to running your query. As part of the data connection, you provide authentication to access the integration. 1. Open TD Console. 2. Navigate to **Integrations Hub** >  **Catalog.** 3. Search for and select **Facebook Offline Conversions.** 4. After the following dialog opens, choose your type of authentication method, which is further described in the following section. ![](/assets/image-20200913-011149.0ff5d47236e5f84f2e24d951f852b97cb42e65b7aebd1ed7a0d28ff02f4ff20e.105d1a9c.png) 1. Enter a name for your connection. 2. Select **Done.** ### Authenticating your Connection The method you use to authenticate Treasure Data with Facebook affects the steps you take to enable the data connector to access Facebook. You can choose to authenticate in the following ways: - Access Token - OAuth #### Using Access Token to Authenticate You need an access token and client secret to authenticate using Access Token. A long-lived user access token or system user access token is recommended. You may need to create a [long-lived access token](https://www.sociablekit.com/get-facebook-long-lived-user-access-token/) or a [system access token](https://developers.facebook.com/docs/audience-network/guides/reporting/system-user/). You need to assign `ads_management` permission to your access_token. #### Using OAuth to Authenticate Using OAuth is the most common authentication method. Authentication requires that you manually connect your Treasure Data account to your Facebook Ads account. To authenticate, complete the following procedure: 1. Select **Click here** to connect a new account. You are redirected to Facebook to log in if you haven't logged in yet or the consent page to grant access to Treasure Data. ![](/assets/image-20200913-030209.a9b9a438d4dc4a9a15d2db5b21f980d016a40bfecb507e21ecd649b31596b80d.105d1a9c.png) 1. Log into your Facebook account in the popup window and grant access to the Treasure Data app. You will be redirected back to TD Console. ![](/assets/data-connector-facebook-login.ffc61a088d2fba0cad93ac39afbcb5c2cf06e120c8fdf113b84fe3855bd8b04d.105d1a9c.png) 1. Repeat the first step (Create a new connection) and choose your new OAuth connection. ![](/assets/image-20200913-030623.f479d771e5df0a0ead39bade8109c76d67ccba05579d884d2b47a6da366130dc.105d1a9c.png) 1. Name your new Facebook Offline Conversions connection. 2. Select **Done.** ### Configure Export Results in Your Data Connection In this step, you create or reuse a query. In the query, you configure the data connection. You need to define the column mapping in the query. The columns in the query represent Offline Event data to be uploaded to Facebook. Additionally, the **match_keys** column and its data are hashed/normalized before being sent to Facebook. Learn more about [hashing and normalization requirements](https://developers.facebook.com/docs/marketing-api/audiences/guides/custom-audiences#hash). You need at least one **match_keys** column to configure export results. | **Column name** | **Data type** | **Match Key** | **Required** | **Multiple** | **Example** | | --- | --- | --- | --- | --- | --- | | `email` | string | Yes | No | Yes | foo@fb.com | | `phone` | string | Yes | No | Yes | 1-202-555-0192 | | `gen` | string | Yes | No | No | M | | `doby` | string | Yes | No | No | 1990 | | `dobm` | string | Yes | No | No | 10 | | `dobd` | string | Yes | No | No | 20 | | `ln` | string | Yes | No | No | Bar | | `fn` | string | Yes | No | No | Foo | | `fi` | string | Yes | No | No | L | | `ct` | string | Yes | No | No | Long Beach | | `st` | string | Yes | No | No | California | | `zip` | string | Yes | No | No | 90899 | | `country` | string | Yes | No | No | US | | `madid` | string | Yes | No | No | aece52e7-03ee-455a-b3c4-e57283 | | `extern_id` | string | Yes | No | No | | | `lead_id` | string | Yes | No | No | 12399829922 | | `event_time` | long | No | Yes | No | 1598531676 | | `event_name` | string | No | Yes | No | Purchase | | `currency` | string | No | Yes | No | USD | | `value` | double | No | Yes | No | 100.00 | | `content_type` | string | No | No | No | | | `contents` | json string | No | No | Yes | {"id": "b20", "quantity": 100} | | `custom_data` | json string | No | No | No | {"a":12, "b":"c"} | | `order_id` | string | No | No | No | OD123122 | | `item_number` | string | No | No | No | | To include [Data Processing Options](https://developers.facebook.com/docs/marketing-apis/data-processing-options) specifying these columns mapping in your query. | **Column name** | **Data Type** | **Required** | **Multiple** | **Example** | | --- | --- | --- | --- | --- | | `data_processing_options` | string | No | No | “LDU“ | | `data_processing_options_country` | long | No | No | 1 | | `data_processing_options_state` | long | No | No | 1000 | To query multiple values with the same name, you specify the name multiple times in the query. For example: ```SQL SELECT home_email as email, work_email as email, first_name as fn, last_name as ln FROM table my_table ``` ### Configure the Connection by Specifying the Parameters 1. Open the TD Console. 2. Navigate to **Data Workbench** > **Queries**. 3. Select the query that you plan to use to export data. 4. Select Export Results, located at the top of your query editor. 5. The **Choose Integration** dialog opens. 6. You have two options when selecting a connection to use to export the results: using an existing connection or by first creating a new one. #### Use an Existing Connection 1. Type the connection name in the search box to filter. 2. Select your connection. 3. Set the following parameters. | **Parameter** | **Description** | | --- | --- | | **Offline Event Set ID** (required) | Facebook offline event set ID. See the Appendix for the Offline Event Set ID. | | **Upload Tag** (required) | Use to track your event uploads | | **Namespace ID** (optional) | Scope used to resolve `extern_id` or `tpid`. It can be another data set or data partner ID. Example: `12345` | | **Match Keys** (required) | The identifying information is used to match people on Facebook. The value is a comma-separated string. Example: `email,phone,fn,ln,st,country…` | | **Skip Invalid Data** (optional) | It is used to terminate a job (without reverting) when invalid records are encountered. For example, a record is missing the required columns, e.g. `event_name, event_time...` | Here is a sample configuration: ![](/assets/image-20200916-024823.38c88f7c6235bf6dc4b3dfb5451a59e998bbc0c4cdfba20e8b1202570ae69bd3.105d1a9c.png) ### Example of a Query to Populate Offline Events Data From Treasure Data, run the following query with export results into a connection for Facebook Offline Conversions: - Regular SELECT query from a table ```SQL SELECT an_email_column AS EMAIL, a_phone_column AS PHONE, an_event_time_column AS EVENT_TIME, an_event_name_column AS EVENT_NAME, a_double_column AS VALUE, a_currency_column AS CURRENCY FROM your_table; ``` - Query multiple email and phone columns for multiple values. ```SQL SELECT 'elizabetho@fb.com' as email, 'olsene@fb.com' as email, '1-(650)-561-5622' as phone, '1-(650)-782-5622' as phone, 'Elizabeth' as fn, 'Olsen' as ln, '94046' as zip, 'Menlo Park' as st, 'US' as country, '1896' as doby, 'Purchase' as event_name, 1598531676 as event_time, 150.01 as value, 'USD' as currency ``` - Query with multiple `contents` ``` SELECT 'elizabetho@fb.com' as email, 'Purchase' as event_name, 1598531676 as event_time, 150.01 as value, 'USD' as currency '{"id": "b20", "quantity": 100}' as contents '{"id": "b21", "quantity": 200}' as contents ``` - Query `custom_data` column ``` SELECT 'elizabetho@fb.com' as email, 'Purchase' as event_name, 1598531676 as event_time, 150.01 as value, 'USD' as currency '{"a":12, "b":"c"}' as custom_data ``` ### (Optional) Schedule Query Export Jobs You can use Scheduled Jobs with Result Export to periodically write the output result to a target destination that you specify. Treasure Data's scheduler feature supports periodic query execution to achieve high availability. When two specifications provide conflicting schedule specifications, the specification requesting to execute more often is followed while the other schedule specification is ignored. For example, if the cron schedule is `'0 0 1 * 1'`, then the 'day of month' specification and 'day of week' are discordant because the former specification requires it to run every first day of each month at midnight (00:00), while the latter specification requires it to run every Monday at midnight (00:00). The latter specification is followed. #### Scheduling your Job Using TD Console 1. Navigate to **Data Workbench > Queries** 2. Create a new query or select an existing query. 3. Next to **Schedule**, select None. ![](/assets/image2021-1-15_17-28-51.f1b242f6ecc7666a0097fdf37edd1682786ec11ef80eff68c66f091bc405c371.0f87d8d4.png) 4. In the drop-down, select one of the following schedule options: ![](/assets/image2021-1-15_17-29-47.45289a1c99256f125f4d887e501e204ed61f02223fde0927af5f425a89ace0c0.0f87d8d4.png) | Drop-down Value | Description | | --- | --- | | Custom cron... | Review [Custom cron... details](#custom-cron-details). | | @daily (midnight) | Run once a day at midnight (00:00 am) in the specified time zone. | | @hourly (:00) | Run every hour at 00 minutes. | | None | No schedule. | #### Custom cron... Details ![](/assets/image2021-1-15_17-30-23.0f94a8aa5f75ea03e3fec0c25b0640cd59ee48d1804a83701e5f2372deae466c.0f87d8d4.png) | **Cron Value** | **Description** | | --- | --- | | `0 * * * *` | Run once an hour. | | `0 0 * * *` | Run once a day at midnight. | | `0 0 1 * *` | Run once a month at midnight on the morning of the first day of the month. | | "" | Create a job that has no scheduled run time. | ``` * * * * * - - - - - | | | | | | | | | +----- day of week (0 - 6) (Sunday=0) | | | +---------- month (1 - 12) | | +--------------- day of month (1 - 31) | +-------------------- hour (0 - 23) +------------------------- min (0 - 59) ``` The following named entries can be used: - Day of Week: sun, mon, tue, wed, thu, fri, sat. - Month: jan, feb, mar, apr, may, jun, jul, aug, sep, oct, nov, dec. A single space is required between each field. The values for each field can be composed of: | Field Value | Example | Example Description | | --- | --- | --- | | A single value, within the limits displayed above for each field. | | | | A wildcard `'*'` to indicate no restriction based on the field. | `'0 0 1 * *'` | Configures the schedule to run at midnight (00:00) on the first day of each month. | | A range `'2-5'`, indicating the range of accepted values for the field. | `'0 0 1-10 * *'` | Configures the schedule to run at midnight (00:00) on the first 10 days of each month. | | A list of comma-separated values `'2,3,4,5'`, indicating the list of accepted values for the field. | `0 0 1,11,21 * *'` | Configures the schedule to run at midnight (00:00) every 1st, 11th, and 21st day of each month. | | A periodicity indicator `'*/5'` to express how often based on the field's valid range of values a schedule is allowed to run. | `'30 */2 1 * *'` | Configures the schedule to run on the 1st of every month, every 2 hours starting at 00:30. `'0 0 */5 * *'` configures the schedule to run at midnight (00:00) every 5 days starting on the 5th of each month. | | A comma-separated list of any of the above except the `'*'` wildcard is also supported `'2,*/5,8-10'`. | `'0 0 5,*/10,25 * *'` | Configures the schedule to run at midnight (00:00) every 5th, 10th, 20th, and 25th day of each month. | 1. (Optional) You can delay the start time of a query by enabling the Delay execution. ## Optionally Configure Export Results in Workflow Within Treasure Workflow, you can specify the use of this data connector to export data. ```yaml timezone: UTC _export: td: database: sample_datasets +td-result-into-target: td>: queries/sample.sql result_connection: facebook_offline_conversions result_settings: event_set_id: 361738844830373 upload_tag: purcharse_event_upload match_keys: email,phone,ln,fn ``` Learn about [Exporting Data with Parameters](https://docs.treasuredata.com/smart/project-product-documentation/exporting-data-with-parameters) for more information on using data connectors in a workflow to export data.