Skip to content
Last updated

Braze Cohort Export Integration

Braze is a Customer Engagement Platform that connects consumers and brands. It allows pushing of targeted ads to the consumer-ends mobile applications and websites via their SDK. In addition, the platform provides a scalable user profiles management system, including scheduled cleaning up of the opt-out users and segments. Treasure Data partnered with Braze to support Treasure Data cohort, which can be used to build segments.

This Braze Cohort export integration allows you to add or remove profiles from or to a particular cohort.

Prerequisites

  • Basic knowledge of Treasure Data.
  • Basic knowledge of Braze and Braze Cohort.
  • To use the Braze Cohort service, contact your Braze representative to enable this service on your Braze account.
  • Able to retrieve the data import key (client secret) and REST endpoint. For more information, review Retrieving Required Parameter Values from Braze.

Requirements and Limitations

  • Query columns must be specified with the exact column names and data type.
  • The query columns must present at least one of the columns: user_ids or device_ids or braze alias column match with configuration on the UI.

Static IP Address of Treasure Data Integration

If your security policy requires IP whitelisting, you must add Treasure Data's IP addresses to your allowlist to ensure a successful connection.

Please find the complete list of static IP addresses, organized by region, at the following link:
https://api-docs.treasuredata.com/en/overview/ip-addresses-integrations-result-workers/

Use the TD Console to Create a Connection

You must create and configure the data connection in Treasure Data before running your query. As part of the data connection, you provide authentication to access the integration.

Create Authentication

Your first step is to create a new authentication with a set of credentials.

  1. Select Integrations Hub.
  2. Select Catalog.
  3. Search for your Integration in the Catalog; hover your mouse over the icon and select Create Authentication.
  4. Ensure that the Credentialstab is selected, and then enter credential information for the integration.

New Authentication Fields

ParameterDescription
Rest EndpointTreasure Data Partner Cohort REST Endpoint. For more information, review Retrieving Required Parameter Values from Braze.
Cluster RegionSupport two regions: US and EU
Data Import KeyTreasure Data Partner Cohort Data Import Key. For more information, review Retrieving Required Parameter Values from Braze.
  1. Select Continue.
  2. Enter a name for your authentication, and select Done.

Define your Query

  1. Navigate to Data Workbench > Queries.
  2. Select New Query.
  3. Run the query to validate the result set.

Specify the Result Export Target

  1. Select Export Results.

You can select an existing authentication or create a new authentication for the external service to be used for output. Choose one of the following:

Use Existing Integration

Create a New Integration

Parameter configuration

FieldDescription
Cohort IDThe Braze Cohort ID.
Cohort NameThe Braze Cohort Name.
OperationSupport two modes: - APPEND: add the profiles to the target cohort - REMOVE: remove the profiles from the target cohort
AliasesA comma-separated list of column names will be treated as aliases when sending them to the target Braze Treasure Data Cohort. The column(s) names are mapped with Alias Label, and the values of the column(s) are mapped with Alias Name.
Thread CountThe number of concurrent requests to Braze Cohort API. The minimum value is 1, and the maximum value is 10.
Skip on Invalid RecordIf enabled, the job will continue to run on invalid records; otherwise, it will stop.

Example Query

  • Query columns must be specified with the exact column names and data type.
  • The query columns must present at least one of the columns: user_ids or device_ids or braze alias column match with configuration on the UI.
SELECT
   user_ids,
   device_ids,
   alias_label_1
FROM
   cohort

(Optional) Schedule Query Export Jobs

You can use Scheduled Jobs with Result Export to periodically write the output result to a target destination that you specify.

Treasure Data's scheduler feature supports periodic query execution to achieve high availability.

When two specifications provide conflicting schedule specifications, the specification requesting to execute more often is followed while the other schedule specification is ignored.

For example, if the cron schedule is '0 0 1 * 1', then the 'day of month' specification and 'day of week' are discordant because the former specification requires it to run every first day of each month at midnight (00:00), while the latter specification requires it to run every Monday at midnight (00:00). The latter specification is followed.

Scheduling your Job Using TD Console

  1. Navigate to Data Workbench > Queries

  2. Create a new query or select an existing query.

  3. Next to Schedule, select None.

  4. In the drop-down, select one of the following schedule options:

    Drop-down ValueDescription
    Custom cron...Review Custom cron... details.
    @daily (midnight)Run once a day at midnight (00:00 am) in the specified time zone.
    @hourly (:00)Run every hour at 00 minutes.
    NoneNo schedule.

Custom cron... Details

Cron ValueDescription
0 * * * *Run once an hour.
0 0 * * *Run once a day at midnight.
0 0 1 * *Run once a month at midnight on the morning of the first day of the month.
""Create a job that has no scheduled run time.
 *    *    *    *    *
 -    -    -    -    -
 |    |    |    |    |
 |    |    |    |    +----- day of week (0 - 6) (Sunday=0)
 |    |    |    +---------- month (1 - 12)
 |    |    +--------------- day of month (1 - 31)
 |    +-------------------- hour (0 - 23)
 +------------------------- min (0 - 59)

The following named entries can be used:

  • Day of Week: sun, mon, tue, wed, thu, fri, sat.
  • Month: jan, feb, mar, apr, may, jun, jul, aug, sep, oct, nov, dec.

A single space is required between each field. The values for each field can be composed of:

Field ValueExampleExample Description
A single value, within the limits displayed above for each field.
A wildcard '*' to indicate no restriction based on the field.'0 0 1 * *'Configures the schedule to run at midnight (00:00) on the first day of each month.
A range '2-5', indicating the range of accepted values for the field.'0 0 1-10 * *'Configures the schedule to run at midnight (00:00) on the first 10 days of each month.
A list of comma-separated values '2,3,4,5', indicating the list of accepted values for the field.0 0 1,11,21 * *'Configures the schedule to run at midnight (00:00) every 1st, 11th, and 21st day of each month.
A periodicity indicator '*/5' to express how often based on the field's valid range of values a schedule is allowed to run.'30 */2 1 * *'Configures the schedule to run on the 1st of every month, every 2 hours starting at 00:30. '0 0 */5 * *' configures the schedule to run at midnight (00:00) every 5 days starting on the 5th of each month.
A comma-separated list of any of the above except the '*' wildcard is also supported '2,*/5,8-10'.'0 0 5,*/10,25 * *'Configures the schedule to run at midnight (00:00) every 5th, 10th, 20th, and 25th day of each month.
  1. (Optional) You can delay the start time of a query by enabling the Delay execution.

Execute the Query

Save the query with a name and run, or just run the query. Upon successful completion of the query, the query result is automatically exported to the specified destination.

Scheduled jobs that continuously fail due to configuration errors may be disabled on the system side after several notifications.

(Optional) You can delay the start time of a query by enabling the Delay execution.

Retrieving Required Parameter Values from Braze

You must get some value from your Braze environment to input into our configuration.

  1. Login into your Braze environment
  2. Go to Partner Integrations > Technology Partners. Select Treasure Data.
  3. Scroll down to find Data Import Using Cohort Import.
  4. Copy the value of the Data Import Key and REST Endpoint to put in the Authentication configuration of this connector.