Maximize your data resources by using Treasure Data with Google DoubleClick Bid Manager.
You can create audience lists in your Google DoubleClick Bid Manager (DBM) using data held in Treasure Data. This integration helps you move cookies and Mobile Advertising Identifiers to new or existing audience lists within Google DBM.
Google has rebranded DoubleClick Bid Manager to Display & Video 360, as part of the Google Marketing Platform.
For sample workflows on how to export to Google DoubleClick Bid Manager, view Treasure Boxes.
Continue to these topics:
Basic knowledge of Treasure Data, including the TD Toolbelt
A Google DBM Account
Authorized Treasure Data DMP access to your Google DBM Account
It may take up to 24 hours for updates to audience lists to be visible in Google DoubleClick Bid Manager (DBM). Expect to wait up to 24 hours from the time of the query completion for changes to be reflected in DBM.
Google Data Platform Policy (Identifying Users and Obtaining User Consent) requires that each segment identifies at least 100 users.
Grant Access for Treasure Data
Treasure Data’s DDP (DoubleClick Data Platform) connector requires permissions to create audience segments in your Google DBM account. Use the Google
Contact Us form to reach the DoubleClick Bid Manager Support team and request that Treasure Data be granted access to your DBM account. Provide the following information in the form:
Request: Grant Treasure Data permissions
Your DoubleClick Bid Manager account ID (referred to by Google as the Partner ID or Advertiser ID)
Treasure Data DMP:
You are sending information so that Google recognizes Treasure Data and connects your Google DBM account to Treasure Data.
Use the TD Console to Create Your Connection
Create a New Connection
In Treasure Data, you must create and configure the data connection prior to running your query. As part of the data connection, you provide authentication to access the integration.
1. Open TD Console.
2. Navigate to Integrations Hub > Catalog.
3. Search for and select Display & Video 360.
4. Select Create Authentication.
5. Type the credentials to authenticate.
6. Select Continue.
7. Type a name for your connection.
8. Select Done.
Source Column Name Mappings (Optional)
Define the mapping between Google DDP column names to the output column names that you specify in your query. You specify the target column and then the source column. For example, if google_cookie is the identifier column in your TD data source, you should define the mapping as cookie:google_cookie. If the source column in the mapping is missing, the target column name will be used. For example, a cookie is the same as cookie:cookie mapping.
Cookie or Mobile Identifier Column Header
Specify the original source of the user cookie or mobile identifier.
You must select one of the options:
cookie_encrypted: Encrypted identifier (for example, Web), a cookie hash of user-id
cookie_idfa: iOS Advertising Identifier
cookie_adid: Android Advertising Identifier
cookie_epid: cookie externally provided id
The Google AdWords (via DDP) reads data source table by columns and uses the following column name mappings to process each row data:
cookie: The encrypted Google ID or Mobile Advertising Identifier that DDP will use in id matching. This column contains the cookie hash or mobile identifier of your users.
list_name: This column contains the name of the audience list (segment) that you want to create in your DBM audience. If the list name does not exist in DBM, a new list is created. If the list name exists, the existing list is updated.
timestamp (optional): The timestamp (seconds since EPOCH). If this column does not exist or is missing, a current timestamp is used.
delete (optional): This column contains boolean values (false or true) or numbers (0 or 1) to indicate if the cookie is to be added or removed from the given audience segment. By default, the value will be false if the value is left blank or if the column is not provided.
Define your Query
Sometimes you need to define the column mapping before writing the query.
Plan to transfer your data at least 24 hours ahead of when you need the audience lists (also referred to as segments) to be in Google DoubleClick Bid Manager.
- Complete the instructions in Creating a Destination Integration.
Navigate to Data Workbench > Queries.
Select a query for which you would like to export data.
Run the query to validate the result set.
Select Export Results.
- Select an existing integration authentication.
- Define any additional Export Results details. In your export integration content review the integration parameters.
For example, your Export Results screen might be different, or you might not have additional details to fill out:
Run your query.
Validate that your data moved to the destination you specified.
Integration Parameters for Display and Video
Source Column Name Mappings (Optional)
|Define the mapping between Google DDP column names to the output column names that you specify in your query. You specify the target column and then the source column. For example, if google_cookie is the identifier column in your TD data source, you should define the mapping as cookie:google_cookie. If the source column in the mapping is missing, the target column name will be used. For example, a cookie is the same as cookie:cookie mapping.|
Cookie or Mobile Identifier Name
The upload process supports several different identifiers, and it is important that types of identifiers are uploaded to the segment using the correct upload file format. Generally, the identifiers fall into one of two categories: encrypted identifiers (anything obtained from Google systems), and raw identifiers (obtained from an external system or source). Any encrypted identifier is uploaded using the cookie_encrypted file format, and raw identifiers are uploaded in a type-specific upload file format such as Mobile Advertising Identifiers, for example, cookie_idfa or cookie_adid.
Supported cookie types are cookie_encrypted, cookie_epid, cookie_idfa, and cookie_adid.
The number of days a user's cookie stays on the user list.
|Temp filesize threshold||Maximum file size (in bytes) of local temp file, that temp file will be flushed to remote file when it reaches a threshold|
Optionally Schedule the Query Export Jobs
You can use Scheduled Jobs with Result Export to periodically write the output result to a target destination that you specify.
1. Navigate to Data Workbench > Queries.
2. Create a new query or select an existing query.
3. Next to Schedule, select None.
4. In the drop-down, select one of the following schedule options:
Review Custom cron... details.
|@daily (midnight)||Run once a day at midnight (00:00 am) in the specified time zone.|
|@hourly (:00)||Run every hour at 00 minutes.|
Custom cron... Details
Run once an hour.
Run once a day at midnight.
Run once a month at midnight on the morning of the first day of the month.
Create a job that has no scheduled run time.
The following named entries can be used:
Day of Week: sun, mon, tue, wed, thu, fri, sat.
Month: jan, feb, mar, apr, may, jun, jul, aug, sep, oct, nov, dec.
A single space is required between each field. The values for each field can be composed of:
|Field Value||Example||Example Description|
A single value, within the limits displayed above for each field.
|Configures the schedule to run at midnight (00:00) on the first day of each month.|
|A range ||Configures the schedule to run at midnight (00:00) on the first 10 days of each month.|
|A list of comma-separated values ||Configures the schedule to run at midnight (00:00) every 1st, 11th, and 21st day of each month.|
|A periodicity indicator ||Configures the schedule to run on the 1st of every month, every 2 hours starting at 00:30. |
|A comma-separated list of any of the above except the ||Configures the schedule to run at midnight (00:00) every 5th, 10th, 20th, and 25th day of each month.|
5. (Optional) You can delay the start time of a query by enabling the Delay execution.
Execute the Query
Save the query with a name and run, or just run the query. Upon successful completion of the query, the query result is automatically imported to the specified container destination.
Scheduled jobs that continuously fail due to configuration errors may be disabled on the system side after several notifications.
Optionally Configure Export Results in Workflow
Within Treasure Workflow, you can specify the use of this data connector to export data.
Learn more at Using Workflows to Export Data with the TD Toolbelt.