You can use the Amazon DSP Data Provider Export Integration connectors to export segment data to Amazon DSP.
Prerequisites
Basic knowledge of Amazon DSP and an advertiser account
Basic Knowledge of Treasure Data: Authentication, Query, Results Export, and (optionally) Workflow.
Limitations
The
External Audience ID
is the key to specify the target Audiences on Amazon (it is unique to all audiences). The values of this field will be unique for each Advertiser account.The job may split the result set into multiple batches. In the case of one batch fail, the connector will not revert successfully uploaded batches.
The prefixes as required per Amazon DSP API (COOKIE- or MAID-) will be added by the connector. Do not add these prefixes manually in the result set.
The connector will not update Audience Metadata.
Null or empty column data will be ignored
Result output schema must have at least
maid
orcookie
or bothcookie
andmaid
column in their metadatamaid
andcookie
column must be inString
data typeMaximum length for a
cookie
field is 1999 characters
Use the TD Console to Create Your Connection
Create a New Connection
In Treasure Data, you must create and configure the data connection, to be used during export, prior to running your query. As part of the data connection, you provide authentication to access the integration.
Open TD Console.
Navigate to Integrations Hub > Catalog.
- Click the search icon on the far-right of the Catalog screen, and enter Amazon DSP.
- Hover over the Amazon DSP Data Provider connector and select Create Authentication.
The following dialog opens. Choose the Region of your advertiser account
Enter your Advertiser ID. Then select Continue.
Enter a name for your connection.
Select Done.
Export Results to Amazon DSP
Create or reuse a query. Sometimes you need to define the column mapping in the query.
When exporting your data to Amazon DSP, you must provide the following parameter values:
Parameter | Description |
---|---|
Audience Name (required) | The name of audience you would like to create. |
Audience Description (optional) | Description for your audience. |
External Audience ID (required) | Unique key for your audience. |
Time to live (required) | The time in seconds when the audience will stay valid. |
Operation (required) | Operation to perform on the output data (add or remove). |
Ignore Invalid Records | During the time data is sent to Amazon DSP, If the check box is ticked and there are incorrect records or errors, they will be skipped and continue to send data to the last row. Otherwise an exception will be thrown |
To export results:
Open the TD Console.
Navigate to Data Workbench > Queries.
Select the query that you plan to use to export data.
For example:SELECT a_cookie_column AS cookie, a_maid_column AS maid FROM your_table;
Select Export Results located at top of your query editor.
The Choose Integration dialog opens.
You have two options when selecting a connection to use to export the results:using an existing connection
creating a new one
Use an Existing Connection
Type the connection name in the search box to filter.
Select your connection.
Select Next.
Type values for Audience Name, Audience Description, External Audience ID, Time to live, Operation and Ignore Invalid Records.
Create a New Amazon DSP Data Provider Connection
Select Create New Integration.
Type a Name for your connection.
Select your region from Region.
Type your advertiser id in Advertiser ID.
Select Next.
Type values for Audience Name, Audience Description, External Audience ID, Time to live, Operation and Ignore Invalid Records.
Select Done.
Use of Scheduled Jobs for Export
You can use Scheduled Jobs with Result Export, to periodically write the output result to a target destination that you specify.
Configure Export Results in Workflow
timezone: UTC _export: td: database: sample_datasets +td-result-into-target: td>: queries/sample.sql result_connection: your_connections_name result_settings: region: NA advertiser_id: 12345 audience_name: This is a test audience audience_description: This is for testing purpose external_audience_id: test_audiences time_to_live: 3600 operation: Add ignore_error: false
To learn more about using data connectors in a workflow check out the Workflows section of the docs.