You can write Treasure Data job results directly to your MediaMath Server.


This feature is in BETA. Contact your Customer Success Representative for more information.


Prerequisites

  • Basic knowledge of Treasure Data, including the TD Toolbelt.

  • A MediaMath account.

Requirements and Limitations

  • Specific query columns with exact column names (case insensitive) and data type
  • The size of data file upload to MediaMath SFTP server may not exceed 4 GB

Static IP Address of Treasure Data

The static IP address of Treasure Data is the access point and source of the linkage for this Integration. To determine the static IP address, contact your Customer Success representative or Technical support.

Use the TD Console to Create Your Connection

We support two options for connecting to MediaMath. You can use Treasure Data as the Data Provider or you can use your own MediaMath credentials. We encourage you to bring your own credentials.

Create a New Connection

In Treasure Data, you must create and configure the data connection before running your query. As part of the data connection, you provide authentication to access the integration.

1. Open TD Console.
2. Navigate to Integrations Hub Catalog.
3. Search for and select MediaMath.
4. Select Create Authentication.
5. Select Data Provider. You can select either Treasure Data  or User account.
If you select Treasure Data, select Continue.

If you select User, enter your credentials to authenticate. Select Continue.

6. Create a name for your authentication.
7. Select Done.


Define your Query

Your query requires one and only one column name `id`. Other column names will be segment names.

Integration Parameters for MediaMath 


Parameters

Value

Description

Action- Add All Segments
- Remove All Segments
- Update Segments Based On Data

Agency Name
Agency Name from MediaMath.  Only accept alphanumeric.
User Name Space
If Ids came from MediaMath then use mm. If Ids are 1P then use your Partner Namespace. Only accept alphabet and cannot be blank or empty.
Segment Name Space
If Segments came from MediaMath then use mm. If Segments are 1P then use your Partner Namespace. Only accept alphabet and cannot be blank or empty.
User Table Id
Exchange ID is used for identifying the match table for user IDs outside of the MediaMath namespace. Required when User Name Space is not mm.
Is Mobile?
Checked if Id is mobile or CTV id
Is Id GUID v4 format?
Checked if Id is GUID Version 4 format
Is Segment 32bit integer?
Checked if Segment Identifier is a 32-bit integer (segment name from 0 - 2^32)
  • If this option is unchecked then the segment is the value entered in "Output Column Name"
  • If this option is checked then the segment is the value entered for "SegmentId"
Skip Invalid Records
If you checked when data rows identify errors, the export job still continues and is successful. Otherwise, the job fails.

Example Query

Your query requires only one column with the name `id` and a data type of 'String'. Other columns are segment names. 


custom object query
SELECT id, segment_1, segment_2  FROM your_table;

Optionally Schedule the Query Export Jobs

You can use Scheduled Jobs with Result Export to periodically write the output result to a target destination that you specify.


1. Navigate to Data Workbench > Queries.
2. Create a new query or select an existing query.
3. Next to Schedule, select None.

4. In the drop-down, select one of the following schedule options:

Drop-down ValueDescription
Custom cron...

Review Custom cron... details.

@daily (midnight)Run once a day at midnight (00:00 am) in the specified time zone.
@hourly (:00)Run every hour at 00 minutes.
NoneNo schedule.

Custom cron... Details

Cron Value

Description

0 * * * *

Run once an hour.

0 0 * * *

Run once a day at midnight.

0 0 1 * *

Run once a month at midnight on the morning of the first day of the month.

""

Create a job that has no scheduled run time.

 *    *    *    *    *
 -    -    -    -    -
 |    |    |    |    |
 |    |    |    |    +----- day of week (0 - 6) (Sunday=0)
 |    |    |    +---------- month (1 - 12)
 |    |    +--------------- day of month (1 - 31)
 |    +-------------------- hour (0 - 23)
 +------------------------- min (0 - 59)

The following named entries can be used:

  • Day of Week: sun, mon, tue, wed, thu, fri, sat.

  • Month: jan, feb, mar, apr, may, jun, jul, aug, sep, oct, nov, dec.

A single space is required between each field. The values for each field can be composed of:

Field ValueExampleExample Description

A single value, within the limits displayed above for each field.



A wildcard ‘*’ to indicate no restriction based on the field. 

‘0 0 1 * *’ Configures the schedule to run at midnight (00:00) on the first day of each month.
A range ‘2-5’, indicating the range of accepted values for the field.‘0 0 1-10 * *’ Configures the schedule to run at midnight (00:00) on the first 10 days of each month.
A list of comma-separated values ‘2,3,4,5’, indicating the list of accepted values for the field.

0 0 1,11,21 * *’


Configures the schedule to run at midnight (00:00) every 1st, 11th, and 21st day of each month.
A periodicity indicator ‘*/5’ to express how often based on the field’s valid range of values a schedule is allowed to run.

‘30 */2 1 * *’


Configures the schedule to run on the 1st of every month, every 2 hours starting at 00:30. ‘0 0 */5 * *’ configures the schedule to run at midnight (00:00) every 5 days starting on the 5th of each month.
A comma-separated list of any of the above except the ‘*’ wildcard is also supported ‘2,*/5,8-10’‘0 0 5,*/10,25 * *’Configures the schedule to run at midnight (00:00) every 5th, 10th, 20th, and 25th day of each month.
5.  (Optional) You can delay the start time of a query by enabling the Delay execution.

Execute the Query

Save the query with a name and run, or just run the query. Upon successful completion of the query, the query result is automatically imported to the specified container destination.

Scheduled jobs that continuously fail due to configuration errors may be disabled on the system side after several notifications.

Optionally Configure Export Results in Workflow


Within Treasure Workflow, you can specify the use of a data connector to export data.

Learn more at Using Workflows to Export Data with the TD Toolbelt.

The action accepts the following values:

  • add
  • remove
  • data_driven


_export:
  td:
    database: mediamath_db

+mediamath_export_task:
  td>: export.sql
  database: ${td.database}
  result_connection: new_created_mediamath_auth
  result_settings:         
    type: media_math
 	data_provider: user
    host: host
    port: 22
    user_name: user_name
    password: password
    partner_name: partner_name
    action: add
    agency_name: agency_name
    id_name_space: mm
    segment_name_space: mm
    user_table_id: user_table_id
    is_mobile: true
    is_id_guid_format: true
    is_segment_32bit_integer: true
    skip_invalid_records: true  



  • No labels