Google has rebranded and folded DoubleClick for Publishers into Google Ad Manager.

This connector was renamed from "Google DoubleClick for Publisher" to "Google Ad Manager via Data Transfer Reports".


You can import Google Doubleclick for Publisher (DFP) data objects into Treasure Data.

This topic includes:

Prerequisites

  • Basic knowledge of Treasure Data

  • Basic knowledge of Google DFP

  • Authorized Treasure Data Service Account access to your Google DFP Account

Grant Access for Treasure Data

Treasure Data’s DFP input connector requires permissions to read data from your Google DFP account. From your Google DFP console, click “Admin” table, select “Global settings” and then “All network settings”, you can see a button on the right panel at the “Service account user” section to grant access to a service account user. Complete the form as shown in the following image:


After you grant access to our service account email (google-dfp@affable-beach-161802.iam.gserviceaccount.com), complete the following steps to import your data.

Use TD Console

Create a New Connection

Go to Integrations Hub > Catalog and search and select Google Ad Manager via Data Transfer Reports for Import. The dialog opens. Enter your DFP network information that you can find in your Google DFP console by clicking “Admin” > “Global settings” > “All network settings” > “Network code”.



Create a New Source

After creating the connection, you are automatically taken to the My Authentications tab. Look for the connection you created and click New Source.



The following dialog opens. Provide information details and click Next.


Next, you see a Preview of your data similar to the following dialog. To make changes, click on Advanced Settings otherwise, click Next.

Select the database and table where you want to transfer the data:


Specify the schedule of the data transfer using the following dialog and click Start Transfer




You see the new data transfer in progress listed under the My Input Transfers tab and a corresponding job are listed in the Jobs section.

Use Command Line

Install ‘td’ Command v0.11.9 or Later

You can install the newest TD Toolbelt.

$ td --version
0.15.0


Create Configuration File

Prepare configuration file (for eg: load.yml) with your Google DFP account access information, as shown in the following example:

in:
  type: google_dfp
  target: order
  network_code: 1234567
  auth_method: SERVICE_ACCOUNT
  start_date: 2017-01-02T12:00:00
  end_date: 2017-11-10T10:00:00
out:
  mode: replace

This example dumps Google DFP Order data object:

  • target: Google DFP data object you want to import.

    • See the Appendix for the list of available target.

  • network_code: Google DFP network code

  • auth_method: Support authorization via a Google service account (required, supported value: SERVICE_ACCOUNT)

  • start_date: import data from this date (optional), format is: yyyy-MM-dd'T'hh:mm:ss

  • end_date: import data from this date (optional), format is: yyyy-MM-dd'T'hh:mm:ss

The start_date and end_date is available (and optional) for Order target. For the list of all options available for each target, see the Appendix

For more details on available out modes, see the Appendix

Preview Data to Import (Optional)

You can preview data to be imported using the command td connector:preview.

$ td connector:preview load.yml
+---------------------------+----------------------------------------------------------+--------------------+----
| end_date_time:timestamp   | total_budget:json                                        | advertiser_id:long | ...
+---------------------------+----------------------------------------------------------+--------------------+----
| "2017-02-27 12:59:00 UTC" | "{\"currencyCode\":\"AUD\",\"microAmount\":10471350000}" | 123456789          | ...
| "2017-02-10 12:59:00 UTC" | "{\"currencyCode\":\"AUD\",\"microAmount\":35000000000}" | 987654321          | ...
+---------------------------+----------------------------------------------------------+--------------------+----


Submit the Load Job

Submit the load job. It may take a couple of hours depending on the data size. Users need to specify the database and table where their data are stored.

It is recommended to specify --time-column option, because Treasure Data’s storage is partitioned by time (see also data partitioning) If the option is available, the Data Connector will choose the first long or timestamp column as the partitioning time. The type of the column specified by --time-column must be either of long and timestamp type.

If your data doesn’t have a time column you can add a time column using the add_time filter option. More details at add_time filter plugin.

$ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column updated_date

The td connector:issue command assumes that you have already created database(td_sample_db) and table(td_sample_table). If the database or the table do not exist in TD, this command will not succeed. You must create the database and table manually or use the --auto-create-table option with the td connector:issue command to auto create the database and table:

$ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column updated_date --auto-create-table

You can assign Time Format column to the "Partitioning Key" by "--time-column" option.


Scheduled Execution

You can schedule periodic Data Connector execution for periodic Google DFP import. We configure our scheduler carefully to ensure high availability. By using this feature, you no longer need a cron daemon on your local datacenter.

Create the Schedule

A new schedule can be created using the td connector:create command. The name of the schedule, cron-style schedule, the database and table where their data will be stored, and the Data Connector configuration file are required.

$ td connector:create \
    daily_google_dfp_import \
    "10 0 * * *" \
    td_sample_db \
    td_sample_table \
    load.yml

The `cron` parameter also accepts these three options: `@hourly`, `@daily` and `@monthly`.

By default, schedule is setup in UTC timezone. You can set the schedule in a timezone using -t or --timezone option. The `--timezone` option only supports extended timezone formats like 'Asia/Tokyo', 'America/Los_Angeles' etc. Timezone abbreviations like PST, CST are *not* supported and may lead to unexpected schedules.


List the Schedules

You can see the list of scheduled entries by td connector:list.

$ td connector:list
+--------------------------+--------------+----------+-------+--------------+-----------------+------------------------------+
| Name                     | Cron         | Timezone | Delay | Database     | Table           | Config                       |
+--------------------------+--------------+----------+-------+--------------+-----------------+------------------------------+
| daily_google_dfp_import  | 10 0 * * *   | UTC      | 0     | td_sample_db | td_sample_table | {"type"=>"google-dfp", ... } |
+--------------------------+--------------+----------+-------+--------------+-----------------+------------------------------+


Show the Setting and History of Schedules

td connector:show shows the execution setting of a schedule entry.

% td connector:show daily_google_dfp_import
Name     : daily_google_dfp_import
Cron     : 10 0 * * *
Timezone : UTC
Delay    : 0
Database : td_sample_db
Table    : td_sample_table

td connector:history shows the execution history of a schedule entry. To investigate the results of each individual execution, use td job <jobid>.

% td connector:history daily_google_dfp_import
+--------+---------+---------+--------------+-----------------+----------+---------------------------+----------+
| JobID  | Status  | Records | Database     | Table           | Priority | Started                   | Duration |
+--------+---------+---------+--------------+-----------------+----------+---------------------------+----------+
| 578066 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-18 00:10:05 +0000 | 160      |
| 577968 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-17 00:10:07 +0000 | 161      |
| 577914 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-16 00:10:03 +0000 | 152      |
| 577872 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-15 00:10:04 +0000 | 163      |
| 577810 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-14 00:10:04 +0000 | 164      |
| 577766 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-13 00:10:04 +0000 | 155      |
| 577710 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-12 00:10:05 +0000 | 156      |
| 577610 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-11 00:10:04 +0000 | 157      |
+--------+---------+---------+--------------+-----------------+----------+---------------------------+----------+
8 rows in set


Delete the Schedule

td connector:delete removes the schedule.

$ td connector:delete daily_google_dfp_import


Appendix

Modes for Out Plugin

You can specify file import mode in out section of load.yml.

append (default)

This is the default mode and records are appended to the target table.

in:
  ...
out:
  mode: append

replace (In td 0.11.10 and later)

This mode replaces data in the target table. Any manual schema changes made to the target table remains intact with this mode.

in:
  ...
out:
  mode: replace


Available Targets

Target

Description

company

Company data object

creative

Creative data object

inventory_adunit

Inventory AdUnit data object

line_item

Line Item data object

order

Order data object

placement

Placement data object

report

Reporting using saved report query


Available Target Options

Target \ Options

start_date

end_date

report_query

last_fetched_datetime

company




x

creative




x

inventory_adunit




x

line_item

x

x


x

order

x

x


x

placement




x

report



x


  • start_date (optional): import data from this date (optional), format is: yyyy-MM-dd'T'hh:mm:ss

  • end_date (optional): import data from this date (optional), format is: yyyy-MM-dd'T'hh:mm:ss

  • report_query (required): the query name (or id) of the saved report query in your Google DFP console

  • last_fetched_datetime (optional): only import data that has last modified time after (exclusive) this date time. This value is in epoch millis format, e.g. ‘1509511116000’ (08 Nov 2017 15:33:20)


Report Target

The report target executes the query specified in report_query to get the report and import the data into our database. Hence, the query must be accessible by our system in advance. You can grant access to our Google DFP service account email (google-dfp@affable-beach-161802.iam.gserviceaccount.com) in the “User able to edit” field:

Pay attention to the date range you are using in the report query. From here, you can find the list of dynamic date ranges supported by Google API.



  • No labels