Data Connector for Salesforce Marketing Cloud (BETA)

This data connector allows you to import Salesforce Marketing Cloud (beta) Data Source objects into Treasure Data. Contact us for further details about the private beta.

You can use the same connection to import and export SFMC data. See Exporting Data to SFMC (BETA).

Table of Contents


  • Basic knowledge of Treasure Data
  • Basic knowledge of Salesforce Marketing Cloud

Option 1: Use Web Console

Create a new connection

Treasure Data Connections. Locate and select Salesforce Marketing Cloud (beta). The dialog opens.

Provide your Salesforce Marketing Cloud Client ID and Client Secret information, click Next and give your connection a name:

Create a new transfer

After creating the connection, you are automatically taken to the My Connections tab. Look for the connection you created and click New Transfer.

The following dialog opens. Provide information details and click Next.

Next, you see a Preview of your data similar to the following dialog. To make changes, click on Advanced Settings otherwise, click Next.

From here, if you want to change some options such as skipping on errors or rate limits, click Advanced Settings:

The next step is to select the database and table where you want to transfer the data, as shown in the following dialog:

Finally, specify the schedule of the data transfer using the following dialog, and click Start Transfer:

You will see the new data transfer in progress listed under the My Input Transfers tab and a corresponding job will be listed in the Jobs section.

You are ready to start analyzing your data.

Option 2: Use Command Line

Step 0: Install ‘td’ command v0.11.9 or later

You can install the newest Treasure Data Toolbelt.

$ td --version

Step 1: Create Configuration File

Prepare configuration file (for eg: load.yml) as shown in the following example, with your Salesforce Marketing Cloud account access information.

  type: salesforce_marketing_cloud
  client_id: xxxxxxxxxx
  client_secret: xxxxxxxxxx
  target: campaign (required, see Appendix A)
  mode: replace

This example dumps Salesforce Marketing Cloud Campaign Data Source:

  • client_id: Salesforce Marketing Cloud client id.
  • client_secret: Salesforce Marketing Cloud client secret.
  • target: Salesforce Marketing Cloud entity object to be imported.
    • See Appendix B for the list of available target.

For more details on available out modes, see Appendix A

Step 2 (optional): Preview data to import

You can preview data to be imported using the command td connector:preview.

$ td connector:preview load.yml
| id:long         | name:string         | description:string | ...
| 42023           | "Hello"             | apps               |
| 42045           | "World"             | apps               |

Step 3: Execute Load Job

Finally, submit the load job. It may take a couple of hours depending on the data size. Users need to specify the database and table where their data are stored.

It is recommended to specify --time-column option, since Treasure Data’s storage is partitioned by time (see also data partitioning). If the option is not given, the data connector selects the first long or timestamp column as the partitioning time. The type of the column specified by --time-column must be either of long and timestamp type.

If your data doesn’t have a time column you can add it using add_time filter option. More details at add_time filter plugin

$ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column modifieddate

The above command assumes you have already created database(td_sample_db) and table(td_sample_table). If the database or the table do not exist in TD this command will not succeed, so create the database and table manually or use --auto-create-table option with td connector:issue command to auto create the database and table:

$ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column modifieddate --auto-create-table
You can assign Time Format column to the "Partitioning Key" by "--time-column" option.

Scheduled execution

You can schedule periodic data connector execution for periodic Salesforce Marketing Cloud import. We take great care in distributing and operating our scheduler in order to achieve high availability. By using this feature, you no longer need a cron daemon on your local datacenter.

Create the schedule

A new schedule can be created using the td connector:create command. The name of the schedule, cron-style schedule, the database and table where their data will be stored, and the Data Connector configuration file are required.

$ td connector:create \
    daily_salesforce_marketing_cloud_import \
    "10 0 * * *" \
    td_sample_db \
    td_sample_table \
The `cron` parameter also accepts these three options: `@hourly`, `@daily` and `@monthly`.
By default, schedule is setup in UTC timezone. You can set the schedule in a timezone using -t or --timezone option. The `--timezone` option only supports extended timezone formats like 'Asia/Tokyo', 'America/Los_Angeles' etc. Timezone abbreviations like PST, CST are *not* supported and may lead to unexpected schedules.

List the Schedules

You can see the list of currently scheduled entries by td connector:list.

$ td connector:list
| Name                                    | Cron         | Timezone | Delay | Database     | Table           | Config                                       |
| daily_salesforce_marketing_cloud_import | 10 0 * * *   | UTC      | 0     | td_sample_db | td_sample_table | {"type"=>"salesforce_marketing_cloud", ... } |

Show the Setting and History of Schedules

td connector:show shows the execution setting of a schedule entry.

% td connector:show daily_salesforce_marketing_cloud_import
Name     : daily_salesforce_marketing_cloud_import
Cron     : 10 0 * * *
Timezone : UTC
Delay    : 0
Database : td_sample_db
Table    : td_sample_table

td connector:history shows the execution history of a schedule entry. To investigate the results of each individual execution, use td job <jobid>.

% td connector:history daily_salesforce_marketing_cloud_import
| JobID  | Status  | Records | Database     | Table           | Priority | Started                   | Duration |
| 578066 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-18 00:10:05 +0000 | 160      |
| 577968 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-17 00:10:07 +0000 | 161      |
| 577914 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-16 00:10:03 +0000 | 152      |
| 577872 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-15 00:10:04 +0000 | 163      |
| 577810 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-14 00:10:04 +0000 | 164      |
| 577766 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-13 00:10:04 +0000 | 155      |
| 577710 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-12 00:10:05 +0000 | 156      |
| 577610 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-11 00:10:04 +0000 | 157      |
8 rows in set

Delete the Schedule

td connector:delete will remove the schedule.

$ td connector:delete daily_salesforce_marketing_cloud_import

Incremental Loading for Data Extensions

Treasure Data supports incremental loading for Data Extensions that have a date field.

If incremental: true is set, the data connector loads records according to the range specified by the from_date and the fetch_days for the specified date field.

For example:

  incremental_column_name: mydatefield
  from_date: 2018-02-01T00:00:00.000Z
  fetch_days: 2
  • 1st iteration: The data connector fetches records from Sep 01 00:00:00 UTC 2016 to Sep 03 00:00:00 UTC 2016
  • 2nd iteration: The data connector fetches records for the next available 2 day period, from Sep 03 00:00:00 UTC 2016 until Sep 05 00:00:00 UTC 2016. This process repeats for each successive iteration.
  • When the increment includes the present date, additional records will be fetched as each complete time period becomes available.

If incremental: false is set, The data connector loads all records for the target specified. This is one time activity.


A) Modes for out plugin

You can specify file import mode in out section of load.yml.

append (default)

This is the default mode and records are appended to the target table.

  mode: append

replace (In td 0.11.10 and later)

This mode replaces data in the target table. Any manual schema changes made to the target table remains intact with this mode.

  mode: replace

B) Available targets

Target Description
campaign The e-mail campaign
contact The contact list
data_extension The data extensions to satisfy the need for flexible data storage

The target contact provides an option to ingest data in multiple requests in case you have too many contact model’s attributes. If you select this option, you also input the number of attributes per request. The default value of this field is 100.

In some cases, your data will be broken because some attributes are faulty. You can use the option Attribute set names will be ignored to skip faulty attributes. This option will help you to ingest data without error from Salesforce Marketing Cloud’s API.

The target data_extension provides an option to filter data extensions that you want to ingest and another option to enable ingesting shared data extension.

C) Additional Settings

Parameters Description Default value
Number of records per page Number of records per page for target data_extension. The maximum value is 2500 as Salesforce document briefs 50
Number of records per page for target contact Number of records per page for target contact. This option is useful when you have a large of data 1000
Request timeout per API call by milliseconds The time to wait for response from Salesforce Marketing Cloud API. This value is useful when you have network issues 60000
Sleep time per API call by milliseconds The time to sleep between requests to Salesforce Marketing Cloud API to avoid flooding API 3000

Last modified: Mar 29 2018 19:37:35 UTC

If this article is incorrect or outdated, or omits critical information, let us know. For all other issues, access our support channels.