Data Connector for Intercom

This article describes how to use the data connector for Intercom, which allows you to directly import data from your Intercom to Treasure Data.

Table of Contents

Prerequisites

  • Basic knowledge of Treasure Data
  • Basic knowledge of Intercom

Option 1: Use Web Console

Step 1: Create a new Connection with OAuth

Authorize with OAuth

Visit Treasure Data Connections, and choose “Intercom”. A dialog opens as below then. Skip App id and API Key. Start the Intercom OAuth flow from “Click here to connect a new account” at the bottom.



You are navigated back to the Treasure Data Connections page then after the OAuth initiation flow of Intercom. Choose “Intercom” again, and you see Your Name (and the Date) in the OAuth connection dropdown. Choose that, Continue, name the connection in the next dialog, and Create Connection finally.



Memo: App id and API Key had been used formerly. Intercom started their OAuth flow then, and their API keys are deprecated.

Tips: If you are using Google Sign-In to log in Intercom, make sure that you are already logged in Intercom before starting the OAuth flow. They needs password login, no Google Sign-In, through the OAuth flow as of Feb, 2017.

Update an existing API key-based connnection to OAuth

Just initiate the OAuth flow as above even if you have been using API keys. OAuth is prioritized over API keys if both are specified.

Step 2: Create a new Transfer

After creating/updating a connection above, you are navigated to the My Connections tab. Find the created connection, and New Transfer. At first, choose the data type to import in the dialog as below.



Check the preview next, and then choose a database and a table where you are importing the data into. The process finishes by setting when to transfer. You will see the created new Transfer My Input Transfers tab.

You are finally ready to analyzing your data from Intercom!

Option 2: Use Command Line

Step 0: Install ‘td’ command v0.11.9 or later

You can install the newest Treasure Data Toolbelt.

$ td --version
0.15.0

Step 1: Create Configuration File

Prepare configuration file (for eg: load.yml) like below, with your Intercom account access information.

in:
  type: intercom
  access_token: xxxxxxx
  target: users
out:
  mode: replace

Access Token

The example above dumps Intercom’s users objects. Here access_token is a valid access token achieved from Intercom. Though using the OAuth flow through our Web Console (above) is recommended, your Personal Access Token can be used for access_token instead of the OAuth flow.

Target

You can select which data needs to be fetched from store as target option.

Step 2(optional): Preview data to imported

You can preview data to be imported using the command td connector:preview.

$ td connector:preview load.yml
+-----------+--------------+----------------------------+----
| id:string | user_id:string | email:string             | ...
+-----------+----------------+-------------------------------
| "1"       | "33"           | "xxxx@xxx.com"           |
| "2"       | "34"           | "yyyy@yyy.com"           |
| "3"       | "35"           | "zzzz@zzz.com"           |
| "4"       | "36"           | "aaaa@aaa.com"           |
| "6"       | "37"           | "bbbb@bbb.com"           |
+-----------+----------------+--------------------------+----

Step 3: Execute Load Job

Finally, submit the load job. It may take a couple of hours depending on the data size. Users need to specify the database and table where their data are stored.

It is recommended to specify --time-column option, since Treasure Data’s storage is partitioned by time (see also architecture) If the option is not given, the Data Connector will choose the first long or timestamp column as the partitioning time. The type of the column specified by --time-column must be either of long and timestamp type.

If your data doesn’t have a time column you may add it using add_time filter option. More details at add_time filter plugin

$ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column created_at

The above command assumes you have already created database(td_sample_db) and table(td_sample_table). If the database or the table do not exist in TD this command will not succeed, so create the database and table manually or use --auto-create-table option with td connector:issue command to auto create the database and table:

$ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column created_at --auto-create-table
Untitled-3
You can assign Time Format column to the "Partitioning Key" by "--time-column" option.

Scheduled execution

You can schedule periodic Data Connector execution for periodic Intercom import. We take great care in distributing and operating our scheduler in order to achieve high availability. By using this feature, you no longer need a cron daemon on your local datacenter.

Create the schedule

A new schedule can be created using the td connector:create command. The name of the schedule, cron-style schedule, the database and table where their data will be stored, and the Data Connector configuration file are required.

$ td connector:create \
    daily_intercom_import \
    "10 0 * * *" \
    td_sample_db \
    td_sample_table \
    load.yml
Untitled-3
The `cron` parameter also accepts these three options: `@hourly`, `@daily` and `@monthly`.
Untitled-3
By default, schedule is setup in UTC timezone. You can set the schedule in a timezone using -t or --timezone option. Please note that `--timezone` option only supports extended timezone formats like 'Asia/Tokyo', 'America/Los_Angeles' etc. Timezone abbreviations like PST, CST are *not* supported and may lead to unexpected schedules.

List the Schedules

You can see the list of currently scheduled entries by td connector:list.

$ td connector:list
+-----------------------+--------------+----------+-------+--------------+-----------------+----------------------------+
| Name                  | Cron         | Timezone | Delay | Database     | Table           | Config                     |
+-----------------------+--------------+----------+-------+--------------+-----------------+----------------------------+
| daily_intercom_import | 10 0 * * *   | UTC      | 0     | td_sample_db | td_sample_table | {"type"=>"intercom", ... } |
+-----------------------+--------------+----------+-------+--------------+-----------------+----------------------------+

Show the Setting and History of Schedules

td connector:show shows the execution setting of a schedule entry.

% td connector:show daily_intercom_import
Name     : daily_intercom_import
Cron     : 10 0 * * *
Timezone : UTC
Delay    : 0
Database : td_sample_db
Table    : td_sample_table

td connector:history shows the execution history of a schedule entry. To investigate the results of each individual execution, please use td job <jobid>.

% td connector:history daily_intercom_import
+--------+---------+---------+--------------+-----------------+----------+---------------------------+----------+
| JobID  | Status  | Records | Database     | Table           | Priority | Started                   | Duration |
+--------+---------+---------+--------------+-----------------+----------+---------------------------+----------+
| 578066 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-18 00:10:05 +0000 | 160      |
| 577968 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-17 00:10:07 +0000 | 161      |
| 577914 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-16 00:10:03 +0000 | 152      |
| 577872 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-15 00:10:04 +0000 | 163      |
| 577810 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-14 00:10:04 +0000 | 164      |
| 577766 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-13 00:10:04 +0000 | 155      |
| 577710 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-12 00:10:05 +0000 | 156      |
| 577610 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-11 00:10:04 +0000 | 157      |
+--------+---------+---------+--------------+-----------------+----------+---------------------------+----------+
8 rows in set

Delete the Schedule

td connector:delete will remove the schedule.

$ td connector:delete daily_intercom_import

Appendix

A) Modes for out plugin

You can specify file import mode in out section of load.yml.

append (default)

This is the default mode and records are appended to the target table.

in:
  ...
out:
  mode: append

replace (In td 0.11.10 and later)

This mode replaces data in the target table. Please note that any manual schema changes made to the target table will remain intact with this mode.

in:
  ...
out:
  mode: replace

Last modified: Feb 24 2017 09:41:25 UTC

If this article is incorrect or outdated, or omits critical information, please let us know. For all other issues, please see our support channels.