# Intercom Import Integration You can directly import data from Intercom to Treasure Data. # Prerequisites - Basic knowledge of Treasure Data, including the [TD Toolbelt](https://toolbelt.treasuredata.com/) - Basic knowledge of Intercom # Use the TD Console to Create Your Connection ## Create a New Connection Go to Integrations Hub > Catalog and search and select Intercom. ![](/assets/image-20191021-163206.227de535ba11d859b5f72ca39993368b027215d1731b56966a1f61fae0195574.5ce474d2.png) Select **Create.** You are creating an authenticated connection. The following dialog opens. ![](/assets/image-20191021-163213.a3fc44d8e4fdc2cec029167063f6d4fe801e2201ca2ddb84b327389428ee322e.5ce474d2.png) Access to Intercom requires OAuth2 authentication. Select **Click here** to connect to your Intercom account. Enter your credentials to sign into Intercom. ![](/assets/image-20191021-163226.7bf8e404820f7180f50906343f51d0975c1cdab2ac3c1ac293ace4808e89c6be.5ce474d2.png) After you grant access to Treasure Data you are redirected back to TD Console. Choose the Intercom connector again, then choose the OAuth Authenticate method. You will see an OAuth connection with your account name in the dropdown list. Choose the account you want to use and then proceed to create the connection. ![](/assets/image-20191021-163235.7be7aea3c68d676a1522ba30ec3a9044437329da72b90211304534929a4c4ebb.5ce474d2.png) ![](/assets/image-20191021-163244.5d4e9fb45ea3b663b3d1aa82393219c3e8b58ad988d22c150ea86bd560dcb556.5ce474d2.png) Name your new Google Drive Connection. Select **Done**. Previously for this data connector, `App id` and `API Key` was used for authentication. However, [Intercom started their OAuth flow](https://developers.intercom.com/blog/oauth-support) and [Intercom API keys were deprecated](https://developers.intercom.com/blog/announcement-upcoming-deprecation-of-api-keys). If you are using Google Sign-In to log into Intercom, make sure that you are already logged in Intercom before starting the OAuth flow. Intercom requires password logins, not Google Sign-In, through the OAuth flow. #### Update an existing API key-based connection to OAuth. Initiate the OAuth flow as you did previously, even if you have been using API keys. OAuth is prioritized over API keys, if both are specified. ## Transfer Your Data to Treasure Data After creating the authenticated connection, you are automatically taken to the Authentications tab. Look for the connection you created and select **New Source**. ### Import From Users and Conversations: From Source select users or conversations. ![](/assets/image-20191021-163319.73d1548449516a00ddb5fe3c93b2ec55e5d5387ad117d63f61e25d017dcf58cc.5ce474d2.png) ![](/assets/image-20191021-163327.738525ff2b84c207d192141fbb7774a6f23ab67899feb9cbba612d9c4e490d6b.5ce474d2.png) Parameters: - **Incremental**: Use when importing data based on a schedule. Use to import only the newest user or conversation created since the last run. ### Import From Tags and Segments: From Source choose tags or segments ![](/assets/image-20191021-163335.e3abc2f92bae7da6366e17d05abdd8ec36628fd85e2cfabd83dc65eb7aea1002.5ce474d2.png) ![](/assets/image-20191021-163342.ba298752c9b7ab290168b0d743fb9931301c223806f3c4d527b2a81ff24444eb.5ce474d2.png) ### Data Preview You can see a [preview](/products/customer-data-platform/integration-hub/batch/import/previewing-your-source-data) of your data before running the import by selecting Generate Preview. Data preview is optional and you can safely skip to the next page of the dialog if you choose to. 1. Select **Next**. The Data Preview page opens. 2. If you want to preview your data, select **Generate Preview**. 3. Verify the data. ### Data Placement For data placement, select the target database and table where you want your data placed and indicate how often the import should run. 1. Select **Next.** Under Storage, you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data. 2. Select a **Database** > **Select an existing** or **Create New Database**. 3. Optionally, type a database name. 4. Select a **Table**> **Select an existing** or **Create New Table**. 5. Optionally, type a table name. 6. Choose the method for importing the data. - **Append** (default)-Data import results are appended to the table. If the table does not exist, it will be created. - **Always Replace**-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created. - **Replace on New Data**-Only replace the entire content of an existing table with the result output when there is new data. 7. Select the **Timestamp-based Partition Key** column. If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter. 8. Select the **Timezone** for your data storage. 9. Under **Schedule**, you can choose when and how often you want to run this query. #### Run once 1. Select **Off**. 2. Select **Scheduling Timezone**. 3. Select **Create & Run Now**. #### Repeat Regularly 1. Select **On**. 2. Select the **Schedule**. The UI provides these four options: *@hourly*, *@daily* and *@monthly* or custom *cron*. 3. You can also select **Delay Transfer** and add a delay of execution time. 4. Select **Scheduling Timezone**. 5. Select **Create & Run Now**. After your transfer has run, you can see the results of your transfer in **Data Workbench** > **Databases.** ## Details Name your Transfer and select **Done** to start. ![](/assets/image-20191021-163431.eb0a775efe14f22aa08cc9d933172ac06f1c5bfc04d45f03b66a17a1054affcb.5ce474d2.png) # Use Command Line ### Install ‘td’ command v0.11.9 or later You can install the newest [TD Toolbelt](https://toolbelt.treasuredata.com/). ``` $ td --version 0.15.0 ``` ## Create Configuration File Prepare configuration file (for eg: `load.yml`) as shown in the following example, with your Intercom account access information to: import Users ``` in: type: intercom access_token: xxxxxxx target: users incremental: false out: mode: append ``` import Conversations ``` in: type: intercom access_token: xxxxxxx target: conversations incremental: false out: mode: append ``` import Segments ``` in: type: intercom access_token: xxxxxxx target: segments out: mode: append ``` import Tags ``` in: type: intercom access_token: xxxxxxx target: tags out: mode: append ``` #### Access Token The preceding example dumps Intercom’s `users` objects. Here `access_token` is a valid access token achieved from Intercom. Using the OAuth flow through TD Console is recommended. Your [Personal Access Token](https://developers.intercom.com/docs/personal-access-tokens) can be used for `access_token` instead of the OAuth flow. #### Target You can select which data needs to be fetched from store as `target` option. ## Preview Data (Optional) You can preview data to be imported using the command `td connector:preview`. ``` $ td connector:preview load.yml +-----------+--------------+----------------------------+---- | id:string | user_id:string | email:string | ... +-----------+----------------+------------------------------- | "1" | "33" | "xxxx@xxx.com" | | "2" | "34" | "yyyy@yyy.com" | | "3" | "35" | "zzzz@zzz.com" | | "4" | "36" | "aaaa@aaa.com" | | "6" | "37" | "bbbb@bbb.com" | +-----------+----------------+--------------------------+---- ``` ## Execute Load Job Submit the load job. It may take a couple of hours depending on the data size. Users need to specify the database and table where their data are stored. It is recommended to specify `--time-column` option, since Treasure Data’s storage is partitioned by time. If the option is not given, the Data Connector will choose the first `long` or `timestamp` column as the partitioning time. The type of the column specified by `--time-column` must be either of `long` and `timestamp` type. If your data doesn’t have a time column you may add it using `add_time` filter option. More details at [add_time filter plugin](https://docs.treasuredata.com/smart/project-product-documentation/add_time-filter-function) ``` $ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column created_at ``` The preceding command assumes you have already created *database(td_sample_db)* and *table(td_sample_table)*. If the database or the table do not exist in TD this command will not succeed, so create the database and table [manually](https://docs.treasuredata.com/smart/project-product-documentation/data-management) or use `--auto-create-table` option with `td connector:issue` command to auto-create the database and table: ``` $ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column created_at --auto-create-table ``` You can assign Time Format column to the "Partitioning Key" by "--time-column" option. # Scheduled Execution You can schedule a periodic Data Connector execution for periodic Intercom import. We configure our scheduler carefully to ensure high availability. By using this feature, you no longer need a `cron` daemon on your local data center. ## Create the Schedule A new schedule can be created using the `td connector:create` command. The name of the schedule, cron-style schedule, the database and table where their data will be stored, and the Data Connector configuration file are required. ``` $ td connector:create \ daily_intercom_import \ "10 0 * * *" \ td_sample_db \ td_sample_table \ load.yml ``` The `cron` parameter also accepts these three options: `@hourly`, `@daily` and `@monthly`. | By default, schedule is setup in UTC timezone. You can set the schedule in a timezone using -t or --timezone option. The `--timezone` option only supports extended timezone formats like 'Asia/Tokyo', 'America/Los_Angeles' etc. Timezone abbreviations like PST, CST are *not* supported and may lead to unexpected schedules. ## List the Schedules You can see the list of scheduled entries by `td connector:list`. ``` $ td connector:list ``` ## Show the Setting and History of Schedules `td connector:show` shows the execution setting of a schedule entry. ``` td connector:show daily_intercom_import ``` `td connector:history` shows the execution history of a schedule entry. To investigate the results of each individual execution, use `td job jobid`. ``` td connector:history daily_intercom_import ``` ## Delete the Schedule `td connector:delete` will remove the schedule. ``` td connector:delete daily_intercom_import ```