# Mailchimp Import Integration [Learn more about Mailchimp Export Integration.](/int/mailchimp-export-integration) You can use this data connector to import List Members, Member Activity, Campaigns, and Lists data into Treasure Data. You can use the same connection to create and update Mailchimp List with Treasure Data. ## Prerequisites - Basic knowledge of Treasure Data, including the [TD Toolbelt](https://toolbelt.treasuredata.com). - A MailChimp account that can grant permissions to Treasure Data. ## Rate Limit Each MailChimp user account is permitted up to 10 simultaneous HTTP connections. Currently, there are no options to raise this limit on a per-customer basis. When running simultaneous jobs, ensure that the total number of HTTP connections for all simultaneous jobs doesn't exceed 10 HTTP connections. Otherwise, all jobs will fail. MailChimp flushes all HTTP connections of a user account when the 10 HTTP connections limit is exceeded. Use Batch Operations to import Member Activity can reduce requests to MailChimp API server and thus reduce rate limit. See import Member Activity ## Use TD Console ### Create a New Connection Go to Integrations Hub -> Catalog and search and select the Mailchimp tile. A dialog opens in which you provide the required credentials. Specify an Authentication Method. ### Authenticating Your Connection The method you use to authenticate Treasure Data with MailChimp affects the steps you take to enable the data connector. You can choose to authenticate using: - API Key - OAuth ### Using an API Key to Authenticate You can specify the [MailChimp API Key](https://mailchimp.com/help/about-api-keys/?utm_source=mc-api&utm_medium=docs&utm_campaign=apidocs) and credential to authorize Treasure Data access. The API key grants full access to your Mailchimp account. ### Using OAuth to Authenticate Note that OAuth method is currently not supported for JP and IDCF customers. You can select an existing OAuth connection for MailChimp from the drop-down. ![](/assets/image-20191013-183746.500a5933493f84e0c9854acf4a439081298d5e11324d327d51663e6da5382eaa.4fc380c5.png) Or you can select the link under **OAuth connection** to create a new one. ### Create a New OAuth Connection When you select **Click here to connect a new account**, you must sign into your MailChimp account in popup window ![](/assets/image-20191013-183809.d907b70a2d1a239d5b3914941a0460def982bb939763e3a612c94e9a00bb113a.4fc380c5.png) By signing into MailChimp, you are authenticating. The action of signing into Mailchimp generates an OAuth authentication. You are redirected back to the Treasure Data Catalog page. Repeat the first step (Create a New Connection) and choose your new OAuth connection, then finish creating your connection. ![](/assets/image-20191013-183845.2fac9b364dba1b96e2ad03cb4df3354e58a82f33422d9fd2ea71069450d3ac63.4fc380c5.png) You now have an authenticated connector that you use to complete configuration for input (to Treasure Data) of data in the next step. Select **Continue**. Provide a name for your connector. You have completed the first part of creating a connector. ### Create a New Source After creating the authenticated connection, you are automatically taken to the Authentications tab. Look for the connection you created and select **New Source**. ![](/assets/image-20191013-183857.787257ef8783db6c7a7f8e84880a62cc051c07ee6d3c83114ee26dc9636fed5c.4fc380c5.png) ### List Members In the New Source dialog, choose List Members for the Import Data, choose whether to import data for all or specific lists, and enter the Start Time (Optional) and End Time (Optional), then select **Next**. ![](/assets/image-20191013-183913.97f937322c78618567467cac761a8dc5fded8c924dc9012c5684dc9e7bd57a9d.4fc380c5.png) When Incremental loading option is checked, it incrementally imports members whose info updated since the last import. | **Parameter** | **Description** | | --- | --- | | Import data for all lists | When the option is checked, import data for all lists | | List ID | A valid MailChimp list id | | Add more list IDs | Enter more list ids if needed | | Start Time | Import members whose info updated since this time (Inclusive) | | End Time | Import members whose info updated before this time (Exclusive) | Next, you see a Preview of your list member data similar to the following dialog. Select **Next**. ![](/assets/image-20191013-183929.39d6451930bf55c408d1b5719493cabff9f0434e124660d13b1121e1a36637c2.4fc380c5.png) #### Member Activity In the New Source dialog, choose Member Activity for the Import Data, and choose whether to import data for all or specific lists, and then select **Next**. ![](/assets/image-20191013-183943.ac5cf788e3bec84bb5e6a5c8c645fd220d741753b74a5024950757a297c378a7.4fc380c5.png) Only last 50 events of a member’s activity are available to be imported. | **Parameter** | **Description** | | --- | --- | | Import data for all lists | When the option is checked, import data for all lists | | List ID | A valid MailChimp list id | | Add more list IDs | Enter more list ids if needed | | Use Batch Operations | When import 10.000 member's activity, there will be 10.000 requests needed. User batch operations to group the requests and send them at once. | Next, you see a Preview of your member activity data similar to the following dialog. Select **Next**. ![](/assets/image-20191013-183954.341221a76db1f067bd5e71680aed1416582a809aec40ba1c00619990b9e78881.4fc380c5.png) #### Campaigns In the New Source dialog, choose Campaigns for the Import Data, and enter a Start Time (Optional) and End Time (Optional), then select **Next**. ![](/assets/image-20191013-184007.af1c8f00f4bf639e46abf46d731abbf220ab95d409986c91212e5e55d87489bd.4fc380c5.png) When Incremental loading option is checked, it incrementally imports campaigns created since the last import. | **Parameter** | **Description** | | --- | --- | | Start Time | Import campaigns created since this time (Inclusive) | | End Time | Import campaigns created before this time (Exclusive) | Next, you see a Preview of your campaign data similar to the following dialog. Select **Next**. ![](/assets/image-20191013-184020.2fb843cac01de350e2bcaed048ed040e9463f2172476c22e1a1b47c161e14571.4fc380c5.png) #### Lists In the New Source dialog, choose Lists for the Import Data, and enter a Start Time (Optional) and End Time (Optional), then select **Next**. ![](/assets/image-20191013-184031.4dc220479657d0c2331ee06ac30fd4596f68e0528e15d876c6edd8f6ae9cd52c.4fc380c5.png) When Incremental loading option is checked, it incrementally imports lists created since the last import. | **Parameter** | **Description** | | --- | --- | | Start Time | Import lists created since this time (Inclusive) | | End Time | Import lists created before this time (Exclusive) | Next, you see a Preview of your list data similar to the following dialog. Select **Next**. ![](/assets/image-20191013-184041.ab90318dabb04505dc55e7ad3b0b53a0346fd25702146f5597fcc369ebeef4d5.4fc380c5.png) ### Transfer to Select the database and table to where you want to transfer the data, as shown in the following dialog: ![](/assets/image-20191013-184055.6ab726c9b2e13a75d2c6d1662ab5e4f428925440df290ffa95d5764123a69899.4fc380c5.png) If you are creating a new database, check **Create new database** and give your database a name. Do the same with **Create new table**. Select whether to append records to an existing table or replace your existing table. If you want to set a different partition key seed than the default key, you can specify one using the popup menu. ### Schedule Finally, specify the schedule of the data transfer using the following dialog and select **Start Transfer**: ![](/assets/image-20191013-184108.bcf984edf58a382be5fcf8d1cf368f0bee58971fcee35204dd08e37391ac5caa.4fc380c5.png) You see the new data transfer in progress listed under the My Input Transfers tab and a corresponding job is listed in the Jobs section. In the When tab, you can specify this as a one-time transfer, or you can schedule an automated recurring transfer. If you selected Once now, select Start Transfer. If you selected Repeat… specify your schedule options, then select Schedule Transfer. After your transfer runs, you can see the results of your transfer in the Databases tab. ![](/assets/image-20191013-184117.5b269953fa6d868d5e47f0ee9710def0d0c5202a6e1f17f73f52bd99a8f86102.4fc380c5.png) You are ready to start analyzing your data. ## Use Command Line ### Install ‘td’ Command v0.11.9 or Later You can install the latest [TD Toolbelt](https://toolbelt.treasuredata.com/). ``` $ td --version 0.15.8 ``` ### Create Configuration File Prepare configuration file (for eg: `load.yml`) as shown in the following example, with your MailChimp credential and transfer information. ```yaml in: type: mailchimp apikey: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-dcxx target: list_members list_id: xxxxxxxxxxxxx start_time: "2018-05-07T00:00:00Z" end_time: "2018-05-08T00:00:00Z" out: mode: append ``` ### Preview Data to Import (Optional) You can preview data to be imported using the command `td connector:preview`. ```bash $ td connector:preview load.yml ``` ### Execute Load Job Submit the load job. It may take a couple of hours depending on the data size. ```bash $ td connector:issue load.yml \ --database td_sample_db \ --table td_sample_table \ --time-column last_changed ``` You must specify the database and table where the data is stored. It is recommended to specify `--time-column` option, because Treasure Data’s storage is partitioned by time (see also [data partitioning](https://docs.treasuredata.com/smart/project-product-documentation/data-partitioning-in-treasure-data)). If the option is not given, the data connector selects the first `long` or `timestamp` column as the partitioning time. The type of the column specified by `--time-column` must be either of `long` and `timestamp` type. If your data doesn’t have a time column you can add it using `add_time` filter option. More details at [add_time filter plugin](https://docs.treasuredata.com/smart/project-product-documentation/add_time-filter-function). You can assign a Time Format column to the "Partitioning Key" by "--time-column" option. The `td connector:issue` command assumes that you have already created *database(td_sample_db)* and *table(td_sample_table)*. If the database or the table do not exist in TD, then this command will not succeed, so create the database and table [manually](https://docs.treasuredata.com/smart/project-product-documentation/data-management) or use `--auto-create-table` option with `td connector:issue` command to auto create the database and table: ```bash td connector:issue load.yml \ --database td_sample_db \ --table td_sample_table \ --time-column last_changed \ --auto-create-table ``` ### Scheduled Execution You can schedule periodic data connector execution for periodic MailChimp import. We have configured our scheduler carefully to ensure high availability. By using this feature, you no longer need a `cron` daemon on your local data center. A new schedule can be created using the `td connector:create` command. You must specify: the name of the schedule, a cron-style schedule, the database and table where the data will be stored, and the data connector configuration file. ```bash td connector:create \ daily_mailchimp_import \ "9 0 * * *" \ td_sample_db \ td_sample_table \ load.yml ``` The `cron` parameter also accepts these three options: `@hourly`, `@daily` and `@monthly`. | By default, schedule is setup in UTC timezone. You can set the schedule in a timezone using -t or --timezone option. The `--timezone` option only supports extended timezone formats like 'Asia/Tokyo', 'America/Los_Angeles' etc. Timezone abbreviations like PST, CST are *not* supported and may lead to unexpected schedules. ### Configuration See the following table for more details on available `in` modes. ### Basic Configuration | Option name | Description | Type | Required? | Default value | | --- | --- | --- | --- | --- | | auth_method | Auth Method. Valid values : api_key, oauth | string | yes | api_key | | apikey | A valid MailChimp API Key | string | yes for api_key Auth Method | | | access_token | A valid MailChimp access token | string | yes for oauth Auth Method | | | target | Supported data targets : list_memebers, member_activity, campaigns, lists | string | yes | | | list_id | A valid MailChimp list id. Import all lists when both list_id and more_list_ids is not presented. Only valid for list_members and member_activity target | string | no | | | more_list_ids | Array of valid MailChimp list ids. Import all lists when both list_id and more_list_ids is not presented. Only valid for list_members and member_activity target | array | no | | | start_time | Specify the date and time to fetch records from. Formatted `yyyy-MM-dd'T'hh:mm:ss'Z'` (e.g. ‘2018-05-07T00:00:00Z’). Inclusive | string | no | | | end_time | Specify the allowable duration to fetch records. formatted `yyyy-MM-dd'T'hh:mm:ss'Z'` (e.g. ‘2018-05-08T00:00:00Z’). Exclusive | string | no | | | incremental | `true` for “mode: append”, `false` for “mode: replace” (See below). | boolean | no | true | | use_batch_request | Group all MailChimp API calls into batches, submit batches instead of using the regular API calls | boolean | no | false | ### Advanced Configuration | Option name | Description | Type | Default value | | --- | --- | --- | --- | | http_max_connections | Maximum HTTP simultaneous connections (min: 1, max: 10) | integer | 5 | | skip_on_invalid_records | `false` fail fast, `true`. Ignore invalid records/errors and continue loading other records | boolean | false | | max_records_per_request | Max records per batch request (min: 10) | integer | 500 | | max_requests_per_minute | Max requests per minute (min: 1, max: 300) | integer | 300 | | maximum_retries | Maximum retry count per API call | integer | 7 | | initial_retry_interval_millis | Initial retry interval per API call in milliseconds | integer | 1000 | | maximum_retry_interval_millis | Maximum HTTP simultaneous connections | integer | 120000 | | http_connect_timeout_millis | HTTP Connect Timeout | integer | 60000 | | http_read_timeout_millis | HTTP Read Timeout | integer | 300000 | ### Max Simultaneous HTTP Connections http_max_connections is 5 for running jobs. When running more than 2 jobs at the same time, the parameter value should be reduced to ensure that total of http connections don't exceed MailChimp's rate limit of 10 connections. http_max_connections can be changed by using the `http\_max\_connections` parameter for CLI or by using Advanced Settings >Preview step, in the TD Console. - Config File ```yaml in: http_max_connections: 5 out: mode: append ``` - TD Console ![](/assets/image-20191013-184142.4bcc7748a7ed3e5feb90be10f6926d8ee34f5813c2f8fff485eb0a1436609c86.4fc380c5.png) ### Use Batch Operations Due to MailChimp's internal implementation, use batch operations won't have performance gains compare to normal import (Sending synchronous requests). Its purpose is to reduce the number of requests made by the Connector. Consider using batch operations if you experience one of the following events: - Your import failed or unable to success using non-batch operation. - Your MailChimp job frequently gets rate limited due to too many requests made by the Connector. - Your MailChimp jobs get errors 501, 503 Http status codes and reach retry limit for these errors.