# Repro Import Integration [Learn more about Repro Export Integration](/int/repro-export-integration). You can use Repro Import Integration to ingest files from your Amazon S3 buckets with customized parameters for an easy configuration. ## Prerequisites - Basic knowledge of Treasure Data, including the TD [Toolbelt](https://toolbelt.treasuredata.com/). - A Repro application id, Access key ID, and Secret access key. ## Limitations - If you enter File name patterns and select Incremental?, the data does not load. The data does not load because Repro does not put the data into the old folder, it creates a new folder every time. ## Using the TD Console to Create Your Connection ### Create a New Connection When you create a data connection, you must provide authentication to access the integration. In Treasure Data, configure the authentication and then specify the source information. 1. Open **TD Console**. 2. Navigate to **Integrations Hub >  Catalog**. 3. Search for and select **Repro**. ![](/assets/image-20200527-022110.ad047e132d984058503e6ec7184f2727f55af90bfa0f6bc7493b0b8010ceb4a9.44e271e2.png) The following dialog opens: ![](/assets/screen-shot-2020-05-26-at-09.00.48.c61c96c4aa63651afee638cba4aa2688e9bddf13342bf43ddfd9c1c4ef14bee4.44e271e2.png) 4. Enter the required information: - Region. The region of your Repro's application (for example, ap-northeast-1, us-east-1 …) - Authentication Method. Select basic. - Access key ID. Enter the key you obtained from Repro. - Secret access key. Enter the secret access key you obtained from Repro. 1. Select **Continue.** 2. Enter a name for your connection. ![](/assets/screen-shot-2020-05-26-at-09.06.52.d66e64c13225f849390aaedb9fd1820b079becbf44fd10d08cf7289f716ce59d.44e271e2.png) 7. Select **Done.** ### Transfer Your Repro Account Data to Treasure Data After creating the authenticated connection, you are automatically taken to Authentications. 1. Search for the connection you created. ![](/assets/screen-shot-2020-05-26-at-09.07.56.605693a219b797f0e38c9621e695cd80ac3e6fad7e15b6e85a02d4b09f1c975d.44e271e2.png) 2. Select **New Source**. ### **Create Your Source** 1. Type a name for your **Source** in the Data Transfer field**.** ![](/assets/screen-shot-2020-05-26-at-09.08.53.41f588232923eb4fa0b384b686f7916d10d89a0656c2e53bde588fb388ea82e7.44e271e2.png) 2. Select **Next**. ![](/assets/screen-shot-2020-05-26-at-09.09.33.e6b5b5aa52ec41642f3523e16fe15a312fa4af642a1dfd6ff5ac18263a399eee.44e271e2.png) 1. Edit the following parameters in your source table. | **Parameters** | **Description** | | --- | --- | | Bucket | The bucket where your Repro application is located, for example, `repro-data-for-outer-production`. | | App ID | Your Repro application id. | | Upload Time | The specific time you would like to ingest the data (`YYYYMMDDHH` format). | | Filename pattern | Use regexp to match file paths. If a file path doesn’t match the specified pattern, the file is skipped. For example, if you specify the pattern *.csv$* # , then a file is skipped if its path doesn’t match the pattern. Read more about [regular expressions](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions). | | Filter by Modified Time | Select if you would like to use modified time as the main criteria to load data. | | Insert these parameters so that the first executions skip files that were modified before that specified timestamp. For example, 2019-06-03T10:30:19.806Z. | | | Incremental by Modified Time *(Available if Filter by Modified Time is selected)* | Select to ingest only new data since the previous ingestion. | | Incremental? *(Available if Filter by Modified Time is selected)* | Select to ingest only new data since the previous ingestion. | ### Configure Your **Data** 1. Select **Next**. The Data Settings page opens. 2. Optionally, edit the data settings or skip this page. ![](/assets/screen-shot-2020-05-26-at-09.39.53.01af20027a538b26831e3017f3f8af4717773c68fd139c67813d88d0fe8d3168.44e271e2.png) ### Data Preview You can see a [preview](/products/customer-data-platform/integration-hub/batch/import/previewing-your-source-data) of your data before running the import by selecting Generate Preview. Data preview is optional and you can safely skip to the next page of the dialog if you choose to. 1. Select **Next**. The Data Preview page opens. 2. If you want to preview your data, select **Generate Preview**. 3. Verify the data. ### Data Placement For data placement, select the target database and table where you want your data placed and indicate how often the import should run. 1. Select **Next.** Under Storage, you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data. 2. Select a **Database** > **Select an existing** or **Create New Database**. 3. Optionally, type a database name. 4. Select a **Table**> **Select an existing** or **Create New Table**. 5. Optionally, type a table name. 6. Choose the method for importing the data. - **Append** (default)-Data import results are appended to the table. If the table does not exist, it will be created. - **Always Replace**-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created. - **Replace on New Data**-Only replace the entire content of an existing table with the result output when there is new data. 7. Select the **Timestamp-based Partition Key** column. If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter. 8. Select the **Timezone** for your data storage. 9. Under **Schedule**, you can choose when and how often you want to run this query. #### Run once 1. Select **Off**. 2. Select **Scheduling Timezone**. 3. Select **Create & Run Now**. #### Repeat Regularly 1. Select **On**. 2. Select the **Schedule**. The UI provides these four options: *@hourly*, *@daily* and *@monthly* or custom *cron*. 3. You can also select **Delay Transfer** and add a delay of execution time. 4. Select **Scheduling Timezone**. 5. Select **Create & Run Now**. After your transfer has run, you can see the results of your transfer in **Data Workbench** > **Databases.**