# Dynalyst Import Integration [Learn more about Dynalyst Export Integration](/int/dynalyst-export-integration). The Dynalyst data connector enables you to import data from your JSON, TSV, and CSV files stored in your S3 buckets into Treasure Data's customer data platform. Dynalyst uses AWS S3 as a storage place, and the import process is similar to importing data from AWS S3. ![](/assets/image2020-11-19_8-34-57.c2df22233c53cb13e754ee1bd4b9a05966488bb70abd9438cb16826b9e154e6b.3872a78d.png) You can also use this connector to export to OneDrive. See [Dynalyst Export Integration](/int/dynalyst-export-integration). You must complete the connection and authentication to be used with Dynalyst, prior to running your imports or query exports. ## Import from Dynalist via TD Console ### Completing Authentication Setup 1. Review and complete the following in [Dynalyst Import and Export Integration](/int). - Prerequisites - Create a New Connection 2. Search for your Dynalyst Authentication. 3. Select **New Source**. ![](/assets/image-20191015-190024.8fa79966d61bf202e526a92fc7201937cda6b7d4f33147acd96bb4d001adc388.3872a78d.png) ### Defining a New Source 1. In the New Source window, provide the name of the Bucket that contains the files that you want to import. - **Path prefix:** Configures the source to import all files that match with the specified prefix. (*eg. /path/to/customers.csv*) - **Path regex:** Configures the source to import all files that match using the specified regex pattern. ![](/assets/image-20191015-190914.03854e07ba679cf02dce13515a6fc519b33a111bb68458bb50402c2ba2d62f39.3872a78d.png) ### Previewing the Source Data 1. In the source preview window, select **advanced settings** to make any adjustments needed for import. For example, changing the import parser from CSV to JSON, setting line-delimiters, and so on*.* 2. Select **Next**. ![](/assets/image-20191015-191001.02c0a2fdf59c8192e51edc9855c363204be523c316ce205062b4556886da6fb8.3872a78d.png) ### Data Placement For data placement, select the target database and table where you want your data placed and indicate how often the import should run. 1. Select **Next.** Under Storage, you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data. 2. Select a **Database** > **Select an existing** or **Create New Database**. 3. Optionally, type a database name. 4. Select a **Table**> **Select an existing** or **Create New Table**. 5. Optionally, type a table name. 6. Choose the method for importing the data. - **Append** (default)-Data import results are appended to the table. If the table does not exist, it will be created. - **Always Replace**-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created. - **Replace on New Data**-Only replace the entire content of an existing table with the result output when there is new data. 7. Select the **Timestamp-based Partition Key** column. If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter. 8. Select the **Timezone** for your data storage. 9. Under **Schedule**, you can choose when and how often you want to run this query. #### Run once 1. Select **Off**. 2. Select **Scheduling Timezone**. 3. Select **Create & Run Now**. #### Repeat Regularly 1. Select **On**. 2. Select the **Schedule**. The UI provides these four options: *@hourly*, *@daily* and *@monthly* or custom *cron*. 3. You can also select **Delay Transfer** and add a delay of execution time. 4. Select **Scheduling Timezone**. 5. Select **Create & Run Now**. After your transfer has run, you can see the results of your transfer in **Data Workbench** > **Databases.** #### Naming Your Source 1. Type a name for your source. 2. Select **Done**. Your source job runs according to the schedule you specified. ![](/assets/image-20191015-191337.38712cacf8e163ef5318a1583e50c9e09b8650f51f18414f43fec885b1a89cdd.3872a78d.png)