# Amazon Redshift Import Integration Learn more about [Amazon Redshift Export Integration](/int/amazon-redshift-export-integration). Open You can connect Amazon Redshift to import data into Treasure Data. # Prerequisites - Redshift instance created. - Treasure Data TD Toolbelt installed. - Basic knowledge of Amazon Redshift. - Basic knowledge of Treasure Data, including the [TD Toolbelt](https://toolbelt.treasuredata.com/). # Use the TD Console to Create Your Connection ## Create a New Connection When you configure a data connection, you provide authentication to access the integration. In Treasure Data, you configure the authentication and then specify the source information. 1. Open TD Console. 2. Navigate to **Integrations Hub** >  **Catalog** 3. Click the search icon on the far-right of the Catalog screen, and enter **Amazon Redshift**. 4. Hover over the Amazon Redshift connector and select Create Authentication. ![](/assets/amazonredshift.72b30214eceebe7b0fea54137a0622f0286279c4112f82ec8cef094f631cb6ae.b6850247.png) 5. Specify the required credentials. **Host**: The host information of the source database, such as an IP address. **Port**: The connection port on the source instance. The PostgreSQL default is 5432. **User**: Username to connect to the source database. **Password**: Password to connect to the source database. **Use SSL**: Check this box to connect using SSL **JDBC Connection options**: Any special JDBC connections required by the source database (optional). **Region**: The AWS regions in which your Redshift instance is hosted. **Socket connection timeout**: Timeout (in seconds) for socket connection (default is 300). **Network timeout**: Timeout (in seconds) for network socket operations. 0 means no timeout. **Rows per batch**: Number of rows to fetch one time. 6. Select **Continue** after entering the required connection details. 7. Name the connection so you can find it later should you need to modify any of the connection details. 8. Optionally, select **Share with others**, if you would like to share this connection with other users in your organization. 9. Select **Done.** If the connection is a success, then the connection appears in your list of authentications. ## Transfer Your Redshift Data Account Data to Treasure Data After creating the authenticated connection, you are automatically taken to Authentications. 1. Search for the connection you created. 2. Select **New Source**. ![](/assets/image-20191204-002533.4e8d301835f3c81faeaf3800f2f86f1ae4dbf07db2db305a1c99b17e25148cf0.b6850247.png) ### **Connection** 1. Type a name for your **Source** in the Data Transfer field**.** 2. Click **Next**. ![](/assets/image-20200812-000957.a2316a234651e0a456f9a11a74a550fe45c1d50859c458d41d34023fe37c4e53.b6850247.png) 3. Specify the details of the database and table that you want to ingest data from. **Database name**: The name of the database you are transferring data from. (Ex. `your_database_name`) **Use custom SELECT query?**: Use if you need more than a simple SELECT (columns) FROM table WHERE (condition). **Schema**: The schema to transfer data from. **SELECT columns**: If there are only specific columns you would like to pull data from, list them here. Otherwise all columns are transferred. **Table**: The table from which you would like to import the data. **WHERE condition**: If you need additional specificity on the data retrieved from the table you can specify it here as part of `WHERE` clause. **ORDER BY**: Specify if you need the records ordered by a particular field. ### Data Settings 1. Select **Next**. The Data Settings page opens. ![](/assets/image-20200812-002114.9e0b1777e5f750d145c3639467f22108ac794d3bd6b52f2b7592d2abe9f1812f.b6850247.png) 2. Optionally, edit the data settings or skip this page of the dialog. **Incremental**: When you want to repeatedly run this transfer, use the checkbox to import data only since the last time the import was run. **Default timezone**: The timezone to be used when doing the import. Default is UTC. **After SELECT**: This SQL is executed after the SELECT query in the same transaction. **Column Options**: Select this option to modify the type of column before importing it. 3. Select **Next**. ### Data Preview You can see a [preview](/products/customer-data-platform/integration-hub/batch/import/previewing-your-source-data) of your data before running the import by selecting Generate Preview. Data preview is optional and you can safely skip to the next page of the dialog if you choose to. 1. Select **Next**. The Data Preview page opens. 2. If you want to preview your data, select **Generate Preview**. 3. Verify the data. ### Data Placement For data placement, select the target database and table where you want your data placed and indicate how often the import should run. 1. Select **Next.** Under Storage, you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data. 2. Select a **Database** > **Select an existing** or **Create New Database**. 3. Optionally, type a database name. 4. Select a **Table**> **Select an existing** or **Create New Table**. 5. Optionally, type a table name. 6. Choose the method for importing the data. - **Append** (default)-Data import results are appended to the table. If the table does not exist, it will be created. - **Always Replace**-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created. - **Replace on New Data**-Only replace the entire content of an existing table with the result output when there is new data. 7. Select the **Timestamp-based Partition Key** column. If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter. 8. Select the **Timezone** for your data storage. 9. Under **Schedule**, you can choose when and how often you want to run this query. #### Run once 1. Select **Off**. 2. Select **Scheduling Timezone**. 3. Select **Create & Run Now**. #### Repeat Regularly 1. Select **On**. 2. Select the **Schedule**. The UI provides these four options: *@hourly*, *@daily* and *@monthly* or custom *cron*. 3. You can also select **Delay Transfer** and add a delay of execution time. 4. Select **Scheduling Timezone**. 5. Select **Create & Run Now**. After your transfer has run, you can see the results of your transfer in **Data Workbench** > **Databases.**