Learn more about PostgreSQL Export Integration.

The Data Connector for PostgreSQL enables you to directly import data from your PostgreSQL to Treasure Data.

For sample workflows showing how to import data from your PostgreSQL, view Treasure Boxes.

Continue to the following topics:

Prerequisites

Using TD Console

Create a New Connection

When you configure a data connection, you provide authentication to access the integration. In Treasure Data, you configure the authentication and then specify the source information.

  1. Open TD Console.

  2. Navigate to Integrations HubCatalog

  3. Search for and select PostgreSQL. Select Create.


  4. The following dialog opens.


  5. Enter the required credentials and set the parameters. Select Continue.



Host

The host information of the source database, such as an IP address.

Port

The connection port on the source instance. The PostgreSQL default is 5432.

User

Username to connect to the source database.

Password

The password to connect to the source database.

Use SSL

Check this box to connect using SSL

Specify SSL version

Select which SSL version to use for the connection.

Socket connection timeout

Timeout (in seconds) for socket connection (default is 300).

Network timeout

Timeout (in seconds) for network socket operations. 0 means no timeout.

Name the Connection

  1. Type a name for your connection. If you would like to share this connection with other users in your organization, select Share with others. If this box is unchecked, then the connection is visible only to you.

  2. Select Done.

Transfer Your PostgreSQL Account Data to Treasure Data

After creating the authenticated connection, you are automatically taken to Authentications.

  1. Search for the connection you created. 

  2. Select New Source. The Create Source dialog opens.

Connection

  1. Type a name for your Source in the Data Transfer field.


  2. Click Next

Source Table

  1. Edit the following parameters

Parameters

Description

Database name

The name of the database you are transferring data from. For example, your_database_name.

Use custom SELECT query?

Use if you need more than a simple SELECT (columns) FROM table WHERE (condition).

SELECT columns

If there are only specific columns you would like to pull data from, list them here. Otherwise, all columns are transferred.

Table

The table from which you want to import the data.

WHERE condition

If you need additional specificity on the data retrieved from the table you can specify it here as part of WHERE clause.

ORDER BY

Specify if you need the records ordered by a particular field.


 Data Settings

  1. Select Next. The Data Settings page opens.

  2. Optionally, edit the data settings or skip this page of the dialog.


Parameters

Description

Incremental:

When you want to repeatedly run this transfer, select this checkbox to import data only since the last time the import was run.

Rows per batch

Extremely large datasets can lead to memory issues and subsequently failed jobs. Use this flag to breakdown the import job into batches by the number of rows to reduce the chances of memory issues and failed jobs.

Default timezone

The timezone to be used when doing the import.

After SELECT

This SQL is executed after the SELECT query in the same transaction.

Column Options

Select this option to modify the type of column before importing it. Select Save to save any data setting you have entered.

Default Column Options

Select this option to define the data type according to default SQL types before importing it. Select Save to save any data settings you have entered.

This option is not available in the TD Console. Set this option using TD CLI or TD Workflow


Preview


Data Placement


Further Information