Skip to content
Last updated

MySQL Import Integration

This article describes how to use the data connector for MySQL, which allows you to directly import data from MySQL to Treasure Data.

For sample workflows on importing data from MySQL, view Treasure Boxes.

Prerequisites

  • Basic knowledge of Treasure Data
  • Basic knowledge of MySQL
  • A MySQL instance that is reachable from Treasure Data

If you are using MySQL Community Server 5.6 and 5.7 and want to use SSL, set the parameters enabledTLSProtocols and TLSv1.2 to resolve a compatability issue with the underlying library of the connector. For other MySQL versions, the integration will try to use the highest TLS version automatically.

Create a New Authentication

When you configure a data connection, you provide authentication to access the integration. In Treasure Data, you configure the authentication and then specify the source information.

  1. Open TD Console.

  2. Navigate to Integrations HubCatalog

  3. Search for and select MySQL.

  4. Select Create

  5. The following dialog opens.

  6. Set the following parameters. Select Continue.

ParametersDescription
HostThe host information of the remote database, eg. an IP address.
PortThe connection port on the remote instance,MySQL default is 3306.
UsernameUsername to connect to the remote database.
PasswordPassword to connect to the remote database.
SSL ModeLearn more about SSL mode
useLegacyDatetimeCodeLearn more about useLegacyDatetimeCode
OPTIONS
JDBC Connection optionsAny special JDBC connections required by remote database.
Socket connection timeoutTimeout (in seconds) for socket connection (default is 300).
Network timeoutTimeout (in seconds) for network socket operations. 0 means no timeout.

Name Your Connection

  1. Type a name for your connection.
  2. Select Done.

Transfer Your MySQL Account Data to Treasure Data

After creating the authenticated connection, you are automatically taken to Authentications.

  1. Search for the connection you created.
  2. Select New Source.

Connection

  1. Type a name for your Source in the Data Transfer field**.**

Source Table

Provide the details of the database and table you would like to ingest data from.

  1. Select Next
  2. Edit the following parameters.
ParametersDescription
Database nameThe name of the database you are transferring data from. (Ex. your_database_name)
Use custom SELECT query?Use if you need more than a simple SELECT (columns) FROM table WHERE (condition).
SELECT columnsIf there are only specific columns you would like to pull data from, list them here. Otherwise all columns will be transferred.
TableThe table from which you would like to import the data.
WHERE conditionIf you need additional specificity on the data retrieved from the table you can specify it here as you would as part of WHERE clause.

Data Settings

  1. Select Next. The Data Settings page opens.

  2. Optionally, edit the data settings or skip this page of the dialog.

ParametersDescription
IncrementalIn cases where you repeatedly run this transfer, this checkbox will allow you to only import data since the last time the import was run.
Rows per batchExtremely large datasets can lead to memory issues and subsequently failed jobs. This flag allows you to breakdown the import job into batches by a number of rows to reduce the chances of memory issues and failed jobs.
Default timezoneThe timezone to be used when doing the import. Default is UTC however, you can change this if needed.
After SELECTThis SQL will be executed after the SELECT query in the same transaction.
Column OptionsIf you need to modify the type of column before importing it, select this option and the relevant column details. Select Save to save any advanced setting you have entered.

Data Preview

You can see a preview of your data before running the import by selecting Generate Preview. Data preview is optional and you can safely skip to the next page of the dialog if you choose to.

  1. Select Next. The Data Preview page opens.
  2. If you want to preview your data, select Generate Preview.
  3. Verify the data.

Data Placement

For data placement, select the target database and table where you want your data placed and indicate how often the import should run.

  1. Select Next. Under Storage, you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data.

  2. Select a Database > Select an existing or Create New Database.

  3. Optionally, type a database name.

  4. Select a TableSelect an existing or Create New Table.

  5. Optionally, type a table name.

  6. Choose the method for importing the data.

    • Append (default)-Data import results are appended to the table. If the table does not exist, it will be created.
    • Always Replace-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created.
    • Replace on New Data-Only replace the entire content of an existing table with the result output when there is new data.
  7. Select the Timestamp-based Partition Key column. If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter.

  8. Select the Timezone for your data storage.

  9. Under Schedule, you can choose when and how often you want to run this query.

Run once

  1. Select Off.
  2. Select Scheduling Timezone.
  3. Select Create & Run Now.

Repeat Regularly

  1. Select On.
  2. Select the Schedule. The UI provides these four options: @hourly@daily and @monthly or custom cron.
  3. You can also select Delay Transfer and add a delay of execution time.
  4. Select Scheduling Timezone.
  5. Select Create & Run Now.

After your transfer has run, you can see the results of your transfer in Data Workbench > Databases.