This TD import allows you to transfer data from your LINE message account to Treasure Data.
- Basic knowledge of Treasure Data, including TD Toolbelt
- A LINE message account
- A Channel Access Token from your LINE message account
- Importing from the CLI requires the Ruby Gem for TD Toolbelt.
- LINE Message API rate limit is 60 requests per hour.
To import from LINE Message, you will need to have a Channel Access Token. Manually obtain it via the LINE Developers Console (Long-Lived Token):
- Log in to the LINE Developers Console.
- Select your provider and then the Messaging API channel you want to use.

- Choose the Messaging API account then go to the "Messaging API" tab.

- Scroll down to the "Channel access token (long-lived)" section. Click the "Issue" button. The channel access token will be displayed.

- Copy the token and store it securely.
Your first step is to create a new authentication with your credential.
- Open TD Console.
- Navigate to Integrations Hub > Catalog.
- Search for and select LINE OA Insights.

- Select Create Authentication.
- Type your channel access token. Select Continue.

- Enter a name for your connection. Select Done.
After creating the authenticated connection, you are automatically taken to Authentications.
- Search for the connection you created.

- Select New Source.
- Type a name for your Source in the Data Transfer field.
- Select Next. The Source Table dialog opens.
- Edit the following parameters:

The following table describes the parameters for configuring a source table:
| Parameter | Mandatory | Description |
|---|---|---|
| Target | Yes | Specify which data you need to ingest. Supported values:
|
| Start Date | Yes if Target is Message Deliveries or Followers | Start date for data import in yyyyMMdd format. |
| End Date | No | End date for data import in yyyyMMdd format. Defaults to current date if not specified. |
Incremental | No | This checkbox is only visible if the Target is one of these values:
Only imports new data since the last import date or the last date for which data is in ready status. |
| Request IDs | Yes if Target is User Interaction Statistics | Specify request IDs to fetch insights data. Comma-separated values. |
- Select Next to define Data Settings.

- Select Next to preview your data.

- To preview your data, select Generate Preview. Optionally, select Next to skip to the next section.
- Define Your Data Placement

In the Storage section, specify details for where you want the data to reside in TD:
- Database — Select the database where you want to save your data.
- Table — The destination table where you want to store your imported data.
- Method
- Append — Add records to your existing table. (Be aware that data may be duplicated.)
- Always Replace — Always clear your destination table before adding records.
- Replace on new data — If new data is found, then old data will be overwritten by the new data.
- Timestamp-based Partition Key — Select your custom timestamp column used as the partition key.
- Data Storage Timezone — The expected timezone for your database.
In the Schedule section, you can choose when and how often you want to run this query:
- Repeat — Select On or Off.
- Schedule — The dropdown list provides these options: @daily (midnight), @hourly (:00), or Custom cron.
- Delay Transfer — You can specify a delay to the execution time.
- Scheduling Timezone — Select the timezone for scheduling.
- Select Create & Run Now.
After your transfer has run, you can see the results of your transfer in Data Workbench > Databases.
Create and run a workflow:
_export:
td:
database: workflow_line_oa_insights
table: line_oa_insights
+import_from_line_oa_insights:
td_load>: imports/seed.yml
database: ${td.database}
table: ${td.table}Modify the seed.yml file with your connection details for the import:
in:
type: line_oa_insights
channel_access_token: {your_channel_access_token}
target: message_deliveries
start_date: 20251101
end_date: 20260105
incremental: true
out:
mode: append| Parameter | Data Type | Mandatory | Description |
|---|---|---|---|
| target | string | Yes | Specify which data you need to ingest. Supported values:
|
| start_date | string | Yes if Target is message_deliveries or followers | Start date for data import in yyyyMMdd format. |
| end_date | string | No | End date for data import in yyyyMMdd format. Defaults to current date if not specified. |
incremental | boolean (true/false). Default is false | No | This value only applies for targets:
Only imports new data since the last date import or the last date that data is in ready status. |
| request_ids | string | Yes if target is user_interaction_statistics | Specify request IDs to fetch insights data. Comma-separated values. |
You can import data from LINE Message using the TD Toolbelt.
Install the latest TD Toolbelt using Ruby Gem:
$ gem install td
$ td --version
0.16.10There are other install methods. For more information, see Treasure Data Toolbelt.
Create a config.yml configuration file:
in:
type: line_oa_insights
channel_access_token: {your_channel_access_token}
target: message_deliveries
start_date: 20251101
end_date: 20260105
incremental: true
out:
...........You can preview the data your config file will pull in:
$ td connector:preview config.yml
+-----------+-----------------------+-------------------------+-------------------------+
| id:string | name:string | created:string | updated:string |
+-----------+-----------------------+-------------------------+-------------------------+
| SPg3aL | chrome_line_name | 2021-08-30 05:35:46 UTC | 2021-08-30 05:35:46 UTC |
| QPg3zL | 100_w_phone_line_name | 2021-08-27 23:49:56 UTC | 2021-08-27 23:49:56 UTC |
+-----------+-----------------------+-------------------------+-------------------------+You can schedule periodic data connector execution for periodic LINE import. By using this feature, you no longer need a cron daemon on your local data center.
A new schedule can be created using the td connector:create command. The following data needs to be specified:
- Name of the schedule
- The cron-style schedule
- The database and table where your data will be stored
- The Data Connector configuration file is required.
$ td connector:create \
daily_line_oa_insights_import \
"10 0 * * *" \
sample_db \
sample_table \
config.ymlThe cron parameter also accepts three special options: @hourly, @daily and @monthly. For more details, see Scheduled Jobs.
By default, the schedule is set up in the UTC timezone. You can set the schedule in a timezone using the -t or --timezone option. The --timezone option supports only extended timezone formats like 'Asia/Tokyo', 'America/Los_Angeles', etc. Timezone abbreviations like PST and CST are not supported and may lead to unexpected schedules.
You can see the list of currently scheduled entries by entering td connector:list.
$ td connector:list
+------------------------------+------------+----------+-------+-----------------+--------------+
| Name | Cron | Timezone | Delay | Database | Table |
+------------------------------+------------+----------+-------+-----------------+--------------+
|daily_line_oa_insights_import | 10 0 * * * | UTC | 0 | sample_database | sample_table |
+------------------------------+------------+----------+-------+-----------------+--------------+td connector:show shows the execution setting of a schedule entry.
$ td connector:show daily_line_oa_insights_import
Name : daily_line_oa_insights_import
Cron : 10 0 * * *
Timezone : UTC
Delay : 0
Database : sample_db
Table : sample_tabletd connector:history shows the execution history of a schedule entry. To investigate the results of each individual execution, use td job:show jobid.
| 577914 | success | 20000 | sample_db | sample_table | 0 | 2015-04-16 00:10:03 +0000 | 152 |
| 577872 | success | 20000 | sample_db | sample_table | 0 | 2015-04-15 00:10:04 +0000 | 163 |
| 577810 | success | 20000 | sample_db | sample_table | 0 | 2015-04-14 00:10:04 +0000 | 164 |
| 577766 | success | 20000 | sample_db | sample_table | 0 | 2015-04-13 00:10:04 +0000 | 155 |
| 577710 | success | 20000 | sample_db | sample_table | 0 | 2015-04-12 00:10:05 +0000 | 156 |
| 577610 | success | 20000 | sample_db | sample_table | 0 | 2015-04-11 00:10:04 +0000 | 157 |
+--------+---------+---------+-----------+--------------+----------+---------------------------+----------+td connector:delete removes the schedule.
$ td connector:delete daily_line_oa_insights_import