# LINE OA Insights Import Integration

This TD import allows you to transfer data from your LINE message account to Treasure Data.

## Prerequisites

- Basic knowledge of Treasure Data, including [TD Toolbelt](/tools/cli-and-sdks/quickstart)
- A LINE message account
- A Channel Access Token from your LINE message account


## Requirements and Limitations

- Importing from the CLI requires the [Ruby Gem for TD Toolbelt](https://toolbelt.treasuredata.com/).
- LINE Message API rate limit is 60 requests per hour.


## Obtain Channel Access Token

To import from LINE Message, you will need to have a Channel Access Token. Manually obtain it via the LINE Developers Console (Long-Lived Token):

1. Log in to the [LINE Developers Console](https://developers.line.biz/console/).
2. Select your provider and then the Messaging API channel you want to use.
![](/assets/2026-02-09_11-36-40.4512c8a09eff5bdb9185f230afe063198895756be03584f989b7fabb12a1b90d.f9f43318.png)
3. Choose the Messaging API account then go to the "Messaging API" tab.
![](/assets/2026-02-09_11-36-41.19c90f2cc4eace36a5ceab22d45bf249f5eee1f5316213664c43f0d1e64bf8f8.f9f43318.png)
4. Scroll down to the "Channel access token (long-lived)" section. Click the "Issue" button. The channel access token will be displayed.
![](/assets/2026-02-09_11-36-42.2499f334d4f9be8451ddf1b66e06a0b430b403efbb29b30fc91adf1c4a7e3fc6.f9f43318.png)
5. Copy the token and store it securely.


## Import from LINE Message via TD Console

### Create Authentication

Your first step is to create a new authentication with your credential.

1. Open **TD Console**.
2. Navigate to **Integrations Hub** > **Catalog**.
3. Search for and select LINE OA Insights.


![](/assets/2026-02-09_11-36-43.b87969bb583337d8e00351b8e0b4be160f4738402f8eada1eefcdc1ab0d20d9a.f9f43318.png)

1. Select **Create Authentication**.
2. Type your channel access token. Select **Continue**.


![](/assets/2026-02-09_11-36-44.eb185bd31a04fd5ea9970a56896ffd35275616b7e3b3a46d6ee655bec3ec85e0.f9f43318.png)

1. Enter a name for your connection. Select **Done**.


### Transfer Your Data to Treasure Data

After creating the authenticated connection, you are automatically taken to Authentications.

1. Search for the connection you created.
![](/assets/2026-02-09_11-36-45.915c5f59d6def3eaba1b3315d6559777b6540cdc8e323940f20cafd88477d051.f9f43318.png)
2. Select **New Source**.
3. Type a name for your **Source** in the Data Transfer field.
4. Select **Next**.
The Source Table dialog opens.
5. Edit the following parameters:
![](/assets/2026-02-09_11-36-49.98db05747b76fd29991e0313ab12588cc092ce66a7fc567f24f2e82f09c71a06.f9f43318.png)


The following table describes the parameters for configuring a source table:

| Parameter | Mandatory | Description |
|  --- | --- | --- |
| Target | Yes | Specify which data you need to ingest. Supported values:- Message Deliveries
- Followers
- Demographics
- User Interaction Statistics

 |
| Start Date | Yes if Target is Message Deliveries or Followers | Start date for data import in yyyyMMdd format. |
| End Date | No | End date for data import in yyyyMMdd format. Defaults to current date if not specified. |
| Incremental
 | No
 | This checkbox is only visible if the Target is one of these values:
- Message Deliveries
- Followers

Only imports new data since the last import date or the last date for which data is in ready status.
 |
| Request IDs | Yes if Target is User Interaction Statistics | Specify request IDs to fetch insights data. Comma-separated values. |


1. Select **Next** to define Data Settings.


![](/assets/2026-02-09_11-36-47.0079204c964f9a959f8efad3a000d284ff6c1cf0cbbc321f2c4c03096710fbbb.f9f43318.png)

1. Select **Next** to preview your data.


![](/assets/2026-02-09_11-36-46.4163f11ef6b22aeb05e639cdd2619198ed7efb08fb856729021f64c1e3b5859c.f9f43318.png)

- To preview your data, select **Generate Preview**. Optionally, select **Next** to skip to the next section.


1. Define Your Data Placement


![](/assets/2026-02-09_11-36-48.28d883dc00fe939750fef2839785ea688d2116aeb27de48e1d5397b39f6fe5d9.f9f43318.png)

In the Storage section, specify details for where you want the data to reside in TD:

- Database — Select the database where you want to save your data.
- Table — The destination table where you want to store your imported data.
- Method
  - Append — Add records to your existing table. (Be aware that data may be duplicated.)
  - Always Replace — Always clear your destination table before adding records.
  - Replace on new data — If new data is found, then old data will be overwritten by the new data.
- Timestamp-based Partition Key — Select your custom timestamp column used as the partition key.
- Data Storage Timezone — The expected timezone for your database.


In the **Schedule** section, you can choose when and how often you want to run this query:

- Repeat — Select **On** or **Off**.
- Schedule — The dropdown list provides these options: *@daily (midnight)*, *@hourly (:00)*, or *Custom cron*.
- Delay Transfer — You can specify a delay to the execution time.
- Scheduling Timezone — Select the timezone for scheduling.


1. Select **Create & Run Now**.


After your transfer has run, you can see the results of your transfer in **Data Workbench** > **Databases**.

## Import from LINE Message via TD Workflow

Create and run a workflow:


```yaml
_export:
  td:
    database: workflow_line_oa_insights
    table: line_oa_insights
+import_from_line_oa_insights:
  td_load>: imports/seed.yml
  database: ${td.database}
  table: ${td.table}
```

Modify the *seed.yml* file with your connection details for the import:


```yaml
in:
  type: line_oa_insights
  channel_access_token: {your_channel_access_token}
  target: message_deliveries
  start_date: 20251101
  end_date: 20260105
  incremental: true
out:
  mode: append
```

| Parameter | Data Type | Mandatory | Description |
|  --- | --- | --- | --- |
| target | string | Yes | Specify which data you need to ingest. Supported values:- message_deliveries
- followers
- demographics
- user_interaction_statistics

 |
| start_date | string | Yes if Target is message_deliveries or followers | Start date for data import in yyyyMMdd format. |
| end_date | string | No | End date for data import in yyyyMMdd format. Defaults to current date if not specified. |
| incremental
 | boolean (true/false). Default is false
 | No
 | This value only applies for targets:
- message_deliveries
- followers

Only imports new data since the last date import or the last date that data is in ready status.
 |
| request_ids | string | Yes if target is user_interaction_statistics | Specify request IDs to fetch insights data. Comma-separated values. |


## Import from LINE Message via CLI (Toolbelt)

You can import data from LINE Message using the TD Toolbelt.

### Prerequisites

Install the latest TD Toolbelt using Ruby Gem:


```
$ gem install td
$ td --version
0.16.10
```

There are other install methods. For more information, see [Treasure Data Toolbelt](https://toolbelt.treasuredata.com/).

### Create Config File

Create a config.yml configuration file:


```
in:
    type: line_oa_insights
    channel_access_token: {your_channel_access_token}
    target: message_deliveries
    start_date: 20251101
    end_date: 20260105
    incremental: true
out:
  ...........
```

### Preview Data Coming In from Config File

You can preview the data your config file will pull in:


```
$ td connector:preview config.yml

+-----------+-----------------------+-------------------------+-------------------------+
| id:string |           name:string |          created:string |          updated:string |
+-----------+-----------------------+-------------------------+-------------------------+
|    SPg3aL |      chrome_line_name | 2021-08-30 05:35:46 UTC | 2021-08-30 05:35:46 UTC |
|    QPg3zL | 100_w_phone_line_name | 2021-08-27 23:49:56 UTC | 2021-08-27 23:49:56 UTC |
+-----------+-----------------------+-------------------------+-------------------------+
```

### Scheduled Execution

You can schedule periodic data connector execution for periodic LINE import. By using this feature, you no longer need a `cron` daemon on your local data center.

#### Create the Schedule

A new schedule can be created using the `td connector:create` command. The following data needs to be specified:

- Name of the schedule
- The cron-style schedule
- The database and table where your data will be stored
- The Data Connector configuration file is required.



```
$ td connector:create \
  daily_line_oa_insights_import \
  "10 0 * * *" \
  sample_db \
  sample_table \
  config.yml
```

The cron parameter also accepts three special options: `@hourly`, `@daily` and `@monthly`. For more details, see [Scheduled Jobs](https://docs.treasuredata.com/smart/project-product-documentation/scheduling-jobs-using-td-console).

By default, the schedule is set up in the UTC timezone. You can set the schedule in a timezone using the `-t` or `--timezone` option. The `--timezone` option supports only extended timezone formats like 'Asia/Tokyo', 'America/Los_Angeles', etc. Timezone abbreviations like PST and CST are *not* supported and may lead to unexpected schedules.

#### List the Schedules

You can see the list of currently scheduled entries by entering `td connector:list`.


```
$ td connector:list
  +------------------------------+------------+----------+-------+-----------------+--------------+
  | Name                         | Cron       | Timezone | Delay | Database        | Table        |
  +------------------------------+------------+----------+-------+-----------------+--------------+
  |daily_line_oa_insights_import | 10 0 * * * | UTC      | 0     | sample_database | sample_table |
  +------------------------------+------------+----------+-------+-----------------+--------------+
```

#### Show the Setting and History of Schedules

`td connector:show` shows the execution setting of a schedule entry.


```
$ td connector:show daily_line_oa_insights_import
  Name     : daily_line_oa_insights_import
  Cron     : 10 0 * * *
  Timezone : UTC
  Delay    : 0
  Database : sample_db
  Table    : sample_table
```

`td connector:history` shows the execution history of a schedule entry. To investigate the results of each individual execution, use `td job:show jobid`.


```
  | 577914 | success | 20000   | sample_db | sample_table | 0        | 2015-04-16 00:10:03 +0000 | 152      |
  | 577872 | success | 20000   | sample_db | sample_table | 0        | 2015-04-15 00:10:04 +0000 | 163      |
  | 577810 | success | 20000   | sample_db | sample_table | 0        | 2015-04-14 00:10:04 +0000 | 164      |
  | 577766 | success | 20000   | sample_db | sample_table | 0        | 2015-04-13 00:10:04 +0000 | 155      |
  | 577710 | success | 20000   | sample_db | sample_table | 0        | 2015-04-12 00:10:05 +0000 | 156      |
  | 577610 | success | 20000   | sample_db | sample_table | 0        | 2015-04-11 00:10:04 +0000 | 157      |
  +--------+---------+---------+-----------+--------------+----------+---------------------------+----------+
```

#### Delete the Schedule

`td connector:delete` removes the schedule.


```
$ td connector:delete daily_line_oa_insights_import
```