# LINE Via Crescendo Lab Import Integration LINE is in Asia what Facebook Messaging and Instagram are to US mobile users a fast and easy way to communicate with friends and to discover new promotions about your favorite products and services. LINE is the number one mobile messaging platform in Taiwan. It is also one of the largest mobile messaging platforms in Japan and Thailand. Through Crescendo Labs MAAC API, we are developing an out-of-the-box integration to activate LINE. With this integration, you can ingest your customer information members, segments. ## Prerequisites - Basic Knowledge of Treasure Data. - Basic knowledge of MAAC Crescendo Experience Account ## Limitation - The period of access token is the contract period signed by the crescendo lab and the partner. ## About Incremental Data Loading Incremental loading is loading only new or updated records from a source into Treasure Data. It is helpful because it runs efficiently compared to full loads, particularly for large data sets. Incremental loading is available for many of the Treasure Data integrations. In some cases, it is a simple checkbox choice; in others, after you select incremental loading, you are provided with other fields that must be specified. ## Limitations, Supported, Suggestions - For some integrations, if you choose incremental loading, you might need to ensure that the columns have an index to avoid a full table scan. - Only Timestamp, Datetime, and numerical columns are supported as incremental_columns. - The incremental_columns is required for the raw query because it cannot detect the Primary keys for a complex query. ## About Incremental Loading for Integrations Treasure Data Incremental loading has four patterns (3 types of data connector + 1 workflow td_load operator.), and the 3 data connector loading examples are as follows: - Cloud storage service (e.g., AWS S3, GCS, etc.) - Lexicographic order of file name - Query (e.g., MySQL, BigQuery, etc.) - Date time - Variable period (Google Analytics, etc) - Use start_date for loading ## Incremental Loading for Connectors If incremental loading is selected, data for the connector is loaded incrementally. This mode is useful when fetching just the object targets that have changed since the previously scheduled run. For example, in the UI: ![](/assets/snippet-about-incremental-loading-2024-11-18-1.63417e89e255d0e02570d0882a4929cde027422a934af21ba7541c5001c313a7.7499f82f.png) Database integrations, such as MySQL, BigQuery, and SQL Server, require column or field names to load incremental data. ![](/assets/snippet-about-incremental-loading-2024-11-18.398c24903f2c97039ca73ab56fd98e5cd1158120244ca53bf37dce03cf0513bb.7499f82f.png) Learn more about [Database-Based](/int/about-database-based-integrations)[Integrations](/int/about-database-based-integrations). ## Obtain API Token 1. Navigate to [https://maac.cresclab.com/apitoken](https://maac.cresclab.com/apitoken) ![](/assets/image2021-10-18_15-11-57.5d1b503f0d2a3320ac74257493ea98c64c7597da818405ee2170f55e6b16d050.2e601cf1.png) 1. Select Get token ## Use the TD Console to Create Your Connection ### Create a New Connection In Treasure Data, you must create and configure the data connection prior to running your query. As part of the data connection, you provide authentication to access the integration. 1. Open **TD Console**. 2. Navigate to **Integrations Hub** >  **Catalog**. 3. Search for and select LINEvia Crescendo. ![](/assets/image2021-10-18_14-9-32.2778d17c341cb0c79ef096e5435acb6c659aeccf77caf9fc0853068ce456c022.2e601cf1.png) 1. Select **Create Authentication**. 2. Type the credentials to authenticate. ![](/assets/image2021-10-18_14-10-21.a2eec66f259bd24db30b462980f5d5ec286ce03d8bff21e9c73fd64748a1a119.2e601cf1.png) 1. Enter a name for your connection. 2. Select **Continue.** ## Transfer Your Data to Treasure Data After creating the authenticated connection, you are automatically taken to Authentications. 1. Search for the connection you created. 2. Select **New Source**. 3. Type a name for your **Source** in the Data Transfer field**.** ![](/assets/image2021-10-18_14-16-18.b65d2eeeaa72cdb4161ffc9326605f22c7da1d3411b0957388643a6560d04c83.2e601cf1.png) 1. Select **Next**. The Source Table dialog opens. ![](/assets/image2021-10-18_14-16-55.393b646e2a5246908554ad70c8af9f13b77600ba9ea990c6780c2505beb375ae.2e601cf1.png) 1. Edit the following parameters: | Parameters | Description | | --- | --- | | **Data Objects** | Data Object type to import: - Members - Segments | | **Start Time** (Optional show if selecting Members Data Object) | For UI configuration, you can pick the date and time from the supported browser, or input the date that suits the browser expectation of date-time. For example, on Chrome, you will have a calendar to select Year, Month, Day, Hour, and Minute; on Safari, you need to input the text such as 2020-10-25T00:00. For CLI configuration, we need a timestamp in RFC3339 UTC "Zulu" format, accurate to nanoseconds, for example: "2014-10-02T15:01:23Z". | | **End Time** (Optional show if selecting Members Data Object) | For UI configuration, you can pick the date and time from the supported browser, or input the date that suits the browser's expectation of date-time. For example, on Chrome, you will have a calendar to select Year, Month, Day, Hour, and Minute; on Safari, you need to input the text such as 2020-10-25T00:00. For CLI configuration, we need a timestamp in RFC3339 UTC "Zulu" format, accurate to nanoseconds, for example: "2014-10-02T15:01:23Z". | | **Incremental** | Import new data only from the last run. See About Incremental Loading. | 1. Select **Next**. The Data Settings page can be modified for your needs or you can skip the page. 2. Select **Next**. ### Data Preview You can see a [preview](/products/customer-data-platform/integration-hub/batch/import/previewing-your-source-data) of your data before running the import by selecting Generate Preview. Data preview is optional and you can safely skip to the next page of the dialog if you choose to. 1. Select **Next**. The Data Preview page opens. 2. If you want to preview your data, select **Generate Preview**. 3. Verify the data. ### Data Placement For data placement, select the target database and table where you want your data placed and indicate how often the import should run. 1. Select **Next.** Under Storage, you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data. 2. Select a **Database** > **Select an existing** or **Create New Database**. 3. Optionally, type a database name. 4. Select a **Table**> **Select an existing** or **Create New Table**. 5. Optionally, type a table name. 6. Choose the method for importing the data. - **Append** (default)-Data import results are appended to the table. If the table does not exist, it will be created. - **Always Replace**-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created. - **Replace on New Data**-Only replace the entire content of an existing table with the result output when there is new data. 7. Select the **Timestamp-based Partition Key** column. If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter. 8. Select the **Timezone** for your data storage. 9. Under **Schedule**, you can choose when and how often you want to run this query. #### Run once 1. Select **Off**. 2. Select **Scheduling Timezone**. 3. Select **Create & Run Now**. #### Repeat Regularly 1. Select **On**. 2. Select the **Schedule**. The UI provides these four options: *@hourly*, *@daily* and *@monthly* or custom *cron*. 3. You can also select **Delay Transfer** and add a delay of execution time. 4. Select **Scheduling Timezone**. 5. Select **Create & Run Now**. After your transfer has run, you can see the results of your transfer in **Data Workbench** > **Databases.** ## Using the Command Line You can create the data connector from the CLI instead of the TD Console if you want. ## Install the prerequisites Install the latest td tool via Ruby gem: ```bash gem install td td --version ``` There are other install methods. For more information, see [Treasure Data Toolbelt](https://toolbelt.treasuredata.com/). ## Create the config file (config.yml) ### Example (config.yml) The following is an example configuration file to request daily basic statistics for all videos on the YouTube channel. ```yaml in: type: line_via_crescendo api_token: dummy_token data_object: members incremental: true start_time: 2021-01-01T07:16:00.000Z end_time: 2021-07-15T07:16:00.000Z out:   mode: append ```