# Airship Import Integration Airship is a platform for customer engagement, lifecycle marketing, analytics, and data solutions. It can help marketers to activate and engage customers by providing notification and messaging solutions, targeting, and campaign management tools. With this integration, you can ingest your customer information opt-in devices, static lists, and reports. ## Prerequisites - Basic Knowledge of Treasure Data. - Basic knowledge of the Airship platform ## Static IP Address of Treasure Data Integration If your security policy requires IP whitelisting, you must add Treasure Data's IP addresses to your allowlist to ensure a successful connection. Please find the complete list of static IP addresses, organized by region, at the following link: [https://api-docs.treasuredata.com/en/overview/ip-addresses-integrations-result-workers/](https://api-docs.treasuredata.com/en/overview/ip-addresses-integrations-result-workers/) ## Limitation The Airship application has a limitation in that Data Object “Static Lists” does not work with the Bearer Token Authentication method, despite the Airship API document stating that it is supported. See this link: [https://docs.airship.com/reference/security/api-security/](https://docs.airship.com/reference/security/api-security/). - For some integrations, if you choose incremental loading, you might need to ensure that the columns have an index to avoid a full table scan. - Only Timestamp, Datetime, and numerical columns are supported as incremental_columns. - The incremental_columns is required for the raw query because it cannot detect the Primary keys for a complex query. ## Obtaining List Names 1. Navigate to your project within Airship. 2. Under Audience, select List: ![](/assets/image2020-12-14_19-27-42.d48e90d6eda950be50b3b446154f67842b3c615ff349fed3ac8a4fb0b9a74ccc.c7507d0f.png) 3. Make note of the list names that you might want to import into Treasure Data. ![](/assets/image2020-12-14_19-28-54.a9aab9ef16ec826f33053682212163c611e46dd175a0fc9efc7519a4e11ab971.c7507d0f.png) ## Use the TD Console to Create Your Connection ## Create a New Authenticaiton In Treasure Data, you must create and configure the data connection before running your query. As part of the data connection, you provide authentication to access the integration. 1. Open **TD Console**. 2. Navigate to **Integrations Hub** >  **Catalog**. 3. Search for Airship and then click **Create Authentication**. ![](/assets/airship.c54cea610f705e1019f4414c31acda543ac6a3560d0191d4ec1189cfbe4e2612.c7507d0f.png) 4. The following dialog opens.![](/assets/image2020-12-9_14-48-36.41f3a135e9338d4635059b3943e1fdc41be5682c02bb721440227e3106174b33.c7507d0f.png) 5. Enter the Base URL: - **Airship’s North American cloud site**. [https://go.urbanairship.com](https://go.urbanairship.com/) - **Airship’s European cloud site**. [https://go.airship.eu](https://go.airship.eu/) 1. Choose one of the following authentication methods: Basic Authentication * In the Airship project dashboard, select **Settings** > **APIs & Integrations.** * Enter the App key, App secret, and App master secret. ![](/assets/image-20201001-105225.8c36beeab1beaa336666eeb2b7e31823e05fb2749a86343da7bb5d31070a7609.429f9851.png) Access Token * Select **Settings** > **APIs & Integrations** > **Tokens** in the Airship project dashboard. * Create a new token and grant **Audience Modification** or **All-access Role**. ![](/assets/image-20201001-110148.02affccc2dabc28ee51534b28b5607998e557003533835501ef7c7fd4b7cab7a.429f9851.png) 1. Enter a name for your connection. 2. Select **Continue.** ## Transfer Your Airship Account Data to Treasure Data After creating the authenticated connection, you are automatically taken to Authentications. Search for the connection you created. 1. Select **New Source**. 2. Type a name for your **Source** in the Data Transfer field**.  ![](/assets/image2020-12-9_15-40-22.74201d979c95be9640411c51ce549d18c920a98b38aacbcbbb38e10cd4655f70.c7507d0f.png) 3. Select **Next**.**The Source Table dialog opens.** ![](/assets/image2020-12-9_15-40-58.70e94fb6c62d177119ecc1ff85862eb31c3474c9d60158d50100df78e789d350.c7507d0f.png) 4. Edit the following parameters: | **Parameters** | **Description** | | --- | --- | | **Data Source** | Data type to import: - Reports - Named Users - Static Lists | | **Data Objects** | Report type to import, show if the data type is Reports: - Custom Event Report - Opt-in Report - Opt-out Report - Time In App Report - Web Response Report - Response List - Device Report - Experiment Overview Report | | **Start Time** (Required when select Reports Data Source) | For UI configuration, you can pick the date and time from the supported browser, or input the date that suits the browser expectation of date-time. For example, on Chrome, you will have a calendar to select Year, Month, Day, Hour, and Minute; on Safari, you need to input the text such as 2020-10-25T00:00. For CLI configuration, we need a timestamp in RFC3339 UTC "Zulu" format, accurate to nanoseconds, for example: "2014-10-02T15:01:23Z". | | **End Time** (Required when select Reports Data Source) | For UI configuration, you can pick the date and time from the supported browser, or input the date that suits the browser's expectation of date-time. For example, on Chrome, you will have a calendar to select Year, Month, Day, Hour, and Minute; on Safari, you need to input the text such as 2020-10-25T00:00. For CLI configuration, we need a timestamp in RFC3339 UTC "Zulu" format, accurate to nanoseconds, for example: "2014-10-02T15:01:23Z". | | **Precision** | The granularity of results to return - Daily - Hourly - Monthly | | **Incremental Loading** | Import new data only from the last run. See About Incremental Loading. | | **Until** (Required when select Device Report Data Source) | Only import data until this date | | **Push ID** (Required when select Experiment Overview Report Data Source) | Format: UUID Returns statistics and metadata about an experiment | | Named User ID (show if Named Users Data Source) | If left blank, retrieve the full list of all the named users | | **List Name** (Required when select Static Lists Data Source) | Download the contents of a static list | ### Data Settings 1. Select **Next**. The Data Settings page opens. 2. Skip this page of the dialog. ### Data Preview You can see a [preview](/products/customer-data-platform/integration-hub/batch/import/previewing-your-source-data) of your data before running the import by selecting Generate Preview. Data preview is optional and you can safely skip to the next page of the dialog if you choose to. 1. Select **Next**. The Data Preview page opens. 2. If you want to preview your data, select **Generate Preview**. 3. Verify the data. ### Data Placement For data placement, select the target database and table where you want your data placed and indicate how often the import should run. 1. Select **Next.** Under Storage, you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data. 2. Select a **Database** > **Select an existing** or **Create New Database**. 3. Optionally, type a database name. 4. Select a **Table**> **Select an existing** or **Create New Table**. 5. Optionally, type a table name. 6. Choose the method for importing the data. - **Append** (default)-Data import results are appended to the table. If the table does not exist, it will be created. - **Always Replace**-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created. - **Replace on New Data**-Only replace the entire content of an existing table with the result output when there is new data. 7. Select the **Timestamp-based Partition Key** column. If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter. 8. Select the **Timezone** for your data storage. 9. Under **Schedule**, you can choose when and how often you want to run this query. #### Run once 1. Select **Off**. 2. Select **Scheduling Timezone**. 3. Select **Create & Run Now**. #### Repeat Regularly 1. Select **On**. 2. Select the **Schedule**. The UI provides these four options: *@hourly*, *@daily* and *@monthly* or custom *cron*. 3. You can also select **Delay Transfer** and add a delay of execution time. 4. Select **Scheduling Timezone**. 5. Select **Create & Run Now**. After your transfer has run, you can see the results of your transfer in **Data Workbench** > **Databases.**