Airship is a platform for customer engagement, lifecycle marketing, analytics, and data solutions. It can help marketers to activate and engage customers by providing notification and messaging solutions, targeting, and campaign management tools. With this integration, you can ingest your customer information opt-in devices, static lists, and reports.
Prerequisites
Basic Knowledge of Treasure Data.
Basic knowledge of Airship platform
Limitation
The Airship application has a limitation that Data Object “Static Lists” does not work with the Bearer Token Authentication method, despite Airship API document stated that it is supported. See this link: https://docs.airship.com/reference/security/api-security/.
About Incremental Data Loading
Incremental loading is the activity of loading only new or updated records from a source into Treasure Data. Incremental loads are useful because they run efficiently when compared to full loads, and particularly for large data sets. Incremental loading is available for many of the Treasure Data integrations. In some cases, it is a simple checkbox choice and in others, after you select incremental loading you are provided with other fields that must be specified. Treasure Data Incremental loading has 4 patterns (3 types of data connector + 1 workflow td_load operator.), then the 3 data connector loading examples are as follows: Cloud storage service (e.g. AWS S3, GCS and etc.) Lexicographic order of file name Query (e.g. MySQL, BigQuery and etc.) Date time Variable period (Google Analytics, etc) Use start_date for loading If incremental loading is selected, data for the connector is loaded incrementally. This mode is useful when you want to fetch just the object targets that have changed since the previously scheduled run. For example, in the UI: Database integrations, such as MySQL, BigQuery, and SQL server, require column or field names to load incremental data. For example: Learn more About Database-based Integrations.Limitations, Supported, Suggestions
About Incremental Loading for Integrations
Incremental Loading for Connectors
Obtaining List Names
- Navigate to your project within Airship.
- Under Audience, select List:
- Make note of the list names that you might want to import into Treasure Data.
Use the TD Console to Create Your Connection
Create a New Connection
In Treasure Data, you must create and configure the data connection before running your query. As part of the data connection, you provide authentication to access the integration.
Open TD Console.
Navigate to Integrations Hub > Catalog.
Search for Airship and then click Create Authentication.
The following dialog opens.
Enter the Base URL:
Airship’s North American cloud site. https://go.urbanairship.com
Airship’s European cloud site. https://go.airship.eu
Choose one of the following authentication methods:
Enter a name for your connection.
Select Continue.
Transfer Your Airship Account Data to Treasure Data
After creating the authenticated connection, you are automatically taken to Authentications. Search for the connection you created.
Select New Source.
Type a name for your Source in the Data Transfer field.
- Select Next.
The Source Table dialog opens. Edit the following parameters:
Parameters
Description
Data Source
Data type to import:
Reports
Named Users
Static Lists
Data Objects
Report type to import, show if the data type is Reports:
Custom Event Report
Opt-in Report
Opt-out Report
Time In App Report
Web Response Report
Response List
Device Report
Experiment Overview Report
Start Time
(Required when select Reports Data Source)For UI configuration, you can pick the date and time from the supported browser, or input the date that suits the browser expectation of date-time. For example, on Chrome, you will have a calendar to select Year, Month, Day, Hour, and Minute; on Safari, you need to input the text such as 2020-10-25T00:00.
For CLI configuration, we need a timestamp in RFC3339 UTC "Zulu" format, accurate to nanoseconds, for example: "2014-10-02T15:01:23Z".
End Time
(Required when select Reports Data Source)For UI configuration, you can pick the date and time from the supported browser, or input the date that suits the browser's expectation of date-time. For example, on Chrome, you will have a calendar to select Year, Month, Day, Hour, and Minute; on Safari, you need to input the text such as 2020-10-25T00:00.
For CLI configuration, we need a timestamp in RFC3339 UTC "Zulu" format, accurate to nanoseconds, for example: "2014-10-02T15:01:23Z".
Precision
The granularity of results to return
Daily
Hourly
Monthly
Incremental Loading
Import new data only from the last run. See About Incremental Loading.
Until
(Required when select Device Report Data Source)Only import data until this date
Push ID
(Required when select Experiment Overview Report Data Source)Format: UUID
Returns statistics and metadata about an experimentNamed User ID
(show if Named Users Data Source)If left blank, retrieve the full list of all the named users
List Name
(Required when select Static Lists Data Source)Download the contents of a static list
Data Settings
Select Next.
The Data Settings page opens.Skip this page of the dialog.
Data Preview
You can see a preview of your data before running the import by selecting Generate Preview. Data shown in the data preview is approximated from your source. It is not the actual data that is imported. Click Next. To preview your data, select Generate Preview. Optionally, click Next. Verify that the data looks approximately like you expect it to. Select Next.
Data preview is optional and you can safely skip to the next page of the dialog if you want.
Data Placement
For data placement, select the target database and table where you want your data placed and indicate how often the import should run. Select Next. Under Storage you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data. Select a Database > Select an existing or Create New Database. Optionally, type a database name. Select a Table> Select an existing or Create New Table. Optionally, type a table name. Choose the method for importing the data. Append (default)-Data import results are appended to the table. Always Replace-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created. Replace on New Data-Only replace the entire content of an existing table with the result output when there is new data. Select the Timestamp-based Partition Key column. Select the Timezone for your data storage. Under Schedule, you can choose when and how often you want to run this query. Select Off. Select Scheduling Timezone. Select Create & Run Now. Repeat the query: Select On. Select the Schedule. The UI provides these four options: @hourly, @daily and @monthly or custom cron. You can also select Delay Transfer and add a delay of execution time. Select Scheduling Timezone. Select Create & Run Now. After your transfer has run, you can see the results of your transfer in Data Workbench > Databases.
If the table does not exist, it will be created.
If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter.