Data Connector for AppAnnie (BETA)

This Data Connector allows you to import AppAnnie Data Source objects into Treasure Data. This feature is currently in beta, so any feedback would be appreciated.

Table of Contents

Prerequisites

  • Basic knowledge of Treasure Data
  • Basic knowledge of AppAnnie

Option 1: Use Web Console

Create a new connection

Please visit Treasure Data Connections and search and select AppAnnie. The dialog below will open.



Please fill in your AppAnnie apikey information, click Next and give your connection a name:



Create a new transfer

Upon creating the connection above, you will be automatically taken to My Connections tab. Look for the connection you created and click New Transfer.



The dialog below will open. Please fill the details and click Next.



Next, you will see a Preview of your data similar to the dialog below. If you wish to change anything, please click on Advanced Settings or else click on Next.



From here, if you want to change some options such as skipping on errors or rate limits, you can click on Advanced Settings:



Third step is to select the database and table where you want to transfer the data, as per the following dialog:



Finally, specify the schedule of the data transfer using the dialog below and click Start Transfer:



You will see the new data transfer in progress listed under the My Input Transfers tab and a corresponding job will be listed in the Jobs section.

Now, you are ready to start analyzing your data!

Option 2: Use Command Line

Step 0: Install ‘td’ command v0.11.9 or later

You can install the newest Treasure Data Toolbelt.

$ td --version
0.15.0

Step 1: Create Configuration File

Prepare configuration file (for eg: load.yml) like below, with your AppAnnie account access information.

in:
  type: app_annie
  apikey: xxxxxxxx
  target: product_sales (required, see Appendix B)
  breakdown_sales: date+country+iap (optional, see Appendix C)
  fetch_type: shared_products (optional, default: `both`, see Appendix D)
  start_date: 2017-01-01 (optional but required here as breakdown contains `iap`) 
  end_date: 2017-02-01 (optional, default: current date)
  currency: USD (optional, default: USD, see Appendix E)
  skip_on_invalid_records: true (optional, default: false)
  calls_per_minute_limit: 15 (optional, 30 by default, see Appendix F)
  calls_per_day_limit: 800 (optional, 1000 by default, see Appendix F)
out:
  mode: replace

This example dumps AppAnnie Account Data Source:

  • apikey: AppAnnie apiKey.
  • target: AppAnnie entity object to be imported.
    • See Appendix B for the list of available target.
  • breakdown: Breakdown type for which product sale/usage data to be fetched.
    • This field name is changed according to which target is selecting, either breakdown_sales or breakdown_usage.
    • See Appendix C for usage and the list of available breakdown.
  • fetch_type: The source of products to pull (any products from connected accounts or via sharing or both) to be imported.
    • See Appendix D for usage and the list of available fetch_type.
  • start_date: From which date (yyyy-MM-dd) product data to be imported. This field is required if either we are fetching product usage (target is product_usage) or product sale (target is product_sales) with in-app-purchase breakdown (breakdown has iap).
  • end_date: To which date (yyyy-MM-dd) product data to be imported. This field is optional and will be automatically adjusted to maximum 60 days since start_date limited by current date.
  • currency: In which currency the data will be presented.
    • See Appendix E for the list of available currency.
  • skip_on_invalid_records: Ignore errors (such as invalid json, unsupported data) and continue fetching record. (false by default)
  • calls_per_minute_limit / calls_per_day_limit: Limit number of API calls per minute / per day

For more details on available out modes, see Appendix A

Step 2 (optional): Preview data to import

You can preview data to be imported using the command td connector:preview.

$ td connector:preview load.yml
+-----------------+---------------------+-----------------+----
| account_id:long | account_name:string | vertical:string | ...
+-----------------+---------------------+-----------------+----
| 42023           | "Hello"             | apps            |
| 42045           | "World"             | apps            |
+-----------------+---------------------+-----------------+----

Step 3: Execute Load Job

Finally, submit the load job. It may take a couple of hours depending on the data size. Users need to specify the database and table where their data are stored.

It is recommended to specify --time-column option, since Treasure Data’s storage is partitioned by time (see also architecture) If the option is not given, the Data Connector will choose the first long or timestamp column as the partitioning time. The type of the column specified by --time-column must be either of long and timestamp type.

If your data doesn’t have a time column you may add it using add_time filter option. More details at add_time filter plugin

$ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column updated_date

The above command assumes you have already created database(td_sample_db) and table(td_sample_table). If the database or the table do not exist in TD this command will not succeed, so create the database and table manually or use --auto-create-table option with td connector:issue command to auto create the database and table:

$ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column updated_date --auto-create-table
Untitled-3
You can assign Time Format column to the "Partitioning Key" by "--time-column" option.

Scheduled execution

You can schedule periodic Data Connector execution for periodic AppAnnie import. We take great care in distributing and operating our scheduler in order to achieve high availability. By using this feature, you no longer need a cron daemon on your local datacenter.

Create the schedule

A new schedule can be created using the td connector:create command. The name of the schedule, cron-style schedule, the database and table where their data will be stored, and the Data Connector configuration file are required.

$ td connector:create \
    daily_appannie_import \
    "10 0 * * *" \
    td_sample_db \
    td_sample_table \
    load.yml
Untitled-3
The `cron` parameter also accepts these three options: `@hourly`, `@daily` and `@monthly`.
Untitled-3
By default, schedule is setup in UTC timezone. You can set the schedule in a timezone using -t or --timezone option. Please note that `--timezone` option only supports extended timezone formats like 'Asia/Tokyo', 'America/Los_Angeles' etc. Timezone abbreviations like PST, CST are *not* supported and may lead to unexpected schedules.

List the Schedules

You can see the list of currently scheduled entries by td connector:list.

$ td connector:list
+-----------------------+--------------+----------+-------+--------------+-----------------+-----------------------------+
| Name                  | Cron         | Timezone | Delay | Database     | Table           | Config                      |
+-----------------------+--------------+----------+-------+--------------+-----------------+-----------------------------+
| daily_appannie_import | 10 0 * * *   | UTC      | 0     | td_sample_db | td_sample_table | {"type"=>"app_annie", ... } |
+-----------------------+--------------+----------+-------+--------------+-----------------+-----------------------------+

Show the Setting and History of Schedules

td connector:show shows the execution setting of a schedule entry.

% td connector:show daily_appannie_import
Name     : daily_appannie_import
Cron     : 10 0 * * *
Timezone : UTC
Delay    : 0
Database : td_sample_db
Table    : td_sample_table

td connector:history shows the execution history of a schedule entry. To investigate the results of each individual execution, please use td job <jobid>.

% td connector:history daily_appannie_import
+--------+---------+---------+--------------+-----------------+----------+---------------------------+----------+
| JobID  | Status  | Records | Database     | Table           | Priority | Started                   | Duration |
+--------+---------+---------+--------------+-----------------+----------+---------------------------+----------+
| 578066 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-18 00:10:05 +0000 | 160      |
| 577968 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-17 00:10:07 +0000 | 161      |
| 577914 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-16 00:10:03 +0000 | 152      |
| 577872 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-15 00:10:04 +0000 | 163      |
| 577810 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-14 00:10:04 +0000 | 164      |
| 577766 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-13 00:10:04 +0000 | 155      |
| 577710 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-12 00:10:05 +0000 | 156      |
| 577610 | success | 10000   | td_sample_db | td_sample_table | 0        | 2015-04-11 00:10:04 +0000 | 157      |
+--------+---------+---------+--------------+-----------------+----------+---------------------------+----------+
8 rows in set

Delete the Schedule

td connector:delete will remove the schedule.

$ td connector:delete daily_appannie_import

Appendix

A) Modes for out plugin

You can specify file import mode in out section of load.yml.

append (default)

This is the default mode and records are appended to the target table.

in:
  ...
out:
  mode: append

replace (In td 0.11.10 and later)

This mode replaces data in the target table. Please note that any manual schema changes made to the target table will remain intact with this mode.

in:
  ...
out:
  mode: replace

B) Available targets

Target Description
account_connections Connected accounts
connected_products Products from connected accounts
shared_products Shared products from external accounts
product_sales Product sales data
product_usage Product usage data
app_details Application details

C) Available breakdowns

This field is available for importing product sales or product usages only.

  • If target is product_sales, the breakdown field name is breakdown_sales
  • If target is product_usage, the breakdown field name is breakdown_usage
Breakdown Product Sales Product Usage
country x x
country+iap x
country+device x
date x x
date+country x x
date+country+device x
date+country+iap x
date+device x
date+iap x
date+type+iap x
device x
iap x

D) Available fetch-types

This field is available for importing product sales, product usage and app details

Source Description
connected_products Import only data of products from connected accounts
shared_products Import only data of products from sharing list
both Import both of product source above

E) Available currencies

This field is available for importing product sales only

Currency Code Symbol Full Name of Currency
AUD A$ Australian Dollar
BGN > лв Bulgarian lev
BRL R$ Brazilian real
CAD C$ Canadian Dollar
CHF CHF Swiss Franc
CNY ¥ Chinese Yuan
CZK Czech koruna
DKK kr Danish krone
EEK kr Estonian kroon
EUR Euro
GBP £ Pound sterling
HKD HK$ Hong Kong dollar
HRK kn Croatian kuna
HUF Ft Hungarian forint
IDR Rp Indonesian rupiah
ILS Israeli new shekel
INR Indian rupee
JPY ¥ Japanese yen
KRW South Korean won
LTL Lt Lithuanian litas
LVL Ls Latvian lats
MXN Mex$ Mexican peso
MYR RM Malaysian ringgit
NOK kr Norwegian krone
NZD $ New Zealand dollar
PHP Philippine peso
PLN Polish złoty
RON lei Romanian new leu
RUB p. Russian rouble
SEK kr Swedish krona/kronor
SGD S$ Singapore dollar
THB ฿ Thai baht
TRY TL Turkish lira
TWD NT$ New Taiwan dollar
USD $ United States dollar
ZAR R South African rand

Reference: https://support.appannie.com/hc/en-us/articles/204209074-4-Currency-List

F) Rate Limits

There are 2 different rate limits in AppAnnie, i.e. call per minute and call per user per day rate limits. The call per minute limit is auto refreshed after a certain amount of seconds while the daily call limited is refreshed at a daily basis on 00:00 PST.

If you have multiple transfers under the same AppAnnie account, you can control the rate limit usage of each AppAnnie transfer via both calls_per_minute_limit and calls_per_day_limit settings as long as the total limits are lower than or equal to your account quota. For example, assume that your account has quotas as 100 calls/minute and 10000 calls/day, if you create 2 transfers, e.g. product sales & product usage data, you could use 50 cpm & 5000 cpd for product sales transfer and the rest (50 cpm & 5000 cpd) for product usage transfer.


Last modified: Mar 30 2017 06:17:47 UTC

If this article is incorrect or outdated, or omits critical information, please let us know. For all other issues, please see our support channels.