Skip to content
Last updated

Google Business Profile Import Integration Using The CLI

You can import customer information using Google Business Profile(formerly Google My Business) import integration. With this connector, you can consolidate statistics, including ratings and reviews, of all your locations. If you are interested in participating in the Google Business Profile beta, contact your Treasure Data Customer Success representative.

Prerequisites

Install ‘td’ Command

Install the Treasure Data Toolbelt.

Create Config File (load.yml)

For Accounts:

in:
    type: google_business_profile
    client_id: my-client-id
    client_secret: secret
    refresh_token: token
    data_type: accounts

filters:
    type: add_time
  - from_value: {mode: upload_time}
    to_column: {name: time}

out: {mode: append}

For Locations:

in:
    type: google_business_profile
    client_id: my-client-id
    client_secret: secret
    refresh_token: token
    data_type: locations

filters:
    type: add_time
  - from_value: {mode: upload_time}
    to_column: {name: time}

out: {mode: append}

For Location Reviews:

in:
    type: google_business_profile
    client_id: my-client-id
    client_secret: secret
    refresh_token: token
    data_type: location_reviews

filters:
    type: add_time
  - from_value: {mode: upload_time}
    to_column: {name: time}

out: {mode: append}

For Location Daily Metrics Time Series:

in:
    type: google_business_profile
    client_id: my-client-id
    client_secret: secret
    refresh_token: token
    data_type: location_daily_metrics_time_series
    daily_metric: BUSINESS_IMPRESSIONS_DESKTOP_MAPS
    start_time: "2014-10-02"
    end_time: "2014-10-03"
    incremental: true

filters:
    type: add_time
  - from_value: {mode: upload_time}
    to_column: {name: time}

out: {mode: append}

For Location Monthly Search Keywords Impressions:

in:
    type: google_business_profile
    client_id: my-client-id
    client_secret: secret
    refresh_token: token
    data_type: location_monthly_search_keywords_impressions
    start_time: "2014-10-02"
    end_time: "2014-10-03"

filters:
    type: add_time
  - from_value: {mode: upload_time}
    to_column: {name: time}

out: {mode: replace}

Configuration keys and descriptions are as follows:

Config keyTypeRequiredDescription
client_idstringyesOAuth client_id.
client_secretstringyesOAuth client_secret.
refresh_tokenstringyesOAuth refresh_token uses to exchange access_token.
data_typestringyesType of data you want to import: location_reviews, accounts, locations, location_daily_metrics_time_series or location_monthly_search_keywords_impressions
daily_metricstringyesSupported values are (only required for location_daily_metrics_time_series data type): BUSINESS_IMPRESSIONS_DESKTOP_MAPS,  BUSINESS_IMPRESSIONS_DESKTOP_SEARCH,  BUSINESS_IMPRESSIONS_MOBILE_MAPS,  BUSINESS_IMPRESSIONS_MOBILE_SEARCH,  BUSINESS_CONVERSATIONS,  BUSINESS_DIRECTION_REQUESTS,  CALL_CLICKS,  WEBSITE_CLICKS,  BUSINESS_BOOKINGS,  BUSINESS_FOOD_ORDERS
start_timeintno- Location Daily Metrics Time Series: Start time with format YYYY-MM-DD, e.g.: 2022-10-10 for report time range inclusive. - Location Monthly Search Keywords Impressions: Start time with format YYYY-MM, e.g.: 2022-10-10 for report time range inclusive
end_timeintno- Location Daily Metrics Time Series: End time with format YYYY-MM-DD, e.g.: 2022-10-10 for report time range inclusive. - Location Monthly Search Keywords Impressions: End time with format YYYY-MM, e.g.: 2022-10-10 for report time range inclusive
incrementalbooleannoWhether import should be incremental or not.
retry_limitintnomaximum retry times for each API call. Default: 7
initial_retry_waitintnowait time for first retry. Default: 5 seconds
max_retry_waitintnomaximum time between retries. Default: 300 seconds

Execute Load Job

Submit the load job. It may take a couple of hours depending on the data size. Users need to specify the database and table where their data is stored.

$ td connector:issue load.yml --database td_sample_db --table td_sample_table

The preceding command assumes that you have already created database(td_sample_db) and table(td_sample_table). If the database or the table do not exist in TD, this command will not succeed, so create the database and table manually or use --auto-create-table option with td connector:issue command to automatically create the database and table as follows.

$ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column created_at --auto-create-table 

You can assign Time Format column to the "Partitioning Key" by using the --time-column option.

Scheduled Execution

You can schedule periodic data connector execution for periodic data import. We carefully configure our scheduler to ensure high availability. By using this feature, you no longer need a cron daemon on your local data center.

Create the Schedule

A new schedule can be created by using the td connector:create command. The name of the schedule, cron-style schedule, the database and table where their data is stored, and the data connector configuration file are required.

$ td connector:create \
    daily_report_import \
    "10 0 * * *" \
    td_sample_db \
    td_sample_table \
    load.yml 

The cron parameter also accepts three special options: @hourly, @daily and @monthly.

List the Schedules

You can see the list of scheduled entries by td connector:list.

$ td connector:list
+---------------------------+--------------+----------+-------+--------------+-----------------+-------------------------------------+
| Name                      | Cron         | Timezone | Delay | Database     | Table           | Config                              |
+---------------------------+--------------+----------+-------+--------------+-----------------+-------------------------------------+
|daily_report_import        | 10 0 * * *   | UTC      | 0     | td_sample_db | td_sample_table | {"type"=>"google_business_profile",...   |
+---------------------------+--------------+----------+-------+--------------+-----------------+-------------------------------------+

Show the Setting and History of Schedules

td connector:show shows the execution setting of a schedule entry.

$ td connector:show daily_report_import
Name     :daily_report_import
Cron     : 10 0 * * *
Timezone : UTC
Delay    : 0
Database : td_sample_db
Table    : td_sample_table
Config
---
// Displayed load.yml configuration.

td connector:history shows the execution history of a schedule entry. To investigate the results of each individual execution, use td job jobid.

Delete the Schedule

td connector:delete removes the schedule.

$ td connector:delete daily_report_import