The Google Analytics Data API gives you programmatic access to Google Analytics 4 (GA4) report data. Google Analytics 4 helps you understand how people use your web, iOS, or Android app.

This Data Connector is in Beta. For more information, contact support@treasuredata.com.

Prerequisites

  • Basic knowledge of Treasure Data

  • Basic knowledge of Google Analysis 4 (GA4)

  • A Google Analytics 4 account

Requirements

A Google Analytics 4 account or Service account JSON key

Authentication Method

Google User Account: OAuthUsing OAuth is the most common method. This method requires fewer setup steps. You can skip the rest of this section and go directly to TD Console.

Google Service Account—JSON key

Using JSON key might be required for your implementation.

This method requires the setup steps from the Google API console.

Set the Google Analytics Data API for JSON Key Authentication


1.  Open the Google API Console: https://console.developers.google.com
2. Sign in to the Google account you want to access through the API.
3. Select NEW PROJECT.
4. Name your project.
5. Select Create.
6. Enable the Google Analytics Data API v1. Navigate to the APIs & Services for your project, then select + ENABLE APIS & SERVICES
7. From the Library page, type to search for the Google Analytics Data API, then press enter. 
8. Choose the Google Analytics Data API. 
9.  Select Enable.
10.  Access Service account to create a new service account.
11.  From APIs & Services, select Credentials, then Service account.
12.  Complete the fields in step 1 to create the service account. You can skip the two optional parameters.


13. Select Done.
14. Select the Actions tab.

15. Select keys.

16. From the Add key pulldown, select Create new key.

17. Choose the JSON key type to download to your local machine and select Create.
18. A notice appears that your private key is saved to your computer. Keep the file in a secure place.
19. Edit the configuration file for the integration using the private key information. Your file might look like this:
{
  "type": "service_account",
  "project_id": "central-stream-314923",
  "private_key_id": "94d03bf7dd9c05bc122c695d1aa13f2a8a28f88e",
  "private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvQIBADANBgkqhkiG9w**********************AoIBAQCNhAICLr/dozCQ\nTW9ZNMNJ6RF+fVqhd0FUbw0VBIwy6BWu/LuaocJrzl2DHChAl0PNvGCDUAObBTRz\nbUT/HOu47q**********7ENGK\nOir9VChG+Qubq25bAtOq/yTVEPJgnj*******AGVjojVnK4f\n2YtW6ti7xBPwFBF1RPY56yTDeVQVko+KK3x+LFS+lTj1+jBBjvedWHrpQQfRHqV/\nVtXyKyDybQlnfAOucMHzMxjQVLN4f9D7JVxCe52Wp7RaALCIdkKDqN/ffkNMF9QT\nCjffudeTAgMBAAECggEAFMQnS0yy6QI2cSZ7zXpZofHqmEYq04DdfFdjcw8cx6eY\n7vm1Seas0gcRX9j06y2HTJx1CS/np4rm/H0vX8RNrvCPYXrOJzUG2DOnW9pwi9Hl\nKb1Z0VErenzy/em78BI958fXIJ4vv5pjNUZ94njEBE4tbuWEJyTODMyuCfoXpye4\nkCDY6DJFxDKUA7tZOTcK3t0YiVV0O2MwcUhJdr107kw4F1HXY/mlh87ki5z3tMy0\nISBKjvau2aWf0SVLZHtlo88JZGUak7tkuxnWaXQN+dUo1rZWKj867pBT4KWXzAbJ\nUVQ7pBrDFri90fNQ5XFsQdS//dO2pFEn+1Aum86Q0QKBgQDB4RHjBWdJ3eMyvWWi\nipdCx4gC6G5Hqjt+icKv9yddyV/WvuMH82xDAHUJJBzaj9I45O5D+07O6TZO8CkZ\n6Tqq92N3HEkHZWiUTo91C4qbO4ai5SXxpnWn5gsYc+JYPqNp1b+T1gZjA4Pj8l+t\neJ7VDGxu0tjK17Vj13turImXCQKBgQC629KRpvq9FAIWuA8NAXBSeqNyzktPVdOZ\n5GJvwCevVzIapvwZPoZTaJ6xehta1hrR859ZReZx/j7ntoOjAjGw1rS//T5N98Hf\nt+JpCemAa5ApcoUBAXmlb80jIHysRBgMUTLcTKnZuFT3RwsD1xtXjRct0doIF8EC\nd8RLE7FkuwKBgQCJSuGIuwXWqBtAjiBPxwawUm29aWzWsPTqeZF1XHbzEiwc/RX2\nRmmu1L8MFxebqmb6xRr45xh6q2k64xSn9aIG+aLk8RHB/AzfoPYzs1WW8cM4zT5e\nbjs5B01qJn3tcYX051l/zfq92Ppny/X2+Mi5I9ARdpvwoGoh5rDQwbu5SQKBgHx8\naGtKsC75Pm7+TmCevcLlGzEoCHohNqiGw6GphYbF84ZYCwmSYxD8WQTp0YGRtCp9\nQILME7uL40KhkE8v7gTe9WoWf8SXs5ykt/y8cshwYImMVtmVrwItWp/1S7nEX7UM\n/3JOzLVUnZ5jwQ3c58VLJM8MyFGt6ZMIUUinJP5zAoGAICmOlDqPWR2RXPo+9SkN\nok82AjvsjeUMDsiCkEVsAQMBZkYbND0047BAg7STqVjIaJg0zYFvQ5oow5zgu1lk\n46nxtfQm3U58lILErGsmClxcOZR2nO7kvm0PJMUgENADGhP5pqE+8w+e4JC45Ojw\nX7X+hhL/a7pu2Un9O/rXZVM=\n-----END PRIVATE KEY-----\n",
  "client_email": "meg-td-service-account@central-stream-314923.iam.gserviceaccount.com",
  "client_id": "117460147437348814027",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://oauth2.googleapis.com/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/meg-td-service-account%40central-stream-314923.iam.gserviceaccount.com"
}




Add service account to the Google Analytics 4

You need the JSON key file in the previous step to add a service account to the Google Analytics 4. You must have Analyst permissions to add a user to the GA4 property being accessed via the Google Analytics Data API v1.


1. Using a text editor, open the json key file and search for client_email field to collect the service account email address. It might look similar to the following address: "client_email": "meg-td-service-account@central-stream-314923.iam.gserviceaccount.com".
2. Sign in to Google Analytics https://analytics.google.com.
3. Select Admin, then navigate to the desired account/property.

4. In the Account or Property pane (depending upon whether you want to add users at the account or property level), select Access Management.
5. From the Account permissions list, select +, then select Add users.

6. Enter the email address you set aside from json key. Select Viewer role or higher (Analyst, Editor, Administrator). Select Add.
Your service account now has access to use Google Analytics 4 through the Google Analytics Data APIs.



Obtain the Property ID from Google Analytics

You must have the Property ID to create import data to Treasure Data.

1. Sign in to Google Analytics https://analytics.google.com/
2. Select Admin, and navigate to the desired account/property

3. Capture or copy the Property ID, it is necessary for the creation of the Treasure Data Import Data.


Obtain the custom dimension or custom metrics name from Google Analytics

You may want to customize your data by creating custom dimensions or custom metrics.

1. Sign in to Google Analytics https://analytics.google.com.
2. Select Admin, and navigate to the desired account and select Configure.

3. Select Custom definitions.

You can obtain custom dimension or metric name by concatenating:  custom{Scope}:{dimension or metric name}

For example, the previous image uses customUser:test_cs_dimension for the custom dimension name and customEvent:test_cs_metric as custom metric name.


Creating the Data Connector from the TD Console

Create a New Connection

In Treasure Data, you must create and configure the data connection prior to running your query. As part of the data connection, you provide authentication to access the integration.

1. Open TD Console.
2. Navigate to Integrations Hub  Catalog.
3. Search for and select Google Analytics Data API.

4. Select Create Authentication.
5. Choose one of the following authentication methods:

Enter the JSON key information.

Make sure that you include the private key of the service account. Make sure that the entire JSON key information is in brackets {…}.

  1. Locate and open the JSON file that you downloaded from the Google Cloud Platform in your favorite text editor.  For example:
  2. Copy and past json key into text box.
  3. Select Continue.


  1. Select OAuth.
  2. If you know the OAuth connection, type it, otherwise you can make the selection to connect a new account.
  3. If you chose to connect a new account, you are taken through a series of screens where you select the account to link and specify that you are OK granting access to that account by Treasure Data. 
    You are then returned to Authentications in TD Console.
  4. Search and select Google Analytics Data API.

  5. Select your account to show in the Authentication connection field. 
  6. Select Continue.

6. Now that you've selected your authentication method, enter a name for your connection.
7. Select Done.



Transfer Your Data to Treasure Data


After creating the authenticated connection, you are automatically taken to Authentications.

You must enter Dimension, Metric orCustom Dimension, Custom Metric information from Google Analytics 4. Visit https://developers.google.com/analytics/devguides/reporting/data/v1/api-schema for list of Dimensions and Metrics

1. Search for the connection you created. 
2. Select New Source.
3. Type a name for your Source in the Data Transfer field.
4. Select Next.

The Source Table dialog opens.  You must provide Dimensions or Custom Dimensions, and Metric or Custom Metrics information. 



5. Edit the following parameters:
Parameters Description
Property ID

Your GA4 Property ID. Read section Obtain the Property ID from Google Analytics

Report Mode

Basic or Advanced report. Visit https://developers.google.com/analytics/devguides/reporting/data/v1/advanced in case you want to build an advanced report

Dimensions

Dimensions or Custom Dimension name. Visit https://developers.google.com/analytics/devguides/reporting/data/v1/api-schema for the list of dimensions and read Obtain the custom dimension or custom metrics name from Google Analytics to retrive list of custom Dimensions

For example:

  • audienceId
  • customUser:test_cs_dimension
Metrics

Metrics or Custom Metrics name. Visit https://developers.google.com/analytics/devguides/reporting/data/v1/api-schema for the list of dimensions and read Obtain the custom dimension or custom metrics name from Google Analytics to retrieve list of custom Metrics

For example:

  • promotionViews
  • promotionClicks
  • customEvent:test_cs_metric
Start Date

The date to start query report data. Support format YYYY-MM-DD, today, yesterday, NdaysAgo. The parameter value is case-sensitive

For example: 

  • 2022-03-12
  • today
  • yesterday
  • 4daysAgo
End Date

The date to end query report data. Support format YYYY-MM-DD, today, yesterday, NdaysAgo. The parameter value is case-sensitive

For example: 

  • 2022-03-12
  • today
  • yesterday
  • 4daysAgo
Advanced Payload

Advanced report definition in JSON format. Visit https://developers.google.com/analytics/devguides/reporting/data/v1/advanced for more detail.

For example:

{
"metrics": [ { "name": "cohortActiveUsers" } ],
"dimensions": [
{ "name": "cohort" },
{ "name": "cohortNthWeek" },
{ "name": "country" }
],
"cohortSpec": {
"cohorts": [
{
"dimension": "firstSessionDate",
"dateRange": {
"startDate": "2022-10-04",
"endDate": "2022-10-10"
}
}
]
}
}

Example:

6. Select Next.
7. Select Next.

Data Preview 

You can see a preview of your data before running the import by selecting Generate Preview.

Data shown in the data preview is approximated from your source. It is not the actual data that is imported.

  1. Click Next.
    Data preview is optional and you can safely skip to the next page of the dialog if you want.

  2. To preview your data, select Generate Preview. Optionally, click Next

  3. Verify that the data looks approximately like you expect it to.


  4. Select Next.

Data Placement

For data placement, select the target database and table where you want your data placed and indicate how often the import should run.

  1.  Select Next. Under Storage you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data.

  2. Select a Database > Select an existing or Create New Database.

  3. Optionally, type a database name.

  4. Select a Table> Select an existing or Create New Table.

  5. Optionally, type a table name.

  6. Choose the method for importing the data.

    • Append (default)-Data import results are appended to the table.
      If the table does not exist, it will be created.

    • Always Replace-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created. 

    • Replace on New Data-Only replace the entire content of an existing table with the result output when there is new data.

  7. Select the Timestamp-based Partition Key column.
    If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter.

  8. Select the Timezone for your data storage.

  9. Under Schedule, you can choose when and how often you want to run this query.

    • Run once:
      1. Select Off.

      2. Select Scheduling Timezone.

      3. Select Create & Run Now.

    • Repeat the query:

      1. Select On.

      2. Select the Schedule. The UI provides these four options: @hourly, @daily and @monthly or custom cron.

      3. You can also select Delay Transfer and add a delay of execution time.

      4. Select Scheduling Timezone.

      5. Select Create & Run Now.

 After your transfer has run, you can see the results of your transfer in Data Workbench > Databases.


Import via Workflow

You can import data from Google Analytics 4 by using td_load>: operator of workflow. If you already have created a SOURCE.

1. In the TD Console, navigate to Integrations Hub > Sources.

2. Select a connector and then select the more menu (…). Select Copy Unique ID.
3. Define a workflow task using td_load> operator

+load:
  td_load>: unique_id_of_your_source
  database: ${td.dest_db}
  Table: ${td.dest_table}

4. Run a workflow
  • No labels