# Karte Export Integration [Learn more about KARTE Import Integration](/int/karte-import-integration). You can use the Karte connector to write job results from queries run in Treasure Data to your Google Cloud Storage. # Prerequisites - Basic knowledge of Treasure Data, including the [Toolbelt](https://toolbelt.treasuredata.com/). - A Google Cloud Platform account # Getting Google Cloud Platform Credentials Before creating the connector, you need to gather the following information: your *Project ID*, and *JSON Credential*. ## JSON Credential The integration with Google Cloud Storage is based on server-to-server API authentication. Go to your *Google Developer Console* and select **Credentials** under *APIs & auth* at the left menu. Then, select **Service account**. ![](/assets/image-20191002-134701.439a883f70e8d8d23edfb7dd8443b07cbf84c337e1e83b8fc34d613648729db6.45d33724.png) Select the *JSON* based key type, which is Google’s recommended configuration. The key is automatically downloaded by the browser. ![](/assets/image-20191002-135027.40548c465aab9c1206e754feddfea9ce75b35a6f1758bbc448c6058fca0f710d.45d33724.png) The Service Account owner, who generated JSON Credentials, must have Write permission for destination bucket. # Specifying Output in Your KARTE Connector ## Using the TD Console ### Create the destination Bucket in Google Cloud Storage Create your *bucket* from your Google Cloud Storage console. ### Write the query Go to the [Treasure Data Console query editor page](https://console.treasuredata.com/app/queries/editor) and compile your query. ### Specify the Result Export target In the same window, select the **Create New Connection** in the *Result Export.* ![](/assets/image-20191002-134829.388713a639208dedf7274c5ce4a5635d96a085dd5638e16ad520b5441ecd6da2.45d33724.png) In the dialog opens, search for and select your connection type: *KARTE*. Provide a name for your *KARTE* connection. Complete the information, including your JSON Credential, Bucket name, and Path: ![](/assets/image-20191002-134857.656a9f0de46cfb1fd35ed35b84c57cdb29c512644f868b21f9d063d1ebca8c9f.45d33724.png) ![](/assets/image-20191002-134934.06e85952c6070679f8d059ac8fe564d9ebe46f729e3420d08d3abbc1efe88dda.45d33724.png) ### Execute the query Finally, either save the query with a name or just run the query. After the successful completion of the query, the results are automatically sent to the specified Google Cloud Storage destination. ![](/assets/image-20191002-134949.40e5f9cff9bab28ade6ce38bc95fec50f470b84b0c1c75dae446b32e80489edd.45d33724.png) ## Activate a Segment in Audience Studio You can also send segment data to the target platform by creating an activation in the Audience Studio. 1. Navigate to **Audience Studio**. 2. Select a parent segment. 3. Open the target segment, right-mouse click, and then select **Create Activation.** 4. In the **Details** panel, enter an Activation name and configure the activation according to the previous section on Configuration Parameters. 5. Customize the activation output in the **Output Mapping** panel. ![](/assets/ouput.b2c7f1d909c4f98ed10f5300df858a4b19f71a3b0834df952f5fb24018a5ea78.8ebdf569.png) - Attribute Columns - Select **Export All Columns** to export all columns without making any changes. - Select **+ Add Columns** to add specific columns for the export. The Output Column Name pre-populates with the same Source column name. You can update the Output Column Name. Continue to select **+ Add Columns**to add new columns for your activation output. - String Builder - **+ Add string** to create strings for export. Select from the following values: - String: Choose any value; use text to create a custom value. - Timestamp: The date and time of the export. - Segment Id: The segment ID number. - Segment Name: The segment name. - Audience Id: The parent segment number. 1. Set a **Schedule**. ![](/assets/snippet-output-connector-on-audience-studio-2024-08-28.a99525173709da1eb537f839019fa7876ffae95045154c8f2941b030022f792c.8ebdf569.png) - Select the values to define your schedule and optionally include email notifications. 1. Select **Create**. If you need to create an activation for a batch journey, review [Creating a Batch Journey Activation](/products/customer-data-platform/journey-orchestration/batch/creating-a-batch-journey-activation). ## Use the CLI You can set a scheduled query to send job results to Google Cloud Storage using KARTE. Designate your json_key and escape the newline with backslash. **Example** ``` $ td sched:create scheduled_karte "10 6 * * *" \ -d dataconnector_db "SELECT id,account,purchase,comment,time FROM data_connectors" \ -r '{"type":"karte","bucket":"samplebucket","path_prefix":"/output/test.csv","format":"csv","compression":"","header_line":false,"delimiter":",","null_string":"","newline":"CRLF", "json_keyfile":"{\"private_key_id\": \"ABCDEFGHIJ\", \"private_key\": \"-----BEGIN PRIVATE KEY-----\\nABCDEFGHIJ\\ABCDEFGHIJ\\n-----END PRIVATE KEY-----\\n\", \"client_email\": \"ABCDEFGHIJ@developer.gserviceaccount.com\", \"client_id\": \"ABCDEFGHIJ.apps.googleusercontent.com\", \"type\": \"service_account\"}"}' ``` ## (Optional) Configure Export Results in Workflow Within Treasure Workflow, you can specify the use of this data connector to output data. ``` timezone: UTC _export: td: database: sample_datasets +td-result-into-target: td>: queries/sample.sql result_connection: karte_integration result_settings: bucket: your_bucket path_prefix: dir/example.csv.gz format: csv compression: gz newline: CR .... ``` For more information on using data connectors in Treasure Data Workflow to export data, review [Exporting Data with Parameters](https://docs.treasuredata.com/display/public/PD/Exporting+Data+with+Parameters). ## KARTE Connector Parameters You can use the following parameters when configuring the KARTE connector. | **bucket** | bucket name | | --- | --- | | **path_prefix** | file path | | **format** | "csv" or "tsv" | | **compression** | "" or "gz", "encryption_pgp" | | public key | The public key is used to encrypt the file before being uploaded | | Key Identifier | Specifies the Key ID of the encryption subkey used to secure the file. The master key is excluded from the encryption process. | | Amor | Whether to use ASCII armor or not | | Compression Type | - Defines the compression algorithm used to compress the file, which will be compressed before encryption for uploading to the SFTP server. - **Note: Please ensure that you compress your file before encrypting and uploading. When you decrypt it, the file will return to a compressed format such as .gz or .bz2.** | | **header_line** | true or false | | **delimiter** | "," or "\t" or " | | **null_string** | "" or "\N" | | **newline** | "CRLF" or "CR" or "LF" |