Zendesk® Sunshine's modern cloud based CRM platform is used to identify customer needs and trends. You can use the Zendesk Export Integration connector to export Events data to Zendesk Sunshine.
- Basic knowledge of Treasure Data
- Zendesk Sunshine account
The result output schema must match the required columns (name and data type) with a target event.
profile_identifier can be set to a maximum of 20 columns (1 → 20 in a continuously ascending order).
| Action | Target | Column required | Column requires at least 1 | Column optional | Note |
|---|---|---|---|---|---|
| Create | Events for profiles | profile_source(String), profile_type(String), event_source(String), event_type(String), event_properties(JSON String) | profile_identifier_1 to profile_identifier_20 (String) | profile_name, profile_created_at, profile_attributes, event_description, event_created_at, event_received_at | See "Identifier columns" below for format and limits. |
| Create | Events for a profile ID | profile_id(String), event_source(String), event_type(String), event_properties(JSON String) | event_description, event_created_at, event_received_at | See "Events for a profile ID" details. | |
| Create | Events for a user ID | user_id(String), profile_source(String), profile_type(String), event_source(String), event_type(String), event_properties(JSON String) | profile_identifier_1 to profile_identifier_20 (String) | profile_name, profile_created_at, profile_attributes, event_description, event_created_at, event_received_at | See "Identifier columns" below for format and limits. |
Required columns
profile_source (String)profile_type (String)event_source (String)event_type (String)event_properties (JSON String)– provide serialized key-value pairs for the event payload.
Identifiers (at least one)
- Up to 20 columns named
profile_identifier_1throughprofile_identifier_20. - Each value must follow the
type,valuepattern (for example,email,example@gmail.com).
Optional columns
profile_nameprofile_created_atprofile_attributesevent_descriptionevent_created_atevent_received_at
Required columns
profile_id (String)event_source (String)event_type (String)event_properties (JSON String)
Optional columns
event_descriptionevent_created_atevent_received_at
Required columns
user_id (String)profile_source (String)profile_type (String)event_source (String)event_type (String)event_properties (JSON String)
Identifiers (at least one)
- Same
profile_identifier_1throughprofile_identifier_20columns andtype,valuerule as listed above.
Optional columns
profile_nameprofile_created_atprofile_attributesevent_descriptionevent_created_atevent_received_at
The API enforces rate limiting on standard create endpoints. Limits include requests made from the Zendesk Apps framework. The rate limits for each plan type are:
| Sunshine Lite | Sunshine Professional | Sunshine Enterprise | |
|---|---|---|---|
| Custom objects | 500 requests per minute | 750 requests per minute | 1,000 requests per minute |
| Profiles | Not included | 250 requests per minute | 500 requests per minute |
| Events | Not included | 250 requests per minute | 500 requests per minute |
The following table shows profiles hard limits. Availability and limits can vary according to Sunshine plan type.
| Category | Limit |
|---|---|
| Profiles | 30,000,000 |
| Profile sources | 50 |
| Profile types | 2500 |
| Identifiers per profile | 20 |
| Profile source character length | 40 |
| Profile type character length | 40 |
| Identifier source character length | 60 |
| Identifier type character length | 60 |
The following table shows events hard limits. Availability and limits can vary according to Sunshine plan type.
| Category | Limit |
|---|---|
| Event sources | 50 |
| Event types | 2500 |
| Event source character length | 40 |
| Event type character length | 40 |
In Treasure Data, you must create and configure the data connection to be used during export prior to running your query. As part of the data connection, you provide authentication to access the integration.
Open the TD Console.
Navigate to Integrations Hub > Catalog.
Search and select Zendesk.

- Click Create. You are creating an authenticated connection. The following dialog opens.

To export event data, type the login URL for Zendesk. For example: https://{your_domain}.zendesk.com.
There are different options for Auth method: basic, token, and OAuth.
Optionally, for basic authentication, type values for username and password.
Optionally, for token, type values for username and token.
Optionally, for OAuth, type values for OAuth Access Token.
Select Continue.
Name your new Zendesk connection.
Select Done.

Create or reuse a query.
Sometimes you need to define the column mapping in the query.
Open the TD Console.
Navigate to Data Workbench > Queries.
Select the query that you plan to use to export data. Each type of resource requires specific columns and exact column names (case sensitive) and data types. Code examples that can be used to export data:
Events for profiles
SELECT
profile_source,
profile_type,
profile_identifier_1,
event_source,
event_type,
event_properties
FROM
your_table;Events for profile ID
SELECT p
profile_id,
event_source,
event_type,
event_properties
FROM
your_table;Events for user id
SELECT
user_id,
profile_source,
profile_type,
profile_identifier_1,
event_source,
event_type,
event_properties
FROM
your_table;Select Export Results located at the top of your query editor. The Choose Integration dialog opens. You have two options when selecting a connection to export the results, using an existing connection or creating a new one. These instructions assume you are selecting an existing connection.
Type the connection name in the search box to filter.
Select your connection.
Select one of the following:

Optionally, select Skip invalid records.
Select Done.
You can use Scheduled Jobs with Result Export to periodically write the output result to a target destination that you specify.
Treasure Data's scheduler feature supports periodic query execution to achieve high availability.
When two specifications provide conflicting schedule specifications, the specification requesting to execute more often is followed while the other schedule specification is ignored.
For example, if the cron schedule is '0 0 1 * 1', then the 'day of month' specification and 'day of week' are discordant because the former specification requires it to run every first day of each month at midnight (00:00), while the latter specification requires it to run every Monday at midnight (00:00). The latter specification is followed.
Navigate to Data Workbench > Queries
Create a new query or select an existing query.
Next to Schedule, select None.

In the drop-down, select one of the following schedule options:

Drop-down Value Description Custom cron... Review Custom cron... details. @daily (midnight) Run once a day at midnight (00:00 am) in the specified time zone. @hourly (:00) Run every hour at 00 minutes. None No schedule.

| Cron Value | Description |
|---|---|
0 * * * * | Run once an hour. |
0 0 * * * | Run once a day at midnight. |
0 0 1 * * | Run once a month at midnight on the morning of the first day of the month. |
| "" | Create a job that has no scheduled run time. |
* * * * *
- - - - -
| | | | |
| | | | +----- day of week (0 - 6) (Sunday=0)
| | | +---------- month (1 - 12)
| | +--------------- day of month (1 - 31)
| +-------------------- hour (0 - 23)
+------------------------- min (0 - 59)The following named entries can be used:
- Day of Week: sun, mon, tue, wed, thu, fri, sat.
- Month: jan, feb, mar, apr, may, jun, jul, aug, sep, oct, nov, dec.
A single space is required between each field. The values for each field can be composed of:
| Field Value | Example | Example Description |
|---|---|---|
| A single value, within the limits displayed above for each field. | ||
A wildcard '*' to indicate no restriction based on the field. | '0 0 1 * *' | Configures the schedule to run at midnight (00:00) on the first day of each month. |
A range '2-5', indicating the range of accepted values for the field. | '0 0 1-10 * *' | Configures the schedule to run at midnight (00:00) on the first 10 days of each month. |
A list of comma-separated values '2,3,4,5', indicating the list of accepted values for the field. | 0 0 1,11,21 * *' | Configures the schedule to run at midnight (00:00) every 1st, 11th, and 21st day of each month. |
A periodicity indicator '*/5' to express how often based on the field's valid range of values a schedule is allowed to run. | '30 */2 1 * *' | Configures the schedule to run on the 1st of every month, every 2 hours starting at 00:30. '0 0 */5 * *' configures the schedule to run at midnight (00:00) every 5 days starting on the 5th of each month. |
A comma-separated list of any of the above except the '*' wildcard is also supported '2,*/5,8-10'. | '0 0 5,*/10,25 * *' | Configures the schedule to run at midnight (00:00) every 5th, 10th, 20th, and 25th day of each month. |
- (Optional) You can delay the start time of a query by enabling the Delay execution.
Save the query with a name and run, or just run the query. Upon successful completion of the query, the query result is automatically exported to the specified destination.
Scheduled jobs that continuously fail due to configuration errors may be disabled on the system side after several notifications.
(Optional) You can delay the start time of a query by enabling the Delay execution.
You can also send segment data to the target platform by creating an activation in the Audience Studio.
- Navigate to Audience Studio.
- Select a parent segment.
- Open the target segment, right-mouse click, and then select Create Activation.
- In the Details panel, enter an Activation name and configure the activation according to the previous section on Configuration Parameters.
- Customize the activation output in the Output Mapping panel.

- Attribute Columns
- Select Export All Columns to export all columns without making any changes.
- Select + Add Columns to add specific columns for the export. The Output Column Name pre-populates with the same Source column name. You can update the Output Column Name. Continue to select + Add Columnsto add new columns for your activation output.
- String Builder
- + Add string to create strings for export. Select from the following values:
- String: Choose any value; use text to create a custom value.
- Timestamp: The date and time of the export.
- Segment Id: The segment ID number.
- Segment Name: The segment name.
- Audience Id: The parent segment number.
- + Add string to create strings for export. Select from the following values:
- Set a Schedule.

- Select the values to define your schedule and optionally include email notifications.
- Select Create.
If you need to create an activation for a batch journey, review Creating a Batch Journey Activation.
See About Using Workflows to Export Data with the TD Toolbelt for more information on using data connectors in a workflow to export data.
Using Basic Auth:
timezone: UTC
_export:
td:
database: sample_datasets
+td-result-into-target:
td>: queries/sample.sql
result_connection: your_connections_name
result_settings:
login_url: https://{example}.zendesk.com
auth_method: basic
username: {username}
password: {password}
target: event_profile
skip_invalid_records: trueUsing Auth Token:
timezone: UTC
_export:
td:
database: sample_datasets
+td-result-into-target:
td>: queries/sample.sql
result_connection: your_connections_name
result_settings:
login_url: https://{example}.zendesk.com
auth_method: token
username: {username}
token: {token}
target: event_profile
skip_invalid_records: trueUsing OAuth Token
timezone: UTC
_export:
td:
database: sample_datasets
+td-result-into-target:
td>: queries/sample.sql
result_connection: your_connections_name
result_settings:
login_url: https://{example}.zendesk.com
auth_method: oauth
access_token: {access_token}
target: event_profile
skip_invalid_records: trueUsing Basic Auth
timezone: UTC
_export:
td:
database: sample_datasets
+td-result-into-target:
td>: queries/sample.sql
result_connection: your_connections_name
result_settings:
login_url: https://{example}.zendesk.com
auth_method: basic
username: {username}
password: {password}
target: event_profile_id
skip_invalid_records: trueUsing Basic Auth
timezone: UTC
_export:
td:
database: sample_datasets
+td-result-into-target:
td>: queries/sample.sql
result_connection: your_connections_name
result_settings:
login_url: https://{example}.zendesk.com
auth_method: basic
username: {username}
password: {password}
target: event_user_id
skip_invalid_records: true