Use the Treasure Data Customer Data Platform to ingest your events information from Cvent, a popular Event Management Software.
Basic knowledge of TD Console and TD Toolbelt
Basic knowledge of Cvent
Using TD Console
Create a New Connection
Go to Integrations Hub > Catalog and search. Select Cvent.
The following dialog opens.
Edit the required credentials. The API User Name and API Password are different from the Cvent application username and password, so contact your Cvent admin to generate an API User Name and API Password.
Check on Sandbox if you are using a testing API account against Cvent Sandbox.
Save the connection.
Create New Source
Select New Source from the authenticated connections.
Select the data type and the fetched time range for this transfer
Data Type: The supported types are Registration, Invitee, Contact, and Event
Start Date: The starting point of time for the data time window. In the preceding image example, all Registrations that get modified from 2018-08-01 00:00:00 UTC will be fetched.
Duration: The length of the time window. In the preceding example, the length of time is equal to fetching from 2018-08-01 00:00:00 UTC to 2018-09-01 00:00:00 UTC.
Incremental: When running on schedule, the time window of the fetched data automatically shifts forward on each run. For example, if the initial config is January 1, with 10 days in duration, the first run fetches data that was modified from January 1 to January 10, the second run fetches data that was modified from January 11 to January 20, and so on.
This shows a preview of the actual data in the specified data transfer configuration. The columns are sorted alphabetically, but custom field columns (if there are any) are placed at the end. This order also applies to the final results on the target database.
Choose the Target Database and Table
Choose an existing or create a new database and table.
The connector inserts literal time values received from the Cvent API. This means that the time values are relative to the implicit time zone of the Cvent server and assumed to be UTC. Also, the Event data type's time-related fields are relative to its own event's time zone. That means the Data Storage Timezone doesn't indicate the actual time zone of the data.
Set a schedule if you want to. Import starts when the scheduled time comes, or immediately if you choose ‘Once now’.
Select START TRANSFER and then check for a running job on the Jobs page.
Using the Command Line
Install the Prerequisites
Install the latest
td tool via Ruby gem:
There are other install methods. For more information, check out Treasure Data Toolbelt.
Create the Config File (config.yml)
Create a configuration file:
Execute Load Job
You must specify the database and table where the data is stored.
It is recommended to specify the
--time-column option because Treasure Data’s storage is partitioned by time. If the option is not given, the data connector selects the first
timestamp column as the partitioning time. The type of the column specified by
--time-column must be either of
timestamp type (use Preview results to check for the available column name and type. Generally, most data types have a last_modified_date column).
If your data doesn’t have a time column you can add the column by using the
add_time filter option. See details at add_time filter plugin.
Submit the load job. It may take a couple of hours depending on the data size. Users need to specify the database and table where their data is stored.
connector:issue command assumes you have already created a database (sample_db) and a table (sample_table). If the database or the table do not exist in TD, the
connector:issue command will fail. Therefore you must create the database and table manually or use
--auto-create-table option with
td connector:issue command to automatically create the database and table:
You can schedule periodic data connector execution for periodic Cvent import. By using this feature, you no longer need a
cron daemon on your local data center.
Create the Schedule
A new schedule can be created by using the
td connector:create command. The name of the schedule, the cron-style schedule, the database and table where their data will be stored, and the Data Connector configuration file are required.
The `cron` parameter also accepts three special options: `@hourly`, `@daily` and `@monthly`. For more detail on Scheduled Jobs.
By default, the schedule is set up in the UTC timezone. You can set the schedule in a timezone using
--timezone option. The
--timezone option supports only extended timezone formats like ‘Asia/Tokyo’, ‘America/Los_Angeles’ etc. Timezone abbreviations like PST, CST are not supported and may lead to unexpected schedules.
List the Schedules
You can see the list of currently scheduled entries by
Show the Setting and History of Schedules
td connector:show shows the execution setting of a schedule entry.
td connector:history shows the execution history of a schedule entry. To investigate the results of each individual execution, use
td job <jobid>.
Delete the Schedule
td connector:delete removes the schedule.
The Cvent Connector has the same upstream limit as the Cvent API. This means:
Contacts, Invitees and Events are only retrievable if the data contains less than 25,000 records that get modified in the same second.
Registrations are only retrievable if the data contains less than 25,000 records that get modified in the same second.
There is a cap of 10,000 requests per-day,which roughly equals 2 million records per-day.
Appendix A: How Column Names are Mapped
All columns are in snake case (for example, first_name, last_name).
Due to an upstream problem from Cvent API, the "RSVP By Date" is mapped to an unusual column name: "rsv_pby_date"
Custom field names are mapped to snake case by the following steps:
Replace all non-alphanumeric to underscore "_"
Remove all leading and trailing underscores after step 1
If the first character is a digit, prefix the name with "col_"
Remove all consecutive underscores "_"
If the name is empty after the preceding steps, then name it "custom_field" (This column field name is different from the column name in TD Console)
Lowercase all of the characters
For example: "Hello @ World" will be mapped to "hello_world", "" (empty) will be mapped to "custom_field".
If there are naming conflicts, the conflicted custom field name is appended with the ID of the field. For example, if there is a custom field name "First Name" on Contact type (which already has a predefined field with that name), the custom field name is mapped to something like "first_name_A3E3_ERQNIHOIU_324AE".
Appendix B: Event's Time Zone
Unlike other datetime fields, the preceding 4 event-related datetime fields are imported as text and can be recognized in a database by a slightly different format (for example, the Start Date is imported as "2018-10-09T17:59:00", and would be "2018-10-09 17:59:00.000" if the Start Date is a default datetime value). Event-related datetime fields are relative to their own event time zone. Therefore, these event-related datetime fields are not treated as absolute time references like other fields.
The event-related datetime fields correspond to the following columns after import: