Use the Treasure Data Customer Data Platform to ingest your events information from Cvent, a popular Event Management Software.


  • Basic knowledge of TD Console and TD Toolbelt

  • Basic knowledge of Cvent

Using TD Console

Create a New Connection

  1. Open the TD Console.

  2. Navigate to the Integrations Hub > Catalog.

  3. Click the search icon on the far-right of the Catalog screen, and enter cvent.
  4. Hover over the cvent connector and select Create Authentication.
  5. Edit the required credentials.

    The API User Name and API Password are different from the Cvent application username and password, so contact your Cvent admin to generate an API User Name and API Password.

    Check on Sandbox if you are using a testing API account against Cvent Sandbox.

  6. Click Continue.

  7. Name the connection.

  8. Click Done.

Create New Source

Select New Source from the authenticated connections.

Select the data type and the fetched time range for this transfer

Data Type: The supported types are Registration, Invitee, Contact, and Event

Start Date: The starting point of time for the data time window. In the preceding image example, all Registrations that get modified from 2018-08-01 00:00:00 UTC will be fetched.

Duration: The length of the time window. In the preceding example, the length of time is equal to fetching from 2018-08-01 00:00:00 UTC to 2018-09-01 00:00:00 UTC.

Incremental: When running on schedule, the time window of the fetched data automatically shifts forward on each run. For example, if the initial config is January 1, with 10 days in duration, the first run fetches data that was modified from January 1 to January 10, the second run fetches data that was modified from January 11 to January 20, and so on.


This shows a preview of the actual data in the specified data transfer configuration. The columns are sorted alphabetically, but custom field columns (if there are any) are placed at the end. This order also applies to the final results on the target database.

Choose the Target Database and Table

Choose an existing or create a new database and table.

The connector inserts literal time values received from the Cvent API. This means that the time values are relative to the implicit time zone of the Cvent server and assumed to be UTC. Also, the Event data type's time-related fields are relative to its own event's time zone. That means the Data Storage Timezone doesn't indicate the actual time zone of the data.


Set a schedule if you want to. Import starts when the scheduled time comes, or immediately if you choose ‘Once now’.

Select START TRANSFER and then check for a running job on the Jobs page.

Using the Command Line

Install the Prerequisites

Install the latest td tool via Ruby gem:

$ gem install td
$ td --version

There are other install methods. For more information, check out Treasure Data Toolbelt.

Create the Config File (config.yml)

Create a configuration file:

  type: cvent
  target: registration
  account_name: xxxxxxxxxxxxxxxx
  api_username: xxxxxxxxxxxxxxxx
  api_password: xxxxxxxxxxxxxxxx
  start_time: 2017-08-01
  fetch_days: 31
  parallel: true

  mode: append


$ td connector:preview config.yml
  | archive_date:timestamp    | capacity:long | category:string |...| currency:string | event_code:string | event_description:string        |
  | "2018-12-29 21:59:00 UTC" | -1            | "Conference"    |...| "U.S. Dollar"   | "KGNH2JXF45K"     | ""                              |
  | "2019-01-26 21:59:00 UTC" | 500           | "Conference"    |...| "U.S. Dollar"   | "W4NKF4YWY4W"     | "Devcon for Cvent TD Engineers" |

Execute Load Job

You must specify the database and table where the data is stored.

It is recommended to specify the --time-column option because Treasure Data’s storage is partitioned by time. If the option is not given, the data connector selects the first long or timestamp column as the partitioning time. The type of the column specified by --time-column must be either of long or timestamp type (use Preview results to check for the available column name and type. Generally, most data types have a last_modified_date column).

If your data doesn’t have a time column you can add the column by using the add_time filter option. See details at add_time filter plugin.

Submit the load job. It may take a couple of hours depending on the data size. Users need to specify the database and table where their data is stored.

$ td connector:issue config.yml \
  --database sample_db \
  --table sample_table

The connector:issue command assumes you have already created a database (sample_db) and a table (sample_table). If the database or the table do not exist in TD, the connector:issue command will fail. Therefore you must create the database and table manually or use --auto-create-table option with td connector:issue command to automatically create the database and table:

$ td connector:issue load.yml \
  --database sample_db \
  --table sample_table \

Scheduled Execution

You can schedule periodic data connector execution for periodic Cvent import. By using this feature, you no longer need a cron daemon on your local data center.

Create the Schedule

A new schedule can be created by using the td connector:create command. The name of the schedule, the cron-style schedule, the database and table where their data will be stored, and the Data Connector configuration file are required.

$ td connector:create \
  daily_cvent_import \
  "10 0 * * *" \
  sample_db \
  sample_table \

The `cron` parameter also accepts three special options: `@hourly`, `@daily` and `@monthly`. For more detail on Scheduled Jobs.

By default, the schedule is set up in the UTC timezone. You can set the schedule in a timezone using -t or --timezone option. The --timezone option supports only extended timezone formats like ‘Asia/Tokyo’, ‘America/Los_Angeles’ etc. Timezone abbreviations like PST, CST are not supported and may lead to unexpected schedules.

List the Schedules

You can see the list of currently scheduled entries by td connector:list.

$ td connector:list
  | Name               | Cron       | Timezone | Delay | Database        | Table        |
  | daily_cvent_import | 10 0 * * * | UTC      | 0     | sample_database | sample_table |

Show the Setting and History of Schedules

td connector:show shows the execution setting of a schedule entry.

$ td connector:show daily_cvent_import
  Name     : daily_cvent_import
  Cron     : 10 0 * * *
  Timezone : UTC
  Delay    : 0
  Database : sample_db
  Table    : sample_table

td connector:history shows the execution history of a schedule entry. To investigate the results of each individual execution, use td job <jobid>.

$ td connector:history daily_cvent_import
  | JobID  | Status  | Records | Database  | Table        | Priority | Started                   | Duration |
  | 578066 | success | 20000   | sample_db | sample_table | 0        | 2015-04-18 00:10:05 +0000 | 160      |
  | 577968 | success | 20000   | sample_db | sample_table | 0        | 2015-04-17 00:10:07 +0000 | 161      |
  | 577914 | success | 20000   | sample_db | sample_table | 0        | 2015-04-16 00:10:03 +0000 | 152      |
  | 577872 | success | 20000   | sample_db | sample_table | 0        | 2015-04-15 00:10:04 +0000 | 163      |
  | 577810 | success | 20000   | sample_db | sample_table | 0        | 2015-04-14 00:10:04 +0000 | 164      |
  | 577766 | success | 20000   | sample_db | sample_table | 0        | 2015-04-13 00:10:04 +0000 | 155      |
  | 577710 | success | 20000   | sample_db | sample_table | 0        | 2015-04-12 00:10:05 +0000 | 156      |
  | 577610 | success | 20000   | sample_db | sample_table | 0        | 2015-04-11 00:10:04 +0000 | 157      |

Delete the Schedule

td connector:delete removes the schedule.

$ td connector:delete daily_cvent_import


The Cvent Connector has the same upstream limit as the Cvent API. This means:

  • Contacts, Invitees and Events are only retrievable if the data contains less than 25,000 records that get modified in the same second.

  • Registrations are only retrievable if the data contains less than 25,000 records that get modified in the same second.

  • There is a cap of 10,000 requests per-day,which roughly equals 2 million records per-day.

Appendix A: How Column Names are Mapped

All columns are in snake case (for example, first_name, last_name).

Due to an upstream problem from Cvent API, the "RSVP By Date" is mapped to an unusual column name: "rsv_pby_date"

Custom field names are mapped to snake case by the following steps:

  1. Replace all non-alphanumeric to underscore "_"

  2. Remove all leading and trailing underscores after step 1

  3. If the first character is a digit, prefix the name with "col_"

  4. Remove all consecutive underscores "_"

  5. If the name is empty after the preceding steps, then name it "custom_field" (This column field name is different from the column name in TD Console)

  6. Lowercase all of the characters

For example: "Hello @ World" will be mapped to "hello_world", "" (empty) will be mapped to "custom_field".
If there are naming conflicts, the conflicted custom field name is appended with the ID of the field. For example, if there is a custom field name "First Name" on Contact type (which already has a predefined field with that name), the custom field name is mapped to something like "first_name_A3E3_ERQNIHOIU_324AE".

Appendix B: Event's Time Zone

Unlike other datetime fields, the preceding 4 event-related datetime fields are imported as text and can be recognized in a database by a slightly different format (for example, the Start Date is imported as "2018-10-09T17:59:00", and would be "2018-10-09 17:59:00.000" if the Start Date is a default datetime value). Event-related datetime fields are relative to their own event time zone. Therefore, these event-related datetime fields are not treated as absolute time references like other fields.

The event-related datetime fields correspond to the following columns after import:

Target Type














  • No labels