This Treasure Data integration empowers digital sales organizations with modern remote collaboration capabilities for exceptional teamwork and frictionless engagement:

  • Find and build stronger relationships.
  • Improve productivity and performance.
  • Get a single view of customers.

You can use the import integration to ingest contact data and transactional data (including quotes, sales orders) from MS Dynamics 365 to Treasure Data.


  • Basic Knowledge of Treasure Data.

  • Client Credentials authentication: Administration privileges to access Azure Active Directory and Dynamics CRM Security settings. 

  • OAuth authentication: A tenant administrator, or a user who has access to Azure "Enterprise applications" to grant consent. 

About Incremental Data Loading

  • When incremental loading is enabled, the query issued to Dynamics API contains the statements $filter and $orderby. The $filter queries data for desired criteria and $orderby sorts data in descending manner
  • The first value received will be kept as a reference for the next job filter. The next job $filter excludes previous job data and fetches new data only.
  • The process repeats for subsequence execution.

  • When incremental loading enabled, the End Time is left blank (by default it will set to the current time)
  • The filter column (by default modifiedon) must not contain a null or empty value.


The following is an example when incremental loading is enabled:

  • Assumption/condition
    • Start Time = 2021-01-01T00:03:01Z
    • Job scheduled to run daily
  • 1st  job current time = 2021-01-15T00:03:01Z:  $filter=modifiedon > 2021-01-01T00:03:01Z and modifiedon <= 2021-01-15T00:03:01Z, $orderby = modifiedon desc. First record result has modifiedon = 2021-01-10T00:00:00Z
  • 2nd job current time = 2021-01-16T00:03:01Z: $filter=modifiedon > 2021-01-10T00:00:00Z and modifiedon <= 2021-01-16T00:03:01Z, $orderby = modifiedon desc. First record result has modifiedon = 2021-01-16T00:03:01Z
  • 3rd job current time = 2021-01-17T00:03:01Z:  $filter=modifiedon > 2021-01-16T00:03:01Z and modifiedon <= 2021-01-17T00:03:01Z, $orderby = modifiedon desc. First record result has modifiedon = 2021-01-17T00:00:01Z
  • ...

Obtaining Client ID and Client Secret

These values are necessary to connect using the Client Credentials authentication option. They are optional if you expect to use the OAuth option to authenticate.

Follow the Microsoft documentation to create your own client application and get your client id and client secret:

It is recommended that you create a custom security role with minimal permission for your registered application. See:

Entity Type

You can fetch all available entities by importing:

  • Entity Type = entity
  • Filter Column = overwritetime
  • Start Time = 1800-06-21T00:00:00Z

Use the TD Console to Create Your Connection

Create a New Connection

In Treasure Data, you must create and configure the data connection prior to running your query. As part of the data connection, you provide authentication to access the integration.

1. Open TD Console.
2. Navigate to Integrations Hub  Catalog.
3. Search for and select Microsoft Dynamic 365 Sales.

4. Select Create Authentication.


5. Type your MS Dynamics domain name. 
6. Choose one of the following authentication methods:
  1. Select OAuth.
  2. Type the credentials to authenticate.
  3. Optionally, select Click here and log in to Microsoft Dynamics 365 to grant consent.
    1. Return Integrations Hub Catalog.
    2. Search for and select Microsoft Dynamics 365 Sales.
    3. Type the value of your Domain
    4. Select OAuth Authentication Method.
    5. Select your newly created OAuth connection
    6. Review the OAuth connection field definition. 
  1. Select Client Credentials.

  2. Type the value of your Domain
  3. Type the value of your Tenant ID.
  4. Type the value of your Client ID.
  5. Type your Client Secret. 
7. Select Continue.
8. Enter a name for your connection.
9. Select Continue.

Transfer Your Data to Treasure Data

After creating the authenticated connection, you are automatically taken to Authentications.

1. Search for the connection you created. 
2. Select New Source.
3. Type a name for your Source in the Data Transfer field.

4. Select Next.

The Source Table dialog opens.

5. Edit the following parameters:
Parameters Description
Entity Type

Entity logical name E.g. contact, sales_order, account...use fetch Entity Type to fetch all available entities

Filter Column

Column to filter data (support date time column only)

Start Time

In UTC format YYYY-MM-DDThh:mm:ssZ. Import data modified from this timestamp.

  • The Start Time field is exclusive which means it won’t download data equals to this value. If you want data equals to this value being included, set the time earlier 1 second (while the End Time is inclusive)
End Time

This field is optional, If not specified current time will be used. In UTC format YYYY-MM-DDThh:mm:ssZ. Import data modified from this timestamp.

It's recommended to leave this field blank when Incremental loading is enabled.

Incremental Loading?

If enabled, only import new data from the last ingestion.

Skip Invalid Data?

When a column data type can not convert to a known value, the row will be skipped. 

If more than 30% of processed rows are invalid, the job stops with a status of fail.

6. Select Next.

The Data Settings page can be modified for your needs or you can skip the page.

7. Optionally, edit the following parameters:

Retry LimitMaximum retry times for each API call.
Initial retry time wait in millisWait time for the first retry (in milliseconds).
Max retry wait in millsMaximum wait time for an API call before it gives up.
HTTP Connect TimeoutThe amount of time before the connection times out when making API calls.
HTTP Read Timeoutthe amount of time waiting for writing data into the request.
Column Settings

You can remove a column from the result or define its data type.

Do not update the column name because it results in a null value for that column. 

8. Select Next.

Data Preview 

You can see a preview of your data before running the import by selecting Generate Preview.

Data shown in the data preview is approximated from your source. It is not the actual data that is imported.

  1. Click Next.
    Data preview is optional and you can safely skip to the next page of the dialog if you want.

  2. To preview your data, select Generate Preview. Optionally, click Next

  3. Verify that the data looks approximately like you expect it to.

  4. Select Next.

Data Placement

For data placement, select the target database and table where you want your data placed and indicate how often the import should run.

  1.  Select Next. Under Storage you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data.

  2. Select a Database > Select an existing or Create New Database.

  3. Optionally, type a database name.

  4. Select a Table> Select an existing or Create New Table.

  5. Optionally, type a table name.

  6. Choose the method for importing the data.

    • Append (default)-Data import results are appended to the table.
      If the table does not exist, it will be created.

    • Always Replace-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created. 

    • Replace on New Data-Only replace the entire content of an existing table with the result output when there is new data.

  7. Select the Timestamp-based Partition Key column.
    If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter.

  8. Select the Timezone for your data storage.

  9. Under Schedule, you can choose when and how often you want to run this query.

    • Run once:
      1. Select Off.

      2. Select Scheduling Timezone.

      3. Select Create & Run Now.

    • Repeat the query:

      1. Select On.

      2. Select the Schedule. The UI provides these four options: @hourly, @daily and @monthly or custom cron.

      3. You can also select Delay Transfer and add a delay of execution time.

      4. Select Scheduling Timezone.

      5. Select Create & Run Now.

 After your transfer has run, you can see the results of your transfer in Data Workbench > Databases.

Import from MS Dynamics 365 Sales via Workflow

You can import data from MS Dynamics 365 Sale by using td_load>: operator of workflow. If you have already created a SOURCE, you can run it

1. Identify your source.
2. To obtain a unique ID, open the Source list and then filter by MS Dynamics 365 Sales.
3. Open the menu and select Copy Unique ID.

4. Define a workflow task using td_load> operator.
  td_load>: unique_id_of_your_source
  database: ${td.dest_db}
  Table: ${td.dest_table}
5. Run a workflow.
  • No labels