Skip to content
Last updated

Microsoft Dynamics 365 Sales Import Integration Using CLI

You can also use the MS Dynamics 365 Sales data connector from the command line interface. The following instructions show you how to import data using the CLI.

Install the ‘td’ command v0.11.9 or later

Install the most current Treasure Data Toolbelt.

$ td --version
0.11.10

Create Seed Config File (seed.yml)

Follow the steps here to set up access to your Dynamics 365 CRM:

  1. Create your application and get your client_id, client_secret https://docs.microsoft.com/en-us/powerapps/developer/data-platform/use-single-tenant-server-server-authentication
  2. Create a custom Security Role for your application user. It is recommended to create a custom security role with minimal permissions only (read permission for your desired object). See more here: https://docs.microsoft.com/en-us/previous-versions/dynamicscrm-2016/administering-dynamics-365/dn531130(v=crm.8)#create-a-security-role and https://docs.microsoft.com/en-us/previous-versions/dynamicscrm-2016/administering-dynamics-365/dn531090(v=crm.8)#security-roles
  3. Create an application user from your Dynamics 365 instance. See more here: https://docs.microsoft.com/en-us/power-platform/admin/create-users-assign-online-security-roles#create-an-application-user
  4. Assign the custom role for the application user

Prepare seed.yml as shown in the following example, with your Microsoft Dynamics 365 details.

in:
  type: ms_dynamics
  domain: YOUR_DYNAMICS_365_DOMAIN
  tenant_id: TENANT_ID
  auth_method: client_credentials
  client_id: CLIENT_ID
  client_secret: CLIENT_SECRET
  entity_type: contact
  filter_from: FROM_DATE
  incremental: true
out:
  mode: append
exec: {}

For more details on available out modes, see Appendix.

Supported config parameters

ParametersValueRequiredDescription
typestringyesms_dynamics
domainstringyesDynamic 365 CRM domain, with or without https:// prefix
tenant_idstringyesMicrosoft account tenant ID
auth_methodstringyesclient_credentials
client_idstringyesApplication client id
client_secretstringyesApplication client secret
entity_typestringyesEntity type to import. E.g. contact, account, quote etc...
filter_fromstringyesFilter data from this value
filter_tostringnoFilter data to this value. If not set the default execution time is used
filter_columnstringnocolumn to filter data. Default modifiedon
incrementalbooleannoImport new data from the last run only
skip_invalidbooleannoSkip the record when a column contains an invalid data format
entity_setstringnoEntity set name, usually in the plural form of entity_type. E.g. contacts, accounts, quotes, etc... If this value is not set, the connector will guess for entity_set from API metadata
columnsarraynoColumns and their datatypes are to be imported. If this value is not set, the connector will guess for the columns and datatypes from API metadata
retry_limitintnoMaximum retry times for each API call.
retry_initial_wait_millisintnoWait time for the first retry (in milliseconds)
max_retry_wait_millisintnoMaximum time wait for an API call before it is gave up
connection_timeout_millisintnoThe amount of time before the connection times out when doing API calls.
read_timeout_millisintnoThe amount of time waiting to write data into the request

Guess Fields (Generate load.yml)

Use connector:guess. This command reads the seed.yml above, and outputs a new file

$ td connector:guess seed.yml -o load.yml

If you open load.yml, you will see the guessed entity_set and columns.

---
in:
  type: ms_dynamics
  domain: <YOUR_DYNAMICS_365_DOMAIN>
  tenant_id: <TENANT_ID>
  auth_method: client_credentials
  client_id: <CLIENT_ID>
  client_secret: <CLIENT_SECRET>
  entity_type: contact
  filter_from: <FROM_DATE>
  custom_expression_filter: your custom expression filter
  incremental: true
  entity_set: contacts
  columns:
  - {name: exchangerate, type: double}
  - {name: anniversary, type: string}
  - {name: lastname, type: string}
  - {name: employeeid, type: string}
  - {name: firstname, type: string}
  - {name: spousesname, type: string}
  - {name: customersizecode, type: long}
  - {name: fullname, type: string}

out: {mode: append}
exec: {}
filters:
- from_value: {mode: upload_time}
  to_column: {name: time}
  type: add_time

Then, you can preview the result by using the preview command.

td connector:preview load.yml

If the system detects your column type unexpectedly, or you want to remove a column from the result, modify load.yml directly and preview again.

Currently, the Data Connector supports parsing of “boolean”, “long”, “double”, “string”, and “timestamp” types.

td database:create td_sample_db
td table:create td_sample_db td_sample_table

Execute Load Job

Submit the load job. It may take a couple of hours depending on the size of the data. Specify the Treasure Data database and table where the data should be stored.

It’s also recommended to specify --the time-column option, because Treasure Data’s storage is partitioned by time (see data partitioning) If the option is not provided, the data connector chooses the first long or timestamp column as the partitioning time. The type of the column specified by --time-column must be either of long or timestamp type.

If your data doesn’t have a time column you can add a time column by using the add_time filter option. For more details, see the add_time filter plugin.

$ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column created_at

The connector:issue command assumes that you have already created a database(td_sample_db) and a table(td_sample_table). If the database or the table does not exist in TD, the connector:issue command fails, so create the database and table manually or use the --auto-create-table option with td connector:issue command to auto-create the database and table:

td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column created_at --auto-create-table

If you have a field called time, you don't have to specify the --time-column option.

td connector:issue load.yml --database td_sample_db --table td_sample_table

Scheduled execution

You can schedule periodic Data Connector execution for incremental import of new data. TD configures our scheduler carefully to ensure high availability. By using this feature, you no longer need a cron daemon on your local data center.

For more detail about the incremental loading, see About Incremental Loading

Create the schedule

A new schedule can be created using the td connector:create command. The following are required:

  • name of the schedule
  • cron-style schedule
  • database and table where the data will be stored
  • Data Connector configuration file.
td connector:create \
daily_import \
"10 0 * * *" \
td_sample_db \
td_sample_table \
load.yml

It’s also recommended to specify the --time-column option because Treasure Data’s storage is partitioned by time (see data partitioning).

td connector:create \
daily_import \
"10 0 * * *" \
td_sample_db \
td_sample_table \
load.yml \
--time-column created_at

The cron parameter also accepts three special options: @hourly, @daily, and @monthly

By default, the schedule is set in the UTC timezone. You can set the schedule in a timezone using the -t or --timezone option. The --timezone option supports only extended timezone formats like 'Asia/Tokyo', 'America/Los Angeles' etc. Timezone abbreviations like PST and CST are *not* supported and may lead to unexpected schedules.

List the Schedules

You can see the list of currently scheduled entries by running the command td connector:list.

td connector:list

Show the Setting and Schedule History

td connector: Show the execution setting of a schedule entry.

td connector:show daily_import

td connector:history shows the execution history of a schedule entry. To investigate the results of each individual run, use td job jobid.

td connector:history daily_import

Delete the Schedule

td connector:delete will remove the schedule.

$ td connector:delete daily_import

Appendix

Modes for out plugin

You can specify the file import mode in our seed.yml section.

append (default)

This is the default mode and records are appended to the target table.

in:
  ...
out:
  mode: append

replace (In td 0.11.10 and later)

This mode replaces data in the target table. Note that any manual schema changes made to the target table will remain intact with this mode.

in:
  ...
out:
  mode: replace