# Microsoft Dynamics 365 Sales Import Integration Using CLI You can also use the MS Dynamics 365 Sales data connector from the command line interface. The following instructions show you how to import data using the CLI. ## Install the ‘td’ command v0.11.9 or later Install the most current [Treasure Data Toolbelt](https://toolbelt.treasuredata.com/). ``` $ td --version 0.11.10 ``` ## Create Seed Config File (seed.yml) Follow the steps here to set up access to your Dynamics 365 CRM: 1. Create your application and get your client_id, client_secret [https://docs.microsoft.com/en-us/powerapps/developer/data-platform/use-single-tenant-server-server-authentication](https://docs.microsoft.com/en-us/powerapps/developer/data-platform/use-single-tenant-server-server-authentication) 2. Create a custom Security Role for your application user. It is recommended to create a custom security role with minimal permissions only (read permission for your desired object). See more here: [https://docs.microsoft.com/en-us/previous-versions/dynamicscrm-2016/administering-dynamics-365/dn531130(v=crm.8)#create-a-security-role](https://docs.microsoft.com/en-us/previous-versions/dynamicscrm-2016/administering-dynamics-365/dn531130(v=crm.8)#create-a-security-role) and [https://docs.microsoft.com/en-us/previous-versions/dynamicscrm-2016/administering-dynamics-365/dn531090(v=crm.8)#security-roles](https://docs.microsoft.com/en-us/previous-versions/dynamicscrm-2016/administering-dynamics-365/dn531090(v=crm.8)#security-roles) 3. Create an application user from your Dynamics 365 instance. See more here: [https://docs.microsoft.com/en-us/power-platform/admin/create-users-assign-online-security-roles#create-an-application-user](https://docs.microsoft.com/en-us/power-platform/admin/create-users-assign-online-security-roles#create-an-application-user) 4. Assign the custom role for the application user Prepare seed.yml as shown in the following example, with your Microsoft Dynamics 365 details. ```yaml in: type: ms_dynamics domain: YOUR_DYNAMICS_365_DOMAIN tenant_id: TENANT_ID auth_method: client_credentials client_id: CLIENT_ID client_secret: CLIENT_SECRET entity_type: contact filter_from: FROM_DATE incremental: true out: mode: append exec: {} ``` For more details on available out modes, see [Appendix](/int/microsoft-dynamics-365-sales-import-integration-using-cli#h1__1835053169). ### Supported config parameters | **Parameters** | **Value** | Required | **Description** | | --- | --- | --- | --- | | type | string | yes | ms_dynamics | | domain | string | yes | Dynamic 365 CRM domain, with or without https:// prefix | | tenant_id | string | yes | Microsoft account tenant ID | | auth_method | string | yes | client_credentials | | client_id | string | yes | Application client id | | client_secret | string | yes | Application client secret | | entity_type | string | yes | Entity type to import. E.g. contact, account, quote etc... | | filter_from | string | yes | Filter data from this value | | filter_to | string | no | Filter data to this value. If not set the default execution time is used | | filter_column | string | no | column to filter data. Default **modifiedon** | | incremental | boolean | no | Import new data from the last run only | | skip_invalid | boolean | no | Skip the record when a column contains an invalid data format | | entity_set | string | no | Entity set name, usually in the plural form of entity_type. E.g. contacts, accounts, quotes, etc... If this value is not set, the connector will guess for entity_set from API metadata | | columns | array | no | Columns and their datatypes are to be imported. If this value is not set, the connector will guess for the columns and datatypes from API metadata | | retry_limit | int | no | Maximum retry times for each API call. | | retry_initial_wait_millis | int | no | Wait time for the first retry (in milliseconds) | | max_retry_wait_millis | int | no | Maximum time wait for an API call before it is gave up | | connection_timeout_millis | int | no | The amount of time before the connection times out when doing API calls. | | read_timeout_millis | int | no | The amount of time waiting to write data into the request | # Guess Fields (Generate load.yml) Use connector:guess. This command reads the seed.yml above, and outputs a new file ```bash $ td connector:guess seed.yml -o load.yml ``` If you open load.yml, you will see the guessed entity_set and columns. ```yaml --- in: type: ms_dynamics domain: tenant_id: auth_method: client_credentials client_id: client_secret: entity_type: contact filter_from: custom_expression_filter: your custom expression filter incremental: true entity_set: contacts columns: - {name: exchangerate, type: double} - {name: anniversary, type: string} - {name: lastname, type: string} - {name: employeeid, type: string} - {name: firstname, type: string} - {name: spousesname, type: string} - {name: customersizecode, type: long} - {name: fullname, type: string} out: {mode: append} exec: {} filters: - from_value: {mode: upload_time} to_column: {name: time} type: add_time ``` Then, you can preview the result by using the preview command. ``` td connector:preview load.yml ``` If the system detects your column type unexpectedly, or you want to remove a column from the result, modify load.yml directly and preview again. Currently, the Data Connector supports parsing of “boolean”, “long”, “double”, “string”, and “timestamp” types. ```bash td database:create td_sample_db td table:create td_sample_db td_sample_table ``` ## Execute Load Job Submit the load job. It may take a couple of hours depending on the size of the data. Specify the Treasure Data database and table where the data should be stored. It’s also recommended to specify *--the time-column* option, because Treasure Data’s storage is partitioned by time (see [data partitioning](https://docs.treasuredata.com/smart/project-product-documentation/data-partitioning-in-treasure-data)) If the option is not provided, the data connector chooses the first long or timestamp column as the partitioning time. The type of the column specified by --time-column must be either of long or timestamp type. If your data doesn’t have a time column you can add a time column by using the add_time filter option. For more details, see the [add_time filter plugin](https://docs.treasuredata.com/smart/project-product-documentation/add_time-filter-function). ``` $ td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column created_at ``` The connector:issue command assumes that you have already created a *database(td_sample_db)* and a *table(td_sample_table)*. If the database or the table does not exist in TD, the connector:issue command fails, so create the database and table [manually](https://docs.treasuredata.com/smart/project-product-documentation/data-management) or use the *--auto-create-table* option with td connector:issue command to auto-create the database and table: ``` td connector:issue load.yml --database td_sample_db --table td_sample_table --time-column created_at --auto-create-table ``` If you have a field called `time`, you don't have to specify the `--time-column` option. ```bash td connector:issue load.yml --database td_sample_db --table td_sample_table ``` ## Scheduled execution You can schedule periodic Data Connector execution for incremental import of new data. TD configures our scheduler carefully to ensure high availability. By using this feature, you no longer need a cron daemon on your local data center. For more detail about the incremental loading, see [About Incremental Loading](https://docs.treasuredata.com/display/INT/MS+Dynamics+365+Sales+Import+Integration#MSDynamics365SalesImportIntegration-AboutIncrementalDataLoading) ## Create the schedule A new schedule can be created using the td connector:create command. The following are required: - name of the schedule - cron-style schedule - database and table where the data will be stored - Data Connector configuration file. ```bash td connector:create \ daily_import \ "10 0 * * *" \ td_sample_db \ td_sample_table \ load.yml ``` It’s also recommended to specify the --time-column option because Treasure Data’s storage is partitioned by time (see [data partitioning](https://docs.treasuredata.com/smart/project-product-documentation/data-partitioning-in-treasure-data)). ```bash td connector:create \ daily_import \ "10 0 * * *" \ td_sample_db \ td_sample_table \ load.yml \ --time-column created_at ``` The `cron` parameter also accepts three special options: `@hourly`, `@daily`, and `@monthly` By default, the schedule is set in the UTC timezone. You can set the schedule in a timezone using the -t or --timezone option. The `--timezone` option supports only extended timezone formats like 'Asia/Tokyo', 'America/Los Angeles' etc. Timezone abbreviations like PST and CST are *not* supported and may lead to unexpected schedules. ## List the Schedules You can see the list of currently scheduled entries by running the command td connector:list. ``` td connector:list ``` ## Show the Setting and Schedule History td connector: Show the execution setting of a schedule entry. ``` td connector:show daily_import ``` td connector:history shows the execution history of a schedule entry. To investigate the results of each individual run, use td job jobid. ``` td connector:history daily_import ``` ## Delete the Schedule td connector:delete will remove the schedule. ``` $ td connector:delete daily_import ``` # Appendix ## Modes for out plugin You can specify the file import mode in our seed.yml section. ### append (default) This is the default mode and records are appended to the target table. ``` in: ... out: mode: append ``` ### replace (In td 0.11.10 and later) This mode replaces data in the target table. Note that any manual schema changes made to the target table will remain intact with this mode. ``` in: ... out: mode: replace ```