# Migrating Salesforce Integrations Complete the following steps to migrate from the legacy Salesforce Legacy data connector to the new Salesforce connector. The legacy data connector uses only REST API to import data. The new Salesforce data connector enables you to use Bulk import and REST API. - [Characteristics of Ingested Data](/int/migrating-salesforce-integrations#characteristics-of-ingested-data) - [Campaign](/int/migrating-salesforce-integrations#campaign) - [Contact](/int/migrating-salesforce-integrations#contact) - [Data Extension](/int/migrating-salesforce-integrations#data-extension) - [Email Event](/int/migrating-salesforce-integrations#email-event) - [Create a New Salesforce V2 Connector](/int/migrating-salesforce-integrations#create-a-new-salesforce-v2-connector) - [Save Settings and Run the Legacy Salesforce Data Connector One Last Time](/int/migrating-salesforce-integrations#save-settings-and-run-the-legacy-salesforce-data-connector-one-last-time) - [Using TD Console](/int/migrating-salesforce-integrations#using-td-console) - [Using CLI and Workflow](/int/migrating-salesforce-integrations#using-cli-and-workflow) - [For Result Output](/int/migrating-salesforce-integrations#for-result-output) - [Using TD Console](/int/migrating-salesforce-integrations#using-td-console-1) - [Using CLI](/int/migrating-salesforce-integrations#using-cli) - [Using Workflow](/int/migrating-salesforce-integrations#using-workflow) # Characteristics of Ingested Data When migrating data from one place or version, it is worth being aware of how that data might be transformed. The following sections outline some important characteristics to be aware of. ## Campaign You can ingest more than 50 assets. | **Column** | **Old Data Type** | **New Data Type** | | --- | --- | --- | | **createdDate** | string | timestamp | | **modifiedDate** | string | timestamp | ### Campaign Assets | **Column** | **Old Data Type** | **New Data Type** | | --- | --- | --- | | **createdDate** | string | ISO 8601 string | Other date time values are converted to UTC. ## Contact Ingestion is limited to: - root and system defined data - **one-to-one** and **one-to-many** relationships - one-to-one relationships are saved as a single JSON - one-to-many relationships are saved as a JSON array Other attributes must be ingested using the Treasure Data ingestion feature. Contact attributes are collected for root and system, you are not able to limit ingested attributes. The number of records per page used the default value of 2000. ## Data Extension Ingestion is limited to: - one data extension at a time | **Old Column Name** | **New Column Name** | | --- | --- | | data-extension-column-name | column-name | | **Column** | **Old Data Type** | **New Data Type** | **Format of Data** | | --- | --- | --- | --- | | **any-datetime** | string | timestamp | UTC | TD generated properties have an underscore prefix of “_” so that they can be easily identified. The number of records per page used the default value of 2500. ## Email Event Ingestion excludes subscribers associated with an event. # Create a New Salesforce V2 Connector Go to Treasure Data Catalog, then search and select Salesforce v2. ![](/assets/image-20190920-220512.90e6a331b9492e4811a07d129b52ba372e154827afd21d8b2a86d6bd77e11dfc.a47fdf65.png) In the dialog box, enter the values that you enter in your legacy Salesforce connector. Salesforce v2 connector requires that you remove unnecessary letters from Login URL parameter. For example, instead of [https://login.salesforce.com/?locale=jp](https://login.salesforce.com/?locale=jp) , use [https://login.salesforce.com/](https://login.salesforce.com/?locale=jp) . Enter your username (your email) and password, as well as your Client ID, Client Secret and Security Token. ![](/assets/image-20190920-220620.ea9aff1d36519f589b82fc46eaa23d60b1cabf146a991e1477cd3366738b1e69.a47fdf65.png) # Save Settings and Run the Legacy Salesforce Data Connector One Last Time You can save your legacy setting from TD Console or from the CLI. - [Campaign](/int/migrating-salesforce-integrations#h2__1588150676) - [Contact](/int/migrating-salesforce-integrations#h2_447307630) - [Data Extension](/int/migrating-salesforce-integrations#h2__804191579) - [Email Event](/int/migrating-salesforce-integrations#h2_1257724182) - [Using TD Console](/int/migrating-salesforce-integrations#h2_1852359361) - [Using CLI and Workflow](/int/migrating-salesforce-integrations#h2_748752854) - [Using TD Console](/int/migrating-salesforce-integrations#h2_1113801507) - [Using CLI](/int/migrating-salesforce-integrations#h2__1909512552) - [Using Workflow](/int/migrating-salesforce-integrations#h2__22747403) ## Using TD Console ### Save the Settings of Your Scheduled Legacy Salesforce Connector and Run a Final Import Go to Integration Hub > Sources. Search for your scheduled Salesforce source, select the source and select **Edit**. ![](/assets/image-20190920-220740.bd3854fa9c1f86b66293dcfd44b1707bb6e5d01075ea7b919f906ef9500e5e2b.a47fdf65.png) In the dialog box, copy the settings to use later: ![](/assets/image-20190920-223337.2adf53d0abfebe0118affc5a062201916f90eb341fdb34565cbabfa162b58965.a47fdf65.png) Also copy any advanced settings: ![](/assets/image-20190920-223416.c23211b8d05c516ff679ff1e03ddd3d78a6a1e46bb0062cb642387c2c0f05a36.a47fdf65.png) Next, you configure one final run with the legacy data connector to create a temporary table against which you can run a config-diff. You use the diff to identify and confirm the latest data imported into Treasure Data. ![](/assets/image-20190920-223452.4bf2fa218a9517425dacb967c12df40769e981e8c13a7c1f2ae45db5ba8decb8.a47fdf65.png) Before running the final import with the legacy connector, make sure that you change the schedule to one run only: ![](/assets/image-20190920-223521.97b218bd0b67056dd5358973c55e88e9b4c4c667b32526fa94565b7750d59a96.a47fdf65.png) After the job is complete, look at and copy **config_diff** in job query information somewhere to use later. ![](/assets/image-20190920-223541.b56a9ac85b1cd27d402643428847df0a61045b7641ee3f81a8284d27a18b0493.a47fdf65.png) ### **Create New Salesforce V2 Source** Go to Integration Hub > Authentication. Search for new Salesforce v2 connection that you created: ![](/assets/image-20190920-223620.7c996d3ecda4cd5281739073f28ce744338c3093e24dae3b0a3db44fe6044d77.a47fdf65.png) Select the New Source. Fill in all basic settings and advanced settings that you copied in the preceding steps. Then, if you want the new source to continue ingesting from the point where the legacy connector left, fill in the Last Record field with the config_diff information that you copied in the previous job. ![](/assets/image-20190920-223653.2f41e4091afd038a05037227d49d15c460bc098eafc64af39b885d9c225d5f22.a47fdf65.png) After completing the settings, choose the database and table job to populate data into, then schedule the job and provide a name for your new data connector. Select **Save** and then run the new data connector. ## Using CLI and Workflow Update in: type in your yml configuration from sfdc to sfdc_v2. For example, your existing workflow configuration might look something like this: ``` in: type: sfdc username: ${secret:sfdc.username} password: ${secret:sfdc.password} client_id: ${secret:sfdc.client_id} client_secret: ${secret:sfdc.client_secret} security_token: ${secret:sfdc.security_token} login_url: ${secret:sfdc.login_url} target: Lead out: {} exec: {} filters: [] ``` Your new workflow configuration would look like this: ``` in: type: sfdc_v2 username: ${secret:sfdc.username} password: ${secret:sfdc.password} client_id: ${secret:sfdc.client_id} client_secret: ${secret:sfdc.client_secret} security_token: ${secret:sfdc.security_token} login_url: ${secret:sfdc.login_url} target: Lead out: {} exec: {} filters: [] ``` # For Result Output The SFDC connection is shared between data connector and result output, although there is nothing change in result output, if you use either of those, you should upgrade it too. ## Using TD Console ### **Save the Settings of Legacy Export Connector** Go to TD Console. Go to Query Editor. Open the Query that uses SFDC for its connection. ![](/assets/image-20190920-223748.6045bf691a26e9cff3e37e890e76be4628cf82bb8e01eb79af5bdbdb26902e42.a47fdf65.png) Select the SFDC connector, then copy and save the details of the existing connection to use later. ![](/assets/image-20190920-223817.da15e7766ba6d24cc18ba368e4d32ed9a2376d2faa5bb7282b57c604f05f76c8.a47fdf65.png) Select **DELETE** to remove the Legacy one. ### **Modify the Existing Query (to Replace the Legacy Connection)** In the query, select Output Results. Next, you are going to set up the SFDC v2 connector by finding and select the SFDC v2 export connector that you created. ![](/assets/image-20190920-223850.847f20da49fae475453d03d1fb0c5c966d1486f75499c347f0016a29f2841f28.a47fdf65.png) In Configuration pane, specify the fields you saved in the previous step, then select **Done**. Check Output results to... to verify that you are using the created output connection. Select **Save**. | | | --- | | It is strongly recommended to create a test target and use it for the first data export to verify that exported data looks as expected and the new export does not corrupt existing data. In your test case, choose an alternate “Object” for your test target. | ## Using CLI Result type protocol needs to update from sfdc to sfdc_v2 for instance from: ``` sfdc://username:passwordsecurity_token@hostname/object_name ``` to: ``` sfdc_v2://username:passwordsecurity_token@hostname/object_name ``` ## Using Workflow If you have a workflow that used the SFDC, you can keep your result settings the same, but need to update **result_connection** to the new connection_name. An example of old workflow result output settings is as follows: ``` +td-result-output-sfdc: td>: queries/sample.sql database: sample_datasets result_connection: your_old_connection_name result_settings: object: object_name mode: append concurrency_mode: parallel retry: 2 split_records: 10000 ``` An example of new workflow result output settings is as follows: ``` +td-result-output-sfdc: td>: queries/sample.sql database: sample_datasets result_connection: your_new_connection_name result_settings: object: object_name mode: append concurrency_mode: parallel retry: 2 split_records: 10000 ```