# Salesforce Marketing Cloud Exacttarget Export Integration Using SFTP Beta This plugin is recommended for large data sets. Alternatively, you can use Salesforce Marketing Cloud (ExactTarget) Data Connector for small data sets, to write job results. Treasure Data can publish user segments into [Salesforce Marketing Cloud](https://www.salesforce.com/products/marketing-cloud/overview/) (ExactTarget), and enable you to send personalized emails to your customers. You can run data-driven email campaigns, by using your first-party data from Web, Mobile, CRM, and other data sources. This feature is in BETA version. For more information, contact your Customer Success Representative. ## Prerequisites - Basic knowledge of Treasure Data - Basic knowledge of Salesforce Marketing Cloud - TD account ## Supported Open SSH 7.8 Private Key is supported. The format of the key is detected and the correct library to use is chosen. ## Set Up a Secure, Automated Account in Salesforce Access your Salesforce account to begin set up. ### Create an FTP Account On the SFMC dashboard, in your account, select **Administration**. ![](/assets/image-20191204-002010.42445ff8df7cbb65071298f2958ff593082a11325c30f565a01adac4a047ae55.a0d1c478.png) From the Account drop-down menu, select **FTP Accounts**. This allows you to establish an SFTP account. ![](/assets/image-20191204-001209.f3f8b78f45b6d32231278162d5a335160736c060fe4efd43475875b50210b949.a0d1c478.png) In the FTP Accounts panel, select **Add FTP User**. ![](/assets/image-20191204-002120.f7e9678741a24db050d2e19b98facf8b9c775a4008d4cb7f4f25ea239319b5ca.a0d1c478.png) Provide an FTP account password. ![](/assets/image-20191204-001252.9e21310b3bb3fabbdcec106bf4c57292dc33df06f4c0abecb170994bdf09ef0c.a0d1c478.png) Review your SFTP account information. ![](/assets/image-20191204-002021.fb6a443184da3766947f55d4c12ec3195d3cad1c293b694689672c3941edbd29.a0d1c478.png) ### Configure for Automatic Mail Delivery Go back to the SFMC dashboard, and select **Email Studio > Email**. ![](/assets/image-20191204-001302.f19ada618bfac43c0a0d89da8ca410b4e03fa1ccffe0b3c9fd26d3fa8fada8b3.a0d1c478.png) Select **Content > Create > Template > From Existing Template** to create an email template. ![](/assets/image-20191204-002156.276d183168c7f6f81e5a7278a4731e7b70647bf726bdf51e2da8c6d9e79a1494.a0d1c478.png) After creating the template, select **Save > Save and Exit** and then provide a template name and location, and save that information as well. Remaining on the Email page, select **Create > Email** to create email content (for example, for a campaign) from a template. ![](/assets/image-20191204-001452.d2dc1111862f119c31cd2a756c98cc5cf0c20237f8129ebdff1bcd8b393e81d0.a0d1c478.png) Select the template, define the email properties, including name and location, and select **Next** to provide content. Continue creating the email and save it when you are finished. ![](/assets/image-20191204-002032.a63962a0998f6ee3e83aeae659999dbb809ac80b971a00c0d7e107a31ee3d425.a0d1c478.png) ### Create an Import Interaction in Salesforce From the Email view, select Interactions. Select **Import**. ![](/assets/image-20191204-001503.4b78d56319543ea313786f09eee4b45edf867ed5c239cbea7ac7eec451b486a4.a0d1c478.png) Select **Create** to make a new import interaction definition. Provide the import interaction information, including SFTP information and data import location. Save the information. ### Specify the Import Trigger From the Email view, select the SFDC blue cloud icon to view menu options. Select **Journey Builder > Automation Studio**. ![](/assets/image-20191204-002131.88ed4960c20d788b5e2d3fc3038a576b18f1c3ff8c9ff10a89020e26ed11df60.a0d1c478.png) Select **New Automation**. Drag the File Drop icon to Starting Source. ![](/assets/image-20191204-001532.9d699a06da20e7a68ff3c21dc298cf0f468bd0928435a142151a06650f86c1a2.a0d1c478.png) Select **Configure> Trigger Automation**. Specify Use Filename Pattern and then select **Done**. ![](/assets/image-20191204-002043.d7e9397d50783a76d14c6b9cdff9ba894c3ae64e3353eea2d67637de793908f2.a0d1c478.png) Drag the **Send Email** icon to the canvas and select **Create New**. ![](/assets/image-20191204-001543.39c3e0d99fe16ec5e2bb2be053a14ac4c051dc614a40858b018afddbeb31cb15.a0d1c478.png) Select an email object, for example, the one you created in this section. Select **Next**. Select an email target list. Select **Next**. Verify the email configuration information, and select **Finish**. Provide a name for the import trigger and an external key that is referred to by Treasure Data, and select **Save**. Select **Active** to enable the import trigger. ![](/assets/image-20191204-001601.93e8d7dab08d2ac4f0280b1020fa79512bf46895e9f1a79fbe7c68e027eaea95.a0d1c478.png) Select **Save** and **Close**. ## Use the TD Console to Create Your Connection ### Create a New Connection In Treasure Data, you must create and configure the data connection prior to running your query. As part of the data connection, you provide authentication to access the integration. Enter the required credentials for your remote SFTP instance. Name the connection. If you would like to share this connection with other users in your organization, check the `Share with others` checkbox. If this box is unchecked this connection is visible only to you. 1. Open **TD Console**. 2. Navigate to **Integrations Hub** > **Catalog**. 3. Search for and select Salesforce Marketing Cloud via SFTP. ![](/assets/image2021-1-29_14-10-26.470fc5eef93922c17a07fc7312b6d2f2f3607676aa7b1c38d9da06c6641c04dc.a0d1c478.png) 1. Select **Create Credentials**. 2. Type the credentials to authenticate. ![](/assets/image2021-1-29_14-11-48.83f486e2b19d4f19197ba524515962fa9d3465ea58a10d2d5cc71d0c6068cdb7.a0d1c478.png) 1. Type or select values for the parameters: | Parameter | Description | | --- | --- | | Host | The host information of the remote SFTP instance, for example, an IP address. | | Port | The connection port on the remote FTP instance, the default is 22. | | User | The user name used to connect to the remote FTP instance. | | Authentication mode | The way you choose to authenticate with your SFTP server.- public/private key pair Open SSH 7.8 Private Key is supported. The format of the key is detected and the correct library to use is chosen. - password | | Secret key file | Required if 'public / private key pair' is selected from `Authentication Mode`. (The key type ed25519 is not supported but the ecdsa key type is supported.) | | Passphrase for secret key file | (Optional) If required, provide a passphrase for the provided secret file. | | Retry limit | Number of times to retry a failed connection (default 10). | | Timeout | Connection timeout in seconds | | Use proxy? | If selected, enter the details for the proxy server. - Type - Host - Port - User - Password - Command | | Sequence format | Format for sequence part of output files (string, default: `".%03d.%02d"`) | 1. Select **Continue**. 2. Type a name for your connection. 3. Select **Done.** ### Define your Query Create a job that selects data from within Treasure Data. The specified column name for the mapping must match the column name in the SFMC Exact Target mail. “Email Address” and “Subscriber Key” columns are required. If needed, you can change the mapped column name that is in TD database. You can change the column name from within the TD Console. ### Exporting Your Query Data to Your Destination 1. Complete the instructions in [Creating a Destination Integration](https://docs.treasuredata.com/smart/project-product-documentation/creating-a-destination-integration). 2. Navigate to **Data Workbench > Queries**. 3. Select a query for which you would like to export data. 4. Run the query to validate the result set. 5. Select **Export Results**. 6. Select an existing integration authentication. ![](/assets/image2020-12-18_13-44-6.09e8af43184e33e337bef7c546600eaaa5be9f010b690af1d591c7c2b4bb2df3.c27c97ee.png) 7. Define any additional Export Results details. In your export integration content review the integration parameters. For example, your Export Results screen might be different, or you might not have additional details to fill out: ![](/assets/image2023-5-17_14-42-52.d2483b20e117c4abf1aaa32c5071595644e4c6b9d987ef4ea4ce4a2038171b85.c27c97ee.png) 8. Select **Done**. 9. Run your query. 10. Validate that your data moved to the destination you specified. ### Integration Parameters You could specify more parameters for the target export file: - **Path prefix**: the path for the plugin to save your output files in the target server - **Rename file after upload finish**: select to try to upload the file with .tmp extension first, then rename the file without .tmp when the file is uploaded successfully - **Format**: Format of the file (would be CSV or TSV) - **Compression**: choose it whenever you would like to compress the file. We support gzip and bzip2 compression - **Header line**: choose it if you would like to write the first line as the columns' name - **Delimiter**: delimiter between values in the target file, would be | or tab or comma - **Quote policy**: quote between each column, could be MINIMUM, ALL, or NONE - **Null string**: the value for the null field in the query - **End-of-line character**: the character to specify for end of the line. Would be Carriage Return Line Feed **(CRLF** - used in Windows OS file systems**)** or Line Feed **(LF** - used in Unix, macOS**)** or Carriage Return **(CR** - used in classic macOS**)** - **Encryption column names**: list of encryption columns, separated by a comma - **Encryption key**: specify key needed to perform the encryption algorithm - **Encryption iv**: specify a number to prevent repetition in data encryption ### Example Query ![](/assets/image-20191204-002208.2ea44edc828e9f310983210bfff13b5b8e3a7d676a5ae2573f8758844e159d73.a0d1c478.png) ### (Optional) Schedule Query Export Jobs You can use Scheduled Jobs with Result Export to periodically write the output result to a target destination that you specify. Treasure Data's scheduler feature supports periodic query execution to achieve high availability. When two specifications provide conflicting schedule specifications, the specification requesting to execute more often is followed while the other schedule specification is ignored. For example, if the cron schedule is `'0 0 1 * 1'`, then the 'day of month' specification and 'day of week' are discordant because the former specification requires it to run every first day of each month at midnight (00:00), while the latter specification requires it to run every Monday at midnight (00:00). The latter specification is followed. #### Scheduling your Job Using TD Console 1. Navigate to **Data Workbench > Queries** 2. Create a new query or select an existing query. 3. Next to **Schedule**, select None. ![](/assets/image2021-1-15_17-28-51.f1b242f6ecc7666a0097fdf37edd1682786ec11ef80eff68c66f091bc405c371.0f87d8d4.png) 4. In the drop-down, select one of the following schedule options: ![](/assets/image2021-1-15_17-29-47.45289a1c99256f125f4d887e501e204ed61f02223fde0927af5f425a89ace0c0.0f87d8d4.png) | Drop-down Value | Description | | --- | --- | | Custom cron... | Review [Custom cron... details](#custom-cron-details). | | @daily (midnight) | Run once a day at midnight (00:00 am) in the specified time zone. | | @hourly (:00) | Run every hour at 00 minutes. | | None | No schedule. | #### Custom cron... Details ![](/assets/image2021-1-15_17-30-23.0f94a8aa5f75ea03e3fec0c25b0640cd59ee48d1804a83701e5f2372deae466c.0f87d8d4.png) | **Cron Value** | **Description** | | --- | --- | | `0 * * * *` | Run once an hour. | | `0 0 * * *` | Run once a day at midnight. | | `0 0 1 * *` | Run once a month at midnight on the morning of the first day of the month. | | "" | Create a job that has no scheduled run time. | ``` * * * * * - - - - - | | | | | | | | | +----- day of week (0 - 6) (Sunday=0) | | | +---------- month (1 - 12) | | +--------------- day of month (1 - 31) | +-------------------- hour (0 - 23) +------------------------- min (0 - 59) ``` The following named entries can be used: - Day of Week: sun, mon, tue, wed, thu, fri, sat. - Month: jan, feb, mar, apr, may, jun, jul, aug, sep, oct, nov, dec. A single space is required between each field. The values for each field can be composed of: | Field Value | Example | Example Description | | --- | --- | --- | | A single value, within the limits displayed above for each field. | | | | A wildcard `'*'` to indicate no restriction based on the field. | `'0 0 1 * *'` | Configures the schedule to run at midnight (00:00) on the first day of each month. | | A range `'2-5'`, indicating the range of accepted values for the field. | `'0 0 1-10 * *'` | Configures the schedule to run at midnight (00:00) on the first 10 days of each month. | | A list of comma-separated values `'2,3,4,5'`, indicating the list of accepted values for the field. | `0 0 1,11,21 * *'` | Configures the schedule to run at midnight (00:00) every 1st, 11th, and 21st day of each month. | | A periodicity indicator `'*/5'` to express how often based on the field's valid range of values a schedule is allowed to run. | `'30 */2 1 * *'` | Configures the schedule to run on the 1st of every month, every 2 hours starting at 00:30. `'0 0 */5 * *'` configures the schedule to run at midnight (00:00) every 5 days starting on the 5th of each month. | | A comma-separated list of any of the above except the `'*'` wildcard is also supported `'2,*/5,8-10'`. | `'0 0 5,*/10,25 * *'` | Configures the schedule to run at midnight (00:00) every 5th, 10th, 20th, and 25th day of each month. | 1. (Optional) You can delay the start time of a query by enabling the Delay execution. ### Execute the Query Save the query with a name and run, or just run the query. Upon successful completion of the query, the query result is automatically exported to the specified destination. Scheduled jobs that continuously fail due to configuration errors may be disabled on the system side after several notifications. (Optional) You can delay the start time of a query by enabling the Delay execution. ### Verifying the Query Export Job After the job finishes, you can check the output file on the SFTP server by using the general SFTP command, as shown in the following example: ![](/assets/image-20191204-001950.fd1686166c81155f0ac282cd52c2afea9f26d16fd157be848b3252a2df940c1d.a0d1c478.png) Check the SFMC dashboard to verify a successful import. If the import and mail delivery is successful, you can see **Complete** on the Automation Studio Overview page. ## Activate a Segment in Audience Studio You can also send segment data to the target platform by creating an activation in the Audience Studio. 1. Navigate to **Audience Studio**. 2. Select a parent segment. 3. Open the target segment, right-mouse click, and then select **Create Activation.** 4. In the **Details** panel, enter an Activation name and configure the activation according to the previous section on Configuration Parameters. 5. Customize the activation output in the **Output Mapping** panel. ![](/assets/ouput.b2c7f1d909c4f98ed10f5300df858a4b19f71a3b0834df952f5fb24018a5ea78.8ebdf569.png) - Attribute Columns - Select **Export All Columns** to export all columns without making any changes. - Select **+ Add Columns** to add specific columns for the export. The Output Column Name pre-populates with the same Source column name. You can update the Output Column Name. Continue to select **+ Add Columns**to add new columns for your activation output. - String Builder - **+ Add string** to create strings for export. Select from the following values: - String: Choose any value; use text to create a custom value. - Timestamp: The date and time of the export. - Segment Id: The segment ID number. - Segment Name: The segment name. - Audience Id: The parent segment number. 1. Set a **Schedule**. ![](/assets/snippet-output-connector-on-audience-studio-2024-08-28.a99525173709da1eb537f839019fa7876ffae95045154c8f2941b030022f792c.8ebdf569.png) - Select the values to define your schedule and optionally include email notifications. 1. Select **Create**. If you need to create an activation for a batch journey, review [Creating a Batch Journey Activation](/products/customer-data-platform/journey-orchestration/batch/creating-a-batch-journey-activation). ## Others - You can involve this integration in a [TD workflow](/int/using-td-workflow-with-td-integrations) as part of a more advanced data pipeline