# Adobe Analytics Import Integration The version 2 of this connector which supports cloud storage services, has been released. For details, see [Adobe Analytics Import Integration (V2)](/int/adobe-analytics-import-integration-v2) The data connector for Adobe Analytics enables you to import files stored on your Adobe Analytics' SFTP server to Treasure Data. # Prerequisites - Basic knowledge of Treasure Data - SFTP file server from Adobe ## Static IP Address of Treasure Data Integration If your security policy requires IP whitelisting, you must add Treasure Data's IP addresses to your allowlist to ensure a successful connection. Please find the complete list of static IP addresses, organized by region, at the following link: [https://api-docs.treasuredata.com/en/overview/ip-addresses-integrations-result-workers/](https://api-docs.treasuredata.com/en/overview/ip-addresses-integrations-result-workers/) # Limitations - This integration only supports the gzip format. Compression file formats tar.gz and zip are not supported. # Use the TD Console to create your connection You can use TD console to create your data connector. ## Create a Authentication When you configure a data connection, you provide authentication to access the integration. In Treasure Data, you configure the authentication and then specify the source information. 1. Open the **TD Console**. 2. Navigate to the **Integrations Hub > Catalog**. 3. Click the search icon on the far-right of the Catalog screen, and enter **Adobe Analytics**. 4. Hover over the Adobe Analytics connector and select **Create Authentication**.![](/assets/adobeanalytics.aeaf842e9004042c974cf20c8dd91dfea31d48e7b7a63b8a81d645b700f92a55.426954b1.png) 5. Enter the required credentials for your remote Adobe Analytics SFTP instance. 6. Set the parameters. 7. Select **Continue** after entering the required connection details. Name the connection so you can easily find it later should you need to modify any of the connection details. If you would like to share this connection with other users in your organization, check the Share with others checkbox. If this box is unchecked this connection is visible only to you. 8. Select **Create Authentication** to complete the connection. If the connection is a success, then the connection you just created appears in your list of authentications with the name you provided. | **Parameter** | **Description** | | --- | --- | | **Host** | The host information of the remote Adobe Analytics' SFTP instance, for example an IP address. | | **Port** | The connection port on the remote FTP instance, the default is 22. | | **User** | The user name used to connect to the remote FTP instance. | | **Authentication mode** | The way you choose to authenticate with your Adobe Analytics' SFTP server. | | **Secret key file** | Required if 'public / private key pair' is selected from `Authentication Mode`. (The key type ed25519 is not supported but the ecdsa key type is supported.) | | **Passphrase for secret key file** | (Optional) If required, provide a passphrase for the provided secret file. | | **Retry limit** | Number of times to retry a failed connection (default 10). | | **Timeout** | Connection timeout in seconds (default 600). | ## Transfer Data into Treasure Data. ### Connection Now that you have created the connection to your remote Adobe Analytics' SFTP instance, the next step is getting the data from your Adobe Analytics' SFTP server into Treasure Data. You can set up an ad hoc one time transfer or a recurring transfer at a regular interval. In this section, you specify source details as described in the following steps. After creating the authenticated connection, you are automatically taken to Authentications. 1. Search for the connection you created. 2. Select **New Source**. **Connection** 1. Type a name for your **Source** in the Data Transfer field**.** 2. Select **Next**. ![](/assets/image-20200402-200406.a2fa091252d0882797c1dc5aec3e0d395489268d0fd3b28f385c28be71500bea.426954b1.png) ### Source Table Provide the details of the database and table that you want to ingest data from. ![](/assets/image-20200402-204953.1c0b6fb3ee9991d3f82cdb13f8d382ac9c2f933f0cb8c42e858cd4e68ac23397.426954b1.png) | **Parameters** | **Descriptions** | | --- | --- | | **Path prefix** | Prefix of target files (string, required). | | **Incremental** | Enables incremental loading (boolean, optional. default: true. If incremental loading is enabled, the config diff for the next execution will include last_path parameter so that next execution skips files before the path. Otherwise, last_path is not included. | ### Data Settings 1. Select **Next**. The Data Settings page opens. 2. Optionally, edit the data setting parameters or skip this page of the dialog. Parameters are described in the page. ![](/assets/image-20200402-201409.fabf9295896b0aeb039018410903c82fb4f7f6d9186c1826e1a50262e27ddd6a.426954b1.png) ![](/assets/image-20200402-201440.6e3df5c4f3730425281e0966432c29cb0935ea6cc2264bd4a73d6253f464c1b6.426954b1.png) ### Data Preview You can see a [preview](/products/customer-data-platform/integration-hub/batch/import/previewing-your-source-data) of your data before running the import by selecting Generate Preview. Data preview is optional and you can safely skip to the next page of the dialog if you choose to. 1. Select **Next**. The Data Preview page opens. 2. If you want to preview your data, select **Generate Preview**. 3. Verify the data. ### Data Placement For data placement, select the target database and table where you want your data placed and indicate how often the import should run. 1. Select **Next.** Under Storage, you will create a new or select an existing database and create a new or select an existing table for where you want to place the imported data. 2. Select a **Database** > **Select an existing** or **Create New Database**. 3. Optionally, type a database name. 4. Select a **Table**> **Select an existing** or **Create New Table**. 5. Optionally, type a table name. 6. Choose the method for importing the data. - **Append** (default)-Data import results are appended to the table. If the table does not exist, it will be created. - **Always Replace**-Replaces the entire content of an existing table with the result output of the query. If the table does not exist, a new table is created. - **Replace on New Data**-Only replace the entire content of an existing table with the result output when there is new data. 7. Select the **Timestamp-based Partition Key** column. If you want to set a different partition key seed than the default key, you can specify the long or timestamp column as the partitioning time. As a default time column, it uses upload_time with the add_time filter. 8. Select the **Timezone** for your data storage. 9. Under **Schedule**, you can choose when and how often you want to run this query. #### Run once 1. Select **Off**. 2. Select **Scheduling Timezone**. 3. Select **Create & Run Now**. #### Repeat Regularly 1. Select **On**. 2. Select the **Schedule**. The UI provides these four options: *@hourly*, *@daily* and *@monthly* or custom *cron*. 3. You can also select **Delay Transfer** and add a delay of execution time. 4. Select **Scheduling Timezone**. 5. Select **Create & Run Now**. After your transfer has run, you can see the results of your transfer in **Data Workbench** > **Databases.** ## FAQ for the Adobe Analytics Data Connector 1. I can’t connect to my Adobe Analytics' SFTP server, what can I do? - Check what is valid protocol. If you intend to SFTP, you can use this Data Connector for Adobe Analytics' SFTP. If FTP/FTPS, try connect with [FTP Data Connector](/int/ftp-server-import-integration). - If you are using a firewall, check your accepted IP range/port. Server administrators sometimes change the default port number from TCP/22 for security reasons. - Be sure that your private key has an *OpenSSH* format. We don’t support other formats like “PuTTY”. - We do not support the default format of the private key since [OpenSSH 7.8](https://www.openssh.com/releasenotes.md). Please re-generate the key using '-m PEM' option. 1. How do I troubleshoot data import problems? Review the job log. Warning and errors provide information about the success of your import. For example, you can [identify the source file names associated with import errors](/products/customer-data-platform/integration-hub/batch/import/data-import-error-troubleshooting).