# Netsuite Import Integration Treasure Data allows you to import your data from Netsuite. For sample workflows of importing data from Netsuite, view [Treasure Boxes](https://github.com/treasure-data/treasure-boxes/tree/master/td_load/netsuite). ## Prerequisites - Basic knowledge of Treasure Data - NetSuite account ## Use TD Console ### Create a New Connection Go to Integrations Hub > Catalog and search. Select NetSuite. ![](/assets/netsuitecatalog.c1bb76655f5dc9f99ebd4c21cb5e74436f355ebcf570bb11c894ce24beafc2aa.bc56b822.png) Complete the NetSuite login information. Click **Continue**. ![](/assets/netsuiteconnectornewauth.d92d01f320d6378172200d47e2eca504b82791cbb4a335382a8cd34c521b679e.bc56b822.png) Provide a connection name. ![](/assets/netsuitenameconnection.8df3a266b2572924b61d675e952dedb1fd77d572466b0f1dae0f091be282e455.bc56b822.png) Click **Done**. ### Create a New Source After creating the connection, you are automatically taken to the Authentications page. Look for the connection you created and click **New Source**. Specify the source location. ![](/assets/netsuitenewsourcetransfer.ed91937afd373df455bbe42c5ecbccc9d35a00884f009a09d4b30739f16f7164.bc56b822.png) Next, you see a Preview of your data. To make changes to the data, such as skipping on errors or rate limits, click ***Advanced Settings*** otherwise, click ***Next***. Select the database and table where you want to transfer the data. In the **Schedule** tab, you can specify a one-time transfer, or you can schedule an automated recurring transfer. If you select **Once now,** click **Start Transfer.** If you select **Repeat…** specify your schedule options, then click **Schedule Transfer.** After your transfer has run, you can see the results of your transfer in **Data Workbench** > **Databases**. You are ready to start analyzing your data. ## Install ‘td’ Command v0.11.9 or Later Install the newest [TD Toolbelt](https://toolbelt.treasuredata.com/). ```bsah $ td --version 0.11.10 ``` ### Create Seed Config File (seed.yml) First, prepare seed.yml as shown in the following example, with your email, password, account, and role. You’ll use “replace” mode: ```yaml in: type: netsuite target: Customer email: YOUR_EMAIL_ADDRESS password: YOUR_PASSWORD account: YOUR_ACCOUNT_ID role: YOUR_ROLE_ID incremental: INCREMENTAL_OPTION #(optional, default: true) from_datetime: DATE TO LOAD DATA #(optional, Date value i.e 2017-07-24 15:43:11 +0700 to spefify the date that data) sandbox: false # set "sandbox: true" if you want to get sandbox account data out: mode: replace ``` The account and role (which is optional) can be found on Setup –> Integration –> Web Service Preference from the navigation bar. The target specifies what type of object you want to dump from NetSuite. Types are listed on the [NetSuite official document](https://system.netsuite.com/help/helpcenter/en_US/srbrowser/Browser2015_2/schema/record/account.md) but some types are not supported. Target types tested by Treasure Data: Customer, Invoice, CustomerPayment, and CustomerRefund. Specifying incremental results runs. The task attempts to import only data that is new, since the last import. For more details on available out modes, see the Appendix. ### Guess Fields (Generate load.yml) Use connector:guess. This command automatically reads the target file, and intelligently guesses the file format. ```bash td connector:guess seed.yml -o load.yml ``` When you open load.yml, you see guessed file format definitions including, in some cases, file formats, encodings, column names, and types. ```yaml in: type: netsuite target: Customer email: YOUR_EMAIL_ADDRESS password: YOUR_PASSWORD account: YOUR_ACCOUNT_ID role: YOUR_ROLE_ID incremental: true sandbox: false columns: - {name: access_role_external_id, type: string} - {name: access_role_internal_id, type: long} - {name: access_role_name, type: string} - {name: access_role_type, type: string} - {name: comments, type: string} - {name: company_name, type: string} - {name: credit_hold_override, type: string} - {name: currency_external_id, type: string} - {name: currency_internal_id, type: long} - {name: currency_name, type: string} - {name: currency_type, type: string} - {name: custom_fields, type: json} - {name: date_created, type: timestamp, format: "%Y-%m-%d %H:%M:%S %z"} - {name: days_overdue, type: long} - {name: default_address, type: string} - {name: email, type: string} - {name: email_preference, type: string} - {name: email_transactions, type: boolean} - {name: entity_id, type: string} - {name: entity_status_external_id, type: string} - {name: entity_status_internal_id, type: long} - {name: entity_status_name, type: string} - {name: entity_status_type, type: string} - {name: fax, type: string} - {name: fax_transactions, type: boolean} - {name: give_access, type: boolean} - {name: is_budget_approved, type: boolean} - {name: is_inactive, type: boolean} - {name: is_person, type: boolean} - {name: language, type: string} - {name: last_modified_date, type: timestamp, format: "%Y-%m-%d %H:%M:%S %z"} - {name: phone, type: string} - {name: print_transactions, type: boolean} - {name: sales_rep_external_id, type: string} - {name: sales_rep_internal_id, type: long} - {name: sales_rep_name, type: string} - {name: sales_rep_type, type: string} - {name: stage, type: string} - {name: subsidiary_external_id, type: string} - {name: subsidiary_internal_id, type: long} - {name: subsidiary_name, type: string} - {name: subsidiary_type, type: string} - {name: taxable, type: boolean} - {name: terms_external_id, type: string} - {name: terms_internal_id, type: long} - {name: terms_name, type: string} - {name: terms_type, type: string} - {name: unbilled_orders, type: string} - {name: url, type: string} filters: [] out: {more: replace} exec: {} ``` You can preview how the system parses the file by using the preview command. ```bash $ td connector:preview load.yml ``` If the system detects your column name or type unexpectedly, modify load.yml directly and preview again. Currently, the Data Connector supports parsing of "boolean", "long", "double", "string", "timestamp", and "json" types. ### Execute Load Job Finally, submit the load job. It might take a couple of hours depending on the data size. Users need to specify the database and table where their data is stored. ```bash td connector:issue load.yml \ --database td_sample_db \ --table td_sample_table ``` The above command assumes you have already created *database(td_sample_db)* and *table(td_sample_table)*. If the database or the table do not exist in Treasure Data, then this command will not succeed, so create the database and table [manually](https://docs.treasuredata.com/smart/project-product-documentation/data-management) or use --auto-create-table option with td connector:issue command to auto create the database and table: ```bash td connector:issue load.yml \ --database td_sample_db \ --table td_sample_table \ --time-column created_at --auto-create-table ``` You can assign Time Format column to the "Partitioning Key" by "--time-column" option. ### Scheduled Execution You can schedule periodic Data Connector execution for periodic NetSuite import. We take great care in distributing and operating our scheduler in order to achieve high availability. By using this feature, you no longer need a cron daemon on your local data center. ### Incremental Load You can incrementally load data by setting incremental: to true and specify the from_date as shown in the following example: ```yaml in: type: netsuite target: Customer email: YOUR_EMAIL_ADDRESS password: YOUR_PASSWORD account: YOUR_ACCOUNT_ID role: YOUR_ROLE_ID incremental: INCREMENTAL_OPTION #(optional, default: true) from_datetime: DATE TO LOAD DATA #(optional, Date value i.e 2017-07-24 15:43:11 +0700 to spefify the date that data) sandbox: false # set "sandbox: true" if you want to get sandbox account data out: mode: replace ``` If you omit the from_datetime, then all available records (data) are fetched. Only timestamp, and timestamptz(timestamp with timezone) are supported. ### Create the Schedule You can create a new schedule by using the td connector:create command. The name of the schedule, cron-style schedule, the database and table where their data will be stored, and the Data Connector configuration file are required. ```bash td connector:create \ daily_netsuite_import \ "10 0 * * *" \ td_sample_db \ td_sample_table \ load.yml ``` The `cron` parameter also accepts three special options: `@hourly`, `@daily` and `@monthly`. | By default, schedule is setup in UTC timezone. You can set the schedule in a timezone using -t or --timezone option. The `--timezone` option supports only extended timezone formats like 'Asia/Tokyo', 'America/Los_Angeles' etc. Timezone abbreviations like PST, CST are *not* supported and might lead to unexpected schedules. ## Further Information The following list provides details of all available options: ``` target: Target object e.g. Customer, Invoice, etc. That should accept search operation. (string, required) email: Email address. (string, required) password: Password. (string, required) account: Account ID. (string, required) role: Role ID. (string, default: nil) sandbox: Use sandbox environment if true. (bool, default: false) from_datetime: Fetch data after this time. (string, default: nil) ```