Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.


This TD export integration allows you to write job results from Treasure Data directly to Amazon S3.

This topic includes:

Table of Contents

What can you do with this Integration?


FeatureAmazon S3 v2Amazon S3 v1
Server-side Encryption with Customer Master Key (CMK)
stored in AWS Key Management Service
Support for Quote Policy for output data formatX
Support Assume Role authentication methodX

This topic includes:

Table of Contents


  • Basic knowledge of Treasure Data, including the TD Toolbelt.

  • For AWS: the IAM User :

    • with s3:PutObject, s3:AbortMultipartUpload permissions.

    • with kms:Decrypt, kms:GenerateDataKey* permissions when selecting the sse-kms setting.


Numbered Headings
 Select Export Results.Image Modified
You can select an existing authentication or create a new authentication for the external service to be used for output. Choose one of the following:

Use Existing Integration

Create a New Integration

(Optional) Specify information for Export to Amazon S3.

Is user directory Root?

If selected, the user directory is treated as the root directory.

(ex. ‘/home/treasure-data’ as ‘/’)

Path prefix:The file path where the file will be stored.
Rename file after upload finishIf selected, SFTP result output renames the file on the remote SFTP server from “.xxx.tmp” to “.xxx” after all the data is transferred.
Some MA tools try to import data when a file with a specific name exists on the SFTP server. The temp name option is useful for such cases.

The format of the exported files:

  • csv (comma separated)
  • tsv (tab separated) 

The compression format of the exported files:

  • None
  • GZ
  • bzip2
Header line?The header line with column name as the first line.

The delimited character:

  • Default
  • ,
  • Tab
  • |
Quote policy

The policy for a quote:

  • ALL
  • MINIMAL:  Add the quote character to only fields which contain delimiter, quote, or any of the characters in lineterminator.
  • NONE
Null string

How null value of the result of the query displays:

  • Default
  • empty string
  • \N
  • NULL
  • null
End-of-line character

The EOL (end-of-line) character:

  • CRLF
  • LF
  • CR
Temp filesize threshold

The maximum file size (in bytes) of a local temp file. When the temp file reaches the threshold, the file flushes to a remote file.

If you encounter the error `channel is broken`, reduce the value of this option to resolve the error.


  1. Define any additional Export Results details . In your export integration and content review the integration parameters.
    For example, your Export Results screen might be different, or you might not have additional details to fill out: .
  2. Select Done.
  3. Run your query
  4. Validate that your data moved to the destination you specified.


ParameterData TypeRequired?Supported in V1?Description
Server-side EncryptionString
yes, only sse-s3

Support values:

  • sse-s3: Server-side Encryption Mode

  • sse-kms: new SSE Mode

Server-side Encryption AlgorithmString

Support value:

  • AES256 
KMS Key IDString
noSymmetric AWS KMS Key ID. If there is no input for the KMS Key ID, it will create/use the default KMS Key.

Provide the S3 bucket name (Ex., your_bucket_name).

PathStringyesyesSpecify the s3 filename (object key), and include an extension (Ex. test.csv).
yesFormat of the exported file: csv, tsv, jsonl
Compression String
yesThe compression format of the exported files (Ex., None or gz)
yesInclude a header in the exported file.
yesUse to specify the delimiter character (Ex., (comma))
String for NULL valuesString
yesPlaced holder to insert for null values (Ex. Empty String)
End-of-line characterString
yesSpecify the EOL(End-Of-Line) representation (Ex. CRLF, LF)
Quote PolicyString
noUse to determine field type to quote. Support values:
  • ALL    Quote all fields
  • MINIMAL    Only quote those fields which contain delimiter, quote or any of the characters in the line terminatorlineterminator.
  • NONE    Never quote fields. When the delimiter occurs in the field, escape with escape char.

Default value: MINIMAL

Quote character (Optional)Char
yesThe character used for quotes in the exported file (Ex. "). Only quote those fields which contain the delimiter, quote, or any of the characters in the line terminatorlineterminator. If the input is more than 1 character, the default value will be used.
Escape character(Optional)Char

The escape character is used in the exported file. If the input is more than 1 character, the default value will be used.

Part Size (MB) (Optional)Integer

The part size in multipart upload.

Default 10, min 5, max 5000


Code Block
SELECT * FROM www_access

(Optional) Schedule


Query Export Jobs

You can use Scheduled Jobs with Result Export to periodically write the output result to a target destination that you specify.


Code Block
  database: td.database

  td>: export_s3v2_test.sql
  database: ${td.database}
  result_connection: s3v2_conn
  	bucket: my-bucket
  	path: /path/to/target.csv
  	sse_type: sse-s3
  	format: csv
  	compression: gz
  	header: false
    delimiter: default
    null_value:  empty
    newline: LF
  	quote_policy: MINIMAL
  	escape: '"'
  	quote: '"'
  	part_size: 20





Integration Using the CLI

To output the result of a single query to an S3 buck add the --result option to the td query command. After the job is finished, the results are written into your s3.
You can specify detailed settings to export your S3 via the --result parameter.