Skip to content
Last updated

List of Upper Limits Max on Treasure Data

The following table describes some of the commonly asked about limitations on Treasure Data.

PropertyLimitComment
TD Console user password8-128 characters
Database name128 charactersIf you exceed this limit, you will see a message similar to this:


Error:
Name is too long must be between 3 and 128 characters long
Table name128 charactersIf you exceed this limit, you will see a message similar to this:


Error:
Name is too long must be between '3 and 128 characters long'
Column name128 charactersIf you exceed this limit, you will see a message similar to this:


Error:
Validation failed: Schema SQL column alias 'column_name'must be same or shorter than 128 characters
Segment name255 characters
Number of databasesNo Limit
Number of tablesNo Limit
Number of usersNo Limit
Number of columns in a table4096This is a soft limit.
Number of imported columns (Streaming)512
Number of imported columns (Bulkimport & Dataconnector)4096If exceeded, the job fails. (Both Bulkimport & Dataconnector)
Number of bulkimport sessions2,448
Schema size16,777,215 characters (assuming single-byte characters)This character limit includes the name of the column, datatype and any SQL alias name in the table schema.
Result download size (via browser)50 MB
Result download size (via TD CLI)16 GBIf you exceed this limit, you will see a message similar to this:


Error:
TreasureData::API::IncompleteError:
Table previewOnly a few recordsOnly a subset of imported records are shown in the table preview. To see full data, use either Presto or Hive to query the table.
Domain keys255 charactersUsed for REST API Job request idempotency.
Treasure Workflow concurrent tasks30 maximumAll other tasks get queued and are issued on a first-come-first-served basis. See also Treasure Workflow Prerequisites and Limitations.
Number and size of attribute values that a Profile API can return to a segment5 attributes, within 10KBThe total bytes of attribute values returned by the Profile API must be less than 10KB. When you configure a Profile API, limit the attributes for which you want values returned to 5. If you exceed this limit, you will see a message similar to this:


Status Code: 400
{"error":"Bad Request","message":"Maximum size of a value is 10KB"}
Segment IDs as an array returned by Profile API1KBThere is a limit to the number of segments ID that the API can return. When you create a profile API, limit the number of segments that you specify.
Data Connector’s input config (in: )65,535 bytes
Query result size for which pagination is disabled (via browser)10 MB
Result exports (or queued activation jobs)VariableThere is a limit to how many result exports, which are also known as activation jobs, that can be processed at one time. If you exceed this limit, you will see a message similar to this:


Error:
This account already has its maximum number of result export jobs queued or running (<your_limit> jobs).


Because Treasure Data may need to change this limit to manage server load, Treasure Data does not promise or advertise a specific limit.
Import limits for records.in endpoint.
  • 1 to 500 events.
  • Maximum 1000kiB per event.
  • Maximum 5MiB for all events.
Workflow Server: Data size limit (or, max response size) for HTTP operators.
  • http> operator: 1MB
  • http_call> operator: 2MB
Length of query statement
  • Presto: 1,000,000 characters
  • Hive: 2,097,152 characters
Number of column annotations in a column5
Number of column annotations in a table20
Number of column annotations (sum of all databases)10,000
Number of tokens that you can specify in a single Profile API call5You can only include up to 5 tokens when you call Profile API or JS SDK fetchUserSegments method.