The following table describes some of the commonly asked about limitations on Treasure Data.
| Property | Limit | Comment |
|---|---|---|
| TD Console user password | 8-128 characters | |
| Database name | 128 characters | If you exceed this limit, you will see a message similar to this:Error:Name is too long must be between 3 and 128 characters long |
| Table name | 128 characters | If you exceed this limit, you will see a message similar to this:Error:Name is too long must be between '3 and 128 characters long' |
| Column name | 128 characters | If you exceed this limit, you will see a message similar to this:Error:Validation failed: Schema SQL column alias 'column_name'must be same or shorter than 128 characters |
| Segment name | 255 characters | |
| Number of databases | No Limit | |
| Number of tables | No Limit | |
| Number of users | No Limit | |
| Number of columns in a table | 4096 | This is a soft limit. |
| Number of imported columns (Streaming) | 512 | |
| Number of imported columns (Bulkimport & Dataconnector) | 4096 | If exceeded, the job fails. (Both Bulkimport & Dataconnector) |
| Number of bulkimport sessions | 2,448 | |
| Schema size | 16,777,215 characters (assuming single-byte characters) | This character limit includes the name of the column, datatype and any SQL alias name in the table schema. |
| Result download size (via browser) | 50 MB | |
| Result download size (via TD CLI) | 16 GB | If you exceed this limit, you will see a message similar to this:Error:TreasureData::API::IncompleteError: |
| Table preview | Only a few records | Only a subset of imported records are shown in the table preview. To see full data, use either Presto or Hive to query the table. |
| Domain keys | 255 characters | Used for REST API Job request idempotency. |
| Treasure Workflow concurrent tasks | 30 maximum | All other tasks get queued and are issued on a first-come-first-served basis. See also Treasure Workflow Prerequisites and Limitations. |
| Number and size of attribute values that a Profile API can return to a segment | 5 attributes, within 10KB | The total bytes of attribute values returned by the Profile API must be less than 10KB. When you configure a Profile API, limit the attributes for which you want values returned to 5. If you exceed this limit, you will see a message similar to this:Status Code: 400{"error":"Bad Request","message":"Maximum size of a value is 10KB"} |
| Segment IDs as an array returned by Profile API | 1KB | There is a limit to the number of segments ID that the API can return. When you create a profile API, limit the number of segments that you specify. |
| Data Connector’s input config (in: ) | 65,535 bytes | |
| Query result size for which pagination is disabled (via browser) | 10 MB | |
| Result exports (or queued activation jobs) | Variable | There is a limit to how many result exports, which are also known as activation jobs, that can be processed at one time. If you exceed this limit, you will see a message similar to this:Error:This account already has its maximum number of result export jobs queued or running (<your_limit> jobs).Because Treasure Data may need to change this limit to manage server load, Treasure Data does not promise or advertise a specific limit. |
| Import limits for records.in endpoint. |
| |
| Workflow Server: Data size limit (or, max response size) for HTTP operators. |
| |
| Length of query statement |
| |
| Number of column annotations in a column | 5 | |
| Number of column annotations in a table | 20 | |
| Number of column annotations (sum of all databases) | 10,000 | |
| Number of tokens that you can specify in a single Profile API call | 5 | You can only include up to 5 tokens when you call Profile API or JS SDK fetchUserSegments method. |