# Logs Import Using Csv Tsv And Ltsv Formats
You can import CSV, TSV, and LTSV formatted logs from Treasure Agent (td-agent), to continuously import the access logs into the cloud.
**td-agent handles log-rotation**. td-agent keeps a record of the last position of the log, ensuring that each line is read exactly once even if the td-agent process goes down. However, because the information is kept in a file, the "exactly once" guarantee breaks down if the file becomes corrupted.
## Prerequisites
td-agent is under the [Fluentd project](http://fluentd.org/). td-agent extends Fluentd with custom plugins for Treasure Data.
## Installing td-agent
Install td-agent on your application servers. td-agent is a daemon program dedicated to the streaming upload of any kind of the time-series data. td-agent is under the Fluentd project. td-agent extends Fluentd with custom plugins for Treasure Data.
To set up td-agent, refer to the following articles; we provide deb/rpm packages for Linux systems.
| **If you have...** | **Refer to...** |
| --- | --- |
| MacOS X | [Installing td-agent on MacOS X](https://docs.fluentd.org/installation/install-by-dmg) |
| Ubuntu System | [Installing td-agent for Debian and Ubuntu](https://docs.fluentd.org/installation/install-by-deb) |
| RHEL / CentOS System | [Installing td-agent for Redhat and CentOS](https://docs.fluentd.org/installation/install-by-rpm) |
| AWS Elastic Beanstalk | [Installing td-agent on AWS Elastic Beanstalk](https://github.com/treasure-data/elastic-beanstalk-td-agent) |
## Modifying td-agent.conf
Specify your authentication key by setting the `apikey` option. You can view your API key from the TD Console.
Access `/etc/td-agent/td-agent.conf` to set the `apikey` option.
*YOUR_API_KEY* should be your API key string.
```conf
# Tailing the CSV formatted Logs
type tail
format csv
path /path/to/log/foo.csv
pos_file /var/log/td-agent/foo.pos
tag td.production.foo
keys key1, key2, key3
time_key key3
# Tailing the TSV formatted Logs
type tail
format tsv
path /path/to/log/bar.tsv
pos_file /var/log/td-agent/bar.pos
tag td.production.bar
keys key1, key2, key3
time_key key3
# Tailing the LTSV formatted Logs
type tail
format ltsv
path /path/to/log/buz.ltsv
pos_file /var/log/td-agent/buz.pos
tag td.production.buz
time_key time_field_name
# Treasure Data Input and Output
type tdlog
endpoint api.treasuredata.com
apikey YOUR_API_KEY
auto_create_table
buffer_type file
buffer_path /var/log/td-agent/buffer/td
use_ssl true
```
Restart your agent when the following lines are in place.
```
$ sudo /etc/init.d/td-agent restart
```
td-agent tails the file, buffers the log (*var/log/td-agent/buffer/td*), and automatically uploads the log into the cloud.
# Confirming Data Import
Sending a SIGUSR1 signal flushes td-agent’s buffer; upload starts immediately.
```
# append new records to the logs
$ ...
# flush the buffer
$ kill -USR1 `cat /var/run/td-agent/td-agent.pid`
```
To confirm that your data uploads successfully, issue the `td tables` command as follows.
```
$ td tables
+------------+------------+------+-----------+
| Database | Table | Type | Count |
+------------+------------+------+-----------+
| production | foo | log | 1 |
| production | bar | log | 3 |
| production | buz | log | 5 |
+------------+------------+------+-----------+
```
Check `/var/log/td-agent.log` if it’s not working correctly. `td-agent:td-agent` must have permission to read the logs.
## Next Steps
We offer a schema mechanism that is more flexible than that of traditional RDBMSs. For queries, we leverage the Hive Query Language.
- [Schema Management](https://docs.treasuredata.com/articles/project-product-documentation/schema+Management)
- [Hive Query Language](https://docs.treasuredata.com/articles/project-product-documentation/hive-and-presto-query-engine-reference)
- [Programmatic Access with REST API and its Bindings](https://docs.treasuredata.com/articles/project-product-documentation/rest-apis-in-treasure-data)