Visit our new documentation site! This documentation page is no longer updated.

Create Workflows of Saved Queries

Table of Contents


In this tutorial we will create a workflow that runs a saved jobs on Treasure Data to import data, query, & then export to an external database or service.


Introductory Tutorial

If you haven’t already, start by going through the TD Workflows introductory tutorial

This will make sure you are already set up to run TD Workflows.


In this tutorial we are going to create a workflow project from scratch.

Create Workflow Directory

# this creates a director called wf_of_saved_queries
$ mkdir wf_of_saved_queries

Add the workflow definition file

Now we’ll add the workflow file itself. In this case, you need to replace the <replaced_with_saved_query_name> with your query's saved name in Treasure Data.

$ cat > saved_queries.dig <<EOF
    database: workflow_temp

  td_load>: <replace_with_saved_data_connector_job_name>

  td_run>: <replace_with_saved_query_name>

To find a saved data connector job you can issue the following command using td-toolbelt via CLI:

$ td connector:list

To find a saved query, go to your console queries section. Add a few saved queries to your workflow definition file. You can just copy the names you want to use directly from the query page.

Run workflow

Now you can run your workflow!

$ td wf run saved_queries

Just like any other workflow, you can now add a schedule to this workflow and submit to Treasure Data to run on a regular basis.


If you have any feedback we welcome hearing your thoughts on our TD Workflows ideas forum.

Also, if you have any ideas or feedback on the tutorial itself, we’d welcome them here!

Last modified: Feb 24 2017 09:41:25 UTC

If this article is incorrect or outdated, or omits critical information, let us know. For all other issues, access our support channels.