# Python Custom Scripting Example Python scripts can be run from TD Workflow or Digdag, using the Python operator `py>`. You can create your workflows for TD using the TD Console or from the command line. For the workflow to run the Python script, you must specify a Docker image. When the workflow task starts, a new Docker container is created based on the specified Docker image. Docker allows the Python script to execute in the container in an isolated environment. Running this tutorial takes about 30 minutes and does not require that you have prior experience with Python or Docker images. ## Prerequisites * Make sure this feature is enabled for your TD account. * Download and install the TD Toolbelt and the TD Toolbelt Workflow module. For more information, see [Using Treasure Workflow](/products/customer-data-platform/data-workbench/workflows/using-treasure-workflow-from-the-command-line-interface) * Python 3.9. Your Python code must be compatible with those versions. * Basic Knowledge of Treasure Workflow's syntax ## 3rd-Party Python Libraries The Python scripts in TD Workflows are managed and run by Treasure Data in isolated Docker containers. Treasure Data provides a number of base Docker images to run in the container. ```python import os import sys os.system(f"{sys.executable} -m pip install tensorflow") import tensorflow ``` 3rd-party Python libraries can be installed from your Python script using the pip install command. ## For Docker images compatible with Python 3.9: 1. To add more libraries from within your Python script use: ```bash pip install ${package_name} ``` ## Python Examples See [examples](https://github.com/treasure-data/treasure-boxes/tree/master/integration-box/python), for basics such as: * How to call functions * How to pass parameters to functions * How to use environment variables * How to import functions ## Reading and Writing Data from Treasure Data The examples show how to read data in Treasure Data into a Dataframe, manipulate data, and write it back to Treasure Data as a table. 1. You can copy or clone the entire [repository](https://github.com/treasure-data/treasure-boxes). 2. Navigate to: `treasure-boxes/integration-box/python/simple.dig` 3. From the command line, type` ls` to verify that you are in the correct directory. You should see the following: `README.md other_scripts scripts simple.dig` 4. Push the simple examples to your TD environment by typing the following: ```bash td wf push simple-example ``` This runs the simple.dig workflow and uploads the simple-example to TD. ### To verify that the sample was added to TD: 1. Open TD Console. 2. Navigate to Workflows. 3. Search for simple. 4. Double click the simple workflow to open up the editor. For example: ![](/assets/image-20200820-235344.091faec0e06c7e47885e43024ed49778cb2ccafa1464457fb657079bc3959128.2e4f387b.png) ### To run the Workflow 1. Select New Run. ![](/assets/image-20200820-235401.aa2d370859b47b08aacc9ff5784e9728c79e5b3831fdd8efdbbc27940fa6a760.2e4f387b.png) ### Or to run the sample from the command line 1. Type `td wf start simple-example simple --session now` ![](/assets/image-20200820-235614.873918124685fbfa81341a014d04523f84bcbdcde0eafb4a994ef42b8ab1ce66.2e4f387b.png) ### To validate the workflow job run 1. From the TD Console, navigate to the workflow editor. 2. Select Run History. ![](/assets/image-20200820-235519.b23245d31276dfe1fdac3ffa272a75e83b0291a3ae946a07ed80c6ed02a8e73d.2e4f387b.png) 1. If there are multiple instances of the job, select one to open the job history. From here you can view at what time the job ran, audit logs, and other helpful diagnostic information about the job. ![](/assets/image-20200820-235638.0079a62ba3bd15303cd64632e888881c86d89c54dbac095551a36010ab5f5062.2e4f387b.png)