# Setting Workflow Secrets From the Command Line Workflows often need to use specific TD API keys and PostgreSQL credentials. As TD API keys, database credentials, and other objects can be used to access your potentially sensitive business data, it is important to treat them securely. TD Workflows offers a secure secret __ management system that you can use to securely manage credentials separately from normal workflow parameters. See also [Secrets Best Practices](/products/customer-data-platform/data-workbench/workflows/about-workflow-secret-management). Review the [TD Workflows SFTP Data Connector Example](/products/customer-data-platform/data-workbench/workflows/workflows-sftp-data-connector-example). ## Limitations * For security reasons, it is not possible to download secrets that have been uploaded to the TD Workflows service. * Secrets are mainly used in td_load to expand credentials to YAML. They are not available as SQL variables, operator’s parameters, and so on. ## Default Workflow Permissions When a workflow is pushed to TD Workflows, it automatically uses the permission of the user who pushed the workflow. To run your workflow with another user's permission, upload the other key to the TD Workflows service using the `td wf secrets` command. You configure a workflow using a specific TD API key. ## Setting, Changing, and Reverting an API Key Upload the API key to use by running the below command and enter your API key when prompted. The API key is not visible in the terminal. Then press Enter. ```bash td wf secrets \ --project nasdaq_analysis \ --set td.apikey ``` Now all workflows in this project will use the API key. Try it out by starting the workflow. ```bash td wf start nasdaq_analysis \ nasdaq_analysis \ --session now ``` The API key specified is not visible in the workflow logs, but the workflow should run successfully. If an invalid API key was specified the run fails. To change the API key, run the secrets set command again and specify another API key. To revert to using the default API key, delete the uploaded API key. ```bash td wf secrets \ --project nasdaq_analysis \ --delete td.apikey ``` ## Configuring a Workflow to Use Multiple API Keys Sometimes you might want to configure tasks in a workflow to use different API keys to access different TD accounts. For the workflow to run successfully, you might need to create the database workflow_temp in the accounts of the two respective API keys, if a temp file does not already exist. Upload two different API keys. ```bash td wf secrets --project nasdaq_analysis --set apikey1 td wf secrets --project nasdaq_analysis --set apikey2 ``` Configure the two tasks to use the two different API keys. ```yaml nasdaq_analysis.dig timezone: UTC schedule: daily>: 07:00:00 _export: td: database: workflow_temp +task1: _secrets: td: apikey: apikey1 td>: queries/daily_open.sql create_table: daily_open +task2: _secrets: td: apikey: apikey2 td>: queries/monthly_open.sql create_table: monthly_open ``` Start the workflow and check that it runs successfully. ```bash td wf start nasdaq_analysis \ nasdaq_analysis \ --session now ``` ## Securely Configuring PostgreSQL Credentials This section assumes that you already have a workflow that uses the `pg>` operator. To securely configure the PostgreSQL user and password, upload them to TD Workflows using the `td wf secrets` command. ```bash td wf secrets --project my_project --set pg.user td wf secrets --project my_project --set pg.password ``` Make sure to remove `pg.user` and `pg.password` from the workflow file, if present. All workflows in this project use the specified user and password when executing `pg>`operators. Several other operator configuration options can also be securely defined, including: * `host` * `port` * `user` * `password` * `database` * `ssl` * `connect_timeout` * `socket_timeout` * `schema` For more information about the different configuration options, see the [Digdag documentation](http://docs.digdag.io/). To use different sets of PostgreSQL credentials within a workflow project, upload them with different names. ```bash td wf secrets --project my_project --set db1.pg.user td wf secrets --project my_project --set db1.pg.password td wf secrets --project my_project --set db2.pg.user td wf secrets --project my_project --set db2.pg.password ``` Then refer to the credentials using the `db1` and `db2` names, respectively. The names can be freely chosen. ```yaml +task1: _secrets: pg: db1 pg>: query1.sql +task2: _secrets: pg: db2 pg>: query2.sql ``` Now `task1` will use the `db1` credentials and `task2` will use the `db2` credentials. ## Uploading Secrets from a File When managing several sets of credentials, it can be more convenient to upload them all at one time using a file. YAML and JSON formatted samples are below. ### YAML Format 1. Create a file with credential key-value pairs. ```yaml credentials.yaml db1.pg.user: user1 db1.pg.password: pw1 db2.pg.user: user2 db2.pg.password: pw2 db3.pg.user: user3 db3.pg.password: pw3 ``` 1. Upload the credentials to TD Workflows ```bash td wf secrets --project my_project --set @credentials.yml ``` ### JSON Format 1. Create a file with credential key-value pairs. ```json credentials.json { "db1.pg.user": "user1", "db1.pg.password": "pw1", "db2.pg.user": "user2", "db2.pg.password": "pw2", "db3.pg.user": "user3", "db3.pg.password": "pw3" } ``` 1. Upload the credentials to TD Workflows ```bash td wf secrets \ --project my_project \ --set @credentials.json ``` ## Listing Secrets To list the secrets that have been uploaded to a project, omit the `--set` option when running the secrets command. ```bash td wf secrets --project my_project ``` ## Local Secrets To use the secrets on local mode, omit the `--local` option instead of `--project` when running the secrets command. ```bash td wf secrets --local --set @credentials.yml ``` ## Deleting Secrets To delete credentials that have been uploaded to a project, using the `--delete` option and specify one or more secrets to delete. ```bash td wf secrets \ --project my_project \ --delete db2.pg.user db2.pg.password ```