# s3_copy>: Copy files in Amazon S3

**s3_copy>** operator copies files within Amazon S3.


```yaml
+copy_file:
  s3_copy>: 
  source: source-bucket/source-key
  destination: destination-bucket/destination-key
```

## Secrets

When you don't know how to set secrets, please refer to [Managing Workflow Secret](/products/customer-data-platform/data-workbench/workflows/secret-management)

* **aws.s3.region, aws.region**
An optional explicit AWS Region in which to access S3. Default is us-east-1.
* **aws.s3.access_key_id, aws.access_key_id**
The AWS Access Key ID to use when accessing S3. When using `credential_provider: assume_role`, this is not required.
* **aws.s3.secret_access_key, aws.secret_access_key**
The AWS Secret Access Key to use when accessing S3. When using `credential_provider: assume_role`, this is not required.


## Options

* **source**: `SOURCE_BUCKET/SOURCE_KEY`
Path to the source file in Amazon S3 to copy from. Use either this parameter or the combination of `source_bucket` and `source_key`.
Examples:



```yaml
  source: source-bucket/my-data.gz
```


```yaml
  source: source-bucket/file/in/a/directory
```

* **source_bucket**: SOURCE_BUCKET
The S3 bucket where the source file is located. Can be used together with the `source_key` parameter.
* **source_key**: SOURCE_KEY
The S3 key of the source file. Can be used together with the `source_bucket` parameter.
* **destination**: `DESTINATION_BUCKET/DESTINATION_KEY`
Path to the destination file in Amazon S3 to copy to. Use either this parameter or the combination of `destination_bucket` and `destination_key`.
Examples:



```yaml
  destination: destination-bucket/my-data-copy.gz
```


```yaml
  destination: destination-bucket/file/in/another/directory
```

* **destination_bucket**: DESTINATION_BUCKET
The S3 bucket where the destination file will be created. Can be used together with the `destination_key` parameter.
* **destination_key**: DESTINATION_KEY
The S3 key of the destination file. Can be used together with the `destination_bucket` parameter.
* **recursive**: BOOLEAN
Copy all objects with the specified prefix recursively. Default is false.
Examples:



```yaml
  +copy_directory:
      s3_copy>:
      source: source-bucket/my-directory/
      destination: destination-bucket/backup/
      recursive: true
```

* **objects_per_iteration**: NUMBER
Maximum number of objects to copy per iteration when using recursive mode. Must be between 1 and 1000. Default is 1000.
* **region**: REGION
An optional explicit AWS Region in which to access S3. This may also be specified using the `aws.s3.region` secret. Default is us-east-1.
* **path_style_access**: BOOLEAN
An optional flag to control whether to use path-style or virtual hosted-style access when accessing S3. Default is false.
* **log_each_object**: BOOLEAN
Whether to log each object being copied. Default is true.
* **credential_provider**: NAME
The credential provider to use for AWS authentication. Supported values are `access_key` (default) and `assume_role`.
Examples:



```yaml
  +copy_file_with_assume_role:
    s3_copy>: 
    source: source-bucket/source-key
    destination: destination-bucket/destination-key
    credential_provider: assume_role
    assume_role_authentication_id: ${auth_id}
```

* **assume_role_authentication_id**: NUMBER
The authentication ID for assume role when using `credential_provider: assume_role`. This corresponds to the `Amazon S3 Import Integration v2` configuration.
How to get authentication_id is written in [Reusing the existing Authentication](/products/customer-data-platform/integration-hub/authentications/reusing-an-existing-authentication).


## Notes

* When copying folders recursively, you cannot copy a folder into itself or into one of its subfolders. For example, you cannot copy `my-folder/` to `my-folder/backup/`.