Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.



Required permissions

Minimum IAM roles

To use table loading

  • bigquery.tables.get

  • bigquery.tables.getData

  • BigQuery Data Viewer

To use query loading


  • BigQuery Job User

To use "Import Large Dataset"

  • bigquery.tables.export

  • bigquery.tables.delete

  • storage.buckets.get

  • storage.objects.list

  • storage.objects.create

  • storage.objects.delete

  • storage.objects.get

  • BigQuery Data Editor

  • Storage Legacy Bucket Writer


When you load a large dataset (more than 500MB as a benchmark), we recommend that you use this "Import Large Dataset" option. This option exports the data as GCS (Google Cloud Storage) objects then and loads the data in multiple tasks. Hence, loading is faster.

To enable this option, check Import Large Dataset then specify "Temp dataset", "Temp table", "GCS bucket" and "GCS path prefix". And please make sure that The "Temp dataset" must be created manually in advance.