How to copy a BigQuery dataset from one region to another

Use the BigQuery Transfer Service

The usual way to copy BigQuery datasets and tables is to use bq cp:

bq cp source_project:source_dataset.source_table \

Unfortunately, the copy command doesn’t support cross-region copies, only copies within the same region.

To do a cross region copy, you can use the BigQuery Transfer Service.

First, create the destination dataset:

bq mk --location eu ch10eu

Then, create a transfer config whose data source is cross_region_copy:

bq mk --transfer_config --data_source=cross_region_copy \
--params='{"source_dataset_id": "iowa_liquor_sales", "source_project_id": "bigquery-public-data"}' \
--target_dataset=ch10eu --display_name=liquor \
--schedule_end_time="$(date -v +1H -u +%Y-%m-%dT%H:%M:%SZ)"

Because the data transfer service is meant for routine copies, this will repeat every 24 hours. In my example, I set the end time to be 1-hour from now, so that the transfer happens only once.

Please star this issue if you’d like bq cp to support cross-region copies.




A collection of technical articles and blogs published or curated by Google Cloud Developer Advocates. The views expressed are those of the authors and don't necessarily reflect those of Google.

Recommended from Medium

Three ways you can earn extra on your Airbnb listing

Data Uniformity in Data Science

Interactive data visualization

Different types of Distribution

YouTube Data Analysis Using Hadoop Tools

Searching for a profitable moving average crossover trading strategy in Python.

Creating an Automatic Support & Resistance Scanner in Python.

Create An API To Deploy Machine Learning Models Using Flask and Heroku

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Lak Lakshmanan

Lak Lakshmanan

Operating Executive at a technology investment firm; articles are personal observations and not investment advice.

More from Medium

Deployment Topologies for Data Fusion with Shared VPCs

Multi-Cloud Analytics with BigQuery Omni : No time to load !

Certification helper GCP — AI Platform (Cloud ML Engine)

Connecting Steampipe with Google BigQuery