Export your Google Cloud instances in one command

Fabio Ferrari
3 min readMay 3, 2019

--

Photo by Samuel Zeller on Unsplash

If you want to migrate your instances from Google Cloud Platform to another solution like VMware, Xen or even AWS or simply to download and archive instances disks off-platform, you should export Google Cloud disks to a compatible format with destination environment.

  • VMDK (Virtual Machine Disk) describes containers for virtual hard disk drives to be used in virtual machines like VMware Workstation or VirtualBox.
  • VHDX (Hyper-V virtual hard disk) is a disk image file format used to create a virtual hard disk (VHD) within Windows Server 2012-based virtualization.
  • VPC (VirtualPC) VHD compatible image format starting with Windows 7 and Windows Server 2008 R2.
  • VDI (VirtualBox Disk Image) format is the VirtualBox-specific Disk Image.
  • QCOW2 (QEMU Copy On Write) uses a disk storage optimization strategy used by QEMU.

I developed a bash script to accomplish all required jobs in one command:

  • Some interaction to select disks to export
  • Create new Google Image from selected source disk
  • Export Google Image to supported format
  • Store exported image to Cloud Storage
  • Remove Google Image

Script can export only one disk or all disks at the same time, a list of available disk will be displayed at the beginning of script execution.

How to export disks

Before launching the script create a new Cloud Storage Bucket to store images, make sure you have set right permissions on bucket.

You can launch the script from a Bash console logged with your Google Cloud account or from Google Cloud Shell (recommended).

Clone repository from GitHub:
https://github.com/fabio-particles/gce-disks-export

# Enter in project folder
$ cd gce-disks-export/
$ ./gce-disks-export.sh BUCKET_NAME [IMAGE_FORMAT]

# Without format, use vmdk as default
$ ./gce-disks-export.sh my-bucket-name

# Export image in qcow2 format to my-bucket-name bucket
$ ./gce-disks-export.sh my-bucket-name qcow2

Image export requires Cloud Build API activation, if you haven’t already done, script will ask you to activate it.

The "cloudbuild.googleapis.com" service is not enabled for this
project. It is required for this operation.
Would you like to enable this service? (Y/n)? y

You must enable permission to Cloud Build service account, answer yes in order to set automatically:

The following IAM permissions are needed for this operation:
[roles/iam.serviceAccountTokenCreator
serviceAccount:347021062934@cloudbuild.gserviceaccount.com
roles/compute.admin
serviceAccount:347021062934@cloudbuild.gserviceaccount.com
roles/iam.serviceAccountUser
serviceAccount:347021062934@cloudbuild.gserviceaccount.com]

Would you like to add the permissions (Y/n)? y

Next you have to choose if export only one disk or all together.
Type “0” to export all disks:

[0] All Disks
[1] disk1-linux
[2] disk2-windows
[3] disk3-data
Select disk number to export, 0 for all disks:

Check logs during exports, if exporter complete its jobs successfully, you will find new disks images in specified Google Storage Bucket.

Google Cloud Storage

Cloud Storage is a Google service where to store files like exported images.
When create a new bucket you can choose a class, every class has different properties.

Available Cloud Storage classes:

  • Multi-Regional
  • Regional
  • Nearline
  • Coldline

Multi-Regional Storage

  • > 99.99% typical monthly availability
  • Geo-redundant
  • Storing data that is frequently accessed
  • Data stored in dual-regional locations

Regional Storage

  • 99.99% typical monthly availability
  • Data stored in a narrow geographic region
  • Storing frequently accessed data in the same region

Nearline Storage

  • 99.95% and 99.9%, typical monthly availability in multi-regional and regional
  • Very low cost per GB stored
  • Data you do not expect to access frequently
    (i.e., no more than once per month)

Coldline

  • 99.95% and 99.9%, typical monthly availability in multi-regional and regional
  • Lowest cost per GB stored
  • Data retrieval costs
  • Higher per-operation costs
  • 90-day minimum storage duration
  • Data you expect to access infrequently
    (i.e., no more than once per year)

Pricing (Update 02/05/2019)

Pricing for 100 GB/month
Multi-Regional: $ 2.60
Regional: $ 2.00
Nearline Multi-Regional: $ 1.00
Nearline Regional: $ 1.00
Coldline Multi-Regional: $ 0.70
Coldline Regional: $ 0.40

For updates about pricing, please check official site:

--

--