GCP Cost Optimization Best Practices

Emerson Bossi
5 min readAug 12, 2021

--

Google Cloud Platform gives you fantastic flexibility when it comes to resources used. The platform allows developers to scale storage, compute, network, and more. These functions ensure you always have the resources to meet any demand while supporting downscaling just as quickly. Still, many companies end up spending more than they need by missing some standard practices that can considerably reduce the costs of running your cloud.

So, whether you are using the cloud to store files for archival, working on a small project, or dealing with large workloads, these tips ensure you get the most out of your money while avoiding any unpleasant surprises at the end of the month.

GCP Cost Optimization Best Practices

Managing Storage Costs

Keeping track of storage costs can be pretty tricky when running your own servers. However, billing transparency is improved when using Google Cloud Platform by making storage costs a completely different line item. This way, keeping track of storage costs and making configuration changes can lead to considerable savings at the end of the month.

Choosing the correct storage class is another factor that can significantly impact your storage bill. If your data is not frequently accessed, swapping to nearline, coldline, or even archival will decrease the costs while maintaining resiliency. Just keep in mind that using these lower-cost options comes with a downside: they’re made for data infrequently accessed. Constant access requests go against the minimum storage duration period, incurring extra costs.

Using Cost Management Tools

Using tools to help understand the costs of a running cloud is essential in knowing and controlling your expenses. But it’s even more critical to have tools that can automatically act based on the given parameters, avoiding wasted resources and the possibility of human error.

Tools like CloudHealth, Google Cloud’s operations suite, Apptio Cloudability, and CloudCheckr provide greater control of cloud resources costs. They support quotas, alarms, rightsizing, and will increase operation speed by automating such tasks, saving money and developer time that can be better spent somewhere else in the project.

Removing Obsolete Snapshots

Snapshots might not be an expensive resource if you only have a few of them. However, once a company or team begins to stockpile on them, they will steadily drive the cost of operations up. To keep your expenses optimized, run scripts to identify old and obsolete snapshots and remove them. If there’s a possibility of using them in the future, try cheaper storage options such as coldline, which may be used to lower storage costs.

Scheduling Compute Resources Usage

One of the foundations of cost optimization is removing wasted resources, especially those no longer being actively used. That’s why GCP offers instance schedulers for both VMs and Compute instances. These can be used to stop any resources that are not being used in production, making the scheduler stop the instance, avoiding further charges once testers, developers, and Q&A finish their analysis.

The scheduler is highly flexible and will allow you to set a one-off VM to start and stop as needed, and it’s also capable of doing recurring tasks. For instance, it can be set to start and stop within working hours or during scheduled Q&A or maintenance. For companies that aren’t disabling VMs after usage, this change can severely reduce Q&A and testing costs.

Leveraging Preemptible VMs

Preemptible VMs come with a heap of benefits for short-lived tasks. These VMs last up to 24 hours and cost up to 80% less than on-demand VMs. This option also doesn’t suffer from market price variance, as Google has fixed prices models for this modality.

These machines will be much cheaper for projects that need to run short-lived workloads, execute sporadic testing, run intensive but quick stress tests, and much more. Additionally, this can be combined with Google Kubernetes containers for even greater savings, as containers are fault-tolerant and stateless.

Optimizing BigQuery

BigQuery is at the center of data analytics in GCP. Optimizing and enforcing limits will reduce costs significantly at the end of the month. Some of the best practices being:

  1. Limiting query costs: Using the maximum billed bytes configuration, you can stop any query larger than the limit set from being executed, making the query fail and not charging you when attempting to run it. The same limitation can be enforced on users, making it easier to limit the resources used by your team.
  2. Batch loading data vs. Streaming inserts: Often, companies choose streaming inserts, making data available in seconds instead of hours. This benefit, while worthwhile for many operational tasks, may not always be necessary. Unless the data being used is constantly manipulated in real-time, you can save by swapping to batch loading data, which comes free of cost.
  3. On-demand vs. Flat rate: Businesses and projects that work with a steady, high workload may find that the flexibility of an on-demand plan may become expensive over time. It’s worth checking how much on-demand is costing you and compare it to flat rate prices. You might find the price more friendly for your needs while offering unlimited bytes processing at a fixed price.

If you’re unsure which plan to pick, Google offers flex slots with time commitments as low as 60 seconds. These are perfect when measuring resource usage, and their noncommittal nature is ideal where an upswing in usage is expected during the year. Furthermore, flex slots can be contracted exclusively to deal with high-load and later downgraded to your usual plan.

Finding and Terminating Zombie Assets

Zombie assets are generated when VMs have been stopped, but resources such as IP addresses, load balancers, and disc storage are still running in the background. Consequently, you may not be actively using said resources, but you will still be billed for them.

You may either have automated scripts run when stopping a VM to ensure every asset related to it is correctly terminated or use third-party tools to analyze and find unused assets and resources and prune them as needed.

Conclusion

Moving your business to the cloud will often be cheaper than using dedicated servers. Yet, understanding the many nuances of resources and assets cost will significantly affect how much you will be billed at the end of the month. Following these best practices, third-party cost optimization services and tools will make for a cheaper, more efficient, and more reliable cloud environment.

--

--

Emerson Bossi

I’m a professional freelance content writer. Check out my portfolio for more info here: https://www.bossiwriter.com/