Blog Series: Unlocking Cloud Savings: Optimizing Storage Costs by Disabling Soft Delete Policy— (Part 2)

Kirat pal Singh Lamba
Quinbay
Published in
4 min read2 days ago

Optimizing Storage Costs by Disabling Soft Delete Policy

Introduction

Google Cloud Storage’s soft delete policy allows for data recovery within a specified retention period, but it can also lead to unexpected storage costs. In this blog post, we will discuss how disabling the soft delete policy on buckets can lead to significant cost savings. We’ll also share our custom automation script that ensures the soft delete policy is disabled for all new and existing buckets.

Understanding Soft Delete Policy

Soft Delete Policy A soft delete policy enables data recovery by retaining deleted objects for a specified period before permanently deleting them. While this can be beneficial for data recovery, it can also incur additional storage costs, especially for large volumes of data.

Impact on Costs Maintaining deleted objects can significantly increase storage costs, particularly for buckets with frequent data turnover. Disabling this policy helps in reducing unnecessary storage expenses.

The Importance of Disabling Soft Delete Policy

Cost Efficiency

  • Reduced Storage Costs: By disabling the soft delete policy, you can avoid paying for the storage of deleted objects, leading to substantial cost savings.
  • Efficient Resource Utilization: Resources can be better utilized for active data rather than retaining deleted objects.

Operational Simplicity

  • Simplified Management: Managing buckets without the soft delete policy simplifies storage management and reduces administrative overhead.
  • Consistency: Ensuring that all buckets adhere to the same policy helps maintain consistency across the storage environment.

Automating the Disablement of Soft Delete Policy

We developed our in-house script to automatically remove soft delete policies from all buckets across all GCP projects. The script iterates through each project and bucket, checks for an existing soft delete policy, and clears it if found, logging the changes made. Our automation script ensures that the soft delete policy is disabled for all new and existing buckets, preemptively addressing potential cost increases before the pricing for soft delete started.

Automation Script for Disabling Soft Delete Policy

Here’s the script we created to automate the disabling of the soft delete policy:

#!/bin/bash

# Fetch all project IDs
PROJECT_IDS=$(gcloud projects list --format="value(projectId)")

for PROJECT_ID in $PROJECT_IDS; do
# Set current project
gcloud config set project $PROJECT_ID

# List all buckets in the current project
BUCKETS=$(gcloud storage buckets list --format="value(name)")

for BUCKET in $BUCKETS; do

SOFT_DELETE_POLICY=$(gcloud storage buckets describe gs://$BUCKET --format="json" | jq -r '.soft_delete_policy.retentionDurationSeconds')

if [[ -n "$SOFT_DELETE_POLICY" ]]; then

gcloud storage buckets update gs://$BUCKET --clear-soft-delete
echo "Soft delete policy removed for bucket $BUCKET in project $PROJECT_ID"
else
echo "Bucket $BUCKET in project $PROJECT_ID does not have a soft delete policy"
fi
done
done

echo "Soft delete policies updated for all buckets in all projects."

Benefits of Our In-House Solution

Proactive Cost Management

  • Preemptive Action: By creating this solution before the pricing for soft delete started, we mitigated potential cost increases effectively.
  • Customizability: The script can be tailored to meet specific needs, offering flexibility beyond the standard GCP offerings.

Operational Efficiency

  • Automation: Automated disabling of the soft delete policy reduces manual effort and ensures consistency across all buckets.
  • Scalability: The script can handle large numbers of buckets, making it scalable for extensive storage environments.

Deadline-Driven Action and High Bucket Volume

We had to act swiftly within a tight deadline due to the high number of buckets in our infrastructure. This urgency was crucial in ensuring that we addressed the potential cost implications promptly. The vast number of buckets posed a significant challenge, making the automation script indispensable for handling the updates efficiently. Our proactive measures ensured that we stayed ahead of the pricing changes and maintained cost-effective storage management practices.

By focusing on these critical aspects, we continue to refine our approach to cost optimization, ensuring that our infrastructure remains efficient and cost-effective.

Conclusion

Disabling the soft delete policy led to significant cost savings and operational efficiency. Our in-house automation script effectively managed our extensive bucket infrastructure, ensuring consistency and simplifying storage management. This proactive approach preemptively addressed potential cost increases and optimized resource utilization.

Our solution highlights the importance of customizability and scalability, demonstrating that tailored scripts can efficiently manage large-scale storage environments. This reduces administrative overhead and ensures effective cost management.

Stay tuned for part 3 of this series, where we’ll explore further cost optimization strategies and share more insights on managing storage expenses efficiently.

--

--