Google Cloud Platform Security Checklist : Part 6/7 — Data Security

Hassene BELGACEM
Google Cloud - Community
4 min readJul 9, 2023
Data Security

Welcome to the sixth installment in our series exploring best practices for securing your Google Cloud Platform (GCP) environment. So far, we have delved into Identity and Access Management (IAM), Key Management System (KMS), Network Security, Compute Engine, and Google Kubernetes Engine (GKE). Each of these elements plays a crucial role in safeguarding your resources and services.

In this article, we will shift our focus towards Data Security. Given the wide array of services provided by GCP, we will strive to cover as many as possible, while providing updates to this article when new recommendations related to Data Security emerge. We’ll focus on the most common services and how to protect and secure data within Cloud SQL, Cloud Storage, and BigQuery and more storage and processing services. These GCP services provide robust and scalable solutions for data storage, management, and analysis, and are commonly used in cloud-based infrastructure. However, along with their extensive capabilities come security considerations that require a thoughtful approach to ensure your data remains secure.

To facilitate understanding, our approach will include two parts: first, we will lay out general recommendations applicable to all data services, followed by service-specific guidelines where needed.

Let’s start with main data security recommendations that should be implemented across all data services:

  • Principle of least privilege: Limiting the permissions of users to only what is necessary for their tasks reduces the risk of unauthorized access and data breaches. This concept should be applied when granting access to data services.
  • Review data storage locations: Knowing where your data is stored is essential for compliance with regulations, especially when dealing with data residency requirements. Periodic reviews help ensure data sovereignty is maintained.
  • Prevent public access to sensitive data: Storage services that contain sensitive or restricted data should be protected from public access. This is a vital measure to prevent data exposure and safeguard the integrity and privacy of the data.

Now, let’s elaborate a bit more on each storage service:

Cloud Storage

  • Limit default ACL permissions: By minimizing default access control list (ACL) permissions, you can decrease the probability of inadvertent data exposure. This provides a safeguard against unintentional public access to data.
  • Prevent public access to sensitive storage buckets: Ensure that sensitive data within storage buckets/objects is not exposed to the public. Appropriate access controls should be in place to protect this data.
  • Data lifecycle policy: Configuring a data lifecycle policy ensures data that is no longer necessary is automatically removed. This reduces the risk of data leakage and optimizes storage costs.

Cloud SQL

  • Private IP configuration: Databases with public IPs are more vulnerable to attacks. Restricting Cloud SQL to private IPs significantly reduces this risk.
  • Strong root user password: The root user has full control over the database. Therefore, you MUST use a strong password for the root user to prevent unauthorized access.
  • Enforce SSL: If you access your Cloud SQL instance using an IP, you MUST enforce SSL to secure the connection. Without this, any issue with your SSL/TLS configuration could make your connection fall back to being unencrypted.
  • IP range based access control: Limiting database access to trusted IP ranges reduces the attack surface and enhances database security.

BigQuery

  • No public access to datasets: BigQuery datasets that contain sensitive or restricted data should not be publicly accessible. Proper access controls should be enforced to protect this data.
  • Dataset-level access control: Rather than granting access at the project level, restrict it at the dataset level. This reduces the chance of unintended access and adheres to the principle of least privilege.
  • Data expiration: By setting BigQuery expiration settings, unneeded tables and partitions are automatically removed, reducing potential data exposure.

Pub/Sub

  • Clear separation of roles: Ensure there’s a clear division of IAM roles between administrators, publishers, and subscribers. This enhances visibility and control over access to Pub/Sub topics and subscriptions.
  • VPC Service Controls: Use these to limit which services or devices can connect to the Pub/Sub API. This measure is part of a comprehensive Defense in Depth strategy and provides better control over your data’s security.

Secret Manager

  • Align Secret Manager Replication Policy: Secret Manager offers different replication policies. You can use ‘User Managed’ to have more control over where your secrets’ data is stored, which is helpful in meeting specific compliance requirements.
  • Review Owners : The Owner role allows access to secrets, and thus, it’s crucial to review the list of identities with this role and secure their accounts appropriately (e.g., using multi-factor authentication with U2F FIDO keys).
  • Enable Data Access Audit Logs: Google Cloud services write audit logs to track user activity within your Google Cloud projects. Having these logs enabled in Secret Manager will help in keeping track of who accessed which secret, when, and where, enhancing visibility and control over your secrets.

Conclusion

These recommendations, when implemented effectively, can significantly enhance the security of your Google Cloud Platform data services, providing a safer and more reliable environment for your operations. Remember, data security is not a one-time action but a continuous process requiring regular reviews and updates as per evolving requirements and threats.

--

--

Hassene BELGACEM
Google Cloud - Community

Cloud Architect | Trainer . Here, I share my thoughts and exp on the topics like cloud computing and cybersecurity. https://www.linkedin.com/in/hassene-belgacem