Getting Ready for Cloud Data Security Posture Management

5 min readAug 22, 2022


By Ravi Ithal, CTO and Cofounder at Normalyze

One of the biggest questions for cybersecurity is, “Where is our data?” You can’t begin to secure data until you know where it is — especially critical business, customer, or regulated data. As we’ve learned in this new era of agile, your data can be almost anywhere in the cloud. Getting better visibility is the first step to a process of securing cloud data called Data Security Posture Management.

The analyst and vendor communities describe various types of posture management. They all address two general questions: What are the issues, and how can we fix them? Data Security Posture Management (DSPM) is a new prescriptive approach for securing cloud data. At a high level, DSPM entails a three-step process:

  1. Discover where your data is
  2. Detect which data is at risk
  3. Remediate to secure data

Discover where your data is

Discovery of data location is a huge issue because of the nature of agile. In DevOps and model-driven organizations, there is a vastly larger and expanding amount of structured and unstructured data that could be located almost anywhere.

In legacy scenarios, all the data was stored on premises, which spawned the “Castle & Moat” network security model of restricting external access while allowing internally trusted users. Those were the easy days of security! Cloud has fragmented the legacy architecture by storing data at external locations operated by service providers and other entities. For security architects and practitioners, and those responsible for compliance, this titanic shift in data volumes and locations calls for a different approach to securing the data: hence Data Security Posture Management.

The DSPM approach acknowledges that agile architectures are far more complex because the cloud is not a monolithic place. For most enterprises, cloud encompasses many physical and virtual places: two or more cloud service providers such as Amazon, Microsoft, or Google; software-as-a-service providers; platform and infrastructure-as-a-service providers; data lake providers; business partners; and, of course, a myriad of hybrid clouds, servers, and endpoints within your own organization.

Data isn’t just moving to more places. The velocity of data creation is soaring with a modern explosion of microservices, growing frequency of changes, acceleration of access for modeling, and constant iterations of new code by DevOps. Some of the fallout for security includes shadow data stores and abandoned databases, which lure attackers like honey draws bees.

Locating your data is just the beginning. Classification analysis is needed to help your team understand the nature of the data, and to determine levels of concern as to data requirements for protection and monitoring — especially if the data is subject to compliance mandates.

Detect which data is at risk

The second phase of the DSPM is detecting which cloud native data are at risk. A precursor is identifying all systems and related operations running in your organization’s cloud environment. Detecting all infrastructure helps determine what all the access paths are to your data and which paths may require access permission changes or new controls for protection.

The issue of access rights is challenging because structured and unstructured data can be found in many types of receptacles. Examples are cloud native databases, block storage, and file storage services. For each of these, your team will need to spot access misconfigurations, inflated access privileges, dormant users, vulnerable applications, and exposed resources with access to sensitive data.

If your organization is coming up to speed on these issues, be aware that security teams must closely collaborate with data and engineering teams due to rapidly evolving cloud application architectures, and by changes to microservices and data stores.

Access is not the only issue for risk; so is the nature of the data. Your teams will need to prioritize the cloud data to enable ranking its importance and risk level. Is the data proprietary, regulated, or otherwise sensitive in nature? Determination of risk is a composite of vulnerability severity, nature of the data, its access paths, and condition of its resource configurations. Higher risk means remediation becomes Priority One!

Remediate to secure data

Securing the cloud data at risk entails remediating the associated vulnerabilities discovered during the Discovery and Detection phases of DSPM. In legacy scenarios, data security often focused on securing the classic perimeter. But since data has moved vastly beyond this quaint antiquity, it requires addressing a different scope of issues. As mentioned, remediation will frequently need collaboration by a cross-discipline team. Depending on scenarios, the team will need help with network and infrastructure operations, cloud configuration management, identity management, databases, DevOps, and more.

Security of cloud data is usually governed with controls provided by a particular service provider. However, the enterprise subscriber also shares a critical role in addressing several issues mostly related to configuration management:

  • Identify where workloads are running
  • Chart relationships between the data and cloud infrastructure and related business processes to discover exploitable paths
  • Verify user and administrator account privileges to find people with over privileged access rights and roles
  • Inspect all public IP addresses related to your cloud accounts for potential hijacking

Since the major cloud service providers do not provide integrated, interoperable security and configuration controls for disparate clouds, it’s on your organization to ensure that security access controls are properly configured for multi-cloud environments.

While remediation is the last of a three-step process for DSPM, take note that all three steps are really part of a continuous cycle. When your enterprise relies on the cloud, it’s vital to always be on top of data security posture!

Getting Started

In reviewing how to prepare for DSPM, alert readers may have already reacted in exasperation: “There’s no way we could do all that with manual processes!” And this would be a correct response! The very essence of agile is spinning up (or down) virtual assets on demand to fulfill functions on the fly. DSPM takes a similar approach by applying automation to processes within each phase.

Automation is the ideal, and there are several automated solutions available now for various parts of the DSPM process. For example,

  • Catalog assets and attributes: Configuration Management Databases (CMDB)
  • Data classification: Data Loss Prevention (DLP) and some PrivacyOps solutions
  • Access management: Software-as-a-Service Security Posture Management (SSPM), Cloud Infrastructure Entitlement Management (CIEM), Database Access Monitoring and file analysis software
  • Risk and vulnerability management: CASB and CNAPP
  • Compliance: PrivacyOps solutions

The downside is that most of these solutions are siloed and cannot interoperate in seamless automated workflows of a fully-integrated DSPM solution — especially for a complex multi-cloud environment.

Toward this end, a new genre of DSPM solutions is emerging to operationalize and automate security of cloud data in modern agile environments. If the issues described above are challenging your organization’s teams, this is a good time to explore DSPM solutions to get visibility and control of cloud data security.

Ravi Ithal is Co-founder and Chief Technology Officer at Normalyze, a data-first cloud security provider for the digital enterprise.




Normalyze is a pioneering provider of cloud data security solutions