Cloud Security
Published in

Cloud Security

Data exposure

Protect your gold

Do you know where the data lives in your organization? If you don’t, how do you know if it is exposed or not? If you think you know where it is, do you understand all the ways it can flow in and out of systems that might inadvertently expose it to the Internet or third parties?

In my last post in my online book, Cybersecurity for Executives, I wrote about how open network ports can expose your organization to unnecessary risk. The concept of blocking network ports to keep things inside or outside your network presumes you have control over your network perimeter and can track what goes in and out via controlling ports. Sometimes in this day and age of cloud services, this is easier said than done. This post will look at some of the changes in systems and architectures in recent years that make managing network access difficult. Some of it has to do with approved third-party access to systems that create paths for data exfiltration (a fancy way to say data extracted from your network in a subversive manner).

Click here to purchase a full copy of the ebook or paperback on Amazon: Cybersecurity for Executives in the Age of Cloud

The Center for Internet Security publishes a list of the top controls businesses should implement to maintain a secure environment. The list comes from the most common causes of data breaches and what will stop them. The first two items in the list involve maintaining an inventory of hardware and software assets. However, I have often wondered — isn’t it just as important to know where your data is, and where it can flow? When I give security presentations, I like to say, “Your data is your gold.” Businesses derive value in part, from the data they maintain. Having that data leak may decrease a company’s competitive advantage and along with it the reason one company is worth more than another. Think about the impact of your competitors knowing your business plans, customers, and intellectual property, not to mention the potential $350-million-dollar price tag of mega breaches I wrote about in the first post.

What are some of the recent causes of data exposure? Of course, I have to mention the rash of AWS S3 buckets left open to the Internet. An AWS S3 bucket is a place to store data. Technically it’s called “object storage,” but to a business executive, it’s a way to store files so applications can access and manage them in a cloud environment. Azure and Google Cloud Platform have similar constructs. An organization cannot maintain S3 storage inside a private network due to its initial construction. As I wrote about in a separate blog post, new features exist that help lock down S3 buckets, but during the timeframe I was on the Capital One cloud team, the files had to traverse the Internet to go between applications and the buckets. A lot of companies probably never thought about this, but Capital One put in feature requests for some of the new controls that exist today that can keep those files off the Internet.

People that don’t understand and are not thinking about networking won’t consider these risks when they choose to use a particular service. Often it’s a business owner focused on how something functions or a developer who is just trying to get something to work. The network and security teams may not even understand how an S3 bucket works or that anyone is using it. The buckets get exposed because developers not adequately trained in networking are maintaining Internet access. They change the configuration to open up the bucket and don’t fully understand the implications of those actions. In other cases, the bucket may have been securely configured to start, but someone changed the configuration after the fact — an ops person in production responding to a ticket, or malware that got into the cloud environment.

Another common occurrence lately has to do with databases being exposed directly to the Internet. In the past, this was not as likely because a network and security team were involved in the design of most systems containing sensitive data. The network architecture and the server implementation put those critical database and data storage servers in a private network that required traversing additional network layers before reaching the data. Developers did not have a choice regarding which server would host the data or in which part of the network.

The recent shift is that now, developers are often responsible for the implementation of cloud environments where networking is easy to change, and databases are sometimes set up with direct Internet access. Additionally, some database services from cloud providers only operate over the Internet and are exposed by virtue of how they work. This type of breach is especially surprising, given that it is an obvious flaw in system architectures and configuration, but it is happening much too often. Some of the most common culprits are Mongo databases and Elastisearch. However, attackers have also breached other types of relational databases and cloud storage for the same reason.

Similar to S3 buckets, services like DropBox, Box, and Google Drive facilitate storage, but often for a non-technical user in a company. I was listening to a panel talk about trying to maintain governance when it comes to cloud services. A sales executive for a security vendor spoke up and said that he uses DropBox even though not permitted by his company because it is the only way he can do business and get files to clients. He works for a security company! Organizations need to understand what people need to be able to do their jobs to facilitate systems that allow what is required and try to find a way to monitor for unauthorized file transfers. Employees also need to be trained on the repercussions of too much data exposure. Companies need to be aware that these types of storage services have been used by insiders to steal incredibly sensitive data from organizations.

Services that facilitate data flows and data transfers are particularly risky if they are not architected with proper visibility so a company can monitor where its data is flowing between integrated systems and services. I’ve reviewed the way companies are setting up cloud applications as a Director of SAAS Engineering and Cloud Architect. I also examined system architecture related to data breaches and for potential acquisitions as Director of Security Strategy and Research for a security vendor. Additionally, I’ve had the opportunity to review system architecture while working on cloud audits through my company, 2nd Sight Lab. Due to the way some cloud systems work, it may be challenging for IT and security departments to obtain visibility into data flows between systems. Some companies also do not consider the risk related to these data flows when they are just trying to make things work.

For example, one particular type of application allows companies to easily transfer data to and from other companies and cloud services by quickly setting up new connections through a user-friendly console and a few button clicks. The first problem with this type of system is that the implementation of governance is challenging when people are simply clicking buttons. One application I reviewed had no evidence of change control, and no one was monitoring the logs for suspicious activity. They also were not aware of the steps they should take if a data breach occurs. A high-risk exists in scenarios like this for an inadvertent or purposefully malicious change to divert data to the wrong place.

Another problem that exists in this scenario is that two data flows could exist — one that sends data into the service and another that sends data out to some other third party. This type of data flow would never traverse the IT and security monitoring systems the company has in place. Perhaps IT and security teams don’t even understand these connections exist. No one would never know the data exfiltration occurred. No data loss prevention system such as a CASB (Cloud Access Security Broker) would ever see it if the cloud provider is not configured to send those logs to a SIEM and does not integrate with any logging system via an API. In the case of an incident, the company would be dependent on the cloud provider to deliver logs for evidence — if they exist and are adequately handled to maintain chain of custody (the proper handling so they will stand up to scrutiny in a court case).

Another source of data breaches lately is related to use of connections on websites that blindly send customers to third-party domains or code when a webpage is loaded. When a web site loads, it may include links to third-party domain names (like to get a font to use on the web site). Sometimes I open up a web page and 20 or 30 different connections exist to domains other than the one I am trying to view! This seems excessive. News websites are some of the worst offenders of this principle in my experience.

When a customer opens your web page, if your site is connecting to all these other third-party sites, data that the customer downloads from your web site or enters into your web site could be accessible by these third-party sites if a vulnerability or misconfiguration exists. The third-party web site or code could also be serving up malware. The Magecart campaign has reportedly infected over 960 E-commerce stores by injecting malicious code. Note that the article suggests the problem has something to do with cloud providers, but likely it is just that the attackers are running automated attacks against applications in these environments, rather than a problem with the cloud provider itself. More details should emerge over time.

Have your developers download third-party code files and store them in your own source control system. Validate that the code does not contain software vulnerabilities, and serve it up from your domain name. Make sure your site does not expose customer data to external sources that serve up code, advertisements, and even images. Alternatively proxy traffic for requests through your web server in a secure manner to the web site that is serving up the ads and validate the content on the way back through. Serve all the content from your own domain. One other way data is exposed is through misconfiguration of something called CORS (Cross-Origin resource sharing), which specifies which third party web sites can access data from your web site.

As an executive, you don’t need to understand all the details of how these particular web technologies work, but you can ask how many different domains customers are exposed to when they open your web site or web application in their browser. Limit those connections to limit data exposure while customers are browsing your website. Additionally, make sure your development staff gets proper security training on top threats, how data breaches work, and potential vulnerabilities so they can design and implement more secure systems.

One other point of data exposure exists when your organization hires external vendors to maintain systems and data. A recent rash of attacks on MSPs, MSSPs, and organizations that develop systems for other companies have exposed data of the customers they support. A Wall Street Journal report explains how Chinese hackers breached U.S. Navy contractors. A company named Attunity exposed data of Ford, TD Bank, and Neflix. I already mentioned the breach of Wipro, India’s third-largest IT outsourcing company.

Think about how you monitor network access within your organization. If your IT or security team sees suspicious actions happening on your network, they will hopefully take appropriate action. If they see a connection by a vendor you have hired to assist with system maintenance or development, would they consider this activity suspicious? Likely the vendor is connecting over an encrypted channel from a trusted network. Your IT or security team has limited or no visibility into the vendor systems or networks. Companies need to be aware of how well their vendors maintain security within these external systems and networks. Organizations also need to be cognizant of what type of data is accessible to the consultants and vendors connecting to their network. Additionally, consider how the vendors are connecting and what can and cannot be monitored.

Companies need to govern how connections to data are established and monitor data transfers for potential loss or exfiltration. Reporting should exist that shows the types of data stored in systems, and the intended and possible data flows between systems to correctly evaluate the risk of loss, modification, or exposure. Organizations need to establish controls related to who can set up new databases, networking, and data connections and with what requirements to help prevent data breaches. Employees need to understand the risk of data exposure as well via adequate security training — including technical and non-technical personnel.

Teri Radichel

If you liked this story please clap and follow:

Medium: Teri Radichel or Email List: Teri Radichel
Twitter: @teriradichel or @2ndSightLab
Requests services via LinkedIn: Teri Radichel or IANS Research

© 2nd Sight Lab 2020


Want to learn more about Cloud Security?

Check out: Cybersecurity for Executives in the Age of Cloud.

Cloud Penetration Testing and Security Assessments

Are your cloud accounts and applications secure? Hire 2nd Sight Lab for a penetration test or security assessment.

Cloud Security Training

Virtual training available for a minimum of 10 students at a single organization. Curriculum: 2nd Sight Lab cloud Security Training

Have a Cybersecurity or Cloud Security Question?

Ask Teri Radichel by scheduling a call with IANS Research.


2020 Cybersecurity and Cloud Security Podcasts

Cybersecurity for Executives in the Age of Cloud with Teri Radichel

Teri Radichel on Bring Your Own Security Podcast

Understanding What Cloud Security Means with Teri Radichel on The Secure Developer Podcast

2020 Cybersecurity and Cloud Security Conference Presentations

RSA 2020 ~ Serverless Attack Vectors

AWS Women in Tech Day 2020

Serverless Days Hamburg

Prior Podcasts and Presentations

RSA 2018 ~ Red Team vs. Blue Team on AWS with Kolby Allen

AWS re:Invent 2018 ~ RedTeam vs. Blue Team on AWS with Kolby Allen

Microsoft Build 2019 ~ DIY Security Assessment with SheHacksPurple

AWS re:Invent and AWS re:Inforce 2019 ~ Are you ready for a Cloud Pentest?

Masters of Data ~ Sumo Logic Podcast

Azure for Auditors ~ Presented to Seattle ISACA and IIA

OWASP AppSec Day 2019 — Melbourne, Australia

Bienvenue au congrès ISACA Québec 2019 KeynoteQuebec, Canada (October 7–9)

Cloud Security and Cybersecurity Presentations

White Papers and Research Reports

Securing Serverless: What’s Different? What’s Not?

Create a Simple Fuzzer for Rest APIs

Improve Detection and Prevention of DOM XSS

Balancing Security and Innovation with Event-Driven Automation

Critical Controls that Could have Prevented the Target Breach

Packet Capture on AWS




Cybersecurity in a Cloudy World

Recommended from Medium

Process data encrypted with 3rd party asymmetric key in EC2 or EMR using AWS Secrets Manager

Impact of 2020 on the Video Surveillance Industry

How to Start a Cybersecurity Career?


Elon Musk changes his avatar and makes the ApeCoin token price explode

Elon Musk changes his avatar and makes the ApeCoin token price explode

How to protect your application from a Denial of Service attack

EMON Tokenomics

How to Reload and Check Your RFID Balance

Photo by NLEX Corporation via TopGear

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Teri Radichel

Teri Radichel

Cloud Security Training and Penetration Testing | GSE, GSEC, GCIH, GCIA, GCPM, GCCC, GREM, GPEN, GXPN | AWS Hero | Infragard | IANS Faculty |

More from Medium

Cloud Fundamentals : Everything You Need to Know !!

FAQ: 5 Top Questions About Fasoo Enterprise DRM vs. Microsoft AIP

Manage Data Risks from Employee Insiders with Microsoft Purview

Industrial Cloud Computing: Scope and Future