Strategy and the Cloud — Access to Data

Scottie Bryan
Hashmap, an NTT DATA Company
9 min readJan 19, 2021

Migrating to the cloud introduces unique opportunities to eliminate years or decades of technical debt for IT organizations. A cloud migration project also opens an opportunity for an IT department to remove many roadblocks and frustrations businesses have with getting to the data they need for operating their respective departments. Finding those intersections where IT technical debt meets business frustration opens up a series of win-wins between IT and the business that can dramatically increase buy-in from stakeholders while also improving IT operations.

A common area where technical debt meets business frustration is accessing data. I have seen organizations with decades of technical debt maintaining over 1,000 unique AD groups that the IT department had to manage and ensure compliance with SOX. This was not just a headache for IT; it was also a nightmare for the business. Interns would show up in May and finally receive access to all of the underlying data for their summer project just a couple of weeks before heading back to college. Vice presidents were frustrated, managers were frustrated, field personnel were frustrated, and the company's cost was real and measurable. Receiving and opening another department’s Tableau report was like untying a Gordian knot. When one access request would be granted, the team would open the report only to find that an underlying table required an additional access request. This would happen four or five times before a team could finally respond to a vice president or executive around a specific budget-related question or performance metric. Due to these roadblocks, this process could take a month to retrieve the answer to a relatively simple question.

On top of this, it became apparent that too much security resulted in no security. With so much to manage, nothing was managed effectively. Managers were charged with granting access to AD groups without knowing what access they were granting. With so many groups and team members moving from department to department, it became difficult to manage who should access what.

Another problem with managing data is that IT does not ‘own’ the data. If a department wants to horde their data (ahem, accounting), a lockdown is often left to IT to implement and enforce. This often means that the rest of the organization blames IT for not making that data available.

In a world where data is becoming the most valuable asset in an organization, rethinking how the organization interacts with data is critical for a successful data transformation or modernization project.

Reestablish Ownership of Data

The first step in improving data accessibility is making sure the business owns its data strategically.

DAMA (Data Management Association) recommends establishing data families to own data with a data family owner, a data family owner-delegate(s), and data family stewards. We’ll discuss the roles and strategies that I’ve seen work really well in a future post. For now, let’s focus on selecting data family owners to establish ownership of data.

The data family owner is the ultimate owner of the data. They usually lead the team(s) or department(s) that input the data and consume the data. I like to use the leadership level that is one or two steps below the C-Suite for data family owners for several reasons.

First, it is a relatively small pool of leaders. A small pool of leaders is usually pretty easy to manage. Promotions and reorganization happen less at this level. When they do, it is a company-wide event, which makes it easier to know when changes occur to adjust the roles accordingly.

The smaller pool of leaders allows for better discussions and enhances peer pressure. Also, it is one step away from the leaders that are handing out mandates.

Picture this: the C-suite establishes KPIs for the year, and half of the KPIs require some level of accounting data to create. Accounting doesn’t want to share their data, so they have it locked down in the accounting system, or it is in the data warehouse, but it requires a higher level of security to access. In my experience, persuading a vice president is much easier than applying this same pressure to a director or manager for several reasons. When the operations vice president(s) is sitting across from the accounting vice president, they are generally one or two leadership layers away from the executive that issued the KPI requirements. Knowing that one of their peers needs to talk to their boss often compels wayward leaders to get on board with the rest of the team. This works much better than if that pressure has to percolate down several bureaucratic layers.

I also find this effective because vice presidents are multiple steps removed from implementing the work. The tactical roadblocks that might exist for a manager (understaffed, overwhelmed with projects, etc.) generally don’t exist at the strategic level. Their strategic mandate can force a tactical reprioritization that managers may not have the freedom to make.

Lastly, using higher leadership levels allows them to only focus on the strategy of data without having to worry about the tactical day-to-day implementation and execution.

Establish ZERO Security Measures

It’s a joke, don’t do this. But this is a great thought exercise as you are moving into the cloud. Instead of starting with what you need to lock down, challenge yourself and your team to open everything up. I tried this with a publicly-traded company struggling to access accounting data because it was segregated in its own ERP. The business was frustrated with the lack of data. The business was being held accountable for its financial performance, but the team could not determine if they were winning or losing until the end of the quarter. IT wasn’t feeling the pressure, and accounting was unaware of the problem.

When we met with the stakeholders, we had previously gone through the thought exercise internally, discussing the risks of opening accounting data up to anyone and everyone. We also knew what the business needed. The result was that we were able to convince accounting to open up tables that would assist with Managerial Accounting (think back to your business classes). We would keep the tables locked down that only pertained to Financial Accounting. Business units could create internal profit and lost KPIs, but they could not access the corporate accounts that would enable someone to recreate the company’s financial statements.

Another convincing point we were able to make to accounting on this project was that too much security translated into no security. Teams were cobbling together sensitive information as it came available. This was creating two issues.

  • First Issue: The data sat in Excel; there was zero security wrapped around it. Worse, there was zero visibility. It could be emailed to anyone inside or outside the company with very few measures in place to protect that information.
  • Second Issue: The data was tracked manually, which meant that no teams’ data agreed with anyone else’s. Accounting was having to defend the right answer in multiple meetings each month.

By right-sizing security, we improved access to data. The business could use data to lower costs and increase revenues, and interdepartmental friction was dramatically reduced.

The other result we were able to deliver through this thought exercise was an extremely streamlined approach to accessing data. We created three classifications: Internal, Sensitive, and Restricted.

Internal was accessible by any employee across the company. This included roughly 90% of the company’s data that was migrated into the cloud.

Sensitive data was accessible to anyone in that role-based security group. Remember those 1,000 A&D groups? We reduced it to five. It became much easier to manage, and it became much easier to ensure that the right people had the right access to data. Conversely, it also became much easier to ensure that the wrong people did not have access to the wrong data, which had been a previous challenge as team members moved from department to department.

Restricted data was row-based security, and it was heavily locked down.

Lastly, we left a lot of sensitive data in the source ERP. 95% of HR data was not needed for business purposes, so we left it in the system. HR KPIs were generated directly from the SaaS tool that they used. When data (like overtime metrics) were needed for analytics, we would work with HR to heavily sanitize the data to be used with an internal classification without exposing the company to any risk around data management.

Consider a Team for Tiebreaking

It is rare, but sometimes you come across a team that is territorial with their data. They have too much ownership in said data and want to apply a high-security level around everything. This is where an independent data council can assist. The Data Family Owners agree to vest authority into the council, and the council is made up of data experts across the company. I generally find that individual contributors who meet certain criteria work the best.

The criteria for selecting these individuals include:

  • They should possess a significant amount of political capital across the organization. (This gives them authority even as an individual contributor.)
  • They should be knowledgeable about the data.
  • They should be knowledgeable about their area of business.
  • They should have a general knowledge of data across the organization.

Placing a few individuals on a council led by an IT leader that votes on giving access to data creates an impartial team that can help the organization balance data security vs. data accessibility. Having authority vested to them by leadership before conflicts arise helps ensure that when an issue arises with a data family owner or data family, the council is ready to provide an impartial ruling that works best for the broader organization.

Final Thoughts

A New Data Warehouse Equals New Opportunities.

Migrating data into the cloud doesn’t just pose new tactical and strategic advantages from the technology stack. It also creates an opportunity for teams to rethink how the organization interacts with data as a whole. My final tip for this post is that you challenge all assumptions that exist not just within your data warehouse but all around your data warehouse.

Don’t hesitate to bring in an outside team, like Hashmap, that has this as a core competency. They can help not only stand up the warehouse (which is deceptively easy), but they can help you configure it in ways to anticipate the future while keeping costs low.

Hashmap’s Data & Cloud Migration and Modernization Workshop is an interactive, two-hour experience for you and your team to help understand how to accelerate desired outcomes, reduce risk, and enable modern data readiness. We’ll talk through options and make sure that everyone has a good understanding of what should be prioritized, typical project phases, and how to mitigate risk. Sign up today for our complimentary workshop.

Other Tools and Content You Might Like

Feel free to share on other channels, and be sure and keep up with all new content from Hashmap here. To listen in on a casual conversation about all things data engineering and the cloud, check out Hashmap’s podcast Hashmap on Tap as well on Spotify, Apple, Google, and other popular streaming apps.

Scottie Bryan is a Delivery Manager with Hashmap, an NTT Data Company, providing Data, Cloud, IoT, and AI/ML solutions and consulting expertise across industries with a group of innovative technologists and domain experts accelerating high-value business outcomes for our customers. Connect with Scottie on LinkedIn.

--

--

Scottie Bryan
Hashmap, an NTT DATA Company

With over twenty years in operations, I’m passionate about using my technical and operational knowledge to help teams extract value from their data.