Transforming your Product Team’s Analytics Prowess: Culture

Joriz De Guzman
Dec 21, 2019 · 10 min read

Transforming your Product Team’s Analytics Prowess is a synthesis of my experiences and lessons with evolving Cornerstone OnDemand’s Product analytics competency. Cornerstone is a 20 year old public company with 2000+ employees globally, and is currently valued at over $3.4B. This series might be of interest if you are looking to introduce, deepen, or evolve the practice of data-driven decision making within your Product team. It becomes increasingly relevant if you’re trying to do so in a large enterprise with … hmm… an established” culture.

The series is organized into key building blocks necessary to transform your organization’s analytics prowess. Namely,

  • Data Culture within your Product team
  • Data sources, availability, and access
  • Analytics acumen: Asking the right questions
  • Analytics acumen: Analyzing and interpreting data
  • Data governance
  • Sharing knowledge with the broad organization

This article will focus on what I believe is the hardest and fundamental building block: Data Culture. As I mention later, changing culture is really about changing attitudes and behaviors. The human factors require the most personal investment and make-or-break an evolution that you’re looking to inspire.

The structure is fairly straightforward. First, I’ll offer a way to assess your Product Team’s (or, broadly, your organization’s) data culture. Next, we’ll get into a simple model to facilitate a cultural shift. Finally, we’ll close out with some reflection on how I applied these models within my first two months at Cornerstone. Clearly there’s a lot to cover, so let’s get started.

But first, some context…

Lone Growth Hacking: Expectations vs. Reality

In September 2016, I joined Cornerstone OnDemand as the first Growth Hacker (or as our CEO calls me, “the original Growth Hacker”). The mission for the role was a broad one: increase usage across all of our products. Pretty easy charter for a lone Growth Hacker, right? Spoiler alert: Analytics were a required organizational capability to effectively operate on that mission.

A few notable needs surfaced quickly after I had joined:

  • Assessing data sources, availability, and access
  • Establishing new data sources
  • Auditing the role of data in Product Management decisions
  • Identifying the key stakeholders in any Product-related decisions
  • Assessing the openness of the organization to data
  • Scaling analytics as a core Product competency across the Product team

The list above is a small subset of the bigger set of challenges when evolving an organization. However, the list reduces to the most fundamental enabler (or, disabler): your organization’s Data Culture. Changing your org’s data culture is about changing behaviors and attitudes. There are a few things you can do to catalyze that transformation.

Baseline and Assess your Data Climate

You can use the attributes I list below — they are organization agnostic. Once you have your list, I suggest that you score each attribute on a simple scale (say, 1–5) on how prevalent they are in your organization. Other dimensions can be consistency or impact. Whatever your scale turns out to be, put the high end of the range to represent your ideal state. Your job is to honestly rate where the organization is today.

Data Climate Attributes

  • Attitudes with negative data — This naturally belongs under “Biases” but I want to call it out specifically. People and organizations will celebrate and implement data that supports what they want to hear or see. However, the true test is what will they do if the data suggests something that counters what they want. A couple years ago, we were value testing and usability testing a new content consumption experience that was designed in a vacuum. I remember asking our VP of Product and CEO, “What will you do if the results are inconvenient to the truth you want?”
  • Data sources Based on the stated objectives of the different products, I wanted to understand which existing systems and teams could produce the data we needed. I was largely interested in user behavior, product utilization, satisfaction, issues, and finance. I pinpointed all of the such existing systems but, more importantly, identified the missing data sources.
  • Data availability — Given the list of sources, I wanted to see how quickly I could access the data within each of them. This allowed me to understand what hoops I needed to jump through, the process to request data, which query languages I needed to learn, and how to estimate turnaround time.
  • Data integrity and completeness — Accuracy and consistency is a constant challenge. Looking at Account-level data in our CRM was exhausting because some critical fields were outdated, missing or incorrect. Our other logs either double-counted or dropped important records.
  • Analytical capabilities — This is the make-or-break attribute, and you need the individuals on your team to index solidly against it. Capability is complex as you can break it down further into question clarity, analytical objectiveness, interpretation quality, analytical rigor, analytical depth, and communicative ability. Weakness here can lead to poor decision-making, which is a waste on so many levels.
  • Biases — I needed to take stock of different biases that individuals and teams were predisposed to. Confirmation bias runs rampant where decision-makers will listen to data that supports the outcome they want and ignore other important points. Belief bias also is pretty common place in silo’d organizations. In our case, the Support org was adamant that any data Product conjured was a set up to make them look bad or to wash our hands of responsibility to fix issues. This article is a great overview of different types of bias to look out for.
  • Consistency of data practices across teams — This is more appropriate for larger organizations. If you’re trying to scale practices across teams, this signal will tell you how much effort and enablement you’ll need to put in to scale effectively.
  • Clear articulation of goals and related metrics — Clear objectives and measures seem like a no-brainer, but there are so many factors that could obfuscate these. Normally, if multiple teams can’t state why they’re working on something and how they measure success, I would advise you to run away. But we don’t have that luxury, and sometimes it’s our job to fix that. Right? I digress… In any case, this is probably the easiest and fastest tell of an organization’s analytical ability.

Looking back, the most critical attributes to address are clear goal articulation, biases around data, and inclusion of data in decisions. These attributes are existentially human factors and require investment in behavioral shifts. The other attributes are largely outcomes of proper instrumentation, governance, and some training.

This seems like a lot of work, but in reality it isn’t. I was able to take stock of these within the first 6 weeks of joining Cornerstone. Arguably, data sources, availability, and integrity require the most effort to document. The rest of the attributes can be assessed purely through observation and some light questioning.

Planning your (first) Cultural Shift

“What it feels like to do behavioral analysis with clean usage data”

This is where the fun really begins. Pick 2–3 attributes above that you want to evolve. As cliche as it sounds, these evolutions won’t happen overnight and will likely incur some frustration. For even more cliche, believe in the process. Here’s a very simple model that I use regular incremental changes.

For each attribute, document the followings items: desired behavior(s), reliable partners, and arenas where you want to see the behavior. Desired Behaviors should describe actions that you want to see or outcomes from a process. Reliable Partners will be the list of people that can help you evangelize or influence the actors and processes that you previously listed. Lastly, arenas should describe where or when the behaviors should be demonstrated. The idea is to visualize a successful behavioral shift.

I won’t go into change management practices here; Of course, the how will require planning and effort. However, with the outcomes you listed, the partners and visualized success criteria you have the building blocks to evolving your Data Culture.

Putting it all together: A Reflection

  • Accessing data like product usage and account-related information was difficult as PMs either didn’t have authorization, required developers to write complex SQL, or had to learn a query language to extract data themselves.
  • The organization needed more than trustworthy, high-quality data. They needed quality interpretation.
  • Literacy and thoroughness needed to be checked, especially if data was being used in decision-making.
  • There was an obsession with metrics. Often, metrics and measurements preempted questions that were to be answered. This created many vanity metrics that were tossed around but weren’t helpful in moving any meaningful needles.
  • Goal-setting, while “practiced”, didn’t always map back to business objectives.
  • Or… Targets were sandbagged and didn’t really measure anything meaningful. In the “Biases” attribute, there was a bit of a fear that “missing” targets would be interpreted as failure. This might be the single-most crippling factor in our cultural transformation.

Knowing that getting quality data would take some time and that building literacy would take more across a team of 30+ PMs, I focused my cultural evolution on three levers:

  1. Breaking down the “Missing Targets = Failure” bias
  2. Standardizing Goal-Setting across all of our Product teams
  3. Showcasing a future state of product analytics prowess

I was able to catalyze change with items (1) and (2) by presenting at a product-team all hands within my first two weeks. The full presentation covered an audit of our data sources and a vision of how we would blend data together. However, most of the energy was spent on cultural mythbusting and introducing a simple metrics frame (the HEART framework).

Some thoughts and some principles. The most critical one for us has been “Measure to continuously improve, not grade or judge
Some thoughts and some principles. The most critical one for us has been “Measure to continuously improve, not grade or judge
Measure to continuously improve, not to grade or judge

After this high-level introduction at the all-hands, I partnered with a few PMs — mostly working on high visibility projects — and helped them build out a metrics plan, a review cadence, and even did some of example analyses for them. This equipped them with high-quality data (with interpretations) that they could share with stakeholders across the company. After several rinse and repeat cycles, we were able to create enough small wins and behavioral evolutions to market to the rest of the team.

For item (3), I injected myself into a cross-functional post-product release meeting. This was a good arena for me as I was able to connect with people from client success, support, product, engineering, and few other departments. The overarching message that I wanted to convey was that we had already sufficient data (not easy to get to, but still…) to sharpen our decision-making AND that we are open to sharing data. The last part was most important to battle the bias against showing negative data and sharing across silos. This also helped shift perceptions that “missing targets” does not mean failure, but rather an opportunity to improve our go-to-market.

Cohort Retention Analysis that was presented to the cross-functional team

I chose a cohort retention chart as the centerpiece to my analysis to give a peek into how we can use simple, single-source data to generate insights. It was helpful to many attendees as it created more surface area for questions and discussions that previously were not had.

For all of the three cultural evolution levers above, the behavior that I wanted to observe was primarily engagement. My thesis was that if a few PMs could openly discuss goals and share metrics without fear of judgement, then we could foster curiosity and exploration across the rest of the team. Similarly, if the broader organization could see data — insightful and honest — then we could drive that same curiosity along with shared accountability.

In these instances, I would call the effort successful. I saw PMs engaging with me and among themselves. Other departments looked to standardize data and reporting in conjunction with the Product team.

Strong Charts and Full Hearts: How all of our team meetings start now

Of course, this was just the beginning. We had a ways to go with overall literacy, data-driven decision making, data availability, and completeness. As I would soon find out, the organization Product team began asking questions at a pace and depth that our existing data could not answer. This lack of availability and access would be my next challenge.

Stay tuned for my upcoming posts on Data Sources — Acquisition and Access! Please interact with me if you have any questions, comments, or your own stories to share!



The Startup

Medium's largest active publication, followed by +585K people. Follow to join our community.

Joriz De Guzman

Written by

Growth and Strategy. Product Management. Wharton MBA ‘10. Previously Microsoft (Xbox Strategy). Visit my site: http://joriz.io. Follow me on twitter: @jorizdg.

The Startup

Medium's largest active publication, followed by +585K people. Follow to join our community.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade