Build a Better Performance Measurement System: Step 5

Katelyn P Mack
Learning for Change
10 min readSep 23, 2020

--

Photo by Rene Asmussen from Pexels

Dive Into Design

When you remodel your kitchen, you don’t just start tearing out countertops and pulling up floorboards. Like a home remodel or construction project, a performance measurement system requires a solid design before breaking ground. Most performance measurement systems require some design work to configure the system for your needs. This might take a few hours or a few months depending on the platform you have selected and the complexity of your programs.

The data system design is similar to an architect’s plan. When our friends were remodeling their backyard, the architect asked questions to understand the scope of the project and their preferences. Fireplace or fire pit? Modern or traditional? Pavers or stones? Most importantly, how do you want to use your yard? Should it be a quiet retreat or a place for a party? If you have been following along step-by-step, then your Scope of Work (Step 2) has lots of important information to dive into design.

An architect’s plans are only as good as your ability to articulate what you want the remodel to look like when it’s complete. The same is true for your performance measurement system. That’s why in Step 2 we articulate priorities and measurable goals. For example, I was speaking to a colleague who was frustrated with the implementation of their new performance measurement system. Their organization is focused on goal attainment as their primary impact measure. This was clearly communicated to their vendor at the outset, along with the specification that services provided to their clients be related to goal attainment. Easy peasy, right? Well, unfortunately not. The vendor’s existing product could relate many services to a goal, but it was not designed to easily capture and update multiple goals within a single service entry. As a result, my colleague had to work with the vendor to develop a custom solution which increased the cost of the project and considerably delayed their time to launch.

You might be tempted to blame the mishap on a lack of specificity when sourcing the project; however, even the most detailed and comprehensive Scope of Work will not provide all of the information needed for a vendor to start configuring the data system to meet your needs. After all, you developed the Product Specifications Document (PSD) without knowing which vendor or implementation partner you would select (Step 3). You also may not have known your full budget if your project depends on raising additional funds (Step 4). And there are tradeoffs in functionality and scope depending on the platform and vendor you select, as well as your budget. And we are human. Part of building a successful performance measurement system is learning and adjusting plans as you go.

When it comes to designing our performance measurement system, six lessons stand out. Several I wish I had known before I got started!

  1. Make conversations leading up to the Design phase count
  2. Be conservative in expecting efficiencies
  3. Make internal training and capacity building part of implementation
  4. Be willing to reallocate resources to your most pressing needs
  5. Set realistic timelines
Photo by Miguel Á. Padriñán from Pexels

#1: Make conversations leading up to the Design phase count

When I was interviewing potential implementation partners for our Salesforce build, none wanted to jump right into the work without a brief “Discovery” phase. In fact, one vendor refused (yes, I asked twice!) to give me an estimate for the entire implementation project and only would provide a quote for an initial Discovery phase. I was irked.

Here, I had done all of this work into creating a thorough and (I thought) comprehensive Product Specifications Document (PSD). I had interviewed staff, gathered input on priorities. We were ready to go. What else did they need?

I came to find out that the pre-Design phase of work is critical for implementation partners to more deeply understand our organization’s needs and our readiness for what we had planned. Since these effort are usually pay-for-service or limited to a few phone calls it is helpful to know some strategies for making the most of this phase.

  1. Organize and prioritize your source documents. Do not expect your vendor to efficiently organize and manage multiple documents sent at random in a dozen different emails. Make their internal document management easy. Use naming conventions that will make it easy for their team to find documents. For example use: Org Name_Priority Name_Topic. For example, our performance measurement system includes student assessments. Our postsecondary success program includes quarterly staff assessments of college student experiences and progress monitoring. This assessment is different by quarter and asks different questions for students at community college and those at 4-year institutions. While I could have sent our partner 6 PDFs of the various Google Forms, instead I organized the questions into an Excel worksheet. This way, it is easy to identify standard assessment questions and which questions are unique to specific student groups or only asked once a year. By organizing information in a more digestible format we reduce the time it takes for our partner to get up to speed.
  2. Get interview questions in advance and prep your staff. While documents provide the content for what should go in your data system, interviews with your staff provide the context for understanding current business processes and readiness for implementing something new. Having our partner speak with program staff created buy-in. We had about 40 people (of 170 program staff) participate in interviews during our pre-Design phase. We made sure the interviews focused on our highest priority topics. While we sent some general questions in advance, I wish we had asked our team to review the questions in advance or gather information from their teams, as well. We’ll do that next time.
  3. Take control of “Discovery” conversations. Remember that you are your organization’s greatest asset in ensuring a focused and right-sized performance measurement system. Given how burned we were from our first foray into performance measurement, I was committed to being in control of the Discovery conversations. I joined every interview and focus group with our staff. My team developed our own interview questions and fed them to our implementation partner to use when their questions were too vague or off-track. By being involved in this pre-design phase, I got to hear ideas from our team about how our system could function better to meet their needs. Some ideas were relevant to the initial phase of work, others we set aside for future development. After two days of interviews, we debriefed with our implementation partner. This allowed me to contextualize what the partner had heard from the team. And it allowed our partner to share reflections that could disrupt my existing narratives about what was important and necessary. So while your vendor or partner might own the Discovery phase, be involved and take control to ensure the right information is surfacing that will lead to a strong design and implementation plan.

#2: Be conservative in expecting efficiencies

In order to reduce the costs of project implementation, a Design plan often seeks to maximize efficiencies. For example, a partner might expect to spend 20 hours configuring Program A, 15 hours configuring Program B, and 10 hours configuring Program C, D, E, and so on. While that is true in theory, efficiencies like these assume that Programs are more alike than different.

However, if I look at BGCP and many other nonprofits, distinct “Programs” are often more different than they are alike. For example, our elementary literacy intervention has a completely different logic model and set of activities than our high school advising program. Just because they both have services and outcomes does not make them similar.

Assumptions around efficiencies burned a couple nonprofit leaders I have spoken with about their performance measurement systems. They and their implementation partners totally underestimated the lift of setting up multiple programs. As a result, projects took longer and went way over budget. It’s better to be conservative in any estimates of efficiencies and to avoid them if your Programs are very distinct.

Photo by Roman Koval from Pexels

#3: Make internal training and capacity building part of implementation

There is a way to conserve your budget that doesn’t have to do with your vendor or implementation partner spending less hours on each additional program: have your staff learn the system as it is being built. Most implementation partners and vendors are very amenable to this approach. After all, they want you to be successful and to be a loyal customer. Making sure someone on your team is skilled and knowledgeable of the system provides that security.

If you want to build a better performance measurement system then you need a staff person who knows the system inside and out. We created a new position, Database Administrator & Analyst, within our Impact & Evaluation team when we adopted our first performance measurement system. Transitioning to Salesforce took us back to square one.

An advantage of selecting Salesforce was that we got immediate access to high quality training resources and toan engaged user community. These were two reasons why we believed Salesforce would be the right system for us. So our Database Administrator and I mapped out a plan for completing Trailheads on Salesforce that would ensure he could understand the platform well enough to update, maintain, and configure new programs once our initial implementation project was done.

We also set out with clear expectations between us and Exponent Partners that our Database Administrator would be trained side-by-side with their team. This typically followed an “I do. We do. You do.” approach. Having Exponent team members based in the Bay Area meant our staff could literally sit side-by-side as the Exponent team set up activities, created new fields, added process flows and more. When Exponent did an important task without our team available, they shared with us a video that our Database Administrator could watch later and apply.

In our case, we set up 3 major programs in Phase 1 of the project. Exponent took the lead on two of the programs, and our Database Administrator configured the third with guidance and oversight by the Exponent team. We were able to launch on time with all 3 programs in our performance measurement system.

Note: Be persistent with this! It’s easy to get caught up in getting things done most efficiently (which will limit any training). In the big picture, you want your team to be able to update and change the system when it’s being used.

#4: Be willing to reallocate resources to your most pressing needs.

Wait…so we have already articulated our priorities and yet, our plans may need to change? Yes! Ok, hear me out. A lot of evaluators…err, people…are uncomfortable adjusting plans. Many of us are wired (and incentivized) to stick to a plan and make it work at all costs.

However, you will learn things leading up to or during implementation that require you to revisit your plan and adapt. For example, we had scoped creating an online registration form as part of Phase 1. We used it would reduce the time our staff spent distributing, receiving, and entering applications. However, once we learned more about the process and what it would take to deploy an online registration form across the organization, we realized 1) there was not a sense of urgency or need from our sites for an online form, 2) there was skepticism about whether our families would fill out an application online, and 3) there was not a clear process already in place for managing and following up with online applications.

Taking the advice of our implementation partners, we reallocated time from creating an online registration form to mission-critical reporting feature related to how often students attend programs.

#5: Set realistic timelines

While every implementation is different, this was our timeline:

  • November — Demo based on Product Specifications Document
  • January — Signed contract
  • February — Discovery phase begins
  • June — Testing complete
  • August — System launched

This project was 7 months from start to finish. It could not have happened any faster and it required nearly two full-time staff to pull off. I spent at least 80% of my time managing this project once the Discovery phase began and our Database Administrator was 100% committed to this project.

Let’s quickly talk timeline ballparks. I’ve talked with nonprofits who spent a year (or more) building their performance measurement system. A year seems too long. It is a long time for staff to wait once they know a change is coming. Remember change usually makes people anxious and can impact productivity. Your business model also can change in that amount of time (at least in my experience). So you may need to revisit or re-do work that is completed early into configuration.

On the other hand, any timeline under 3 months seems suspect. Unless you are focusing on a very narrow set of use cases, adopting a system straight of the box, and expect only a small team (<20) to adopt it, then 3 months is not realistic. Having now led and experienced major organizational changes, it takes at least:

  • One month to work with vendor and implementation partner to understand the goals, priorities, issues, and roadmap and to get all necessary stakeholders (staff, leadership, trustees) on board
  • One month to design and develop the product / tool(s) and do some initial testing to get feedback and make sure they work in practice
  • A week or two to make changes and iterate based on feedback
  • Two weeks to train staff and roll out the product

That’s 3 months — minimum.

Timelines are subject to the law of 3 — if you have a short timeline, expect to spend more or limit your scope and/or quality.

Lastly, your launch date should make sense given your annual program cycle. We decided to launch our system at the beginning of the school year. This allowed us to piggyback on existing staff training and start with a “clean slate” of data as we entered into a new enrollment cycle.

Conclusion

There is so much more to say about the next phases of implementation. Yet, success depends on getting off to a good start. Make the early stages of building a better performance measurement system count. You can do it!

And good luck.

--

--

Katelyn P Mack
Learning for Change

Social impact strategist | Data geek | Lover of learning | VP Impact & Evaluation @ Boys & Girls Clubs of the Peninsula | Previously @ FSG