Collaborative evaluation: a case study in building learning partnerships to improve educational outcomes
Coauthored by Alanna Williams
Evaluation is a key tool to understand the impact of public sector grants. That’s why CalData is interested in developing shared approaches to evaluation. But across the public sector, evaluations of grant programs can vary in quality and approach, and can be:
- Box ticking activities: centered on compliance or one-off needs. They can lack clear goals and a learning-oriented process to build our understanding of what really works.
- Rigid: limited flexibility to address grantee priority questions, constraints, and unexpected changes
- Uncoordinated: unable to extract broader lessons from varied projects
- Not transparent: limited feedback and self-examination, and hard to find or navigate materials
- Under-resourced: aren’t funded to address the above issues
- A bunch of other things: for another day :)
The Regional K16 Collaboratives Grant Program (from our partners at the Office of Public School Construction) gave us a chance to test some approaches to grant evaluations. This blog discusses the program and our approach, i.e. state as a learning partner not just a funder.
What is the Regional K16 Collaboratives program?
The California Regional K16 Education Collaboratives Program supports regional efforts to create education to career pathways. Although it’s a statewide effort, the program recognizes that sub-regions of California have unique needs. So, each region will bring together education and workforce partners to form a collaborative. Each regional collaborative will:
- participate in California’s Cradle to Career (C2C) data system,
- create at least two occupational pathways (in either business, computing / engineering, education, or healthcare) per regional needs, and
- implement at least four strategies to improve student success from the 2021 Recovery with Equity (RWE) report.
Early on, the program put a strong emphasis on data and evaluation. They hired dedicated staff (Alex:-) to develop a thoughtful approach and implementation.
Developing a Theory of Change
The first step was to create a statewide theory of change for the program. Each collaborative has a theory of how their program will create impact. As the funder, we wanted to create a statewide theory of change to measure the grant’s impact. With our stakeholders, we identified our goals:
Big-picture goals are:
- Support regional collaboratives in their work to develop career pathways and close equity gaps through proven student success strategies,
- Increase use of the Cradle to Career data system and its tools, and
- Foster sustained cross-sector collaboration post award.
Process goals are:
- Develop custom evaluation plans for each collaborative,
- Identify best practices in pathway implementation and student support strategies, and
- Document our iterative grant process.
Developing a Learning Agenda
We then used the theory of change to create a public learning agenda. Learning agendas help us:
- dive in deeper to clarify goals and knowledge gaps,
- document what kind of evidence gathering activities are needed,
- establish what kinds of data we will need to conduct analyses, and
- propose deliverables and methods for disseminating findings.
We summarized the theory of change document into five goals. Each goal describes what we want to know and how we plan to gather evidence. We will also examine how the goals and questions relate to one another. For example, certain student success strategies like creating inclusive classroom spaces might be extra helpful for students in education pathways.
Implementing the Approach
We’ll need many different ways to answer our priority questions, including:
- Foundational fact finding: setting baselines with descriptive statistics and exploratory analysis
- Performance measurement: track each collaborative’s implementation progress. This helps us understand what’s working, what’s not, and how we can help
- Program impact evaluation: examining updated data against baselines and relevant comparisons. This helps us see collaboratives’ successes, challenges, and keep an eye on equity gaps.
- Policy analysis: feedback from collaboratives is important to help us identify roadblocks. This gives us opportunities to figure out how we can work together to resolve them. Stakeholders envisioned the state playing a more active role as a learning partner. This results in a more responsive policy cycle, a win for both the state and our regions!
Each collaborative has unique plans, so it doesn’t make sense to make everyone use the same data. Instead of taking a one-size fits all approach, we will work with each collaborative to create custom data reporting. When collaboratives have unique approaches, we’ll set custom metrics. And when they have shared practices, we’ll use consistent measurement. We’ll continue to meet with collaboratives on a regular basis and work through any challenges together as partners.
To sum it up — we are excited about this model!
Centralized theories of change and learning agendas build coherent, transparent and learning-oriented foundations.
Collaboration and coordination with a variety of approaches means we can have both:
- the flexibility to meet grantee needs
- the direction and tools to answer our statewide questions
Next Steps
For the Regional K16 Collaboratives program:
- This will be a multi-year evaluation process. We will keep learning, providing updates, and publishing material.
For CalData:
- We will continue to showcase promising evaluation practices.
- We will start to build statewide shared learnings, language, and communities around evaluation. Follow our newsletter to keep up on this work!
For you:
- We would love to hear from departments who are trying similar (or different!) strategies — feel free to drop a note in this form.