Cities and Economic Mobility: What It Takes to Jumpstart Progress
How nine cities designed projects to boost economic mobility and evaluate their effectiveness
More and more Americans find themselves stuck in place, unable to build a better economic future — and the COVID-19 pandemic has only exacerbated this crisis. Cities across the country, however, have been stepping up to find ways to tackle this challenge and create greater opportunities for their residents, even before the pandemic. But not all approaches that cities try are equally likely to lead to positive economic mobility outcomes and, without ways to evaluate what is and is not working, cities miss important opportunities. The What Works Cities Economic Mobility Initiative was created to help fill this gap.
Launched last year with an inaugural nine-city cohort, the initiative gives cities the opportunity to experiment with evidence-informed strategies for accelerating economic mobility for residents. Strategies focus on those policy areas that have a robust evidence base for their impact on improving economic mobility outcomes, including: education, economic stability, workforce development, and safe, affordable housing.*
But how do cities build the case for a specific idea? How do they design an intervention with local partners? And how do they evaluate its impact?
One answer to all these questions is: thoughtfully, and with some outside help. As part of the initiative, the selected participating cities worked directly with the Behavioral Insights Team (BIT), one of What Works Cities’ expert partners, to design and evaluate the interventions being implemented over the 18-month program. To better understand how this work was stood up, WWC spoke with BIT staff.
The following responses have been edited for brevity and clarity.
How did each city land on a particular intervention to boost economic mobility?
Kelsey Gohn, BIT advisor: There was a unique path for each city to select their projects. In their applications, cities could describe different policy areas they were interested in and projects they were currently working on. There’s a handful of cities, such as Lansing, where the project idea came directly from the application.
There were other cities where we started from more of an exploratory angle, thinking about some of the outcomes they wanted to see, or whether there was something in the mayor’s plan or from a task force on economic mobility that they wanted to hone in on. We were able to help cities look at the existing information and evidence, and help them narrow in on a project. Detroit, for example, knew they wanted to focus on improving economic mobility outcomes through the lens of their affordable housing work, so we started there and helped to design a project that specifically addressed how the city might be able to help residents ensure more stable housing.
For all of the cities, though, we helped them at the outset to identify potential projects that fit within the following criteria:
- Alignment to city & mayoral priorities;
- Relevance to economic mobility outcomes;
- A promising evidence base; and
- Potential scalability for cities beyond our cohort.
Broadly speaking, what did the design process look like for interventions?
Kelsey: At BIT, we have a process we go through on most of our projects to help us design and evaluate interventions — we call it “TESTS”: Target, Explore, Solution, Trial, and Scale. TESTS helps us structure the process as we break down a problem, identify and design solutions, evaluate those ideas, and act on the results.
For example, in Target, we helped cities think about their core problem statement: what kind of outcomes or metrics they want to influence through this initiative, what impact that would have on their residents, and how those specific pieces relate to their economic mobility goals.
Of course, this is an iterative process and we refine our problem statement as we learn more in each phase.
Next, in Explore, we helped engage with stakeholders to better understand how they see the problem the city has chosen to target. For example, we want to understand how they describe the barriers they face, and what they might suggest as ideas for improvement. Most of the time this takes the form of on-the-ground work to make sure we are designing with users, rather than for them. Our first site visit on the ground was really critical for most cities — we were interacting with a lot of residents and users of different services to hear directly from them and learn about how they were experiencing the challenge the city is trying to solve.
In Solution, we brainstormed and prioritized potential solutions for cities to implement, co-developing and refining the projects with the city and adding elements or tweaking evidence-based practices to fit what we heard from the city’s residents.
Then in Trial, we developed an evaluation plan so we can understand “what worked” and where there were areas for improvement. BIT helps plan these evaluations to make sure they’re rigorous, but also feasible for cities to implement. Once we’ve designed this plan, then we launch the interventions and start to monitor!
And finally, in Scale, we look to see what worked and what didn’t, to help the city and partners make decisions on how to carry this work forward.
What role did data play in designing an intervention? Was there a specific type of data a city needed to develop a project in say, early childhood education?
Carolina Toth, principal advisor, city governments: It can be tricky. You might wish for data that says, if you have poor attendance at kindergarten, then that makes you less economically mobile by this much when you’re 35 years old. But economic mobility is a difficult field in this way: ideal data tying long-term outcomes back to each thing that happened over the course of a person’s life doesn’t exist.
The approach cities took was much more pragmatic. Taking the example of early childhood education, we do know that high-quality education has a strong relationship to positive economic mobility outcomes, even if we don’t have that all in one dataset. Research shows that improving preschool attendance can improve kindergarten readiness, which also has an evidence base that connects to long term outcomes, including economic outcomes.
So in Dayton, the data the city used to craft its intervention was focused around the differential attendance of Black preschool students versus white students. They were collecting that data in their schools and could see disparities. In order to increase economic mobility for all Dayton residents, they said, “Let’s close this gap.”
The same approach was behind interventions across the cohort — we worked with cities to place data available in specific areas within a theory of change of how that data impacts economic mobility.
Michael Kaemingk, senior advisor: In the Explore phase, we also help generate qualitative data or other data, if needed, to inform the ultimate intervention design and make sure it’s tailored to the specific environment. Take the city of Detroit, for example. Its project is about increasing housing stability among affordable housing residents. We worked with a community partner to conduct semi-structured interviews with residents to understand what barriers those residents face to paying rent on time, and what kinds of services would be able to address those barriers.
What role did user feedback play in design?
Kelsey: Tulsa is a good example. Their intervention is to recruit more low-income young adults into a career education and job training program. We invited people in the target audience to give us feedback about sample recruitment and orientation materials, and then said, “How did you feel about this?” The feedback was very specific and helpful. People would point to materials and say, “This word works, this word doesn’t.” Or, “I hated this five minute part of the orientation.”
That’s one concrete tip: Look at the data you have, but also prototype solutions with the actual people involved. You can learn a lot by listening to them.
What does the evaluation process look like for these projects?
Emily Cardon, principal advisor, head of research: We’ve tried to be really holistic about what evaluation means. For us, it accompanies all program work across all phases. It doesn’t have to mean a costly process at the end. Evaluation should be part of the whole program development process — it’s about incorporating data, being thoughtful and systematic about assumptions, and trying to find ways to test and validate.
In fact, the best evaluations are ones that are developed alongside the program rather than afterward. This way, we’re able to evaluate the right things because each project is different and requires evaluations to be tailored to the specific needs and expected outcomes of that program.
[Learn more about evaluations in our Q&A with Emily: Building a Useful and Human-Facing Government]
There’s lots of times when things don’t work out. In those cases, evaluation is still telling you something, helping you learn and think about next steps. I hope people don’t think it’s an extra thing — it’s a really critical piece of doing the work itself.
Carolina: I’d add that being able to do an evaluation requires data on the outcomes that you as a city care about. Do you care about evictions, and how they might perpetuate racial inequities? Then you need to work with the courts to get access to evictions data and track it by different racial groups.
The other thing that can be helpful at a mindset level is an openness to randomization, which is the best way to know whether your intervention (versus another external factor) is responsible for any change you may see.
What advice would you give to cities that want to stand up interventions on their own? How should they get started?
Carolina: Look at the data you have about economic mobility in your city — who is being left behind? At what phase of their lives do people suffer setbacks? Who can help you as a city make a dent on a problem, extending your efforts both now and in the future?
Also, pick something very concrete that you can do in the short term to move the needle on this issue, in parallel with having a broader and longer-term discussion, solidifying relationships and ensuring you’re able to track all the outcomes you care about. An 18-month initiative can help you build momentum and demonstrate you are serious about tackling something as complex and multifaceted as economic mobility.
Can you share any major lessons learned thus far from the initiative?
Carolina: One thing is that while it’s hard work to bring together multiple partners and can cause complexity at the beginning of a project, it is critical for city governments to have relationships with community partners and pursue a joint agenda for residents. These relationships and partnerships help build something sustainable.
For example, Rochester’s project has allowed for close collaboration between the City’s Office of Community Wealth Building, Rochester’s Financial Empowerment Center, and CASH, a local non-profit that provides free tax prep services. They all work to serve the same people, so being able to more efficiently refer, follow up, and identify shared challenges allows them to have a cumulative impact on people’s lives that they wouldn’t be able to do if they were working alone or in silos.
Another thing we’re learning is that if you really consider your priorities carefully and pick an intervention aligned to them, even a global pandemic will not upend your plans!
How has the COVID-19 pandemic affected this workaround economic mobility?
Kelsey: What was important for economic mobility before COVID remains important after COVID, if not even more important. So at the intervention level, it’s been about tweaking things, like changing evidence-based practices to be sensitive to an employer that now has very little funding. Or removing program elements that are now health risks, like encouraging people to carpool.
Tulsa, for example, had to pause recruitment, in-person trainings and in-person coaching in its program because of the pandemic. But youth need jobs now more than ever, and higher mobility jobs are still being brought to the city as a result of Tulsa’s economic development efforts, which have continued despite COVID. So, over the course of the last couple of months, we’ve worked with the city to modify the program and adapt the work to fit an online environment. The intended outcomes are still the same — to get young people in the door and connect them to the support they need to secure high mobility jobs in Tulsa. The path toward that end might have had to be adjusted, but the goal hasn’t changed.
* Other policy areas with strong evidence on economic mobility outcomes include: community and social capital, health and well-being, and criminal justice.
The What Works Cities Economic Mobility Initiative is made possible by the support of Bloomberg Philanthropies, the Bill and Melinda Gates Foundation, and Ballmer Group. Learn more about Cincinnati, OH; Dayton, OH; Detroit, MI; Lansing, MI; New Orleans, LA; Newark, NJ; Racine, WI; Rochester, NY; and Tulsa, OK and their projects, here.