A Framework for Data Driven Government
Last week, I wrote about the barriers to data driven decision making in government, covered in the first week of the class I teach in the Executive MPA program at Syracuse University. In this week’s class, we talked about the framework around which data and analytics projects can be implemented. This framework is challenging because it comes off as if it is linear, but it very much is not. If I’ve learned anything during my time doing this work, projects need a lot of iteration, sometimes they need to be shut down, and there is a need for reflection and feedback at every step. Generally, though, the framework is as follows: discovery, insight, action, and outcomes.
Discovery
As a project is developing, defining what problems the project will tackle is critical. Discovery, in my experience, can look a few different ways.
When we worked on projects relating to physical infrastructure in Syracuse, the team would observe staff fixing a water main break. We even filled potholes and threw trash in an effort to have a little understanding for what the staff dealt with on a day-to-day basis. Our hope was that this would frame the problem in a more specific way, and it might build some trust when we made our recommendations.
Discovery can also get extremely detailed. For a data project, it might be understanding how two tables in a database are joined together, or where staff might have built a workaround with a less-than-optimal software system that makes their work easier, but might make it harder to do an analysis.
Ultimately, the hope is that this step gives clear definition on what a project should look like, and many of the initial questions should have been asked and answered.
Insight
In this step, data should be analyzed to better understand general trends. Initial insights should be shared and presented to relevant stakeholders. Many times, the results of this step might be another round of discovery as new questions arise. A good thing to remember is this may be one of the first times stakeholders have ever seen the data presented in the way you are showing it. In my case, there were a couple of departments who had never seen the data they generated on a daily basis put on a map. When they saw all of the incidents they were responding to on a map, it opened their eyes to a number of insights that they hadn’t thought of before, and I never would have come up with on my own.
This is also a good time to understand where data might be missing. When we were thinking about better community outreach for 311 calls (things like pothole complaints, etc), we had a sense that some parts of the City called less frequently than other parts, but we weren’t sure. Mapping this information made it clear that some neighborhoods called much less — and as a result got less service. Revealing this insight was important and also helped lead to the next step, action.
Action
With insights, stakeholders might have a better idea of what is going on, but next it is important to design potential interventions. Those actions should be led by the data — in the above example about 311 we had a sense for which neighborhoods were making service requests the most and least. There were a few actions that could have worked — a marketing campaign to those neighborhoods, a reimagining of the process for submitting a service request, not relying on service requests to guide work and instead building tools to predict where problems may arise.
Any action will likely require additional insight, and will also probably need additional discovery to ensure the data being used actually informs the actions being taken.
In some cases, the actions may not rely on a large amount of data and analytics work — the initial insights might be so clear that the action is obvious. In other cases, it may require more analytics work. When we were working with the fire department to help them understand which fire hydrants could supply the most water during a fire incident, we had general information about high and low water pressure zones within the system, but nothing more granular. We needed to give them a prediction for how much water pressure would come out of any of the more than 5,000 hydrants in the City. The action ended up being more work collecting data, and then building a predictive model and map that was accessible in the fire truck itself.
Outcomes
Once the action is in place, the project is still not done. It is important to monitor if the action taken is effective, and continues to be effective. To do this, some measures should be put in place to define success — is the department more efficient or effective because of the recommended action?
This ends up being cyclical as well. It is more than likely that the action will surface additional problems that need to be tackled, or with the key problem crossed off the list, less critical problems can now be addressed. Also, sometimes the proposed action doesn’t work, requiring more discovery, insights, and actions.
This is a rigorous process. There may be some projects that only need a quick answer, though the general framework will remain the same. While this framework is useful, it does require buy-in from leadership. It certainly can be quicker or easier to do things the way they always have been done, or just to make a gut decision without analyzing the outcomes. Doing it that way, though, is obviously not in line with the way I’d approach a problem.
What are your thoughts on this framework, and is it missing anything?