Design Thinking for Better Data Collection in Chaotic Environments

Anonymous HKS Student
DPI-662: Digital Government
5 min readSep 10, 2016

Data is one of the last words a nurse working in a rural, poorly resourced health system in Sub Saharan Africa wants to hear. Nurses have the unenviable task of caring for patients in the middle of several interlocking health crises (HIV, TB, chronic malnutrition, etc.) and they are also responsible for a lot of dull work: making sure the health clinic is clean, tracking the gasoline in the generator, and, most importantly to us outside researchers, collecting data. When gravely patients arrive at these often one-woman health posts, some of these tasks take a back seat, particularly data collection.

For two years I worked as a research manager in such a health system managing the data collection and analysis for a study designed to improve the HIV identification and care for infants in 60 rural health facilities. Part of my job was to design new data collection forms to make sure all of the study outcomes the Ministry of Health and external funders cared about were accurately recorded. Working on a tight timeline, I visited a few sample clinics, took stock of the existing data tools, and set to work to fill the gaps.

Invariably in my early visits nurses complained about the workload created by the 15+ overlapping “data registers” that had to be completed each week. Many of the forms had been photocopied on poor photocopy machines over and over again and reduced to size 8 font to cram more information into each page. It was beyond our mandate to be able to get rid of any of the 15 registers in existence, but we could at least redesign a few of them and print clearer copies with size 14 font. This won a few friends among the older nurses with thicker glasses and unsteady hands. However, our experience making a few forms a bit prettier masked some deeper problems that went unresolved.

Mistakes were Made

While we recognized many design flaws in the existing system and forms, when it came design to build our own new forms we also failed miserably. In the process, we inadvertently created more work for nurses and ended up with unusable data after the forms were abandoned halfway through the project. The biggest mistake among many we made was making our new forms a combination of data that was already collected elsewhere (albeit irregularly) and new data specific to the project. The new task was perceived as both redundant and extra work after it was introduced to the facilities. Overtime compliance faded to levels rendering certain data unusable.

In hindsight, the forms were designed to meet our needs as researchers rather than the needs of the rural nurse tasked with completing the paperwork. It centralized all of the critical information to our project in one place and ensured that we could follow-up on a specific form if the paperwork was not filled out. However, from the perspective of the true user, the nurses, this was just form number 16, no more urgent or less bothersome than the other 15.

In addition to this overarching mistake, we also did not make the forms very easy to fill out. While we did completed one round of piloting before finalizing the forms, we only changed the surface level problems that we encountered at our first test. We did not anticipate any of the problems that would arise from edge cases that didn’t show up in our test drive. This experience clarified that piloting in such research settings needs to include more than a formality of testing out a survey or form once and making the obvious corrections. It requires multiple iterations to get it right.

Institutions Not Conducive to Innovation

While I think we knew at the time that multiple iterations of piloting, revision, and training would have been ideal, there were several structural barriers that prevented such an approach. First, all research protocol changes needed to be approved by two separate ethics review boards, one in the US and one in country. This process was expensive and time intensive taking several months to clear all of the hurdles. Any changes to forms required opening up the process again and seeking new approval. Second, the Ministry of Health at various levels wanted to approve all final forms, which also took several months. These overlapping processes of approval did not lend themselves to multiple iterations.

Carving Out a Design Space

Although the existing environment was not well suited to tinkering in our project design phase, there are certainly steps we could have taken to carve out such a space. One of these would have been to conduct several round of piloting before submitting the first draft of the materials for approval to ethics. This could have been accomplished in one to two weeks, iterating on a daily basis and working with a small group of facilities close to our office to reduce cost (even if these are not the facilities we would have worked with at scale).

Second, we could have engaged more actively with the health facilities to see if existing data collection forms could have been improved to the point where they were willing to work with them more regularly and therefore not introduce a new form. This would have required a longer term engagement with the Ministry of Health and approval from a more senior level.

Third, we could have more actively asked the Ministry of Health for a window of time to experiment. To gain such approval we could have laid out our initial designs and talked about the bounds of such experimenting, only going back for regular approval if those bounds were exceeded. If not possible to obtain such approval from a higher level, perhaps approval could have been won to temporarily devolve decision making power to a local authority who could review each iteration on a much quicker time frame before gaining higher level approval to scale up the approach.

A More Systemic Issue

While these suggestions may have made our individual project more successful, they would have done little to improve the broader data collection program of the Ministry. Layers and layers of poorly designed systems that were not friendly to the nurses and other frontline workers would still have persisted and sucked up valuable resources. Many of these layers are due to outside researchers (like mine) and other partner organizations that all have enough sway to get their niche data collection form added or their system tweak implemented in some portion of the system. These processes pile up over time, outliving the partners’ involvement, and do not serve the long run needs of the users, the frontline workers, and the broader health system. A more fundamental rethinking of the health system and how design thinking could be applied to solve problems is needed to tackle this more fundamental problem.

--

--

Anonymous HKS Student
DPI-662: Digital Government

Musings on tech, development, and public policy for Digital Governance #DPI662