Sometimes it *is* about the tech

Making All Voices Count Research Spotlight

Kate Thompson Davy
Civic Tech Innovation Network
6 min readJun 22, 2017

--

Mobile journalists in training. Picture: ZOE KLAR via ALLISSA RICHARDSON

“It’s not about the tech” is a well-known trope. It encapsulates the idea that the choice of one technology tool over another is less significant than how that tool is used and in what context. But is it true? This report argues that the choice of a tech tool is still critical to the success of a project, particularly for technology for transparency and accountability initiatives (T4TAIs). In other words, “sometimes it *is* about the tech”. The researchers investigated how organisations go about choosing their tech — what they do and where they “drop the ball” — in order to promote better decisions and interventions.

Why you should care

The choice and roll-out of a technology tool can make or break the success of a civic tech project: it will affect its ability to reach scale, the availability of collected data for analysis, and may even guide how you can engage with stakeholders around the project. This research found that less than 25% of the survey participants were happy with the tool they chose — including those built for purpose. Finding out about available existing tools before developing one, and preparing a solid project plan with defined outcomes are among the remedial steps this report suggests.

The gist of it

Indra de Lanerolle, Chris Wilson, Sasha Kinney and Tom Walker researched the processes behind choosing a technology tool in transparency and accountability initiatives. Through semi-structured interviews with almost 40 Kenyan and South African organisations that used tech for acccountability and transparency, they identify common starting points and conditions for success and failure in those choices. They find that most organisations tech choices were unsuccessful. The researchers make the case that more research and planning in the design phase of projects is necessary to make better tool choices. They find that a few key practices could make a significant impact in making tech work better. They capture these in six ‘rules’ or heuristics which include ‘trialling’ — testing out tech choices with users before launching a new service — and finding people who have already tried to do what you are aiming to do before building new technologies yourself.

Report summary

This study focuses on the process of tool selection, using a mixed methods approach that incorporated an online survey, and telephonic and in-person interviews.

The tools in question included both relatively simple and fairly advanced systems, from communication tools like SMS to web-based data portals. The researchers found that few of the organisations had dedicated and experienced technical staff to choose or manage their tools. Organisations’ decision-making processes were “rarely linear or highly formalised”, and many of the organisations had conducted little to no research on their end-users and the tool itself. Alarmingly — that less than a quarter of the organisations interviewed reported that they were happy with the tools they had ultimately chosen. Potential remedial outcomes of the study include “six basic rules of thumb for organisations deploying ICT for development” and “four recommendations for funders”

You can download the report (PDF, 3.2MB) here.

What was done?

Researchers interviewed staff members from 38 diverse organisations in Kenya and South Africa about the way they had chosen a digital technology tool to use in a transparency and accountability initiative. Each of the interviewed organisations had recently chosen tools to use in their work. Survey questions focused on why they had chosen that particular tool, the process of choosing it, and whether they were happy with the choice.

What happened?

Out of a semi-structured interview and survey interaction with the 38 organisations, researchers identified three common starting points for the technology tools chosen by organisations:

  • Need: Most (21 out of 38, or 55%) started with a need that they thought a new technology could address.
  • Tool: Some (9 out of 38, 23%) reported that they had discovered a tool and wanted to find a way to use it.
  • Peer knowledge: Others (8 out of 38, 21%) reported seeing a tool used by a peer organisation and then decided to “implement a similar project in their own context”.

Particularly revealing was the finding that in just under half the cases (17 out of 38, 44,7%), the organisation reported starting with the tool or type of tool they wanted to use before they knew how they would use it.

How organisations choose tools. Screengrab from report.

The researchers also found that 57% of organisations (10 out of 20 in Kenya, and 11 out of 18 in South Africa) built their tech tool from scratch — despite having limited development and technology experience within their organisation — without investigating existing similar tools. Additionally, the study showed that many organisations had completely outsourced or delegated the decision-making process to a “technical partner”.

What did the researchers learn?

Key findings include:

  • After choosing tools, many organisations discovered functional limitations that significantly affected how useful the tool was for their initiative.
  • Half the organisations had problems in getting their intended groups of users to use the tools that they chose. Others had collected very little information on whether people were using them.
  • Where organisations built a new tool, they almost always did so without planning or budgeting for developing the tool beyond its launch version.

In addition to identifying the paths organisations use in their tool decision-making process, the researchers offer the following two sets of guidelines:

Six “rules of thumb” for choosing tools

The researchers identified six basic rules of thumb for organisations deploying ICT for development:

  1. Map out what you need to know: At the very least include research on the issue you want to tool to address, the needs to the people you hope will use it and the digital tool options already available
  2. Think twice before you build: Look for existing tools that can do the job; building new technologies from scratch is complex and risky
  3. Get a second opinion: Someone else has probably tried a similar approach before you
  4. Always take it for a test drive: Trial the tool; it highlights problems and raises key questions early on
  5. Plan for failure: Don’t expect to get it right first time; budget time and money to make adjustments.
  6. Stop and reflect on what you’re doing: Keep thinking about what is working, and what isn’t.

Four recommendations for funders

  1. Help organisations do more (and more effective) research: Organisations need to be supported and encouraged to develop project plans covering intended users, similar tools, and aims of the project — before they commit to using a particular tool.
  2. Give the space to trial and adjust: The first attempt to use a tool is unlikely to be the one that succeeds.Accepting failure, the need for trialling products, and the need for an iterative process that adjusts as it continues promotes overall success.
  3. Support networks that provide face-to-face advice: Make connections and support spaces where organisations can share experiences openly or get access to appropriate, tool-agnostic advice, to help your recipients find and work well with suitable technology partners.
  4. Make research more accessible and actionable: Without access to appropriate research, organisations often repeat common mistakes in the tool selection process. To help them make better informed choices, investigate various ways to present key heuristics and guidance in ways that are relevant and actionable.

Conclusions

Overall, they conclude that — as the report title states — sometimes, it is about the tech. “Choosing the right tool is a necessary, though not sufficient, part of ensuring that a T4TAI meets its goals”, they write. And making the right choices needs research (particularly into the end-users of the tool), as well as acknowledging an organisation’s own strengths and skills gaps. Stronger learning networks and support structures from funders can significantly aid in plugging those gaps.

For more details, you can access the research report here.

Contact person: Indra de Lanerolle on indra.delanerolle@wits.ac.za or on Twitter @indradl

Project: Research report for Making All Voices Count

Publication: “Sometimes it is about the tech: choosing tools in South African and Kenyan transparency and accountability initiatives: findings and summary”

Authors: Indra de Lanerolle, Tom Walker and Sasha Kinney

--

--

Kate Thompson Davy
Civic Tech Innovation Network

Freelance journalist & editor: word nerd, occasional photographer, water-baby, crazy dog lady, technophile, feminist