Off The Record #7 : Understanding agent networks

Incentives and Identity

Busara Center
The Busara Blog
4 min readApr 12, 2019

--

Photo © Busara Center

One question we are frequently asked is the following:

“What are some things that you have noticed working in the Global South that are specific to these markets?”

As a behavioral science organisation, we are fascinated by contextual data specific to a population, and have noticed a common thread amongst the majority of programs we consult on for clients: the reliance on a network of agents to reach the desired population, particularly to enable last-mile delivery.

This is especially true in rural locations and amongst the poor. For example, access to financial services for such populations is made possible through an agent who is the point of liaison for last-mile cash-out. Or, to treat health concerns, trusted community doctors are far more effective than a faraway clinic. Similarly, for agricultural programs, training agents carry with them better farming practices and enforce accountability for loan repayments.

Because of this reliance on agents for a majority of large scale initiatives and programs, we decided to explore the behavioral triggers of agents themselves: what are their motivations, how to incentivise them and what makes them successful? We also wanted to explore techniques to increase agent efficiency, specifically in reaching rural women. These women are to this day the least included, especially in last mile delivery. Are women harder to reach because agents are typically male? To what extent does agent gender lead to selectivity? This Off The Record is the first step to enable us to explore these questions from an experimental perspective — we are excited to explore this in more depth moving forwards!

The research design

In order to identify the best nodes of diffusion and understand how to make an agent network more efficient, we decided to recruit “agents” ourselves. These agents were tasked with recruiting individuals to join our database for lab studies in Nairobi, and had the following selection criteria: participants must own smartphones and be from Kibera and Kawangware (Nairobi).

This design is a proxy for how agents mobilize people to participate in programs.

To measure agent motivation, we created two different invitation messages:

  1. Invitation framed as an opportunity for them to earn money
  2. Invitation framed as an opportunity for them to serve their community

243 “agents” (Male: 114, Female: 129) agreed to the task and recruited target populations for a period of 5 days.

Once “agents” arrived in our lab, we explained the task and they were cross-randomized into various treatment groups and given job contracts accordingly. The treatments were the following:

  • Treatment 1 — support to plan their recruitment strategies. Participants engaged in short discussions as groups to identify the best strategies to recruit respondents over the course of the next 5 days. This is in addition to the simple commission of 50KES per recruit.
  • Treatment 2 — each “agent” committed individually to recruiting a minimum of 50 respondents by writing this down on paper and signing it. This is in addition to the simple commission of 10KES per recruit.
  • Control — “agents” received a simple commission of 10KES per recruit

Our results

We tested whether different invite messages affect attendance and whether treatments have an effect on number of recruitments and found no significant effects between treatment groups. However, as evidenced in the graph below, we found that the gender of agents highly influences the gender of their recruits: women recruit more women, and men recruit more men.

While this is not surprising, it is important to reference and prove, and should be considered in intervention design when engaging agents.

Regarding the recruitment of respondents, 20% of our agents actively undertook their assignment: we obtained a total of 673 new respondents in our lab database by the end of the study. (We will likely call upon these new respondents for a future “Off The Record”!). This low uptake could be because the 10 KES incentive is too low, or the best practice strategies identified were not helpful enough.

What next?

This study serves as off the record evidence that agents recruit people who are similar to themselves, a behavior which is confirmed by other studies that demonstrate our biases. This could mean that in order to reach a specific group, the more similar the agent is to the group, the higher the chances of successfully engaging with the desired target. In this research design, agents were tasked with simple recruitment and received no specific guidelines on the demographic composition of their target sample. To make these findings useful, our next step is to unpack the mechanisms that drive this effect and then translate these mechanisms into interventions that perform as well as the recruitment of women by women and of men by men.

Follow our next “Off The Record” to stay up-to-date, and reach out to us on Twitter for any ideas of future studies we could run for Off The Record!

A final note

Our goal is to make research accessible in order to start, continue or inform conversations that help us to better understand human behavior.

With this in mind, each “Off The Record” post provides access to the full findings from our studies, freely available here.

As a commitment to Open Science, we keep this anonymized data live at this page for all our on-going research efforts.

Access the full data relating to this post here.

--

--

Busara Center
The Busara Blog

Busara is a research and advisory firm dedicated to advancing Behavioral Science in the Global South