Build communities around a program: Version with focus

There are no perfect solutions, nothing is sacred.

--

Thursday, 8 PM, I can’t believe my WhatsApp notifications! 30, 40, sometimes 60 messages sent on our communities!

The teachers and leaders are highly engaged sending words of appreciation to each other: “I congratulate @Sujatateacher for being a Dedicated Educator and helping all her children learn”, “I want to celebrate @RahulSir for pushing his students to think critically!”, “@Fatimamiss is a Modern Professional because she uses innovative solutions in her class every day, like Attention Grabbers and Pair & Share”, “Special claps for @Dharshanamissprincipal for always supporting us and reminding us we are Nation Builders!”

Thrilled by the responses, I open the 321 Designers and Moderators group. It is a celebration there too: “Amazing job Design Team — the responses keep coming!”, “Thanks for the early feedback, the 3rd version seems to work today.” The different elements of our creation approach to educators’ online communities were slowly coming together.

In my first article, I described the first element — #1. Build on existing user habits — let me share a little more today.

#2. Measure your success

At 321 we use data to “Learn, Improve, and Prove”. The reflection on data defines what we want to learn, which in turn helps us define what we are trying to achieve for our users. It comes at the start of the designing process.

One thing we have learned over the course of the years is to be very mindful of what data we collect, and how much we collect. It requires discipline. Our Kaizen team (Monitoring & Evaluation) helps every team in the organization build that discipline by defining key metrics to focus on; they ask us: “What do you exactly want to know by collecting this data? How will you use it? Which questions do you have? How will you prioritize?”.

With regard to the educators’ online communities, our answer was straightforward: “We want to know how engaged our users are.” After some research done on how other communities measure their engagement, we decided that our measure of success will be the ‘reply rate’. Once the metric was defined, a collection process was built, and dashboards for analysis created.

When the system was set — after multiple tests and versions — the magic started. During our weekly data analysis call, we tried to understand what data wanted us to wonder about. Dashboards and graphs present information which triggers more questions: ‘Is the engagement rate different in Hyderabad and Mumbai?’, ‘What is the variation compared to last week’s engagement rate?’, ‘What about these specific groups who were low on engagement?’. It also helps us reflect on the specifics of our design: ‘Does that mean this content could be the reason why the engagement is low?’, ‘Would this format increase the responses?’, or ‘We have used twice the same call-for-action (define later), and the rates went up each time, could it be it?’.

Assumptions were triggered… Hypotheses shaped…

But before we explore and understand how 321 started applying the hypothesis-driven approach, we first need to have a closer look at the details of the messages.

#3. Version the content until one sticks

During research on building communities, I once read “When launching a new community, we can ponder endlessly what kinds of posts will spark engagement. Unfortunately, simply pondering these questions does nothing to move the needle. The team jumped right in and tried several different types of content to initially engage people after their group launched.” (source)

And that is what we’ve done as well. We thought about what we wanted our educators to do or feel with our messages and built on that. We wanted them to share successes of using new solutions, to see the impact on their students, to feel motivated or to solve problems as a community of educators.

Hence, the content has three categories: 1) Follow up, 2) Motivation and 3) Problem-solving. Follow up messages ask educators how they are implementing the teaching solutions in their classes (with images, students stories), or help revise the solutions. Motivation messages contain inspirational quotes or facts, aligned with the program’s beliefs, and are about celebrating educators’ “small wins”. Finally, the problem-solving messages present a reality-based problem which educators can solve.

321’s Beliefs for Educators

The second aspect of the messages is their format. By choosing WhatsApp, one aim was to build on existing user’s habits but also to play with the different media — text, video, image or even audio. Keeping in mind John Medina’s Brain Rules №10: “Vision trumps all other senses” the use of images and videos was prioritized.

As our measure of success is the reply rate, it seemed important to us to help the educators do just that: reply. That is why we defined a third well-defined component to our messages: the “call-for-action”. The call-for-action was inspired by other social platforms such as 9GAG, Instagram, Facebook or Twitter which develop various mechanisms for the users to engage with a post.

For a few months, we did as Udemy and we created messages playing with combinations of content, format, and call-for-actions. After some time, we also reached out to the communities’ moderators for their frequent feedback on the message. As the moderators are the educators’ trainers and coaches, they have a great understanding of what would work, what might not work, or what else to try out. Moreover, like Ruth, a designer at 321, explained: “At 321, we have a lot of schools for which our content is differentiated. Since the trainers are up to date with which solutions are used or not due to their frequent school visits and classrooms observations, they can often help the design team with narrowing relevant content.” The creation then evolved towards a ‘co-creation’ process with multiple versions of a single message.

This approach was matching several of our broader guidelines: user-centered with the measure of success, good design with simple and real-world ready messages (based in ground observations), versioning with frequent feedback from moderators and timely tracking of reply rate from users, as well as borrowing from other successful approaches with our call-for-action.

We were applying the mantra ‘version the content until one sticks’, and we could see some trends emerge. The problem now lay in knowing for sure that one was sticking. The risk was also to spend too much time and energy going in the wrong direction, which seemed to be the case as engagement rates were dipping for certain communities. We needed a way to bring focus to our versioning.

#4. Version with focus: hypothesis-driven development

The method we found fitting our needs the most was the hypothesis-driven approach: “Instead of randomly testing ideas that you ‘feel’ are good, the focus should be on building a solid hypothesis that maximizes chances for winning.” (source)

We embarked on the journey of hypothesis-driven development.

The first step was to define for ourselves what is a hypothesis. Here’s the one that we selected :

“A hypothesis is a proposed statement made on the basis of limited evidence that can be proved or disproved and is used as a starting point for further investigation.” (source)

The second step was to put down our proposed statements, our “predictions for our experiments” informed by as much knowledge as we had. Thanks to the first few months of versioning, we had the “limited evidence” to build our hypotheses: “We think by making this change, it will cause this effect.”

Here are some initial hypotheses we put down to test:

  1. Sending messages on a holiday will have a higher engagement rate than sending messages on a day which is not a holiday.
  2. For schools with mixed language comprehension, sending the message in Hindi and English will show the highest engagement.
  3. Puzzles for 321 teachers will show the highest engagement.
  4. Sharing reflections after coaching will increase engagement in the next week.

The third step was to define when we will know the change in the metric means true or false. For that, we played with a different sample set of communities for different hypotheses. For the 1st hypothesis, we choose a set of communities: a representative sample of low, medium and high engagement rates. We sent them the message on a holiday and compared their reply rates with a control group. For hypothesis 2, we selected communities for which the “mixed language comprehension” parameter applied, to few we sent messages in Hindi and in English, for the rest only in English. For hypothesis 3 and 4, we used the entire pool of communities, compared with the previous engagement rate (a message which wasn’t a puzzle or reflections).

The fourth step was to create a calendar to know when to test which hypothesis. I would like to mention a very important point here — hypothesis-driven approach isn’t a scientifically proven approach, because it doesn’t pretend to take into consideration all the variables that could influence the results of the experiment. The approach attempts to determine which variables will not be considered in a test, as well as those that will be considered. In the attempt of reducing the influence of different variables, hypotheses were scheduled to be tested many times.

Once the calendar was set, the final step was to create the content of the messages to test the hypotheses.

We were all set.

A few months later, we were satisfied to observe a hike in engagement rates across communities. Analyzing the results closer, many hypotheses were validated, some were not. “We welcomed the results telling us ‘the hypothesis is wrong’ with joy as it really helps to challenge the team’s assumptions of what works and what doesn’t work.” (Anukshi, a Designer at 321). The results will now be distilled into learning, and help us take focused decisions on what elements to keep, to change, and how.

Zooming out we can say that our creation approach based in focus versioning is successful. But as the definition of hypothesis states: “it is [just] the starting point for further investigation”…

To know more about our work, check out our website : https://321-foundation.org/

This article was originally written and published by Sophie Bereau, former Design Manager at 321 Education Foundation. You can follow her work at https://medium.com/@sophie_b

--

--

Learn. Share. Repeat. by 321 Education Foundation
Learn. Share. Repeat.

321’s official blog where team members share their insights and learnings about the work we do.