Consult, Design, Test, Repeat

Agile and Human-Centered Digital Training Development for Micro- and Small Enterprises

Strive Community
Mastercard Strive
8 min readJul 20, 2022

--

The following is a guest article by TechnoServe, which offers practical insights on agile and human-centered development approaches to designing digital training content for micro- and small enterprises.

How do you design a global digital training program that meets the needs of small business owners in Abuja and Almaty, Mombasa and Michoacán?

We confronted this question while designing Strive Community’s digital training toolkit on growing a successful e-commerce business for micro- and small enterprises (MSEs) around the world. We had to learn what information entrepreneurs needed and how to deliver it, and we had to learn it quickly.

To get the answers, we developed an agile and human-centered development process: gathering user feedback, using it to design the training, testing the training content, and then adjusting our approach in real time and revising content based on the results.

This type of consultative, iterative approach is useful not only for global programs like Strive Community, but for any design process. This article describes how we implemented this approach and what we learned in the process.

Our process in five steps

1) Needs assessment: understanding entrepreneurs’ needs and high-priority topics

The first step was to conduct a needs assessment to determine the training topics and delivery channels. We recruited 120 MSE owners from Brazil, India, Indonesia, Kazakhstan, Mexico, and Nigeria through our existing networks of entrepreneurs and targeted advertising on social media. These MSE owners had between two and ten employees, were focused primarily on retail, and were currently selling online. We conducted online surveys and phone interviews with these entrepreneurs, as well as e-commerce experts, to assess which topics MSEs most urgently needed. We also asked entrepreneurs about their device and app usage, internet access, and how much time they could dedicate to learning new skills.

Based on this needs assessment, we developed initial hypotheses to further test in our pilot:

  • Focus on promotion and digital marketing (e.g., advertising, keywords, cross-/up-selling), customer segmentation, and competition (e.g., pricing strategy)
  • Focus on selling on social media, which is where most MSEs are
  • Deliver training should be delivered via smartphone, which has the highest penetration rate compared to other devices
  • Create efficient training programs that take one to two hours per week at most

2) Small-scale testing of learners’ preferences for content presentation and style

Once we had a clearer picture of our target segment’s needs, we quickly tested their preferences for digital training content at a small scale to establish some “ground rules” for our content development.

We asked entrepreneurs from the needs assessment and new recruits from social media outreach for feedback on sample video and training content. To ensure global representation, we sampled a variety of countries and targeted a minimum of ten responses per country.

Using WhatsApp, we sent a series of Google Form surveys with multiple choice, ranking, and open-response questions to test user preferences such as how characters are portrayed (their image and voice) and how content is presented (perspective, level of technical language, balance of text and visuals, and degree of interactivity).

Some examples of questions we tested with our entrepreneurs:

3) Content development

Based on the findings from the needs assessment and small-scale surveys, we created ten modules on high-priority e-commerce topics across the customer journey.

4) Rapid testing and feedback for each module (10x)

Once each module was developed, it was immediately tested “in the field” with a pilot group of approximately 90 English-speaking MSEs from Kenya, Nigeria, and India. Users were given one to two weeks to complete each module, followed by a one to two week period for gathering and analyzing feedback.

For each of the ten modules, we collected qualitative and quantitative user feedback through multiple channels:

  • End-of-module survey: ratings on overall satisfaction, relevance and usefulness of the content, likelihood to adopt practices, etc., open-ended feedback on likes / dislikes and questions
  • One-on-one interviews / advisory sessions: phone / video call conversations between entrepreneurs and program team to provide in-depth feedback on course experience and practice adoption as well as to ask for advice
  • Group Q&A sessions: group video calls to assess actions learners took after the module, identify frequently asked questions, and offer advice
  • Discussion forums and WhatsApp groups: monitoring users’ posts and messages to assess engagement, pain points, etc.
  • Assignment submissions: analyzing learners’ coursework to assess understanding of material, likelihood to adopt practices, and content’s user-friendliness and difficulty level

Among these feedback channels, the surveys and interviews were most useful for understanding content strengths and specific areas of improvement. However, users’ discussion posts and coursework were most meaningful for assessing practice adoption.

Feedback from testing was then used to shape the development of ongoing and future modules. Over the testing and iteration cycles, our work became increasingly refined and responsive to user feedback.

5) Revisions to prior modules

At the end of all ten testing cycles, we further revised the content based on cumulative learnings and specific feedback on individual modules.

What we learned: Developing effective digital training content for MSEs

Our testing process yielded rich insights about what works well in digital training for entrepreneurs. These findings fall under three themes.

1) Content should be presented in an accessible and engaging way.

MSEs…

  • Want less text and more visuals, video, and audio.
  • Appreciate acquiring new terminology / technical concepts when they are easy to understand and explained step-by-step.
  • Prefer active (e.g., quizzes and exercises) vs. passive learning

Two- to three-minute animated videos for each module were engaging, convenient, and memorable.

A highly interactive and visual approach promoted learning and engagement.

2) Entrepreneurs want “real” content that reflects their experiences.

MSEs…

  • Prefer human characters who they find relatable in animated videos.
  • Prefer to learn from a “peer” / fellow entrepreneur rather than an instructor.
  • Appreciate real-life examples / case studies based on experiences that entrepreneurs face.
  • Want to see a diversity of business types represented to better extrapolate to their own businesses.

Examples used throughout were based on real entrepreneurs.

3) Content should focus on the urgent, practical, and actionable.

MSEs…

  • Have limited time / capacity, so content should be as brief as possible and immediately get to actionable practices.
  • Particularly appreciate “easy wins” that make a difference in their business.
  • Found the assignment templates for applying lessons to their own business at each module’s end to be one of the most useful parts of the course.

Templates / worksheets at the end of each module allowed learners to immediately apply lessons to their business.

Addressing challenges in the development process

Our rapid testing and feedback-intensive approach presented unique challenges.

1 ) Challenge: Remaining agile and balancing time between content development and testing

Testing and developing content in parallel places pressure on timelines, as we needed to test modules sequentially and quickly incorporate feedback into the development of the next module.

Solutions

  • Allocate more time to development and testing on earlier modules where the learning curve is the steepest. Later modules’ timelines can be compressed as learnings are established.
  • When time is limited, focus on the highest-value and most urgent feedback, circling back to other improvements at the end of the process.

2) Challenge: Avoiding testing fatigue and maintaining user engagement in digital channels

Testing and developing in parallel extends the length of the pilot program, which risks users dropping out or becoming less active. A feedback-intensive process also runs the risk of fatiguing users.

Solutions

  • To reduce the burden on users, iterate on and streamline feedback collection over time to prioritize the highest value-add questions.
  • Analyze natural outputs of the learning process, like users’ coursework and messages, that didn’t require additional effort from them.
  • Make the feedback process mutually beneficial. For example, we gathered feedback in interviews that doubled as advisory sessions and group Q&A sessions with experts that learners found highly valuable.
  • Build community (virtually). Learners are more engaged when they understand the role their input plays and feel like part of a learning community. We did this by creating opportunities for interaction, like WhatsApp groups and virtual meetings.

3 ) Challenge: Limitations of user feedback

Often, users may be reluctant to give honest critical feedback or cannot always clearly articulate what they need and what is missing from content (after all, they don’t know what they don’t know!).

Solutions

  • Supplement direct user feedback with additional data points. Getting insights from testing often requires “reading in between the lines” and observing user behaviors. For example, dropout rates, time spent on the content, engagement with coursework, frequently asked questions, and user practice adoption all yield important information on the efficacy of the content.

Conclusion: Key takeaways on agile and human-centered development of training content for MSEs

1. Leverage small-scale survey or A/B testing early in the process to establish users’ preferences and digital capabilities before you start development. This can help ensure that content is broadly aligned with user needs from the start.

2. Use an agile and iterative approach with multiple rounds of testing to gather “real-time” feedback during content development. This “rapid testing” approach avoids costly redesigning of content later and allows for continuous improvement, culminating in a better final product.

3. Create mobile-friendly training content that is visual and engaging, representative of real experiences, and focused on immediate actions MSEs can take to make a difference in their business.

4. Engage your community of users. Users like to know that their input matters and are more engaged if they are part of a community and mission. Make the feedback process valuable for them as well; for example, have focus groups or Q&A sessions that double as opportunities for users to get their questions answered.

5. Leverage indirect feedback. To avoid fatiguing users with too many questions and to derive more accurate insights, observe how users engage with the content and examine outputs like assignments, discussion posts, messages, etc., to understand the training’s gaps and successes.

6. Be prepared to change your process. Through iteration, we found that we needed to rebalance our resources to bring in more experts with direct subject matter experience and increase our investment in visual resources. Having a flexible budget and workplan are critical for being able to adapt to user feedback.

Developing effective digital training content will become only more important in the years to come. By taking a consultative, iterative approach, organizations can efficiently and effectively develop content to help entrepreneurs grow their businesses and build stronger livelihoods.

--

--