Photo by UX Indonesia on Unsplash

Learning about learning

Insights from developing monitoring, evaluation and learning practices for the SENS initiative.

Griffith Centre for Systems Innovation
Published in
11 min readJan 10, 2024

--

Introduction

The Australian Social Enterprise National Strategy (SENS) initiative started in 2020. After extensive research and engagement to scope the potential for a national strategy, Social Enterprise Australia (SEA) was established in 2022 to facilitate and lead its development. SEA also provided a ‘peak’ function for social enterprise in Australia and sought to increase engagement with the Federal Government and other national bodies; and to support the sector to develop its collective capability and capacity, and to work together and with others towards shared goals.

Griffith Centre for Systems Innovation (formerly The Yunus Centre, Griffith) supported SENS by undertaking the initial research and strategy development. It then played a role, alongside other partners, in incubating the establishment of SEA and, more specifically, leading the design and implementation of the SENS evaluation and learning (E&L) framework. This work has been underway for around 18 months.

In this blog, we share insights generated through the process of developing and evolving the SENS E&L framework and embedding evaluative practices. While these are specific to this work, we believe they reflect some of the general challenges of undertaking and embedding E&L in new, ambitious, and stretched initiatives which are operating in dynamic, systems contexts.

We hope our learning will be useful to others grappling with similar challenges or to those seeking to support systems initiatives.

Juggling short-term survival with long-term success

Fuelled by ambition and driven by the desire to make a positive impact, for-purpose start-ups throw their energies into the many tasks required to set-up and get going. These include fundraising, networking, recruiting staff, and raising their profile in typically noisy operating environments.

However, amidst the busy-ness of doing, limited systematic reflection and learning may hinder the prospects of long-term success. It can lead to missed opportunities, poorly allocated resources, ineffective activities, and unintended consequences. The effects of poor learning may be amplified when more ambitious and complex goals — such as systems change — are set.

Therefore, establishing a culture of learning and the processes that support continuous improvement are necessary investments for early stage organisations to make if they are to succeed in the ever-changing landscapes they operate within. Research[1] has shown a range of benefits from learning that include:

  • being a source of innovation that helps to create competitive advantages;
  • improving an organisation’s performance;
  • influencing expectations of future growth;
  • enhancing team members’ competencies; and
  • creating new knowledge.

Where we started

The initial SENS evaluation and learning framework was developed in 2022 with SEA. It laid out the principles, underlying evaluation approaches (participatory and developmental evaluation approaches[2]), and key learning questions organised around SEA’s spheres of control (or nominal control), influence, and interest[3].

We sought to answer questions related to what SEA did and what influence it had, while also looking at emergent changes in the broader social enterprise ecosystem and whether they are influenced by SEA’s activities. The primary aim was that E&L would support SEA (as the primary implementing actor of SENS) to generate timely, useful, and usable information to support its work. The approach also aimed to provide accountability to funders and sector stakeholders, and contribute to broader learning in the sector.

Unsurprisingly, implementation has diverged from original plans and intentions. For example, from early 2023 the co-design approach to developing tools and processes was undermined by SEA’s pace of change and limited bandwidth. This was stretched largely as a result of SEA’s presence and activities generating much higher levels of interest (and demands) than anticipated — effectively, success compromised capacity.

This led to systematic E&L largely being carried out in parallel to the core work program rather than being fully integrated into it and provoked discussions about what was possible and by when, with an acceptance that interim arrangements would have to adjust to fit reality. We were also forced to acknowledge the challenges of working in an emergent way where plans often change before they have even had the chance to be articulated properly. The following sections summarise where we landed to cope with constraints and rapid change. The medium-term aim remains that E&L will be fully integrated into SEA’s ways of working.

Shaping strategy and action by asking questions

Asking questions can be a powerful tool in embedding learning practices. So far in SEA, our E&L activities are helping to shape strategy and action mostly through asking, rather than answering, questions.

Even without having answers, questions by a critical friend can help focus people’s attention and practice learning. As Elder and Paul (1998) argue[4]

Thinking is driven not by answers but questions.’

Here are some strategies we are using:

  • Asking questions of SEA have been helpful to pull team members’ out of busy operational work like setting up financial management and IT systems. Our questions have included what is SEA doing? How it is doing it? Why it is doing things? And how activities relate to the underpinning theory of change (the original framing of what SEA exists to do). Just the process of having to answer a question can help people to articulate their thinking and reveal gaps in their rationale and implementation logic. All of which can then trigger improvements.
  • Following a simple after action review format, we use open questions, such as ‘What happened? So what? Now what?’ The benefits of this approach are:
  1. Team members’ do not need a lot of preparation time (although some preparation helps).
  2. Reflections can be timed in line with key project milestones and/or used on an ad-hoc basis as opportunities arise. Most of the ‘after action reviews’ we have conducted so far have been ad-hoc.
  3. Complementarity with SEA’s ‘probe-sense-respond’ strategy process, whereby SEA tests stakeholders’ interests and needs through conversations and small actions, to identify demand and where limited resources can be best used.

This approach might be best described as trial-and-error learning whereby organisations inadvertently learn from the results of previous actions or improvisational learning where organisations learn while activities are in motion, thereby reacting to issues as they arise[6].

To enhance the benefit of asking questions and building an organisation’s ‘reflective muscles’, we learned to:

  • Encourage open dialogue and create spaces for discussions that allow team members to internalise the importance of learning and what it might mean for their roles. This has been done by:
  1. Using after action reviews to provide a simple framework for team members to ask questions of themselves and each other, thereby creating a broader culture where learning is a team responsibility and not that of a single team member.
  2. Encouraging individual team members to share and discuss their progress and reflections, so as to weave together learnings and mitigate risks of siloed responsibilities in small teams.
  • Navigate different views that may be necessary as team members are likely to have different experiences and perspectives on what has happened and why. It’s important team members can agree on a way forward, even if there are compromises rather than consensus. Done proactively, naming and working through differences (rather than avoiding them) can also lead to the strengthening of practices, culture, and ongoing learning.
  • Capture data generated from individual team reflections in ways that allow it to be more easily analysed over time. Planning for data analysis, not just data collection, can improve the efficiency of analysis over the long term as well as the depth of the insights as patterns emerge. Deciding how data will be used, by whom, and what for early on helps to shape how data can be collected, categorised, and sorted in ways that enable better analysis.
  • Expand who participates in reflections, if possible, since different organisations have different purposes and cultures. The value, however, may not necessarily be obvious or a high priority for partner organisations, who also have time constraints.
  • Be proactive to complete feedback circles. In some cases, after action reviews have identified actions to be taken that have not been implemented because of changes in priorities or changes to the external environment. Capturing these changes is important to ongoing learning and ‘conscious’ decision making.

We also recognise that funders and boards play an influential role in helping to establish learning practices, setting the tone about learning and evaluation. A 2021 study of evaluation practices in Australia’s philanthropic sector found ‘Many boards were comfortable with anecdotal evidence regarding impactful work, as opposed to looking to evaluation and data insights to draw definitive conclusions.’[7] And like the managers of new organisations, their funders and directors may also be focused on short-term priorities.

As part of the E&L framework, the SEA Board has started to ask itself a range of questions about how it wants to frame and enact ‘impact governance’. Currently, it is articulating this role through:

  • Ensuring SEA develops and implements appropriate E&L systems, processes, and practices.
  • The board having a shared view of what information it requires of the Executive at any given time, how it relates to SEA’s strategic priorities and theory of change, and how they will interpret and incorporate data and learning into their wider duties (e.g. adapting/directing strategy).
  • Ensuring SEA’s impact reporting obligations are met and that the needs of stakeholders with respect to information and impact measurement are anticipated and served.

The board recognises it needs to invest its own time and capability to set-up and implement these functions well. Such processes may also include reaching a common understanding of what good learning practices look like as board members come from diverse backgrounds and have similarly diverse views about learning and evaluation.

Shaping strategy and action through evidence

To improve the quality of our questions and discussions, we recognise that we also need to improve the quality of evidence. In 2024, SEA aims to create a broader body of evidence that provides insight into its effectiveness and influence. This will help SEA ask more probing questions as well as testing assumptions underpinning its strategy.

Because evidence gathering and documentation creates work in the short term, we are necessarily taking a light-touch approach and integrating such activities into existing management processes where possible. For instance:

  • Developing simple systems that enable SEA to record and track progress against activities including what happened versus what was planned. Also, creating a simple dashboard that shows progress against core areas of work.
  • Recording information that will provide a chronicle of SEA’s work. This includes asking staff to record their reflections of key meetings in addition to formal meeting minutes that are often sanitised versions of discussions, recording external ‘critical incidents’ that reflect key events in the broader sector (like policy changes, movements of key individuals between roles, and the launch of new sector alliances, etc.).
  • Collecting data through periodic events like stakeholder consultations.
  • Designing and planning stakeholder surveys and interviews in 2024.

In trying to move towards a more integrated E&L approach we are navigating the following issues:

Putting in place systematic data collection requires effective coordination around roles, routines, rules and plans, but new organisations often find coordination challenging.[8] Systematic evidence generation activities compete for time with other, often more short-term activities like getting the quarterly financial reports processed and updating the website, where people can see the immediate result. Indeed, systematic data collection may not appear to have an immediate tangible benefit because evidence needs to be built over time for it to add value. This can mean that this work continually drops to the bottom of the pile. Ongoing effort is needed to keep it on the agenda.

Effective communication is crucial, but effective communication requires more than using common words or agreeing that evaluation and learning is important and should be embedded in everything that an organisation does. Outsourcing the establishment of evaluation and learning may be seen as a short-term solution, but it doesn’t guarantee findings will be understood and/or implemented. To overcome these risks, we are:

  • Encouraging team members who work alongside external supporters in the development of evaluation and learning to build in time for internal-facing engagement with colleagues to gather their inputs, feedback, and bring them along in the process.
  • Asking that team members guide external supporters on the appropriate sizing of evaluation and learning activities.
  • Preparing for a lack of understanding or misunderstanding that leads to extra work and adjustments to systems and practices.

Many learning and evaluation discussions focus on impact, which can contribute to an undervaluing of process information. Sometimes this may be that organisations do not see the link between processes and impact or assume processes will automatically lead to impact because of their intent. At other times, people associate evaluation as something that happens at the end of a program (as is common with government programs) and, therefore, can always be dealt with later. Undoubtedly, information about processes and impact are both important. Too much emphasis on either creates problems. For new organisations and initiatives, process-related evaluative activities provide greater real, or near real, time understanding on what may (or may not) be working. In the short-term, this helps to surface where improvements are needed and can be made and in the longer term provides important foundational information for impact-focused evaluations.[9]

While light touch approaches may be pragmatic, they may not provide sufficient in-depth insights particularly related to systems changes. For instance, initially, we had planned to conduct interviews with key sector stakeholders six-monthly to gather perspectives on changes in the sector and SEA’s relative contribution. This timing was intended to create a shorter-feedback cycle and increase understanding of what in the ‘system’ was changing, and help SEA adapt to emerging opportunities and risks. Several factors resulted in us taking a more episodic approach, whereby we would conduct interviews and a survey annually (yet to be implemented). These factors included: limited resources, prioritising gathering feedback from sector stakeholders to inform specific programming, and sensitivity to how often we could seek inputs from key informants.

End note

New organisations and initiatives juggle many tasks and pressures, and structured learning is often seen as a luxury that can be done later when the mission has matured and there is more time. However, practicing learning from the outset has immediate and critical impact, determining the trajectory and rate of development.

Our key learning about learning is that it has to be structured and enacted in ways that are pragmatic, manageable, and meaningful in order for busy teams to buy-in to it. And while E&L processes and practices are always somewhat subject to entropy, dedicated capacity (if initially external) and top-down expectations can be important enablers of seeing them integrated into core culture and ways of working over time.

Footnotes

[1] Sekliuckiene, J., Vaitkiene, R., & Vainauskiene, V. (2018). Organisational Learning in Startup Development and International Growth. Entrepreneurial Business and Economics Review, 6(4), 125–144; Byungyun Bae, Sungyong Choi (2021). The Effect of Learning Orientation and Business Model Innovation on Entrepreneurial Performance: Focused on South Korean Start-Up Companies, Journal of Open Innovation: Technology, Market, and Complexity, Volume 7 (4), p245; Yang Y, Zheng Y, Xie G, Tian Y. (2022). The Influence Mechanism of Learning Orientation on New Venture Performance: The Chain-Mediating Effect of Absorptive Capacity and Innovation Capacity. Frontiers in Psychology. May 30;13:818844.

[2] Better Evaluation (2021). Participatory evaluation. Accessed December 2023: https://www.betterevaluation.org/methods-approaches/approaches/participatory-evaluation; Better Evaluation (2021). Participatory evaluation. Accessed December 2023: https://www.betterevaluation.org/methods-approaches/approaches/developmental-evaluation

[3] This framing is from another evaluation approach, outcome mapping, but we did not use the full approach.

[4] Elder, L and Paul, R (1998). ‘The Role of Socratic Questioning in Thinking, Teaching, and Learning’ in The Clearing House, Vol. 71, №5 (May — Jun., 1998), pp. 297–301.

[5] Snowdon, D.J. and Boone, M.E. (2007). ‘A Leader’s Framework for Decision Making in Harvard Business Review. https://hbr.org/2007/11/a-leaders-framework-for-decision-making

[6]Jones, M and Schou, P.K., (2023). ‘Structuring the start-up: how coordination emerges in start-ups through learning sequencing’ in Academy of Management Journal, Vol. 66, №3, 859–893.

[7] The study concluded that governance (boards and trustees) was important to influencing evaluative (and learning) practices and critical to strategic thinking at governance level. Weak board-level interest in evaluating practice, a lack of understanding and lack of curiosity around social impact from boards impeded progress. Asia Pacific Social Impact Centre (2021). Philanthropy: The continued journey to real impact and better practice. Are evaluation, strategy and social impact frameworks the key to achieving change?. Asia Pacific Social Impact Centre, Melbourne Business School, University of Melbourne.

[8] Jones, M and Schou, P.K., (2023).

[9] Rogers, P and Woodcock, M (2023). Process and Implementation Evaluations: A Primer. Center for International Development Faculty Working Paper №433. May 2023. Harvard University.

Contributors

Dr Donna Loveridge
GCSI, Adjunct Industry Fellow and Evaluation + Learning Lead for SEA

Alex Hannant
GCSI, Executive-in-Residence and lead on SENS project

Jess Moore
Social Enterprise Austalia, CEO

--

--

Griffith Centre for Systems Innovation
Good Shift

Griffith University's Centre for Systems Innovation exists to accelerate transitions to regenerative and distributive futures through systems innovation