Five things Brightside learned through a DataDive project with DataKind UK

DataKind UK
DataKindUK
Published in
6 min readJan 23, 2023

By Louise Jones, Head of Impact at Brightside

A Zoom screenshot of about 20 young adults smiling, all seated in their living rooms
The online team of volunteers at Brightside’s virtual DataDive weekend

At Brightside, we strive for positive and lasting impact. Over the last 18 years, we have learned a huge amount about how to run successful mentoring programmes, and we’ve also learned that measuring the impact of mentoring on mentees’ thoughts and feelings is complex. In 2019/20, 95% of mentee respondents felt mentoring had helped them feel optimistic about the future, but the proportion who recorded an increase in hope between the start and end of mentoring was much lower. While we consistently see significant positive change at the overall level for human and social capital, our behavioural outcomes show less change due to many mentees reporting positive, negative and no change in their individual responses. This is despite overwhelmingly positive feedback about their mentoring experience.

One of the great things about Brightside’s mentoring is the fact that all communication is text-based, and captured via our safe and secure mentoring platform. It means that we have the content of a mentoring conversation and can link it to a mentees’ baseline and exit survey responses. We knew we could learn so much from looking at this data together, but didn’t know where to start. And that’s where DataKind UK came in.

DataKind UK helps social organisations use data science in order to have more of an impact. We had heard excellent things from contacts and partners about their DataDive programmes — 6–8 weeks that culminate in a weekend event where data science volunteers work on your data to help you answer a set of key questions for your organisation. In May 2021, 25+ volunteers spent a whole weekend with data pulled from around 1,500 mentoring conversations, helping us investigate the following questions:

  • What mentee, mentor, and project-specific factors relate to positive change in outcomes and good quality scores? How do different mentee outcomes relate to each other?
  • What messaging behaviours (e.g. message length, response time, sentiment, number of messages) by mentors and mentees relate to more positive outcomes and quality ratings?
  • Which topics in mentee and mentor messages relate to positive outcomes and quality ratings?

It was a weekend full of energy, enthusiasm and insights. Read on for five things we learned from our DataDive project and what they mean for Brightside:

1. There is no silver bullet for impact; mentoring is made up of many elements all working together.

I considered that we might stumble upon one or two gamechangers within our programmes that consistently achieve high impact and high quality for the mentees taking part. It can be tempting to look for a major element that you can just add to a programme and be assured the programme will work. But that’s not what happened, and it’s not surprising. The DataDive project didn’t highlight any project factors, messaging behaviours, or topics that were particularly correlated with impact on capital and behavioural outcomes. This was encouraging, in that it didn’t find particular topics from our mentoring guides, or project models that we deliver each year, to have negative impact. But it further highlighted that understanding why and how a mentee developed their growth mindset, for example, is challenging and complex. The DataDive’s outcomes emphasised, for us, the importance of tracking and testing project inputs (i.e. if we change this one small thing for some mentees and not the others, what happens?). It also reminded us how important our qualitative data is to understanding why change happens.

2. Messaging behaviours impact the quality of a mentee’s experience, and we can influence this.

While the DataDive project didn’t find correlations between messaging behaviours and impact scores, there were some relationships with the quality ratings mentees gave at the end of a programme. For example, information provision and signposting (measured through sharing of URLs and file attachments) was associated with higher quality scores. We already advocate these behaviours in mentor training and support materials, but being able to explain to mentors and partners that this is linked with higher quality ratings from mentees makes it more powerful and more likely to be put into practice. Other findings were more surprising and challenged some of our assumptions. We understand that, for mentees, coming up with questions for a mentor can be challenging and so we focus a lot of advice for mentors on the importance of asking open questions to prompt discussion. The DataDive project found little to no relationship between mentor questions and mentee quality scores, but it did find one between mentee questions and quality scores. While this doesn’t mean our hypothesis about mentor questions is false, it suggests that if we did more on equipping mentees to come up with and ask questions, it could have a positive result.

3. Don’t skim over the missing data.

When preparing for the DataDive weekend, we focused on collating a complete data set (i.e. the messages sent and received by mentees for whom we also had both baseline and exit survey responses). However, during the weekend, volunteers noted that we were looking at a set of mentees who predominantly engaged really well and reported high quality experiences. If we had also looked at messaging content and behaviours from mentees who did not engage long enough to ‘complete’ the programme (complete an exit survey), we may have seen more variation and potentially some additional correlations. One of the main recommendations from our team of Data Ambassadors, volunteers who guided us through the DataDive project from beginning to end, was to repeat the analysis for those less engaged mentees, which may highlight behaviours — or a lack of behaviours — that increase the risk of non-completion.

4. Establishing ‘normal’ behaviours could help to address mentee retention and conversion.

The fact that the DataDive project focused on mentees who ‘completed’ a programme means it can give us a guide to ‘usual’ mentee and mentor behaviour. We learned that it was unusual for a mentee to respond to a message once 14 days had passed since their last received message, and then make it to the end of the programme. From this, we’ve cemented our assumptions that frequent communication is important for engagement and we can add to our existing mechanisms (such as automated engagement reminders from our app) to ensure we are focusing our support for mentees on the 14 days following a received message, where it is more likely to make a difference.

5. The power and potential of data science for social change organisations.

We could have written a much longer list of key takeaways from our first DataDive project, but had to dedicate one learning point to the value of data science for organisations like Brightside. Some insights can be applied straight away to our day-to-day work, and others will help inform our next Impact Strategy. Seeing 25+ data experts giving up their weekend to help Brightside increase our impact on young people was inspiring. During the 8-week lead in, where we worked with our amazing Data Ambassadors on refining our questions and preparing the data, it really felt like we had 4 additional Brightsiders who understood and were invested in our mission. The DataDive project has shown us the value of building relationships with organisations like DataKind UK, as well as the wealth of opportunities for learning and upskilling your team that exist in the sector through events such as the Data4Good Festival.

We are excited to continue using our DataDive’s insights to inform our decision-making and approach to programme design and delivery. We will be talking about our progress with partners and our wider network over the coming months, but please do get in touch if this has sparked your interest in a discussion about mentoring!

A huge thank you to the DataKind UK team, all the volunteers that attended the DataDive weekend, and our wonderful Data Ambassadors and trouble-shooters: Marlene, Agata, Darren, Katy and Nick.

If you think that DataKind UK could support your organisation with its data question(s), take a look at the free support we offer here.

--

--