Proving the concept: Avoiding the hack day prototype graveyard

Comic Relief Engineering
Comic Relief Technology
4 min readJul 31, 2018

Back in February, we held a hack day which led to this post on practical tips for organising one. One of the problems we tried to find a solution for on the day was from our Impact & Investment team. They were receiving a lot of applications for funding that didn’t meet the eligibility criteria we’d set out. This was wasting the valuable time of both those spending the time completing an application and the team at Comic Relief who review them.

The hack day gave us two great possible solutions to the problem. Both were based on a similar assumption; that providing a way for organisations to quickly check whether they might be eligible against our criteria before starting an application will reduce the number of ineligible applications.

To avoid our solutions falling into the ‘hack day prototype graveyard’ we wanted to test out the prototype on our website to see if our assumptions were correct. Once we were back in the office we put a team together to get one of the prototypes into a working proof of concept on the website to test; which we managed to do within a month!

The results of our test are in and I thought I’d share them with you… once I’ve shared a few things we did which helped us along the way.

Stop! Collaborate and listen

Getting this proof of concept together so we could test it required a team who could not only design and build it but also formulate the right questions to ask organisations that used it.

The Digital and Innovation team are many things, but experts in our eligibility criteria we are not. So, we brought in a reinforcement from the Impact and Investment team to join us and apply their expertise throughout the process. Working as one team meant we could make decisions quickly, work out which key questions we needed to ask and not be blocked by having no one to review our work. I know everyone talks about how collaboration is key but that’s because it is. By involving the right people from the start, we had the knowledge and expertise we needed to get something working quickly.

Lock down the scope and know what success is

It’s easy to get lost in the detail of what could be when working on a proof of concept. The problems it could solve, the features it could have, the things it could do! To avoid this we agreed as a team what the scope of test would be, the assumptions we had and what success would be from the start. And we wrote them down.

I’ve found that often these conversations will happen but are never documented so when it comes to testing whether something worked you’re never quite sure which assumptions were proved/disproved.

Having the assumptions and scope written down also helped us to make decisions like what to include in the proof of concept and where it would sit on the website. We made sure we shared this document with stakeholders throughout the Impact & Investment team so that everyone had the same expectations of the work we were doing.

The Scope

We agreed the test would cover 2 initiatives that would open for applications within a couple of days of each other. We launched what we had begun to refer to as ‘the checker’ on the website at the same time and took it down when the 2 initiatives closed for applications.

We knew that we had an initiative on the website which had been open for some time before ‘the checker’ would go on the site. To give a simple journey for our users we covered that initiative’s criteria in our questions but agreed we did not include this as in scope for measuring success.

What is success?

This was a test, so we needed to know what would determine whether it was a success or not. There was obviously one big measure that we could use — the number of ineligible applications received for each initiative. As we agreed the scope of the test up front it meant we could do some benchmarking in advance on similar previous initiatives and what the percentage of ineligible applications were for them.

We looked at 6 comparable initiatives and the number of ineligible applications for each. This gave us a figure of 23% to beat as a part of the test.

We knew that if this figure reduced for the two initiatives we had in scope then we could consider our test a success.

The assumptions

One of the main assumptions we had is that organisations would use the checker. It’s an obvious one but it’s an important one, to test whether the proof of concept worked we needed people to use it!

This made us think carefully about the journey into it and the call to action to use it. I’m pleased to report that this particular assumption was proved with nearly 3000 events recorded for the button click that took users into the checker.

Did we prove the concept?

I know by now you’re all wondering whether our proof of concept worked! Did the checker reduce those ineligible applications? Was time saved for all?

I can confirm that our test was a success for both initiatives! Only 3% of the applications submitted for the first initiative were ineligible and the second initiative’s rate was a little higher at 13%. Both were a huge improvement on that 23% figure we were aiming to beat.

We’re now taking what we learnt in the proof of concept and putting it to use, iterating on our proof of concept to develop it into something that we hope will become a more permanent feature on our website for organisations to use. Watch this space!

--

--