Ride2Work User testing

Finally everything was complete and ready for set up so I put the whole course up for user testing.

Before anything else, I ensured that the user test had a before and after quiz just to be able to gather the data from my users.

The user tests were conducted in a professional environment with a supervisor. The tests were also screen recorded with audio so that we were able to watch all of the users go through the game and be able to mark things like where they clicked. Thankfully there was success in sticking to the time limit. I was able to test two different users on my prototype and get all of the feedback from them.

Surverys

Overall the tests were completely successful. The users were able to complete the task and pass. There was little to no need for me to have to intervene and take over. This was because I ensured that there was sufficient information the entire time through. This is supported by the responses to my survey.

The first survey was done before their initial test. It was made with questions that would gather data about what kind of learner they are, how they like to learn, what tools they use, if they are confident etc. I did this to gauge some kind of insight into how those traits would effect their experience with my eLearning activity. 
Looking back on this data, I can see that I tested a range of learners. Which is good because I built my course around hoping to have a very broad activities for all learners.

This survey was done online so that they were able to easily do it.

Changes

If I had to make any changes, I would incorporate more hints because the test itself was quite difficult for some people. I noticed that in the test scores all users passed, (but barely) so this needed to be addressed with some help.

I would also test on more people so I would be able to gather more data.

This was the end test that I hosted for peoples feedback. I saw that a lot of the feedback said that it was very easy and interactive. As well as being able to convey the relevant information for the learning activity.

This feedback was very vital to me because I was able to be validated in what aspects of the learning activity worked or not.

Im glad that my activity was not confusing or boring. And that it did its intended purpose in conveying information. I think that what could be improved from this feedback would have to do with the initial working and structure of my questions in the learning course so that it is a bit less confusing.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.