Science Olympiad: A case study

Shea Hunter Belsky
11 min readApr 13, 2019

--

Every year, hundreds of students, parents, and coaches fill places like Cornell University’s Statler Auditorium at Science Olympiad tournaments. This case study takes a dive into the ups and downs, and presents a new solution for making tournament management and participation easy: Ezra.

For the past four years, I have been involved with the Science Olympiad community as a tournament organizer of Science Olympiad at Cornell’s yearly invitational tournament (SciOly@Cornell). I wanted to understand how technology and user experience design could serve the needs and interests of this unique community.

Science Olympiad: What is it?

Science Olympiad is an interdisciplinary tournament that sees students compete against their peers in a variety of academic disciplines. Events that students compete in include Anatomy & Physiology, Experimental Design, Mousetrap Vehicle, Herpetology, Wright Stuff, and Circuit Lab. In middle school and high school, students compete at the regional level, then move on to the state level, and finally the national level. Teams can also compete in invitationals, which allow them to practice and prepare in an environment similar to what they will experience during normal tournaments. Science Olympiad features over 8,000 teams from across the United States, and is in its 35th year as of 2019.

Problem: No crosstalk between different software

One of the largest pain points in running a tournament was managing all the information of every team that was going to compete. The largest tournaments have over 70 teams, and with up to 15 students per team, plus 2 coaches on average per team, that’s 1,190 people per tournament. A variety of software is used to manage data from countless different sources. But once you have it, that data doesn’t copy over nicely into other systems. Teams need to self-schedule for time slots during the day to compete in certain events, they need to have scores entered for every event, the winners need to be put into a presentation, and the results need to be made public to all the parents who want to know how their children did. At worst, that’s five different pieces of software into which the same information has to be copied every single time.

How is this problem currently being addressed?

At the time I was thinking about these issues, the only major solution for Science Olympiad tournament management was Avogadro. They took a lot of the pain out of running a tournament: online event signups, score entry, presentation, and results, all on one platform.

Avogadro: The results page from Massachusetts’ Division C state tournament

But from the tournament organizer perspective, there was still more to be desired. Myself and other members of SciOly@Cornell used this software for its invitational, and volunteered at tournaments that used the software, and found it lacking in certain areas. How can schools register for tournaments on their own, versus needing to be added by organizers? Can organizers add custom content and media to their tournament page? What about customizing the registration form like on Survey Monkey/Google Forms/Typeform? Maintaining information on a team’s long-term performance? It was a good solution, but it could be even better.

Who is the target audience? What are they trying to accomplish?

There are several target audiences among the Science Olympiad community. Some overlap is present between the audiences, though they each have distinct objectives in mind. I was able to have casual conversations, and formal user interviews, with a multitude of people across the Science Olympiad ecosystem to hear their voice. Questions in these conversations and interviews included:

  1. In your own words, what are your top goals at a Science Olympiad tournament?
  2. Do you find that it takes a lot of work to accomplish those goals? Or is it easy to get them done?
  3. If accomplishing these goals could be easier, how would you make it easier?
  4. If it’s already easy to achieve your goals, what could get in the way of making it easy?
  5. How often do you use technology leading up to, on the day of, and after a tournament?
  6. How, if at all, could technology make it easier for you to achieve your goals?
  7. In what ways might technology impede your ability to achieve those same goals, if at all?

The first thing I discovered was that anything meant for the Science Olympiad community should include more than just the showrunners. My initial goal was to target tournament organizers, but I soon broadened this to include competitors, parents, team coaches, event judges, and score counselors. Science Olympiad represents teamwork and cooperation from many different angles. Thus, anything large-scale that addresses Science Olympiad should be inclusive of everyone who participates in it.

Competitors and Parents: They want to know how they did after a tournament day. They may only have their cellphone on them, and want to know as much about their score as possible in a format that lets them scroll through events and scores quickly. Asking their coach for the results can be difficult if they are pre-occupied with post-tournament details (ensuring everyone is on the bus home, everything they brought to the tournament is still with them, etc.)

Similarly, parents of competitors want to know how their children have done in a tournament. Depending on the coach for this information can place a lot of stress on a small number of people.

Historically, tournament organizers upload Excel spreadsheets of the day’s final results after everything is over. Being able to actually read these results can be difficult on mobile devices, and it’s still not great on laptops and desktop computers.

Coaches: They want to register for tournaments, including their local regional tournament and invitationals. Re-entering information for each tournament is a pain. Being able to enter the information once and just have it stick would be easy.

Coaches also need to self-schedule their teams for events. Certain events allow teams to choose when during the day they want to compete, and others happen at a specified time. For the former, coaches should be able to go somewhere and submit their signups.

Teams compete at many tournaments during a season, but they have no meaningful metric of performance over time. Coaches can calculate this information on their own, but combining the information from all these tournaments that are in very different formats (Excel, online, PDF) can be time-consuming.

Event Judges/Score Counselors: Judges and score counselors are assigned to one or more events to grade them and validate their scores. Only once scores have been entered, tie broken, and confirmed can the event be considered complete. Score counselors check the scores entered for an event to prevent data entry errors or typos.

Entering scores for events with complex rubrics can be troublesome. The rubrics provided by the Science Olympiad national organization are converted into Excel sheets, but they can be hard to use and understand. Judges without comprehensive prior knowledge of the event, especially volunteer judges, may be daunted and confused by what is being asked. Being able to hand such a rubric to a volunteer and asking them to score a team would be impossible without prior experience with the event.

Example of an Excel score sheet used by event judges for Roller Coaster. All score sheets can be found here.

If their event has coach-selectable time slots associated with it, the judge and counselor should be able to view the information of teams that have signed up at specific times. Judges may be located in a space that’s far from the tournament’s headquarters, and might not be able to immediately get in touch with an organizer to get this information.

Organizers: There’s so much to do, and a lot to organize. Organizers often rely on a committee of people to make tournaments happen. One issue they often encounter is a disparity of information, either incorrect or missing, across an organizing committee. With multiple different systems to juggle, and people with lots of different responsibilities coming together, organizers can get strained and daunted.

Technology should make things more consistent and centralized, a repository of sorts. At the same time, teaching volunteers and supervisors how to use the software is a top concern. The technical ability of tournament staff varies greatly from one tournament to the next. It should be accessible, straightforward, and even hand-holding to ensure that everyone can properly use and understand the system.

Personas: The people of Science Olympiad

Images that represent different people of the Science Olympiad community. Images from Personas 1, 2, and 4 are from Unsplash.com

The most common theme from my conversations and interviews was that across the board, people in Science Olympiad had a lot of things to accomplish, but were spread far too thin in terms of making them happen. Everyone comes into a tournament with some specific goals, but they get so caught up on the question of “How do I do this?”, that meaningfully carrying out their objective becomes sidelined.

Another takeaway is that existing methods and tools do a poor job of information communication and representation. There is a lot of wiggle room and confusion as to what something means, and this creates discrepancy and disagreements when two people can’t agree on what something means. Through its many events and its overall structure, Science Olympiad is very focused on detail and precision. Leaving something up to the imagination can only cause headaches, rather than promote flexibility.

Everyone comes into a tournament with some specific goals, but they get so caught up on the question of “How do I do this?”, that meaningfully carrying out their objective becomes sidelined.

Scribbles, car rides, and translations

My initial work consisted of finding the best features that would create the most immediate impact for tournament organizers and competitors. As myself and several other members of SciOly@Cornell travelled from Syracuse to Ithaca, from a tournament that had run late by more than 2 hours, my friends and I fantasized about what we would do if we could create our own system. A more efficient system could have expedited the scoring process, allowing a tournament to end on time (or even, gasp, early!)

On this hour long car ride, we just let our mouths run and thought up the wildest things we would want. Someone took notes, which were copied into this Google Document. Having already conducted user interviews, this car ride allowed some of my ideas to be given definition, while other ideas where shot down, and new things came up that I would never have thought of myself. We filled the first page and a half of dreams, before getting back to Cornell and going home.

Prototyping

I developed a prototype of Ezra to present to other members of the SciOly@Cornell executive board, and to members of New York State Science Olympiad’s (NYSSO) own committee of directors. This was not a reflection of minute design ideas and specific functionalities, but an overview on what I wanted to accomplish. There wasn’t a lot of content on the site, but there was a general structure and outline. I didn’t call it a prototype at the time; rather, I labelled it a “Technical Demo”. But it was really a prototype.

Screenshot from the Events page of the Ezra prototype.

I was not paying too much attention to visual hierarchy, structure, or the minutiae of colors and typography. My goal was to lay out a blueprint and say “This is what I want to accomplish. This is what I want Ezra to be able to do.” I left the meeting with a lot of positive feedback, suggestions, and ideas about how best to proceed.

Building the platform

Over several weeks, I took the time to plan and think about what I wanted to accomplish. I thought back to the conversations I informally had at Cornell’s invitational tournament, and the New York State tournament I had just returned from. I recognized that my perspective was limited as a tournament organizer, and having never been a competitor made it hard for me to empathize with a large portion of my audience. My few times judging and grading events could not compare to the bevy of experience that I was surrounded by, within Science Olympiad at Cornell and elsewhere.

At the same time, my lack of experience in Science Olympiad gave me a unique perspective to be objective and critical of wording, terms, and site functionality that I didn’t think made sense. In the beginning of Ezra, when my business partner would explain something to me, it often times went over my head, and I’d just stare at him. After breaking it down and being a little more wordy, I often translated the second explanation into what would ultimately be conveyed on Ezra. I would develop features and content for Ezra so that anyone would be able to understand and use the platform, even with little to no prior background in Science Olympiad. I spent the summer, fall, and winter of 2016 building the first version of the platform.

Usability Testing

When the competition season began, which unofficially started in January of the school year, I attended as many tournaments as I could to pilot Ezra and see how people used it. Though my user interviews, prototyping, notes, and conversations shaped the product, the ways in which people used Ezra would ultimately shape its future.

One thing I learned very quickly is that more information is better, and not less. Some parts of the platform did a great job with communicating functionality and intent, while others did not. I worked directly with a variety of judges and tournament organizers to determine what to say and how to say it. A good example of this is on our scoring page. Once all scores are entered for an event, they must be score counseled in order to be considered “final”. These scores can change after being counseled, but the process must repeat. This was not immediately clear to users of Ezra. I added some text to the section that describes score counseling to convey what the button did, what would happen after, and why someone would need to press it again.

Score Counseling section of the events page. Details: What the button does, what happens after the button is pressed, and why the button might need to be pressed again in the future.

I sat in the same room as event judges as they graded events, and used digital forms to score teams without needing to use Excel or a calculator. I even had the chance to work with judges who supervised the events at a national level, adjusting and improving the forms to make them intuitive and straightforward. Whether this was a judge’s first time scoring an event, or their hundredth, the parameters of an event should be clear to them and to the competitors.

Next steps for Ezra

To view Ezra, check it out at this link. To view an example of the results from a tournament, click here.

Ezra will be used at the 2019 Science Olympiad National Tournament, to be held at Cornell University. We now have a free mobile app, for iOS and Android, to connect everyone on a tournament day together. Push notifications, tournament information, results, and more are included in the app.

I am excited to see what the future of Ezra, and of Science Olympiad, holds in store. This has been an amazing, stressful, exciting, and insightful project, and I’m eager to unveil Ezra on the National Tournament stage in June!

--

--

Shea Hunter Belsky
Shea Hunter Belsky

Responses (1)