Gameplay Survey for Dawn @ Conrad High School
During the development sprint before the Alpha milestone, I coordinated a 90+ student survey at Conrad High School in Dallas, TX. This was a critical phase in design because the team just finished iteration on the first fully playable build — they were ready for any user feedback that could help prioritize problem areas in gameplay.
Producer Andrew Curley had set up some early conversations through our marketing staff at SMU Guildhall, and he later helped me gather resources from the team: designers to take gameplay notes, gameplay controllers, and drivers to help transport designers to the school and back. Game Designer Max Krembs, with help from discipline leads Heather Tierney (Art), Andy Wang (LD), and Dylan Byrd (SD), helped me identify the gameplay areas we wanted to test through the survey:
· General Fun, Challenge, Boredom, and Frustration
· Favorite part of the game + Most frustrating part of the game
· Understanding of the game’s conclusion (Was it Logical? Natural? Satisfying?)
· Strength of the tutorials
· Understanding and navigation of the level
· Understanding and challenge of the flower, crystal, bouncy flower, vines, and wind / interact ability
· Perceived accuracy of the interact ability
· Summary of the game story
· Understanding of game UI which totals flowers collected
· Quality of the music and sound
· Aspects of sound effects or music that did not meet expectations
Our target player demographic specified 2 of Richard Bartle’s 4 player types: explorer and achiever types. We prioritized the data from the 71 players that matched these types. We dismissed data from players who only identified as killer and/or socializer types. We felt that these types would not understand or enjoy our game due to their preferred play style.
Due to the large number of students per class, and the limited number of available computers (9), we had 45 minutes to test 2 groups of players. Depending on the class period, we gave a single group of 9 students only 13–15 minutes to finish the game before starting the survey. Another point of interest for us was how many students finished the entire game, a number we later identified to be 27. We did not want our data about the conclusion of the game invalidated by students who never saw it. This time limit also affected how many survey questions we could ask the students.
Survey data is often an incomplete picture because players are bad at reporting game behaviors and perceptions from memory. However, our designers also took observation notes during the playtest sessions to balance out this reporting bias. Furthermore, the large number of players who finished the survey (83) gave us trust in strong trends that emerged.
As a result of the survey, we learned some key user behaviors and perceptions that marked game features we could iterate on and polish.
Strong evidence suggested that some players had problems understanding how to use the leaves and vines. Several players wanted to run on the vine, but the intended design was to force players to use the leaves. Therefore, we made the vines more slippery so that the intent was clear.
Other evidence suggested that players did not always understand the visual communication of the flowers. We re-textured the flowers so that there was a very noticeable difference between a flower that is closed and a flower that is open.
The story conclusion was not fully understood by players at the time, in part due to incomplete story assets. This is a common tradeoff in playtesting, showing players incomplete content in order to get earlier feedback. Nonetheless, we got to hear what players liked and didn’t like about the ending, and we finished the story assets in the sprint after our Alpha milestone.
I recommend large scale surveys for user research in games, as a supplement to smaller, focused usability testing sessions. Teams depend on objective information from their target demographic for design, and it can even help resolve decision making and communication issues stemming from strong personalities and creative differences.
To see full results of the playtest survey, check out my presentation from November 2016: https://docs.google.com/presentation/d/1ooZTpcx3aIPmFE1z7Aw6VhtQRsX7bFamuj_vpRZBhzg/edit?usp=sharing