A Math EdTech Mixed Methods Study: “Building the Plane While Flying It” (Part 3 of 6)
Silicon Valley Education Foundation (SVEF) is a nonprofit resource and advocate for students and educators. We are dedicated to supporting all students to be on track for college and careers, focusing on the critical areas of science, technology, engineering, and math (STEM). SVEF’s Learning Innovation Hub (iHub) leverages research, practice, partnerships, and policy to raise student achievement through technology equity.
In our third post, we share how we planned and launched a research study with the guidance of WestEd. The study aims to gain insight into how middle school students and their teachers work with EdTech products to support learning, instruction, and performance in math.
How do you prepare a research study in the midst of ambiguity? Previously, the Learning Innovation Hub (iHub) had not brought ethnographic research and classroom observations together into a mixed-methods study. This study required, in a sense, that iHub apply innovation strategies on itself. Given the short timeline and accelerated ramp-up period of our study, we ventured a bit into the Silicon Valley cliché of building the plane while flying it.
We were not flying blind, however; we had a plan, a growing research skill set, and the guidance of professional researchers. We crafted overarching goals and research questions and then identified the data elements and general methodologies needed to accurately answer our research questions. As we were initially designing our study, we were still in the process of securing school sites. We did not know the culture of the classrooms we would observe, how many teachers and students we would get to work with, or even what math EdTech products we would see. What would the study actually end up looking like? We were not only wondering what conclusions we would draw but also what kind of conclusions we would be able to draw.
As we confirmed district partners, we focused on preparing what we could control: enhancing our research capacity. We wanted to be deliberate and thoughtful about norming across our two organizations from the onset. Our team of four had experienced backgrounds in education (two of us have been former teachers), but entering a school for the purposes of research is quite different. We were ready to work with WestEd to hone our skills.
Lesson One: Bias
We all carry preconceived ideas or perspectives that have been substantiated by what we’ve experienced, what we’ve read, or what we’ve heard. However, researchers aim to be as unbiased and objective as possible and to always base findings on data. This meant that first, we had to recognize our biases regarding education and tech so that we could actively deal with them and minimize their influence on our observations. We discussed as a group what we expected of an ideal classroom, what we considered high-quality classroom practices, and what we hoped to see in our field visits.
Our study aim was to gain insight into how students and teachers worked with math EdTech products to support math learning and instruction. It was not to assess teachers or students. Discussing our biases openly helped us begin to recognize what classroom activity would be more relevant to our goals and how to curb our school evaluator. We found videos of teachers using technology in their classrooms, and used them as “trial runs” to see how our biases might influence our focus. One of our team members, formerly a teacher and department chair, was accustomed to observing teachers to critique and support their practice. It took a conscious effort for this team member to separate how the math EdTech products influenced student learning from how the teacher and classroom environment did. These discussions helped us circumnavigate potential pitfalls during our observations and figure out ways to support one another.
We also had to keep biases in check because we were a team of researchers. The data that we collected through our observations and field notes had to be reliable and compatible with one another. Discussing our individual perspectives was an important part of the norming process to establish consistency across the team. Being aware of what might influence our observations and interpretations would allow us all to consciously focus on what would inform our study.
One useful tactic was to acknowledge and give space to our biases during an observation rather than try to stifle them. In our field notes (“Lesson Two”), we set aside a section where we could jot down our emotional reactions and thoughts not directly related to our study. In this way, we could attend to our personal perspectives and minimize their influences on the rest of our observations.
Lesson Two: Tools
The tools of an ethnographic researcher (yes, that’s what we were!) are field notes and a recorder. These provide the basis for the observation write-ups. The quick tip here is to purchase recorders that are easy to use, with simple on-off and a USB port to make downloading the audio files fool-proof. Recording others seemed awkward at first, but quickly became critical a tool during our classroom visits. It allowed us to engage more and be better aware of the students and class by removing the pressure to write every single word down.
Given the short timeline of the study, we had to norm our field note-taking skills quickly. Our WestEd teammate assigned homework: read articles on how to write field notes and then study actual samples. We discussed format and style to develop best practices. We recognized that consistent information and practices were central to creating reliable and informative observations. We decided on a simple template and structure for our field notes. This included sections to note down basic information such as time and student count, as well as our subjective thoughts, alongside the observations.
Our learning continued, even as we were actively conducting classroom observations. Objective language–in both field notes and the narrative write-ups that followed–became a focus. Researchers must recognize words which might be loaded with subjective definitions. Rather than use words such as “engaged” or “frustrated,” we described what we saw that was leading us to these conclusions. Was the student engaged because her eyes were on the teacher, and she was nodding? It was these visual cues that we had to note down, not our interpretations of them. Each narrative also closed with a reflection piece, in which we could express any concerns, questions, and personal perspectives. It also was where we could acknowledge our own personal state, should that have influenced the quality of our notes–were we agitated that morning? Tired? Feeling ill?
We continued to hone these habits as we produced our own fields notes–we were airborne. The professional researcher on our team reviewed our first set of narratives, providing detailed feedback on language and organization. Throughout the study, she read every narrative write-up, helping to keep a higher degree of consistency and quality control.
Lesson Three: Flexibility
As part of our norming process, we invested time in researching and drafting a rubric that would help us organize our framework for EdTech observations. We focused on personalized learning, an approach to education that many hope technology can help enable. The personalized learning framework developed by LEAP Innovations guided us in our development of a rubric for math EdTech product use. This prompted us to explore what an ideal product would do, whether we could truly expect one product to do all things, and whether teachers would even want a product that did all these things.
The ambiguity we were dealing with necessitated a mindset of flexibility and acceptance of change. To maintain focus on the student experience, ultimately we did not pursue the rubric. This by no means meant that we wasted time and efforts. This process helped us surface more of our biases and enabled us to better focus on the activities we were truly interested in understanding, once we entered the classrooms. It forced us to better distinguish product from teacher practice, and we identified what we thought could be answered through desk research rather than direct observation, such as available product features. While the concepts within the rubric were embedded in other channels of data collection, what we learned by developing it contributed to our deeper understanding of the environment we were entering and our better preparation as researchers.
Finally, flexibility was required to adjust the scope of our project. We actively recruited more districts that were able to commit to the study and explored four products in use before homing in on two that were more widely adopted. The constraints of time and the size of our team meant we would be limited in delivering useful insights and results if we pursued everything. We focused on building good relationships with schools, collecting high-quality data, and refining our approach each time we entered a classroom.
Our situation is not unique. By nature, research deals with ambiguity. Its purpose is to learn and begin to develop an understanding of what is not yet fully understood. Every study is constrained by a “budget” of time, human resources, and capacity. We prepared by building up our team’s skills. We trained ourselves on tools we would have to employ, and we adopted an approach and mindset of flexibility. We had to accept the uncertainty as to how everything would actually come together and trust that if we kept our eye on the guiding goal, all of our work would support us to reach that goal. With preparation and continuous learning, our plane was flying.
This is the 3rd of six posts we will be releasing on our research. Each week, we will release a new post, so please follow us, and continue to check back. If you are interested in hearing more, email email@example.com for additional information.