What’s the connection between flipped learning, perception, and self-regulation?

https://www.flickr.com/photos/33650767@N06/

One of the questions about flipped learning that I’ve been thinking about for a while is how flipped learning is related to “lifelong learning”. Existing research on flipped learning tends to focus on a single chunk of time: a single class that’s flipped, or control/experiment sections of a single class where one is flipped and the other is not. Generally speaking, the research shows that learning outcomes are at least as good in a flipped learning class as in a traditional class, and in most cases better. But what about later on, once the class is over? Do those gains persist after the course is over, and does the flipped learning experience stick with students and grow into a lifelong ability to learn independently?

I’ve always felt like the answer is clearly “yes” to all of the above, since flipped learning requires students to put the basic skills of lifelong learning into practice every day: learning new concepts independently by tapping into print and video resources, monitoring and regulating their cognition in the process, taking initiative to change things up if their strategies aren’t working, and so on. Ideally in a flipped learning environment, lifelong learning is more than just an aphorism from the university strategic plan — it’s a daily practice woven into the DNA of the course.

Those skills and attitudes are collectively known as self-regulated learning (SRL). I’ve written about the relationship between flipped learning and SRL before (see that article for a more detailed description of the concept) and I firmly believe that flipped learning is an ideal platform to develop SRL in students. But firm beliefs are not data, so I’ve been looking for research articles that address this area of overlap.

The article I am reviewing today is by Dr. Sarah Rae Sletten, a professor of biology at Maryville State College in North Dakota:

Sletten, S. R. (2017). Investigating Flipped Learning: Student Self-Regulated Learning, Perceptions, and Achievement in an Introductory Biology Course. Journal of Science Education and Technology, 26(3), 347–358. http://doi.org/10.1007/s10956-016-9683-8

This paper is the distillation of her Ph.D. thesis written in 2015.

Research Questions and Methods

Sletten shares my belief that flipped learning presupposes a certain degree of facility with self-regulated learning, and that students come into a flipped environment with a nonzero amount of SRL skill. But just because students show evidence of SRL doesn’t mean they are skilled in SRL. She notes that previous research suggests that students’ perceptions about their learning environments have an impact on their actual usage of SRL strategies. The prior research quoted in the paper had to do with student perceptions of online versus face-to-face class environments. So it’s natural to wonder whether the same perception effects happen when students are placed in a flipped learning environment. Sletten’s study revolves around two research questions:

  1. Do student perceptions of the flipped model correlate with the use of self-regulated learning strategies?
  2. Does student use of self-regulated learning strategies then correlate with academic performance?

Sletten hypothesizes “yes” answers to both of these questions. To study these questions, Sletten enlisted 76 undergraduates in a single section of an introductory biology course at her university. The variables of interest were measured like this:

  • SRL activity was measured using the Motivated Strategies for Learning Questionnaire (MSLQ), an instrument developed by psychologist Paul Pintrich. The MSLQ in its raw form poses 81 questions about motivation and learning strategies to students in the form of statements (for example “It is my own fault if I don’t learn the material in this course”, and “I try to apply ideas from course readings in other class activities such as lecture and discussion”). Students are then asked to rate each question on a scale of 1 to 7 with 1 meaning “not at all true of me” and 7 meaning “very true of me”. The questions on the MSLQ are clustered into subscales, addressing elements of SRL such as effort, help-seeking, and so on, and it’s designed so that one can use just a subset of the questions if desired. This study only used the parts for academic cognition, academic motivation, and academic behavior. Each of those parts were further broken down into subscales; for example “academic behavior” contained the subscales for “effort” and “help seeking”. Some of the items were reworded to fit a flipped learning environment better.
  • Student perceptions of the flipped environment were measured using a 32-item survey developed for the study. The items were aimed at what Sletten identified as the “two dimensions” of the flipped model: student perceptions about video lectures before class, and student perceptions about active learning in class. (The focus on video lectures is an issue; I’ll address that below.) The video questions pertained to preference of video, perceived value of video, and frequency of viewing the videos.
  • Finally, academic performance was measured using the students’ final letter grades (A,B,C,D,F) in the course.

All the data were collected in a single session at one point in the semester, in week 13 (of a 16-week term). Multiple regression was used to see if a combination of SRL usage and student perceptions correlated with academic performance. Single regression was used to determine if student perceptions of flipped learning correlated with academic performance. There were additional regression analyses done to determine relationships between some of the SRL and perception subscales. Again, this was a purely correlational study, so proceed with caution when interpreting the results.

Insights

Unsurprisingly, there were significant correlations within the SRL strategy subscales: The “study strategies” subscale correlated with “metacognition”, “self talk” (a facet of SRL in which learners use positive language with themselves while learning), and “effort”. Self-talk was also correlated with metacognition and effort. Again: not surprising that different aspects of SRL are mutually supportive like this.

Between student perceptions and SRL strategy use there were also significant correlations. In fact the only student perception subscale not to correlate significantly with any of the SRL strategy was students’ preference for videos. There were other areas of weak correlation: Students’ previous experience with flipped learning also did not correlate with any of the SRL strategy subscales, and course grade only weakly correlated with academic behavior and not at all with student perceptions.

Within the student perception survey responses, the study found that students had placed high value on in-class active learning activities but a low value on the perception of video lectures. More on that below.

Addressing the main research questions, the study found:

  • Re: the question of student perceptions predicting SRL strategy use: The only factors not showing significant correlation were “environmental structuring” (which is what it sounds like — an SRL strategy based on arranging one’s environment to minimize distractions) and “self-consequating” (choosing one’s own rewards and punishments based on performance, another SRL strategy). All other subscales of student perceptions significantly correlated with the other subscales for SRL.
  • Re: the question of whether SRL strategy use in a flipped environment predicted course grades: There was no significant correlation between the two. At first glance this might seem like a major blow to flipped learning, but see below for an unpacking of the use of grades.

So in short, what the study found was that there is a significant relationship within the various aspects of SRL, and in most cases there is a significant relationship between students’ perceptions of a flipped learning environment and their use of SRL strategies. But, the resulting SRL strategy use did not predict final grades.

Issues

The results of the analysis — both the presence and absence of significant correlations in those key areas — have to be filtered through what I see as some important issues with the study.

First and foremost for me is the use of final course grades as a measure of academic achievement. Here, the assumption appears to be that course grades are an accurate measurement of students’ mastery of basic biology concepts. As readers know from my posts on specifications grading, and as Sletten acknowledges herself in the paper, this is a shaky assumption. In fact, frankly I have no confidence that grades measure student mastery of course concepts. Any number of factors not related to concept mastery could figure in to a student’s grade and corrupt the information that the grade conveys about subject mastery, and the paper doesn’t mention the algorithm for how grades in the course were determined. I can potentially be on board with the idea that grades measure “achievement” (the term used in the research question) but it’s not at all clear that “achievement” correlates with actual knowledge of the subject, or that grades are reliable measures of knowledge.

So in some ways it’s no surprise that there was no significant correlation between perceptions/SRL strategies and course grades, and the fact that there was no significant correlation doesn’t really tell us much about student academic achievement. If we’d used a more sure measurement of academic skill, like a concept inventory, we’d know more.

Another issue is that the precise formulations of the items on the surveys, as well as the MSLQ items used, weren’t disclosed in the paper, so it’s never entirely clear what students were responding to. I can read through some of Pintrich’s work on the MSLQ (especially this) and sort of piece together the SRL questions, but I’m not certain about it. And the student perceptions survey was MIA1. Without knowing the questions, it’s difficult to draw conclusions about those correlations.

A third issue is that the definition of “flipped classroom” seems to be unnecessarily centered on videos. The first sentence of the abstract makes this clear:

In flipped classrooms, lectures, which are normally delivered in-class, are assigned as homework in the form of videos, and assignments that were traditionally assigned as homework, are done as learning activities in class.

But flipped learning environments do not require video use, and in fact there’s good research that suggests videos are not the best approach for pre-class work in a flipped environment. By framing flipped learning in terms that assume that students are watching lectures on video, the scope of student responses to the “perceptions” survey becomes inaccurate because students are not talking about flipped learning but rather about videos. We don’t know much2about those videos: How long were they? Did the instructor make them or were they done by a third party? Did they integrate well with the active learning activities in the class or were they more like add-ons? Without the details, we can’t say what those student perceptions are really about; maybe students had a valid point when they gave largely negative perceptions of them.

Rather than ask about how students perceive videos, a more relevant question might have been to ask about students’ perceptions of independent learning in their individual spaces (regardless of medium). And we really need collectively to get away from the notion that flipped learning must involve lectures and videos. It’s clouding our data.

Finally, there doesn’t appear to have been any explicit training on self-regulated learning, or even a discussion of what SRL is. Without training in self-regulation, it’s not surprising that students’ use of SRL didn’t translate into higher grades (although like I said, grades are a poor choice for measurement). And it’s not hard to see why students might have given poor ratings to the usefulness of video lectures, if they were expected to figure out how best to use video on their own with no training.

Takeaways

There were some important lessons to be learned from this study that all of us who teach in flipped environments would do well to ponder:

  • Student perceptions matter. Students come into our courses with preconceptions. Many of them have heard of flipped learning and a few (more and more each year) have experienced flipped learning environments both good and bad. They aren’t blank slates. It’s crucial for instructors to discover what those perceptions are as soon as possible, and help shape the narrative about flipped learning to alter those perceptions if necessary.
  • Perceptions can influence cognition. The signature lesson from this study is that students’ perceptions of flipped learning environments actually have a strong relationship with how they study. So if we’re teaching in a flipped environment, and students aren’t doing their pre-class work or they’re not using the videos effectively, it’s a good idea to ask yourself what they are thinking about the environment.
  • Students need training on self-regulation. This study also showed that even though students may have positive views about flipped learning and this has a good chance of leading to competent use of self-regulated learning strategies, it might turn out that none of this corresponds to improved learning. There’s something missing in the ligature that binds perception to strategy to performance. I think that this something is explicit training on how to use SRL properly. Sletten mentions that we “may have to spend time with students at the start of a flipped course, showing them ways to develop their SRL skills”. This is true, but not enough: SRL has to become a regular, even daily part of the classroom experience, iterated and reiterated throughout the course.

Next steps

On the point about training in SRL, it would be interesting to see this study recast as a quasi-experimental study, using two flipped sections of the course, with the only difference between the sections being that one includes explicity training in SRL and the other doesn’t. In fact, this very setup is the basis of the next article that I’ll be reviewing here on the blog! So stay tuned.


Thanks for reading. This was originally published at my website, rtalbert.org, where I write on mathematics, education, technology, and academic productivity. You’re invited to read more there, and to sign up for my monthly newsletter The Monthly Review. I also have a book out!