ELI ‘17 and Learning Analytics

Katrina
4 min readFeb 21, 2017

“Learning Analytics” was a hot topic at EDUCAUSE ELI 2017. After presenting myself and listening to other presentations about learning analytics, I’ve managed to thread together two themes. Lots of folks are aware of analytics and the value of backing up our assumptions with data. And lots of institutions, both public and private, are working with data in some shape or form — but they either have a very clear direction for what they’re tracking or they are truly in the exploration phase.

As a young professional observing the current state of the field, it’s an exciting place to be. There’s enough baseline research to establish that learner data is valuable to organizations, but as far as the true potential of using this data, it seems that no one has a solid answer yet.

I noticed two broad approaches at ELI 2017. The first, more common, is the traditional business style analysis of historical data for institutional advancement. Administrators and other data professionals look at data sets that have already been collected make decisions or recommendations based on the numbers. Predictive analysis is a common lens to view such data through. These decisions tend to try to make correlations with retention, future success, etc., such as what folks at University of Memphis are trying to accomplish with their analytics dashboard. These decisions could also include institutional strategies like what to market to potential students, where to direct monetary resources, or where energy and time should be focused for maximum program improvement.

The other approach is more small-scale and seems to be rooted in research on online/digital learner behavior and outcomes. Individuals are looking at data sets gathered from a course or specific group of courses to try to make improvements to learning experiences. Analysis of this type tend to try to go beyond the typical evaluative piece that comes at the end of most popular design frameworks and look for real-time, actionable learner data that can drive change that could be implemented in the same course cycle as the one in which the data was collected. This approach tends to be more flexible, agile, and adaptable, which leads to questions about usefulness of data long term and whether that’s even a concern. A great example of this use case for learner data collection comes from Perry Samson, a professor at University of Michigan, who showed how he used data to drive pedagogical decisions in his large lecture class that ultimately leveled the playing field for minority groups.

The approach I’ve taken with the assistance of the ELMS project is one that is more locally focused on driving change that can occur during the semester. My colleague Bryan Ollendyke has been implementing survey questions that are embedded intermittently throughout courses in the Arts & Architecture portfolio in order to obtain real time feedback on how students are feeling about both the course design and the content. I’ve been looking at data to see what videos and content pieces are being utilized the most by students in our flagship entrepreneurship course with the intention of offering data-backed redesign suggestions to maximize the learning experience for students. I’m also exploring how students are using online resources to complement a once/week seminar and whether that has any impact on student performance.

A representative from LMS company Blackboard attended our xAPI workshop session at the end of the first day and asked us about scalability for our approach. As it stands right now in this very second, the approach doesn’t scale well — however, it’s important to note that we are not striving for a one-size-fits-all strategy. I believe different subject areas require different expertise and different treatments. Data collection strategies for change at the design level require a high touch approach in order to best serve the learners in the respective fields. I doubt that anyone would argue with the assumption that engineering students respond better to specific pedagogical strategies when compared to English majors or education majors. The approach Bryan and I are advocating for accounts for all of those differences by offering a customizable data collection strategy for those unique constituent groups. By prioritizing pedagogy over technology, we are able to offer a better insight for stakeholders. This approach also relies on expertise from those who work in the field.

“Pedagogy first” is not a new style of leveraging technology. If you have ever seen Bryan or I present in the past, we often preach the value of ignoring technical constraints (within reason) and letting content drive decision making. Most of the projects we collaborate on are guided by this core principle, so it’s no surprise that our data collection approach aligns with this goal.

If you’d like to talk more about our strategy or offer us some insights and new perspectives that we are overlooking, please engage with us on twitter! In the meantime, check out some of the fantastic learning analytics conference presentations from ELI 2017.

--

--