Assessment in open online courses

An introduction to assessment approaches in a computer-graded learning environment

MIT Open Learning
MIT Open Learning
4 min readJun 13, 2022

--

Illustration of miniature figures holding pencils around a clipboard with a checklist on it
Image credit: Imam Fathoni, iStock

By Jessica Sandland, PhD
Lecturer and MITx Digital Learning Lab Scientist, DMSE

I always find it exciting to start the development of a new course, because when we begin development we encounter countless decision points that will ultimately shape the learner experience. We need to make choices big and small, from overall curriculum design to the details of our video recording schedule. But to me, the most challenging thing to develop is the course assessment strategy. Until we have developed a clear plan of assignments and assessments to help learners develop and evaluate their skills, we can’t really say that we have a course development plan.

Before we start thinking about assessment, it’s best to take some time to think about the kinds of learning that we want to assess. Bloom’s Taxonomy gives us a great framework for thinking about the various types of skills that we’d like our learners to develop.

Illustration of Bloom’s taxonomy, a pyramid listing from bottom to top: remember, understand, apply, analyze, evaluate, create
Figure 1: From the Vanderbilt University Center for Teaching (CC-BY)

Bloom’s Taxonomy presents us with a hierarchy of skills, which range from basic recall (“remember”) to the ability to develop entirely original works (“create”). In a traditional classroom, instructors have a wide variety of tools at their disposal to develop and assess learners’ skills across this hierarchy. Everyday across MIT, students are asked to solve engineering problems, engage in class discussions, investigate research questions in the laboratory, and write original academic papers.

In an online environment, developing and assessing these higher-order skills becomes more difficult. Multiple choice questions can give us nearly instantaneous insight into learner recall, but they can be more challenging to use to assess the higher-order skills that we aim to develop in our learners. Happily, even in an at-scale online learning environment, there are many different assessment approaches that we can take to challenge and engage our learners. Below, I introduce a few different approaches we take to help learners “apply” their knowledge to solving a problem or “analyze” an engineering scenario that’s new to them.

Make a Calculation
Numerical graders give us the opportunity to have students apply their knowledge to solving a variety of different problems. Typically, we grade these problems in a right-or-wrong fashion, but we are careful to specify the accuracy required for an answer to be considered correct. (For example, all answers within 2% of the instructor-provided value could be marked as correct.)

An illustration of a word problem with a diagram with measurements
Figure 2: From 3.032: Mechanical Behavior of Materials by Prof. Lorna Gibson

Unlike instructor-graded problems, it’s typically rather difficult to award partial-credit with computer-graded problems. Therefore, we sometimes take slightly different approaches to computer-graded problems, giving learners multiple attempts at a question, for example, or asking learners to provide intermediate steps in a calculation. Because there are trade-offs here between assessment rigor and learner ease, we carefully consider the goals of the problem when deciding how much scaffolding and support to provide.

Derive an Expression
Students in science and engineering classes are often asked to derive an expression for one variable as a function of several other constants and variables, and learners in our MIT online courses are no exception.

A series of equations
Figure 3: From 3.012T: Thermodynamics of Materials by Prof. Rafael Jaramillo

The formulas that the learners are asked to enter can range from simple to complex. The example shown below makes use of an excellent resource called the MITx Grading Library (developed by my former colleague Dr. Jolyon Bloomfield), which we used in the example above to allow learners to enter variables with quite complicated variable names. Thus, our online learners are able to learn to solve problems even when a complex level of expression is required.

These examples are only a few of the approaches that we use to give depth to our online course assessments. If you are new to course development, and you’re interested in seeing more examples, I encourage you to check out some of the courses available from MIT on MITx Open Learning Library, MITx Online, and edX.

Jessica Sandland is a Lecturer in the Department of Material Science and Engineering and an MITx Digital Learning Scientist. Jessica leads online learning initiatives in DMSE, creating MOOCs and designing blended courses for MIT students. She has coordinated the development of a wide variety of DMSE’s online courses, including 3.086x: Innovation and Commercialization, 3.032x: Mechanical Behavior of Materials, 3.072x: Symmetry, Structure, and Tensor Properties of Materials, 3.15x: Electrical, Optical and Magnetic Materials and Devices, and 3.054x: Cellular Solids.

MITx courses and the innovative work of the Digital Learning Lab are possible in part because of the support of MITx learners. If you’re able, please consider a donation to MITx today.

--

--

MIT Open Learning
MIT Open Learning

Transforming teaching and learning at MIT and around the globe through the innovative use of digital technologies.