“I’ll have the 3-year course, please.” Can we expect ‘menu’ pricing to become the norm for universities?

Brighteye Ventures
Brighteye Ventures

--

By Rhys Spence

I started at university the year that tuition fees in the UK were moved from £3,000 to £9,000 per year. My parents had been saving to help me cover the costs of university, but the tripling of fees scuppered the plan. Had I started in 2011, the fees would have been £9,000 across the three years. But starting in 2012 meant that the fees were £27,000 (excluding living costs).

As soon as the new fees were established, my parents encouraged me to look internationally for cheaper alternatives, with various Dutch and German institutions in the mix. In the Netherlands and Germany, tuition is free but administrative fees are around €6,000 and €2,500 respectively (for three year courses). Rates are similar in France and in many other European nations.

With more of the costs falling on the student, my parents were concerned about whether I would get value for money and whether it was sensible to take on a debt that the government expected to take 30 years to pay off (taking a 21-year-old graduate through to their 51st birthday).

They were right to question the value of degrees and indeed wider education courses- are the costs worth students’ investment? In the UK, many refer to tuition fees as a ‘graduate tax’. ‘Graduate tax’ suggests that graduates can expect to earn more across their lifetimes than non-graduates, the tax gradually clawing back a portion of those earnings to cover the costs of attendance.

This is fine in principle, assuming that attending university represents a benefit equal to or exceeding the costs…but should universities be more accountable for the value they add to the learning and earnings of their students?

Qualifying links between post-school education and individual and social prosperity is difficult- understanding learning processes and outcomes is complex. But it’s necessary if the aim is to assess quality and value for money.

In the UK, where which university you attend counts for as much if not more than the grade you receive, it’s particularly challenging to convince universities to ‘risk’ measuring their ‘value-added’, or ‘learning gain’. What happens if they are shown to add less value or provide less learning than an institution ranked 50 places lower? It’s no wonder this kind of measurement is a hard sell to institutions that have spent decades polishing their reputation and investing into world-leading research- perhaps the issue is the way we amalgamate student experience with research quality to form rankings. Would separating student experience and research quality give students a more accurate reflection of their likely university experience and prospects post-graduation?

The OECD has outlined 4 ways in which universities could measure their value or ‘learning gain’:

⁃ Comparing expected against actual estimates of performance

⁃ Comparing first and later year assessments

⁃ Assessing students’ engagement in effective education practices

⁃ Capturing feedback from graduate employers

All require robust baselined data to estimate the net effects of an education experience.

Various trials have been undertaken in Australia and in the UK (via HEFCE), but none have led to accepted measures of value or learning gain. The important word here is ‘accepted’. One of the major obstacles to establishing consensus is institutions’ autonomy. There is no direct ‘need’ to evidence the value-added while numbers of applications remain high and indeed grew during the pandemic.

Why does this matter?

Measuring value-added is not strictly necessary for an effective HE system but it’s difficult to justify tuition costs, whatever the level, without some form of proven or expected return for students.

Students want two things when attending university: a positive experience of self-exploration and a passport to a career they want. As the OECD puts it, they want to be ‘work-, career- and future-ready’; they want return on their investment.

The question becomes, what should we do for students for whom work-readiness is the priority?

The pandemic experience has changed the way that students experience university. Pivoting to remote operation no doubt required extremely high start-up costs for universities that hadn’t yet provided a remote offer, equal to or exceeding the costs of in-person provision. But it begs the question, should this remote-only offer be made permanent for students that prefer it? And more broadly, should students be able to choose how they undertake their qualification? This is similar to the decisions of some universities to offer shortened, ‘condensed’ courses over two years rather than three, with reduced breaks between terms, allowing graduates to enter the workforce a year early and benefit from a year of earnings, lower fees and a year less of interest on those fees. Of course, there are also negatives to this approach, such as reduced opportunities for placement periods alongside study, and reduced time for part-time work. But it’s undoubtedly positive that universities can be flexible to the needs and preferences of student groups. The pandemic has shown what’s possible- virtual lecture halls, experiments, peer working (enabled by platforms like Aula), resources and assessment (made possible by companies like Rosalyn AI) all represent alternative delivery modes.

With this renewed flexibility, there’s a strengthening case for universities to begin ‘menu pricing’, assuming the equivalence of qualifications awarded at the end of the course. Variables could include course length, whether provision is remote or in-person, access to full lectures, seminars and content vs. reduced access, type of assessment and a host of other options. It could reap significant rewards for universities: they might be able to expand the size of their cohorts and broaden their income base. Prices would reflect the student experience in each circumstance.

We have already seen universities beginning to do a version of this via MOOCs, making content available on platforms like Coursera and Udemy. Though clearly not the sole purpose, it’s hard to gauge how these courses impact students’ employment outlooks; indeed, many MOOCs have lower barriers to entry regarding prior attainment and less rigorous testing than formal degrees. But more technical courses, with more clearly measurable skills and abilities have been shown to boost outcomes in the jobs market (whether that’s because of the honed skillset or demonstrating the ‘get-up-and-go’ to complete courses in spare time!). This is one of the reasons why virtual and in-person skills courses and bootcamps like those provided by Ironhack are gaining significant traction.

The point is that these online courses provide one item, or type of item, that could appear on a university’s menu of options, some of which are outlined below:

Providing this flexibility- meeting the needs of their customers- might help universities to thrive post-covid. Such dynamism will prove necessary as non-HE pathways increase in popularity and rigour and remain the vastly cheaper option, often with more direct routes to employment.

Had I been presented with these options at 18, I might have chosen a different course of action. It’s only looking back that I realise how narrow my options were.

Move over set menu, I’d like the à la carte.

--

--

Brighteye Ventures
Brighteye Ventures

We are the leading European VC focused on EdTech. We invest in companies that help people learn & grow: www.brighteyevc.com