What’s inside the Times Higher Education World University Rankings’ ‘Academic Reputation Survey’?
Number 64: #USSbriefs64
Click for printable PDF (in production, available soon)
1. Your reputation is at stake!
The so-called ‘Academic Reputation Survey’ (hereinafter ARS; IPA pronounciation /ɑːs/ ), undertaken in partnership with Elsevier, is a key component of Times Higher Education (hereinafter THE) World University Rankings. According to ‘World University Rankings 2019: Methodology’, a university is scored by:
- Teaching (the learning environment) — 30%
- Research (volume, income and reputation) — 30%
- Citations (research influence) — 30%
- International outlook (staff, students, research) — 7.5%
- Industry income (knowledge transfer) — 2.5%
The ARS is used to generate scores for Teaching and Research. The Teaching score is generated using:
- ARS — 15%
- Staff-to-student ratio — 4.5%
- Doctorate-to-bachelor’s ration [sic?] — 2.25%
- Doctorates-awarded-to-academic-staff ratio — 6%
- Institutional income — 2.25%
The Research score is generated using:
- ARS — 18%
- Research income — 6%
- Research productivity — 6%
In total, the ‘Academic Reputation Survey’ (ARS) forms 33% of a university’s score, as illustrated by THE below (emphasis inserted by myself):
So who gets to fill out the ARS? As Phil Baty, ‘Chief Knowledge Officer of THE’, helpfully explains (emphasis inserted by myself):
The THE Academic Reputation Survey, carried out in partnership with Elsevier, is uniquely rigorous and balanced. It is invitation-only, to ensure that only experienced, published scholars can take part and to ensure a truly representative statistical sample of global scholarship, across countries and academic disciplines. Universities cannot nominate anyone to take part, they cannot supply contact lists and individuals cannot nominate themselves to take part […] If you are selected to take part in the survey, you have been chosen based on a proven record of research publication and will be representing thousands of your peers in your discipline and your country. Please take the opportunity to provide your expert input and help us develop a uniquely rich perspective on global higher education.
Why ‘reputation’? Again as Phil Baty explains:
[T]here is also another, more simple reason that we include the reputation survey in the THE World University Rankings. Reputation matters. In today’s highly competitive global academy, reputation is the currency. It is a key consideration for faculty when moving jobs, it influences the formation of new research collaborations and helps persuade philanthropists or industrial funders to invest. It is also a key consideration for international students in deciding who to invest their future with.
Given the centrality of ARS in the generation of THE World University Rankings, given the importance of THE World University Rankings in setting the priorities and objectives of some universities’ governing body, and given the urgent need for greater transparency in all our debates on the present and future of global higher education, the questions inside the ARS should be of enormous interest to all university staff and students. The content of the ARS should not be restricted to a number of ‘experienced, published senior scholars’ (allegedly 10,000+ per year — ‘junior’ scholars are presumably excluded), who have been chosen by THE based on undisclosed criteria to complete the survey.
The ARS has been wide open since early November 2018. I have now seen a copy of the ARS, and I am posting the entirety of its contents below as screenshots. I emphasise again only ‘invited’ ‘senior’ scholars can see the ARS — the hypothetical ‘student-consumer’ who uses THE rankings to inform their education decisions has zero access to either the ARS’ content or the selection criteria for those who complete it. I will not be offering commentary on the validity (or otherwise) of THE rankings and its ARS. Interested parties should feel free to draw their own conclusions and I will deeply appreciate it if you could share your thoughts on Twitter using the hashtag #TheARS.
2. The ARS: invitation email and 22 screens
3. Monetising your voluntary labour
In a July 2018 post on the blog of The Society for Research into Higher Education, Paul Temple (Reader Emeritus in Higher Education at the UCL Institute of Education) bluntly asks, ‘What is Times Higher Education for?’ Temple points out:
[THE] seems to be saying that the data used to create the THE rankings are available, at a price, to allow universities to improve their own performance. Leaving aside the old joke about a consultant being someone who borrows your watch to tell you the time, referring to the data used to produce rankings and in the following sentence proposing using the same data to help universities achieve their strategic goals (and I’d be surprised if these goals didn’t include rising in the aforementioned rankings) will suggest to potential clients that these two THE activities are linked. Otherwise why mention them in the same breath? This is skating on thin ethical ice.
Temple is referring to the fact that Times Higher Education now sells data to universities as a package called ‘THE DataPoints’. As described on THE’s website:
Our DataPoints suite of tools is designed to provide detailed performance information across all of the core areas of university activity, as well as allowing comparison and benchmarking against other institutions — whether competitors or collaborators — across regions, subjects and other key criteria.
Within the THE DataPoints package is THEReputation, which provides (emphasis mine):
a 360º view of your institution’s reputation among participants, beneficiaries and influencers in higher education. The product is built on millions of granular data points capturing the opinion, sentiment and behaviour of academics, students and the public, including the THE Global Academic Reputation Survey and data from social media streams.
In other words, the ARS is used to generate 33% of a university’s score in THE World University Rankings, and the data — the free labour of ARS participants — is then sold back to universities as ‘THEReputation’. Of course, university staff are very much used to donating their free labour, which is then monetised by big corporations like Elsevier. However, to extend Paul Temple’s metaphor, the THE consultants are charging university senior management for telling the ‘time’, using a ‘watch’ that they have borrowed from you for free, the ‘watchmaker’ — which is then weaponised to create institutional policies (e.g. research and teaching performance management) that often work against you.
How to disrupt 33% of THE World University Rankings and one of THE’s data products? Simple: if you receive an email invitation from Phil Baty to fill out the ARS, just delete it.
Update by the author, 26 November 2018
In a tweet posted on Monday 26 November 2018 (see also screenshot below), Phil Baty, the Chief Knowledge Officer of THE, appeared to suggest that the current THE World University Rankings was built on a survey — the ‘Academic Reputation Survey’ (ARS)?— of ‘hundreds of academics’. This would seem to contradict claims made in a number of places, including in the THEReputation brochure quoted above, that the ARS has ‘10,000+ respondents per annum’.
Phil Baty was approached by the author on Twitter for comment. While Baty did not respond to the author directly, he later clarified in a tweet that the methodology of THE World University Rankings was ‘developed after a survey of hundreds of academics in 2009/10’, and the ARS ‘attracts over 10,000 statistically representative responses each year’.
This paper represents the views of the author only. The author believes all information to be reliable and accurate; if any errors are found please contact us so that we can correct them. We welcome discussion of the points raised and suggest that discussants use Twitter with the hashtags #USSbriefs64 and #TheARS; the author will try to respond as appropriate. This post is licensed under a Creative Commons 1.0 Universal Public Domain Dedication License.