Data collection via crowdsourcing platforms (Amazon MTurk, Prolific Academic)

Joanna Liu
Psyc 406–2016
Published in
3 min readJan 31, 2016

Accessible. Instantaneous. High-quality?

We are living in the Cyber Age. In the recent decade, the world has witnessed the Internet’s rise have an unprecedented effect on individuals and society. Look at the boom of e-commerce — successful businesses, of all sizes and of all industries, recognize and profit on the increasingly fast-paced nature of consumer demand. Could this mentality extend to psychologists who aim to run cost-effective and time-efficient labs?

Many forward-thinking, globally-minded psychologists are taking the leap to the World Wide Web for the cheaper participant pool and “overnight” survey and experimental data collection. Crowdsourcing websites, such as Amazon Mechanical Turk (https://www.mturk.com/mturk/welcome, US bank account required) and Prolific Academic (https://www.prolific.ac/, international UK-based), may soon become the mainstream in social sciences research. Moving online has its fair share of concerns, ranging from ethical (i.e. informed consent) to practical (i.e. sampling, variables) to social (i.e. language, culture), and potential threats to experimental validity.

The way that online experiments typically work is the researchers create their experiment, with optional audio and/or visual stimuli, through an online survey hosting platform, for example Qualtrics, LimeSurvey, or SurveyMonkey to name a few; then, they publish the survey on a crowdsourcing platform. Amazon MTurk and Prolific Academic allow extensive control over participant background information like gender, age, nationality, native and known languages, previous study history and approval ratings from other researchers. Researchers tend to appreciate the automaticity of screening participants on inclusion and exclusion criteria in a participant pool of tens of thousands — their study will only be available to their population of interest. Participation in studies can become a steady source of income for “workers”! Its convenience and accessibility benefit individuals who stay at home (due to disability, maternity leave, etc.) or need a flexible, part-time job (read: single parents) or live in developing countries — the second largest labour market for Amazon MTurk, after the USA, is India. Conversely, instead of paying $10+ per participant for a 60-minute study, researchers can pay as little as $3 to $5 for the same survey. Thanks to widespread Internet proliferation, researchers are able to collect computerized survey and experimental data from hundreds of people within days or weeks, while barely having to lift a finger — no need for manual data entry.

That said, the pros need to be weighed with the cons. Is the online participant pool the same quality that we expect from the traditional university-based subject pool? Internet-based data collection platforms promise a lot, but the reality falls short for many researchers! Some tips are to estimate the minimum and average time required with a pilot participant, filter the relevant sample variables, and of course, quality-check all of the data. Post-experimentally, researchers can refuse to pay and give a low ratings to participants that cheat. However, you cannot ethically throw away outlier data without reason. In the end, the validity of data collected via crowdsourcing platforms really boils down to the experimental protocol that the researcher puts forth. Hence, exercise common sense, always.

--

--