Announcing the Launch of the Social Science Prediction Platform

The Center for Effective Global Action
CEGA
Published in
5 min readJul 14, 2020

This post was written by BITSS Senior Program Associate Aleksandar Bogdanoski and Program Manager Katie Hoeberling. The post is also published on the BITSS blog and the Development Impact blog and BITSS blog.

Photo: Social Science Prediction homepage

What will be the effect of raising the minimum wage by $2 an hour? How will cash transfers impact local prices in rural Kenya? Will a basic income support economic recovery from the coronavirus pandemic? When faced with such questions, policymakers and practitioners rely on research and expert perspectives to make decisions. Making good policy decisions often comes down to being able to reasonably predict the effects of policy alternatives, and choosing those that maximize societal benefits. Too often, however, hindsight bias leads us to believe we knew how a policy, intervention, or an evaluation would play out even if our priors don’t actually match the outcomes.

Collecting and recording predictions systematically can help us understand how results relate to our prior beliefs, as well as how we can improve their accuracy. They can also reveal how surprising results really are, potentially protecting against publication bias or the mistaken discounting of results as “uninteresting” after the fact. Finally, tracking prediction accuracy over time makes it possible to identify super-forecasters — individuals who make consistently accurate predictions — who can help prioritize research questions, as well as design strategies and policy options in the absence of rigorous evidence. All of these benefits become even more important in times of emergency, such as the COVID-19 pandemic which has disrupted the collection of field data for many research projects.

Recognizing the potential of a more systematic approach to forecasting, the Berkeley Initiative for Transparency in the Social Sciences (BITSS) has been working with Stefano DellaVigna and Eva Vivalt to build the first platform of its kind that will allow social scientists to systematically collect predictions about the results of research. Today we are excited to announce the official launch of the Social Science Prediction Platform!

The Social Science Prediction Platform makes it easy to collect forecasts

The Social Science Prediction Platform, or SSPP, allows researchers to standardize how they source and record predictions, similar to how study registries have systematized hypothesis and design pre-registration. Perhaps most importantly, the SSPP streamlines the development and distribution of forecast collection surveys. With the SSPP, researchers can:

  • Use our survey template or develop their own survey on Qualtrics to collect predictions for various outcomes such as effect sizes, standard errors, and confidence intervals.
  • Distribute the survey directly from the platform to a sample of their choosing, including topic or method experts such as senior academics and professionals, disciplinary experts such as pre-screened Ph.D. students, or members of the general public who sign up on the platform. Importantly, the SSPP enables more efficient coordination of survey distribution, reducing the risk of overburdening a small group of popular forecasters and mitigating “forecaster fatigue” and low response rates.
  • Access survey results on a pre-specified date and easily download them in .csv files.

To learn more about key issues to consider when designing surveys for collecting forecasts, as well as how to use the platform, see our Forecasting Survey Guide. The SSPP team is also offering consulting support for those who need it — contact Nick Otis at support@socialscienceprediction.org for help.

Who is the SSPP for?

Two types of accounts allow users to access different functionalities based on their interests and experience. Basic accounts allow users to view available studies and make predictions, while Researcher accounts are restricted to researchers at academic or other institutions and allow users to upload and distribute surveys.

To encourage predictions, we will offer the first 200 graduate students $25 for their first 10 predictions, re-evaluating as more projects and predictions are added to the SSPP.

Though widely applicable across research designs and disciplines, the SSPP may be particularly useful when studying complex, flagship projects that are unlikely to be replicated, or projects that have been offered “in-principle acceptance” as part of a registered reports journal track.

These projects need your predictions!

As part of the SSPP’s soft launch, three projects are available for prediction:

Open questions

Since the SSPP is the first of its kind, this launch will be a learning process. We are discussing how best to address the following open questions and welcome feedback from the social science community!

  • What role should the SSPP play in coordinating or facilitating incentives? Cash payments and gift cards tend to be the most commonly used survey incentives, but other systems like dashboards that display individuals’ accuracy might work just as well, if not better, for academics and other scientists whose expertise is an important form of capital.
  • Relatedly, how will reciprocity play a role in contributing predictions? How should we encourage this? We are considering how many predictions we should encourage users to provide for each survey they upload. As forecasting becomes more normative, we hope to foster a shared sense of responsibility in eliciting and contributing to them.
  • Should the SSPP assign Digital Object Identifiers (DOIs) to projects or predictions? The platform assigns unique identifiers to projects, but we are discussing what value DOIs may add.
  • How should the SSPP share results back with forecasters, policymakers, or the public? Are some reporting aspects, such as design, timing, or level of detail, more helpful than others in facilitating the updating of priors? Should the SSPP also play a role in understanding belief updating?

Send us your suggestions, questions, and comments and help us make the platform as useful as possible. Follow us on Twitter @socscipredict and @UCBITSS, or join the discussion using #socsciprediction. And sign up on the platform to get started!

--

--

The Center for Effective Global Action
CEGA
Editor for

CEGA is a hub for research on global development, innovating for positive social change.