Open Science Monitor: What’s Next?

David Osimo
3 min readSep 12, 2018

--

Dear friends,

Thank you for the generous response to the consultation on the Open Science Monitor, closed on 31st August 2018. We received almost 300 comments, some of them in the form of a full paper. Many of them were critical, and we welcome such criticisms especially when it’s constructive — indeed, this was the reason for opening the methodology in the first place. We are particularly thankful to those who took time to find out more about this exciting project and made the effort to propose new indicators and concrete sources for areas where data gaps exist.

We are analyzing these comments with a view towards revising and updating the methodology. Our approach is to be “as open as possible, as closed as necessary”, in line with the principles of open science. Concretely, this means that we collaborate with the broad community committed to open science. We share and discuss the methodology and the process. Just as in any open science initiative, it is clear that this can be risky and cumbersome, but uniquely helpful. This also implies favoring open data sources, when available, but also using proprietary sources, when they are necessary to provide the data — while being transparent about limitations and biases. Our ultimate goal is to provide high-quality data to support open science policy in Europe today. And this is also the main selection criterion for indicators and data sources.

In fact, better data are urgently needed for making open science a reality. While open mandates are increasingly adopted by governments, funding bodies and journals, these mandates are difficult to implement. One of the main reasons for this implementation gap is the data gap: for instance, when it comes to research data sharing, we do not have anything close to the quality of bibliometric data on open access to publication because there are no widely adopted standards for data citation. Of course, there are pioneering initiatives that potentially could solve this problem, such as the FAIR data metrics and innovative experiences on data mining in patents. But they still have to scale and their results will arrive several years from now.

If we want open science policy to be more than a statement of principles, we need these data now — and the Open Science Monitor aims to provide at least part of the solution. We’d love to have valid and robust open data for every trend, but they are not always available. We can’t expect data that are fully comprehensive of every aspect, but we can expect data that is valid enough to capture directionally the development of trends over time. And we can be transparent about the method and the limitations of the data that we have.

Ultimately, we need to find workable solutions in the short term probably based on a mix of bibliometric data, proxies, and surveys. And we need to prepare the ground for more long-term, stable options based on automatic and open data generation. And to achieve this, we need the help and the knowledge of the community. There is no single established way to measure open science — this is very much a work in progress.

We will discuss how to update the methodology based on the comments received at an expert workshop in Brussels on 19th September 2018, and publish a revised methodology at the end of September, as anticipated, together with responses to the comments received. We are also reaching out to those who made constructive contributions to gain some more clarification if needed.

Last but not least: this is only the beginning. We would like to invite all those interested to collaborate with us for the remaining part of the project (until the end of 2019). We all have our role to play in order to improve the evidence base for open science in Europe.

Thanks again, and look forward to the discussion.

--

--