Who are the wine critics with the closest ratings to the Global Wine Score? (NB: this is NOT a critics classification but a statistical study about scores deviation)

The Global Wine Score
The Global Wine Score (GWS)
3 min readJun 27, 2017

DISCLAIMER : We do not support any claim of a supposed critics classification based on our blog article.The score deviation to GWS is not at all a criteria to define the quality of critics.

The Global Wine Score aims to give a score objective and consensual by aggregating experts’ ratings.

All these wine critics have their own legitimate opinions about all the wines they are tasting. In order to assess how consensual these critics’ ratings are, this article tries to measure the average deviation between experts’ scores and the Global Wine Score.

How is it possible to compare the critics’ ratings and the Global Wine Score?

As seen in the previous articles of this blog, the experts have different scales to assess wines. The Global Wine Score is calculated through the normalization of the ratings given by these different wine critics, or to put it in other words, we put all the wines on a common scale, out of 100. This normalization uses an algorithm which considers all the ratings from an expert to assess what the notations distributions are.

This study presents the average deviation between the normalized experts ratings and the Global Wine Score. The second part will focus on the deviation dispersion for each journalist.

The results

This study concerns red En Primeur Bordeaux wines on the 5 last vintages (from 2012 to 2016) rated by at least 3 journalists.

The chart presents the journalists sorted from the closest ratings to the GWS (left) to the furthest (right). It shows that the wine critic having the closest ratings to the Global Wine Score is Jeff Leve followed by Jacques Perrin and Decanter. Journalists the further away from the GWS ratings are the ones who have ratings which differ from the whole experts’ pool (Tim Atkin, Jancis Robinson and Jacque Dupont on the right part of the chart). Overall, all the wine critics are pretty close to the Global Wine Score, the deviation ranges from 1.21 for Jeff Leve to 1.80 for Jacques Dupont.

The dispersion

The chart above presents the candle sticks of the deviation to the Global Wine Score for each journalist. It confirms that all the journalists have very close ratings. The scores range globally from -5 to +5 points with slight differences. The journalists with the closest ratings to the GWS have narrower ranges (left half of the chart).

Some of the journalists (Yves Beck, James Suckling) tend to slightly underscore wines in comparison to the others.

The points on the chart represent values which are significantly out of the main points of cloud (outliers). For instance, Robert Parker have a lot of points represented because he gives scores very different when compared to the GWS.

Conclusion

The scores are globally very close with slight differences relative to each journalist.

Jeff Leve appears to be the most consensual critic followed by Jacques Perrin and Decanter. Tim Atkin, Jancis Robinson and Jacques Dupont have more different ratings and seem to be more singular. Yves Beck and James Suckling tend to underscore when compared with the GWS.

This study is not about assessing the “quality” of the ratings for each critic. Each journalist has his own tastes and quality. We just want to see how consensual these journalists are. Hopefully this article gives a better idea on how to consider the ratings coming from the different critics. And do not forget to match them with your OWN personal tastes.

--

--