Has peer review hit rock bottom?
Just as we wonder how many shootings it will take for us to change our gun laws, we should be wondering what it will take for us to realize our peer review system is broken. Peer review supposedly ensures a publication is of sufficient quality to be included in the scientific literature, but time and time again at Retraction Watch and PubPeer we see blatant examples of fraud that have slipped by the academic editors and reviewers. And if peer reviewers are not even capable of detecting blatant fraud, how do we expect them to stop more sophisticated attempts at fraud, or God forbid what they are really there for, critically assessing the quality of the publication?
I recently became aware of the most blatant example of fraud I have ever seen. It appears that every image in this paper has been manipulated, and it didn’t just pass peer review at one journal, but FOUR different journals. Microbiome Digest does an excellent job showing the four publications side by side and the image manipulation that has occurred.
So if we assume that there were 2 or 3 peer reviewers for each paper, along with an academic editor, that means that the fraud in these papers went undetected by at the very least TWELVE “experts”. How is this possible? Well, I’m going to tell you. Or rather, I am going to show you.
One of the journals that published this paper was PeerJ, which allows publishing of the peer review history. Unfortunately, the authors decided to not make this history public (I wish they had to make it public, and I wish reviewers were not anonymous). Luckily, I happen to have just published an article in PeerJ Computer Science, and I made the peer review history public. How about we take a look at what the peer reviewers had to say.
I should note that this is not an attack on PeerJ, but rather peer review in general. PeerJ is my favorite journal because of the extremely low publication costs, publication speed, and dedication to transparency such as making the peer review history public. And PeerJ Preprints is my preferred preprint server.
Below are the comments along with appropriate responses. The full reviewer comments and my actual responses are available at PeerJ.
First Submission
Reviewer 1
“This work developed a web tool, named lncOnco”
Hmm, the name is OncoLnc, it’s in the title of the paper, you didn’t read the paper did you, it’s not even that long.
“Table S1-S3 should be a part of this tool”
That is the data in OncoLnc, just in an excel format. You definitely didn’t read the paper.
“I can’t find any link or button back to main page”
The search results are one page removed from the main page, are you aware of your browser’s back button?
Reviewer 3
“Three kinds of datasets, mRNA, miRNA and LncRNA, were used in this study. However, their importances were not addressed”
Did you really just ask why these biomolecules are important?
“Thrid, the datasets used in this study need more details, including original sources and appropriate citations.”
TCGA data is publicly available, it’s clear you haven’t worked with TCGA data before.
Second Submission
Reviewer 1
“Synonymous gene name search is the basic function for bioinformatics web tools. Please add this feature in OncoLnc.”
I added the feature already. You didn’t use OncoLnc again before you submitted your comments, did you? At least you are spelling OncoLnc correctly now.
Reviewer 2
“Yes, the users should know what they would like to query. However, in biology research field, there are lots of alias for genes and other biomolecules. A function to list the possible alias for the queried item is definitely not useless.”
Sigh, none of you are even using OncoLnc are you? I already added this functionality.
Reviewer 3
“lncOnco provides the results of precomputed survival analyses.”
Now you can’t spell OncoLnc?
Third Submission
Reviewer 3
“The author only add the download page. The author do not provide any further customized or analysis functions.
I still think IncOnco is a bit too rough.”
Thanks for being really specific on what you’d like to see. Are you trolling me? You’ve now spelled OncoLnc three different ways. I kind of wish I had a fourth submission to see what else you had up your sleeve (why is it always reviewer 3?).
___________________________________________________________________
What I hope these snippets show is that these reviewers are clearly not “experts”. More concerning than what the reviewers said, is what they didn’t say. Their comments were very brief and give the impression they skimmed the paper and wrote just enough to appear as if they did their job. Nothing they said showed they were knowledgeable about TCGA data, cancer studies, web development, or even basic biology.
And this is not an unusual case, this is the NORM. What happens is although you suggest up to five reviewers that you think are qualified to review your publication, you never get those reviewers. What ensues is the journal has to desperately try to get scientists to agree to review your paper. And what they end up with are people who neither understand the science nor care enough about the paper to even read it.
And what does this watered-down peer review process get us? In my case did they do anything to check the accuracy of my work? The code is on GitHub, did they try running any of it? Did they try to find any bugs with the web application? Did they check values in the supplement matched values in the web application? I’m going to go out on a limb and guess not.
And did OncoLnc have errors when it was submitted. Yes! Although just minor bugs (that I know of). For example, I discovered that it was possible to get a server error if users tried to make a Kaplan-Meier plot with a small number of patients.
So in my case what did peer review accomplish? A few changes to the manuscript and a little functionality added. Was that worth delaying the publication of the manuscript months? Is that a valid reason to consider the published article any differently than the preprint?
And what if the data in OncoLnc is wrong? Heck, what if I made up all the data? Would this peer review process have caught it? I think we know the answer to that.