Image integrity: the new frontier of publishing ethics

Ben Mudrak
Research Square
Published in
4 min readNov 7, 2017

Academic publishing has long had an important role in safeguarding the scholarly record so researchers can continue to build on past results in new ways. Since the mid-18th century, journals have used peer review to evaluate the papers they publish, with this important step helping to filter incorrect or incomplete work.

In more recent years, the explosion of research publishing has led to the unethical practice of plagiarism. While not always intentionally fraudulent, plagiarism was a sufficiently pervasive problem that it gave rise to tools like the nearly ubiquitous iThenticate. Plagiarism is still reasonably common; we see substantial text overlap in over 20% of the manuscripts we screen for our publisher partners.

Another area with less attention, and less pervasive solutions, is image fraud. Researchers and publishers are undeniably aware of the deleterious effects of image fraud thanks to some high-profile cases like that of Woo Suk Hwang, a stem cell researcher who had several papers retracted in the mid-2000s. But these anecdotes do not paint a picture of the publishing landscape as a whole. Are they indicative of high-pressure fields and labs? Or a deeper problem across the scholarly literature?

Is image manipulation really an issue?

In recent years, we and several others have set out to determine the prevalence of image issues in scientific papers. One of the earliest journals to explicitly target image integrity, the Journal of Cell Biology, shared some of its internal data on how commonly they find figures that are improperly spliced, adjusted, or duplicated, finding that

25% of all accepted manuscripts have had one or more illustrations that were manipulated in ways that violate the journal’s guidelines.

Fortunately, the overwhelmingly majority of cases could be resolved through communication with the author and inspection of original images. Altogether, only 1% of cases seemed to involve true fraud on the part of authors.

Another, more broad-ranging study on image manipulation in published papers was released in 2016 in the journal mBio. In this study, Bik et al. screened 20,621 papers from 40 scientific journals and found that 3.8% of published papers contained improperly altered images, many of which appeared to be deliberate attempts on the part of the author. Year-by-year analysis led to the unsurprising result that image issues have become more prevalent in recent years.

Fig. 5. Percentage of papers containing inappropriate image duplications by year of publication. Reproduced from Bik et al. mBio vol. 7 no. 3 e00809–16. DOI: 10.1128/mBio.00809–16. CC-BY Bik et al.

Most recently, Research Square screened nearly 500 manuscripts with gels, blots, or microscopy images for a pair of open access biomedical journals and presented the findings at the recent Peer Review Congress. We found that 19% displayed indications of improper image manipulation or duplication (see figure below). Like the Journal of Cell Biology, the vast majority of cases could be cleared up after communication with authors, leaving only 11% of identified image issue cases that needed additional follow-up by the journal.

Data collected in 2016 and 2017. © 2017 Research Square

What can be done to combat improper image manipulation?

Even at the most modest level indicated above (1%), that could mean over 20,000 articles with fraudulent images are published each year, at current rates. What can the industry do to combat this problem, much of which likely stems from misunderstanding on the part of authors?

Author guidelines. One important area to consider are your journal’s guidelines. If you expect images to adhere to certain standards of integrity, list them on your site. The Journal of Cell Biology has some best practices in this area. A little education may go a long way in avoid some of the cases based on author error.

Screen your manuscripts for image manipulation. An ounce of prevention may be worth a pound of cure, but not every author will carefully internalize your guidelines, and a handful of them may knowingly skirt them. Consider screening manuscripts as you receive them (or alternatively, as you prepare to officially accept them for publication). This extra effort will safeguard your journal against unwanted retractions and corrigenda, along with steadily improving the images you receive. The American Physiological Society began screening images in its journals in 2009, and they have seen the rate of author queries and corrigenda decrease as researchers get accustomed to their image standards. Perhaps most tellingly, of 190 authors who were flagged for image issues and later submitted a manuscript, only 8 provided manuscripts with image issues a second time. These trends are promising even as the overall incidence of improper image manipulation has grown over the past decade.

The birth of VerifEye, Research Square’s online software tool

Based on the data shared above, we wondered whether simplifying how journals screen manuscripts for images would enable more journals to take that step. By screening images and returning manipulated ones to authors, journals help researchers submit better images the next time around. In turn, the integrity of the scholarly record is strengthened.

Currently an internal tool to support our editorial checks, VerifEye is being upgraded for use by journals and other interested parties. Research Square’s VerifEye is a simple web-based solution that enables the simultaneous evaluation of multiple images for potential manipulation. Using different filters and transformations, the tool helps anomalous features stand out (without expensive software to install). Signups for VerifEye are coming soon!

--

--