A new study published in PLOS Biology has uncovered widespread image-related problems in pre-clinical animal studies of subarachnoid hemorrhage. The findings highlight serious concerns for research integrity in a field that shapes early scientific understanding of a severe type of stroke.
This project, led by researchers at Radboud university medical center, reviewed 608 scientific papers and found image-related concerns in a large proportion of them. The issues affect the reliability of evidence across this area of neuroscience, from duplicated western blot panels to reused microscopy images drawn from unrelated disease models.
The research team, including René Aquarius, a Research Scientist and Forensic Meta-Scientist at Radboudumc, and Kim Wever, an Assistant Professor and Meta-Scientist at Radboud university medical center, worked on this investigation for nearly three years. What began as a systematic literature review evolved into the most extensive examinations of image problems in pre-clinical subarachnoid hemorrhage studies to date.
“Imagetwin made it possible for us to systematically assess all included literature for image integrity issues. We could not have completed the project within a reasonable time frame without Imagetwin.”
- René Aquarius, Research Scientist and Forensic Meta-Scientist at Radboudumc
Why Image Integrity Matters
Subarachnoid hemorrhage is a life-threatening form of stroke. Pre-clinical animal models are essential for building evidence on potential mechanisms, interventions, and treatment pathways. When images in these studies contain duplication, manipulation, or reuse from unrelated experiments, it can distort results, misdirect follow-up research, and reduce trust in the field.
The study found that:
- ~40% of articles included duplicated western blot panels.
- Of the 243 problematic articles, 239 had image-related issues.
- One article was corrected without issuing a formal notice.
- Policies for dealing with problematic articles varied widely across publishers.
The team’s findings point to a broader structural issue: researchers rely on a body of literature that may contain compromised visual data, and publishers do not always apply consistent correction standards.
How Imagetwin Supported the Project
The Radboud university medical centre team performed the full scientific analysis and evaluation. Imagetwin’s role was to support the technical side of the work.
We collaborated with the researchers and provided access to the Imagetwin platform for the duration of the project. This enabled the team to screen hundreds of papers at scale and confirm cases of duplication or reuse through database comparisons. Imagetwin’s database of more than 115 million scientific figures allowed the researchers to detect visual overlaps that would have been extremely difficult, or impossible, to identify manually.
The research team defined the methodology, made all classification decisions, and interpreted every result. Our contribution was offering the tooling and guidance necessary to handle a large amount of visual data efficiently.
“Detecting duplicated images by eye is extremely time consuming, and it’s easy to miss something. It was great to have access to Imagetwin to accelerate our investigation and make it more accurate. We used Imagetwin as our initial detection method and checked everything it flagged by eye”
- Kim Wever, Assistant Professor and Meta-Scientist at Radboud university medical center
Three Years of Research Highlight the Need for Stronger Integrity Checks
The project began when the team noticed a suspicious duplication during a review session with student interns. That single finding led to a systematic investigation involving multiple collaborators, experts in research integrity, scientific sleuths, and the CAMARADES collaboration.
The results have already drawn international attention and raised important questions:
- How can journals apply consistent image-screening practices?
- Should publishers use tools to retrospectively assess previously published articles?
- What level of transparency should accompany post-publication corrections?
- How can research fields avoid polluted evidence bases that mislead future studies?
While answers will differ across fields, this work shows that large-scale image screening is possible and necessary, and that stronger tools can give research communities a clearer view of the evidence they rely on.