By bringing cases of potential plagiarism out into the open, researchers at UT Southwestern Medical Center have shed light on the peer-review process and scientific publication.
In the past two years, UT Southwestern researchers have used a computer-based text-searching tool they developed, called eTBLAST, to analyze millions of abstracts randomly selected from Medline, one of the largest databases of biomedical research articles. They turned up nearly 70,000 highly similar citations.
Their subsequent analysis of a small sampling of these, including human inspection of the articles in question, revealed 207 pairs of articles with signs of potential plagiarism.
Now, in a commentary appearing in the March 6 issue of Science, the UT Southwestern researchers outline the wide range of reactions they received when they followed up with both victims and perpetrators of possible misconduct, as well as responses from journal editors.
"Studying these reactions might help to illuminate the reasons for such misconduct and might provide a way for the scientific community to prevent such activity in the future," said Dr. Harold "Skip" Garner, professor of biochemistry and internal medicine at UT Southwestern and senior author of the Science article, which appears in the Policy Forum section of the journal.
Dr. Garner and his colleagues sent 162 sets of questionnaires to the authors of original articles and to the authors of highly similar articles published later. Surveys also were sent to the editors of each journal in which the similar papers appeared. They received a reply in 143 cases. All respondents were guaranteed anonymity.
"Although our goal was merely to solicit information, our questionnaire triggered 83 internal investigations by editors, 46 of which have led to retraction," Dr. Garner said.
On the other hand, Dr. Garner said, nearly half of the duplications brought to light have received no action. In addition, on 12 separate occasions, editors specifically indicated that cases involving their journal would not be reviewed.
This variation in feedback reveals a great deal about the attitudes and motivation of scientists around the globe, including why some journal editors do not pursue obvious cases of duplication, Dr. Garner and his colleagues note. They speculate that some editors may not want to deal with the added stress of conducting a thorough investigation, while others might feel it would bring bad publicity or reflect poorly on their journal's review process.
The Science article and supplemental material available on the journal's Web site also include excerpts from statements made by authors and editors. The comments reveal many perspectives in response to being presented with evidence of possible plagiarism.
For example, before receiving the questionnaire, 93 percent of the original authors were not aware of the duplicate's existence. The responses from duplicate authors were more varied, however. Of the 60 replies, 28 percent denied any wrongdoing; 35 percent admitted to having borrowed previously published material; and 22 percent were from co-authors claiming no involvement in the writing of the manuscript. An additional 17 percent said they were unaware their names appeared on the article in question.
Of the 174 journal editors who responded, 11 admitted they had never personally dealt with a potentially plagiarized manuscript and were unaware how to proceed.
"The majority of these editors showed deep concern and were open to any helpful suggestions or recommendations we could offer," said Dr. Garner, who is a faculty member in the Eugene McDermott Center for Human Growth and Development at UT Southwestern.
"Our objective is to make a significant impact on how scientific publications may be handled in the future," Dr. Garner said. "As it becomes more widely known that there are tools such as eTBLAST available, and that journal editors and others can use it to look at papers during the submission process, we hope to see the numbers of potentially unethical duplications diminish considerably."