Thomson Reuters suspends journals from its rankings for ‘citation stacking’.
Mauricio Rocha-e-Silva thought that he had spotted an easy way to 
raise the profiles of Brazilian journals. From 2009, he and several 
other editors published articles containing hundreds of references to 
papers in each others’ journals — in order, he says, to elevate the 
journals’ impact factors.
                    
Because each article
 avoided citing papers published by its own journal, the agreement flew 
under the radar of analyses that spot extremes in self-citation — until 
19 June, when the pattern was discovered. Thomson Reuters, the firm that
 calculates and publishes the impact factor, revealed that it had 
designed a program to spot concentrated bursts of citations from one 
journal to another, a practice that it has dubbed ‘citation stacking’. 
Four Brazilian journals were among 14 to have their impact factors 
suspended for a year for such stacking. And in July, Rocha-e-Silva was 
fired from his position as editor of one of them, the journal Clinics, based in São Paulo.
                    
“We’ve
 been caught wrong-footed,” says Rocha-e-Silva, a retired physiologist. 
The editors of the other three Brazilian journals collared by Thomson 
Reuters remain in place. In addition to these four journals, “there are a
 few others which played a part in this game, and they escaped”, he 
says.
                    
Editors have tried before to 
artificially boost impact factors, usually by encouraging the citation 
of a journal’s own papers. Each year, Thomson Reuters detects and cracks
 down on excessive self-citation. This year alone, it red-flagged 23 
more journals for the wearily familiar practice. But the revelation that
 journals have gained excessively from citations elsewhere suggests that
 some editors may be searching for less detectable ways to boost their 
journals’ profiles. In some cases, authors may be responsible for 
stacking, perhaps trying to boost citations of their own papers.
                           
  
   
   
  
 
 
                          
The journals flagged by the new algorithm 
extend beyond Brazil — but only in that case has an explanation for the 
results emerged. Rocha-e-Silva says the agreement grew out of 
frustration with his country’s fixation on impact factor. In Brazil, an 
agency in the education ministry, called CAPES, evaluates graduate 
programmes in part by the impact factors of the journals in which 
students publish research. As emerging Brazilian journals are in the 
lowest ranks, few graduates want to publish in them. This vicious cycle,
 in his view, prevents local journals improving.
                    
Abel
 Packer, who coordinates Brazil’s system of free government-sponsored 
journals, known as SciELO, says that the citation-stacking venture was 
“unfortunate and unacceptable”. But he adds that many editors have long 
been similarly critical of the CAPES policy because it encourages local 
researchers to publish in high-impact journals, increasing the 
temptation for editors to artificially boost their own impact factors, 
he says.
Brazilian 
editors have campaigned for years for CAPES to change its system. “But 
they have always adamantly refused to do this,” Rocha-e-Silva says. 
Bruno Caramelli, editor of Revista da Associação Médica Brasileira,
 which was also hauled up by Thomson Reuters, says that by 2009, editors
 of eight Brazilian journals decided to take measures into their own 
hands (see ‘Citation stacking’).
But
 other editors involved in the agreement dispute Rocha-e-Silva’s 
assertion that they aimed solely to increase impact factors; Caramelli 
and Carlos Roberto Ribeiro de Carvalho, editor of the Jornal Brasileiro de Pneumologia,
 argue that the idea was also to show off articles in Brazilian 
journals, attracting better contributions and raising quality all round.
The
 punishment for the four journals caught out has been severe: CAPES says
 that any articles they published in 2010–12 will not count in its 
October 2013 evaluation of graduate programmes. The journals have also 
been suspended from CAPES’ evaluation system until their impact factors 
are restored.
                    
Citation stacking first came to 
light in 2012, after Thomson Reuters was tipped-off about a case in 
which (now-retracted) articles in two journals contained hundreds of 
references to a third, boosting its impact factor (see go.nature.com/rehuit). In response, the company pioneered its new algorithm, designed to spot the practice.
What
 happened in the cases of the other ten journals censured for citation 
stacking is unclear. One involves a close pattern of citations between 
three Italian journals (International Journal of Immunopathology and Pharmacology, Journal of Biological Regulators and Homeostatic Agents and European Journal of Inflammation)
 all with the same editor-in-chief, Pio Conti, an immunologist at the 
University of Chieti-Pescara. Many of the authors on the relevant papers
 are also at that university. Conti told Nature that he regretted that the anomalous citations had occurred, but that “we have no quick explanation of the patterns”.
                    
In another case, review articles with hundreds of references to Science China Life Sciences
 were meant not to lift its impact factor, but to clarify confusions 
after a rebranding and to “promote the newly reformed journal to 
potential new readers”, says its editor-in-chief Dacheng Wang, who is 
based at the Chinese Academy of Sciences’ Institute of Biophysics in 
Beijing.
In a further case, the Journal of Instrumentation saw hundreds of cross-citations from papers authored in SPIE Proceedings by
 Ryszard Romaniuk, an electronic engineer who was part of the 
collaboration that put together the CMS experiment in the Large Hadron 
Collider at CERN, Europe’s high-energy physics laboratory near Geneva in
 Switzerland. Romaniuk, who repeatedly cited CMS engineering papers, has
 not replied to Nature, nor to the Journal of Instrumentation’s publishers — the Institute of Physics and the International School of Advanced Studies (SISSA).
Marie McVeigh, who leads Journal Citation Reports, Thomson
 Reuters’ annual report of journal citation patterns and impact factors,
 says that the firm does not assume intent in the patterns it sees. “We 
just analyse data, and see where the impact factor, due to this unusual 
concentration of citations, is not a reflection of the journal’s 
citation in the broader literature. Until we can algorithmically measure
 motive, we are just going to measure citations.” She says that because 
citation stacking is only a problem if it excessively distorts journal 
rank, only four Brazilian journals, and not all of those that 
Rocha-e-Silva says participated in the arrangement, have had their 
impact factors suspended.
Thomson Reuters has 
received some appeals against its rulings, with some editors arguing 
that to suspend impact factors for incoming citations of which they were
 unaware was unfair; or that outgoing citations did not affect their own
 journal. The firm will deal with these by September, McVeigh says. 
Rocha-e-Silva appealed against the suspension of Clinics, but his request was turned down.
                    
He
 feels that journal impact-factor scores are a bad way to judge 
scientists’ work and that the real problem is CAPES’ policy. But he also
 accepts responsibility for what he did. “We’re not going to play that 
again wherever I become editor, if I ever do. I’ve burned my fingers 
once,” he says.
The journals currently 
suspended for either self-citation or citation stacking represent only 
0.6% of the 10,853 in Thomson Reuters’ respected directory. But 
impact-factor anomalies will continue, McVeigh says. Aside from some 
journals with excessive self-citations, “we’ve already identified some 
journals citation stacking for next year”, she says.
- Nature 500,510–511 () doi:10.1038/500510a

 
 

No comments:
Post a Comment