Acessibilidade / Reportar erro

All publishers are predatory - some are bigger than others

Key words:
Elsevier; Peer review; Preprint; Science Policy; Science Evaluation; Scientific Publishing.

Recently, an AABC editorial by Cordeiro and Lima (2017CORDEIRO Y AND LIMA LMTR. 2017 Publish and perish in the hands of predatory journals. An Acad Bras Cienc 89: 787-788.) discussed the problem of “predatory journals”. The authors tell the by now all-too-familiar story of a researcher being contacted for material by a journal with a seemingly reputable name, which turns out to be a haphazardly organized, pay-for-publication trap that profits from the number of articles it publishes, and therefore has no interest whatsoever in peer review or scientific rigor (Butler 2013BUTLER D. 2013. Investigating journals: The dark side of publishing. Nature 495: 433-435.).

Having read that story with interest, I would like to share a personal experience of my own with predatory journals.

In April 6th of this year, my coauthors and I submitted a systematic review of peripheral biomarkers in multiple psychiatric disorders to a scientific journal specializing in reviews in the neuroscience field. The work had already been available as a preprint in an open repository for a while (Pinto et al. 2016PINTO JV, MOULIN TC AND AMARAL OB. 2016. On the transdiagnostic nature of peripheral biomarkers in major psychiatric disorders: a systematic review. Neurosci Biobehav Rev 83: 97-108.), attracting a reasonable amount of attention - since November 2016, it has accumulated more than 2,000 views, more than 400 downloads and a number of tweets. Nevertheless, for the sake of making it more visible, as well as for it to be counted by funding agency evaluations, we decided to submit to a formal journal as well.

On May 15th, we received an e-mail telling us that reviews were complete and decisions were pending, after two reviewers had evaluated the manuscript. Nevertheless, the editor decided to send it to a third reviewer, and the manuscript went back into slumber.

On June 14th, more than two months after initial submission, we decided to make an enquiry about the manuscript’s status; nevertheless, we were told that the editor was still waiting for a third reviewer’s recommendation, and that we should therefore wait longer.

On July 2nd, we received an automatic e-mail telling us that reviews were complete and that an editorial decision was pending. Nevertheless, it was not until July 17th - 102 days after we had originally submitted the manuscript, that we finally received feedback from the journal.

The editor’s decision was that the paper required major revision, based on the feedback received from 3 reviewers - or actually 2, as “reviewer #1” was merely a blank on the editorial letter - perhaps because he/she had never sent his review in the first place. Reviewer #2 had 3 major comments and 2 minor ones, all of them relatively simple to be addressed. Reviewer #3 said the manuscript could be accepted as it was. The feedback received from both reviewers and the editor added up to 497 words - that is, less than 5 words for each day that the manuscript had been under review.

Nevertheless, we worked on the reviewers’ comments and submitted a revised version on August 31st. On October 1st, roughly 172.7 days after initial submission, according to the publisher’s own website, the article was finally accepted.

Shortly after, we received the proofs for the formatted version of the manuscript. The tables were typeset automatically, leading the formatting to be much worse than in the original manuscript. We made corrections and complaints on the proof, which we submitted on October 9th. We still don’t know if they were taken into consideration, as to this date, more than five months later, a typeset version of the article has still not been made available, and all that can be found in the website is our own PDF version.

Nevertheless, by visiting the journal’s website, I have been made aware that it is already charging US$ 35.95 for a single copy of the article (Pinto et al. 2017). Of course, most customers will access it through an institutional subscription to the journal - which, also according to the website, costs US$ 4420 per year for an institution, not including taxes.

Now let’s briefly review what the publisher has done in order to charge those prices:

  • - It provided a platform for submission - which basically merged files into a PDF and forwarded it by e-mail to the editor, something a marginally tech-savvy person could have done in less than 10 minutes.

  • - Through the editor, it looked for reviewers for the article. We do not know how much work that took, as the process occurs behind doors. Nevertheless, it took more than 3 months for the publisher to provide us with a couple of brief reviews. In comparison, a mere 48 hours after the preprint version of the article had been posted on bioRxiv, it had already been retweeted 15 times (although no formal comments had been made), suggesting that scientific readership might not be that hard to find.

  • - After a whopping 102 days, the publisher was able to provide some feedback on the article - 5 comments that fit into little more than half a page. That feedback was based on the opinion of a reviewer that, in all likelihood, provided his expertise for free for the publisher. That said, we cannot know for sure whether the chosen reviewers were indeed experts in the field, as both of them remained anonymous.

  • - After resubmission, it took around 30 days to confirm that the paper was accepted, without any further comments or feedback to justify that delay.

  • - After acceptance, it provided an automatically typeset version of the article in which the formatting of tables was ruined. After complaints on the proof, they have still not been able to provide a final typeset version of the manuscript at the time of writing, even though more than three months have passed.

On the other side of the deal, my coauthors and I worked for around 2 years on the manuscript at the time of submission. To arrive at the final product, we reviewed 1050 original studies and 142 meta-analyses, and extracted data from 326 of those articles in order to produce a 6,486-cell, 282-row-x-23-column spreadsheet. We produced figures, had countless discussions on the subject, read a huge amount of articles, worked hundreds of hours on the manuscript, and showed it to multiple collaborators who provided their input. Both my salary and my coauthors’ scholarships were paid by Brazilian governmental agencies. Ultimately, that is the value for which the publisher charges with the fees quoted above - science produced with public money whose copyright is transferred for free to a private company.

Does that sound like a fair deal? Hardly. In fact, comparing this kind of predatory behavior to that of the “predatory journals” in Beall’s list alluded to by Cordeiro and Lima (2017CORDEIRO Y AND LIMA LMTR. 2017 Publish and perish in the hands of predatory journals. An Acad Bras Cienc 89: 787-788.) is like comparing sharks to zooplankton, both in terms of magnitude and greed.

The predator in this case, as one might have guessed, was Elsevier, the largest scientific publisher in the world. It is currently the “Scientific, Technical & Medical” branch of the RELX Group, one of the internet’s largest and most profitable companies. In 2016, RELX reported revenues of over US$ 9 billion dollars, with over 3 billion of it coming from Elsevier, most of it through journal subscriptions (RELX Group 2016). Their net profit on this last amount was around 1.1 billion dollars, a 37% margin. The large margin is not a surprise, as their business model mainly involves acquiring content for free, putting it behind paywalls and charging for access, which is of course highly profitable.

The journal’s name is Neuroscience & Biobehavioral Reviews. It specializes in reviews in the field of neurobiology and has an impact factor of 8.299, according to the 2016 Journal Citation Reports

This last number might make some researchers think that the deal was not completely unfair, at least on an individual basis. After all, publishing in a journal with such an impact factor - ranked as A1 in the current Qualis classification for the CAPES Biological Sciences II and Medicine II areas - will definitely be a boost for the next evaluation of the authors’ graduate programs. Thus, 172 days of waiting for an editorial decision and free transfer of the copyright over the authors’ own work has in the end paid off in a sense, hasn’t it?

We would argue otherwise. The value created by Elsevier in the process is purely artificial. It consists of a virtual stamp of approval based on the work of anonymous reviewers, which is interpreted by Brazilian funding agencies as a virtual indicator of quality that is transferred to the authors and their graduate programs. Nevertheless, this whole process creates little value, none of which is directly provided by the publisher.

Before submission, publication and acceptance, the article had already been available in bioRxiv, which provides a platform for open distribution of preprints and is currently maintained by the Cold Spring Harbor Laboratory, with support from institutions such as the Chan-Zuckerberg Initiative. The postprint version has also been archived in the author’s page at ResearchGate - a commercial website which provides access to scientific articles as a business model, as opposed to restricting it. The actual peer review process was likely not paid for, and could have happened easily and more quickly on bioRxiv itself, or in post-publication peer review forums such as PubPeer. The role played by Elsevier in the process was merely that of an inefficient and ludicrously expensive middleman, as the scientific content it charges for would still be available without its participation.

Thus, the assumption that the publication of the article in a high-impact factor, indexed journal somehow adds value to Brazilian or international science is a collective illusion - one that is unfortunately shared by funding agencies, institutions and researchers. This illusion - which serves as an excuse to delegate the evaluation of science to for-profit companies and anonymous reviewers for the sake of false objectivity - costs taxpayers dearly: in Brazil, around US$ 85 million were spent by CAPES in national journal subscriptions in 2016 (Tuffani 2016TUFFANI M. 2016. Capes negocia redução de US$ 20 milhões em contratos e mantém Portal de Periódicos. Direto da Ciência, 15/8/2016. Available at http://www.diretodaciencia.com/2016/05/18/capes-negocia-reducao-de-us-20-milhoes-em-contratos-e-mantem-portal-de-periodicos/.
http://www.diretodaciencia.com/2016/05/1...
), a figure that excludes other costs handed over to publishers such as open-access fees and local institutional subscriptions.

More absurdly still, basing evaluation systems on journal impact factors - also measured and published by a for-profit company, Clarivate Analytics - creates a loophole in which scientists, even though fully aware that no value is created by commercial publishers, will still choose to offer them the copyright of their work for free, in order to ensure the financial survival of their research groups in a competitive academic environment.

Ironically, there is little empirical evidence that the prepublication peer review market commanded by these companies actually improves the trustworthiness of science (Jefferson et al. 2007JEFFERSON T, RUDIN M, BRODNEY FOLSE S AND DAVIDOFF F. 2007. Editorial peer review for improving the quality of reports of biomedical studies. Cochrane Database Syst Rev 2: MR000016.). On the contrary, the woeful reproducibility rates of many fields of biomedical science suggest that the publish-or-perish mentality driven by this system, which makes publication an end upon itself, might actually be making science less reliable (Smaldino and McElreath 2016SMALDINO PE AND MCELREATH R. 2016. The natural selection of bad science. R Soc Open Sci 3: 160384.) - and in the process, draining billions of dollars that could be invested in advancing knowledge rather than putting it behind paywalls.

For the alternative of reading and reviewing science outside the traditional publishing system to become a reality, however, some cultural barriers must be overcome. Although there are myriad forums for commenting on preprints and published articles, these are still used very sparingly: on February 2018, the National Center for Biotechnology Information (NCBI) announced the discontinuation of their post-publication peer review feature, PubMed Commons, due to its meager level of use: more than 4 years after its launch, only around 6,000 (or 0.02%) of the more than 27 million articles on PubMed had received a comment (NCBI 2018).

This suggests that convincing scientists to use these forums requires incentives, which need not be material: in spite of not being directly rewarded, prepublication peer review is seen as a duty and a sign of prestige by the scientific community. The scientist-driven initiative ASAPbio has been discussing proposals to implement journal-agnostic, post-publication peer review as a recognized practice, such as Peer Feedback (Vale et al. 2018VALE R, HYMAN T AND POLKA J. 2018. Peer Feedback. Available at http://asapbio.org/peer-feedback.
http://asapbio.org/peer-feedback...
) and APPRAISE (Eisen 2018EISEN MB. 2018. APPRAISE (A Post-Publication Review and Assessment In Science Experiment). Available at http://asapbio.org/eisen-appraise.
http://asapbio.org/eisen-appraise...
). Technological implementation of such proposals is certainly feasible, but their adoption depends on cultural change on the part of scientists concerning journal-independent peer review.

A closely related discussion is how to replace the heuristic value of journals in orienting scientists on what to read in an age of information overload. Currently, reviewers and editors perform this duty by evaluating the perceived importance of an article in their acceptance decisions. Nevertheless, I would argue that methodological soundness and importance are dimensions that should be evaluated separately - with the second one ideally performed after publication. This was the rationale behind the creation of large-scale, ‘impact-agnostic’ journals such as PLOS One, which focus peer review on methodology, with the expectation that scientific importance will later be assessed by the community. Nevertheless, this hope has not been completely fulfilled due to the scarcity of post-publication peer review - a problem that, once again, seems to require cultural rather than technological change to be solved (Nosek and Bar-Anan 2012NOSEK BA AND BAR-ANAN Y. 2012. Scientific Utopia: I. Opening scientific communication. arXiv:1205.1055 [physics.soc-ph].).

Recent events, however, suggest that change is possible - and perhaps inevitable. On January 2012, mathematician Tim Gowers started a movement urging scientists to boycott Elsevier - by refusing to publish, review and/or do editorial work for its journals -, in a protest against the exorbitant prices charged by the company for subscriptions, as well as its legal opposition to free exchange of information (The Cost of Knowledge 2012). At the time of writing, the boycott had been signed by 16,974 researchers - including myself, after the predatory incident I have just described. This number is still exceedingly small, and has had little impact on Elsevier’s profits. Nevertheless, even this small-scale protest was successful in removing the company’s support for the Research Works Act, a bill that sought to prohibit open-access mandates for federally funded research in the United States.

More recently, around 200 German universities and research institutes have not renewed their contracts with Elsevier in order to pressure the publisher for a nationwide agreement that would include both access to its journals and open access publishing for the country’s scientists. As recently declared by a member of the German negotiating team, “most papers are now freely available somewhere on the Internet, or else you might choose to work with preprint versions.” (Schiermeier 2018SCHIERMEIER Q. 2018. Germany vs Elsevier: universities win temporary journal access after refusing to pay fees. Nature 553(137): DOI 10.1038/d41586-018-00093-7.
https://doi.org/10.1038/d41586-018-00093...
). This possibility is of course dependent on most scientists using preprints as a routine form of dissemination, something that is not yet a reality in most fields. Nevertheless, the exponential growth of preprint usage in the life sciences (Kaiser 2017KAISER J. 2017. The preprint dilemma. Science 357: 1344-1349.) suggests that the practice is developing rapidly, and is likely to be accelerated if more scientists and institutions start to break off from large for-profit publishers.

Much has been said and written about the problem of predatory open-access journals (Butler 2013BUTLER D. 2013. Investigating journals: The dark side of publishing. Nature 495: 433-435., Cordeiro and Lima 2017CORDEIRO Y AND LIMA LMTR. 2017 Publish and perish in the hands of predatory journals. An Acad Bras Cienc 89: 787-788.). Nevertheless, the profit made by these journals at the expense of science seems trifling compared to the much larger-scale predation performed by Elsevier and other large publishers - and the prey in this case are not individual authors, but science itself. The existence of predatory open-access is only the most pathetic facet of a much deeper cultural problem within science - a form of comic relief on the backdrop of a tragedy, which we should all take as a reminder of how far from our goals we have strayed.

REFERENCES

  • BUTLER D. 2013. Investigating journals: The dark side of publishing. Nature 495: 433-435.
  • CORDEIRO Y AND LIMA LMTR. 2017 Publish and perish in the hands of predatory journals. An Acad Bras Cienc 89: 787-788.
  • EISEN MB. 2018. APPRAISE (A Post-Publication Review and Assessment In Science Experiment). Available at http://asapbio.org/eisen-appraise
    » http://asapbio.org/eisen-appraise
  • JEFFERSON T, RUDIN M, BRODNEY FOLSE S AND DAVIDOFF F. 2007. Editorial peer review for improving the quality of reports of biomedical studies. Cochrane Database Syst Rev 2: MR000016.
  • KAISER J. 2017. The preprint dilemma. Science 357: 1344-1349.
  • NCBI. 2018. PubMed Commons to be discontinued. Avaliable at https://ncbiinsights.ncbi.nlm.nih.gov/2018/02/01/pubmed-commons-to-be-discontinued/.
    » https://ncbiinsights.ncbi.nlm.nih.gov/2018/02/01/pubmed-commons-to-be-discontinued
  • NOSEK BA AND BAR-ANAN Y. 2012. Scientific Utopia: I. Opening scientific communication. arXiv:1205.1055 [physics.soc-ph].
  • PINTO JV, MOULIN TC AND AMARAL OB. 2016. On the transdiagnostic nature of peripheral biomarkers in major psychiatric disorders: a systematic review. Neurosci Biobehav Rev 83: 97-108.
  • RELX. 2016. Group Annual Reports. Available at https://www.relx.com/investors/annual-reports/2016
    » https://www.relx.com/investors/annual-reports/2016
  • SCHIERMEIER Q. 2018. Germany vs Elsevier: universities win temporary journal access after refusing to pay fees. Nature 553(137): DOI 10.1038/d41586-018-00093-7.
    » https://doi.org/10.1038/d41586-018-00093-7
  • SMALDINO PE AND MCELREATH R. 2016. The natural selection of bad science. R Soc Open Sci 3: 160384.
  • THE COST OF KNOWLEDGE. 2012. Statement of purpose. Available at https://gowers.files.wordpress.com/2012/02/elsevierstatementfinal.pdf
    » https://gowers.files.wordpress.com/2012/02/elsevierstatementfinal.pdf
  • TUFFANI M. 2016. Capes negocia redução de US$ 20 milhões em contratos e mantém Portal de Periódicos. Direto da Ciência, 15/8/2016. Available at http://www.diretodaciencia.com/2016/05/18/capes-negocia-reducao-de-us-20-milhoes-em-contratos-e-mantem-portal-de-periodicos/
    » http://www.diretodaciencia.com/2016/05/18/capes-negocia-reducao-de-us-20-milhoes-em-contratos-e-mantem-portal-de-periodicos/
  • VALE R, HYMAN T AND POLKA J. 2018. Peer Feedback. Available at http://asapbio.org/peer-feedback
    » http://asapbio.org/peer-feedback

Publication Dates

  • Publication in this collection
    21 May 2018
  • Date of issue
    Apr-June 2018

History

  • Received
    07 Dec 2017
  • Accepted
    12 Jan 2018
Academia Brasileira de Ciências Rua Anfilófio de Carvalho, 29, 3º andar, 20030-060 Rio de Janeiro RJ Brasil, Tel: +55 21 3907-8100 - Rio de Janeiro - RJ - Brazil
E-mail: aabc@abc.org.br