Acessibilidade / Reportar erro

Citation Indicators and Scientific Relevance: Genealogy of a Representation

ABSTRACT

This article analyzes the association between citation indicators and scientific relevance as social representations supported by assumptions formulated and consolidated during the second half of the twentieth century. Looked at as a recent problem in Latin America, this association loses the weight of the processes which have historically configured this construction. Using certain concepts of Bourdieu, Even-Zohar, and Moscovici, this paper proposes to investigate what were the political and economic processes which historically contributed to this representation and which aspects are present in the currently used citation indicators. Claiming that citation indicators are a form of evaluating scientific knowledge generates a distortion of the system of values which relegates conceptual ecosystems, innovative themes that are little dealt with, and stimulates remaining within already highly populated areas to ensure citation.

scientific journals; evaluation of scientific production; citation metrics

RESUMEN

Este artículo analiza la asociación entre indicadores de citación y relevancia científica, en tanto representación social sustentada en presupuestos formulados y consolidados durante la segunda mitad del siglo XX. Esta asociación, al ser abordada en América Latina como un problema reciente, pierde la carga de los procesos que historicamente configuraron esa construcción. Desde ciertos conceptos de Bourdieu, Even-Zohar y Moscovici, este artículo se propone indagar acerca de cuáles fueron los processos políticos y económicos que históricamente tributaron a la conformación de esa representación y qué aspectos están presentes en los indicadores de citación utilizados en la actualidad. Reivindicar los indicadores de citación como forma de evaluar el aporte al conocimiento científico genera una distorsión del sistema de valores que relega los ecosistemas conceptuales, los temas innovadores o poco trabajados, y estimula alistarse dentro de áreas que ya están altamente pobladas, para asegurar la citación.

revistas científicas; evaluación de la producción científica; métricas de citación

RESUMO

Este artigo analisa a associação entre indicadores de citação e relevância científica, como representação social sustentada em pressupostos formulados e consolidados durante a segunda metade do século XX. Esta associação, sendo abordada na América Latina como um problema recente, acaba perdendo a carga dos processos que historicamente configuraram essa construção. A partir de certos conceitos de Bourdieu, Even-Zohar e Moscovici, este artigo se propõe a indagar sobre quais foram os processos políticos e econômicos que historicamente tributaram na conformação dessa representação e quais aspectos estão presentes nos indicadores de citação utilizados na atualidade. Reivindicar os indicadores de citação como forma de avaliar o aporte ao conhecimento científico gera uma distorção do sistema de valores que relega os ecossistemas conceituais, os temas inovadores ou pouco trabalhados, e estimula a publicar dentro de áreas que já estão altamente povoadas, para assegurar a citação.

revistas científicas; avaliação da produção científica; métricas de citação

RÉSUMÉ

Cet article analyse l’association entre les indicateurs de citation et la pertinence scientifique, en tant que représentation sociale soutenue par des budgets formulés et consolidés au cours de la seconde moitié du XXe siècle. Cette association, abordée dans la Amérique Latine comme un problème récent, perd le poids des processus qui ont historiquement façonné cette construction. À partir de certains concepts de Bourdieu, Even-Zohar et Moscovici, cet article vise à étudier quels étaient les processus politiques et économiques qui ont historiquement contribué à la formation de cette représentation et quels aspects sont présents dans les indicateurs de citation utilisés aujourd’hui. Revendiquer les indicateurs de citation comme un moyen d’évaluer la contribution aux connaissances scientifiques génère une distorsion du système de valeurs qui relègue les écosystèmes conceptuels, des sujets innovants ou insuffisamment travaillés, et stimule l’enrôlement dans des zones déjà très peuplées, pour assurer la citation.

magazines scientifiques; évaluation de la production scientifique; mesures de citation

INTRODUCTION

One of the most complex and deeply rooted representations within the scientific community is the one in which the value of a citation indicator obtained by a journal is synonymous with its scientific relevance and, thus, of the research published there. In other words, it is based on the premise that the citation indicator reflects in a transparent and direct form the impact or contribution to scientific knowledge of research published in a journal, in such a way that the higher the indicator value, the greater the relevance and vice-versa.

Although this association has been much questioned by the scientific community (Dora, 2012DORA. (2012). “San Francisco Declaration on Research Assessment”. Disponible en: https://sfdora.org/read. Acceso en 5 dic. 2018.
https://sfdora.org/read...
; Hicks et al., 2015HICKS, Diana et al. (2015), “Bibliometrics: The Leiden Manifesto for research metrics”. Nature, v. 520, n. 7548, pp. 429-431. doi: 10.1038/520429a.; Adler, Ewing and Taylor, 2008; Vessuri, Guédon and Cetto, 2014; Beigel, 2014BEIGEL, Fernanda. (2014), “Publishing from the periphery: structural heterogeneity and segmented circuits: The evaluation of scientific publications for tenure in Argentina’s CONICET”. Current Sociology, v. 62, n. 5, pp. 743-765. doi: 10.1177/0011392114533977.; Beigel, Gallardo and Bekerman, 2018), the bodies responsible for assessing academic/scientific production in certain Latin American countries, such as Mexico, Colombia, Chile, Brazil, and Argentina are increasingly clinging to corporate metrics and simplifying evaluation parameters based on, above all, the quartiles published in the Scimago Journal & Country Rank, funded by Elsevier (Vasen and Vilchis, 2017; Alperin and Rozemblum, 2017ALPERIN, Juan Pablo; ROZEMBLUM, Cecilia. (2017). “La reinterpretación de visibilidad y calidad en las nuevas políticas de evaluación de revistas científicas”. Revista Interamericana de Bibliotecología, v. 40, n. 3, pp. 231-241. doi: 10.17533/udea.rib.v40n3a04.; Gómez-Morales, 2018GÓMEZ-MORALES, Yuri Jack. (2018), “Abuso de las medidas y medidas abusivas. Crítica al pensamiento bibliométrico hegemónico”. Anuario Colombiano de Historia Social y de la Cultura, v. 45, n. 1, pp. 269-290. doi: 10.15446/achsc.v45n1.67559.; Farias et al., 2017)FARIAS, Mareni Rocha; STORB, Bernd Heinrich; STORPIRTIS, Silvia; et al. (2017). “Impact Factor: an appropriate criterion for the Qualis journals classification in the Pharmacy area?” Brazilian Journal of Pharmaceutical Sciences, v. 53, n. 3, p. e01001. doi: 10.1590/s2175-97902017000301001..

We understand that this association is based on assumptions formulated and consolidated in European and especially US scientific culture in the second half of the twentieth century. However, in Latin America, since was approached in a synchronic form, as if it were a recent problem, limited to disputes to obtain funds or leadership in certain disciplinary areas, it is stripped of its historicity and therefore of the complexity of the economic, political, ideological, and cultural processes, which have historically shaped its construction.

Although scientific journals are key publishing agents within the scientific field, responsible for the circulation and, above all, for the legitimatization of both the symbolic goods produced and their producers (Bourdieu, 1994BOURDIEU, Pierre. (1994), “El campo científico”. Redes, n. 2, pp. 130-160.), they are still mechanisms cut across by economic interests and governmental policies and modes of production in the publishing field, an aspect which does not necessarily become the object of study of the scientific community. This distance from the object prevents the passage from testimony to direct observation, meaning that the representation system is nourished by a system of alien meanings which impregnate practices and discursivities (Moscovici, 1979MOSCOVICI, Serge. (1979), El psicoanálisis, su imagen y su público. Buenos Aires: Editorial Huemul.). These types of symbolic constructions seem to have more significance the more they move away from being confronted with direct experiences.

Hence, those who were the primary participants in the apparent “benefits” of the citation metrics are those who most question this association. For example, Nobel prizes such as Joseph Goldstein, Peter Doherty, Paul Nurse, Bruce Beutler, Randy Schekman (The Nobel Prize, 2017THE NOBEL PRIZE. (2017), The research counts, not the journal! Disponible en: https://www.youtube.com/watch?v=6MQ8R0OyvyQ. Acceso en 3 dic. 2018.
https://www.youtube.com/watch?v=6MQ8R0Oy...
; Schekman, 2013SCHEKMAN, Randy. (2013), “How journals like Nature, Cell and Science are damaging science”. The Guardian. DIsponible en: https://www.theguardian.com/commentisfree/2013/dec/09/how-journals-nature-science-cell-damage-science. Acceso en 3 dic. 2018.
https://www.theguardian.com/commentisfre...
); institutions such as Research Councils UK (2013)RESEARCH COUNCILS UK. (2013), “RCUK policy on open access and supporting guidance”. Disponible en: https://www.ukri.org/files/legacy/documents/rcukopenaccesspolicy-pdf. Acceso en 5 dic. 2018.
https://www.ukri.org/files/legacy/docume...
; Wellcome Trust (2019)WELLCOME TRUST. (2019), “Open access policy 2020 – Frequently asked questions”. Disponible en: https://wellcome.ac.uk/sites/default/files/wellcome-open-access-policy-2020-faq.pdf. Acceso en 5 dic. 2018.
https://wellcome.ac.uk/sites/default/fil...
; European Molecular Biology Organization (2018)EUROPEAN MOLECULAR BIOLOGY ORGANIZATION. (2018). “EMBO Long-Term Fellowships: Application guidelines”. Disponible en: http://www.embo.org/documents/LTF/LTF_Guidelines_for_Applicants.pdf. Acceso en 4 dic. 2018.
http://www.embo.org/documents/LTF/LTF_Gu...
; American Society for Microbiology (Bertuzzi et al., 2016BERTUZZI, Stefano et al. (2016), “Journal impact factors: changing the weather”. Microbe Magazine, v. 11, n. 7, p. 289. doi: 10.1128/microbe.11.289.1.), Association of the Scientific Medical Societies in Germany (Herrmann-Lingen et al., 2012HERRMANN-LINGEN, Christoph et al. (2012), “Evaluation of Medical Research Performance: Position Paper of the Association of the Scientific Medical Societies in Germany (AWMF)”. GMS, v. 12, Doc11. doi: 10.3205/000196.
https://doi.org/10.3205/000196...
), UK Forum for Responsible Research Metrics (2018); International Mathematical Union, International Council of Industrial and Applied Mathematics, Institute of Mathematical Statistics (Adler et al., 2008); amongst so many other organizations, have spoken against considering the relevance of research based on the journal in which it is published and citation metrics.

From this perspective, this paper proposes to open the discussion and address the following questions: what were the political and economic processes which historically contributed to the formation of the current symbolic representation that associates citation indicators with scientific relevance?; which aspects of this representation are present in the construction of the citation indicators used at present?

ABOUT THE RESEARCH: CENTRAL CONCEPTS AND EMPIRICAL DATA

To address both these questions, we focused on scientific journals as publishing agents resulting from a social practice in which both the scientific field (Bourdieu, 1994BOURDIEU, Pierre. (1994), “El campo científico”. Redes, n. 2, pp. 130-160.) and the publishing field (Bourdieu, 1995BOURDIEU, Pierre. (1995), “El mercado de los bienes simbólicos”. In: Las reglas del arte: génesis y estructura del campo literario. Barcelona: Anagrama, pp. 213-261., 1999BOURDIEU, Pierre. (1999), “Una revolución conservadora en la edición”. In: Intelectuales, política y poder. Buenos Aires: Eudeba, pp. 223-264.) participate, intersected by political and economic dimensions. In Bourdieuian terms, the agents of the publishing field – in this case scientific journals, their publishers, and the institutions of which they are part – not only have the power to give a text and its authors access to a “public existence”, but in the same act they can transfer to the text and its authors the symbolic capital accumulated within the publishing field (Bourdieu, 1999). The particularity in the case of scientific journals is that the symbolic capital transferred to a text and its authors, in the very act of publication, is shaped by values which are configured within the scientific field, hence many of those engaged in publishing academic books do not understand the practices of publication, circulation, and legitimation of scientific journals. In turn, scientific journals are materialized on the basis of modes of production from the publishing field which, in Bourdieu”s terms, is “the site of the antagonistic coexistence of two modes of production and circulation obeying inverse logics”: on the one hand, a logic based on the rejection of commercial and economic benefits oriented to the accumulation of symbolic capital and, on the other, industrial economic logic which, “since they make the trade in cultural just another trade, confer priority on distribution, on immediate and temporary success” (Bourdieu, 1995:214).

These concepts are central in addressing the association between citation indicators and scientific “relevance” or quality in terms of both symbolic construction, since, as we will argue in this text, industrial economic logical played a great role in the shaping of this association, affecting both scientific and publishing practices. As Moscovici (1979)MOSCOVICI, Serge. (1979), El psicoanálisis, su imagen y su público. Buenos Aires: Editorial Huemul. mentions a social representation is an organization of language which structures and symbolizes acts and situations that become common, but which are configured through testimonies which are not part of our direct experiences, meaning that it is an indirect construction which comes to us intermediated by the interests of those who organize this “world of discourse.”

In turn, we propose the concept of “conceptual ecosystems” based on Even-Zohar”s theory of polysystems (1990) to try to overcome the notion of “center-periphery” so present in works of sociology and the history of science produced in the twentieth century, though which we believe has now lost its explanatory force. According to Even-Zohar, the notion of center and periphery is based on a unisystem exclusively identified with the central strata – for example, official culture, industrialized science, “high impact” journals, – as the only valid model, and which regards peripheries as something “categorically extra-systemic”. The concept of polysystems breaks away from this notion, allowing for the coexistence of adjacent systems within the same polysystem, in other words, the coexistence of distinct conceptual ecosystems within a specific area of knowledge.

Based on these concepts, the first part of this article analyzes how at the end of the Second World War the process of the industrialization of science consolidated a “unisystem” in the interests of US Big Science and the scientific publishing industry, placing in the periphery, in other words in this extra-systemic space, not only Latin America, but also the German sociological and philosophical tradition, French sociology and linguistics, the production of Asian countries, Russian, and the rest of the Eastern European countries.

In the second part, the principal dimensions of citation indicators are analyzed, such as the selection of the journal sample, the construction of numerators, the quartiles for the thematic categories, and the periods analyzed by the indicator in which it can be identified how Mertonian universalism fades and is replaced by a logic of stratification. It is worth noting that while the most discussed citation indicator in the bibliography is the impact factor (IF), currently the property of Clarivate Analytic, to exemplify the proposed dimensions we focus on the records, indicators, and quartiles of the Scimago Journal & Country Rank. This site, created in 2007, uses as its input/raw material the journals that are part of Elsevier”s Scopus (Butler, 2008)BUTLER, Declan. (2008), “Free journal-ranking tool enters citation market”. Nature, v. 451, n. 7174, p. 6. doi: 10.1038/451006a., as it is the most used measure in various Latin American countries. For the analysis all the journal records available in November 2018 were extracted from the Scimago Journal & Country Rank and the number of journals included by country and region counted. To analyze the quartiles, the list of countries in each of the thematic categories was standardized, the number of journals which each country had in each quartile counted, and those thematic categories which contained at least one Latin America journal in Quartile 1 (Q1) identified.

ASSOCIATION BETWEEN CITATION INDICATORS AND SCIENTIFIC RELEVANCE: POLITICAL AND ECONOMIC PROCESSES

When in 1942, Robert Merton postulated the values of “universalism” and “communism” as part of the “ethos” of science, he based himself on the scientific culture of the beginning of the twentieth century, unaware of the changes which would occur after the end of the Second World War. In the first decades of the twentieth century, scientific practice itself and, consequently, the publication, distribution, and indexing practices of science were intersected by certain ideals of inclusion and the compilation of the largest amount of research from distinct regions of the world. In fact, in the Catalogue of scientific papers published by the Royal Society of London, which covered the period 1800-1900 in four series, articles can be found published in journals from Chile, Argentina, Brazil, Mexico, Cuba, etc., (Royal Society of London, 1867ROYAL SOCIETY OF LONDON. (1867), Catalogue of scientific papers, 1800-1863. Vol. 1. London: Royal Society of London. Disponible en: http://biodiversitylibrary.org/page/18305124. Acceso en 3 dic. 2018.
http://biodiversitylibrary.org/page/1830...
, 1877ROYAL SOCIETY OF LONDON. (1877), Catalogue of scientific papers, 1864-1873. Vol. 7. London: Royal Society of London. Disponible en: https://www.biodiversitylibrary.org/item/60932. Acceso en 3 dic. 2018.
https://www.biodiversitylibrary.org/item...
, 1891ROYAL SOCIETY OF LONDON. (1891), Catalogue of scientific papers, 1874-1883. Vol. 9. London: Royal Society of London. Disponible en: https://www.biodiversitylibrary.org/item/60933. Acceso en 3 dic. 2018.
https://www.biodiversitylibrary.org/item...
, 1914ROYAL SOCIETY OF LONDON. (1914), Catalogue of scientific papers, 1884-1900. Vol. 13. London: Royal Society of London. Disponible en: https://www.biodiversitylibrary.org/item/17787. Acceso en 3 dic. 2018.
https://www.biodiversitylibrary.org/item...
). In this period, scientific journals, whether from Europe, the US, or Latin America, were distributed for free among those associated with the institution and many copies served as a currency of exchange to obtain journals published by other scientific societies through an “exchange” system, and thus increase library collections with journals from other regions of the world. As mentioned by Cook, academic publication in those years “was not a commercial business, it was undertaken to promote and disseminate knowledge” (Cook, 2001:20).

“Communism”, in the non-technical and extended sense of common ownership of goods, is a second integral element of the scientific ethos. The substantive findings of science are a product of social collaboration and are assigned to the community. They constitute a common heritage in which the equity of the individual producer is severely limited (Merton, 1973MERTON, Robert K. (1973), “The normative structure of science”. In: The sociology of science: Theoretical and empirical investigations. Chicago: University of Chicago Press.:273).

In relation to “universalism”, Merton postulated a science in which the acceptance or rejection of a scientific formulation should not depend on personal or social attributes, but rather be based on objectivity and “preestablished impersonal criteria” founded on observation and previously confirmed knowledge. Ethnocentrism and private interests were thus alien to the ethos of science: “when the dominant definition of the situation is such as to emphasize national loyalties, the man of science is subjected to the conflicting imperatives of scientific universalism and of ethnocentric particularism” (Merton, 1973:270). For Bourdieu, Merton does not raise the relationship between these ideal values of science, the norms it professes, and the social structure of the scientific universe, in other words, “the mechanisms which tend to facilitate “control” and communication, evaluation and retribution” (Bourdieu, 1997:86). However, it is this idealized vision of science which, even today, makes invisible the participation of industrial economic logic, corporate interests (Camargo Jr., 2009), or ideological-cultural values in the shaping of representations, practices, and discursivities within the scientific field.

By the end of the Second World War, the US dominated the global economy, with almost two thirds of global industrial production (Hobsbawm, 2010HOBSBAWM, Eric. (2010), “Los años dorados”. In: Historia del siglo XX. Buenos Aires: Crítica, pp. 260-289.). In the 1950s, driven by US state policy, a process of the global stratification of science began, which supported a scientific-technological elite financed with governmental funds. By 1960, with the military-industrial complex, this elite had accumulated immeasurable power, as described by the then US president, Dwight D. Eisenhower, at the end of his mandate (Eisenhower, 1961).

In 1961, the nuclear physicist Alvin Weinberg, in his paper on the impact of lack-scale science asked, “Is Big Science ruining science?” (Weinberg, 1961WEINBERG, Alvin M. (1961), “Impact of Large-Scale Science on the United States: Big Science is here to stay, but we have yet to make the hard financial and educational choices it imposes”. Science, v. 134, n. 3473, pp. 161-164. doi: 10.1126/science.134.3473.161.:161). He argued that to ensure that members of congress continued to approve the high percentages of funding, significant public support was needed, achieved through publicity. To put science onto the public agenda it was necessary to inject “news” into the market that would be readily taken up by the means of communication, leading to an “enormous proliferation of scientific writing, which largely remains unread” (Weinberg, 1961WEINBERG, Alvin M. (1961), “Impact of Large-Scale Science on the United States: Big Science is here to stay, but we have yet to make the hard financial and educational choices it imposes”. Science, v. 134, n. 3473, pp. 161-164. doi: 10.1126/science.134.3473.161.:161). This led to the expansion of the publishing industry in the countries where the academic-scientific community had more money in circulation, either from the state or distinct industrial sectors, which in turn led to a change in the modes of production: a high percentage of the new journals were not endorsed by scientific societies, but were created in the commercial offices of large publishers such as Pergamon Press, Springer, Elsevier, and Taylor & Francis with the purpose of absorbing the publication of articles from various institutions (Fredriksson, 2001)FREDRIKSSON, Einar H. (2001), “The Dutch publishing scene: Elsevier and North-Holland”. In: A century of science publishing. Amsterdam: IOS Press. pp. 61-76.. This is when the economic industrial logic of the publishing sector entered since, in Bourdieu”s terms, it converts “the trade in cultural goods just another trade”, and aligns with the interests of Big Science giving priority to immediate success and diffusion.

In this context, in 1955 Garfield founded the Institute for Scientific Information (ISI) and with this institutional framework created three databases into which knowledge was divided in three main areas: science; the social sciences; and finally, arts and humanities. This division strengthened the separation between a central core of “science” split from the social sciences. The first version of the Science Citation Index (SCI) was published in 1963 in a print format (Garfield, 1963GARFIELD, Eugene. (1963), “Science Citation Index”. In: Science Citation Index 1961. Philadelphia: Institute of Scientifique Information. pp. V-XVI.). In 1972, the Social Science Citation Index (SSCI) was published (Garfield, 1972a), followed a few years later in 1978 by the Arts & Humanities Citation Index (AHCI) (Garfield, 1977)GARFIELD, Eugene. (1977), “Will ISI’S Arts & Humanities Citation Index revolutionize scholarship?” Current Contents, n. 32, pp. 5-9.. In the introduction to the first print edition of SCI, Garfield stated:

The citation index is your historical roadmap of the literature. Where you travel is primarily your decision. What you find will depends on you and what is available. Only you can measure its relevance. What may be valuable for one man is irrelevant for another. Two otherwise unrelated scientific observations may be correlated in the citation index through a common reference. (Garfield, 1963GARFIELD, Eugene. (1963), “Science Citation Index”. In: Science Citation Index 1961. Philadelphia: Institute of Scientifique Information. pp. V-XVI.)

This idea that each researcher could measure the relevance of an article and reconstruct what was being discussed about a theme by creating a relationship roadmap based on the crossing of citations would be strongly overshadowed after the publication of the first “impact factor” ranking in 1969, under the title Journal Citation Reports. With this citation indicator it was no longer necessary to reconstruct relations between authors, nor to recover the distinct coexisting conceptual frameworks. This process was gradually reduced and simplified, mentioned in a report from the International Mathematical Union, the International Council for Industrial and Applied Mathematics, and the Institute of Mathematical Statistics:

The idea that research assessment must be done using “simple and objective” methods is increasingly prevalent today [...] There is a belief that citation statistics are inherently more accurate [...] While numbers appear to be “objective”, their objectivity can be illusory. The meaning of a citation can be even more subjective than peer review. Because this subjectivity is less obvious for citations, those who use citation data are less likely to understand their limitations. (Adler et al., 2008ADLER, Robert; EWING, John; TAYLOR, Peter. (2008), “Citation statistics”. Disponible en:https://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf. Acceso en 3 dic. 2018.
https://www.mathunion.org/fileadmin/IMU/...
)

One of these limitations was the sample of journals on which the impact factor was calculated. It was the time of “punch cards” and the limited processing capacity of computers in the 1970s only allowed the citation of a limited number of journals to be followed. In fact, Garfield himself explained that journals which did not use the Roman alphabet, such as Russians and Japanese, were not as easy and economical to include in the SCI database (Garfield, 1975a).

According to Jean-Claude Guédon (2016)GUÉDON, Jean Claude. (2016), “Los retos de la comunicación científica para el sur global”. Disponible en: https://www.youtube.com/watch?v=L9goyc2A8gk&index=5&list=FLYe4miAPo-9fD1ob-rr0W3A. Acceso en 3 dic. 2018.
https://www.youtube.com/watch?v=L9goyc2A...
, Garfield made many efforts to explain how with a few journals the essence of world science could be analyzed and to hide technical problems it was necessary to justify that the 2,400 journals included at the time in the Science Citation Index were not only representative of what was researched and published at the global level, but that they were the “most important”, and the basis on which the citation indexes were calculated:

The coverage of SCI [Science Citation Index] is international and multidisciplinary; it has grown from 600 journals in 1964 to 2,400 in 1972, and now includes the world”s most important scientific and technical journals in most disciplines. (Garfield, 1972b:527)

For more than 50 years the impact factor created by Eugene Garfield (2006)GARFIELD, Eugene. (2006), “The History and Meaning of the Journal Impact Factor”. JAMA, v. 295, n. 1, pp. 90-93. doi: 10.1001/jama.295.1.90. was the only citation indicator at a global level and this allowed it hold the monopoly of the “symbolic construction of exclusivity” (Münch, 2015MÜNCH, Richard. (2015), “El mecanismo de monopolio en la ciencia”. Literatura: Teoría, Historia, Crítica, v. 17, n. 2, pp. 251-286. doi. 10.15446/lthc.v17n2.51293.), and mold a representation of scientific quality as synonymous with a high citation rate. The conceptual and theoretical ecosystems of each country were devalued by a new “geography” of world science, a “unisystem” in the terms of Even-Zohar (1990)EVEN-ZOHAR, Itamar. (1990). “Polysystem studies”. Poetics Today, v. 11, n. 1, p. 9–26., which divided the scene between central science, that followed the industrial logic, with journals produced by the publishing industry and with a good performance within the system created by Garfield (in other words, a high impact factor), and peripheral science or “extra-systemic” science, consisting of everything published in journals which did not reach the top of the ranking. Trapped within this package called peripheral science were not only Latin American production, but many other areas of knowledge whose logics of production and citation dynamics were not consistent with the standards set by the indicators. While the unlimited entrance of journals created by the publishing industry and by US and English scientific societies was permitted, the entry of journals from the rest of the world was restricted, while the performance of the former group was immediately evaluated as if it was a representative sample of the journal existing at a global level.

This mechanism was and is functional for the demands of US Big Science, which as research supported by an industrial logic was transformed into an apparatus of economic production needing immediate results that could be easily transmissible in the means of communication, whose publicity allowed the justification and renovation of high levels of investment on the part of the state (Capshew and Rader, 1992CAPSHEW, James H.; RADER, Karen A. (1992), “Big Science: price to the present”. Osiris, v. 7, n. 1, pp. 2-25.; Andriesse, 2008ANDRIESSE, Cornelis D. (2008), “Saturation”. In: Dutch messengers: A history of science publishing, 1930-1980. Leiden: Brill. pp. 225-241.; Weinberg, 1961WEINBERG, Alvin M. (1961), “Impact of Large-Scale Science on the United States: Big Science is here to stay, but we have yet to make the hard financial and educational choices it imposes”. Science, v. 134, n. 3473, pp. 161-164. doi: 10.1126/science.134.3473.161.). Thus, an adjusted version of the bibliometric laws formulated by Lotka, Bradford, and Price – adjusted to industrial necessities and applied to Journal Citation Reports – rearranged a focus on reality creating a “truth” which acquired the status of regulations, and both the scientific community and national policies adopted, transformed, and reproduced these rules, not necessarily because their technical superiority had been demonstrated (Shenhav and Kamens, 1991SHENHAV, Yehouda A.; KAMENS, David H. (1991), “The ‘costs’ of Institutional Isomorphism: Science in Non-Western Countries”. Social Studies of Science, v. 21, n. 3, pp. 527-545. doi: 10.1177/030631291021003005.), nor because their premises were a great contribution to the scientific relevance of each country, but rather because they established themselves as “highly rationalized myths” (Meyer and Rowan, 1977MEYER, John W.; ROWAN, Brian. (1977), “Institutionalized organizations: Formal structure as myth and ceremony”. American Journal of Sociology, v. 83, n. 2, pp. 340-363. doi: 10.1086/226550.).

Lotka (1926)LOTKA, Alfred A. (1926), “The frequency distribution of scientific productivity”. Journal of the Washington Academy of Sciences, v. 16, n. 12, pp. 317-323. noted a constant fact: a reduced number of authors publish most articles, but does not prove the relationship between high productivity and a greater contribution to science and scientific knowledge. Something similar happens with Bradford (1934)BRADFORD, Samuel C. (1934), “Sources of information on specific subjects”. Engineering: An Illustrated Weekly Journal, v. 137, n. 3550, pp. 85-86., who postulates that a reduced number of journals (the core) concentrates the largest quantity of articles about a determined theme, while a large number of journals have a reduced number of articles about this theme. The association between the journals which publish most articles about a particular topic and greater relevance or contribution to science is not present in this formulation. Nevertheless, it laid the base for certain symbolic constructions which took hold during the second half of the twentieth century.

In turn, the law of obsolescence or the immediacy factor which Derek J. de Solla Price proposed in 1965, in his paper “Networks of Scientific Papers”, shows that more recent articles are cited more frequently and, according to his words, this generates the “wellknown phenomenon of papers being considered obsolescent after a decade” (Price, 1965PRICE, Derek J. de Solla. (1965), “Networks of scientific papers”. Science, v. 149, n. 3683, pp. 510-515.:513). In relation to this, in 1975, Eugene Garfield stated that:

Citation practices differ from one field to another. [...] Rapid obsolescence may characterize one field but not another. Thus, for example, it would be foolish to conclude merely on the based of citation counts, that Journal of the American Chemical Society is a “better” journal than Annals of Mathematics, or to hypothesize, without a great deal of study, which serves its own field “better”. (Garfield, 1975a:3)

The ten years obsolescence mentioned by Price and the differences between fields suggested by Garfield in 1975, were later replaced by simplified and generalized statements applied to all fields:

...paper that achieve a high impact are usually cited within months of publication and certainly within a year or so. This pattern of immediacy has enabled Thomson Scientific to identify “hot papers”. (Garfield, 2006GARFIELD, Eugene. (2006), “The History and Meaning of the Journal Impact Factor”. JAMA, v. 295, n. 1, pp. 90-93. doi: 10.1001/jama.295.1.90.:92)

Garfield related the “pattern of immediacy” and “hot papers” to science as a whole and reduced the period of obsolescence to two years. This decision had a significant impact on scientific production. In this way the parameters adopted by Eugene Garfield related to the shaping of bibliometric indicators, together with the interests of the scientific publishing industry came to be a cog in the productive chain of Big Science, and the impact factor was converted into a form of classifying and advertising the scientific scope of its products.

The new-look journals, with their emphasis on big results, shot to the top of these new rankings, and scientists who published in “high impact” journals were rewarded with jobs and funding. Almost overnight, a new currency of prestige had been created in the scientific world. (Buranyi, 2017BURANYI, Stephen. (2017), “Is the staggeringly profitable business of scientific publishing bad for science?” The Guardian. Disponible en: https://www.theguardian.com/science/2017/jun/27/profitable-business-scientific-publishing-bad-for-science? Acceso en 5 dic. 2018.
https://www.theguardian.com/science/2017...
)

As we will see in the next section, these decisions adopted in the second half of the twentieth century, not only remain in force, but are reaffirmed on the basis of this symbolic construction which associates citation metrics and scientific relevance.

ASPECTS PRESENT IN CITATION INDICATORS: THE ASSOCIATION BETWEEN CITATION INDICATORS AND SCIENTIFIC RELEVANCE

The stratification of science in the second half of the twentieth century was constructed on “closed” calculations, whose results were only distributed in print form. Only in 2002 could the databases created by Eugene Garfield, then owned by Thomson Scientific, be consulted on a web platform in which the citation data from Web of Science was integrated with the citation indicators from Journal Citation Reports.

In 2004, as a form of countering the power accumulated by the Web of Science and the impact factor as the only citation indicator, the scientific publisher Elsevier launched the Scopus database with the promise of expanding the geographic scope of journals and forming a more representative sample. SCimago Research Group designed a new indicator for Scopus called SCimago Journal Rank (SJR) in which quartiles are used to measure position, which determines the location of a journal within a thematic category (Scimago Research Group, 2007SCIMAGO RESEARCH GROUP. (2007), “Description of Scimago Journal Rank indicator”. Disponible en: https://www.scimagojr.com/SCImagoJournalRank.pdf. Acceso en 3 dic. 2018.
https://www.scimagojr.com/SCImagoJournal...
; González-Pereira, Guerrero-Bote and Moya-Anegón, 2010; Guerrero-Bote and Moya-Anegón, 2012GUERRERO-BOTE, Vicente P.; MOYA-ANEGÓN, Félix. (2012), “A further step forward in measuring journals’ scientific prestige: the SJR2 Indicator”. Journal of Informetrics, v. 6, n. 4, pp. 674-688. doi: 10.1016/j.joi.2012.07.001.). However, this new database not only does not expand the geographic scope, but also reproduces and in some cases amplifies some of the inconsistencies of the already existing indicators. Nevertheless, it was still adopted by bodies that assessed scientific production and the categorization of journals in various Latin American countries, replacing the preexisting indicators (Vasen and Vilchis, 2017; Alperin and Rozemblum, 2017ALPERIN, Juan Pablo; ROZEMBLUM, Cecilia. (2017). “La reinterpretación de visibilidad y calidad en las nuevas políticas de evaluación de revistas científicas”. Revista Interamericana de Bibliotecología, v. 40, n. 3, pp. 231-241. doi: 10.17533/udea.rib.v40n3a04.; Gómez-Morales, 2018GÓMEZ-MORALES, Yuri Jack. (2018), “Abuso de las medidas y medidas abusivas. Crítica al pensamiento bibliométrico hegemónico”. Anuario Colombiano de Historia Social y de la Cultura, v. 45, n. 1, pp. 269-290. doi: 10.15446/achsc.v45n1.67559.; Farias et al., 2017FARIAS, Mareni Rocha; STORB, Bernd Heinrich; STORPIRTIS, Silvia; et al. (2017). “Impact Factor: an appropriate criterion for the Qualis journals classification in the Pharmacy area?” Brazilian Journal of Pharmaceutical Sciences, v. 53, n. 3, p. e01001. doi: 10.1590/s2175-97902017000301001.).

SELECTION OF THE SCIENTIFIC JOURNAL SAMPLE

In relation to the composition of the sample of journals, in November 2018, the Scimago Journal & Country Rank database had a total of 34,169 journals included in Scopus, of which 40.8% were the US, 17.1% from the United Kingdom, 6.9% from the Netherlands, and 5.6% from Germany; in other words, 70.4% were journals from countries with the greatest impact on the publishing industry. According to information from Elsevier (2016)ELSEVIER. (2016). “Scopus: Content Coverage Guide”. Disponible en https://www.elsevier.com/__data/assets/pdf_file/0007/69451/scopus_content_coverage_guide.pdf. Acceso en 3 dic. 2018.
https://www.elsevier.com/__data/assets/p...
, more than 10,000 journals included in Scopus (30% of the total) are owned by five publishing companies (10% Elsevier, 8% Springer, 5% Wiley-Blackwell, 5% Taylor & Francis, and 2% Sage).

Counting the number of journals that are part of the eight regions present in the database (North America, Western Europe, the Asian Region, Eastern Europe, Latin America, the Middle East, the Pacific Region, and Africa), the share of Latin America, which included a total of 48 countries, was 2.5% (842 journals) (Figure 1).

Figure 1
Total number of scientific journals included in the Scimago Journal & Country Rank database (N=34.169), according to region. November 2018.

This unequal participation would not be a problem if the objective of the database was to present a sample of the production of determined regions, based on convenience. However, one of the objectives of the database is to generate citation indicators which can symbolically compete with the impact factor of Clarivate Analytics, so that the current composition of this database conditions the calculation of both indicators such as CiteScore and SJR, since it is a non-probabilistic sample, created by a selection process proposed by the database itself.

The potential quantity of “citing” journals which each journal has within a thematic category depends on the composition of this journal sample. In Figure 2 the proportion of citing journals from the United Kingdom and the US is much higher than all the other countries – which means that an important number of journals from both countries will always be in quartile 1 (Q1) – followed by the Netherlands and Germany. The proportions in Figure 2 are reproduced with few variations in the distinct categories, since these four countries together account for 70.4% of the journals included in the database.

Figure 2
Scientific journals included in the Cultural Studies thematic category in Scimago Journal & Country Rank (n=876), according to country and citation quartile. 2018.

Note: The ordering criteria was the total number of journals per country.


According to the unisystemic perspective, within a thematic area journals establish a homogenous dialogue, intersected by the same thematic, conceptual, and methodological interests to deal with the same social problems. Hence, the permanence of a large number of US and English journals in the first quartile expresses a recognition by world science of the scientific contribution of both countries, while journals in the rest of the world do not make significant contributions to this unidimensional dialogue.

This is the representation consolidated during the second half of the twentieth century, however, as we will see in the analysis of the shaping of the quartiles, the more journals a country has in the lower quartiles within each thematic category, the more journals it will have in quartile 1 (Q1). As a result, the criteria that form the sample, based on a non-probabilistic sample, are what allow the current proportions to be regulated and sustained.

CONSTRUCTION OF THE NUMERATOR

While the numerator of the Clarivate Analytics impact factor of the CiteScore from Scopus counts the number of citations received by a journal in a determined year, assigning the same degree of relevance to all the cited journals, the SJR numerator, according to its own creators “is based on the transfer of prestige from one journal to another,” so not all the citations received by a journal have the same weight. To obtain the numerator, first the “prestige” of each journal is calculated and, second, the number of citations received by a journal for all the articles published in the three previous years are counted and then computed according to the calculated “prestige” of the citing journal. In this way, the citations obtained from a more “prestigious” journal have more weight than those cited in a less “prestigious” journal (González-Pereira et al., 2010GONZÁLEZ-PEREIRA, Borja; GUERRERO-BOTE, Vicente P.; MOYA-ANEGÓN, Félix. (2010). “A new approach to the metric of journals’ scientific prestige: the SJR Indicator”. Journal of Informetrics, v. 4, n. 3, pp. 379-391. doi: 10.1016/j.joi.2010.03.002.; Scimago Research Group, 2007SCIMAGO RESEARCH GROUP. (2007), “Description of Scimago Journal Rank indicator”. Disponible en: https://www.scimagojr.com/SCImagoJournalRank.pdf. Acceso en 3 dic. 2018.
https://www.scimagojr.com/SCImagoJournal...
; Guerrero-Bote and Moya-Anegón, 2012GUERRERO-BOTE, Vicente P.; MOYA-ANEGÓN, Félix. (2012), “A further step forward in measuring journals’ scientific prestige: the SJR2 Indicator”. Journal of Informetrics, v. 6, n. 4, pp. 674-688. doi: 10.1016/j.joi.2012.07.001.). This generates an effect by which, according to the creators of the indicator, high values tend to concentrate in fewer journals so that the distance between the best classified journals and the rest tends to be greater (González-Pereira et al., 2010GONZÁLEZ-PEREIRA, Borja; GUERRERO-BOTE, Vicente P.; MOYA-ANEGÓN, Félix. (2010). “A new approach to the metric of journals’ scientific prestige: the SJR Indicator”. Journal of Informetrics, v. 4, n. 3, pp. 379-391. doi: 10.1016/j.joi.2010.03.002.).

In contrast with Mertonian universalism and communism in force since the beginning of the twentieth century, this logic of the transfer of “prestige” stimulates inequalities and stratification, in order to assure that the core of industrial science monopolizes the symbolic representation of scientific relevance. This generations a distortion in the system of values which relegates and devalues regional conceptual ecosystems, national problems, innovative or uncommon themes, and stimulates support for theories in vogue within areas that are already highly populated to assure high citation rates.

QUARTILES PER THEMATIC CATEGORY

Based on the SJR value obtained, the Scimago Journal & Country Rank database weighs the journals included within a thematic category and calculates the citation quartiles. Of the total of 313 thematic categories in Scimago Journal & Country Rank, based on the classification of Scopus, only 31 categories (10%) contain a journal from Latin America positioned in Quartile 1 (Q1). In analyzing in each of the 31 thematic categories the quantity of journals in each country according to each quartile (Table 1), the first constant which emerges is that the more journals a country has in the lower quartiles, within a thematic category, the more journals it will have in quartile 1 (Q1). We understand that this constant is because the journals included within a thematic category responds to distinct conceptual ecosystems and thus they do not all dialogue with each other, but rather the dialogue occurs within each ecosystem, in which they tend to share priority problems to be work on, conceptual frameworks, researcher networks, and modes of scientific production with greater or less participation of industrial logic. In other words, a thematic category is neither isomorphous nor a uniform block, rather it is polysystemic. Therefore, if the journal sample is not representative of this diversity of ecosystems, the indicators will reproduce this initial bias and show, within a unisystemic logic, certain ecosystems as if they had greater scientific “influence” or “prestige,” when in reality they only have a greater proportion in numerical terms, which is what ensures their position. Within this scheme, Latin American journals have very few real possibilities of reaching the superior quartiles due to the low proportion of journals cited within their conceptual ecosystems.

Table 1
Number of journals from all regions, US, United Kingdom, and the Latin American region that are part of the thematic subcategories in the Scimago Journal & Country Rank in which at least one Latin American journal is in quartile 1. November 2018.

Within the group of categories selected in Table 1, Veterinary (miscellaneous) (14.0%), Horticulture (11.5%), Animal Science and Zoology (10.0%) had the highest percentage of journals in Latin America, which coincides with the lowest percentage of US journals. The same is true in reverse: the areas with the lowest participation of Latin American journals – Surfaces, Coatings and Films (1.8%); Management, Monitoring, Policy and Law (1.6%); Ceramics and Composites (1.5%); Industrial and Manufacturing Engineering (1.0%), and Metals and Alloys (1.0%) – are the areas with the highest percentage of US journals. This data is consistent with the productive matrices of the country, which have distinct priority problems to be worked with and distinct grades of participation of industrial logic within the modes of scientific-publishing production (Martinovich, 2019MARTINOVICH, Viviana. (2019), “Revistas científicas argentinas de acceso abierto y circulación internacional: un análisis desde la teoría de los campos de Pierre Bourdieu”. Información, Cultura y Sociedad, n. 40, pp. 93-115. doi: 10.34096/ics.i40.5540.).

Another aspect to highlight is that, in observing the data from Table 1, analysis based on the classic divisions between exact and natural sciences, social sciences, and technology loses relevance, since the problems related to the selection of the journal sample, the construction of the numerator, the quartiles per thematic category, and the period analyzed by the indicator are equally common to all areas of knowledge. The problem does not lie in the low representation of the social sciences within the Scopus selection, since thematic areas such as History (n=1120) or Cultural Studies, (n=876) have a higher number of journals than other areas such as Physics and Astronomy (n=268), Molecular Biology (n=406), or even Public Health (n=526), not included in this selection because they do not have any Latin American journal in quartile 1. Even the percentage share of Latin American journals is greater in areas such as History (3.5%) than in Medicine (2.5%).

Since in the shaping of the sample, 57.9% of the total is concentrated in journals from the US (40.8%) and the United Kingdom, (17.1%), the possibilities that these journals will be part of the 25% corresponding to quartile 1 are much greater than the possibilities of the 2.5% corresponding to Latin American journals.

PERIOD ANALYZED PER INDICATOR

The citations which are valid for the calculation of SJR correspond to the last three years (unlike the impact factor which covers two years). This signifies that the 2017 SJR of a determined journal included in the numerator the citations received in 2017 of the articles published in the three previous years (2016, 2015, and 2014); and in the denominator, the articles which the same journal has published in the three previous years (2016, 2015, and 2014). However, in Citation Statistics (Adler et al., 2008ADLER, Robert; EWING, John; TAYLOR, Peter. (2008), “Citation statistics”. Disponible en:https://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf. Acceso en 3 dic. 2018.
https://www.mathunion.org/fileadmin/IMU/...
), the International Mathematical Union mentions that in fields like mathematics the majority of citations occur outside this two year period, and that approximately 50% of citations correspond to articles which appeared in previous decades, 25% cite articles which appeared in the previous decade and 12.5% cite articles from the current decade. As a result, approximately 90% of journal citations fall outside the two or three year window, meaning that citation indexes are based on only 10% of citation activity (Adler et al., 2008ADLER, Robert; EWING, John; TAYLOR, Peter. (2008), “Citation statistics”. Disponible en:https://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf. Acceso en 3 dic. 2018.
https://www.mathunion.org/fileadmin/IMU/...
). As in many other areas, this loss of citations is unknown and invisible, and this biased value, resulting from a very reduced proportion of total citations, is associated with low citations and the entire complex of representations and relations between low citation and lack of scientific relevance.

According to Leydesdorff et al. (2016)LEYDESDORFF, Loet et al. (2016), “Citations: indicators of quality? The impact fallacy”. Frontiers in Research Metrics and Analytics, v. 1, n.1. doi: 10.3389/frma.2016.00001., short-term citations have a higher probability of being transitory and thus are problematic as indicators of quality, since they tend to measure participation in recent discussions and not the epistemic quality of research. As Burton and Kebler mention, each thematic field is composed of two or more different types of article, each one with their own half-life. For example, there exists in most fields a body of work regarded as “classics.” These articles tend to have a relatively longer half-life that the so-called “ephemeral” ones (Burton and Kebler, 1960). For example, the article “Molecular structure of nucleic acids: A structure for Deoxyribose Nucleic Acid” by James Watson and Francis Crick (1953), one of the most important works in biology in the twentieth century in which they postulate the double helix structure of DNA, took ten years to reach its maximum citation rate (Olby, 2003OLBY, Robert. (2003), “Quiet Debut for the Double Helix”. Nature, v. 421, n. 6921, pp. 402-405. doi: 10.1038/nature01397.; Lawrence, 2007LAWRENCE, Peter A. (2007), “The mismeasurement of science”. Current Biology, v. 17, pp. 583-585. doi: 10.1016/j.cub.2007.06.014.), doing so in 1963 after having received the Nobel prize in 1962 (Gingras, 2010GINGRAS, Yves. (2010), “Revisiting the ‘Quiet Debut’ of the Double Helix: a bibliometric and methodological note on the ‘impact’ of scientific publications”. Journal of the History of Biology, v. 43, n. 1, pp. 159-181. doi: 10.1007/s10739-009-9183-2.). The same occurs with innovative topics which do not have a group of researchers which can potentially cite the production. In this sense, referring to cellular biology, Bruce Alberts, president of the US National Academy of Science and editor responsible for Science states:

The misuse of the journal impact factor is highly destructive, inviting a gaming of the metric that can bias journals against publishing important papers in fields. [...] Any evaluation system in which the mere number of a researcher”s publications increases his or her score creates a strong disincentive to pursue risky and potentially groundbreaking work, because it takes years to create a new approach in a new experimental context, during which no publications should be expected. Such metrics further block innovation because they encourage scientists to work in areas of science that are already highly populated, as it is only in these fields that large number of scientists can be expected to reference one”s work, no matter how outstanding. (Alberts, 2013ALBERTS, Bruce. (2013), “Impact Factor Distortions”. Science, v. 340, n. 6134, p. 787–787. doi: 10.1126/science.1240319.)

CONCEPTUAL ECOSYSTEMS VS ISOMORPHIC SCIENCE

The analysis carried out shows that the scientific production evaluation models adopted by certain Latin American countries, based on the impact factor and the citation quartile stimulate a science based on unisystemic terms, aligned with agendas centered topics that are much worked on and not very innovative thereby assuring citation, but with an ephemeral validity which does not exceed the two or three years analyzed by the citation indicators, leaving out much of the relevant theoretical and empirical production in the region. Based on the composition of the journal sample, Latin American journals, with a share of only 2.5% of the total, have very restricted effective possibilities of reaching the superior quartiles due to the low proportion of cited journals within their conceptual ecosystems. However, these aspects are not usually problematized, because the association between citation indicators and scientific relevance, as representation, is not based on values configured in our own experiences, but a system of values intermediated by the interests of those who organize this discursive universe and who managed to establish it as if it were a generalized consensus which does not require verification (Moscovici, 1979MOSCOVICI, Serge. (1979), El psicoanálisis, su imagen y su público. Buenos Aires: Editorial Huemul.).

In this sense, claiming that citation indicators are a form of evaluating the relevance of research and its contribution to scientific knowledge, while being functional to the interests of Big Science and, above all, the scientific publishing industry, is translated into a “tragedy of common goods” (Hardin, 1968HARDIN, Garrett. (1968), “The Tragedy of the Commons”. Science, v. 162, n. 3859, pp. 1243-1248. doi: 10.1126/science.162.3859.1243.; Casadevall et al., 2016CASADEVALL, Arturo et al. (2016), “ASM Journals Eliminate Impact Factor Information from Journal Websites”. MSystems, v. 1, n. 4. doi: 10.1128/mSystems.00088-16.
https://doi.org/10.1128/mSystems.00088-1...
), because it stimulates a science based on competitivity and personal benefits, regardless of whether its benefits or harms the social group to which it belongs (Casadevall and Fang, 2015CASADEVALL, Arturo; FANG, Ferric C. (2015), “Impacted Science: impact is not importance”. MBio, v. 6, n. 5, p. e01593-15. doi: 10.1128/mBio.01593-15.). A distortion of the value system is thus generated which relegates and devalues regional conceptual ecosystems, national problems, innovative or little discussed topics, and stimulates enlisting in theories in vogue within areas that are already highly populated to ensure citation (Alberts, 2013ALBERTS, Bruce. (2013), “Impact Factor Distortions”. Science, v. 340, n. 6134, p. 787–787. doi: 10.1126/science.1240319.). In summary, it is biased towards an “isomorphic science” (Shenhav and Kamens, 1991SHENHAV, Yehouda A.; KAMENS, David H. (1991), “The ‘costs’ of Institutional Isomorphism: Science in Non-Western Countries”. Social Studies of Science, v. 21, n. 3, pp. 527-545. doi: 10.1177/030631291021003005.), based on rules which function as “highly rationalized myths” (Meyer and Rowan, 1977MEYER, John W.; ROWAN, Brian. (1977), “Institutionalized organizations: Formal structure as myth and ceremony”. American Journal of Sociology, v. 83, n. 2, pp. 340-363. doi: 10.1086/226550.), which reproduce the scientific interests of those on top of the pyramid.

One of the examples of this “tragedy of common goods” is the impact of the changes adopted in 2016 by the Colombian system Publindex de Colciencias. This involved the implementation of four categories (A1, A2, B, and C), equivalent to the four citation quartiles (Q1, Q2, Q3, and Q4), to categorize not only Colombian journals, but also to standardize foreign journals in which Colombian researchers published. This modification resulted in such a downgrade of Colombian journals “that only one journal remained in the A1 classification” (Castaño Castrillón, 2018CASTAÑO CASTRILLÓN, José Jaime. (2018), “Penurias de un editor”. Archivos de Medicina (Col), v. 18, n. 2. doi 10.30554/archmed.18.2.2876.2018.), and a large number of journals from Brazil, Argentina, Mexico, and Chile fell from A1 to C. What were the negative effects of this? On the one hand, since Colombian researchers received a higher score from publishing in A1 journals, a high percentage of journals from Colombia and other Latin American countries – which came to be categorized as C – were relegated by their own system. On the other hand, a transfer of economic resources from the Colombian state to the scientific publishing industry was stimulated, through the payment of article processing charges (APC), losing the possibility to invest these resources in the development of a national scientific publishing sector. However, the most noxious effect is perhaps that since issues which could be relevant in Colombia are not necessarily those of interest to A1 journals, published above all in the US and the United Kingdom, the research agenda is beginning to modify to align itself with those themes which have a greater acceptance in these type of journals, and to segment research in multiple fragments to reach the high productivity quotas required. These aspects are not exclusive to Colombia, since in 2016, with very similar parameters, it was implemented in Mexico (Vasen and Vilchis, 2017) and in Brazil (Farias et al., 2017FARIAS, Mareni Rocha; STORB, Bernd Heinrich; STORPIRTIS, Silvia; et al. (2017). “Impact Factor: an appropriate criterion for the Qualis journals classification in the Pharmacy area?” Brazilian Journal of Pharmaceutical Sciences, v. 53, n. 3, p. e01001. doi: 10.1590/s2175-97902017000301001.; Martinovich, 2016MARTINOVICH, Viviana. (2016), “Práctica editorial contextualizada: Carlos Augusto Monteiro y la Revista de Saúde Pública”. Salud Colectiva, v. 12, n. 2, pp. 295-304. doi: 10.18294/sc.2016.980.).

The analysis carried out in this article has shown that the evaluation parameters of scientific production mold the type of science produced, which merits questioning and rethinking what Amilcar Herrera (1973)HERRERA, Amilcar O. (1973), “Los determinantes sociales de la política científica en América Latina: Política científica explícita y política científica implícita”. Desarrollo Económico: Revista de Ciencias Sociales, v. 13, n. 49, pp. 113-134. calls “explicit scientific policy,” in other words the formal and declarative façade of official policy, which is very distinct from the scientific policy that really exists, in other words, “implicit scientific policy,” which really expresses the role of science in society. Both spheres are intersected by representations and practices that feedback to the discursive sphere, which is usually disassociated from the valorization of regional or national conceptual ecosystems. Therefore, questions should focus on science and technology for what and for whom (Kreimer and Zabala, 2006KREIMER, Pablo; ZABALA, Juan Pablo. (2006), “¿Qué conocimiento y para quién? Problemas sociales, producción y uso social de conocimientos científicos sobre la enfermedad de Chagas en Argentina”. Redes, v. 12, n. 23, pp. 55-64.; Varsavsky, 1971VARSAVSKY, Oscar. 1971, Ciencia política y cientificismo. 2a ed. Buenos Aires: Centro Editor de América Latina.), and afterwards an evaluation system of production needs to be formulated which respects and values the policy decisions adopted. In summary, commercial databases have to return to the original meaning of the distribution of scientific knowledge and the generation of evaluation parameters based on the science policy priorities which each country establishes.

Within this framework, the precepts of an open science to which both the Scientific Electronic Library Online (SciELO) and the Red de Revistas Científicas de América Latina y el Caribe, España y Portugal (Redalyc) subscribe are central to increasing the visibility of Latin American science. In addition to its undeniable relevance, there remains the challenge of putting into discussion the network of meanings based on exclusion as an essential condition to hold the symbolic power of scientific relevance. Therefore, the precepts of an open science have to conceive of the knowledge generated by the scientific community as a public good, but one that is still pending in order to reinstate an inclusive scientific culture, thought of in terms of polysystems, which allow the recovery of and respect for the diversity of conceptual ecosystems present in research from different regions of the world.

REFERENCIAS

  • ADLER, Robert; EWING, John; TAYLOR, Peter. (2008), “Citation statistics”. Disponible en:https://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf Acceso en 3 dic. 2018.
    » https://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf
  • ALBERTS, Bruce. (2013), “Impact Factor Distortions”. Science, v. 340, n. 6134, p. 787–787. doi: 10.1126/science.1240319.
  • ALPERIN, Juan Pablo; ROZEMBLUM, Cecilia. (2017). “La reinterpretación de visibilidad y calidad en las nuevas políticas de evaluación de revistas científicas”. Revista Interamericana de Bibliotecología, v. 40, n. 3, pp. 231-241. doi: 10.17533/udea.rib.v40n3a04.
  • ANDRIESSE, Cornelis D. (2008), “Saturation”. In: Dutch messengers: A history of science publishing, 1930-1980. Leiden: Brill. pp. 225-241.
  • BEIGEL, Fernanda. (2014), “Publishing from the periphery: structural heterogeneity and segmented circuits: The evaluation of scientific publications for tenure in Argentina’s CONICET”. Current Sociology, v. 62, n. 5, pp. 743-765. doi: 10.1177/0011392114533977.
  • BEIGEL, Fernanda; GALLARDO, Osvaldo; BEKERMAN, Fabiana. (2018). “Institutional expansion and scientific development in the periphery: the structural heterogeneity of Argentina’s academic field”. Minerva, v. 56, n. 3, pp. 305-331. doi: 10.1007/s11024-017-9340-2.
  • BERTUZZI, Stefano et al. (2016), “Journal impact factors: changing the weather”. Microbe Magazine, v. 11, n. 7, p. 289. doi: 10.1128/microbe.11.289.1.
  • BOURDIEU, Pierre. (1994), “El campo científico”. Redes, n. 2, pp. 130-160.
  • BOURDIEU, Pierre. (1995), “El mercado de los bienes simbólicos”. In: Las reglas del arte: génesis y estructura del campo literario. Barcelona: Anagrama, pp. 213-261.
  • BOURDIEU, Pierre. (1997), “La doble ruptura”. In: Razones prácticas: sobre la teoría de la acción. Barcelona: Editorial Anagrama, pp. 84-90.
  • BOURDIEU, Pierre. (1999), “Una revolución conservadora en la edición”. In: Intelectuales, política y poder. Buenos Aires: Eudeba, pp. 223-264.
  • BRADFORD, Samuel C. (1934), “Sources of information on specific subjects”. Engineering: An Illustrated Weekly Journal, v. 137, n. 3550, pp. 85-86.
  • BURANYI, Stephen. (2017), “Is the staggeringly profitable business of scientific publishing bad for science?” The Guardian. Disponible en: https://www.theguardian.com/science/2017/jun/27/profitable-business-scientific-publishing-bad-for-science? Acceso en 5 dic. 2018.
    » https://www.theguardian.com/science/2017/jun/27/profitable-business-scientific-publishing-bad-for-science?
  • BURTON, Robert E.; KEBLER, R. W. (1960), “The ‘half-life’ of some scientific and technical literatures”. American Documentation, v. 11, n. 1, pp. 18-22. doi: 10.1002/asi.5090110105.
  • BUTLER, Declan. (2008), “Free journal-ranking tool enters citation market”. Nature, v. 451, n. 7174, p. 6. doi: 10.1038/451006a.
  • CAMARGO Jr., Kenneth R. (2009), “Public health and the knowledge industry”. Revista de Saúde Pública, v. 43, n. 6, pp. 1078-1283. doi: 10.1590/S0034-89102009005000076.
  • CAPSHEW, James H.; RADER, Karen A. (1992), “Big Science: price to the present”. Osiris, v. 7, n. 1, pp. 2-25.
  • CASADEVALL, Arturo et al. (2016), “ASM Journals Eliminate Impact Factor Information from Journal Websites”. MSystems, v. 1, n. 4. doi: 10.1128/mSystems.00088-16.
    » https://doi.org/10.1128/mSystems.00088-16
  • CASADEVALL, Arturo; FANG, Ferric C. (2015), “Impacted Science: impact is not importance”. MBio, v. 6, n. 5, p. e01593-15. doi: 10.1128/mBio.01593-15.
  • CASTAÑO CASTRILLÓN, José Jaime. (2018), “Penurias de un editor”. Archivos de Medicina (Col), v. 18, n. 2. doi 10.30554/archmed.18.2.2876.2018.
  • COOK, Alan. (2001), “Academic Publications before 1940”. In: E. H. Fredriksson (ed.) A century of science publishing: a collection of essays. Amsterdam: IOS Press. pp. 14-24.
  • DORA. (2012). “San Francisco Declaration on Research Assessment”. Disponible en: https://sfdora.org/read Acceso en 5 dic. 2018.
    » https://sfdora.org/read
  • EISENHOWER, Dwight D. (1961), “Text of the address by President Eisenhower, broadcast and televised from his office in the White House, Tuesday evening, January 17, 1961”. Acceso en: https://tinyurl.com/y8vfrtdw Acceso 5 dic. 2018.
    » https://tinyurl.com/y8vfrtdw
  • ELSEVIER. (2016). “Scopus: Content Coverage Guide”. Disponible en https://www.elsevier.com/__data/assets/pdf_file/0007/69451/scopus_content_coverage_guide.pdf Acceso en 3 dic. 2018.
    » https://www.elsevier.com/__data/assets/pdf_file/0007/69451/scopus_content_coverage_guide.pdf
  • EUROPEAN MOLECULAR BIOLOGY ORGANIZATION. (2018). “EMBO Long-Term Fellowships: Application guidelines”. Disponible en: http://www.embo.org/documents/LTF/LTF_Guidelines_for_Applicants.pdf Acceso en 4 dic. 2018.
    » http://www.embo.org/documents/LTF/LTF_Guidelines_for_Applicants.pdf
  • EVEN-ZOHAR, Itamar. (1990). “Polysystem studies”. Poetics Today, v. 11, n. 1, p. 9–26.
  • FARIAS, Mareni Rocha; STORB, Bernd Heinrich; STORPIRTIS, Silvia; et al. (2017). “Impact Factor: an appropriate criterion for the Qualis journals classification in the Pharmacy area?” Brazilian Journal of Pharmaceutical Sciences, v. 53, n. 3, p. e01001. doi: 10.1590/s2175-97902017000301001.
  • FREDRIKSSON, Einar H. (2001), “The Dutch publishing scene: Elsevier and North-Holland”. In: A century of science publishing. Amsterdam: IOS Press. pp. 61-76.
  • GARFIELD, Eugene. (1963), “Science Citation Index”. In: Science Citation Index 1961. Philadelphia: Institute of Scientifique Information. pp. V-XVI.
  • GARFIELD, Eugene. (1972a), “The new Social Sciences Citation Index (SSCI): will add a new dimension to research on man and society”. Current Contents, n. 21, pp. 317–319.
  • GARFIELD, Eugene. (1972b), “Citation analysis as a tool in journal evaluation”. Science, n. 178, p. 471-479.
  • GARFIELD, Eugene. (1975a), Journal Citation Reports: A bibliometric analysis of references processed for the 1974 Science Citation Index. Philadelphia: Institute for Scientific Information.
  • GARFIELD, Eugene. (1975b), “The Social Science Citation Index, more than tool”. Current Contents, n. 12, pp. 6-9.
  • GARFIELD, Eugene. (1977), “Will ISI’S Arts & Humanities Citation Index revolutionize scholarship?” Current Contents, n. 32, pp. 5-9.
  • GARFIELD, Eugene. (2006), “The History and Meaning of the Journal Impact Factor”. JAMA, v. 295, n. 1, pp. 90-93. doi: 10.1001/jama.295.1.90.
  • GINGRAS, Yves. (2010), “Revisiting the ‘Quiet Debut’ of the Double Helix: a bibliometric and methodological note on the ‘impact’ of scientific publications”. Journal of the History of Biology, v. 43, n. 1, pp. 159-181. doi: 10.1007/s10739-009-9183-2.
  • GÓMEZ-MORALES, Yuri Jack. (2018), “Abuso de las medidas y medidas abusivas. Crítica al pensamiento bibliométrico hegemónico”. Anuario Colombiano de Historia Social y de la Cultura, v. 45, n. 1, pp. 269-290. doi: 10.15446/achsc.v45n1.67559.
  • GONZÁLEZ-PEREIRA, Borja; GUERRERO-BOTE, Vicente P.; MOYA-ANEGÓN, Félix. (2010). “A new approach to the metric of journals’ scientific prestige: the SJR Indicator”. Journal of Informetrics, v. 4, n. 3, pp. 379-391. doi: 10.1016/j.joi.2010.03.002.
  • GUÉDON, Jean Claude. (2016), “Los retos de la comunicación científica para el sur global”. Disponible en: https://www.youtube.com/watch?v=L9goyc2A8gk&index=5&list=FLYe4miAPo-9fD1ob-rr0W3A Acceso en 3 dic. 2018.
    » https://www.youtube.com/watch?v=L9goyc2A8gk&index=5&list=FLYe4miAPo-9fD1ob-rr0W3A
  • GUERRERO-BOTE, Vicente P.; MOYA-ANEGÓN, Félix. (2012), “A further step forward in measuring journals’ scientific prestige: the SJR2 Indicator”. Journal of Informetrics, v. 6, n. 4, pp. 674-688. doi: 10.1016/j.joi.2012.07.001.
  • HARDIN, Garrett. (1968), “The Tragedy of the Commons”. Science, v. 162, n. 3859, pp. 1243-1248. doi: 10.1126/science.162.3859.1243.
  • HERRERA, Amilcar O. (1973), “Los determinantes sociales de la política científica en América Latina: Política científica explícita y política científica implícita”. Desarrollo Económico: Revista de Ciencias Sociales, v. 13, n. 49, pp. 113-134.
  • HERRMANN-LINGEN, Christoph et al. (2012), “Evaluation of Medical Research Performance: Position Paper of the Association of the Scientific Medical Societies in Germany (AWMF)”. GMS, v. 12, Doc11. doi: 10.3205/000196.
    » https://doi.org/10.3205/000196
  • HICKS, Diana et al. (2015), “Bibliometrics: The Leiden Manifesto for research metrics”. Nature, v. 520, n. 7548, pp. 429-431. doi: 10.1038/520429a.
  • HOBSBAWM, Eric. (2010), “Los años dorados”. In: Historia del siglo XX. Buenos Aires: Crítica, pp. 260-289.
  • KREIMER, Pablo; ZABALA, Juan Pablo. (2006), “¿Qué conocimiento y para quién? Problemas sociales, producción y uso social de conocimientos científicos sobre la enfermedad de Chagas en Argentina”. Redes, v. 12, n. 23, pp. 55-64.
  • LAWRENCE, Peter A. (2007), “The mismeasurement of science”. Current Biology, v. 17, pp. 583-585. doi: 10.1016/j.cub.2007.06.014.
  • LEYDESDORFF, Loet et al. (2016), “Citations: indicators of quality? The impact fallacy”. Frontiers in Research Metrics and Analytics, v. 1, n.1. doi: 10.3389/frma.2016.00001.
  • LOTKA, Alfred A. (1926), “The frequency distribution of scientific productivity”. Journal of the Washington Academy of Sciences, v. 16, n. 12, pp. 317-323.
  • MARTINOVICH, Viviana. (2016), “Práctica editorial contextualizada: Carlos Augusto Monteiro y la Revista de Saúde Pública”. Salud Colectiva, v. 12, n. 2, pp. 295-304. doi: 10.18294/sc.2016.980.
  • MARTINOVICH, Viviana. (2019), “Revistas científicas argentinas de acceso abierto y circulación internacional: un análisis desde la teoría de los campos de Pierre Bourdieu”. Información, Cultura y Sociedad, n. 40, pp. 93-115. doi: 10.34096/ics.i40.5540.
  • MERTON, Robert K. (1973), “The normative structure of science”. In: The sociology of science: Theoretical and empirical investigations. Chicago: University of Chicago Press.
  • MEYER, John W.; ROWAN, Brian. (1977), “Institutionalized organizations: Formal structure as myth and ceremony”. American Journal of Sociology, v. 83, n. 2, pp. 340-363. doi: 10.1086/226550.
  • MOSCOVICI, Serge. (1979), El psicoanálisis, su imagen y su público. Buenos Aires: Editorial Huemul.
  • MÜNCH, Richard. (2015), “El mecanismo de monopolio en la ciencia”. Literatura: Teoría, Historia, Crítica, v. 17, n. 2, pp. 251-286. doi. 10.15446/lthc.v17n2.51293.
  • OLBY, Robert. (2003), “Quiet Debut for the Double Helix”. Nature, v. 421, n. 6921, pp. 402-405. doi: 10.1038/nature01397.
  • PRICE, Derek J. de Solla. (1965), “Networks of scientific papers”. Science, v. 149, n. 3683, pp. 510-515.
  • RESEARCH COUNCILS UK. (2013), “RCUK policy on open access and supporting guidance”. Disponible en: https://www.ukri.org/files/legacy/documents/rcukopenaccesspolicy-pdf Acceso en 5 dic. 2018.
    » https://www.ukri.org/files/legacy/documents/rcukopenaccesspolicy-pdf
  • ROYAL SOCIETY OF LONDON. (1867), Catalogue of scientific papers, 1800-1863. Vol. 1. London: Royal Society of London. Disponible en: http://biodiversitylibrary.org/page/18305124 Acceso en 3 dic. 2018.
    » http://biodiversitylibrary.org/page/18305124
  • ROYAL SOCIETY OF LONDON. (1877), Catalogue of scientific papers, 1864-1873. Vol. 7. London: Royal Society of London. Disponible en: https://www.biodiversitylibrary.org/item/60932 Acceso en 3 dic. 2018.
    » https://www.biodiversitylibrary.org/item/60932
  • ROYAL SOCIETY OF LONDON. (1891), Catalogue of scientific papers, 1874-1883. Vol. 9. London: Royal Society of London. Disponible en: https://www.biodiversitylibrary.org/item/60933 Acceso en 3 dic. 2018.
    » https://www.biodiversitylibrary.org/item/60933
  • ROYAL SOCIETY OF LONDON. (1914), Catalogue of scientific papers, 1884-1900. Vol. 13. London: Royal Society of London. Disponible en: https://www.biodiversitylibrary.org/item/17787 Acceso en 3 dic. 2018.
    » https://www.biodiversitylibrary.org/item/17787
  • SCHEKMAN, Randy. (2013), “How journals like Nature, Cell and Science are damaging science”. The Guardian. DIsponible en: https://www.theguardian.com/commentisfree/2013/dec/09/how-journals-nature-science-cell-damage-science Acceso en 3 dic. 2018.
    » https://www.theguardian.com/commentisfree/2013/dec/09/how-journals-nature-science-cell-damage-science
  • SCIMAGO RESEARCH GROUP. (2007), “Description of Scimago Journal Rank indicator”. Disponible en: https://www.scimagojr.com/SCImagoJournalRank.pdf Acceso en 3 dic. 2018.
    » https://www.scimagojr.com/SCImagoJournalRank.pdf
  • SHENHAV, Yehouda A.; KAMENS, David H. (1991), “The ‘costs’ of Institutional Isomorphism: Science in Non-Western Countries”. Social Studies of Science, v. 21, n. 3, pp. 527-545. doi: 10.1177/030631291021003005.
  • THE NOBEL PRIZE. (2017), The research counts, not the journal! Disponible en: https://www.youtube.com/watch?v=6MQ8R0OyvyQ Acceso en 3 dic. 2018.
    » https://www.youtube.com/watch?v=6MQ8R0OyvyQ
  • UK FORUM FOR RESPONSIBLE RESEARCH METRICS. (2018), “UK Progress towards the use of metrics responsibly: Three years on from The Metric Tide report”. Disponible en: https://tinyurl.com/yavmu3g3 Acceso en 3 dic. 2018.
    » https://tinyurl.com/yavmu3g3
  • VARSAVSKY, Oscar. 1971, Ciencia política y cientificismo. 2a ed. Buenos Aires: Centro Editor de América Latina.
  • VASEN, Federico; LUJANO VILCHIS, Ivonne. (2017), “Sistemas nacionales de clasificación de revistas científicas en América Latina: tendencias recientes e implicaciones para la evaluación académica en ciencias sociales”. Revista Mexicana de Ciencias Políticas y Sociales, v. 62, n. 231, pp. 199-228. doi: 10.1016/S0185-1918(17)30043-0
  • VESSURI, Hebe; GUÉDON, Jean-Claude; CETTO, Ana María. (2014), “Excellence or quality? Impact of the current competition regime on science and scientific publishing in latin america and its implications for development”. Current Sociology, v. 62, n. 5, p. 647-665. doi: 10.1177/0011392113512839.
  • WATSON, James D.; CRICK, Francis H. C. (1953), “Molecular structure of nucleic acids: a structure for deoxyribose nucleic acid”. Nature, v. 171, n. 4356, pp. 737-738. doi: 10.1038/171737a0.
  • WEINBERG, Alvin M. (1961), “Impact of Large-Scale Science on the United States: Big Science is here to stay, but we have yet to make the hard financial and educational choices it imposes”. Science, v. 134, n. 3473, pp. 161-164. doi: 10.1126/science.134.3473.161.
  • WELLCOME TRUST. (2019), “Open access policy 2020 – Frequently asked questions”. Disponible en: https://wellcome.ac.uk/sites/default/files/wellcome-open-access-policy-2020-faq.pdf Acceso en 5 dic. 2018.
    » https://wellcome.ac.uk/sites/default/files/wellcome-open-access-policy-2020-faq.pdf

Publication Dates

  • Publication in this collection
    24 July 2020
  • Date of issue
    2020

History

  • Received
    21 Apr 2019
  • Reviewed
    16 July 2019
  • Accepted
    2 Aug 2019
Instituto de Estudos Sociais e Políticos (IESP) da Universidade do Estado do Rio de Janeiro (UERJ) R. da Matriz, 82, Botafogo, 22260-100 Rio de Janeiro RJ Brazil, Tel. (55 21) 2266-8300, Fax: (55 21) 2266-8345 - Rio de Janeiro - RJ - Brazil
E-mail: dados@iesp.uerj.br