Acessibilidade / Reportar erro

Peer Review

EDITORIAL

Peer Review

The advances in Brazilian science, beginning in the early 50s, when the Conselho Nacional de Pesquisas (CNPq) and the Campanha de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) were created, are unquestionable and perfectly visible without the use of a magnifying lens.

Implantation of the Accessory Committees (Comitês Assessores - CAs) in the CNPq and Area Coordinations (Coordenações de Áreas) in CAPES , to evaluate the researcher and post-graduate courses, respectively, were decisive in elevating Brazil to the select group of 20 nations boasting the greatest scientific production in the world. Due to the CAPES system of evaluation, the best post-graduate programs in Chemistry are of a quality comparable to that of many industrialized nations. It is noteworthy that in the academic sector there are 300 new PhDs yearly, while this same sector is responsible for over 90% of foreign publications, notwithstanding Brazil's two periodicals _ Química Nova (http://quimicanova.sbq.org.br/quimicanova.htm), IF = 0.627 and the Journal of the Brazilian Chemical Society (http://jbcs.sbq.org.br), IF = 1.16, this last being of greatest impact among all Latin American scientific periodicals in this area.1-3 If Brazilian academic Chemistry is, at this moment in time, beyond the frontier of scientific article publications and in turning out PhDs, it is behind in transforming the Brazilian Chemical industry into an industry of specialties, as well as transforming knowledge into riches for the nation.

The system of peer review should be regulated by the researcher's academic-scientific-technological merit, or that of the course itself, and make use of qualitative criteria, while still allowing that quantitative criteria be used subsidiarily. The criteria should be both amply well-known and legitimate, their changes requiring ample debate and the construction of consensus. Necessarily, in peer review, the criteria for evaluation belong to the evaluated community and not to the evaluators; otherwise, the evaluation would not be by peers!

What is seen to be currently true is a growing wave of dissatisfaction among those evaluated (researchers and courses) with the criteria and parameters for evaluation which, very often, are either not very well qualified or use purely quantitative parameters for their evaluation which negate peer review itself. This would seem to indicate an uncertain future, without Evaluating Committees, and in which a checklist and a computer program might produce the ranking, such as is done, for example, for tennis players!

In the early 1990s, some Brazilian biochemists began to propose the introduction of the Impact Factor (IF) parameter for scientific periodicals1 and the number of citations to evaluate researchers. What began as a simple proposal, gained projection and became a veritable fever, even though the impact factor parameter has been abandoned by agencies and universities of developed countries.

Use of the IF for the evaluating systems of departments, universities and funding agencies has gained more importance than it deserves. This does not mean that it should not be used for evaluations. The Brazilian Chemical community is proud of the impact factor in the Brazilian Chemical Society's periodicals.2,3 These numbers had been tenaciously pursued for five years. Today, however, one can observe a tendency in processes for evaluation, a tendency to count papers and consider them as the impact factor for the periodical in which they are published. The exacerbation of quantification in evaluations defends more the evaluator (the reviewer) than the evaluated (the reviewed) especially when the result of the evaluation process is financing, the concession of scholarships to produce research, or the concept of a post-graduate program. The recognition of merit is more complex and requires excellence from both those evaluated as well as from the evaluators themselves; it cannot be limited to merely one or two indicators.

One of the main questions involved in evaluation is the characterization of merit and of impact. The evaluation of merit is necessarily qualitative and the evaluation of impact involves quantitative criteria. How should merit and impact be considered? This, certainly, can only be done by peers who can perceive and recognize academic-scientific-technological merit as well as identify parameters for measuring impact. For example, a researcher whose work is of merit and relevant impact, does not necessarily transfer these automatically to the course he serves on as faculty member! The reverse is also true. A researcher's evaluation involves his academic-scientific-technological performance, while the evaluation of post-graduate courses takes into account the performance of its professors and students, regional, national, international and collective impact, as well as the fate of its students.

It is not for us to define here which is (are) better or worse criteria for evaluation, but to invite the Chemical community to reflect and to debate the theme of peer review. With this objective, readers are invited to write to editor@jbcs.sbq.org.br, in order to contribute to perfecting the system of peer review , in which merit, publication impact, academic degrees and the transformation of knowledge into riches coexist harmoniously.

To better mark the main points of this discussion, the following questions are asked:

i) is it possible to differentiate the criteria for evaluating researchers from post-graduate courses without being detrimental to either?

ii) is it possible to qualify (and measure) the intellectual contribution of the different authors of one scientific study and distinguish them from the technical and/or instrumental contribution?

iii) how can seldom cited articles published in high impact periodicals be compared with articles often cited published in periodicals of low or medium impact factor?

iv) how can the scientific and technological production of faculty members be made appropriate in evaluating post-graduate courses?

v) How should the impact of knowledge generated in post-graduate programs be weighed and what importance should be given the number of entrants in these programs for the development of scientific and technological activities in Brazil?

Jailson B. de Andrade

(Editor - J. Braz. Chem. Soc.)

References

1. Pinto, A.C.;de Andrade, J.B.;Quím. Nova 1999, 22, 448.

2. Pinto, A.C.; de Andrade, J.B.; J. Braz. Chem. Soc. 2005, 16, 4-Editorial.

3. de Torresi, S.I.C.; Pardini, V.L.; Ferreira, V.F.; Pinto, A.C.; de Andrade, J.B.; Quím. Nova 2005, 28, 745-Editorial.

Publication Dates

  • Publication in this collection
    09 Nov 2005
  • Date of issue
    Oct 2005
Sociedade Brasileira de Química Instituto de Química - UNICAMP, Caixa Postal 6154, 13083-970 Campinas SP - Brazil, Tel./FAX.: +55 19 3521-3151 - São Paulo - SP - Brazil
E-mail: office@jbcs.sbq.org.br