SciELO - Scientific Electronic Library Online

 
vol.9 número3Aspectos epidemiológicos das enteroparasitoses em creches na cidade de Botucatu, Estado de São Paulo, BrasilReprodutibilidade e validade do questionário de freqüência de consumo alimentar utilizado em estudo caso-controle de câncer oral índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

Compartilhar


Revista Brasileira de Epidemiologia

versão impressa ISSN 1415-790Xversão On-line ISSN 1980-5497

Resumo

OLIVEIRA, Natália Sanchez; OLIVEIRA, Julicristie Machado de  e  BERGAMASCHI, Denise Pimentel. Agreement among raters in the selection of articles in a systematic review. Rev. bras. epidemiol. [online]. 2006, vol.9, n.3, pp.309-315. ISSN 1415-790X.  https://doi.org/10.1590/S1415-790X2006000300005.

The objective of this study is to present the methodological aspects of inter-rater agreement in the initial selection of studies for a systematic review with or without meta-analysis. As an example, we used data from the initial phase of the study called "Vitamin A supplementation for breastfeeding mothers: systematic review". The data are the result of a reading carried out by two raters of article abstracts selected judiciously among electronic bibliography databases. For each study we posed the questions: "Does the study involve post-partum females?"; "Is it a vitamin A supplementation study?"; "Is it a clinical trial?", followed by a decision (inclusion/exclusion) concerning the study. The data were keyed twice into an Excel spreadsheet and a validation procedure was followed. The kappa agreement rate was applied to the following aspects: population, intervention, study type, and decision. We identified 2,553 studies. The kappa agreement rates were: k=0.46 for suitability of the population studied; k= 0.59 for intervention type; k=0.59 for study type; and k=0.44 for the inclusion/exclusion decision. Given the fair (intervention and study type) and slight (population studied) agreement between raters, we emphasize the need for the studies to be read initially by at least two raters. The consensus meetings carried out in the presence of disagreement were useful to solve differences of interpretation between raters, provided new understanding and deeper reflection, contributed to reduce the chance of non-inclusion of necessary studies, and thus enhanced control over a possible selection bias.

Palavras-chave : Meta-Analysis; Interrater agreement; Kappa statistics; Selection Bias.

        · resumo em Português     · texto em Português     · Português ( pdf )

 

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons