Acessibilidade / Reportar erro

Brazilian Version of the ACE (Assessing Competencies in Evidence-Based Medicine) Tool: a Validation Study

Abstract:

Introduction:

The ACE (Assessing Competencies in Evidence-Based Medicine) Tool is a recently developed questionnaire to assess competencies in Evidence-Based Medicine. The aim of this study is to validate the Brazilian version of ACE Tool.

Methods:

This is a cross-sectional validation study carried out in two phases. In the first phase, the questionnaire was translated. In the second phase, the questionnaire was applied to undergraduate students and teachers/preceptors of the medical course. The evaluated properties were internal validity, consistency and reliability.

Results:

76 medical undergraduate students and 12 teachers/preceptors were included. The mean of teachers/preceptors was significantly higher than that of students (10.25±1.71 vs 8.73±1.80, mean difference of 1.52, 95%CI 0.47-2.57, p=0.005), demonstrating construct validity. The Brazilian version of the ACE Tool obtained adequate internal consistency (Cronbach’s alpha = 0.61) and reliability (item-total correlation ≥ 0.15 in 14 of the 15 items).

Conclusion:

The Brazilian version of the ACE Tool shows acceptable psychometric properties and can be used as an instrument to assess competencies for Evidence-Based Medicine in Brazilian medical students.

Key words:
Medical Education; Validation Study; Evidence-Based Medicine

Resumo:

Introdução:

A ferramenta Assessing Competencies in Evidence-Based Medicine (ACE) é um questionário recentemente proposto para avaliação de competências em Medicina Baseada em Evidências. Este estudo teve como objetivo validar a versão brasileira da ferramenta ACE.

Método:

Trata-se de um estudo transversal de validação realizada em duas fases. Na primeira fase, traduziu-se o questionário. Na segunda fase, estudantes de graduação e professores/preceptores do curso de Medicina responderam ao questionário. As propriedades avaliadas foram validade, consistência e confiabilidade internas.

Resultado:

Incluíram-se 76 estudantes de graduação e 12 professores/preceptores. A média dos professores/preceptores foi significativamente mais alta que a dos alunos (10,25 ± 1,71 versus 8,73 ± 1,80, diferença média de 1,52, IC95% 0,47-2,57, p = 0,005), demonstrando a validade de construto. A versão brasileira da ferramenta ACE obteve consistência (alfa de Cronbach = 0,61) e confiabilidade internas (correlação item-total ≥ 0,15 em 14 dos 15 itens) adequadas.

Conclusão:

A versão brasileira da ferramenta ACE demonstra propriedades psicométricas aceitáveis e pode ser usada como instrumento para a avaliação de competências para a Medicina Baseada em Evidências em estudantes de Medicina brasileiros.

Palavras-chave:
Educação Médica; Estudo de Validação; Medicina Baseada em Evidências

INTRODUCTION

In 1991, in an editorial in the ACP Journal Club, Gordon Guyatt used the term “Evidence-Based Medicine” (EBM) for the first time in the medical literature to describe a new way of thinking and practicing medicine, privileging skills of literature research, critical evaluation of scientific articles and synthesis of information for individualized clinical decision-making, to the detriment of the appeal to the authority of more experienced professionals and textbooks11. Guyatt GH. Evidence-Based Medicine. ACP J Club. 1991;114:A16..

David Sackett, one of the pioneers of clinical epidemiology, defined EBM as “the conscious, explicit and judicious use of the best evidence for decision-making in the care of individual patients”. Therefore, the practice of EBM incorporates the best scientific evidence, the experience and expertise of the professional and the particularities, including values and preferences of the patient, for a better choice22. Sackett DL, Rosenberg WMC, Gray JAM, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn’t. BMJ. 1996;312:71-2..

The practice of MBE, and therefore its teaching and assessment, must comprise 5 steps (or domains), as summarized by the Sicily Statement: ask, search, appraise, integrate and evaluate, as shown in Chart 1 33. Daes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, et al. Sicily Statement on evidence-based practice. BMC Med Educ. 2005;5:1..

Chart 1
Steps for evidence-based practice

The ACE (Assessing Competencies in Evidence-Based Medicine) Tool, is a questionnaire to assess competencies for EBM, proposed and validated by Ilic et al., in which the respondents are presented with a clinical scenario, a clinical question, a search strategy, and a hypothetical article summary. Then, 15 closed questions are presented, which must be answered with a “yes” or “no”, covering four of the five steps for evidence-based practice: the construction of the clinical question (questions 1 and 2); the search of scientific literature in databases (questions 3 and 4); critical analysis of the evidence found (questions 5 to 11); and the application of evidence to the specific clinical setting (questions 12 to 15)44. Ilic D, Nordin RB, Glasziou P, Tilson JK, Villanueva E. Development and validation of the ACE tool: Assessing Medical Medical Trainees’ Competency in Evidence Based Medicine. BMC Med Educ . 2014;14:114..

The aim of this study is to validate the Brazilian version of the ACE Tool.

METHODS

Design, participants and ethics

This is a cross-sectional validation study. Medical students from Universidade Federal do Rio Grande do Norte (UFRN) enrolled in a complementary or extension course on Evidence-Based Medicine were invited to answer the questionnaire after the first class. Teachers and preceptors of the medical course, recognized by the researchers as familiar with the topic, were also invited, aiming to assess the discriminatory capacity of the questionnaire. The research protocol was reviewed and approved by the Research Ethics Committee of Hospital Universitário Onofre Lopes (Huol - UFRN) with CAAE n. 30445120.0.0000.5292 and Opinion n. 4,074,739.

Translation and adaptation of the assessment questionnaire

The initial translation of the questionnaire was carried out independently by two researchers with experience in the subject and fluency in English, after which a single version was established. This consensus version in Portuguese was back-translated into English by a professional translator, who did not participate in the previous phases. The back-translated version was then compared to the original version of the questionnaire in English and new adjustments were made, until a final version was attained by consensus between the researchers and the translator. The final questionnaire items are shown in Chart 2, while the full translated and adapted version of the ACE Tool in Portuguese is shown in the supplementary material.

Chart 2
Questionnaire Items - translated version of the ACE Tool

Application of the Questionnaire

The questionnaire was applied through an online platform to be answered in a single attempt with no time limit. All participants provided the free and informed consent to participate.

Statistical analysis

The following variables were collected: group (students and teachers/preceptors); semester attended by the student; responses to each item of the ACE questionnaire; total number of correct answers. A sample size of 75 students was estimated (5 participants per item of the questionnaire). The difficulty of the questionnaire items, internal consistency and reliability were evaluated. The difficulty of the items was evaluated by the percentage of candidates who answered the question correctly. The internal consistency of the questionnaire was assessed using Cronbach’s alpha. A Cronbach’s alpha between 0.6-0.7 was considered an acceptable internal consistency; between 0.7-0.9, as good internal consistency; and above 0.9, as excellent internal consistency. Reliability was assessed by the item-total correlation. An item-total correlation (CIT) 0.15 was considered acceptable55. Kline P. The handbook of psychological testing. 2nd edition. London: Routledge; 2000.. The students’ results were compared with those of teachers/preceptors’ results using Student’s t test for independent samples. P values <0.05 were considered statistically significant.

RESULTS

Eighty-eight responses were obtained, 76 of which comprised undergraduate medical students (from the first to the tenth semesters of the course) and 12 teachers/preceptors of the medical course.

Figure 1 below shows the distribution of the number of correct answers in the students’ assessment according to the course semester.

Figure 1
Box and whisker plot (median and interquartile ranges) of the number of correct answers in the students’ assessment according to the course semester

Difficulty, reliability and internal consistency of the translated version of the ACE tool

Table 1 shows the analysis of individual items.

The Cronbach’s alpha value was 0.61.

Table 1
Analysis of Individual Items: distribution of items according to step, difficulty index and item-total correlation

Construct validity

The averages obtained by the students were compared with the averages obtained by the teachers/preceptors, aiming to observe the questionnaire’s ability to discriminate different degrees of expertise. The mean number of correct answers by teachers/preceptors was significantly higher than the mean number of correct answers by the 76 students (10.25±1.71 vs. 8.73±1.80, mean difference of 1.52, 95%CI 0.47-2.57, p=0.005).

Summary of the properties of the translated version of the ACE tool

Chart 3 summarizes the properties of the translated version of the ACE Tool.

Chart 3
Properties of the translated version of the ACE Tool

DISCUSSION

Our results demonstrate that the translated version of the ACE Tool maintains the discriminatory capacity for different levels of expertise and acceptable internal reliability and consistency, according to the original version. The undergraduate medical students in our study obtained a mean number of correct answers of 8.73 and the teachers, 10.25, comparable to 8.6 for the “beginner” level and 10.4 for the “advanced” level of the original study. The reliability and consistency indices in our study also maintained the results of the previous study44. Ilic D, Nordin RB, Glasziou P, Tilson JK, Villanueva E. Development and validation of the ACE tool: Assessing Medical Medical Trainees’ Competency in Evidence Based Medicine. BMC Med Educ . 2014;14:114.. It is necessary to draw attention to the relatively low value of internal consistency (Cronbach’s alpha between 0.6 and 0.7), both in our research and in the original study by Ilic, which was only “acceptable”. The very purpose of the questionnaire, addressing competencies in different domains (construction of the clinical question, literature search, critical evaluation and integration into the clinical scenario), contributes to a lower relationship between the variables and, therefore, a lower numerical value of Cronbach’s alpha. On the other hand, each of the 15 items has its own relevance, as it addresses a specific competence, such as identifying the adequacy of randomization, blinding, intention to treat, etc., so that the answer to each question has an important meaning, even when it departs from the answer to other questions.

The ACE Tool is one of several standardized questionnaires used to assess competencies in EBM, such as the Berlin questionnaire66. Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer H-H, Kunz R. Do short courses in Evidence Based Medicine improve knowledge and skills? Validation of Berlin Questionnaire and before and after study of courses in Evidence Based Medicine. BMJ . 2002;325;1338-41. and the Fresno Test77. Ramos KD, Schafer S, Tracz SM. Validation of the Fresno Test of Competence in Evidence Based Medicine. BMJ . 2003;326:319-21., with the latter also having been validated into Brazilian Portuguese88. Salermo MR, Herrmann F, Debon LM, Soldatelli MD, Forte GC, Bastos MD, et al. Brazilian version of the Fresno Test of Competence in Evidence-Based Medicine: a validation study. Sci Med. 2019;29(1):e32295.. The Berlin questionnaire only addresses critical evaluation. The Fresno Test, in turn, assesses 3 domains (“ask”, “research” and “critically appraise”) through open-ended questions, but requires a long time to answer (approximately one hour). The ACE Tool allows a broad assessment (“ask”, “research”, “critically appraise” and “integrate”), stimulating clinical reasoning, high practicality and a short response time. In fact, the ACE Tool has been used internationally to assess students99. Clode NJ, Danielson K, Dennett E. Perceptions of competency with Evidence-Based Medicine among medical students: changes through training and alignment with objective measures. N Z Med J. 2021;134(1531):63-75.),(1010. Mahmoud MA, Laws S, Kamal A, Al Mohanadi D, Al Mohammed A, Mahfoud ZR. Examining aptitude and barriers to Evidence-Based Medicine among trainees at an ACGME-I Accredited Program. BMC Med Educ . 2020;20:414.) and educational strategies1111. Ilic D, Nordin RB, Glasziou P, Tilson JK, Villanueva E. A randomised controlled trial of a blended learning education intervention for teaching Evidence-Based Medicine. BMC Med Educ . 2015;15:39.)-(1414. Kumaravel B, Stewart C, Ilic D. Face-to-face versus online clinical integrated EBM teaching in an undergraduate medical school: a pilot study. BMJ Evid Based Med. 2022;27:162-8., although Buljan et al. have observed a lower “sensitivity to change”, that is, a lower capacity of the ACE Tool to discriminate the knowledge obtained after courses, limiting its use as a “post-test evaluation” in relation to the Berlin questionnaire and the Fresno test1515. Buljan I, Jeroncic A, Malicki M, Marusic M, Marusic A. How to choose an evidence-based medicine knowledge test for medical students? Comparison of three knowledge measures. BMC Med Educ . 2018;18:290.. It is also important to note that the ACE Tool is specifically targeted at a therapeutic issue, and important focal points of clinical activity, such as diagnosis and prognosis, are not included. Standardized questionnaires for diagnostic reasoning and evidence-based prognosis constitute a gap in the literature.

The National Curriculum Guidelines for the Undergraduate Course in Medicine recognize the need for decision-making based on critical and contextualized analysis of scientific evidence and effectively point out as a “key action” the promotion of scientific and critical thinking and support for the production of new knowledge1616. Brasil. Institui Diretrizes Curriculares Nacionais do Curso de Graduação em Medicina e dá outras providências. Resolução nº 3, de 20 de junho de 2014. Brasília: Ministério da Educação; 2014.. The ACE Tool addresses four of the five steps of evidence-based practice and allows discriminating specific knowledge and skills. In this way, it is an important tool to understand the participants’ prior knowledge, to plan and/or adapt the curriculum, as well as to understand specific educational needs.

CONCLUSIONS

The Brazilian version of the ACE Tool shows acceptable psychometric properties similar to the original version and can be used as an instrument to assess competencies for Evidence-Based Medicine in Brazilian medical students.

ACKNOWLEDGMENT

To the Professional Master’s Degree in Health Education (MPES) and to the Tutorial Educational Program (PET) of Universidade Federal do Rio Grande do Norte for support in the study conduction.

REFERÊNCIAS

  • 1
    Guyatt GH. Evidence-Based Medicine. ACP J Club. 1991;114:A16.
  • 2
    Sackett DL, Rosenberg WMC, Gray JAM, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn’t. BMJ. 1996;312:71-2.
  • 3
    Daes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, et al. Sicily Statement on evidence-based practice. BMC Med Educ. 2005;5:1.
  • 4
    Ilic D, Nordin RB, Glasziou P, Tilson JK, Villanueva E. Development and validation of the ACE tool: Assessing Medical Medical Trainees’ Competency in Evidence Based Medicine. BMC Med Educ . 2014;14:114.
  • 5
    Kline P. The handbook of psychological testing. 2nd edition. London: Routledge; 2000.
  • 6
    Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer H-H, Kunz R. Do short courses in Evidence Based Medicine improve knowledge and skills? Validation of Berlin Questionnaire and before and after study of courses in Evidence Based Medicine. BMJ . 2002;325;1338-41.
  • 7
    Ramos KD, Schafer S, Tracz SM. Validation of the Fresno Test of Competence in Evidence Based Medicine. BMJ . 2003;326:319-21.
  • 8
    Salermo MR, Herrmann F, Debon LM, Soldatelli MD, Forte GC, Bastos MD, et al. Brazilian version of the Fresno Test of Competence in Evidence-Based Medicine: a validation study. Sci Med. 2019;29(1):e32295.
  • 9
    Clode NJ, Danielson K, Dennett E. Perceptions of competency with Evidence-Based Medicine among medical students: changes through training and alignment with objective measures. N Z Med J. 2021;134(1531):63-75.
  • 10
    Mahmoud MA, Laws S, Kamal A, Al Mohanadi D, Al Mohammed A, Mahfoud ZR. Examining aptitude and barriers to Evidence-Based Medicine among trainees at an ACGME-I Accredited Program. BMC Med Educ . 2020;20:414.
  • 11
    Ilic D, Nordin RB, Glasziou P, Tilson JK, Villanueva E. A randomised controlled trial of a blended learning education intervention for teaching Evidence-Based Medicine. BMC Med Educ . 2015;15:39.
  • 12
    Yoon SH, Kim M, Tarver C, Loo LK. “ACEing” the evidence within Physical Medicine and Rehabilitation (PM&R). MedEdPortal. 2020;16:11051.
  • 13
    Goodarzi H, Teymourzadeh E, Rahimi S, Nasiri T. Efficacy of active and passive Evidence-Based Practice training for postgraduate medical residents: a non-randomized controlled trial. BMC Res Notes. 2021;14(1):317.
  • 14
    Kumaravel B, Stewart C, Ilic D. Face-to-face versus online clinical integrated EBM teaching in an undergraduate medical school: a pilot study. BMJ Evid Based Med. 2022;27:162-8.
  • 15
    Buljan I, Jeroncic A, Malicki M, Marusic M, Marusic A. How to choose an evidence-based medicine knowledge test for medical students? Comparison of three knowledge measures. BMC Med Educ . 2018;18:290.
  • 16
    Brasil. Institui Diretrizes Curriculares Nacionais do Curso de Graduação em Medicina e dá outras providências. Resolução nº 3, de 20 de junho de 2014. Brasília: Ministério da Educação; 2014.
  • 7
    Evaluated by double blind review process.
  • SOURCES OF FUNDING

    The authors declare no sources of funding.
Chief Editor: Daniela Chiesa. Associate editor: Not assigned.

Publication Dates

  • Publication in this collection
    12 Aug 2022
  • Date of issue
    2022

History

  • Received
    27 May 2022
  • Accepted
    31 May 2022
Associação Brasileira de Educação Médica SCN - QD 02 - BL D - Torre A - Salas 1021 e 1023 | Asa Norte, Brasília | DF | CEP: 70712-903, Tel: (61) 3024-9978 / 3024-8013, Fax: +55 21 2260-6662 - Brasília - DF - Brazil
E-mail: rbem.abem@gmail.com