Acessibilidade / Reportar erro

Objective structured clinical assessment as an evaluation tool for medical students

EDITORIAL

Objective structured clinical assessment as an evaluation tool for medical students

Renato S. ProcianoyI, * * Correspondência: Rua Silva Jardim,n° 1155 - apto 701 90450-071 -Porto Alegre, RS. Tel.: (51) 3331-5726 e-mail: renatosp@terra.com.br The authors do not have any conflict of interest to disclose. ; Rita C. SilveiraII

IProfessor titular de Pediatria da Universidade Federal do Rio Grande do Sul e Chefe do Serviço de Neonatologia do Hospital de Clínicas de Porto Alegre, Porto Alegre, RS

IIProfessora adjunta de Pediatria da Universidade Federal do Rio Grande do Sul e Médica do Serviço de Neonatologia do Hospital de Clínicas de Porto Alegre, Porto Alegre, RS

As physicians working with students we are often requested to evaluate their knowledge, skills, attitudes, and interest in learning during their medical course. When involved in a selection process for medical residency it is very important to judge medical competency in combination with knowledge. Therefore, the ability to evaluate if the educational objectives were achieved is a major issue of any program.

Validity and reliability must be considered when developing any method of assessment. We called validity of a test, the extent to which it measures what is intended to be measured and reliability the reproducibility of a set of measurements, consistency or stability of measures over time. Besides these two criteria, objectivity and practicability of the method must also be considered 1. Any method that lacks objectivity or is very difficult to employ is not practical for assessment purposes.

For many years examination tests took into consideration only knowledge. Essential cognitive components are well evaluated with written examinations either as open-ended or multiple choice questions2. However, they do not assess trainees' clinical skills and attitudes.

Nowadays there is a focus on competencies rather than on knowledge acquisition. Competency is defined as a complex set of behaviors built on the components of knowledge, skills and attitudes3. One attempt at measuring clinical competency is use of the Objective Structured Clinical Examination (OSCE).

OSCE was described in 1975. Thirty-three students spent 5 minutes at each of 16 stations, either procedure stations or question and answer stations. At procedure stations, students were asked to take a history, perform a physical examination or some focused tasks. An examiner assigned points for the information obtained at each station4. The OSCE involves observing students in simulated encounters and often provides information about a students' communication skills as well as their abilities when collecting clinical data. There are some criticisms on use of OSCE. The timing and setting may seem artificial; the student feels inhibited by the environment. It penalizes those using shortcuts to reach the final decision. This method is very expensive and time consuming; it requires a minimum of 10 stations which students visit for over 3 to 4 hours in order to achieve a reliability of 0.85 to 0.905. The cost is a real problem and a limiting factor for medical schools in developing countries.

To improve the quality of clinical skills and knowledge assessment, several studies combined OSCE with other evaluation methods or compared results of OSCE with other assessment methods. Carraccio and Englander suggested that a combination of OSCE, standardized board examinations, and direct observation in the clinical setting has the potential to become the gold standard for measuring a physician's competence6. Another study suggested that the combination of OSCE with observation of actual patient encounters may provide a more valid measure of clinical performance7. It seems that clinical skills of medical students should be routinely assessed with clinical evaluation forms. Finally a study compared OSCE with a computer-controlled patient simulator suggesting that both can be used effectively as performance evaluation tools8. There is a consensus that, in order to achieve high levels of reliability OSCE takes more time than is often practicable, and should be combined with other methods of assessment9.

In this issue of Revista da Associação Médica Brasileira, Santos et al. publish a very interesting article showing that among multiple choice and open questions, OSCE, interviews, curriculum analyses for participation in scientific meetings, papers published and voluntary activities, OSCE alone was able to differentiate candidates for residency that had a clerkship duration of 2 years or less10. This finding is not surprising since OSCE evaluates competency, and a longer clinical medical education (internship) will improve skills and attitudes. That is why some specialties require longer residency periods than others. There are some limitations of the study that must be pointed out. Brazilian medical education and medical schools are very heterogeneous in quality and the paper does not present any information about candidates' medical graduation. The number of applicants who finished their medical education just before taking the exam was significantly higher in the 2 year group than in the less than 2 year group. This means that probably the best of that group were admitted to a residency program in the previous years. And finally, the OSCE consisted of only five stations.

The findings of Santos et al. are very interesting because they show that OSCE was the only tool among all those chosen that was able to indicate the candidates who had longer internship and probably had the best clinical skills and medical attitudes. We encourage the use of OSCE or a similar method combined with other methods of evaluation for adequate assessment of medical students.

  • 1.  Barman A. Critiques on the Objective Structured Clinical Examination. Ann Acad Med Singapore. 2005;34(8):478-82.
  • 2.  Rogers PL, Grenvik A, Willenkin RL. Teaching medical students complex cognitive skills in the intensive care unit. Crit Care Med. 1995;23(3):575-81.
  • 3.  Carraccio C, Englander R, Wolfsthal S, Martin C, Ferentz K. Educating the pediatrician of the 21st century: defining and implementing a competency-based system. Pediatrics. 2004;113(2):252-8.
  • 4.  Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;22;1(5955):447-51.
  • 5.  Epstein RM. Assessment in medical education. N Engl J Med. 2007;356(4):387-96.
  • 6.  Carraccio C, Englander R. The objective structured clinical examination: a step in the direction of competency-based evaluation. Arch Pediatr Adolesc Med. 2000;154(7):736-41.
  • 7.  Rogers PL, Jacob H, Rashwan AS, Pinsky MR. Quantifying learning in medical students during a critical care medicine elective: a comparison of three evaluation instruments. Crit Care Med. 2001;29(6):1268-73.
  • 8.  Kreiter CD, Bergus GR. A study of two clinical performance scores: assessing the psychometric characteristics of a combined score derived from clinical evaluation forms and OSCEs.[cited 2009 jan 9]. Med Educ. 2007;12:10. Available from: http://www.med-ed-online.org
  • 9.  Newble D. Techniques for measuring clinical competence: objective structured clinical examinations. Med Educ 2004; 38(2): 199-203
  • 10.  Santos IS, Vieira JE, Nunes MPT. Length of internship influences performance on medical residency exam. Rev Assoc Med Bras 2009;55(6) 744-48.
  • *
    Correspondência: Rua Silva Jardim,n° 1155 - apto 701 90450-071 -Porto Alegre, RS. Tel.: (51) 3331-5726 e-mail:
    The authors do not have any conflict of interest to disclose.
  • Publication Dates

    • Publication in this collection
      25 Feb 2010
    • Date of issue
      2009
    Associação Médica Brasileira R. São Carlos do Pinhal, 324, 01333-903 São Paulo SP - Brazil, Tel: +55 11 3178-6800, Fax: +55 11 3178-6816 - São Paulo - SP - Brazil
    E-mail: ramb@amb.org.br