Cacho et al., (2016)1414. Cacho RDO, Baroni MP, Ruaro JA, Lopes JM, Britto HMJDS, Ferreira TB, et al. Active methods in physical therapy: intereliability study of the OSCE method. Rev Bras Educ Med. 2016;40(1):128-37.
|
Professors met prior to the examination to devise clinical cases. The concepts “yes”, “no” and “insufficient” were ascribed to the task accomplishment, with a final score of 10 points. The second phase referred to the use of OSCE. Students experienced the practice in stations (attendance simulated environment) where two examiners applied the instrument. |
Excellent reliability was seen in the stations 1 (CCI=0,89), 2 (CCI=0,99) and 3 (CCI=0,99), and satisfactory (CCI=0,73) in station 4, being shown as a useful method for the evaluation process of professional education. However, it requires pedagogical preparation and examiners’ previous experience. It is preferable to use narrow response, e.g. correct or incorrect. |
Davies et al., (2015)2626. Edgar S, Mercer A, Hamer P. Admission interview scores are associated with clinical performance in an undergraduate physiotherapy course: an observational study. Physiotherapy. 2014;100(4):331-5.
|
Telephone interview to determine which physiotherapy courses used OSCE and to collect data about the methodology employed: number of stations, time spent in each station, type of examiner, and evaluation methods/scales. It also included questions regarding professional behavior, criteria and classification components. |
The number of stations varied from 1 to 10 stations (mean 3-4), with duration from 5 to 15 minutes. All programs reported evaluating only practical skills during the OSCE. All programs reported the use of different classes of examiners, both full-time professors and assistant professors. All used evaluation forms like checklists to evaluate the components: communication, respect, patient safety and work characteristics. |
Edgar et al., (2014)2727. Maloney S, Storr M, Morgan P, Ilic D. The effect of student self-video of performance on clinical skill competency: a randomised controlled trial. Adv Health Sci Educ. 2013;18:81-9.
|
Predictive variables (course entrance interview) and outcome variables (students’ marks during the course and in the OSCE) were analyzed. There was no description of the way OSCE was carried out. |
The course entrance interview scores significant association with performance in three of the six clinical internships of the course. The authors highlighted the role of entrance measurements in the selection of physiotherapy undergraduate students. |
Maloney et al., (2013)2828. Silva CC, Lunardi AC, Mendes FA, Souza FF, Carvalho CR. Objective structured clinical evaluation as an assessment method for undergraduate chest physical therapy students: a cross-sectional study. Rev Bras Fisioter. 2011;15(6):481-6.
|
Students were divided into two groups: one group was assigned the task of learning an ability through the self-video; to the other, the self-video showed an ability less relevant to the clinic case. At the end of the semester, both groups took OSCE based on the following abilities: introduction to the patient, interpretation of the clinical examination result, conduct execution, giving feedback to the patient, reformulating conduct or evaluation technique, proper handling, communication with the patient, etc. |
The students got a significantly higher score in the OSCE when the clinical ability had been analyzed with a self-video on how to perform the task (p=0.048). The analysis of students’ perception identified that the self-video contributed to the improvement of the clinical performance and self-confidence in future clinical practice. The clinical ability with highest score was reached when the traditional teaching methods were complemented with the self-videos. |
Silva et al., (2011)2929. Norcini JJ, Mckinley DW. Assessment methods in medical education. Teach Teach Educ. 2007;23:239-50.
|
Two types of examinations were evaluated: traditional and OSCE. The traditional test consisted in four theoretical and one practical question. OSCE included five stations (0-2 points/each). The students had a minute to read a task description and five to accomplish it. The abilities evaluated were divided into classes, cognitive, psychomotor and behavioral skills. |
The OSCE mean score oscillated between 4.4 and 9.6, with good internal consistency between stations (0.7). The agreement between examinations determined that they are not comparable. Low agreement was also observed between the two examinations (r=-0.1, p=0.9). OSCE evaluates distinctly different abilities when compared to the traditional examination, which suggests that OSCE complements the evaluation of skills and competences that traditional exams fail in evaluating. |
Maloney et al., (2013)3232. Bransford JD, Brown AL, Cocking RR. Como as pessoas aprendem: cérebro, mente, experiência e escola. São Paulo: Senac São Paulo; 2007.
|
Three teaching methods were compared: “traditional”, pre-recorded lessons (tutorial videos) and students’ self-performance videos. The students received a clinical case that required the use of their skills (evaluated by OSCE). Performance was scored according to their communication abilities, clinical reasoning, safety and risk management and patient handling, and the results were classified as “concluded”, “partially concluded” or “insufficient”. |
No significant differences were found in the clinical performance between the three teaching practices. There was significant difference between the methods regarding the education value noticed by the students, with teaching approaches using tutorial videos and students’ self-performance videos, in which the tutorial videos were considered superior. Alternative teaching methods might produce equivalent learning results when applied to the development of practical skills. They might be applied instead of live lessons, rather than being a mere complement or a supplementary method to traditional approaches. |
Snodgrass et al., (2014)3838. Kropmans TJB, O'Donovan BGG, Cunningham D, Murphy AW, Flaherty G, Nestel D, et al. An online management information system for objective structured clinical examinations. Comp Inform Scien. 2012;5(1):31-48.
|
The electronic OSCE (eOSCE) was devised using an iPad to improve students’ feedback. The eOSCE was tested in two universities regarding user friendliness, preference (electronic or printed) and performance in the feedback given to the students. The authors did not describe how the OSCE was applied. |
Most of the examiners (68%) preferred eOSCE, mainly due to the consistency and speed of the students’ feedback. The advantages listed were automation, individual and immediate feedback and time saved in the OSCE applications. The disadvantages were the necessary pre-exam preparation and challenges to the examiners that were not comfortable using the technology. |