Services on Demand
- Cited by Google
- Similars in SciELO
- Similars in Google
Jornal de Pediatria
Print version ISSN 0021-7557
SANDOVAL, Gloria E. et al. Analysis of a learning assessment system for pediatric internship based upon objective structured clinical examination, clinical practice observation and written examination. J. Pediatr. (Rio J.) [online]. 2010, vol.86, n.2, pp. 131-136. ISSN 0021-7557. http://dx.doi.org/10.1590/S0021-75572010000200009.
OBJECTIVE: To describe and analyze three tools used in the assessment system applied to the pediatric internship over a 7-year period at the School of Medicine, Pontificia Universidad Católica de Chile, Santiago, Chile. METHODS: Retrospective observational research design for the assessment modalities implemented in the pediatric internship from 2001 through 2007. The tools were as follows: objective structured clinical examination (OSCE), written examination and daily clinical practice observation guidelines (DCPOG). The assessment methods were applied to the sixth-year pediatric internship with a total of 697 students. Statistical analysis included a descriptive assessment, with correlation and simple linear and multiple regressions (ANOVA), Bonferroni test and Cronbach's alpha coefficient. Significance level was set at p < 0.05. RESULTS: OSCE success scores were reached in 75.7±8%, with a better mean among females (p < 0.001). OSCE scores improved after the third year of implementation. Cronbach's alpha coefficient was 0.11-0.78. Written examination had a mean score of 79.8±10% and there were no sex differences. Mean DCPOG score was 97.1±3% and the results were better among females (p < 0.005). Correlation between the three assessment methods showed a moderate positive relationship except in the year of 2007, where the correlation was higher (p < 0.001). CONCLUSIONS: Analysis of the learning assessment system was performed using OSCE, written examination and DCPOG, which are complementary to each other, and yielded good results.
Keywords : Internship and residency; medical school; clinical competence; educational measurement; professional competence; program evaluation.