Acessibilidade / Reportar erro

OSCE 3D: a virtual clinical skills assessment tool for coronavirus pandemic times

Abstract:

Introduction:

In pandemic times, in which the “lockdown strategy” has been adopted, the use of innovations using technological resources such as the creation of instruments that can replace traditional teaching-learning methods in the training of health professionals is essential.

Objective:

the aim of this study was to develop and evaluate the usability of a realistic interactive simulation computer system using three-dimensional imaging technology and virtual reality with free-access computational tools available on the web.

Methods:

the development of a prototype (OSCE 3D) was based on the steps used for the construction of a “Serious Game” simulation software. The free-access version of the Unity Editor 3D platform (Unity Technologies, version 2018), used for developing educational games, the software GNU Image Manipulation Program (GIMP, version 2.10.12), Blender (version 2.79) and MakeHuman (version 1.1.1) were utilized for creating textures and building models of the 3D environments. An experimental phase was carried out to assess usability, through a questionnaire based on the System Usability Scale. The study was approved by the Research Ethics Committee of the institution and all participants signed the Informed Consent Form.

Results:

a total of 39 undergraduate medical students attending the 6th semester of a private university center of northeastern Brazil voluntarily participated in the evaluation of the OSCE 3D. The usability evaluation resulted in a mean score of 75.4 with a margin of error of 3.2, which is considered a good usability score according to the literature.

Conclusions:

this study allowed the development of a low-cost prototype, using a three-dimension realistic simulation system for clinical skills assessment. This product, even in the prototype phase, showed good usability.

Keywords:
Computer Simulation; Teaching; Assessment; Virtual Reality; Coronavirus Infections; Pandemics

Resumo:

Introdução:

Em tempos de pandemia em que a “estratégia de bloqueio” tem sido adotada, é imprescindível a utilização de inovações utilizando recursos tecnológicos como a criação de instrumentos que possam substituir os métodos tradicionais de ensino-aprendizagem na formação dos profissionais de saúde.

Objetivo:

o objetivo deste estudo foi desenvolver e avaliar a usabilidade de um sistema computacional de simulação interativa realista utilizando tecnologia de imagem tridimensional e realidade virtual com ferramentas computacionais de acesso livre disponíveis na web.

Métodos:

o desenvolvimento de um protótipo (OSCE 3D) baseou-se nas etapas utilizadas para a construção de um software de simulação de um “Serious Game”. Foi utilizada a versão de acesso livre da plataforma Unity Editor 3D (Unity Technologies, versão 2018) para o desenvolvimento de jogos educacionais, software GNU Image Manipulation Program (GIMP, versão 2.10.12), Blender (versão 2.79) e MakeHuman (versão 1.1 .1) para criar texturas e modelos de construção de ambientes 3D. Uma fase experimental para avaliação da usabilidade foi realizada por meio de um questionário baseado na Escala de Usabilidade de Sistema. O estudo foi aprovado pelo Comitê de Ética em Pesquisa da instituição e todos os participantes assinaram o Termo de Consentimento Livre e Esclarecido.

Resultados:

um total de 39 alunos de graduação do 6º semestre de uma universidade privada do Nordeste do Brasil participaram voluntariamente da avaliação do OSCE 3D. A avaliação de usabilidade resultou em pontuação média de 75,4 com margem de erro de 3,2, sendo esta considerada uma boa usabilidade de acordo com a literatura.

Conclusões:

este trabalho permitiu o desenvolvimento de um protótipo de baixo custo, utilizando um sistema de simulação realista tridimensional para avaliação de habilidades clínicas. Este produto, ainda em fase de protótipo, apresentou boa usabilidade.

Palavras-chave:
Simulação por Computador; Ensino; Avaliação; Realidade Virtual; Infecções por Coronavírus; Pandemias

INTRODUCTION

Working with distance learning and remote assessments in a course with many practical activities requires creativity from educators11. Patil NG, Chan Y, Yan H. SARS and its effect on medical education in Hong Kong. Med Educ. 2003;37(12):1127-8.),(22. Donkin R, Askew E, Stevenson H. Video feedback and e-Learning enhances laboratory skills and engagement in medical laboratory science students. BMC Med Educ . 2019;19(1):310.. While distance cognitive assessments are already a reality in many universities, the practical assessment of clinical skills has been a challenge to be faced33. Yang W, Hebert D, Kim S, Kang B. MCRDR Knowledge-Based 3D Dialogue Simulation in Clinical Training and Assessment. J Med Syst. 2019;43(7):200..

Objective Structured Clinical Examination (OSCE)-type assessment has become more widely used in medical education. However, due to its being an expensive strategy, some attempts to use computing and virtual environments to simulate clinical situations started to appear globally. Those systems use different technologies to simulate the structured clinical examination, some with interaction via text or voice, others through interaction with images and videos in an intuitive interface44. Johnsen K, Dickerson RF, Raij A, Harrison C, Lok B, Stevens AO, et al. Evolving an immersive medical communication skills trainer. Presence: Teleoperators and Virtual Environments. 2006;15:(1):33-46.),(55. Courteille O, Bergin R, Stockeld D, Ponzer S, Fors U. The use of a virtual patient case in an OSCE-based exam--a pilot study. Med Teach. 2008;30(3):e66-76..

Technological strategies have been used to increase and adapt health education. Game-design techniques and solutions can be employed to create the simulated reality and the experience of something real66. van Gaalen AEJ, Brouwer J, Schönrock-Adema J, Bouwkamp-Timmer T, Jaarsma ADC, Georgiadis JR. Gamification of health professions education: a systematic review. Adv Health Sci Educ Theory Pract. 2020 Oct 31. Epub ahead of print.),(77. Middeke A, Anders S, Schuelper M, Raupach T, Schuelper N. Training of clinical reasoning with a Serious Game versus small-group problem-based learning: A prospective study. PLoS One. 2018;13(9):e0203851.. Tools for using clinical skills assessment with the use of computers have been described. However, few of these have been validated or had their development described and used platforms available on the web88. Kleinert R, Wahba R, Chang DH, Plum P, Hölscher AH, Stippel DL. 3D immersive patient simulators and their impact on learning success: a thematic review. J Med Internet Res. 2015;17(4):e91.),(99. Brossier D, Sauthier M, Alacoque X, Masse B, Eltaani R, Guillois B, et al. Perpetual and Virtual Patients for Cardiorespiratory Physiological Studies. J Pediatr Intensive Care. 2016;5(3):122-8.. A virtual OSCE is a new concept for most institutions, so one needs to spend significant time planning the format to meet one’s individual needs1010. Hopwood J, Myers G, Sturrock A. Twelve tips for conducting a virtual OSCE. Med Teach . 2020:1-4..

In the assessment of clinical skills, a successful experiment was carried out, using realistic simulation techniques in a three-dimension (3D) virtual environment, with an interactive simulator system of clinical cases. According to students who participated in the study, the system improved communication skills and professionalism33. Yang W, Hebert D, Kim S, Kang B. MCRDR Knowledge-Based 3D Dialogue Simulation in Clinical Training and Assessment. J Med Syst. 2019;43(7):200..

Another simulator using a web-based immersive technology, developed at the University Hospital of Cologne, in Germany, allowed the assessment of skills in surgical patients, with an impact on performance when compared to the traditional OSCE and with an increase in student motivation1111. Chon SH, Hilgers S, Timmermann F, Dratsch T, Plum PS, Berlth F, et al. Web-Based Immersive Patient Simulator as a Curricular Tool for Objective Structured Clinical Examination Preparation in Surgery: Development and Evaluation. JMIR Serious Games. 2018;6(3):e10693..

Coping with the COVID-19 pandemic and the use of the “lockdown strategy” have been significant challenges for medical education1212. Li L, Xv Q, Yan J. COVID-19: the need for continuous medical education and training. Lancet Respir Med. 2020;8(4):e23.),(1313. Mian A, Khan S. Medical education during pandemics: a UK perspective. BMC Med. 2020;18(1):100.. The use of traditional Objective Structured Clinical Examination (OSCE) assessments, with adequate planning, has been reported at Duke-National University Singapore Medical School, during a period of high risk of contamination by COVID-191414. Boursicot K, Kemp S, Ong T, Wijaya L, Goh SH, Freeman K, et al. ‘Conducting a high-stakes OSCE in a COVID-19 environment’, Med Ed Publish. 2020;9(1):54..

Since our conventional face-to-face OSCEs were cancelled due to the risks posed to students and examiners and patients by the Covid-19 pandemic, assessments of clinical communication, practical skills and examination skills, for example, became a real challenge for health schools. Every effort is needed to avoid delaying the graduation of new doctors, who will comprise the medical workforce1515. Musa TH, Ahmad T, Khan M, Haroon H, Wei P. Global outbreak of COVID-19: a new challenge? J Infect Dev Ctries. 2020;14(3):244-5.),(1616. Bedford J, Enria D, Giesecke J, Heymann DL, Ihekweazu C, Kobinger G, et al. WHO Strategic and Technical Advisory Group for Infectious Hazards. COVID-19: towards controlling of a pandemic. Lancet. 2020;395(10229):1015-8..

The objective of this work was to develop a prototype that simulates the traditional OSCE, using free-access tools available on the internet, and to evaluate its usability by undergraduate medical students.

METHODS

This is an applied research, which developed the prototype of a technological tool using computer graphics, for use in the evaluation of clinical skills of the OSCE type, followed by an experimental phase that evaluated the usability of this prototype, from January to November 2019.

Development of the OSCE 3D prototype

The OSCE 3D was developed with the participation of a multi-professional team consisting of three teachers from a medical course and one from computing sciences, a systems analyst, a programmer and a graphic designer. The multi-professional composition of the development team aimed at constructing a system that met the students’ needs regarding self-learning. For this purpose, the Co-Design methodology was used, consisting of five phases:

  1. Scope: overview of the learning objectives;

  2. Shared understanding: exchange of experiences between participants regarding the scenarios, types of pedagogical technologies and methodologies as the basis for the app;

  3. Brainstorming: sketching the primary interfaces for the software;

  4. Refining: modelling the app screens, images, clinical cases and designing the activities;

  5. Implementation: interactive development of the software with incremental deliveries. During the process, phases III, IV and V were cyclically reviewed to improve the system1717. Millard D, Howard Y, Gilbert L, Wills G. Co-design and co-deployment methodologies for innovative m-learning systems. In: Goh TT, (Ed). Multiplatform E-Learning Systems and Technologies: Mobile Devices for Ubiquitous ICT-Based Education. New York: IGI Global 2010..

The prototype was developed as a workstation-based simulator to be installed on a computer and offer an immersion with three-dimension (3D) graphics, not limited by the internet broadcasting. The free-access personal version for students and beginners of the Unity Editor 3D platform (Unity Technologies, version 2018) for developing educational games and simulations was used. The additional free-access software GNU Image Manipulation Program (GIMP, version 2.10.12), Blender (version 2.79) and MakeHuman (version 1.1.1) were also used for creating textures and building models of the 3D environments.

The method used to develop the virtual environment of the OSCE 3D was the same to create traditional 3D computer games with a realistic approach of the serious games1818. Zyda M. From Visual to Virtual Reality to Games. Computer. 2005;38(9):25-32.. The steps used to develop the OSCE 3D prototype were (1) Simulation concept (objectives, visual characteristics, type of interactivity, description of the characters, technical material of the simulation, evaluation model and feedback); (2) Design of the user interface, preparation of images and textures of the virtual environment (interaction buttons, selection of textures of objects and materials in the scene); (3) 3D modeling of the environment and characters (creation of 3D models, human characters and their animations); (4) Development of the simulation platform (formatting the scenario, importing objects and programming the movement and interactivity of the player); (5) Creation of the student’s score and feedback system; and (6) Functionality tests and final adjustments.

Development of practical stations for the OSCE 3D prototype

The main elements of traditional OSCE stations were reproduced through simulation using virtual reality. Clinical and anatomical information were collected from the medical literature to create 03 clinical cases, which were used to compose the stations of the OSCE 3D prototype. The physical space of corridors of a clinical skills laboratory, medical consultation rooms with materials for examination and door commands were modeled for the virtual reality environment. The scripts with possible responses were created to be presented on the screen according to the activation of interaction devices. Checklists filled in automatically, according to the options chosen, were added to the system and presented as feedback after the end of each station.

Evaluation of the OSCE 3D prototype

This phase of the study had an interventional and exploratory characteristic, of a quantitative nature and sought to assess the usability of the OSCE 3D by the students.

Participants

The participants were undergraduate medical students from the institution who were attending the sixth semester, and were invited to participate voluntarily, selected by non-probability sampling. According to the literature, samples with at least 12 participants are sufficient to assess usability using the same method applied in this study1919. Sauro J. A practical guide to the system usability scale: Background, benchmarks & best practices. Measuring Usability LLC 2011..

The choice of students in the third year of the medical course was motivated by the fact that these students are already familiarized to the traditional OSCE type of assessment method.

The institution’s medical course started in 2006, with 60 students per class (semester). Since then, six student assessments have been carried out using traditional OSCEs each year, for all classes, including internship.

Usability assessment (experimental phase)

Students underwent the same preparation process for a traditional OSCE assessment. Before the evaluation, they were confined and received general guidance about the access to the system. Then, the students went to the institution’s computer lab and accessed the OSCE 3D prototype. There was no support or interference from the researchers, so as not to interfere with the usability evaluation.

Right after using the system, students assessed usability through a highly reliable questionnaire based on the System Usability Scale (SUS)2020. Brooke J. SUS: a “quick and dirty” usability scale. In: Jordan PW, Thomas B, Weerdmeester B, McClelland IL, editors. Usability evaluation in industry. London: Taylor & Francis; 1996. p. 189-94. ISBN: 100748404600),(2121. Bangor A, Kortum P, Miller J. An Empirical Evaluation of the System Usability Scale. Int J Hum Comput Interact. 2008;24(6):574-94., validated for Portuguese2222. Tenório JM, Cohrs FM, Sdepanian VL, Pisa IT, de Marin HF. Desenvolvimento e Avaliacão de um Protocolo Eletrônico para Atendimento e Monitoramento do Paciente com Doença Celíaca. Rev Inform Teór Aplic. 2011;17(2):210-20., consisting of 10 items, which are answered using the 5-point Likert scale to identify agreement or disagreement2323. Lewis JR, Sauro J. The Factor Structure of the System Usability Scale. In: Kurosu M. (eds) Human Centered Design. HCD 2009. Lecture Notes in Computer Science, vol 5619. Springer, Berlin, Heidelberg 2009.. The 10 items of the applied questionnaire are: item 1 “I would use this system frequently”; item 2 “I found the system unnecessarily complex”; item 3 “I found the system easy to use”; item 4 “I think I would need help from technical support to be able to use this system”; item 5 “I thought that the various functions of the system were well integrated”; item 6 “I thought there were a lot of inconsistencies in this system”; item 7 “I would imagine that most people would learn to use this system quickly”; item 8 “I found the system too heavy to use”; item 9 “I felt very confident using the system”; and item 10 “I needed to learn a number of things before I could continue using the application”.

The calculation of the usability score was obtained by adding the individual contribution of each item. For odd items, 1 point was subtracted from the value assigned to the answer. For even items, the calculation was made by subtracting the value assigned to the response from the total of 5 points. For the calculation of the total score, the value obtained from the sum of the even and odd items was multiplied by 2.5. In the end, the total score of the usability scale varied between 0 and 100 points2323. Lewis JR, Sauro J. The Factor Structure of the System Usability Scale. In: Kurosu M. (eds) Human Centered Design. HCD 2009. Lecture Notes in Computer Science, vol 5619. Springer, Berlin, Heidelberg 2009..

Statistical analysis

The data were tabulated in Microsoft Excel for Windows® and exported for statistical analysis in the Statistical Package for the Social Sciences (SPSS) software program, version 20.0 (IBM). The data were presented with both absolute and percentage frequencies. Cronbach’s alpha coefficient was used to estimate the reliability of the applied questionnaires, and the lower limit of 0.70 was used for acceptable reliability2424. Bonett DG, Wright TA. Cronbach’s alpha reliability: Interval estimation, hypothesis testing, and sample size planning. J Organ Behav. 2014;36(1):3-15..

Ethical aspects

The study was approved by the Ethics Research Committee of the institution, CAAE: 84915418.3.0000.5049, being in compliance with Resolution 466/12 of the National Health Council and the Helsinki Declaration. The subjects of the research participated voluntarily, after signing the Informed Consent Form, and were not identified, so as to ensure the confidentiality of the answers.

RESULTS

OSCE 3D prototype

The prototype of the realistic simulation system OSCE 3D was developed for use on computers, in the offline version, for Windows platform, intended to reproduce the evaluation by the OSCE method in a virtual environment, aimed at students of the undergraduate medical course. The system simulates a 3D-space of a clinical skills laboratory (Figure 1a).

Figure 1a
Three-dimensional space of a clinical skills laboratory simulated by the OSCE 3D prototype.

The objects of a medical office and the detailed physical characteristics with facial and breathing movements of the virtual patient were simulated by the prototype in order to obtain maximum realism (Figure 1b).

Figure 1b
Medical office and physical characteristics of a virtual patient simulated by the OSCE 3D prototype.

The operator’s movement within the 3D environment was programmed to allow interaction and examination of the virtual patient with 3 axes of freedom: rotation and approach (+/- zoom). This feature was added in order to allow the student to obtain the best viewing angle during the examination (Figure 1c).

Figure 1c
Interaction of a virtual patient through the rotation and zoom commands by the OSCE 3D prototype.

The navigation and interactivity interface were carried out through the mouse and keyboard, with selection of icons and texts on the screen during the entire performance in order to facilitate its usability. The interaction takes place through a task bar with the options “Dialogue”, “Examine”, “Order exams”, “Prescribe” and “Diagnose” (Figure 1d). When selecting each of these, a side menu opens with options to be selected.

Figure 1d
Task bar to interaction with a virtual patient simulated by the OSCE 3D prototype.

At the end of each station, a feedback screen with scores, the options selected by the student and the comparison with the most appropriate options is presented. A countdown timer was added to the system, which is triggered from the moment the student gives the command to enter the station.

Evaluation of system usability by students

A total of 39 students assessed the usability of the OSCE 3D prototype using the SUS-based questionnaire; the majority was male (53.8%), with an average age of 23.2 ± 0.5 years, ranging from 20 to 38 years. The assessment identified a good degree of usability, with a total average score of 75.4, ranging from 52.5 to 90, with a margin of error of 3.2 and a 95% confidence interval of 72.2 to 78.6.

In the analysis of the individual response to each questionnaire item , it was found that most of these had a score greater than 70 (chart 1). The exception to this finding was item 6, which scored 24.4, which states “I thought there were a lot of inconsistencies in this system”.

Chart 1
Usability evaluation of the OSCE 3D prototype using the System Usability Scale (SUS) questionnaire by undergraduate medical students (N = 39).

The analysis of the responses to the SUS questionnaire revealed that they showed acceptable reliability, with Cronbach’s alpha coefficient of 0.714 for all items.

DISCUSSION

A prototype was developed to be used in the assessment of clinical skills, using computer graphics technology. This prototype maintains characteristics that are typical of the traditional OSCE and adds advantages typical of virtual simulators, such as the use of a playful environment.

The simulation modality based on virtual reality using 3D environments has been described as an alternative that can bring good results for learning and evaluation in health education, in comparison with the expensive physical simulator models2626. Zackoff MW, Real FJ, Cruse B, Davis D, Klein M. Medical Student Perspectives on the use of Immersive Virtual Reality for Clinical Assessment Training. Acad Pediatr. 2019;19(7):849-51..

Multiple online platforms have been used recently to run virtual OSCEs, including Microsoft Teams, Zoom, Google Hangouts and Adobe Connect. Each platform has its pros and cons and are frequently updated with new features. Although we can assess the majority of skills using an online set-up, authentic settings may help the candidates feel more comfortable, with less impact on station focus1010. Hopwood J, Myers G, Sturrock A. Twelve tips for conducting a virtual OSCE. Med Teach . 2020:1-4..

A study conducted recently at the Children’s Hospital of Cincinnati (USA) with pediatric cases evaluated the perception of third-year medical students submitted to a clinical-case simulator, built with the same game development platform (Unity) as the one used in this study. It was found that most students strongly agreed that the simulations were clinically accurate (97.4%) and covered the main learning objectives (100%). In addition, the virtual reality simulator was more effective than traditional conferences, distance learning and learning with low-fidelity mannequins. The students also identified that the training with the virtual simulator was classified as equal or more effective, in terms of learning, than the high-fidelity mannequins and use of standardized patients. The only modality in which the virtual simulator was rated as less effective was when compared to bedside teaching2626. Zackoff MW, Real FJ, Cruse B, Davis D, Klein M. Medical Student Perspectives on the use of Immersive Virtual Reality for Clinical Assessment Training. Acad Pediatr. 2019;19(7):849-51..

Usability

The prototype developed in this study showed good usability after evaluation by a group of undergraduate medical students. Usability is the software’s ability to be understood, learned, operated and attractive to the user, when used under specified conditions2727. Brooke J. “SUS: a “quick and dirty” usability scale”. In PW Jordan B, Thomas BA, Weerdmeester & AL McClelland (eds.). Usability evaluation in industry . London: Taylor and Francis 1986.. To validate the usability of the OSCE 3D prototype, the standard assessment SUS questionnaire was adopted, which has been previously used in studies for the analysis of educational software2828. Zbick J, Nake I, Milrad M, Jansen M. A web-based framework to design and deploy mobile learning activities: Evaluating its usability, learnability and acceptance. In: 2015 IEEE 15th International Conference on Advanced Learning Technologies (ICALT) p. 88-92.),(2929. Chung H, Chen S, Kuo M. A study of EFL college students’ acceptance of mobile learning. Procedia - Social and Behavioral Sciences. 2015;176:333-9..

According to some authors, a score higher than 68.5 corresponds to an acceptable degree of usability2121. Bangor A, Kortum P, Miller J. An Empirical Evaluation of the System Usability Scale. Int J Hum Comput Interact. 2008;24(6):574-94.. In another publication on the interpretation of the results obtained with the SUS questionnaire, the same authors identified that a score above 73 for a software would be considered as good usability3030. Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies. 2009;4(3):114-23.. When assessing the usability of the OSCE 3D prototype, the average SUS score was 75.4, being therefore considered “Good”, according to the method employed.

During the analysis by item of the answers obtained by the questionnaire based on SUS, it was observed that item 6 (“I thought there were a lot of inconsistencies in this system”) was the only one that obtained an average score below 70. A previous study using the SUS scale to assess a virtual learning environment observed interference in the score due to technical failures related to hardware and network configuration, which influence system use3131. Boucinha RM, Tarouco LMR. Avaliação de Ambiente Virtual de Aprendizagem com o uso do SUS - System Usability Scale. RENOTE Revista Novas Tecnologias na Educação. 2013;11(3):1-10.. This result might be a consequence of the bias in the students’ interpretation of the term “inconsistencies in the application”, due to possible instability and crashes during the execution of the prototype on computer lab machines, not adapted for computer graphics.

An explanation is that it depends on the level of ‘product’ experience or exposure. Another reason may be related to the complexity of the product3232. Borsci S, Federici S, Bacci S, Gnaldi M, Bartolucci F. Assessing user satisfaction in the era of user experience: comparison of the SUS, UMUX, and UMUX-LITE as a function of product experience. Int J Hum Comput Interact . 2015;31(8):484-95.),(3333. Roxo-Gonçalves M, Martins MAT, Martins MD, Schmitz CAA, Dal Moro RG, D’Avila OP, et al. Perceived usability of a store and forward telehealth platform for diagnosis and management of oral mucosal lesions: A cross-sectional study. PLoS One . 2020;15(6):e0233572.. In some cases, professionals have to adapt to a new system and learn how to integrate it into their work routine3434. Mol M, van Schaik A, Dozeman E, Ruwaard J, Vis C, Ebert DD, et al. Dimensionality of the system usability scale among professionals using internet-based interventions for depression: a confirmatory factor analysis. BMC Psychiatry. 2020;20(1):218.. Although the SUS is favored as the industry standard for simple and reliable assessment of usability and allows comparison to the usability of other routinely used technologies, it has limitations. In particular, it is intended to assess satisfaction more than the efficiency and effectiveness dimensions3535. Melnick ER, Dyrbye LN, Sinsky CA, Trockel M, West CP, Nedelec L, et al. The Association Between Perceived Electronic Health Record Usability and Professional Burnout Among US Physicians. Mayo Clin Proc. 2020;95(3):476-87..

In a systematic review of immersive environments with simulated patients using 3D technology, a total of 13 publications with relevance in medical education were identified88. Kleinert R, Wahba R, Chang DH, Plum P, Hölscher AH, Stippel DL. 3D immersive patient simulators and their impact on learning success: a thematic review. J Med Internet Res. 2015;17(4):e91.. Of these publications, only 5 carried out a validation study, 1 with content validity3636. Cohen D, Sevdalis N, Patel V, Taylor M, Lee H, Vokes M, et al. Tactical and operational response to major incidents: feasibility and reliability of skills assessment using novel virtual environments. Resuscitation. 2013;84(7):992-8., 2 with satisfaction3737. Heinrichs WL, Youngblood P, Harter PM, Dev P. Simulation for team training and assessment: case studies of online training with virtual worlds. World J Surg. 2008;32(2):161-70.),(3838. Youngblood P, Harter PM, Srivastava S, Moffett S, Heinrichs WL, Dev P. Design, development, and evaluation of an online virtual emergency department for training trauma teams. Simul Healthc . 2008;3(3):146-53. and 2 with performance improvement3939. Funke K, Bonrath E, Mardin WA, Becker JC, Haier J, Senninger N, et al. Blended learning in surgery using the Inmedea Simulator. Langenbecks Arch Surg. 2013;398(2):335-40.),(4040. Knight JF, Carley S, Tregunna B, Jarvis S, Smithies R, de Freitas S, et al. Serious gaming technology in major incident triage training: a pragmatic controlled trial. Resuscitation . 2010;81(9):1175-9.. None of these validated the system’s usability, which limits our comparison with other studies on student assessment systems.

Students need a considerable amount of training to ensure that their performance is not affected by the assessment format and, instead, reflects their clinical competence. In addition to training the candidates on the platform of virtual OSCE, it is necessary to train the hosts, examiners and role players. Everyone needs to be familiar with the logistics of how it will be run. Future opportunities should also include feedback obtained from the candidates and examiners prior to the results becoming available.

Limitations

As limitations of the study, it should be considered that it was carried out in a single university center and restricted to the evaluation of usability. At the moment, it was not possible to compare it with the clinical skills assessment through the OSCE traditional, so there is no evidence to support one or more of the key inferences. The absence of computers adapted for use of computer graphics software is another factor that may have influenced the evaluation by students. Furthermore, it was not possible yet, as it is a prototype, to evaluate communication or procedural skills.

Another limitation may be the fact that the prototype is not yet accessible online, as this implies the need to install an application and make it difficult to update.

Innovations

This study is innovative as it describes the steps for the development of a workstation-based 3D simulator, not limited by internet broadcasting, with free-access computational tools available on the web, which can be used in the assessment of clinical skills such as OSCE.

The developed prototype can positively favor improvements in terms of economic viability, mobility, reproducibility, less need for human resources, logistics and physical space. In addition, the virtual OSCE system may have other applications, such as for formative assessment, besides the obvious applications in pandemic situations.

CONCLUSION

A prototype of a 3D immersive environment of realistic simulation with computer graphics was developed using free-access tools available on the internet and with independent function from the web, which sought to simulate an evaluation of the traditional OSCE type. This has showed good usability.

As it is still an early version of a method for virtual assessment of clinical skills, further studies are needed to explore other aspects related to the functionality of this prototype or other similar versions, applicable in times of lockdown strategies to face pandemics such as COVID-19.

REFERENCES

  • 1
    Patil NG, Chan Y, Yan H. SARS and its effect on medical education in Hong Kong. Med Educ. 2003;37(12):1127-8.
  • 2
    Donkin R, Askew E, Stevenson H. Video feedback and e-Learning enhances laboratory skills and engagement in medical laboratory science students. BMC Med Educ . 2019;19(1):310.
  • 3
    Yang W, Hebert D, Kim S, Kang B. MCRDR Knowledge-Based 3D Dialogue Simulation in Clinical Training and Assessment. J Med Syst. 2019;43(7):200.
  • 4
    Johnsen K, Dickerson RF, Raij A, Harrison C, Lok B, Stevens AO, et al. Evolving an immersive medical communication skills trainer. Presence: Teleoperators and Virtual Environments. 2006;15:(1):33-46.
  • 5
    Courteille O, Bergin R, Stockeld D, Ponzer S, Fors U. The use of a virtual patient case in an OSCE-based exam--a pilot study. Med Teach. 2008;30(3):e66-76.
  • 6
    van Gaalen AEJ, Brouwer J, Schönrock-Adema J, Bouwkamp-Timmer T, Jaarsma ADC, Georgiadis JR. Gamification of health professions education: a systematic review. Adv Health Sci Educ Theory Pract. 2020 Oct 31. Epub ahead of print.
  • 7
    Middeke A, Anders S, Schuelper M, Raupach T, Schuelper N. Training of clinical reasoning with a Serious Game versus small-group problem-based learning: A prospective study. PLoS One. 2018;13(9):e0203851.
  • 8
    Kleinert R, Wahba R, Chang DH, Plum P, Hölscher AH, Stippel DL. 3D immersive patient simulators and their impact on learning success: a thematic review. J Med Internet Res. 2015;17(4):e91.
  • 9
    Brossier D, Sauthier M, Alacoque X, Masse B, Eltaani R, Guillois B, et al. Perpetual and Virtual Patients for Cardiorespiratory Physiological Studies. J Pediatr Intensive Care. 2016;5(3):122-8.
  • 10
    Hopwood J, Myers G, Sturrock A. Twelve tips for conducting a virtual OSCE. Med Teach . 2020:1-4.
  • 11
    Chon SH, Hilgers S, Timmermann F, Dratsch T, Plum PS, Berlth F, et al. Web-Based Immersive Patient Simulator as a Curricular Tool for Objective Structured Clinical Examination Preparation in Surgery: Development and Evaluation. JMIR Serious Games. 2018;6(3):e10693.
  • 12
    Li L, Xv Q, Yan J. COVID-19: the need for continuous medical education and training. Lancet Respir Med. 2020;8(4):e23.
  • 13
    Mian A, Khan S. Medical education during pandemics: a UK perspective. BMC Med. 2020;18(1):100.
  • 14
    Boursicot K, Kemp S, Ong T, Wijaya L, Goh SH, Freeman K, et al. ‘Conducting a high-stakes OSCE in a COVID-19 environment’, Med Ed Publish. 2020;9(1):54.
  • 15
    Musa TH, Ahmad T, Khan M, Haroon H, Wei P. Global outbreak of COVID-19: a new challenge? J Infect Dev Ctries. 2020;14(3):244-5.
  • 16
    Bedford J, Enria D, Giesecke J, Heymann DL, Ihekweazu C, Kobinger G, et al. WHO Strategic and Technical Advisory Group for Infectious Hazards. COVID-19: towards controlling of a pandemic. Lancet. 2020;395(10229):1015-8.
  • 17
    Millard D, Howard Y, Gilbert L, Wills G. Co-design and co-deployment methodologies for innovative m-learning systems. In: Goh TT, (Ed). Multiplatform E-Learning Systems and Technologies: Mobile Devices for Ubiquitous ICT-Based Education. New York: IGI Global 2010.
  • 18
    Zyda M. From Visual to Virtual Reality to Games. Computer. 2005;38(9):25-32.
  • 19
    Sauro J. A practical guide to the system usability scale: Background, benchmarks & best practices. Measuring Usability LLC 2011.
  • 20
    Brooke J. SUS: a “quick and dirty” usability scale. In: Jordan PW, Thomas B, Weerdmeester B, McClelland IL, editors. Usability evaluation in industry. London: Taylor & Francis; 1996. p. 189-94. ISBN: 100748404600
  • 21
    Bangor A, Kortum P, Miller J. An Empirical Evaluation of the System Usability Scale. Int J Hum Comput Interact. 2008;24(6):574-94.
  • 22
    Tenório JM, Cohrs FM, Sdepanian VL, Pisa IT, de Marin HF. Desenvolvimento e Avaliacão de um Protocolo Eletrônico para Atendimento e Monitoramento do Paciente com Doença Celíaca. Rev Inform Teór Aplic. 2011;17(2):210-20.
  • 23
    Lewis JR, Sauro J. The Factor Structure of the System Usability Scale. In: Kurosu M. (eds) Human Centered Design. HCD 2009. Lecture Notes in Computer Science, vol 5619. Springer, Berlin, Heidelberg 2009.
  • 24
    Bonett DG, Wright TA. Cronbach’s alpha reliability: Interval estimation, hypothesis testing, and sample size planning. J Organ Behav. 2014;36(1):3-15.
  • 25
    Haerling KA. Cost-Utility Analysis of Virtual and Mannequin-Based Simulation. Simul Healthc 2018;13(1):33-40.
  • 26
    Zackoff MW, Real FJ, Cruse B, Davis D, Klein M. Medical Student Perspectives on the use of Immersive Virtual Reality for Clinical Assessment Training. Acad Pediatr. 2019;19(7):849-51.
  • 27
    Brooke J. “SUS: a “quick and dirty” usability scale”. In PW Jordan B, Thomas BA, Weerdmeester & AL McClelland (eds.). Usability evaluation in industry . London: Taylor and Francis 1986.
  • 28
    Zbick J, Nake I, Milrad M, Jansen M. A web-based framework to design and deploy mobile learning activities: Evaluating its usability, learnability and acceptance. In: 2015 IEEE 15th International Conference on Advanced Learning Technologies (ICALT) p. 88-92.
  • 29
    Chung H, Chen S, Kuo M. A study of EFL college students’ acceptance of mobile learning. Procedia - Social and Behavioral Sciences. 2015;176:333-9.
  • 30
    Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies. 2009;4(3):114-23.
  • 31
    Boucinha RM, Tarouco LMR. Avaliação de Ambiente Virtual de Aprendizagem com o uso do SUS - System Usability Scale. RENOTE Revista Novas Tecnologias na Educação. 2013;11(3):1-10.
  • 32
    Borsci S, Federici S, Bacci S, Gnaldi M, Bartolucci F. Assessing user satisfaction in the era of user experience: comparison of the SUS, UMUX, and UMUX-LITE as a function of product experience. Int J Hum Comput Interact . 2015;31(8):484-95.
  • 33
    Roxo-Gonçalves M, Martins MAT, Martins MD, Schmitz CAA, Dal Moro RG, D’Avila OP, et al. Perceived usability of a store and forward telehealth platform for diagnosis and management of oral mucosal lesions: A cross-sectional study. PLoS One . 2020;15(6):e0233572.
  • 34
    Mol M, van Schaik A, Dozeman E, Ruwaard J, Vis C, Ebert DD, et al. Dimensionality of the system usability scale among professionals using internet-based interventions for depression: a confirmatory factor analysis. BMC Psychiatry. 2020;20(1):218.
  • 35
    Melnick ER, Dyrbye LN, Sinsky CA, Trockel M, West CP, Nedelec L, et al. The Association Between Perceived Electronic Health Record Usability and Professional Burnout Among US Physicians. Mayo Clin Proc. 2020;95(3):476-87.
  • 36
    Cohen D, Sevdalis N, Patel V, Taylor M, Lee H, Vokes M, et al. Tactical and operational response to major incidents: feasibility and reliability of skills assessment using novel virtual environments. Resuscitation. 2013;84(7):992-8.
  • 37
    Heinrichs WL, Youngblood P, Harter PM, Dev P. Simulation for team training and assessment: case studies of online training with virtual worlds. World J Surg. 2008;32(2):161-70.
  • 38
    Youngblood P, Harter PM, Srivastava S, Moffett S, Heinrichs WL, Dev P. Design, development, and evaluation of an online virtual emergency department for training trauma teams. Simul Healthc . 2008;3(3):146-53.
  • 39
    Funke K, Bonrath E, Mardin WA, Becker JC, Haier J, Senninger N, et al. Blended learning in surgery using the Inmedea Simulator. Langenbecks Arch Surg. 2013;398(2):335-40.
  • 40
    Knight JF, Carley S, Tregunna B, Jarvis S, Smithies R, de Freitas S, et al. Serious gaming technology in major incident triage training: a pragmatic controlled trial. Resuscitation . 2010;81(9):1175-9.
  • 2
    Evaluated by double blind review process.
  • SOURCES OF FUNDING

    None of the authors received funding for the research and publication of this article.

Edited by

Chief Editor: Rosiane Viana Zuza Diniz. Associate Editor: Izabel Cristina Meister Martins Coelho.

Publication Dates

  • Publication in this collection
    11 June 2021
  • Date of issue
    2021

History

  • Received
    04 Oct 2020
  • Accepted
    28 Mar 2021
Associação Brasileira de Educação Médica SCN - QD 02 - BL D - Torre A - Salas 1021 e 1023 | Asa Norte, Brasília | DF | CEP: 70712-903, Tel: (61) 3024-9978 / 3024-8013, Fax: +55 21 2260-6662 - Brasília - DF - Brazil
E-mail: rbem.abem@gmail.com