OSCE 3D: a virtual clinical skills assessment tool for coronavirus pandemic times

Background : In pandemic times where the “lockdown strategy” has been adopted, the use of innovations using technological resources such as the creation of instruments that can replace traditional teaching-learning methods in the training of health professionals is essential. The aim of this study was to develop and evaluate the usability of a realistic interactive simulation computer system using three-dimensional imaging technology and virtual reality with free-access computational tools available on the web. Methods : the development of a prototype (OSCE 3D) was based on the steps used for the construction of simulation software of a "Serious Game". An experimental phase was carried out to assess usability, through a questionnaire based on the System Usability Scale. The study was approved by the Research Ethics Committee of the institution and all patients signed the Informed Consent Form. Results : a total of 39 undergraduate medical students from the 6th semester of a private university center of northeast do Brazil voluntarily participated in the evaluation of the OSCE 3D. The usability evaluation presented a mean score of 75.4 with a margin of error of 3.2, considered a good usability according to the literature. Conclusions : this work allowed the development of a low-cost prototype, using a three-dimension realistic simulation system for OSCE assessment stations. This product, even in the prototype phase, showed good usability.


Abstract
Background : In pandemic times where the "lockdown strategy" has been adopted, the use of innovations using technological resources such as the creation of instruments that can replace traditional teachinglearning methods in the training of health professionals is essential. The aim of this study was to develop and evaluate the usability of a realistic interactive simulation computer system using three-dimensional imaging technology and virtual reality with free-access computational tools available on the web.
Methods : the development of a prototype (OSCE 3D) was based on the steps used for the construction of simulation software of a "Serious Game". An experimental phase was carried out to assess usability, through a questionnaire based on the System Usability Scale. The study was approved by the Research Ethics Committee of the institution and all patients signed the Informed Consent Form.
Results : a total of 39 undergraduate medical students from the 6th semester of a private university center of northeast do Brazil voluntarily participated in the evaluation of the OSCE 3D. The usability evaluation presented a mean score of 75.4 with a margin of error of 3.2, considered a good usability according to the literature.
Conclusions : this work allowed the development of a low-cost prototype, using a three-dimension realistic simulation system for OSCE assessment stations. This product, even in the prototype phase, showed good usability.

Background
Coping with the COVID-19 pandemic and the use of the "lockdown strategy" have been signi cant challenges for medical education 1,2 .
Although developing or underdeveloped countries were the last to be affected by the COVID-19 pandemic, public health systems in these locations are fragile and require a larger number of quali ed health professionals. Then, in these countries, every effort is needed to avoid delaying the graduation of new doctors, who will make up the medical workforce 3,4 .
Working with distance learning and remote assessments on a course with a large amount of practical activities required creativity from educators 5,6 . While distance cognitive assessments are already a reality in many universities, the practical assessment of clinical skills has been a di culty to be faced 7 .
There is a report in the literature on the use of traditional Objective Structured Clinical Examination (OSCE) assessments, with adequate planning, at Duke-National University Singapore Medical School, during a period of high risk of contamination by COVID-19 8 .
Tools for using clinical skills assessment with the use of computers have been described. However, few of these have been validated or had their development described and using platforms available on the Page 3/14 web 9,10 .
The objective of this work was to develop a low-cost prototype that simulates the traditional OSCE, using free-access tools available on the internet, and to validate it for usability by undergraduate medical students.

Methods
It is an applied research, which developed the prototype of a technological tool using computer graphics, for use in the evaluation of clinical skills of the OSCE type, followed by an experimental phase that evaluated the usability of this prototype.

Development of the OSCE 3D prototype
The OSCE 3D was developed with the participation of a multi-professional team consisting of three teachers from a medical course and one from computing, a systems analyst, a programmer and a graphic designer. The multi-professional composition of the development team had the objective of constructing a that achieved student needs with regard to self-learning. For this, the Co-Design methodology was used, consisting of ve phases: (I) scope -overview of the learning objectives; (II) shared understanding -exchange of experiences between stakeholders on the scenarios, types of pedagogical technologies and methodologies as the basis for the app; (III) brainstorming -sketching the primary interfaces for the software; (IV) re ning -modelling the app screens, images, clinical cases and designing the activities; (V) implementation -interactive development of the software with incremental deliveries. During the process, phases III, IV and V were reviewed cyclically to improve the system 11 .
The prototype was developed as a workstation-based simulator to be installed on a computer and offer an immersion with three-dimension (3D) graphics, not limited by the internet broadcasting. It was used the free-access personal version for students and beginners of the Unity Editor 3D platform (Unity The method used to develop the virtual environment of the OSCE 3D was the same to create traditional 3D computer games with a realistic approach of the serious games 12 . The steps used to develop the OSCE 3D prototype were (1) Simulation concept (objectives, visual characteristics, type of interactivity, description of the characters, technical material of the simulation, evaluation model and feedback); (2) Design of the user interface, preparation of images and textures of the virtual environment (interaction buttons, selection of textures of objects and materials in the scene); (3) 3D modeling of the environment and characters (creation of 3D models, human characters and their animations); (4) Development of the simulation platform (formatting the scenario, importing objects and programming the movement and interactivity of the player); (5) Creation of the student's score and feedback system; and (6) Functionality tests and nal adjustments.

Development of practical stations for the OSCE 3D prototype
The main elements of traditional OSCE stations were reproduced through simulation using virtual reality.
Clinical and anatomical information were collected from the medical literature to compose 03 clinical cases, which were used to compose the stations of the OSCE 3D prototype. The physical space of corridors of a clinical skills laboratory, medical consultation rooms with materials for examination and door commands were modeled for the virtual reality environment. The scripts with possible responses were created to be presented on the screen according to the activation of interaction devices. Checklists lled in automatically, according to the options chosen, were added to the system and presented as feedback after the end of each season.

Evaluation of the OSCE 3D prototype
This phase of the study had an interventional and exploratory character, of a quantitative nature and sought to assess the usability of the OSCE 3D by the students.

Participants
The participants were undergraduate medical students from the institution and who were in the sixth semester, invited to participate voluntarily. The choice of students in the third year of the medical course was motivated by the fact that these students are already familiarized to the traditional OSCE type of assessment method.

Usability assessment (experimental phase)
Students underwent the same preparation process for a traditional OSCE assessment. Before the evaluation, they were con ned and received general guidance about the evaluation. Then, the students went to the institution's computer lab and accessed the OSCE 3D prototype. Right after using the system, students assessed usability through a questionnaire based on the System Usability Scale (SUS), composed of 10 items, answered using the 5-point Likert scale to identify agreement or disagreement 13 . The 10 items of the applied questionnaire are: item 1 "I would use this system frequently"; item 2 "I found the system unnecessarily complex"; item 3 "I found the system easy to use"; item 4 "I think I would need the support of technical support to be able to use this system"; item 5 "I thought that the various functions of the system were well integrated"; item 6 "I thought there was a lot of inconsistency in this system"; item 7 "I would imagine that most people would learn to use this system quickly"; item 8 "I found the system too heavy to use"; item 9 "I felt very con dent using the system"; and item 10 "I needed to learn a number of things before I could continue using the application".
The calculation of the usability score was obtained by adding the individual contribution of each item.
For odd items, 1 point was subtracted from the value assigned to the answer. For even items, the calculation was made by subtracting the value assigned to the response from the total of 5 points. For the calculation of the total score, the value obtained from the sum of the even and odd items was multiplied by 2.5. In the end, the total score of the usability scale could vary between 0 and 100 points. According to the literature, samples from at least 12 participants are su cient to assess usability using the same method applied in this study 14 .

Statistical analysis
The data were tabulated in Microsoft Excel for Windows® and exported for statistical analysis in the Statistical Package for the Social Sciences (SPSS) software program, version 20.0 (IBM). The data were presented with both absolute and percentage frequencies. Cronbach's alpha coe cient was used to estimate the reliability of the questionnaires applied, and the lower limit of 0.70 was used for acceptable reliability 15 .

Ethical aspects
The study was approved by the Ethics in Research Committee of the institution, CAAE: 84915418.3.0000.5049, being in compliance with Resolution 466/12 of the National Health Council and the Helsinki Declaration. The subjects of the research participated voluntarily, after signing the Informed Consent Form, and were not identi ed, so as to ensure the secrecy of the answers.

OSCE 3D prototype
The prototype of the realistic simulation system OSCE 3D was developed for use on computers, in o ine version, for Windows platform, intended to reproduce the evaluation by the OSCE method in a virtual way, for students of the undergraduate medical course. The system simulates a 3D space of a clinical skills laboratory ( gure 1a).

Page 6/14
The objects of a medical o ce and the detailed physical characteristics with facial and breathing movements of the virtual patient were simulated by the prototype in order to obtain maximum realism ( gure 1b).
The operator's movement within the 3D environment was programmed to allow interaction and examination of the virtual patient with 3 axes of freedom: rotation and approach (+/-zoom). This feature was added in order to allow the operator to obtain the best viewing angle during the examination ( gure 1c).
The navigation and interactivity interface were done through the mouse and keyboard, with selection of icons and texts on the screen during the entire execution in order to facilitate its usability. The interaction takes place through a task bar with the options "Dialogue", "Examine", "Order exams", "Prescribe" and "Diagnose" ( gure 1d). When selecting each of these, a side menu opens with options to be selected.
At the end of each station, a feedback screen with scores, the options selected by the student and the comparison with the most appropriate options is presented. A countdown timer was added to the system, which is triggered from the moment that the student gives the command to enter the station.

Evaluation of system usability by students
A total of 39 students assessed the usability of the OSCE 3D prototype using the SUS-based questionnaire, the majority was male (53.8%), with an average age of 23.2 ± 0.5 years. The assessment identi ed a good degree of usability, with a total average score of 75.4, with a margin of error of 3.2 and a 95% con dence interval of 72.2 to 78.6. In the analysis of the individual response to each item of the questionnaire, it was found that most of these had a score greater than 70 ( gure 2). The exception to this nding was in item 6, which states "I thought there was a lot of inconsistency in this system". The analysis of the responses to the SUS questionnaire revealed that they showed acceptable reliability, with Cronbach's alpha coe cient of 0.714 for odd items and 0.713 for even items (without item 6).

Discussion
It was possible to develop a prototype for use in the assessment of clinical skills, using computer graphics technology. This prototype maintains characteristics typical of the traditional OSCE and adds advantages that are typical of virtual simulators such as the use of a playful environment. This showed good usability after evaluation by a group of undergraduate medical students.
The simulation modality based on virtual reality using 3D environments has been described as an alternative that can bring good results for learning and evaluation in health education, in comparison with the expensive models of physical simulators 16 .
Serious games usually developed to enhance learning. In a study carried out comparing a group of undergraduate medical students submitted to the Problem-based Learning method and another to a serious game developed for teaching accident and emergency care, it was identi ed that the virtual method resulted in a higher score in the evaluation with use of clinical cases 17 .
In a more recent study, the authors tested the effect of a serious game simulating an emergency department and assessed the satisfaction and effect on students' declarative and procedural knowledge through pre-test-post-test. In this study, a signi cant increase in declarative knowledge was observed 18 . However, this serious game aimed at the teaching process, but not as a method of assessing skills.
A study conducted recently at the Children's Hospital of Cincinnati (USA) with cases of pediatrics evaluated the perception of third-year medical students submitted to clinical case simulator, built with the same game development platform (Unity). It was found that most students strongly agreed that the simulations were clinically accurate (97.4%) and that they covered the main learning objectives (100%). In addition, the virtual reality simulator was more effective than traditional conferences, distance learning and learning with low-delity mannequins. The students also identi ed that the training with the virtual simulator was classi ed as equal or more effective, in terms of learning, than the high-delity mannequins and use of standardized patients. The only modality in which the virtual simulator was rated as less effective was when compared to bedside teaching 19 .
In the evaluation of clinical skills, a successful experiment was carried out, using realistic simulation techniques in a 3D virtual environment, with an interactive simulator system of clinical cases. According to students who participated in the study, the system improved communication skills and professionalism 7 .
Another simulator using a web-based immersive technology, developed at the University Hospital of Cologne, in Germany, allowed the assessment of skills in surgical patients, with an impact on performance when compared to the traditional OSCE and with an increase in student motivation 20 .

Usability
Usability is the software's ability to be understood, learned, operated and attractive to the user, when used under speci ed conditions 21 . To validate the OSCE 3D prototype, the standard assessment SUS questionnaire was adopted, which has already been used in studies for the analysis of educational software 22,23 .
According to some authors, a score with a score higher than 68.5 corresponds to an acceptable degree of usability 24 . In another publication on the interpretation of the result obtained with the SUS questionnaire, the same authors identi ed that a score above 73 for a software would be considered as a good usability 25 .
In assessing the usability of the OSCE 3D prototype, the average SUS score was 75.4, being therefore considered "Good" according to the method employed.
During the analysis by item of the answers obtained by the questionnaire based on SUS, it was observed that item 6 ("I thought there was a lot of inconsistency in this system") was the only one that obtained an average score below 70. This result could consequence from a bias in the students' interpretation of the term "inconsistency in the application", due to possible instability and crashes during the execution of the prototype on computer lab machines not adapted for computer graphics.
In a systematic review of immersive environments with simulated patients using 3D technology, a total of 13 publications with relevance in medical education were identi ed 9 . Of these publications, only 5 carried out a validation study, 1 with content validity 26 , 2 with satisfaction 27,28 and 2 with improvement of performance 29,30 . None of these validated the system's usability.

Limitations
As limitations of the study, it should be considered that it was carried out in a single university center, with a limited group of students and restricted to the evaluation of usability, at the moment it was not possible to compare it with the clinical skills assessment through the OSCE traditional. The absence of computers adapted for use of computer graphics software is another factor that may have in uenced the evaluation by students. Furthermore, it was not possible yet, as a prototype, to evaluate communication or procedural skills.

Innovations
However, the study is innovative in describing the steps for the development of a low-cost workstationbased 3D simulator, not limited by the internet broadcasting, with free-access computational tools available on the web, which can be used in the assessment of clinical skills like OSCE.
In addition, the developed prototype can positively favored improvements in terms of economic viability, mobility, reproducibility, less need for human resources, logistics and physical space.

Conclusion
It was possible to develop a low-cost prototype of a 3D immersive environment of realistic simulation with computer graphics, using free-access tools available on the internet and with independent functioning of the web, which sought to simulate an evaluation of the traditional OSCE type. This has also been validated for usability.
As it is a still early version of a method for assessing clinical skills remotely, further studies are needed to explore other aspects related to the functionality of this prototype or other similar versions, applicable in times of lockdown strategies to face pandemics like COVID-19.

OSCE -Objective Structured Clinical Examination
3D -three-dimension SUS -System Usability Scale  Usability evaluation of the OSCE 3D prototype using the System Usability Scale (SUS) questionnaire by undergraduate medical students (N = 39). Source: prepared by the authors.