Acessibilidade / Reportar erro

Development and Validation of a Mobile Application for the Teaching of Electrocardiogram

ABSTRACT

Introduction

The interpretation of the electrocardiogram (ECG) is fundamental for the identification of cardiovascular diseases, which are among the main causes of death in the world. The acquisition of this competence is essential in the training of the general practitioner and the use of new teaching tools, based on technology such as the use of mobile learning, can facilitate the acquisition of this knowledge.

Objectives

The objective of this study was to develop and evaluate the usability and the potentiality for the use in medical education of a mobile teaching application for ECG interpretation.

Methods

With the participation of two health professors and a computer scientist, a systems analyst, a programmer and a graphic designer, a mobile application was developed to teach ECG interpretation using the Java language. For validation analysis, a usability evaluation questionnaire based on the System Usability Scale (SUS) and a questionnaire used to evaluate the suitability of software for use in medical education, previously translated into Portuguese and applied in Brazil, were used.

Results

The “ECG Fácil” application was developed for off-line use and free access on iOS and Android platforms. In the validation phase of the application, a total of 109 students had free access to the mobile application for 6 weeks and then evaluated usability through a SUS-based questionnaire. The answers to the questionnaire showed good reliability, according to the validation analysis by Cronbach’s alpha coefficient (value: 0.74), and the application presented an excellent acceptance with a mean score on the SUS scale of 85.3. Meanwhile, fifteen faculty members evaluated the application through the questionnaire used to assess the suitability of software for use in medical education, with most agreeing with items indicating that it is suitable for use in medical education.

Conclusion

The “ECG Fácil” application was considered to be of good usability by students and suitable for educational purpose by teachers. Future studies with this application will be needed to assess the acquisition and retention of knowledge about ECG interpretation by students.

Electrocardiogram; Mobile Applications; Teaching; Health; Cardiology

RESUMO

Introdução

A interpretação do eletrocardiograma (ECG) é fundamental para a identificação de doenças cardiovasculares, que estão entre as principais causas de morte no mundo. A aquisição dessa competência é essencial na formação do médico generalista, e o uso de novas ferramentas de ensino, apoiadas na tecnologia, como o mobile learning, pode facilitar a aquisição desse conhecimento.

Objetivos

Este estudo teve por objetivo desenvolver e avaliar a usabilidade e a potencialidade para o uso em educação médica de um aplicativo móvel de ensino para interpretação do ECG.

Métodos

Com a participação de dois professores da área da saúde e um da computação, um analista de sistemas, um programador e um designer gráfico, foi desenvolvido um aplicativo móvel para ensino da interpretação do ECG, utilizando-se a linguagem Java. Para análise de validação foram empregados um questionário de avaliação da usabilidade baseado no System Usability Scale (SUS) e um questionário utilizado para avaliar a adequação de softwares para uso em educação médica, previamente traduzido para o português e aplicado no Brasil.

Resultados

Foi desenvolvido o aplicativo “ECG Fácil” para uso off-line e de acesso gratuito nas plataformas iOS e Android. Na fase de validação do aplicativo, 109 discentes tiveram acesso livre ao aplicativo móvel durante seis semanas e depois avaliaram a usabilidade por meio de questionário baseado no SUS. As respostas ao questionário mostraram boa confiabilidade, conforme a análise de validação pelo coeficiente alfa de Cronbach (valor: 0,74), e o aplicativo apresentou excelente aceitação, com escore médio de 85,3 na escala SUS. Enquanto isso, 15 docentes avaliaram o aplicativo por meio do questionário utilizado para avaliar a adequação de softwares para uso em educação médica, tendo a maioria concordado com os itens que indicam que ele é adequado ao uso em educação médica.

Conclusão

O aplicativo “ECG Fácil” foi considerado de boa usabilidade pelos alunos e adequado à finalidade educacional pelos professores. Estudos futuros com esse aplicativo serão necessários para avaliar a aquisição e a retenção de conhecimento sobre a interpretação do ECG pelos alunos.

Eletrocardiograma; Aplicativos Móveis; Ensino; Saúde; Cardiologia

INTRODUCTION

The physiologist Augustus D. Waller, in 1887, was the first to register a human electrocardiogram trace, by means of electrodes connected to the front and back walls of the chest11 Giffoni RT, Torres RM. Breve história da eletrocardiografia. Revista Médica de Minas Gerais 2010;20(2):263-270.. Currently, the electrocardiogram (ECG) is a low-cost, non-invasive, no-risk examination, used routinely in daily clinical practice to identify serious conditions, such as heart attack, cardiac arrhythmias, hydroelectrolytic disturbances and pulmonary thromboembolism22 Pastore CA, Pinho JA, Pinho C, Samesima N, Pereira Filho HG, Kruse JCL, et al. III Diretrizes da Sociedade Brasileira de Cardiologia Sobre Análise e Emissão de Laudos Eletrocardiográficos. ArqBrasCardiol 2016;106(4)supl.1:1-23..

Considering its importance for diagnosing serious illnesses and the wide availability of the method in health units, it has become essential to learn about ECG interpretation on Medical degree courses. However, there is the idea among students that interpretation of this method is difficult, as it requires knowledge of the anatomy, heart physiology and electric vectors33 Dong R, Yang X, Xing B, Zou Z, Zheng Z, Xie X, et al. Use of concept maps to promote electrocardiogram diagnosis learning in undergraduate medical students. Int J Clin Exp Med 2015;8(5):7794-7801..

At both degree and postgraduate universities, ECG interpretation is traditionally taught by specialists, by means of lectures to large groups of students44 Cantillon P. ABC of learning and teaching in medicine: Teaching large groups. BMJ 2003;7386(326):437-437.. In addition, experience with using machinery to teach this subject has been reported in the literature since 196455 Owen SG, Hall R, Waller IB. Use of a teaching machine in medical education; preliminary experience with a program in electrocardiography. Postgrad Med J 1964;40:59-65.. Since then, a variety of educational software for ECG interpretation has been designed for use on computers66 Pontes PAI, Chaves RO, Castro RC, de Souza ÉF, Seruffo MCR, Francês CRL. Educational Software Applied in Teaching Electrocardiogram: A Systematic Review. Biomed Res Int 2018;2018:8203875..

Due to the easy and frequent access to smartphones and tablets, the use of these devices as instruments to assist the teaching/learning process is becoming increasingly frequent77 Marçal E, Andrade R, Rios R. Aprendizagem utilizando Dispositivos Móveis com Sistemas de Realidade Virtual. RENOTE - RevistaNovasTecnologiasnaEducação 2005;3(1):1-11.,88 Nogueira JBS. Desenvolvimento e avaliação de usabilidade de aplicativo para planejamento de artroplastias totais de joelho. Fortaleza; 2016. Mestrado [Dissertação] - Mestrado Profissional em Tecnologia Minimamente Invasiva e Simulação na Área de Saúde, Centro Universitário Christus (UNICHRISTUS).. Specifically with regard to ECG interpretation, there are various applications (apps) accessible through these devices. However, according to the current literature, there is no app for mobile devices designed for educational purposes and tested with regard to usability by Medical students.

Therefore, the present work describes the development and evaluation of the usability and the potentiality of an app for mobile devices for educational purposes for teaching ECG interpretation.

METHOD

This is applied research, with the development of a technological tool for use in teaching, followed by quantitative analysis after this tool has been used, through collecting information by means of questionnaires and statistically processing the data.

Development of the app for mobile devices

The app for the mobile devices was developed for Android and iOS, using the Java language, with the participation of a multi-professional team consisting of two teachers from the Medical course and one from computing, a systems analyst, a programmer and a graphic designer. Software Development Kits (SDK) specifically for Android and Apple devices were used. For Android platforms, Integrated Development Environment (IDE) tools from Android Studio were used, and Android from Google with Application Programming Interface (APIs) and Open Source Computer Vision (OpenCV). After finalizing a version without apparent errors, the app for mobile devices was registered at the National Industrial Property Institute (Instituto Nacional de Propriedade Industrial).

The multi-professional composition of the development team had the objective of constructing an app that achieved student needs with regard to self-learning. For this, the adapted codesign methodology was used, consisting of five phases: (I) scope – overview of the learning objectives; (II) shared understanding – exchange of experiences between stakeholders on the scenarios, types of pedagogical technologies and methodologies as the basis for the app; (III) brainstorming – sketching the primary interfaces for the app; (IV) refining – modelling the app screens, images, clinical cases and designing the activities; (V) implementation – interactive development of the software with incremental deliveries. During the process, phases III, IV and V were reviewed cyclically to improve the app99 Millard D, Howard Y, Gilbert L, Wills G. Co-design and co-deployment methodologies for innovative m-learning systems. In: Goh TT, (Ed). Multiplatform E-Learning Systems and Technologies: Mobile Devices for Ubiquitous ICT-Based Education. New York: IGI Global, 2010. p.147-163.,1010 Pereira RVS, Kubrusly M, Marçal E. Desenvolvimento, Utilização e Avaliação de uma Aplicação Móvel para Educação Médica: um Estudo de Caso em Anestesiologia. RENOTE - Revista Novas Tecnologias na Educação 2017;15(1):1-10..

Evaluation of the app for mobile devices

The evaluation of the app was conducted by students and teaching staff at a private higher education institution (HEI) in the Northeast of Brazil.

Participants (students and teaching staff)

The participating students were in the second semester of the Medical course of the HEI, where the cardiology module appears. All of the students that studied this module during the semesters 2017.2 and 2018.1 were invited to voluntarily participate. Students were excluded if they were not registered normally at the HEI during this period or if they did not have a smartphone to access the app to be evaluated.

The teaching staff group that participated in the evaluation of the app was formed by specialists in cardiology, belonging to the same institution, who had contact with the students during the semesters 2017.2 and 2018.1, not only during the cardiology module, but also during the internship period. Everyone who met these requirements was invited to participate voluntarily. Teachers involved in developing the app for the mobile devices were excluded.

Development of the app and the evaluation instrument

The participants, students and teachers, had free access to the app for mobile devices during the period of six weeks relating to the cardiology module and, afterwards, were invited to respond to the evaluation questionnaires for this tool.

The evaluation conducted by the students aimed to measure the usability of the app for mobile devices. For this, a questionnaire based on the System Usability Scale (SUS)1111 Brooke J. SUS - A quick and dirty usability scale. Usability Eval Ind 1996;189(194):4–7. was used. This consists of ten items, answered by selecting agreement or disagreement with the proposal using the five-point Likert scale. The version of the SUS questionnaire applied was translated to Portuguese by Tenório et al.1212 Tenório JM, Cohrs FM, Sdepanian VL, Pisa IT, Marin HF. Desenvolvimento e avaliação de um protocolo eletrônico para atendimento e monitoramento do paciente com doença celíaca. Revista de Informática teórica e aplicada 2010;17(2):210–220., and the items are listed below1212 Tenório JM, Cohrs FM, Sdepanian VL, Pisa IT, Marin HF. Desenvolvimento e avaliação de um protocolo eletrônico para atendimento e monitoramento do paciente com doença celíaca. Revista de Informática teórica e aplicada 2010;17(2):210–220..

Item 1. I would use this app often.

Item 2. I found the app unnecessarily complex.

Item 3. I found the app easy to use.

Item 4. I think help from technical support would be needed to be able to use this app.

Item 5. I thought the different functions of the app were well integrated.

Item 6. I thought there was great inconsistency in this app.

Item 7. I would imagine that the majority of people would quickly learn how to use this app.

Item 8. I found the app very hard to use.

Item 9. I felt confident using this app.

Item 10. I needed to learn a series of things before I could use the app.

The calculation of the SUS usability score is obtained by adding the individual contribution of each item. For the odd-numbered items, one point is subtracted from the value attributed to the response. For the even items, the calculation is to subtract the value attributed to the answer from the total of five points. To calculate the total score, the values obtained from the odd- and even-numbered items are added and multiplied by 2.5. This results in a total usability score varying between 0 and 100 points1111 Brooke J. SUS - A quick and dirty usability scale. Usability Eval Ind 1996;189(194):4–7..

The evaluation conducted by the teachers aimed to judge the potential of the app for use as a tool in the health teaching/learning process. For this purpose, a questionnaire of ten items was applied, based on the “ten golden rules”, that aim to evaluate whether software is adequate for use in medical education1313 Jha J, Duffy S. ‘Ten golden rules’ for designing software in medical education: results from a formative evaluation of DIALOG. Medical Teacher 2002;24(4):417-421.. This questionnaire was answered also using a five-point Likert scale, aiming to identify agreement and disagreement with the proposals. This instrument was previously applied in Brazil, as part of a quality evaluation model for software for use in medical teaching1414 Barros PRM. Avaliando a Qualidade de Produto de Software em Saúde: o caso SimDeCS. Porto Alegre; 2013. Mestrado [Dissertation] - Curso de Ciências da Saúde, Fundação Universidade Federal de Ciências da Saúde de Porto Alegre.. The ten items relating to the questionnaire are described below.

Item 1. Does the content of the app meet the educational purpose?

Item 2. Is the content of the app based on evidence and not on opinions?

Item 3. Does the app enable the use of hypermedia and hypertext to promote knowledge?

Item 4. Does the app have an interesting, pleasing and challenging interface?

Item 5. Is the use of multimedia in the app appropriate?

Item 6. Does the app enable students to interactively explore and experience the possibility of resolving clinical cases?

Item 7. Does the app present the content in a way that stimulates the use of analytical and clinical skills for problem solving?

Item 8. Is the app easy to use and is its navigation appropriate?

Item 9. Can the app be defined as an ideal tool to use, because of the benefits it provides?

Item 10. Can the app be defined as a tool with low maintenance costs, making it easy to maintain the cases presented, enabling the content to be quickly updated?

Statistical analysis

The data were tabulated in Microsoft Excel for Windows® and exported for statistical analysis in the Statistical Package for the Social Sciences (SPSS) software program, version 20.0 (IBM). The data were presented with both absolute and percentage frequencies. Cronbach’s alpha coefficient was used to estimate the reliability of the questionnaires applied, and the lower limit of 0.70 was used for acceptable realibility1515 Bonett DG, Wright TA. Cronbach’s alpha reliability: Interval estimation, hypothesis testing, and sample size planning. Journal of Organizational Behavior 2014;36(1):3-15..

Ethical aspects

The study was approved by the Ethics in Research Committee of the institution, CAAE: 73150617.5.0000.5049, being in compliance with Resolution 466/12 of the National Health Council and the Helsinki Declaration. The subjects of the research participated voluntarily, after signing the Informed Consent Form, and were not identified, so as to ensure the secrecy of the answers.

RESULTS

The “ECG Fácil” (“Easy ECG”) app was developed for mobile devices (smartphones and tablets), for off-line use on iOS and Android platforms, available for free download in the APP Store and Google Play Store, and intended to teach ECG interpretation. This has the functionality of open user control, which enables students and parties interested in the subject to have individualized and secure access after a quick process of registering identification details.

The app developed

The opening screen of the app shows seven bars that can be accessed: normal electrocardiogram, chamber overloads, interventricular conduction disorder, ST segment changes, tachyarrhythmia, bradyarrhythmias and review. This enables the user to access content that becomes successively more complex, from the basic information about the ECG to more specific content, such as arrhythmias. The initial screens of the app are shown in Figure 1.

FIGURE 1
Initial screens to download and access the “ECG Fácil” app

With the objective of optimizing teaching by interactivity, after presenting the theory of ECG interpretation, screens are shown with challenges, with immediate feedback being given on the answers (Figure 2).

FIGURE 2
Screens showing the content and feedback

The content is displayed not only using text, but also with images, with the objective of facilitating understanding, and challenges always follow. If the incorrect option is chosen, the student is informed of the error and shown a screen with a new presentation of the same content or tips on ECG interpretation, for the purpose of improving understanding (Figure 3).

FIGURE 3
Screens displaying the content (text and images), feedback and tips in the case of errors

With the aim of achieving sequential learning, as the user advances in accessing the theory screens and solving challenge-type questions, the tabs subsequent to the initial screen are unlocked to enable new content to be accessed. In addition, with the purpose of enabling cumulative learning, contextualized with practice, the new content is followed by clinical cases that enable the content learned in the previous unit (previous tab) and in that being accessed at that moment to be tested (Figure 4).

FIGURE 4
Screens showing the sequential unlocking of the content and the use of clinical cases

The app also has some tools relating to the display, such as the possibility of expanding the images by zooming using the fingers on the screen. In addition, at the end, as a teaching/learning tool, the final tab provides revision exercises relating to the content.

The app interface for the teacher consists of a Web page, which enables secure access to the data provided by the students. However, the version available to the teacher is a prototype created to validate the tests of the app, and is cannot yet be freely accessed. In addition, the prototype does not yet allow the introduction of new cases or new exercises.

Evaluation of the usability of the app by students

After providing use of the app during a period of six weeks, 109 students assessed the usability by means of the SUS questionnaire. This shows good reliability, according to the validation analysis by Cronbach’s alpha coefficient (value: 0.74).

From the students’ evaluation, the app has an overall average SUS score greater than 70, these results being presented in Table 1.

TABLE 1
Evaluation of the usability of the “ECG Fácil” app (N = 109 students)

With regard to the analysis by items on the SUS scale, it was confirmed that the majority of these had scores above 70 (Figure 5). The exception to this finding was item 10 – “I needed to learn various new things to be able to use the system”.

FIGURE 5
Evaluation of the “ECG Fácil” app by students using the usability questionnaire based on the System Usability Scale

Evaluation of the suitability of the app for mobile devices for use in medical education by the teachers

The invitation was sent to 20 teachers and 15 teachers agreed to participate in the research. All of the participants were doctors, with specialization in cardiology. Among them, 47% lectured on the cardiology module; 47% provided lessons on electrocardiogram interpretation; and 53% worked in cardiology outpatients clinics with students of cardiology or on the internship rotation in clinical medicine.

The teachers considered the app to be easy to use, conducive to use, appropriate to the educational purpose and with evidence-based content. In addition, they considered the use of multimedia was appropriate (Figure 6).

FIGURE 6
Evaluation of the “ECG Fácil” app by teachers who are specialists in cardiology with regard to the appropriateness for use in medical education

Among the points for improvement, are the interactivity in resolving the clinical cases, the stimulus with analytical and clinical skills for resolving the problems and expanding the app with hypermedia and hypertext (Figure 6).

DISCUSSION

Software related to the ECG with educational objectives or only connected to diagnostic assistance already exists66 Pontes PAI, Chaves RO, Castro RC, de Souza ÉF, Seruffo MCR, Francês CRL. Educational Software Applied in Teaching Electrocardiogram: A Systematic Review. Biomed Res Int 2018;2018:8203875.. The innovation of the present study consisted of, aside from developing an app for mobile devices for teaching the ECG, evaluating its usability by potential users, as well as its potentialities and deficiencies in the teaching/learning process by professionals who are specialists and involved in the teaching of cardiology.

One of the first experiences of using “machines” as support tools in medical education used teaching the ECG as its focus. Owen et al.55 Owen SG, Hall R, Waller IB. Use of a teaching machine in medical education; preliminary experience with a program in electrocardiography. Postgrad Med J 1964;40:59-65. validated the Grundy tutor teaching machine for teaching the ECG to small groups of students.

With technological evolution, other tools for teaching the ECG have been developed. One of them used information recorded and accessed on CD-ROM, with the aim of the adding to teaching through the use of virtual reality and self-learning1616 Jeffries PR, Woolf S, Linde B. Technology-based vs. traditional instruction. A comparison of two methods for teaching the skill of performing a 12-lead ECG. Nurs Educ Perspect 2003;24(2):70-74..

With the objective of advancing interactive teaching, Shahein1717 Shahein HI. Computers in health-sciences education. An application to electrocardiography. Comput Programs Biomed 1983;17(3):213-223. developed an interactive computer program, which combined the process of teaching/learning about the normal and abnormal ECG with the management of individualized teaching. According to this author, the purpose of one of the first experiences of using a computer system formatted for self-learning, with quick and flexible responses to the users, was teaching the ECG.

With the advent of the internet, initiatives arose for the teaching through this platform. With regard to teaching ECG interpretation, one of the first initiatives was developed at Harvard University. Nathanson et al.1818 Nathanson LA, Safran C, McClennen S, Goldberger AL. ECG Wave-Maven: a self-assessment program for students and clinicians. Proc AMIA Symp. 2001:488-492.described an extensive library of electrocardiogram traces and clinical data from patients registered over more than ten years, which could be accessed by students.

The use of the app for mobile devices in medical education is more recent. The first app described in the literature for this purpose was developed and validated by the University of Colorado. The Radiology Resident iPad Toolbox was validated as an educational, economic and mobile instrument, which increases the efficiency of study.1919 Sharpe EE 3rd, Kendrick M, Strickland C, Dodd GD 3rd. The Radiology Resident iPad Toolbox: an educational and clinical tool for radiology residents. J Am Coll Radiol 2013;10(7):527-532.. In relation to the study of the electrocardiogram, the first app for mobile devices for this purpose was described by Tofield2020 Tofield A. Cardiac arrhythmia challenge: a new App. Eur Heart J 2013;34(44):3392., although without a description of the validation of this tool for teaching2020 Tofield A. Cardiac arrhythmia challenge: a new App. Eur Heart J 2013;34(44):3392..

To evaluate the usability of the “ECG Fácil” app, a specific questionnaire was adopted for this purpose (SUS), which is being used in other studies for analyzing educational apps for mobile devices2121 Zbick J, Nake I, Milrad M, Jansen M. (2015) A web-based framework to design and deploy mobile learning activities: Evaluating its usability, learnability and acceptance. In: 2015 IEEE 15th International Conference on Advanced Learning Technologies (ICALT) (pp. 88-92). IEEE Press. Available at: http://dx.doi.org/10.1109/ICALT.2015.97
http://dx.doi.org/10.1109/ICALT.2015.97...
,2222 Chung H, Chen S, Kuo M. A study of EFL college students’ acceptance of mobile learning. Procedia - Social and Behavioral Sciences 2015;176:333-339.. Conceptually, the usability is the capacity of a piece of software to be understood, learned and operated by an individual when used for specific purposes2323 Maramba I, Chatterjee A, Newman C. Methods of usability testing in the development of eHealth applications: A scoping review. Int J Med Inform 2019;126:95-104..

In accordance with the literature, a SUS score above 68 indicates an acceptable degree of usability2424 Sauro, J. A practical guide to the system usability scale: Background, benchmarks & best practices. Measuring Usability LLC, 2011.. In an extensive analysis of the application of the SUS questionnaire, Bangor et al.2525 Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies 2009;4(3):114-123. identified that a score of 85 would be associated with an excellent acceptance of a piece of software or an app2525 Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies 2009;4(3):114-123.. The average of the SUS score for evaluating the usability of the “ECG Fácil” app achieved these parameters described in the literature.

In analysis by items on the SUS scale, it was confirmed that item 10 – “I needed to learn various new things to be able to use the system” – has a score below 70. This result could be explained by the failure to explain the questionnaire that was being applied in advance, at least specifically with regard to item 10. This statement relates to the need to learn to handle the app before using it for learning, not to learning the theoretical content about the electrocardiogram before using the “ECG Fácil” app, which may have skewed the collection of this information.

Attempts to evaluate ECG teaching programs are at an initial phase. One of the first experiences in validating the use of software for teaching the ECG occurred in 1986, when the development of an interactive, low-cost, graphical platform was described. According to the authors, it was well accepted by resident doctors, junior doctors and Medical degree students, but this evaluation was not standardised2626 Thomas RA, Bowyer AF. Development of electrocardiographic teaching materials using an MC68000-based, interactive graphics microcomputer. Comput Methods Programs Biomed 1986;22(1):87-91..

Two decades later, Criley and Nelson2727 Criley JM, Nelson WP. Virtual tools for teaching electrocardiographic rhythm analysis. Journal of Electrocardiology2006;39:113–119. developed a program called ECG Viewer (Blaufuss Medical Multimedia Laboratories, Palo Alto, California) and tried to validate it with a small group of 21 individuals, students in the fourth year of a Medical degree course and residents. The majority agreed, by means of a questionnaire prepared by the authors, that this programme was useful for teaching2727 Criley JM, Nelson WP. Virtual tools for teaching electrocardiographic rhythm analysis. Journal of Electrocardiology2006;39:113–119..

Another attempt to validate the teaching of the ECG, then by means of a Web-based program, was made by Nilsson et al.2828 Nilsson M, Bolinder G, Held C, Johansson B, Fors U, Östergren J. Evaluation of a web-based ECG-interpretation programme for undergraduate medical students. BMC Medical Education2008;8:25. In this study, 20 students on the Medical course evaluated the utility of the program using a five-point scale and considered it useful for learning.

More recently, other authors have also evaluated the teaching of the ECG by means of slides over the Web for psychiatry residents, with a focus on identifying electrocardiogram changes related to medications. The majority of the participants completed the training and responded that they were “very interested” in continuing using this teaching/learning method2929 DeBonis K, Blair TR, Payne ST, Wigan K, Kim S. Viability of a Web-Based Module for Teaching Electrocardiogram Reading Skills to Psychiatry Residents: Learning Outcomes and Trainee Interest. Acad Psychiatry 2015;39(6):645-648..

Using a qualitative method, with a limited number of only ten Medical students, Keis et al.3030 Keis O, Grab C, Schneider A, Öchsner W. Online or face-to-face instruction? A qualitative study on the electrocardiogram course at the University of Ulm to examine why students choose a particular format. BMC Medical Education2017;17:194. identified as necessities for better learning of the ECG through distance and asynchronous strategies, the contextualization and directing of the subject in accordance with the applicability of the knowledge, with the objective of improving the extrinsic and intrinsic motivation, as well as a minimum level of interaction with the teacher. The “ECG Fácil” app aims, by means of using clinical cases, to contextualize the learning about interpreting electrocardiogram changes.

There are still few studies that evaluate the learning and the retention of the information about ECG interpretation using technological tools, and the results are currently conflicting3131 Montassier E, Hardouin JB, Segard J, Batard E, Potel G, Planchon B, et al. e-Learning versus lecture-based courses in ECG interpretation for undergraduate medical students: a randomized noninferiority study. Eur J Emerg Med 2016;23(2):108-113.,3232 Fent G, Gosai J, Purva M. A randomized control trial comparing use of a novel electrocardiogram simulator with traditional teaching in the acquisition of electrocardiogram interpretation skill. Journal of Electrocardiology2016;49:112–116.. Possibly, this is due to the diversity of the learning styles: active, theoretical, pragmatic and reflective3333 Vinales JJ. The learning environment and learning styles: a guide for mentors. Br J Nurs 2015;24(8):454-457..

In a study conducted by Granero-Molina et al.3434 Granero-Molina J, Fernández-Sola C, López-Domene E, Hernández-Padilla JM, Preto LS, Castro-Sánchez AM. Effects of web-based electrocardiography simulation on strategies and learning styles. Rev Esc Enferm USP 2015;49(4):650-656. using an electrocardiogram simulation in a Web environment, it was possible to stimulate the theoretical learning style, which aims to understand the theory before the practice, and the pragmatic style, where how to apply the knowledge in the real world is what it is necessary to know. The “ECG Fácil” app supplies theoretical information about ECG interpretation on opening the first tabs and then uses clinical cases, aiming to illustrate the application of knowledge to clinical practice.

One of the limitations of the present study was the absence of an evaluation of retention of the information, as well as the fact that it had been conducted at a single university center. Another limitation was the use of a questionnaire intended to evaluate the app by teachers, which, although validated, included some questions that evaluated more than one aspect in the same item, which could complicate interpretation.

This study provides the first description of a Brazilian app for mobile devices, developed for teaching ECG interpretation, evaluated with regard to usability by a significant number of student users and by cardiology teachers regarding the potential use in medical education.

The results with regard to the evaluation of the usability and the potential for the use of “ECG Fácil” app in teaching ECG interpretation provide motivation, for the future, for updating it to improve its functionality and to add new content.

CONCLUSIONS

The production of the app for iOS and Android platforms was conducted with success. The app was evaluated by Medical degree students, who gave it excellent usability indicators. In addition, after evaluation by the teachers participating in the study, it was identified that the app shows good potential as a tool for use in health teaching/learning. Other studies are necessary to evaluate the learning and the retention of the knowledge with this app.

REFERENCES

  • 1
    Giffoni RT, Torres RM. Breve história da eletrocardiografia. Revista Médica de Minas Gerais 2010;20(2):263-270.
  • 2
    Pastore CA, Pinho JA, Pinho C, Samesima N, Pereira Filho HG, Kruse JCL, et al. III Diretrizes da Sociedade Brasileira de Cardiologia Sobre Análise e Emissão de Laudos Eletrocardiográficos. ArqBrasCardiol 2016;106(4)supl.1:1-23.
  • 3
    Dong R, Yang X, Xing B, Zou Z, Zheng Z, Xie X, et al. Use of concept maps to promote electrocardiogram diagnosis learning in undergraduate medical students. Int J Clin Exp Med 2015;8(5):7794-7801.
  • 4
    Cantillon P. ABC of learning and teaching in medicine: Teaching large groups. BMJ 2003;7386(326):437-437.
  • 5
    Owen SG, Hall R, Waller IB. Use of a teaching machine in medical education; preliminary experience with a program in electrocardiography. Postgrad Med J 1964;40:59-65.
  • 6
    Pontes PAI, Chaves RO, Castro RC, de Souza ÉF, Seruffo MCR, Francês CRL. Educational Software Applied in Teaching Electrocardiogram: A Systematic Review. Biomed Res Int 2018;2018:8203875.
  • 7
    Marçal E, Andrade R, Rios R. Aprendizagem utilizando Dispositivos Móveis com Sistemas de Realidade Virtual. RENOTE - RevistaNovasTecnologiasnaEducação 2005;3(1):1-11.
  • 8
    Nogueira JBS. Desenvolvimento e avaliação de usabilidade de aplicativo para planejamento de artroplastias totais de joelho. Fortaleza; 2016. Mestrado [Dissertação] - Mestrado Profissional em Tecnologia Minimamente Invasiva e Simulação na Área de Saúde, Centro Universitário Christus (UNICHRISTUS).
  • 9
    Millard D, Howard Y, Gilbert L, Wills G. Co-design and co-deployment methodologies for innovative m-learning systems. In: Goh TT, (Ed). Multiplatform E-Learning Systems and Technologies: Mobile Devices for Ubiquitous ICT-Based Education. New York: IGI Global, 2010. p.147-163.
  • 10
    Pereira RVS, Kubrusly M, Marçal E. Desenvolvimento, Utilização e Avaliação de uma Aplicação Móvel para Educação Médica: um Estudo de Caso em Anestesiologia. RENOTE - Revista Novas Tecnologias na Educação 2017;15(1):1-10.
  • 11
    Brooke J. SUS - A quick and dirty usability scale. Usability Eval Ind 1996;189(194):4–7.
  • 12
    Tenório JM, Cohrs FM, Sdepanian VL, Pisa IT, Marin HF. Desenvolvimento e avaliação de um protocolo eletrônico para atendimento e monitoramento do paciente com doença celíaca. Revista de Informática teórica e aplicada 2010;17(2):210–220.
  • 13
    Jha J, Duffy S. ‘Ten golden rules’ for designing software in medical education: results from a formative evaluation of DIALOG. Medical Teacher 2002;24(4):417-421.
  • 14
    Barros PRM. Avaliando a Qualidade de Produto de Software em Saúde: o caso SimDeCS. Porto Alegre; 2013. Mestrado [Dissertation] - Curso de Ciências da Saúde, Fundação Universidade Federal de Ciências da Saúde de Porto Alegre.
  • 15
    Bonett DG, Wright TA. Cronbach’s alpha reliability: Interval estimation, hypothesis testing, and sample size planning. Journal of Organizational Behavior 2014;36(1):3-15.
  • 16
    Jeffries PR, Woolf S, Linde B. Technology-based vs. traditional instruction. A comparison of two methods for teaching the skill of performing a 12-lead ECG. Nurs Educ Perspect 2003;24(2):70-74.
  • 17
    Shahein HI. Computers in health-sciences education. An application to electrocardiography. Comput Programs Biomed 1983;17(3):213-223.
  • 18
    Nathanson LA, Safran C, McClennen S, Goldberger AL. ECG Wave-Maven: a self-assessment program for students and clinicians. Proc AMIA Symp. 2001:488-492.
  • 19
    Sharpe EE 3rd, Kendrick M, Strickland C, Dodd GD 3rd. The Radiology Resident iPad Toolbox: an educational and clinical tool for radiology residents. J Am Coll Radiol 2013;10(7):527-532.
  • 20
    Tofield A. Cardiac arrhythmia challenge: a new App. Eur Heart J 2013;34(44):3392.
  • 21
    Zbick J, Nake I, Milrad M, Jansen M. (2015) A web-based framework to design and deploy mobile learning activities: Evaluating its usability, learnability and acceptance. In: 2015 IEEE 15th International Conference on Advanced Learning Technologies (ICALT) (pp. 88-92). IEEE Press. Available at: http://dx.doi.org/10.1109/ICALT.2015.97
    » http://dx.doi.org/10.1109/ICALT.2015.97
  • 22
    Chung H, Chen S, Kuo M. A study of EFL college students’ acceptance of mobile learning. Procedia - Social and Behavioral Sciences 2015;176:333-339.
  • 23
    Maramba I, Chatterjee A, Newman C. Methods of usability testing in the development of eHealth applications: A scoping review. Int J Med Inform 2019;126:95-104.
  • 24
    Sauro, J. A practical guide to the system usability scale: Background, benchmarks & best practices. Measuring Usability LLC, 2011.
  • 25
    Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies 2009;4(3):114-123.
  • 26
    Thomas RA, Bowyer AF. Development of electrocardiographic teaching materials using an MC68000-based, interactive graphics microcomputer. Comput Methods Programs Biomed 1986;22(1):87-91.
  • 27
    Criley JM, Nelson WP. Virtual tools for teaching electrocardiographic rhythm analysis. Journal of Electrocardiology2006;39:113–119.
  • 28
    Nilsson M, Bolinder G, Held C, Johansson B, Fors U, Östergren J. Evaluation of a web-based ECG-interpretation programme for undergraduate medical students. BMC Medical Education2008;8:25.
  • 29
    DeBonis K, Blair TR, Payne ST, Wigan K, Kim S. Viability of a Web-Based Module for Teaching Electrocardiogram Reading Skills to Psychiatry Residents: Learning Outcomes and Trainee Interest. Acad Psychiatry 2015;39(6):645-648.
  • 30
    Keis O, Grab C, Schneider A, Öchsner W. Online or face-to-face instruction? A qualitative study on the electrocardiogram course at the University of Ulm to examine why students choose a particular format. BMC Medical Education2017;17:194.
  • 31
    Montassier E, Hardouin JB, Segard J, Batard E, Potel G, Planchon B, et al. e-Learning versus lecture-based courses in ECG interpretation for undergraduate medical students: a randomized noninferiority study. Eur J Emerg Med 2016;23(2):108-113.
  • 32
    Fent G, Gosai J, Purva M. A randomized control trial comparing use of a novel electrocardiogram simulator with traditional teaching in the acquisition of electrocardiogram interpretation skill. Journal of Electrocardiology2016;49:112–116.
  • 33
    Vinales JJ. The learning environment and learning styles: a guide for mentors. Br J Nurs 2015;24(8):454-457.
  • 34
    Granero-Molina J, Fernández-Sola C, López-Domene E, Hernández-Padilla JM, Preto LS, Castro-Sánchez AM. Effects of web-based electrocardiography simulation on strategies and learning styles. Rev Esc Enferm USP 2015;49(4):650-656.

Publication Dates

  • Publication in this collection
    13 Jan 2020
  • Date of issue
    2019

History

  • Received
    22 Aug 2019
  • Accepted
    24 Aug 2019
Associação Brasileira de Educação Médica SCN - QD 02 - BL D - Torre A - Salas 1021 e 1023 | Asa Norte, Brasília | DF | CEP: 70712-903, Tel: (61) 3024-9978 / 3024-8013, Fax: +55 21 2260-6662 - Brasília - DF - Brazil
E-mail: rbem.abem@gmail.com