Acessibilidade / Reportar erro

Limitations of Automated Accessibility Evaluation in a MOOC Platform: Case Study of a Brazilian Platform

ABSTRACT

Massive Open Online Courses (MOOCs) have been announced as a promising educational tool, driving a high number of applications, enabling open learning with an emphasis on the diversity of users. However, literature has pointed to the lack of accessibility in MOOCs. With the intention of improving accessibility support for a MOOC platform, a multi-format course was evaluated using automated tools. The results of the automated tests indicated some accessibility problems of the platform, which can hinder the participation of people with disabilities, signaling the need to design such platforms in an inclusive way. However, this type of test is not capable of discriminate relevant areas, functionalities and contents within a web page, nor is it able to differentiate between different roles of users. The result is that platform areas that are not visible to students (such as platform administration and course editing areas) are also evaluated - even if that is not the goal. For this reason, it was concluded that, to evaluate the inclusion support, it is necessary to combine a qualitative method with the automated tests. The indication of the limitation of this type of test is the most important finding.

KEYWORDS:
Distance education; Person with disability; Course

RESUMO

Os Massive Open Online Courses (MOOCs) têm sido anunciados como uma ferramenta educacional promissora, impulsionando um elevado número de inscrições, de forma a viabilizar a aprendizagem de forma aberta e com ênfase na diversidade de usuários. No entanto, a literatura tem apontado a falta de acessibilidade em MOOCs. Com a intenção de aprimorar o suporte à acessibilidade de uma plataforma de MOOCs, um curso com material multiformato foi avaliado com uso de ferramentas automatizadas. Os resultados dos testes automatizados indicaram alguns problemas de acessibilidade da plataforma, que podem dificultar a participação de pessoas com deficiência, sinalizando a necessidade de projetar tais plataformas de forma inclusiva. Contudo, esse tipo de teste não é capaz de identificar áreas, funcionalidades e conteúdos relevantes dentro de uma página da web, bem como não é capaz de reconhecer diferentes papéis de usuários. O resultado é que áreas da plataforma que não são visíveis aos estudantes (como áreas de administração da plataforma e de edição dos cursos) também são avaliadas nesses testes - mesmo que não seja esse o objetivo. Por esse motivo, concluiu-se que, para avaliar o suporte à inclusão, é necessário aliar aos testes automatizados um método qualitativo. A indicação da limitação desse tipo de teste é a constatação mais importante.

PALAVRAS-CHAVE:
Educação a Distância; Pessoa com Deficiência; Curso

1 Introduction

One of the goals of Massive Open Online Courses (MOOCs) is to enable open learning for all, which includes providing equal opportunities for people with disabilities to produce accessible MOOCs and platforms (Iniesto, McAndrew, Minocha, & Coughlan, 2018Iniesto, F., McAndrew, P., Minocha, S., & Coughlan, T. (2018). Widening disabled learners' participation to HE through the use of MOOCs. Widening Participation Conference: Is widening participation to higher education enough?, Milton Keynes, Inglaterra.). The fact that MOOCs enable learning for a variety of people without specific requirements presupposes that educational materials/resources will suit all students, regardless of their location, socioeconomic factors, skill levels.

This perspective raises questions about the existence of barriers to learning in MOOCs (Ichou, 2018Ichou, R. P. (2018). Can MOOCs reduce global inequality in education?. Australasian Marketing Journal, 26(2), 116-120. DOI: https://doi.org/10.1016/j.ausmj.2018.05.007
https://doi.org/10.1016/j.ausmj.2018.05....
), since, on the one hand, whether making educational content available to a diverse population allows access to learning broadly; on the other hand, there may not be compatibility considering differences related to instructional level, participants' origins, skills, among others (Ichou, 2018). Since MOOCs are consumed by a heterogeneous audience, there is a need to produce content considering the different profiles and skills of participants (Bastos & Biagiotti, 2014Bastos, R. C., & Biagiotti, B. (2014). MOOCs: uma alternativa para a democratização do ensino. RENOTE, 12 (1), 1-9. DOI: https://doi.org/10.22456/1679-1916.50333
https://doi.org/https://doi.org/10.22456...
), because even serving a large number of learners, providing them with opportunities, many people may be disadvantaged due to the lack of accessibility in MOOCs (Królak, Chen, Sanderson, & Kessel, 2017Królak, A., Chen, W., Sanderson, N. C., & Kessel, S. (2017). The Accessibility of MOOCs for Blind Learners. Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility, Baltimore, Maryland, USA.).

In this scenario, Assistive Technologies (AT) - covering alternative features, services, and miscellaneous equipment designed to support people with disabilities in their tasks, such as hardware, software, applications, sensors, mobile devices, specialty keyboards, display, navigation support modules, among others (Abascal, Barbosa, Nicolle, & Zaphiris, 2016Abascal, J., Barbosa, S. D., Nicolle, C., & Zaphiris, P. (2016). Rethinking universal accessibility: a broader approach considering the digital gap. Universal Access in the Information Society, 15(2), 179-182. ; Csapó, Wersényi, Nagy, & Stockman, 2015Csapó, Á., Wersényi, G., Nagy, H., & Stockman, T. (2015). A survey of assistive technologies and applications for blind users on mobile platforms: a review and foundation for research. Journal of Multimodal User Interfaces, 9(4), 275-286.) - may be an answer to this problem. However, even with the availability of various types of assistive technologies, people with disabilities may encounter obstacles when using the Web, and thus assistive technology functionalities only perform satisfactorily in online learning contexts if the materials/adopt principles of accessibility, providing navigation, interaction and learning experiences for all user profiles without impediments (Kurt, 2019Kurt, S. (2019). Moving toward a universally accessible web: Web accessibility and education. Assistive Technology, 31(4), 199-208. DOI: 10.1080/10400435.2017.1414086
https://doi.org/10.1080/10400435.2017.14...
). In educational contexts, especially in face-to-face teaching, assistive technologies are still little explored, for reasons that include the lack of resources in the school environment, lack of knowledge of professionals and students about the existence and functionalities of this technology category, and sometimes AT may also have low user acceptability (Borges & Mendes, 2018Borges, W. F., & Mendes, E. G. (2018). Usabilidade de aplicativos de tecnologia assistiva por pessoas com baixa visão. Revista Brasileira de Educação Especial, 24(2), 483-500. DOI: http://dx.doi.org/10.1590/s1413-65382418000500002
http://dx.doi.org/10.1590/s1413-65382418...
).

In this context, there is a need for those involved in the production of MOOCs to have adequate accessibility training to design instructional materials accessible to all, integrating theoretical and practical elements related to accessibility (Heap & Thompson, 2018Heap, T. P., & Thompson, M. (2018). Optimizing Accessibility Training in Online Higher Education. Proceedings of 33rd CSUN Assistive Technology Conference, March, San Diego, CA, Estados Unidos .). Accessibility is often not considered during the design process and content development for MOOCs, reducing the possibility for people with disabilities to participate. (Iniesto, McAndrew, Minocha, & Coughlan, 2017Iniesto, F., McAndrew, P., Minocha, S., & Coughlan, T. (2017). An investigation into the perspectives of providers and learners on MOOC accessibility. Proceedings of the 5th International Conference on Technological Ecosystems for Enhancing Multiculturalit, Cadiz, Spain.).

Therefore, an accessibility analysis of the MOOCs Lúmina5 5 Available at https://lumina.ufrgs.br , platform was conducted, which offers courses in Portuguese, focusing on a course that was produced by a multidisciplinary team, which has materials in various formats. The choice of this MOOC was motivated by the fact that it integrates accessibility features into all learning sessions, allowing the student to choose material in the format of their choice, following their learning style or specific educational need. The choice of the platform, in turn, was motivated by being (in 2019) the platform with the largest number of courses in the Portuguese language, with more than 40 courses, and because the result of this analysis can be effectively implemented in the source code of the platform as it is managed and maintained by a core coordinated by the authors of this paper.

2 Accessibility in MOOCs Platforms

The term "accessibility" may have different definitions, and their descriptions may vary in various areas of knowledge. However, from a general perspective, accessibility can be associated with a quality attribute that aims to offer products and services capable of serving a wide audience, in order to favor individuals with diverse characteristics and skills, promoting equity in all areas of society, including access to physical spaces, services, technologies, information (Persson, Ahman, Yngling, & Gulliksen, 2014Persson, H., Ahman, H., Yngling, A., & Gulliksen, J. (2014). Universal design, inclusive design, accessible design, design for all: different concepts-one goal? On the concept of accessibility-historical, methodological and philosophical aspects. Universal Access in the Information Society, 14(4), 505-526.). In the context of online learning, problems are often associated with platforms not providing accessibility, or presenting content that is not suitable for access with AT support (Sanderson, Chen, Bong, & Kessel, 2016Sanderson, N. C., Chen, W., Bong, W. K., & Kessel, S. (2016). The Accessibility of MOOC Platforms from Instructors' Perspective. International Conference on Universal Access in Human-Computer Interaction, 124-134.).

Due to this concern, research on digital accessibility with emphasis on design and evaluation of MOOCs has been carried out. The research conducted by Cinquin, Guitton and Sauzéon (2018)Cinquin P. A., Guitton P., & Sauzéon H. (2018). Towards Truly Accessible MOOCs for Persons with Cognitive Disabilities: Design and Field Assessment. In K. Miesenberger, & G. Kouroupetroglou (Eds.), Computers Helping People with Special Needs. ICCHP 2018. Lecture Notes in Computer Science (v. 10896). Springer: Cham ., for example, involved the design and accessibility assessment of a MOOC. The authors produced an accessible MOOC for people with cognitive disabilities by adopting the participatory design process. MOOC was evaluated by users with cognitive disabilities and other types of disabilities, presenting good results regarding usability. Acedo and Osuna (2016)Acedo, S. O., & Osuna, S. M. T. (2016). ECO European Project: Inclusive Education through Accessible MOOCs. Proceedings of the Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 881- 886. presented the project called "ECO European Project", which seeks to build MOOCs based on inclusive education, universal design, accessible web design and responsive web design, providing different language subtitles, transcriptions, resource management via keyboard for people who cannot use the mouse, among other aspects of accessibility.

In the field of accessibility assessment, Sanderson et al. (2016)Sanderson, N. C., Chen, W., Bong, W. K., & Kessel, S. (2016). The Accessibility of MOOC Platforms from Instructors' Perspective. International Conference on Universal Access in Human-Computer Interaction, 124-134. conducted the accessibility assessment of the online course platform called "Canvas" through the "Authoring Tool Accessibility Guidelines" (ATAG) tool. The evaluation was performed by three professionals with a background in Computer Science and with experience in the area of digital accessibility. The authors mention that out of 28 accessibility criteria, the Canvas platform met only 11 in full.

Park, Kim and So (2016)Park, K., Kim, H. J., & So, H. (2016). Are Massive Open Online Courses (MOOCs) Really Open to Everyone?: A Study of Accessibility Evaluation from the Perspective of Universal Design for Learning. Proceedings of HCI Korea, Jeongseon, Republic of Korea, 29-36. evaluated the accessibility of three MOOC platforms (Coursera, edX and Khan Academy) based on a set of accessibility guidelines. The inspection was performed manually by two accessibility specialists and four visually impaired users. Several problems were found, the most serious being related to responsiveness.

Bohnsack and Puhl (2014)Bohnsack M., & Puhl S. (2014). Accessibility of MOOCs. In K. Miesenberger, D. Fels, D. Archambault, P. Peňáz, & W. Zagler (Eds.), Computers Helping People with Special Needs - ICCHP 2014. Lecture Notes in Computer Science (v. 8547, pp. 141-144). Springer: Cham. conducted the accessibility assessment of five MOOC platforms (Coursera, Udacity, edX, OpenCourseWorld, and Iversity) to test accessibility for the visually impaired. The evaluation took place through the use of screen reader software and followed a task protocol elaborated by the authors. The results revealed that most platforms failed accessibility (edX had fewer issues and was the only one to bring accessibility settings for the visually impaired). Iniesto and Rodrigo (2014)Iniesto, F., & Rodrigo, C. (2014). Accessibility assessment of MOOC platforms in Spanish: UNED COMA, COLMENIA and Miriada X. International Symposium on Computers in Education (SIIE), Logrono, Spain. performed the accessibility assessment of three MOOC platforms: UNED COMA, COLMENIA and Miríada X, selecting one course per platform and using different assessment tools. In addition to the use of the tools, learning materials including audio, videos, PDF documents were analyzed. The results showed that all platforms have limitations regarding accessibility.

These works on accessibility in MOOCs have evidenced the frequent occurrence of problems, hindering the participation of people with disabilities. These results are not surprising, as enabling accessibility on MOOCs platforms is a complex task, and, according to Borges and Mendes (2018)Borges, W. F., & Mendes, E. G. (2018). Usabilidade de aplicativos de tecnologia assistiva por pessoas com baixa visão. Revista Brasileira de Educação Especial, 24(2), 483-500. DOI: http://dx.doi.org/10.1590/s1413-65382418000500002
http://dx.doi.org/10.1590/s1413-65382418...
, the little or no knowledge of the educational team about it further worsens the situation. Another difficulty - cited by Iniesto and Rodrigo (2014)Iniesto, F., & Rodrigo, C. (2014). Accessibility assessment of MOOC platforms in Spanish: UNED COMA, COLMENIA and Miriada X. International Symposium on Computers in Education (SIIE), Logrono, Spain. - concerns the fact that audiovisual and interactive elements are not evaluated by automated tools. In addition, designing educational materials that include translations, subtitles, audio descriptions, and sign language use means increasing the workforce as it requires a multidisciplinary team, increasing costs. Add to this the financial difficulties experienced by Brazilian educational institutions, which makes it difficult to attract new members to the work team.

3 Accessibility assessment

Accessibility assessment in the web context is a task that seeks to identify the existence of accessibility problems through the use of specific tools and guidelines, as well as to verify if an individual with disability can interact with a system satisfactorily, and the User participation or not in the evaluation process is linked to the method used in the research (Ramos & Dantas, 2017Ramos, A. L. B. M., & Dantas, A. E. V. B. (2017). Internet para Todos: uma abordagem metodológica para avaliação multidimensional da acessibilidade web. Revista Mangaio Acadêmico, 2(1), 1-11.). Paddison and Englefield (2004)Paddison, C., & Englefield, P. (2004). Applying heuristics to accessibility inspections. Interacting with Computers, 16, 507-521. DOI: https://doi.org/10.1016/j.intcom.2004.04.007
https://doi.org/10.1016/j.intcom.2004.04...
distinguish two types of accessibility assessments: technical and usable. Technical evaluation inspects source code, and usable evaluation relates to task support.

In the context of this study, accessibility assessment was performed with the support of two online tools. This form of assessment is devoid of users with disabilities and can be done by members of the development team (i.e. designers, programmers, reviewers, and educators) or by experts in accessibility, universal design, and related fields - provided they have basic knowledge of HTML and CSS. For this reason, it is usually less costly and allows for the fastest result compared to user ratings. Another form of user-free rating is heuristic ratings, one of the most popular ways of thinking about usability rating. There are heuristics for the most diverse situations, and for accessibility assessments, such as Paddison and Englefield (2004)Paddison, C., & Englefield, P. (2004). Applying heuristics to accessibility inspections. Interacting with Computers, 16, 507-521. DOI: https://doi.org/10.1016/j.intcom.2004.04.007
https://doi.org/10.1016/j.intcom.2004.04...
- which were largely based on the laws used by automated assessment tools. It is noteworthy that the result of this type of evaluation is highly dependent on the evaluator's familiarity with the heuristics and with the product being evaluated. In addition, it is recommended that more than one evaluator carry out the evaluation (Stanton & Young, 1998Stanton, N., Young, M. (1999). Guide to methodology in ergonomics: Designing for human use. London: Taylor & Francis.).

3.1 Automatic accessibility assessment

Automatic accessibility assessment tools allow to assess adherence to internationally accepted standards - such as Web Content Accessibility Guidelines (WCAG) versions 1.0 and 2.0 of the World Wide Web Consortium (W3C), validating code syntax (Pacheco, Amorim, Barbosa, & Ferreira, 2016Pacheco, H. S., Amorim, P. F., Barbosa, P. G. F., & Ferreira, S. B. L. (2016). Comparative Analysis of Web Accessibility Evaluation Tools. Proceedings of the 15th Brazilian Symposium on Human Factors in Computing Systems, São Paulo, Brasil.). The most current version of the W3C -2019 - accessibility guidelines is 2.1. According to Avila (2018)Avila, J. (2018). WCAG 2.1: Exploring the New Success Criteria. Recuperado em 21 de janeiro de 2019 de https://www.levelaccess.com/wcag-2-1-exploring-new-success-criteria/
https://www.levelaccess.com/wcag-2-1-exp...
, version 2.1 of the guidelines contains 17 new items that address a wide range of specific needs, as follows: learning (3), low vision (4), speech (2), college entrance (1) motor (6), blindness (1). Some of the success criteria of this new version are difficult to verify using automated tests, for example: providing keyboard shortcuts; allow one-touch operation instead of one way (on touch screen apparatus); click cancellation (on touch screen apparatus); timeout warnings. Thus, tools based on these guidelines provide a purely objective diagnosis, as neither the importance of each functionality nor the specifics of programming or design are considered.

4 Methods

This research employed the accessibility assessment on a MOOC available on the Lúmina platform. Lúmina courses are self-instructional, with free registration and free access to an unlimited number of participants. Lúmina is based on Moodle, with theme and source code customizations to suit the needs of open courses. It is noteworthy that the authors of this paper participate in the management and maintenance of Lúmina platform, which motivated their choice as the object of this analysis, as the test results may be implemented in future updates.

4.1 Evaluated MOOC description

For the evaluation of accessibility, the MOOC "Deconstructing racism in practice" was selected from the area of Permanent Education and with a study estimate of 60 hours. The course contents are directed towards anti-racist education, distributed in four modules: Ethnic-racial relations in Brazil; Black women; Blackness and education; Construction of racial equality. MOOC's learning resources are multi-format, integrating videos with Brazilian Sign Language translation (known as LIBRAS), deaf and deafened subtitles (DDS) and audio description (AD); textual material available in PDF, audio (MP3) and/or editable file (DOC or TXT) for printing in ink and Braille, including the description of images.

In each module, there are evaluative activities in the form of questionnaires of the following type: true or false and tick the right answer, all available in the formats mentioned above. It is also noted that people with disabilities were part of the MOOC development team through research group COM Acesso - Comunicação Acessível, coordinated by Eduardo Cardoso, the second author of this paper, acting as consultants from the initial stages to the final course evaluation before being made available on the platform. In this paper, the course homepage and content pages were evaluated (Figure 1). It is noteworthy that all content pages are equal, changing only the labels, texts and files available. Since these pages - both the initial and internal content pages - are the same in all courses, this assessment is equivalent to an assessment of the platform itself. The choice of this MOOC was motivated by the fact that it integrates accessibility features into all learning sessions, allowing the student to choose material in the format of their choice, according to their learning style or specific educational need. Also motivating his choice was having access to the development team, which could clarify doubts if automated tests pointed to problems with the materials produced.

Figure 1
Screenshot of the evaluated page.

To assess accessibility, we sought a tool that was compatible with version 2.1, the latest of the guidelines. There were several tools listed on the W3C website6 6 Web Accessibility Evaluation Tools List (https://www.w3.org/WAI/ER/tools/?q=wcag-21-w3c-web-content-accessibility-guidelines-21) ; however, none are free. A trial license was obtained from Make-Sense's A-Web inspection service7 7 http://mk-sense.com/a-web/ . This tool works through a script that must be inserted into the <header> of the homepage, which communicates with the company website and scans the entire website.

Also, the AccessMonitor8 8 http://www.acessibilidade.gov.pt/accessmonitor/ tool was used, which is not on the W3C list, but is free and compliant with version 2.0 of the guidelines. The AccessMonitor tool assigns a score to the analyzed page using a scale from 0 to 10: Very poor (equals score = 1); Bad (score = 2 or 3); Fair (score = 4 or 5); Good (score = 6 or 7); Very good (score = 8 or 9); Excellent (score = 10). Unlike A-Web, page by page has to be tested.

5 Results

Regarding the quantification of accessibility level with AccessMonitor, the MOOC homepage achieved a score of 7.5, which is equivalent to the "good practice" of accessibility. The following accessibility issues were pointed out by AccessMonitor: form markup (1); use of absolute units, such as "points" (1); marking the main language of the page (1).

The Lúmina "form markup" issue does no cause harm to users, even though it lowers the page grade - AccessMonitor describes the error as "form without submit button" in a "hidden" form because AccessMonitor's automated testing only looks for occurrences such as <input type = "submit">, <input type = "image"> or <button type = "submit">, without considering that the "hidden" attribute is ignored by assistive technologies.

Similarly, the language problem - described as "the document type does not allow the use of the xml: lang attribute" - does not assume that the page's HTML header supports both versions of the language attribute: lang="pt-br" xml:lang="pt-br". These two errors are classified as having a high level of severity, although they do not impair browsing or user experience.

The problem with using absolute units could be detrimental if it prevented the user from resizing content - something that is not found in any of today's most commonly used browsers9 9 It is possible to see the participation of each browser in https://www.w3counter.com/globalstats.php (Google Chrome, Safari, Microsoft Edge, Mozilla Firefox). This rule is included in the AccessMonitor test because older versions of browsers (such as Internet Explorer 6) prevent scaling when the page uses absolute units. Although this issue is not rated as severe, it also impairs the page rating. An example of using absolute units is setting the style of a text font to 12 points. Alternatively, you can configure such styles with relative units, such as "in", which are evaluated against the user-configured font size in the browser (1 equals 100% browser size).

Regarding the content page, the score obtained was 7.2, which is also a "good practice". From a qualitative point of view, the evaluation result was more accurate. The tool reported errors in the following categories: header marking (1); metadata (1); marking the main language of the page (1). An issue marked as a warning was found in the Metadata category.

As with the homepage, the error related to "main language markup" does not harm the user, although it lowers the page grade. The error in the "heading markup" category refers to the fact that the page does not use the <H> tags that assistive technologies use to identify hierarchies in text. This is a real problem, but it cannot be fixed without a change in Moodle source code. Moodle development is done through plugins and themes, which are independent modules that can be installed (and customized) according to the needs of each user. This way, it is not encouraged to change Moodle's source code - that is, all files that are not part of a plugin or theme. When this recommendation is violated, there is a risk that updates will not be supported - and Moodle releases more than one update per year.

The "metadata" category error refers to a <refresh> element that is used to redirect the page but can cause confusion if it takes too long. Although not good practice (as explained in the previous paragraph), this element was added directly into the source code to create the "I want to subscribe" button, which doesn't exist in Moodle.

Regarding the A-Web tool, the result of the analysis was much more severe, as over 1,500 problems were reported, which caused a great surprise. The sheer number of problems is explained by why the A-Web system lists all occurrences of the problem in all "dependencies" of a page, scanning the execution of all scripts, without discerning what is actually rendered. For example, the problems with "ensuring that click targets should be acceptable sizes", which was accounted for 885 times, refer to icons that are used in page editing and course administration mode. This was noticed because, after checking the size of the clickable icons and images (rendered to the student user), it was noticed that the number of occurrences of this problem was little reduced.

Similarly, other problems were oversized due to A-Web's non-separation between screens that are rendered to students and course administration and editing screens. The errors reported were: ensuring that image elements have alternative text (102); ensure that the role attribute has an appropriate value (59); ensure buttons have discernible text (63); content should be presented without requiring horizontal scrolling (22); ensure that buttons and texts have text that is not repeated as an image (19); ensuring that "radio" elements have a group (25); ensure contrast between foreground and background (10); ensure that "image" elements have alternate text or "role = none" (15); ensure that the order of the headers is semantically correct (4); for elements with "labels" that include text, the name contains the text (2); ensure that "p" elements are not used to style headers; ensure that the document has no more than one banner landmark (1).

The second major source of errors (alternative text in images) can also be explained by the non-differentiation between rendered screens for editors and administrators and students - mainly because the course design team took special care with alternative texts. The third major source of problems is related to the roles of HTML elements in Accessible Rich Internet Applications (ARIAs). According to Tonato (2016)Tonato, A. F. (2016). Melhorando a acessibilidade de aplicações Web com ARIA. Brazil JS. Recuperado em 21 de janeiro de 2019 de https://braziljs.org/blog/melhorando-a-acessibilidade-de-aplicacoes-web-com-aria/
https://braziljs.org/blog/melhorando-a-a...
, this is a W3C specification that provides extra information to screen readers through HTML attributes, but has no effect on how they appear or behave in browsers, just an extra descriptive layer for the screen readers. The A-Web tool indicated that it could not evaluate if the labels were correct.

Regarding the fourth most prevalent error - ensuring that buttons have discernible text - the error refers to elements that have the aria-role="Button" but do not have a "label". These are also elements that are not rendered to the end user as they are course editing controls. However, the tool does not differentiate these contexts. Similarly, the fifth most prevalent error, related to the lack of radio button grouping, also refers to elements that are not rendered to the student.

It is worth mentioning that the problems identified by the AccessMonitor and A-Web validator refer to the elements contained in the course's HTML code. Therefore, learning resources such as multiformat materials with accessibility elements (subtitles, audio description, Brazilian Sign Language) were not evaluated by the tool. This is another limitation of the evaluation method, revealing the need for complementary methods, such as those involving heuristics and manual analysis by professionals and users.

6 Conclusions

This research addressed accessibility on the Lúmina MOOCs platform, more specifically in the course "Deconstructing Racism in Practice", motivated by studies that pointed out that accessibility on these platforms does not meet all user profiles, making it difficult for people with disabilities to participate. This study also motivated the fact that the authors of the paper manage and maintain the platform, which makes the use of the results in future source code updates possible.

The automatic accessibility assessment performed with AccessMonitor rated MOOC a good accessibility practice; however, the A-Web tool detected a larger number of issues. It is noteworthy that A-Web inspects all platform and course pages, but it does not identify screens/interfaces that are not viewed by students (such as the editing and administration screens). This difference between AccessMonitor and A-Web results from the way pages are analyzed. Because AccessMonitor inspects code through calls to the URL, only one file can be analyzed at a time, and only what is rendered by the browser is evaluated. The A-Web system requires a code to be inserted inside the <head> of the index page, so that the service runs inside the server where the platform is hosted. Thus, the service makes no difference between what is rendered for each of the roles. This is a feature common to all paid accessibility assessment services.

Compared to the results obtained by Iniesto and Rodrigo (2014)Iniesto, F., & Rodrigo, C. (2014). Accessibility assessment of MOOC platforms in Spanish: UNED COMA, COLMENIA and Miriada X. International Symposium on Computers in Education (SIIE), Logrono, Spain. who used the eXaminator tool, which uses the same AccessMonitor scoring scale, Lúmina presented better results - the UNED COMA platform obtained an average score of 6.1; COLMENIA reached 5.2; and Mirrored 4.3. Although Bohnsack and Puhl's (2014)Bohnsack M., & Puhl S. (2014). Accessibility of MOOCs. In K. Miesenberger, D. Fels, D. Archambault, P. Peňáz, & W. Zagler (Eds.), Computers Helping People with Special Needs - ICCHP 2014. Lecture Notes in Computer Science (v. 8547, pp. 141-144). Springer: Cham. research did not use the WCAG/W3C guidelines, the JAWS screen reader software task and usage protocol indicated problems with language markers, page language, metadata, untitled buttons, being results similar to those of Lúmina. Sanderson, Chen, Bong and Kessel (2016)Sanderson, N. C., Chen, W., Bong, W. K., & Kessel, S. (2016). The Accessibility of MOOC Platforms from Instructors' Perspective. International Conference on Universal Access in Human-Computer Interaction, 124-134. also used W3C guidelines to assess the accessibility of the Canvas platform, and among 28 accessibility criteria used, the Canvas platform met 11 in full. Since the AccessMonitor tool also applies the W3C accessibility success criteria, by contrasting the Lúmina MOOC results, the Canvas platform presented a larger set of issues that do not meet the accessibility success criteria. However, there are some common aspects related to fault detection that affect the functionality of assistive technologies, such as screen readers that will not succeed in performing their function; no inclusion of alternative text for images and problems with "buttons".

As well as the 'Deconstructing Racism in Practice" course analyzed in this research, the MOOCs developed and presented in the studies by Acedo and Osuna (2016)Acedo, S. O., & Osuna, S. M. T. (2016). ECO European Project: Inclusive Education through Accessible MOOCs. Proceedings of the Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 881- 886. and Cinquin, Guitton and Sauzéon (2018)Cinquin P. A., Guitton P., & Sauzéon H. (2018). Towards Truly Accessible MOOCs for Persons with Cognitive Disabilities: Design and Field Assessment. In K. Miesenberger, & G. Kouroupetroglou (Eds.), Computers Helping People with Special Needs. ICCHP 2018. Lecture Notes in Computer Science (v. 10896). Springer: Cham . also have different accessibility features that allow users to use according to their needs and preferences. In Cinquin, Guitton and Sauzéon (2018)Cinquin P. A., Guitton P., & Sauzéon H. (2018). Towards Truly Accessible MOOCs for Persons with Cognitive Disabilities: Design and Field Assessment. In K. Miesenberger, & G. Kouroupetroglou (Eds.), Computers Helping People with Special Needs. ICCHP 2018. Lecture Notes in Computer Science (v. 10896). Springer: Cham ., the authors found from their user reviews that the accessibility settings available in MOOC provided participants with disabilities more autonomy and significantly influenced their permanence throughout the course. Other promising results were reported by Acedo and Osuna (2016)Acedo, S. O., & Osuna, S. M. T. (2016). ECO European Project: Inclusive Education through Accessible MOOCs. Proceedings of the Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 881- 886., who applied questionnaires to participants and obtained positive answers regarding accessibility, usability and satisfaction with the learning materials of the MOOCs built in the ECO European Project. Influenced by these successful results, in future work, it is intended to conduct research with people with disabilities in order to analyze accessibility from the user's perspective.

As a limitation, it is noteworthy that automatic accessibility assessment tools, although valuable resources, should not be the only form of inspection for the following reasons:

  1. Do not differentiate features of websites, so it is not possible to prioritize change needs. For example, an image without alternative text that represents a logo in the footer and an image without alternative text but that is a didactic material will cause the same type of error.

  2. They do not evaluate elements other than source code - such as teaching materials. Videos, images, PDFs, spreadsheets - these important elements are not inspected.

  3. Do not differentiate roles within the platform. A platform like Lúmina has multiple roles, each with a different extension of access to management tools. Services that run inside the server where the platform is hosted - such as A-Web and other paid services - have this limitation

For this reason, the integration of different accessibility assessment methods is important. Furthermore, in this research and in the accessibility assessment reports of the literature it is mentioned that some problems detected are not necessarily from the MOOCs platform, but rather from tools, editors, among other features that are used in the platform, and the suitability only in platform code may not be sufficient to meet accessibility criteria. We believe this to be our most significant finding, as third-party research cited throughout the paper - which also evaluated MOOC platforms with automated accessibility testing tools - does not point to this limitation.

Given the above, besides incorporating diversified methods, technological and human factors knowledge in the design and evaluation of MOOCs, it is necessary to reflect on the inclusion of people with disabilities in all educational settings, especially in distance learning through MOOCs. It is also urgent to look for solutions that direct the construction of accessibility-aware MOOCs and platforms in order to meet the various user profiles without technological, educational or even attitudinal barriers.

Acknowledgement

This study was financed in part by the Coordination for the Improvement of Higher Education Personnel - Brazil (CAPES) - Finance Code 001.

  • Abascal, J., Barbosa, S. D., Nicolle, C., & Zaphiris, P. (2016). Rethinking universal accessibility: a broader approach considering the digital gap. Universal Access in the Information Society, 15(2), 179-182.
  • Acedo, S. O., & Osuna, S. M. T. (2016). ECO European Project: Inclusive Education through Accessible MOOCs. Proceedings of the Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 881- 886.
  • Avila, J. (2018). WCAG 2.1: Exploring the New Success Criteria. Recuperado em 21 de janeiro de 2019 de https://www.levelaccess.com/wcag-2-1-exploring-new-success-criteria/
    » https://www.levelaccess.com/wcag-2-1-exploring-new-success-criteria/
  • Bastos, R. C., & Biagiotti, B. (2014). MOOCs: uma alternativa para a democratização do ensino. RENOTE, 12 (1), 1-9. DOI: https://doi.org/10.22456/1679-1916.50333
    » https://doi.org/https://doi.org/10.22456/1679-1916.50333
  • Bohnsack M., & Puhl S. (2014). Accessibility of MOOCs. In K. Miesenberger, D. Fels, D. Archambault, P. Peňáz, & W. Zagler (Eds.), Computers Helping People with Special Needs - ICCHP 2014. Lecture Notes in Computer Science (v. 8547, pp. 141-144). Springer: Cham.
  • Borges, W. F., & Mendes, E. G. (2018). Usabilidade de aplicativos de tecnologia assistiva por pessoas com baixa visão. Revista Brasileira de Educação Especial, 24(2), 483-500. DOI: http://dx.doi.org/10.1590/s1413-65382418000500002
    » http://dx.doi.org/10.1590/s1413-65382418000500002
  • Cinquin P. A., Guitton P., & Sauzéon H. (2018). Towards Truly Accessible MOOCs for Persons with Cognitive Disabilities: Design and Field Assessment. In K. Miesenberger, & G. Kouroupetroglou (Eds.), Computers Helping People with Special Needs. ICCHP 2018. Lecture Notes in Computer Science (v. 10896). Springer: Cham .
  • Csapó, Á., Wersényi, G., Nagy, H., & Stockman, T. (2015). A survey of assistive technologies and applications for blind users on mobile platforms: a review and foundation for research. Journal of Multimodal User Interfaces, 9(4), 275-286.
  • Heap, T. P., & Thompson, M. (2018). Optimizing Accessibility Training in Online Higher Education. Proceedings of 33rd CSUN Assistive Technology Conference, March, San Diego, CA, Estados Unidos .
  • Ichou, R. P. (2018). Can MOOCs reduce global inequality in education?. Australasian Marketing Journal, 26(2), 116-120. DOI: https://doi.org/10.1016/j.ausmj.2018.05.007
    » https://doi.org/10.1016/j.ausmj.2018.05.007
  • Iniesto, F., McAndrew, P., Minocha, S., & Coughlan, T. (2018). Widening disabled learners' participation to HE through the use of MOOCs. Widening Participation Conference: Is widening participation to higher education enough?, Milton Keynes, Inglaterra.
  • Iniesto, F., McAndrew, P., Minocha, S., & Coughlan, T. (2017). An investigation into the perspectives of providers and learners on MOOC accessibility. Proceedings of the 5th International Conference on Technological Ecosystems for Enhancing Multiculturalit, Cadiz, Spain.
  • Iniesto, F., & Rodrigo, C. (2014). Accessibility assessment of MOOC platforms in Spanish: UNED COMA, COLMENIA and Miriada X. International Symposium on Computers in Education (SIIE), Logrono, Spain.
  • Królak, A., Chen, W., Sanderson, N. C., & Kessel, S. (2017). The Accessibility of MOOCs for Blind Learners. Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility, Baltimore, Maryland, USA.
  • Kurt, S. (2019). Moving toward a universally accessible web: Web accessibility and education. Assistive Technology, 31(4), 199-208. DOI: 10.1080/10400435.2017.1414086
    » https://doi.org/10.1080/10400435.2017.1414086
  • Pacheco, H. S., Amorim, P. F., Barbosa, P. G. F., & Ferreira, S. B. L. (2016). Comparative Analysis of Web Accessibility Evaluation Tools. Proceedings of the 15th Brazilian Symposium on Human Factors in Computing Systems, São Paulo, Brasil.
  • Paddison, C., & Englefield, P. (2004). Applying heuristics to accessibility inspections. Interacting with Computers, 16, 507-521. DOI: https://doi.org/10.1016/j.intcom.2004.04.007
    » https://doi.org/10.1016/j.intcom.2004.04.007
  • Park, K., Kim, H. J., & So, H. (2016). Are Massive Open Online Courses (MOOCs) Really Open to Everyone?: A Study of Accessibility Evaluation from the Perspective of Universal Design for Learning. Proceedings of HCI Korea, Jeongseon, Republic of Korea, 29-36.
  • Persson, H., Ahman, H., Yngling, A., & Gulliksen, J. (2014). Universal design, inclusive design, accessible design, design for all: different concepts-one goal? On the concept of accessibility-historical, methodological and philosophical aspects. Universal Access in the Information Society, 14(4), 505-526.
  • Ramos, A. L. B. M., & Dantas, A. E. V. B. (2017). Internet para Todos: uma abordagem metodológica para avaliação multidimensional da acessibilidade web. Revista Mangaio Acadêmico, 2(1), 1-11.
  • Sanderson, N. C., Chen, W., Bong, W. K., & Kessel, S. (2016). The Accessibility of MOOC Platforms from Instructors' Perspective. International Conference on Universal Access in Human-Computer Interaction, 124-134.
  • Stanton, N., Young, M. (1999). Guide to methodology in ergonomics: Designing for human use London: Taylor & Francis.
  • Tonato, A. F. (2016). Melhorando a acessibilidade de aplicações Web com ARIA Brazil JS. Recuperado em 21 de janeiro de 2019 de https://braziljs.org/blog/melhorando-a-acessibilidade-de-aplicacoes-web-com-aria/
    » https://braziljs.org/blog/melhorando-a-acessibilidade-de-aplicacoes-web-com-aria/

Publication Dates

  • Publication in this collection
    25 Nov 2019
  • Date of issue
    Oct-Dec 2019

History

  • Received
    13 June 2019
  • Reviewed
    07 Aug 2019
  • Accepted
    11 Aug 2019
Associação Brasileira de Pesquisadores em Educação Especial - ABPEE Av. Eng. Luiz Edmundo Carrijo Coube, 14-01 Vargem Limpa, CEP: 17033-360 - Bauru, SP, Tel.: 14 - 3402-1366 - Bauru - SP - Brazil
E-mail: revista.rbee@gmail.com