Acessibilidade / Reportar erro

Mobile Augmented Reality enhances indoor navigation for wheelchair users

Abstract

Introduction: Individuals with mobility impairments associated with lower limb disabilities often face enormous challenges to participate in routine activities and to move around various environments. For many, the use of wheelchairs is paramount to provide mobility and social inclusion. Nevertheless, they still face a number of challenges to properly function in our society. Among the many difficulties, one in particular stands out: navigating in complex internal environments (indoors). The main objective of this work is to propose an architecture based on Mobile Augmented Reality to support the development of indoor navigation systems dedicated to wheelchair users, that is also capable of recording CAD drawings of the buildings and dealing with accessibility issues for that population.

Methods

Overall, five main functional requirements are proposed: the ability to allow for indoor navigation by means of Mobile Augmented Reality techniques; the capacity to register and configure building CAD drawings and the position of fiducial markers, points of interest and obstacles to be avoided by the wheelchair user; the capacity to find the best route for wheelchair indoor navigation, taking stairs and other obstacles into account; allow for the visualization of virtual directional arrows in the smartphone displays; and incorporate touch or voice commands to interact with the application. The architecture is proposed as a combination of four layers: User interface; Control; Service; and Infrastructure. A proof-of-concept application was developed and tests were performed with disable volunteers operating manual and electric wheelchairs.

Results

The application was implemented in Java for the Android operational system. A local database was used to store the test building CAD drawings and the position of fiducial markers and points of interest. The Android Augmented Reality library was used to implement Augmented Reality and the Blender open source library handled the basis for implementing directional navigation arrows. OpenGL ES provided support for various graphics and mathematical transformations for embedded systems, such as smartphones. Experiments were performed in an academic building with various labs, classrooms and male and female bathrooms. Two disable volunteers using wheelchairs showed no difficulties to interact with the application, either by entering touch or voice commands, and to navigate within the testing environment with the help of the navigational arrows implemented by the augmented reality modules.

Conclusion

The novel features implemented in the proposed architecture, with special emphasis on the use of Mobile Augmented Reality and the ability to identify the best routes free of potential hazards for wheelchair users, were capable of providing significant benefits for wheelchair indoor navigation when compared to current techniques described in the literature.

Keywords:
Mobile augmented reality; Assistive technology; Pervasive computing; Indoor wheelchair navigation


Introduction

In spite of the technological advances of the last decades, a large percentage of people with disabilities still face enormous challenges to perform basic activities of daily living. Assistive Technology resources can be used to provide those people with various means to improve their quality of life and social inclusion (Santarosa et al., 2012Santarosa LMC, Conforto D, Basso LDO. Eduquito: ferramentas de autoria e de colaboração acessíveis na perspectiva da web 2.0. Revista Brasileira de Educação Especial. 2012; 18(3): 449-68.). Mobility impairments are among the most common problems experienced by individuals with disabilities, affecting a wide range of the world population (Alm et al., 1998Alm N, Arnott JL, Murray IR, Buchanan I. Virtual reality for putting people with disabilities in control. IEEE International Conference on Systems, Man, and Cybernetics; 1998; San Diego, USA. San Diego: IEEE; 1998. p. 1174-9.; Kouroupetroglou, 2013Kouroupetroglou G. Disability informatics and web accessibility for motor limitations. In: Kouroupetroglou G, editor. Advances in medical technologies and clinical. Hershey: IGI Global; 2013.; Mirza et al., 2012Mirza R, Tehseen A, Kumar AVJ. An indoor navigation approach to aid the physically disabled people. ICCEET 2012: Proceedings of the International Conference on Computing, Electronics and Electrical Technologies; 2012 March 21-22; Nagercoil, India. Kumaracoil: IEEE; 2012. p. 979-83. http://dx.doi.org/10.1109/icceet.2012.6203860.
http://dx.doi.org/10.1109/icceet.2012.62...
; Sanchez et al., 2007Sanchez JH, Aguayo FA, Hassler TM. Independent outdoor mobility for the blind. Proceedings of the Virtual Rehabilitation 2007; 2007 Sep 27-29; Venice, Italy. Venice: IEEE; 2007. p. 114-20. http://dx.doi.org/10.1109/icvr.2007.4362150.
http://dx.doi.org/10.1109/icvr.2007.4362...
; Tsetsos et al., 2006Tsetsos V, Anagnostopoulos C, Kikiras P, Hadjiefthymiades S. Semantically enriched navigation for indoor environments. International Journal of Web and Grid Services. 2006; 2(4):453-78. http://dx.doi.org/10.1504/IJWGS.2006.011714.
http://dx.doi.org/10.1504/IJWGS.2006.011...
). Wheelchairs are commonly used assistive devices and, beyond providing mobility, allow for social inclusion and improved quality of life. Nevertheless, wheelchair users still face a number of challenges to navigate within indoor environments (Kouroupetroglou, 2013Kouroupetroglou G. Disability informatics and web accessibility for motor limitations. In: Kouroupetroglou G, editor. Advances in medical technologies and clinical. Hershey: IGI Global; 2013.). Finding the most appropriate route to navigate through large and complex areas, such as hospitals, bus terminals, supermarkets and shopping centers is not an easy task (Tsetsos et al., 2005Tsetsos V, Anagnostopoulos C, Kikiras P, Hasiotis T, Hadjiefthymiades S. A human-centered semantic navigation system for indoor environments. ICPS '05: Proceedings of the 2005 International Conference on Pervasive Services; 2005 Jul 11-14; Santorini, Greece. Greece: IEEE; 2005. p. 146-55.). The best route may refer to the shortest, the easiest trajectory, such as a path without stairs, or one which offers appropriate dropped curbs for wheelchair access. On the other hand, the best route may also be the one that passes by points of interest or still, a safer route for the users to reach their destination.

With the proliferation of wireless technology, a number of solutions for intelligent navigation have become common over recent years, as in the context of driving cars and other vehicles (Chakraborty and Hashimoto, 2010Chakraborty B, Hashimoto T. A framework for user aware route selection in pedestrian navigation system. ISAC: Proceedings of the 2nd International Symposium on Aware Computing; 2010 Nov 1-4; Tainan, Taiwan. Tainan: IEEE; 2010. p. 150-3. http://dx.doi.org/10.1109/isac.2010.5670466.
http://dx.doi.org/10.1109/isac.2010.5670...
; Tsetsos et al., 2005Tsetsos V, Anagnostopoulos C, Kikiras P, Hasiotis T, Hadjiefthymiades S. A human-centered semantic navigation system for indoor environments. ICPS '05: Proceedings of the 2005 International Conference on Pervasive Services; 2005 Jul 11-14; Santorini, Greece. Greece: IEEE; 2005. p. 146-55.). However, when such techniques are applied to wheelchair users, the systems fail in many aspects, such as the precision to indicate the best route to be followed or the best route for those with mobility difficulties (Alm et al., 1998Alm N, Arnott JL, Murray IR, Buchanan I. Virtual reality for putting people with disabilities in control. IEEE International Conference on Systems, Man, and Cybernetics; 1998; San Diego, USA. San Diego: IEEE; 1998. p. 1174-9.; Deruwe and Wall, 2008Deruwe G, Wall R. Pedestrian navigation and integration with distributed smart signal traffic controls. IECON 2008: Proceedings of the 34th Annual Conference of IEEE - Industrial Electronics; 2008 Nov 10-13; Orlando, USA. Orlando: IEEE; 2008. p. 2923-8. http://dx.doi.org/10.1109/iecon.2008.4758424.
http://dx.doi.org/10.1109/iecon.2008.475...
; Park et al., 2006Park K-H, Bien Z, Lee J-J, Kim BK, Lim J-T, Kim J-O, Lee H, Stefanov DH, Kim D-J, Jung J-W, Do J-H, Seo K-H, Kim CH, Song W-G, Lee W-J. Robotic smart house to assist people with movement disabilities. Autonomous Robots. 2006; 22(2):183-98. http://dx.doi.org/10.1007/s10514-006-9012-9.
http://dx.doi.org/10.1007/s10514-006-901...
; Postolache et al., 2011Postolache O, Silva PG, Pinheiro E, Pereira MD, Madeira R, Mendes J, Cunha M, Postolache G, Moura CM. Pervasive sensing and computing for wheelchairs users health assessment. Proceedings of the 1st Portuguese Meeting in Bioengineering; 2011 Mar 1-4; Lisbon, Portugal. Lisbon: IEEE; 2011. p. 1-4. http://dx.doi.org/10.1109/enbeng.2011.6026070.
http://dx.doi.org/10.1109/enbeng.2011.60...
; Stefanov et al., 2004Stefanov DH, Bien Z, Bang W-C. The smart house for older persons and persons with physical disabilities: structure, technology arrangements, and perspectives. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2004; 12(2):228-50. http://dx.doi.org/10.1109/TNSRE.2004.828423. PMid:15218937.
http://dx.doi.org/10.1109/TNSRE.2004.828...
). Limitations of this type prevent wheelchair users from safely moving around with agility. As a matter of fact, proper navigation systems for this class of users must also take into account the various impairments and physical limitations. On many occasions, the task of steering the wheelchair may be already too demanding for some users, while for others, reaching the elevator call button or manipulating a portable device may not be possible at all. Despite the various efforts, current systems still lack proper features to fully overcome those limitations (Kouroupetroglou, 2013Kouroupetroglou G. Disability informatics and web accessibility for motor limitations. In: Kouroupetroglou G, editor. Advances in medical technologies and clinical. Hershey: IGI Global; 2013.; Mirza et al., 2012Mirza R, Tehseen A, Kumar AVJ. An indoor navigation approach to aid the physically disabled people. ICCEET 2012: Proceedings of the International Conference on Computing, Electronics and Electrical Technologies; 2012 March 21-22; Nagercoil, India. Kumaracoil: IEEE; 2012. p. 979-83. http://dx.doi.org/10.1109/icceet.2012.6203860.
http://dx.doi.org/10.1109/icceet.2012.62...
; Tsetsos et al., 2006Tsetsos V, Anagnostopoulos C, Kikiras P, Hadjiefthymiades S. Semantically enriched navigation for indoor environments. International Journal of Web and Grid Services. 2006; 2(4):453-78. http://dx.doi.org/10.1504/IJWGS.2006.011714.
http://dx.doi.org/10.1504/IJWGS.2006.011...
). Furthermore, the majority of the solutions described in the literature do not provide crucial environmental information to wheelchair users, such as information concerning access ramps, counters, shelves, furniture, stairs and elevators that define the internal space of a building. Mobile augmented reality has emerged as an alternative technology that can be used to fulfill those requirements (Blum et al., 2011Blum JR, Bouchard M, Cooperstock JR. What's around me? Spatialized audio augmented reality for blind users with a smartphone. New York: Springer; 2011. http://dx.doi.org/10.1007/978-3-642-30973-1_5.
http://dx.doi.org/10.1007/978-3-642-3097...
; Höllerer et al., 1999Höllerer T, Feiner S, Terauchi T, Rashid G, Hallaway D. Exploring MARS: developing indoor and outdoor user interfaces to a mobile augmented reality system. Computers & Graphics. 1999; 23(6):779-85. http://dx.doi.org/10.1016/S0097-8493(99)00103-X.
http://dx.doi.org/10.1016/S0097-8493(99)...
; Marston et al., 2006Marston JR, Loomis JM, Klatzky RL, Golledge RG, Smith EL. Evaluation of spatial displays for navigation without sight. ACM Transactions on Applied Perception. 2006; 3(2):110-24. http://dx.doi.org/10.1145/1141897.1141900.
http://dx.doi.org/10.1145/1141897.114190...
; Mulloni et al., 2012Mulloni A, Grubert J, Seichter H, Langlotz T, Grasset R, Reitmayr G, Schmalstieg D. Experiences with the impact of tracking technology in mobile augmented reality. MobileHCI and MobiVis 2012: Proceedings of Mobile Vision (MobiVis 2012): vision-based applications and HCI; Workshop at MobileHCI 2012. 2012 Sept 21; San Fransisco, USA. San Fransisco: ACM; 2012. p. 1-4; Rovadosky et al., 2012Rovadosky DS, Pavan W, Dalbosco J, Cervi CR. Uma ferramenta de realidade aumentada usando dispositivo móvel com sistema operacional Android. Revista Brasileira de Computação Aplicada. 2012; 4(1):25-37.).

Augmented Reality (AR) has the potential to provide a natural interface for large-scale environments along with the use of Ubiquitous Computing (Denning and Metcalfe, 1998Denning PJ, Metcalfe RM. Beyond calculation: the next fifty years of computing. Göttingen: Copernicus; 1998.). Ubiquitous Computing, more commonly known as Pervasive Computing, describes the ways in which actual technological models, based on intelligent devices (mobile phones and wireless devices), intelligent environments (embedded device systems) and intelligent interaction (between devices), relate and support computational vision for a wide range of devices used in a variety of environments and human activities (Poslad, 2009Poslad S. Ubiquitous computing: smart devices, environments and interactions. Hoboken: Wiley; 2009. http://dx.doi.org/10.1002/9780470779446.
http://dx.doi.org/10.1002/9780470779446...
; Xing et al., 2009Xing L, Alpcan T, Bauckhage C. Adaptive wireless services for augmented environments. MobiQuitous '09: Proceedings of the 6th Annual InternationalMobile and Ubiquitous Systems: Networking & Services; 2009 Jul 13-1; Toronto, Canada. Toronto: IEEE; 2009. p. 1-8. http://dx.doi.org/10.4108/icst.mobiquitous2009.6821.
http://dx.doi.org/10.4108/icst.mobiquito...
). While Pervasive Computing, along with mobile devices, could allow for the development of new and useful applications for daily activities, which can also be directed towards the improvement of the quality of life of wheelchair users (Ahluwalia et al., 2014Ahluwalia P, Varshney U, Koong KS, Wei J. Ubiquitous, mobile, pervasive and wireless information systems: current research and future directions. International Journal of Mobile Communications. 2014; 12(2):103-41. http://dx.doi.org/10.1504/IJMC.2014.059738.
http://dx.doi.org/10.1590/2446-4740.0151...
; Chiara et al., 2010Chiara DD, Paolino L, Romano M, Sebillo M, Tortora G, Vitiello G. Augmented map navigation through customizable mobile interfaces. Proceedings of the 16th International Conference on Distributed Multimedia Systems; 2010; Oak Brook, USA. Oak Brook: Knowledge Systems Institute Graduate School; 2010. p. 265-70.), Augmented Reality technologies can ideally be used for indoor navigation to provide environmental information and navigation routes (Delail et al., 2012Delail BA, Weruaga L, Zemerly MJ. CAViAR: context aware visual indoor augmented reality for a university Campus. WI-IAT: Proceedings of the 2012 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology; 2012 Dec 4-7; Macau, China. Macau: IEEE; 2012. p. 286-90. http://dx.doi.org/10.1109/wi-iat.2012.99.
http://dx.doi.org/10.1109/wi-iat.2012.99...
; Hervas et al., 2014Hervas R, Bravo J, Fontecha J. An assistive navigation system based on augmented reality and context awareness for people with mild cognitive impairments. IEEE Journal Biomedical and Health Informatics. 2014; 18(1):368-74. http://dx.doi.org/10.1109/JBHI.2013.2266480. PMid:24403436.
http://dx.doi.org/10.1109/JBHI.2013.2266...
; Kotsakos et al., 2013Kotsakos D, Sakkos P, Kalogeraki V, Gunopulos D. Using smart mobile devices for monitoring in assistive environments. Proceedings of the 6th International Conference on Pervasive Technologies Related to Assistive Environments; 2013 May 29-31; Rhodes, Greece. USA: ACM; 2013. p. 1-4.; Matuszka et al., 2013Matuszka T, Gombos G, Kiss A. A new approach for indoor navigation using semantic webtechnologies and augmented reality. In: Shumaker R, editor. Virtual augmented and mixed reality. Lecture notes in computer science: designing and developing augmented and virtual environments. Heidelberg: Springer; 2013. v. 8021, p. 202-10. http://dx.doi.org/10.1007/978-3-642-39405-8_24.
http://dx.doi.org/10.1007/978-3-642-3940...
; Rehman and Cao, 2015Rehman U, Cao S. Augmented reality-based indoor navigation using Google Glass as a wearable head-mounted display. SMC 2015: Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics; 2015 Oct 9-12; Hong Kong, China. Hong Kong: IEEE; 2015. p. 1452-7. http://dx.doi.org/10.1109/smc.2015.257.
http://dx.doi.org/10.1109/smc.2015.257...
).

Positioning systems for outdoor environments have reached a high degree of success and can now be found commercially (Chakraborty and Hashimoto, 2010Chakraborty B, Hashimoto T. A framework for user aware route selection in pedestrian navigation system. ISAC: Proceedings of the 2nd International Symposium on Aware Computing; 2010 Nov 1-4; Tainan, Taiwan. Tainan: IEEE; 2010. p. 150-3. http://dx.doi.org/10.1109/isac.2010.5670466.
http://dx.doi.org/10.1109/isac.2010.5670...
; Hansen and Thomsen, 2007Hansen R, Thomsen B. Using weighted graphs for computationally efficient WLAN location determination. MobiQuitous 2007: Proceedings of the Fourth Annual International Conference on Mobile and Ubiquitous Systems: Networking & Services; 2007 Aug 6-10; Philadephia, PA, USA. Philadelphia: IEEE; 2007. p. 1-5.; Tsetsos et al., 2006Tsetsos V, Anagnostopoulos C, Kikiras P, Hadjiefthymiades S. Semantically enriched navigation for indoor environments. International Journal of Web and Grid Services. 2006; 2(4):453-78. http://dx.doi.org/10.1504/IJWGS.2006.011714.
http://dx.doi.org/10.1504/IJWGS.2006.011...
). Indoor navigation, on the other hand, still has to achieve the same level of success (Kouroupetroglou, 2013Kouroupetroglou G. Disability informatics and web accessibility for motor limitations. In: Kouroupetroglou G, editor. Advances in medical technologies and clinical. Hershey: IGI Global; 2013.; Ran et al., 2004Ran L, Helal S, Moore S. Drishti: an integrated indoor/outdoor blind navigation system and service. PerCom 2004: Proceedings of the Second IEEE International Conference on Pervasive Computing and Communications; 2004 Mar 14-17; Orlando, USA. Orlando: IEEE; 2004. p. 23.; Tsetsos et al., 2005Tsetsos V, Anagnostopoulos C, Kikiras P, Hasiotis T, Hadjiefthymiades S. A human-centered semantic navigation system for indoor environments. ICPS '05: Proceedings of the 2005 International Conference on Pervasive Services; 2005 Jul 11-14; Santorini, Greece. Greece: IEEE; 2005. p. 146-55.; Tsetsos et al., 2006Tsetsos V, Anagnostopoulos C, Kikiras P, Hadjiefthymiades S. Semantically enriched navigation for indoor environments. International Journal of Web and Grid Services. 2006; 2(4):453-78. http://dx.doi.org/10.1504/IJWGS.2006.011714.
http://dx.doi.org/10.1504/IJWGS.2006.011...
; Yohan et al., 2012Yohan C, Talipov E, Hojung C. Autonomous management of everyday places for a personalized location provider. IEEE Transactions on Systems, Man and Cybernetics. Part C, Applications and Reviews. 2012; 42(4):518-31. http://dx.doi.org/10.1109/TSMCC.2011.2131129.
http://dx.doi.org/10.1109/TSMCC.2011.213...
). As for example, a number of navigation systems are based on GPS data. However, GPS signals can be weak or imprecise when operating indoors, resulting in insufficient resolution and can even render navigation impossible. Furthermore, most of those systems do not allow for the incorporation of new navigational data, being limited to only the information received from locations where GPS signals are present and do not incorporate data for precise localization obtained through possible feedback sources concerning the actual location (Hub, 2008Hub A. Precise indoor and outdoor navigation for the blind and visually impaired using augmented maps and the TANIA system. Vision 2008: Proceedings of the 9th International Conference on Low Vision; 2008 Jul 7-11; Montreal, Canada. Canada: Online; 2008. p. 1-4.).

Recent works have provided important evolutions to facilitate indoor navigation for wheelchair users (Cheein et al., 2011Cheein FAA, De La Cruz C, Filho TFB, Carelli R. Maneuverability strategy for assistive for assistive vehicles navigating within confined space. International Journal of Advanced Robotic Systems. 2011; 8:62-75.; De La Cruz et al., 2010De La Cruz C, Bastos TF, Cheein FAA, Carelli R. SLAM-based robotic wheelchair navigation system designed for confined spaces. ISIE: Proceedings of the 2010 IEEE International Symposium on Industrial Electronics; 2010 Jul 4-7; Bari, Italy. Bari: IEEE; 2010. p. 2331-6. http://dx.doi.org/10.1109/isie.2010.5637760.
http://dx.doi.org/10.1109/isie.2010.5637...
; De La Cruz et al., 2011De La Cruz C, Celeste WC, Bastos TF. A robust navigation system for robotic wheelchairs. Control Engineering Practice. 2011; 19(6):575-90. http://dx.doi.org/10.1016/j.conengprac.2010.11.007.
http://dx.doi.org/10.1016/j.conengprac.2...
). Nevertheless, these new technologies do not yet incorporate the many advantages of Augmented Reality as a means of facilitating indoor navigation for wheelchair users, while taking into account their needs and abilities (Golfarelli et al., 2001Golfarelli M, Maio D, Rizzi S. Correction of dead-reckoning errors in map building for mobile robots. IEEE Transactions on Robotics and Automation. 2001: 17(1):37-47.; Kouroupetroglou, 2013Kouroupetroglou G. Disability informatics and web accessibility for motor limitations. In: Kouroupetroglou G, editor. Advances in medical technologies and clinical. Hershey: IGI Global; 2013.; Parker and Tomitsch, 2014Parker C, Tomitsch M. Data visualisation trends in mobile augmented reality applications. Proceedings of the 7th International Symposium on Visual Information Communication and Interaction; 2014 Aug 5-8; Sydney, Australia. USA: ACM; 2014. p. 228-31.; Tsetsos et al., 2005Tsetsos V, Anagnostopoulos C, Kikiras P, Hasiotis T, Hadjiefthymiades S. A human-centered semantic navigation system for indoor environments. ICPS '05: Proceedings of the 2005 International Conference on Pervasive Services; 2005 Jul 11-14; Santorini, Greece. Greece: IEEE; 2005. p. 146-55.). In fact, to this date, the authors have not found in the literature, systems that use Mobile Augmented Reality for indoor navigation and for visualization of information in real time, while also addressing accessibility issues. Besides, the lack of data to support indoor navigation also poses a difficult challenge, primarily when it comes to the design of a generalized navigation system for wheelchair users. In other words, a more flexible navigation system that is omnipresent and capable of incorporating user preferences while supplying a clear set of choices is necessary, in order to satisfy a more ample spectrum of society (Chakraborty and Hashimoto, 2010Chakraborty B, Hashimoto T. A framework for user aware route selection in pedestrian navigation system. ISAC: Proceedings of the 2nd International Symposium on Aware Computing; 2010 Nov 1-4; Tainan, Taiwan. Tainan: IEEE; 2010. p. 150-3. http://dx.doi.org/10.1109/isac.2010.5670466.
http://dx.doi.org/10.1109/isac.2010.5670...
).

Table 1 shows a number of studies published in the literature over the past eighteen years, dealing with indoor and outdoor navigation for individuals with physical disabilities, describing the various technologies used and the main features that each system incorporates. No studies were found covering the main features we consider paramount for the successful generalization of navigation systems for wheelchair users: indoor navigation; information visualization (interactive visual representation of data to reinforce human cognition) endorsed by Augmented Reality techniques; specific navigation tools (such as those that consider dropped curbs and ramps); specific functionalities for different user profiles; capacity of switching visualizations and algorithms according to user feedback; and voice commands and gesture recognition.

Table 1
Comparison of research-based navigation systems.

The scarcity of research using mobile AR for navigating in indoor environments, associated with techniques that identify best routes by taking physically disabilities into account and yet directed to wheelchair users was, in fact, the main motivation of this research. It is believed that such inclusion will improve accessibility and facilitate navigation in various environments, such as shopping centers, hospitals and workplaces.

The authors propose an architecture to support the development of indoor navigation systems dedicated to wheelchair users that is also capable of recording CAD drawings of the buildings alongside accessibility issues. This overall strategy is also based on recording the location of special areas (such as rooms, bathrooms etc.) and fiducial markers positioned in points of interest. By means of voice or taping commands in a smartphone interface, the user can request a specific location. In turn, the system provides an optimized route, considering possible user’s physical limitations (a path without ramps and other obstacles, for instance). Also, when moving along the proposed route, the user will be able to spot fiducial markers. Positioning the smartphone’s camera in front of those markers will allow for the visualization of directional arrows in an Augmented Reality environment, facilitating continuous navigation along the correct path and to the next marker, until the desired location is reached.

Methods

We propose an architecture for the development of applications where a wheelchair user should be able to visualize routes that provide better and safer indoor navigation, avoiding rumps, stairs and other kind of obstacles. To support indoor navigation, Augmented Reality fiducial markers are employed to present direction arrows. Besides, the system should allow for the registering of different environments/locations. The configuration of places of interests (such as bathrooms and emergency exits) and the repositioning of fiducial markers should also be permitted. It is also important to take into consideration the motor and cognitive abilities of possible users (Jia et al., 2007Jia P, Hu HH, Lu T, Yuan K. Head gesture recognition for hands-free control of an intelligent wheelchair. Industrial Robot: International Journal. 2007; 34(1):60-8.). In other words, the navigation algorithm must consider the needs and capacities of each user. Those could influence a number of aspects associated to the design of the indoor navigation system, as well as the algorithms used for navigation and interface design. For example, the positioning of markers need to take into consideration the height of the user sat on the chair. Furthermore, the application should be designed to respond to both vocal and written (typed on a keyboard or selected from a form) instructions to define, for instance, the desired target for which the best route should be calculated and presented to the user.

In this manner, we propose the following main functional requirements (FR) for a system dedicated to indoor navigation for wheelchair users:

  • FR01: allow indoor navigation for wheelchair users by means of Mobile Augmented Reality techniques;

  • FR02: register and configure building CAD drawing in a smartphone display with fiducial markers, locating points of interest or obstacles to be avoided by the wheelchair;

  • FR03: find the best route for wheelchair indoor navigation, taking stairs and other obstacles into account;

  • FR04: allow for the visualization of virtual direction arrows in the smartphone display, according to AR fiducial markers and arrival point position. The placing of fiducial markers shall consider wheelchair user limitations (such as eye height level) and mobile device specification (such as the reach of the digital camera focus);

  • FR05: incorporate voice commands to call for specific locations (such as bathrooms and emergency exits) in case the wheelchair user presents limitations to manipulate the smartphone.

Considering those Functional Requirements, the authors propose an architecture for the design of applications dedicated to indoor navigation for wheelchair users composed of four layers, as shown in Figure 1:

Figure 1
Proposed architecture for an indoor navigation system for wheelchair users based on fiducial markers and Augmented Reality.
  • User interface layer: responsible for creating specific events for each client, depending on the user. This layer contains two modules - A Data Acquisition module that is responsible for interpreting the commands (touch or voice), when the user is selecting a point of arrival, and the Event Capture module that handles this feature.

  • Control layer: responsible for accepting or rejecting data or events that arrive from the interface layer. Two modules comprise this layer - The Data Processing module that receives and interprets the option selected by the user in Data Acquisition module (previous layer); and the Event Controls module, required when events such as voice/gesture commands are identified in the Interface Layer.

  • Service layer: this layer is responsible for managing all services that are requested to ensure wheelchair indoor navigation. For instance, the Service Location module is responsible to regulate the position of fiducial markers within the CAD drawing. This is important to generate arrow directions according to user´s selection of points of arrival, performed by the Routing Algorithm module. The Indoor Navigation module is responsible for controlling user navigation presenting direction arrows at the smartphone display, through Augmented Reality techniques. The User Profile Identification module identifies the type of wheelchair user (able to use both hands and voice, only voice, etc.). The Outdoor Navigation module is designed to allow for outdoor navigation with the help of GPS data. When GPS signals can no longer be identified, the Navigation Service module switches the control for the Indoor Navigation module.

  • Infrastructure layer: responsible for accessing information (such as CAD drawing, position of markers in this drawing etc.) stored in a database.

In order to develop an application based on this architecture, one must first consider registering the CAD drawing of the building of interest into the smartphone storage place. Next, the CAD drawing is divided in a matrix of cells. Each cell represents a 2 meters side square labeled according to its position (line, column). It is also possible to configure cells with different dimensions. Next, fiducial markers are scattered along the corridors of the building and labeled according to their position. In Figure 2, for example, fiducial marker ‘4’ is labeled as position (5, 5). Likewise, the system labels fiducial marker ‘6’ as position (5, 8). The Service Location module of the Service Layer in the proposed architecture handles the registration of cells and fiducial markers.

Figure 2
CAD drawing of the building used in the proof-of-concept experiment. In the zoomed area are shown the start and arrival points, along with the position of fiducial markers (small squares placed on the walls) and a possible path (dashed lines).

For operation, the wheelchair user can access the application and request a point of arrival by selecting it in the smartphone form by touch (handled by the Data Acquisition module) or by voice, managed by the Event Capture, in the User Interface Layer of the proposed architecture. For example, considering Figure 2, suppose the user is at the building´s main entrance (starting point) and selects to go to a room referenced by marker ‘4’. The Data Processing module of the architecture´s Control Layer process the requisition and register marker ‘4’ position as the point of arrival. At this point, the Routing Algorithm module of the Service Layer is activated and remains in a ‘listening’ mode, expecting a flag from the Indoor Navigation module saying that a marker has been found and a directional arrow is requested. After the selection, the user starts navigating in the building corridors, looking for markers.

When the user finds a marker, he/she must position the smartphone display, with its digital camera activated, in front of the marker to identify navigation directions. For example, consider the user reaches fiducial marker ‘9’, in Figure 2. When he positions the smartphone camera in front of this marker, the Routing Algorithm module proceeds with the arrow direction calculation. This is mathematically achieved by projecting the cell position of both markers in the plane of the digital camera. A normal vector of this plane goes from the camera to the fiducial marker. Then, the direction of the arrow to be presented at the user smartphone display is given from the current marker position to the point of arrival position. Once this calculation is complete, the Indoor Navigation module is requested to present the directional arrow onto the user’s smartphone display. In this example, the calculated direction arrow would be pointing to the right. However, if the fiducial marker ‘9’ was positioned at the opposite wall (see Figure 2), the calculated direction arrow would be pointing to the left. It is important to note that when the markers’ projections in the camera display plane are coincident, the direction arrow would point forward, unless the current fiducial marker is the point of arrival.

As a proof of concept, the authors developed an application for smartphone, based on the proposed architecture, to allow for indoor navigation based on the use of fiducial markers and Augmented Reality. Navigation tests were performed with two disable volunteers. Volunteer 1 used a manual wheelchair, while Volunteer 2 used a motorized/electrical wheelchair. Also, since the system relies on the smartphone’s camera to capture the images of the markers for further processing, we also tested the application with two different smartphones: Samsung™ Galaxy S2 with Android 2.3 Operational System and Samsung™ Galaxy S5 with Android 5.0 Operational System.

This study was approved by the Ethics Committee of the Federal University of Uberlândia, Brazil (CAAE 37874014.0.0000.5152). The volunteers were made aware of the objectives of the experiments and signed the free informed consent forms agreeing to participate in the trials, as well as with the use of their images in scientific publications.

Results

In order to evaluate the proposed architecture, a proof-of-concept application has been developed. The computer platform and tools used to implement this application are as follows:

  • Android operational system, version 2.3 (Burnette, 2009Burnette E. Hello, android: introducing Google’s mobile development platform. 2nd ed. Raleigh: Pragmatic Bookshelf; 2009.). Android is a worldwide open-source platform, based on Java programming language, for the development of mobile applications;

  • The building CAD drawings were stored in a local database – SQLite (Aditya and Vikash, 2014Aditya SK, Vikash KK. Android sQLite essentials. Birmingham: Packt Publishing Ltd; 2014.). This database is integrated with the Android platform.

  • AndAR (Android Augmented Reality) (Chen et al., 2016Chen P, Peng Z, Li D, Yang L. An improved augmented reality system based on AndAR. Journal of Visual Communication and Image Representation. 2016; 37(1):63-9. http://dx.doi.org/10.1016/j.jvcir.2015.06.016.
    http://dx.doi.org/10.1016/j.jvcir.2015.0...
    ), which is a software library to create Augmented Reality applications. It contains a set of modules that are responsible for registering fiducial markers and associate virtual objects with them. When, the smartphone digital camera captures a real image, AndAR provides all mathematical transformations to position the generated direction arrows onto the fiducial marker. AndAR is based on ARToolKit that is one of the very first platforms to develop desktop AR applications.

  • Blender: an open-source 3D modeler that was used to create the directional arrows (Brito, 2008Brito A. Blender 3D: architecture, buildings, and scenery: create photorealistic 3D architectural visualizations of buildings, interiors, and environmental scenery. Birmingham: Packt Publishing Ltd; 2008.).

  • OpenGL ES (Open Graphics Library for Embedded Systems): this library is necessary because AndAR only places the virtual objects on fiducial marker. However, a set of rotations and translations are necessary to position the arrows in right navigation direction. OpenGL ES performs all these mathematical rigid transformations (Mithwick et al., 2012Mithwick R, Michael MV, Muhtasib L. Pro opengl es for android. New York: Apress; 2012.).

Experiments were performed in an academic building with various labs, classrooms and male and female bathrooms. For the experiment, the CAD drawing of the building (Figure 2) was inserted into the system and the position of markers (placed along possible paths) was registered in the application. It is also important to note that the height of the fiducial markers was defined previously so that it was according to the height of shoulder of the average user sitting on the wheelchair.

Navigation tests were performed with two disable volunteers using a wheelchair. The starting point was established at the entrance of the building. After starting the application, the user can select the desired environment and then the desired arrival point, as shown in Figure 3. Navigation can also be controlled using a standard voice interface of the smartphone, just by vocalizing commands and names of specific locations, such as “environment one” followed by “bathroom”. Synonyms are not handled in the current version of the system.

Figure 3
User accessing the application to (a) select the desired environment (left) and (b) the target location within, prior to starting navigation. If necessary, the user can also input commands to the application via a standard voice interface (c).

Once the final destination is selected, the interface presents the direction that the wheelchair user should follow to reach the destination via the best obstacle-free route. When passing by a marker, the user could capture an image of that marker, using the camera of the mobile device, which would prompt the presentation of the direction to be followed for the best route from that point on. Note that, the use of fiducial markers would allow for precise and swift navigation even in areas where traditional systems would not work at all, or would be imprecise or slow to respond, due to the lack of proper GPS signals, for instance. Figure 4 shows various stages of navigation using the proof-of-concept application.

Figure 4
Demonstration of the various steps associated with the use of the proof-of-concept application to guide navigation of wheelchair users in indoor environments. (a) Volunteer 1 starting navigation; (b) Volunteer 2 starting navigation; (c) User in front of the first fiducial marker with the system presenting the direction for the best obstacle-free route, from that point to the desired location; (d) User in from of another fiducial marker midway; (e) User at the arrival location.

We also tested the capacity of two different smartphones, as describe earlier, to correctly identify (focus on) those images. We have found that recognition could be easily achieved for marker-camera distances between 10 cm and 3.5 meters.

Finally, we also tested the use of voice commands to execute the same path as shown in Figure 3c. This feature was designed to help those users who have difficulties in handling the smartphone by hands. The results comply with the requirements, allowing for proper navigation.

Discussion

In this work, we propose an architecture based on Augmented Reality for the design of indoor navigation systems dedicated to wheelchair users. An algorithm that considers imminent physical difficulties (such as ramps, obstacles, etc.) has been provided to identify best routes and points of interest. In contrast to traditional approaches, the proposed technique also handles voice commands to provide support for different users. By integrating these three features (AR techniques, voice command and best route identification), advances in indoor navigation for wheelchair users have been acquired, as discussed hereafter.

In our experiments, the volunteers were able to navigate indoors and find points of interest, as stated by the first functional requirement (FR01). The user reached the point of arrival, without any additional help, by only following instructions of the system’s display. The ability to configure different locations was also achieved, as demanded by requirement FR02. Figure 2 shows the CAD drawing of the testing environment, which was registered in the portable/smartphone application, along with the position of markers scattered along the building corridors. When a best route was identified by the system (FR03), visual information was presented to conduct the user through the building. Our experiments suggest that the inclusion of fiducial markers, along the route identified by the algorithm, simplifies indoor navigation for wheelchair users. This is achieved through Augmented Reality techniques that dynamically update arrow directions, which, in turn, are displayed at the smartphone screen, to provide guides for navigation (FR04). The routing algorithm could successfully calculate the directional arrows during the experiments. Finally, by complying with requirement FR05, the system supported voice commands that were also helpful in identifying points of interests. This feature facilitates usability, especially for those who present difficulties in manual handling of smartphones.

It is important to highlight that the identification of the best routes associated with an algorithm to detect potential hazards related to wheelchair mobility were not handled by previous intelligent navigation systems (Cankaya et al., 2015Cankaya IA, Koyun A, Yigit T, Yuksel AS. Mobile indoor navigation system in iOS platform using augmented reality. AICT: Proceedings of the 9th International Conference on Application of Information and Communication Technologies; 2015 Oct 14-16; Rostov-on-Don, Russia. Rostov-on-Don: IEEE; 2015. p. 281-4.; Deruwe and Wall, 2008Deruwe G, Wall R. Pedestrian navigation and integration with distributed smart signal traffic controls. IECON 2008: Proceedings of the 34th Annual Conference of IEEE - Industrial Electronics; 2008 Nov 10-13; Orlando, USA. Orlando: IEEE; 2008. p. 2923-8. http://dx.doi.org/10.1109/iecon.2008.4758424.
http://dx.doi.org/10.1109/iecon.2008.475...
; Dong et al., 2015Dong J, Xiao Y, Noreikis M, Ou Z, Yl A. iMoon: using smartphones for image-based indoor navigation. Proceedings of the 13th ACM Conference on Embedded Networked Sensor Systems. 2015 Nov 1-4; Seoul, South Korea. Seoul: ACM; 2015. p. 85-97.; Kouroupetroglou, 2013Kouroupetroglou G. Disability informatics and web accessibility for motor limitations. In: Kouroupetroglou G, editor. Advances in medical technologies and clinical. Hershey: IGI Global; 2013.; Mirza et al., 2012Mirza R, Tehseen A, Kumar AVJ. An indoor navigation approach to aid the physically disabled people. ICCEET 2012: Proceedings of the International Conference on Computing, Electronics and Electrical Technologies; 2012 March 21-22; Nagercoil, India. Kumaracoil: IEEE; 2012. p. 979-83. http://dx.doi.org/10.1109/icceet.2012.6203860.
http://dx.doi.org/10.1109/icceet.2012.62...
; Park et al., 2006Park K-H, Bien Z, Lee J-J, Kim BK, Lim J-T, Kim J-O, Lee H, Stefanov DH, Kim D-J, Jung J-W, Do J-H, Seo K-H, Kim CH, Song W-G, Lee W-J. Robotic smart house to assist people with movement disabilities. Autonomous Robots. 2006; 22(2):183-98. http://dx.doi.org/10.1007/s10514-006-9012-9.
http://dx.doi.org/10.1007/s10514-006-901...
; Postolache et al., 2011Postolache O, Silva PG, Pinheiro E, Pereira MD, Madeira R, Mendes J, Cunha M, Postolache G, Moura CM. Pervasive sensing and computing for wheelchairs users health assessment. Proceedings of the 1st Portuguese Meeting in Bioengineering; 2011 Mar 1-4; Lisbon, Portugal. Lisbon: IEEE; 2011. p. 1-4. http://dx.doi.org/10.1109/enbeng.2011.6026070.
http://dx.doi.org/10.1109/enbeng.2011.60...
; Rovadosky et al., 2012Rovadosky DS, Pavan W, Dalbosco J, Cervi CR. Uma ferramenta de realidade aumentada usando dispositivo móvel com sistema operacional Android. Revista Brasileira de Computação Aplicada. 2012; 4(1):25-37.; Stefanov et al., 2004Stefanov DH, Bien Z, Bang W-C. The smart house for older persons and persons with physical disabilities: structure, technology arrangements, and perspectives. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2004; 12(2):228-50. http://dx.doi.org/10.1109/TNSRE.2004.828423. PMid:15218937.
http://dx.doi.org/10.1109/TNSRE.2004.828...
). In fact, in when deriving the building’s CAD drawing information, to be registered in the user´s smartphone, the location of rumps and stairs can also be taken into account with the proposed architecture, so that the optimized route identification algorithm can use such information to derive possible new routes.

Previous AR attempts where only designed for people with ability to walk and manipulate a smartphone touch screen (Koch et al., 2014Koch C, Neges M, König M, Abramovici M. Natural markers for augmented reality-based indoor navigation and facility maintenance. Automation in Construction. 2014; 48:18-30. http://dx.doi.org/10.1016/j.autcon.2014.08.009.
http://dx.doi.org/10.1016/j.autcon.2014....
; Neges et al., 2015Neges M, Koch C, König M, Abramovici M. Combining visual natural markers and IMU for improved AR based indoor navigation. Advanced Engineering Informatics. 2015. In press. http://dx.doi.org/10.1016/j.aei.2015.10.005.
http://dx.doi.org/10.1016/j.aei.2015.10....
; Ruta et al., 2015Ruta M, Scioscia F, Ieva S, Filippis DD, Sciascio ED. Indoor/outdoor mobile navigation via knowledge-based POI discovery in augmented reality. WI-IAT 2015: Proceedings of the 2015 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology; 2015 Dec 6-9; Singapore. Singapore: IEEE; 2015. p. 26-30. http://dx.doi.org/10.1109/wi-iat.2015.243.
http://dx.doi.org/10.1109/wi-iat.2015.24...
). However, due to their limitations, wheelchair users move around the environment in an imaginary line that is below healthy people’s eyes. That is why the location and the position of the fiducial markers were cautiously calculated to guarantee reachable information through the smartphone image capturing function. Experiments have shown that, when the users were sat on their wheelchairs, the camera was able to provide helpful information when positioned up to about 3 meters from the markers. Furthermore, the inclusion of voice commands has been shown to be a suitable option for standard users as well as those with more severe limitations (Jia at al., 2007Jia P, Hu HH, Lu T, Yuan K. Head gesture recognition for hands-free control of an intelligent wheelchair. Industrial Robot: International Journal. 2007; 34(1):60-8.). This is an important component and has not been observed in other works, such as those proposed by Sinyukov et al. (2014)Sinyukov DA, Ran L, Otero NW, Runzi G, Padir T. Augmenting a voice and facial expression control of a robotic wheelchair with assistive navigation. SMC 2014: Proceedings of the 2014 IEEE International Conference on Systems, Man and Cybernetics; 2014 Oct. 5-8; San Diego, USA. San Diego: IEEE; 2014. p. 1088-94. http://dx.doi.org/10.1109/smc.2014.6974059.
http://dx.doi.org/10.1109/smc.2014.69740...
and Takahashi and Kondo (2015)Takahashi C, Kondo K. Indoor positioning method for augmented audio reality navigation systems using iBeacons. GCCE 2015: Proceedings of the IEEE 4th Global Conference on Consumer Electronics; 2015 Oct 27-30. Osaka, Japan. Osaka: IEEE; 2015. p. 451-2. http://dx.doi.org/10.1109/gcce.2015.7398636.
http://dx.doi.org/10.1109/gcce.2015.7398...
, as shown in Table 1.

In addition, on many occasions traditional indoor navigation systems are fully integrated or on-board (Blum et al., 2011Blum JR, Bouchard M, Cooperstock JR. What's around me? Spatialized audio augmented reality for blind users with a smartphone. New York: Springer; 2011. http://dx.doi.org/10.1007/978-3-642-30973-1_5.
http://dx.doi.org/10.1007/978-3-642-3097...
; Chiara et. al., 2010Chiara DD, Paolino L, Romano M, Sebillo M, Tortora G, Vitiello G. Augmented map navigation through customizable mobile interfaces. Proceedings of the 16th International Conference on Distributed Multimedia Systems; 2010; Oak Brook, USA. Oak Brook: Knowledge Systems Institute Graduate School; 2010. p. 265-70.; Mulloni et al., 2012Mulloni A, Grubert J, Seichter H, Langlotz T, Grasset R, Reitmayr G, Schmalstieg D. Experiences with the impact of tracking technology in mobile augmented reality. MobileHCI and MobiVis 2012: Proceedings of Mobile Vision (MobiVis 2012): vision-based applications and HCI; Workshop at MobileHCI 2012. 2012 Sept 21; San Fransisco, USA. San Fransisco: ACM; 2012. p. 1-4; Rehman and Cao, 2015Rehman U, Cao S. Augmented reality-based indoor navigation using Google Glass as a wearable head-mounted display. SMC 2015: Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics; 2015 Oct 9-12; Hong Kong, China. Hong Kong: IEEE; 2015. p. 1452-7. http://dx.doi.org/10.1109/smc.2015.257.
http://dx.doi.org/10.1109/smc.2015.257...
; Sinyukov et al., 2014Sinyukov DA, Ran L, Otero NW, Runzi G, Padir T. Augmenting a voice and facial expression control of a robotic wheelchair with assistive navigation. SMC 2014: Proceedings of the 2014 IEEE International Conference on Systems, Man and Cybernetics; 2014 Oct. 5-8; San Diego, USA. San Diego: IEEE; 2014. p. 1088-94. http://dx.doi.org/10.1109/smc.2014.6974059.
http://dx.doi.org/10.1109/smc.2014.69740...
). Although this may seem as a more cohesive approach, we believe that this would also pose a number of problems when maintenance or substitution of modules is needed. In this sense, the proposed architecture has been designed in such way that the inclusion of new modules (such as outdoor navigation or gesture recognition) is allowed, without compromising the structure of the whole system, and with minimum code construction for the inclusion of new features. Such flexibility would be a major bonus when dealing with the development of indoor navigation systems for wheelchair users, since different particularities can arise from one user to another.

In summary, the authors believe that the various features implemented in the proposed architecture are capable of providing significant benefits for wheelchair user indoors navigation. In fact, none of the works showed in Table 1 addressed many aspects incorporated in our proposal, such as the need for deriving optimal routes considering the presence of rumps or stairs and the association of Augmented Reality techniques to improve the overall experience. Also, with the exception of the work developed by Sinyukov et al. (2014)Sinyukov DA, Ran L, Otero NW, Runzi G, Padir T. Augmenting a voice and facial expression control of a robotic wheelchair with assistive navigation. SMC 2014: Proceedings of the 2014 IEEE International Conference on Systems, Man and Cybernetics; 2014 Oct. 5-8; San Diego, USA. San Diego: IEEE; 2014. p. 1088-94. http://dx.doi.org/10.1109/smc.2014.6974059.
http://dx.doi.org/10.1109/smc.2014.69740...
, where Augmented Reality techniques are handled by Google Glass™, no other work was found in the literature handling voice commands for wheel chair users. However, their solution does not handle concurrent presentation of information about the derived routes, possibly due to limitations of the Google Glass. Besides, the ability to generate directional arrows following users’ movements is not handled by their approach.

Although the proposed architecture presents a number of improvements over current strategies, the authors acknowledge the need of future research that would contribute to further improve indoors navigation. The integration of location devices known as the Bluetooth beacons (Chawathe, 2009Chawathe SS. Low-latency indoor localization using bluetooth beacons. ITSC '09: Proceedings of the 12th International IEEE Conference on Intelligent Transportation Systems; 2009 Oct 4-7; St. Louis, USA. St. Louis: IEEE; 2009. p. 1-7.), which can be captured by smartphones, will be considered in the near future. This technology is currently used in supermarkets, where push notifications are sent to users of smartphones and tablets with information about products closely stocked. This feature can be explored to help wheelchair users with additional information (such as safety navigation issues, close emergency exits, etc.), when a direction arrow provided by a fiducial marker is identified. And finally, we envision a future integration of this platform in fully automated wheelchairs to improved safety and indoor navigation for those with severe motor impairments, such as people with quadriplegia or those in advanced stages of neurodegenerative diseases, such as amyotrophic lateral sclerosis.

Acknowledgements

The authors thank CAPES (Coordination of Improvement of Higher Education Personnel) and FAPEMIG (Foundation of Research of the State of Minas Gerais) for the financial support that enabled this work.

References

  • Aditya SK, Vikash KK. Android sQLite essentials. Birmingham: Packt Publishing Ltd; 2014.
  • Ahluwalia P, Varshney U, Koong KS, Wei J. Ubiquitous, mobile, pervasive and wireless information systems: current research and future directions. International Journal of Mobile Communications. 2014; 12(2):103-41. http://dx.doi.org/10.1504/IJMC.2014.059738
    » http://dx.doi.org/10.1590/2446-4740.01515
  • Alm N, Arnott JL, Murray IR, Buchanan I. Virtual reality for putting people with disabilities in control. IEEE International Conference on Systems, Man, and Cybernetics; 1998; San Diego, USA. San Diego: IEEE; 1998. p. 1174-9.
  • Barberis C, Bottino A, Malnati G, Montuschi P. Experiencing indoor navigation on mobile Devices. IT Professional. 2014; 16(1):50-7. http://dx.doi.org/10.1109/MITP.2013.54
    » http://dx.doi.org/10.1109/MITP.2013.54
  • Blum JR, Bouchard M, Cooperstock JR. What's around me? Spatialized audio augmented reality for blind users with a smartphone. New York: Springer; 2011. http://dx.doi.org/10.1007/978-3-642-30973-1_5
    » http://dx.doi.org/10.1007/978-3-642-30973-1_5
  • Brito A. Blender 3D: architecture, buildings, and scenery: create photorealistic 3D architectural visualizations of buildings, interiors, and environmental scenery. Birmingham: Packt Publishing Ltd; 2008.
  • Burnette E. Hello, android: introducing Google’s mobile development platform. 2nd ed. Raleigh: Pragmatic Bookshelf; 2009.
  • Cankaya IA, Koyun A, Yigit T, Yuksel AS. Mobile indoor navigation system in iOS platform using augmented reality. AICT: Proceedings of the 9th International Conference on Application of Information and Communication Technologies; 2015 Oct 14-16; Rostov-on-Don, Russia. Rostov-on-Don: IEEE; 2015. p. 281-4.
  • Chakraborty B, Hashimoto T. A framework for user aware route selection in pedestrian navigation system. ISAC: Proceedings of the 2nd International Symposium on Aware Computing; 2010 Nov 1-4; Tainan, Taiwan. Tainan: IEEE; 2010. p. 150-3. http://dx.doi.org/10.1109/isac.2010.5670466
    » http://dx.doi.org/10.1109/isac.2010.5670466
  • Chawathe SS. Low-latency indoor localization using bluetooth beacons. ITSC '09: Proceedings of the 12th International IEEE Conference on Intelligent Transportation Systems; 2009 Oct 4-7; St. Louis, USA. St. Louis: IEEE; 2009. p. 1-7.
  • Cheein FAA, De La Cruz C, Filho TFB, Carelli R. Maneuverability strategy for assistive for assistive vehicles navigating within confined space. International Journal of Advanced Robotic Systems. 2011; 8:62-75.
  • Chen P, Peng Z, Li D, Yang L. An improved augmented reality system based on AndAR. Journal of Visual Communication and Image Representation. 2016; 37(1):63-9. http://dx.doi.org/10.1016/j.jvcir.2015.06.016
    » http://dx.doi.org/10.1016/j.jvcir.2015.06.016
  • Chiara DD, Paolino L, Romano M, Sebillo M, Tortora G, Vitiello G. Augmented map navigation through customizable mobile interfaces. Proceedings of the 16th International Conference on Distributed Multimedia Systems; 2010; Oak Brook, USA. Oak Brook: Knowledge Systems Institute Graduate School; 2010. p. 265-70.
  • De La Cruz C, Bastos TF, Cheein FAA, Carelli R. SLAM-based robotic wheelchair navigation system designed for confined spaces. ISIE: Proceedings of the 2010 IEEE International Symposium on Industrial Electronics; 2010 Jul 4-7; Bari, Italy. Bari: IEEE; 2010. p. 2331-6. http://dx.doi.org/10.1109/isie.2010.5637760
    » http://dx.doi.org/10.1109/isie.2010.5637760
  • De La Cruz C, Celeste WC, Bastos TF. A robust navigation system for robotic wheelchairs. Control Engineering Practice. 2011; 19(6):575-90. http://dx.doi.org/10.1016/j.conengprac.2010.11.007
    » http://dx.doi.org/10.1016/j.conengprac.2010.11.007
  • Deepesh PC, Rath R, Tiwary A, Rao VN, Kanakalata N. Experiences with using iBeacons for indoor positioning. Proceedings of the 9th India Software Engineering Conference; 2016 Feb 18-20; Goa, India. Rajasthan: Birla Institute of Technology and Science; 2016. p. 184-9.
  • Delail BA, Weruaga L, Zemerly MJ. CAViAR: context aware visual indoor augmented reality for a university Campus. WI-IAT: Proceedings of the 2012 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology; 2012 Dec 4-7; Macau, China. Macau: IEEE; 2012. p. 286-90. http://dx.doi.org/10.1109/wi-iat.2012.99
    » http://dx.doi.org/10.1109/wi-iat.2012.99
  • Denning PJ, Metcalfe RM. Beyond calculation: the next fifty years of computing. Göttingen: Copernicus; 1998.
  • Deruwe G, Wall R. Pedestrian navigation and integration with distributed smart signal traffic controls. IECON 2008: Proceedings of the 34th Annual Conference of IEEE - Industrial Electronics; 2008 Nov 10-13; Orlando, USA. Orlando: IEEE; 2008. p. 2923-8. http://dx.doi.org/10.1109/iecon.2008.4758424
    » http://dx.doi.org/10.1109/iecon.2008.4758424
  • Dong J, Xiao Y, Noreikis M, Ou Z, Yl A. iMoon: using smartphones for image-based indoor navigation. Proceedings of the 13th ACM Conference on Embedded Networked Sensor Systems. 2015 Nov 1-4; Seoul, South Korea. Seoul: ACM; 2015. p. 85-97.
  • Ferreira ALS, Santos SRD, Miranda LCD. TrueSight: sistema de navegação para pedestres baseado na detecção e Extração automática de landmark em smartphone Android. SVR 2012: Anais do XIV Simpósio de Realidade Virtual e Aumentada; 2012 May 21-31; Niterói, Brasil. Niterói: Universidade Federal Fluminense; 2012. p. 91-9. http://dx.doi.org/10.1109/SVR.2012.14
    » http://dx.doi.org/10.1109/SVR.2012.14
  • Golfarelli M, Maio D, Rizzi S. Correction of dead-reckoning errors in map building for mobile robots. IEEE Transactions on Robotics and Automation. 2001: 17(1):37-47.
  • Hansen R, Thomsen B. Using weighted graphs for computationally efficient WLAN location determination. MobiQuitous 2007: Proceedings of the Fourth Annual International Conference on Mobile and Ubiquitous Systems: Networking & Services; 2007 Aug 6-10; Philadephia, PA, USA. Philadelphia: IEEE; 2007. p. 1-5.
  • Hervas R, Bravo J, Fontecha J. An assistive navigation system based on augmented reality and context awareness for people with mild cognitive impairments. IEEE Journal Biomedical and Health Informatics. 2014; 18(1):368-74. http://dx.doi.org/10.1109/JBHI.2013.2266480 PMid:24403436.
    » http://dx.doi.org/10.1109/JBHI.2013.2266480
  • Höllerer T, Feiner S, Terauchi T, Rashid G, Hallaway D. Exploring MARS: developing indoor and outdoor user interfaces to a mobile augmented reality system. Computers & Graphics. 1999; 23(6):779-85. http://dx.doi.org/10.1016/S0097-8493(99)00103-X
    » http://dx.doi.org/10.1016/S0097-8493(99)00103-X
  • Hub A. Precise indoor and outdoor navigation for the blind and visually impaired using augmented maps and the TANIA system. Vision 2008: Proceedings of the 9th International Conference on Low Vision; 2008 Jul 7-11; Montreal, Canada. Canada: Online; 2008. p. 1-4.
  • Jia P, Hu HH, Lu T, Yuan K. Head gesture recognition for hands-free control of an intelligent wheelchair. Industrial Robot: International Journal. 2007; 34(1):60-8.
  • Ju JS, Shin Y, Kim EY. Vision based interface system for hands free control of an Intelligent Wheelchair. Journal of Neuroengineering and Rehabilitation. 2009; 6(1):1-17. http://dx.doi.org/10.1186/1743-0003-6-33 PMid:19660132.
    » http://dx.doi.org/10.1186/1743-0003-6-33
  • Kalkusch M, Lidy T, Knapp M, Reitmayr G, Kaufmann H, Schmalstieg D. Structured visual markers for indoor pathfinding. Proceedings of the First IEEE International Workshop on Augmented Reality Toolkit; 2002; Darmstadt, Germany. Germany: IEEE; 2002. p. 8. http://dx.doi.org/10.1109/art.2002.1107018
    » http://dx.doi.org/10.1109/art.2002.1107018
  • Koch C, Neges M, König M, Abramovici M. Natural markers for augmented reality-based indoor navigation and facility maintenance. Automation in Construction. 2014; 48:18-30. http://dx.doi.org/10.1016/j.autcon.2014.08.009
    » http://dx.doi.org/10.1016/j.autcon.2014.08.009
  • Kotsakos D, Sakkos P, Kalogeraki V, Gunopulos D. Using smart mobile devices for monitoring in assistive environments. Proceedings of the 6th International Conference on Pervasive Technologies Related to Assistive Environments; 2013 May 29-31; Rhodes, Greece. USA: ACM; 2013. p. 1-4.
  • Kouroupetroglou G. Disability informatics and web accessibility for motor limitations. In: Kouroupetroglou G, editor. Advances in medical technologies and clinical. Hershey: IGI Global; 2013.
  • Lokuge Y, Madumal P, Kumara T, Ranasinghe N. Indoor navigation framework for mapping and localization of multiple robotic wheelchairs. ISMS 2014: Proceedings of the 5th International Conference on Intelligent Systems, Modelling and Simulation; 2014 Jan 27-29; Lagkawi, Malaysia. Lagkawi: IEEE; 2014. p. 197-200. http://dx.doi.org/10.1109/isms.2014.39
    » http://dx.doi.org/10.1109/isms.2014.39
  • Maghdid HS, Lami IA, Ghafoor KZ, Lloret J. Seamless outdoors-indoors localization solutions on smartphones: implementation and challenges. ACM Computing Surveys. 2016; 48(4):1-34. http://dx.doi.org/10.1145/2871166
    » http://dx.doi.org/10.1145/2871166
  • Marston JR, Loomis JM, Klatzky RL, Golledge RG, Smith EL. Evaluation of spatial displays for navigation without sight. ACM Transactions on Applied Perception. 2006; 3(2):110-24. http://dx.doi.org/10.1145/1141897.1141900
    » http://dx.doi.org/10.1145/1141897.1141900
  • Matuszka T, Gombos G, Kiss A. A new approach for indoor navigation using semantic webtechnologies and augmented reality. In: Shumaker R, editor. Virtual augmented and mixed reality. Lecture notes in computer science: designing and developing augmented and virtual environments. Heidelberg: Springer; 2013. v. 8021, p. 202-10. http://dx.doi.org/10.1007/978-3-642-39405-8_24
    » http://dx.doi.org/10.1007/978-3-642-39405-8_24
  • Mirza R, Tehseen A, Kumar AVJ. An indoor navigation approach to aid the physically disabled people. ICCEET 2012: Proceedings of the International Conference on Computing, Electronics and Electrical Technologies; 2012 March 21-22; Nagercoil, India. Kumaracoil: IEEE; 2012. p. 979-83. http://dx.doi.org/10.1109/icceet.2012.6203860
    » http://dx.doi.org/10.1109/icceet.2012.6203860
  • Mithwick R, Michael MV, Muhtasib L. Pro opengl es for android. New York: Apress; 2012.
  • Mulloni A, Grubert J, Seichter H, Langlotz T, Grasset R, Reitmayr G, Schmalstieg D. Experiences with the impact of tracking technology in mobile augmented reality. MobileHCI and MobiVis 2012: Proceedings of Mobile Vision (MobiVis 2012): vision-based applications and HCI; Workshop at MobileHCI 2012. 2012 Sept 21; San Fransisco, USA. San Fransisco: ACM; 2012. p. 1-4
  • Neges M, Koch C, König M, Abramovici M. Combining visual natural markers and IMU for improved AR based indoor navigation. Advanced Engineering Informatics. 2015. In press. http://dx.doi.org/10.1016/j.aei.2015.10.005
    » http://dx.doi.org/10.1016/j.aei.2015.10.005
  • Newman J, Wagner M, Bauer M, Macwilliams A, Pintaric T, Beyer D, Pustka D, Strasser F, Schmalstieg D, Klinker G. Ubiquitous tracking for augmented reality. ISMAR 2004: Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality; 2004 Nov 2-5; Arlington, USA. Arlington: IEEE; 2004. p. 192-201.
  • Onorati T, Malizia A, Diaz P, Aedo I. Modeling an ontology on accessible evacuation routes for emergencies. Expert Systems with Applications. 2014; 41(16):7124-34. http://dx.doi.org/10.1016/j.eswa.2014.05.039
    » http://dx.doi.org/10.1016/j.eswa.2014.05.039
  • Park K-H, Bien Z, Lee J-J, Kim BK, Lim J-T, Kim J-O, Lee H, Stefanov DH, Kim D-J, Jung J-W, Do J-H, Seo K-H, Kim CH, Song W-G, Lee W-J. Robotic smart house to assist people with movement disabilities. Autonomous Robots. 2006; 22(2):183-98. http://dx.doi.org/10.1007/s10514-006-9012-9
    » http://dx.doi.org/10.1007/s10514-006-9012-9
  • Parker C, Tomitsch M. Data visualisation trends in mobile augmented reality applications. Proceedings of the 7th International Symposium on Visual Information Communication and Interaction; 2014 Aug 5-8; Sydney, Australia. USA: ACM; 2014. p. 228-31.
  • Poslad S. Ubiquitous computing: smart devices, environments and interactions. Hoboken: Wiley; 2009. http://dx.doi.org/10.1002/9780470779446
    » http://dx.doi.org/10.1002/9780470779446
  • Postolache O, Silva PG, Pinheiro E, Pereira MD, Madeira R, Mendes J, Cunha M, Postolache G, Moura CM. Pervasive sensing and computing for wheelchairs users health assessment. Proceedings of the 1st Portuguese Meeting in Bioengineering; 2011 Mar 1-4; Lisbon, Portugal. Lisbon: IEEE; 2011. p. 1-4. http://dx.doi.org/10.1109/enbeng.2011.6026070
    » http://dx.doi.org/10.1109/enbeng.2011.6026070
  • Ran L, Helal S, Moore S. Drishti: an integrated indoor/outdoor blind navigation system and service. PerCom 2004: Proceedings of the Second IEEE International Conference on Pervasive Computing and Communications; 2004 Mar 14-17; Orlando, USA. Orlando: IEEE; 2004. p. 23.
  • Rehman U, Cao S. Augmented reality-based indoor navigation using Google Glass as a wearable head-mounted display. SMC 2015: Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics; 2015 Oct 9-12; Hong Kong, China. Hong Kong: IEEE; 2015. p. 1452-7. http://dx.doi.org/10.1109/smc.2015.257
    » http://dx.doi.org/10.1109/smc.2015.257
  • Rovadosky DS, Pavan W, Dalbosco J, Cervi CR. Uma ferramenta de realidade aumentada usando dispositivo móvel com sistema operacional Android. Revista Brasileira de Computação Aplicada. 2012; 4(1):25-37.
  • Rui Z, Yuanqing L, Yongyong Y, Hao Z, Shaoyu W, Tianyou Y, Zhenghui G. Control of a Wheelchair in an Indoor Environment Based on a Brain. 2013; Computer Interface and Automated Navigation. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2016; 24(1):128-39. http://dx.doi.org/10.1109/TNSRE.2015.2439298 PMid:26054072.
    » http://dx.doi.org/10.1109/TNSRE.2015.2439298
  • Ruta M, Scioscia F, Ieva S, Filippis DD, Sciascio ED. Indoor/outdoor mobile navigation via knowledge-based POI discovery in augmented reality. WI-IAT 2015: Proceedings of the 2015 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology; 2015 Dec 6-9; Singapore. Singapore: IEEE; 2015. p. 26-30. http://dx.doi.org/10.1109/wi-iat.2015.243
    » http://dx.doi.org/10.1109/wi-iat.2015.243
  • Sanchez JH, Aguayo FA, Hassler TM. Independent outdoor mobility for the blind. Proceedings of the Virtual Rehabilitation 2007; 2007 Sep 27-29; Venice, Italy. Venice: IEEE; 2007. p. 114-20. http://dx.doi.org/10.1109/icvr.2007.4362150
    » http://dx.doi.org/10.1109/icvr.2007.4362150
  • Santarosa LMC, Conforto D, Basso LDO. Eduquito: ferramentas de autoria e de colaboração acessíveis na perspectiva da web 2.0. Revista Brasileira de Educação Especial. 2012; 18(3): 449-68.
  • Sharhan SMH, Zickau S. Indoor mapping for location-based policy tooling using Bluetooth Low Energy beacons. WiMob 2015: Proceedings of the IEEE 11th International Conference on Wireless and Mobile Computing, Networking and Communications; 2015 Oct 19-21; Abuu-Dhabi, United Arab Emirates. Abuu-Dhabi: IEEE; 2015. p. 28-36. http://dx.doi.org/10.1109/WiMOB.2015.7347937
    » http://dx.doi.org/10.1109/WiMOB.2015.7347937
  • Sinyukov DA, Ran L, Otero NW, Runzi G, Padir T. Augmenting a voice and facial expression control of a robotic wheelchair with assistive navigation. SMC 2014: Proceedings of the 2014 IEEE International Conference on Systems, Man and Cybernetics; 2014 Oct. 5-8; San Diego, USA. San Diego: IEEE; 2014. p. 1088-94. http://dx.doi.org/10.1109/smc.2014.6974059
    » http://dx.doi.org/10.1109/smc.2014.6974059
  • Stefanov DH, Bien Z, Bang W-C. The smart house for older persons and persons with physical disabilities: structure, technology arrangements, and perspectives. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2004; 12(2):228-50. http://dx.doi.org/10.1109/TNSRE.2004.828423 PMid:15218937.
    » http://dx.doi.org/10.1109/TNSRE.2004.828423
  • Takahashi C, Kondo K. Indoor positioning method for augmented audio reality navigation systems using iBeacons. GCCE 2015: Proceedings of the IEEE 4th Global Conference on Consumer Electronics; 2015 Oct 27-30. Osaka, Japan. Osaka: IEEE; 2015. p. 451-2. http://dx.doi.org/10.1109/gcce.2015.7398636
    » http://dx.doi.org/10.1109/gcce.2015.7398636
  • Tsetsos V, Anagnostopoulos C, Kikiras P, Hadjiefthymiades S. Semantically enriched navigation for indoor environments. International Journal of Web and Grid Services. 2006; 2(4):453-78. http://dx.doi.org/10.1504/IJWGS.2006.011714
    » http://dx.doi.org/10.1504/IJWGS.2006.011714
  • Tsetsos V, Anagnostopoulos C, Kikiras P, Hasiotis T, Hadjiefthymiades S. A human-centered semantic navigation system for indoor environments. ICPS '05: Proceedings of the 2005 International Conference on Pervasive Services; 2005 Jul 11-14; Santorini, Greece. Greece: IEEE; 2005. p. 146-55.
  • Xing L, Alpcan T, Bauckhage C. Adaptive wireless services for augmented environments. MobiQuitous '09: Proceedings of the 6th Annual InternationalMobile and Ubiquitous Systems: Networking & Services; 2009 Jul 13-1; Toronto, Canada. Toronto: IEEE; 2009. p. 1-8. http://dx.doi.org/10.4108/icst.mobiquitous2009.6821
    » http://dx.doi.org/10.4108/icst.mobiquitous2009.6821
  • Yayan U, Akar B, Inan F, Yazici A. Development of indoor navigation software for intelligent wheelchair. INISTA 2014: Proceedings of the 2014 IEEE International Symposium on Innovations in Intelligent Systems and Applications; 2014 Jun 23-25; Alberobelo, Italy. Alberobelo: IEEE; 2014. p. 325-9. http://dx.doi.org/10.1109/inista.2014.6873639
    » http://dx.doi.org/10.1109/inista.2014.6873639
  • Yohan C, Talipov E, Hojung C. Autonomous management of everyday places for a personalized location provider. IEEE Transactions on Systems, Man and Cybernetics. Part C, Applications and Reviews. 2012; 42(4):518-31. http://dx.doi.org/10.1109/TSMCC.2011.2131129
    » http://dx.doi.org/10.1109/TSMCC.2011.2131129

Publication Dates

  • Publication in this collection
    24 May 2016
  • Date of issue
    Apr-Jun 2016

History

  • Received
    23 July 2015
  • Accepted
    11 May 2016
Sociedade Brasileira de Engenharia Biomédica Centro de Tecnologia, bloco H, sala 327 - Cidade Universitária, 21941-914 Rio de Janeiro RJ Brasil, Tel./Fax: (55 21)2562-8591 - Rio de Janeiro - RJ - Brazil
E-mail: rbe@rbejournal.org