Acessibilidade / Reportar erro

Turn off the graphics: designing non-visual interfaces for mobile phone games

Abstract

Mobile phones are a widespread platform for ICT applications because they are highly pervasive in contemporary society. Hence, we can think of mobile gaming as a serious candidate to being a prominent form of entertainment in the near future. However, most games (for computers, console and mobile devices) make extensive use of the visual medium, which tends to exclude visually-impaired users from the play. While mobile gaming could potentially reach many visually-impaired users, who are very familiar with this technology, currently there seems to be only very few alternatives for this community. In an attempt to explore new interactive possibilities for such users, this work presents an initial study on non-visual interfaces for mobile phone games. It is based on Semiotic Engineering principles, emphasizing communication through aural, tactile and gestural signs, and deliberately excluding visual information. Results include a number of issues that can be incorporated to a wider research agenda on mobile gaming accessibility, both for the visually-impaired and sighted.

Mobile non-visual games; accessibility; audio games; haptics and gestures; visually-impaired users; semiotic engineering


Turn off the graphics: designing non-visual interfaces for mobile phone games

Luis ValenteI,* * e-mail: lvalente@inf.puc-rio.br ; Clarisse Sieckenius de SouzaII; Bruno FeijóI

IVisionLab, Department of Informatics, Pontifical Catholic University of Rio de Janeiro - PUC-Rio, 22453-900, Rio de Janeiro, RJ, Brazil

IISemiotic Engineering Research Group - SERG, Department of Informatics, Pontifical Catholic University of Rio de Janeiro - PUC-Rio, 22453-900, Rio de Janeiro, RJ, Brazil

ABSTRACT

Mobile phones are a widespread platform for ICT applications because they are highly pervasive in contemporary society. Hence, we can think of mobile gaming as a serious candidate to being a prominent form of entertainment in the near future. However, most games (for computers, console and mobile devices) make extensive use of the visual medium, which tends to exclude visually-impaired users from the play. While mobile gaming could potentially reach many visually-impaired users, who are very familiar with this technology, currently there seems to be only very few alternatives for this community. In an attempt to explore new interactive possibilities for such users, this work presents an initial study on non-visual interfaces for mobile phone games. It is based on Semiotic Engineering principles, emphasizing communication through aural, tactile and gestural signs, and deliberately excluding visual information. Results include a number of issues that can be incorporated to a wider research agenda on mobile gaming accessibility, both for the visually-impaired and sighted.

Keywords: Mobile non-visual games, accessibility, audio games, haptics and gestures, visually-impaired users, semiotic engineering.

1. Introduction

Non-visual games for mobile platforms are rare. Exploring this technology has at least two appealing characteristics:

• Including visually-impaired users in the game play; and

• Providing new kinds of gaming experiences for sighted players.

Furthermore, this exploration must take into account the different hardware and software characteristics between the desktop and mobile phone platforms. The latter has the advantage of being a convergent device (several input and output devices in the same apparatus) and the drawback of suffering severe limitations (e.g. restricted computing power, impossibilities for more powerful microprocessors and batteries because cooling systems are inexistent, use in adverse environment conditions, and huge diversity amongst cell phone models). In the present work we try to define the simplest level that characterizes a non-visual game in a mobile phone platform.

This paper reports the results of preliminary research steps with non-visual (and non-verbal) games for mobile phones. Our main objective is to start learning about the challenges and opportunities involved with this kind of technology. We want to contribute to the design and development of accessible mobile phone games for visually impaired users in this kind of environment. They are currently unable to play games on that platform due to a lack of accessible interfaces. At the same time, we want to explore new kinds of mobile gaming experiences for sighted users, who are used to playing games where visual signs prevail.

Visually-impaired players and sighted users are two different populations with strong differences in space modeling ability, 3D perception capacity, and sound perception accuracy. When we consider both populations in this work, we are trying to understand the first general layer of non-visual interfaces for mobile phone games. However, we are aware that further investigation will be required in designing games for visually-impaired people only and for different age groups.

Because the technology we describe has no precedent, we decided to build a prototype mobile phone game and perform a qualitative study with a small group of participants, some of them sighted and some of them visually-impaired.

The mobile game prototype was built using principles of semiotic engineering1 for the design of a non-visual and non-verbal interactive language. This language expressed the gist of our design intent, and an additional aspect of our empirical study was to find out whether the users understood the essence of the design message encoded in the artifact. The most salient feature of the design is the complete absence of visual signs. All game interaction is based solely on aural, gestural and tactile signs. We avoid extra modalities, such as voice interaction, because we want to isolate the simplest core of non-visual games.

Our preliminary results point to a number of interesting research issues that do not seem to have been addressed systematically or jointly to this date. They range from physical interaction to mental representations, and from emotional engagement to social inclusion.

In the next section we discuss aspects of game HCI that are important for the present work. Then, in the third section, we discuss related research. In the fourth section we present the semiotic engineering of the game interface, including a brief description of the prototype game, The Audio Flashlight. Finally, in the last two sections we present a preliminary evaluation of the game play (fifth section) and our current conclusions (sixth section).

2. Game HCI

In recent years, game graphics and sound have reached an amazing level of realism and received most of the attention from the game community. However, computer game research still lacks a robust theoretical foundation, in spite of game itself being as old as human culture19. Modern game design fundamentals2,37 have greatly expanded the early theoretical concepts of Chris Crawford9, but they are yet far from being a complete conceptual framework. One of the areas that need further development is HCI in computer games43 specifically which is surprising, because interaction shares many characteristics that prevail in games. Crawford9 even says that "interactiveness is an index of gaminess".

The traditional focus of HCI has been set almost exclusively on usability, which underlines ease of use and productivity for accomplishing tasks. Software interfaces should be easy to learn, use, and master, which is somehow opposite to games that are usually easy to learn, but difficult to master.29 Barr and co-authors4 point out that computer games are not made to support external, user-defined tasks, but instead define their own activities for players to engage in. Understanding these differences is the starting point for Game HCI.

Among a number of different research topics in Game HCI, works on game semiotics31, heuristics10, accessibility20, and presence36 deserve special attention from the game research community and industry.

Semiotic analyses of computer games have been done by several researchers. The work by Myers31 very convincingly characterizes playing computer games as a form of semiosis. Caldwell8 analyzes the user interface of a particular turn-based strategy game (Civilization II) using semiotics. But, as far as the authors know, the present study is the first one to employ semiotic engineering principles specifically to designing mobile games.

Heuristics are rules of thumb used as general principles for game design and playability evaluation. Desurvire et al.10 propose playtesting heuristics for the evaluation of game usability (interface and interaction devices), game play (problems and challenges that players must face to win a game), game story (plot and character development), and game mechanics (environment interaction programming). More abstract game characteristics have been considered elsewhere29, such as fantasy and curiosity. Korhonen and Koivisto22 propose playability heuristics for mobile games. However, as far as the present authors know, there is no work on heuristics for non-visual mobile games. This subject is in our plans for future research.

Presence is a key for games. It is the perception of being in a particular space or place. Presence has been studied from different perspectives5,27,42, but most of the approaches are related to the sensorial experience of users in general-purpose virtual environments. Specific research on presence in games, however, is scarce36.

Inspired by Huizinga's playground19 and Roger Caillois' "second-order reality"7, Salen and Zimmermann37 propose the concept of Magic Circle, which is the mental universe created when the player enters the game. The Magic Circle can be greatly expanded by the player's imagination, which allows users to experience convincing collisions through simple joystick vibrations, or even a flight sensing over the scene without any sort of feedback.

Liljedahl and co-authors25 propose a related term to presence that they call "scary shadow syndrome". This term was conceived after observing horror films, meaning that an event may cause greater impact imagined than seen. As an example, they say that "a suspicious shadow can trigger a strong emotional response in an audience"25, because the audience can imagine all sorts of "bad things" that possibly could be associated with that event.

The prevalence of sight over other senses in gaming is widespread. It becomes clearer when we think that many games have an option to "turn off sound" completely. In this case, the game can still be played and enjoyed. Sound can be regarded as a subsidiary resource. But, how many games have the option to turn off the graphics?

Haptics have also been extensively used to enhance the sensorial experience of gamers. Video game consoles have long been supporting "force feedback" joysticks and other input devices since early works in the second half of the 90's. Ouhyoung and co-authors32 presented a game-like flight simulator with vibration feedback in 1995. In 1997 Nintendo released the Rumble Pak, an accessory to connect to the Nintendo 64 joystick to produce tactile feedback. More recent and complex haptic-based games can be found elsewhere14. However, there are few works on haptics for mobile phones38,40.

Using gesture commands in games had a breakthrough with the release of the Nintendo Wii. Instead of traditional joystick input commonly found in games, the Nintendo Wii uses input devices like wireless joysticks (the Wii Remote) that have built-in 3D accelerometers and optical sensors, that makes it possible to detect movements with one hand (while the other hand can use the Nunchuck unit that is an analog stick with motion-sensing). Another kind of input device for the Nintendo Wii, the Wii balance board, can be used to sense movements of the user's body.

2.1. Mobile games

Mobile game design is several steps behind personal computers (PC) and console game design. In particular, mobile phones still remain a casual gaming platform despite recent technological advancements6. Problems with haptics and game accessibility are especially acute.

In the mobile world, haptics has typically not been used in game interfaces, despite the built-in motors available in many phone models. Only recently has it caught the attention of the mobile game market, through such initiatives as the VibeTonz system40. VibeTonz provides a tool to implement applications on mobile phones that use haptics feedback.

Mobile phones probably are the most pervasive kind of device nowadays. This opens up the possibility of reaching out for a large user base, visually-impaired people in particular.

Traditionally, PC and console games often feature advanced graphics, physics and AI simulations due to high processing power available on those platforms. This is not the case with mobile phones, which have simpler hardware and limited input methods. For example, mobile phones (compared to PCs) have low processing power and tiny screens. Another issue is that phones have been designed mainly for making calls, and thus those devices usually have keyboards optimized for that. In many cases, when using such keyboards it is not possible to detect when two or more keys are pressed simultaneously, making it difficult to design key-based interfaces for action-games.

However, mobile phones design is getting more and more sophisticated when it comes to feature convergence. For example, the mobile phone used to test the game prototype features cameras (two, for pictures and video), music player, network connections (3G, WiFi and Bluetooth), acceleration sensor, GPS receiver, reasonable storage capacity (up to 4GB), and other accessories.

This opens up new possibilities for designing games that exploit the above-mentioned features of mobile phones, one being that mobile phones are always connected to some network. Thus, developing multi-player games for those devices is a natural move, and can be an opportunity to increase social inclusion and interaction among players (both sighted and visually-impaired).

2.2. Mobile games and accessibility

According to the IGDA20, Game Accessibility can be defined as the ability to play a game even when functioning under limiting conditions, which can be functional limitations, or disabilities - such as blindness, deafness, or mobility limitations.

The work by Glinert and Wyse18 claims that there are fewer than 300 games available for visually-impaired, known to the general public. Compared to regular games, this is too few. An important source of games for users with special needs can be found in the AudioGames web site [www.audiogames.net]. In the specific case of visually-impaired users, games roughly fall into three categories16: games not designed to be accessible (like conventional games); games designed to be accessible (like audio games); and games adapted to be accessible.

Although audio games are designed to be accessible, they are not necessarily designed specifically for visually-impaired users. Because they do not rely on visual information, they seem to be an excellent alternative for such users. However, because not much is known about other needs and desires that visually-impaired players might have, audio games still represent an open field for research, as suggested by Friberg and Gärdenfors15, and Drewes and co-authors11, for example. None of these proposals, however, focuses on games for mobile phones.

Accessible mobile games are scarce. Although mobile devices are rapidly becoming more powerful, with more memory, more processing power, and more multimedia functionalities, such resources have not yet been used to promote accessible gaming interfaces. Additionally, as is the case with most accessible technologies39, explorations with non-visual game interaction clearly opens up novel possibilities for sighted users as well.

2.3. Benefits of non-visual games

Research on non-visual games can generate various kinds of benefit:

• An opportunity to include a visually-impaired audience in the play, by fostering game designs and environments that boost their participation;

• An opportunity for gameplay innovation in trying to represent the game environment, characters and events, using audio and tactile feedback;

• An opportunity to explore and exercise other senses;

• An opportunity to create more personalized experiences, as people imagine things in different ways; and

• An opportunity to increase the immersive experiences in games.

Some authors13 point out that accessibility is the main motivation for research on game audio. However, another important motivation may be finding new means for richer interaction. For example, exploring audio and haptics may lead to innovative game designs that minimize issues like display limitations on mobile phones.

3. Related Work

In this section we describe work on non-speech signification systems using semiotics, non-visual games for PCs, mobile phone experiences with haptics and gestures, and game accessibility.

Using semiotic theories to design sonic signification systems is a strategy previously used by Pirhonen and co-authors35. The authors propose a design method for non-speech sound systems based on structural semiotics using syntagmatic analysis. They specifically discuss the needs of visually-impaired users, but focus on web accessibility, not games. And interacting with games is considerably different from navigating on the web.

Similarly to the previously mentioned work, Murphy and co-authors30 resort to structural semiotics as a theoretical basis for designing non-speech signification systems. They discuss the combination of audio and haptics to convey information to visually-impaired web users.

With respect to PC games, exploring the experience of sighted players, the work of Liljedahl and co-authors25 and of Drewes and co-authors11 provide interesting examples. Liljedahl and co-authors25 describe a PC game, called Beowulf, where the player wanders in a dark labyrinth composed of tunnels and caves, inhabited by monsters and other dangerous characters. The player uses audio to navigate in the environment and to fight enemies. Moreover players hear the audio as if they were actually located in the environment ("first person audio-game"). The authors define the game as an "audio-mostly game", because the majority of game play is driven by audio and not by graphics. The visual information comprises a representation of the game world map with no details, displaying the areas already explored by the player.

One of goals of the Beowulf's authors25 was to explore the players' imagination by "not telling or showing everything", and let them interpret the game experience by themselves. They claim that by using this strategy they are contributing to increase the "suspension of disbelief", the user's willingness to accept his/her experience as true even if it seems unreal or impossible. Their goal was to explore an immersive experience based on audio. They sum-up the whole idea by defining the "scary shadow syndrome" concept, something that we have also identified in our game, The Audio Flashlight.

Liljedahl and co-authors25 present a classification of the sound elements that can be used in game design. This classification is not based on semiotic theories, and their design does not include music and narrative elements. They state that "general design principles and methodologies are still very much to be developed in the field of sound design".

Beowulf is not an audio-only game project for visually-impaired people, but an investigation on gameplay in situations almost entirely driven by audio and less by visuals. The Beowulf authors want to investigate if the player will be freer to create his/her mental universe in an "audio mostly game". The present work is influenced by the Beowulf game, but it goes in a different direction when it focuses on the complete absence of visuals, includes haptics, and considers both visually-impaired and sighted users.

The work by Drewes and co-authors11 describes Sleuth, an audio game for PCs where the player plays the role of a detective. The game begins with a narrative that describes what happened in the game scene and places the player in the role of a detective. As a detective, the player wanders through many rooms in the game, trying to gather evidence to help him/her to solve the mystery. The evidence is brought to the player in the form of audio messages. Some time later, the player can try to guess who is the murderer, the weapon and the location of the crime. If right, s/he wins the game.

In Sleuth the authors explore spatialized audio perception (through cue design and identification), navigation feedback, and narrative elements. The navigation feedback works as the system telling the user when certain events happen. For example, they use sound to notify the player when s/he is walking, or has hit a wall. The authors also stress the recommendation to avoid overloading audio with too much information. This is even more critical for non-visual games, where players cannot use the visual system to resolve ambiguous audio information.

In spite of being an audio-only game, Sleuth does not seem to have been designed for visually-impaired people. In their evaluation, the researchers hand paper maps to the players with a "blue-print" representation of the game world (unnamed rooms and doors), and a checklist of possible weapons, room names and game characters. Then they ask the players to annotate the map, which helps them orient themselves and infer the right answers to naming the visual representations on the map.

Regarding games that explore accessibility issues, the work by Friberg and Gärdenfors15, and Glinert and Wyse18 bring up interesting insights.

Friberg and Gärdenfors15 explore three games (for PCs) based on audio initially targeted at visually-impaired children. Their goal is to explore possibilities and to find new approaches for designing game audio, and they propose a model that is inspired on film music conventions.

They use a categorization system for different kinds of sound that may exist in a game. They also present a semiotic analysis of sound objects in their games. Their work is based on that of a cartoon and comic scholar - Scott McCloud - who proposed a semiotic model of visual vocabulary based on Charles Peirce's triangular sign structures34. Friberg and Gärdenfors stress the importance of providing feedback to player actions through the auditory interface. In our case, this is even more important given the subtlety of movement variations when commanding the game through gestures only.

Glinert and Wyse18 discuss the AudiOdyssey game, an accessible PC game with audio and visual information. The game aims at offering a multi-player (online) environment where sighted and non-sighted players can play equally and, in the authors' words, "with the same level of challenge and sharing a common game experience". This work is concerned with the very issue of accessibility for games.

The game itself is defined as a "rhythm game", where the player incorporates a DJ whose task is to keep the audience happy on the dance floor. The player generates sound in real time by responding to the music beats. The player interacts with the application with the keyboard or the Nintendo Wiimote. Using this input device makes it possible for visually-impaired (and sighted) people to play the game with a more natural and intuitive interface. When listening to the speakers, players have cues to guide themselves on how to swing the Wiimote in order to play.

Several works have investigated haptics focusing on visually-impaired people. A comprehensive survey on this matter can be found in a report by Vincent Lévesque24. However, there are only few experiences with games. Iglesias and co-authors21 presented an adventure and searching game using a system in which the interaction mechanism utilizes two robot arms. This system called GRAB, is an innovative haptic and audio virtual environment for visually-impaired people, but it was designed with a focus on virtual environments, not mobile applications.

Regarding mobile phones, the works of Linjama and Kaaresoja26, Gilbertson and co-authors17, Baek and Yun3, Ur-Rehman and co-authors38, and Ekman and co-authors13, provide interesting examples on games, haptics, and gestures.

Linjama and Kaaresoja26 explore gestural input and tactile feedback with mobile, although not for playing games. The authors describe their "bouncing ball" demo, where the user taps the device to change the orientation of the ball. The demo responds with vibration when the ball hits the walls. The ball is always visible on the phone screen.

Gilbertson and co-authors17 conclude that tilt movements of mobile phones require minimal signal processing and no external references, and as such are especially suitable for mobile phone applications. They explore a tilt interface for a 3D first-person driving game called Tunnel Run and compare the user experience with the same game using a traditional phone joystick interface. Their results are consistent with our conclusions about player's fun and difficulties with an interface control based on tilting gestures.

Another interesting approach to mobile phone interface control can be found in the work by Baek and Yun3, which proposes a state machine algorithm for sequence-action recognition using the mobile phone's accelerometer. This is a promising alternative we want to explore in future works.

Ur-Rehman and co-authors38 describe a system to represent information from a live soccer game (non-interactive, as on television) into vibration sensations on a mobile phone to convey what is going on in the game. However, as is the case with the work by Linjama and Kaaresoja, ur-Rehman and co-authors are interested in tactile feedback as complementary modes of communication and representation to be used with visual modes.

The work by Ekman and co-authors13 describes a location-based mobile phone game, The Songs of the North, which is based on Finnish mythology. This game was not designed for visually-impaired users, but it takes audio as the primary information channel. The authors explore audio as a medium to convey information about places, characters, objects and actions in the game. One of the goals of the project is to provide enough information about the game, so that the player is able to interact without having to look at the phone screen. The authors warn us, however, that using sound to convey game information is still an unfamiliar approach to some sighted players. This is something that we wanted to explore with our experiment.

They also note that due to the technical characteristics of mobile phones in general, there are some limitations on alternatives for creating immersive experiences in this platform. Visual and audio information, as commonly found in PCs, are not likely to be as expressive in mobile phones. This means that it is necessary to come up with innovative approaches to reach desired immersive effects. Hence, exploring new ways of expression using audio and haptics may be a promising alternative.

Finally, exploring mental imaging is one of the topics discussed by Lumbreras and Sánchez28. The authors provide a framework to describe and implement virtual audio environments, so it can be used in games, for example. They describe an audio game (for PC) and one of their goals when testing the application is to have the users (visually-impaired children) reconstruct the virtual environment with LEGO blocks, to verify if the user perceptions matched the physical environment modeled by the researchers. Hence, their focus is slightly different from ours.

We share some of the concerns found in that work, such as: how audio-based entertainment can help creating cognitive spatial structures in the minds of visually-impaired people; how to describe an acoustic navigable environment; and whether haptic/acoustic correlation helps spatial navigation naturally.

4. The Semiotic Engineering of the Audio Flashlight

The Audio Flashlight is a "treasure hunt" game we have developed for the purposes of this particular research study. The game takes place in a dark room, where the treasure is lying somewhere.

While inside the virtual room, the player cannot see anything. All s/he can use to find the treasure is a special device called "The Audio Flashlight". This device can be regarded as a kind of radar that guides the player to the treasure through sound.

Occasionally, the player may bump into walls or other internal obstacles that lie around the room. The player should dodge these obstacles and keep walking in search of the treasure. Figure 1 illustrates a typical map for a room in this game. The map in Figure 1 is only a representation for illustrative purposes in this text, as the game does not display anything on the screen.


The platform chosen to test the game concept is a Nokia N95 mobile phone33.

Following the trend of some previous works, ours was also inspired by semiotics. However, unlike authors who have resorted to fundamental semiotic theories, we have used Semiotic Engineering1, a theory of Human-Computer Interaction with semiotic foundations stemming mainly from the work of Peirce34 and Eco12.

The main tenet of Semiotic Engineering is that interactive systems designers actually communicate with users (at interaction time) through computer systems interfaces. Interfaces act as the designers' proxies (the designers' deputy, according to the theory). Thus, when designing any system's interface, designers are actually deciding what kinds of conversations they will have with users, using which modes and media, and for what purposes.

Given the exploratory goals of this project, we selected a simple "treasure hunt" game for mobile phones as the test-bed for exploration, The Audio Flashlight. Our design challenge was to communicate to our users the whole idea and experience of the game without using a single visual sign. Instructions on how to play and game initialization procedures were provided by the game designer in person, and fall outside the scope of the current study.

Our first step was to identify the critical meanings we wanted to convey to the users. We decided we needed the following kinds of signs:

• A sign to represent that the game has begun;

• A sign to represent that the user has decided to abandon the game;

• A sign to represent that the user has accomplished the game goal (i.e. has found the treasure);

• A sign to indicate that the user is walking around the environment;

• A sign to represent that the user is doing nothing;

• A sign to represent the current state of the user relative to the game's goal;

• A sign to indicate every single action that the user can take;

• A sign to represent the obstacles that may hinder the user's walk.

4.1. Choosing the signs

An important step in the Semiotic Engineering of the game's interface was to choose the appropriate signification system(s) that user and the designer (i.e. the system's interface) will use to communicate with each other, excluding systems that rely on visual representations.

Signification system choices must be based on cultural conventions usually associated to the messages that need to be communicated in the game. Otherwise, users will be required to learn an unfamiliar and arbitrary signification system to play the game, which is surely a source of usability problems. We resorted to visual games interfaces, which often apply such cultural conventions as ancillary reinforcements to communication. For example, to increase the perceived sensations and emotional setting in the game, sound and music may be used, as well as tactile signs of different sorts. The role of cognitive metaphors23 is particularly important in this setting.

4.1.1. Game events

The sign to represent that the game has begun corresponds to an "opening door" sound, relating to the metaphor of "entering an environment".

The sign to represent that the user has decided to abandon the game, before or after finding the treasure, corresponds to a "closing door" sound.

The sign to represent that the user has accomplished the game goal corresponds to sound conveying "applauses".

Note that all three meanings are conveyed in aural mode, based on signs that express primitive metaphors associated to their object.

4.1.2. The radar

The sign to indicate the current progress of the user toward the game goal is represented with musical patterns, through volume and rhythm variations. The metaphor here is that of an "audio radar". Hence, the player must use his/her hearing senses for orientation within the environment.

The audio radar is designed as a set of music files with varying volume and rhythm. Figure 2 illustrates a schematic view of the audio radar.


The music spectrum is divided into five musical patterns. All of them are very similar, but they differ in rhythm. The radar selects the pattern according to the distance between the player and the secluded treasure. The closer the player gets, the faster the music plays. The radar also changes the music volume using this strategy. The closer the player gets to the target, the louder the music. Thus we achieve a redundancy of dimensions within the same signification system - volume and rhythm reinforce each other in conveying the user's status with respect to the ultimate goal in the game.

The music was designed to help in creating a tension aura in the scene, following concept explained by the "scary shadow syndrome", and thus contributing to improve the immersion experience of the game.

4.1.3. Player actions

The signs to represent the user actions are represented through a gestural interface.

In the game, the user can walk around the environment in four basic directions: forward, backward, left, and right. The user communicates this command to the game by tilting the device, i.e. turning the mobile phone screen in the desired direction. For example, when turning the mobile phone screen to his/her chest, the player walks backward. When turning the mobile phone screen forward, as trying to point out something on the ground, the player walks forward. Figure 3 illustrates these tilting gestures. The player is not required to walk physically in the environment to play the game.


While the player is walking, s/he hears the sound footsteps in constant pace. The player remains walking while keeping the phone screen turned to the desired direction. The player stops walking by positioning the phone screen up, parallel to the ground. The sign to indicate that the user is idle (not walking) is "silence": the footstep sound is not heard.

The gestural sign to communicate that the player wants to abandon the game is to position the phone screen facing the ground.

The main motivation to adopt a gestural interface is to provide a more natural way to interact with the phone. Compared to pressing buttons to signal those commands, for instance, gestures are clearly more direct and expected to be easier to perform.

4.1.4. Obstacles

The sign to indicate the presence of obstacles is represented using haptics, through the vibration feature of the mobile phone.

There are two kinds of obstacles in the game: the room boundaries and internal obstacles. Room boundaries are represented with long (and stronger) vibrations, while internal obstacles are represented with short (and weaker) vibrations.

The motivation to use vibrations is to associate the idea of "physical collision" with the physical sensation provided by vibration.

In sum, we notice that the system communicates with the user through aural and tactile signs, while the user communicates with the system through gestures. The game interface designer decided on this distribution guided by a general constraint on the nature of signification systems modes, and then on the most salient metaphors available in non-visual modes of signification for mutual communication.

5. Preliminary Evaluation

To examine the quality of the Semiotic Engineering of The Audio Flashlight, we carried out an empirical pilot study with seven participants. They played the game prototype in a Nokia N95 phone provided by the researchers for the test. The goal of the evaluation was to indicate some initial directions for further research, both in terms of method and issues to explore.

Among the seven people who volunteered to do the test, three of them were sighted, and four were visually-impaired. Table 1 lists the players' profiles.

Two of the sighted players were gamers, very used to playing (traditional) computer games. They are also game developers. The other sighted player is a casual gamer.

Three of the visually-impaired participants were totally blind, and the fourth one had sub-normal sight. This impairment caused severe difficulties for this person to see images and graphics on mobile phone screens and keys, but did not prevent her from moving around by herself without special aid. One of the visually-impaired participants (V4) was born with sub-normal sight, and then lost his sight at the age of 21. Another participant (V2) was born with normal sight and lost his sight also at 21 because of severe disease.

Regarding the visually-impaired participants, players V2, V3, and V4 are not currently active gamers, although players V2 and V3 reported playing simple games on a PC sometimes. Player V4, in particular, reported having played games before he lost his sight completely (six years before he participated in the study). V1, a participant with sub-normal sight, reported that she occasionally played games on a PC.

None of the play-testers had ever played a non-visual game like The Audio Flashlight.

5.1. Methodology

Qualitative research methods are particularly useful in the preliminary stages of research with innovative artifacts or concepts. Unlike quantitative studies, which test a priori hypotheses formulated by researchers in the form of "yes/no" questions, qualitative studies tend to have no other a priori than an open-ended research question.

We therefore chose a qualitative method for our research at this stage. The research question we wanted to investigate was: "What do sighted and visually impaired people experience when playing with a non-visual mobile phone game?" Two techniques were used for data collection: direct observation and interviews. Volunteer participants were invited and selected according to a maximal variation strategy - we wanted to work with a small group, as is the case with most preliminary evaluation studies, but still have the opportunity to cover many different types of potential users. Gamers and developers, for instance, were selected for their expert knowledge and fine appreciation of technology. The sighted non-gamer and the person with sub-normal sight were chosen as representatives of two classes of "relatively unbiased" users that have other options. Finally, among the other visually-impaired users, we had a considerable variation of age, experience, and history of impairment.

After the participants consented to make the test, the evaluation was carried out according to the following sequence of steps:

• the evaluator read aloud the game instructions to the players

• the evaluator made a pre-game interview to gather information about the participant's experience, expectation, and history of impairment (only for visually impaired participants)

• the participants played the game

• the evaluator made the post-game interview to capture the reported game experience and the participant's suggestions and comments.

The game instructions were read aloud so that all the relevant signs of the game were fully explained, in the same way, for all players, under the same conditions. The evaluator also made a brief demonstration of how the vibration signs felt, so that the players could check that they were feeling the right (intended) thing during the play. As "a surprise" ingredient in the game, the instructor did not tell participants what sound would play when they found the treasure. He just told them "they would know" when they found it.

The preliminary interview consisted of questions that were slightly different for sighted and visually-impaired participants.

What we wanted to know from participants (and thus constituted the interview script) had to do with:

I. General gaming

a) Did they play digital games? Which games? (Why not?)

b) Did they use mobile phones? How? Other mobile ICT devices?

c) If so, did they play mobile phone games? Which games? (Why not?)

d) Were they game developers?

II. Familiarity with non-visual games

a) Had they ever played a non-visual game? If so, how was the experience?

b) What were their expectations regarding the game play that was about to begin?

What we wanted to hear from visually-impaired participants was slightly different:

I. The nature and history of their disability (viz. when it was acquired).

II. Their former experience with any form of accessible digital entertainment.

a) If they had not experience, what expectations they had for the game play that was about to begin.

III. How they used mobile phone (if at all):

a) What functions they used most and how.

b) What they found to be hard and easy to use on a mobile phone? Why?

After this brief interview the evaluator started the game and let the participant play at leisure. There was no time limit for the play, although researchers had decided previously to gently close down the experiment should the participant take too long to find the treasure and get bored or tense.

The virtual room configured for the game test is shown in Figure 1. Notice that this visual representation was never displayed for sighted or non-sighted players; it is only a schematic representation of the spatial configuration parameter for the play. The players started at the lower-left corner of the map, although they did not have this information. All they knew was that they had just opened the door of a completely dark room, where they could not see anything, but should orient their treasure hunt using the signs emitted by the audio flashlight. This guaranteed that sighted and non-sighted players had the same spectrum of interactivity in the play.

After the test, another short interview followed, in which we wanted to capture their report on the following:

I. What was the experience like for them

a) Easy or hard? Why?

b) Entertaining or not?

II. Did they feel immersed in the game?

III. Would they play this (or a similar) game again?

IV. Could the game be made more challenging? How?

V. What did they think about the non-visual representations in the game?

VI. Do they think that this interaction style can be used in other applications or situations? Which ones? Why?

VII. Did they have further suggestions, comments, questions

Interviews were recorded in audio, and all test sessions were videotaped. All participants were sitting in a quiet place (typically a closed room), holding the mobile phone close to their lap.

The results of the experiment are reported in the next subsections, categorized according to dimensions that are directly related to non-visual games interfaces.

5.2. Overall experience

All the play testers except V3 had no particular difficulty in finding the treasure, although only one (V2) expressed that the playing was easy. Here are excerpts of the interviews.

"I think I was lucky." S2

"I found it moderately difficult." S2

"Does the game pose the same difficulties to all players? Cause I found it easy. I got to the treasure really fast." V2

"[Playing was] neither easy, nor difficult. I had to get used to playing. I was a bit confused with the vibration; it took me a short while to figure out how it worked." V4

The evidence above is particularly revealing if we look at the spatial configuration of the game on Figure 1, and see how easy the game would be with a visual interface. Also, we should note that the only participant who was emphatic about how easy the game was is V2, a person who is completely blind.

One of the sighted players (S2), who was a gamer, was able to find the treasure faster than the other participants, although he was closely followed by the next faster player (V4), one of the visually-impaired participants. This strongly suggests that this modality of playing leveled out the conditions of sighted and non-sighted players that we observed. Player V3, however, was not able to find the treasure. Although he said that he found the game "easy to play after some training", the interview clearly showed that he did not fully understand the audio cues (see below).

The players who found the treasure accomplished the task in 149 seconds (2 minutes, 29 seconds) on average. Table 2 lists the timings for the play sessions. In the case of player V3, we decided to gently close down the experiment to prevent boredom and tension for the participant.

Although all participants said they had high expectations and curiosity regarding what was about to happen in the play, they all reported having enjoyed game, and found it very interesting because of its unconventional approach. Here are some interview excerpts:

"Very cool." S2

"Fun to play!" V4

"Congratulations! Great idea!" V2

An unexpected revelation was made by V1, who reported that she enjoyed the play, especially in comparison with "the boring game that I can play on my mobile", but added:

"In the beginning I was somewhat distressed because I couldn't see anything."

This was a participant with sub-normal sight, for whom being completely deprived of sight certainly has a different emotional meaning than that for sighted and blind people.

5.3. The footstep sound

One thing that stood out in the tests was how the players relied on the footstep sound to know what they were doing. Listening to this sound was critical to differentiate between walking and being idle.

In traditional games, where commands are activated by pressing phone keys, the "no action" state corresponds to not pressing anything. This is a clear sign for "idle". However, with the gestural interface, the "no action" state is much more subtle to command (at least in this game). Participants did not always realize that their hand position was not facing forward or backward, but is actually turned up parallel to the ground.

The footstep sound was intentionally played lower than the music when the player was very close to the treasure. The researchers wanted to check whether the excitement of the music playing faster and louder would take over and suggest that the participant was moving towards the target. But, all the players reported getting lost in this situation.

This suggests that the footsteps were the primary system feedback that provided the constant engagement conditions for users to keep playing, as Friberg and Gärdenfors15 had already proposed.

One sighted player reported that indicating explicitly that he was walking in the room (with the footsteps sound) was a much better alternative than using an implicit indication (like the system "doing" nothing till the player hits an obstacle or a wall, for example). However, player V4 was a bit confused with the interpretation of move commands when facing an obstacle. There, the sound of footsteps gave way to vibration, a sign of collision. At a certain point during the game, V4 felt continuing vibration and thought he was holding the phone in the idle position. See the following excerpt of his post-test interview:

V4: "I was confused in the beginning because ...What was it again? Oh, yes! It had stopped [walking], but there was still vibration. I could not figure out why."

Interviewer: "Well, in fact the stop position was slightly different from how you were holding the phone."

V4: "Oh, I see."

Interviewer: "You were not really standing still - you were trying to move and bumping against something".

5.4. The music

The music in The Audio Flashlight was intentionally designed to build a tension aura to the scene.

Participants reported that the music helped to create tension in the game, and contributed to create excitement, as defined in the "scary shadow syndrome"25.

They also found the audio radar a useful tool to guide them to the treasure.

One of the participants (V3) stayed idle for long periods of time. So, sometimes the same music ended and started over again. This operation created a gap in the sound, and he thought that this was due to some game event. The other participants did not experience this problem because they were constantly moving around (and so the music changed often).

Player V2 reported that the music guidance was very useful and clear for him, and made his task very easy.

5.5. Immersion

Participants also reported that the game was very immersive. Sighted participants said that despite the absence of graphics, they still felt immersed. One of them even got startled when he was walking in room and felt a vibration due to a wall collision. In the interview he said:

"Oh, yes! I was so immersed I closed my eyes to focus on the game." S3

A sighted participant even said that the immersion experience was similar to the one with a graphics intensive game of the same kind, despite the absence of graphics. He said:

"I think that using only your ears and hands is a completely innovative way to play. [...] And it's fun. [...] The greater the innovation, the greater the fun, because if you keep your eyes closed you will hear better, you will be more concentrated, your results will be better. So, your immersion will increase." S2

5.6. Vibration

Players S2, V1 and V2 did not care to differentiate between walls and internal obstacle vibrations. For them, both were the same because they felt they had to dodge them anyway.

All participants except player V3 were able to use this tactile feedback to navigate in the environment. Player V3 had some difficulty in using the game due to the sound produced by the vibration motor in the phone. He thought that the motor noise was actually a sign of some game event. The other players, who were more used to using mobile phones, seemed not to be bothered. A visually-impaired participant (V2), for example, stated that he did not notice this sound at all. This issue was something that came unexpectedly, but might be explained by the pronounced aural acuity that blind people develop in order to interpret environmental cues around them.

An interesting insight to us came from V2, who explained why the two different types of vibration made no particular difference to him:

"I thought that the combination of both sound (music) and vibration made perfect sense. [...] But to me, sensing the vibration, short or long, is really what matters. It means that I have to get out of that place and go where the music is playing louder. So, it didn't really matter whether I was hitting a wall or another obstacle. To me, a wall is an obstacle." V2

This participant also remarked the importance of signifying collision. In his words:

"For us, blind people, this is crucial. If you leave us in the middle of a square, with leveled and smooth pavement, no obstacles around us, we'll be lost."

5.7. Gestural Interface

The players found the tilting gestural interface very convenient to this kind of game.

V3 acknowledged the usefulness of this approach because he could move naturally in the environment, and also faster. He reported that he would have had much trouble with the game if he was required to play with the keypad. He said that keys in current mobile phone keypads are not very different when it comes to using touch to identify them. Then, he would have to memorize the key locations to try to play the game, something that might make his experience not so enjoying (and difficult).

However, another participant, V4, told us that maybe we could use a joystick as "an alternative for the game". He showed us that his mobile phone had a joystick, which he used for interacting with other programs on the phone. Notice that he was the one who was confused with the idle position, and could not understand why there was vibration when he (thought he) had halted all movement.

Player V2 initially thought that the interface was too sensitive and tried only small gestures, and then he would not move. After a short while, he got used to the interface and was able to move with no problems. Interestingly enough, he did not move continuously as the other players, but moved "step by step". In other words, he would move to one side and then positioned the phone in the idle position, after hearing the footstep sound. He did that for all directions he wanted to move until he found the treasure.

As we mentioned before, when discussing the importance of the footsteps sound, it is crucial to give appropriate feedback for all the events triggered by the gestural interface, or the using the interface may become quite confusing. This point is also stressed in15, where the authors refer that the connection between game input and audio feedback is very important to keep the player informed that the game has acknowledged the command.

Another observation was that some of the participants seemed to have some difficulty in keeping the phone in the "straight" position and often moved diagonally. One interesting direction to take when refining the game interaction, is to check if it would be more useful for the players if they could move only in the basic directions, as if the game world were a grid, dropping the diagonal moves.

One of the sighted players suddenly started manipulating the phone in ways that had nothing to do with the ones required by the game (see Figure 4). He would turn the phone around and move it in landscape position. In the post-test interview he reported that he had lost his sense of orientation and was using the phone physically as a tool to try to recover it.


As pointed out on the former paragraph, moving in straight lines in a grid-like fashion might help the players to regain his/her sense of orientation in the game. However, this hypothesis should be further investigated.

5.8. Event Representation

Participants reported that the representation of the events (beginning and ending of the game, footsteps, finding the treasure) was generally adequate. However, some of them did not notice the signs that communicate the "beginning and ending game events". They said they "did not hear it". The footstep event was the most important one to them.

As they did not know a priori what was the sound for the "finding the treasure" event, some of them asked if they had found it when the music reached its peak on the first time.

The game was so designed that when a player finds the treasure, the game starts over again. The game plays the opening door sound again to communicate this event. However, participants seemed not to know what to do when this happened. They looked like they had forgotten the instructions about what that sign meant.

An interesting contribution came from participant V2, who said:

"I thought that maybe along with vibration, when I hit the wall I might hear a scream, or a bumping noise;[...] an obstacle, for example, hearing the noise of something falling and breaking. [...]. Maybe this would be more challenging to me. " V2

The use of headphones might solve some of the above-mentioned problems. Moreover, headphones allow interesting future research on auralization and 3D space structuring41.

V2 is clearly alluding to the addition of a plot in the game. He mentioned that having to decide what to do when he bumped on an obstacle and heard the sound of something breaking, compared to what he would do when he bumped onto a wall and heard a scream, would make the game "more elaborate, more instigating". He explicitly said that his strategy of turning away from vibration towards where the music was playing louder "made it very easy for [him] to win the game fast".

6. Conclusions

The design of non-visual games is a compelling task and still not very explored, especially in mobile phones. The research in non-visual games can help to bring visually-impaired people into play, and also benefit gaming as a whole due to the opportunity of exploring other senses beyond vision.

Current mobile phones have interesting capabilities regarding audio, vibration motors, acceleration sensors and connectivity that may help to spark novel approaches to designing games. To explore the possibilities of non-visual mobile phone games we designed The Audio Flashlight, a mobile phone game whose interface relies solely on audio, vibrations, and gestural input, and carried out an empirical pilot study. Audio, it must be said, was completely non-verbal in the game. No speech was used to command the game or provide feedback.

In this section we present general conclusions and the most important comments on experimental results. We think that commenting these results in this section help the readers to visualize a more global research agenda.

Our experiment showed, not surprisingly, that sound design in non-visual game presents very specific challenges. Although sound is commonly taken as the substitute medium to convey visually encoded information, sound that excludes speech, as in our case, is not necessarily a "substitute" for visual signs.

The non-verbal encoding must be carefully designed so as not to overload the users' senses with too much information at the expense of other sensorial experiences. This may confuse both sighted and visually-impaired players. For blind players, this issue becomes really critical, and we found that sighted designers are prone to missing certain distinctions if they are not helped by blind collaborators (possibly users) at design time. The main lesson learned was how and why certain design options were not helpful to our users.

In The Audio Flashlight we tried to avoid auditory overloading by representing collisions through another sensory channel, touch. However, as mentioned above, the vibration motor of the mobile phone made a sound that one of the visually-impaired participants thought was a meaningful element in the sound signification system.

Likewise, we missed the point that unlike in the real world, where colliding with an obstacle makes one stop, in the game the "move" command co-existed with the "colliding with an obstacle or wall" situation, which confused one of the participants. This is mainly because the discontinuity of media and modality for both commands - gesture for "move" and vibration for "collision" - does not lead to one canceling the other, as would happen in a physical model. This brings about interesting non-verbal representation and communication issues that must be resolved to this possibility into a real style of interaction with mobile phone games.

Another issue with games highly dependent on audio is that people may not be able to play them properly in some environments like public places, due to the surrounding noise. Also, the player may not feel comfortable playing the game while other people are watching and hearing what comes out of the speakers. Social and privacy issues must clearly be addressed for appropriate playability, and we were not fully aware of those in our preliminary design. One of the participants actually verbalized this problem, by saying:

"The only thing that struck me [as a limitation] about this game is that I would not be able to play it on the bus" V2

Using vibrations in a game might be interesting to enrich the gameplay. For example, a game could use different vibrational patterns to distinguish interactions with various non-player characters. However, just as with sounds, we must be careful not to overload the users' senses with too much information encoded in tactile patterns.

An important key to signification seems to have been given by V2, when he mentioned the importance of narrative sounds in the game (something breaking, a scream, etc.). We interpreted it, as already mention, as a request for a plot, the heart of so many games. However, the semiotic challenge is even greater in this case, because the semiotic engineering of plot-related signs and interaction-related signs (i.e. commands and feedback) should contemplate not only communicability, that is, the ability to communicate clearly the meanings that sounds were intended to convey, but also usability, the ability to support ease of use and agility in game playing.

Moreover, games that explore non-visual senses open up the opportunity for players to create personalized experiences, because they can use their imagination to visu alize the game scenario mentally. This can also contribute to increase engagement with the game due to the "scary shadow syndrome" concept described in25. Some of our sighted participants mentioned that they would "close their eyes" for greater enjoyment of the game. However, we should not forget the distress that "not being able to see" caused, even if momentarily, to one of the participants with sub-normal sight. The scare, in this case, had a completely different connotation, which we must carefully investigate and understand.

The preliminary evaluation of the game indicated that all players enjoyed the experience, because it was different, challenging, and new. They all had ideas about how to make the game more exciting. In fact, most of them, sighted or non-sighted, explicitly asked about "the next stages of the game". Some of them even asked what the next stage was like: what new difficulties they had to face. Participant V3 was very curious about how he performed compared to others. Participant V2 wanted to know what his performance was like (the equivalent of "his score"). He (V3) explicitly asked the evaluator: "How close was I to the treasure?" Participants V2, V3, and V4 said that they would like "to practice more" for the next time.

This aroused competitiveness of visually-impaired people is a strong sign of the engagement potential of this kind of game. In particular, it points at the social dimension of competition - comparing oneself with others, which in this particular case is an inclusive comparison because both sighted and non-sighted players reported equivalent challenges and experiences.

The mobile phone platform brings about unprecedented opportunities for inclusive social interaction with accessible games. Besides networked multi-player games, following the lines of Glinert and Wyse18, because such devices are portable, they can actually support interaction of players - sighted and non-sighted - that are co-located, sitting around a table in a cafeteria or a pub. The mobile phone might be used as the equivalent of a game board - a common space for all players to meet and compete for winning the game.

Of course on the road towards this kind of scenario, there is a long research agenda to investigate classes of games for which this modality of interaction is suitable. And, for this purpose the use of semiotic engineering concepts is a very promising alternative because it brings HCI onto the stage of games technology, where semiotics itself has been previously used to analyze a wide spectrum of issues.

Some possible directions for future works that we have identified are listed below in a decreasing order:

• Testing the game with different populations of visually-impaired users, trying to include not only people with different kinds of sight problems, but also with different emotional attitudes towards games. We realize we know very little about how visually-impaired people relate to what sighted people normally refer to as games (i.e. computer based games).

• Studying how visually-impaired people navigate physically in the world, focusing on how they mentally conceive space and how they physically select and perceive spatial cues that contribute to building a spatial mental model. Then we can try to support their navigational strategies in non-visual game interfaces.

• Testing the game with more elaborate spatial configurations, making it harder to find the treasure.

• Adding a plot to the game and finding out how spatial information integrates with narrative information in a game situation. Designing appropriate interfaces for this kind of game is likely to be a hard challenge.

• Elaborating on gestural interface signs, to communicate a wide spectrum of contents. For example, could it be that holding the phone in landscape mode, with both hands, would convey different kinds of orientation contents? Also, more elaborate sequences of gestures can be implemented using some sort of machine intelligence3.

• Investigating how hearing and touch jointly (or separately) affect the sense of presence and immersion.

• Structuring sound within two spaces, a "3D space" (auralization41 to improve immersion) and a "temporal space" (auditory icons played before, during or after a vibration feedback or other sounds to distinguish semantic elements).

• For sighted people, we also find it interesting to study the similarities and differences between the immersion provided by audio games and graphics intensive games. How can we increase immersion in audio games? How visual games would benefit from an elaborated sound design?

As can be seen in the list above, a research agenda in this field is rich and challenging, touching on various interdisciplinary issues. Our study is only a modest step into this territory, and it raises more questions than provides answers. However, given the relative scarcity of research with non-visual games for mobile phones, and the evidence of interest, curiosity and pleasure experienced by the participants in our study, we believe this may be a good start.

Acknowledgements

The authors want to thank the participants of the experiment reported in this paper, and also CAPES, CNPq and FINEP for giving financial support to their research projects. They also want to thank Fabio Vecchia for designing the game music, and the anonymous reviewers who contributed to improve this paper.

Received: November 8, 2008; Accepted: February 18, 2009

A previous version of this paper appeared at IHC 2008(VIII Brazilian Symposium on Human Factors in Computing Systems)

  • 1
    Souza CS. The semiotic engineering of human-computer interaction Cambridge: The MIT Press; 2005.
  • 2
    Adams E, Rollings A. Fundamentals of game design. Upper Saddle River: Prentice Hall; 2006.
  • 3
    Baek J, Yun B. A sequence-action recognition applying state machine for user interface. IEEE Transactions on Consumer Electronics 2008; 54(2):719-726.
  • 4
    Barr P, Noble J, Biddle R. Video game values: human-computer interaction and games. Interacting with Computers 2007; 19(2):180-195.
  • 5
    Bystrom KE, Barfield W, Hendrix C. A conceptual model of sense of presence in virtual environments. Presence: Teleoperators and Virtual Environments 1999; 8(2):109-121.
  • 6
    Cai Y. The new frontier: portable and mobile gaming. Park & Associates Report. [on the internet]. Available from: <http://www.parksassociates.com/research/reports/tocs/2008/mobilegaming.htm>. Access in: 16/08/2008.
  • 7
    Caillois R. Man, play and games Champaign: University of Illinois Press; 2001.
  • 8
    Caldwell N. Theoretical frameworks for analyzing turn-based computer strategy games. Media International Australia Incorporating Culture and Policy 2004; (110):42-51.
  • 9
    Crawford C. The art of computer game design [on the internet]. Washington: State University Vancouver; 1982. Available from: <http://www.vancouver.wsu.edu/fac/peabody/game-book/Coverpage.html>. Access in: 16/08/2008.
  • 10
    Desurvire H, Caplan M, Toth JA. Using heuristics to evaluate the playability of games. In: Proceedings of the Conference on Human Factors in Computer Systems; 2004; Vienna. p. 1509-1512.
  • 11
    Drewes T, Mynatt E, Gandy M. Sleuth: an audio experience. In: Proceedings of The International Conference on Auditory Display; 2000; Atlanta. Atlanta: 2000.
  • 12
    Eco U. A theory of semiotics Bloomington: Indiana University Press; 1976.
  • 13
    Ekman L, Ermi L, Lahti J, Nummela J, Lankoski P, Mäyrä F. Designing sound for a pervasive mobile game. In: Proceedings of the International Conference on Advances in Computer Entertainment Technology; 2005; Valencia. Valencia: ACM Press; 2005. p. 110-116.
  • 14
    Faust M, Yoo YH. Haptic feedback in pervasive games. In: Third International Workshop on Pervasive Gaming Applications; 2006; Dublin. Available from: <http://www.e56.de/download/HapticFeedbackInPervasiveGames.pdf>. Access in: 11/08/2008
  • 15
    Friberg J, Gärdenfors D. Audio games: new perspectives on game audio. In: Proceedings of the International Conference on Advances in Computer Entertainment Technology; 2004; Singapore. Singapore: ACM Press; 2004. p. 148-154.
  • 16
    GAME Accessibility: gaming with a visual disability. [on the internet]. Available from: <http://www.game-accessibility.com/index.php?pagefile=visual>. Access in: 11/08/2008.
  • 17
    Gilbertson P, Coulton P, Chehimi F, Vajk T. Using "tilt" as an interface to control "no-button" 3-D mobile games. ACM Computers in Entertainment 2008; 6(3).
  • 18
    Glinert E, Wyse L. AudiOdyssey: an accessible video game for both sighted and non-sighted gamers. In: Proceedings of the 2007 conference on Future Play; 2007; Ontario. p. 251-252.
  • 19
    Huizinga J. Homo Ludens: a study of the play-element in culture. Boston: Beacon Press; 1966.
  • 20
    IGDA Game Accessbility SIG. Accessibility in games: motivations and approaches. [on the internet]. Available from: <http://www.igda.org/accessibility/IGDA_Accessibility_WhitePaper.pdf>. Access in: 11/08/2008.
  • 21
    Iglesias R, Casado S, Gutierrez T, Barbero JI, Avizzano CA, Marcheschi S, Bergamasco M. Computer graphics access for blind people through a haptic and audio virtual environment. In: Proceedings of The 3rd IEEE International Workshop on Haptic, Audio, and Visual Environments and their Applications; 2004; Ottawa. p. 13-18.
  • 22
    Korhonen H, Koivisto EMI. Playability heuristics for mobile games. In: Proceedings of the 8th Conference on Human-computer interaction with mobile devices and services; 2006; Helsinki. p. 9-16.
  • 23
    Lakoff G, Johnson M. Metaphors we live by Chicago; 1980.
  • 24
    Lévesque V. Blindness, technology and haptics Technical Report TR-CIM-05.08. Montreal: Center for Intelligent Machines, Mc-Gill University; 2005.
  • 25
    Liljedahl M, Papworth N, Lindberg S. Beowulf: an audio mostly game. In: Proceedings of the International Conference on Advances in Computer Entertainment Technology; 2007; Salzburg. p. 200-203.
  • 26
    Linjama J, Kaaresoja T. Novel, minimalist haptic gesture interaction for mobile devices. In: Proceedings of the NordiCHI 2004; 2004; Tampere. p. 457-458.
  • 27
    Lombard M, Ditton T. At the heart of it all: the concept of presence. Journal of Computer Mediated Communication 1997; 3(2). [on the internet]. Available from: <http://jcmc.indiana.edu/vol3/issue2/lombard.html>. Access in: 11/08/2008.
  • 28
    Lumbreras M, Sánchez J. Interactive 3D sound hyperstories for blind children. In: Proceedings of the Conference on Human Factors in Computer Systems; 1999; Pittsburgh. Pittsburgh: ACM Press; 1999. p. 318-325.
  • 29
    Malone TW. Heuristics for designing enjoyable user interfaces: lessons from computer games. In: Proceedings of the Conference on Human Factors in Computer Systems; 1982; Gaithersburg. p. 63-68.
  • 30
    Murphy E, Kuber R, Strain P, McAllister G, Yu W. Developing sounds for a multimodal interface: conveying spatial information to visually impaired web users. In: Proceedings of The International Conference on Auditory Display; 2007; Montréal. p. 26-29.
  • 31
    Myers D. The nature of computer games: play as semiosis. New York: Peter Lang; 2003.
  • 32
    Ouhyoung M, Tsai WN, Tsai MC, Wu JR, Huang CH, Yang TJ. A low-cost force feedback joystick and its use in PC video games. IEEE Transactions on Consumer Electronics 1995; 41(3):787-794.
  • 33
    33. Nokia. Nokia N95 specification. [on the internet]. Available from: http://www.forum.nokia.com/devices/N95>. Access in: 11/08/2008
  • 34
    Peirce CS. The Essential Peirce: selected philosophical writings. In: Houser N, Kloesel CJW. (Orgs.). Bloomington: Indiana University Press; 1998.
  • 35
    Pirhonen A, Murphy E, McAllister G, Yu W. Non-speech sounds as elements of a use scenario: a semiotic perspective. In: Proceedings of The International Conference on Auditory Display; 2007; London. p. 20-23.
  • 36
    Ravaja N, Saari T, Turpeinen M, Laarni J, Salminen M, Kivikangas M. Spatial presence and emotions during video game playing: does it matter with whom you play? Presence: Teleoperators and Virtual Environments 2006; 15(4): 381-392.
  • 37
    Salen K, Zimmermann E. Rules of play: game design fundamentals. Cambridge: The MIT Press; 2004.
  • 38
    Ur-Rehman S, Liu L, Li H. Vibration soccer: tactile rendering of football game on mobiles. In: Proceedings of the Next Generation Mobile Applications, Services and Technologies; 2007; Cardiff. Cardiff: IEEE Press; 2007. p. 9-13.
  • 39
    Vanderheiden G. Fundamental principles and priority setting for universal usability. In: Proceedings of the ACM Conference on Universal Usability; 2000; Arlington. Arlington: ACM Press; 2000. p. 32-37.
  • 40
    VIBETONZ system [on the internet]. Available from: <http://www.vibetonz.com>. Access in: 11/08/2008
    » link
  • 41
    Vorländer M. Auralization: fundamentals of acoustics, modelling, simulation, algorithms and acoustic virtual reality. Cambridge: Springer; 2007.
  • 42
    Witmer BG, Singer MJ. Measuring presence in virtual environments: a presence questionnaire. Presence: Teleoperators and Virtual Environments 1998; 7(3):225-240.
  • 43
    Zaphiris P, Ang CS. HCI issues in computer games. Interacting with Computers 2007; 19(2):135-139.
  • *
    e-mail:
  • Publication Dates

    • Publication in this collection
      22 June 2009
    • Date of issue
      Mar 2009

    History

    • Accepted
      18 Feb 2009
    • Received
      11 Aug 2008
    Sociedade Brasileira de Computação Sociedade Brasileira de Computação - UFRGS, Av. Bento Gonçalves 9500, B. Agronomia, Caixa Postal 15064, 91501-970 Porto Alegre, RS - Brazil, Tel. / Fax: (55 51) 316.6835 - Campinas - SP - Brazil
    E-mail: jbcs@icmc.sc.usp.br