Acessibilidade / Reportar erro

Facial emotional recognition in schizophrenia: preliminary results of the virtual reality program for facial emotional recognition

Reconhecimento emocional de faces na esquizofrenia: resultados preliminares do programa de realidade virtual para o reconhecimento emocional de faces

Abstracts

BACKGROUND: Significant deficits in emotional recognition and social perception characterize patients with schizophrenia and have direct negative impact both in inter-personal relationships and in social functioning. Virtual reality, as a methodological resource, might have a high potential for assessment and training skills in people suffering from mental illness. OBJECTIVES: To present preliminary results of a facial emotional recognition assessment designed for patients with schizophrenia, using 3D avatars and virtual reality. METHODS: Presentation of 3D avatars which reproduce images developed with the FaceGen® software and integrated in a three-dimensional virtual environment. Each avatar was presented to a group of 12 patients with schizophrenia and a reference group of 12 subjects without psychiatric pathology. RESULTS: The results show that the facial emotions of happiness and anger are better recognized by both groups and that the major difficulties arise in fear and disgust recognition. Frontal alpha electroencephalography variations were found during the presentation of anger and disgust stimuli among patients with schizophrenia. DISCUSSION: The developed program evaluation module can be of surplus value both for patient and therapist, providing the task execution in a non anxiogenic environment, however similar to the actual experience.

Virtual reality; 3D avatar facial emotion recognition; schizophrenia; qEEG; alpha frontal activity


CONTEXTO: Pessoas diagnosticadas com esquizofrenia apresentam um défice significativo na cognição social com implicações negativas relativamente ao funcionamento interpessoal e social. A realidade virtual apresenta grandes potencialidades para a avaliação e o treino de competências em pessoas com doença mental. OBJETIVOS: Apresentar os resultados preliminares de um programa construído para avaliação do reconhecimento emocional de faces por pessoas com esquizofrenia, utilizando avatares 3D e realidade virtual. MÉTODOS: Apresentação de avatares 3D que reproduzem expressões emocionais, construídas por meio do FaceGen® e integradas num ambiente virtual tridimensional. Apresentou-se cada avatar a 12 doentes com esquizofrenia e a 12 pessoas sem patologia psiquiátrica, avaliando as respostas de reconhecimento e a atividade eletroencefalográfica frontal. RESULTADOS: Os resultados demonstraram que as expressões de alegria e raiva foram as mais bem reconhecidas pelos dois grupos, enquanto de medo e nojo foram as de maior dificuldade. Verificaram-se alterações na atividade alfa frontal para os estímulos raiva e nojo na amostra de doentes com esquizofrenia. CONCLUSÃO: Apesar de algumas expressões emocionais poderem ser melhoradas, o programa desenvolvido pode constituir uma mais-valia para o paciente e para o terapeuta, proporcionando a execução da tarefa em condições não ansiogênicas e aproximadas à experiência real.

Realidade virtual; reconhecimento emocional facial em avatar 3D; esquizofrenia; qEEG; atividade alfa frontal


ORIGINAL ARTICLE

Facial emotional recognition in schizophrenia: preliminary results of the virtual reality program for facial emotional recognition

Reconhecimento emocional de faces na esquizofrenia: resultados preliminares do programa de realidade virtual para o reconhecimento emocional de faces

Teresa SoutoI, II; Alexandre BaptistaI; Diana TavaresI, III; Cristina QueirósI, II; António MarquesI, III

IPsychosocial Rehabilitation Laboratory of Faculty of Psychology and Educational Sciences, Porto University/School of Allied Health Sciences, Porto Polytechnic Institute (FPCEUP/ESTSPIPP), Porto, Portugal

IIFPCEUP, Porto, Portugal

IIIESTSPIPP, Porto, Portugal

CorrespondenceCorrespondence: Cristina Queirós Faculdade de Psicologia e de Ciências da Educação da Universidade do Porto Rua Alfredo Allen, s/n 4200-135 - Porto, Portugal Telephone: (+351 22) 607-9720 E-mail: cqueiros@fpce.up.pt

ABSTRACT

BACKGROUND: Significant deficits in emotional recognition and social perception characterize patients with schizophrenia and have direct negative impact both in inter-personal relationships and in social functioning. Virtual reality, as a methodological resource, might have a high potential for assessment and training skills in people suffering from mental illness.

OBJECTIVES: To present preliminary results of a facial emotional recognition assessment designed for patients with schizophrenia, using 3D avatars and virtual reality.

METHODS: Presentation of 3D avatars which reproduce images developed with the FaceGen® software and integrated in a three-dimensional virtual environment. Each avatar was presented to a group of 12 patients with schizophrenia and a reference group of 12 subjects without psychiatric pathology.

RESULTS: The results show that the facial emotions of happiness and anger are better recognized by both groups and that the major difficulties arise in fear and disgust recognition. Frontal alpha electroencephalography variations were found during the presentation of anger and disgust stimuli among patients with schizophrenia.

DISCUSSION: The developed program evaluation module can be of surplus value both for patient and therapist, providing the task execution in a non anxiogenic environment, however similar to the actual experience.

Keywords: Virtual reality, 3D avatar facial emotion recognition, schizophrenia, qEEG, alpha frontal activity.

RESUMO

CONTEXTO:

OBJETIVOS: Apresentar os resultados preliminares de um programa construído para avaliação do reconhecimento emocional de faces por pessoas com esquizofrenia, utilizando avatares 3D e realidade virtual.

MÉTODOS: Apresentação de avatares 3D que reproduzem expressões emocionais, construídas por meio do FaceGen® e integradas num ambiente virtual tridimensional. Apresentou-se cada avatar a 12 doentes com esquizofrenia e a 12 pessoas sem patologia psiquiátrica, avaliando as respostas de reconhecimento e a atividade eletroencefalográfica frontal.

RESULTADOS: Os resultados demonstraram que as expressões de alegria e raiva foram as mais bem reconhecidas pelos dois grupos, enquanto de medo e nojo foram as de maior dificuldade. Verificaram-se alterações na atividade alfa frontal para os estímulos raiva e nojo na amostra de doentes com esquizofrenia.

CONCLUSÃO: Apesar de algumas expressões emocionais poderem ser melhoradas, o programa desenvolvido pode constituir uma mais-valia para o paciente e para o terapeuta, proporcionando a execução da tarefa em condições não ansiogênicas e aproximadas à experiência real.

Palavras-chave: Realidade virtual, reconhecimento emocional facial em avatar 3D, esquizofrenia, qEEG, atividade alfa frontal.

Introduction

The worldwide prevalence of schizophrenia is estimated to vary between 0.5% to 1.5%. Several studies argue that individuals with schizophrenia have social cognition difficulties, especially related to emotional recognition and social perception. Social functioning and skills become compromised, as well communication and interpersonal relationships1, contributing to social exclusion. Although schizophrenia affects differently each individual, causing chronic and multidimensional functioning deficits, the treatment is vital for the quality of life of the mentally ill person and to their social inclusion and participation.

Emotions play a key role in human communication, and according to Ekman2, human beings have the innate capacity to recognize six universal facial expressions: rage, fear, sadness, disgust, happiness and surprise. Emotions are basically expressed and recognized in the human face, leading researchers to discuss about emotional facial expressions posed or spontaneous and about static and dynamic expressions. Studies that aim to teach emotion recognition have different results and the majority uses photos of facial expressions, despite the possibility to use facial synthesis in real time.

Virtual reality: strengths and applications

Virtual reality (VR) can be a new possibility for the facial emotional recognition, since the software developed based on VR is potentially more representative of daily life situations. VR definitions throughout the literature are several, but not always consensual3. However, it is unanimous the conception of VR as a way for the individual to visualize, manipulate and interact with computer generated 3D graphic images. Therefore, it is a simulation created by the computer which is similar to the real world, constituting the most advanced type of man-machine interface currently available.

The application of VR to mental health enlarges the present limitations of intervention, providing scenarios with elements that allow patients to contact with very similar activities to those that occur in daily life4. VR enables the use of flexible and controlled situations suitable for application in experimental and therapeutic contexts5, particularly the application of VR in the treatment of phobias and other anxiety disorders, allowing the controlled confrontation of the patients with the phobic situations. In posttraumatic stress disorders, it enables the in vivo exposure of the patient to the environmental stressor, in a gradual, prolonged and systematic way. Additionally, the support given to diagnosis and evaluation are equally benefits of this technological resource6. The potentialities of VR result also from its main features, including the possibility of the person's immersion in a perceptual computer generated world. In other words the laboratory allows that the person feels she is really there (presence)6,7. The possibility of real time interaction with the environment enables the individual to have an active role exploring the virtual environment, and an increased implication, conceived as participation, attention and persistence of the individual in the task, namely, their motivational state6,7. An application that uses VR must provide immersion, interaction and stimulate imagination, requirements designated by the Burdea's VR Triangle3, allowing the individual to put into practice abstract knowledge in the virtual context similarly to the real environment, using their skills in manipulating objects and interacting with virtual "other people". It can be stated that the higher the level of immersion, interaction, and imagination applied to a system, the closer is the synthesis of a new reality, the VR. These details, when combined, allow the developed system to approach the individual mental representation, incorporating values, skills and interests. Considering the recent evolution of technology, the boundaries between the virtual and physical world are increasingly tenuous, providing the individual with a realistic experience of the proposed task, whether of recreational or therapeutic aims.

Virtual reality presents the advantage of allowing the person to reproduce and interact with imaginary situations, using fictional scenarios, involving virtual objects, whether static or in movement. It also allows a faithful reproduction of real-life environments, such as virtual home, office, bank, supermarket, or even a virtual city, with which the person may interact, using, for instance, the hands with the aid of equipment such as a digital glove, and eventually gestures or voice commands. Within the scenarios the person can visit the setting and interact both with the scenario as with other people of the virtual environment. The person can, for example, go into a virtual bank and handle the cash machine, similar to what he would do with real equipment, or even interact with an employee, represented in the environment by a virtual digital humanoid, commonly referred as avatar. This interaction is accomplished in real time, using computer equipment and techniques that increase the sense of presence.

During experimental studies that research the processing of emotional expressions, virtual faces are an essential advantage, since they can be easily animated and vary according to the needs of the researcher. Regarding cognitive deficits in schizophrenia, both in terms of evaluation or rehabilitation, several experiments have used VR7,8. However, few were about emotions. Kim and collaborators8 used VR as a methodology to assess the social perception of individuals with schizophrenia, arguing that it can be more widely used in the evaluation of strategies for problem solving, assertiveness and, generically, in social skills. A similar perspective is presented by Wendt6, who emphasizes the use of RV in diverse contexts and proved effectiveness in healthcare. For an adequate social functioning, it is necessary to decode and interpret social and emotional clues from other people, a capacity that have deficits8 in people with schizophrenia.

Development of a Virtual Reality Program for Facial Emotional Recognition (RV-REF)

Traditionally, to elicit and recognize emotions, researchers use stimuli such as pictures, music or sounds. RV-REF program innovates, since it allows to analyze with more detail interactive recognition methodologies, and those are more specific for the evaluation of individuals' competence of emotional recognition. Thus, the RV-REF aims to assess the ability of recognizing the six basic facial emotions (happiness, sadness, anger, disgust, fear and surprise). This program was developed based on published studies and a literature review on emotional recognition, including previous studies made with the widely published methodology of Ekman and Friesen2. The RV-REF uses three-dimensional avatars within virtual three-dimensional environments, which is an innovative and ecological approach more representative of the individual's real world.

The RV-REF presents 3D avatars that mimic the image of a young adult male, built using several computer programs. The visualization of the avatar was carried out using 3D technology systems where each of the six basic emotions was prepared. To build the avatar and its three-dimensional environment, several tools were used in a sequence of stages. First, a picture of a young adult male was chosen by a panel of experts from the Psychosocial Rehabilitation Laboratory of FPCEUP/ESTSPIPP from the validated photographic database Radboud Faces Database9. That photograph was transformed into a three-dimensional model using the program FaceGen® v3.5 where the six emotions were worked through the tool "Morph" (Figure 1). After obtaining all the necessary data, they were exported through the "Customizer" tool to an "obj" format that allows the three-dimensional model to be worked on a wide range of 3D modelling programs.


After completing this stage, another avatar was filmed and integrated within the 3D virtual environment (a room of a house). This avatar has a full body and is a guide to the user of RV-REF (it serves as cicerone), and also allows the user to have a training with the application and the three-dimensional environment. Finally, there is another virtual character in the scene, a cat with extremely realistic appearance and expressions, which was introduced to create a familiar scenario of a house (Figure 2).


The full body human avatar introduces the task to the user and, after the explanation, presents the three-dimensional faces on a screen on the wall of the virtual room, reproducing each of the emotions for a period of 7 seconds. This period comprises elapsed time from neutral emotion, through expressed emotion and returning to neutral (Figure 3).


All these actions take place in a three-dimensional environment, using the created stereoscopic effect that allows the user to feel within the environment. This effect is achieved by exporting the files to a high definition video (1920x1080i) format obtained with a Side by Side (SbS) method. This method is the most basic and was the most useful within various ways of performing the VR application. An application can provide a VR session in three different forms: passive, exploratory or interactive10. A session of passive VR gives the participant an opportunity to automatically explore the environment without interference. The pathway and the observation points are explicit and controlled exclusively by software, and the participant has no control, except to exit the session. The evaluation of stimuli created for the RV-REF was held at PING'S laboratory (Porto Interactive NeuroScience Group, Porto, Portugal), using a "Wall" that consists in a screen with about 4 x 3 m, where the 3D stereoscopic projection was made by two multimedia projectors (via back projection mirror). It was also used passive polarized glasses that allowed the sense of depth, into and out of the screen and a surround sound system, allowing the verbal presentation of the emotions to remember the user the sis emotions under evaluation. Additionally to the emotions recognition registration, it was made an EEG register for the values of power (qEEG) of alpha activity in the electrodes F3 and F4 (Figure 4) in order to check whether the different types of emotion represented by the avatars induced specific electroencephalographic changes upon presentation according to the Davidson's asymmetry model11.


Given the specific characteristics of this methodology, it was intended to develop an innovative program to assess the individual competence of emotion recognition, having a great discriminatory power (sensitivity and effectiveness), and promoting adequate assessment of emotional recognition using power (qEEG) of alpha activity in the electrodes F3 and F4, as an indicator of the occurrence of affective processing12 during stimuli presentation.

It this study we present the preliminary results of the RV-REF effectiveness of stimuli built under the RV-REF, whose data were collected during the first semester of 2012 from individuals diagnosed with schizophrenia and individuals with no psychiatric pathology.

Method

Participants

The sample included 12 patients, of both sexes, with a diagnosis of schizophrenia (DSM-IV)13 from a private institution of social solidarity in the area of mental health. This convenience sample14 had as selection criteria, (a) to be stabilized from the psychopathological point of view for at least one year period, and (b) to have an effective control of compliance with prescribed psychopharmacological treatment. All participants were medicated with anti-psychotic drug and the absence of mental deterioration in all patients was checked applying the Mini Mental State Examination15. The psychosocial functioning of the participants with schizophrenia was characterized by the Personal Social and Performance Scale16 and the Global Assessment of Functioning Scale16, revealing moderate valuesii MMSE: Mean = 29.08 SD = 0.79; PSPS: Mean = 52.58 SD = 15.62; GAFS: Mean = 52.92 SD = 10.60. ii We named the control group as reference group, since this group is not the result of a common universe, presenting differences in some sociodemographic variables. iii In a 3D application there are three possibilities for an image to appear on the screen projection: in front of the screen, behind the screen or in the plane of the screen and these effects have respectively the names of negative parallax (front), positive parallax (behind) and zero parallax (same depth as screen)..

It was also used a reference groupii of 12 volunteers with no psychiatric disorder, obtained from an intentional sampling procedure, in order to match some independent variables, including age and gender (Table 1). All participants gave their informed consent to participate in this study and followed the ethical principles recommended by the University of Porto.

Instruments

We used a socio-demographic questionnaire for charactering both groups. Within schizophrenia group additional data was collected regarding the disease, hospitalization, medication and psychosocial functioning. This was assessed using the Mini Mental State Examination, Personal and Social Performance Scale and Global Assessment of Functioning Scale (DSM-IV-R). For both groups was also rated the emotional recognition during the first stage of the Virtual Reality Program for Emotional Recognition for (RV-REF), considering the number of correct answers and the number and type of data errors. Finally, it was used an electroencephalogram (EEG) to record the values for the power (qEEG) of alpha activity in the electrodes F3 and F4 in order to determine whether different types of emotion represented by the avatars induced specific electroencephalographic changes during the presentation, according to the Davidson´s asymmetry model11.

Procedure

Data on participants' psychosocial functioning was collected in their own institution, before the 3D emotional recognition task in PING's laboratory. The task of emotional recognition was processed in the laboratory in the following sequence:

- Reception and general information - The participant was informed that he would watch on the screen a 3D movie having to use polarized glasses, and that he would be also monitored by EEG throughout the viewing.

- Placement of glasses and EEG electrodes - After the placement of electrodes, a first record of 3D visualization was made, asking the person to close and open his eyes. Thus, it was possible to obtain the registration of baseline eyes open (EO) relevant for comparison with other values (F3 and F4). After confirmation that the participant was prepared, the application was initiated after the individual had the polarized glasses on.

- Task prelude - The participant begins by viewing and listening (dolby surround sound®) rain falling in the sea for a few seconds, followed by an introduction to the logos of FPCEUP and LABRP in 3D moving cubes on several axes, accompanied by music. Prior to the first scene of the task itself, the virtual environment was shown using a panoramic 360º by different parallaxesiii.

- Emotional recognition task - In the first scene appears a full body avatar, serving as a cicerone throughout the application, presenting the task and providing instructions. Stimuli were than presented by another avatar, corresponding to the face of Figures 2 and Figure 3 in a white screen placed in one of the walls of the room virtual environment. The avatar cicerone tells the participant that all stimuli have a countdown timer of 5 seconds before starting the task, so the participant knows exactly when to start. The display of the stimulus lasts 7 seconds between neutral stimulus, the stimulus elaboration and the return to neutral stimulus. After this display, a list with emotions is shown for 10 seconds, after which the avatar remains a neutral stimulus, slightly axially moving his head for 15 seconds to allow the participant to return to a baseline record. After the identification of the emotion by the participant, the avatar cicerone appears at the bottom right of the screen (miniature) verbalizing a phrase of positive reinforcement and inviting the participant to identify the next emotion. This process is repeated until the sixth/final emotion.

- End of the task - Once completed the process of recognition, cicerone avatar thanks the person's participation. In this scene, as well as briefly in the overview scene and also in the scene of the instructions for the task, there is a third element that is also considered an avatar: a cat that at the end of the application receives a feast by the cicerone avatar and reacts to the touch. This element as well as all the preparation of the room has the purpose of creating a familiar environment fostering the participant's involvement. This concept is directly related to the motivation level that connects individuals with a certain activity.

The total time of the session for collecting data was approximately 25 minutes, including the placement and removal of the electrodes. The performance of participants in the task of emotional facial recognition was analyzed by the total number of emotions correctly identified of the global set of stimuli presented, and the electroencephalographic response. Statistical analysis was performed using the PASW-v20, with resource to nonparametric tests due the small size of the two groups of participants.

Results

Regarding the emotional recognition task (Table 2) no statistical significant differences were found between the two groups. We noticed that for happiness nearly all participants answered correctly (92% of the schizophrenic group and 100% of the reference group). Regarding anger both groups obtained 75% of correct answers. Difficulties arise in the recognition of fear (17% accuracy for both groups) and disgust for the schizophrenic group (33% correct answers) or sadness for the reference group (42% correct answers).

The results of the qEEG showed evidence of absence of significant changes in the reference group (Table 3). However there were changes in alpha frontal activity for the stimuli anger and disgust in the group of people with schizophrenia. Regarding anger, the most significant statistically value occurred during the comparison of the value of F3-EO with the value of F3 to the stimulus 1. This result corresponds to an increase in EEG power for the alpha rhythm in the left hemisphere (electrode F3) when emotional recognition of the face of anger, which means a decrease in the activation of that region and therefore a preponderance of the contralateral hemisphere for this stimulus. This same stimulus was also associated with an increased EEG power for the alpha rhythm in the right hemisphere (comparison of the value of F4-EO with the F4-value for stimulus 1), suggesting a decrease in activation in this region and a preponderance of the contralateral hemisphere. A similar change occurred to the stimulus disgust, having in common the increase in EEG power for the alpha rhythm in the left hemisphere (F3-EO value compared to the value of F3 for the stimulus 3) which means a decrease in activation in this region and a preponderance of the contralateral hemisphere for that stimulus.

Finally, regarding the correlation between age and correct responses, no statistically significant results were found for the two groups.

Discussion

Although we didn't find any significant statistical differences between both groups during emotional facial recognition, the performance of people with schizophrenia was inferior. It should be noted that this methodology is not comparable with the existing literature, since three-dimensional stereoscopic stimuli is different from static or posed methods (photography) and dynamic and spontaneous method (2D movie), which are traditionally used among researchers. In the recognition of the emotion happiness minor differences were found between groups, and the performance of schizophrenic group was similar to the reference group, result corroborated by some literature17. Additionally, it was in happiness where less misidentification and exchanges occurred, with the reference group having 100% success and people with schizophrenia having 92% accuracy and a non-response.

The stimuli of fear present the worst performance in both groups, with a massive exchange of stimuli fear for the stimuli surprise. There was a rate of 17% of correct identification in both groups and a wrong attribution of the emotion surprise in the order of 83% for the group with schizophrenia and 75% for the reference group, with the remaining percentage of the reference group explained by non-responses.

Some authors argue that negative emotional facial recognition is comparatively worse to positive one in patients with schizophrenia17-20. The fact that the reference group presents similar results is discussed in the literature, by authors such as Ekman21 that highlights the difficulty of distinguishing fear from surprise using pictures. According to Kohler et al.18, surprise distance itself from the remaining five basic emotions as it that depends entirely on the precipitating event and may then turn into any other emotion depending only on the speed it is installed.

Regarding the results of the EEG, the emotional recognition of anger suggests respectively the dominance of the right hemisphere and the left hemisphere. This result is supported by the research of Demaree et al.22 who noticed that the study of frontal asymmetry associated with negative emotion of anger triggering the behavior of approach, support the Approach - Avoidance Model. The meta-analysis study of Wager et al.23 concluded by the lack of support for the hypothesis of overall right lateralization of emotional function, and of limited support for the specific valence of lateralization of the emotional activity in the frontal cortex. Generically, it occurred a change of perceptual information/expressive to experiential data for information leading to emphasize the involvement of valence model by assigning cerebral hemispheric (left/right and positive/negative) and processing within the approximation-avoidance model, with a new perspective - positive/approach and negative/avoidance, respectively. Despite the progress made, these authors consider the need of using an approach and intensity measurement, considering that this failure/lack represents a significant loss in this theoretical modification22.

Exploring the relationship between variables, it is possible to identify within the reference group, a relationship of weak intensity between the number of correct responses and age of the participant. Regarding the group with schizophrenia, the relationship is moderate and assumes a negative correlation. None of these results may be attributed to mental deterioration, since it was an exclusion criterion. Despite we haven't any concrete indicator, it might have motivational influences, such as emotional state of the participant, acceptation of the task, or generally the level of involvement during the task.

The results may have methodological limitations regarding the program used to built the stimuli (FaceGen®) as there are studies like the one of Ochs et al.24 that refer the need to create algorithms to generate a diversity of smiles, because that may be associated with different emotions and not just to happiness. For instance, Swiss Centre for Affective Science of the Geneva University developed a software FACSGen v2.025 as an extension of the commercial software FaceGen®, and according the authors, it provides limited control over the manipulation of facial expressions, while FACSGen allow creating facial expressions as provided by the Facial Action Coding System-FACS26.

On the other hand, the state of the art related with equipment to work with basic emotions such as happiness, anger, fear, disgust, appears to be quite advanced, especially when it comes to analyze monitors with recorded activity in environments limited to the laboratory27. However, the analysis of equipment for social emotions such as empathy, envy, admiration has not yet been tried28. While some of the social emotions could be arguably represented in terms of the dimensions of affection (valence, activation, expectation, power and intensity) there are pioneering efforts towards a dimensional emotional recognition and continuum that were recently proposed20. A set of critical issues need to be first addressed, according to Pantic et al.28, if these approaches for the automatic dimensional and continuous recognition of emotion will be used with free resource to moving objects in real-world scenarios, such as interaction between patient-doctor, television programs (talk shows) or job interviews, among others28.

Another limitation relates to the fact that our group with schizophrenia has been obtained by an intentional sampling method, and cannot be considered representative of the population of patients with schizophrenia in Portugal. The total value of n was small (n = 12) making necessary caution when interpreting and generalizing these results. The fact that the sample is coming from a group of outpatients attending a socio-occupational forum may be a possible influence, as well as the mean time of residence (58 months) in the project and the fact that they were stimulated in this area.

For its characteristics related specifically to virtual reality, RV-REF's approach seems to have potential of a future powerful tool, since it includes characteristics and attributes that may support many situations and research contexts, such as learning and rehabilitation. Virtual reality as a resource seems to be an advantage from a clinical point of view, since the emotional expressions of virtual faces have the possibility of being animated29 by the therapist. Thus, they allow training emotional recognition in non-anxious setting and try to minimize the impact of the disease on social interaction and integration.

As far as we know, RV-REF is the first attempt to use the VR as a methodology for evaluating the recognition of emotional faces, suggesting that "the limitations on the use of virtual reality in health sciences depend only on the imagination of researchers"30. Building materials, more specifically the emotional facial stimuli for this purpose, was a difficult task, since although there are several software tools semi-intuitive, they aren't always oriented to these (therapeutic) aims. Reproducing human facial emotions is a reliable manner for this type of study and its effectiveness depends on this. We suggest for future work, the use of software that develops better facial emotional stimuli. Additionally, we must consider issues related to a new challenge and a new field of research: the theory of Social Signal Processing (SSP) that aims to understand and modulate social interactions (human and scientific purpose), and develop computers with similar competencies of interaction in human-computer scenarios (technological purpose)28. In the context of the SSP, the clues that reveal a person's individual emotions and the emotions of the mass-media include, beyond facial expressions, vocal actions and "explosions", body gestures and postures28, which turns more complex the "signs" that must be valorised when designing a methodology ecologically valid for emotional recognition. Another research is needed to study a full body avatar in a virtual environment in a meaningful social context.

References

1. Bellack A, Mueser K, Gingerich S, Agresta J. Social skills training for schizophrenia: a step by step guide. 2nd ed. New York: Guilford; 2004.

2. Ekman P, Friesen WV. Unmasking the face: a guide to recognizing emotions from facial expressions. Cambridge: Cambridge University Press; 2003.

3. Burdea GC, Coiffet P. Virtual reality technology. New York: Wiley; 2003.

4. Carvalho MR, Freire RC, Nardi AE. Realidade virtual no tratamento do transtorno de pânico. J Bras Psiquiatr. 2008;57(1):64-9.

5. Dyck M, Winbeck M, Leiberg S, Chen Y, Gur RC, Mathiak K. Virtual faces as a tool to study emotion recognition deficits in schizophrenia. Psychiatry Res. 2010;179:247-52.

6. Wendt GW. Tecnologias de interface humano-computacional: realidade virtual e novos caminhos para pesquisa. Rev Psiq Clín. 2011;38(5):211-2.

7. Marques A, Queirós C, Rocha N. Virtual reality and neuropsychology: a cognitive rehabilitation approach for people with psychiatric disabilities. In: Sharkey P, Lopes Santos PL, Weiss P, Brooks T, editors. ICDVRAT - Proceedings of 7th International Conference on Disability Virtual Reality and Associated Technologies (Sep 8-11 2008). Maia, Portugal; 2008. p. 39-46.

8. Kim K, Kim JJ, Kim J, Park D, Jang H, Ku J, et al. Characteristics of social perception assessed in schizophrenia using virtual reality. Cyberpsychol Behav. 2007;10:215-9.

9. Langner O, Dotsch R, Bijlstra G, Wigboldus DJ, Hawk ST, van Knippenberg A. Presentation and validation of the Radboud Faces Database. Cogn Emotion. 2010;24(8):1377-88.

10. Adams L. Visualização e realidade virtual. São Paulo: Makron Books; 2004. p. 255-9.

11. Davidson RJ. Affective neuroscience and psychophysiology: toward a synthesis. Psychophysiology. 2003;40:655-65.

12. Cipresso P, Gaggioli A, Serino S, Pallavicini F, Raspelli S, Grassi A, et al. EEG alpha asymmetry in virtual environments for the assessment of stress-related disorders. Stud Health Technol Inform 2012:102-4.

13. American Psychiatric Association. DSM-IV-TR, Manual de diagnóstico e estatística das perturbações mentais. 4ª ed. Lisboa: Climepsi Editores; 2002.

14. Almeida LS, Freire T. Metodologia da investigação em psicologia e educação. Braga: Psiquilíbrios Edições; 2007.

15. Folstein MF, Folstein SE, McHugh PR. Mini-mental state [documento electrônico]: a practical method for grading the cognitive state of patients for the clinician; 1975.

16. Morosini P, Magliano L, Brambilla L, Ugolini S, Pioli R. Development, reliability and acceptability of a new version of the DSM-IV Social and Occupational Functioning Assessment Scale (SOFAS) to assess routine social functioning. Acta Psychiatr Scand. 2000;101:323-9.

17. Bediou B, Franck N, Saoud M, Baudouin JY, Tiberghien G, Daléry J, et al. Effects of emotion and identity on facial affect processing in schizophrenia. Psychiatry Res. 2005;133:149-57.

18. Kohler C, Turner T, Bilker W, Brensinger C, Siegel S, Kanes S, et al. Facial emotion recognition in schizophrenia: intensity effects and error pattern. Am J Psychiatry. 2003;160:1768-74.

19. van 't Wout M, Aleman A, Kessels RPC, Cahn W, de Haan E, Kahn RS. Exploring the nature of facial affect processing deficits in schizophrenia. Psychiatry Res. 2007;150:227-35.

20. Tsoi D, Lee K, Khokhar W, Mir N, Swalli J, Gee K, et al. Is facial emotion recognition impairment in schizophrenia identical for different emotions? A signal detection analysis. Schizophr Res. 2008;99:263-9.

21. Ekman P. Emotions revealed: recognizing faces and feeling to improve communication and emotional life. New York: Times Books, Henry Holt and Company; 2003.

22. Demaree H, Youngstrom E, Harrison D. Brain lateralization of emotional processing: historical roots and a future incorporating "dominance". Behav Cogn Neurosci Rev. 2005;4(1):3-20.

23. Wager TD, Phan KL, Liberzon I, Taylor SF. Valence, gender, and lateralization of functional brain anatomy in emotion: a meta-analysis of findings from neuroimaging. NeuroImage. 2003;19:513-31.

24. Ochs M, Niewiadomski R, Pelachaud C. How a virtual agent should smile? Morphological and dynamic characteristics of virtual agent's smiles. IVA. 2010:427-40.

25. Roesch EB, Tamarit L, Reveret L, Grandjean D, Sander D, Scherer KR. FACSGen: a tool to synthesize emotional facial expressions through systematic manipulation of facial action units. J Nonverbal Behav. 2011;35:1-16.

26. Krumhuber EG, Tamarit L, Roesch EB, Scherer KR. FACSGen 2.0 animation software: generating 3D FACS-Valid facial expressions for emotion research. Emotion. 2012;12(2):351-63.

27. Zeng Z, Pantic M, Roisman GI, Huang TH. A survey of affect recognition methods: audio, visual and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell. 2009;31(1):39-58.

28. Pantic M, Cowie R, D'Errico F, Heylen D, Mehu M, Pelachaud C, et al. Social signal processing: the research agenda. Visual Analysis Hum. 2011:511-38.

29. Nicolaou M, Gunes H, Pantic M. Output-associative RVM regression for dimensional and continuous emotion prediction. 2011. Disponível em: <http://ibug.doc.ic.ac.uk/people/mnicolaou> .

30. Wiederhold BK, Wiederhold MD. Postface. In: Riva G, Galimberti C. Towards cyberpsychology: mind, cognition, and society in the Internet age. Amsterdan: IOS Press; 2001.

Received: 11/6/2012

Accepted: 2/14/2013

Institution where the study was elaborated: Faculty of Psychology and Educational Sciences, Porto University, Portugal.

  • 1. Bellack A, Mueser K, Gingerich S, Agresta J. Social skills training for schizophrenia: a step by step guide. 2nd ed. New York: Guilford; 2004.
  • 2. Ekman P, Friesen WV. Unmasking the face: a guide to recognizing emotions from facial expressions. Cambridge: Cambridge University Press; 2003.
  • 3. Burdea GC, Coiffet P. Virtual reality technology. New York: Wiley; 2003.
  • 4. Carvalho MR, Freire RC, Nardi AE. Realidade virtual no tratamento do transtorno de pânico. J Bras Psiquiatr. 2008;57(1):64-9.
  • 5. Dyck M, Winbeck M, Leiberg S, Chen Y, Gur RC, Mathiak K. Virtual faces as a tool to study emotion recognition deficits in schizophrenia. Psychiatry Res. 2010;179:247-52.
  • 6. Wendt GW. Tecnologias de interface humano-computacional: realidade virtual e novos caminhos para pesquisa. Rev Psiq Clín. 2011;38(5):211-2.
  • 8. Kim K, Kim JJ, Kim J, Park D, Jang H, Ku J, et al. Characteristics of social perception assessed in schizophrenia using virtual reality. Cyberpsychol Behav. 2007;10:215-9.
  • 9. Langner O, Dotsch R, Bijlstra G, Wigboldus DJ, Hawk ST, van Knippenberg A. Presentation and validation of the Radboud Faces Database. Cogn Emotion. 2010;24(8):1377-88.
  • 10. Adams L. Visualização e realidade virtual. São Paulo: Makron Books; 2004. p. 255-9.
  • 11. Davidson RJ. Affective neuroscience and psychophysiology: toward a synthesis. Psychophysiology. 2003;40:655-65.
  • 12. Cipresso P, Gaggioli A, Serino S, Pallavicini F, Raspelli S, Grassi A, et al. EEG alpha asymmetry in virtual environments for the assessment of stress-related disorders. Stud Health Technol Inform 2012:102-4.
  • 13
    American Psychiatric Association. DSM-IV-TR, Manual de diagnóstico e estatística das perturbações mentais. 4ª ed. Lisboa: Climepsi Editores; 2002.
  • 14. Almeida LS, Freire T. Metodologia da investigação em psicologia e educação. Braga: Psiquilíbrios Edições; 2007.
  • 15. Folstein MF, Folstein SE, McHugh PR. Mini-mental state [documento electrônico]: a practical method for grading the cognitive state of patients for the clinician; 1975.
  • 16. Morosini P, Magliano L, Brambilla L, Ugolini S, Pioli R. Development, reliability and acceptability of a new version of the DSM-IV Social and Occupational Functioning Assessment Scale (SOFAS) to assess routine social functioning. Acta Psychiatr Scand. 2000;101:323-9.
  • 17. Bediou B, Franck N, Saoud M, Baudouin JY, Tiberghien G, Daléry J, et al. Effects of emotion and identity on facial affect processing in schizophrenia. Psychiatry Res. 2005;133:149-57.
  • 18. Kohler C, Turner T, Bilker W, Brensinger C, Siegel S, Kanes S, et al. Facial emotion recognition in schizophrenia: intensity effects and error pattern. Am J Psychiatry. 2003;160:1768-74.
  • 19. van 't Wout M, Aleman A, Kessels RPC, Cahn W, de Haan E, Kahn RS. Exploring the nature of facial affect processing deficits in schizophrenia. Psychiatry Res. 2007;150:227-35.
  • 20. Tsoi D, Lee K, Khokhar W, Mir N, Swalli J, Gee K, et al. Is facial emotion recognition impairment in schizophrenia identical for different emotions? A signal detection analysis. Schizophr Res. 2008;99:263-9.
  • 21. Ekman P. Emotions revealed: recognizing faces and feeling to improve communication and emotional life. New York: Times Books, Henry Holt and Company; 2003.
  • 22. Demaree H, Youngstrom E, Harrison D. Brain lateralization of emotional processing: historical roots and a future incorporating "dominance". Behav Cogn Neurosci Rev. 2005;4(1):3-20.
  • 23. Wager TD, Phan KL, Liberzon I, Taylor SF. Valence, gender, and lateralization of functional brain anatomy in emotion: a meta-analysis of findings from neuroimaging. NeuroImage. 2003;19:513-31.
  • 24. Ochs M, Niewiadomski R, Pelachaud C. How a virtual agent should smile? Morphological and dynamic characteristics of virtual agent's smiles. IVA. 2010:427-40.
  • 25. Roesch EB, Tamarit L, Reveret L, Grandjean D, Sander D, Scherer KR. FACSGen: a tool to synthesize emotional facial expressions through systematic manipulation of facial action units. J Nonverbal Behav. 2011;35:1-16.
  • 26. Krumhuber EG, Tamarit L, Roesch EB, Scherer KR. FACSGen 2.0 animation software: generating 3D FACS-Valid facial expressions for emotion research. Emotion. 2012;12(2):351-63.
  • 27. Zeng Z, Pantic M, Roisman GI, Huang TH. A survey of affect recognition methods: audio, visual and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell. 2009;31(1):39-58.
  • 28. Pantic M, Cowie R, D'Errico F, Heylen D, Mehu M, Pelachaud C, et al. Social signal processing: the research agenda. Visual Analysis Hum. 2011:511-38.
  • 29. Nicolaou M, Gunes H, Pantic M. Output-associative RVM regression for dimensional and continuous emotion prediction. 2011. Disponível em: <http://ibug.doc.ic.ac.uk/people/mnicolaou>
  • 30. Wiederhold BK, Wiederhold MD. Postface. In: Riva G, Galimberti C. Towards cyberpsychology: mind, cognition, and society in the Internet age. Amsterdan: IOS Press; 2001.
  • Correspondence:
    Cristina Queirós
    Faculdade de Psicologia e de Ciências da Educação da Universidade do Porto
    Rua Alfredo Allen, s/n
    4200-135 - Porto, Portugal
    Telephone: (+351 22) 607-9720
    E-mail:
  • i
    MMSE: Mean = 29.08 SD = 0.79; PSPS: Mean = 52.58 SD = 15.62; GAFS: Mean = 52.92 SD = 10.60.
    ii We named the control group as reference group, since this group is not the result of a common universe, presenting differences in some sociodemographic variables.
    iii In a 3D application there are three possibilities for an image to appear on the screen projection: in front of the screen, behind the screen or in the plane of the screen and these effects have respectively the names of negative parallax (front), positive parallax (behind) and zero parallax (same depth as screen).
  • Publication Dates

    • Publication in this collection
      12 Sept 2013
    • Date of issue
      2013
    Faculdade de Medicina da Universidade de São Paulo Rua Ovídio Pires de Campos, 785 , 05403-010 São Paulo SP Brasil, Tel./Fax: +55 11 2661-8011 - São Paulo - SP - Brazil
    E-mail: archives@usp.br