Acessibilidade / Reportar erro

Comparison of digital games as a cognitive function assessment tool for current standardized neuropsychological tests

Abstract

Objective:

Cognitive dysfunction may occur postoperatively. Fast and efficient assessment of Postoperative Cognitive Dysfunction (POCD) can minimize loss of quality of life, and therefore, a study comparing a digital game with standard neuropsychological tests to assess executive, mnemonic, and attention functions to evaluate POCD seems to be relevant both for research and clinical practice.

Methods:

A battery of standardized tests and a digital game (MentalPlus®) were administered to 60 patients at the Central Institute of Hospital das Clínicas in São Paulo (36 women and 24 men), with ages between 29 and 82 years, preand post-surgery performed under anesthesia. Correlation and linear regression model were used to compare the scores obtained from the standardized tests to the scores of the six executive and cognitive functions evaluated by the game (shortand long-term memory, selective and alternating attention, inhibitory control, and visual perception).

Results:

After correlation analysis, a statistically significant result was found mainly for the correlation between the scores from the phase of the digital game assessing the visuoperception function and the scores from the A and B cards of the Stroop Test (p < 0.001, r = 0.99 and r = 0.64, respectively), and the scores from TMTA (p = 0.0046, r = 0.51). We also found a moderate correlation between the phase of the game assessing short-memory function and VVLT (p < 0.001, r = 0.41). No statistically significant correlations were found for the other functions assessed.

Conclusion:

The digital game provided scores in agreement with standardized tests for evaluating visual perception and possibly short-term memory cognitive functions. Further studies are necessary to verify the correlation of other phases of the digital game with standardized tests assessing cognitive functions.

KEYWORDS
Neuropsychology; Neuropsychological tests; Videogames; Cognitive function; Cognitive dysfunction; Anesthesia

Introduction

Neuropsychology, a branch of clinical psychology, studies cognitive functions, exploring their inter-relations and effects on human behavior. It is worth pointing out that cognitive functions are integrated, and therefore it is challenging to establish precise boundaries among them.11 Luria AR. Fundamentos de Neuropsicologia: Tradução de Juarez Ricardo Aranha. Rio de Janeiro: Livros Técnicos e Científicos; 1981. São Paulo: Ed. da Universidade de São Paulo.,22 Mader MJ. Avaliação neuropsicológica: aspectos históricos e situação atual. Psicologia, ciência, profissão [Internet]. 1996;16(3):12-8, http://dx.doi.org/10.1590/S1414-98931996000300003 [cited 2019 Dec 5].
http://dx.doi.org/10.1590/S1414-98931996...
Consequently, assessing cognitive functions of a patient is laborious, and several hours may be required to complete a comprehensive assessment of all domains.33 Capovilla AGS. Contribuições da neuropsicologia cognitiva e da avaliação neuropsicológica à compreensão do funcionamento cognitivo humano. Cadernos de Psicopedagogia [on-line]. 2007;6(11).

4 Lesak MD. Neuropsychological Assessment. 5th ed. New York: Oxford University Press; 2012.
-55 Strauss E, Sherman EMS, Spreen O. A Compendium of Neuropsychological Tests: Administration, Norms and Commentary. 3rd ed. New York: Oxford University Press; 2006. Assessment models, however, can be used in a variety of clinical and research scenarios and can be adapted to a particular setting by selecting tests and their assessment times.

Attention, memory (implicit and explicit), and perception, as well as expressive and executive functions are assessed in tests commercially available and they are administered using questionnaires and drawings, also called ‘‘pen and paper’’ tests. In brief, the main functions tested to check a patient’s health are: selective attention - ability to select and remain focused; divided attention - ability to focus on simultaneous stimuli; alternating attention - that allows to alternate and return to initial stimulus44 Lesak MD. Neuropsychological Assessment. 5th ed. New York: Oxford University Press; 2012.,66 Malloy-Diniz LF, Fuentes D, Mattos P, et al. Avaliação Neuropsicológica. 2nd ed. Porto Alegre: Artmed Editora; 2018.; memory - that integrates, codes, stores, and recovers significant content information for immediate or long-term use; and perception - dependent on attention and memory (by vision, for example), that is triggered by functions called executive, of which language and drawing are two concrete examples, while planning and problem solving are more refined examples in the development of productive skills.44 Lesak MD. Neuropsychological Assessment. 5th ed. New York: Oxford University Press; 2012.

5 Strauss E, Sherman EMS, Spreen O. A Compendium of Neuropsychological Tests: Administration, Norms and Commentary. 3rd ed. New York: Oxford University Press; 2006.
-66 Malloy-Diniz LF, Fuentes D, Mattos P, et al. Avaliação Neuropsicológica. 2nd ed. Porto Alegre: Artmed Editora; 2018.

Assessing memory is even more complex, as it is considered to involve active consciousness, that can be explicit or implicit, the latter being independent of active consciousness for executing tasks.44 Lesak MD. Neuropsychological Assessment. 5th ed. New York: Oxford University Press; 2012.,55 Strauss E, Sherman EMS, Spreen O. A Compendium of Neuropsychological Tests: Administration, Norms and Commentary. 3rd ed. New York: Oxford University Press; 2006. The literature suggests using computerized tests, as they offer advantages of standardization of administration, correction, and data extraction.55 Strauss E, Sherman EMS, Spreen O. A Compendium of Neuropsychological Tests: Administration, Norms and Commentary. 3rd ed. New York: Oxford University Press; 2006. However, topics such as theoretical procedures, preparation of items, and evidence of validity based on content are still under study.77 Reppold CT, Gurgel LG, Hutz CS. O processo de construção de escalas psicométricas. Avaliação Psicológica. 2014;13:307-10. Using videogames points toward better cognitive performance, notably attention and visuoperception functions, and also as a tool to stimulate neuroplasticity.88 Rivero TS, Querino EHG, Starling-Alves I. Videogame: seu impacto na atenção, percepção e funções executivas. Neuropsicologia Latinoamericana. 2012;4:38-52.

9 Green CS, Li R, Bavelier D. Perceptual learning during action video game playing. Top Cogn Sci. 2010;2:202-16.

10 Blumberg FC, Fisch SM. Introduction: digital games as a context for cognitive development, learning, and developmental research. New Dir Child Adolesc Dev. 2013;2013:1-9.

11 Merabet LB, Connors EC, Halko MA, et al. Teaching the blind to find their way by playing video games. PLoS One. 2012;7:e44958.
-1212 Bavelier D, Green CS, Pouget A, et al. Brain plasticity through the life span: learning to learn and action video games. Annu Rev Neurosci. 2012;35:391-416. Studies on using this digital tool for assessment are still scarce.

In this scenario, the digital game MentalPlus® was developed to assess attention, memory, and executive functions, reducing the time required for its administration to roughly 25 minutes. A study with 163 healthy volunteers suggested its usefulness as a cognitive function assessment tool for subjects without cognitive impairment.1313 Valentin LSS, Pereira VFA. Digital game: a scale to evaluate the perioperative cognitive function (MentalPlus®), https://www.mnkjournals.com/journal/ijlrst/pdf/Volume_6_1_2017/10705.pdf [Accessed 18 January 2021].
https://www.mnkjournals.com/journal/ijlr...
,1414 Valentin LSS, Pereira VFA. MentalPlus® digital game is reliable to measure cognitive function in healthy adults. A future accessible tool to assess postoperative cognitive dysfunction and rehabilitation. Inter J Psychiatry. 2017;2:1-6. The rationale for the present study was comparison for postoperative cognitive dysfunction (POCD) assessment, condition that can occur after surgery under general anesthesia, in which the level of change is variable.1515 Canet J, Reader J, Rasmussen LS, et al. Cognitive dysfunction after minor surgery in the elderly. Acta Anaesthesiol Scand. 2003;47:1204-10.

16 Steinmetz J, Funder KS, Dahl BT, et al. Depth of anaesthesia and post-operative cognitive dysfunction. Acta Anaesthesiologica Scandinavica. 2010;54:162-8.
-1717 Rasmussen LS, ISPOCD2 investigators. Post-operative cognitive dysfunction in the elderly. Acta Anaesthesiol Scand. 2005;49:1573.

Herein, we aimed to compare scores obtained from routinely used neuropsychological tests to their equivalent scores obtained from MentalPlus®, with both tests administered under professional guidance. The evaluation acquired from this digital game as a neuropsychological assessment tool for POCD proved relevant to support further investigation to establish its validation and clinical uses.

Methods

This study applied tools to assess cognitive functions by routine neuropsychological tools and by the MentalPlus® game. Data were collected from patients of the Central Institute of Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, after approval by the ethics committee of the institution (CAPPesq - research Project #: 14086 CAAE: 49463315.5.1001.0068) and registration on Clinicaltrial.gov # NCT02551952. Patients selected were seen at the gynecology, urology, gastrointestinal, and head and neck surgery services between October 2017 and August 2018.

The instruments were administered by trained professionals. The patients selected were candidates to non-cardiac elective surgery to be performed under any anesthetic technique, with ages between 20 and 80 years, without limitation of upper limb mobility or visual acuity preventing visualization of tests or the game. The neuropsychological test battery and game were administered preoperatively and repeated at bedside at the ward after patient discharge from the postanesthetic care unit or intensive care unit.

The physical test battery was selected based on the tests proposed by the International Study of Postoperative Cognitive Disfunction1818 Silverstein JH, Steinmetz J, Reichenberg A, et al. Postoperative cognitive dysfunction in patients with preoperative cognitive impairment: which domains are most vulnerable? Anesthesiology. 2007;106:431-5. and on the study that validated a POCD test battery adapted for the Brazilian population.1919 Valentin LSS, Pietrobon R, Aguiar Junior W, et al. Definition and application of neuropsychological test battery to evaluate postoperative cognitive dysfunction. Einstein (São Paulo). 2015;13:20-6. The physical test battery was comprised by the following tests:

Telephone Interview Cognitive Status (TICS): questionnaire with 14 questions to assess overall cognition;

Visual Verbal Learning Test (VVLT): the respondent is asked to read a list of fifteen words and then perform immediate recall (three successive repetitions) and late recall (after 20 minutes) to assess shortand long-term verbal memory;

Brief Visuospatial Memory Test Revised (BVMTR): a card with six geometric figures is displayed for 10 seconds and the respondent is asked to draw them right after display - immediate recall (three consecutive repetitions), after 20 minutes - late recall, and to identify the figures to assess shortand long-term verbal memory;

Stroop Test - Victoria version: three cards are presented for reading, the first with names of colors to assess selective attention, the second only for naming colors and visuoperception assessment, and the third with the name of the colors written with an incongruent ink color to assess inhibitory control;

Trail Making Test (TMT): respondents are asked to link numbers in ascending order using a continuous line (task A) to assess selective attention and visuoperception; and to link with a continuous line, letters and numbers alternatively, letters in alphabetical order and numbers in ascending order (task B), to assess alternating attention and visuoperception;

Patient Health Questionnaire 9 (PHQ-9): questionnaire with 9 questions assessing anxiety and depression symptoms;

Short Form 8 Health Survey (SF 8): questionnaire to assess quality of life;

Pain Scale: 0 to 10 scale, where 0 is absence of pain and 10 is severe pain, to evaluate patient’s pain intensity at time of assessment.

MentalPlus® is a game developed to assess and stimulate neuropsychological functions, patented and registered at Fundação Biblioteca Nacional according to Law #9.610/98, copyright #663.707. Test administration takes roughly 25 minutes, it acquires player’s sociodemographic data, and displays six phases of the game in a changing order at each game session. It assesses short-term (MCP MP) Memory, Long-Term (MLP MP) Memory, Selective Attention (AS MP), Alternating Attention (AA MP), Inhibitory Control (CI MP), and Visuoperception (VP MP). Information on the game can be obtained by accessing the link https://www.youtube.com/watch?v=aJqvYb7jeHA&feature=youtu.be

In order to present data aimed at examining the potential equivalence between pen and paper tests and the digital game, we decided to analyze data obtained from preoperative and postoperative assessments, checking if correlation between tests was kept in both scenarios. They are:

VVLT13 (sum of the three short-term memory lists) and BVMTR13 (sum of the three short-term memory cards) to the MentalPlus® (MCP MP) short-term memory phase;

VVLT4 (list of four late recall) and BVMTR4 (identifying) to the MentalPlus® (MLP MP) long-term memory phase;

TMTA (TMT part A) and STROOPA (Stroop Test card A) to the MentalPlus® (AS MP) selective attention phase;

TMTB (TMT part B) to the MentalPlus® (AA MP) alternating attention phase;

STROOPC (Stroop Test card C) to the MentalPlus® (CI MP) inhibitory control phase;

STROOPA and STROOPB (Stroop Test, cards A and B) and TMTA to the MentalPlus® (VP MP) visual perception phase.

Statistical analysis was performed using R3.6.3 (released 29/02/2020 - nicknamed Holding the Windsock). Correlation analysis and simple linear regression with the method of least squares were used to compare pen and paper tests and their corresponding MentalPlus® phase, using raw data. Linear regression near the bisection was not expected, because physical methods and the game do not use the same measurement units. As is acknowledged, however, the determination coefficient (R2, equal to the square of the Pearson correlation coefficient, r) allows checking if there is a linear relationship between the two tests. Additionally, the strength of the correlation can be classified according to the Pearson correlation coefficient2020 Ellis PD. The Essential Guide to Effect Sizes. 1st ed. Cambridge University Press; 2010. as negligible (r < 0.1, r22 Mader MJ. Avaliação neuropsicológica: aspectos históricos e situação atual. Psicologia, ciência, profissão [Internet]. 1996;16(3):12-8, http://dx.doi.org/10.1590/S1414-98931996000300003 [cited 2019 Dec 5].
http://dx.doi.org/10.1590/S1414-98931996...
< 0.01), weak (0.1 ≤ r < 0.3, 0.01 ≤ r22 Mader MJ. Avaliação neuropsicológica: aspectos históricos e situação atual. Psicologia, ciência, profissão [Internet]. 1996;16(3):12-8, http://dx.doi.org/10.1590/S1414-98931996000300003 [cited 2019 Dec 5].
http://dx.doi.org/10.1590/S1414-98931996...
< 0.09), moderate (0.3 ≤ r < 0.5, 0.09 ≤ r22 Mader MJ. Avaliação neuropsicológica: aspectos históricos e situação atual. Psicologia, ciência, profissão [Internet]. 1996;16(3):12-8, http://dx.doi.org/10.1590/S1414-98931996000300003 [cited 2019 Dec 5].
http://dx.doi.org/10.1590/S1414-98931996...
< 0.25) or strong (0.5 ≤ r ≤ 1, 0.25 ≤ r22 Mader MJ. Avaliação neuropsicológica: aspectos históricos e situação atual. Psicologia, ciência, profissão [Internet]. 1996;16(3):12-8, http://dx.doi.org/10.1590/S1414-98931996000300003 [cited 2019 Dec 5].
http://dx.doi.org/10.1590/S1414-98931996...
< 1). Sample size was calculated using the pwr-package, which implements power analysis according to Cohen.2121 Cohen J. Statistical power analysis for the behavioral sciences. 2nd ed. Hillsdale, NJ: Lawrence Erlbaum; 1988. We obtained an estimated sample size of 67 patients after assuming the significance level of 5% and power of 80% and seeking to detect a correlation that presented at least moderate strength. In order not to interfere in patient treatment, the recruitment plan followed the pre-established surgery schedule. As the pen and paper test assessment requires hours of interaction, interviewers were only able to assess one participant a day, so the first suitable participant who accepted would take the test.

Results

Considering the possibility of losses, we interviewed 112 patients preoperatively. Because the study was part of a larger research project with pre-post design assessing the decline of cognitive function after surgery under anesthesia, we only analyzed participants with complete data, that is with pre and postoperative evaluations. Thus, 60 participants who accepted to repeat the evaluation in the postoperative period were included in the study, a number near the sample calculation target. The interval between preand postoperative evaluations ranged between 1 and 12 days, depending on the clinical recovery of the participant postoperatively. Thus, 65% of participants were reassessed up to the second postoperative day, 27.6% between the third and seventh day, and 6.9% in the second week.

We assessed 36 women and 24 men with ages ranging between 29 and 82 years, and approximately 70% of participants with ages between 40 and 70 years. The median and mean ages were 54 and 52.7 years, respectively. Only 5 participants had less than 3 years of formal schooling, although even among them, interviewers reported no difficulty for the participants to understand the game and provide answers. As participant recruitment depended on operating room availability for surgery specialties, we interviewed inpatients from different surgery specialties. Thus, the distribution of participants according to units was the following: gastrointestinal surgery (26 patients), gynecology (11 patients), urology (8 patients), general surgery (9 patients), and head and neck surgery (5 patients). Information on one patient was lost.

Table 1 summarizes the comparison between conventional tests and corresponding MentalPlus® game phases. Given measurement units of scores from pen and paper tests and from the game are different, our aim was not to test the equivalence between measurements methods, but their correlation. Thus, it was important to verify if the linear regression angular coefficient was statistically different from zero (H0: ac = 0 represents the null hypothesis that the regression line slope, given by the angular coefficient, is null) and to assess the effect size using the Pearson correlation coefficient classification as described by Ellis, 2010.2020 Ellis PD. The Essential Guide to Effect Sizes. 1st ed. Cambridge University Press; 2010.

Table 1
Linear regressions by the ordinary least squares method, assuming test measurement on paper as the Independent Variable (IV) and the measurement of the digital game as dependent variable (DV), assessed preand post-surgery (see text for description of tests). Mean ± standard deviation, angular coefficient of linear regressions, statistical significance slopes of lines (H0: ac = 0, null hypothesis for null angular coefficient), and practical significance (size of effect) are registered according to Ellis, 2010 (see text). We checked if the correlation remained after controlling for age (H0: r = r’, null hypothesis that correlation remained).

Linear regression and correlation analysis were performed on pre and postoperative data, considering results from tests on paper and the results from the supposedly corresponding phases of MentalPlus® game. Table 1 shows that stronger correlations occurred between the visuoperception phase of MentalPlus® (VP) and three tests on paper, STROOPA (pre: R2 = 0.986; post: R2 = 0.960), STROOPB (pre: R2 = 0.443; post: R2 = 0.369), TMTA (pre: R2 = 0.272; post: R2 = 0.264). Secondly, a moderate correlation was found between the phase of the game that is supposed to assesses short-term memory and the scores from VVLT13 (pre: R2 = 0.166; post: R2 = 0.137) while a weak correlation was found between that phase of the game and the scores from BVMTR13 (pre: R2 = 0.063; post: R2 = 0.057). The remainder cognitive functions revealed weak or negligible correlation (Table 1).

Figure 1 depicts some typical cases of linear regression. The strong correlation between the scores from the phase of the game assessing visuoperception and the scores from STROOPA (Fig. 1A), STROOPB (Fig. 1B), and TMTA (Fig. 1C) is noticeable, as is the moderate correlation between the scores from the phase of the game assessing short-term memory and the scores from VVLT13 (Fig. 1D). Conversely, some expected correlations were not confirmed in the present study. Figure 1E shows the association between the phase assessing alternating attention and TMTB in which, despite the value of R2 suggesting weak or moderate correlation, the slope of the regression line was unexpectedly negative and significant only for postoperative assessments (Table 1). Figure 1F depicts the regression analysis between the scores from the phase assessing the inhibitory control and STROOPC scores. On it, in addition to the fact that the correlation oscillates between weak and negligible strength, regression line slopes are not even significant, which can be validated numerically by Table 1 or by the graph representation of regression analysis (as it is possible to draw a horizontal line within the 95% confidence bands, built by the dotted lines from the several regression lines obtained by bootstrapping, that merge themselves shadowing the plot’s background).

Figure 1
Cases of Linear regressions. Stand out: large size effect (A) STROOPA (B) STROOPB and (C) TMTA; medium size effect (D) VVLT. Among non-confirmed expected associations, (E) shows the association between alternating attention phase and TMTB with significant and negative regression line slope and for postoperative measurements, and (F), regression between inhibitory control phase of the STROOPC game.

A potential issue related to our results was age variability among participants assessed, as age was not considered exclusion criterium. There is the possibility that the wide range of patient ages (29 to 82 years) could make it act as a confounding variable. The last column of Table 1 shows the statistical significance of the correlation comparison test (first order correlation with partial correlation controlled for age). The comparison of the first order and the partial correlations controlled for age did not show age effect for large size effect correlations (visuoperception), indicating that age did not change the associations observed in these cases. There was a significant effect of age in the correlation between VVLT13 and the phase assessing short-term memory, as there were significant and negative correlations between age and scores of both tests (Age vs. VVLT13[pre]: r =-0.404, p = 0.0014; VVLT13[post]: r = 0.310, p = 0.0169; MCP MP[pre]: r = 0.307, p = 0.0170; MCP MP[post]: r = 0.308, p = 0.0176). In the remaining cases in which correlation changed, the effect size was already small or negligible, therefore, without any practical relevance.

Moreover, Figure 2 shows the changes in the Pearson correlation coefficients when the partial correlation controlled by age was computed, showing that variations were small for stronger correlations and, therefore, that age did not interfere importantly with the conclusions of the present study.

Figure 2
Changes in Pearson correlation coefficients showing that age does not interfere in the conclusions of the study.

Discussion

The scores of the cognitive function tests used in the present study revealed appropriate degree of comparability between the ‘‘pen and paper’’ and the digital game only for the visuoperception function. The remaining functions assessed by the digital game, shortand long-term memory, selective and alternating attention, and inhibitory control revealed moderate, weak, or absent correlation with their assumed corresponding digital game phases.

When the study was planned, we did not expect so many weak or absent correlations, as the phases of the digital game were specifically designed to assess certain cognitive domains. A possible interpretation from the results obtained is that digital games have their own merit, but their differentiated dynamic does not correspond well to the pen and paper tests they were supposed to be equivalent to. The digital game phase assessing short-term memory, for example, revealed weak correlation with BVMTR13. However, BVMTR13 does not require time for memory recall, different from the game, in which recall memory is requested from patients at 1 minute.2222 Benedict RHB, Schretlen D, Groninger L, et al. Revision of the Brief Visuospatial Memory Test: Studies of normal performance, reliability, and validity. Psychol Assess. 1996;8:145-53.

Another example of the different dynamic among methods is observed when long-term memory is assessed. For VVLT4, a list is presented to the respondent three times and recall is required after 20 minutes, unlike the digital game in which the respondent is exposed visually to figures only once and recall is requested after 20 minutes.2222 Benedict RHB, Schretlen D, Groninger L, et al. Revision of the Brief Visuospatial Memory Test: Studies of normal performance, reliability, and validity. Psychol Assess. 1996;8:145-53.

There are also requirements that should bring forth skills that depend on different cognitive processes. Regarding selective attention, STROOPA requires reading words, while the digital game requires motor skills and processing speed. For alternating attention, TMTB tests require alternance of concepts (numbers and letters) while the digital game requires alternance of position of figures. Concerning inhibitory control, STROOPC depends, in addition to the function mentioned, on reading words and knowledge of colors, while the game requires short-term memory and motor skill functions simultaneously.2323 Scarpina F, Tagini S. The Stroop Color and Word Test. Front Psychol. 2017;8:557.

The absence of correlation between TMTA and AS MP is harder to explain, given both tests measure selective attention influenced by motor coordination and processing speed, so the requirements were assumed to be similar by the researchers. According to data obtained, however, both assessment methods did not consistently reveal equivalence.2424 Llinàs-Reglà J, Vilalta-Franch J, López-Pousa S, et al. The Trail Making Test. Assessment. 2017;24:183-96.

More coherent results were observed among STROOPA, STROOPB and TMTA tests and the phase of the game designed to assess visuoperception. In this case, as both methods depend on a visual search of figures, requirements were expected to be similar, as was observed.2323 Scarpina F, Tagini S. The Stroop Color and Word Test. Front Psychol. 2017;8:557.

It is interesting to observe that the visuoperception function of the digital game occurs by visual search of figures that appear sequentially on the screen. Thus, respondents should select the one that appears last, which clearly seems to involve the main visual task, in addition to the short-term memory of the position of figures. Regarding visuoperception and short-term memory functions, one can assume that they are two of the functions most stimulated by the digital game.88 Rivero TS, Querino EHG, Starling-Alves I. Videogame: seu impacto na atenção, percepção e funções executivas. Neuropsicologia Latinoamericana. 2012;4:38-52.

9 Green CS, Li R, Bavelier D. Perceptual learning during action video game playing. Top Cogn Sci. 2010;2:202-16.

10 Blumberg FC, Fisch SM. Introduction: digital games as a context for cognitive development, learning, and developmental research. New Dir Child Adolesc Dev. 2013;2013:1-9.

11 Merabet LB, Connors EC, Halko MA, et al. Teaching the blind to find their way by playing video games. PLoS One. 2012;7:e44958.
-1212 Bavelier D, Green CS, Pouget A, et al. Brain plasticity through the life span: learning to learn and action video games. Annu Rev Neurosci. 2012;35:391-416.

A previous study compared neuropsychological tests administered by digital tools with ‘‘pen and paper’’ tests in the same patient population. Reliability was observed to be only moderate. Results obtained in the present study seem to confirm the difficulty to obtain strong correlations but point out visuoperception and short-term memory as functions to which the digital game was sensitive.2525 Radtke FM, Franck M, Papkalla N, et al. Postoperative cognitive dysfunction: computerized and conventional tests showed only moderate inter-rater reliability. J Anesth. 2010;24:518-25.

Cognitive dysfunction assessment is not performed routinely. A study in heart surgery compared Mini-Mental and clock drawing screening with attention, executive function and oral fluence tests. The study suggested the advantage of using these specific tests as they showed significantly altered results in executive functions up to six months postoperative.2626 Pérez-Belmonte LM, Florido-Santiago M, Osuna-Sánchez J, et al. Screening versus brief domain-specific tests to assess long-term postoperative cognitive dysfunction after concomitant aortic valve replacement and coronary artery bypass grafting. J Cardiovasc Nurs. 2019;34:511-6. Also, self-administered cognitive screening tests can be tools for preoperative cognitive assessment.2727 Stoicea N, Koehler KN, Scharre DW, et al. Cognitive self-assessment scales in surgical settings: Acceptability and feasibility. Best Pract Res Clin Anaesthesiol. 2018;32: 303-9.

Although in the present study the remainder comparison pairs (short-term and long-term memory, selective and alternating attention, and inhibitory control) did not reveal satisfactory correlation with their supposedly equivalents in paper, this does not mean that the MentalPlus® phases are not measuring any function, but that they may be partially measuring the same function. Both methods do not use the same measurement unit and explain exactly which are the functions and their equivalences is an open challenge.

Recently, postoperative delirium has been the subject of a great deal of research, reviews, and recommendations, but robust evidence is still lacking. The present study is supported by the need to identify high-risk patients, inform them, and establish routine assessments and techniques to reduce this risk, notably to the possibility of multidisciplinary non-pharmacological interventions.2828 Hughes CG, Boncyk CS, Culley DJ, et al. American Society for Enhanced Recovery and Perioperative Quality Initiative Joint Consensus Statement on Postoperative Delirium Prevention. Anesth Analg. 2020;130:1572-90. For elderly patients, anesthesia has been recommended as a factor in the decision to proceed with surgery or not, due to the risk of POCD, in addition to seeking alternative approaches to general anesthesia.2929 Cottrell JE, Hartung J. Anesthesia and cognitive outcome in elderly patients: a narrative viewpoint. J Neurosurg Anesthesiol. 2020;32:9-17. Often, among the elderly, cognitive dysfunction seems to be exacerbated by toxic effects of drugs used in general anesthesia, detected in several areas of visual-spatial functions. They are long-lasting, and probably originate from subcortical vascular injury, and little difference observed between general and regional anesthesia.3030 Ancelin ML, de Roquefeuil G, Scali J, et al. Long-term post-operative cognitive decline in the elderly: the effects of anesthesia type, apolipoprotein E genotype, and clinical antecedents. J Alzheimers Dis. 2010;22 Suppl 3:105-13.

The present study has some major limitations. Digital games require construct validation on tasks for identities and correlations with objective neuropsychological assessment. The tests adopted should be measurements that identify subtle changes at relatively short intervals. Regarding the comparisons of this study, results suggest a reasonable correlation, and given applicability (time and usability of digital games), they can be the foundation for the notion of cognitive recovery, contributing to rule out the ‘‘all or nothing’’ concept when diagnosing this status.3131 Piggin LH, Newman SP. Measuring and monitoring cognition in the postoperative period. Best Pract Res Clin Anaesthesiol. 2020;34:e1-12. A second limitation is related to age of participants, that was not considered among inclusion or exclusion criteria of the study. Consequently, recruitment was based on opportunity, following the surgery schedule sequence. Consequently, results showed varied-age patients submitted to a variety of procedures, and little control on the postoperative time at which there was the opportunity for postoperative assessment. Additionally, the attempt to compare the digital game to traditional pen and paper tools made the assessment sessions too long.

Although we do not have elements to infer on what can be done about the fact that only 60 of 112 patients accepted to repeat the assessment session in the postoperative period, we can only speculate on how less cooperative any participant would be during postoperative recovery, after having experienced a long preoperative assessment. However, given the aim of the study was not to obtain a clinical assessment, but only to examine if the scores obtained by traditional instruments and by the digital game followed the same pattern, not having a second assessment may have not been harmful to patients who refused (although they may have missed some gain).

Although cognitive tests, whether validated or under investigation, provide an estimate of skill or function, this study adequately assessed the visuoperception function in a context of POCD, within the limitations discussed. However, the MentalPlus® digital game still needs further studies to be validated. The associations described here suggest the digital game as an instrument with potential, notably because it reduces neuropsychological assessment time, offers simple results provided by the number of correct hits on phases of the game, and is self-explanatory to patients. Therefore, its use can be encouraged as a fast and accessible method to monitor POCD.

References

  • 1
    Luria AR. Fundamentos de Neuropsicologia: Tradução de Juarez Ricardo Aranha. Rio de Janeiro: Livros Técnicos e Científicos; 1981. São Paulo: Ed. da Universidade de São Paulo.
  • 2
    Mader MJ. Avaliação neuropsicológica: aspectos históricos e situação atual. Psicologia, ciência, profissão [Internet]. 1996;16(3):12-8, http://dx.doi.org/10.1590/S1414-98931996000300003 [cited 2019 Dec 5].
    » http://dx.doi.org/10.1590/S1414-98931996000300003
  • 3
    Capovilla AGS. Contribuições da neuropsicologia cognitiva e da avaliação neuropsicológica à compreensão do funcionamento cognitivo humano. Cadernos de Psicopedagogia [on-line]. 2007;6(11).
  • 4
    Lesak MD. Neuropsychological Assessment. 5th ed. New York: Oxford University Press; 2012.
  • 5
    Strauss E, Sherman EMS, Spreen O. A Compendium of Neuropsychological Tests: Administration, Norms and Commentary. 3rd ed. New York: Oxford University Press; 2006.
  • 6
    Malloy-Diniz LF, Fuentes D, Mattos P, et al. Avaliação Neuropsicológica. 2nd ed. Porto Alegre: Artmed Editora; 2018.
  • 7
    Reppold CT, Gurgel LG, Hutz CS. O processo de construção de escalas psicométricas. Avaliação Psicológica. 2014;13:307-10.
  • 8
    Rivero TS, Querino EHG, Starling-Alves I. Videogame: seu impacto na atenção, percepção e funções executivas. Neuropsicologia Latinoamericana. 2012;4:38-52.
  • 9
    Green CS, Li R, Bavelier D. Perceptual learning during action video game playing. Top Cogn Sci. 2010;2:202-16.
  • 10
    Blumberg FC, Fisch SM. Introduction: digital games as a context for cognitive development, learning, and developmental research. New Dir Child Adolesc Dev. 2013;2013:1-9.
  • 11
    Merabet LB, Connors EC, Halko MA, et al. Teaching the blind to find their way by playing video games. PLoS One. 2012;7:e44958.
  • 12
    Bavelier D, Green CS, Pouget A, et al. Brain plasticity through the life span: learning to learn and action video games. Annu Rev Neurosci. 2012;35:391-416.
  • 13
    Valentin LSS, Pereira VFA. Digital game: a scale to evaluate the perioperative cognitive function (MentalPlus®), https://www.mnkjournals.com/journal/ijlrst/pdf/Volume_6_1_2017/10705.pdf [Accessed 18 January 2021].
    » https://www.mnkjournals.com/journal/ijlrst/pdf/Volume_6_1_2017/10705.pdf
  • 14
    Valentin LSS, Pereira VFA. MentalPlus® digital game is reliable to measure cognitive function in healthy adults. A future accessible tool to assess postoperative cognitive dysfunction and rehabilitation. Inter J Psychiatry. 2017;2:1-6.
  • 15
    Canet J, Reader J, Rasmussen LS, et al. Cognitive dysfunction after minor surgery in the elderly. Acta Anaesthesiol Scand. 2003;47:1204-10.
  • 16
    Steinmetz J, Funder KS, Dahl BT, et al. Depth of anaesthesia and post-operative cognitive dysfunction. Acta Anaesthesiologica Scandinavica. 2010;54:162-8.
  • 17
    Rasmussen LS, ISPOCD2 investigators. Post-operative cognitive dysfunction in the elderly. Acta Anaesthesiol Scand. 2005;49:1573.
  • 18
    Silverstein JH, Steinmetz J, Reichenberg A, et al. Postoperative cognitive dysfunction in patients with preoperative cognitive impairment: which domains are most vulnerable? Anesthesiology. 2007;106:431-5.
  • 19
    Valentin LSS, Pietrobon R, Aguiar Junior W, et al. Definition and application of neuropsychological test battery to evaluate postoperative cognitive dysfunction. Einstein (São Paulo). 2015;13:20-6.
  • 20
    Ellis PD. The Essential Guide to Effect Sizes. 1st ed. Cambridge University Press; 2010.
  • 21
    Cohen J. Statistical power analysis for the behavioral sciences. 2nd ed. Hillsdale, NJ: Lawrence Erlbaum; 1988.
  • 22
    Benedict RHB, Schretlen D, Groninger L, et al. Revision of the Brief Visuospatial Memory Test: Studies of normal performance, reliability, and validity. Psychol Assess. 1996;8:145-53.
  • 23
    Scarpina F, Tagini S. The Stroop Color and Word Test. Front Psychol. 2017;8:557.
  • 24
    Llinàs-Reglà J, Vilalta-Franch J, López-Pousa S, et al. The Trail Making Test. Assessment. 2017;24:183-96.
  • 25
    Radtke FM, Franck M, Papkalla N, et al. Postoperative cognitive dysfunction: computerized and conventional tests showed only moderate inter-rater reliability. J Anesth. 2010;24:518-25.
  • 26
    Pérez-Belmonte LM, Florido-Santiago M, Osuna-Sánchez J, et al. Screening versus brief domain-specific tests to assess long-term postoperative cognitive dysfunction after concomitant aortic valve replacement and coronary artery bypass grafting. J Cardiovasc Nurs. 2019;34:511-6.
  • 27
    Stoicea N, Koehler KN, Scharre DW, et al. Cognitive self-assessment scales in surgical settings: Acceptability and feasibility. Best Pract Res Clin Anaesthesiol. 2018;32: 303-9.
  • 28
    Hughes CG, Boncyk CS, Culley DJ, et al. American Society for Enhanced Recovery and Perioperative Quality Initiative Joint Consensus Statement on Postoperative Delirium Prevention. Anesth Analg. 2020;130:1572-90.
  • 29
    Cottrell JE, Hartung J. Anesthesia and cognitive outcome in elderly patients: a narrative viewpoint. J Neurosurg Anesthesiol. 2020;32:9-17.
  • 30
    Ancelin ML, de Roquefeuil G, Scali J, et al. Long-term post-operative cognitive decline in the elderly: the effects of anesthesia type, apolipoprotein E genotype, and clinical antecedents. J Alzheimers Dis. 2010;22 Suppl 3:105-13.
  • 31
    Piggin LH, Newman SP. Measuring and monitoring cognition in the postoperative period. Best Pract Res Clin Anaesthesiol. 2020;34:e1-12.

Publication Dates

  • Publication in this collection
    28 Feb 2022
  • Date of issue
    Jan-Feb 2022

History

  • Received
    13 Jan 2020
  • Accepted
    26 June 2021
  • Published
    16 Aug 2021
Sociedade Brasileira de Anestesiologia (SBA) Rua Professor Alfredo Gomes, 36, Botafogo , cep: 22251-080 - Rio de Janeiro - RJ / Brasil , tel: +55 (21) 97977-0024 - Rio de Janeiro - RJ - Brazil
E-mail: editor.bjan@sbahq.org