Acessibilidade / Reportar erro

Naive skepticism scale: development and validation tests applied to the chilean population

Abstract

Background

Skepticism has traditionally been associated with critical thinking. However, philosophy has proposed a particular type of skepticism, termed naive skepticism, which may increase susceptibility to misinformation, especially when contrasting information from official sources. While some scales propose to measure skepticism, they are scarce and only measure specific topics; thus, new instruments are needed to assess this construct.

Objective

This study aimed to develop a scale to measure naive skepticism in the adult population.

Method

The study involved 446 individuals from the adult population. Subjects were randomly selected for either the pilot study (phase 2; n = 126) or the validity-testing study (phase 3; n = 320). Parallel analyses and exploratory structural equation modelling were conducted to assess the internal structure of the test. Scale reliability was estimated using Cronbach's alpha and McDonald's omega coefficients Finally, a multigroup confirmatory factor analysis was performed to assess invariance, and a Set- Exploratory Structural Equation Modeling was applied to estimate evidence of validity based on associations with other variables.

Results

The naive skepticism scale provided adequate levels of reliability (ω > 0.8), evidence of validity based on the internal structure of the test (CFI = 0.966; TLI = 0.951; RMSEA = 0.079), gender invariance, and a moderate inverse effect on attitudes towards COVID-19 vaccines.

Conclusions

The newly developed naive skepticism scale showed acceptable psychometric properties in an adult population, thus enabling the assessment of naive skepticism in similar demographics. This paper discusses the implications for the theoretical construct and possible limitations of the scale.

Keywords
Naive Skepticism; Measurement; Misinformation; Scale development

In recent literature, misinformation is understood as any partially false information (Ecker et al., 2022Ecker, U. K. H., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., Kendeou, P., Vraga, E. K., & Amazeen, M. A. (2022). The psychological drivers of misinformation belief andits resistance to correction. Nature Reviews Psychology, 1(1), 13–29. https://doi.org/10.1038/s44159-021-00006-y
https://doi.org/10.1038/s44159-021-00006...
; Wang et al., 2019Wang, Y., McKee, M., Torbica, A., & Stuckler, D. (2019). Systematic Literature Review on the Spread of Health-related Misinformation on social media. Social Science and Medicine, 240, 112552. https://doi.org/10.1016/j.socscimed.2019.112552
https://doi.org/10.1016/j.socscimed.2019...
), with focus on its consequences, such as influencing the spread of risky health behaviors (Wang et al., 2019Wang, Y., McKee, M., Torbica, A., & Stuckler, D. (2019). Systematic Literature Review on the Spread of Health-related Misinformation on social media. Social Science and Medicine, 240, 112552. https://doi.org/10.1016/j.socscimed.2019.112552
https://doi.org/10.1016/j.socscimed.2019...
). Misinformation is identified as the leading direct cause of death among young people and an indirect cause in adults (WHO, 2020World Health Organization. (2020). Infodemics and misinformation negatively affect people's health behaviours, new WHO review finds. https://www.who.int/europe/news/item/01-09-2022-infodemics-and-misinformation-negatively-affect-people-s-health-behaviours--new-who-review-finds
https://www.who.int/europe/news/item/01-...
). Its impact was particularly evident during the COVID-19 pandemic, notably affecting attitudes towards vaccines (Dubé et al., 2022Dubé E., MacDonald Sh., Manca T., Bettinger J., Driedger S., Graham J., Greyson D., MacDonald N., Meyer S., Roch G., Vivion M., Aysworth L., Witteman H., Gélinas-Gascon F., Sathler L., Hakim H., Gagnon D., Béchard B., Gramaccia J., Khoury R.,Tremblay S. (2022). Understanding the Influence of Web-Based Information, Misinformation, Disinformation, and Reinformation on COVID-19 Vaccine Acceptance: Protocol for a Multicomponent Study. JMIR Research Protocols. 11(10). https://doi.org/10.2196/41012
https://doi.org/10.2196/41012...
; Zheng et al., 2022Zheng, L., Elhai, J. D., Miao, M., Wang, Y., Wang, Y., & Gan, Y. (2022). Healthrelated fake news during the COVID-19 pandemic: perceived trust and information search. Internet Research, 32(3), 768–789. https://doi.org/10.1108/INTR-11-2020-0624
https://doi.org/10.1108/INTR-11-2020-062...
). These attitudes can be influenced by a lack of knowledge or a predisposition to assimilate false or biased information, thus increasing misbeliefs about the consequences of risky health behaviors. Therefore, a plausible hypothesis is that naive skepticism affected attitudes towards vaccines in the context of COVID-19 (Bavel et al., 2020Van Bavel, J., Baicker, K., Boggio, P., Capraro, V., Cichocka, A., Cikara, M., Crockett, M., Crum, A., Douglas, K., Druckman, J., Drury, J., Dube, O., Ellemers, N., Finkel, E., Fowler, J., Gelfand, M., Han, S., Haslam, A., Jetten, J., & Willer, R. (2020). Using social and behavioural science to support COVID-19 pandemic response. Nature human behaviour, 4(5), 460–471. https://doi.org/10.1038/s41562-020-0884-z
https://doi.org/10.1038/s41562-020-0884-...
; Roozenbeek & van der Linden, 2019Roozenbeek, J., & Van Der Linden, S. (2019). The fake news game: Actively inoculating against the risk of misinformation. Journal of Risk Research, 22(5), 570–580. https://doi.org/10.1080/13669877.2018.1443491
https://doi.org/10.1080/13669877.2018.14...
; Roozenbeek et al., 2020Roozenbeek, J., Schneider, C., Dryhurst, S., Kerr, J., Freeman, A., Recchia, G., Van Der Bles, A., & Van Der Linden, S. (2020). Susceptibility to misinformation about COVID-19 around the world: Susceptibility to COVID misinformation. Royal Society Open Science, 7(10). https://doi.org/10.1098/rsos.201199
https://doi.org/10.1098/rsos.201199...
). To this end, part of the scientific community is investigating variables that heighten susceptibility to misinformation, suggesting various psychological traits as protective (e.g., analytical thinking, deductive and inductive reasoning) (Sinderman et al., 2020Sindermann, C., Schmitt, H., Rozgonjuk, D., Elhai, J., & Montag, C. (2020). Which factors influence the evaluation of fake and true news? Ability versus non-ability traits. OSFPreprints.) or risk factors (e.g., receptivity to nonsense, political orientation, religious beliefs) (Gligorić et al., 2022Gligorić, V., Feddes, A., & Doosje, B. (2022). Political bullshit receptivity and its correlates: A cross-country validation of the concept. Journal of Social and Political Psychology, 10(2), 411–429. https://doi.org/10.5964/jspp.6565
https://doi.org/10.5964/jspp.6565...
; Pennycook & Rand, 2019aPennycook, G., & Rand, D. (2019a). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39–50. https://doi.org/10.1016/j.cognition.2018.06.011
https://doi.org/10.1016/j.cognition.2018...
).

From the perspective of educational philosophy, the concept of naive skepticism has emerged. It potentially impacts the ability of young adults to correctly judge the veracity of information (Wright, 2019Wright, J. (2019). The truth, but not yet: Avoiding naive skepticism via explicit communication of metadisciplinary aims. Teaching in Higher Education, 24(3), 361–377. https://doi.org/10.1080/13562517.2018.1544552
https://doi.org/10.1080/13562517.2018.15...
), thus playing a role in discriminating between false and accurate information. Naive skepticism is defined as a psychological trait characterized by the tendency to dismiss information without critical analysis, even when such information is supported by truthful, or at least reasonably acceptable, evidence. This trait is distinct from reasoned skepticism, which is based on arguments or evidence supporting the skeptical stance (Wright, 2019Wright, J. (2019). The truth, but not yet: Avoiding naive skepticism via explicit communication of metadisciplinary aims. Teaching in Higher Education, 24(3), 361–377. https://doi.org/10.1080/13562517.2018.1544552
https://doi.org/10.1080/13562517.2018.15...
). Furthermore, naive skepticism differs from conspiracy theories. Naive skepticism corresponds to a general tendency to question the credibility of official sources, whereas conspiracy theories are associated with persons who believes in specific conspiracies or have a strong inclination towards conspiracy thinking; often linked to varied topics, making it less applicable in different contexts (Douglas et al., 2019Douglas, K. M., Uscinski, J. E., Sutton, R. M., Cichocka, A., Nefes, T., Ang, C. S., & Deravi, F. (2019). Understanding Conspiracy Theories. Political Psychology, 40, 3–35. https://doi.org/10.1111/pops.12568
https://doi.org/10.1111/pops.12568...
; Imhoff & Lamberty, 2018Imhoff, R., & Lamberty, P. (2018). How paranoid are conspiracy believers? Toward a more fine ‐grained understanding of the connect and disconnect between paranoia and belief in conspiracy theories. European journal of social psychology, 48(7), 909–926.; Van Prooijen & Douglas, 2017Van Prooijen, J. W., & Douglas, K. M. (2017). Conspiracy theories as part of history: The role of societal crisis situations. Memory Studies, 10(3), 323–333. https://doi.org/10.1177/1750698017701615
https://doi.org/10.1177/1750698017701615...
; Quiring et al., 2021Quiring, O., Ziegele, M., Schemer, C., Jackob, N., Jakobs, I., & Schultz, T. (2021). Constructive skepticism, dysfunctional cynicism? Skepticism and cynicism differently determine generalized media trust. International Journal of Communication, 15, 22.). Notably, naive skeptics are predisposed to accepting conspiracy theories (Quiring et al., 2021Quiring, O., Ziegele, M., Schemer, C., Jackob, N., Jakobs, I., & Schultz, T. (2021). Constructive skepticism, dysfunctional cynicism? Skepticism and cynicism differently determine generalized media trust. International Journal of Communication, 15, 22.).

Regarding how naive skepticism might increase vulnerability to disinformation, it has been suggested that naive skepticism influences the level of information processing, primarily through reasoning motivated by maintaining one's beliefs (Wood & Porter, 2019Wood, T., & Porter, E. (2019). The elusive backfire effect: Mass attitudes’ steadfast factual adherence. Political Behavior, 41(1), 135–163. https://doi.org/10.1007/s11109-018-9443-y
https://doi.org/10.1007/s11109-018-9443-...
). This reasoning makes individuals more susceptible to disinformation that aligns with their central belief systems or group identity (Erion, 2005Erion, G. (2005). Engaging student relativism. Discourse, 5(1), 120–133.). Consequently, naive skepticism emerges as a risk factor, leading to a higher tendency to believe in disinformation (Wright, 2020Wright, J. (2020). Many People Are Saying…": Applying the Lessons of Naive Skepticism to the Fight against Fake News and Other "Total Bullshit. Postdigital Science and Education, 2(1), 113–131. https://doi.org/10.1007/s42438-019-00051-0
https://doi.org/10.1007/s42438-019-00051...
).

Naive skepticism usually manifests as a reluctance to trust official information sources, which, in the context of healthy democracies, typically base their communications on evidence or reasonable conjecture. Notable sources often subjected to unfounded questioning by individuals with higher levels of naive skepticism include scientific organizations (e.g., Steffens et al., 2019Steffens, M. S., Dunn, A. G., Wiley, K. E., & Leask, J. (2019). How organisations promoting vaccination respond to misinformation on social media: A qualitative investigation. BMC Public Health, 19(1), 1–12. https://doi.org/10.1186/s12889-019-7659-3
https://doi.org/10.1186/s12889-019-7659-...
), governmental organizations (e.g., Van Scoy et al., 2021Van Scoy, L. J., Snyder, B., Miller, E. L., Toyobo, O., Grewel, A., Ha, G., … Lennon, R. P. (2021). Public anxiety and distrust due to perceived politicization and media sensationalism during early COVID-19 media messaging. Journal of Communication in Healthcare, 1–13. https://doi.org/10.1080/17538068.2021.1953934
https://doi.org/10.1080/17538068.2021.19...
; Lynch, 2023Lynch, M. P. (2023). Political Skepticism, Bias, and Epistemic Colonization. In H. Samarzija & Q. Cassam (Eds.) The Epistemology of Democracy. Routledge.), and mainstream media (e.g., Nekmat, 2020Nekmat, E. (2020). Nudge effect of fact-check alerts: Source influence and media skepticism on sharing of news misinformation in social media. Social Media + Society, 6(1), https://doi.org/10.1177/2056305119897322
https://doi.org/10.1177/2056305119897322...
).

Given its explanatory and predictive potential regarding susceptibility to misinformation, some scales have been developed to assess skepticism as a naive trait. These include the following: 1) Skepticism Towards Advertising Scale, which measures skepticism towards advertisements using 9 items (e.g., "We can depend on getting the truth in most advertising") and has a unidimensional structure (Obermiller & Spangenberg, 1998Obermiller, C., & Spangenberg, E. (1998). Development of a Scale to Measure Consumer Skepticism Toward Advertising. Journal of Consumer Psychology, 7(2), 159–186. https://doi.org/10.1207/s15327663jcp0702_03
https://doi.org/10.1207/s15327663jcp0702...
); 2) Climate Change Skepticism Scale, assessing skepticism towards climate change through 3 items (e.g., "I doubt that there is global warming going on") (Ojala, 2015Ojala, M. (2015). Climate change skepticism among adolescents. Journal of Youth Studies, 18(9), 1135–1153. https://doi.org/10.1080/13676261.2015.1020927
https://doi.org/10.1080/13676261.2015.10...
); and, 3) Professional Skepticism Scale, measuring professional skepticism in the audit process multidimensionally (i.e., as an individual trait and as a state in professionals) with 30 items (e.g., "I often accept other people's explanations without further thought") (Hurtt, 2010Hurtt, R. (2010). Development of a Scale to Measure Professional Skepticism. AUDITING: A Journal of Practice & Theory, 29(1), 149–171. https://doi.org/10.2308/aud.2010.29.1.149
https://doi.org/10.2308/aud.2010.29.1.14...
). However, these scales are domain-specific and do not assess naive skepticism as a general trait, as conceptualized by Wright (2020)Wright, J. (2020). Many People Are Saying…": Applying the Lessons of Naive Skepticism to the Fight against Fake News and Other "Total Bullshit. Postdigital Science and Education, 2(1), 113–131. https://doi.org/10.1007/s42438-019-00051-0
https://doi.org/10.1007/s42438-019-00051...
.

Despite the relevance of naive skepticism as a general trait in susceptibility to misinformation, empirical studies supporting this notion are limited. This paucity may be due to the lack of measurement scales that provide valid and reliable means of testing this and other related hypotheses. Consequently, the aim of this study was to develop a new scale to measure naive skepticism, providing evidence of its validity, reliability, and invariance in an adult population. Given the nascent nature of this field of study and the scarcity of prior research, establishing equivalence between men and women in comprehending the concept of naive skepticism was crucial. This aligns with the ongoing debate surrounding the gender similarity hypothesis, which considers the potential for theorizing similarities between men and women (Hyde, 2014Hyde, J. S. (2014). Gender similarities and differences. Annual Review of Psychology, 65, 373–398. https://doi.org/10.1146/annurev-psych-010213-115057
https://doi.org/10.1146/annurev-psych-01...
). Hence, gender invariance was a focus. This new scale will expand the capabilities for studying susceptibility to misinformation and its ramifications.

Materials and method

Participants

This research, an instrumental study with a cross-sectional design (Ato et al., 2013Ato, M., López-García, J., & Benavente, A. (2013). Un sistema de clasificación de los diseños de investigación en psicología. Anales De Psicología, 29(3), 1038–1059.), was conducted in three primary phases. Phase 1 involved drafting new items, which were then evaluated by expert judges (refer to the Instruments section for more details). Phase 2 focused on exploring the dimensionality of the scale (i.e., the revised, post expert-review version). The final phase involved assessing scale validity based on the internal structure of the test and its associations with other variables. It is important to note that only the last two phases had distinct data sets.

In Phase 2, a total of 126 adults participated. Of these, 55.6% (n = 70) were female and 44.4% (n = 56) were male, with a mean age of 24 years (SD = 7.01). Phase 3 saw the participation of 320 adults, with 54.4% (n = 174) female, 44.7% (n = 143) male, and 0.6% (n = 2) identifying as non-binary. The mean age in this phase was 29.32 years (SD = 11.01). Detailed demographic information for Phase 3 is presented in Table 1.

Table 1
Sociodemographic characteristics of the study (Phase 3)

Instruments

Naive Skepticism Scale (NSS): The NSS is an ad-hoc instrument designed to assess the level of naive skepticism in individuals. It comprises two dimensions: skepticism towards governmental organizations and the official press (hereafter referred to as SGO; 7 items) and skepticism towards science (hereafter referred to as SS; 7 items). The scale utilizes a Likert format with five response categories, ranging from 1 = "Never" to 5 = "Always." The items in the NSS are statements reflecting distrust towards science, governmental organizations, and the press.

Given the absence of standardized instruments for measuring naive skepticism broadly, the initial Phase involved developing a comprehensive operational definition based on a literature review. This definition encompassed the overall tendency to reject information without critical analysis or the support of reliable evidence, as well as specific aspects for each sub-dimension (i.e., SGO, SS). Subsequently, 37 items were drafted following the guidelines for creating Likert-type scale items, as proposed by AERA, APA & NCME (2014)American Educational Research Association, American Psychological Association & National Council on Measurement in Education (2014). Standards for educational and psychological testing. Washington, DC. and Muñiz and Fonseca-Pedrero (2019)Muñiz, J., & Fonseca-Pedrero, E. (2019). Diez pasos para la construcción de un test. Psicothema, 31(1), 7. https://doi.org/10.7334/psicothema2018.291
https://doi.org/10.7334/psicothema2018.2...
. These items were then evaluated by four expert judges (all from the social sciences field; three doctoral researchers and one master's student) for grammatical adequacy (coherence and clarity) construct representativeness. Judges individually scored each item on a scale of 1, 0, and −1, where "1" indicated grammatical adequacy and construct representativeness. Items with means less than or equal to 0 were discarded.

The revised version resulted in a 23-item scale, which was used in Phase 2 (n = 126) for initial dimensionality exploration. Item selection was guided by content relevance, corrected homogeneity index, and parallel analysis. The process culminated in a debugged 14-item scale used in Phase 3 in the adult population. Detailed psychometric evidence for this final version is presented in the Results section.

Spanish Version of the COVID-19 Vaccine Attitude Scale (Campo-Arias et al., 2021Campo-Arias, A., Caamaño-Rocha, L., & Pedrozo-Pupo, J. (2021). Spanish Version of the Attitude Towards COVID-19 Vaccines Scale: Reliability and Validity Assessment. medRxiv. https://doi.org/10.1101/2021.07.18.21260733
https://doi.org/10.1101/2021.07.18.21260...
): This 8-item instrument measures favorable attitudes toward COVID-19 vaccines. Responses are in a Likert format ranging from 0 = "Strongly Disagree") to 4 = "Strongly Agree"). Scores are tallied directly from 0 to 4, except for item 7, which is reverse-scored. Total scores range from 0 to 32, with higher scores indicating more favorable attitudes or acceptance of COVID-19 vaccines. The Spanish version demonstrated strong internal consistency (Cronbach's alpha of 0.94 and McDonald's omega of 0.95) and a unidimensional structure with acceptable goodness-of-fit indicators (CFI = 0.94, TLI = 0.91, SRMR = 0.04) (Campo-Arias et al., 2021Campo-Arias, A., Caamaño-Rocha, L., & Pedrozo-Pupo, J. (2021). Spanish Version of the Attitude Towards COVID-19 Vaccines Scale: Reliability and Validity Assessment. medRxiv. https://doi.org/10.1101/2021.07.18.21260733
https://doi.org/10.1101/2021.07.18.21260...
).

Procedure

Data collection for this study was limited to Phases 2 and 3. Phase 2 data were gathered between August and November 2021 through an online questionnaire. In contrast, Phase 3 data collection occurred from May to October 2022, utilizing both a self-administered pencil-and-paper questionnaire and an online format.

In both phases, participants received an explanation of the study and were asked to sign an informed consent form. This form outlined the research objectives, participants’ rights, and the anonymity and confidentiality of their involvement. Participation was entirely voluntary, with no rewards or incentives offered. The mean response time to complete the questionnaires was between 15 and 20 min for both phases.

The Scientific Ethics Committee of the Universidad de Tarapacá granted ethical approval for this research, conducted as part of the FONDECYT Regular Project n°1,220,664.

Data analysis

In Phase 2, parallel analysis was performed to establish the dimensionality of the instrument using the method of minimum residual estimation and oblimin rotation (Goretzko et al., 2021Goretzko, D., Pham, T. T. H., & Bühner, M. (2021). Exploratory factor analysis: Current use, methodological developments and recommendations for good practice. Current Psychology, 40, 3510–3521. https://doi.org/10.1007/s12144-019-00300-2
https://doi.org/10.1007/s12144-019-00300...
). To debug the scale, iterative debugging was performed based on three criteria: (1) retaining items with substantial factor loadings (λ > 0.5); (2) removing redundant items (Abad et al., 2011Abad, F., Olea, J., Ponsoda, V., García, C. (2011). Medición en ciencias sociales y de la salud. Madrid: Síntesis. 26–38 p.); and (3) removing items with large cross-loadings (> 0.3) (Muthén & Asparouhov, 2012Muthén, B., & Asparouhov, T. (2012). Bayesian structural equation modeling: A more flexible representation of substantive theory. Psychological Methods, 17(3), 313. https://doi.org/10.1037/a0026802
https://doi.org/10.1037/a0026802...
; Xiao et al., 2019Xiao, Y., Liu, H., & Hau, K. (2019). A comparison of CFA, ESEM, and BSEM in test structure analysis. Structural Equation Modeling: A Multidisciplinary Journal, 26(5), 665–677. https://doi.org/10.1080/10705511.2018.1562928
https://doi.org/10.1080/10705511.2018.15...
). An item analysis using the corrected homogeneity index followed, with indices greater than 0.05 deemed adequate and significant. This process resulted in a 14-item scale across two dimensions.

For Phase 3, an Exploratory Structural Equation Model (ESEM) with GEOMIN rotation (Asparouhouv & Muthén, 2009Asparouhouv, T., & Muthén, B. (2009). Exploratory structural equation modeling. Structural Equation Modeling: A Multidisciplinary Journal, 16(3), 397–438. https://doi.org/10.1080/10705510903008204
https://doi.org/10.1080/1070551090300820...
) and the weighted least squares estimation method were performed to establish evidence of validity based on the internal structure of the test. This estimation method is robust for non-normal discrete variables (DiStefano & Morgan, 2014DiStefano, C., & Morgan, G. (2014). A comparison of diagonal weighted least squares robust estimation techniques for ordinal data. Structural Equation Modeling: A Multidisciplinary Journal, 21(3), 425–438. https://doi.org/10.1080/10705511.2014.915373
https://doi.org/10.1080/10705511.2014.91...
; Li, 2016Li, C. (2016). Confirmatory factor analysis with ordinal data: Comparing robust maximum likelihood and diagonally weighted least squares. Behavior Research Methods, 48(3), 936–949. https://doi.org/10.3758/s13428-015-0619-7
https://doi.org/10.3758/s13428-015-0619-...
). Given the ordinal structure of the data, a polychoric correlation matrix was also used (Barendse et al., 2015Barendse, M., Oort, F., & Timmerman, M. (2015). Using exploratory factor analysis to determine the dimensionality of discrete responses. Structural Equation Modeling: A Multidisciplinary Journal, 22(1), 87–101. https://doi.org/10.1080/10705511.2014.934850
https://doi.org/10.1080/10705511.2014.93...
). Reliability for each dimension was estimated using non-ordinal versions of Cronbach's alpha and McDonald's omega coefficients (Viladrich et al., 2017Viladrich, C., Angulo-Brunet, A., & Doval, E. (2017). A journey around alpha and omega to estimate internal consistency reliability. Annals of Psychology, 33(3), 755–782. https://doi.org/10.6018/analesps.33.3.268401
https://doi.org/10.6018/analesps.33.3.26...
). Measurement invariance across different genders was assessed through multigroup (i.e., metric and scalar) confirmatory factor analysis (MGCFA), considering Comparative Fit Index (CFI) decreases of less than 0.010 as evidence of invariance (Chen, 2007Chen, F. (2007). Sensitivity of goodness of fit indexes to lack of measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 14(3), 464–504. https://doi.org/10.1080/10705510701301834
https://doi.org/10.1080/1070551070130183...
). Since a survey was applied in a pencil-and-paper and online format, invariance was also performed for the application format. Furthermore, evidence of validity based on the relationship with other variables was established through Set-ESEM (employing GEOMIN rotation, weighted least squares estimation method estimator, and polychoric correlations), examining the relationship between the dimensions of the naive skepticism scale and attitudes towards COVID-19 vaccines.

Model fit was evaluated following Schreiber's (2017)Schreiber, J. (2017). Update to core reporting practices in structural equation modeling. Research in Social and Administrative Pharmacy, 13(3), 634–643. https://doi.org/10.1016/j.sapharm.2016.06.006
https://doi.org/10.1016/j.sapharm.2016.0...
recommended cut-point indicators: the CFI, the Tucker-Lewis Index (TLI), and the Root Mean Square Error of Approximation (RMSEA) (e.g., CFI > 0.95; TLI > 0.95; RMSEA < 0.06). Notably, parallel analysis, reliability coefficients, and the homogeneity index were obtained using Jamovi program v2.0.0 (The Jamovi Project, 2020The Jamovi Project. (2020). Jamovi (Version 1.8.1) [Computer Software]. Disponible en línea en: https://www.jamovi.org(accessed26March2021)
https://www.jamovi.org...
), while the ESEM was conducted using Mplus v8.2 (Muthén and Muthén, 1998–2017Muthén, L., & Muthén, B. (1998–2017). Mplus User's Guide, 8th Edition. Los Angeles, CA: Muthén & Muthén.).

Results

Phase 2: Pilot study

Parallel analysis

In the 23-item version of the scale, parallel analysis suggested a three-factor solution (refer to Fig. 1A). The first 4 eigenvalues of this version were: 1 = 79.131; 2 = 16.878; 3 = 0.8765; 4 = 0.5506. However, upon closer examination of this structure, six items were identified with cross-loadings or saturations lower than 0.4. Consequently, an iterative revision and debugging process was undertaken, focusing on the content of the items and the corrected homogeneity index. This revision resulted in a debugged two-dimensional scale (see Fig. 1B) comprising 14 items: (a) SGO (7 items); and (b) SS (7 items) (illustrated in Fig. 1). The first 3 eigenvalues for this version were: 1 = 46.830; 2 = 14.015; 3 = 0.2222.

Fig. 1
Parallel analysis model. A 23-item version; B 14-item version

Phase 3: Validity testing

Evidence of validity based on the internal structure of the test

The ESEM with the 14-item version of the NSS exhibited fit indicators aligning with literature recommendations (Schreiber, 2017Schreiber, J. (2017). Update to core reporting practices in structural equation modeling. Research in Social and Administrative Pharmacy, 13(3), 634–643. https://doi.org/10.1016/j.sapharm.2016.06.006
https://doi.org/10.1016/j.sapharm.2016.0...
). The model demonstrated satisfactory fit: χ2(64) = 190.597; CFI = 0.966; TLI = 0.951; RMSEA = 0.079 (confidence interval: 0.066—0.092); SRMSR = 0.034. Factor saturations and reliability estimates are presented in Table 2.

Table 2
Descriptive information of the NSS and resulting factor loadings from the ESEM

Item factor saturations for each dimension indicated strong representation (SGO, λ = 0.53—0.79; SS, λ = 0.52— 0.84) with low cross-factor saturation (SGO, λ = −0.07— 0.27; SS, λ = −0.01—0.25). Reliability estimates were adequate for both SGO (α = 0.841, ω = 0.843; Cho & Kim, 2015Cho, E., & Kim, S. (2015). Cronbach's coefficient alpha: Well-known but poorly understood. Organizational Research Methods, 18(2), 207–230. https://doi.org/10.1177/1094428114555994
https://doi.org/10.1177/1094428114555994...
) and SS (α = 0.845, ω = 0.850; Cho & Kim, 2015Cho, E., & Kim, S. (2015). Cronbach's coefficient alpha: Well-known but poorly understood. Organizational Research Methods, 18(2), 207–230. https://doi.org/10.1177/1094428114555994
https://doi.org/10.1177/1094428114555994...
).

Factorial invariance

The CFI deltas did not show a decrease in fit higher than 0.010 in either the metric or scalar models compared to the configuration model (i.e., multiple group CFA by sex). This finding suggests an equivalence in factor loadings and factor intercepts between male and female participants, indicating that the items hold consistent meaning across these groups. Regarding the application format, the CFI deltas also did not show a decrease in fit higher than 0.010 in in either the metric or scalar models compared to the configuration model. This suggests an equivalence in factor loadings and factor intercepts between the paper-and-pencil and online application formats. A t-test was also conducted to assess whether there were differences between the two application formats. The results of the t-test for independent samples showed statistically significant differences in the dimension skepticism towards governmental organizations and official press (Student's t(318) = 2.49; p = 0.013; Cohen's d = 0.337; online: M = 3.45, SD = 0.71; paper-and-pencil: M = 3.22, SD = 0.59) and skepticism towards science (Welch's t(318) = 2.52; p = 0.013; Cohen's d = 0.310; online: M = 2.13, SD = 0.73; paper-and-pencil: M = 1.93, SD = 0.53). In short, participants showed higher levels of skepticism towards government organizations and official press and skepticism towards science in the online application format compared to the pen and paper format. Detailed invariance tests conducted on sex and application format, using the final version of the scale are presented in Table 3.

Table 3
Fit indexes for multi-group confirmatory factor analysis of the NSS by sex and application format

Evidence of validity based on the relationship with other variables

The impact of the final version of the NSS on attitudes towards COVID-19 vaccines was evaluated using a SetESEM. This model demonstrated satisfactory fit indicators: χ2(174) = 354.029; CFI = 0.966; TLI = 0.955; RMSEA = 0.057 (confidence interval: 0.048—0.065); and Standardized Root Mean Square Residual = 0.039, thus aligning with recommendations by Schreiber (2017)Schreiber, J. (2017). Update to core reporting practices in structural equation modeling. Research in Social and Administrative Pharmacy, 13(3), 634–643. https://doi.org/10.1016/j.sapharm.2016.06.006
https://doi.org/10.1016/j.sapharm.2016.0...
(illustrated in Fig. 2).

Fig. 2
Set-ESEM model, graphical representation of the relationships between naive skepticism and attitudes towards vaccines COVID-19

The model revealed moderate, inverse, and statistically significant relationships between the latent variables; SGO and SS were inversely correlated with attitudes towards COVID-19 vaccines (γ = −0.202, p < 0.001 for SGO; γ = −0.425, p < 0.001 for SS, respectively).

Discussion

The primary objectives of this study were to develop a scale for measuring naive skepticism among adults in Chile and to gather preliminary psychometric evidence to support its interpretation and application in researching risk factors associated with misinformation. The fit statistics of the 14-item model, the magnitude of factor loadings, and the lack of relevant cross-loadings substantiate the model's two-dimensional structure. These findings provide evidence of validity based on the internal structure, ensuring accurate interpretation of the scores. Furthermore, reliability coefficient estimates affirm that each dimension exhibits a satisfactory consistency level.

The 14-item model also demonstrated metric and scalar measurement invariance across genders, allowing the application of the scale to both men and women. This invariance signifies that the factor loadings were equivalent between groups, and the dimensions exhibited similar variability across sexes. Consequently, this new scale presents an opportunity to investigate the gender similarity hypothesis in susceptibility to misinformation in future research, as proposed by Hyde (2014)Hyde, J. S. (2014). Gender similarities and differences. Annual Review of Psychology, 65, 373–398. https://doi.org/10.1146/annurev-psych-010213-115057
https://doi.org/10.1146/annurev-psych-01...
. On the other hand, the model also demonstrated metric and scalar invariance between application formats, allowing for application in pencil and paper and online formats. It should be noted that the existence of slight differences in means between the application formats is not sufficient to suggest the use of one format over another. Furthermore, there is no differential functioning of the instrument (i.e., evidence of invariance). Therefore, for future psychometric testing, the application formats can be used in combination or separately.

In terms of validity evidence based on the association with other variables, the dimensions of the NSS were found to correlate with attitudes toward COVID-19 vaccines, aligning with the anticipated direction and corroborating prior studies (Bavel et al., 2020Van Bavel, J., Baicker, K., Boggio, P., Capraro, V., Cichocka, A., Cikara, M., Crockett, M., Crum, A., Douglas, K., Druckman, J., Drury, J., Dube, O., Ellemers, N., Finkel, E., Fowler, J., Gelfand, M., Han, S., Haslam, A., Jetten, J., & Willer, R. (2020). Using social and behavioural science to support COVID-19 pandemic response. Nature human behaviour, 4(5), 460–471. https://doi.org/10.1038/s41562-020-0884-z
https://doi.org/10.1038/s41562-020-0884-...
; Roozenbeek & van der Linden, 2019Roozenbeek, J., & Van Der Linden, S. (2019). The fake news game: Actively inoculating against the risk of misinformation. Journal of Risk Research, 22(5), 570–580. https://doi.org/10.1080/13669877.2018.1443491
https://doi.org/10.1080/13669877.2018.14...
; Roozenbeek et al., 2020Roozenbeek, J., Schneider, C., Dryhurst, S., Kerr, J., Freeman, A., Recchia, G., Van Der Bles, A., & Van Der Linden, S. (2020). Susceptibility to misinformation about COVID-19 around the world: Susceptibility to COVID misinformation. Royal Society Open Science, 7(10). https://doi.org/10.1098/rsos.201199
https://doi.org/10.1098/rsos.201199...
). These studies have indicated that negative attitudes towards vaccines are frequently underpinned by mistaken beliefs about the consequences of such behaviors, stemming from either ignorance or a predisposition to accept false or biased information. This inclination, potentially influenced by individual factors like naive skepticism, increases vulnerability to misinformation (Wright, 2020Wright, J. (2020). Many People Are Saying…": Applying the Lessons of Naive Skepticism to the Fight against Fake News and Other "Total Bullshit. Postdigital Science and Education, 2(1), 113–131. https://doi.org/10.1007/s42438-019-00051-0
https://doi.org/10.1007/s42438-019-00051...
). Consequently, this misinformation hampers the development of behaviors essential for health prevention and maintenance, inadvertently encouraging behaviors that compromise health.

Limitations and implications

The main limitation of this study lies in the size and representativeness of the sample. Being non-probabilistic, the generalizability of the findings to the broader population is constrained. Accordingly, it is recommended that future psychometric studies utilizing this instrument be extended to diverse groups, such as adolescents, older adults, and individuals from varied socioeconomic backgrounds and educational levels, as well as in the medical, health, and educational contexts.

Given the nascent stage of this field, future research should also explore the convergent and discriminant validity of the NSS in comparison with scales measuring beliefs in conspiracy theories (e.g., Generic Conspiracist Beliefs Scale, Brotherton et al., 2013Brotherton, R., French, C. C., & Pickering, A. D. (2013). Measuring belief in conspiracy theories: The generic conspiracist beliefs scale. Frontiers in Psychology, 4(279), 1–15. https://doi.org/10.3389/fpsyg.2013.00279
https://doi.org/10.3389/fpsyg.2013.00279...
; Belief in Conspiracy Theories Inventory, Swami et al., 2010Swami, V., Chamorro-Premuzic, T., & Furnham, A. (2010). Unanswered questions: A preliminary investigation of personality and individual difference predictors of 9/11 conspiracist beliefs. Applied Cognitive Psychology., 24, 749–761. https://doi.org/10.1002/acp.1583
https://doi.org/10.1002/acp.1583...
) and critical thinking (e.g., The Critical Thinking Disposition Scale, Sosu, 2013Sosu, E. M. (2013). The development and psychometric validation of a Critical Thinking Disposition Scale. Thinking Skills and Creativity, 9, 107–119. https://doi.org/10.1016/j.tsc.2012.09.002
https://doi.org/10.1016/j.tsc.2012.09.00...
). This approach will enable the differentiation and correlation of these theoretical constructs.

Considering the significant impact of naive skepticism on health behaviors (Wang et al., 2019Wang, Y., McKee, M., Torbica, A., & Stuckler, D. (2019). Systematic Literature Review on the Spread of Health-related Misinformation on social media. Social Science and Medicine, 240, 112552. https://doi.org/10.1016/j.socscimed.2019.112552
https://doi.org/10.1016/j.socscimed.2019...
; Zheng et al., 2022Zheng, L., Elhai, J. D., Miao, M., Wang, Y., Wang, Y., & Gan, Y. (2022). Healthrelated fake news during the COVID-19 pandemic: perceived trust and information search. Internet Research, 32(3), 768–789. https://doi.org/10.1108/INTR-11-2020-0624
https://doi.org/10.1108/INTR-11-2020-062...
), incorporating this new scale into assessment protocols within health services or educational centers could prove beneficial. The insights garnered from this instrument could aid in identifying individuals susceptible to misinformation and the practice of risky behaviors, thereby necessitating tailored preventive interventions. Consequently, current strategies promoting health risk behaviors in adults could be enhanced, focusing on the adoption of certain behaviors and the avoidance of others.

Conclusion

The final 14-item version of the NSS demonstrated evidence of reliability and validity. This evidence, grounded in the internal structure of the test, measurement invariance, and associations with other variables, supports the applicability of the scale in sample groups akin to those in this study. The preliminary findings indicate that this scale represents a novel, concise instrument crafted using modern psychometric techniques. The NSS provides an updated and alternative proposal to assess naive skepticism and holds potential for use in researching psychological factors associated with health risk behaviors.

  • Funding
    ANID (National Agency for Research and Development) sponsored and funded this research through grant FONDECYT regular No 1220664.

Declarations

Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Acknowledgements

Not applicable.

References

  • Abad, F., Olea, J., Ponsoda, V., García, C. (2011). Medición en ciencias sociales y de la salud. Madrid: Síntesis. 26–38 p.
  • American Educational Research Association, American Psychological Association & National Council on Measurement in Education (2014). Standards for educational and psychological testing. Washington, DC.
  • Asparouhouv, T., & Muthén, B. (2009). Exploratory structural equation modeling. Structural Equation Modeling: A Multidisciplinary Journal, 16(3), 397–438. https://doi.org/10.1080/10705510903008204
    » https://doi.org/10.1080/10705510903008204
  • Ato, M., López-García, J., & Benavente, A. (2013). Un sistema de clasificación de los diseños de investigación en psicología. Anales De Psicología, 29(3), 1038–1059.
  • Baban, A., & Craciun, C. (2007). Changing health-risk behaviors: A review of theory and evidence-based interventions in health psychology. Journal of Evidence-Based Psychotherapies, 7(1), 45.
  • Barendse, M., Oort, F., & Timmerman, M. (2015). Using exploratory factor analysis to determine the dimensionality of discrete responses. Structural Equation Modeling: A Multidisciplinary Journal, 22(1), 87–101. https://doi.org/10.1080/10705511.2014.934850
    » https://doi.org/10.1080/10705511.2014.934850
  • Brotherton, R., French, C. C., & Pickering, A. D. (2013). Measuring belief in conspiracy theories: The generic conspiracist beliefs scale. Frontiers in Psychology, 4(279), 1–15. https://doi.org/10.3389/fpsyg.2013.00279
    » https://doi.org/10.3389/fpsyg.2013.00279
  • Campo-Arias, A., Caamaño-Rocha, L., & Pedrozo-Pupo, J. (2021). Spanish Version of the Attitude Towards COVID-19 Vaccines Scale: Reliability and Validity Assessment. medRxiv https://doi.org/10.1101/2021.07.18.21260733
    » https://doi.org/10.1101/2021.07.18.21260733
  • Chen, F. (2007). Sensitivity of goodness of fit indexes to lack of measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 14(3), 464–504. https://doi.org/10.1080/10705510701301834
    » https://doi.org/10.1080/10705510701301834
  • Cho, E., & Kim, S. (2015). Cronbach's coefficient alpha: Well-known but poorly understood. Organizational Research Methods, 18(2), 207–230. https://doi.org/10.1177/1094428114555994
    » https://doi.org/10.1177/1094428114555994
  • Cohen, J. (1988). Statistical Power Analysis for the Behavioural Sciences (2nd ed.). Routledge.
  • Datta, S., O'Connor, P., Jankovic, D., Muscat, M., Ben Mamou, M., Singh, S., Kaloumenos, T., Reef, S., Papania, M., & Butler, R. (2017). Progress and challenges in measles and rubella elimination in the WHO European Region. Vaccine, 36(36), 5408–5415. https://doi.org/10.1016/j.vaccine.2017.06.042
    » https://doi.org/10.1016/j.vaccine.2017.06.042
  • De Keersmaecker, J., Dunning, D., Pennycook, G., Rand, D., Sanchez, C., Unkelbach, C., & Roets, A. (2020). Investigating the robustness of the illusory truth effect across individual differences in cognitive ability, need for cognitive closure, and cognitive style. Personality and Social Psychology Bulletin, 46(2), 204–215. https://doi.org/10.1177/0146167219853844
    » https://doi.org/10.1177/0146167219853844
  • Diamantopoulos, A., Sarstedt, M., Fuchs, C., Wilczynski, P., & Kaiser, S. (2012). Guidelines for choosing between multiitem and singleitem scales for construct measurement: A predictive validity perspective. Journal of the Academy of Marketing Science, 40(3), 434–449. https://doi.org/10.1007/s11747-011-0300-3
    » https://doi.org/10.1007/s11747-011-0300-3
  • DiStefano, C., & Morgan, G. (2014). A comparison of diagonal weighted least squares robust estimation techniques for ordinal data. Structural Equation Modeling: A Multidisciplinary Journal, 21(3), 425–438. https://doi.org/10.1080/10705511.2014.915373
    » https://doi.org/10.1080/10705511.2014.915373
  • Douglas, K. M., Uscinski, J. E., Sutton, R. M., Cichocka, A., Nefes, T., Ang, C. S., & Deravi, F. (2019). Understanding Conspiracy Theories. Political Psychology, 40, 3–35. https://doi.org/10.1111/pops.12568
    » https://doi.org/10.1111/pops.12568
  • Dubé E., MacDonald Sh., Manca T., Bettinger J., Driedger S., Graham J., Greyson D., MacDonald N., Meyer S., Roch G., Vivion M., Aysworth L., Witteman H., Gélinas-Gascon F., Sathler L., Hakim H., Gagnon D., Béchard B., Gramaccia J., Khoury R.,Tremblay S. (2022). Understanding the Influence of Web-Based Information, Misinformation, Disinformation, and Reinformation on COVID-19 Vaccine Acceptance: Protocol for a Multicomponent Study. JMIR Research Protocols. 11(10). https://doi.org/10.2196/41012
    » https://doi.org/10.2196/41012
  • Erion, G. (2005). Engaging student relativism. Discourse, 5(1), 120–133.
  • Ecker, U. K. H., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., Kendeou, P., Vraga, E. K., & Amazeen, M. A. (2022). The psychological drivers of misinformation belief andits resistance to correction. Nature Reviews Psychology, 1(1), 13–29. https://doi.org/10.1038/s44159-021-00006-y
    » https://doi.org/10.1038/s44159-021-00006-y
  • Glanz, K., & Bishop, D. (2010). The Role of Behavioral Science Theory in Development and Implementation of Public Health Interventions. Annual Review of Public Health, 31(1), 399–418. https://doi.org/10.1146/annurev.publhealth.012809.103604
    » https://doi.org/10.1146/annurev.publhealth.012809.103604
  • Gligorić, V., Feddes, A., & Doosje, B. (2022). Political bullshit receptivity and its correlates: A cross-country validation of the concept. Journal of Social and Political Psychology, 10(2), 411–429. https://doi.org/10.5964/jspp.6565
    » https://doi.org/10.5964/jspp.6565
  • Goretzko, D., Pham, T. T. H., & Bühner, M. (2021). Exploratory factor analysis: Current use, methodological developments and recommendations for good practice. Current Psychology, 40, 3510–3521. https://doi.org/10.1007/s12144-019-00300-2
    » https://doi.org/10.1007/s12144-019-00300-2
  • Hurtt, R. (2010). Development of a Scale to Measure Professional Skepticism. AUDITING: A Journal of Practice & Theory, 29(1), 149–171. https://doi.org/10.2308/aud.2010.29.1.149
    » https://doi.org/10.2308/aud.2010.29.1.149
  • Hyde, J. S. (2014). Gender similarities and differences. Annual Review of Psychology, 65, 373–398. https://doi.org/10.1146/annurev-psych-010213-115057
    » https://doi.org/10.1146/annurev-psych-010213-115057
  • Imhoff, R., & Lamberty, P. (2018). How paranoid are conspiracy believers? Toward a more fine ‐grained understanding of the connect and disconnect between paranoia and belief in conspiracy theories. European journal of social psychology, 48(7), 909–926.
  • Karlova, N., & Fisher, K. (2012). "Plz RT": A Social Diffusion Model of Misinformation and Disinformation for Understanding Human Information Behaviour. Proceedings of the ISIC2012 (Tokyo). Recuperado de https://www.hastac.org/sites/default/files/documents/karlova_12_isic_misdismodel.pdf
    » https://www.hastac.org/sites/default/files/documents/karlova_12_isic_misdismodel.pdf
  • Li, C. (2016). Confirmatory factor analysis with ordinal data: Comparing robust maximum likelihood and diagonally weighted least squares. Behavior Research Methods, 48(3), 936–949. https://doi.org/10.3758/s13428-015-0619-7
    » https://doi.org/10.3758/s13428-015-0619-7
  • Lynch, M. P. (2023). Political Skepticism, Bias, and Epistemic Colonization. In H. Samarzija & Q. Cassam (Eds.) The Epistemology of Democracy. Routledge.
  • Muthén, L., & Muthén, B. (1998–2017). Mplus User's Guide, 8th Edition Los Angeles, CA: Muthén & Muthén.
  • Muñiz, J., & Fonseca-Pedrero, E. (2019). Diez pasos para la construcción de un test. Psicothema, 31(1), 7. https://doi.org/10.7334/psicothema2018.291
    » https://doi.org/10.7334/psicothema2018.291
  • Muthén, B., & Asparouhov, T. (2012). Bayesian structural equation modeling: A more flexible representation of substantive theory. Psychological Methods, 17(3), 313. https://doi.org/10.1037/a0026802
    » https://doi.org/10.1037/a0026802
  • Napper, L., Reynolds, G., & Fisher, D. (2010). Measuring perceived susceptibility, perceived vulnerability and perceived risk of HIV infection. Psychology of risk perception. Hauppauge: Nova Science Publishers, Inc.
  • Nekmat, E. (2020). Nudge effect of fact-check alerts: Source influence and media skepticism on sharing of news misinformation in social media. Social Media + Society, 6(1), https://doi.org/10.1177/2056305119897322
    » https://doi.org/10.1177/2056305119897322
  • Obermiller, C., & Spangenberg, E. (1998). Development of a Scale to Measure Consumer Skepticism Toward Advertising. Journal of Consumer Psychology, 7(2), 159–186. https://doi.org/10.1207/s15327663jcp0702_03
    » https://doi.org/10.1207/s15327663jcp0702_03
  • Ojala, M. (2015). Climate change skepticism among adolescents. Journal of Youth Studies, 18(9), 1135–1153. https://doi.org/10.1080/13676261.2015.1020927
    » https://doi.org/10.1080/13676261.2015.1020927
  • Pennycook, G., & Rand, D. (2019a). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39–50. https://doi.org/10.1016/j.cognition.2018.06.011
    » https://doi.org/10.1016/j.cognition.2018.06.011
  • Pennycook, G., & Rand, D. (2019b). Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. Journal of Personality, 88(2), 185–200. https://doi.org/10.1111/jopy.12476
    » https://doi.org/10.1111/jopy.12476
  • Pennycook, G., Cannon, T., & Rand, D. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General, 147(12), 1865–1880. https://doi.org/10.1037/xge0000465
    » https://doi.org/10.1037/xge0000465
  • Quiring, O., Ziegele, M., Schemer, C., Jackob, N., Jakobs, I., & Schultz, T. (2021). Constructive skepticism, dysfunctional cynicism? Skepticism and cynicism differently determine generalized media trust. International Journal of Communication, 15, 22.
  • Reimann, Z., Miller, J., Dahle, K., Hooper, A., Young, A., Goates, M., Magnusson, B., & Crandall, A. (2020). Executive functions and health behaviors associated with the leading causes of death in the United States: A systematic review. Journal of Health Psychology, 25(2), 186–196. https://doi.org/10.1177/1359105318800829
    » https://doi.org/10.1177/1359105318800829
  • Roozenbeek, J., Schneider, C., Dryhurst, S., Kerr, J., Freeman, A., Recchia, G., Van Der Bles, A., & Van Der Linden, S. (2020). Susceptibility to misinformation about COVID-19 around the world: Susceptibility to COVID misinformation. Royal Society Open Science, 7(10). https://doi.org/10.1098/rsos.201199
    » https://doi.org/10.1098/rsos.201199
  • Roozenbeek, J., & Van Der Linden, S. (2019). The fake news game: Actively inoculating against the risk of misinformation. Journal of Risk Research, 22(5), 570–580. https://doi.org/10.1080/13669877.2018.1443491
    » https://doi.org/10.1080/13669877.2018.1443491
  • Schreiber, J. (2017). Update to core reporting practices in structural equation modeling. Research in Social and Administrative Pharmacy, 13(3), 634–643. https://doi.org/10.1016/j.sapharm.2016.06.006
    » https://doi.org/10.1016/j.sapharm.2016.06.006
  • Sindermann, C., Schmitt, H., Rozgonjuk, D., Elhai, J., & Montag, C. (2020). Which factors influence the evaluation of fake and true news? Ability versus non-ability traits. OSFPreprints
  • Sosu, E. M. (2013). The development and psychometric validation of a Critical Thinking Disposition Scale. Thinking Skills and Creativity, 9, 107–119. https://doi.org/10.1016/j.tsc.2012.09.002
    » https://doi.org/10.1016/j.tsc.2012.09.002
  • Steffens, M. S., Dunn, A. G., Wiley, K. E., & Leask, J. (2019). How organisations promoting vaccination respond to misinformation on social media: A qualitative investigation. BMC Public Health, 19(1), 1–12. https://doi.org/10.1186/s12889-019-7659-3
    » https://doi.org/10.1186/s12889-019-7659-3
  • Storm, L., & Thalbourne, M. (2005). The effect of a change in pro attitude on paranormal performance: A pilot study using naive and sophisticated skeptics. Journal of Scientific Exploration, 19(1), 11–29.
  • Swami, V., Chamorro-Premuzic, T., & Furnham, A. (2010). Unanswered questions: A preliminary investigation of personality and individual difference predictors of 9/11 conspiracist beliefs. Applied Cognitive Psychology., 24, 749–761. https://doi.org/10.1002/acp.1583
    » https://doi.org/10.1002/acp.1583
  • Syam, H., & Nurrahmi, F. (2020). I Don't Know If It Is Fake or Real News" How Little Indonesian University Students Understand Social Media Literacy. Journal Komunikasi: Malaysian Journal of Communication, 36(2), 92–105. https://doi.org/10.17576/JKMJC-2020-3602-06
    » https://doi.org/10.17576/JKMJC-2020-3602-06
  • The Jamovi Project. (2020). Jamovi (Version 1.8.1) [Computer Software] Disponible en línea en: https://www.jamovi.org(accessed26March2021)
    » https://www.jamovi.org
  • Van Bavel, J., Baicker, K., Boggio, P., Capraro, V., Cichocka, A., Cikara, M., Crockett, M., Crum, A., Douglas, K., Druckman, J., Drury, J., Dube, O., Ellemers, N., Finkel, E., Fowler, J., Gelfand, M., Han, S., Haslam, A., Jetten, J., & Willer, R. (2020). Using social and behavioural science to support COVID-19 pandemic response. Nature human behaviour, 4(5), 460–471. https://doi.org/10.1038/s41562-020-0884-z
    » https://doi.org/10.1038/s41562-020-0884-z
  • Van Scoy, L. J., Snyder, B., Miller, E. L., Toyobo, O., Grewel, A., Ha, G., … Lennon, R. P. (2021). Public anxiety and distrust due to perceived politicization and media sensationalism during early COVID-19 media messaging. Journal of Communication in Healthcare, 1–13. https://doi.org/10.1080/17538068.2021.1953934
    » https://doi.org/10.1080/17538068.2021.1953934
  • Van Prooijen, J. W., & Douglas, K. M. (2017). Conspiracy theories as part of history: The role of societal crisis situations. Memory Studies, 10(3), 323–333. https://doi.org/10.1177/1750698017701615
    » https://doi.org/10.1177/1750698017701615
  • Viladrich, C., Angulo-Brunet, A., & Doval, E. (2017). A journey around alpha and omega to estimate internal consistency reliability. Annals of Psychology, 33(3), 755–782. https://doi.org/10.6018/analesps.33.3.268401
    » https://doi.org/10.6018/analesps.33.3.268401
  • Wang, Y., McKee, M., Torbica, A., & Stuckler, D. (2019). Systematic Literature Review on the Spread of Health-related Misinformation on social media. Social Science and Medicine, 240, 112552. https://doi.org/10.1016/j.socscimed.2019.112552
    » https://doi.org/10.1016/j.socscimed.2019.112552
  • Wood, T., & Porter, E. (2019). The elusive backfire effect: Mass attitudes’ steadfast factual adherence. Political Behavior, 41(1), 135–163. https://doi.org/10.1007/s11109-018-9443-y
    » https://doi.org/10.1007/s11109-018-9443-y
  • World Health Organization. (2020). Infodemics and misinformation negatively affect people's health behaviours, new WHO review finds. https://www.who.int/europe/news/item/01-09-2022-infodemics-and-misinformation-negatively-affect-people-s-health-behaviours--new-who-review-finds
    » https://www.who.int/europe/news/item/01-09-2022-infodemics-and-misinformation-negatively-affect-people-s-health-behaviours--new-who-review-finds
  • Wright, J. (2019). The truth, but not yet: Avoiding naive skepticism via explicit communication of metadisciplinary aims. Teaching in Higher Education, 24(3), 361–377. https://doi.org/10.1080/13562517.2018.1544552
    » https://doi.org/10.1080/13562517.2018.1544552
  • Wright, J. (2020). Many People Are Saying…": Applying the Lessons of Naive Skepticism to the Fight against Fake News and Other "Total Bullshit. Postdigital Science and Education, 2(1), 113–131. https://doi.org/10.1007/s42438-019-00051-0
    » https://doi.org/10.1007/s42438-019-00051-0
  • Xiao, Y., Liu, H., & Hau, K. (2019). A comparison of CFA, ESEM, and BSEM in test structure analysis. Structural Equation Modeling: A Multidisciplinary Journal, 26(5), 665–677. https://doi.org/10.1080/10705511.2018.1562928
    » https://doi.org/10.1080/10705511.2018.1562928
  • Zheng, L., Elhai, J. D., Miao, M., Wang, Y., Wang, Y., & Gan, Y. (2022). Healthrelated fake news during the COVID-19 pandemic: perceived trust and information search. Internet Research, 32(3), 768–789. https://doi.org/10.1108/INTR-11-2020-0624
    » https://doi.org/10.1108/INTR-11-2020-0624

Publication Dates

  • Publication in this collection
    13 May 2024
  • Date of issue
    2024

History

  • Received
    30 Aug 2023
  • Accepted
    26 Jan 2024
  • Published
    20 Feb 2024
Curso de Pós-Graduação em Psicologia da Universidade Federal do Rio Grande do Sul Rua Ramiro Barcelos, 2600 - sala 110, 90035-003 Porto Alegre RS - Brazil, Tel.: +55 51 3308-5691 - Porto Alegre - RS - Brazil
E-mail: prc@springeropen.com