Acessibilidade / Reportar erro

CITIZENS AND COLLECTIVE DELIBERATION IN SOCIAL SCIENCE

ABSTRACT

It is argued that, in certain particular conditions related to the intellectual character of the deliberators and their cognitive diversity, small research teams that engage in deliberation in the analysis of data and involve citizens can better promote good epistemic results than those teams which do not involve citizens. In particular, it is argued that certain communities within the social sciences that lack the relevant cognitive diversity among their professionals can take advantage of the diversity found in the citizenry to increase the epistemic quality of their research, as long as the citizens possess the relevant virtues.

KEYWORDS:
Citizen Science; Division of Labor; Collective Deliberation; Intellectual Virtue; Cognitive Diversity; Ideology

Citizen Science is a rapidly growing field, expanding across varied research domains within the sciences, such as astronomy, computer science, ecology, genetics, geography and medicine (Kullenberg and Kasperowski 2016KULLENBERG, C., KASPEROWSKI, D. “What Is Citizen Science?-A Scientometric Meta-Analysis.” PLoS ONE, 11(1), e0147152, 2016.). Citizen science programs involve non-professional scientists in scientific inquiry. These citizens voluntarily contribute toward scientific research in collaboration with professional scientists. Nowadays, opportunities to participate in citizen science are boundless and there are different ways (and degrees) in which people can participate (Novak et al. 2018NOVAK, J., BECKER, M., GREY, F., MONDARDINI, R. “Citizen Engagement and Collective Intelligence for Participatory Digital Social Innovation.” In Hecker, S. et al. (eds.) (2018), pp.124-45. ). Having said that, citizen involvement in citizen science consists typically in data collection. They mainly participate in the process of gathering, annotating and/or categorizing data individually and according to specific scientific protocols. Nevertheless, since its inception, one major strand of citizen science aimed to have a less limited role for citizens in scientific research (Irwin 1995IRWIN, A. Citizen Science: A Study of People, Expertise and Sustainable Development. London: Routledge , 1995.; cf. Bonney 1996BONNEY, R. “Citizen Science: A Lab Tradition.” Living Birds, 15(4), pp.7-15, 1996.). According to this approach, citizens are meant to have more cognitive engagement and influence on scientific projects; for example, by empowering them to collaborate with professional scientists on many core aspects of the projects, such as the interpretation of results and the choice of problem to study. However, the contribution of citizens in scientific research has been questioned on epistemological grounds (Reisch and Potter 2014REISCH, H., POTTER, C. “Citizen Science as seen by scientists.” Public Understanding of Science, 23(1), pp.107-120, 2014.; Elliott and Rosenberg 2019ELLIOTT, K., ROSENBERG, J. “Philosophical Foundations for Citizen Science.” Citizen Science, 4(1), pp.1-9, 2019.). Even research that relies on the restricted contribution of citizens in data collection is often questioned about the epistemic quality of such data.1 1 The contribution of citizens has also been questioned on ethical and other grounds, but my focus here is on epistemological issues exclusively. In fact, this work falls within the epistemology of science and, in particular, the growing sub-field of virtue epistemology of science (e.g., Paternotte and Ivanova 2017; van Dongen and Paul 2017), which studies the role of the epistemic agent’s virtues and vices in scientific inquiry (as opposed to the, more traditional, theoretical virtues issue; e.g., Schindler 2018). However, much work done within this sub-field is individualistic in nature: it does not focus on the social dimensions of scientific inquiry. This work attempts to address the vacuum that one can presently find in the epistemology of science with regard to the study of the social dimensions of intellectual virtues relevant to the scientific inquiry and it aims to show that diverse, virtuous citizens can have a legitimate epistemic role in research within the social sciences.

Here, however, I do not consider such epistemic worries since I am not concerned with the possible contributions of citizens in data collection. Instead I focus on the contribution they can make in the collective process of deliberation within research teams. Teamwork in science is ubiquitous and it varies enormously in terms of the number of participants, the social structure and the sorts of work the teams engage in. I argue that, in certain particular conditions related to the intellectual character of the deliberators and their cognitive diversity, small research teams that engage in deliberation in the analysis of data and involve citizens can better promote good epistemic results, in terms of truth-seeking, than those teams which do not involve citizens or do not even deliberate collectively.2 2 I focus on the analysis phase, which can be conducted by one person and concerns, among other things, interpreting results (say, as outliers or violations), reaching conclusions about the sample studied and extrapolating those conclusions to a large population. Nevertheless, what I say equally applies to other phases, such as hypothesizing and reporting ones (some hints are provided in §2). See Gerring and Christenson 2017 and Wicherts et al. 2016, for more on these phases and their tasks. In particular, certain communities within the social sciences that lack the relevant cognitive diversity among their professionals can take advantage of the diversity found in the citizenry to increase the epistemic quality of their research, as long as the citizens possess the relevant virtues.

The paper proceeds as follows. First, some aspects of modern science are briefly highlighted. In particular, its truth-seeking goal and ethos, including its cooperative structure. Second, the practice of collective deliberation is introduced as an instance of the cooperative division of cognitive labor in science and some of its potential epistemic benefits appreciated: in particular, eliminating errors and cognitive biases. But those benefits depend on certain conditions holding. So, third, it is argued that the intellectual virtues of autonomy and humility play a key role in realizing such benefits. Moreover, a certain cognitive diversity, regarding the opinions and cognitive styles of the deliberators, is also required. Having argued that collective deliberation can increase the epistemic performance of the group and identified the above two conditions for reaping its benefits, fourth, it is noted that professional scientists are likely, given the ethos of science, to possess the relevant intellectual character. However, some relevant cognitive diversity is, in some social science fields (such as social and political psychology), missing. So, fifth, it is concluded that some scientific research that made use of citizens in their deliberative processes can be epistemically benefited if the relevant diversity and intellectual character is found in the citizens.

1. SOME ASPECTS OF MODERN SCIENCE

Science is often thought as one of the most reliable forms of inquiry. Since the scientific revolution introduced changes in the methods, goals and standards of conduct of the scientific practice, attempts to overcome barriers to reveal the underlying workings of nature have been center stage. Francis Bacon, the first spokesperson of what we now call “modern science,” consistently maintained that the duty of the scientist is to “cultivate truth in charity” (Sargent 2005SARGENT, R. “Virtues and the Scientific Revolution.” In Koertge, N. (ed.) (2005), pp.71-79., 72) and so he was concerned, like many others after him, with the epistemic maladies of scientific inquiry.3 3 Although science is also concerned with other epistemic goals (such as understanding; indeed, “it seems commonplace to state that the desire for understanding is a chief motivation for doing science”—de Regt et al. 2009, p.1), here I focus on truth. His “idols of the mind” famously outlined a series of biases, prejudices and other cognitive shortcomings that scientists are likely to suffer, given how deeply set they are in our cognitive makeup.4 4 Scientists, like everyone else, are subject to unintentional errors due to biases and other shortcomings (e.g., Jussim et al. 2019; Koehler 1993; Lilienfeld 2010; Mercier and Heintz 2014; Peterson 2020). Scientists are also capable of deliberate deception and self-deception. Scientific fraud, such as Andrew Wakefield’s, is the most troubling source of intentional error but also not very common, partly due to the known consequences of such conduct. And because of those consequences, it is not unlikely that much error is the product of questionable work which scientists rationalize away (Trivers 2011). For example, scientists might use a small set of data, or exclude some data, or create huge amounts of data so to find some spurious statistically significant correlation, and excuse this and other flexibility in their treatment of data on the basis of “researcher degrees of freedom” (Simmons et al. 2011; Wicherts et al. 2016). Here, however, I do not focus on these two deceptive sources of errors. For Bacon, the mind is beset by idols that are the result of natural or acquired tendencies and he urges that we should guard against the idols’ influence by prescribing manners of proceeding that cancelled them (Sharpe 2019SHARPE, M. “Bacon.” In Gaukroger, S. (ed.) (2019), pp.7-26.). Modern science has since attempted to eliminate the idols and this conscious effort by the scientific community has partly earned science its positive evaluation as one of the most reliable forms of inquiry.5 5 Although there has recently been an increasing public-trust decline in scientific experts and evidence (e.g., Nichols 2017; Eyal 2019).

Yet science is by no means a perfectly good form of inquiry (let alone the only good one) and, indeed, Bacon did not believe that the idols could be totally eliminated. Like all human enterprises, science is ineradicably fallible and imperfect. The process of science is in constant need of improvement. This is the reason why it is misleading to talk about the scientific method. There is no single recipe for producing scientific theories (that is, no unique methodology for doing science), not even within scientific branches (e.g., Gerring and Christenson 2017GERRING, J., CHRISTENSON, D. Applied Social Science Methodology. Cambridge: CUP , 2017.; Potochnik et al. 2018POTOCHNIK, A., COLOMBO, M., WRIGHT, C. Recipes for Science. London: Routledge , 2018.). As historians of science have shown, scientists use many different methods at any one time, and these methods change through time too, in part, to address epistemic deficiencies in scientific practices.

However, science is not just a collection of procedures and practices that scientists exploit. In fact, the methodological changes take place because science also involves a willingness to embrace a particular mindset by its community (McIntyre 2019MCINTYRE, L. The Scientific Attitude. Massachusetts: MIT Press, 2019.; Dea and Silk 2020DEA, S., SILK, M. “Sympathetic Knowledge and the Scientific Attitude.” In Fricker, M. et al. (eds.) (2020), pp.344-53. ). This scientific attitude revolves primarily around caring about evidence, its quality and sources: particularly, “earnestly willing to seek out and consider evidence that may have a bearing on the grounds for our beliefs” (McIntyre 2019MCINTYRE, L. The Scientific Attitude. Massachusetts: MIT Press, 2019., p.48). The roots of this scientific attitude, like those of modern scientific methods, can also be found throughout the work of Bacon (Rossi 1996ROSSI, P. “Bacon’s idea of science.” In Peltonene, M. (ed.) (1996), pp.25-46.; Sargent 2005SARGENT, R. “Virtues and the Scientific Revolution.” In Koertge, N. (ed.) (2005), pp.71-79.; McIntyre 2019MCINTYRE, L. The Scientific Attitude. Massachusetts: MIT Press, 2019.). But this attitude, like the methods of science, has also developed through time and, centuries later, this more thorough attitude is (mostly) captured in the work of the sociologist of science Robert Merton. He initially listed four social norms that instantiated the ethos of science and guided the scientific community (Merton 1973MERTON, R. The Sociology of Science. Chicago: UCP, 1973., pp.267-278). The first one, “Universalism,” says that “acceptance or rejection of claims entering the lists of science is not to depend on the personal or social attributes of their protagonist; his race, nationality, religion, class, and personal qualities are as such irrelevant” (p.270). The second one, “Communism,” claims that the “substantive findings of science are a product of social collaboration and are assigned to the community” (p.273). The third one, “Disinterestedness,” states that “scientific research is under the exacting scrutiny of fellow experts” (p.276). The fourth one, “Organized Skepticism,” recommends “the detached scrutiny of beliefs in terms of empirical and logical criteria” (p.277; see also p.264). However, Merton rightly realized that these four norms did not fully capture the ethos of science and so amended and expanded these norms. One such expansion was the inclusion of a norm of humility. There are different aspects of this norm but one is that humility “is expected also in the form of the scientist's insisting upon his personal limitations and the limitations of scientific knowledge altogether” (p.303). But regardless of the specifics of the ethos that guides scientific inquiry, it is a shared one which is embraced by the scientific community at large and which fosters the critical scrutiny of scientific claims.

Indeed, Bacon’s vision for science included not only a given methodology and attitude that aimed to eradicate errors (and so promote truth) but also a community dedicated to the task. Bacon saw the scientific enterprise as a collective, as opposed to an individual, effort. Contrary to Descartes (for whom the quest for knowledge is a solitary enterprise6 6 That is, knowledge requires autonomy as absence of external interference. But, as we will see, this Cartesian ideal of autonomy entails a vicious excess. For more on this and related ideals, see Code 1991, pp.110-116; also Solomon 2001, pp.2-7 and Mercier and Heintz 2014, pp.515-516. ), for Bacon science is a social institution that enjoys a cooperative structure to overcome epistemic limitations (Rossi 1996ROSSI, P. “Bacon’s idea of science.” In Peltonene, M. (ed.) (1996), pp.25-46., Sargent 1996SARGENT, R. 1996. “Bacon as an advocate for cooperative scientific research.” In Peltonene, M. (ed.) (1996), pp.146-171.). In this era of ‘‘Big Science,’’ the teamwork aspects of science are quite familiar and clearly visible: research teams with hundreds and even thousands of members that collaborate in the research and research articles with hundreds and even thousands of co-authors (de Ridder 2020DE RIDDER, J. “How Many Scientists Does It Take to Have Knowledge?” In McCain, K. and Kampourakis, K. (eds.) (2020), pp.3-17. ).7 7 One main reason as to why science nowadays tends to involve more collaboration is the increasing difficulty of research problems and the hyper-specialization (even within disciplines) requires the collaboration of different individuals. For more, see Muldoon 2018. Importantly for present purposes, one common element in modern science is the critical scrutiny of scientific claims by the scientific community. Even in the case of scientific research done by single individuals, as opposed to teams, scientists cooperate in evaluating their research so to filter faulty or false findings through, say, the mechanism of peer review and replication.

The above suggests different types of division of labor within science. Leaving aside the (macro) division of labor promoted by the competitive side of science (Kitcher 1993KITCHER, P. The Advancement of Science: Science without Legend, Objectivity without Illusions. Oxford: OUP , 1993.; Weisberg and Muldoon 2009WEISBERG, M., MULDOON, R. “Epistemic landscapes and the division of cognitive labour.” Philosophy of Science, 76(2), pp.225-52, 2009.),8 8 The concerns in this literature are mainly with the coordination of research efforts. For example, they show how different research problems are adopted by different research teams and, when teams share a research problem, how different research methods are adopted, due to the Priority Rule (Strevens 2003). there is clearly a division of epistemic and cognitive labor, both at the macro- and micro-level, that accommodates the cooperative side of science (Wagenknecht 2017WAGENKNECHT, S. A Social Epistemology of Research Groups. Dordrecht: Springer , 2017.; Muldoon 2018MULDOON, R. “Diversity, Rationality and the Division of Cognitive Labor.” In Boyer-Kassem, T. et al. (eds.) (2018), pp.78-94. ).9 9 Notice that both communism and disinterestedness presuppose this cooperative structure of science. By the division of epistemic labor, I mean the distribution, across people, of cognitive work to separately perform distinct epistemic tasks required for some positive epistemic status. Of course, the most familiar example of this delegation of labor is testimony: the speaker and the hearer perform different but complementary tasks (i.e., competent inquiry and legitimate acceptance, respectively) in order for the hearer’s testimonially-based belief to be justified or knowledge. However, it would be a mistake to think that the division of epistemic labor only concerns the transmission of epistemic goods (at either the macro- or micro-level). For example, this division, at the macro-level, often takes place with regard to the epistemic procedures that one exploits in scientific research (often one is not aware of the rationale behind them; [De Brasi] 2015DE BRASI, L. “Reliability and Social Knowledge-Relevant Responsibility,” Transformaçao, 38(1), pp. 187-212, 2015.) and always takes place in the aforementioned refereeing process (where one party generates claims that are evaluated by another one).

By the division of cognitive labor, I mean the distribution, across people, of cognitive work to jointly perform a given epistemic task required for some positive epistemic status. One example of this is deliberation of the interpersonal form that sometimes takes place within research teams, where the interlocutors exchange and evaluate reasons in order to acquire some epistemic good. In this natural and ubiquitous sort of deliberation (Mercier and Sperber 2017MERCIER, H., SPERBER, D. The Enigma of Reason. London: Allen Lane, 2017.), the interlocutors are jointly tackling the same epistemic tasks and, as we will see, certain epistemic benefits are gained, in particular with regard to the eradication of errors.

2. COLLECTIVE DELIBERATION AND SOME OF ITS EPISTEMIC BENEFITS

Collaborative relations within a group can be multifarious. For example, two or more people can co-author a paper by delegating work (say, each one writes a section). But the close collaboration here alluded to involves interactional, collective work (say, they all write together each section). This latter form of collaboration does not allow for different tasks of the work to be attributed to different individuals. Instead, the group’s members are jointly tackling, say, the writing by each providing reasons to write one thing rather than another and deciding together what to write, and the end result depends on the quality of the reasons provided by their joint efforts. But one need not co-author a paper to engage in this sort of close collaborations. Indeed, in different contexts, members of research groups often deliberate together, by presenting different reasons for and against some claim. These collective deliberations instantiate a sort of division of cognitive labor in which individuals weight the merits of competing reasons in discussion together.10 10 This involves the epistemic weighting of reasons. Some claim that this sort of collective deliberation is just collective scientific reasoning (Mercier and Heintz 2014). In particular, the individuals, conversing together, jointly explore the plausibility of some claim, typically each bringing a slightly different perspective to bear. The individuals are meant to defend those perspectives, which are challenged by their interlocutors. These challenges cannot be ignored and reasons (some of which are tailored to specific objections raised) are evaluated in this exchange11 11 For example, one party may advance a reason R1 in favour of some claim and another one may respond by introducing a counter-reason or defeater D that speaks against the claim; then the first one may introduce a defeater of the defeater DD or concede R1 has been defeated—or weakened to certain extent—and perhaps introduce some new reason R2 and so on, and eventually they weight the reasons for and against to see how strong is the case for the claim. and, as we will now see, this process can significantly improve our epistemic performance.12 12 Of course, given the present interdisciplinary nature of much research, this division of cognitive labor is also likely to rely on some sort of division of epistemic labor (see, e.g., Green 2017, pp.135ff.).

Why can collective deliberation be epistemically better than deliberating on one’s own? One reason has to do with the dispersal of knowledge. Different people often bring to the discussion different knowledge (particularly, when team members come from different disciplines and even sub-disciplines). This additional flow of information can increase the chance that erroneous views be corrected (Fearon 1998FEARON, J. “Deliberation as Discussion.” In Elster, J. (eds.) (1998), pp.44-68. ). But people can be diverse in other ways too. For example, people can also vary with respect to the cognitive skills they exploit to target some issue (or how they use them).13 13 For example, some might be better at producing certain kinds of explanations and not others, while others might be more holistic, as opposed to analytical, in their approach to issues. See, e.g., Zhang, Sternberg and Rayner 2012. And of course people can be diverse with regard to their experiences, evidence, interpretative frameworks, and more. And this too can serve that purpose. For example, by pooling our limited and fallible cognitive abilities, we increase the chance to pick out errors either because you think (on your own, given your abilities), say, some possibility that I would not have thought (and vice versa) or because some possibility is put forward that is the brainchild of our abilities and interaction (and neither of us could have thought of it on our own-cf. Fearon 1998FEARON, J. “Deliberation as Discussion.” In Elster, J. (eds.) (1998), pp.44-68. ). So, when it is demonstrable the correctness of some claim, although it is difficult to reach it, diverse groups are much more likely to identify the truth than individuals on their own (Moshman and Geil 1998MOSHMAN, D., GEIL, M. “Collaborative Reasoning: Evidence for Collective Rationality.” Thinking & Reasoning, 4(3), pp.231-48, 1998.; Larson 2010LARSON, J. In Search of Synergy in Small Group Performance. New York: Psychology Press, 2010.).

Another, but related, reason has to do with neutralizing certain cognitive shortcomings.14 14 Although I will not here provide further reasons, this is not meant to be an exhaustive list of benefits. For example, collective deliberation also increases performance when creativity is required to work out a complex issue (Sawyer, 2017) and creativity is required in the formation of factual hypotheses (Gold and Ochu, 2018). To appreciate this, let me first introduce two natural and pervasive forms of faulty thinking. The confirmation bias is the long-recognized phenomenon (Bacon 1620BACON, F. The New Organon. Edited by Lisa Jardine and Michael Silverthorne. Cambridge: CUP, 1620 (2002)., I.46) regarding the tendency to seek and collect reasons that support one’s beliefs and ignore those that contradicts them (Mercier 2017MERCIER, H. “Confirmation bias-myside bias.” In Pohl, R. (eds.) (2017), pp.99-114.; Nickerson 1998NICKERSON, R. “Confirmation Bias: A ubiquitous phenomena in many guises.” Review of General Psychology, 2(2), pp.175-220, 1998.). Specifically, it is the tendency to find only reasons for one’s beliefs and against beliefs one opposes and it does not apply to the evaluation of reasons (Mercier 2017MERCIER, H. “Confirmation bias-myside bias.” In Pohl, R. (eds.) (2017), pp.99-114.). However, collective deliberation involves both producing and evaluating reasons. And when it comes to evaluating reasons that one opposes, we scrutinize them for longer and subject them to much more extensive refutational analysis than those that agree with our prior beliefs (Edwards and Smith 1996EDWARDS, K., SMITH, E. “A Disconfirmation Bias in the Evaluation of Arguments.” Journal of Personality and Social Psychology, 71(1), pp.5-24, 1996.; Evans 2017EVANS, J. “Belief Bias in Deductive Reasoning.” In Pohl, R. (eds.) (2017), pp.165-181.). Given this disconfirmation bias (Lodge and Taber 2013LODGE, M., TABER, C. The Rationalizing Voter. Cambridge: CUP , 2013.), we are more inclined to detect errors in reasoning for a conclusion with which, to begin with, we disagree. Relatedly, research by Pronin et al. (2004PRONIN, E., GILOVICH, T., ROSS, L. “Objectivity in the eye of the beholder.” Psychological Review, 111, pp.781-799, 2004.) on our bias blind spot shows that we are better at detecting cognitive biases in others than in ourselves. All this means that one is a more rigorous evaluator of opposing views than one’s own (leaving aside the fact that, given the confirmation bias, we are unlikely to find a counter-reason for one’s view).

Nevertheless, none of this means that one cannot recognize and concede to a good reason against one’s view. In fact, even when people are confident about some view, they can change it if the reasons suggest it (Fishkin 2011FISHKIN, J. When the People Speak. Oxford: OUP , 2011.; Hess and McAvoy 2015HESS, D., MCAVOY, P. The Political Classroom: Evidence and Ethics in Democratic Education. London: Routledge , 2015.; Mercier and Sperber 2017MERCIER, H., SPERBER, D. The Enigma of Reason. London: Allen Lane, 2017.).15 15 See also Redlawsk et al. 2010, p.590, for the possibility of change even when holding extreme views very confidently. Having said that, there is the well-known phenomenon of belief-perseverance (Ross et al. 1975) that seems to speak against our capacity to change our mind in response to counter-reasons. In fact, we do not only seem to display belief-perseverance when facing disconfirming information, we also seem to suffer a “backfire effect” (Nyhan and Reifler 2010): we increase our confidence on our belief when facing such information. However, the settings for the experiments where these phenomena are meant to show up are not interpersonally deliberative; hence the studied phenomena are not incompatible with our changing our mind in response to reasons in collective deliberation. Moreover, a recent meta-analysis regarding the psychological efficacy of disconfirming information suggests that the debunking effect is more effective when people are provided with new detailed reasons, which is likely to be the case in collective deliberation with relevantly diverse individuals (see below), that enables them to adjust the mental model justifying their beliefs (Chan et al. 2017). Furthermore, as our anxiety increases because more counter-evidence is confronted, we pay more attention to the evidence and start making the right adjustments to our beliefs (Redlawsk et al. 2010). So it is not surprising that the “backfire effect” has failed to be replicated (Wood and Porter 2019) and, given the above, one can reasonably expect someone to change her mind if the reasons of the collective deliberation suggest it. Also, as we will see, a certain intellectual character seems required for successful collective deliberation and such character fosters a more adequate use of such disconfirming information (see also Chan et al. 2017, p.14; Myers 2019, p.93). But, of course, it suggests that one probably cannot alone come up with some such reason against one’s view and so collective deliberation can again prove itself epistemically useful given that it allows for the interaction of individuals with disagreeing views. On the one hand, the confirmation bias makes each individual come up with a relevantly strong case in favor of their own views. One does not search for reasons against one’s view, but the other, who disagrees with one, does. So each presents a relevantly strong case for a given view and against opposing ones. On the other hand, the disconfirmation bias makes each individual a rigorous judge of the other’s reasons. One does not scrutinize one’s reasons assiduously, but the other, who disagrees with one, does: each one controls the quality of the reasons provided by the other. Moreover, given that we can recognize and concede to a better reason, the less error-prone case-given the available evidence-is likely to prevail.

So, in collective deliberation, we are not merely neutralizing these natural, systematic tendencies that prevent us from deliberating effectively, in truth-seeking terms, if done individually, but also taking advantage of them (by exploiting each other’s tendencies).16 16 These tendencies are unlikely to be mere cognitive bugs, as opposed to features of the mind, given how strong, universal and pervasive they are (as well as pleasurable—Gorman and Gorman 2017, p.134). Having said that, if these natural tendencies are not exploited in an interactive setting, they become epistemically harmful. For more on this and a plausible evolutionary story supporting it, see Mercier and Sperber 2017. Thinking together in a critically engaged manner can increase our epistemic performance.17 17 Of course, I have not argued that collective deliberation can counteract all relevant biases, and nor is it my intention to do so. I have only argued that two pervasive biases which can, in principle, affect the analysis of data in all scientific research, can be neutralized. But, although collective deliberation can help us improve our epistemic performance with regard to certain biases that affect us at individual-level, it might introduce new ones that affects us at group-level. Deliberation is a group-based phenomenon and below we will be concerned with a common problem facing groups: group polarization. In this case, as we will see, groups of likeminded individuals polarize and so the relevant heterogeneity is to be fostered in order to avoid the polarization. But, of course, there are other issues to be concerned about, such as the well-known in-group bias. Overcoming (individual and group-level) biases is not easy for us, particularly when left to our own devices. It is, moreover, likely that no one single strategy will be effective to overcome them. The strategy here employed to deal with the confirmation and disconfirmation biases is to exploit group interaction in certain conditions (deliberators should be relevantly diverse and possess a certain intellectual character; see immediately below). That is, deliberation does not cancel cognitive biases merely by assembling individuals into a group. Moreover, collective deliberation, by requiring people to justify their public decisions, can interrupt some of our usual cognitive patterns. It nudges people to “more careful processing of information and reduces cognitive biases (e.g., stereotyping, group bias, primacy effects, anchoring effects in probability judgements, fundamental attribution errors [and framing effects])” (Chong 2013, pp.114, 118). Nevertheless, even if certain individual- and group-level biases can be overcome with the right kind of deliberation (via both the interaction of certain individuals, as specified below, and the nudging it promotes), collective deliberation is not here presented as the solution to all our cognitive shortcomings (but two very general ones) and, importantly, it is not incompatible with other efforts to overcome them (but this more comprehensive inquiry into the different measures to be adopted to overcome all known shortcomings is certainly beyond the scope of the present paper). And surely scientists need to be both boosted and nudged in many different ways (for the distinction between nudging and the sort of boosting strategy here exploited, see Hertwig and Grüne-Yanoff 2017). This reaffirms that a strength of modern science to protect itself from error lies in its social organization.18 18 Notice that none of the above epistemic benefits depend on the so-called “diversity trumps ability” theorem (Hong and Page 2004), for which serious doubts about both its significance and applicability have been rightly raised (Thompson 2014; Weymark 2015). Further, they do not depend on simulations, like Weisberg and Muldoon’s ones (2009), which have also been questioned (Alexander et al. 2015; Thoma 2015). Indeed, epistemic goods can be achieved at group-level even if its members cannot individually do so (cf. Kitcher 1993KITCHER, P. The Advancement of Science: Science without Legend, Objectivity without Illusions. Oxford: OUP , 1993.; Solomon 2001SOLOMON, M. Social Empiricism. Massachusetts: MIT Pres, 2001.).

3. COGNITIVE DIVERSITY AND INTELLECTUAL VIRTUES

Collective deliberation can enjoy certain epistemic benefits over individual deliberation, but whether it does so depends on certain conditions holding. We have already noticed the role that diversity of knowledge and cognitive skills, among other things,19 19 See, e.g., fn.12. Of course, not all diversity (such as diversity about epistemic standards) is welcome. See also Rolin 2011. play in promoting these benefits. But notice too that diversity of opinion is required if people are to search for different reasons and to assiduously evaluate opposing views. Indeed, collective deliberation seems to be epistemically harmful in the absence of this diversity.

Homogenous groups may fail to produce reasons against their shared beliefs and members may provide each other with additional evidence supporting them. Group homogeneity can promote group polarization: that is, the members of the group end up with more extreme beliefs than they had prior to deliberation (Isenberg 1986ISENBERG, D. “Group Polarization: A critical review and meta-analysis.” Journal of Personality and Social Psychology, 50, pp.1141-51, 1986.). This polarization can result from the sharing of confirmatory information (Sunstein 2006SUNSTEIN, C. Infotopia. Oxford: OUP , 2006., pp.65-7). Yet another process that can bring about group polarization is social comparison: the polarization results from in-group comparisons. Even if the group does not share information, its members attempt to maintain their reputation and self-conception by emphasizing the attitudes they perceive to be normative within the group (Sunstein 2006SUNSTEIN, C. Infotopia. Oxford: OUP , 2006., pp.67-9). But this again is likely to happen only in a group of likeminded people, so the suggested group heterogeneity can also remedy group polarization due to social comparison.20 20 At this point, one might complain that even if a group is heterogenous, minorities within it might self-silence themselves (for fear to be ridiculed or some other social punishment—Sunstein 2006, p.68), but a group norm that welcomes dissent can nudge the minority to contribute (Paluck and Green 2009). Of course, whether someone is then heard when speaking is another issue; people might not listen to others because of differences in social identity, gender, etc., but, as we will see, intellectual humility is in part required to promote such listening. Anyhow, the evidence in favor of consistent group polarization in deliberative events is only found in homogenous groups (e.g., Levendusky et al. 2016; Luskin et al. 2014). Having said that, Sunstein (2006, p.55) notices that deliberation can not only promote an increase in group confidence but also in group unity: “After talking together, group members come into greater accord with one another.” But there is nothing per se wrong with that result and in fact the aforementioned diversity for collective deliberation is only required pre-deliberation and so can be transient (cf. Zollman 2010, who argues for the benefits of transient diversity in a different, community-wide, division of cognitive labor).

So epistemically beneficial cognitive diversity within the deliberative group comes in different ways (as diversity of opinion, of knowledge, and of skills, among other things), but more is required of the deliberators if collective deliberation is actually to increase epistemic performance. In particular, the individuals in collective deliberation need to possess a certain intellectual character. For example, they need to be willing to engage in collective deliberation although their views seem right to them and to revise their views in response to the reasons brought forward by others (otherwise, collective deliberation, with its back-and-forth of reasons, would just be a futile exercise). So, to be able to exploit this division of cognitive labor, one needs to possess the intellectual virtue of humility.21 21 A virtue is understood as consisting of attitudes and dispositions of the agent which “perfect” a natural human faculty or correct for proneness to dysfunction and error in certain situations (Roberts and Wood 2007, p.59). This is so if intellectual humility is understood as the virtuous mean between intellectual arrogance and self-deprecation: neither does the intellectually humble person overestimate her epistemic goods and epistemic capacities, nor does she underestimate them.22 22 Roberts and Wood (2007, pp.236ff.) can be interpreted as understanding intellectual humility as being opposed to intellectual arrogance but neglecting the self-deprecation extreme. It seems, however, that one can be too humble. In particular, the intellectual virtue of humility reduces intellectual arrogance (without underappreciation of one’s epistemic goods and capacities) by promoting a doubting attitude owing to the recognition of our fallibility (due to, say, biases and prejudices) and our knowledge limitation (due to, say, our finite cognitive power and time). This dimension of intellectual humility makes clear how it can render one open to engage in collective deliberation. Moreover, this virtue also seems to involve a disposition to change and make up one’s mind even due to others’ opinions. After all, it seems that if the above recognition does not impact on one’s opinions then it is difficult to think of it as such. This dimension of intellectual humility makes clear how it can help one depend epistemically on others in certain circumstances. Given the above, intellectual humility can be understood as some sort of confidence management of one’s beliefs and epistemic capacities, which allows us to make epistemically proper use of others (cf. Baehr 2015BAEHR, J. Cultivating Good Minds: A Philosophical and Practical Guide to Educating for Intellectual Virtues. (MS) Accessed 17/April/2020 (MS) Accessed 17/April/2020 http://intellectualvirtues.org/why-should-we-educate-for-intellectual-virtues-2-2/ , 2015.
http://intellectualvirtues.org/why-shoul...
; Church and Samuelson 2017CHURCH, I., SAMUELSON, P. Intellectual Humility. London: Bloomsbury, 2017.; Kidd 2016KIDD, I. “Educating for Intellectual Humility.” In Baehr, J. (eds.) (2016), pp.54-70.; Whitcomb et al. 2017WHITCOMB, D., BATTALY, H., BAEHR, J., HOWARDS-SNYDER, D. “Intellectual Humility: Owning our limitations.” Philosophy and Phenomenological Research, 94(3), pp.509-39, 2017.).

However, to benefit from collective deliberation not only is the individual required to be intellectually humble but also intellectually autonomous. This is so if intellectual autonomy is understood as the virtue that reduces sheer epistemic dependence on others by promoting a willingness and ability to think critically for oneself in judging views, without capitulating to hyper-individualism (cf. Baehr 2015BAEHR, J. Cultivating Good Minds: A Philosophical and Practical Guide to Educating for Intellectual Virtues. (MS) Accessed 17/April/2020 (MS) Accessed 17/April/2020 http://intellectualvirtues.org/why-should-we-educate-for-intellectual-virtues-2-2/ , 2015.
http://intellectualvirtues.org/why-shoul...
; Grasswick 2019GRASSWICK, H. “Epistemic Autonomy in a Social World of Knowing.” In Battaly, H. (eds. (2019), pp.196-208.; Roberts and Wood 2007ROBERTS, R., WOOD, J. Intellectual Virtues. Oxford: OUP, 2007., pp.257ff.; Siegel 2017SIEGEL, H. Education´s Epistemology. Oxford: OUP , 2017., pp.89ff.). In other words, one has the dispositions and attitudes to self-govern intellectually, which include, among other things, seeking evidence, demanding justifications, and assessing for oneself beliefs, reasons and sources of them.23 23 Note the virtue requires not only the critical skills to intellectually self-govern but also the willingness to exploit them (Baehr 2020). So, given that, as mentioned, each party in collective deliberation controls the quality of the reasons provided by the opposing party and tailors their reasons to the objections raised, the virtue’s dispositions and attitudes to engage in critical assessment also play a central role in such deliberation. So both intellectual virtues, humility and autonomy, are required for us to take proper advantage of this division of labor.

It is important to notice that intellectual autonomy is not in tension with intellectual humility, contrary to what one might think if autonomy is taken to entail a Cartesian-like ideal of autonomy (fn.6) and humility to entail openness to epistemic dependence. To see that they are not incompatible, note first that the above virtue of intellectual autonomy is not a matter of sheer epistemic independence. Indeed, such Cartesian hyper-individualism is one of the vicious extremes in-between which is the virtuous mean of intellectual autonomy (the other extreme being sheer epistemic dependence). So being intellectually humble (and so being open to epistemic dependence) is not in tension with being intellectually autonomous. Indeed, intellectual autonomy concerns what one does with one’s inevitable epistemic dependence on others: put differently, it involves some sort of dependence management. Although the Cartesian ideal of autonomy might be an apt ideal for a superior being, like Descartes’ God, it is not an ideal for the kind of being we are (Fricker 2006FRICKER, E. “Testimony and Epistemic Autonomy.” In Lackey, J. and Sosa, E. (eds.) (2006), pp.225-250. ). Indeed, given our limited powers, we cannot, say, know for ourselves everything we want and need to know. So, given, more precisely, our cognitive, spatial and temporal limitations and that such a dependence rewards us with epistemic riches, a good human epistemic subject is to epistemically rely on others.24 24 Indeed, relying on others seems to be cognitively fundamental for beings like us and, given our social and cooperative nature, one would expect social arrangements, such as the aforementioned divisions of labor, to be in place in order to help us overcome our limitations ([De Brasi] 2018; Kitcher 2011). But one should not do this indiscriminately and intellectual autonomy, qua dependence-management, requires that one be able to discriminate between the epistemically good and bad beliefs and reasons that others offer us. So although intellectual autonomy is not in tension with intellectual humility (the former involves the management of our epistemic dependence and the latter the management of our epistemic confidence so to be open to such epistemic dependence), it helps us avoid gullibility, including scientific gullibility (Jussim et al. 2019JUSSIM, L., STEVENS, S., HONEYCUTT, N., ANGLIN, S., FOX, N. “Scientific Gullibility.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.279-303. ).25 25 See also Dunning 2019, p.230 and Myers 2019, p.93.

4. PROFESSIONAL SCIENTISTS AND IDEOLOGICAL TRENDS

One can expect to find the above intellectual character in professional scientists. After all, as seen, the ethos of science seems to require it. On the one hand, organized skepticism, which recommends “the detached scrutiny of beliefs in terms of empirical and logical criteria,” clearly requires intellectual autonomy.26 26 So too disinterestedness if scientific research is to be under the scrutiny of fellow experts. On the other hand, as Merton realized, intellectual humility, as the recognition of one’s personal limitations, is also required by the ethos of science. So a typical, well-trained, professional scientist will possess, to a bigger or lesser degree, both virtues,27 27 See also Kidd 2011, van Dongen and Paul 2017, Jussim et al. 2019, Mayo 2019, McIntyre 2019, Myers 2019. which are required to adequately exploit collective deliberation. This is good news since, as we know, professional scientists are hardly immune from cognitive biases (fn.4). In fact, one can find striking illustrations of, say, the confirmation bias (particularly, ignoring counter-evidence) in different scientific domains, such as psychological science (Kelley and Blashfield 2009KELLEY, L., BLASHFIELD, R. “An Example of Psychological Science’s Failure to Self-Correct.” Review of General Psychology, 13(2), pp.122-129, 2009.). But collective deliberation can better protect research from such bias and its accompanying disconfirmation bias and scientists are likely to have the intellectual character to make proper use of it.

Having said that, the relevant cognitive diversity also required to adequately exploit collective deliberation is not always found within certain scientific communities. For example, in some parts of the world, there are certain ideological trends in academia, where much research is done (Klein and Stern 2009aKLEIN, D., STERN, C. “By the Numbers: The Ideological Profile of Professors.” In Maranto, R. et al. (eds.) (2009a), pp.15-38.; Inbar and Lammers 2012INBAR, Y., LAMMERS, J. “Political diversity in social and personality psychology.” Perspectives in Psychological Science, 7(5), pp.496-503, 2012.; Gross 2013GROSS, N. Why Are Professors Liberal and Why Do Conservatives Care? Massachusetts: HUP, 2013.; Langbert et al. 2016LANGBERT, M., QUAIN, A., KLEIN, D. “Faculty Voter Registration in Economics, History, Journalism, Law and Psychology.” Econ Journal Watch, 13(3), pp.422-451, 2016.).28 28 While American academia seems to strongly lean to the political left, European academia seems to be more heterogenous (van de Werfhorst 2020). There are, of course, many different characterizations of ideology (Eagleston 1991, pp.1-2). Here I am concerned with political ideologies and I take them to be a body of values and beliefs characteristic of a particular political group. Now, given the epistemically beneficial cognitive diversity pointed out above includes diversity of opinion, it seems that diversity of opinions related to ideologies will also be epistemically desirable (here I focus on ideology-related beliefs as opposed to values; see fn.29). But, of course, as a referee rightly pointed out, it might be the case that some diversities in ideology are not ethically and/or epistemically desirable (cf. Hicks’ (2011) “Nazi problem”). Having said that, as far as beliefs are concerned, if the above epistemic benefits are to be had in collective deliberation, then diversity of opinion shouldn’t, epistemically speaking, be unwelcome, even if it can, in some cases, be ethically undesirable. So much ideology-related research, in social science and particularly in social, political and personality psychology, is likely to be done in the absence of ideological diversity. Indeed, such lack of diversity in certain fields, and particularly in psychology, and the need for diversification in order to increase the reliability (and so the credibility) of the research has repeatedly been noticed (Tetlock 1994TETLOCK, P. “Political Psychology or Politicized Psychology.” Political Psychology, 15(3), pp.509-29, 1994.; Redding 2001REDDING, R. “Sociopolitical Diversity in Psychology.” American Psychologist, 56(3), pp.205-215, 2001.; Duarte et al. 2015DUARTE, J., CRAWFORD, J., STERN, C., HAIDT, J., JUSSIM, L., TETLOCK, P. “Political diversity will improve social psychological science.” Behavioral and Brain Sciences, 38, E130, 2015.). Normally, however, the focus is on the ideology-relevant political values. It is argued that social scientists can import political values that shape not only the analysis of data but also the research topics and hypotheses (Tetlock 1994TETLOCK, P. “Political Psychology or Politicized Psychology.” Political Psychology, 15(3), pp.509-29, 1994.; Redding 2001REDDING, R. “Sociopolitical Diversity in Psychology.” American Psychologist, 56(3), pp.205-215, 2001.; Duarte et al. 2015DUARTE, J., CRAWFORD, J., STERN, C., HAIDT, J., JUSSIM, L., TETLOCK, P. “Political diversity will improve social psychological science.” Behavioral and Brain Sciences, 38, E130, 2015.).29 29 For example, Tetlock (1994) shows the negative epistemic effects that the homogeneity of political values, which shape standards of evidence in testing competing hypotheses, can have. And Duarte et al. (2015) emphasize the negative role that such homogeneity can have in the selection of research topics and in the misattribution of traits to people with contrary political views. Redding (2001) consider these and other examples of the negative effects of homogeneity with regard to ideology-related values. This, of course, is not to suggest that some epistemic and otherwise values cannot play a legitimate role in scientific inquiry, as long noted by Kuhn (1977), Longino (1990) and Douglas (2009), among others, but only that homogeneity with regard to some values can have negative epistemic effects. Although this is an important issue that invites further consideration, my concern here is with the negative effect that the homogeneity of beliefs related to ideologies can have. But, surely, political ideology can affect the analysis of data via diverse mechanisms and, given §§2-3, it should be clear how such lack of diversity can also render the analysis of data error-prone due to ideology-related beliefs. These beliefs are not only the ones that, together with certain values, constitute some ideology but also the beliefs that accompany it. As Kahan (2016KAHAN, D. “The Politically Motivated Reasoning Paradigm, Part1.” In Scott, R. and Kosslyn, S. (eds.) (2016), pp.1-16. ) has shown, beliefs of all sorts, including scientific ones, are normally associated with ideologies so as to protect one’s group identity.30 30 Typical examples are beliefs about climate change, nuclear waste and the like that are associated to left and right ideologies. More relevant to our purposes are beliefs, say, about the dogmatism and self-deception of the right; see Tetlock 1983 and Jost 2017. In particular, non-constitutive ideology-related beliefs are brought into alignment with those held by others with whom they share ideology to signify their loyalty to the group.

Now, given that people tend to search for evidence that confirms their existing ideology-related beliefs while also ignoring disconfirming evidence and they are far worse at identifying biases and errors in their own thinking than in someone else who holds a different view, lack of ideology diversity within a research group can render the analysis of the evidence more error-prone (and trigger group polarization; see also Klein and Stern 2009bKLEIN, D., STERN, C. “Groupthink in Academia: Majoritarian Departmental Politics and the Professional Pyramid.” In Maranto, R. et al. (eds.) (2009b), pp.79-98.). So, when group members have homogeneous ideological views, ideology-related beliefs can, via the confirmation and disconfirmation biases, make the scientist analyze data in ways that are ideology friendly. Of course, such research (if the effort is made) can be externally scrutinized by researches that are relevantly diverse. However, such scrutiny, even in the case of peer-review, is only likely to succeed if all the evidence collected (including evidence eventually disregarded) and its analysis is provided. Nevertheless, even if journals were to ask for that disclosure,31 31 As some have recommended in order to avoid (intentional or not; fn.4) p-hacking (Simmons et al. 2011; Jussim et al. 2019). the possible error-correction would come anyway too late for the research team (even if not for the scientific community).

5. CITIZENS IN SOCIAL SCIENCE

Given ideological homogeneity in academia, ideology-related research taking place in it is less likely to overcome certain general cognitive shortcomings, rendering it more error-prone. Such research would benefit from the inclusion of ideology-diverse individuals in its collective deliberations, which can counteract, given their opposing views, confirmation and disconfirmation biases, from which data analysis is likely to suffer. Now, given the shortage of such individuals within certain scientific communities, citizens could play such role.32 32 Notice that real dissent is needed (even if by a minority), not someone, contraTollefsen (2006), playing devil’s advocate (Crisp and Turner 2011). But notice that these citizens need not only be ideologically diverse, they also need to be intellectually humble and autonomous if they are to be able to neutralize those biases in deliberation (cf. Wray 2014WRAY, B. “Collaborative Research, Deliberation and Innovation.” Episteme, 11(3), pp.291-303, 2014.). For example, intellectually arrogant citizens (just like intellectually arrogant scientists) are not likely to listen to opposing views (and, of course, self-deprecating ones are not likely to put forward their views for consideration). So one couldn’t, if intellectually arrogant, adequately tailor one’s criticism. And if one didn’t possess intellectual autonomy, even if one listened to the other’s views, one couldn’t adequately engage in its critical assessment.33 33 So one would expect deliberation among citizens and scientists to go wrong in cases where citizens, scientists or both do not instantiate the appropriate intellectual character. And, in fact, some public scientific discussion within heterogenous groups seems to be rather unfruitful. If the above is right, then this is due to a lack of appropriate intellectual character, which can be empirically tested by means of survey-based studies or more experimental designs. So citizens should not only be tested for the relevant diversity but also for the dispositions and attitudes associated with the intellectual virtues of humility and autonomy. For example, citizens should prove their critical skills (which they will be probably willing to apply, as autonomy requires, if they volunteer for the job; fn.22).34 34 Of course, one might worry about this testing, given that ideological homogeneity of the researchers might influence the tests’ results; after all, intellectual virtue is highly contextual (Battaly 2015; Pigden 2017; Madva 2019), so the dispositions and attitudes associated to the virtues need to be checked within the relevant context. Now, leaving aside that there is still some disagreement as to how to measure these virtues (not least because of the different conceptions of them available), some scales have been developed to be applied to diverse intellectual domains, such as the political one (e.g. Hoyle et al. 2016). But notice that such scales concern self-reports about, say, one’s tendency to consider others’ perspectives (within the political domain) and to accept feedback from others, even if negative (within the political domain), so it is not clear (leaving aside worries about the reliability of these reports) that the researchers’ ideology can interfere with the selection (given this sort of measurement). Moreover, given the virtues’ context-relativity (which needn’t commit us to a form of epistemic relativism—Battaly 2015, p.66), someone might worry about the psychological reality of the virtues. For example, situationist research in social psychology suggests that (often trivial) environmental variables can have greater explanatory power than character traits. And although it is true that there is good evidence that sometimes situations are quite powerful (Benjamin and Simpson 2009), the stronger claim that also appears to have widespread acceptance is that personal traits and individual differences have little to no effect once the impact of the situation is accounted for (Jost and Kruglanski 2002). Harman (2000) famously argues against the existence of moral character traits and Doris (2002) argues, more moderately, that only “narrow” (i.e. local), as opposed to global, traits can be empirically supported. And, as one would expect, the debate has recently moved to the intellectual domain (Fairweather and Alfano 2017). But this seems to overlook scholarship that has produced contrary evidence. In fact, the typical effect size for a situational effect on conduct is about the same as the typical effect size for personality characteristics (Fleeson 2004; Fleeson and Noftle 2008; Funder 2006). Indeed, some virtue theorists have argued that the situation-person debate is misguided and that situational factors should in fact be exploited in virtue development (Athanassoulis 2016; Battaly 2014; McWilliams 2019). To a certain extent, and as one would expect, the citizens that are to collaborate in scientific research are also to do so according to the ethos of science; in particular, to organized skepticism and intellectual humility, given the need for intellectually autonomous and humble deliberators in order to take advantage of the epistemic benefits of collective deliberation.

However, notice that intellectually humble citizens might become too humble in this particular deliberative context if they do not feel as equal deliberators with the professional scientists. So, although there will certainly be differences in expertise, citizens should be treated as equally capable of understanding reasons as well as providing criticisms for opposing views (which is something one cannot very effectively do against one’s own view given one’s biases). So, to use Longino’s (2002LONGINO, H. The Fate of Knowledge. Princeton: PUP, 2002., pp.131-3) phrase, some “tempered equality” of intellectual authority is particularly required in this context. Indeed, tempered equality is one of Longino’s (2002LONGINO, H. The Fate of Knowledge. Princeton: PUP, 2002.) criteria to ensure critical discourse in a deliberative setting. She thinks that, although individual scientists are biased in a variety of ways, these biases can be overcome by promoting critical dialogue between them. We have seen how, for some biases, that is possible. But a certain diversity and intellectual character are required and to help all parties remain intellectually humble, an explicit norm embracing tempered equality might just nudge them to do so. Longino certainly thinks cognitive diversity is crucial to correct false beliefs and biased accounts and she does not merely care about the diversity of perspectives that better gender equality would bring into science.35 35 The benefits of such perspectives are undeniable. A remarkable example of scientific progress due to the introduction of women’s perspectives into a field is that of primatology, and particularly the study of primate’s social and sexual behavior; see, e.g., Lloyd 2005. See also Longino 2002, 132. In fact, her tempered equality criterion requires that a community be also inclusive of scientists independently of their race, ethnic identity, nationality, age, sexual orientation and, of course, ideology, among other things. Longino’s criterion, just like Merton’s universalism and communism (§1), invites outsiders to participate in scientific debates, which Bacon and others members of the Royal Society already saw as necessary (Sargent 2005SARGENT, R. “Virtues and the Scientific Revolution.” In Koertge, N. (ed.) (2005), pp.71-79.). And when such outsiders are not to be found within the scientific community, citizens can do so in order to increase epistemic performance, on the condition that they respect the ethos of science.

So, independently of the discussion about the epistemic merits of data collection by citizens in much citizen science, relevantly diverse citizens who are intellectually humble and autonomous can contribute in research teams’ deliberations towards the elimination of errors in the analysis of data. Indeed, we have seen that some research within the social sciences can benefit epistemically from the inclusion of such citizens in this division of cognitive labor (as opposed to some division of epistemic labor, as much citizen science does). So, in some cases, it is worth taking seriously a different kind of citizen participation a laIrwin (1995IRWIN, A. Citizen Science: A Study of People, Expertise and Sustainable Development. London: Routledge , 1995.). Indeed, the time is right for some professional social scientists to lay aside the prejudices they may have and join deliberative efforts with the relevant citizens to increase their research’s epistemic quality, just as Bacon would have probably recommended.36 36 The research for this article has been supported by the Fondecyt project # 1180886 (Chile). I would like to thank to anonymous referees for their helpful comments.

REFERENCES

  • ALEXANDER, J., HIMMELREICH, J., THOMPSON, C. “Epistemic landscapes, optimal search and the division of cognitive labor.” Philosophy of Science, 82(3), pp.424-453, 2015.
  • ATHANASSOULIS, N. “The Psychology of Virtue Education.” In Masala, A. and Webber, J. (eds.) (2016), pp.207-228.
  • BACON, F. The New Organon Edited by Lisa Jardine and Michael Silverthorne. Cambridge: CUP, 1620 (2002).
  • BAEHR, J. Cultivating Good Minds: A Philosophical and Practical Guide to Educating for Intellectual Virtues (MS) Accessed 17/April/2020 (MS) Accessed 17/April/2020 http://intellectualvirtues.org/why-should-we-educate-for-intellectual-virtues-2-2/ , 2015.
    » http://intellectualvirtues.org/why-should-we-educate-for-intellectual-virtues-2-2/
  • BAEHR, J. Intellectual Virtues and Education London: Routledge, 2016.
  • BAEHR, J. “Intellectual Virtues, Critical Thinking and the Aims of Education.” In Fricker, M. et al. (eds.) (2020), pp.447-56.
  • BATTALY, H. “Acquiring Epistemic Virtue: Emotions, Situations and Education.” In Fairweather, A. and Flanagan, O. (eds.) (2014), pp.175-196.
  • BATTALY, H. Virtue London: Polity, 2015.
  • BATTALY, H. The Routledge Handbook of Virtue Epistemology London: Routledge , 2019.
  • BENJAMIN, L., SIMPSON, J. “The Power of the Situation.” American Psychological Association, 64, pp.12-9, 2009.
  • BONNEY, R. “Citizen Science: A Lab Tradition.” Living Birds, 15(4), pp.7-15, 1996.
  • BOYER-KASSEM, T., MAYO-WILSON, C., WEISBERG, M. Scientific Collaboration and Collective Knowledge Oxford: OUP, 2018.
  • CHAN, M., JONES, C., JAMIESON, K., ALBARRACÍN, D. “Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation.” Psychological Science, 28(11), pp.1531-46, 2017.
  • CHONG, D. “Degrees of Rationality in Politics.” In Huddy, L. et al. (eds.) (2013), pp.96-129.
  • CHURCH, I., SAMUELSON, P. Intellectual Humility London: Bloomsbury, 2017.
  • CODE, L. What can she Know? Ithaca: CUP, 1991.
  • CRISP, R., TURNER, R. “Cognitive Adaptation to the Experience of Social and Cultural Diversity.” Psychological Bulletin, 137(2), pp.242-66, 2011.
  • DEA, S., SILK, M. “Sympathetic Knowledge and the Scientific Attitude.” In Fricker, M. et al. (eds.) (2020), pp.344-53.
  • DE BRASI, L. “Reliability and Social Knowledge-Relevant Responsibility,” Transformaçao, 38(1), pp. 187-212, 2015.
  • _____. “Citizenry Incompetence and the Epistemic Structure of Society”, Filosofía Unisinos, 19(3), pp. 201-12, 2018.
  • DE REGT, H., LEONELLI, S., EIGNER, K. Scientific Understanding: Philosophical Perspectives Pittsburgh: University of Pittsburgh Press, 2009.
  • DE RIDDER, J. “How Many Scientists Does It Take to Have Knowledge?” In McCain, K. and Kampourakis, K. (eds.) (2020), pp.3-17.
  • DORIS, J. Lack of character: Personality and Moral Behavior Cambridge: CUP , 2002.
  • DOUGLAS, H. Science, Policy, and the Value-Free Ideal Pittsburgh: University of Pittsburgh Press , 2009.
  • DUARTE, J., CRAWFORD, J., STERN, C., HAIDT, J., JUSSIM, L., TETLOCK, P. “Political diversity will improve social psychological science.” Behavioral and Brain Sciences, 38, E130, 2015.
  • DUNNING, D. “Gullible to Ourselves.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.217-33.
  • EAGLESTON, T. Ideology: An Introduction London: Verso, 1991.
  • EDWARDS, K., SMITH, E. “A Disconfirmation Bias in the Evaluation of Arguments.” Journal of Personality and Social Psychology, 71(1), pp.5-24, 1996.
  • ELLIOTT, K., ROSENBERG, J. “Philosophical Foundations for Citizen Science.” Citizen Science, 4(1), pp.1-9, 2019.
  • ELSTER, J. Deliberative Democracy Cambridge: CUP , 1998.
  • EVANS, J. “Belief Bias in Deductive Reasoning.” In Pohl, R. (eds.) (2017), pp.165-181.
  • EYAL, G. The Crisis of Expertise London: Polity , 2019.
  • FAIRWEATHER, A., ALFANO, M. Epistemic Situationism Oxford: OUP , 2017.
  • FAIRWEATHER, A., FLANAGAN, O. Naturalizing Epistemic Virtue Cambridge: CUP , 2014.
  • FEARON, J. “Deliberation as Discussion.” In Elster, J. (eds.) (1998), pp.44-68.
  • FISHKIN, J. When the People Speak Oxford: OUP , 2011.
  • FLEESON, W. “Moving personality beyond the person-situation debate: The challenge and opportunity of within-person variability.” Current Directions in Psychological Science, 13, pp.83-7, 2004.
  • FLEESON, W., NOFTLE, E. “The end of the person-situation debate: An emerging synthesis in the answer to the consistency question.” Social and Personality Psychology Compass, 2, pp.1667-84, 2008.
  • FORGAS, J., BAUMEISTER, R. The Social Psychology of Gullibility London: Routledge , 2019.
  • FRICKER, E. “Testimony and Epistemic Autonomy.” In Lackey, J. and Sosa, E. (eds.) (2006), pp.225-250.
  • FRICKER, M., GRAHAM, P., HENDERSON, D., PEDERSEN, N. Routledge Handbook of Social Epistemology London: Routledge , 2020.
  • FUNDER, D. “Towards a resolution of the personality triad: Persons, situations, and behaviors.” Journal of Research in Personality, 40, pp.21-34, 2006.
  • GAUKROGER, S. Knowledge in Modern Philosophy London: Bloomsbury , 2019.
  • GERRING, J., CHRISTENSON, D. Applied Social Science Methodology Cambridge: CUP , 2017.
  • GOLD, M., OCHU, E. “Creative collaboration in citizen science and the evolution of ThinkCamps.” In Hecker, S. et al. (eds.) (2018), pp.124-45.
  • GORMAN, S., GORMAN, J. Denying to the Grave Oxford: OUP , 2017.
  • GRASSWICK, H. “Epistemic Autonomy in a Social World of Knowing.” In Battaly, H. (eds. (2019), pp.196-208.
  • GREEN, A. The Social Contexts of Intellectual Virtue: Knowledge as a Team Achievement London: Routledge , 2017.
  • GROSS, N. Why Are Professors Liberal and Why Do Conservatives Care? Massachusetts: HUP, 2013.
  • HARMAN, G. “The nonexistence of character traits.” Proceedings Aristotelian Society, 100, pp.223-6, 2000.
  • HECKER, S., HAKLAY, M., BOWSER, A., MAKUCH, Z., VOGEL, J., BONN, A. Citizen Science London: UCL Press, 2018.
  • HERTWIG, R., GRÜNE-YANOFF, T. “Nudging and Boosting.” Perspectives on Psychological Science, 12(6), pp.973-86, 2017.
  • HESS, D., MCAVOY, P. The Political Classroom: Evidence and Ethics in Democratic Education London: Routledge , 2015.
  • HICKS, D. “Is Longino’s Conception of Objectivity Feminist?” Hypatia, 26(2), pp.333-51, 2011.
  • HONG, L., PAGE, S. “Groups of Diverse Problem Solvers Can Outperform Groups of High-Ability Problem Solvers.” Proceedings of the National Academy of Sciences of the United States, 101(46), pp.16385-89, 2004.
  • HOYLE, R., DAVISSON, E., DIEBELS, K., LEARY, M. “Holding specific views with humility: Conceptualization and measurement of specific intellectual humility”. Personality and Individual Differences, 97, pp.165-72, 2016.
  • HUDDY, L., SEARS, D., LEVY, J. Oxford Handbook of Political Psychology Oxford: OUP , 2013.
  • INBAR, Y., LAMMERS, J. “Political diversity in social and personality psychology.” Perspectives in Psychological Science, 7(5), pp.496-503, 2012.
  • IRWIN, A. Citizen Science: A Study of People, Expertise and Sustainable Development London: Routledge , 1995.
  • ISENBERG, D. “Group Polarization: A critical review and meta-analysis.” Journal of Personality and Social Psychology, 50, pp.1141-51, 1986.
  • JOST, J. “Ideological Asymmetries and the Essence of Political Psychology.” Political Psychology, 38(2), pp.167-208, 2017.
  • JOST, J., KRUGLANSKI, A. “The estrangement of social constructionism and experimental social psychology.” Personality and Social Psychology Review, 6, pp.168-87, 2002.
  • JUSSIM, L., STEVENS, S., HONEYCUTT, N., ANGLIN, S., FOX, N. “Scientific Gullibility.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.279-303.
  • KAHAN, D. “The Politically Motivated Reasoning Paradigm, Part1.” In Scott, R. and Kosslyn, S. (eds.) (2016), pp.1-16.
  • KELLEY, L., BLASHFIELD, R. “An Example of Psychological Science’s Failure to Self-Correct.” Review of General Psychology, 13(2), pp.122-129, 2009.
  • KIDD, I. “Pierre Duhem’s epistemic claims and the intellectual virtue of humility.” Studies in History and Philosophy of Science, 42(1), pp.185-189, 2011.
  • KIDD, I. “Educating for Intellectual Humility.” In Baehr, J. (eds.) (2016), pp.54-70.
  • KITCHER, P. The Advancement of Science: Science without Legend, Objectivity without Illusions Oxford: OUP , 1993.
  • KITCHER, P. Science in a Democratic Society New York: Prometheus, 2011.
  • KLEIN, D., STERN, C. “By the Numbers: The Ideological Profile of Professors.” In Maranto, R. et al. (eds.) (2009a), pp.15-38.
  • KLEIN, D., STERN, C. “Groupthink in Academia: Majoritarian Departmental Politics and the Professional Pyramid.” In Maranto, R. et al. (eds.) (2009b), pp.79-98.
  • KOEHLER, J. “The Influence of Prior Beliefs on Scientific Judgments of Evidence Quality.” Organizational Behavior and Human Decision Processes, 56, pp.28-55, 1993.
  • KOERTGE, N. Scientific Values and Civic Virtues Oxford: OUP , 2005.
  • KUHN, T. The Essential Tension: Selected Studies in Scientific Tradition and Change Chicago: University of Chicago Press, 1977.
  • KULLENBERG, C., KASPEROWSKI, D. “What Is Citizen Science?-A Scientometric Meta-Analysis.” PLoS ONE, 11(1), e0147152, 2016.
  • LACKEY, J., SOSA, E. The Epistemology of Testimony Oxford: OUP , 2006.
  • LANGBERT, M., QUAIN, A., KLEIN, D. “Faculty Voter Registration in Economics, History, Journalism, Law and Psychology.” Econ Journal Watch, 13(3), pp.422-451, 2016.
  • LARSON, J. In Search of Synergy in Small Group Performance New York: Psychology Press, 2010.
  • LEVENDUSKY, M., DRUCKMAN, J., MCLAIN, A. “How Group Discussions Create Strong Attitudes and Strong Partisans.” Research and Politics, 3, pp.1-6, 2016.
  • LILIENFELD, S. “Can psychology become a science?” Personality and Individual Differences, 49, pp.281-88, 2010.
  • LLOYD, E. The case of the female orgasm Massachusetts: HUP , 2005.
  • LIPPERT-RASMUSSEN, K., BROWNLEE, K., COADY, D. A Companion to Applied Philosophy Oxford: Wiley-Blackwell, 2017.
  • LODGE, M., TABER, C. The Rationalizing Voter Cambridge: CUP , 2013.
  • LONGINO, H. Science as Social Knowledge: Values and Objectivity in Scientific Inquiry Princeton: Princeton University Press, 1990.
  • LONGINO, H. The Fate of Knowledge Princeton: PUP, 2002.
  • LUSKIN, R., O´FLYNN, I., FISHKIN, J., RUSSELL, D. “Deliberating Across Deep Divides.” Political Studies, 62(1), pp.116-35, 2014.
  • MADVA, A. “The Inevitability of Aiming for Virtue.” In Sherman, B. and Goguen, S. (eds.) (2019), pp.85-99.
  • MARANTO, R., REDDING, R., HESS, F. The politically correct university: Problems, scope and reforms Washington: AEI Press, 2009.
  • MASALA, A., WEBBER, J. From Personality to Virtue Oxford: OUP , 2016.
  • MAYO, R. “The Skeptical (Ungullible) Mindset.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.140-58.
  • MCCAIN, K., KAMPOURAKIS, K. What is Scientific Knowledge? London: Routledge , 2020.
  • MCINTYRE, L. The Scientific Attitude Massachusetts: MIT Press, 2019.
  • MCWILLIAMS, E. “Can Epistemic Virtue Help Combat Epistemologies of Ignorance?” In Sherman, B. and Goguen, S. (eds.) (2019), pp.101-18.
  • MERCIER, H. “Confirmation bias-myside bias.” In Pohl, R. (eds.) (2017), pp.99-114.
  • MERCIER, H., HEINTZ, C. “Scientists’ Argumentative Reasoning.” Topoi, 33, pp.513-524, 2014.
  • MERCIER, H., SPERBER, D. The Enigma of Reason London: Allen Lane, 2017.
  • MERTON, R. The Sociology of Science Chicago: UCP, 1973.
  • MOSHMAN, D., GEIL, M. “Collaborative Reasoning: Evidence for Collective Rationality.” Thinking & Reasoning, 4(3), pp.231-48, 1998.
  • MYERS, D. 2019. “Psychological Science Meets a Gullible Post-Truth World.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.77-100.
  • MULDOON, R. “Diversity, Rationality and the Division of Cognitive Labor.” In Boyer-Kassem, T. et al. (eds.) (2018), pp.78-94.
  • NICHOLS, T. The Death of Expertise Oxford: OUP , 2017.
  • NICKERSON, R. “Confirmation Bias: A ubiquitous phenomena in many guises.” Review of General Psychology, 2(2), pp.175-220, 1998.
  • NOVAK, J., BECKER, M., GREY, F., MONDARDINI, R. “Citizen Engagement and Collective Intelligence for Participatory Digital Social Innovation.” In Hecker, S. et al. (eds.) (2018), pp.124-45.
  • NYHAN, B., REIFLER, J. “When Corrections Fail.” Political Behaviour, 32, pp.303-30, 2010.
  • PALUCK, E., GREEN, D. “Deference, Dissent and Dispute Resolution.” American Political Science Review, 103, pp.622-44, 2009.
  • PATERNOTTE, C., IVANOVA, M. “Virtues and Vices in Scientific Inquiry.” Synthese, 194, pp.1787-807, 2017.
  • PELTONENE, M. The Cambridge Companion to Bacon Cambridge: CUP , 1996.
  • PETERSON, E. 2020. “Can Scientific Knowledge Sift the Wheat from the Tares?: A Brief History of Bias (and Fears about Bias) in Science.” In McCain, K. and Kampourakis, K. (eds.) (2020), pp.195-211.
  • PIGDEN, C. “Are Conspiracy Theorists Epistemically Vicious?. In Lippert-Rasmussen, K. et al. (eds.) (2017), pp.120-132.
  • POHL, R. Cognitive Illusions London: Routledge , 2017.
  • POTOCHNIK, A., COLOMBO, M., WRIGHT, C. Recipes for Science London: Routledge , 2018.
  • PRONIN, E., GILOVICH, T., ROSS, L. “Objectivity in the eye of the beholder.” Psychological Review, 111, pp.781-799, 2004.
  • REDDING, R. “Sociopolitical Diversity in Psychology.” American Psychologist, 56(3), pp.205-215, 2001.
  • REDLAWSK, D., CIVETTINI, A., EMMERSON, K. “The Affective Tipping Point: Do Motivated Reasoners Ever “Get It”?” Political Psychology, 31(4), pp.563-93, 2010.
  • REISCH, H., POTTER, C. “Citizen Science as seen by scientists.” Public Understanding of Science, 23(1), pp.107-120, 2014.
  • ROBERTS, R., WOOD, J. Intellectual Virtues Oxford: OUP, 2007.
  • ROLIN, K. “Diversity and Dissent in the Social Sciences.” Philosophy of the Social Sciences, 41(4), pp.470-494, 2011.
  • ROSS, L., LEPPER, M., HUBBARD, M. “Perseverance in self-perception and social perception.” Journal of Personality and Social Psychology, 32(5), pp.880-92, 1975.
  • ROSSI, P. “Bacon’s idea of science.” In Peltonene, M. (ed.) (1996), pp.25-46.
  • SARGENT, R. 1996. “Bacon as an advocate for cooperative scientific research.” In Peltonene, M. (ed.) (1996), pp.146-171.
  • SARGENT, R. “Virtues and the Scientific Revolution.” In Koertge, N. (ed.) (2005), pp.71-79.
  • SAWYER, K. Group Genius: The Creative Power of Collaboration New York: Basic Books, 2017.
  • SCHINDLER, S. Theoretical Virtues in Science Cambridge: CUP , 2018.
  • SCOTT, R., KOSSLYN, S. Emerging Trends in the Social and Behavioral Sciences Hoboken: Wiley, 2016.
  • SHARPE, M. “Bacon.” In Gaukroger, S. (ed.) (2019), pp.7-26.
  • SHERMAN, B., GOGUEN, S. Overcoming Epistemic Injustice London: Rowan&Littlefield, 2019.
  • SIEGEL, H. Education´s Epistemology Oxford: OUP , 2017.
  • SIMMONS, J., NELSON, L., SIMONSOHN, U. “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant.” Psychological Science, 22(11), pp.1359-66, 2011.
  • SOLOMON, M. Social Empiricism Massachusetts: MIT Pres, 2001.
  • STREVENS, M. “The Role of the Priority Rule in Science.” Journal of Philosophy, 100, pp.55-79, 2003.
  • SUNSTEIN, C. Infotopia Oxford: OUP , 2006.
  • TETLOCK, P. “Cognitive style and political ideology.” Journal of Personality and Social Psychology, 45, pp.118-26, 1983.
  • TETLOCK, P. “Political Psychology or Politicized Psychology.” Political Psychology, 15(3), pp.509-29, 1994.
  • THOMA, J. “The epistemic division of labour revisited.” Philosophy of Science, 82(3), pp.454-72, 2015.
  • THOMPSON, A. “Does Diversity Trump Ability? An example of the misuse of mathematics in the social sciences.” Notices of the American Mathematical Society, 61, pp.1024-30, 2014.
  • TOLLEFSEN, D. “Group Deliberation, Social Cohesion and Scientific Teamwork.” Episteme, 3(1-2), pp.37-51, 2006.
  • TRIVERS, R. The Folly of Fools New York: Ingram, 2011.
  • VAN DE WERFHORST, H. “Are universities left-wing bastions? The political orientation of professors, professionals, and managers in Europe.” British Journal of Sociology, 71(1), pp.47-73, 2020.
  • VAN DONGEN, J., PAUL, H. Epistemic Virtues in the Sciences and the Humanities Dordrecht: Springer, 2017.
  • WAGENKNECHT, S. A Social Epistemology of Research Groups Dordrecht: Springer , 2017.
  • WEISBERG, M., MULDOON, R. “Epistemic landscapes and the division of cognitive labour.” Philosophy of Science, 76(2), pp.225-52, 2009.
  • WEYMARK, J. “Cognitive Diversity, Binary Decisions and Epistemic Democracy.” Episteme, 12, pp.497-511, 2015.
  • WICHERTS, J., VELDKAMP, C., AUGUSTEIJN, H., BAKKER, M., VAN AERT, R., VAN ASSEN, M. “Degrees of Freedom in Planning, Running, Analyzing and Reporting Psychological Studies.” Frontiers in Psychology, 7, pp.1-12, 2016.
  • WHITCOMB, D., BATTALY, H., BAEHR, J., HOWARDS-SNYDER, D. “Intellectual Humility: Owning our limitations.” Philosophy and Phenomenological Research, 94(3), pp.509-39, 2017.
  • WOOD, T., PORTER, E. “The Elusive Backfire Effect: Mass Attitudes’ Steadfast Factual Adherence.” Political Behaviour, 41, pp.135-63, 2019.
  • WRAY, B. “Collaborative Research, Deliberation and Innovation.” Episteme, 11(3), pp.291-303, 2014.
  • ZHANG, L., STERNBERG, R., RAYNER, S. Handbook of Intellectual Styles: Preferences in Cognition, Learning and Thinking Dordrecht: Springer , 2012.
  • ZOLLMAN, K. “The Epistemic benefits of Transient Diversity.” Erkenntnis, 72(1), pp.17-35, 2010.
  • 1
    The contribution of citizens has also been questioned on ethical and other grounds, but my focus here is on epistemological issues exclusively. In fact, this work falls within the epistemology of science and, in particular, the growing sub-field of virtue epistemology of science (e.g., Paternotte and Ivanova 2017PATERNOTTE, C., IVANOVA, M. “Virtues and Vices in Scientific Inquiry.” Synthese, 194, pp.1787-807, 2017.; van Dongen and Paul 2017VAN DONGEN, J., PAUL, H. Epistemic Virtues in the Sciences and the Humanities. Dordrecht: Springer, 2017.), which studies the role of the epistemic agent’s virtues and vices in scientific inquiry (as opposed to the, more traditional, theoretical virtues issue; e.g., Schindler 2018SCHINDLER, S. Theoretical Virtues in Science. Cambridge: CUP , 2018.). However, much work done within this sub-field is individualistic in nature: it does not focus on the social dimensions of scientific inquiry. This work attempts to address the vacuum that one can presently find in the epistemology of science with regard to the study of the social dimensions of intellectual virtues relevant to the scientific inquiry and it aims to show that diverse, virtuous citizens can have a legitimate epistemic role in research within the social sciences.
  • 2
    I focus on the analysis phase, which can be conducted by one person and concerns, among other things, interpreting results (say, as outliers or violations), reaching conclusions about the sample studied and extrapolating those conclusions to a large population. Nevertheless, what I say equally applies to other phases, such as hypothesizing and reporting ones (some hints are provided in §2). See Gerring and Christenson 2017GERRING, J., CHRISTENSON, D. Applied Social Science Methodology. Cambridge: CUP , 2017. and Wicherts et al. 2016WICHERTS, J., VELDKAMP, C., AUGUSTEIJN, H., BAKKER, M., VAN AERT, R., VAN ASSEN, M. “Degrees of Freedom in Planning, Running, Analyzing and Reporting Psychological Studies.” Frontiers in Psychology, 7, pp.1-12, 2016., for more on these phases and their tasks.
  • 3
    Although science is also concerned with other epistemic goals (such as understanding; indeed, “it seems commonplace to state that the desire for understanding is a chief motivation for doing science”—de Regt et al. 2009DE REGT, H., LEONELLI, S., EIGNER, K. Scientific Understanding: Philosophical Perspectives. Pittsburgh: University of Pittsburgh Press, 2009. , p.1), here I focus on truth.
  • 4
    Scientists, like everyone else, are subject to unintentional errors due to biases and other shortcomings (e.g., Jussim et al. 2019JUSSIM, L., STEVENS, S., HONEYCUTT, N., ANGLIN, S., FOX, N. “Scientific Gullibility.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.279-303. ; Koehler 1993KOEHLER, J. “The Influence of Prior Beliefs on Scientific Judgments of Evidence Quality.” Organizational Behavior and Human Decision Processes, 56, pp.28-55, 1993.; Lilienfeld 2010LILIENFELD, S. “Can psychology become a science?” Personality and Individual Differences, 49, pp.281-88, 2010.; Mercier and Heintz 2014MERCIER, H., HEINTZ, C. “Scientists’ Argumentative Reasoning.” Topoi, 33, pp.513-524, 2014.; Peterson 2020PETERSON, E. 2020. “Can Scientific Knowledge Sift the Wheat from the Tares?: A Brief History of Bias (and Fears about Bias) in Science.” In McCain, K. and Kampourakis, K. (eds.) (2020), pp.195-211.). Scientists are also capable of deliberate deception and self-deception. Scientific fraud, such as Andrew Wakefield’s, is the most troubling source of intentional error but also not very common, partly due to the known consequences of such conduct. And because of those consequences, it is not unlikely that much error is the product of questionable work which scientists rationalize away (Trivers 2011TRIVERS, R. The Folly of Fools. New York: Ingram, 2011.). For example, scientists might use a small set of data, or exclude some data, or create huge amounts of data so to find some spurious statistically significant correlation, and excuse this and other flexibility in their treatment of data on the basis of “researcher degrees of freedom” (Simmons et al. 2011SIMMONS, J., NELSON, L., SIMONSOHN, U. “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant.” Psychological Science, 22(11), pp.1359-66, 2011.; Wicherts et al. 2016WICHERTS, J., VELDKAMP, C., AUGUSTEIJN, H., BAKKER, M., VAN AERT, R., VAN ASSEN, M. “Degrees of Freedom in Planning, Running, Analyzing and Reporting Psychological Studies.” Frontiers in Psychology, 7, pp.1-12, 2016.). Here, however, I do not focus on these two deceptive sources of errors.
  • 5
    Although there has recently been an increasing public-trust decline in scientific experts and evidence (e.g., Nichols 2017NICHOLS, T. The Death of Expertise. Oxford: OUP , 2017.; Eyal 2019EYAL, G. The Crisis of Expertise. London: Polity , 2019.).
  • 6
    That is, knowledge requires autonomy as absence of external interference. But, as we will see, this Cartesian ideal of autonomy entails a vicious excess. For more on this and related ideals, see Code 1991CODE, L. What can she Know? Ithaca: CUP, 1991., pp.110-116; also Solomon 2001SOLOMON, M. Social Empiricism. Massachusetts: MIT Pres, 2001., pp.2-7 and Mercier and Heintz 2014MERCIER, H., HEINTZ, C. “Scientists’ Argumentative Reasoning.” Topoi, 33, pp.513-524, 2014., pp.515-516.
  • 7
    One main reason as to why science nowadays tends to involve more collaboration is the increasing difficulty of research problems and the hyper-specialization (even within disciplines) requires the collaboration of different individuals. For more, see Muldoon 2018MULDOON, R. “Diversity, Rationality and the Division of Cognitive Labor.” In Boyer-Kassem, T. et al. (eds.) (2018), pp.78-94. .
  • 8
    The concerns in this literature are mainly with the coordination of research efforts. For example, they show how different research problems are adopted by different research teams and, when teams share a research problem, how different research methods are adopted, due to the Priority Rule (Strevens 2003STREVENS, M. “The Role of the Priority Rule in Science.” Journal of Philosophy, 100, pp.55-79, 2003.).
  • 9
    Notice that both communism and disinterestedness presuppose this cooperative structure of science.
  • 10
    This involves the epistemic weighting of reasons. Some claim that this sort of collective deliberation is just collective scientific reasoning (Mercier and Heintz 2014MERCIER, H., HEINTZ, C. “Scientists’ Argumentative Reasoning.” Topoi, 33, pp.513-524, 2014.).
  • 11
    For example, one party may advance a reason R1 in favour of some claim and another one may respond by introducing a counter-reason or defeater D that speaks against the claim; then the first one may introduce a defeater of the defeater DD or concede R1 has been defeated—or weakened to certain extent—and perhaps introduce some new reason R2 and so on, and eventually they weight the reasons for and against to see how strong is the case for the claim.
  • 12
    Of course, given the present interdisciplinary nature of much research, this division of cognitive labor is also likely to rely on some sort of division of epistemic labor (see, e.g., Green 2017GREEN, A. The Social Contexts of Intellectual Virtue: Knowledge as a Team Achievement. London: Routledge , 2017., pp.135ff.).
  • 13
    For example, some might be better at producing certain kinds of explanations and not others, while others might be more holistic, as opposed to analytical, in their approach to issues. See, e.g., Zhang, Sternberg and Rayner 2012ZHANG, L., STERNBERG, R., RAYNER, S. Handbook of Intellectual Styles: Preferences in Cognition, Learning and Thinking. Dordrecht: Springer , 2012.. And of course people can be diverse with regard to their experiences, evidence, interpretative frameworks, and more.
  • 14
    Although I will not here provide further reasons, this is not meant to be an exhaustive list of benefits. For example, collective deliberation also increases performance when creativity is required to work out a complex issue (Sawyer, 2017SAWYER, K. Group Genius: The Creative Power of Collaboration. New York: Basic Books, 2017.) and creativity is required in the formation of factual hypotheses (Gold and Ochu, 2018GOLD, M., OCHU, E. “Creative collaboration in citizen science and the evolution of ThinkCamps.” In Hecker, S. et al. (eds.) (2018), pp.124-45. ).
  • 15
    See also Redlawsk et al. 2010REDLAWSK, D., CIVETTINI, A., EMMERSON, K. “The Affective Tipping Point: Do Motivated Reasoners Ever “Get It”?” Political Psychology, 31(4), pp.563-93, 2010., p.590, for the possibility of change even when holding extreme views very confidently. Having said that, there is the well-known phenomenon of belief-perseverance (Ross et al. 1975ROSS, L., LEPPER, M., HUBBARD, M. “Perseverance in self-perception and social perception.” Journal of Personality and Social Psychology, 32(5), pp.880-92, 1975.) that seems to speak against our capacity to change our mind in response to counter-reasons. In fact, we do not only seem to display belief-perseverance when facing disconfirming information, we also seem to suffer a “backfire effect” (Nyhan and Reifler 2010NYHAN, B., REIFLER, J. “When Corrections Fail.” Political Behaviour, 32, pp.303-30, 2010.): we increase our confidence on our belief when facing such information. However, the settings for the experiments where these phenomena are meant to show up are not interpersonally deliberative; hence the studied phenomena are not incompatible with our changing our mind in response to reasons in collective deliberation. Moreover, a recent meta-analysis regarding the psychological efficacy of disconfirming information suggests that the debunking effect is more effective when people are provided with new detailed reasons, which is likely to be the case in collective deliberation with relevantly diverse individuals (see below), that enables them to adjust the mental model justifying their beliefs (Chan et al. 2017CHAN, M., JONES, C., JAMIESON, K., ALBARRACÍN, D. “Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation.” Psychological Science, 28(11), pp.1531-46, 2017.). Furthermore, as our anxiety increases because more counter-evidence is confronted, we pay more attention to the evidence and start making the right adjustments to our beliefs (Redlawsk et al. 2010REDLAWSK, D., CIVETTINI, A., EMMERSON, K. “The Affective Tipping Point: Do Motivated Reasoners Ever “Get It”?” Political Psychology, 31(4), pp.563-93, 2010.). So it is not surprising that the “backfire effect” has failed to be replicated (Wood and Porter 2019WOOD, T., PORTER, E. “The Elusive Backfire Effect: Mass Attitudes’ Steadfast Factual Adherence.” Political Behaviour, 41, pp.135-63, 2019.) and, given the above, one can reasonably expect someone to change her mind if the reasons of the collective deliberation suggest it. Also, as we will see, a certain intellectual character seems required for successful collective deliberation and such character fosters a more adequate use of such disconfirming information (see also Chan et al. 2017CHAN, M., JONES, C., JAMIESON, K., ALBARRACÍN, D. “Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation.” Psychological Science, 28(11), pp.1531-46, 2017., p.14; Myers 2019MYERS, D. 2019. “Psychological Science Meets a Gullible Post-Truth World.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.77-100. , p.93).
  • 16
    These tendencies are unlikely to be mere cognitive bugs, as opposed to features of the mind, given how strong, universal and pervasive they are (as well as pleasurable—Gorman and Gorman 2017GORMAN, S., GORMAN, J. Denying to the Grave. Oxford: OUP , 2017., p.134). Having said that, if these natural tendencies are not exploited in an interactive setting, they become epistemically harmful. For more on this and a plausible evolutionary story supporting it, see Mercier and Sperber 2017MERCIER, H., SPERBER, D. The Enigma of Reason. London: Allen Lane, 2017..
  • 17
    Of course, I have not argued that collective deliberation can counteract all relevant biases, and nor is it my intention to do so. I have only argued that two pervasive biases which can, in principle, affect the analysis of data in all scientific research, can be neutralized. But, although collective deliberation can help us improve our epistemic performance with regard to certain biases that affect us at individual-level, it might introduce new ones that affects us at group-level. Deliberation is a group-based phenomenon and below we will be concerned with a common problem facing groups: group polarization. In this case, as we will see, groups of likeminded individuals polarize and so the relevant heterogeneity is to be fostered in order to avoid the polarization. But, of course, there are other issues to be concerned about, such as the well-known in-group bias. Overcoming (individual and group-level) biases is not easy for us, particularly when left to our own devices. It is, moreover, likely that no one single strategy will be effective to overcome them. The strategy here employed to deal with the confirmation and disconfirmation biases is to exploit group interaction in certain conditions (deliberators should be relevantly diverse and possess a certain intellectual character; see immediately below). That is, deliberation does not cancel cognitive biases merely by assembling individuals into a group. Moreover, collective deliberation, by requiring people to justify their public decisions, can interrupt some of our usual cognitive patterns. It nudges people to “more careful processing of information and reduces cognitive biases (e.g., stereotyping, group bias, primacy effects, anchoring effects in probability judgements, fundamental attribution errors [and framing effects])” (Chong 2013CHONG, D. “Degrees of Rationality in Politics.” In Huddy, L. et al. (eds.) (2013), pp.96-129., pp.114, 118). Nevertheless, even if certain individual- and group-level biases can be overcome with the right kind of deliberation (via both the interaction of certain individuals, as specified below, and the nudging it promotes), collective deliberation is not here presented as the solution to all our cognitive shortcomings (but two very general ones) and, importantly, it is not incompatible with other efforts to overcome them (but this more comprehensive inquiry into the different measures to be adopted to overcome all known shortcomings is certainly beyond the scope of the present paper). And surely scientists need to be both boosted and nudged in many different ways (for the distinction between nudging and the sort of boosting strategy here exploited, see Hertwig and Grüne-Yanoff 2017HERTWIG, R., GRÜNE-YANOFF, T. “Nudging and Boosting.” Perspectives on Psychological Science, 12(6), pp.973-86, 2017.).
  • 18
    Notice that none of the above epistemic benefits depend on the so-called “diversity trumps ability” theorem (Hong and Page 2004HONG, L., PAGE, S. “Groups of Diverse Problem Solvers Can Outperform Groups of High-Ability Problem Solvers.” Proceedings of the National Academy of Sciences of the United States, 101(46), pp.16385-89, 2004.), for which serious doubts about both its significance and applicability have been rightly raised (Thompson 2014THOMPSON, A. “Does Diversity Trump Ability? An example of the misuse of mathematics in the social sciences.” Notices of the American Mathematical Society, 61, pp.1024-30, 2014.; Weymark 2015WEYMARK, J. “Cognitive Diversity, Binary Decisions and Epistemic Democracy.” Episteme, 12, pp.497-511, 2015.). Further, they do not depend on simulations, like Weisberg and Muldoon’s ones (2009WEISBERG, M., MULDOON, R. “Epistemic landscapes and the division of cognitive labour.” Philosophy of Science, 76(2), pp.225-52, 2009.), which have also been questioned (Alexander et al. 2015ALEXANDER, J., HIMMELREICH, J., THOMPSON, C. “Epistemic landscapes, optimal search and the division of cognitive labor.” Philosophy of Science, 82(3), pp.424-453, 2015.; Thoma 2015THOMA, J. “The epistemic division of labour revisited.” Philosophy of Science, 82(3), pp.454-72, 2015.).
  • 19
    See, e.g., fn.12. Of course, not all diversity (such as diversity about epistemic standards) is welcome. See also Rolin 2011ROLIN, K. “Diversity and Dissent in the Social Sciences.” Philosophy of the Social Sciences, 41(4), pp.470-494, 2011..
  • 20
    At this point, one might complain that even if a group is heterogenous, minorities within it might self-silence themselves (for fear to be ridiculed or some other social punishment—Sunstein 2006SUNSTEIN, C. Infotopia. Oxford: OUP , 2006., p.68), but a group norm that welcomes dissent can nudge the minority to contribute (Paluck and Green 2009PALUCK, E., GREEN, D. “Deference, Dissent and Dispute Resolution.” American Political Science Review, 103, pp.622-44, 2009.). Of course, whether someone is then heard when speaking is another issue; people might not listen to others because of differences in social identity, gender, etc., but, as we will see, intellectual humility is in part required to promote such listening. Anyhow, the evidence in favor of consistent group polarization in deliberative events is only found in homogenous groups (e.g., Levendusky et al. 2016LEVENDUSKY, M., DRUCKMAN, J., MCLAIN, A. “How Group Discussions Create Strong Attitudes and Strong Partisans.” Research and Politics, 3, pp.1-6, 2016.; Luskin et al. 2014LUSKIN, R., O´FLYNN, I., FISHKIN, J., RUSSELL, D. “Deliberating Across Deep Divides.” Political Studies, 62(1), pp.116-35, 2014.). Having said that, Sunstein (2006SUNSTEIN, C. Infotopia. Oxford: OUP , 2006., p.55) notices that deliberation can not only promote an increase in group confidence but also in group unity: “After talking together, group members come into greater accord with one another.” But there is nothing per se wrong with that result and in fact the aforementioned diversity for collective deliberation is only required pre-deliberation and so can be transient (cf. Zollman 2010ZOLLMAN, K. “The Epistemic benefits of Transient Diversity.” Erkenntnis, 72(1), pp.17-35, 2010., who argues for the benefits of transient diversity in a different, community-wide, division of cognitive labor).
  • 21
    A virtue is understood as consisting of attitudes and dispositions of the agent which “perfect” a natural human faculty or correct for proneness to dysfunction and error in certain situations (Roberts and Wood 2007ROBERTS, R., WOOD, J. Intellectual Virtues. Oxford: OUP, 2007., p.59).
  • 22
    Roberts and Wood (2007ROBERTS, R., WOOD, J. Intellectual Virtues. Oxford: OUP, 2007., pp.236ff.) can be interpreted as understanding intellectual humility as being opposed to intellectual arrogance but neglecting the self-deprecation extreme. It seems, however, that one can be too humble.
  • 23
    Note the virtue requires not only the critical skills to intellectually self-govern but also the willingness to exploit them (Baehr 2020BAEHR, J. “Intellectual Virtues, Critical Thinking and the Aims of Education.” In Fricker, M. et al. (eds.) (2020), pp.447-56.).
  • 24
    Indeed, relying on others seems to be cognitively fundamental for beings like us and, given our social and cooperative nature, one would expect social arrangements, such as the aforementioned divisions of labor, to be in place in order to help us overcome our limitations ([De Brasi] 2018_____. “Citizenry Incompetence and the Epistemic Structure of Society”, Filosofía Unisinos, 19(3), pp. 201-12, 2018.; Kitcher 2011KITCHER, P. Science in a Democratic Society. New York: Prometheus, 2011.).
  • 25
    See also Dunning 2019DUNNING, D. “Gullible to Ourselves.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.217-33., p.230 and Myers 2019MYERS, D. 2019. “Psychological Science Meets a Gullible Post-Truth World.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.77-100. , p.93.
  • 26
    So too disinterestedness if scientific research is to be under the scrutiny of fellow experts.
  • 27
    See also Kidd 2011KIDD, I. “Pierre Duhem’s epistemic claims and the intellectual virtue of humility.” Studies in History and Philosophy of Science, 42(1), pp.185-189, 2011., van Dongen and Paul 2017VAN DONGEN, J., PAUL, H. Epistemic Virtues in the Sciences and the Humanities. Dordrecht: Springer, 2017., Jussim et al. 2019JUSSIM, L., STEVENS, S., HONEYCUTT, N., ANGLIN, S., FOX, N. “Scientific Gullibility.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.279-303. , Mayo 2019MAYO, R. “The Skeptical (Ungullible) Mindset.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.140-58. , McIntyre 2019MCINTYRE, L. The Scientific Attitude. Massachusetts: MIT Press, 2019., Myers 2019MYERS, D. 2019. “Psychological Science Meets a Gullible Post-Truth World.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.77-100. .
  • 28
    While American academia seems to strongly lean to the political left, European academia seems to be more heterogenous (van de Werfhorst 2020VAN DE WERFHORST, H. “Are universities left-wing bastions? The political orientation of professors, professionals, and managers in Europe.” British Journal of Sociology, 71(1), pp.47-73, 2020.). There are, of course, many different characterizations of ideology (Eagleston 1991EAGLESTON, T. Ideology: An Introduction. London: Verso, 1991., pp.1-2). Here I am concerned with political ideologies and I take them to be a body of values and beliefs characteristic of a particular political group. Now, given the epistemically beneficial cognitive diversity pointed out above includes diversity of opinion, it seems that diversity of opinions related to ideologies will also be epistemically desirable (here I focus on ideology-related beliefs as opposed to values; see fn.29). But, of course, as a referee rightly pointed out, it might be the case that some diversities in ideology are not ethically and/or epistemically desirable (cf. Hicks’ (2011HICKS, D. “Is Longino’s Conception of Objectivity Feminist?” Hypatia, 26(2), pp.333-51, 2011. ) “Nazi problem”). Having said that, as far as beliefs are concerned, if the above epistemic benefits are to be had in collective deliberation, then diversity of opinion shouldn’t, epistemically speaking, be unwelcome, even if it can, in some cases, be ethically undesirable.
  • 29
    For example, Tetlock (1994TETLOCK, P. “Political Psychology or Politicized Psychology.” Political Psychology, 15(3), pp.509-29, 1994.) shows the negative epistemic effects that the homogeneity of political values, which shape standards of evidence in testing competing hypotheses, can have. And Duarte et al. (2015DUARTE, J., CRAWFORD, J., STERN, C., HAIDT, J., JUSSIM, L., TETLOCK, P. “Political diversity will improve social psychological science.” Behavioral and Brain Sciences, 38, E130, 2015.) emphasize the negative role that such homogeneity can have in the selection of research topics and in the misattribution of traits to people with contrary political views. Redding (2001REDDING, R. “Sociopolitical Diversity in Psychology.” American Psychologist, 56(3), pp.205-215, 2001.) consider these and other examples of the negative effects of homogeneity with regard to ideology-related values. This, of course, is not to suggest that some epistemic and otherwise values cannot play a legitimate role in scientific inquiry, as long noted by Kuhn (1977KUHN, T. The Essential Tension: Selected Studies in Scientific Tradition and Change. Chicago: University of Chicago Press, 1977.), Longino (1990LONGINO, H. Science as Social Knowledge: Values and Objectivity in Scientific Inquiry. Princeton: Princeton University Press, 1990.) and Douglas (2009DOUGLAS, H. Science, Policy, and the Value-Free Ideal. Pittsburgh: University of Pittsburgh Press , 2009.), among others, but only that homogeneity with regard to some values can have negative epistemic effects. Although this is an important issue that invites further consideration, my concern here is with the negative effect that the homogeneity of beliefs related to ideologies can have.
  • 30
    Typical examples are beliefs about climate change, nuclear waste and the like that are associated to left and right ideologies. More relevant to our purposes are beliefs, say, about the dogmatism and self-deception of the right; see Tetlock 1983TETLOCK, P. “Cognitive style and political ideology.” Journal of Personality and Social Psychology, 45, pp.118-26, 1983. and Jost 2017JOST, J. “Ideological Asymmetries and the Essence of Political Psychology.” Political Psychology, 38(2), pp.167-208, 2017..
  • 31
    As some have recommended in order to avoid (intentional or not; fn.4) p-hacking (Simmons et al. 2011SIMMONS, J., NELSON, L., SIMONSOHN, U. “False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant.” Psychological Science, 22(11), pp.1359-66, 2011.; Jussim et al. 2019JUSSIM, L., STEVENS, S., HONEYCUTT, N., ANGLIN, S., FOX, N. “Scientific Gullibility.” In Forgas, J. and Baumeister, R. (eds) (2019), pp.279-303. ).
  • 32
    Notice that real dissent is needed (even if by a minority), not someone, contraTollefsen (2006TOLLEFSEN, D. “Group Deliberation, Social Cohesion and Scientific Teamwork.” Episteme, 3(1-2), pp.37-51, 2006.), playing devil’s advocate (Crisp and Turner 2011CRISP, R., TURNER, R. “Cognitive Adaptation to the Experience of Social and Cultural Diversity.” Psychological Bulletin, 137(2), pp.242-66, 2011.).
  • 33
    So one would expect deliberation among citizens and scientists to go wrong in cases where citizens, scientists or both do not instantiate the appropriate intellectual character. And, in fact, some public scientific discussion within heterogenous groups seems to be rather unfruitful. If the above is right, then this is due to a lack of appropriate intellectual character, which can be empirically tested by means of survey-based studies or more experimental designs.
  • 34
    Of course, one might worry about this testing, given that ideological homogeneity of the researchers might influence the tests’ results; after all, intellectual virtue is highly contextual (Battaly 2015BATTALY, H. Virtue. London: Polity, 2015.; Pigden 2017PIGDEN, C. “Are Conspiracy Theorists Epistemically Vicious?. In Lippert-Rasmussen, K. et al. (eds.) (2017), pp.120-132.; Madva 2019MADVA, A. “The Inevitability of Aiming for Virtue.” In Sherman, B. and Goguen, S. (eds.) (2019), pp.85-99.), so the dispositions and attitudes associated to the virtues need to be checked within the relevant context. Now, leaving aside that there is still some disagreement as to how to measure these virtues (not least because of the different conceptions of them available), some scales have been developed to be applied to diverse intellectual domains, such as the political one (e.g. Hoyle et al. 2016HOYLE, R., DAVISSON, E., DIEBELS, K., LEARY, M. “Holding specific views with humility: Conceptualization and measurement of specific intellectual humility”. Personality and Individual Differences, 97, pp.165-72, 2016.). But notice that such scales concern self-reports about, say, one’s tendency to consider others’ perspectives (within the political domain) and to accept feedback from others, even if negative (within the political domain), so it is not clear (leaving aside worries about the reliability of these reports) that the researchers’ ideology can interfere with the selection (given this sort of measurement). Moreover, given the virtues’ context-relativity (which needn’t commit us to a form of epistemic relativism—Battaly 2015BATTALY, H. Virtue. London: Polity, 2015., p.66), someone might worry about the psychological reality of the virtues. For example, situationist research in social psychology suggests that (often trivial) environmental variables can have greater explanatory power than character traits. And although it is true that there is good evidence that sometimes situations are quite powerful (Benjamin and Simpson 2009BENJAMIN, L., SIMPSON, J. “The Power of the Situation.” American Psychological Association, 64, pp.12-9, 2009.), the stronger claim that also appears to have widespread acceptance is that personal traits and individual differences have little to no effect once the impact of the situation is accounted for (Jost and Kruglanski 2002JOST, J., KRUGLANSKI, A. “The estrangement of social constructionism and experimental social psychology.” Personality and Social Psychology Review, 6, pp.168-87, 2002.). Harman (2000HARMAN, G. “The nonexistence of character traits.” Proceedings Aristotelian Society, 100, pp.223-6, 2000.) famously argues against the existence of moral character traits and Doris (2002DORIS, J. Lack of character: Personality and Moral Behavior. Cambridge: CUP , 2002.) argues, more moderately, that only “narrow” (i.e. local), as opposed to global, traits can be empirically supported. And, as one would expect, the debate has recently moved to the intellectual domain (Fairweather and Alfano 2017FAIRWEATHER, A., ALFANO, M. Epistemic Situationism. Oxford: OUP , 2017.). But this seems to overlook scholarship that has produced contrary evidence. In fact, the typical effect size for a situational effect on conduct is about the same as the typical effect size for personality characteristics (Fleeson 2004FLEESON, W. “Moving personality beyond the person-situation debate: The challenge and opportunity of within-person variability.” Current Directions in Psychological Science, 13, pp.83-7, 2004.; Fleeson and Noftle 2008FLEESON, W., NOFTLE, E. “The end of the person-situation debate: An emerging synthesis in the answer to the consistency question.” Social and Personality Psychology Compass, 2, pp.1667-84, 2008.; Funder 2006FUNDER, D. “Towards a resolution of the personality triad: Persons, situations, and behaviors.” Journal of Research in Personality, 40, pp.21-34, 2006.). Indeed, some virtue theorists have argued that the situation-person debate is misguided and that situational factors should in fact be exploited in virtue development (Athanassoulis 2016ATHANASSOULIS, N. “The Psychology of Virtue Education.” In Masala, A. and Webber, J. (eds.) (2016), pp.207-228.; Battaly 2014BATTALY, H. “Acquiring Epistemic Virtue: Emotions, Situations and Education.” In Fairweather, A. and Flanagan, O. (eds.) (2014), pp.175-196.; McWilliams 2019MCWILLIAMS, E. “Can Epistemic Virtue Help Combat Epistemologies of Ignorance?” In Sherman, B. and Goguen, S. (eds.) (2019), pp.101-18.).
  • 35
    The benefits of such perspectives are undeniable. A remarkable example of scientific progress due to the introduction of women’s perspectives into a field is that of primatology, and particularly the study of primate’s social and sexual behavior; see, e.g., Lloyd 2005LLOYD, E. The case of the female orgasm. Massachusetts: HUP , 2005.. See also Longino 2002LONGINO, H. The Fate of Knowledge. Princeton: PUP, 2002., 132.
  • 36
    The research for this article has been supported by the Fondecyt project # 1180886 (Chile). I would like to thank to anonymous referees for their helpful comments.
  • Article info CDD: 300

Publication Dates

  • Publication in this collection
    21 Oct 2020
  • Date of issue
    Jul-Sep 2020

History

  • Received
    22 July 2020
  • Received
    11 Sept 2020
  • Accepted
    16 Sept 2020
UNICAMP - Universidade Estadual de Campinas, Centro de Lógica, Epistemologia e História da Ciência Rua Sérgio Buarque de Holanda, 251, 13083-859 Campinas-SP, Tel: (55 19) 3521 6523, Fax: (55 19) 3289 3269 - Campinas - SP - Brazil
E-mail: publicacoes@cle.unicamp.br