Political Training in Four Generations of Activists in Argentina and Brazil

This paper, which is part of wider research on the transformation of political linkages in Argentina and Brazil, analyzes a specific dimension: political activist training. It seeks to understand how transformations such as weaker partisanship and intense political fluctuation manifest in the way activists have defined and experienced political training. I examine narratives in interviews held between 2007 and 2015 with four generational groups of activists, classified according to the historical period in which they engaged in youth activism. All [...]


Political Training in Four Generations of Activists in Argentina and Brazil Transparency criteria and replicability
At the request of the Editors of the Brazilian Political Science Review (BPSR), I provide here a detailed description of some of Aguinis and Solarino's (2019) 'transparency criteria' (I was asked to detail at least two) as I apply them for my own research. Once I have finished this task, I will also discuss some of the postulates of Aguinis and Solarino's article (2019). The intention is to express and substantiate certain concerns regarding their proposition of widespread use of their transparency criteria list, as well as their expectation of 'replicability' by editors and reviewers evaluating qualitative research articles in social sciences journals (especially in political science journals).
First, I find important to point out the following: if, on the one hand, I value transparency in academic research -and a good initiative from journals would be to encourage authors to send a separate file with "supporting information" with a description of different aspects of the research design, fieldwork, and analysis, with a more flexible word limit -on the other, I disagree with the idea that replication is a desirable goal when it comes to qualitative research. I will return to this matter later.

Second, my article "Political Training in Four Generations of Activists in
Argentina and Brazil", to be published in the BPSR, is only a small part of a broader research project on the transformations in political linkage in both countries through the study of activists' political engagement in organizations that actively supported the Kirchner (2003Kirchner ( -2015 and the Workers' Party (2003)(2004)(2005)(2006)(2007)(2008)(2009)(2010)(2011)(2012)(2013)(2014)(2015)(2016) administrations. So, most of the aspects that I will describe here are rather associated with my broader research project than with this particular and more recent article, which resulted from the analysis of one single aspect of activism (training) and used a smaller sample with a selection of interviewees divided into four generations. Aguinis and Solarino refer to a similar case when they say that their transparency criteria should not be applied rigidly to all qualitative research. The reason is that, although they are broad in nature, not all of them apply to every situation and type of qualitative study. For example, a manuscript may describe a qualitative study that is only a small portion of a larger effort (Aguinis andSolarino, 2019: 1311).
With the two above-mentioned specifications in mind, I have chosen the following six "transparency criteria" out of the 12 formulated by Aguinis and Solarino (2019). In the following pages, I mention the specific criteria and also quote Aguinis and Solarino's consideration of when such criteria are met (according to their table on page 1299): Criterion #1: "Kind of qualitative method" "The authors clearly identify the type of qualitative research approach they adopted" I will describe here only aspects of my qualitative approach that were not mentioned in the BPSR article itself (in its section "methodological decisions" and the appendix). It is also important to clarify some issues.
First of all, unlike the articles analyzed by Aguinis and Solarino, my research's main methodological technique did not consist of interviews with "elite informants" (IEIs). In the course of my broader research project (of which the BPSR article is only a part), a few of the subjects (Baltasar, Leonele, and Enrique, in Brazil, and Javier in Argentina -all fictional names), could be considered "key informants", which is not equivalent, in qualitative research, to EIs. 1 In other words, they had considerable and useful knowledge and provided me with important insights on the 1 Here I draw on Rosana Guber's definition of "informante central" ("key informant", in English), that is, a person seen as a fundamental source of information about a wide range of significant issues related to their own culture or social unit. This informant represents privileged and qualified access to the culture we are studying (Guber, 2004: 86). Second, I wanted to approach the subjects' narratives and interpretations following two main features of the interpretive paradigm I adopted within qualitative research: 1) an interest in meaning and interpretation, with an emphasis on the importance of context and processes, coupled with an inductive and hermeneutic strategy (Maxwell, 2004); and 2) the double hermeneutic theory (Giddens, 1976) present in the interpretive paradigm, in which the researcher interprets something that has already been interpreted by the subjects themselves, thus creating second-order concepts. As researchers, we reinterpret a situation that is meaningful (in other words, that makes sense) to those who participate in it (Bryman, 2000;Denzin and Lincoln, 2005), and we clearly commit to looking at events, actions, norms, and values from the perspective and frames of meaning of the subjects being examined. The ontological perspective underpinning the interpretive paradigm (in other words, how it conceives the nature of reality) is one in which reality is perceived as socially constructed and relative. Now, if reality is a social construction, then there are several realities: at least the realities of the actor (or actors), the observer, and the eventual reader of the study (Denzin and Lincoln, 2005). The methodological perspective of the interpretive paradigm involves inductive mechanisms and a focus on the particular characteristics of what we seek to understand, thus resulting in an in-depth analysis of the case. As a qualitative researcher, I did "not seek laws but meanings; the importance of these kinds of findings lies in their specificity and circumstantial character" (Vasilachis de Gialdino, 1992: 60, my translation). In terms of results, the interpretative paradigm assumes the impossibility to generalize and predict. Theories are not verified or falsified by the facts but appear as a result of understanding those facts. Thus, while quantitative research is associated with the accumulation of knowledge for purposes of control and prediction, qualitative research aims to understand the meaning of social action in specific contexts (Bryman, 2000;Meo and Navarro, 2009).
I would like to clarify a third issue: different traditions had some bearing on the research. For example, even though this study is not a grounded theory study, the fieldwork and data analysis were influenced by grounded theory's notion of a reciprocal relationship between data gathering, analysis, and theory; unforeseen elements can be crucial, and concepts not necessarily pre-exist the empirical analysis but result of induction (Strauss and Corbin, 1990;Glaser, 1992). and followed some of its techniques and criteria. In that sense, Schatz's (2009) concept of "ethnographic sensitivity" for political studies was quite useful, especially for raising attention to the importance of gathering and taking into consideration the meanings that subjects give to their own political and social reality, as I previously mentioned.

Criterion #3: "Position of researcher along the insider-outsider continuum"
"The authors clearly position themselves on the insider-outsider continuum" I was clearly seen as an outsider by all the organizations and subjects I had access to in both countries (Argentina and Brazil). I had not been a member of any of those organizations in the past, and I did not have a personal relationship with any of the subjects. Of course, in some cases, some of the key informants I mentioned earlier in this description (whom I met in the course of the fieldwork) provided me with phone numbers of acquaintances, comrades, and compañeros/companheiros (fellow activists). These key informants not only facilitated my access to the field but most likely inspired more confidence and openness from interviewees since I was contacting them through a person they already knew.
As a qualitative researcher (and now I move beyond Aguinis and Solarino's criterion #3), I acknowledge the notion of reflexivity of the interpretive paradigm, from which the researcher acknowledges that his or her values and multiple identities have an impact on the research process. Reflexivity involves examining the researcher's "self" and inquiring about how his or her participation affects the production and analysis of data (Reay 1996;Meo and Navarro, 2009

Criterion #4: "Sampling procedures"
"The authors describe the kind of variability they seek and how they identified the participants or cases" 2 Aguinis and Solarino (2019) define three types of replication. "Exact replication" is the replication (by another researcher) of a previous study "using the same population and the same procedures". In "empirical replication", on the other hand, a previous study is replicated using the same procedures but a different population. And, finally, in a "conceptual replication", a previous study is replicated using the same population but different procedures. The case selection (Argentina and Brazil during both the PT and the Kirchner administrations) and the decision on where to carry out the fieldwork (cities/geographical areas) remained the same throughout fieldwork (2005)(2006)(2007)(2008)(2009)(2010)(2011)(2012)(2013)(2014)(2015) and is already justified in the article approved by the BPSR.
The general sampling procedure was "snowball sampling", aimed at building a "purposeful sample" (Patton, 2002), that is, one with valuable "information-rich cases", to achieve a deep understanding rather than a general theory (Flick, 2004).
With this sampling procedure, thus, the definition of the sample (who, how many, etc.) is gradual and open throughout the research process (Meo and Navarro, 2009).
As for the kind of variability I sought when building a smaller sample for the BPSR article on activist training, I used the following criteria: a) Regarding the organizations, of all the organizations that actively supported the government, I wanted those with some political relevance or public importance to be included. In Argentina, the sample included a wider range of organizations because I decided to take into consideration the numerous Kirchnerist collectives that were proliferating beyond and outside the PJ. In Brazil, conversely, since the presence of PT members in non-partisan organizations was quite significant, the concern was to represent in the As an icebreaker, the first question was always a general one about the moment the interviewee had engaged in politics. During the interviews, I developed the "active" and "floating" listening and followed some tactics proposed by different authors (Denscombe, 1999;Patton, 2002;Guber, 2004;Meo and Navarro, 2009), such as moments of intentional silence and ways to animate or elicit a (re)elaboration by the interviewees. All of this was mastered in the course of the interviewing process (Kvale, 1996).
After leaving the interview site, I would often write a short post-interview memo (Hammersley and Atkinson, 1994;Strauss and Corbin, 2002), with a "thick description" 3 of the situation (before, during, and after the conversation), as well as anything else that seemed relevant during the interview but had not been captured by the recording. Sometimes, if something especially important or interesting had emerged in the conversation, I would write an "analytical memo" with ideas, potential new codes or patterns, and even questions.
3 In their article, Aguinis and Solarino (2019) say that they "see the 12 transparency criteria as an extension to what Lincoln and Guba (1985) referred to as 'thick description' in qualitative research". For me, the notion of thick description in qualitative research is, instead, evocative of Geertz's (1973) concept of thick description, which involves, in ethnography, describing human conduct by explaining not only the behaviour itself but also its context, so that the meaning given by the subject can be understood by an outsider.
There are certain ethical considerations in the qualitative research paradigm (Warwick, 1982;Denscombe, 1999;Meo, 2010) that were considered when conducting the interviews and documenting the interactions. Despite dealing with subjects who were involved in politics and sometimes in the public spherewho, therefore, were often familiar, to some extent, with the dynamics of interviewing, I would always give the interviewees an explicit description of the interview's dynamics from the start as well as information about the nature of the investigation, so that informed consent could be obtained. Furthermore, I used pseudonyms or fictional names to protect the interviewees from any harm that could come from publicizing their names with their accounts. With this decision, I prioritized their accounts over their real identities, even though in some cases their identities were themselves an important piece of information (in the case of a political leader, for example). Later, I will return to the issue of research confidentiality, which is key in qualitative research and not sufficiently addressed by Aguinis and Solarino (2019) in their list of transparency criteria (see below, under the description for criterion #12, "Data disclosure").

Criterion #10: "Data coding and first-order codes"
"The authors describe the first-order coding methodology and present the full code list" Following the tradition of grounded theory, the analysis and interpretation of data (for my broader research project) began during fieldwork, not after it (Mendizábal, 2006), in an uninterrupted and recurrent "zig-zag" sequence (Creswell, 1998 For this particular article, the analysis was not done with the use of software but manually, by taking notes and analyzing previous output (specifically looking for references to activist training). Pictures of some of these handwritten notes are in the database. This kind of manual and handwritten analysis was possible, in this case, because the sample was smaller, the article's topic was quite circumscribed, and some of the findings, such as those related to nostalgia or activism and State immersion, had undergone a preliminary analysis for other papers (Rocca Rivarola, 2017b and 2019).
The first-order coding (for my broader research project) combined "open coding" and "in vivo coding". For instance, the term professionalização da militância (in English, activist professionalization), which refers to full-time activism with a salary (through a job in the State, for example) emerged as a common category among Brazilian interviewees. This term eventually became one of the in vivo codes, which was then related to Ribeiro's (2010) study on the PT. The code list includes descriptive categories, categories of a more conceptual nature (or of a higher level of abstraction) and sub-dimensions of certain categories. For the analysis process, I drew on several studies on coding and qualitative analysis of interviews (Dey, 1993;Hammersley and Atkinson, 1994;Coffey and Atkinson, 2003;Miles and Huberman, 1994;Strauss and Corbin, 2002). For Strauss and Corbin (2002), for example, coding is considered a process that enables not only assigning tags to data, but also conceptualizing, asking questions, and providing tentative answers about the potential relationships between newly created categories (Strauss and Corbin, 2002).
As I revisited the interviews (in a specific order, so as to more easily identify and remember regularities or patterns within each generation of interviewees) I took "output" notes following the previously elaborated codes (notes on the interviewees' references to a particular issue related to one or more codes/categories), and I also registered how the issue of activist training emerged in the subjects' narratives. Then, I built comparative matrices with findings for the four generational groups of the sample in both countries. I am attaching the scanned image of one of these matrices.
Criterion #12: "Data disclosure" "The authors disclose the raw materials gathered and examined during study" I am attaching, in a compressed file, all the interviews' recordings and transcriptions so that the editors of the BPSR can see, check, and evaluate. However, I strongly disagree with Aguinis and Solarino's (2019) idea that disclosing these materials is the only and best way to meet this "transparency" criterion.
Furthermore, I cannot authorize the BPSR to make these materials public. The reason is simple: anonymity and confidentiality are part of a consolidated ethical principle in qualitative research (Grinyer, 2002). I believe that seeking transparency in qualitative research should not put these ethical considerations at risk. The dilemma and problems for anonymity and confidentiality created by recent transparency-related requirements (such as disclosing recordings and transcriptions) have been detected and analyzed, for example, by Monroe (2018).
Having respected these ethical considerations over all these years (for example, using fictional names for the interviewees, concealing supplementary personal data, and refraining from using parts of the interviewees' accounts that they had specifically asked me not to quote), I do not intend to radically change them in order to convey methodological rigor and transparency. Thus, I am requesting the BPSR to dispose of all the recordings and transcriptions after the editors have checked them and ensured the survey's transparency.
Having presented the detailed description of six of Aguinis and Solarino's 12 "transparency/replicability criteria", I would like, at this point, to briefly discuss some of the authors' postulates. Aguinis andSolarino (2019: 1292) themselves acknowledge that the desirability of replicability is "a potentially contentious issue in qualitative research". Indeed, although there are numerous methodological handbooks and articles on rigor and even on transparency in qualitative research, that is not the case with replication (see Kincheloe and Berry, 2004, for example). Replication is, rather, a concept that the authors have transplanted from other methodological perspectives. This procedure would not be in itself problematic if the qualitative perspective could, as a result, be enriched with new terms and discussions. But one of the main conclusions of Aguinis and Solarino's work is that none of the 52 articles they analyzed (all published in the same academic journal, after undergoing a thorough double-blind peer-review) were sufficiently transparent. None of them met their expectations or their transparency criteria. And they go further, suggesting (inductively?) that this could be generalized to other journals. The underlying argument (see Aguinis andSolarino, 2019, p. 1305), therefore, is that qualitative research is not replicable and, thus, not transparent enough (as opposed to quantitative research).
I share the concerns raised by one of the anonymous reviewers of Aguinis and Solarino's article, who, as the authors say, pointed out that widespread use of this criteria list by editors and reviewers of social sciences journals would have serious implications for qualitative studies, possibly leading to increased rejection to publishing such studies. Publishing qualitative research articles would be even more challenging in political science journals, given that quantitative studies are still predominant in that field, as many authors have argued (Sartori, 2004;Munck, 2007;Soares, 2008).