Acessibilidade / Reportar erro

Accounting scholarship and management by numbers

There is a plethora of indices ranking universities, departments, and individual researchers based on a variety of indices. These invariably include a measurement of research, usually based on a combination of quantity and quality of journal publications. Informal discussions with accounting researchers turn to the question of journal rankings and performance management indicators. Why is this so?

The reasons are complex, but for granters of research funding, especially government sources, there is a desire to concentrate funds to centres of excellence. Given the drive to mass university education, it is deemed impossible to fund every academic research. Also, research-based universities are seen as triggers of economic development, as they can innovate in terms of high value-added goods and services. Consequently, policy-makers are sensitive to international rankings of domestic research universities with respect to research output and its impact. Also, there is a legitimate desire to monitor and hold accountable those who receive grants and check whether funding is proving effective.

For research-based universities rankings are vital when it comes to reputation and impact, not least regarding recruiting the best students, often at a premium price, attracting leading researchers and gaining research funding. Thus, senior management and specialized units constantly monitor and seek to manage research performance. In principle, this is reasonable: in the past, some (but probably a small minority of) academics funded to conduct research and neglected their duties in this respect. It is wrong to assume that there was a 'golden age' whereby autonomous researchers invariably made substantive advances.

Arguably, the turn to greater research evaluation has brought benefits for accounting research. Its output and range regarding theory, issues and empirical studies has increased enormously over the past forty years. Accounting has become a vibrant social science. Also, the rankings give protection to accounting researchers: university administrators cannot staff accounting department only with non-researchers committed just to teaching, who often merely replicate professional courses, or starve accounting departments of research funding by diverting funds to other areas without risking a diminution in external research quality indicators. Given the scarcity of effective accounting researchers internationally, their value has increased, which is reflected in salary levels.

However, this context has also brought perils. University and department rating indices are many and they cover (albeit very differently) a wide range of factors, such as: research impact; contributions to the academic and wider community; student wages after graduation; and student teaching evaluations. Some controversies emerge concerning the reliability, measurement and effectiveness of such indices. For example, high student teaching ratings are often correlated to the generosity of marking: hence, a trend to grade drift upwards and a dumbing down of the curriculum and testing. Journal evaluations by peers often rely on judgements by deans, who may be unaware both of the modus operandi in specialized areas, and established academics, who may resist new ideas challenging their work. The ability to raise research grants is not invariably linked to effective outputs. Rankings can be subject to manipulation and pursuit of vested interests. For example, there are frequent anecdotes of editors requesting authors to delete references to competitor journals and include more references to their own journals, in order to boost their citation rates and diminish those of its competitors. Moreover, citations can be unreliable. Review papers tend to be cited more frequently than specialized empirical papers and not all papers in top-ranked journals show high citation rates, whereas many papers published elsewhere do. An analysis of all these issues is beyond the scope of this contribution, which focuses on just one element, namely the ranking of accounting research journals and its role in performance evaluation.

Evaluating research performance is problematic (Gray, Guthrie, & Parker, 2002Gray, R., Guthrie, J., & Parker, L. (2002). Rites of passage and the self-immolation of academic accounting labour: an essay exploring exclusivity versus mutuality in accounting. Accounting Forum, 26(1), 1-30.; Guthrie, Parker, & Gray, 2004a). It can be measured in several ways, including the volume of publications, their 'rank' according to the publication medium, the amount of research funding, or the number of copyright or patent agreements. Journal rankings come from numerous sources. First, there are indices constructed by publishers, such as the Science Citation Index, Web of Science, Scopus, and ISI, which are based on citations in selected journals. Second, there are innumerable indices constructed by individual particular universities and their departments, especially in the USA. Scoring varies considerably, but is often based on senior academics', often deans', perception of research quality. Third, there are national ranking lists, where the UK Chartered Association of Business School's list (CABS) and the Australian Business Deans Council list (ABDC) are especially prominent. Both have been developed over time and play a significant role in research evaluation exercises conducted by their respective governments. Neither follows a mechanistic metrics, but panels of academic experts base their scores on perceptions gained from scrutiny of public submissions, qualitative and quantitative data, public exposure and feedback, and international expert consultation. Both of them carry warnings that they should not be used for 'absolute' judgments. Regardless of the method, the result is often a ranking of journals on a scale that ranges from A* to D. This has led many universities to struggle to recruit and retain '4 × 4' scholars (those who have 4 papers published in top-ranked journals within a 4-year period). Staff that does not reach such results may suffer consequences in terms of retention, extra teaching loads, and promotion.

As might be expected, journal and university research rankings have undergone considerable scrutiny and debate (see, for example Critical Perspectives on Accounting, 2015, 26(1), and Academy of Management Learning and Education, 2009, 8(1)). There are even journals devoted to the theme, e.g. Research Evaluation and Evaluation: The International Journal of Theory, Research and Practice. The outcomes raise serious issues for accounting research internationally.

Many indices, especially those by publishers and U.S. oriented business schools, cover only a small number of journals within management and social sciences. This is especially marked in accounting, where the Journal of Accounting Research (JAR), The Accounting Review (TAR), the Journal of Accounting and Economics, and, to some degree Accounting, Organisations and Society (AOS) and Contemporary Accounting Research (CAR), predominate. Many, if not most, of the other journals are either excluded or lowly rated. All of them are from the USA, except AOS, which might, arguably, be classified as a 'mid-Atlantic' journal. All of them, except AOS and, to a lesser degree, CAR, are oriented to quantitative, economic, and positivist research, especially in financial accounting, and, paradoxically, given their attribution of being internationally rated journals, they are heavily concentrated on U.S. data and issues, something which is reflected in the authorship of papers and members of their editorial boards.

However, a feature of accounting research over the past fifty years has been the growing adoption of alternative political economy and social science theories and paradigms, and qualitative research methods, especially within Europe; and also an extension into new topics, e.g. increasing civil society involvement, social and environmental accounting. Over the past 20 years, much high-quality work has been published in an increasingly diverse and larger group of journals (Guthrie, Parker, & Dumay, 2015Guthrie, J, Parker, L. D., & Dumay, J. (2015). Academic performance, publishing and peer review: peering into the twilight zone. Accounting, Auditing & Accountability Journal , 28(1), 2-13.). This has brought divisions within the academic community. For example, Lowe and Locke (2005Lowe, A. D., & Locke, J. (2005). Perceptions of journal quality and research paradigm: results of a web-based survey of British accounting academics. Accounting, Organizations and Society, 30(1), 81-98.), in a survey of British accounting academics, found an approximate bimodal distribution of perceptions of journal rankings: those with a positivist/functional orientation tended to rate journals in their field most highly, while those with a critical/interpretivist orientation favoured journals oriented to their interests, e.g. AOS, Critical Perspectives on Accounting (CPA), and the Accounting, Auditing and Accountability Journal (AAAJ). Generally, there was evidence of growing perceptions of high quality attributed to these and other newer journals, such as Management Accounting Research.

Potentially, this is exciting as the variety of approaches to research and the emergence of new topics may lead to more vibrant accounting research, but this has not been materialised, yet. Rather, it is more like two ships passing in the night. The positivist/functional, allegedly high-quality, journals rarely publish papers in newer and emerging fields, the authors rarely cite works outside their methodological circle, and even when shifting towards newer topics, they ignore relevant research outside their select domain of journals. The silence of JAR, TAR, and Journal of Accounting in Emerging Economies (JAEE) regarding methodological criticism has been deafening. This has consequences. For example, scholars who were previously focused on financial and investor returns started examining the impact of social, environmental, and sustainability strategies and disclosures. Such works, largely informed by economic theory, have been increasingly shown in U.S. and similarly oriented conferences, but they display no reference to, or knowledge of, prior research in the field (Guthrie & Parker, 2016Guthrie, J., & Parker, L. D. (2016). Whither the accounting profession, accountants and accounting researchers? Commentary and projections. Accounting, Auditing & Accountability Journal, 29(1), 2-10.). Thus, the overwhelming evidence of 'greenwashing' and limitations of business case solutions are ignored. This is curious as, under some measures, social and environmental studies are the most frequently cited and downloaded papers from accounting journals. In part, this myopia may be due to the narrowness of U.S. Ph.D. programmes, which limit reading to a small range of 'elite' U.S. journals. For example, Schwartz, Williams and Williams (2005Schwartz, B., Williams, S., Williams, P. F. (2005). U.S. Doctoral students familiarity with accounting journals: insights into the structure of the U.S. academy. Critical Perspectives on Accounting, 16(2), 327-348.) found that students in U.S. Ph.D. programmes had little familiarity with journals, except those traditionally regarded as 'premier' journals, and this was more marked in rather 'elite' programmes. Moreover, there are widespread reports by leaders in the U.S. accounting academy warning junior faculty and Ph.D. students about the potential career damage of publishing in 'alternative' journals.

Conservative journal rankings involve a series of problems within global accounting scholarship. High ratings are linked to dates of the foundation of journals. Basically, older, well-established journals have high ratings. Commercial indices (where publishers can have a vested interest) tend to ignore newer journals. Getting a journal into their indices requires a long process of submissions which, even if successful, is likely to take at least six years. This may not be unreasonable if one believes that entering the highest level of journal quality rankings requires substantive proof of competitive prowess and that the so-called 'elite' journals cover all spectra of accounting research. However, this is not so. Given the pressures placed upon academics to publish in 'elite' journals, they may restrict their research within the narrow fields covered by these journals. Hence, the accounting research focus is frozen in time, and it pursues topics and employs methods favoured by previous generations. For example, accounting history, the focus of much research in Latin countries, is discouraged (generally all journal indices inexplicably assign low ratings to meritorious journals such as Accounting History). The list of research topics neglected by many leading accounting journals is almost endless. It includes vital areas, such as accounting and development and public sector accounting. The results can be bizarre. For example, researchers in many developing countries often mimic research on the U.S. capital markets, although their countries have weak, sometimes nearly non-existent, capital markets and economies dominated by the State and public-owned enterprises, and issues such as corruption and civil society involvement being paramount. Despite worldwide commitment to Millennium Development Goals that emphasise a broad range of development goals, e.g. education, health, eradication of poverty, and environmental sustainability, national accounting research becomes deflected from such issues in pursuit of better means to further corporate and stakeholder interests. The problem is not confined to poor countries. Palea (in press) argues that the widespread practice of assessing academic quality based on journal rankings, by conformity to U.S.-mainstream criteria has led research relevant to the European economic and socio-political context and the fundamental objectives and the constitution of the European Union to be neglected by European researchers.

Most research indices rely exclusively on journals published in English, with obvious disadvantages to works written in other languages. However, if rankings are constructed to cover scholarship worldwide on an online basis, irrespective of language of publication, then quite different results can ensue. For example, the French accounting scholar Gérard Charreaux has accumulated 30 citations in ISI-listed journals over his lifetime, whereas Google Scholar, which is a web-based search of citations, found he has received over 1,000 citations. Most of Charreaux's citations are retrieved from French journals not listed in the ISI database, but it is hard to conclude (based on ISI data) that Charreaux made little impact on his field (Adler & Harzing, 2009Adler, N. J., & Harzing, A. W. (2009). When knowledge wins: transcending the sense and nonsense of academic rankings. The Academy of Management Learning & Education, 8(1), 72-95.). Moreover, a global knowledge network of producers and consumers of academic research outputs has emerged alongside an increasing volume of works from various emerging countries (e.g. China, Brazil, Russia, and South Korea), but this is rendered invisible by most indices (Larivière, Lozano, & Gingras, 2014Larivière, V. , Lozano, G. A., & Gingras, Y. (2014). Are elite journals declining? Journal of the Association for Information Science and Technology, 65(4), 649-655.).

Most accounting research evaluations privilege a single form of academic output - peer-reviewed journal articles cited by other scholars in their published works. Books, monographs, conference papers, and chapters are mostly ignored (the exception is Google Scholar), but even web-based citation search engines are weak on recording citations within and to such works (for which some publishers can take the blame). The result has been the virtual disappearance of the single authored scholarly book within accounting research. Moreover, seeking rapid publication to meet short time horizons of tenure-track decisions, Ph.D. theses increasingly comprise of several chapters in the format of prospective papers ready for submission to journals. Thus, sometimes they may only lightly cover methodological issues. The emphasis on journal publications has encouraged 'salami-slicing' of research results and the eschewal of comprehensive presentations of research in a book format. This is curious for, in other social sciences, e.g. anthropology, sociology, and politics, books remain an important medium for disseminating knowledge and they are evaluated accordingly (Guthrie, Parker, & Gray, 2004bGuthrie, J., Parker, L., & Gray, R. (2004b). Requirements and understandings for publishing academic research: an insider view. In C. Humphrey & W. Lee (Eds.), The real life guide to accounting research: a behind-the-scenes view of using qualitative research methods (pp. 411-432). Oxford: Elsevier.). From personal experience, it is increasingly difficult to persuade academic leaders to contribute to edited books seeking to make knowledge of their field more accessible to a broader audience, be they students, practitioners or policy-makers. For example, along with colleagues, I recently edited a collection of works on accounting in less developed countries (Hopper, Tsamenyi, Uddin, & Wickramasinghe, 2012Hopper, T. M, Tsamenyi, M., Uddin, S., & Wickramasinghe, D. (Eds.). (2012). Handbook of accounting and development. Cheltenham: Edward Elgar.). The aim was to review research in the field in a single collection to encourage and aid new entrants to the area. Fortunately, we eventually found suitable authors, but several first-line choices replied that they would feel delighted to contribute because they recognised the need for such a book, but they were prohibited to do so by edicts from deans stipulating that they must only submit journal articles to 3 or 4 ranked journals.

For some, superficial research quality assessments (especially journal ratings) are driving out the traditional model of a university as a forum where colleagues engage with ideas and insights (Gendron, 2008Gendron, Y. (2008). Constituting the academic performer: the spectre of superficiality and stagnation in academia. European Accounting Review, 17(1), 97‐127.; Hopwood, 2008Hopwood, A. G. (2008). Changing pressures on the research process: on trying to research in an age when curiosity is not enough. European Accounting Review , 17(1), 87-96.). The benchmarking of journals can be the antithesis of scholarship and the pursuit of knowledge, research creativity, risk-taking, disciplinary breakthroughs, and engagement with communities, professions, government, and businesses. For some, it reflects the commercialisation, corporatisation, and financialisation of universities globally, whereby research becomes a commodity proxied by a simple key performance indicator. The global expansion of a Higher Education market means that creating a reputation and exhibiting research performance has often become reduced to maximising such scores, with little regard to the quality and significance of knowledge produced (Gray et al., 2002Gray, R., Guthrie, J., & Parker, L. (2002). Rites of passage and the self-immolation of academic accounting labour: an essay exploring exclusivity versus mutuality in accounting. Accounting Forum, 26(1), 1-30.; Neumann & Guthrie, 2002Neumann, R., & Guthrie, J. (2002). The corporatization of research in Australian Higher Education. Critical Perspectives on Accounting Journal, 13, 721-741.).

This has created a market for research and researchers who can be commercially traded. Research output quantity, journal rankings, and a researcher's publishing record have become all embracing objectives (Marginson & Considine, 2000Marginson, S., & Considine, M. (2000). The enterprise university: power, governance and reinvention in Australia. Cambridge: Cambridge University Press. ). This not only disadvantages emergent journals and areas of interest, but contributes to a debasement of the academic culture, where business schools and academics emphasise status and league table positioning rather than addressing important issues of concern to wider society. Now, many researchers identify themselves not by their research field, but through methodological allegiance and the 'top' journals they publish in, which fosters a monoculture where the publication medium is more important than its content and contribution (Willmott & Mingers, 2012Willmott, H., & Mingers, J. (2012). Taylorizing business school research: on the "one best way" performative effects of journal ranking lists. Human Relations, 66(8), 1051-1073.).

One of my most depressing duties over recent years has been sitting on appointment panels across various universities whose members, especially senior academic managers, simply evaluate candidates by the number of publications weighted by a journal quality measure over a period of time. Often this has resulted in the appointment of individuals with trivial pursuits. Once I was asked to make such a rating and I could not resist stating I was an academic not a scorer. On reflection, I wonder why some departments go through the ritual of interviews when they could simply rank applicants according to their research scores, draw a baseline acceptable score, and select from the top of the list, just as in zero-based budgeting. Similarly, it is sad to see young researchers being pressured by 'publish or perish' situations at the expense of their broader research development. In many respects, universities have reproduced methods to control and process knowledge akin to the Fordist mass-production model.

Nevertheless, despite the above reservations about evaluating research attainments simply by volume and quality indicators, I would not advocate for their complete abolition, as others recommend. As mentioned, researchers need to be accountable. Moreover, in supplying somewhat more objective evidence, they can facilitate equal opportunities, not least by reducing discrimination against women and foreigners. Leaving appointments, promotions, and allocation of research funding to the fiefdom of senior academics uncorroborated by more objective evidence has been and can promote discrimination and patronage. For those seeking harder evidence of research quality, the ABDC list provides the fairest and most extensive source, though this should be cross-checked with Google Scholar, especially for papers not published in English. However, as most accountants might recognise, relying on simple numerical indicators is not reasonable. Adjudicators must at minimum read a researchers' work and evaluate its potential worth instead of just checking external key performance indicators: Is it innovative? Does it address important policy and practice issues? Paradoxically, it has been my experience that leading departments of accounting have established their reputation not by copying, but leading the field. However, my experience is that rather than listening to expert professorial judgements, the relatively uninformed views of senior university managers increasingly predominate, which stifles such creativity. It is incumbent that grant-giving agencies, be they government departments, or charities, or professional associations, appoint panels of adjudicators that span the entire gamut of methodological approaches now used within accounting research and they ensure that submissions are judged by the methodological and practical canons underpinning that research approach rather than those of a competing paradigm. This would normally require extensive consultation, diverse expert advice, and making the soliciting of external comments from interested parties possible.

Lastly, throughout my academic career, I have heard public pronouncements by editors of top-ranked journals pursuing a positivist/functional agenda state that they are open to new ideas and different research approaches. However, changes in the contents of their journals have been negligible. Perhaps they are simply ignorant of alternative emergent work or feel threatened by it or simply do not understand it. The problem is not that their journals are not legitimate and worthy vehicles for diffusing esteemed specialist areas of research, but rather that they purport (or are perceived to be) adjudicators all leading research internationally. Persuasion for them to change their editorial practices is almost certainly doomed to failure. However, such a state of affairs does not and need not prevail internationally, as is attested by the rise of newer journals conveying fresh ideas. It is incumbent on accounting professors wishing for a plurality of methods and topics to ensure that this is enacted in managerial decisions made by managers within and without their institutions. Perhaps the most important immediate means of doing so is ensuring that Ph.D. programmes in accounting cover the wide range of topics, theories, and methods now resplendent in the accounting literature. This may leave a lasting and valuable legacy.

REFERENCES

  • Adler, N. J., & Harzing, A. W. (2009). When knowledge wins: transcending the sense and nonsense of academic rankings. The Academy of Management Learning & Education, 8(1), 72-95.
  • Gendron, Y. (2008). Constituting the academic performer: the spectre of superficiality and stagnation in academia. European Accounting Review, 17(1), 97‐127.
  • Gray, R., Guthrie, J., & Parker, L. (2002). Rites of passage and the self-immolation of academic accounting labour: an essay exploring exclusivity versus mutuality in accounting. Accounting Forum, 26(1), 1-30.
  • Guthrie, J., & Parker, L. D. (2016). Whither the accounting profession, accountants and accounting researchers? Commentary and projections. Accounting, Auditing & Accountability Journal, 29(1), 2-10.
  • Guthrie, J, Parker, L. D., & Dumay, J. (2015). Academic performance, publishing and peer review: peering into the twilight zone. Accounting, Auditing & Accountability Journal , 28(1), 2-13.
  • Guthrie, J., Parker, L., & Gray, R. (2004a). From thesis to publication. In S. Burton & P. Steane (Eds.), Surviving your thesis (pp. 232-247). London: Routledge.
  • Guthrie, J., Parker, L., & Gray, R. (2004b). Requirements and understandings for publishing academic research: an insider view. In C. Humphrey & W. Lee (Eds.), The real life guide to accounting research: a behind-the-scenes view of using qualitative research methods (pp. 411-432). Oxford: Elsevier.
  • Hopper, T. M, Tsamenyi, M., Uddin, S., & Wickramasinghe, D. (Eds.). (2012). Handbook of accounting and development. Cheltenham: Edward Elgar.
  • Hopwood, A. G. (2008). Changing pressures on the research process: on trying to research in an age when curiosity is not enough. European Accounting Review , 17(1), 87-96.
  • Larivière, V. , Lozano, G. A., & Gingras, Y. (2014). Are elite journals declining? Journal of the Association for Information Science and Technology, 65(4), 649-655.
  • Lowe, A. D., & Locke, J. (2005). Perceptions of journal quality and research paradigm: results of a web-based survey of British accounting academics. Accounting, Organizations and Society, 30(1), 81-98.
  • Marginson, S., & Considine, M. (2000). The enterprise university: power, governance and reinvention in Australia. Cambridge: Cambridge University Press.
  • Neumann, R., & Guthrie, J. (2002). The corporatization of research in Australian Higher Education. Critical Perspectives on Accounting Journal, 13, 721-741.
  • Palea, V. (No prelo). Whither accounting research? A European view. Critical Perspectives on Accounting.
  • Schwartz, B., Williams, S., Williams, P. F. (2005). U.S. Doctoral students familiarity with accounting journals: insights into the structure of the U.S. academy. Critical Perspectives on Accounting, 16(2), 327-348.
  • Willmott, H., & Mingers, J. (2012). Taylorizing business school research: on the "one best way" performative effects of journal ranking lists. Human Relations, 66(8), 1051-1073.

Publication Dates

  • Publication in this collection
    May-Aug 2016
Universidade de São Paulo, Faculdade de Economia, Administração e Contabilidade, Departamento de Contabilidade e Atuária Av. Prof. Luciano Gualberto, 908 - prédio 3 - sala 118, 05508 - 010 São Paulo - SP - Brasil, Tel.: (55 11) 2648-6320, Tel.: (55 11) 2648-6321, Fax: (55 11) 3813-0120 - São Paulo - SP - Brazil
E-mail: recont@usp.br