Acessibilidade / Reportar erro

CARDIM DE CARVALHO’S CONTRIBUTION TO THE UNDERSTANDING OF DECISION MAKING IN CONTEMPORANEOUS FINANCIAL MARKETS

A CONTRIBUIÇÃO DE CARDIM DE CARVALHO À COMPREENSÃO DAS TOMADAS DE DECISÃO NOS MERCADOS FINANCEIROS CONTEMPORÂNEOS

ABSTRACT

According to the post-Keynesian approach, uncertainty is inherent to the loci in which investors decide the portfolio allocation of their wealth (or the wealth they manage) in a monetary economy. In financial assets’ markets, to deal with uncertainty, agents use instruments that evolved with time and encompass two elements of the decision-making process: (i) which premises will be considered to make a decision, the focus of Keynes’ General Theory; (ii) the sequel between the premises and the very decision, the focus of Keynes’ Treatise on Probability. The aim of this paper is twofold. The first one is to summarize the contribution of Fernando Cardim de Carvalho to the understanding of the decision-making under uncertainty. The second one is to analyse the decision-making instruments used in contempo-raneous financial assets’ markets in light of his contribution that is especially suitable for this goal due to his acquaintance with the Treatise.

KEYWORDS:
Keynesian uncertainty; decision making; financial asset’s markets; conventions; speculation

RESUMO

De acordo com a abordagem pós-keynesiana, a incerteza é inerente aos loci em que os investidores tomam as decisões de alocação da sua riqueza (ou da riqueza que administram) numa economia monetária. Nos mercados de ativos financeiros, para lidar com a incerteza os agentes utilizam instrumentos que evoluíram ao longo do tempo e englobam dois elementos do processo de decisão: (i) as premissas que serão consideradas, foco da General Theory; (ii) a sequência entre as premissas e a própria decisão, foco do Treatise of Probability. O objetivo deste artigo é duplo. O primeiro é resumir a contribuição de Fernando Cardim de Carvalho para a compreensão da tomada de decisão sob incerteza. O segundo consiste em analisar os instrumentos de tomada de decisão utilizados nos mercados de ativos financeiros contemporâneos à luz desta contribuição que é especialmente adequada a este objetivo devido ao seu profundo conhecimento do Treatise.

PALAVRAS-CHAVE:
incerteza keynesiana; tomada de decisão; mercados de ativos financeiros; convenções; especulação

INTRODUCTION

Fernando Cardim de Carvalho (hereafter, Cardim de Carvalho) was not only one of the main post-Keynesians authors with key contributions to the research program of this school of thought but also an erudite in arts. To take some examples, in the field of music, he was a connoisseur of Johann Sebastian Bach. In the field of literature, he was keen on William Shakespeare.

In a paper published in the Journal of Post Keynesian Economics (JPKE) in 2003 (CARVALHO, 2003CARVALHO, F. J. C. Decision-making under uncertainty as drama: Keynesian and Shacklean themes in three of Shakespeare’s tragedies. Journal of Post-Keynesian Economics, v. 25, n. 2, p. 189-218, 2003.), he uses three Shakespeare’s tragedies (Hamlet, Macbeth, and Julius Caesar) to identify the distinct problems that have to be considered by any relevant theory of decision-making under Keynesian uncertainty.1 1 As Carvalho (1992a, p. 69) summarizes, “Keynes’s notion of uncertainty is connected to the unknowability of the relevant distribution functions that preside over social processes”. In the next section, this concept will be detailed. He shows, rather brilliantly and creatively, that “Shakespeare is the playwright of nondeterminism”, being “decision-making under uncertainty” his central theme (2003, p. 193). Hence, “Shakespeare is an exceptional source of hypotheses” (ibidem, p. 217) for model human behaviour in economics and other modern social sciences.

Decision-making under uncertainty, a crucial subject for the post-Keynesian theory, had already been approached by Cardim de Carvalho in other papers, among which “Keynes on probability, uncertainty and decision-making” (CARVALHO, 1988CARVALHO, F. J. C. Keynes on probability, uncertainty and decision making. Journal of Post-Keynesian Economics, v. XI, n. 1, p. 66-99, 1988.).2 2 A new version of this paper was published as a chapter in Carvalho (1992a). In this paper, he digs into Keynes’ concepts of probability and uncertainty, bringing to light the evolution of Keynes’ thinking on decision making from the Treatise on Probability (TP) to the General Theory (GT). He points out that in the TP (KEYNES, 1973aKEYNES, J. M. A treatise on probability. The Collecting Writings of J. Maynard Keynes. London: Macmillan for the Royal Economic Society, 1973a, v. 8.), he was concerned “with probability as the foundation for decision marking rather than with descriptive statistics or with probability as a feature of the world as such” (1988, p. 67). In the GT, Keynes changed his focus from “the logical unfolding” of premises into a conclusion (ibidem, p. 70) “to the quantity and quality of the information on which agents base their decision-making processes” (p. 66-67). The incompleteness of this information in the case of some economic processes, such as an entrepreneur’s investment decisions, results in the emphasis on uncertainty in his later work. Yet, his basic framework did not undergo any important change. Exactly because of that, as he stresses in another paper (CARVALHO, 2010), “Keynes’s views on uncertainty are clearer to those who are acquainted with his treatise on probability” (p. 711).

Besides Shakespeare’s characters and entrepreneurs, agents in financial assets’ markets also face shortcomings in their decision-making as uncertainty is inherent to the loci in which investors decide the portfolio allocation of their wealth (or the wealth they manage) in a monetary economy, as Keynes points out in chapter 12 of the GT (KEYNES, 1936KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936.). Yet, in that case, agents need to choose how to allocate their wealth among different financial assets and not between instrumental and financial (and/or monetary) assets as analysed in Carvalho (1988CARVALHO, F. J. C. Keynes on probability, uncertainty and decision making. Journal of Post-Keynesian Economics, v. XI, n. 1, p. 66-99, 1988.).

The strategies adopted to deal with uncertainty in these markets encompass the use of instruments that change with time and involve the two elements of the decision-making process pointed out by Carvalho (1992aCARVALHO, F. J. C. Mr Keynes and the Post Keynesians: principles of macroeconomics for a monetary production economy. Aldershot: Edward Elgar, 1992a.): (i) which premises will be considered to make decisions, the focus of the GT; and (ii) the sequel between the premises and the very decision, the focus of the TP.

The aim of this paper is twofold. The first one is to summarize the contribution of Cardim de Carvalho to the understanding of the decision-making under uncertainty. The second one is to analyse the decision-making instruments used in the contemporaneous financial assets’ markets in light of this contribution that is especially suitable for this goal due to his acquaintance with the TP.

The arguments are organized as follows. In the second section, after justifying why the post-Keynesian approach is the most suitable to understand agents’ decision-making in the contemporaneous financial asset markets, we present this approach based, mainly, on Cardim de Carvalho’s contribution. Yet, we will also resort to Keynes’ original works. In the third section, we present and analyse the instruments used by these agents to make decisions in such markets. The last section concludes.

1. DECISION-MAKING UNDER UNCERTAINTY: CARDIM DE CARVALHO’S CONTRIBUTION

Contemporaneous financial assets’ markets are featured by assets of different temporalities negotiated in securities markets (spot) and derivatives ones (deferred settlement).3 3 Financial derivatives represent a market response to the context of high uncertainty and instability of expectations that emerged after the end of the Bretton Woods system in 1973. Besides its deferred settlement, another key specificity of a derivative is pointed out by BIS (1995, p. 6-7): “a contract whose value depends on the price of underlying assets, but which does not require any investment of principal in those assets. As a contract between two counterparts to exchange payments based on underlying prices or yields, any transfer of ownership of the underlying asset and cash flows becomes unnecessary.” The high imbedded leverage of financial derivatives stems from this second specificity. Hence, the theoretical analysis of the logic underlying the agent’s decision-making in financial assets´ markets must encompass both spot and derivatives assets. Consequently, the following distinguishing characteristics of financial derivatives need to be considered: (i) their markets are a zero-sum game; (ii) there are no previous inventories of contracts; and (iii) they have a finite maturity.

Already decisive in spot markets where there are finite stocks of assets that switch hands without prior temporal determination, the presence of agents with opposite positions (i.e., buyers and sellers) becomes essential in derivatives markets in which the very existence of the so-called virtual contracts with a previously defined temporality depends on the agreement between them. Such presence is only possible because participants (speculators, hedgers, and arbitrageurs) have divergent expectations about the future price of its underlying asset.

This means that in both segments of financial assets’ markets, the existence of different expectations is a pre-condition for the liquidity of the market, hence for its very existence. If a consensus is formed among the various market participants on the future direction of prices, liquidity would disappear as all would wish to sell or buy the asset in question. The market would simply stop trading or do so in small volumes with huge price level gaps.

The neoclassical theory offers some hypothesis of expectations’ formation underlying agents’ behaviour in financial assets’ markets. Under the adaptive expectations’ hypothesis (MILLS, 1961) embraced by monetarists, agents could incur learning errors, but they would not be repeated, guarantying the return of the economy to its long-term equilibrium. This hypothesis is not valid neither to securities nor to financial derivatives markets as a high number of agents make systematic errors and end up leaving the markets, being replaced by others.

The new classical approach adopts the rational expectations’ hypothesis. If we consider its original and stronger version (MUTH, 1961MUTH, J. F. Rational expectations and the theory of price movements. Econometrica, v. 29, n. 3, p. 315-335, jul. 1961.), the rationality would reside, at a given moment, in buying securities or derivatives rather than selling it, implying that the counterpart is acting irrationally or at least in a non-rational way.4 4 In mathematical or econometric models, the existence of uniform expectations in the operations of derivatives is admitted by postulating that the hedger, although sharing the expectations of all other participants, is willing to perform an operation contrary to these expectations due to pure risk aversion. Although it may explain the existence of some derivatives business, this hypothesis could not be extended to large and liquid derivative markets in which hedging accounts for only a small fraction of trades (FARHI, 1999). This hypothesis eliminates the possibility of systematic ex-ante and ex-post errors, resulting imperfect and homogeneous forecasting and fully efficient and accessible information to all through price observation, then making it impossible to explain the formation of divergent expectations among agents.

This version of rational expectations is a key pillar of the theory of market´s efficiency5 5 The theory of market´s efficiency was developed in the 1950s based on the research carried out by Kendall (1953) and Osborne (1959). New researches carried out by Mendelbrot (1963) and by Fama (1970) extended their theoretical statement and claim that if a market is efficient no forecast can be established based on a historical series of prices and that no model reproducing over time can be identified. according to which prices would integrate all available information, constituting the best valuation of the real value of the asset. Market prices would follow a random walk in which successive price changes are independent of each other over time. Consequently, any attempt to forecast prices would be useless and vain. Any information likely to affect assets’ prices would be immediately known to all and instantly incorporated into quotations, with no business at intermediate levels.6 6 The so-called ‘passive management of resources’ stems from the practical application of the efficient market theory, having been adopted by several categories of agents, including mutual funds. As this class of forecasting and decision-making tool is not used in the derivatives markets, it will not be analysed in the next section.

New Keynesians retain the assumption of rational expectations, but search to explain current financial assets markets by admitting heterogeneity in agents’ anticipations. Such heterogeneity is introduced thanks to the hypothesis of imperfect or asymmetric information (such as the cost of information or unpredictable and uncorrelated random shocks) linked to the paradoxes of Grossman and Stiglitz (1980GROSSMAN, S. J.; STIGLITZ, J. E. On the impossibility of informationally efficient markets. American Economic Review, v. 70, n. 1, 1980.) and Tirole (1982TIROLE, J. On the possibility of speculation under rational expectations. Econometrica, v. 50, n. 5, 1982.).

Another neoclassical approach also supposes heterogeneous agents is behavioural finance. This approach has blossomed since the 1990s and rejects the rational representative agent’s hypothesis underlying the three previous approaches. Based on the work of psychologists Amos Tversky and Daniel Kahneman (TVERSKY and KAHNEMAN, 1974TVERSKY, A.; KAHNEMAN, D. Judgment under uncertainty: heuristics and biases. Science, New Series, v. 185, n. 4157, p. 1124-1131, Sept. 1974.), it supports that agents’ decisions in the financial assets’ markets are determined by psychological factors inherent to human beings (SHILLER, 2003SHILLER, R. J. From efficient markets theory to behavioral finance. Journal of Economic Perspectives, v. 17, n. 1, 2003., p. 93). More specifically, in the face of the difficulty to collect and process complex information, agents make decisions based on heuristic principles.

Even if this last approach adopts a different hypothesis of the agent’s behaviour, it shares with the other neoclassical approaches the hypothesis that social processes are ergodic. Carvalho (1992aCARVALHO, F. J. C. Mr Keynes and the Post Keynesians: principles of macroeconomics for a monetary production economy. Aldershot: Edward Elgar, 1992a.) details the concept of ergodicity proposed by Davidson (1982DAVIDSON, P. International money and the real world. New York: St Martin’s Press, 1982.-1983): ergodicity “demands replicability, which means that processes should be time-independent” (p. 62). This means that social processes are “replicable and obey stable statistical laws of distribution” (ibidem, p. 64). In this environment, agents could identify all necessary data to guide their decisions through trial and error. This is the base for the neoclassical notion of uncertainty (probability risk).

Conversely, Keynes and post-Keynesians argue that in real-world economies where time is irreversible and crucial decisions (i.e., that destroy or transform the environment in which they are made) are possible, social processes are non-ergodic. Hence, agents interacting in both goods and financial markets face the Keynesian uncertainty about the future. As summarized by Carvalho (1992aCARVALHO, F. J. C. Mr Keynes and the Post Keynesians: principles of macroeconomics for a monetary production economy. Aldershot: Edward Elgar, 1992a., p. 69), “Keynes’ notion of uncertainty is connected to the unknowability of the relevant distribution functions that preside social processes”. In an uncertain context, “the tendency to gravitation towards a long-period equilibrium does not operate and (…) the kinds of behaviour described by neoclassical models are themselves irrational” (ibidem, p. 42).

To understand agents’ behaviour under uncertainty it is important to distinguish two elements of the processes of choice: “the initial data and the reasoning processes leading to the outcomes” (CARVALHO, 1992aCARVALHO, F. J. C. Mr Keynes and the Post Keynesians: principles of macroeconomics for a monetary production economy. Aldershot: Edward Elgar, 1992a., p. 56). As mentioned in the Introduction, in the TP Keynes was interested in the second element, more specifically, with “the construction of the relation between starting propositions and final outcomes”, i. e., “with the validity of inductive methods on which to ground decision making-processes” (1992a, p. 66). Hence, even before he became an economist, Keynes was aligned with the British tradition to approach economics as being “especially concerned with the making of decisions and with the consequences that follow from the decisions (HICKS, 1979HICKS, J. R. On Coddington’s interpretation: A reply. Journal of Economic Literature, v. 17, p. 989-995, 1979., p. 5, apud CARVALHO, 1992a, p. 55).

In the TG and its related papers, Keynes (1936KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936. and 1973b) focuses on the first element of decision making, namely, on the starting propositions or premises. Yet, Keynes deal with the appraisal of the premises for the first time in the TP “when he introduced a discussion of the ‘weight of arguments’” (CARVALHO, 1992aCARVALHO, F. J. C. Mr Keynes and the Post Keynesians: principles of macroeconomics for a monetary production economy. Aldershot: Edward Elgar, 1992a., p. 58). This issue will be resumed in the TG in the discussion of the role of states of confidence in agents’ behaviour in an uncertain environment.

Keynes’ concept of uncertainty refers exactly to the insufficiency of premises as “not only may some premises be unknown at the moment of the decision, but they may actually be unknowable” as crucial decisions and the interaction among agents will change the very environment in which they make their choices. In situations like that, “no meaningful numerical probability can be obtained” (CARVALHO, 1992, p. 60). Another situation that induces the same kind of behaviour by economic agents, also emphasized by Keynes, is one in which the premises are combined in a too complex way. In the two cases, the degree of confidence is important as “the state of long-term expectation, upon which our decisions are made” does not depends only on the best forecast the agent could make, but also on the confidence with which he makes this forecast (KEYNES, 1936KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936., p. 148). As the weights to the arguments, the degree of confidence is a subjective function, particular to each agent making investment decisions.

In the words of Carvalho (1992, p. 61):

For practical purposes (…) the two cases may be taken as equivalent in the sense that they are likely to induce the same kind of behaviour by economic agents. Social processes developing in historical time may be perhaps the most important example of situations where the power of reasoning is too weak to derive the global picture necessary to the identification of numeral probabilities. In both cases, the agent has to limit somehow the set of observable premises to a workable group of alternatives and to build sequels, in the knowledge, however, of its unavoidable incompleteness. The awareness by the agent of being at least partially ignorant of the influencing elements in a given process will be a crucial feature distinguishing behaviour in the Post Keynesian and neoclassical models.

According to the post-Keynesian approach, in an uncertain environment, agents adopt defensive strategies to avoid paralysis. One of these strategies is the liquidity preference that is defined in the GT “in terms of the exposure to yet undefined risks that parting with liquidity implies for a wealth-holder” (CARVALHO, 2010CARVALHO, F. J. C. Uncertainty and money: Keynes, Tobin and Kahn and the disappearance of the precautionary demand for money from liquidity preference theory. Cambridge Journal of Economics, v. 34, n. 4, p. 709-725, 2010., p. 715). Yet, as Carvalho stresses, it was only in the 1937 paper that Keynes makes clear the “more revolutionary elements of his theory” (2010, p. 718), namely, the demand for money as a defence against uncertainty (the precautionary demand for money) and the link between the state of confidence and this demand.7 7 On the relationship between liquidity preference and uncertainty, see also Carvalho (2015a, 2015b).

Another typically rational answer of agents who need to make prospective calculations and take decisions in such an environment is the resort to conventions. Keynes (1936KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936., 1973b) characterises conventions in the following way. First, current market conditions provide a reasonable guide for decision-making (under ‘normal’ conditions, agents tend to give little importance to future changes). Second, agents presume that the current state of opinion, as expressed in prices and production, is based on a correct summary of the future perspectives of the economy, which they accept until something new and relevant comes up. In other words, agents’ decisions are based on inductive reasoning, on the belief that the future will resemble the past. Moreover, Keynes emphasises the intersubjectivity of the agents’ actions. Since they believe that their knowledge is “limited, vague and uncertain” and that other agents might be better informed than they are, the individual agent tends “to conform with the behaviour of the majority or the average opinion” (KEYNES, 1973bKEYNES, J. M. The general theory and after. Defence and development. The Collected Writings of J. Maynard Keynes. London: Macmillan for the Royal Economic Society, 1973b, v. 14., p. 114).

Thus, conventions emerge because agents have limited, uncertain knowledge concerning the different relevant factors affecting their decisions, as well as on the results of these decisions. Under fundamental uncertainty, conventions function as an anchor to decision-making processes, i.e., they act as a kind of “substitute for the knowledge which is unattainable” (KEYNES, 1973bKEYNES, J. M. The general theory and after. Defence and development. The Collected Writings of J. Maynard Keynes. London: Macmillan for the Royal Economic Society, 1973b, v. 14., p. 124). In the words of Carvalho (1992bCARVALHO, F. J. C. Elasticidade de expectativas e surpresa potencial: reflexões sobre a natureza e a estabilidade do equilíbrio sob incerteza. Revista Brasileira de Economia, v. 46, n. 1, p. 53-75, 1992b.), conventions could be seen as techniques developed by agents to live with uncertainty “that will allow them to act “as if” it was not relevant” (p. 54, authors’ translation).8 8 Carvalho (2014) also addresses the link between uncertainty and conventions.

Concerning specifically financial assets markets, Keynes (1936KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936.) stresses in chapter 12 of the TG that in liquid stock markets the majority of agents make their investment decisions based on “a conventional valuation which is established as the outcome of the mass psychology” (p. 155). Yet, liquidity, a crucial feature of those markets, depends on the presence of divergent expectations.

Hence, a question comes out: how different opinions about the future evolution of prices emerge in a market dominated by conventional behaviour? They emerge because of the precariousness of conventions. In Keynes’ words:

A conventional valuation which is established as the outcome of the mass psychology of a large number of ignorant individuals is liable to change violently as the result of a sudden fluctuation of opinion due to factors which is do not really make much difference to the prospective yield, since there will be no strong roots of conviction to hold it steady. (1936, p. 154)

The possibilities of speculation arise exactly because of the divergences among agents regarding the future evolution of financial assets’ prices. Speculators search to “foreseeing changes in the conventional basis of valuation a short time ahead of the general public” (KEYNES, 1936KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936., p. 154). This means that they follow a non-conventional behaviour making decisions of buying or selling assets in the opposite direction of the ‘mass’. It is the interaction of market participants and their heterogeneous expectations that come out with the price oscillations of financial assets (and, by extension, of their derivatives).

Keynes’s approach to decision making in financial assets’ markets meets the two conditions for decision making be seen as one fundamental object of economic science pointed out by Carvalho (1992aCARVALHO, F. J. C. Mr Keynes and the Post Keynesians: principles of macroeconomics for a monetary production economy. Aldershot: Edward Elgar, 1992a.) based on Shackle (1979SHACKLE, G. L. S. Imagination and the nature of choice. Edinburgh, UK: Edinburgh University Press, 1979.). Firstly, “it has to be creative” (or ‘uncaused’ in Shackles’ terminology) i.e., “to decide cannot be reducible to mere adaptation, which means that one cannot link directly environmental conditions to behavioural results (CARVALHO, 1992aCARVALHO, F. J. C. Mr Keynes and the Post Keynesians: principles of macroeconomics for a monetary production economy. Aldershot: Edward Elgar, 1992a., p. 55). Secondly, “there must be a criterion of decision and a method of construction the sequels of a set of premises to inform the decision” (ibidem, p. 55).

Carvalho (2003CARVALHO, F. J. C. Decision-making under uncertainty as drama: Keynesian and Shacklean themes in three of Shakespeare’s tragedies. Journal of Post-Keynesian Economics, v. 25, n. 2, p. 189-218, 2003., p. 191) takes back this issue in the analysed of Shakespeare’s characters’ choices. He summarized these two conditions as follows: the first one is that “the future must be open to be created by the act of choice”; the second one is “the existence of an ordered universe. If there is no order, any sequel can follow any decision, and the act of choosing is idle”. The order does not result in deterministic trajectories, independent of the agent’s choices. It relies on the existence of social rules and “is compatible with freedom if it is represented by patterns of reaction, rather than the result of giving an anthropomorphic character to history itself. Order is itself moving (…) It is the variety of individual behaviours that are compatible with a given ordering that allows change and evolution, allowing ever-new possibilities of interaction.” Yet, chaotic stages could emerge “if disturbances are large enough to prevent new rules from being developed when current ones are challenged”. But “chaos is not the unavoidable consequence of freedom, and to show this is the core of Shakespearean tragedies”.9 9 Carvalho (1994) details the relationship among order, uncertainty and chaos.

In financial assets’ markets, the conventional behaviour ensures order while creativity is present due to the activity of speculation that breaks the order until a new convention arises. In these markets, we could say that chaotic stages are exactly what Keynes (1936KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936., p. 154) called “abnormal times” in which

the hypothesis of an indefinite continuance of the existent state of affairs is less plausible than usual even though there are no express ground to anticipate a definitive change, the market will be subject to waves of optimistic and pessimistic sentiment, which are unreasoning, yet in a sense legitimate where no solid basis exists for a reasonable calculation. (KEYNES, 1936KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936., p. 154)

2. DECISION-MAKING INSTRUMENTS IN FINANCIAL ASSETS’ MARKETS

Any operation in a financial asset’s market has one of the following possible results: weak profit, significant profit, weak loss, heavy loss. As such, all decisions concerning wealth allocation in financial instruments can be described as crucial decisions. The key to all participants in the financial assets markets and derivatives is to avoid heavy losses in the hope that small gains and losses will offset each other until a high profit is achieved. To achieve this goal, they use the most diverse types of forecasting tools to assist them in decision-making.

The development of decision-making tools has been fostered by market diversification, increasing liquidity and market volatility, leading investors to shrug off long-term prospects, and focus on the possibilities of gains on short-term operations. In the spot markets, earnings prospects begin to look larger if decisions are made based on short-term movements, conditioned by market psychology. In the derivatives markets, time is a decisive factor that prevents agents from operating with longer prospects neglecting daily fluctuations, especially in future markets where margin adjustments are daily. Also, making profits in these markets depends on the losses of the other participants, which requires much more precise and accurate forecasting and timing of operations.

In the following, we present a historical perspective of the decision-making instruments in financial assets markets (sub-section 2.1) and detail the main instrument currently used, the so-called High-Frequency Trading (HFT).

2.1. FROM FUNDAMENTALIST TO QUANTITATIVE ANALYSIS.

Decision-making tools began to be developed in the operations of stock exchanges at the end of the 19th century when liquidity was much lower than the current one. The fundamentalist analysis method based on macroeconomic indicators and data related to the assets was the first to be employed. Numerous criticisms of this kind of instrument soon emerged in the professional circles of technicians and among participants of financial assets markets. The criticisms focused on the ability to adapt traditional methods of analysis as an operational decision-making tool. Curiously, many of them highlight features of these markets stressed by the Keynesian theory that this method does not take into account, among which the subjectivity regarding both the selection of the premises and the sequel adopted by the decision-maker.

Teweles, Harlow and Stone (1974TEWELES, R. J.; HARLOW, C. V.; STONE, H. L. The commodity futures game. New York: MacGraw Hill Book Company, 1974.) summarize the main criticisms to this class of instrument: (i) it cannot explain and even more predict short-term movements and oscillations; (ii) it tends mainly to provide ex-post explanations while financial assets markets tend to anticipate economic trends; (iii) interpretation of statistical series has a subjective bias, as the analyst selects certain indicators and may neglect others; (iv) the same objections are made for the evaluation of news as the basis of operational decision; (v) it cannot solve the problem of the timing of the transactions that are so important in financial assets markets; (vi) the statistical information used by the fundamental analysis does not represent the present situation, but the one existing at the time of its data collection.

Shortly after, one of the most popular methods of decision making, called technical or chartist analysis, gained great popularity when in 1901, W.P. Hamilton, then editor of the Wall Street Journal, published an accurate analysis of U.S. Steel’s first public offering of shares.10 10 In the same period, market professionals searched other tools to support their operational decisions. One of them was based on the determination of the profile of the participants and their positions. By the mimetic effect, agents followed the orders of big investors, those who had the reputation of winners and those who were considered well informed. It was a primitive version of what is now called noise trading which seeks to determine the position and level of prices on which large orders are concentrated. Despite the emergence of much more sophisticated analytical instruments as detailed below, this mode of operation continues to play an important role in the contemporaneous financial assets’ markets, usually by small investors and day traders and very strongly by the bulk of uniformed investors during sharp prices’ reversal. His readers supposed that he had an informant inside the company. Hamilton surprised them by revealing that all of his previsions were based on the tabulation and charting of price series and traded volumes on the market. It was then employed in commodity futures markets (TEWELES and JONES, 1987TEWELES, R. J.; JONES, F. J. The futures game. New York: Mac Graw Hill Book Company 1987.). To a large extent, its success at that time was due to its simplicity of updating, which could be done manually without the need for complex calculations.

Murphy (1986MURPHY J. J. Technical analysis of the futures markets. New York: New York Institute of Finance, 1986.) presents technical analysis as the study of the evolution of a market, mainly based on charts that include all market aspects (evolution of prices, turnover and open interest in the case of derivatives) to predict future trends. For a long time, this analysis was viewed by investors, market professionals, and portfolio managers as a fast and valuable tool for forecasting price movements. It became widely used to the point of being termed a ‘self-fulfilling prophecy.’

Technical analysis has some basic principles. The first one is that all elements that may influence the price of an asset (including those that are not yet in the public domain) are present and crystallized in the prices, trading volumes, and open position of each specific market. The chart analyst does not care to know the reason for a certain price movement, he rather tries to determine when and how it should occur and to estimate its magnitude. The second principle refers to price developments. This analysis does not accept that the prices have a random evolution, although it does not deny that some movements could have this feature. But, these movements are considered of secondary importance. Abstraction made of these minor fluctuations, the technical analyst is convinced that the prices evolve in tendencies that last a certain time before being reversed. It is intuitive, for all who have a certain familiarity with these markets, that prices go through periods of continued highs, periods of continued lows, and times of relative stability that market term ‘sideways.’ The observation, even superficial, of a price chart also tends to confirm this principle.

One of the most important objectives of technical analysis is to identify the emergence of these trends, their possible extension, and the points of reversal that mark their changes in direction. According to their proponents, this allows filling one of the gaps in the fundamental analysis in which information, when it becomes public, no longer allows to profit as they have already been anticipated, embedded in prices. The famous (at least among participants in financial assets markets) phrase from Charles Dow, one of the founders of the Dow-Jones technical school and economic intelligence agency, “buy the rumour, sell the fact”, is frequently reaffirmed in the financial assets’ markets.

By stressing the importance of studying how prices have evolved to determine their future trend, rather than determining why they evolved, technical analysts are forced to admit the third and most controversial principle, namely, the way prices behave in the past are indicative of how they will behave in the future. Béchu and Bertrand (1995, p. 30) state:

the technical analyst can argue of the great stability of the configurations that he uses. Some methods date back to the end of the last century. Since then, the changes - political, legislative, fiscal, economic, etc. - have been numerous and of a fantastic breadth. However, the chartist settings used by the technical analysts remained the same without loss of their effectiveness. What worked well in the cotton market in 1885 continues to work in 1991 over Apple Computers’ stock.

Some aspects of the logic underlying this type of instrument are compatible with the Keynesian approach to decision-making under uncertainty. First, agents who used it (investors, asset managers, traders) adopt the convention that price performance in the past is indicative of their movement in the future and reject the hypothesis of randomness. Second, there are many elements of subjectivism in the two elements of the decision-making process. The way the premises (the market aspects) are combined and the degree of confidence in each premise vary among investors. The second element (the reasoning process leading to the outcomes) is subjective as well inasmuch each trader can interpret the charts in different (sometimes opposite) ways. For instance, traders could be trend followers (buy when prices are rising) or contrarians (buy when prices are falling). Third, the search for forecasting the moment of the reversion of the current price trend is similar to the activity of speculation, i. e., to foreseen “changes in the conventional basis of valuation a short time ahead of the general public” (KEYNES, 1936KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936., p. 154).

Conversely, other aspects are incompatible with such an approach. The use of prices’ behaviour in the past (sometimes over a long period) as indicative of how they will behave in the future clashes with the non-ergodic character of economic processes emphasized by Keynes and post-Keynesians. As Carvalho (2003CARVALHO, F. J. C. Decision-making under uncertainty as drama: Keynesian and Shacklean themes in three of Shakespeare’s tragedies. Journal of Post-Keynesian Economics, v. 25, n. 2, p. 189-218, 2003., p. 192) stresses, one key determinant of this character is the presence of creative behaviours, that

result from decisions that are not endogenously determined. Simply to know current data is not sufficient to allow one to anticipate a given person’s decisions because these decisions are being made on the basis of imagined (future) events. One is creating, not merely reacting to events.

It is this kind of decision, which changes the course of history, even if small-scale, that Shackle (1979SHACKLE, G. L. S. Imagination and the nature of choice. Edinburgh, UK: Edinburgh University Press, 1979., p. 58) called “crucial decision” (p. 192). Hence, history “cannot, by definition, be explicable by experience” (ibidem, p. 192). As investment in instrumental assets, financial assets market trends are also shaped by crucial decisions. For instance, a boom phase could bust due to a change in the position of a big investor that could result in huge losses for other investors that were following the previous convention.

It is precisely because of the subjective character of technical analysis that it is impossible, despite observations that span decades, to evaluate its performance in predicting price developments. Some graphical analysts, possibly more sensitive to market psychology, obtained appreciable financial results. However, others ended up losing money, although the losses were generally limited by the operational discipline that is the most important by-product of technical analysis. Such discipline spread beyond its followers and was adopted by the subsequent decision-making instruments.

The most important tool of operational discipline is the stop-loss orders whose aim is to limit losses when the movement of prices goes against the position of the agent. Stop orders involve buying back or selling a position at market price, once it reaches a previously specified price level and can be placed in moments that, according to the graphical analysis, would indicate a trend reversal if they were reached. Determining the price at which stop orders are concentrated is one of the favourite activities of day traders, as volatility is accentuated when they are triggered.

The operational discipline brought about by technical analysis has clear and sometimes deep repercussions on the price movements. In certain circumstances, these repercussions stem from the fact that the self-imposed operational discipline function as accrued instruments of consensus-building on prices´ direction contributing to the loss of market liquidity. There were occasions when the activation of these accumulated orders contributed significantly to an already extremely pronounced trend in which successive levels of support were broken, reinforcing the optimistic or pessimistic sentiment and contributing to the emergence of chaotic situations instead of curbing them.

In search of a more objective decision-making instrument, financial assets market agents turned to quantitative analysis that seeks to overcome the elements of subjectivity and result, unequivocally, in decision making. The idea of using quantitative reasoning as an instrument is not in itself a novelty. However, its employment was rather limited before the popularization of computers as it involved long and sometimes complex calculations that had to be updated at least every day.

Quantitative analysis adopts the same principles of technical analysis and could be seen as its sophisticated extension in the age of computers. Like the technical analysis, quantitative analysis rejects the theory of efficient markets, draws on the study of past data to predict the future, and seeks to identify patterns and structures that tend to replicate over time.

Its most intensive application began in the 1970s with the use of series of past data: regression analysis to clarify the relationships between the data studied and numerical filtering methods, and to isolate the basic price trend from the short-term oscillations that can mask it. Gradually, new technical measures regarding the behaviour of prices were introduced. Among the most interesting are: speed that measures the intensity of the movement prices; standard deviation describing the effective behaviour of prices; oscillators to determine the tensions and the power of prices’ movement.

All of these and possibly other significant measures of price behaviour were embedded in complex mathematical models that required at the time the latest generation of information technology (IT) equipment. These models were formulated mostly in the US by the so-called rocket scientists (physicists, mathematicians, and statisticians). In any case, the secret surrounding the development and application of these models certainly resembles the one that surrounded the aerospace, ballistics, and defense areas at the height of the Cold War.

Since the end of the 1980s, several banks recruited these scientists at very high salaries to develop their programs. Other researchers created their own ‘quant shops,’ with some of them having been remarkably successful in managing third-party resources while others failed in a scale that posed potential systemic risks as was the case of Long-Term Capital Management in 1998. Midland Bank, an English financial institution, was one of the first banks to apply these models to its operations after seven years of proprietary research led by a researcher from the University of Oxford and a specialist in non-linear optics. Other banking and non-banking institutions have entered into agreements to provide funds or to acquire participation with research groups and ‘quant shops’ to obtain access to the latest and most sophisticated models.

For these scientists, financial assets markets and their derivatives could not be modelled under the assumptions of perfect efficiency and random variables. On the contrary, they view these markets as complex adaptive systems similar to climate formation, air turbulence, or cell growth. As a result, their researches were conducted as if they were similar to natural phenomena, using series of observations and data such as historical prices, volume, and open interest (in the case of derivatives).

In general, these models consist of nonlinear equations (i.e., the effects do not follow the causes in a clear and unidimensional way) and regressions that seek to identify how a combination of variables produces a given result, namely, how the price of a certain asset derives from several factors among which its past prices. It should be adequate to explain at the same time past price movements and predict future developments. Thus, they are initially tested with historical data to verify the rate of success they would have had in forecasting the subsequent movements of prices.

Another feature of such models is their adaptability to each change in market conditions and the constant search for new price structures that may emerge. Some of them used computer neural networks, which pretended to imitate the functioning of the human brain and learn by ‘trial and error’ given the need for constant redefinition.

In the 1990s, the first results obtained in the effective application of the quantitative analysis to the markets of securities and financial derivatives seemed promising. GK Capital Management said that its exchange rate model had allowed an average gain of 12% per year since 1989. Global Market Technologies, whose models span multiple markets, claimed 20% profits in 1994, while Midland’s diversified portfolio Bank gained 23% in the same period.

At the end of the 20th century, it was still difficult to evaluate the results of quantitative analysis since huge gains were not uncommon in the financial assets markets, even in institutions that manage resources without resorting to any technological paraphernalia. The real test was whether they could be consistently maintained over time. As time passed, it became clear that the results obtained by this type of analysis were much more consistent in the very short term.

In comparison with technical analyses, the set of premises is bigger and its degree of subjectivism is lower, yet it is still present as the final decision of buying or selling an asset is made by the agent that could or not follow the model. Such a tool also disregards the non-ergodic character of financial assets prices as these forecasting models are stochastics. Exactly because of that, their projections have been insufficiently accurate, which made it impossible to use them for longer-term operations. Consequently, these models focused on a short-term horizon, intraday, or around two or three days.

2.2. HIGH-FREQUENCY TRADING (HFT): ALGORITHMS AT THE CENTRE OF DECISION-MAKING

Quantitative analysis represented the initial building blocks of the current HFT a class of electronic trading11 11 The term ‘electronic trading’ is used in many ways. In this paper, we adopt the definition of Allen, Hawkins e Sato (2001). It refers mainly to trading in wholesale financial markets (as opposed to e-commerce more generally and focuses on the central feature of electronic trading systems, which is the automation of trade execution. featured by high-speed connections and the use of complex algorithms replacing human decisions in the issuing of orders in the markets. HFT comes from the convergence between finance and technology. This financial innovation is featured by a high sunk cost due to its strong technological content, compensated, in part at least, by a reduction in cost as less and less human labour is required to run a trading desk.

HFT is typically performed by powerful computers able to execute electronic transactions within milliseconds even microseconds, generating a huge number of trades on a daily basis. Speed allows them to take advantage of small discrepancies in prices. This new class of electronic trading has specific characteristics, among which: super-fast dedicated cables; extraordinarily high-speed and sophisticated computer programs for generating, routing, and executing orders; powerful algorithms in a large extent resulting from quantitative analysis, but also linked to numerous news feed and programmed to react very quickly to specific keywords; use of co-location services12 12 Exchanges built or are building huge data centers where traders, members and non-members alike, after paying a fee, can place computers containing their trading algorithms next to an exchange’s engine, which matches ‘buy’ and ‘sell’ orders. This ‘co-location’ shaves crucial milliseconds from the time it takes to complete a trade (Financial Times Lexicon). and individual data feeds offered by exchanges and others to minimize network and other types of latencies; very short time-frames for opening and closing positions; submission of a large number of orders, frequently cancelled shortly after; ending the trading day as close to a flat position (in which no net positions are carried overnight, thus are not recorded in the balance sheets) as possible.

The initial aim of HFT use was gaining speed to perform almost simultaneous arbitrage trade in various markets and to buy or sell assets in the time-lapse that large orders take to reach the market at a slower pace, forcing ‘slow investors’ to buy at a higher price or sell at a lower one.

Although its emergence in stock trading goes back to mid-2000, HFT has boomed only after the 2008 global crisis due to banks and other financial institutions’ reactions to the financial regulations’ changes, the so-called regulatory arbitrage. In the face of the ban on banks’ proprietary trading (Volcker rule), large American banks started to reduce the big trading positions they once carried. Competition among non-banking financial institutions to grab part of this business and to restore liquidity in the markets became fierce. In this setting, many HTF firms were created with that same aim. These firms that can trade in microseconds are usually trading for themselves rather than fulfilling clients’ orders. They are usually designated as proprietary firms. As such they are not required to be registered and are absolutely unregulated.

Meanwhile, those banks searching for new sources of revenue have engaged in HFT for their proprietary trading as they allow them to make trading profits during the day while not carrying assets that need to be included in their balance sheets. This activity allows them to comply with the Volcker rule and, at the same time, retain at least part of those profits. The banks’ rush to maintain or regain their places as brokers is still going on and involves making HFT available to their clients. For instance, Goldman Sachs, which has long been one of the top two stock brokerages in terms of revenue and customer rankings, saw its status slipping because its technology did not keep up with client demands for ever-faster trades. In March 2015, Goldman hired dozens of technologists and support staff to elevate its position in a fast-growing slice of the market and win back business from its chief competitor, Morgan Stanley (MACCRANK, 2016MACCRANK, J. Goldman revamps electronic stock trading to catch rival. Reuters News, March 14, 2016. Available at: <http://www.reuters.com/article/us-goldman-stocktrading-idUSKCN0WG10C>.
http://www.reuters.com/article/us-goldma...
).

This kind of operation is far from being new. The trading strategy that attempts to make profits on small intraday price changes has been long known as scalping. While trading occurred on the floor of the exchanges, scalpers were common figures, located at the bottom of the scale of financial resources and risk-taking. Back then, their attempts to buy (or sell) at the bid (or ask) price and then quickly sell (or buy) them a few cents higher (or lower) for a profit were based on their feeling of the short-term market´s trend. Then, internet trading became the norm, and scalpers switched to online day trading.

HFT´s scalping is a huge business, relying on algorithms that present the same characteristics of quantitative analysis to make a very short-term profit with a higher probability. Most of the time the difference in price can be counted in fractions, but many small profits can easily compound into large gains as shown in some extraordinary performance among prominent HFT firms. A report by Barron (2010), for example, estimates that Renaissance Technology’s Medallion - a quantitative HFT fund - achieved a 62.8% annual compound return in the three years before the report. For the banks, scalping presents an additional advantage as they allow them to bypass Volcker´s rule restrictions.

However, on one hand, those returns are very unevenly distributed. HFT firms are competing among themselves by investing in small increases in speed. Then, there are differences in their algorithms and their strategies. Baron et al. (2014BARON, M.; BROGAARD, J.; KIRILENKO, A. Risk and return in High Frequency Trading. Commodity Futures Trading Commission, Apr. 2014. Available at: <https://www.cftc.gov/sites/default/files/idc/groups/public/@economicanalysis/documents/file/oce_riskandreturn0414.pdf>.
https://www.cftc.gov/sites/default/files...
, p. 3) show that

HFT firms who specialize in aggressive strategies generate substantially more revenue than those who specialize in passive strategies. Moreover, revenue persistently and disproportionally accumulates to the top performing HFTs, suggesting winner-takes-all market structure.

On the other hand, the best days of HFT’s returns may be behind it as its success and proliferation intensifies competition and reduce benefits.

In face of the HFT boom, a great number of electronic trading venues were created such as Bloomberg Terminal, Thomson Reuters 3000 Xtra (replaced by Eikon platform, in 2010), BondsPro, Thomson TradeWeb or CanDeal. Conversely, these new venues have fostered even more the boom. In that setting, a new regulation that forces broker houses to seek among them the best execution for their orders was launched. With trading currently spread in the US, over 11 public stock exchanges, uncounted electronic trading venues, and more than 40 dark pools13 13 Dark pools are trading venues that do not publish prices, volumes and the origin of bid and ask orders. market operators are dependent on HFT firms for liquidity and revenue. To cater to them and in a framework of intense competition, the exchanges and dark pools sell advantages to HFT firms: myriad special-order types, faster data, price rebates, and co-location (the right to put their computers in the same data centers as the exchanges so they can trade even faster).

The very stiff competition among financial institutions and technological advances has been fostering constant updates (new algorithms, faster cable connections, etc). As a result of this competition between proprietary firms and banks, HFT has rapidly spread to trade in all different kinds of assets and their derivatives all over the world. According to Zhang (2010ZHANG, X. F. The effect of high-frequency trading on stock volatility and price discovery. New Heaven: Yale University School of Management, Nov. 2010. Available at: <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1691679>.
https://papers.ssrn.com/sol3/papers.cfm?...
, p. 1), “high-frequency trading has become a dominant force in the U.S. capital market, accounting for over 70% of dollar trading volume.” In foreign exchange markets, for instance, HFT is largely employed in cross rates operations. Chaboud et al. (2009CHABOUD, A. et al. Rise of the machines: algorithmic trading in the foreign exchange market, board of governors of the Federal Reserve System. International Finance Discussion Papers, Board of Governors of the Federal Reserve System, n. 980, Oct. 2009. Available at: <http://www.federalreserve.gov/pubs/ifdp/2009/980/ifdp980.pdf>.
http://www.federalreserve.gov/pubs/ifdp/...
) use the example of the pairs euro/dollar, dollar/yen, and euro/yen to underline that “in this cross-rate, computers have a clear advantage over humans in detecting and reacting more quickly to triangular arbitrage opportunities, where the euro-yen price is briefly out of line with prices in the euro-dollar and dollar-yen markets.” A recent study by McKinsey (2015, p. 4) estimates that “trading through electronic channels now accounts for 90 percent or more of spot G10 FX14 14 Advanced economies currencies foreign exchange transactions. and equity transactions and is increasingly common in certain areas of rates and credit.”

The rise of HFT in the U.S. and around the world has been rapid. Competition mechanisms led to a huge expansion. Biais (2011BIAIS, B. High frequency trading. Presentation prepared for the European Institute of Financial Regulation. Paris, Sept 2011. Available at: <https://repository.nu.edu.sa/bitstream/123456789/2206/1/High%20frequency%20trading%20Bruno%20Biais%20(Toulouse%20School%20of%20Economics).pdf>.
https://repository.nu.edu.sa/bitstream/1...
) describes the situation in which agents invest in HFT because it is costlier to remain slow while others get faster. He compares that investment to an arm’s race: expansive, socially useless, but if others do it, you must equal them. According to a report by the Bank of England (BANK OF ENGLAND, 2010), by 2010 HFT accounted for 70% of all trading volume in U.S. equities and 30-40% of all trading volume in European equities.

While HFT was being implemented, it remained relatively unknown. As it expanded, news of its existence started to become known, mostly based on information provided by disgruntled non-HFT traders (OLOFFSON and GANDEL, 2009OLOFFSON, K.; GANDEL, S. High-frequency trading grows, shrouded in secrecy. Time, Aug. 05, 2009. Available at: <http://content.time.com/time/business/article/0,8599,1914724,00.html>.
http://content.time.com/time/business/ar...
; ARNUK and SACUZZI, 2012ARNUK, S. J. Broken markets: how high frequency trading and predatory practices on Wall Street are destroying investor confidence and your portfolio. FT PRESS, USA, June 3, 2012.). It really made the headlines after the publication, in March 2014, of Michael Lewis’ bestseller book15 15 Lewis’ Flash Boys stood four consecutive weeks at Number 1 on the New York Times nonfiction list. The Flash Boys: A Wall Street Revolt. Written almost like a novel, overcoming the complexity of the subject, Lewis´ book used the testimonies of different traders and computer scientists that became the main characters of the book. He explained the way Wall Street had developed an overly complex, opaque system for trading stocks that levied a tax on investing. His main allegation that the market is ‘rigged’ in favour of high-frequency traders relies essentially on the finding that those venues sold access to HFT technology that allowed it to gain speed and pricing advantage over public orders. None of these was new information, yet it was the first time that they reached the broad public and provoked real impact.

Market´s behaviour also called attention to the impacts of HFT. On May 6, 2010, a ‘Flash Crash’16 16 Flash Crash is a very rapid, deep, and volatile movement in security prices occurring within an extremely short time. occurred in the United States, lasting for approximately 36 minutes. Stock indexes, such as the Dow Jones Industrial Average and Nasdaq 100, collapsed and rebounded very rapidly. The Dow Jones Industrial Average had its biggest intraday point drop (from the opening) up to that point, plunging 998.5 points (about 9%) within minutes, only to recover a large part of the loss. The prices of stocks, stock index futures, options, and Exchange Traded Funds were extremely volatile, thus trading volume spiked during that short period in which nearly USD 1 trillion in market value waned. In the New York Times, Bowley (2010BOWLEY, G. Lone $4.1 billion sale led to ‘flash crash’ in May. The New York Times, Oct. 1, 2010. Available at: <https://www.nytimes.com/2010/10/02/business/02flash.html>.
https://www.nytimes.com/2010/10/02/busin...
) described the process of contagion to other markets:

The selling pressure was then transferred from the futures markets to the stock market by arbitrageurs who started to buy the cheap futures contracts but sell cash shares on markets like the New York Stock Exchange. Automatic computerized traders on the stock market shut down as they detected the sharp rise in buying and selling. Altogether, this led to the abrupt drop in prices of individual stocks and other financial instruments such like exchange-traded funds, and caused shares of some prominent companies like Procter & Gamble and Accenture to trade down as low as a penny or as high as $100,000. The rout continued until an automatic stabilizer on the futures exchange cut in and paused trading for five seconds, after which the markets recovered.

The 2010 flash crash exemplifies the risk created by a new and accelerating trend: the market’s reliance on automated computer systems in trading; and accordingly, a new class of risk to the markets - the computer-based trading malfunction. Since then, other computer-based trading malfunctions, or ‘glitches,’ have occurred, highlighting at-risk areas in the global trading system. Of those, one stands out: in 2013 a false tweet from a hacked account owned by the Associated Press (AP) claiming that President Obama had been injured by the White House explosion sent financial assets markets into a tailspin. Once the nature of the tweet was discovered, the markets corrected themselves almost as quickly as they were skewed by the bogus information. This episode proves the importance of the news feed in the algorithms that provide the basic foundation of HFT.

Those incidents led to changes to regulatory and procedural safeguards by capital markets regulators in many countries. These changes are largely designed to mitigate the potential risk of volatility or damage caused to the markets and have not, at least so far, addressed the question of markets ‘structure and the very existence of HFT that seems more and more firmly established in international financial assets markets.

FINAL REMARKS

In this paper, we draw on Cardim de Carvalho’s contribution of the Keynesian approach on decision-making under uncertainty to analyse the instruments used in the two segments (securities and derivatives) of contemporaneous financial assets markets. In section 1, we argue that the theoretical categories of such approach explored by Cardim de Carvalho (such as conventional behaviour, uncertainty, non-ergodicity of economic processes, speculation, degree of confidence) allow us to understand the logic behind agents’ decision making in such markets. In section 2, we presented these instruments and analysed them in light of Cardim de Carvalho’s contribution. Our assessment came out with three main conclusions.

Firstly, the decision-making tools differentiate themselves with regards to the two elements of the decision-making process (the premises considered to make a decision and the sequel between the premises and the very decision) and to the degree of subjectivity involved in these elements.

Secondly, these instruments are techniques and they support agents making decisions on the allocation of their wealth in an uncertain context. Yet, could we also classify them as conventions? Undoubtedly, they have the first defining propriety that belongs to every convention, i.e., the conformity with conformity, which means that agents adopted them because others have adopted or are expected to adopt. However, as Dequech (2017DEQUECH, D. The concept of development conventions: some suggestions for a research agenda. Journal of Economic Issues, v. 51, p. 285-296, 2017.) clarifies, a convention has also another propriety, which is arbitrariness that means that a non-inferior alternative to the prevailing rule or system of rules exists or is conceivable.

We may affirm that the fundamental, chartist and quantitative analysis also shows this second propriety, but the HFT probably does not. Currently, it seems to be no non-inferior alternative to this instrument that does not encompass subjectivism in the decision-making as its predecessors. As mentioned above, the very stiff competition between proprietary firms and banks results in the rapid spread of HFT to assets and derivatives markets all over the world. As the proportion of HFT increases, slow traders that use the former instruments are evicted from the market. Thus, fast traders stop realizing all potential gains from speedy trade. In other words, now they mostly trade amongst and against each other and have to rely on the quality of their algorithms rather than on speed alone to obtain a profit. Then, the arbitrariness in the case of these instruments seems to refer to the class of algorithm that will be used. Indeed, some money managers or investors have not adopted yet a ‘robot’ only because they could not afford it.

Thirdly, the HFT is a decision-making tool that considers features of contemporaneous financial assets markets emphasized by the post-Keynesians. As the automatic orders of buying or selling can be triggered by the news and not only by the forecasts of the stochastic models bundled in the algorithms, in fact, this tool is admitting that agents’ reactions to uncertain events could change the probability distribution of the asset price movement. In other words, agents make crucial decisions, hence “history cannot, by definition, be explicable by experience” (CARVALHO, 2003CARVALHO, F. J. C. Decision-making under uncertainty as drama: Keynesian and Shacklean themes in three of Shakespeare’s tragedies. Journal of Post-Keynesian Economics, v. 25, n. 2, p. 189-218, 2003., p. 192).

Fourthly, the HFT has reinforced the short-termism and the speculative character of financial assets markets. Then, the evolution of the decision-making instruments carried to the apex the characteristics of liquid financial assets markets stressed by post-Keynesians. The following statement of Keynes (1936KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936., p. 159) was never truer: “speculators may do no harm as bubbles on a steady stream of speculation. But the position is serious when enterprise becomes the bubble on a whirlpool of speculation.” As mentioned in the last section, the new risks created by such a tool could result in chaotic situations. However, they have not been addressed yet by financial regulators.

REFERENCES

  • ALLEN, H.; HAWKINS, J.; SATO, S. Electronic trading and its implications for financial systems. BIS Papers, n. 7, 2007, p. 30-52, 2001.
  • ARNUK, S. J. Broken markets: how high frequency trading and predatory practices on Wall Street are destroying investor confidence and your portfolio. FT PRESS, USA, June 3, 2012.
  • BANK OF ENGLAND. Sterling Money Markets. Liaison Group, 15 Nov. 2007.
  • BANK OF ENGLAND. Financial Stability Report, n. 27, jun. 2010. Available at: <https:// www.bankofengland.co.uk/-/media/boe/files/financial-stability-report/2010/june-2010.pdf>.
    » https:// www.bankofengland.co.uk/-/media/boe/files/financial-stability-report/2010/june-2010.pdf
  • BARON, M.; BROGAARD, J.; KIRILENKO, A. Risk and return in High Frequency Trading. Commodity Futures Trading Commission, Apr. 2014. Available at: <https://www.cftc.gov/sites/default/files/idc/groups/public/@economicanalysis/documents/file/oce_riskandreturn0414.pdf>.
    » https://www.cftc.gov/sites/default/files/idc/groups/public/@economicanalysis/documents/file/oce_riskandreturn0414.pdf
  • BARRON’S. Top 100 Hedge Funds. Barron’s, May 24, 2010.
  • ECHU T.; BERTRAND E. L’analyse technique - Pratique et méthodes. 2ème ed. Paris: Economica, 1995.
  • BIAIS, B.; WOOLLEY, P. High Frequency Trading. London School of Economics, 1 Mar. 2011. Available at: <http://idei.fr/sites/default/files/IDEI/documents/pw/hft_financial_world.pdf>.
    » http://idei.fr/sites/default/files/IDEI/documents/pw/hft_financial_world.pdf
  • BIAIS, B. High frequency trading. Presentation prepared for the European Institute of Financial Regulation. Paris, Sept 2011. Available at: <https://repository.nu.edu.sa/bitstream/123456789/2206/1/High%20frequency%20trading%20Bruno%20Biais%20(Toulouse%20School%20of%20Economics).pdf>.
    » https://repository.nu.edu.sa/bitstream/123456789/2206/1/High%20frequency%20trading%20Bruno%20Biais%20(Toulouse%20School%20of%20Economics).pdf
  • BIS - BANK OF INTERNATIONAL SETTLEMENTS. Issues of measurement related to market size and macro-prudential risks in derivatives markets. Basle: BIS, 1995. Available at: <https://www.bis.org/publ/ecsc05.pdf>.
    » https://www.bis.org/publ/ecsc05.pdf
  • BOWLEY, G. Lone $4.1 billion sale led to ‘flash crash’ in May. The New York Times, Oct. 1, 2010. Available at: <https://www.nytimes.com/2010/10/02/business/02flash.html>.
    » https://www.nytimes.com/2010/10/02/business/02flash.html
  • CARVALHO, F. J. C. Uncertainty and liquidity preference. In: CARVALHO, F. J. C. Liquidity preference and monetary economies. 1. ed. Londres: Routledge, 2015a.
  • CARVALHO, F. J. C. Keynes on expectations, uncertainty and defensive behavior. Brazilian Keynesian Review, v. 1, p. 44-54, 2015b.
  • CARVALHO, F. J. C. Expectativas, incerteza e convenções. In: MONTEIRO, D.; PRADO, L. C.; LASTRES, H. (Orgs.). Estratégias de desenvolvimento, política industrial e inovação: Ensaios em Memória de Fabio Erber. Rio de Janeiro: BNDES, 2014, p. 235-258.
  • CARVALHO, F. J. C. Uncertainty and money: Keynes, Tobin and Kahn and the disappearance of the precautionary demand for money from liquidity preference theory. Cambridge Journal of Economics, v. 34, n. 4, p. 709-725, 2010.
  • CARVALHO, F. J. C. Decision-making under uncertainty as drama: Keynesian and Shacklean themes in three of Shakespeare’s tragedies. Journal of Post-Keynesian Economics, v. 25, n. 2, p. 189-218, 2003.
  • CARVALHO, F. J. C. Sobre ordem, incerteza e caos em economia. Revista Brasileira de Economia, v. 48, n. 2, p. 179-88, 1994.
  • CARVALHO, F. J. C. Mr Keynes and the Post Keynesians: principles of macroeconomics for a monetary production economy. Aldershot: Edward Elgar, 1992a.
  • CARVALHO, F. J. C. Elasticidade de expectativas e surpresa potencial: reflexões sobre a natureza e a estabilidade do equilíbrio sob incerteza. Revista Brasileira de Economia, v. 46, n. 1, p. 53-75, 1992b.
  • CARVALHO, F. J. C. Keynes on probability, uncertainty and decision making. Journal of Post-Keynesian Economics, v. XI, n. 1, p. 66-99, 1988.
  • CHABOUD, A. et al. Rise of the machines: algorithmic trading in the foreign exchange market, board of governors of the Federal Reserve System. International Finance Discussion Papers, Board of Governors of the Federal Reserve System, n. 980, Oct. 2009. Available at: <http://www.federalreserve.gov/pubs/ifdp/2009/980/ifdp980.pdf>.
    » http://www.federalreserve.gov/pubs/ifdp/2009/980/ifdp980.pdf
  • CHLISTALLA, M. High-frequency trading Better than its reputation? Deutsche Bank Research, Feb. 7, 2011. Available at: <https://secure.fia.org/ptg-downloads/dbonhft2-11.pdf>.
    » https://secure.fia.org/ptg-downloads/dbonhft2-11.pdf
  • DAVIDSON, P. International money and the real world. New York: St Martin’s Press, 1982.
  • DEQUECH, D. The concept of development conventions: some suggestions for a research agenda. Journal of Economic Issues, v. 51, p. 285-296, 2017.
  • FAMA, E. F. Efficient capital markets: a review of theory and empirical work. The Journal of Finance, v. 25, n. 2, 1970.
  • FARHI, M. Derivativos financeiros: hedge, especulação e arbitragem. Economia e Sociedade, v. 8, n. 2, jan. 1999.
  • FINANCIAL TIMES. Financial Times Lexicon. Entry - Co-Location. Financial Times Lexicon. [On-line]. Available at: <http://lexicon.ft.com/Term?term=co_location>.
    » http://lexicon.ft.com/Term?term=co_location
  • GROSSMAN, S. J.; STIGLITZ, J. E. On the impossibility of informationally efficient markets. American Economic Review, v. 70, n. 1, 1980.
  • HICKS, J. R. On Coddington’s interpretation: A reply. Journal of Economic Literature, v. 17, p. 989-995, 1979.
  • KENDALL, M. The analysis of economic time-series-part i: prices. Journal of the Royal Statistical Society, Series A (General), v. 116, n. 1, 1953.
  • KEYNES, J. M. The general theory of employment, interest, and money. London: Palgrave Macmillan, 1936.
  • KEYNES, J. M. A treatise on probability. The Collecting Writings of J. Maynard Keynes. London: Macmillan for the Royal Economic Society, 1973a, v. 8.
  • KEYNES, J. M. The general theory and after. Defence and development. The Collected Writings of J. Maynard Keynes. London: Macmillan for the Royal Economic Society, 1973b, v. 14.
  • LUCAS, R. Expectations and the neutrality of money. Journal of Economic Theory, v. 4, p. 103-124, 1972.
  • MACCRANK, J. Goldman revamps electronic stock trading to catch rival. Reuters News, March 14, 2016. Available at: <http://www.reuters.com/article/us-goldman-stocktrading-idUSKCN0WG10C>.
    » http://www.reuters.com/article/us-goldman-stocktrading-idUSKCN0WG10C
  • MCKINSEY. Two different routes to digital success. McKinsey Working Papers on Corporate & Investment Banking, n. 10, Oct. 2015.
  • MENDELBROT, B. The variation of certain speculative prices. The Journal of Business, v. 36, n. 4, 1963.
  • MIILS, E. S. The use of adaptative expectations: a comment. The Quarterly Journal of Economics, v. 75, n. 2, p. 330-335, 1961. Available at: <https://academic.oup.com/qje/article-abstract/75/2/327/1846609>.
    » https://academic.oup.com/qje/article-abstract/75/2/327/1846609
  • MURPHY J. J. Technical analysis of the futures markets. New York: New York Institute of Finance, 1986.
  • MUTH, J. F. Rational expectations and the theory of price movements. Econometrica, v. 29, n. 3, p. 315-335, jul. 1961.
  • OLOFFSON, K.; GANDEL, S. High-frequency trading grows, shrouded in secrecy. Time, Aug. 05, 2009. Available at: <http://content.time.com/time/business/article/0,8599,1914724,00.html>.
    » http://content.time.com/time/business/article/0,8599,1914724,00.html
  • SARGENT, T. V. Rational expectations. In: The New Palgrave: A Dictionary of Economics, v. 4, p. 76-79, 1987.
  • SHACKLE, G. L. S. Imagination and the nature of choice. Edinburgh, UK: Edinburgh University Press, 1979.
  • SHILLER, R. J. From efficient markets theory to behavioral finance. Journal of Economic Perspectives, v. 17, n. 1, 2003.
  • TEWELES, R. J.; HARLOW, C. V.; STONE, H. L. The commodity futures game. New York: MacGraw Hill Book Company, 1974.
  • TEWELES, R. J.; JONES, F. J. The futures game. New York: Mac Graw Hill Book Company 1987.
  • TIROLE, J. On the possibility of speculation under rational expectations. Econometrica, v. 50, n. 5, 1982.
  • TVERSKY, A.; KAHNEMAN, D. Judgment under uncertainty: heuristics and biases. Science, New Series, v. 185, n. 4157, p. 1124-1131, Sept. 1974.
  • ZHANG, X. F. The effect of high-frequency trading on stock volatility and price discovery. New Heaven: Yale University School of Management, Nov. 2010. Available at: <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1691679>.
    » https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1691679
  • JEL Codes:

    E12; G11; G15.
  • 1
    As Carvalho (1992a, p. 69) summarizes, “Keynes’s notion of uncertainty is connected to the unknowability of the relevant distribution functions that preside over social processes”. In the next section, this concept will be detailed.
  • 2
    A new version of this paper was published as a chapter in Carvalho (1992a).
  • 3
    Financial derivatives represent a market response to the context of high uncertainty and instability of expectations that emerged after the end of the Bretton Woods system in 1973. Besides its deferred settlement, another key specificity of a derivative is pointed out by BIS (1995, p. 6-7): “a contract whose value depends on the price of underlying assets, but which does not require any investment of principal in those assets. As a contract between two counterparts to exchange payments based on underlying prices or yields, any transfer of ownership of the underlying asset and cash flows becomes unnecessary.” The high imbedded leverage of financial derivatives stems from this second specificity.
  • 4
    In mathematical or econometric models, the existence of uniform expectations in the operations of derivatives is admitted by postulating that the hedger, although sharing the expectations of all other participants, is willing to perform an operation contrary to these expectations due to pure risk aversion. Although it may explain the existence of some derivatives business, this hypothesis could not be extended to large and liquid derivative markets in which hedging accounts for only a small fraction of trades (FARHI, 1999).
  • 5
    The theory of market´s efficiency was developed in the 1950s based on the research carried out by Kendall (1953) and Osborne (1959). New researches carried out by Mendelbrot (1963) and by Fama (1970) extended their theoretical statement and claim that if a market is efficient no forecast can be established based on a historical series of prices and that no model reproducing over time can be identified.
  • 6
    The so-called ‘passive management of resources’ stems from the practical application of the efficient market theory, having been adopted by several categories of agents, including mutual funds. As this class of forecasting and decision-making tool is not used in the derivatives markets, it will not be analysed in the next section.
  • 7
    On the relationship between liquidity preference and uncertainty, see also Carvalho (2015a, 2015b).
  • 8
    Carvalho (2014) also addresses the link between uncertainty and conventions.
  • 9
    Carvalho (1994) details the relationship among order, uncertainty and chaos.
  • 10
    In the same period, market professionals searched other tools to support their operational decisions. One of them was based on the determination of the profile of the participants and their positions. By the mimetic effect, agents followed the orders of big investors, those who had the reputation of winners and those who were considered well informed. It was a primitive version of what is now called noise trading which seeks to determine the position and level of prices on which large orders are concentrated. Despite the emergence of much more sophisticated analytical instruments as detailed below, this mode of operation continues to play an important role in the contemporaneous financial assets’ markets, usually by small investors and day traders and very strongly by the bulk of uniformed investors during sharp prices’ reversal.
  • 11
    The term ‘electronic trading’ is used in many ways. In this paper, we adopt the definition of Allen, Hawkins e Sato (2001). It refers mainly to trading in wholesale financial markets (as opposed to e-commerce more generally and focuses on the central feature of electronic trading systems, which is the automation of trade execution.
  • 12
    Exchanges built or are building huge data centers where traders, members and non-members alike, after paying a fee, can place computers containing their trading algorithms next to an exchange’s engine, which matches ‘buy’ and ‘sell’ orders. This ‘co-location’ shaves crucial milliseconds from the time it takes to complete a trade (Financial Times Lexicon).
  • 13
    Dark pools are trading venues that do not publish prices, volumes and the origin of bid and ask orders.
  • 14
    Advanced economies currencies foreign exchange transactions.
  • 15
    Lewis’ Flash Boys stood four consecutive weeks at Number 1 on the New York Times nonfiction list.
  • 16
    Flash Crash is a very rapid, deep, and volatile movement in security prices occurring within an extremely short time.

Publication Dates

  • Publication in this collection
    07 Aug 2020
  • Date of issue
    2020

History

  • Received
    12 Feb 2019
  • Accepted
    29 Oct 2019
Instituto de Economia da Universidade Federal do Rio de Janeiro Avenida Pasteur, 250 sala 114, Palácio Universitário, Instituto de Economia, 22290-240 Rio de Janeiro - RJ Brasil, Tel.: 55 21 3938-5242 - Rio de Janeiro - RJ - Brazil
E-mail: rec@ie.ufrj.br