Defining away the most prevalent economic problems of modern economies and failing to communicate the limitations and assumptions of its popular models, the economics profession bears some responsibility for the current crisis, aver the authors of an essay included in ‘Lessons from the Financial Crisis’ (www.wiley.com). They fret that the economics profession appears to have been unaware of the long build-up to the current worldwide financial crisis and to have significantly underestimated its dimensions once it started to unfold.
The authors, David Colander et al., are of the view that many of the financial economists who developed the theoretical models upon which the modern financial structure is built were well aware of the strong and highly unrealistic restrictions imposed on their models to assure stability. “Yet, financial economists gave little warning to the public about the fragility of their models, even as they saw individuals and businesses build a financial system based on their work.”
Why so? Because of ‘an ethical breakdown,’ the essay bemoans. Economists, the authors argue, have an ethical responsibility to communicate the limitations of their models and the potential misuses of their research. “Currently, there is no ethical code for professional economic scientists. There should be one.”
Questionable theoretical foundations
A section on how models can be a source of risk cites as example mathematical portfolio and risk management models which have been the academic backbone of the tremendous increase of trading volume and diversification of instruments in financial markets. For instance, portfolio insurance and dynamic hedging are based on derivative pricing models.
“While useful, these models are far from perfect as a foundation for new financial instruments, because of the limited availability of historical data to base these models on and the analytic computing limitations that led modellers to use simulations with relatively arbitrary assumptions about correlations between risks and default probabilities.”
As a result, the theoretical foundations of all financial products based on these models were open to question, much as in the case of erecting a building of cement of which you weren’t sure of the components, reads a telling analogy in the essay.
A research agenda which the authors prescribe for coping with financial fragility calls for a systematic analysis of network vulnerability as done in computer science and operations research. How woefully naïve it was to introduce new derivatives, seeing them through the lens of general equilibrium models.
“Unfortunately, the claimed efficiency gains through derivatives are merely a theoretical implication of a highly stylised model and, therefore, have to count as a hypothesis… The idea that the system was made less risky with the development of more derivatives led to financial actors taking positions with extreme degrees of leverage…”
Improbable risks underestimated
Another essay, in the book edited by Robert W. Kolb, explores the behavioural basis of the financial crisis. “Our existing risk measures account for perhaps 95 per cent of what occurs. The major catastrophic risks lurk in the fat tails of the remaining 5 per cent. We tend to underestimate these improbable risks because of behavioural biases,” notes the author, J. V. Rizzi.
Defining risk as ‘exposure to the consequences of the unknown,’ he classifies risk along two dimensions. “The first concerns high frequency events with relatively clear cause-and-effect relationships like missing your connecting flight. Other risks such as health problems occur infrequently. Consequently, the cause-and-effect relationship is unclear.”
The author underlines the ‘impact severity’ in the second dimension and notes that no matter how remote high impact events cannot be ignored because they can threaten an institution’s existence as was demonstrated in the current market crisis.
Cyclical risks are ‘low frequency-high impact events’ with fat-tailed loss distributions, the essay instructs; for instance, investors incurring such risk can expect mainly small, positive events but are subject to a few cases of extreme loss.
Why are these risks difficult to understand? First, there are insufficient data to determine meaningful probability distributions, Rizzi reasons. “In this case, the statistics are descriptive, not predictive. Consequently, no amount of mathematics can tease out certainty from uncertainty.”
But more important is the second reason that he lays out: ‘infrequency clouds perception,’ so much so that risk estimates become anchored on recent events. For example, an overemphasis on recent events can produce ‘an under-appreciation of risk during a bull market because instruments are priced without regard to the possibility of a crash.’
Four behavioural biases
Four biases that Rizzi describes are overconfidence, statistical, herding, and groupthink. Overconfidence occurs when we exaggerate our predictive skills and ignore the impact of chance or outside circumstances, he cautions. “Risk managers who took credit for results during the boom failed to consider the impact of randomness and mean reversion, creating an illusion of control. Compounding this is their selective recall of confirming information to overestimate their ability to predict the correct outcome, which inhibits learning.”
Statistical bias, in the author’s view, involves confusing beliefs for probability and skill for chance by selecting evidence in accordance with our expectations. Economics is a social science based on human behaviour, he reminds; so, prices are not determined by random number machines, but come from trades by real people. As a result, statistically-based risk management practices are inherently limited. “They are unable to reflect the hidden risk that the market conditions may change, such as formerly diversified positions that begin moving together, triggering unexpected losses.”
One may like to think of ‘herding’ as a characteristic of most animals, but the truth is that humans can often behave like herds, with individuals mimicking the decisions of others. Through herding, individuals avoid falling behind and looking bad if they pursue an alternative action, the Rizzi observes. “It is based on the social pressure to conform, and reflects safety by hiding in the crowd. In so doing, you can blame any failing on the collective action and maintain your reputation and job.”
And the fourth bias, groupthink, occurs when individuals identity with the organisation and uncritically accept its actions, thereby suppressing inconsistent information, and validating only mutually reinforcing individual biases and unrealistic views. Experts are prone to groupthink, rues the author. “They tend to limit information from all but other expert sources. Thus, they repeat statements until they become accepted dogma, regardless of their validity, because of a lack of critical thinking…”
Imperative read for finance managers, especially the smart-and-frenzied ones.