Sunteți pe pagina 1din 144

03/2010/#28

journal
the journal of financial transformation

Cass-Capco Institute Paper Series on Risk

Recipient of the APEX Awards for Publication Excellence 2002-2009


Minimise risk,
optimise success

With a Masters degree from Cass Business School,


you will gain the knowledge and skills to stand out
in the real world.

MSc in Insurance and Risk Management

Cass is one of the world’s leading academic centres For applicants to the Insurance and Risk
in the insurance field. What's more, graduates Management MSc who already hold a
from the MSc in Insurance and Risk Management CII Advanced Diploma, there is a fast-track
gain exemption from approximately 70% of the January start, giving exemption from the
examinations required to achieve the Advanced first term of the degree.
Diploma of the Chartered Insurance Institute (ACII).
To find out more about our regular information
sessions, the next is 10 April 2008, visit
www.cass.city.ac.uk/masters and click on
'sessions at Cass' or 'International & UK'.

Alternatively call admissions on:


+44 (0)20 7040 8611
Editor
Shahin Shojai, Global Head of Strategic Research, Capco

Advisory Editors
Cornel Bender, Partner, Capco
Christopher Hamilton, Partner, Capco
Nick Jackson, Partner, Capco

Editorial Board
Franklin Allen, Nippon Life Professor of Finance, The Wharton School,
University of Pennsylvania
Joe Anastasio, Partner, Capco
Philippe d’Arvisenet, Group Chief Economist, BNP Paribas
Rudi Bogni, former Chief Executive Officer, UBS Private Banking
Bruno Bonati, Strategic Consultant, Bruno Bonati Consulting
David Clark, NED on the board of financial institutions and a former senior
advisor to the FSA
Géry Daeninck, former CEO, Robeco
Stephen C. Daffron, Global Head, Operations, Institutional Trading & Investment
Banking, Morgan Stanley
Douglas W. Diamond, Merton H. Miller Distinguished Service Professor of Finance,
Graduate School of Business, University of Chicago
Elroy Dimson, BGI Professor of Investment Management, London Business School
Nicholas Economides, Professor of Economics, Leonard N. Stern School of
Business, New York University
José Luis Escrivá, Group Chief Economist, Grupo BBVA
George Feiger, Executive Vice President and Head of Wealth Management,
Zions Bancorporation
Gregorio de Felice, Group Chief Economist, Banca Intesa
Hans Geiger, Professor of Banking, Swiss Banking Institute, University of Zurich
Wilfried Hauck, Chief Executive Officer, Allianz Dresdner Asset Management
International GmbH
Pierre Hillion, de Picciotto Chaired Professor of Alternative Investments and
Shell Professor of Finance, INSEAD
Thomas Kloet, Senior Executive Vice-President & Chief Operating Officer,
Fimat USA, Inc.
Mitchel Lenson, former Group Head of IT and Operations, Deutsche Bank Group
David Lester, Chief Information Officer, The London Stock Exchange
Donald A. Marchand, Professor of Strategy and Information Management,
IMD and Chairman and President of enterpriseIQ ®

Colin Mayer, Peter Moores Dean, Saïd Business School, Oxford University
Robert J. McGrail, Executive Managing Director, Domestic and International Core
Services, and CEO & President, Fixed Income Clearing Corporation
John Owen, CEO, Library House
Steve Perry, Executive Vice President, Visa Europe
Derek Sach, Managing Director, Specialized Lending Services, The Royal Bank
of Scotland
John Taysom, Founder & Joint CEO, The Reuters Greenhouse Fund
Graham Vickery, Head of Information Economy Unit, OECD
Norbert Walter, Group Chief Economist, Deutsche Bank Group
Table of contents

Opinion

8 A partial defense of the giant squid


Sanjiv Jaggia, Satish Thosar
12 Thou shalt buy ‘simple’ structured products only
Steven Vanduffel
14 Enterprise friction — the mandate for risk 77 Can ARMs’ mortgage servicing portfolios be
management delta-hedged under gamma constraints?
Sandeep Vishnu Carlos E. Ortiz, Charles A. Stone, Anne Zissu
19 Enterprise Risk Management — a clarification of 87 The time-varying risk of listed private equity
some common concerns Christoph Kaserer, Henry Lahr, Valentin Liebhart,
Madhusudan Acharyya Alfred Mettler
22 Capital at risk — a more consistent and intuitive 95 The developing legal risk management
measure of risk environment
David J. Cowen, David Abuaf Marijn M.A. van Daelen
103 Interest rate risk hedging demand under a
Gaussian framework
Sami Attaoui, Pierre Six
Articles
109 Non-parametric liquidity-adjusted VaR model: a
27 Economists’ hubris — the case of risk stochastic programming approach
management Emmanuel Fragnière, Jacek Gondzio, Nils S.
Shahin Shojai, George Feiger Tuchschmid, Qun Zhang

37 Best practices for investment risk management 117 Optimization in financial engineering — an essay
Jennifer Bender, Frank Nielsen on ‘good’ solutions and misplaced exactitude
Manfred Gilli, Enrico Schumann
45 Lessons from the global financial meltdown of
2008 123 A VaR too far? The pricing of operational risk
Hershey H. Friedman, Linda W. Friedman Rodney Coleman

55 What can leaders learn from cybernetics? 131 Risk management after the Great Crash
Toward a strategic framework for managing Hans J. Blommestein
complexity and risk in socio-technical systems
Alberto Gandolfi
61 Financial stability, fair value accounting, and
procyclicality
Alicia Novoa, Jodi Scarlata, Juan Solé
Dear Reader,

As the variety and volume of shocks facing the financial system increase the demand for those who have a
thorough understanding of risk, its measurement and management will continue to rise. And this has certainly
been the case over the past couple of years. The financial system came very close to collapsing and had it
not been for the aggressive and immediate responses from the global monetary authorities many of today’s
surviving institutions might have also disappeared, without commenting on the moral hazard debate that this
has engendered.

In order to avoid similar events taking place in the future, the world’s leading executives and policy makers
have come together to develop means of predicting and perhaps preventing similar crises from engulfing the
financial system in the future, sadly with each sitting on either side of the debate. While these efforts are to
be commended it is not clear whether they are necessarily targeting the real reasons behind the current crisis
and whether such preventions would be effective for future crises that will have their own specificities and
implications.

From our perspective, these efforts will be in vain unless executives, policymakers, and developers of scientific
models come together to have an open debate about the true causes and implications of the current crisis and
what actions need to be taken in order to rectify them, since many of the main culprits are still present within
the system. This is the reason that we established the Cass-Capco Institute Paper Series on Risk in collaboration
with Cass Business School. To focus on the real issues that impact our industry and not just those that get press
attention.

These issues are covered in a number of the articles in this edition of the paper series, which we are confident
will be of interest, and certainly of useful benefit to our executive readers.

At Capco we strive to bring together leading academic thinking along with business application, as either in
isolation will be of limited benefit to anyone. We hope that this issue of the paper series has also achieved that
objective.

Good reading and good luck.

Yours,

Rob Heyvaert,
Founder and CEO, Capco
The dream of effective ERM

The past year or so has been very bruising for executives sitting at the helms of the world’s major financial insti-
tutions. Many came very close to failing, and quite a few in fact did. Many of the most respected global financial
brands have now disappeared.
 
There have been numerous reasons put forward for the current crisis, but most have focused on the most obvious
causes and not necessarily on the underlying issues that are most pertinent. The aim of this edition of the Cass-
Capco Institute Paper Series on Risk is to give adequate attention to those issues that have fallen under the radar,
mainly through neglect by headline-grabbing journalists or because press attention has been focused elsewhere.
 
The papers address the damage that securitization has caused through the disappearance of due diligence at the
issuers of mortgage lending. They also examine why it is that most, if not all, of the risk measurement and manage-
ment models look great on paper but are of very little value to the institutions that need them most.
 
Risk management had in recent years reached the status of a science. Yet if anything, the current crisis proved
that it is anything but. There are a myriad of models and formulas that have been found wanting during the recent
crisis. In fact, none of the quantitative models worked. Those institutions that survived did so simply because of
the actions of the monetary authorities of the countries in which they were based.
 
We still have a long way to go before the models developed at academic institutes that view economics and finance
as science rather than art can actually be put to effective practice within financial institutions. The gap between
theory and practice remains as large today as it has ever been.
 
Similarly, in the field of risk management the gap between hoped-for and actual effectiveness has also remained
quite large. Financial institutions will have a very tough time instituting effective risk management policies so long
as they are dealing with spaghetti-like IT systems that simply don’t communicate with one another. This situation
prevents management from getting a holistic picture of the different pockets containing highly correlated risks.
 
This edition of the paper series highlights these issues so that actions can be taken in time for the next inevitable
crisis. It insists that financial executives wake up to the fact that investing in new and more advanced models is of
little benefit if they are not aware of all the risks they face. Those at the helm need to get their IT in order before
any of the models can be applied. We cannot continue putting the cart before the horse.
 
We hope that you enjoy this issue of the Journal and that you continue to support us in our efforts to mitigate the
gap between academic thinking and business application by submitting your ideas to us.

On behalf of the board of editors


Opinion

A partial defense of the giant squid

Thou shalt buy ‘simple’ structured products only

Enterprise friction — the mandate for risk management

Enterprise Risk Management — a clarification of some common concerns

Capital at risk — a more consistent and intuitive measure of risk


A partial defense of the giant squid
Sanjiv Jaggia, Professor of Economics and Finance, California Polytechnic State University
Satish Thosar, Professor of Finance, University of Redlands

For those who have been meditating at a Buddhist monastery over through October 30, 1999 in the following sectors: biotechnol-
the last year, the giant squid in the title refers to Goldman Sachs ogy, computer hardware, computer software, electronics, Internet
Inc. Matt Taibbi writing in Rolling Stone magazine1 characterizes services, Internet software, and telecommunications. This resulted
the investment bank as: “a great vampire squid wrapped around in a sample of 301 high-tech IPO firms. We stopped at October 30,
the face of humanity, relentlessly jamming its blood funnel into 1999 because we wanted to study medium-term aftermarket price
anything that smells like money.” The article is (to put it mildly) behavior beyond the IPO date while excluding the market correc-
a colorful polemic hurled at Goldman Sachs accusing it of essen- tion that commenced in 2000 [Jaggia and Thosar (2004)].
tially creating and profiting from various financial bubbles since the
onset of the Great Depression. In Figure 1, we provide selected descriptive statistics relating to our
sample broken down by three lead underwriter reputation tiers:
Taibbi’s rhetoric was not well received in the mainstream business top, medium, and bottom. The top-tier underwriters are those that
press. Reactions were dismissive (along the lines of: simplistic analy- received the highest score of 9 in the Carter et al. (1998) rank-
sis; he’s not a real business reporter!), indignant (basically objecting ing system. These are: Goldman Sachs, Credit Suisse First Boston
to the article’s over-the-top language), defensive (all of them do it, (renamed Credit Suisse), Hambrecht & Quist, and Salomon Smith
why pick on Goldman?) but seemed not to engage with the substance Barney. The medium-tier underwriters are those with a score
of Taibbi’s accusations. In fact, an ‘audit’ done by the Columbia between 8.75 and 8.99, while the bottom tier includes all firms with
Journalism Review’s Dean Sparkman largely validates Taibbi’s sub- a score below 8.75.
stantive claims2.
There do not appear to be any obvious differences between IPO
One of these claims relates to the tech sector bubble of the late firms represented by top-tier underwriters and the others in terms
1990s. With the benefit of hindsight, it is clear that many of the of objective quality criteria. If anything, metrics such as: the level
high-tech IPOs launched in this period were based on dubious of initial underpricing, percentage of profitable firms, and firm age
valuations. Goldman was certainly active in IPO underwriting and
had the highest ranking in terms of underwriter reputation [Carter
Underwriter reputation
et al. (1998)]3. The firm also had its share of high-profile misfires
Variables Top-tier Medium-tier Bottom-tier
(for example: Webvan, Etoys) when the dot.com mania peaked and
Cumulative market-adjusted return 44.71 19.95 -16.55
crashed in the spring of 2000. Goldman was also arguably involved (CMAR) at the end of six months (140.91) (101.88) (57.23)
in activities such as spinning and laddering4; the latter has the Percentage change from offer to 65.74 68.38 43.66
market open price (72.26) (105.15) (73.18)
effect of artificially pumping up the stock prices of IPO firms in the
Percentage of firms with positive net 21.11 16.90 24.64
aftermarket. But was Goldman a particularly egregious offender in income in pre-IPO year (41.04) (37.61) (43.41)
a climate in which underwriting best practices had slipped precipi- Revenue in pre-IPO year ($ millions) 83.71 50.71 75.84
(285.64) (200.94) (397.87)
tously?5 And how should this be evaluated?
Offer size ($ millions) 134.48 126.97 48.60
(176.86) (490.20) (47.48)
As it turns out, we were involved in researching high-tech firms Percentage of firms with green-shoe 64.44 54.23 55.07
that had an IPO in the late 1990s. We found significant positive (over-allotment) option (48.14) (50.00) (50.11)
Percentage of firms belonging to 61.11 60.56 59.42
momentum and sharp reversals within a six-month aftermarket the Internet services or software (49.02) (49.04) (49.46)
window6. When we were doing the study, underwriter reputation categories

was not a central concern — it was only one of several control vari- Firm age at IPO date (years) 4.45 5.61 5.90
(3.40) (5.88) (6.03)
ables we employed. However, in the wake of the Taibbi article and
Number 90 142 69
the considerable controversy it has generated, we thought it would Notes:
be interesting to revisit our sample to see if we could uncover any 1. Standard deviations are in parentheses below the sample means.
2. Top-tier underwriter firms are those assigned the highest point score of 9 in the
interesting facts related to underwriter identity. Carter et al. (1998) system. This category includes Goldman Sachs. Medium-tier
firms are those with a score of 8.75 – 8.99. Bottom-tier are all those below 8.75.
3. A green-shoe provision gives the underwriter the option to purchase additional
The set up shares at the offer price to cover over allotments. Presence of the provision
Our primary sample was drawn from ipo.com, which lists the indirectly increases underwriter compensation.

universe of U.S. IPOs with dates, offer prices etc. classified in a Figure 1 – Selected descriptive statistics for IPO firms classified by underwriter
number of categories. We chose all IPOs from January 1, 1998 reputation

1 Taibbi, M., 2009, “The great American bubble machine,” Rolling Stone magazine, market. The SEC sanctioned various underwriting firms including Goldman Sachs, which
issue 1082-83, July 2. paid a fine of U.S.$40 million without admitting wrongdoing. The firm also reportedly
2 See: Don’t dismiss Taibbi: what the mainstream press can learn from a Goldman take- paid U.S.$110 million to settle an investigation by New York state regulators.
down, The Audit, posted on the CJR website on August 08, 2009. 5 Taibbi quotes Professor Jay Ritter, a leading IPO researcher at the University of
3 Based on the ranking system developed in: Carter, R. B., F. H. Dark, and A. K. Singh, Florida: “In the early eighties, the major underwriters insisted on three years of
1998, “Underwriter reputation, initial returns and the long-run performance of IPO profitability. Then it was one year, then it was a quarter. By the time of the Internet
8 stocks,” Journal of Finance, 53:1, 285-311. bubble, they were not even requiring profitability in the foreseeable future.”
4 Spinning involves the underwriter allocating underpriced IPOs to favored executives 6 See: Jaggia, S., and S. Thosar, 2004, “The medium-term aftermarket in high-tech
– the quid pro quo being a promise of future business. Laddering involves allocations IPOs: patterns and implications,” Journal of Banking and Finance, 28, 931-950.
conditioned upon buyers agreeing to purchase additional shares of the IPO in the after-
effect a wealth transfer from the founders and seed financiers of
70 CMAR Top-tier Medium-tier Bottom-tier
the firm to outside investors and this is over and above the initial
60
underpricing of 66 percent for this group (Figure 1). Under normal
50 circumstances, the underwriters could be justly accused either of
40 incompetence in terms of valuation or extorting their IPO clients to
30 enrich themselves and their favored customers.
20
On the other hand, if informed investors recognize that tech sector
10
stock prices are inflated, unsustainable, and are in the market to
0
0 25 50 75 100 125 exploit the ‘greater fool,’ the CMAR patterns may reflect the ability
-10
of certain underwriters through their analyst coverage, laddering
-20 arrangements, etc., to not only stabilize but pump up prices in the
-30 Trading days aftermarket until the wealth transfer from uninformed to informed
investors is duly complete.
Note: Top-tier underwriter firms are those assigned the highest point score of 9 in
the Carter et al. (1998) system. In our sample, they represent 90 firms. Medium-tier
are those
120 CMARwith a score of 8.75 – 8.99
CS representing
GS 142 firms.
Rest We decided that a closer disaggregated look at the top-tier group
Bottom-tier are all those below 8.75 representing 69 firms. might be useful.
100
Figure 2 – Cumulative market-adjusted returns (CMAR) for IPO firms grouped by lead
underwriter
80 reputation The defense
In Figures 3 and 4, we report descriptive statistics and CMAR pat-
60 terns for the IPOs underwritten by top-tier firms. Hambrecht &
seem to favor the bottom-tier group. The average IPO offer size is Quist and Salomon Smith Barney are combined so as to represent
40
considerably larger for the top-tier compared to the bottom-tier a reasonable sample size; Credit Suisse and Goldman Sachs are
group,
20 which is not surprising. One expects that superior reputation reported separately.
carries with it the ability to tap more extensive investor networks
to 0raise
0
larger chunks
25
of capital
50
at a given
75
time. Also
100
worth noting125
A few metrics are worth noting. More firms represented by Goldman
is that the top-tier group has the highest proportion of contracts (27 percent) were profitable in their pre-IPO year than Credit Suisse
-20 Trading days
with a green-shoe option. A green-shoe provision gives the under-
writer the option to purchase additional shares at the offer price to
Variables CS GS Rest
cover over-allotments and thereby indirectly increases underwriter
Cumulative market-adjusted return 80.09 27.51 30.55
compensation. (CMAR) at the end of six months (165.39) (131.84) (121.06)
Percentage change from offer to 74.34 85.67 26.63
market open price (75.17) (81.23) (28.61)
A striking and somewhat surprising difference is in the cumula-
Percentage of firms with positive net 14.29 27.03 20.00
tive market-adjusted returns (CMAR) registered by each group. income in pre-IPO year (35.64) (45.02) (40.83)
To study this in greater detail, we graph (Figure  2) the CMAR for Revenue in pre-IPO year ($ millions) 41.34 37.32 199.81
(147.31) (50.56) (504.84)
each group using the post-IPO day 1 open price as the base through
Offer size ($ millions) 101.97 155.59 139.63
trading-day 125 or approximately six months after the IPO date7. (103.99) (222.89) (165.43)
There are visual and arguably economically significant differences Percentage of firms with green-shoe 67.86 86.47 28.00
across groups. The bottom-tier group (green) immediately slips into (over-allotment) option (47.56) (34.66) (45.83)
Percentage of firms belonging to 71.43 59.46 52.00
negative territory and stays there for a six-month CMAR of -16.5
the Internet services or software (46.00) (49.77) (50.99)
percent. The medium (red) and top (blue) groups display strong categories
positive momentum and reach a CMAR peak of 40.5 percent (at Firm age at IPO date (years) 4.01 5.03 4.07
(2.13) (3.98) (3.63)
day 114) and 61.5 percent (at day 112) respectively. The CMAR then
Number 28 37 25
tapers off possibly due to the onset of lock-up expiration pressures Notes:
and at the end of six months ends up at 20.0 percent and 44.7 per- 1. Standard deviations are in parentheses below the sample means.
2. Top-tier underwriter firms are those assigned the highest point score of 9 in
cent for the medium and top groups respectively. the Carter et al. (1998) system; CS = Credit Suisse, GS = Goldman Sachs, Rest =
Hambrecht & Quist and Salomon Smith Barney
3. A green-shoe provision gives the underwriter the option to purchase additional
This can be viewed in a number of ways. If the market is behav- shares at the offer price to cover over allotments. Presence of the provision
ing rationally and recognizing ‘true value’ as time elapses, the indirectly increases underwriter compensation.

45 percent CMAR displayed by the top group represents serious


Figure 3 – Selected descriptive statistics for IPO firms classified by top-tier
underestimation of the initial IPO offer prices. It represents in underwriters

7 Let Pi1 represent the day 1 open price of the ith firm and let Pm1 be the corresponding
level of the market (Nasdaq) index. Similarly, Pit and Pmt represent the open price at 9
day t of the ith firm and the market respectively. The CMAR of the firm at time t is
calculated as: CMARit = [Pit/Pi1] ÷ [Pmt/Pm1] -1. The time in question does not refer to
calendar time, but to the time from the IPO date.
70 CMAR Top-tier Medium-tier Bottom-tier

60

50

40

30

20

10

0
0 25 50 75 100 125
-10

-20

-30 Trading days

120 CMAR CS GS Rest IPO firm name/current or former CMAR % Lead underwriter Status
ticker symbol

100
Infospace Inc (INSP) 199.2 Hambrecht & Quist B
Art Technology Group (ARTG) 255.4 Hambrecht & Quist B
80 F5 Networks Inc (FFIV) 469.9 Hambrecht & Quist A
Inktomi Corp (INKT) 142.7 Goldman Sachs C
60 Ebay Inc (EBAY) 623.4 Goldman Sachs A
Viant Corp (VIAN) 183.6 Goldman Sachs D
40 Active Software Inc (ASWX) 248 Goldman Sachs C
Allscripts Inc (MDRX) 103.9 Goldman Sachs A
20
Tibco Software (TIBX) 182.8 Goldman Sachs B
Inet Technologies (INTI) 119 Goldman Sachs C
0
0 25 50 75 100 125 Juniper Network Inc (JNPR) 116.5 Goldman Sachs A
NetIQ Corp (NTIQ) 141.9 Credit Suisse C
-20 Trading days
Appnet Systems Inc (APNT) 158.4 Credit Suisse C

Note: CS = Credit Suisse representing 28 firms; GS = Goldman Sachs representing 37 Commerce One Inc (CMRC) 609.8 Credit Suisse C
firms; Rest = Hambrecht & Quist and Salomon Smith Barney together representing E.Piphany Inc (EPNY) 138.3 Credit Suisse C
25 firms. Phone.com Inc (PHCM) 141.4 Credit Suisse C
Software.com Inc (SWCM) 237.3 Credit Suisse C
Figure 4 – Cumulative market-adjusted returns (CMAR) for IPO firms grouped by top-
tier underwriters Tumbleweed Software Corp (TMWD) 201.7 Credit Suisse C
Liberate Technologies (LBRT) 487.2 Credit Suisse C
Vitria Technology (VITR) 250 Credit Suisse C
Notes:
(14 percent) and the rest (20 percent). Goldman firms were also 1. The above firms were represented by top-tier lead underwriters and experienced
post-IPO six-month cumulative market-adjusted returns (CMAR) greater than 100
marginally longer in business before the IPO date. On the other
percent.
hand, Goldman firms were subject to greater initial underpricing on 2. Status (August 2009) definitions are given below:
3. Successful ongoing enterprises; significant positive returns realized by early long-
average. They also had a higher average offer size and were more
term investors.
likely to be subject to a green-shoe provision8. But the most sug- 4. Viable ongoing enterprises.
5. Merged, restructured or otherwise consolidated; significant impairment to early
gestive statistic in our view is the six-month CMAR. The Goldman
valuations.
group’s CMAR at 28 percent is significantly lower than that of the 6. Defunct.
Credit Suisse group which racked up 80 percent. Thus aftermarket
momentum (or manipulation if one were to take the cynical view) is Figure 5 – Current status of selected IPO firms launched during the dotcom bubble era

lowest for firms represented by Goldman.

This is borne out by the CMAR patterns in Figure  4. The red line subset of the likeliest candidates for success. How did they perform
representing Goldman firms is virtually flat in the immediate after- over the long-term? We traced the fortunes of these 20 firms from
market, when most purported price pumping takes place. The blue their IPO date up until the present (August 2009). We examined
(Credit Suisse) and green (Hambrecht & Quist and Salomon Smith available financials, stock price performance, mergers, consolida-
Barney) lines suggest higher levels of momentum and reversal tions etc. Several firms were targets of class-action lawsuits filed by
within a six-month period — more of a bubble within a bubble pat- aggrieved stockholders claiming misstatements in the IPO prospec-
tern with the benefit of hindsight. tus and the like. Our findings are summarized in Figure  5, which is
essentially a status report on each firm. We assigned each firm into
After all is said and done, the tech bubble is only one instance of a one of following categories, or letter grades if you will.
series of such events in recorded history. And, while these events
result in a lot of wealth destruction, the firms left standing in the A These are all firms that have survived and thrived. In our judg-
end usually signify technological and productivity gains to society, ment, they all have successful business models and good pros-
which may in the long-run exceed the Schumpeterian costs. pects going forward. An investor who bought shares soon after
the IPO date and held on to them till August 2009 would have
We do not profess to know how to execute such a cost-benefit analy- realized significant positive returns. Only four of the 20 firms
sis. Instead, we decided to undertake an outlier analysis within our receive an A grade — three of these (Ebay, Juniper Network,
sample. We carried out a case study-type analysis of the 20 firms Allscripts) were lead underwritten by Goldman Sachs. The fourth
that registered a six-month CMAR of more than 100 percent and (F5 Networks) was underwritten by Hambrecht & Quist.
were represented by top-tier lead underwriters. We were essentially B The three firms in this category are viewed as viable ongoing
projecting ourselves back in time before the crash and picking a small enterprises. There is a fair amount of within-group variation. For

8 This may reflect Goldman’s greater clout even within the top-tier group.
10 – The   journal of financial transformation
instance, Infospace (Hambrecht & Quist) has negative income
in its latest financial year but still has a market capitalization of
U.S.$293 million. Early post-IPO investors who held on to their
position would see a negative return. In contrast, Tibco Software
(Goldman) is profitable, has a current market capitalization of
U.S.$1.61 billion, and a P/E multiple of 27. The only reason Tibco
did not get an A grade is that early buy-and-hold investors would
register a negative stock return.
C The twelve firms in this group were severely impacted in the
tech sector crash of 2000. While a small number survive with
their original stock ticker symbol, none of these are profitable or
actively traded. Most have merged, restructured, or otherwise
consolidated. The common element is that early investors who
had not divested before the crash would have suffered signifi-
cant (if not quite total) losses. Goldman represented three firms
in this group.
D The one firm in this category (Viant; Goldman) declared bank-
ruptcy in 2003 and is essentially defunct.

Hambrecht & Quist represented only three firms (1 A; 2 Bs), all of


which survive and in aggregate delivered considerable value to
early investors. Goldman’s record is mixed with three As and a B
balanced out with three Cs and a D. Credit Suisse has the poorest
record in terms of our sample (9 Cs). None of the firms they rep-
resented were successful in weathering the tech sector shakeout.
Thus, even among the small subset of IPO firms represented by
top-tier underwriters and greeted with sustained enthusiasm by
investors, ex-post analysis reveals considerable variation in the
staying power of their business models.

Conclusion
A respected market observer recently commented: “When faced
with market euphoria, whatever its source, financial institutions
will always be confronted with the same stark choice: lower your
standards or lower your market share.”9

Goldman was certainly part of the general deterioration of under-


writing standards but our analysis reveals that they did represent
some very good firms and in terms of our CMAR analysis were a
reasonably responsible player in the IPO aftermarket. Perhaps their
quality control mechanisms were not quite so compromised. More
recently, they seem to have recognized the risks stemming from
subprime lending well ahead of their competitors, hedged with
some success, and have emerged from the financial crisis more or
less intact10. We doubt that Taibbi would set much store by this but
there it is.

9 Jonathan A. Knee, senior managing director at Evercore Partners, in the New York
Times, DealBook Dialogue, October 6, 2009. 11
10 Critics may point out that Goldman would likely have gone under (or at least taken
large losses) if the U.S. taxpayer had not bailed out AIG and thereby its counterpar-
ties.
Thou shalt buy ‘simple’ structured
products only
Steven Vanduffel, Faculty of Economics and Political Science, Vrije Universiteit Brussel (VUB)

Structured products are a special type of investment product; they uct. Hence, using structured products one is able to generate any
appear to offer good value in two situations. The first is when you desired wealth distribution, and in this sense there is no doubt that
can sell them with an attractive margin, such that the payoff pro- structured products can offer good value.
vided at the end of the investment horizon T>0 is hedged, and not
of your concern. This method of value creation is possible for banks Is this the end of the story? Not really, because once a convenient
and financial planners but is not really in reach of retail customers. wealth distribution has been put forward and a structured product
The second possibility consists of purchasing these instruments has been designed to achieve this, one may still raise the following
and making sufficiently high investment returns with them. In this question: how can we obtain this wealth distribution at the lowest
note we claim that one must be extremely careful when one is possible cost? Of course, such a cost efficient strategy, if it exists,
on the buyer’s side; the odds may go against you, transforming a would be preferred by all investors. But does it exist? Intuitively
potential cash cow into a loss generator. We will elaborate on this one may think that two products with the same wealth distribution
point further. at maturity should bear the same cost. Surprisingly this common
belief is absolutely untrue and careless design of a structured prod-
An obvious question people ask themselves when investing is how uct can cause a lot of harm.
to do it in an optimal way. Of course, everyone who prefers more
to less, which is akin to having an increasing utility curve [von Cost efficiency in a complete single-asset market
Neumann and Morgenstern (1947)], will want to invest the available Dybvig (1988) showed that in a so-called complete single-asset
funds in a product that will ultimately provide the highest return. market the most efficient way to achieve or build a wealth distribu-
Unfortunately an investor cannot predict with certainty which tion is by purchasing ‘simple’ structured products. Here ‘complete-
investment product will outperform all others. However, ex-ante ness’ refers to a financial market where all payoffs linked to the
one may be able to depict for each investment vehicle all possible single underlying risky asset are hedgeable or replicable [Harrison
returns together with the odds to achieve these, hence determining and Kreps (1979)], and ‘simple’ refers to the feature that the payoff
the distribution function for the wealth that will be available at time can be generated using a (risk-free) zero-coupon bond and plain
T, and this will be called the wealth distribution. vanilla derivatives (calls and puts).

Hence, the quest for the best investment vehicle amounts to Indeed, Dybvig showed that optimal payoffs only depend on the
determining the most attractive wealth distribution first. There value of the underlying asset at maturity not at intermediate times.
is, however, no such thing as a wealth distribution function that is In other words, they should be path-independent. He also showed
universally optimal; simply because optimality is intimately linked that they should be non-decreasing in the terminal value of the
to the individual preference structure, the utility curve, of the underlying risky asset. For example, if S(t) (0≤t≤T) is reflecting the
investor at hand. Some people may prefer certainty above all, and price process of the risky asset, then the exotic payoff S2(T)-3 is a
in such instance putting the available money in a state deposit is path-independent and non-decreasing payoff, whereas the payoff
the best one can do. Others may prefer a higher expected return S(T/2).S(T) is path-dependent.
at the cost of more uncertainty and may find it appropriate to
invest in a fund that tracks the FTSE-100. For some people neither Moreover, since these path-independent payoffs can be approxi-
of these options is satisfactory and they may wish to seek capital mated to any degree of precision by a series of zero-coupon
protection while taking advantage of increasing markets as well. bonds and calls and puts, the logical conclusion of Dybvig’s work
For example, they may want to purchase a product that combines would indeed be that proper investments only consist of simply
the protection of capital with a bonus of 50% in case the FTSE-100 purchasing the appropriate proportions of bonds and plain vanilla
has increased in value during the entire investment horizon under derivatives (calls and puts) written on the underlying risky asset.
consideration. Others may find this too risky and may prefer a Path-dependent, thus complex, structured products are then to
product which protects their initial capital augmented with 50% be avoided. We remark that Cox and Leland (1982, 2000) already
of the average increase (if any) of the FTSE-100 during the invest- showed (using other techniques) that optimal payoffs are necessar-
ment horizon. ily path-independent but this result is more limited because it only
holds for investors that are risk averse, whereas Dybvig’s result
The different examples hint that building the optimal wealth dis- holds for all investors (assuming they all prefer more to less).
tribution and corresponding investment strategy often consists
of combining several underlying products, such as fixed income More general markets
instruments, equities, and derivatives. In the context of this note, One can argue that the above mentioned results are limited in the
such a cocktail of products will be further called a structured prod- sense that they only hold for a single-asset market, which is quite

12 – The   journal of financial transformation


restrictive, and also that they are contingent on a major assump- We also remark that at this point we do not claim that in all theoreti-
tion, i.e., the claim that markets are complete. cal frameworks path-independent structures are value-destroying,
and we leave it for future research to analyze to what extent the
Let us start with the first assumption and analyze to which extent different findings can be generalized and interpreted further. Our
it can be relaxed to multi-asset markets. Assuming that the sub- conjecture is that, at least, structured products should be designed
sequent periodical asset returns of the different risky assets are ‘in some particular way’ in order to be potentially optimal.
(multivariate) normally and independently distributed (nowadays
also referred to as a multidimensional Black-Scholes market), All in all, from a buyer’s perspective the only good structured prod-
Merton (1971) already proved that risk averse investors exhibiting ucts to invest in seem to be the simple ones and one can argue that
a particular, so-called CRRA utility function would always allocate even these may be unnecessary. Indeed, using basic products such
their funds to a riskless account and a mutual fund representing as equities, property, and bonds one may be able to design a well-
the so-called market portfolio; implying that their payoff at matu- balanced wealth distribution as well; hence avoiding all other bells
rity can be understood as a particular path-independent deriva- and whistles that are associated with derivatives.
tive written on an underlying market portfolio. Recently Maj and
Vanduffel (2010) have generalized this to include all risk averse Finally, note that the optimal design of structured products is a
decision-makers. They have shown that optimal pay-offs are neces- research topic that has also been picked up by some other authors
sarily path-independent in the underlying market portfolio. Hence, including Boyle and Tian (2008) and Bernard and Boyle (2010).
in a multidimensional Black-Scholes market, the only valuable struc-
tured products are those which can be expressed as a combination References
of a fixed income instrument and plain vanilla derivatives that are • Bernard, C., and P. Boyle, 2010, “Explicit representation of cost efficient strategies,”
working paper, University of Waterloo
written on the underlying market-portfolio. • Boyle, P., and W. Tian, 2008, “The design of equity indexed annuities,” Insurance,
Mathematics and Economics, 43:3, 303-315
Regarding the assumption of normally distributed returns, it is well- • Cox, J. C., and H. E. Leland, 1982, “On dynamic investment strategies,” Proceedings of
the seminar on the Analysis of Security Prices, 262, Center for Research in Security
known that this is a false assumption in general. See, for instance, Prices, University of Chicago
Madan and Seneta (1990) for evidence, but also look at Vanduffel • Cox, J. C., and H. E. Leland, 2000, “On dynamic investment strategies,” Journal of
(2005), where it is argued that this assumption is also dependent Economic Dynamics and Control, 24, 1859-1880
• Dybvig, P, H., 1988, “Inefficient dynamic portfolio strategies or how to throw away a
on the length of the investment horizon involved. Nevertheless, a million dollars in the stock market,” The Review of Financial Studies, 1:1, 67-88
widely accepted paradigm to describe asset returns involves the • Harrison, J. M., and D. M. Kreps, 1979, “Martingales and arbitrage in multiperiod securi-
use of Lévy processes. Hence, let us assume that this is actually ties markets,” Journal of Economic Theory, 20, 381-408
• Madan, D. B., and E. Seneta, 1990, “The variance gamma (VG) model for share market
being used by market participants, and also that they agree to use returns,” Journal of Business, 63, 511-524
the so-called Esscher tranform (exponential tilting) to price deriva- • Maj, M., and S. Vanduffel, 2010, “Improving the design of financial products,” working
tives. Vanduffel et al. (2009a) have shown that optimal structured paper, Vrije Universiteit Brussel
• Merton, R., 1971, “Optimum consumption and portfolio rules in a continuous-time
products are necessarily path-independent; providing further evi- model,” Journal of Economic Theory, 3, 373-413
dence against the use of complex (path-dependent) structures. • Vanduffel, S., A. Ahcan, B. Aver, L. Henrard, and M. Maj, 2009b, “An explicit option-
based strategy that outperforms dollar cost averaging,” working paper
• Vanduffel, S., A. Chernih, M. Maj, and W. Schoutens, 2009a, “A note on the suboptimal-
Final remarks ity of path-dependent pay-offs for Lévy markets,” Applied Mathematical Finance, 16:4
In this note we have given some theoretical evidence that from http://www.informaworld.com/smpp/title~db=all~content=t713694021~tab=issueslist~
branches=16 – v16, 315-330
a buyer’s point of view the only (financially) valuable structured
• Vanduffel, S., 2005, “Comonotonicity: from risk measurement to risk management,”
products are path-independent and thus ‘simple.’ While compli- PhD thesis, University of Amsterdam
cated structured products may provide some emotional value or • von Neumann, J., and O. Morgenstern, 1947, “Theory of games and economic
behavior,” Princeton University Press, Princeton
happiness to the investor they do not seem to be the right vehicles
for generating financial value. For illustrations of the different
theoretical results we refer to Dybvig (1988), Ahcan et al. (2009),
Vanduffel et al. (2009a,b), Maj and Vanduffel (2010), and Bernard
and Boyle (2010).

Let us also remark that besides theoretical inefficiency, path-


dependent payoffs also have a practical cost disadvantage. Indeed,
as compared to plain vanilla products they also suffer from a lack of
transparency and liquidity, creating the potential for their promot-
ers to make them more expensive than they fairly should be.

13
Enterprise friction —
the mandate for risk management
Sandeep Vishnu, Partner, Capco
In today’s battered economy, few are willing to put in place any- To create that necessary, well-balanced friction, companies need
thing that might meddle with earnings potential. Even fewer are to take some fundamental steps. In this report, we will examine the
willing to spend money on something that may offer only a theo- three cornerstones of robust risk management — cornerstones that
retical return on investment. Behind the polite but forced smiles will help win over the naysayers and more importantly, ensure that
and handshakes from executives, there is a silent accusation: risk companies have the most-efficient risk management programs in
management dampens revenue and puts brakes on innovation. This place. But for this to work, we believe that:
is a challenge faced by risk managers as they try to put in place
structures to guard against losses. ■■ Companies will need to change their view of risk management
as a necessary evil, and elevate the role of this critical function
But risk management is not about playing it safe; it is about play- within their organizations. Risk should have a strong voice at the
ing it smart. It is about minimizing, monitoring, and controlling the management table, and risk mitigation should be given as much
likelihood and/or fallout of unfavorable events caused by unpredict- importance as risk taking.
able financial markets, legal liabilities, project failures, accidents, ■■ Executives will need to design a new blueprint for strategic
security snafus — even terrorist attacks and natural disasters. There management that integrates risk management into every criti-
is always risk in business, and risk management should be designed cal element of the enterprise and thereby drives a risk-sensitive
to help companies navigate the terrain. culture.
■■ Organizations will have to buttress the risk management infra-
Sure, risk management may at times call on companies to pull back structures they already have in place, including, data, analytics,
on the reins, and it certainly is not free. However, risk management and reporting.
provides a counterpoint to enterprise opportunity — friction, if you
will — that not only avoids unnecessary losses, but enhances the History lessons: extending the time horizon of risk
ability of organizations to respond effectively to the threats and management
vulnerabilities to which they are exposed in the course of busi- One of the toughest issues associated with implementing an effec-
ness. tive risk management program revolves around effectively funding
the efforts. The challenge often boils down to cost versus benefit. In
Friction is a much-maligned term. The connotations are more often the months and years that led to the housing market and mortgage
negative than positive: a retarding effect, in-fighting, etc. Even in meltdown, credit crisis and subsequent recession, companies paid
physics, it is characterized as a necessary evil. The fact that friction less attention to risk management mainly because times were so
allows us to walk properly is often overlooked. One can understand good for so long. Typically, managing risks was not an embedded
the importance of friction simply by trying to walk on ice, where the element in critical business processes; it was a bolt-on activity.
absence of friction causes one to slip and slide. By contrast, high
friction makes walking on wet sand inordinately difficult. Consider a 2007 global risk information management survey
conducted by OpRisk & Compliance magazine1. Although a major-
Imagine Michael Jordan without his Nikes. One might argue that a ity of respondents said they were at least somewhat effective in
stockinged Jordan might be less encumbered with the extra weight providing the right information to the right people at the right
and trappings of footwear. But few could argue that he would have time to meet the organization’s business requirements, the survey
been as effective on the polished court that was his field of opera- indicated that many felt the information they had was not being
tions. used effectively to create a risk management culture within their
organizations. That same survey also illuminated the ever-present
Achieving a proper balance of risk in strategic planning and opera- ROI hurdle. Only 9 percent of respondents said their firms were able
tions for the enterprise is critical — especially in today’s environ- to trace the ROI of information management initiatives designed to
ment, when resilience and agility in the face of uncertainty are as capture and manage vital corporate data.
important as the effective use of identified variables.
The biggest problem with risk management is in the establishment
Now, even more than before, the goal of enterprise risk manage- of its ROI. Risk management is primarily about loss avoidance, and
ment is to define and deliver the right level of friction, because it is difficult to measure what has been avoided and thus has not
getting this calibration exactly right can help improve enterprise occurred. Efficiency and effectiveness metrics are often dwarfed by
agility. Too little friction, and a company could slip into dangerous the magnitude of loss avoidance; however, while the former can be
scenarios; too much friction, and a company could just get stuck. measured, the latter are estimated.
Getting this right can not only drive corrective measures, but can
also serve as an effective counterbalance. Executives are often heard saying, “We haven’t faced a serious

1 www.opriskandcompliance.com
14 – The   journal of financial transformation
problem; why are we spending this much money?” It is a classic management approaches gain broad support, then markets would
Catch-22. begin assessing enterprise performance through risk-adjusted
measures in addition to traditional top-line and bottom-line metrics.
The ROI problem is compounded by the fact that risk management Regulatory standardization and public availability of such informa-
can sometimes act as a retardant to growth. Anything — especially tion has greater potential to drive a shift toward sustainable growth
the cost to fund an initiative with fuzzy ROI metrics or threaten a versus short-term results.
company’s profit margin — is definitely going to be suspect.
Fast forward to today, and risk management is still receiving short
We suggest that firms refrain from looking at risk management shrift because so many companies are scrambling to make ends
costs within quarterly financial reporting time frames. Analysis of meet. Few feel they have the financial resources or time to finance
risk management’s return needs to be considered over much longer and bolster existing risk management practices. It is once again a
time horizons. Catch-22 situation:
■■ When times are good, people do not want to pay attention to risk
Sometimes retardants are necessary for growth. With 20/20 hind- management because they are too busy making money.
sight, many contend that Wall Street should have invested more ■■ When times are bad, people do not want to pay too much atten-
strategically in risk management practices and infrastructure. If tion to risk management because they are already incurring
risk profiles had been based on a 30-year historical record rather losses, and do not want to spend more money.
than the standard 100-year window, a fair number of the subprime
and adjustable rate mortgage (ARM) loans processed during the In the end, risk management gets only lip service. That needs to
housing bubble and in the years leading up to the recession would change.
have been seen more clearly for what they became — toxic assets
that damaged the health of the financial industry and ultimately the Striking a balance
entire U.S. economy. Enlightened enterprises promote creative tension between strategy
and risk management, and put in place a set of checks and balances
In all likelihood, many on Wall Street probably knew intuitively the to guard against the exploitation of short-term opportunities at the
risks they were taking; they just chose to make a management expense of long-term viability. Failure to strike this balance can
decision that the risk was acceptable because the top line return have devastating consequences, as evidenced by Countrywide’s
was so attractive. We would not call it willful negligence, but we do demise in 2009. In 2005, it was the largest mortgage originator;
believe it was a measured decision recognizing that the benefit of however, 19 percent of its loans were option ARMs, and of those, 91
the upside potentially outweighed the likelihood of the downside. Of percent had low documentation.
course, we doubt anybody suspected that the downside would turn
out to be as severe as it has been. This phenomenon is exacerbated More specifically, strategic considerations and risk assessments
when dealing with new products and structures [i.e., collaterized need to be made in tandem. There must be a dynamic — even sym-
debt obligations (CDO) and mortgage backed securities (MBS)], biotic — interaction between these two perspectives. They should
which do not have the same experiential basis as traditional prod- be seen as two sides of the same coin — like the classic ying-yang
ucts such as fixed-rate mortgages. balance principle.

While longer-term views are desirable for assessments, ROI calcu-


lations, etc., they are sometimes hard to achieve. Consequently,
Enterprise
enterprises should try to embed risk management in all ‘risk-taking’
risk
activities and ultimately drive a risk-sensitive culture. This can be management
facilitated by the increased use of risk-sensitive measures such Enterprise
strategy
as risk-adjusted return on capital (RAROC), economic value added
(EVA), shareholder value added (SVA), etc. For example, within
investment banks this would mean that the desk heads are not just
responsible for P&L, but also for the amount of capital used to gen-
erate that P&L. Extending this sentiment further, incentive compen- To effectively integrate risk considerations into the critical strate-
sation can also be based on RAROC or similar metrics. These met- gic decision-making processes, organizations should incorporate
rics allow for a degree of normalization across the enterprise and the following principles into every aspect of their management
allow the board and senior management to determine investment philosophy.
strategy and consistently evaluate returns. If such metrics and

15
Promote a culture of resilience Compensation is, and always has been, a key lever in determining
Executives may well consider revisiting many of the major pillars of the nature of a corporate culture. Culture depends heavily on incen-
their organization and refine critical processes by integrating risk tives, which today are often skewed toward rewarding upside ben-
considerations into their enterprise architecture. Resilience and efits and not necessarily avoiding downside losses. Compensation
agility should be primary goals of such efforts and should address practices need to become more risk-sensitive, so that they reward
foundational elements such as data, as well as derived capabilities, long-term value creation and not just short-term gains. Risk mitiga-
including analytics, and feedback loops driven through reporting. tion should be as important as risk-taking to drive the appropriate
culture.
Often organizations conduct risk assessments as a bolt-on activity.
But organizations that integrate resilience (and risk management
Governance
in general) into their culture in a granular manner stand a better
chance of not only mitigating risks more effectively, but also more
cost efficiently.
Reporting

The agile software development process adopted by high-tech Analytics


organizations has demonstrated that integrating quality assurance
into the development process results in both higher-quality and Data
less-expensive final products. Checking for mistakes ‘after the fact’
Resilience
is almost always more expensive.

Robust enterprise risk management (ERM) needs to leverage formal


structures — data, processes, and technology used for creating, stor- Data as a foundation for risk management
ing, sharing, and analyzing information — as well as informal networks There is a growing consensus among risk managers across indus-
represented by the communication and relationships both within and tries — from government, to financial services, to manufacturing
without the risk management organization. Informal networks have and health care — that the data upon which key organizational
repeatedly shown their usefulness in identifying and mitigating fraud, decisions are made represent the foundational layer for enterprise
and often provide early warnings of potential tail events. risk management (ERM). Bad data can have an immediate and nega-
tive impact at any point of the organization, but the downstream
impacts of bad data can snowball out of control.
Informal networks

Some data challenges, such as completeness and timeliness, are


Informal networks

Informal networks

harder to overcome than others. However, incorporating a risk


Formal
structures management perspective on the design of a robust data model
can help reduce inconsistency and inaccuracy, and drive overall
efficiency. This can help address the challenges that result from
Informal networks
the fact that data often exists in silos, making it difficult to get an
accurate view of a related set of information across these silos.
Wachovia’s write-down of the Golden West financial portfolio, which
The interplay between formal structures and informal networks stemmed largely from overreliance on poor data, offers an example
is important because this allows risk managers to compensate of disproportionate emphasis being placed on valuations rather
for shortcomings in one by using the other. But this requires the than on borrower income and assets.
right culture to be in place: one that encourages staff to ask tough
questions without fear of being seen as inhibitors to growth. Risk Another challenge relates to inconsistent labels, which make it hard
identification should not have punitive consequences. A culture to match customers to key metrics over a common information
of appropriately calibrated enterprise friction should be fostered. life cycle. Different technological platforms (which help create the
Doing this would allow critical elements of the organization to silos in the first place) make aggregation and synthesis challenging.
accelerate their pursuit of opportunities while knowing that they Most organizations lack a clear enterprise-wide owner in charge of
have the perspective, and operational ability, to slow down, acceler- addressing such data quality issues, which complicates their identi-
ate, or change course because of an appropriate sensitivity to key fication and remediation.
risk parameters.

16 – The   journal of financial transformation


Analytical risks Governance imperatives
Analytical frameworks help translate data into actionable informa- Governance has many definitions and flavors, which span the
tion. However, analytics should not just be simple characterizations strategic as well as the tactical. It is probably simplest to think of
of data. They should be timely and insightful so that analysis can governance as the way that an enterprise steers itself. This involves
enable appropriate actions. In the financial services industry, the using key conceptual principles to define objectives as well as moni-
credit crisis demonstrated how neglected — or inappropriate — toring the performance of processes to ensure that objectives are
analytical frameworks prevented organizations from identifying being met.
knowable risks (i.e., flawed model assumptions) and illustrated why
key decision-makers were unable to break through the opacity of Reporting, or information presentation, is the mechanism that
others (i.e., lack of transparency into the risk of underlying assets enables governance. Governance relies on this function to provide
being traded in secondary markets, especially when it related to timely and insightful information that allows executives to take
second-order derivatives). preventative and corrective action so that they can avoid imbal-
ance and tail events. For example, executives from Bear Stearns
All too often, analytical frameworks emerge as simplistic char- and the SEC, which was providing regulatory oversight, failed to
acterizations of the ‘real world’ that may not be able to convey recognize that risk managers at Bear Stearns had little experience
a complete risk profile. This is evidenced by the overreliance on with mortgage-backed securities, where the greatest risk was con-
value-at-risk as a key risk metric in the recent financial crisis. The centrated. Reporting mechanisms were oriented toward capturing
dissolution of Lehman Brothers and the near collapse of AIG offer and characterizing transactions and did not appropriately address
good examples of the shortcomings of traditional analytics, which competencies and capabilities, thereby creating a knowledge gap.
were unable to adequately account for dramatic increases in lever-
age, counterparty risk, and capital impacts as markets and ratings Defining and facilitating the integrated management of different
deteriorated. risk types should become a primary activity for enterprise gover-
nance.
Reporting deficiencies
Reporting is a multidimensional concept that does not necessarily Conclusion: where to go from here
capture the dynamic nature of information presentation. Typically, Risk management is not new; most companies already have infra-
reporting has at least four major stakeholders, two external (regu- structure in place to help execute their risk management strategy.
lators and investors) and two internal (senior management, includ- The good news is that companies do not need to scrap what they
ing the board of directors, and line management). have got. Instead, firms need to enhance and buttress current risk
management infrastructures to drive the right level of enterprise
A strong risk-information architecture is crucial to delivering the friction.
right information to the right audience in a timely manner. It should
present salient information as a snapshot, as well as provide the The first order of business an organization needs to put in place
ability to drill down into the detail. Well-defined business usage will to begin reversing negative attitudes about risk management is to
help drive overall requirements, while integrated technology plat- elevate its role. The CEO needs to be involved, along with boards of
forms can help deliver the processing efficiency needed to manage directors and senior executives across the lines of business. Senior
the volumes and timeliness of information presentation. management needs to define and then promulgate a shared set of
risk management values across the company. One positive outcome
Reporting has often been segmented into regulatory reporting of the recession is that risk management has greater executive
and management reporting, directed toward specific compliance mindshare, and risk managers need to capitalize on that.
requirements for the former and financial statements for the lat-
ter. The financial crisis highlighted the need for organizations in Specific goals for creating a risk management culture include:
many industries to develop both ad-hoc and dynamic reporting, ■■ Institutionalize risk management. Develop and articulate an
which not only meet compliance requirements, but also, and more explicit risk management strategy. Establish roles that reflect
importantly, improve the decision-making process. Many organiza- the organization’s risk management model.
tions are coming to the conclusion that current architectures and ■■ Determine and define who has ownership over risk management
infrastructures might not necessarily facilitate easy achievement issues and actions, and who will take on the roles established in
of these requirements. For example, a March 2007 statement to the risk management model.
investors by Bear Stearns represented that only 6 percent of one ■■ Nurture a culture of risk awareness and action. Include risk-
of its hedge funds was invested in subprime mortgages. However, based metrics in performance scorecards and operate a reward
subsequent examination revealed that it was closer to 60 percent. system that balances risk taking with risk mitigation.

17
■■ Be pragmatic. Focus on business needs, such as compliance and
shareholder value. Then attack those needs in bite-size portions
to demonstrate success early and often.

In the end, creative tension between strategy and risk management


should be seen as a positive development in organizations. This
helps to ensure that short-term opportunities are not exploited at
the expense of long-term viability. However, these strategic consid-
erations and risk assessments should be made at the same time,
ensuring a symbiotic interaction between these two perspectives.

In summary, three components are critical to delivering enterprise


friction:
■■ Enterprise risk management should have a strong voice at the
management table and should work in tandem with enterprise
strategy across all enterprise activities.
■■ Formal risk management structures must be buttressed across
data, analytics, reporting, and governance to help the enterprise
achieve the appropriate level of resilience.
■■ Informal networks should be encouraged. These networks can
evolve to fill the white space left uncovered by formal struc-
tures.

18 – The   journal of financial transformation


Enterprise Risk Management —
a clarification of some common
concerns
Madhusudan Acharyya, The Business School, Bournemouth University1

There have been several discussions on the topic of Enterprise ing). It is a matter of understanding the culture, growing aware-
Risk Management (ERM) both in practitioner and academic com- ness, selecting priorities, and effective communication. On the
munities. However, there still remains considerable confusion and other hand, it is a tool to maintain the survival of the organization
misunderstanding in the ongoing debate on this topic. This may be from unexpected losses with the features of capturing potential
because we have focused more on the application of the subject opportunities. Alternatively, in the latter view, risk is concerned
rather than the causes of the confusion. This article articulates a with the extreme events which are rare but have massive power
different understanding of ERM and provides clarifications of sev- of destruction. Statisticians call them lower tail risks with lower
eral misunderstood issues. probability and higher severity which provide volatility in value
creation. In sum, some view risk as a danger and to others taking
It is understood that risk is universal and it has a holistic effect risks is an opportunity.
on the organization. All organizational functions are exposed to
risk to various extents. In addition, there exist differences in risk The meaning of ERM
attitude both at individual and organizational levels. One of the So, what does ERM mean? It actually means different things to dif-
controversial areas of risk management is the classification of ferent people depending on their professional training, nature of
risk. Risk is traditionally classified by professionals depending on job, type of business, and the objective to be achieved. There exist
the phenomena they observe in their functions at various levels two types of definitions of ERM. From the strategic perspective,
of the management hierarchy (i.e., lower-medium-top). From ERM is to manage the firm’s activities in order to prevent the failure
financial and economic perspectives, risks are often classified as of achieving its corporate objectives. From another perspective,
market (price, interest rate, and exchange risks), liquidity risks ERM is related to the successful operationalization of the corporate
(which involve interaction between the firm and the external strategy of the firm in the dynamic market environment, which
market) and credit risk. From the strategic perspective, risk can requires management of all significant risks in a holistic framework
be classified as systemic risk and reputational risk (which involves rather than in silos. It is argued that ERM is not to deal with the risks
policy and decision-making issues at the top management level). that are related to the day-to-day business functions of the firm
From the operational perspective risk can be classified as fraud and are routinely checked upon by lower or line level management.
risk and model risk (which involves human and process at the Here is the key difference between business risk management and
organizational level). From the legal perspective risk can be clas- ERM. It is evident that the practitioners, at least in the financial
sified as litigation risk, sovereignty risk, regulatory/compliance industry, are developing ERM to deal with the unusual risks, which
risk, and so on. The list of risk categories is so long that it never the statisticians called outliers or extreme events. Enterprise risk is
stops, thus creating complexities to understand them. There exists a collection of extraordinary risks that are full of surprising potenti-
another understanding that attempts to classify the risks of a firm ality and can threaten the survival of business. Indeed, overlooking
from the various sources, i.e., both internal and external sources. the less vulnerable areas, which currently might seem less risky,
Organizations take risks knowingly and unknowingly, and risks are is dangerous as they may produce severe unexpected risks over a
also produced during the operations of the business. Some of the period of time. Consequently, a comparison and differentiation of
risks cause massive failure and some do not. Moreover, some risks the key significant risks with the less significant risks is an inherent
are core to the business and some are ancillary. As such, the list of issue in ERM. A continuous process of selecting key significant risks
significant risks differs extensively from one industry to another. and the relevant time frame are essential elements of identification
For a bank the source of core risks are from lending activities (i.e., of enterprise risks. Notwithstanding, whether ERM is a specialized
credit risk), for an insurance company the core risks are providing branch of [corporate] finance or an area of general business man-
the cover of insurable risks at the right price and offloading them agement is still an issue of debate in both practitioner and academic
through appropriate pooling and diversification. In addition, the communities.
professional expertise of the leader also influences the risk man-
agement priorities of the organization. To some organizations risk The evolution of ERM
management may be thought of as a matter of common sense but How did ERM evolve? In some cases it was the internal initiatives
it might be the subject of particular specialization to some others. of the organizations and in some others the motivation was exter-
On one hand, risk management is close to the general management nal (i.e., regulations). Indeed, the regulators and rating agencies
function, with particular attention to risk built on a management play an observational (or monitoring) role in the business journey
process (i.e., identification, measurement, mitigation, and monitor- of financial firms. They are there to maintain the interest of the

1 The author is grateful to John Fraser, Chief Risk Officer of Hydro One, for revision of
the manuscript before submission. 19
customers and investors. Theoretically, the innovation remains Value of ERM
within the expertise of the firm and most [large] corporations, in The value proposition of ERM is another disputed area. What is the
essence, want to get ahead of the regulatory curve to maintain objective of ERM? Why should a firm pursue ERM? In reality, the
their superior role in the competitive market. It would be frighten- goal of a firm is to create value for its owners. Consequently, maxi-
ing for the market if the goals of regulators and rating agencies mization of profit is the overriding objective of an enterprise, which,
in terms of risk management are not aligned to the business goals in modern terms is called creation of [long-term] shareholder
of the corporations. If not, at least two things could happen — value. This is fully aligned with the expectation of the sharehold-
first, the creation of systemic risk and second, an increase in the ers of a firm. Within this justification, senior executives are paid
cost of the product that is often charged to the customers. It is commensurate to their risk taking and managing capabilities (an
interesting to see if the ERM initiatives of regulators and rating issue of agency theory). There remains a lot of speculation as to
agencies influence the firm’s internal (or actual) risk manage- the benefits of ERM, such as value creation for the firm, securing
ment functions. Nevertheless, organizations, in particular those competitive advantage, reducing the cost of financial distress, low-
that have large scale global operations, should be given enough ering taxes, etc. In addition, there remains analysis of the benefits
flexibility to promote innovations. Otherwise, the current move- of risk management from ex-ante (pre-loss) and ex-post (post-loss)
ment of ERM of financial firms will be limited to the boundary of situations. However, the recent 2007 financial crisis demonstrated
regulators and rating agencies’ requirements. On the other hand, the failure of several large organizations that were believed to have
it is true that strict (or new) regulations trigger innovations mainly ERM in place.
in the areas of governance. Interestingly, there appears a positive
indication through the introduction of a principle-based regula- Risk ownership
tory approach. A further unresolved issue of ERM is the risk ownership structure
within the organization. Take the example of a CEO. How does he/
The uniqueness of ERM she view risks of the firm? How much risk of the firm does he/she
Uniqueness is an important aspect of ERM. Since risks of each firm personally hold? In fact, the CEO is (at the executive level) the
(and industry) are different from another it is important to view ultimate owner of the risk of the firm of going burst (i.e., survival).
ERM from a specific firm or industry perspective. Take the example A relevant question for risk ownership is who else, along with the
of the insurance business. The key sources of risk of a typical CEO, owns and constantly monitors the total risk of the firm?
insurance company are underwriting, investment, and finance/ Although, at the upper level it is the board of directors, it might
treasury functions. Similar to other industries, insurers also face be too late to get them involved to deal with the risks of the firm
several types of risks, such as financial, operational, strategic, haz- that have already caused irreparable damage to the organization.
ard, reputational, etc. However, the risks are traditionally seen in In essence, the CEO is the only person who takes the holistic view
silos and the view of an underwriter and an actuary on risk is very of the entire organization. Consequently, it is important to support
different from the investment and finance people. The opposite is ERM by creating risk ownership across the various levels of man-
also true. In the banking world the views of a trader on risk is very agement hierarchy within the organization.
different from that of the lenders. The risk profile of an investment
bank and commercial bank will be different. Consequently, some Risk appetite and tolerance
common questions are emerging in both practitioner and academic How much risk should an organization take? This is often referred
communities in recognizing ERM. Does ERM seek a common view of to as risk appetite or level of risk tolerance. This needs a bottom-
risks of all these professionals? Is it necessary? Is it possible? The up assessment of risk with the integration of several key risks that
short answer is ‘indeed not.’ So, what does a common language of exist within the firm. Moreover, it is directly linked to the corporate
risk mean? In my view, it means that everybody should have the objectives of the firm which, in essence, means where the firm
capability of assessing the downside risk and upside opportunity of wants to be in a certain period of time in future. Certainly, it is not
their functions and making decisions within their position in terms limited to tangible risks of the firm, those associated with the capi-
of the corporate objectives of the firm. This understanding places tal market variables, such as asset and liability risks. In essence, it
the achievement of corporate objectives and strategic decision- includes a lot of intangible issues, such as the culture of the firm,
making at the heart of ERM. Alternatively, employees should have the expertise of the people who drive the business process, the
the ability to judge the risk and return of their actions in a proac- market where the firm operates, and the future cash flows that the
tive fashion, judging the implication of their functions on the entire firm wants to produce. In fact, the appetite for risk is an essential
organization (i.e., another department/division), and communicat- element of formulating a firm’s corporate strategy. Indeed, it is dan-
ing their concerns across the firm. In practice, a ‘group risk policy,’ gerous to rely solely on statistical models that generate numbers
which is the best example of such a common risk language, provides where the scope for including managerial judgment is limited.
valuable guidance.

20 – The   journal of financial transformation


Challenges of ERM at various geographical locations. The recent Sir Walker’s review in
How can the ERM objectives be achieved? Should we prefer the the U.K. has highlighted the significance of the role of the CRO and
mathematical approach? Indeed, a mathematical treatment (or recommended the establishment of a board level risk committee for
extrapolation) of risk (i.e., quantitative modeling) is necessary to banking and other bank-like financial firms (i.e., life insurers).
transform ideas into actions; but the limitation of mathematics,
as a language, is that it cannot transform all ideas into numerical Future of ERM
equations. This is equally true for the progress of ERM, as it is still Where is ERM going? Certainly, globalization influences businesses
going through the transition period of getting to maturity. Indeed, to change their business patterns and operational strategies. As
risk involves a range of subjective elements (i.e., individual experi- stated earlier, ERM is currently maturing and more significant
ence, judgment, emotion, trust, confidence, etc.) and ignoring these developments are likely in future. It is assumed that ERM will move
behavioral attributes will likely lead to failure in adoption of ERM. from its narrow focus of core business risks to take on a more
It is important to remember that beyond mathematical attempts to general perspective. Risk management will gradually be embedded
theorize risk effects/impacts, risk management is about processes within firms’ strategic issues. Certainly, risk is an integral part of all
and systems that involve human understanding and actions. Ideally, businesses and their success depends on the level of each firm’s
neither approach is singly sufficient to handle the enterprise risk of capability for managing risks. However, integrating the two views
the firm. An effective ERM must balance both in a common frame- of managing risks (i.e., fluctuation of performance in the area of
work. ERM is truly a multidisciplinary subject. corporate governance with the volatility in the shareholder value
creation) in a common framework is challenging. Importantly, the
The role of CRO robustness of an ERM program depends on the commitment of the
Who shares the responsibility of the CEO in relation to risk? The top management in promoting a strong risk management culture
common practice is to have a ‘risk leadership team’ equivalent across the organization. Moreover, an innovative team of people
to a ‘group risk committee’ comprising of the head of each major within the organization with a structured risk-taking ability and
function. Theoretically, this team is supported by both technical approach is significantly important for the success of ERM.
and non-technical persons; hence there could be communication
problems since they might speak with different languages of risk. References
• Brian W. N. and R. M. Stulz, 2006, “Enterprise risk management: theory and practice,”
Consequently, there should be somebody responsible for coordina-
Journal of Applied Corporate Finance 18, 8-20
tion (or facilitation) in order to maintain the proper communication • Calandro, J., W. Fuessler, and R. Sansone, 2008, “Enterprise risk management – an
of risk issues across the organization in terms of the require- insurance perspective and overview.” Journal of Financial Transformation, 22, 117-122
• Dickinson, G., 2001, “Enterprise risk management: its origins and conceptual founda-
ments of the corporate objectives. This person, at least theoreti-
tion,” The Geneva Papers on Risk and Insurance: Issues and Practice, 26, 360-366
cally, should be technically familiar with all the subject areas (which • Fraser, J. R. S., and B. J. Simkins, 2007, “Ten common misconceptions about enter-
practically appears impossible) but should have the capability to prise risk management,” Journal of Applied Corporate Finance, 19, 75-81
• Mehr, R. I. and B. A. Hedges, 1963, Risk management in the business enterprise, Richard
understand the sources of risks and their potential impact either
D. Irwin, Inc. Homewood, IL
solely or with the help of relevant experts. Currently such a role • Walker, D., 2009, “A review of corporate governance in U.K. banks and other financial
is emerging and is often called the Chief Risk Officer (CRO). In the industry entities – Final recommendations,” H M Treasury

meantime, the presence of CROs has appeared both in the financial


(i.e., banking and insurance) and the non-financial sectors (i.e., com-
modity). Typically, a CRO is responsible for developing ERM with
adequate policy and process for managing risks at all levels of the
firm. In addition to an adequate knowledge of the quantitative side
of risk management, a CRO should have a fair understanding of the
behavioral aspects of risk as he/she has to deal with the human
perceptions and systems (i.e., communication and culture). One of
the challenging jobs of CROs is to report to the board of directors
(most of whose members often do not have a technical knowledge
of the risks) through the CFO or the CEO depending on the specific
structure of the firm. Ideally, the board of directors is, by regula-
tions (e.g., Combined Code in the U.K., Sarbanes Oxley in the U.S.,
and similar regulations in some other countries), responsible for all
risks of the firm. Another big challenge for a CRO is to promote a
risk awareness and ownership culture throughout the firm, includ-
ing the units at the corporate centre and the subsidiaries/divisions

21
Capital at risk — a more consistent
and intuitive measure of risk
David J. Cowen, Quasar Capital, and President and CEO, Museum of American Finance
David Abuaf

This paper will explain a risk methodology for traders and hedge for three days”2. That quote is testimony enough to find a different
funds that trade in the most liquid of markets like G10 futures and measure of risk.
foreign exchange. The methodology is called ‘capital at risk’ (CaR)
and is a replacement for ‘value at risk’ (VaR). CaR obviates the need What is CaR?
to worry about fat tails or outliers called Black Swans as it virtually When should it be used?
eliminates downside surprises1. It is a conservative measure of risk CaR is a measure of risk originally designed to value the maximum
and focuses on assessing the maximum downside to the portfolio. downside to the portfolio without using any assumptions. There are
The traditional profile of a risk manager who should use CaR is a specific conditions which must exist in order to properly use CaR:
short-term trader investing in the most liquid of markets — where
slippage is almost entirely avoidable; however, CaR is by no means 1 Each trade must have a predetermined stop-loss.
exclusive to short-term traders. In the volatility of 3rd and 4th quar- a. Stop-loss levels are continually readjusted for profitable
ters of 2008 this tool would have been very useful to those with a trades to lock-in profits. Consequently, CaR is not a static
medium to longer term trading horizon as well. number. Additionally, even if the stop-loss level does not
move, because the market is moving, CaR will by definition be
Problems with traditional risk metrics a dynamic number.
In traditional risk management, VaR is used by traders to assess the b. Even if two trades have a high degree of correlation they must
probability of a deviation to the portfolio’s return in excess of a spe- be treated as separate trades to their stop loss. For instance,
cific value. This measurement, like many others, has flaws. The most if one was short equivalent Australian dollar against U.S.
obvious is its basis on past performance — wherein historical volatil- dollar with 25 basis points of risk to the portfolio and long
ity is indicative of future volatility. This flaw leads to two discreet equivalent New Zealand dollar with 25 basis points of risk to
problems. The first is that it cannot take into account severe market the portfolio, CaR will report 50 basis points. VAR risk models
dislocations that are not reflected in historical data. The second is would look at this as cross trade, long New Zealand dollar
that from a practitioner’s standpoint VAR can be completely differ- versus short Australian dollar and say that those two currency
ent from one trader to the next due to subjective limitations, i.e., the pairs are highly correlated, which they are, and then report a
time limit utilized or confidence threshold. significantly lower risk to the portfolio, say something on the
order of 10 basis points.
In the more liquid markets which short-term traders frequent, VaR is 2 Trades must be in liquid futures or spot foreign exchange so that
a risk model that has the potential for catastrophic drawdowns. When slippage is mitigated.
using VaR, only historical returns are factored into future volatility a. Emerging markets trades should take caution using CaR
expectations, and as a result infrequent occurrences are not reflect- methodology for these currencies have substantial gap risk
ed (stock market crashes, high commodity demand, terrorist attacks, negating the usefulness of CaR.
etc.). Additionally, when one sees a return indicated at 3σ VaR (99% 3 If options are used then it is calculated at the full premium of the
confidence level) we can expect the event to occur 2.5x per year, option no matter what maturity or delta. It is the full cost of the
yet amongst traders it is not uncommon to observe moves of this option.
magnitude or greater more than 5x per year. And with VaR there is a. Therefore CaR can only be used in a long only option based
the required subjective nature of the timeframe used. For instance, a strategy. It has limitations if the risk manager is naked short-
VaR model that has twenty years of look-back data might seem suf- ing options.
ficient; however it would not include the October 1987 market crash.
A five-year model would not have the technology bubble of 2000. These conditions are not just optimal but essential.
Therefore, there are inherent caution flags when using VaR.
How to calculate CaR?
To see the problems of using VaR, one need only to look at the One of the more fortunate effects of transacting in extremely liq-
performance of hedge funds and statistical arbitrage traders dur- uid markets and not relying on historical performance or outside
ing the summer of 2007. Many funds lost 20% of their capital in assumptions is the ease of calculation of CaR:
those months alone. Matthew Rothman, head of Quantitative Equity
Strategies for now defunct Lehman Brothers was quoted in the 1 Revalue all cash and futures positions to their stop loss levels, so
Wall Street Journal as saying “[Today] is the type of day people will if the risk manager was stopped out what would you have lost.
remember in quant-land for a very long time. Events that models 2 Add up the total cost of all your options based on if revaluation
predicted would happen once in 10,000 years happened every day went to zero.

1 A reference to Nassim Taleb’s popular book Black Swan. David Cowen and Nassim 2 Wall Street Journal, August 2007. http://online.wsj.com/article/
22 worked at the same firm in London, Bankers Trust, 16 years ago for a brief period. SB118679281379194803.html. For a more in-depth look at the limitations of VaR see
Nassim has led the charge against VaR as a risk model. this recent article: http://www.nytimes.com/2009/01/04/magazine/04riskt.html?_
r=2&ref=magazine&pagewanted=all
3 Add 1 and 2. multiplied by $50 per mini contract and then by 20, or 17 x $50 x
4 Divide the above amount by the total capital of the portfolio. 20 = $17,000. $17,000 is then divided by $10,000,000 to achieve
5 The end result is the maximum amount of loss to the portfolio, the 17 basis points of risk.
CaR.
On day 2, we see the portfolio has more risk associated with the
CaR’s ease of use in calculation can be applied to all trades and is gold call — this is because the price of the call has risen, so the pos-
easily aggregated to specific market or portfolio levels. sible value lost has increased. In the first two mini contracts, the
stops were rolled upwards, locking in profit but still exposing the
CaR is not the most difficult of measures to use or calculate. The portfolio to a loss from day 2’s NAV. We also see the addition of a
benefit of CaR is that a trader can rest assured that they know their third S&P mini contract, further adding to the portfolio’s CaR.
maximum portfolio loss in the event of a catastrophe. It is always a
worst case scenario. In that manner there are no portfolio surprises Real world examples of not knowing your risk
as one is always cognizant of the full risk to the portfolio. In 2008, we witnessed catastrophic losses at Bear Stearns, Lehman
Brothers, AIG, and a score of other high profile firms. The cost to
Benefits of CaR the economy and taxpayers has been enormous. To use AIG as but
■■ Ease of use. one example, the U.S. Government has pumped U.S.$152 billion
■■ Ease of calculation. into AIG in the form of direct loans, investment in preferred stock,
■■ Easily understandable. and acquisition of troubled assets. AIG’s exposure was through its
■■ Not easily manipulated. Financial Products Division, which became a major player in the
■■ Not subject to historical data or assumptions. derivatives market, both in credit default swaps and collateralized
■■ More intuitive manner in which to assess risks of a trade. debt obligations. In the case of the credit default swap, in exchange
■■ Eliminates downside surprise. for fees AIGFP would guarantee, or insure, a firm’s corporate debt
in case of default. It was a growing market and the firm was book-
ing hefty profits. With respect to CDOs, the firm had a portfolio of
Example usage of CaR structured debt securities that held either secured or unsecured
The following illustrates a portfolio of $10,000,000 invested in a bonds or loans. By the end of 2005 AIG had almost U.S.$80 billion
few securities. of CDOs.

AUM 10,000,000
The firm was comfortable with its derivatives portfolio. In a public
Contract Quantity Current price Stop CAR
forum in August 2007 AIG Financial Products President Joseph
Day 1 Feb COMEX gold call 20 13.1 26.2bp
DEC S&P Mini s.1 20 840 823 17bp
Cassano boasted on a conference call to investors that the deriva-
DEC S&P Mini s.2 10 840 830 5bp tives portfolio was secure: “It is hard for us, without being flippant,
Total portfolio CAR 48.2bp to even see a scenario within any kind of realm of reason that would
see us losing $1 in any of those transactions”3. What prompted this
Day 2 Feb COMEX gold call 20 14 28bp confidence? According to interviews with top officials at AIG they
DEC S&P Mini s.1 20 866 850 16bp
were relying on a computer model to assess their credit default
DEC S&P Mini s.2 10 866 850 8bp
swap portfolio. According to their proprietary model, there was only
DEC S&P Mini s.3 10 866 845 10.5bp
Total portfolio CAR 62.5bp
a 0.15% chance of paying out, or a 99.85% of reaping reward. AIG
believed that there was only a 1.5 chance in 1,000 of disaster. Once
The notations s.1, s.2, and s.3 indicate distinct sets of contracts
again a model which predicted only a slim chance of an event occur-
ring, the so-called fat tail, sunk a once prestigious firm.
Here we see that as of the close of day 1 the firm has 20 outstand-
ing Feb gold calls presently priced at 13.1 and 30 outstanding While AIG had credit risk they never faced the reality of the magni-
December S&P mini contracts with different stops. The CaR was tude of their risk. Had CaR been used, it would not have presented
calculated assuming the call would fall from present prices to zero a probability, but rather a finite amount of risk. In particular what
and that the mini-contracts could fall from 840 to their respective hurt AIG was that it had to post collateral against these swaps
stops. Each gold contract is worth $100 per point. Consequently, and when their AAA rating became imperiled the calls for margin
we multiplied 13.1 x $100 x 20 to reach $26,200. Then we divided occurred. Perhaps if AIG had thought in terms of full collateral capi-
$26,200 by $10,000,000. The mini example is (840-823) = 17. 17 is tal at risk they would have thought differently about their risk. The

3 Washington Post, December 30, 2008. http://www.washingtonpost.com/wp-dyn/con-


tent/article/2008/12/30/AR2008123003431_5.html?sid=ST2008123003491&s_pos 23
reality of life is that risk in the financial markets is never 0.15% no
matter what the model states.

And we know that VaR in practice is simply unable to handle the


stress. Bear Stearns reported on May 31, 2007 an interest rate
risk to their portfolio of U.S.$30.5 million using a VaR confidence
level of 95%. Moreover, they claimed diversification benefits from
elsewhere offset the entire firm-wide exposure to only U.S.$28.7
million. When Bear Stearns failed we know that they had levered
their money so high, over thirty times, that a run developed that
could not be met. How does a firm state a risk of less than U.S.$30
million when its true risk is significantly higher? We know that VaR
is part of the problem and not part of the answer.

Conclusion
CaR has been used by David Cowen to measure risk in his hedge
funds. Over his two decade trading career he was never satisfied
with VaR. David set out to find a simplistic method to value the
maximum downside to the portfolio. The result is CaR. This is an
easy to calculate measure which appropriately factors in all risks to
the portfolio without looking at historical and often erroneous data.
While David originated this idea he does not take sole authorship
for the concept. Others could have easily simultaneously developed
this method as well.

24 – The   journal of financial transformation


Articles

Economists’ hubris — the case of risk management

Best practices for investment risk management

Lessons from the global financial meltdown of 2008

What can leaders learn from cybernetics?


Toward a strategic framework for managing complexity and risk in socio-technical systems

Financial stability, fair value accounting, and procyclicality

Can ARMs’ mortgage servicing portfolios be delta-hedged under gamma constraints?

The time-varying risk of listed private equity

The developing legal risk management environment

Interest rate risk hedging demand under a Gaussian framework

Non-parametric liquidity-adjusted VaR model: a stochastic programming approach

Optimization in financial engineering — an essay on ‘good’ solutions and misplaced exactitude

A VaR too far? The pricing of operational risk

Risk management after the Great Crash


26 – The   journal of financial transformation
Articles

Economists’ hubris —
the case of risk
management1

Shahin Shojai
Global Head of Strategic Research, Capco

George Feiger
CEO, Contango Capital Advisors

Abstract
In this, the third paper in the Economists’ Hubris series, we high-
light the shortcomings of academic thought in developing models
that can be used by financial institutions to institute effective
enterprise-wide risk management systems and policies. We find
that pretty much all of the models fail when put under intense sci-
entific examinations and that we still have a long way to go before
we can develop models that can indeed be effective. However, we
find that irrespective of the models used, the simple fact that the
current IT and operational infrastructures of banking institutions
does not allow the management to obtain a holistic view of risk
and the silos they sit within means that instituting an effective
enterprise-wide risk management system is as of today nothing
more than a panacea. The main worry is that it is not only academ-
ics who fail to realize this fact, practitioners also believe that these
models work even without having a holistic view of the risks within
their organizations. In fact, we can state that this is the first paper
in which we highlight not only the hubris exhibited by economists
but also the hubris of practitioners who still believe that they are
able to accurately measure and manage the risk of the institutions
they manage, monitor, or regulate.

1 The views expressed in this paper reflect only those of the authors and in no way
are representative of the views of Capco, Contango Capital Advisors, or any of their 27
partners
Economists’ hubris — the case of risk management

In this, our third article in the economists’ hubris series, we look The more one delves into the intricacies of academic finance the
at the shortcomings of academic thinking in financial risk man- more one realizes how little the understanding is among a majority
agement, a very topical subject. In the previous two articles, we of academics about what actually takes place within these institu-
examined whether contributions from the academic community tions and how difficult it really is to implement the theories that are
in the fields of mergers and acquisitions [Shojai (2009)] and asset devised in so-called scientific finance institutions at major business
pricing [Shojai and Feiger (2009)] were of much practical use to the schools in the West.
practitioners and demonstrated that economists have drifted into
realms of sterile, quasi-mathematical and a priori theorizing instead In this paper, we will focus on a number of these issues. We will high-
of coming to grips with the empirical realities of their subjects. In light why it is that the current structures within the major financial
this sense, they have stood conventional scientific methodology, institutions make it almost impossible to have a holistic view of the
which develops theories to explain facts and tests them by their enterprise’s risk, despite the many different suggestions as to its
ability to predict, on its head. Not surprisingly this behavior has viability in academic literature; which we must add is not that many.
carried also into the field of risk management, with an added twist. We will discuss why it is that the current compensation structures
Rather like the joke about the man who looks for his dropped keys make it very hard for the management to control risks at the indi-
under the street light because that is where the light is rather than vidual, divisional, and group level. And, finally, we will explain why it
where he dropped the keys, financial economists have focused on would still be impossible to prevent such crisis in the future, even if
things that they can ‘quantify’ rather than on things that actually the two former issues were somehow miraculously solved, given the
matter. The latter include both the structure of the financial system tools that are available to risk managers and their management.
and the behavior of its participants. Consequently, the gap between
academic thinking and business application remains as large today However, before we get too deep into the intricacies of financial
as it has ever been. institutions and their risk management operations, it is important
to cast a critical eye on what has been suggested to be the main
Irrespective of one’s views regarding academic finance, or even the cause of the recent crisis and discuss why it is that one of the main
practioners who are expected to apply the models devised, few can causes has been overlooked, namely the disappearance of due
deny that there were serious failures in risk management at major diligence by banks.
global financial institutions, perpetrated in all probability by the
belief that the models developed work and that they can withstand What are causes of the recent market crisis?
environments such as the recent financial crisis. However, it is not If one were to choose the one area which has borne the most criti-
enough to simply make such generalized statements without taking cism for the current crisis it would be the CDO market, and those that
the time to get a better understanding of why such failures took rate them [Jacobs (2009)]. Of course, the regulators and central
place and whether we will be able to correct them in the future. bankers have also not got away unscathed, but most studies seem
Our opinion is that the tools that are currently at the disposal of to suggest that had we got a better understanding of CDOs and their
the world’s major global financial institutions are not adequate to risks we might have been able to prevent the current crisis.
help them prevent such crises in the future and that the current
structure of these institutions makes it literally impossible to avoid The first problem with this point of view is the expectation that com-
the kind of failures that we have witnessed. plex financial assets, such as CDOs or other securitized assets, can
be accurately valued using scientific methods. Even if we were able
Our objective with this article is not to exonerate the risk manage- to calculate the risk of simple, vanilla structure, financial assets to
ment divisions of these institutions, nor is it to suggest that the correctly price them, and by that we mean that the pricing models
entire enterprises should be forgiven for the incalculable dam- arrive at the same price that the asset is trading at in the markets,
age that they have caused. Our aim is to demonstrate that even we would still have a very difficult time pricing securitized assets
if the risk management divisions of these institutions had acted with any degree of accuracy. The reason is that the prepayment
with the best intentions of their organizations, and even the other rights of these instruments, which are related, with some friction,
stakeholders, in mind they would have still had a difficult task in to movements in interest rates, would necessitate an absolutely
effectively managing the risks within their enterprises given the perfect prediction of interest rate movements into the future, and
tools that were at their disposal and the structures of the firms a similarly accurate assessment of the proportion of borrowers who
they operated in. Furthermore, and in our opinion given the focus of choose to repay [Boudoukh et al. (1997)]. As one can appreciate,
this paper, more importantly, had the risk management divisions of that is quite an unrealistic expectation.
these institutions effectively instituted the tools that were provided
to them by academic finance the situation would be no better than The sad fact is that academic finance has failed in its efforts to
it is today. even provide valuation models that can price simple assets, such

28 – The   journal of financial transformation


Economists’ hubris — the case of risk management

as equities, with any degree of accuracy. Expecting these models to a financial engineer of mortgages. Its income will come not from
perform any better for highly complex instruments is nothing more the difference between its borrowing rate and the rate at which it
than wishful thinking [Shojai and Feiger (2009)]. lends, less any loss experienced as a result of defaults, but from
the fees it can charge from manufacturing mortgage pools that are
If we accept that these assets cannot be priced with any degree of then sold to investors. As a result it does not spend as much time
accuracy then we must accept that neither the financial institutions on due diligence as it used to do; hence the accusations that banks
that created or traded these assets nor the rating agencies would were pushing mortgages to those they knew would not be able to
have been able to help prevent the crisis that was brought about repay them. The fact was that once the mortgages were packaged
by the subprime mortgage market even if they did know what they non-payment was somebody else’s problem. However, since house
were doing. prices were continually going up and interest rates were going in
the opposite direction few would default since they could simply
But, in our opinion, focusing on the pricing of these assets, espe- extract more value from their homes by remortgaging; yet another
cially since all of the people involved with these instruments are source of fees for the banks.
familiar with their intricacies, ensures that we ignore the main
reason why this market became so big and finally imploded. In our With the banks relinquishing their role as the conductors of due dili-
opinion, even if interest rates had not been kept so low for as long gence, it was left to the rating agencies to act as sort of auditors for
as they were we would still have not been able to prevent the crisis. these issues. But, they never had access to the underlying borrow-
And that is because securitization, by its mere existence, creates an ers to have any idea of their true state of health. That information
environment that would lead to such crises. The fact that it had not was with the issuing bank, and remember that they had stopped
happened before is the real surprise. caring about collecting that kind of information when they started
selling the mortgages onto other investors [Keys et al. (2010)]. So,
Now, our job is not to provide a primer on securitization or how the rating agencies had to use aggregate data to rate these instru-
the U.S. financial system works for those of our readers who are ments, or the credit quality of the insurer who had provided credit
significantly more qualified to talk on the subject than we are, enhancement to the security [Fabozzi and Kothari (2007)]. In either
but it would not hurt to just examine how the mere existence of case, neither the credit enhancer nor the rating agency had any
securitized mortgages can lead to potential crises. If we look at idea about the underlying quality of the borrowers. Consequently,
how mortgages were issued in the past we would find that prior all that was needed to bring the house of cards down was a correc-
to the development of the so-called individual risk rating models tion in house prices, which is exactly what happened.
[Hand and Blunt (2009), Hand and Yu (2009)] banks used to use
the information that they had about their client of many years to Consequently, no matter how we change the regulations govern-
decide on whether and how much mortgage to issue them with. The ing rating agencies, unless they admit that they have no idea how
decision and the amount were, therefore, related to the history of to rate securitized assets, which they really cannot for the afore-
that client’s relationship with the bank. mentioned reasons, such crises are likely to occur again and again.
Given that the number of securitized asset issues dwarf other more
When banks started lending to nonclients more aggressively they traditional issues that rating agencies used to live off it is highly
had to resort to using the information provided by personal risk unlikely that they will admit to having no idea about how to rate
ranking organizations, such as Experian. Now, the issue is not these instruments, no matter how complex the mathematical model
whether these rating agencies provide information that is genuinely they use is.
accurate or of much use, though what we do find is that it is not as
remotely scientific in its accuracy as many are lead to believe. The The moral of this overview is that when risk is passed on and com-
fact is that the loan was still issued by the bank or its brokers after pensation is based on the number of sales, due diligence goes out
doing some sort of due diligence on the client, and more importantly of the window. It is this lack of due diligence that brought about the
the loan remained on the books of the bank. The fact that the loan current crisis, and not only in terms of CDOs and MBSs but across
remained on the bank’s own books ensured that they monitored the the entire financial ecosystem. From traders who can only win to
activities of their brokers more closely, since any defaults would asset managers that share in client winnings but do not have to
directly cost the bank itself. What securitization does is to allow recoup their losses, or even share in them, when they lose.
the bank to package these mortgages and sell them to third-party
investors. In other words, the loss would no longer hit the bank Because of the aforementioned issues it is essential that when we
but the investors, who in the case of credit enhanced MBSs would do examine institutional risk management policies we recognize
have no idea who the original issuing bank(s) was (were). When this the importance that human aspects play in their successful imple-
happens, the bank is no longer a mortgage company but simply mentation.

29
Economists’ hubris — the case of risk management

Having shared with you our perspective on what actually caused enticing estimates are possible, such as Enterprise Value at Risk
the recent crisis, we will, in the next section, highlight the short- and others.
comings of academic finance in developing models that financial
institutions can use to institute effective enterprise-wide risk man- Anyone who has attempted to estimate the variance/covariance
agement systems and policies. matrices of the instruments knows that the distributions are not
stationary. Indeed, they vary markedly according to the sample
A framework for thinking about financial risks period chosen. Hence the statistically trained have used various
In order to evaluate the relevance of academic thinking on risk weighting schemes to create ‘more relevant’ data. For example,
management it is helpful to use a 3-level schema: weighting recent data more heavily than older data using some
■■ Risk at the level of the individual financial instrument. distributed lag scheme. This is typical of the economists’ methods:
■■ Risk at the level of a financial institution holding diverse instru- create some theoretical structure and impose it on the data. A more
ments. productive approach would be to inquire why the distributions are
■■ Risk at the level of the system of financial institutions. not stationary.

A financial instrument might be a credit card or a residential mort- Without getting into too much detail of the underlying models at
gage or a small business loan. In the U.S., risks in these instruments this stage we would like to suggest that at least 3 factors play a role
have been intensively studied and are moderately predictable in in causing the nonstationarity of risk distributions which lead to the
large pools. One example that has become common parlance is to practical downfall of the ‘portfolio approach’ to risk management
equate the default rate on national pools of seasoned credit card within an institution:
balances to the national unemployment rate. A financial institution ■■ Failure to attempt to understand ‘causation.’
holding a diverse portfolio of such instruments might be a bank ■■ Neglect of the ‘convergence of trading behavior’ in the financial
which originates them and retains all or some or an investing insti- system we have created.
tution like a pension fund or a hedge fund or an insurance company ■■ Adherence to false notions of ‘market efficiency’ which causes
(or, indeed, an individual investor with a personal portfolio). neglect of credit-fueled valuation bubbles.

The system of financial institutions is the total of these individual Let us start with ‘causation.’ Unlike the predictably random move-
players, in particular, embracing the diverse obligations of each to ments of electrons in an atom, economic events sometimes have
the others. A bank may have loaned unneeded overnight cash to causes which bear rather sharply on the correlation among instru-
another bank or it may have borrowed to fund its portfolio of trad- ments. It is easy to give a practical illustration. Some time ago one
able securities by overnight repurchase agreements; a hedge fund of the authors had a client with what was, supposedly, a well-diver-
may have borrowed to leverage a pool of securities; an insurance sified portfolio — equities, bonds, real estate, even some direct busi-
company may have guaranteed another institution’s debt, backing ness interests. Unfortunately, all the investments were located in
that guarantee by pledging part of its own holding of securities; or were claims on businesses in Houston, Texas. When the price of
and an individual may have guaranteed a bank loan to his small oil collapsed, so did the value of the portfolio. There was a common
business by pledging a real estate investment, itself leveraged by causal factor in all the returns. It is not so easy, ex-ante, to calculate
a mortgage. the variance/covariance matrix of returns in Houston conditional
on the price of oil or of returns in the Midwest conditional on the
The fallacy in academic approaches to risk management (enthu- health of General Motors. Let alone to then create the probability
siastically adopted by the financial institutions themselves) is to distribution of the fate of these causal factors.
assume that the techniques shown to be reasonably useful for the
analysis of large samples of individual instruments can be of signifi- Consider now ‘convergence of trading behavior.’ Correlations of
cant value in assessing the other levels of risk. returns of assets like European and U.S. and emerging stock indices,
various commodities, and the like have been highly variable but
Academic approach to risk management within clearly rising over the past 15 years [Ned Davis research]. A combina-
financial institutions tion of deregulation of global capital flows, development of sophisti-
The academic approach to handling the risk of a financial institu- cated capital market intermediaries operating globally, and data and
tion holding a diverse pool of instruments is to look at some set of trading technology have enabled more and more money to be placed
historic correlations among the instruments and model the institu- on the same bets at the same time. And it is. Once, Harvard and Yale
tion as the portfolio of the instruments that it holds. Technically it were unique among institutions in investing heavily in timberland and
assumes that the outcomes of all the instruments are drawn from private equity and hedge strategies. Now everyone is doing it and the
a stationary joint probability distribution, from which all sorts of returns are falling while the volatility is rising.

30 – The   journal of financial transformation


Economists’ hubris — the case of risk management

And finally ‘credit-fueled valuation bubbles.’ Asset markets do not were in trouble themselves. So they dried up the interbank lend-
price efficiently, no matter what your professor said in business ing market, essential for liquidity in the world trading system, and
school. Reinhardt and Rogoff (2009) have demonstrated that for their commercial paper appeared risky and fell in price, damaging
centuries, valuation bubbles funded by excess credit creation have the money market fund industry which held a large part of liquid
occurred in all economies for which we have decent records. Why assets in the US.
do investors not simply recognize these and “trade them away,”
as the efficient markets hypothesis would imply? It is not so easy, Similar things happened with mortgage-backed CDOs and other
because while the bubbles are ‘obvious,’ when they will burst is not. instruments. We need not elaborate on the history but you can
Smithers (2009) explains this in eloquent terms. However, if you see why the variance/covariance matrix did not work out to be
think about it, had you started shorting the tech bubble aggres- relevant. And, consequently, why the core of the academic work
sively in 1998 you would have been bankrupt well before the end. on risk management did not turn out to be relevant. And indeed,
As Keynes said, the markets can remain irrational longer than you how insidious the concept of market efficiency has been in blinding
can remain solvent. market participants to the nature of real risk by implying that it has
already been priced in to all assets as well as it can be.
Systemic connections and liquidity risk
Once we acknowledge that valuations are not perfect but can The role of collateral and incentives
undershoot and overshoot we can explain systemic risk that is, We have not quite finished with criticism of academic treatment of
essentially, broad loss of liquidity in most assets. Why pay more for risk management. We need to turn now to the two academic nos-
something today than it is likely to be worth tomorrow? trums for the kind of risk we have described: ‘adequate collateral’
and ‘appropriate incentives.’
Consider the example of the CLO market in 2008. Several hundred
billion of corporate loans were held in CLOs, on the largely invisible Collateral, whether in the hands of a central counterparty or put up
books of offshore hedge funds and underwriting institutions (via over the counter against individual transactions (the equity cush-
off-balance-sheet vehicles), in structures leveraged 20 times and ion in all those leveraged CLOs) is claimed to allow the system to
more. The leverage came from major banks and insurance compa- absorb unanticipated shocks. That may be, but the question is, how
nies which, we should remember, devoted the bulk of their other much collateral is enough? Here we come back to the stationary
business activities to making loans to entities like businesses and joint probability distribution of asset prices, which defines the likely
real estate developers and to each other. They in turn raised their magnitude of these unanticipated shocks, and we can start over.
liabilities by issuing bonds and commercial paper. On reflection, systemic risk renders collateral least helpful when
you need it most. Think of something as simple as a mortgage on
Some of the loans in the CLOs started falling in price in secondary a house. The collateral is the down payment, which is the lender’s
trading (the market making in which, by the way, was provided by cushion against default. If your house goes on the market in ‘normal
the same banks which were providing the leverage to the CLO hold- times,’ say because of a divorce, all will be fine and the lender is
ers). This precipitated margin calls on the holders that they could likely to recoup his loan. If a real estate bubble has collapsed and
not all meet. With leverage of 20 times the fund equity could quickly every house on your block is for sale, the cushion is non-existent.
disappear so the only recourse was to dump loans and delever- This is not an easy problem to solve. Think dynamic collateral
age as quickly as possible. So we sat at our Bloomberg screens in requirements driven by rules on market value versus ‘true value.’
amazement as blocks of hundreds of millions or billions of loans
were thrown out for whatever bid they could get. Well, would you Let us now turn to incentives, particularly the notion that if incen-
buy then, even if you thought that almost all of the loans were tives are paid in deferred common stock of the originators, there
ultimately money-good, as we indeed did? Of course not, because will be a much lower likelihood of people making ‘bad trades.’ We
in that panic it was certain that the prices would fall further, which believe that such attempts will fail the test of practical reality, for
they did. at least the three following reasons.

Normally the market makers would buy bargains but they were ■■ They are incompatible with a free market for talent. Institutions
providing the leverage and they were holding trading inventories that attempt long deferral of incentive payments into restricted
that were tumbling in price. So they withdrew the leverage and vehicles will experience loss of their highest performers to insti-
reduced the inventories and so forced prices down further. This tutions that do not do this. You can see this happening right now
killed the hedge funds but also inflicted losses on other holders, on Wall Street.
including their own balance sheets, and created credit losses on ■■ The way in which we choose to measure corporate performance
the loans extended to the leveraged CLO holders. Now the banks encourages the opposite, namely short-term risk taking for near-

31
Economists’ hubris — the case of risk management

term results. We measure results quarterly and annually even sure could change during the day without any additional trades or
though the economic cycle takes place over years, not quarters. changes in portfolio composition.
Who doubts that any financial institution CEO who dropped out
of playing in late 2006 would have long lost his job and his key The assumption that the bank’s entire portfolio of assets follows
staff, well before the markets collapsed? Is this not the meaning a normal distribution is also not very valid. Different institutions
of the infamous remark by Chuck Prince, former CEO of Citi that have different portfolio compositions and based on that the shape
“as long as the music is playing we have to dance.” of their risk profile could be very different from their peers. Using
■■ The proposed holding periods for the restricted incentive pay- the incorrect distribution could result in arriving at risk appetite
ments are always some calendar interval. The economy does not levels that are completely inappropriate for the firm in question.
oblige by revealing the quality of systemic risk provoking actions In fact, even if one starts off with the right distribution it does not
like credit bubbles over short periods. mean that they will end with the correct distribution, as distribu-
tions could change over time, such as the subprime mortgages that
Having highlighted what we believe to have been the main causes became more risky with different distribution profiles over time.
of the recent financial crisis and some of the failures of academic
finance in the area of risk management for financial institutions, The fact that different assets have different distributions means
in the rest of the paper we will highlight the shortcomings of VaR that their risks cannot simply be added to each other to arrive
and its offsprings and highlight why it is that irrespective of the at a group risk figure. Given the enormous difficulties that fund
effectiveness of the models, enterprise-wide risk management will management companies, and their consultants, face when trying to
remain a panacea until internal infrastructures of financial institu- compute the overall risk of a portfolio of shares that follow similar
tions are modified to eliminate many of the problems that exist distributions it should come at no surprise just how difficult it would
today. be if one were to try to calculate the overall risk of a financial
institution.
VaR and its shortcomings
Unlike what many who study the subject of risk might believe there More importantly, the relationships between the different assets
are very few studies dedicated to the institution of effective risk within the portfolio could be, and most probably are, miscalculated,
management systems and policies within financial institutions. In even if we assume that historical patterns will hold into the future.
fact, had it not been for Value-at-Risk (VaR) [Jorion (2006)], and How would the model account for liquidity risk when recalibrating
its numerous offsprings [Kuester et al. (2006)], we would not have the overall VaR [Fragnière et al. (2010)]? What happens to the cor-
had much to refer to in this article. The reality is that as of today we relations during a crisis? If we have learned nothing else from the
are still forced to refer back to the contributions of Harry Markowitz recent crisis we have learned that correlations among assets that
in the early 1950s [Markowitz (1952)] towards diversification ben- were previously unrelated become significantly stronger during cri-
efits of less than perfect assets within a portfolio when trying to ses. If the correlations merge we would lose some of the benefits of
determine both risk tolerance and appetite of financial firms (as we diversification that we were relying on, pushing VaR figures against
discussed in the previous section). the wall. And sadly, the fact is that historical patterns rarely hold in
the future. They are of little use in correctly determining VaR, espe-
VaR at its most basic tries to measure of the maximum loss that a cially since no future crisis will exhibit the patterns of behavior that
firm is willing to accept within a prespecified parameter, which in its predecessors did. Hence, it is literally impossible to determine
most cases is the simple confidence interval over a set time hori- the true number of bad events that fall outside our predetermined
zon. It is used to determine what the firm’s appetite is for risk. A confidence interval.
simple example that most use is that if we use a 95% confidence
interval and the firm’s portfolio of investments has a one-day 5% Even if you were able to accurately determine the number of events
VaR of $100 million, there is a 5% probability that the portfolio that will fall beyond your accepted confidence level, VaR will not
will fall in value by more than $100 million over a one-day period. help you determine their magnitude. The problem is that VaR looks
Another way of stating this is that a loss of $100 million or more at the number of events that might fall beyond the confidence
on this portfolio is expected on 1 day in 20. However, VaR assumes interval we have established and not their magnitude. What is the
that markets are normal and there is no trading, i.e., no change in point of knowing that there is a 1 in 20 chance of losing more than
the portfolio composition. $100 million when because of that single event the bank could go
bankrupt? The recent events have proved that a one-off outlier
In reality the composition of the firm’s portfolio is changing every could bring the entire system down.
second, with every trade that the traders make or every new instru-
ment that is engineered. In fact, for complex instruments the expo- VaR also fails to account for the interrelationships between finan-

32 – The   journal of financial transformation


Economists’ hubris — the case of risk management

cial institutions, which in recent years, thanks to the CDO and MBS Most institutions have only recently started to come to grips with
markets, has increased dramatically. The fact is that many banks the fact that operational risk is a major risk and that it must be
had used MBSs as collateral with the Fed and when the crisis hit managed. Learning how to quantify it will take years, if not decades.
their values plummeted, preventing the rebalancing needed in light Once that is done, we need to be able to compute the operational
of higher home loan borrower default rates. As a result, banks were risks of each division. Many institutions still segregate their differ-
left with just one alternative when default rates shot up, decreasing ent business, hence it is literally impossible to quantify operational
credit availability by lending less and calling-in outstanding loans risk for the group and determine what diversification benefits could
[Hunter (2009)]. The implications were that the interbank market be derived. For example, many institutions combine credit and FX
dried up and lending to businesses disappeared, which had second- instruments within the same divisions, others keep them separate.
ary and tertiary impacts on different assets of other institutions. Some combine complex instruments with the fixed income division,
others with equities. In some firms FX, credit, and equities each
The reality of the situation is that VaR, and all other risk manage- have their own quants teams, whose risks no one can understand.
ment methodologies, are relying on data that is backward looking,
is incomplete, and fails to account for the important events that As a result, the firm will have a view on the risk of each silo, but
financial institutions really need to look out for. These risk manage- will not be able to aggregate them in any useful way for the group.
ment tools behave like the ratings that ratings agencies issue. It Similarly, since they are sitting in different parts of the business it
really does not matter whether a firm has a BBB rating rather than is difficult to accurately determine the correlation benefits that the
AAA rating if neither experience default. All that happens is that firm is experiencing in a meaningful way.
the investors in the BBB issue earn a higher return for the same
risk. The differences in ratings matter when the BBB firm defaults The other problem is that there is just too much data to deal with,
while the AAA firm does not, and we have seen that the granularity even if we are able to aggregate them. The risk management teams
that investors expect simply does not exist. All borrowers become are facing a hosepipe of data which they have to decipher, and its
equally highly risky when the situation turns sour. Citigroup is a contents change with every trade and even by a simple change in
perfect example of an institution that went from being a superstar the hour of the day.
to a destroyed entity.
Even if so much data could be effectively analyzed the next tough
Most importantly, all risk measurements are far more subjective than task is to present them in a useful way to the management, since
many want to accept. They are based on individual decisions about it is they who determine the firm’s overall risk appetite and not the
what should be incorporated into the model and what should not. risk management division. Once the management sets the param-
eters then it is the risk management team’s job to ensure that
The problem sadly is that even if the methodologies that were everyone operates within the set boundaries.
provided were reliable, which they are far from being, the current
structures of most financial institutions make instituting effective These kinds of complex reports are very hard for the manage-
risk management policies literally impossible. And it is these institu- ment to understand. And the simpler they are made the greater
tional intricacies that we will turn to in the next section. sacrifices have to be made about specificities of risk. The result is
that either the firm does not allocate adequate capital to the risks
Why risk management is a panacea in today’s it faces or it allocates too much, hence limiting some of the firm’s
financial environment profit potential.
Even if we choose to ignore the fact that different instruments have
different risk profiles and follow disparate distributions, we cannot Even if the firm was fully intent on getting all the available informa-
ignore the fact that they are based within different silos of the tion and it was able to correctly label all of the risks it faces and
institution that is trying to manage its risk. place them in the correct buckets, IT and human aspirations will still
make their task impossible.
These divisions have different management styles, compensation
structures, and risk appetites. More importantly, in order to deter- Implications of IT and human greed
mine the firm’s overall risk appetite we need to be able to firstly While academic literature does account for personal greed, and
place each of the risks within their correct buckets, such as credit, a number of papers have been written on the subject by Michael
market, operational, etc. The reality of the situation is that most Jensen and his many colleagues and students of the subject
institutions have major difficulties in quantifying the risks of their [Jensen and Meckling (1976)], it is usually overlooked when it
credit and market instruments, let alone the firm’s operational or comes to subjects that deal with anything other than corporate
liquidity risks. finance. For some reason, there is an assumption that just like

33
Economists’ hubris — the case of risk management

investors all behave rationally individuals are also mostly honest systems that are simply unable to communicate with one another
and as a result models should work. [Dizdarevic and Shojai (2004)]. Incompatible systems make the
development of a viable enterprise-wide risk management close to
The fact is that any risk model is as reliable as the data that is impossible, since without accurate and timely data the firm will be ill
inputted into it, and financial executives are inclined to reduce the prepared to respond to changes in asset compositions and prices.
allocation of risk to their activities insofar as possible so that they
can take on greater risks. After all, their compensation is based And, sadly that situation will remain for many years to come.
purely on the returns they generate. They have an option on the Despite the huge annual investments made by major financial insti-
bank. They share in the gains and not the losses. Consequently, the tutions, different systems supporting different instruments are sim-
higher the risk, the higher the return, with a floor attached. ply unable to share information in any meaningful way. This means
that no matter how remarkable the models used are, the data that
Given the complexity of many of the instruments that financial they will be dealing with are incomplete.
institutions are dealing with it is only natural that they will be col-
lecting underestimated risk profiles from the traders and engineers. Consequently, despite the best of ambitions, the dream of having
Furthermore, because of the silo structures many firms will find a reliable and effective enterprise-wide risk management shall
that they have assets that are highly correlated with each other but remain just that, a dream. And as long as firms remain incapable
sitting in different silos. As a result, during the downturn the firm of determining their own exposures, irrespective of which of the
might find itself significantly more exposed than it thought and not reasons mentioned above is the main culprit, they will continue to
as capitalized as necessary. face enormous risks at times when markets do correct themselves
rapidly and sadly the precedent set in the recent crisis does noth-
The desire to underestimate risk is not limited to the traders or ing but put flame to the fire. Financial institutions are now even
engineers sitting on the quants desks, the firm’s own management more confident of taking risks than they were two years ago, which
are compensated on their annual performance, if not quarterly. means that the next crisis can only be greater than the one we just
Hence they would also like to be able to take more risks than they lived through.
should be permitted to. And none of these people are stupid. The
traders know that they will be bailed out by the bank if the situation Conclusion
goes sour and at worst lose their jobs, but if the bet pays off they In this, the third article in the Economists’ Hubris series, we turn our
will be able to retire with savings that even their grandchildren can- attention to not only the hubris of economists but also the hubris
not spend. And, the management knows that if the bank goes close of practitioners, be they bankers or regulators. We find that while
to failure it will be bailed out by the government if it is important a number of models have been developed to help financial institu-
enough to the local or global economy. The moral hazard problem, tions manage their risk, none is really that reliable when placed
as some had predicted1, has become severely worse since banks are under strict scientific examinations. Similar to other economic
now even more confident of taking big risks. disciplines, risk management is also prone to being effective solely
within the confines of the academic institutions in which they are
Sadly, however, there is nothing that can be done about that. And developed.
the different Basle concordats will in no way help, since they are
based on parameters that have no connection with the real world. However, the models are not our only problem. In order to institute
They are arbitrary numbers plucked from the air. BIS is today as effective risk management systems and policies at financial institu-
effective in preventing financial risk as the U.N. is in preventing tions we first need to be able to collect reliable data, which given
global conflicts. today’s operational and IT infrastructures is literally impossible to
do. Risk data are sitting in a number of silos across the globe with-
In addition to the human aspects of risk management, most firms out any viable way of aggregating them in any useful way. It is this
are unable to get a holistic view of their risk due to the fact that issue that needs to be dealt with first before we work on developing
their IT systems are simply unable to provide that kind of data. Most new models that can effectively manage them.
financial institutions are unable to determine the actual cost of
the instruments they develop let alone their risk, and how it might The sad fact is that just like the academics who develop these mod-
change over time. els, the practitioners who use them also assume that they work.
Unlike asset pricing where there is a clear gap between academic
In today’s world very few financial institutions have not undergone thinking and business application, when it comes to enterprise-wide
some sort of a merger or acquisition, with the pace increasing in the risk management, both are equally wrong. Consequently, practitio-
past couple of years. The result is a spaghetti like infrastructure of ners, both bankers and their regulators, were living under a false

1 Shojai, S., 2009, “Why the rush to repay TARP?” Riskcenter, May 12
34 – The   journal of financial transformation
Economists’ hubris — the case of risk management

sense of security that was shattered with the recent financial crisis.
The current governmental and regulatory responses are focusing
on the periphery. The real issue is that banks should be forced
to improve their operational and IT infrastructure in order to be
able to get a holistic view of the risks that are sitting within the
many pockets of their organizations. Our aim with this paper is to
highlight the main issues that financial institutions face in institut-
ing effective risk management systems and policies so that public
policy debates get diverted to focus on those issues that truly mat-
ter and not those that simply get press attention.

References
• Boudoukh, J., M. Richardson, and R. Stanton, 1997, “Pricing mortgage-backed securities
in a mulitfactor interest rate environment: a multivariate density estimation approach,”
Review of Financial Studies, 10, 405-446
• Dizdarevic, P., and S. Shojai, 2004, “Integrated data architecture – the end game,”
Journal of Financial Transformation, 11, 62-65
• Fabozzi, F. J., and V. Kothari, 2007, Securitization the tool of financial transformation,”
Journal of Financial Transformation, 20, 33-45
• Fragnière, E., J. Gondzio, N. S. Tuchschmid, and Q. Zhang, forthcoming 2010, “Non-
parametric liquidity-adjusted VaR model: a stochastic programming approach”, Journal
of Financial Transformation
• Hand, D. J., and G. Blunt, 2009, “Estimating the iceberg: how much fraud is there in the
U.K.,” Journal of Financial Transformation, 25, 19-29
• Hand, D. J., and K. Yu, 2009, “Justifying adverse actions with new scorecard technolo-
gies,” Journal of Financial Transformation, 26, 13-17
• Hunter, G. W., 2009, “Anatomy of the 2008 financial crisis: an economic analysis post-
mortem,” Journal of Financial Transformation, 27, 45-48
• Jacobs, B. I., 2009, “Tumbling tower of Babel: subprime securitization and the credit
crisis,” Financial Analysts Journal, 66:2, 17 – 31
• Jensen. M. C., and W. H. Meckling, 1976, “Theory of the firm: managerial behavior,
agency costs and ownership structure,” Journal of Financial Economics, 3:4, 305-360
• Jorion, P., 2006, Value at Risk: the new benchmark for managing financial risk, 3rd ed.
McGraw-Hill
• Keys, B. J., T. Mukherjee, A. Seru, and V. Vig, 2010, “Did securitization lead to lax
screening? Evidence from subprime loans,” Quarterly Journal of Economics, forthcom-
ing
• Kuester, K., S. Mittnik, and M. S. Paolella, 2006, “Value-at-Risk prediction: a comparison
of alternative strategies,” Journal of Financial Econometrics, 4:1, 53-89
• Markowitz, H., 1952, “Portfolio selection,” Journal of Finance, 7:1, 77-91
• Reinhart, C. M., and K. Rogoff, 2009, This time is different: eight centuries of financial
folly, Princeton University Press
• Shojai, S., 2009, “Economists’ hubris – the case of mergers and acquisitions,” Journal
of Financial Transformation, 26, 4-12
• Shojai, S., and G. Feiger, 2009, “Economists’ hubris: the case of asset pricing,” Journal
of Financial Transformation, 27, 9-13
• Smithers, A., 2009, Wall Street revalued: imperfect markets and inept central bankers,
John Wiley & Sons

35
Articles

Best practices
for investment risk
management

Jennifer Bender
Vice President, MSCI Barra

Frank Nielsen
Executive Director, MSCI Barra

Abstract
A successful investment process requires a risk management
structure that addresses multiple aspects of risk. In this paper, we
lay out a best practices framework that rests on three pillars: risk
measurement, risk monitoring, and risk-adjusted investment man-
agement. All three are critical. Risk measurement means using the
right tools accurately to quantify risk from various perspectives.
Risk monitoring means tracking the output from the tools and flag-
ging anomalies on a regular and timely basis. Risk-adjusted invest-
ment management (RAIM) uses the information from measurement
and monitoring to align the portfolio with expectations and risk
tolerance.

37
Best practices for investment risk management

The last 18 months have brought risk management to the forefront ■■ Manages risk for normal times but is cognizant of and aims to be
and highlighted the need for guidance on best practices for inves- prepared for extreme events.
tors. Many institutional investors were surprised by the violent mar-
ket moves during the current crisis. Some have argued that current We developed this framework with institutional investors and
risk management practices failed when they were needed most, and their risk management challenges in mind, but this framework
with multi-sigma events extending across formerly uncorrelated can be adapted easily to the requirements of asset managers. In
asset classes, investors have questioned the very meaning of the the first section of the paper, we describe a framework that takes
term ‘well diversified portfolio.’ What does sound risk management into account the three guiding principles. In the second section of
mean for plans, foundations, endowments, and other institutional the paper, we illustrate this framework in more detail and provide
investors? How should these institutions think about best practices examples for its implementation.
in risk management? We start with three guiding principles:
Three pillars for risk management
1 Risk management is not limited to the risk manager; any- Risk management has evolved rapidly over the last few decades,
one involved in the investment process, from the CIO to marked by key developments like the adoption of the 1988 Basel
the portfolio managers, should be thinking about risk — risk Accord and significant episodes like the U.S. Savings and Loan crisis,
management should not be limited to an after-the-fact reporting the collapse of LTCM, the ‘Dot.com’ bust, and the recent financial
function but must be woven into the investor’s decision-making crisis. However, the degree to which various risk methodologies
process, whether it is the asset allocation decision or the process and practices have been implemented by institutional investors has
for hiring managers. Those responsible for asset allocation and varied. In particular, there remains a wide range in how market par-
management should be risk managers at heart and consider risk ticipants (pension plans, endowments, asset managers, hedge funds,
and return tradeoffs before making investment decisions. investment banks, etc.) have integrated the risk management func-
2 If you cannot assess the risk of an asset, maybe you should tion. Best and Reeves (2008), for example, highlight the divergence
not invest in it — for those institutions invested in alternative in risk management practices between buy-side and sell-side institu-
asset classes, such as private equity and hedge funds, or who tions, the latter being subject to greater regulatory pressure.
have exposure to complex instruments, such as derivatives and
structured products, the risk management requirements have Our goal is to establish a framework for sound market risk man-
greatly increased. These investors need a framework for manag- agement for institutional investors. We rely on three pillars:
ing risk that far exceeds what was needed for the plain vanilla risk measurement, monitoring, and management (or risk-adjusted
stock and bond investing that prevailed only ten years ago. We investment management, RAIM). Risk measurement refers to the
argue that one should assess one’s risk management capabilities tools institutional investors use to measure risk. Risk monitoring
before making the decision to invest in certain asset types. focuses on the process of evaluating changes in portfolio risk over
3 Proactive risk management is better than reactive risk time. RAIM refers to how investors may adjust their portfolios in
management — being prepared for unlikely events is perhaps response to expected changes in risk. Robust risk management
the most important lesson learned from the recent crisis. This integrates all three areas.
applies to both market and nonmarket risks such as counter-
party, operational, leverage, and liquidity. Addressing this issue The risk manager’s toolkit may include a variety of measures captur-
transcends the simple use of the output of models and tools. It ing different views of risk. Figure 1 illustrates one way of categorizing
requires an institutional mindset that analyzes the global eco- the suite of tools needed. We distinguish between risk measures for
nomic outlook, understands the aggregate portfolio exposures
across asset classes, and is willing to use the model output intel-
ligently to align the portfolio structure with the plan sponsor’s Alpha (active risk) Beta (total risk)
assessment of the risks that may impact the portfolio. Normal Extreme Normal Extreme
Tracking error Stress testing Asset class Stress testing asset
active bets volatility/beta classes
In this paper, we propose a risk management framework that:
■■ Is aligned with the investment objectives and investment hori- Contribution to Active contribution Contributions to Total contribution
zon. tracking error to tail risk total risk to tail risk

■■ Tackles multiple aspects of risk and is not limited to a single Active exposures
measure like tracking error or Value at Risk (VaR). Benchmark misfits Maximum active Sources of return/ Maximum
drawdown exposures drawdown,
■■ Measures, monitors, and manages exposures to economic and
contagion effect
fundamental drivers of risk and return across asset classes to
avoid overexposures to any one risk factor. Figure 1 – Structure for risk measurement and monitoring

38 – The   journal of financial transformation


Best practices for investment risk management

normal and extreme times as well as risk measures that relate to is available. While largely unpredictable, the impact of such shocks
absolute losses or losses relative to a benchmark. On one hand, insti- can be analyzed using stress tests. Below we list a number of stress
tutional investors need to manage the total risk of their investments, test categories that investors might employ on a regular basis to
which means protecting themselves from asset-liability deficits, assess the immediate impact on the portfolio as well as the change
declines in broad asset classes, and more generally, any losses large in impact over time.
enough to make it difficult to meet the investor’s obligations. On the
other hand, institutions need to manage the risk of managers under- Systemic shock
performing their benchmarks, which involves monitoring the tracking ■■ Liquidity shock
error and performance relative to the assigned benchmark. ■■ Leverage shock

To assess future risks, it is essential to measure and monitor risk Macro shock
both at the aggregate level and at the factor level. For risk mea- ■■ Interest rate shock
surement, most institutional investors measure aggregate portfolio ■■ Oil price shock
risk with volatility or tracking error, which rely on individual volatili-
ties and correlations of asset classes and managers. However, while Market-wide shock
volatility, tracking error, and correlations capture the overall risk of ■■ Market-wide decline in equity prices
the portfolio, they do not distinguish between the sources of risk,
which may include market risk, sector risk, credit risk, and interest Targeted shock
rate risk, to name a few. For instance, energy stocks are likely to ■■ U.S. value stocks hit
be sensitive to oil prices, and BBB corporate bonds are likely to ■■ Japan growth stocks hit
be sensitive to credit spreads. Sources of risk, or factors, reflect
the systematic risks investors are actually rewarded for bearing Our discussion of stress testing segues naturally into the problem
and are often obscured at the asset class level [Kneafsey (2009)]. of managing tail risk, or the risk of some rare event occurring.
Institutional investors can decompose portfolio risk using a factor Whereas stress tests do not address the likelihood of extreme
model to understand how much return and risk from different asset shocks occurring, other methods for analyzing tail risk do. This
classes or managers resulted from prescribed factor exposures in recent period of turmoil has acutely highlighted both the impor-
the past1, or how much risk to expect going forward.2 tance of managing tail risk and the inadequacy of generic tail risk
measures, such as parametric VaR.
Risk monitoring enables institutions to monitor changes in the
sources of risk on a regular and timely basis. For instance, many While the simplest measure of parametric VaR assumes that
well diversified U.S. plans saw a growing exposure to financial sec- returns are normally distributed, more sophisticated methods
tor, housing, and credit risk from 2005-2006. While risk managers for calculating VaR do not. These span a wide range of modeling
may not have foreseen a looming correction, the ability to monitor choices that may rely on parametrically or non-parametrically
these exposures would have at least alerted them to the risks in the specified non-normal distributions, or Extreme Value Theory. For
event of a correction. a more detailed discussion on the latter, we refer to Barbieri et al.
(2009) and Goldberg et al. (2009). Other measures of tail risk, such
Portfolio decomposition plays an important role in stress testing. as Expected Shortfall (conditional VaR) and Maximum Drawdown,3
Here, the sources of risk are stressed by the risk manager to assess seek to capture a different and potentially more relevant facet of
the impact on the portfolio. Stress testing is flexible in enabling risk tail risk. In general, turbulent times highlight the need for moni-
managers to gauge the impact of an event on the portfolio. The toring appropriate tail risk measures. Such times also call for the
stress scenario might be real or hypothetical, commonplace or rare, frequent reporting of exceptional developments, i.e., reporting that
but stress tests are used typically to assess the impact of large and highlights unusual changes in risk measures or increases in expo-
rare events. Scenarios can come in different flavors, such as macro sure to certain factors.
shocks, market shocks, or factor shocks. The intuition behind stress
testing for market risk can be applied to nonmarket or systemic Before we move on to the third pillar, RAIM, it is important to point
risks, such as leverage and liquidity risk. When leverage and liquid- out that risk monitoring requires the necessary IT and infrastruc-
ity shocks occur, as in 2008, it may result in unexpected increases ture resources for support. First, accurate data is essential, as is
in investment commitments for which no immediate funding source sufficient coverage of the assets held in the portfolio. Delays in a

1 Performance attribution, the attribution of realized returns to a set of exposures assets, both equity and fixed income, together with commodities, hedge funds, and
times factor returns, can provide valuable insight to risk managers seeking to iden- currencies, are then combined into a single risk model. This makes it suitable for a
tify where their investments or managers added value. In addition, it can highlight wide range of investment purposes, from conducting an in-depth analysis of a single-
similarities between asset groups or managers’ strategies in a way that is far more country portfolio to understanding the risk profile of a broad set of international
informative than looking at historical inter-manager correlations alone. investments across several asset classes. 39
2 The Barra Integrated Model (BIM) is such a multi-asset class model for forecasting 3 VaR captures the expected loss at some threshold, while Expected Shortfall captures
the asset- and portfolio-level risk of global multi-asset class portfolios or plans. The the expected loss once that threshold has been exceeded. Maximum drawdown is
model begins with a detailed analysis of individual assets from 56 equity markets and defined as the largest drop from a peak to a bottom in a certain period.
46 fixed income markets to uncover the factors that drive their risk and return. The
Best practices for investment risk management

risk manager’s ability to view changes in holdings, prices, or char- Risk measurement Risk monitoring RAIM
acteristics are often caused by infrastructure limitations. In some Total Volatility Monitor sources of Limit exposure to
cases, data may not be readily available, or the resources required volatility biggest sources of

Normal
volatility
to collect data from custodians or individual managers may be
Active Tracking error Monitor sources of Limit exposure to
prohibitively expensive. In addition, hard-to-model assets, such as tracking error biggest sources of
complex derivatives, hedge funds, and private equity, can pose a tracking error

challenge for even the most advanced systems. In sum, institutions Total Stress tests/tail risk Monitor expected Implement portfolio
measures shortfall of the total insurance plan
should consider the costs of implementing the necessary risk man- plan

Extreme
agement systems when they decide in which assets to invest. Active Stress tests/tail risk Monitor changes in Ask managers to
measures potential active losses limit exposures to
if market declines certain sources
One consequence of the current crisis may be that investors by X% of risk
become more cautious when they choose their investments.
Warren Buffett, for example, commented at his recent shareholder Figure 3 – Three pillars of risk management

meeting on complex calculations used to value purchases: “If you


need to use a computer or a calculator to make the calculation, you Specifically, RAIM could be used in the development of overlay
shouldn’t buy it.” Even though that statement may be extreme, the strategies that would facilitate certain hedges, such as currency
point is well taken. The damage that exotic, illiquid, and hard-to-val- hedges, or tail risk insurance.
ue instruments have triggered over the last 18 months highlighted
the need to be able to assess the risks of such investments before As an example, the declines in the broad equity market last year
money is allocated to them. caused many pension plans to become underfunded. Decision-
makers may decide that their tolerance for losses should be limited
The third pillar in our framework is RAIM, which puts risk measure- to a specific percentage. They should then decide whether that limit
ment and monitoring outputs into action. While risk measurement should be maintained through a passive hedge or through a trigger
provides the measures, and risk monitoring ensures that the mechanism defined by the breach of clearly defined parameters of
measures are timely and relevant, without the ability to make a risk measure. Some pension plans started hedging their equity
adjustments to the portfolio, this information is of limited value exposure to limit downside risk, though for many it was too late.
for protecting the investor against losses. RAIM aligns the invest- One reason why pension plans may not have hedged their market
ment decision-making process with the risk management function. exposure more frequently is the cost of hedging. Hedging reduces
For instance, RAIM might be used to make portfolio adjustments the performance of the portfolio in up markets, but in periods when
as either the correlations between assets or managers rise or the the market declines, hedging limits the downside. Figure  2 illus-
probability of certain tail risk or disaster scenarios increases. RAIM trates a successful market hedge that includes a stop-loss plan at a
could also facilitate the management of risks coming from certain point when assets drop below a specified level.
sources of return, or it could aid in better diversifying the portfolio.
All three pillars — risk measurement, risk monitoring, and RAIM — are
indispensable to a complete risk management structure. Figure  3
summarizes the three pillars, illustrated with specific examples.
60% Cumulative return Portfolio
insurance The Figure uses the same idea we presented before, namely, that
takes effect
40% risk measures can be categorized by normal and extreme times and
relative versus absolute investment objectives.
20%

0% Implementing a market risk management framework


In practice, the needs of institutional investors can be wide ranging,
8
08
7

08
06

08
07

07

8
07

9
7

7
-0

-0
-0

-0

-0
t-0

t-0
n-

n-
c-

c-

c-
r-

r-
g

g
b

b
Ap

Ap
Au

Au
De

Oc

De

Oc

De
Ju

Ju
Fe

Fe

Fe

-20%
and their ideal measurement, monitoring, and managing capabili-
-40%
ties will differ. In this section, we illustrate the case of a hypotheti-
Plan bears cost of insurance during
normal markets but benefits from cal but typical U.S. plan sponsor. Although there may be additional
large unexpected drops due to
-60%
systemic blow-ups
criteria, the three critical drivers of risk management requirements
-80%
are as follows:

-100% Uninsured portfolio Insured portfolio 1 Return requirements — the plan’s liabilities or expected pay-
outs will influence not only the assets in which it invests but
Figure 2 – Risk-adjusted investment management to protect against downside risk also which benchmarks are used and how much it can lose over

40 – The   journal of financial transformation


Best practices for investment risk management

certain periods. The latter, in turn, may drive how much risk it is tives would be small. A Type-3 plan would invest in a variety of
willing to take and with how much exposure to certain sources of alternative asset classes as well as complex instruments but to a
return/risk it is comfortable. larger extent than Type-2 plans.
2 Investment horizon — the plan’s investment horizon, or willing-
ness to sustain shorter-term shocks, will influence which risk Currently, the vast majority of pension plans fall into the second
measures are appropriate and how frequently they need to be group. We, therefore, consider a hypothetical Type-2 plan for our
monitored. illustration of a risk management framework. Its asset allocation
3 Complexity of investments — plans that invest in difficult-to- is as follows: equity (60%) [U.S. (36%), international (24%)], fixed
value assets with potentially non-normal return distributions income (U.S.) (25%), alternatives (15%) [real estate (5%), private
or unusually high exposure to tail events require additional risk equity (5%), hedge funds (5%)].
measures, higher monitoring frequencies, and advanced RAIM
capabilities. A first critical step is to adopt tools that enable the plan to mea-
sure and monitor risk at the source, or factor level, and not just at
These criteria are naturally linked, although the degree of impor- the aggregate level. The plan should then monitor its exposure to
tance might vary from plan to plan. For instance, return require- different risk sources on a regular basis — monthly or quarterly at
ments may play the primary role in driving the choice of instru- the very least. This would occur at the total portfolio level, look-
ments and asset classes for some plans, while they may play a ing across asset classes. In addition, the plan can require from its
less important role for other plans. For some plans, the investment managers more detailed information on risk exposures. Many plans
horizon is tied directly to their return requirements, while for oth- receive only high-level performance summaries focusing on real-
ers, it is more a function of how much they are willing to lose over a ized returns and tracking error. Requiring estimates of exposures
given period. Regardless, these three criteria determine the guiding to various sources of risk is a crucial extension. For illustration, we
principles for any plan’s risk management function. show how this would fit in the framework we have used so far:

For example, a plan sponsor with reasonable and relatively Total risk:
infrequent payout obligations, a large surplus, and with limited ■■ Normal periods — the plan can evaluate sources of return and
exposure to alternatives and complex instruments does not need risk across its overall portfolio using a multi-asset class factor
short-term measures or frequent monitoring. This plan would model. Sources of risk can include macroeconomic and market
benefit from focusing on longer-term risk measures. Instead of factors. An example of the type of analysis that can be done is
setting up a system to calculate 10-day VaR measures, the plan to look at the performance of the plan’s portfolio in different
could focus on how multiyear regime shifts in different risk fac- macroeconomic regimes. The plan could then adjust its asset
tors, such as interest rate cycles, may affect the portfolio’s value. allocation during the next review period.
In contrast, a plan with frequent and significant expected payouts, ■■ Extreme events — using the sources of risk for the overall port-
a limited ability to sustain short-term losses, and with substantial folio, the plan can shock certain factors or sources of risk, i.e.,
exposure to alternatives and complex instruments would require those likely to suffer in the event of a market dislocation, includ-
a wide variety of risk measures, frequent risk monitoring, and a ing a systemic meltdown or series of external shocks. These
well developed RAIM process. Most plans are likely to fall between stress tests would enable the plan to evaluate how individual or
these two extremes. multiple simultaneous shocks impact the overall portfolio (i.e.,
tail risk and tail correlations). The plan could then establish an
Another example may help to shed light on these ideas. Below, we action plan if asset values drop by some absolute or relative (i.e.,
group investors into one of three categories using the third criteria — to liabilities) amount.
complexity of instruments and asset classes.
Active risk:
A Type-1 plan invests in a straightforward allocation to equities and ■■ Normal periods — the plan can ask for reports on their sources
fixed income. Equity and fixed income allocations may be limited of risk from all equity, fixed income, and alternatives managers,
to the domestic market, and fixed income investments are mostly or the plan can estimate them internally. Sources of active risk
concentrated in government bonds and AAA-rated corporate. A should be detailed and focused on the specific risk and return
Type-2 plan may invest in equities globally, including emerging drivers of the manager’s investment strategy. For instance,
markets. Fixed income investments may include high yield bonds analyzing a value, small cap equity manager’s tracking error will
and mortgage-backed securities, and the plan may also invest in focus on the active bets relative to the agreed upon benchmark,
alternatives and complex derivatives. However, as a percentage of ideally a small cap value benchmark like the MSCI Small Cap
the plan’s total value, the investments in alternatives and deriva- Value Index. Measuring and monitoring will focus on questions

41
Best practices for investment risk management

of active bets relative to this benchmark, for example, does the


manager’s portfolio have a consistent small cap value bias or did Senior investment /
the portfolio move towards growth-oriented companies over the risk committee

last few years when value stocks underperformed? The active


risk analysis should ensure that the hired manager is following • Initial asset/manager allocation
his or her mandate and is not deviating from the agreed upon Risk manager • Standard risk reports
• Red flags/exceptions reporting
guidelines.
■■ Extreme events — the plan may want to stress test the impact
of the joint underperformance of a number of active strategies
that historically have been uncorrelated. For example, during Equity Alternatives Fixed income
the quant meltdown in August 2007, a number of return factors
became suddenly highly correlated, leading to severe negative
portfolio performance relative to their respective benchmarks. Figure 4 – Organizational structure for risk management
Such stress tests enable the plan to evaluate how shocks impact
an entire group of managers. Other useful measures for rare
events are tail risk and tail risk correlations of active bets across Our example focused on a Type-2 plan. For a Type-3 plan, this illus-
managers. Certain factors or strategies may become highly cor- tration would also be relevant, but the requirements for measuring,
related across asset classes or markets during crises (i.e., value monitoring, and managing different types and sources of risk would
or momentum across equity markets) and could lead to vastly be more extensive. For a Type-1 plan, the extent to which it invests
higher losses relative to their respective benchmarks than esti- in risk management should depend on its liabilities structure and its
mated by the tracking error. short-term risk tolerance.

The above examples illustrate how a model that decomposes risk Most plans, regardless of their specific characteristics, can take
along its sources can help institutions evaluate different types of some basic actions on an organizational or administrative level to
risk across different dimensions. It can be applied similarly to real- manage risk. Our hypothetical plan may establish a risk commit-
ized returns in order to attribute past performance. For our hypo- tee consisting of the CIO, risk managers, senior portfolio manag-
thetical Type-2 plan, a suggested set of minimum components for ers, and legal and compliance officers that meets at least once a
risk management may include: quarter to discuss the overall economic and financial environment.
■■ Aggregate measures of volatility and tracking error across man- Participants can discuss their concerns regarding systemic risk
agers and asset classes. issues such as liquidity and leverage, review the results of stress
■■ An accurate decomposition of return and risk across asset tests, and debate whether hedging strategies should be activated to
classes, utilizing an integrated (across asset classes) multi-factor address undesired exposures or potential tail risk events. The plan
risk model. could also develop a reporting framework where the risk committee
■■ A stress testing framework and/or extreme risk measures for would receive at least monthly reports on unusual developments
understanding tail risk and tail risk correlations in the portfolio. identified by the risk manager. Then, if the investment committee
■■ An appropriate set of benchmarks. is sufficiently concerned about exposure to a certain segment, it
could ask those managers with large exposures to hedge them or
The exact measures, monitoring frequencies, and RAIM processes to eliminate the undesired exposures.
the plan adopts will depend on its return requirements and expect-
ed payouts, and its investment horizon and willingness to tolerate Figure 4 illustrates this type of setup, where the risk manager pre-
shorter-term losses. For instance, a plan with limited ability to pares risk reports and recommendations for the risk committee and
withstand short-term losses may want to build out its ability to deliverers risk management services and advice to the different
assess tail risk over different horizons using risk measures such as asset class managers.
Expected Shortfall based on Extreme Value Theory [Goldberg et al.
(2009)], which is more conservative than parametric VaR. These Finally, the plan may establish minimum risk management require-
plans may also want to implement extensive stress tests across ments for external managers. For instance, the external managers
asset classes and within certain subcategories of investments. could be required to demonstrate their ability to calculate tracking
Meanwhile, plans with greater ability to withstand short-term losses error, VaR, or other measures, as well as how risk management
may opt for more basic tail risk measures and stress tests. impacts their portfolio construction.

42 – The   journal of financial transformation


Best practices for investment risk management

Conclusion
Recent events have put into stark relief the inadequacy of the cur-
rent state of risk management. Much has been said about the need
for better risk management and a greater degree of risk aware-
ness in the broader investment community. Risk management is
a dynamic area, and any set of best practices are bound to evolve
over time. Here we set out to clarify some of the principles and
tools that we believe are required for a sound risk management
framework.

Specifically, we lay out a framework that rests on three pillars — risk


measurement, monitoring, and RAIM (or risk-adjusted investment
management). Each of the three domains is critical for risk manage-
ment. Risk measurement means having the right tools to measure
risk accurately from various perspectives. Risk monitoring means
observing the risk measures on a regular and timely basis. RAIM
means using the information from the measurement and monitor-
ing layers intelligently to ensure that the portfolio management
process is aligned with expectations of risk and risk tolerance.
While each pillar encompasses a different aspect of risk manage-
ment, each is indispensable to a strong risk management process.
Moreover, they are interdependent and should be aligned with the
investor’s objectives. Their interconnectedness drives the key con-
ceptual theme — that risk management and the investment process
should be fully integrated.

References
• Barbieri, A., V. Dubikovsky, A. Gladkevich, L. Goldberg, and M. Hayes, 2009, “Central
limits and financial risk,” MSCI Barra Research Insights
• Best, P., and M. Reeves, 2008, “Risk management in the evolving investment manage-
ment industry,” Journal of Financial Transformation, 25, 88-90
• Goldberg, L., M. Hayes, J. Menchero, and I. Mitra, 2009, “Extreme risk analysis,” MSCI
Barra Research Insights
• Kneafsey, K., 2009, “Four demons,” Journal of Financial Transformation, 26, 18-23

43
Articles

Lessons from
the global financial
meltdown of 2008

Hershey H. Friedman
Professor of Business and Marketing, Department
of Economics, Brooklyn College, City University of
New York

Linda W. Friedman
Professor, Department of Statistics and Computer
Information Systems, Baruch College and the
Graduate Center of the City University of New York

Abstract
The current financial crisis that threatens the entire world has
created an ideal opportunity for educators. A number of impor-
tant lessons can be learned from this financial meltdown. Some
are technical and deal with the value of mathematical models and
measuring risk. The most important lesson, however, is that unethi-
cal behavior has many consequences. This debacle could not have
occurred if the parties involved had been socially responsible and
not motivated by greed. Conflicts of interest and the way CEOs are
compensated are at the heart of this financial catastrophe that has
wiped out trillions of dollars in assets and millions of jobs. We pres-
ent a set of lessons as teaching opportunities for today’s students
and tomorrow’s decision makers.

45
Lessons from the global financial meltdown of 2008

The current financial crisis is the worst debacle we have experienced over U.S.$125 billion, but only had equity of U.S.$5 billion. The
since the Great Depression. Millions of jobs have been lost and tril- financial crisis started by LTCM at that time also demonstrated how
lions of dollars in market value have evaporated. The entire world is the entire financial system could be at risk because of the actions
suffering. The financial crisis is far from over and is adversely affect- of one fund. The Federal Reserve Bank was involved in a U.S.$3.5
ing people all over the world. Are there lessons that can be learned billion rescue package in 1998 to protect the financial markets
from this financial debacle? In fact, it is the perfect teaching tool from a total collapse because of the actions of LTCM. One lesson
since it makes it so easy to demonstrate what can go wrong when that should have been learned from this financial debacle was that
firms do not behave in an ethically, socially responsible manner. we should not rely so much on sophisticated mathematical models.
LTCM’s models were developed by two Nobel laureates — Myron
It is interesting to note that the financial meltdown of 2008 did not Scholes and Robert C. Merton — who were both members of the
suddenly appear out of nowhere. The corporate world was heading board of the hedge fund.
down a dangerous path for more than 20 years. We feel that an
early warning was the Savings and Loans disaster in which 1,043 Lowenstein (2008) observes that “regulators, too, have seemed to
banks failed with a cost to U.S. taxpayers of about U.S.$124 billion replay the past without gaining from the experience. What of the
[Curry and Shibut (2000)]. That happened between 1986 and 1995. warning that obscure derivatives needed to be better regulated
The financial world ignored this warning. A few years later came and understood? What of the evident risk that intervention from
several colossal corporate scandals including Enron, which filed Washington would foster yet more speculative behavior — and pos-
for bankruptcy in late 2001, Tyco International, Adelphia, Global sibly lead to a string of  bailouts?” Lowenstein (2008) states that
Crossing, WorldCom, and many other firms. These companies were only six months after the LTCM fiasco, Alan Greenspan “called for
found to use dubious accounting practices or engage in outright less burdensome derivatives regulation, arguing that banks could
accounting fraud to deceive the public and enrich executives. In police themselves.” Needless to say, Greenspan was proven quite
fact, the Sarbanes-Oxley Act of 2002 was enacted in order to wrong in this assertion.
prevent future financial disasters such as Enron. Suskind (2008)
reports that Alan Greenspan, Chairman of the Federal Reserve, The scandal involving Bernard Madoff which has been called the larg-
was at a meeting on February 22, 2002 after the Enron debacle est Ponzi scheme ever has also served to cast serious doubts on how
and was upset with what was happening in the corporate world. well our financial system is being monitored. Gandel (2008) notes
Mr. Greenspan noted how easy it was for CEOs to “craft” financial that KPMG, PricewaterhouseCoopers, BDO Seidman, and McGladrey
statements in ways that could deceive the public. He slapped the & Pullen all signed off that all was well with the many feeder funds
table and exclaimed: “There’s been too much gaming of the system. that had invested with Madoff. What has shocked everyone is that
Capitalism is not working! There’s been a corrupting of the system the auditors did not recognize that billions of dollars of assets were
of capitalism.” This was another warning sign that it was easy for just not there. Auditors are supposed to check that the stated assets
unrestrained greed to harm the entire economy. actually exist. Interestingly, Madoff himself was not a client of any of
the large auditing firms; he used a tiny accounting firm in New City,
The dot.com bubble which took place between 1995 and 2001 NY that had only three employees. It is now apparent that this alone
(NASDAQ peaked at over 5100 in March 2000), was a different should have been an indication that something was very wrong with
kind of crisis. It was fueled by irrational spending on Internet stocks the way Madoff conducted business. One lawyer remarked: “All they
without considering traditional business models. Investors did not really had to substantiate the gains of these funds was Madoff’s own
seem to care about earnings per share or other more traditional statements. They were supposed to be the watchdogs. Why did they
measures. Moreover, there were too many companies trying to sign off on these funds’ books?” [Gandel (2008)].
create online businesses. The price of many dot.com stocks came
tumbling down, but the bubble was not based on fraud as much as What can be learned from the above crises, especially the financial
overvaluation of stocks (especially the IPOs) and excessive specu- meltdown of 2008? Firstly, we should recognize that what we have
lation. The housing bubble, on the other hand, that helped cause experienced is not the breakdown of an economic system. This is
the current financial meltdown was fueled to a large degree by the a values meltdown more than anything else. In fact, a recent poll
ready availability of deceitful mortgages. What was apparent from showed that bankers are near the bottom of the list when it comes
the dot.com bubble was that prices cannot go up forever — a lesson to respect felt by the public and barely beat prostitutes and con-
not heeded by those in the mortgage business. victed felons [Norris (2009)].

Long Term Capital Management (LTCM), a hedge fund founded Is the pursuit of self-interest always good?
in 1994, which went out of business in 2000, showed how risky a One pillar of mainstream economics taught in most economics
highly-leveraged hedge fund could be. In 1998, LTCM had borrowed courses is based on the famous saying of Adam Smith in his classic

46 – The   journal of financial transformation


Lessons from the global financial meltdown of 2008

work, “Wealth of Nations,” that: “it is not from the benevolence of Howard (1997) uses the expression “tragedy of maximization” to
the butcher, the brewer or the baker that we expect our dinner, describe the devastation that the philosophy of maximizing self-
but from their regard to their own interest.” Smith demonstrated interest has wrought. Unrestrained capitalism that is obsessed with
how self-interest and the “invisible hand” of the marketplace self-interest and is unconcerned about the long run, can lead to
allocates scarce resources efficiently. Students are taught that monopoly, inequitable distribution of income, unemployment, and
“economic man” or homo economicus acts with perfect rationality environmental disaster [Pitelis (2002)].
and is interested in maximizing his/her self-interest. This results in
an economic system (capitalism) that is efficient and productive. In 1937, at his second inaugural address, President Franklin D.
For the corporation, self-interest is synonymous with maximiza- Roosevelt stated: “We have always known that heedless self-
tion of profits and/or maximization of shareholder wealth. Indeed, interest was bad morals; we know now that it is bad economics.”
students are being taught that self interest plus free markets and [Roosevelt (1937)]. Lawrence H. Summers, in a 2003 speech to the
deregulation results in prosperity for everyone. The famous speech Chicago Economic Club made the following remark: “For it is the
by Gordon Gekko in the movie Wall Street is based on the idea that irony of the market system that while its very success depends
the pursuit of self-interest is good for all of us1: “Greed, for lack of on harnessing the power of self-interest, its very sustainability
a better word, is good. Greed is right. Greed works. Greed clarifies, depends upon people’s willingness to engage in acts that are not
cuts through, and captures the essence of evolutionary spirit. Greed self-interested.”
in all of its forms, greed for life, for money, for love, knowledge has
marked the upward surge of mankind. And greed, you mark my The financial meltdown of 2008 shows quite clearly what happens
words, will not only save Teldar Paper, but that other malfunction- when everyone is solely concerned with self-interest. Closely tied to
ing corporation called the USA.” the concept of pursuit of self-interest is the belief that free markets
do not need regulation. Many economists and CEOs promoted the
Change the word “greed” to pursuit of self-interest and you have in belief that capitalism could only work well with very little regulation.
effect what has been taught to millions of business and economics
students. Even the so-called watchmen and gatekeepers — cor- Free markets and the role of regulation
porate directors, investment bankers, regulators, mutual funds, There has been a small movement in economics that questions
accountants, auditors, etc. — have fallen into the self-interest trap the neoclassical model in economics and its belief that free mar-
and disregarded the needs of the public [Lorsch et al. (2005)]. kets and laissez-faire economics will solve all problems [Cohen
(2007)]. According to one economist, only 5% to 10% of America’s
It is ironic that the discipline of economics started as part of the 15,000 economists are “heterodox,” i.e., do not follow the neoclas-
discipline of moral philosophy, and even became a moral science sical model promoted by free market enthusiasts such as Milton
[Alvey (1999)]. Adam Smith, in his first book, “The Theory of Moral Friedman. Some heterodox economists feel that neoclassical eco-
Sentiments,” made it clear that he believed that economic growth nomics has become “sycophantic to capitalism;” the discipline is
depended on morality. To Smith, benevolence — not pursuit of self- concerned with mathematical solutions that do not resemble the
interest — was the highest virtue [Alvey (1999)]. The following quo- real world. The discipline is more concerned about models than
tation demonstrates what Smith actually believed: “Man ought to solving social problems [Monaghan (2003)].
regard himself, not as something separated and detached, but as a
citizen of the world, a member of the vast commonwealth of nature Lichtblau (2008) believes that the federal government did not do
and to the interest of this great community, he ought at all times to a good job monitoring Wall Street. He cites what Arthur Levitt,
be willing that his own little interest should be sacrificed.” former chairman of the SEC, said regarding his former agency: “As
an overheated market needed a strong referee to rein in danger-
Robinson made the point more than 30 years ago that the pursuit ously risky behavior, the commission too often remained on the
of self-interest has caused much harm to society and that Adam sidelines.” Sean Coffey, who used to be a fraud prosecutor, also
Smith should not be associated with this doctrine. In actuality, was not happy with the performance of the SEC: the SEC “neutered
Smith believed that “society, however, cannot subsist among those the ability of the enforcement staff to be as proactive as they could
who are at all times ready to hurt and injure one another.” Raw self- be. It’s hard to square the motto of investor advocate with the
interest without a foundation of morality is not what Adam Smith way they’ve performed the last eight years.” Not only was there a
is all about. Robinson ended a commencement address with the relaxation of enforcement, there was also a reduction in SEC staff.
following warning: “I hope … that you will find that the doctrines of Coffey asserts that the administration used the argument that
Adam Smith are not to be taken in the form in which your profes- loosening up regulations was necessary in order to make it possible
sors are explaining them to you” [Robinson (1977)]. for American companies to compete globally [Lichtblau (2008)].
Senator Charles E. Schumer also believed that the rules had to be

1 Available at: http://www.americanrhetoric.com/MovieSpeeches/moviespeechwall-


street.html 47
Lessons from the global financial meltdown of 2008

changed to encourage free markets and deregulation if the United WaMu, for example, lent money to nearly everyone who asked for
States was to remain competitive [Lipton and Hernandez (2008)]. it. Loan officers were encouraged to approve mortgages with virtu-
ally no checking of income [Goodman and Morgenson (2008)]. To
The problems began at a meeting on April 28, 2004 between five encourage people with little income to borrow money, WaMu used
major investment banks and the five members of the SEC The option ARMs (Adjustable Rate Mortgages). The very low initial rates
investment banks wanted an exemption from a regulation that lim- enticed people to take out mortgages. Of course, many borrowers
ited the amount of debt — known as the net capital rule — they could thought the low payments would continue indefinitely and would
have on their balance sheets. By increasing the amount of debt they never balloon [Goodman and Morgenson (2008)]. The number of
would be able to invest in the “opaque world of mortgage-backed ARMs at WaMu increased from 25% (2003) to 70% (2006). It did
securities” and credit default swaps (CDSs). The SEC agreed to not take long for word to spread that WaMu would give mortgages
loosen the capital rules and also decided to allow the investment to nearly anyone. WaMu even ran an advertising campaign telling
banks to monitor their own riskiness by using computer models to the world that they would give mortgages to anyone, “The power
analyze the riskiness of various securities, i.e., switch to a voluntary of yes.” WaMu did exactly that: it approved almost every mortgage.
regulatory program [Labaton (2008)]. The firms did act on the new Revenues at WaMu’s home lending unit increased from $707 million
requirements and took on huge amounts of debt. The leverage ratio to approximately $2 billion in one year. As one person noted: “If
at Bear Stearns rose to 33:1. This made the firm very risky since it you were alive, they (WaMu) would give you a loan. Actually, I think
held only $1 of equity for $33 of debt. Regarding what transpired, if you were dead, they still would give you a loan” [Goodman and
James D. Cox said [Labaton (2008)]: “We foolishly believed that Morgenson (2008)].
the firms had a strong culture of self-preservation and responsibil-
ity and would have the discipline not to be excessively borrowing. In the mortgage business, NINJA loans are mortgage loans made
Letting the firms police themselves made sense to me because I to people with “No income, no job or assets.” These mortgage
didn’t think the SEC had the staff and wherewithal to impose its own loans were made on the basis of unsubstantiated income claims
standards and I foolishly thought the market would impose its own by either the applicant or the mortgage broker or both. There are
self-discipline. We’ve all learned a terrible lesson.” stories of individuals with income of U.S.$14,000 a year purchasing
U.S.$750,000 homes with no money down and no mortgage pay-
Alan Greenspan finally admitted at a congressional hearing in ments for two years [Friedman (2008)]. Other types of mortgages
October, 2008 that he had relied too much on the “self-correcting that became popular include the “balloon mortgage” (borrower only
power of free markets” [Andrews (2008)]. He also acknowledged makes the interest payment for 10 years but then has to pay a huge
that he did not anticipate the “self-destructive power of wanton amount — the balloon payment); the “liar loan” (borrower claims an
mortgage lending” [Andrews (2008)]. Greenspan was asked wheth- annual income, no one checks and there is no documentation); the
er his ideology — i.e., belief in deregulation and that government “piggyback loan” (this combines a first and second mortgage so a
regulators did not do a better job than free markets in correcting down payment is not necessary); the “teaser loan” (mortgage at very
abuses — had contributed to the financial crisis. He admitted that he low interest rate for first two years but when it is readjusted after the
had made a mistake and actually took partial responsibility. Some two years, the borrower does not have the income to make the pay-
of the mistakes Greenspan made had to do with risky mortgages ments); the “option ARM loan” (discussed above); and the “stretch
and out-of-control derivatives. loan” (borrower has to use more than 50% of his monthly income to
make the mortgage payments) [Pearlstein (2007)].
Risky mortgages
Greenspan allowed the growth of highly risky and fraudulent mort- President Bush wanted to encourage homeownership, especially
gages without recognizing the “self-destructive power” of this type among minorities. Unfortunately, his approach, which encouraged
of mortgage lending with virtually no regulation [Skidelsky (2008)]. easy lending with little regulation, helped contribute to the financial
The Fed could have put a stop to it by using its power under a 1994 crisis [Becker et al. (2008)]. The increase in homeownership was
law (Home Owner Equity Protection Act) to prevent fraudulent accomplished through the use of toxic mortgages. President Bush
lending practices. It was obvious that the mortgage industry was also encouraged mortgage brokers and corporate America to come
out of control and was allowing individuals with very little money up with innovative solutions to enable low-income people to own
to borrow huge sums of money. Greenspan could also have used homes. L. William Seidman, an advisor to Republican presidents,
the monetary powers of the Fed to raise interest rates which would stated: “This administration made decisions that allowed the free
have ended the housing bubble. market to operate as a bar room brawl instead of a prize fight.”
Bush’s banking regulators made it clear to the industry that they
Here are just a few examples of how deregulation affected the sub- would do everything possible to eliminate regulations. They even
prime mortgage market: used a chainsaw to symbolically show what they would do to the

48 – The   journal of financial transformation


Lessons from the global financial meltdown of 2008

banking regulations. Various states, at one point, did try to do charge more. According to Dallavecchia, Mudd’s response was that
something about predatory lending but were blocked by the federal “the market, shareholders, and Congress all thought the companies
government. The attorney general of North Carolina said: “They should be taking more risks, not fewer. Who am I supposed to fight
took 50 sheriffs off the beat at a time when lending was becoming with first?” [Duhigg (2008)].
the Wild West” [Becker et al. (2008)].
As early as February 2003, Armando Falcon, Jr., who ran the Office
The policy of encouraging homeownership affected how Fannie of Federal Housing Enterprise Oversight (OFHEO), wrote a report
Mae and Freddie Mac conducted business. Fannie and Freddie are that warned that Fannie and Freddie could default because they
government-sponsored, shareholder-owned companies, whose job were taking on far too many mortgages and did not have the capi-
is to purchase mortgages. They purchase mortgages from banks tal to protect themselves against losses. Falcon almost got fired
and mortgage lenders, keep some and sell the rest to investors. for this report. After some accounting scandals at Freddie, the
This enables banks to make additional loans, thus allowing more President decided to keep Falcon on. A bill was written by Michael
people to own homes. Fannie and Freddie were encouraged by Oxley, a Republican and Chairman of the House Financial Services
President Bush to do everything possible to help low-income people Committee that would have “given an aggressive regulator enough
buy homes. One way to accomplish this was to reduce the down power to keep the companies from failing” [Becker et al. (2008)].
payments. There was a drive to allow new homeowners to obtain Since the bill was not as strong as what President Bush wanted,
a federally-insured mortgage with no money down. There was little he opposed it and it died in the Senate. The bottom line was that
incentive to do something about the super-easy lending practices, in Bush’s desire to get a tougher bill, he ended up with no bill at
since the housing industry was helping to pump up the entire econ- all. Eventually, James B. Lockhart III became the Director of the
omy. The increasing home values helped push consumer spending. OFHEO. Under his watch, Freddie and Fannie purchased U.S.$400
More and more people were becoming homeowners in line with billion of the most risky subprime mortgages. In September, 2008,
what President Bush wanted. the federal government had to take over Fannie Mae and Freddie
Mac.
Back in the year 2000, Fannie Mae, under CEO Franklin D. Raines
and CFO J. Timothy Howard decided to expand into riskier mort- Derivatives out of control
gages. They would use sophisticated computer models and rank Greenspan also admitted that he allowed the market for derivatives
borrowers according to how risky the loan was. The riskier the to go out of control. Greenspan was opposed to tighter regulation
loan, the higher the fees that would be charged to guarantee the of derivatives going back to 1994 [Andrews (2008)]. George Soros,
mortgage. The big risk was if a huge number of borrowers would be the renowned financier, avoided derivatives “because we don’t
unable to make the payments on their mortgages and walk away really understand how they work”; Felix G. Rohatyn, the prominent
from their obligations. That danger was not seen as likely in 2000. investment banker, called derivatives “potential hydrogen bombs”;
Moreover, the computer models would ensure that the higher fees and Warren E. Buffett remarked that derivatives were “financial
for the riskier mortgages would offset any losses from mortgage weapons of mass destruction.” Greenspan, on the other hand, felt
defaults. The company announced that it would purchase U.S.$2 that “derivatives have been an extraordinarily useful vehicle to
trillion in loans from low-income (and risky) borrowers by 2010 transfer risk from those who shouldn’t be taking it to those who are
[Duhigg (2008)]. willing and capable of doing so” [Goodman (2008)].

What this accomplished — besides enriching the executives at Fannie Greenspan also admitted that the market for credit default swaps
Mae — was that it made subprime mortgages that in the past would (CDSs), which became a multi-trillion dollar business, went out of
have been avoided by lenders more acceptable to banks all over control [Andews (2008)]. The CDSs were originally developed to
the country. These banks did not have the sophistication or experi- insure bond investors against default risk but they have taken on
ence to understand the kind of risk they were taking on. Between a life of their own and have been used for speculative purposes.
2001 and 2004, the subprime market grew from U.S.$160 billion to A CDS is a credit derivative and resembles insurance since the
U.S.$540 billion [Duhigg (2008)]. In 2004, there were allegations buyer makes regular payments and collects if the underlying finan-
of accounting fraud at Freddie and Fannie and both had to restate cial instrument defaults. In a CDS, there is the protection buyer who
earnings. Daniel H. Mudd became CEO of Fannie Mae in 2005 after can use this instrument for credit protection; the protection seller
Raines and Howard resigned from Fannie Mae under a cloud. Under who gives the credit protection; and the “reference entity,” which
Mudd’s watch, Fannie purchased even riskier mortgages, those that is the specific bond or loan that could go bankrupt or into default.
were so new that the computer models could not analyze them It could be compared to buying fire insurance on someone else’s
properly. Mr. Mudd was warned by Enrico Dallavecchia, his chief risk house. Imagine how the insurance industry would work if 1000
officer, that the company was taking on too much risk and should people were able to buy fire insurance on Jane Doe’s house. With

49
Lessons from the global financial meltdown of 2008

traditional insurance, if Jane Doe’s house burns down, there is only One banker from India made the point that his colleagues were sur-
one payment. If, on the other hand, 1000 people were allowed to prised at the “lack of adequate supervision and regulation.” What
each own a fire insurance policy on Jane Doe’s home, one thousand made it even more amazing was that all this occurred after the
payouts must be made. The insurance company would be collecting Enron debacle and the passing of the Sarbanes-Oxley bill [Nocera
fees from 1,000 customers and have a nice revenue stream, but the (2008)]. In fact, India avoided a subprime crisis because a bank
risk would be very great. In fact, we would have a situation where regulator by the name of V. Y. Reddy, who was the opposite of
one thousand people would do everything possible to make sure the Greenspan, believed that if “bankers were given the opportunity to
Jane Doe home burns down. sin, they would sin.” Unlike Greenspan, he felt that his job was to
make sure that the banking system would not be part of a huge real
Another difference between a CDS and traditional insurance is that estate bubble. He used his regulatory powers to deflate potential
there is a market for the CDS, which makes it attractive for specula- bubbles by not allowing bank loans for the purchase of undeveloped
tors, since the owner of the CDS does not actually have to own the land. Bankers were not happy that Reddy was preventing them
underlying security. One problem with this is that a hedge fund can from making huge amounts of money, like their American peers
buy a CDS on a bond and then do everything possible (i.e., sell the were making. Now everyone knows that Reddy was right. As one
stock short to force the price of the stock down) to make sure that Indian banker explained, regarding what happened in the United
the reference entity does indeed default [Satow (2008)]. States: “It was perpetuated by greedy bankers, whether investment
bankers or commercial bankers. The greed to make money is the
It should be noted that the market for derivatives and CDSs became impression it has made here.” [Nocera (2008)].
unregulated thanks to the Commodity Futures Modernization Act of
2000. This law was pushed by the financial industry in the name of Credit rating agencies
free markets and deregulation. It also made it virtually impossible There are three major credit-rating agencies: Moody’s, Standard
for states to use their own laws to prevent Wall Street from doing & Poors, and Fitch Ratings. All have been accused of being overly
anything about these financial instruments. This bill was passed “at generous in how they rated the securities that consisted of bundled,
the height of Wall Street and Washington’s love affair with deregu- low-quality mortgages. The big question that has arisen is whether
lation, an infatuation that was endorsed by President Clinton at the these firms assigned very good ratings (AAA) because of sheer
White House and encouraged by Federal Reserve Chairman Alan incompetence or to make more money. By ingratiating themselves
Greenspan” [Kroft (2008)]. with clients, they were able to steer more business to themselves.
There is no question that the firms were able to charge consider-
According to Eric Dinallo, the insurance superintendent of New ably more for providing ratings for complex financial securities than
York State [Kroft (2008)]: “As the market began to seize up and as for simple bonds. Rating a complex U.S.$350 million mortgage pool
the market for the underlying obligations began to perform poorly, would generate approximately U.S.$250,000 in fees, as compared
everybody wanted to get paid, had a right to get paid on those to U.S.$50,000 for an equally sized municipal bond. Morgenson
credit default swaps. And there was no ‘there’ there. There was no (2008a) quotes a managing director at Moody’s, a firm that rates the
money behind the commitments. And people came up short. And quality of bonds, as saying that: “These errors make us look either
so that’s to a large extent what happened to Bear Sterns, Lehman incompetent at credit analysis or like we sold our soul to the devil for
Brothers, and the holding company of AIG. It’s legalized gambling. revenue, or a little of both.”
It was illegal gambling. And we made it legal gambling…with abso-
lutely no regulatory controls. Zero, as far as I can tell.” In response Moody’s graded the securities that consisted of Countrywide Financial’s
to the question as to whether the CDS market was like a “bookie mortgages — Countrywide is the largest mortgage lender in the United
operation,” Dinallo said: “Yes, and it used to be illegal. It was very States. Apparently, the ratings were not high enough and Countrywide
illegal 100 years ago.” complained. One day later, Moody’s raised the rating. This happened
several times with securities issued by Countrywide.
SEC Chairman, Christopher Cox, stated that it is of utmost impor-
tance to bring transparency to the unregulated market in CDSs. Morgenson (2008a) provides an interesting example of how unreli-
They played a major role in the financial meltdown and were also able the ratings had become. A pool of residential subprime mort-
the cause of the near bankruptcy of AIG which necessitated a gov- gages was bundled together by Goldman Sachs during the summer
ernment bailout [Landy (2008), Cox (2008)]. According to Landy of 2006 (it was called GSAMP 2006-S5). The safest part of this,
(2008), no one even knows the exact size of the CDS market; consisting of U.S.$165 million in debt received a AAA rating from
estimates range from U.S.$35 trillion to U.S.$55 trillion. When AIG Moody’s and S & P on August 17, 2006. Eight months later, the rat-
was bailed out by the federal government it held U.S.$440 billion of ing of this security was lowered to Baa; and on December 4, 2007,
CDSs [Philips (2008)]. it was downgraded to junk bond status.

50 – The   journal of financial transformation


Lessons from the global financial meltdown of 2008

Method of compensation may encourage risk taking little in the way of income or assets. Real estate agents were given
Warren Buffet once said: “in judging whether corporate America fees of thousands of dollars for bringing borrowers to WaMu. WaMu
is serious about reforming itself, CEO pay remains the acid test” also gave mortgage brokers generous commissions for the riskiest
[Kristof (2008)]. It is obvious to all that corporate America has not loans since they produced higher fees and resulted in increased com-
performed well on this test. The fact that as many as 29% of firms pensation for executives. Between 2001 and 2007, Killinger earned
have been backdating options makes it appear that the system of approximately U.S.$88 million [Goodman and Morgenson (2008)].
compensation of executives needs an overhaul [Burrows (2007)].
According to a Watson Wyatt survey, approximately 90% of insti- Cohan (2008) also sees “Wall Street’s bloated and ineffective com-
tutional investors believe that top executives are dramatically pensation system” as a key cause of the financial meltdown. Several
overpaid [Kirkland (2006)]. politicians are examining the way bonuses work in order to make
sure that they will not be given to executives of firms that are receiv-
Richard Fuld, CEO of Lehman Brothers, earned approximately half- ing government assistance. Cohan (2008) feels very strongly that
a-billion dollars between 1993 and 2007. Kristof (2008) observes compensation reform in Wall Street is badly needed. He feels that
that Fuld earned about U.S.$17,000 an hour to destroy a solid, the change of the old system, where the big firms (i.e., Donaldson,
158-year old company. AIG Financial Products, a 377-person office Lufkin, and Jenrette; Merrill Lynch; Morgan Stanley; Goldman
based in London, nearly destroyed the mother company, a trillion- Sachs; Lazard; etc.) switched from being a partnership to a public
dollar firm with approximately 116,000 employees. This small office company, contributed to the financial debacle. When these firms
found a way to make money selling credit default swaps to financial were partnerships, there was collective liability so the firms were
institutions holding very risky collateralized debt obligations. AIG much more cautious. Once they became corporations with common
Financial Products made sure that its employees did very well stock, “bankers and traders were encouraged to take short-term
financially: they earned U.S.$3.56 billion in the last seven years risks with shareholder’s money.” These bankers and traders did not
[Morgenson (2008b)]. mind taking on more and more risk since their bonuses depended
on the annual profits. Put these ingredients together — encourage
One of the big problems at many Wall Street firms was how risk taking, no accountability, and use of shareholder’s money — and
compensation was determined. Bonuses made up a huge part of you get a financial meltdown. Siegel (2009) feels that the CEOs
how people were compensated. One individual at Merrill Lynch deserve most of the blame for the financial crisis. When the major
received U.S.$350,000 as salary but U.S.$35,000,000 as bonus investment banks such as Lehman Brothers and Bear Stearns were
pay. Bonuses were based on short-term profits, which distorted partnerships, they were much more conservative since they were
the way incentives work. Employees were encouraged to take risking their own money. Once they became public companies, they
huge risks since they were interested in the bonus. Bonuses were did not mind taking on huge amounts of risk since they were no
based on the earnings for that year. Thus, billions in bonuses were longer risking their own wealth. In effect, they were using other
handed out by Merrill Lynch in 2006 when profits hit U.S.$7.5 bil- people’s money to become super wealthy.
lion. Of course, those profits were only an illusion as they were
based on toxic mortgages. After that, the company lost triple that Cohan (2008) feels that compensation, which consumes approxi-
amount, yet the bonuses were not rescinded [Story (2008)]. Lucian mately 50% of generated revenues, is far too high. It is a myth
Bebchuk, a compensation expert, asserted that “the whole organi- that these high salaries and bonuses are needed to keep good
zation was responding to distorted incentives” [Story (2008)]. executives from leaving. There was a time that CEOs earned about
30 to 40 times more than the ordinary worker. Recently, that ratio
It is clear that bonuses based on the profits of a particular year at large companies has been 344 to 1 [Kristof (2008)]. For those
played a role in the financial meltdown. Firms are now changing who believe that large bonuses lead to improved performance,
the way bonuses work so that employees have to return them if research by Dan Ariely indicates the opposite. Bonuses cost the
the profits turn out to be illusory. The money will be kept in escrow company more and lead to worse performance [Ariely (2008)].
accounts and not distributed unless it is clear that the profits are The reason given by Ariely is that the stress caused by trying to
real. E. Stanley O’Neal, former CEO of Merrill Lynch, not only col- win the big bonus overwhelms any ability of the bonus to motivate
lected millions of dollars in bonuses but was given a package worth performance.
about U.S.$161 million when he left Merrill Lynch. He did quite well
for someone whose claim to fame is that he nearly destroyed Merrill It appears that the method of compensation used by Wall Street
Lynch; it was finally sold to Bank of America. firms was a distorted incentive that did not improve performance.
Rather, it encouraged firms to take on huge amount of risk — risk
According to WaMu employees, CEO Kerry K. Killinger put a huge that eventually destroyed many of these firms. Amy Borrus, dep-
amount of pressure on employees to lend money to borrowers with uty director at the Council of Institutional Investors, asserts that

51
Lessons from the global financial meltdown of 2008

“poorly structured pay packages encouraged the get-rich-quick far too chummy with CEOs. There is talk of passing legislation that
mentality and overly risky behavior that helped bring financial mar- will force CEOs to return past pay — this is known as a “clawback” —
kets to their knees and wiped out profits at so many companies” in cases where compensation was based on profits that turned out
[Morgenson (2009)]. to be illusory. Frederick E. Rowe, a founder of Investors for Director
Accountability, feels that “there is a fine line that separates fair
To make matters even worse, the public has learned that Merrill compensation from stealing from shareholders. When manage-
Lynch rushed U.S.$4 billion in end-of-year bonuses to top manage- ments ignore that line or can’t see it, then hell, yes, they should
ment a few days before the company was taken over by Bank of be required to give the money back” [Morgenson (2009)]. There is
America. These bonuses were paid out while the company was even talk of “clawing back” executives’ pensions in cases where the
reporting a fourth quarter loss of more than U.S.$15 billion. Clearly, profit earned by the firm turned out to be imaginary.
the bonuses are not rewards for a job well done. John Thain,
former CEO of Merrill Lynch, argued that “if you don’t pay your Models have only limited value
best people, you will destroy your franchise.” The massive losses Models are representations of reality. They will not work if conditions
suffered by Merrill Lynch forced Bank of America to ask for addi- have changed dramatically. They certainly will not work if compensa-
tional billions in bailout money from the government on top of the tion is tied to risk taking and executives insist that employees take
U.S.$25 billion they had already been promised. Thain also agreed risks to enrich themselves. As noted above, executives were doing
to reimburse Bank of America for the U.S.$1.2 million renovation of everything possible to show stellar performance — even if it meant
his office a year earlier [LA Times (2009)]. The renovation included taking on huge amounts of risk — since they wanted a fat yearly
area rugs for U.S.$131,000, chairs for U.S.$87,000, a U.S.$13,000 bonus. Whenever there are conflicts of interest, poor decisions are
chandelier, and an antique commode for U.S.$35,000 [Erman likely to be made. Models rely on interpretation by people. If the
(2009)]. Citigroup was pressured by the Obama administration to employees interpreting the model are afraid of losing their jobs, they
cancel a deal to purchase a U.S.$50 million new French corporate will construe the models in a way that pleases top management.
jet. Citigroup was castigated by the media and called Citiboob by
the New York Post for this planned purchase. Citigroup has lost bil- As noted above, Fannie Mae decided to expand into riskier mort-
lions of dollars, fired 52,000 employees, and received U.S.$45 bil- gages in the year 2000. They believed that their sophisticated mod-
lion in government bailout money, yet they were ready to purchase els would allow them to purchase riskier mortgages and that these
this luxury for its executives [Keil and Bennett (2009)]. computer models would protect them from the increased risk. After
all, Fannie Mae would charge higher fees for risky mortgages. The
The year 2008 was a time in which Wall Street firms lost billions. mortgages, though, became super risky because of very low down
The country was surprised to learn that this did not stop these payments and unverified income on the part of the borrower. It is
firms from giving out bonuses totaling U.S.$18.4 billion — sixth larg- doubtful that any model could anticipate that millions of homeown-
est amount ever. Apparently, bonuses are also doled out for a job ers would find it easy to walk away from their mortgage. In the past,
poorly done. President Obama declared these bonuses “shameful” relatively few people defaulted on a mortgage because they had to
and “the height of irresponsibility.” Vice President Biden stated: “I’d make a substantial down payment [Duhigg (2008)].
like to throw these guys in the brig. They’re thinking the same old
thing that got us here, greed. They’re thinking, ‘Take care of me’” Lewis and Einhorn (2009) feel that the rating agencies did not
[Stolberg and Labaton (2009)]. measure risk properly. American financial institutions took on a
great deal more risk without having to face a downgrading of their
What has become clear to the public is that Wall Street just does securities. This was probably due to the fact that the credit rating
not get it. The public and the media see the Wall Street executives agencies were making money on the risky securities being created
who are indifferent to what they caused. They still feel that they by these very same financial institutions. Companies such as AIG,
deserve enormous amounts of money even if their firms have shed Fannie Mae, Freddie Mac, GE, and MBIA which guarantee municipal
thousands of jobs and have nearly destroyed the financial system. bonds kept their AAA rating for a very long time. Lewis and Einhorn
Ira Kay, an executive consultant with Watson Wyatt, feels that you (2009) make the point that MBIA deserved its AAA rating in 1990
can make a strong case that the Wall Street culture (some refer when it insured municipal bonds and had U.S.$931 million in equity
it to the “eat what you kill” mentality) as well as the bonuses that and U.S.$200 million of debt. By the year 2006, MBIA was insuring
resulted from this kind of selfish culture have contributed greatly CDOs (collateralized debt obligations) which are extremely risky;
to the current financial meltdown [Nocera (2009b)]. the company had U.S.$26.2 billion in debt and only U.S.$7.2 billion
in equity. It kept its AAA rating for quite a while until it was obvious
There is no question that executive compensation is going to be to everyone that MBIA was no longer a secure firm. The models
scrutinized very carefully in the future, even by boards that were that were being used by the credit rating agencies may not have

52 – The   journal of financial transformation


Lessons from the global financial meltdown of 2008

worked quite well. However, no one wanted to kill the goose that The key lessons one can learn from the global financial crisis of
was producing so much profit for the credit rating agencies. Indeed, 2008 are the following:
the agencies, as noted above, were quite pleased with all the risky ■■ Some self-interest is good — capitalism requires it and human
securities they were paid to evaluate. beings are programmed that way. However, voracious self-
interest with total disregard for everyone else is not good for
Back in 2004, as noted above, the investment banks wanted the society.
SEC to give them an exemption from a regulation that limited the ■■ Too much regulation may be a bad thing — it stifles innovation
amount of debt — known as the net capital rule — they could have on and is not good for business or society. On the other hand, too
their balance sheets. By increasing the amount of debt, they would little regulation is even more dangerous for society and busi-
be able to invest in the “opaque world of mortgage-backed securi- ness. China is discovering this with their recent problem with
ties” and credit default swaps (CDSs). The SEC agreed and decided tainted milk which killed several children. Apparently, executives
to allow the investment banks to monitor their own riskiness by at several dairy companies sold dairy products adulterated with
using computer models to analyze the riskiness of various securi- melamine, a toxic chemical, in order to make the protein count
ties [Labaton (2008)]. One individual was opposed to the changes appear higher than it actually was. One executive, who has been
and said the computer models would not work in periods of “severe sentenced to death, was convicted of selling 600 tons of “protein
market turbulence.” He pointed out that computer models did not powder” contaminated with melamine to dairy firms [Barboza
protect Long-Term Capital Management from collapse [Labaton (2009)]. The poor conditions of the Peanut Corporation of
(2008)]. Of course, no one listened to him. America plant in Blakely, Georgia which resulted in a salmonella
outbreak also demonstrated what can happen when there is too
One measure of risk that was very popular with the financial engi- little regulation. ConAgra, manufacturer of Peter Pan peanut
neers was VaR (Value-at-Risk). This is a short-term measure of risk butter, had similar problems and upgraded its safety proce-
(daily, weekly, or for a few weeks) expressed in dollars and is one dures after a salmonella outbreak in 2007. Government officials
number and is based on the normal distribution. A weekly VaR of admitted that there were not enough agents (60) to monitor
U.S.$100 million indicates that, for that week, there is a 99% prob- thousands of food businesses. It is clear that the plant inspec-
ability that the portfolio will not lose more than U.S.$100 million. tors missed many serious and chronic problems such as a leaky
Of course, this is based on what has happened in the past. Alan roof in the Blakely plant [Moss (2009)]. Coming up with the right
Greenspan noted the following when trying to explain what went amount of regulation is not impossible. What is needed is suf-
wrong during the financial meltdown: “The whole intellectual edifice, ficient regulation to discourage huge risk taking but enough to
however, collapsed in the summer of last year because the data input encourage innovation [Knowledge@Wharton (2009)].
into the risk-management models generally covered only the past ■■ Incentives are an effective way of motivating people —
two decades, a period of euphoria. Had instead the models been fit- However, bonuses may not be the appropriate kind of incentive
ted more appropriately to historic periods of stress, capital require- for executives. In fact, bonuses can help destroy firms if they
ments would have been much higher and the financial world would be encourage executives to focus on short-term profits rather than
in far better shape today, in my judgment [Nocera (2009a)]. the long-term health of a company.
■■ Mathematical models — might provide some insights but ulti-
The problem with VaR, and similar measures, is that it ignores what mately it is people who have to interpret them. They can be used
can happen 1% of the time. Taleb (2007) considers VaR a dishonest to justify almost anything and can cause as much harm as good.
measure since it does not measure risks that come out of nowhere,
the so called “black swan.” It provides a false sense of security The global financial crisis would not have occurred if executives
because over the long run, things that have low probabilities of were truly ethical. There is no question that lack of ethics played
occurrence do happen. Indeed, Long Term Capital Management col- a significant role in the meltdown. A large number of people knew
lapsed because of a black swan, unexpected financial crises in Asia that the mortgages they were dealing with were toxic. It does not
and Russia [Nocera (2009a)]. take a great financial expert to know that mortgages with no down
payments given to people with no income is extremely foolhardy.
Conclusion The excuse that they believed that housing prices would continue to
Phelps (2009) observes: “Whether in Enron’s creative accounting, keep going up (at one point houses were doubling in value every six
the packaging of high-risk subprime mortgages as top-grade col- or seven years) is not credible and indeed it is not even a legitimate
laterized debt obligations, or Bernard Madoff’s $50 billion scam justification for this behavior. The truth is that as long as there
operation, the recent riot of capitalist irresponsibility has shattered was money to be made virtually no one said anything. To hide how
the fantasy that the free market, left to its own devices, will pro- risky these toxic mortgages were, they turned them into securities.
duce rationality and prosperity.” To make matters worse, mortgage brokers were encouraged to do

53
Lessons from the global financial meltdown of 2008

everything possible to get people to take out mortgages. The rating • Labaton, S., 2008, “Agency’s ’04 rule let banks pile up new debt and risk,” New York
Times, October 3, A1, A23
agencies were making money on rating these securities so they did
• Landy, H., 2008, “Credit default swaps oversight nears,” Washington Post, November
not do their job. All the parties figured someone else would end up 15, D03
holding the bag. • Lewis, M., and D. Einhorn, 2009, “The end of the financial world as we know it,” New
York Times, Week in Review, January 4, 9-10
• Lichtblau, E., 2008, “Federal cases of stock fraud drop sharply” New York Times,
One thing is clear: free markets do not work well unless there is December 25, A1
accountability, responsibility, ethics, and transparency. Predatory • Lipton, E., and R. Hernandez, 2008, “A champion of Wall Street reaps benefits,” New
York Times, December 14, 1, 36
capitalism that disregards the concern for others and is based pure-
• Lorsch, J. W., L. Berlowitz, and A. Zelleke, 2005, Restoring trust in American business,
ly on self-interest may even be more dangerous than communism. MIT Press, Cambridge, MA
• Los Angeles Times, 2009, “John Thain defends Merrill Lynch bonuses amid Bank of
America Takeover,” January 27, latimes.com, retrieved January 28, 2009 from http://
References
www.latimes.com/business/la-fi-thain27-2009jan27,0,476991.story
• Alvey, J. E., 1999, “Why spirituality deserves a central place in liberal education,”
• Lowenstein, R., 2008, “Long-Term Capital Management: It’s a short term memory,”
Journal of Markets and Morality, 2:1, 53-73
International Herald Tribune, September 7. Retrieved December 29, 2008 from http://
• Andrews, E. L., 2008, “Greenspan concedes error on regulation,” New York Times,
www.iht.com/articles/2008/09/07/business/07ltcm.php
October 24, B1, B6
• Monaghan, P., 2003, “Taking on ‘rational man’,” Chronicle of Higher Education,
• Ariely, D., 2008, “What’s the value of a big bonus?” New York Times, OP-ED, November
January 24, A12-A15
20, A43
• Morgenson, G., 2008a, “Debt watchdogs: Tamed or caught napping,” New York Times,
• Barboza, D., 2009, “China plans to execute 2 in scandal over milk,” New York Times,
December 7, 1, 40
January 23, A5
• Morgenson, G., 2008b, “Behind insurer’s crisis, blind eye to a web of risk,” New York
• Becker, J., S. G. Stolberg, and S. Labaton, 2008, “White House philosophy stoked mort-
Times, September 28, 1, 28
gage bonfire,” New York Times, December 21, 1, 36-37
• Morgenson, G., 2009, “After huge losses, a move to reclaim executives’ pay,” Sunday
• Burrows, P., 2007, “He’s making hay as CEOs squirm,” Business Week, January 15,
Business, New York Times, February 22, 1, 7
64-65
• Moss, M., 2009, “Peanut case shows holes in food safety net,” New York Times,
• Cohan, W. D., 2008, “Our risk, Wall Street’s reward,” New York Times, Week in Review,
February 9, A1, A12
November 16, 13
• Nocera, J., 2008, “How India avoided a crisis,” New York Times, December 20, B1, B8
• Cohen, P., 2007, “In economics departments, a growing will to debate fundamental
• Nocera, J., 2009a, “Risk mismanagement,” New York Times Magazine, January 4,
assumptions,” New York Times, July 11, B6
23-50
• Cox, C., 2008, “Swapping secrecy for transparency,” New York Times, Week in Review,
• Nocera, J., 2009b, “Getting theirs: It’s not the bonus money. It’s the principle,” New
October 19, 12
York Times, Business Day, January 31, 1, 4
• Curry, T., and L. Shibut, 2000, “The cost of the savings and loan crisis: Truth and
• Norris, F., 2009, “Failing upward at the Fed,” New York Times, Business Day, February
consequences,” FDIC Banking Review, 13(2), retrieved December 29, 2008 from http://
27, B1, B4
www.fdic.gov/bank/analytical/banking/2000dec/brv13n2_2.pdf
• Pearlstein, S., 2007, “No money down falls flat,” Washington Post, March 14, D01
• Duhigg, C., 2008, “Pressured to take more risk, Fannie reached a tipping point,” New
• Phelps, C., 2009, “Beyond economic revival: which New Deal approaches are worth
York Times, October 5, 1, 34
emulating?” Chronicle Review, February 13, B10-B12
• Erman, B., 2009, “Big bonuses, costly commodes: Thain out at B of A,” Globeandmail.
• Philips, M., 2008, “The monster that ate Wall Street,” Newsweek, October 6, retrieved
com, January 23. Retrieved January 28, 2009 from http://www.theglobeandmail.com/
on January 4, 2009 from http://www.newsweek.com/id/161199
servlet/story/LAC.20090123.RTHAIN23/TPStory/Business
• Pitelis, C., 2002, “On economics and business ethics,” Business Ethics: A European
• Friedman, T. L., 2008, “The great unraveling.” New York Times, OP-ED, December 17,
Review, 11(2), 111-118
A39
• Robinson, J., 1977, “Morality and economics” Commencement Address, retrieved July
• Gandel, S., 2008, “The Madoff fraud: How culpable were the auditors?” Time,
9, 2007 from Economist’s View http://economistsview.typepad.com/economists-
December 17, retrieved December 29, 2008 from http://www.time.com/time/print-
view/2007/07/morality-and-ec.html
out/0,8816,1867092,00.html#
• Roosevelt, F. D., 1937, “Second Inaugural Address,” retrieved October 25, 2008 from
• Goodman, P. S., 2008, “Taking a hard look at a Greenspan legacy,” New York Times,
Bartleby.com http://www.bartleby.com/124/pres50.html
October 9, A1, A29
• Satow, J. 2008, “Default swaps may be next in credit crisis,” New York Sun, September
• Goodman, P. S., and G. Morgenson, 2008, “By saying yes, WaMu built empire on shaky
22, http://www.nysun.com/business/default-swaps-may-be-next-in-credit-crisis/86312/
loans,” New York Times, December 28, 1, 22
• Siegel, J., 2009, “Lesson one: what really lies behind the financial crisis,” Knowledge@
• Howard, G. S., 1997, “The tragedy of maximization,” Ecopsychology Online, October,
Wharton, January 21, retrieved January 22, 2009 from: http://knowledge.wharton.
retrieved June 25, 2007 from http://ecopsychology.athabascau.ca/1097/index.
upenn.edu/article/2148.cfm
htm#politics
• Skidelsky, R., 2008, “The remedist,” New York Times Magazine, December 14, 21-22
• Keil, J.G. and C. Bennett, 2009, “Citigroup cancels plans to buy $50m jet,” New
• Stolberg, S. G., and S. Labaton, 2009, “Banker bonuses are ‘shameful,’ Obama
York Post, January 27, retrieved January 28, 2009 from http://www.nypost.
declares,” New York Times, January 30, A1, A20
com/seven/01272009/news/nationalnews/citigroup_cancels_plans_to_buy_50m_
• Story, L., 2008, “On Wall Street, bonuses, not profits, were real,” New York Times,
jet_152270.htm
December 18, A1, A35
• Kirkland, R., 2006, “The real CEO pay problem,” Fortune, June 30. Retrieved
• Suskind, R., 2008, “The crisis last time,” New York Times, OP-ED, September 25, A29
July 29, 2008 from http://money.cnn.com/magazines/fortune/fortune_
• Taleb, N. N., 2007, The black swan: the impact of the highly improbable, Random
archive/2006/07/10/8380799/index.htm
House, New York
• Knowledge@Wharton, 2009, “Getting it right: making the most of an opportunity to
update market regulation,” January 21, retrieved January 22, 2009 from http://knowl-
edge.wharton.upenn.edu/article.cfm?articleid=2145
• Kristof, N., 2008, “Need a job? $17,000 an hour. No success required,” New York
Times, OP-ED, September 18, A29
• Kroft, S., 2008, “The bet that blew up Wall Street,” 60 Minutes, October 26, retrieved
January 4, 2008 from http://www.cbsnews.com/stories/2008/10/26/60minutes/
main4546199.shtml

54 – The   journal of financial transformation


Articles

What can leaders learn


from cybernetics?
Toward a strategic
framework for
managing complexity
and risk in socio-
Alberto Gandolfi
Professor, Department of Management and Social
Sciences, SUPSI University of Applied Science of
technical systems
Southern Switzerland

Abstract
How can leaders cope with the seemingly inexorable increase in
complexity of their world? Based on the cybernetic “law of requi-
site variety,” formulated almost fifty years ago by W. R. Ashby, we
propose a pragmatic framework that might help strategic decision
makers to face and manage complexity and risk in organizational,
political, and social systems. The law of requisite variety is a funda-
mental hypothesis in the general theory of regulation and defines
an upper limit to the controlling ability of a system, based on its
“variety” (complexity) level. The strategic framework discussed
here assumes that complexity breeds systemic risks and suggests
three mutually complementary approaches to managing complexity
at a strategic level: 1) where possible, reduce complexity in the sys-
tem to be controlled; 2) effectively manage residual complexity; and
3) increase complexity of the controlling system. Each approach is
then discussed, and practical examples provided.

55
What can leaders learn from cybernetics? Toward a strategic framework
for managing complexity and risk in socio-technical systems

Decision makers in literally every area are facing an increasing level it is a particular configuration of the different elements of a
of complexity. This inexorable trend in turn has lead to increased system, exhibiting a particular behavior [Ashby (1956)]. System
difficulty in managing organizational, technological, political, and variety depends, therefore, on the total number of different ele-
socio-economical systems and their related risks. Everything seems ments in the systems and on the number of different internal
to become more complex, and therefore more difficult to manage (behavior relevant) configurations of each element. For example,
and control: markets and business opportunities, corporate gover- in a productive system, a status could be a particular combina-
nance, projects, fiscal systems, technology, demands from all kinds tion of all relevant system elements: machines, raw material,
of stakeholders, management systems, and supply chains. components, people, schedule, data, procedures, and orders.
■■ The interactions among elements (system dynamics) — lead-
Partly as a response to this important trend, in the last three ing to discontinuous and nonlinear changes in system behavior —
decades the interdisciplinary study of the structure and the behav- from a decision-making point of view, the perceived complexity
ior of complex systems — to which we will refer here as complexity increases by increasing the number of relationships between ele-
science — has grown considerably and has brought relevant inputs ments, and by increasing nonlinearity and opacity (i.e., nontrans-
to many scientific fields, such as biochemistry, ecology, physics, parency) of these relationships. Opacity occurs when an observer
chemistry, genetics, and biology [see for example Kauffman (1993) (in this case, the decision-maker) is no longer able to determine
for genetics and evolutionary biology or Pimm (1991) for ecology]. clear and straightforward cause-effect relationships in the sys-
Many scholars have then tried to apply the instruments and analyti- tem dynamics. Nonlinearity occurs when the system changes in
cal concepts developed by complexity science in order to improve an erratic, discontinuous way or when the size of change is not
the control of socio-technical systems, including organizational, correlated with the size of the input that caused it.
economical, technological, social, and political systems [Iansiti
and Levien (2004), Kelly (1994), Senge (1992), Malik (1984)]. Even We can note that those two aspects represent, in a way, a concep-
though this transfer from “hard” to “soft” sciences has brought tual extension of the NK-model for evolutionary fitness landscapes
some interesting insight for the latter, a general model addressing [Kauffman (1993)], where N is the number of elements of the land-
complexity management in socio-technical systems is still lacking. scape and K is the total number of relationships between elements.
Following these preliminary considerations, in the remainder of this
What we still need is a more pragmatic approach, enabling manag- paper we will adopt this twofold definition of complexity.
ers, politicians, and other decision makers in the real world to better
understand and actively influence the behavior and control risks of A very important point is the relationship between complexity and
the systems they are supposed to control. Our research has led to risk. Results from several research areas have consistently dem-
a conceptual model for complexity management in social and socio- onstrated that an increase in complexity breeds more, and more
technical systems, which will be outlined in this paper. The current diverse, potential risks for a socio-technical system, as also sug-
model should not be considered as complete, but as work in prog- gested by our daily experiences [Johnson (2006)]. The dynamics of
ress which will be further enhanced and modified over the coming complex systems show a set of well-known features, such as feed-
years, focusing primarily not on theoretical elegance, but rather at back loops [Johnson (2006), Sterman (2000), Forrester (1971)],
usefulness in a real decision making environment. nonlinear behavior (phase transitions) [Perrow (1999), Sterman
(2000), Ball (2005), Kauffman (1993)], network effects (network
Complexity and risk dynamics) [Barabasi (2002)], emerging properties (system effects,
Before we can proceed to explain our framework, it is necessary self-organization) [Johnson (2006), Holland (2000), Kauffman
to clarify what we mean by the term “complexity”. In this paper (1993), Ashby (1956)], delayed effects [Jervis (1997), Sterman
we will adopt a relatively simple, management-oriented definition, (2000)], and indirect, unintended consequences (side effects)
viewing complexity from the perspective of the decision makers [Jervis (1997), Sterman (2000), Tenner (1997)].
who are supposed to plan, manage, and control real socio-technical
systems, surrounded by real environments. Although it is not in the As Jervis (1997, p.29) summarized it: “many crucial effects are
scope of this paper to discuss the different meanings of “complex- delayed and indirect; the relations between two actors often are
ity,” for our purposes two complementary dimensions of complexity determined by each one’s relation with others; interactions are
become particularly relevant [Casti (1994), Malik (1984)] [see also central and cannot be understood by additive operations; many
Sommer and Loch (2004) for applications in project management, outcomes are unintended; regulation is difficult.”
and Perrow (1999) for applications in risk management]:
Casti (1994) suggests that complexity generates risks for manage-
■■ The number of states of the systems (system variety) — what ment and produces system ‘surprises.’ Such surprises make the
does cybernetics mean by the term “state” of a system? In short, system’s behavior opaque, unstable, counterintuitive, sometimes

56 – The   journal of financial transformation


What can leaders learn from cybernetics? Toward a strategic framework
for managing complexity and risk in socio-technical systems

chaotic, and eventually unpredictable [Forrester (1971)]. Under such [Beer (1979, 1981)]. Beer argued that management can balance its
circumstances, control, regulation, and, consequently, effective risk ‘variety (complexity) equation’ by using both ‘variety reducers’
management are extremely difficult or even impossible to achieve. for filtering out environmental variety and ‘variety amplifiers’ for
Errors and failures, and in the worst cases catastrophes, are the increasing its own variety vis-à-vis its environment.
often unavoidable consequences [Choo (2008), Perrow (1999)].
Basically, our model extends this approach, suggesting three meth-
Ashby’s seminal hypothesis: “only variety can ods, or strategic levers, for the decision maker to better manage, and
destroy variety” survive, complexity. Assuming that a controlling system C wants to
In the 1960s, Ashby proposed a law of ‘requisite variety,’ which control and manage a system s, the three strategic levers are:
states that “only variety can destroy variety” [Ashby (1956)], where
variety means the number of different states (or behaviors) the ■■ First lever — reduce complexity in the controlled system.
system can display. To fully understand this metaphor, one has ■■ Second lever — manage the remaining complexity.
to consider the cybernetic definition of control, as suggested by ■■ Third lever — increase complexity of the controlling system C,
Ashby: to control a system means to reduce the variety of its poten- and understand and accept complexity.
tial behaviors. One controls a system if he/she is able to limit the
number of different states the system may exhibit. In the extreme First lever — reduce complexity in the controlled
case, the system is entirely under control, such that it may exhibit system C
only one particular behavior (or status), the one the controlling The first lever suggested by our strategic framework aims to
system has chosen [Hare (1967)]. An alternative way of understand- reduce the complexity of the controlled system — where and when
ing the law of requisite variety, sometimes called Cybernetics I [De such a reduction is possible, realistic, and adequate. One can adopt
Leeuw (1982)], is to view control as the ability to reduce or to block different approaches in order to reduce the overall complexity of a
the flow of variety from environmental disturbances, thus enabling system, such as:
an internal “physiological” stability and the survival of the system
[Ashby (1956)]. Complexity transfer — in this first approach a certain ‘amount
of complexity’ is transferred elsewhere, outside the system to be
In this paper, we will assume that the cybernetic concept of variety controlled.
might be substituted by the more broad (and bold) term of ‘com-
plexity’ [Jackson (2000), Malik (1984), Van Court (1967)]. In fact, ■■ Example 1 — outsourcing decisions usually allow for transferring
we believe the term complexity can better capture the rich nature a considerable amount of organizational and technical complex-
of real socio-technical systems, where variety is only one of the ity of a process to a specialized partner, outside the company.
aspects of complexity [Johnson (2006)]. Here, it is interesting to note that the overall net management
burden and costs are often reduced, due to the specialization
The three levers of the outsourcing partner in a few specific processes, such as
On the basis of the conceptual schema provided by the law of transportation of goods, IT management, or market research.
requisite variety, our research led to the elaboration of a strategic The result is that the complexity decrease in the outsourcing
framework, with a pragmatic orientation. Its objective, and the company is greater than the complexity increase in the special-
criterion which we will adopt to measure its effectiveness, is to ized partner.
help decision-makers out in the field to manage their complex work ■■ Example 2 — the introduction of traffic circles leads to a sub-
environment. The target audience for the strategic framework is stantial reduction in the complexity of local traffic management
those people faced with problems, environments, and situations by public authorities, usually achieved by traffic lights. In this
of increasing complexity, such as board members, executives of very successful case, complexity of coordination has been effec-
private or publicly owned companies, consultants, politicians, plan- tively distributed to thousands of single car drivers, each of them
ners, and project managers. managing his/her own space- and time-limited complexity when
approaching the crossing.
Van Court, in one of the rare comments and conceptual develop- ■■ Example 3 — from an industrial point of view, Alberti has sug-
ments of Ashby’s law of requisite variety, noted that “there are only gested a general concept for the reduction of complexity in pro-
two major ways to adjust the ability of the analyst (or controller) duction planning and production management [Alberti (1996)],
to the requirements of the system to be controlled: (1) increase the which includes the transfer of complexity from department and
variety of the controller, or (2) reduce the variety of the system to plant managers to other elements of the manufacturing system,
be controlled” [Van Court (1967)]. This issue was initially proposed such as product structure, production infrastructure (i.e., factory
by Stafford Beer with his seminal concept of ‘variety engineering’ layout), workers, and suppliers.

57
What can leaders learn from cybernetics? Toward a strategic framework
for managing complexity and risk in socio-technical systems

Simplification of system elements (reduction of both number and system has its limits, due to either physical-structural boundaries or
variety of elements) — in this approach, complexity is reduced by limitations in the actual power of the controlling system (in this case
standardizing and rationalizing the individual elements of a socio- the decision maker managing a complex socio-technical system).
technical system. The result is a reduction in the overall number In the last decades several scientific disciplines have developed
and variety of elements of the system, and hence of system variety supporting tools and concepts, which can be extremely helpful in
[Gottfredson and Aspinall (2005), Schnedl and Schweizer (2000)]. enabling decision makers to better manage existing socio-technical
complexities. We will focus here on two of such tools and concepts:
■■ Example 1 — some airlines have dramatically reduced the number using decision support software and segmenting complexity levels
of different planes in their fleet. For example, Southwest Airlines
ended up having only one kind of plane [Freiberg and Freiberg Decision support software — we refer here to software tools which
(1996)]. This leads to a reduced complexity in terms of training enable the effective simulation and optimization of the behavior
pilots, crew and technical personnel, maintenance and safety of complex systems. One should distinguish between (a) general
management, and purchasing of new planes and spare parts. purpose, generic applications and (b) task- or system-specific appli-
■■ Example 2 — standardizing the documents which are created cations (i.e., for logistics, transportation, production scheduling).
and circulated in an organization is also a complexity reducing Basically, even many functionalities of the traditional Enterprise
measure. For example, a company could decide that all sales reps Resource Planning (ERP) applications aim to reduce complexity
use only one standard format to register new sales orders. For for decision-makers in organizations, enabling a centralized, non-
them and for their colleagues from all other departments this redundant, fully-integrated data management [Davenport and
would mean a less demanding task, reduced error probability, Harris (2005)].
and a reduction in training time. By the same token, imposing a
standardization of the different softwares used by an organiza- Segmenting complexity levels — another approach tries to seg-
tion will lead to a significant complexity reduction, particularly in ment (or segregate) complexity, that is, to set apart different levels
the IT department and in the help desk. of complexity in different processes, with specific rules, procedures,
resources, and strategies. The segmentation of process complexity
Simplification of the interaction between elements (reduction is one of the basic tenets of Business Process Reengineering, BPR
of the dynamical complexity) — in complex systems the relation- [Hammer and Champy (1993)]. In the case of many insurance com-
ships between elements are usually numerous, nonlinear, and non- panies, which have subdivided the process of evaluation of requests
transparent. This approach focuses on moving the system in the for compensation according to the complexity of the individual
opposite direction, i.e., reducing the number of interactions, making cases, simple cases follow a more simplified procedure, with less
them more linear or more transparent for the decision maker. The informed collaborators — while more complex cases are managed
objective is to obtain a system with lower overall dynamic complex- by a team of specialists through a more elaborate process. This per-
ity [Gottfredson and Aspinall (2005)]. mits allocation of complex tasks to the best (and most expensive)
specialized resources, while avoiding the overburdening of these
■■ Example — a growing number of project leaders have adopted resources with numerous simple and non-critical cases.
web-based communication platforms (often with a “pull-philos-
ophy” for getting the data) to better manage communication Third lever — increase the complexity of the decision
and collaboration in complex projects, with many stakeholders maker
involved. Such a tool may help to organize and simplify commu- After having reduced the complexity of the controlled system,
nication flow between project members, usually characterized and having better managed the remaining complexity of system,
by infinite and disturbing numbers of emails, the coexistence of a third approach suggests increasing the variety (or complexity)
different versions of the same document, and frequent mistakes. of the controlling system, the decision making body. What does
Here complexity reduction is obtained by reducing the number “increasing complexity of the decision maker” mean? According to
of interactions in the communication network between project our working definition of complexity, to increase complexity means
members, that is, moving from a n:n-communication topology here to increase the number of potential perspectives, opinions,
(everybody interacts with everybody) to a n:1 topology (every decisions, and the variety of approaches for selecting among them
project member interacts within the communication platform.) the best, most effective one. This translates into two complementa-
ry options: both (a) increase the number or the (internal) complex-
Second lever — managing residual complexity ity of elements of the decision making system, and (b) improve the
The second approach suggested by our framework concentrates number and variety of the interactions among these elements. In
on managing the remaining complexity in the controlled system. real life — say, a manager who has to lead a complex organization —
Obviously, the potential for reducing complexity in the controlled we can envisage the following approaches to reach this goal.

58 – The   journal of financial transformation


What can leaders learn from cybernetics? Toward a strategic framework
for managing complexity and risk in socio-technical systems

Increase the “decision complexity” of human resources — the financial market environment. Schütz argues that such a board
through different training and development measures, one has could not effectively control or manage some critical and complex
to enable single employees and managers to think and act in a processes of the bank. Eventually, UBS ended up in serious finan-
more complex way: considering more potential frames and solu- cial distress, and had to accept a merger with a smaller and less
tion ideas when solving a problem; understanding the relationship powerful competitor, SBS (the new bank maintained the name UBS,
between activities and objectives of different processes, teams although most of the new management came from SBS).
and departments; communicating and collaborating with others in
a more effective way; acting as an interconnected element for the Improve knowledge management in the organization — more
long term interests of the whole system; and deliberately delaying knowledge enables access to more diverse mental states and
judgment and remaining open to alternative systems [McKenzie et more dynamic interactions, and thus, potentially, to more complex
al. (2009)]. For example, decision makers should understand and decisions. In this way, effective knowledge management can be
accept complexity, the logic behind the structure and behavior of a powerful tool for increasing the complexity of decision-making
complex socio-technical systems, and its profound consequences throughout the organization.
for the management of those systems. One essential conceptual
frame for understanding complexity is System Thinking [Sterman Create networks and decide and operate in networks — there is
(2000), Vester (1999), Senge (1990)]. Basically, System Thinking some evidence that a network can generate a higher decision and
is a holistic way of seeing and interpreting the world, in terms of behavior complexity than other topological configurations, such
systems, made up of networks of interconnected elements. Often as hierarchies or chains, or a single decision maker [Surowiecki
the emergent behavior of a system is determined by the relation- (2004), Vester (1999), Kelly (1994), Peters (1992)]. One of the most
ship between elements, rather than by the behavior of the single effective ways of dealing with very complex problems is to create
elements. System Thinking has been implicitly adopted and inte- a network of different and complementary competencies, as in
grated in strategic planning tools such as the well-known Balanced the case of scientific research, where the trend is to build large
Scorecard, BSC [Kaplan and Norton (2001)], developed in order to research networks (competence centers) on specific complex top-
escape from the limits of traditional strategic planning which was ics. At the Civico City Hospital of Lugano (Switzerland), a medical
too focused on financial indicators — and hence not systemic. BSC, competence centre for diagnosis and therapy of breast cancer was
on the contrary, recognizes the role of many other dimensions and established in 2005, in order to integrate into a virtual network all
objectives (market, internal, learning and growth objectives) for the the involved specialists (oncologists, radiologists, surgeons, phar-
long-term success of an organization. macologists, nurses, psychologists, and researchers). One of the
most important features of such networks is a massive improve-
Increase variety of the whole decision system [Schwaninger ment of communication between the nodes, which leads to an
(2001)] — a first, simple step is to go from individual to collective increase in relationship complexity of the system [Evans and Wolf
decision-making [Van Buuren and Edelenbos (2005)]. Further, (2005)]. The effectiveness of networked decision making can be
one can increase complexity by diversifying the decision-making further improved by implementing methodologies and approaches
group from different perspectives: diversity of age, professional designed to foster cohesion, communication, and synergy in larger
or cultural background, thinking style, race, sex, knowledge of groups of individuals, as defined for example by Stafford Beer with
the problem, role in the organization, experiences, and interests. his Team Syntegrity Model [Schwaninger (2001)].
A more diverse team is potentially able to generate more diverse
mental states, solutions, or decisions [Page (2007)]. For example, Conclusion
some innovative companies have made their research and develop- Complexity and risks of markets, legal systems, technological land-
ment teams more diverse by integrating “outsiders,” that is, people scapes, environmental, and social issues, to mention only a few,
with little or no knowledge (or a very naïve knowledge) of a specific will further increase in the next decades. Drawing from the insight-
technological product, like children, older people, housewives, and ful intuition of R.W. Ashby, this paper tries to outline a pragmatic
artists. At the opposite end, there are clues which show that some strategic framework to managing complexity in socio-technical
big decision mistakes might be traced back to an insufficient vari- systems, as summarized in Figure  1. The model should be further
ety of decision bodies (i.e., the board of directors). Schütz (1999) improved in the next few years; very important progress is to be
describes the case of Union Bank of Switzerland (UBS) in the 1990s. expected if researchers eventually become able to measure com-
He shows that the board of directors of UBS was in those years very plexity levels or at least to quantify changes in the complexity levels
homogeneous: members were all men, Swiss, more than 50 years of socio-technical systems.
old, army officers, with an economic or legal background, having
followed a traditional banking career. This complexity (variety) level We believe that the application of concepts and insights of complex-
was apparently inadequate to cope with an increasing complexity in ity science to management still has a huge development potential.

59
What can leaders learn from cybernetics? Toward a strategic framework
for managing complexity and risk in socio-technical systems

Strategic lever Some approaches


References
• Alberti, G., 1996, Erarbeitung von optimalen Produktionsinfrastrukturen und Planungs-
First lever — reduce Complexity transfer
und steuerungssystemen, Dissertation Nr. 11841, ETH, Zürich
complexity in the Simplification of system elements (reduction of variety) • Ashby, W. R., 1956, An introduction to cybernetics, Chapman & Hall, London
controlled system C
Simplification of the interaction between elements • Beer, S., 1985, Diagnosing the system, Wiley, New York
(reduction of the dynamical complexity) • Beer, S., 1981, Brain of the firm, second edition, Wiley, Chichester
Second lever — manage Decision support applications • Beer, S. 1979, The heart of enterprise, Wiley, Chichester
remaining complexity Segmenting complexity levels • Casti, J., 1994, Complexification, Harper Collins, New York
• Choo, C.W., 2008, “Organizational disasters: why they happen and how they may be
……
prevented,” Management Decision, 46:1, 32-45
Third lever — increase Increasing decision complexity of human resources
• Davenport, T. H., and J. G. Harris, 2005, “Automated decision making comes of age,”
complexity of the Increasing variety of the whole decision system
controlling system C Sloan Management Review, Summer, 83-89
Improving knowledge management • De Leeuw A., 1982, “The control paradigm: an integrating systems concept in organi-
Deciding and operating in networks zation theory,” in Ansoff, H. I., A. Bosman, and P. M. Storm (eds.), understanding and
managing strategic change, North Holland, New York
Figure 1 – Overview of the strategic framework to better manage complexity in socio- • De Leeuw A., and H. Volberda, 1996, “On the concept of flexibility: a dual control per-
technical systems – the four strategic levers and the corresponding approaches (the spective,” Omega, International Journal of Management Sciences, 24:2, 121-139
list is not exhaustive). • Evans, P., and B. Wolf, 2005, “Collaboration rules,” Harvard Business Review, July-
August, 96-104
• Forrester, J. W., 1971, “Counterintuitive behavior of social systems,” Theory and
We must narrow down the gap between the two worlds: complexity Decision, 2:2, 109-140
• Freiberg, K. L., and J. A. Freiberg, 1996, Nuts! Bard Press, New York
science and system thinking on the one hand and “down-to-earth,”
• Gandolfi, A., 2004, La Foresta delle decisioni, Casagrande, Bellinzona
empirical management reality on the other. We also believe that the • Gottfredson, M., and K. Aspinall, 2005, “Innovation versus complexity: what is too
model discussed in this paper can be a humble contribution to this much of good thing?” Harvard Business Review, November, 62-71
• Hammer, M., and J. Champy, 1993, Reengineering the corporation, Harper Business,
fascinating challenge on the edge of these two worlds.
New York
• Holland, J. H. 2000, Emergence – from chaos to order, Oxford University Press, Oxford
• Iansiti, M., and R. Levien, 2004, The keystone advantage, Harvard Business School
Press, Boston
• Jackson, M. C., 2000, Systems approaches to management, Kluwer Academic, New
York
• Jervis, R., 1997, Systems effects, Princeton University Press, Princeton
• Johnson, J., 2006, “Can complexity help us better understanding risk?” Risk
Management, 8, 227-267
• Kauffman, S. A., 1993, The origin of order, Oxford University Press, Oxford
• Kaplan, R. S., and D. P. Norton, 2001, The strategy focused organization, Harvard
Business Press, Boston
• Kelly, K., 1994, Out of control: the new biology of machines, social systems and the
economic world, Addison-Wesley Reading
• Küppers, B. O., 1987, “Die Komplexität des Lebendigen,” in Küppers, B. O., Ordnung aus
del chaos, Piper, München
• Malik, F., 1984, Strategie des managements komplexer systeme, Haupt, Bern
• McKenzie, J., N. Woolf, and C. Van Winkelen, 2009, Cognition in strategic decision mak-
ing, Management Decision, 47:2, 209-232
• Mintzberg, H., 1994, The rise and fall of strategic planning, The Free Press, New York
• Ormerod, P., 1998, Butterfly economics, Faber and Faber, London
• Page, S., 2007, The difference, Princeton University Press, Princeton
• Perrow, C., 1999, Normal accidents, Princeton University Press, Princeton
• Peters, T., 1992, Liberation management, Fawcett Columbine, New York
• Pimm, S. L., 1991, The balance of nature? University of Chicago Press, Chicago
• Schnedl, W., and M. Schweizer, 2000, Dynamic IT management in the 21st century,
PricewaterhouseCoopers
• Schütz, D., 1999, Der fall der UBS, Weltwoche-ABC-Verlag, Zürich
• Schwaninger, M., 2001, “Intelligent organizations: an integrative approach”. System
Research and Behavioral Science, 18, 137-158
• Senge, P. M., 1990, The fifth discipline, Doubleday, New York
• Sommer S. C., and C. H. Loch, 2004, “Selectionism and learning in projects with com-
plexity and unforeseeable uncertainty,” Management Science, 50:10, 1334-1347
• Sterman, J., 2000, Business dynamics, Irwin McGraw-Hill, Boston
• Surowiecki, J., 2004, The wisdom of crowds, Little Brown, London
• Tenner, E., 1997, Why things bite back, Vintage Books, New York
• Van Buuren A., and J. Edelenbos, 2005, “Innovations in the Dutch polder,” Science and
Society Conference, Liverpool, 11-15 September
• Van Court, H., 1967, Systems analysis: a diagnostic approach, Harcourt, Brace & World,
New York
• Vester, F., 1999, Die Kunst vernetzt zu denken, DVA, Stuttgart

60 – The   journal of financial transformation


Articles

Financial stability,
fair value accounting,
and procyclicality1

Alicia Novoa
Economist, Financial Oversight Division, IMF

Jodi Scarlata
Deputy Division Chief, Financial Analysis Division,
IMF

Juan Solé
Economist, Global Financial Stability Division, IMF

Abstract
In light of the uncertainties about valuation highlighted by the
2007–2008 market turbulence, this paper provides an empirical
examination of the potential procyclicality that fair value account-
ing (FVA) could introduce in bank balance sheets. The paper finds
that, while weaknesses in the FVA methodology may introduce
unintended procyclicality, it is still the preferred framework for
financial institutions. It concludes that capital buffers, forward-
looking provisioning, and more refined disclosures can mitigate the
procyclicality of FVA. Going forward, the valuation approaches for
accounting, prudential measures, and risk management need to be
reconciled and will require adjustments on the part of all parties.

1 The authors are thankful to Kenneth Sullivan, and participants at the 10th Accountants
Forum of the IMF, as well as a number of staff of the IMF, for their comments and 61
suggestions. We are also indebted to Yoon Sook Kim and Xiaobo Shao for their
research and technical support.
Financial stability, fair value accounting, and procyclicality

Since the 2007 market turmoil surrounding complex structured as faithful a picture as possible of a bank’s current financial condi-
credit products, fair value accounting and its application through tion — alternative techniques have their own shortcomings. Yet
the business cycle has been a topic of considerable debate. As despite its advantages, difficulties exist not only in determining
the illiquidity of certain products became more severe, financial FV prices in downturns and illiquid markets, but also during boom
institutions turned increasingly to model-based valuations that, times in active markets when prices can overshoot and incorporate
despite increased disclosure requirements, were nevertheless risk premia that inflate profits. Under such circumstances, market
accompanied by growing opacity in the classification of products prices may not accurately reflect risks and can result in exagger-
across the fair value spectrum. Moreover, under stressed liquidity ated profits that distort incentives (i.e., management compensa-
conditions, financial institutions made wider use of unobservable tion) and amplify the cyclical upturn. In rapidly evolving financial
inputs in their valuations, increasing uncertainty among financial markets, inaccurate valuations may quickly alter the implications
institutions, supervisors, and investors regarding the valuation of for solvency and more broadly, financial stability.
financial products under such conditions.
The paper emphasizes that FVA should be structured so that it
It has been during this period that the procyclical impact of fair contributes to good risk management and ensures that financial
value accounting on bank balance sheets and, more specifically, statements include adequate disclosure of valuations, method-
the valuation of complex financial instruments in illiquid markets, ologies, and volatilities such that inherent uncertainties are well
came to the fore, raising questions on the use of market prices understood. While the volatility of estimation errors in valuation
below “theoretical valuation” and the validity of “distressed sales.” techniques should be reduced as much as possible, genuine eco-
Financial products were fair valued despite concerns that the cur- nomic volatility should be faithfully reflected in financial statements
rent market prices were not an accurate reflection of the product’s and preserved by regulators and supervisors [Barth (2004), Borio
underlying cash flows or of the price at which the instrument and Tsatsaronis (2005)]. The paper concludes by providing some
might eventually be sold. Sales decisions based on fair value pric- quantitative insight for regulators and supervisors to better assess
ing in a weak market with already falling prices resulted in further the implications of FVA on bank balance sheets and capital and puts
declines in market prices, reflecting a market illiquidity premium. forward proposals for dealing with issues of the volatility of FVA
Additionally, falling prices can, and did, activate margin calls and and FV classification. Importantly, it stresses the need for resolving
sale triggers that are components of risk management criteria, the tensions between valuation approaches across risk managers,
contributing further to the downward trend. As bank net worth is accountants, and prudential supervisors and regulators, so as to
positively correlated with the business cycle, and as fair market ensure that accounting frameworks do not unduly contribute to
values for collateral values fall, losses have been passed through potential financial instability.
to banks’ capital [Kashyap (2005)]. The weakening of bank balance
sheets and regulatory requirements for prudential capital replen- Fair value accounting through the business cycle
ishment has served to heighten concerns as to the future course of Fair value accounting and its application
some markets, the health of banks and, more broadly, the financial The current accounting framework
system. Both U.S. Generally Accepted Accounting Principles (U.S. GAAP)
and International Financial Reporting Standards (IFRS) use a mixed
This paper reviews the principles and application of fair value attributes model, where different valuation criteria are applied to
accounting (FVA), the implications of its features and how these different types of assets and liabilities, depending on their char-
impact bank balance sheets. Using a simple model, it provides acteristics and on management’s intentions in holding them to
empirical support for the public discussions regarding the procycli- maturity or not. In essence, both frameworks require FV valuation
cality of FVA on bank balance sheets. Utilizing representative bank for financial assets and liabilities held for trading purposes and
balance sheets from a sample of actual institutions, it examines the available-for-sale (AFS) assets, and all derivatives. Held-to-maturity
application of FVA to banks’ balance sheets during the course of a (HTM) investments2, loans, and liabilities not fair valued are valued
normal business cycle, as well as during extreme shocks, such as at amortized cost. Both frameworks provide a carefully specified
those that have recently occurred, to distill in what manner FVA option to fair value (FVO) certain financial assets and liabilities3 that
may contribute to procyclicality. The paper examines the results would normally be valued at amortized cost4. The mixed attributes
obtained, discusses actual and proposed alternatives to FVA, and model is intended to be as neutral as possible — without emphasiz-
elaborates on policy implications going forward. ing one accounting principle over another. But, its uneven applica-
tion to balance sheets produces accounting volatility and may not
The paper finds that, while the application of FVA methodology fully capture the effects of economic events in all instruments
introduces unwanted volatility and measurement difficulties, FVA included in the banks’ financial statements.
nevertheless is the correct direction for proceeding as it provides

2 Non-derivative financial assets with fixed or determinable payments and fixed matu-
rity that an entity has the intention and ability to hold to maturity.
62 – The   journal of financial transformation 3 Namely, when they are risk-managed on a FV basis, though differences remain
between FAS159 and IAS39.
4 The paper utilizes that major classification of financial assets as set out by the IASB
and FASB, recognizing that there are additional, specific categories of instruments
subject to fair valuation.
Financial stability, fair value accounting, and procyclicality

What is fair value? to examining the prices and inputs used to FV financial instruments,
IFRS and U.S. GAAP similarly define FV as the amount for which an in order to minimize late write-downs or write-offs and opportuni-
asset could be exchanged, and a liability settled, between knowl- ties for management to “cherry-pick” the accounting treatment of
edgeable, willing parties, in an arm’s length, orderly transaction. financial instruments.
U.S. GAAP (FAS 157) are more prescriptive than IFRS because they
consider that FV is an “exit” or “selling” price5. Both accounting Disclosures of FVA
frameworks prescribe a hierarchy of FV methodologies that start Both IFRS and U.S. GAAP mandate various disclosures, particularly
with observable prices in active markets (Level 1), using prices when information other than market inputs is used to estimate FV.
for similar instruments in active or inactive markets or valuation For example, IFRS 7 requires disclosure (i) if the transaction price
models using observable inputs (Level 2), and moving to a mark-to- of a financial instrument differs from its FV when it is first recorded
model methodology with unobservable inputs and model assump- in the balance sheet; and (ii) of the implications of using “reason-
tions (Level 3)6. The absence of market prices, trading activity, or ably possible alternative assumptions” to reflect the sensitivities of
comparable instruments’ prices and inputs is a prominent feature FV measurement9. IFRS 7 also contain reporting requirements that
of complex structured credit products, many of which are held off include the publication of sensitivity tests for individual items of
balance sheet. Consequently, both frameworks require extensive the financial statements. Similarly, FAS 157 requires banks’ balance
disclosures of information on the FV methodologies used, specific sheets to be sufficiently clear and transparent as to fully explain to
assumptions, risk exposures, sensitivities, etc. market participants, through quantitative and qualitative notes to
the financial statements, the nature of the changes and the meth-
Thus defined, FV does not require the presence of deep and liquid odologies used, to name a few items10.
markets to be applied. FV can be estimated when a market does not
exist, as FV valuation models comprise the expected, risk-discount- Although some U.S. and European Union (E.U.) financial institutions
ed cash flows that market participants could obtain from a finan- voluntarily provide such disclosures, neither IFRS nor U.S. GAAP
cial instrument at a certain point in time. While FV incorporates require disclosure on the governance and management control
forward-looking assessments, it must also reflect current market processes11 surrounding FV valuation12. Enhancement of disclosures
conditions, measures of risk-return factors7, and incorporate all fac- in this direction could increase confidence in the banks’ balance-
tors that market participants consider relevant, with firm-specific sheets and lower investors’ aversion to transact in instruments
risk preferences or inputs kept to a minimum. Under this definition, whose valuations may not be well understood13. This would not
two key issues underlying the FV methodology present a challenge: necessarily indicate a need for more disclosures, but for a more
what constitutes an active market and what can be considered an appropriate composition, medium (i.e., websites), and frequency of
observable price or input. disclosures.

Forced or “fire” sales would not be valid determinants of market Along this line, at the request of the Financial Stability Forum (FSF)
prices, because the accounting frameworks presume that a report- a Senior Supervisors Group conducted a survey of disclosure prac-
ing entity is a going concern that does not need or intend to liqui- tices for selected financial exposures, such as special-purpose enti-
date its assets, or materially curtail the scale of its operations. Yet, ties and collateralized debt obligations, among others, and issued
accounting standard setters have decided to leave to judgment a report concluding that disclosure practices currently observed
(namely, of management, supervisors, and auditors) how to deter- could be enhanced without amending existing accounting disclo-
mine “regularly occurring” or “distressed” sales, and when sales in sure requirements14. The FSF is encouraging financial institutions
thin markets, at heavy discounts, could be used for balance-sheets’ to use these disclosure practices for their 2008 financial reports
FVA8. Consequently, market participants and supervisors would and urging supervisors to improve risk disclosure requirements in
expect to see banks’ external auditors use a very cautious approach Pillar 3 of Basel II.

5 Nevertheless, differences are disappearing given the international convergence to September 30, 2008, the U.S. SEC jointly with the U.S. FASB issued new guidance
IFRS currently underway, led jointly by the International Accounting Standards Board clarifying the use of FVA under the current environment and, on October 10, 2008,
(IASB) and the U.S. Financial Accounting Standards Board (FASB), which is aimed to the U.S. FASB staff issued Staff Position No. 157-3 providing guidance on how to
achieve a single set of high-quality international accounting standards. determine the FV of a financial asset when the market for that asset is not active.
6 This language is U.S. GAAP-specific and not IFRS, but it is used extensively in the 9 IFRS 7, “Financial instruments: disclosures,” became effective on January 1, 2007.
banking industry and in financial statements of IFRS users as well. 10 For those financial assets measured at amortized cost, the entity must also disclose
7 IFRS do not explicitly mention some risk factors (i.e., counterparty credit risk, liquid- the FV in the notes to the statements.
ity risk), which may have added confusion to financial statement preparers during 11 Including audit-related programs.
the 2007–08 turmoil. An International Accounting Standards Board Expert Advisory 12 The FSF recommends disclosures about price verification processes to enhance gov-
Group is currently working on this and other FV issues. The U.S. Financial Accounting ernance and controls over valuations and related disclosures. Disclosures regarding
Standards Board is reevaluating some disclosure requirements (i.e., credit deriva- risk management governance structures and controls would also be welcome.
tives) and has issued new standards (i.e., FAS 161 on derivatives and hedging). Both 13 Examples are the U.S. SEC letters of March 2007 and March 2008 to major financial
Boards are working jointly on FV issues and examining requirements for off-balance institutions outlining the nature of recommended disclosures and the most current
sheet entities as well. letter of September 16, 2008. 63
8 White papers prepared by the six largest international audit firms and other audit 14 “Leading-practice disclosures for selected exposures,” April 11, 2008. Twenty large,
firms summarize guidance on what constitutes an active market, FV measurement internationally oriented financial firms were surveyed (15 banks and five securities
in illiquid markets, and forced sales, CAQ (2007) and GPPC (2007). Further, on firms) as of end-2007.
Financial stability, fair value accounting, and procyclicality

A preliminary reading of financial reports prepared for mid-2008 by Level 1 valuations Level 2 valuations Level 3 valuations

some U.S., European Union, and Canadian banks would show that 100
6 5 6
U.S. banks are including more quantitative notes in their financial 90

statements, as compared with their end-2007 reporting15, typically 80

providing information on financial assets securitized, cash flows 70

received on Special Purpose Entities (SPE) retained interests, 60 72


67
69

assets in non-consolidated variable-interest entities (VIE), and 50

maximum exposures to loss in consolidated and non-consolidated 40

30
VIEs, with details broken down by instrument.
20
28 25
10 22
Volatility and procyclicality of FVA
0
Barth (1994 and 2004) argues that there are three potential U.S. Financial Institutions European Financial Institutions Total

channels through which FV may introduce volatility into financial


statements. The first is the volatility associated with changes in Figure 1 – Aggregate fair value hierarchy, end-2007 (in percent)
Source: Fitch Ratings
the underlying economic parameters. The second is the volatility
produced by measurement errors and/or changing views regarding
economic prospects throughout the business cycle. As to the third, procyclical effects may create incentives for banks to restructure
volatility may be introduced by relying on the “mixed attributes” their balance sheets (i.e., lower loan originations, higher/lower
model that applies FVA to certain instruments and amortized securitization, introduce hedging, etc.)19. Nevertheless, higher FV
cost to others, reducing the netting effect that full fair valuation volatility, per se, would not necessarily be a problem if market
of assets and liabilities would produce16. Each of these sources of participants are well informed and could correctly interpret the
volatility is either explicitly or implicitly present in the simulation information provided in the financial statements. In this sense,
exercises examined later in the paper. increased volatility may be thought of as part of the process of fair
valuing financial instruments, and a reflection of genuine economic
The mixed attributes model adopted by IFRS and U.S. GAAP has volatility, not as a cause itself of procyclicality.
embedded volatility and procyclicality aspects17. On the one hand,
historical cost accounting, applicable to HTM investments and loans, However, in some cases, the symmetrical treatment within FVA
is less volatile and also backward looking. When such an investment produces seemingly misleading results. For example, the use of
or loan is correctly priced at origination, its FV equals its face value. FVA on a bank’s own debt, where the price of the bank’s bonds and
Over the life of the asset and until maturity, its reported stream of notes falls due to a decline in its own creditworthiness, will result in
profits is stable and its carrying value is based on its value at origi- a gain that must be recognized in the bank’s financial statements,
nation. But if market conditions negatively affect these portfolios equal to the difference between the original value of the debt and
and there is evidence of a credit loss event and asset impairment, its market price. As counter-intuitive as this situation may be, it is
then the reporting values must be reassessed and provisions for still a faithful representation of FV and is a signal to supervisors or
losses must be accrued or write-offs recorded. The latter is often a other users of financial statements to have appropriate tools (i.e.,
late recognition of excess risk taken earlier, in good times. In this prudential filters)20 for understanding the implications of FVA and
sense, historical costs are subject to a backward-looking assess- the impact on regulatory capital.
ment of value. Thus, amortization of loans, when combined with
procyclical provisioning, often coincides with a downturn of an As valuation moves from market prices to mark-to-model valua-
economic cycle, adding to stresses. tion, FVA poses reliability challenges to which markets, particularly
under distress, are sensitive21. These “subjective” aspects of FVA
On the other hand, FVA introduces more volatility in earnings and may compound market illiquidity or price spirals if they increase
capital during the life of an asset or liability than historical cost uncertainty around valuations. Both in the United States and the
accounting and incorporates forward-looking assessments18. Gains European Union, financial institutions’ balance sheets are heav-
and losses in fair-valued instruments can generally affect the ily represented in Level 2 instruments, a possible indication that
income statement and this increased volatility of FVA and resulting financial institutions are biased towards using Level 2 methods

15 Canada has postponed adoption of the full International Financial Reporting 19 One intention of the FVO in both accounting frameworks is to enable entities to
Standards until 2011. reduce accounting mismatches by applying FV on matching assets and liabilities.
16 Barth (2004) argues that mixed-attributes models impair the relevance and reliability 20 Bank supervisors use prudential filters as a tool to adjust changes in the (accounting)
of financial statements and that this constitutes one of the primary reasons behind equity of a bank due to the application of the accounting framework, so that the qual-
hedge accounting. IAS 39 was aimed to alleviate mismatches in assets and liabilities ity of regulatory capital may be properly assessed. For example, when the gains that
valuations due to the mixed-attributes model and the complexities of hedge accounting. result from a deterioration in a bank’s own creditworthiness (fair valued liability) are
17 It should be noted that procyclicality of accounting and reporting standards existed included in a bank’s prudential own funds, they must be “filtered out” by the supervi-
64 prior to the recent attention to FVA. It has long been recognized that as the business sor in order to determine the true amount of regulatory own funds.
cycle and market sentiment change, so too will valuations of assets and liabilities. 21 In principle, valuations are thus better aligned with the prevailing mark-to-model
18 IFRS and U.S. GAAP accounting standards – and FVA is no exception – are applicable techniques used in risk management.
to reporting entities irrespective of their size or systemic importance.
Financial stability, fair value accounting, and procyclicality

due to their flexibility, as well as a desire to avoid “obscure” Level behavior by helping to create a market perception that the banks
3 assets and liabilities (Figure  1). Falling valuations can activate were standing behind their OBSEs. Both the IASB and the U.S. FASB
certain management decision rules that trigger the liquidation of have different projects under way to improve OBSE disclosures and
certain assets or portfolios, adding additional stress. Hence, there enhance the criteria for derecognition and consolidation of OBSEs.
is a need for good risk management practices to be consistent with Examples are the IASB’s consolidation and derecognition projects,
FV mark-to-model valuations. Clear and transparent quantitative and the FASB’s changes to FAS 140 and Interpretation 46(R). The
and qualitative notes to the financial statements regarding the FASB’s recently revised standard, FAS 140, will go into effect at the
nature of the changes and methodologies could enhance reliability end of 2009.
of mark-to-model valuations.
Regardless, OBSEs require financial supervisors to revisit pruden-
Although more volatile, FVA could play a role by partially mitigating tial reporting so that the integrity of banks’ risk exposures can be
the crisis if warning signals are heeded, thereby helping markets better captured and explained, as well as adequately buffered (i.e.,
to recover before damaging self-fulfilling downturns worsen. FVA capital) to the satisfaction of supervisors.
that captures and reflects current market conditions on a timely
basis could lead to a better identification of a bank’s risk profile, if Procyclicality in the Basel II framework
better information is provided. An earlier warning that can prompt A key improvement in the Basel II framework is its enhanced risk
corrective action by shareholders, management, and supervisors sensitivity. Yet this very feature is associated with the unintended
allows for a timelier assessment of the impact of banks’ risky effect of heightening its procyclical propensity. Basel II recognizes
actions on regulatory capital and financial stability. Moreover, since possible business cycle effects and how they should be addressed
FVA should lead to earlier recognition of bank losses, it could have in both Pillar 1 (minimum capital requirements) and Pillar 2 (super-
a less protracted impact on the economy than, for example, loan visory review process) of the framework. If Basel II is properly
portfolios whose provisions for losses are usually made when the implemented, then greater risk sensitivity can lead banks to restore
economy is already weak22. Raising new capital at an earlier stage capital earlier in a cyclical downturn, thus preventing a build-up of
might enable banks to retain written-down assets or other assets required capital when it could amplify the cycle.
originally not for sale on their balance sheets and, thus, to avoid
asset price spirals. Under Basel II’s Standardized Approach, risk weights are based on
external ratings constructed to see through the cycle, so that cyclical
On the prudential front, the negative impact of vastly lower valu- effects are muted. It is in the internal- ratings-based (IRB) approach-
ations stemming from recent market conditions raises questions es that deterioration in credit risk feeds more directly into the capital
as to whether increases in regulatory capital may be needed for requirements. The three main risk components in the IRB approaches
complex structured products, off-balance sheet entities (OBSEs), (i.e., probability of default, loss given default, and exposure at
or other risks. Basel II, Pillar 2 guidance could encourage banks to default) are themselves influenced by cyclical movements and may
put greater attention into FV during periods of falling or rising asset give rise to a cyclical impact on banks’ capital requirements.
prices, so that they may better control for procyclical aspects of
FVA. Pillar 3 disclosures could improve transparency of valuations, Basel II includes mitigating measures to address these concerns.
methodologies, and uncertainties. Nevertheless, FVA can serve as Although Pillar 1 does not mandate the use of through-the-cycle mod-
an early warning system for supervisors to pursue closer scrutiny els, it promotes estimates of risk components based on observations
of a bank’s risk profile, risk-bearing capacity, and risk management that “ideally cover at least one economic cycle,” and whose valida-
practices. tion must be based on data histories covering one or more complete
business cycles. Sound stress testing processes must be in place
Off-balance-sheet entities and procyclicality that involve scenarios based on economic or industry downturns and
Recent market turmoil has heightened public awareness of the include specific credit risk stress tests that take into account a mild
extensive use of off-balance-sheet entities (OBSEs) by financial recession to assess the effects on the bank’s risk parameters.
institutions. With variations, both IFRS and U.S. GAAP have spe-
cific criteria to determine when instruments transferred to OBSEs Pillar 2 places the onus on both banks and supervisors to assess
should be consolidated on-balance-sheet. Any retained interest business cycle risk and take appropriate measures to deal with it.
in securitized financial assets should be on-balance-sheet and Banks are required to be “mindful of the stage of the business cycle
accounted for at FV, usually in the trading book. in which they are operating” in their internal assessment of capital
adequacy, perform forward-looking stress tests, address capital
Mandatory disclosures on OBSEs are not prevalent. Their absence volatility in their capital allocation, and define strategic plans for
may have added to market confusion and contributed to procyclical raising capital. In turn, encouraging forward-looking credit risk

22 IASB’s November 2009 exposure draft, Financial instruments: amortized cost and
impairment, proposes an expected loss model for provisioning. 65
Financial stability, fair value accounting, and procyclicality

assessments or higher provisioning for loan losses (that consider ments and discipline in reporting, yet they need close monitoring to
losses over the loans’ whole life) is left to national supervisors23. ensure that this practice does not evolve into management “cherry
Thus, where Pillar 1 does not adequately capture business cycle picking,” providing a means to evade a certain accounting fair value
effects, supervisors should take remedial action under Pillar 2, level classification, or improving the balance sheet.
including through additional capital buffers.
Reclassifications
The capital disclosures required by Pillar 3 may assist markets and The transfer of assets from available-for-sale or trading to the
stakeholders in exercising pressure on the banks to maintain their held-to-maturity (HTM) category could avoid the volatility resulting
capital levels throughout the full business cycle. In its recent report, from valuation changes amid a downward spiral. However, from an
“Enhancing market and institutional resilience,” the Financial accounting perspective, reclassifications could be penalized by not
Stability Forum called for the Basel Committee to develop Pillar 2 allowing banks to revert to the trading book when markets rebound.
guidance on stress testing practices and their use in assessing capi- Further, assets transferred from the trading category to HTM would
tal adequacy through the cycle, to examine the balance between be subject to impairment assessment (as they should were they
risk sensitivity and cyclicality, and update the risk parameters and moved into the AFS category). From a prudential standpoint, dete-
the calibration of the framework, if needed [Financial Stability riorated HTM assets would require higher regulatory capital, while
Forum (2008)]. In response, the committee is establishing a data changes in AFS assets would be considered additional but not core
collection framework to monitor Basel II’s impact on the level and capital. Allowing reclassifications, particularly if not fully disclosed,
cyclicality of prudential capital requirements over time across may postpone the weaknesses of the balance sheets, and promote
member countries. The committee is expected to use these results cherry-picking elements of the accounting framework24.
to further calibrate the capital adequacy framework.
Full fair value accounting
Options for the application of fair value accounting to Recognizing the significant challenges that FVA poses, a longer-
mitigate procyclicality term alternative would be to adopt a full-fair-value (FFV) model for
The procyclicality of FVA has prompted the search for options that all financial assets and liabilities in a balance sheet, irrespective of
allow financial institutions to cope with situations of market turmoil. an entity’s intention in holding them. One single FV principle, with
Alternatives range from considering a wider selection of “observ- some limited exceptions, would reduce the complexity of financial
able” prices or inputs to a change in the accounting treatment of instruments reporting, balance sheet window dressing, and cherry
financial instruments, as follows: picking, and allow for more transparent representations of the
financial condition of an entity. It could improve the comparability
Consensus pricing services of financial information across balance sheets and enhance market
Consensus pricing services, often independent brokers and agen- discipline, but it would pose significant challenges for implementa-
cies, can provide price quotes for complex or illiquid financial tion, modeling capabilities, and auditing estimates.
instruments, often using prices based on their own sales of relevant
instruments that allow them to observe price behavior and market- Internal decision rules
test their estimates. Through this approach, illiquid products could Without searching for a FVA alternative, regulators could require
obtain a Level 2 price, potentially limiting valuation uncertainty banks to have internal decision rules based on FV that require a
and underpricing in downturns. However, difficulties may remain if careful review of all the implications of changing FV and the specific
there is a wide dispersion of values that do not reflect the features occasions when such changes could trigger management decisions,
of the specific financial product or if banks contend that values do so that these decisions do not adversely affect regulatory capital or
not reflect market conditions, thereby obliging banks to use internal accentuate downward price spirals.
valuation methodologies.
Smoothing techniques and circuit breakers
Valuation adjustments Smoothing asset prices and circuit breakers could be used as price
Banks could estimate the “uncertainty” surrounding the price of cer- adjusters to FVA to reduce excessive price volatility in the bal-
tain assets and make a valuation adjustment to the carrying value of ance sheet. Smoothing techniques involve the averaging of asset
an instrument disclosed in the financial statements. Valuation adjust- prices over a given period. A circuit breaker imposes rules to stem
ments would allow banks to work with less perfect prices that are the recognition of a fall in asset prices. However, both reduce the
corrected to reflect current market conditions. These estimates of information content of financial statements by suspending equity
“uncertainty” might incorporate the liquidity of inputs, counterparty at an artificially higher-than-fair-value calculated level. The simula-
risk, or any market reaction likely to occur when the bank’s position tion exercises examine the following alternatives: FFV accounting,
is realized. Valuation adjustments could improve fair value measure- smoothing techniques, circuit breakers, and reclassifications.

23 The U.S. Financial Accounting Standards Board has a project under way to address 24 In mid-October 2008, the IASB amended IAS39 to allow some reclassifications of
66 provisioning and related credit risk disclosures. financial instruments held for trading or AFS to the HTM category, meeting certain
criteria, with the desire to reduce differences between IFRSs and US GAAP. As of
November 2009, IFRS 9, Financial statements, requires reclassifications between
amortized cost and fair value classification when the entity’s business model changes.
Financial stability, fair value accounting, and procyclicality

Modeling FVA through the business cycle using Data and modeling assumptions
simulations This section presents the construction of the simulation exercises
Using model simulations, this section assesses the effects that and reviews the assumptions underlying the various scenarios.
changes in financial instruments’ fair value have on the balance
sheet of three types of large, internationally active financial institu- Banks’ balance sheets
tions — U.S. commercial banks, U.S. investment banks, and European To accurately reflect the balance sheets of a representative
banks — as well as more retail-oriented U.S. and E.U. banks (Table 1). large U.S. commercial bank, a large U.S. investment bank, a large
The balance sheets of a sample of representative institutions were European bank, and retail-oriented U.S. and European banks, the
taken as of end-2006 to construct prototypical institutions. The financial statements at end-2006 for these five banking groups
simulations illustrate the impact of changes in valuations and, ulti-
mately, on these representative banks’ equity capital. The section
also explores possible alternatives related to FVA and its current U.S. U.S.
commercial investment
European
bank
U.S.
retail-
European
retail-
application — full fair value, smoothing techniques, circuit breakers, bank bank oriented oriented
bank bank
and reclassifications — that aim to reduce its volatility on balance Financial assets
sheets (Box 3.4). Securities
Debt securities 21.82 27.85 15.71 14.96 17.72

The first simulation serves as the baseline for subsequent scenarios Trading book FV1 21.82 27.85 14.98 5.09 16.59
Banking book2 — — 0.73 9.87 1.13
and consists of tracking the evolution of the banks’ balance sheets
Shares 6.73 7.50 6.55 0.64 2.96
throughout a normal business cycle. Four scenarios are applied to
Trading book FV1 6.73 7.50 6.32 0.47 2.96
the normal cycle with the goal of gauging the degree to which fair Banking book2 — — 0.23 0.17 —
valuations amplify fluctuations in balance sheet components, and Derivatives (trading) 2.67 5.28 14.71 1.19 4.44
more notably, on accounting capital25. The sources of increased Interest rate swaps 1.48 1.87 7.76 ... ...
cyclicality are (i) a bust-boom cycle in equity valuations; (ii) a bust- Other derivatives 1.20 3.41 6.96 ... ...

boom cycle in the housing market; (iii) a widening and then contrac- Loans
Corporate/Consumer 10.11 5.63 23.77 23.00 25.84
tion of banks’ funding spreads; and (iv) a bust-boom cycle in debt
Short-term (fixed rate) < 1 year FV 4.72 2.82 11.88 6.84 12.92
1

securities’ valuations, all of which are calibrated using the most


Medium-term ( > 1 year < 5 year) 3.66 2.82 3.57 10.97 3.88
current cyclical movements (Table  2). As noted by Fitch (2008a
Fixed rate FV1 0.72 1.41 1.78 1.71 1.94
and 2008b) among others, the sensitivities of FV measurements
Variable rate FV1 2.94 1.41 1.78 9.26 1.94
to changes in significant assumptions are particularly important Long-term (> 5year) 1.73 n.a. 8.32 5.19 9.04
when valuations are model-based and/or markets become highly Fixed rate FV1 0.46 n.a. 4.16 2.03 4.52
illiquid. Specifically, the method by which an institution chooses Variable rate FV1 1.27 n.a. 4.16 3.16 4.52
to value components of its balance sheet constitutes one of the Mortgages 16.51 n.a. 6.54 37.44 26.43

three main transmission channels through which FVA introduces Fixed rate FV1 12.83 n.a. 1.40 29.09 10.78
Variable rate FV1 3.68 n.a. 5.14 8.35 15.65
volatility into the balance sheet [Barth (2004)]. The simulations
Other assets 28.60 43.27 20.93 17.34 5.41
help underscore this point and provide a sense of the magnitude of
Financial liabilities
these effects. In addition, the simulations illustrate how a sudden Debt securities/equity (trading) FV1 4.68 8.68 12.77 0.01 12.71
tightening in banks’ funding conditions, or changes in the liquidity Derivatives (trading) 3.20 5.49 15.34 0.96 3.47
conditions in securities markets, exacerbate cyclical fluctuations in Interest rate swaps 2.09 1.73 7.84 ... ...
balance sheets. Other derivatives 1.10 3.76 7.49 ... ...
Short-term and long-term financial FV1 18.25 27.21 10.35 19.56 18.97
liabilities/Bonds
It is worth noting that from a cash flow perspective, the changes Other liabilities 65.26 51.52 56.23 69.72 61.16
in assumptions underlying valuations (such as those made in the of which: deposits and interbank 42.44 3.72 24.88 60.12 56.72
simulations below) may not necessarily be of future consequence borrowing
Net equity3 7.65 3.71 2.86 9.75 4.36
to the reporting institution, as those gains and losses have not been
realized and may never be. In this sense, the ensuing changes in Table 1 – Balance sheet of representative U.S. and European financial institutions (in
regulatory capital produced by the updated valuations are some- percent of total assets, December 31, 2006)
Sources: Annual Reports; and SEC’s 10-K filings.
what artificial. With these considerations in mind, the simulation
Note: Columns may not add to 100 percent as some balance sheet items are not
results should be interpreted as a simple exercise to gauge how displayed in the table.
changes in the underlying valuation parameters in the presence of 1 Valued at fair value.
2 Annual statements showed negligible or zero holdings for the sampled U.S. banks.
FVA may lead to substantial fluctuations in banks’ equity.
3 Net equity in percent of total (non-risk weighted) assets.

25 See Enria et al. (2004), who examine the impact of several one-off shocks on the
balance-sheet of a representative European bank under alternative accounting 67
frameworks.
Financial stability, fair value accounting, and procyclicality

were compiled from the institutions’ annual reports and the U.S. Business Business Business
cycle trend cycle trough cycle peak
Securities and Exchange Commission’s 10-K filings26. Individual points points points
bank balance sheets were then used to construct a weighted aver- Normal cycle PD for all loans and securities 1.18 1.40 0.73
age for each type of institution, and the resulting representative LGD for mortgages 20.30 20.30 20.30
balance sheets (Table 1). Table 1 indicates the line items that were LGD for loans1 and securities 46.20 46.20 46.20

fair valued in the simulations27, 28. Not all the items in the balance Stock market index 100.00 100.00 100.00
Stock market PD for all loans and securities 1.18 1.40 0.73
sheet were fair valued in the simulations: items that are typically cycle
not available for sale (i.e., securities in the banking book) and items LGD for mortgages 20.30 20.30 20.30
that fall under the “other” categories were held constant.29 LGD for loans1 and securities 46.20 46.20 46.20
Stock market index 100.00 80.00 120.00

Valuation of assets and liabilities under fair value Real estate PD for mortgages 1.18 5.29 0.73
market cycle
Loans and debt securities are valued at their expected net present
PD for loans1 and securities 1.18 1.40 0.73
value (NPV), which takes into account the probability of default and LGD for mortgages 20.30 30.50 20.30
the loss given default of each instrument. In other words, the value LGD for loans1 and securities 46.20 46.20 46.20
of a given security (or loan) with a maturity of T years is given by Stock market index 100.00 100.00 100.00

the expression Note: PD = probability of default; LGD = loss given default.


1
Loans excluding mortgages.
T
E (CFt )
NPV = ∑
(1 +δt )
t Table 2 – Parameter values for each simulation (in percent)
t= 1
Sources: IMF staff estimates; and Nickell et al. (2000).

where δt is the discount rate for year t, and E(CFt) is the expected
cash-flow for year t factoring in the possibility that the security (or tion abstracts from changes in interest rates during the cycle and
loan) defaults: initially assumes a flat yield curve.
E(CFt) = [PDt · (1 + rt) · N · (1 – LGDt)] + [(1 – PDt) · rt · N] for all t<T,
and In principle, different classes of securities and loans may have dif-
E(CFT) = [PDT · (1 + rT) · N · (1 – LGDT)] + [(1 – PDt) · (1 + rT) · N] ferent PDs and evolve differently throughout the cycle. For simplic-
where PDt stands for probability of default30, rt is the interest rate ity, however, this paper assumes that all securities and loans have
on the loan, N is the notional amount of the loan, and LGDt is the the same PD and display the same cyclical behavior, except for the
loss-given-default. scenario of the bust-boom cycle in real estate, where a different
PD for mortgages is assumed. In addition, loans are assumed to be
Under FV, traded shares are valued at their market price. Since bullet instruments, whose principal is repaid in full upon maturity.
the detailed composition of the shares portfolio of banks was The specific values for these PDs were derived from Nickell et
not available, it was assumed that banks hold a generic type of al. (2000), who investigate the dependence of securities rating
share which represents the Standard & Poor’s 500 stock market transition probabilities on the state of the economy [Pederzoli and
index. Therefore, the number of shares for each type of bank was Torricelli (2005), Bangia et al. (2002), and Altman et al. (2005)].
obtained by dividing the value of their shares portfolio at end-2006 The probabilities of default at different stages of the business cycle
by the value of the S&P 500 index at the same date. were computed using their estimated transition matrices at differ-
ent points in the cycle (Table 2)31.
Characterization of the business cycles
To simplify the analysis, the paper considers a stylized business To compute the net present value (NPV) of loans and securities,
cycle consisting of four periods representing different points in a it is also necessary to have a measure of losses in the event of
typical business cycle: trend, trough, peak, and back to trend. Each default. Thus, loss-given-default (LGD) rates were taken from the
point in the business cycle is characterized by a different prob- BIS’s Fifth quantitative impact study QIS-5 [BIS (2006a)], and equal
ability of default (PD) on securities and loans. To construct the 20.3  percent for mortgage loans and 46.2 percent for corporate
normal business cycle, the PDs on loans and debt securities were loans. To isolate the effect of the evolving PDs, the LGD rates were
assumed to change with the pulse of the cycle, increasing during held constant through the cycle (except in the bust-boom cycle
economic downturns and decreasing during upswings. To isolate in the housing market and in the downward price spiral for debt
the effect of the evolving PDs on valuations, the baseline simula- securities)32.

26 The filing period was chosen to be December 2006 in order to obtain balance sheets 29 Despite being a central element in the 2007–08 turmoil, an explicit breakdown of credit
that are relatively recent, while at the same time do not reflect too closely banks’ bal- derivative exposures was unavailable in the 2006 reports. Some mortgage-backed
ance sheet structures in the run-up or fall-out of the 2007-08 U.S. sub-prime meltdown. securities were included in the debt securities category.
27 For simulation purposes, all banks were assumed to be newly established, so that all 30 Strictly speaking, PDt is the conditional probability of default at time t. That is, the
balance sheet items are at FV at the start of the simulations. Thus, the shocks applied probability that, conditional on not having defaulted before, a loan defaults on period t.
68 to the baseline reflect only the pure impact of the shocks, and not a combination of the 31 It should be noted that the Quantitative impact study 5 (QIS-5) estimated the PD for a
imposed shock plus any initial deviations from fair value. group of G-10 (ex-U.S.) banks’ retail mortgage portfolio at 1.17 percent, very close to the
28 IAS 39 prevents the valuation of demand deposits at a value less than face value, even estimate of 1.18 percent for the trend period used here.
if a significant portion of these display economic characteristics of a term deposit. 32 Although this may be a less realistic assumption than allowing LGDs to evolve through
Consequently, deposits remain at face value in the exercise. the cycle, the qualitative results of the simulations would not be altered.
Financial stability, fair value accounting, and procyclicality

To construct the scenario of distressed securities markets and then


U.S. commercial banks Baseline Period 1 Period 2 Period 3 Period 4
Business Business Business Business Business
recovery, it was assumed that the LGD rates for debt securities
cycle
trend
cycle
trough
cycle
trend
cycle
peak
cycle
trend
sharply increase during troughs and decrease by the same amount
Normal cycle 7.6 7.5 7.6 7.9 7.6 during peaks35. During the cycle trough, the LGD rate for debt
Bust-boom cycle in share prices 7.6 6.3 7.3 9.1 7.6 securities increases to 67.3 percent36 from its initial base of 46.2
Bust-boom cycle in real estate 7.6 5.4 7.6 7.9 7.6 percent. Subsequently, the simulation applies the same shock mag-
U.S. investment banks Baseline Period 1 Period 2 Period 3 Period 4
nitude (but reversed sign) to the LGD during the cycle peak; that is,
Business Business Business Business Business
cycle cycle cycle cycle cycle LGD decreases to 25.1 percent.
trend trough trend peak trend

Normal cycle 3.7 3.8 3.7 3.6 3.7


Bust-boom cycle in share prices 3.7 2.3 3.4 5.0 3.7
Simulation results
Bust-boom cycle in real estate 3.7 3.8 3.7 3.6 3.7 The simulations highlight three key points regarding FVA and its
European banks Baseline Period 1 Period 2 Period 3 Period 4 potential regulatory and financial stability implications: (i) strong
Business
cycle
Business
cycle
Business
cycle
Business
cycle
Business
cycle
capital buffers are crucial to withstand business cycle fluctuations
trend trough trend peak trend in balance sheet components, especially when FV is applied more
Normal cycle 2.9 2.8 2.9 3.0 2.9
extensively to assets than liabilities; (ii) fair valuing an expanded set
Bust-boom cycle in share prices 2.9 1.6 2.6 4.2 2.9
of liabilities acts to dampen the overall procyclicality of the balance
Bust-boom cycle in real estate 2.9 1.9 2.9 3.0 2.9
sheet; and (iii) when combined with additional liquidity shortages in
Table 3 – Equity to assets ratio through the business cycle (in percent) financial markets, the FVA framework magnifies the cyclical volatil-
Source: IMF staff estimates.
ity of capital.

Characterization of the economic shocks The effects of economic shocks under full fair value
The first scenario considered is a bust-boom cycle in stock market In the normal cycle, fair valuing both sides of the balance sheet
valuations where, concurrent with a normal cycle, share prices ini- produces fluctuations that are mild compared to the bust-boom
tially plummet by 20 percent during the downturn of the economic scenarios below (Figure 2), an intuitive result37. However, it is worth
cycle and then surge to a level that is 20 percent above the original noting that, in the case of the representative U.S. investment bank,
level, to ultimately return to their trend value (Table 3)33. equity behaves in a countercyclical manner due to the strong effect
of fair valuing the liabilities. Under full FV (FFV), the value of the
The second scenario is a bust-boom cycle in the housing market, in bank’s liabilities declines as economic activity weakens and prob-
which mortgage default rates and LGD rates dramatically increase abilities of default (PDs) rise, mitigating the decline in equity. This
during the downturn, and then rebound during the recovery. In effect arises because of the asset/liability structure of the invest-
this scenario, PDs of mortgage loans increase to 5.29 percent in ment banks’ balance sheet, which consists of a large proportion of
the trough of the cycle — a magnitude which is commensurate with financial liabilities that are fair valued. Liabilities at FFV, as is done
the recent meltdown in the U.S. housing market34. Additionally, the by some U.S. investment banks, can introduce an element of coun-
reduction in house values — and thus the expected decline in recov- tercyclicality by serving as an implicit counter-balancing hedge to
eries — was factored in through a 50 percent increase in the LGD the fair valuation of assets38, 39. This phenomenon has raised related
rate over the average values reported in the QIS-5 (i.e., from 20.3 concerns by some market observers who regard with unease a
percent to 30.5 percent). bank’s ability to record revaluation gains as its own creditworthi-
ness weakens and the price of its own debt declines40. The presence
To simulate the cycle in funding conditions, the paper assumes that of gains that are a construct of the particular technique chosen
during the business cycle trough, banks’ cost of funding increases by for valuation, signals the need for clear disclosure of underlying
58.7 basis points. This increase in spreads was obtained by comput- assumptions to avoid misrepresentation of financial statements.
ing the average rise in Libor-OIS spreads for U.S. and European banks
during the summer of 2007. Conversely, to analyze the effects of In both the bust-boom cycles in equity valuations and in the housing
ample liquidity conditions, the simulation assumes that banks’ fund- market, the European banks exhibit the largest deviations from trend.
ing costs decrease by the same amount during the cycle peak. For the equity price shock, despite roughly comparable magnitudes

33 The initial price of the representative stock held by banks was normalized to the value 37 The results are presented in terms of the evolution of banks’ normalized equity
of the S&P 500 index at end-2006, which closed at 1418 on December 29th, 2006. through the cycle — that is, at each point in the cycle, banks’ equity is divided by their
34 To estimate the PDs during the 2007–08 U.S. housing crisis, it was assumed that 100 initial level of equity (i.e., at end-2006). All figures for this section are presented at
percent of foreclosures and 70 percent of delinquencies beyond 90 days end up in the end.
default. These percentages are then combined with the respective PDs to yield an over- 38 Note however that this result reflects only one element of countercyclical forces, as
all estimated PD of 5.29 percent for all mortgages. See UBS (2007); data source: Merrill “other liabilities” represents about 50 percent of the balance sheet and can poten-
Lynch, April 2008. tially introduce additional countercyclicality.
35 The rationale behind this characterization of distressed markets follows Altman et. 39 Chapter 4 of IMF (2008a) examines procyclicality of leverage ratios of U.S. invest-
al (2005) in that during times of distress, the demand for securities declines hence ment banks, finding their extreme variation across the cycle. Note this is consistent
reducing both the market price and the recovery rate (i.e., the inverse of LGD) of with the scenario conducted later in this paper where funding spreads vary through 69
securities. See Acharya et al. (2007), Altman et al. (2005), and Bruche and González- the cycle, producing the same procyclicality found in IMF (2008a).
Aguado (2008) for papers discussing the link between distressed markets and 40 See Guerrera and White (2008). Additionally, Barth et al. (2008) suggest that these
increases in LGD rates. counterintuitive effects are attributable primarily to incomplete recognition of con-
36 Derived from Bruche and González-Aguado (2008). temporaneous changes in asset values.
Financial stability, fair value accounting, and procyclicality

of equity shares across the three banks’ portfolios, a combination The effects of mixed-attributes models
of two effects are at work. First, there is the countercyclical effect Using two versions of the mixed-attributes model, this exercise
of the relatively greater proportion of FV liabilities for investment shows how the degree to which financial institutions apply FV to
banks. Second, the European bank has a lower capital base and thus their assets and liabilities affects the extent to which there can be
the relative size of valuation changes to normalized equity capital is offsetting volatility effects. Table 4 shows that financial institutions
larger. In the housing market scenario, the European bank exhibits apply FV differentially. But what is not shown in the table is the
wider fluctuations, despite the fact that the U.S. commercial bank extent to which the vast majority of banks continue to use amor-
holds a much larger fraction — about two-and-half times greater — of tized cost to value their loan portfolio. Thus, for the purposes of the
its loan portfolio in mortgages. In both scenarios, the lower capital simulations, two variations of the model are considered: (i) “finan-
base of the European bank vis-à-vis the U.S. commercial bank is a cial liabilities and bonds” are valued at amortized cost throughout
key element. Similar results in terms of capital-to-assets ratios are the cycle; and then (ii) in addition, “loans” and “mortgages” are also
presented in Table 3, but reflect a less dramatic impact on European valued at amortized cost43.
banks41. More generally, a bank’s balance sheet would evolve through
the cycle — contracting in downturns and expanding in upturns — such Figure  5 underscores the idea that the asymmetric application of
that it would restore a bank’s capital adequacy ratio, a result that is a mixed-attributes model, where FV is applied more extensively to
not testable in this simple framework. value assets than liabilities, has the effect of increasing the procy-
clical behavior of the balance sheet. In other words, the fluctuations
The recent events have raised two interesting scenarios regarding in equity — for all types of institutions and for all the scenarios
increased funding costs and a downward spiral in the valuation of considered — are larger when a smaller fraction of liabilities are
debt securities. Sudden changes in bank’s ability to obtain funding fair valued (compare with Figure  2, the results under FFV). Thus,
largely exacerbate the fluctuations in balance sheets (Figure  3). the benefits intended by the introduction of the FVO, which were
This exercise underscores the significance of general liquidity to reduce the accounting volatility of the mixed attributes methods
conditions in driving balance sheet fluctuations, and how the FVA and the need for FV hedge accounting techniques, are lessened.
framework recognizes these changes promptly. Interestingly, the This supports an expanded application of FV, rather than a reduced
countercyclical behavior observed in the U.S. investment banks’ application, as some would like to propose. Bear in mind, however,
equity disappears. In fact, the U.S. investment bank is hardest hit by that the application of FV to banks’ own debt may produce revalu-
both the tightening of funding conditions and the distress in secu-
rities markets. This should not be surprising given that, contrary
Financial institutions Assets at FV Liabilities at FV Return on
to the U.S. commercial and European banks, the U.S. investment on a recurring on a recurring equity
bank does not rely on deposits — which are not fair valued — to basis basis

fund its activities. Note, too, that these simulations do not account JPMorgan Chase & Co. 41 16 12.86

for structured credit products or the OBSEs that were so central Citigroup 39 22 3.08

to much of the 2007–08 turmoil and would likely increase the Bank of America 27 6 10.77
procyclicality of the balance sheets. Such a deterioration of banks’ Goldman Sachs 64 43 31.52
balance sheets could affect market confidence and overall share Lehman Brothers 42 22 20.89
prices, which in turn could generate additional volatility in banks’ Merrill Lynch 44 33 -25.37
balance sheets. Morgan Stanley 44 27 9.75

The results presented thus far have focused on the balance sheets
Credit Suisse 64 39 17.88
of large internationally active institutions. Comparatively, the
Societe Generale 46 32 3.36
more retail-oriented banks tend to have larger loan and mortgage
Royal Bank of Scotland 45 31 15.13
portfolios and rely more extensively on deposits for their funding42.
BNP Paribas 65 55 16.98
To illustrate the effects of these two structural characteristics,
Deutsche Bank 75 48 18.55
simulations comprising the cycle in funding spreads and the bust-
boom cycle in real estate were conducted for all banks, excluding UBS 54 35 -10.28

the representative U.S. investment bank. The results corroborate HSBC 40 25 16.18

the supposition that the more retail-oriented institutions are less Barclays 52 39 20.50

vulnerable to changes in funding conditions than their internation- Credit Agricole 44 24 10.67
ally active counterparts (Figure 4). Conversely, the retail-oriented
Table 4 – Application of fair value by U.S. and European banks, 2007
banks are harder hit by a bust in the housing market than the inter- (in percent of total balance sheet)
nationally active banks. Sources: Fitch; and Bloomberg L.P.

41 Some portion of the lower equity position in European banks may stem from differ- 43 In effect, valuing these instruments at amortized cost would produce comparable
70 ences in IFRS versus U.S. GAAP accounting treatments [Citigroup (2008), Financial results to being classified as HTM.
Times (2008)].
42 Note, however, that retail-oriented European banks also have a larger fraction of debt
securities and financial liabilities than the larger European banks.
Financial stability, fair value accounting, and procyclicality

ation gains as the value of liabilities declines on their balance sheet tion by suspending equity at an artificially higher level than under
and that this should be properly disclosed. FV and, more generally, may hamper price discovery. However,
in this case, the cycle may be extenuated even longer than with a
This simulation highlights that the greater the imbalance of the smoothing technique because the circuit breaker can maintain the
mixed attributes application to assets and liabilities, the greater is same value for a given period, while the smoothing is a rolling aver-
the accounting volatility. When financial instruments are valued at age that is updated during each period of the cycle. Additionally, this
a historical cost that does not represent the current market condi- measure is asymmetrically applied, as the circuit breaker has gener-
tions, an accurate picture of a bank’s equity becomes blurred and ally been proposed for when valuations are falling. Even though not a
the informational content of the accounting statements weakens. preferred technique, for symmetry, one could apply circuit breakers
Historical costs have low information content for investors who rely during bubble periods to stop the artificial inflation of equity. If not,
on current financial figures as a basis for investment decisions. For asymmetric treatment of valuations may create perverse risk-taking
a regulator, making an accurate assessment of the health of a bank, incentives for managers as long as financial institutions are able to
and formulating the appropriate regulatory response, becomes benefit from the upside in valuation while the downside would remain
increasingly difficult44. capped.

The second simulation (not shown), where financial liabilities plus The effects of a changing yield curve
loans and mortgages are all valued at amortized cost, showed that Yield curve effects are introduced to the baseline scenario to evalu-
the range of fluctuations diminished further than in the above simu- ate how the change in interest rates over the cycle affect the bal-
lation. Thus, although the wider application of the mixed attributes ance sheet45. The paper follows Keen (1989) and assumes the fol-
model can reduce fluctuations in the balance sheet, the cost comes lowing stylized facts regarding the cyclical behavior of yield curves
in the form of a further reduction in up-to-date information. [Piazzesi and Schneider (2006), Keen (1989)]: (i)  both short- and
long-term rates tend to decline during business cycle downturns
Smoothing techniques and circuit breakers on and rise during expansions; and (ii) short rates tend to rise more
reporting prices relative to long rates during expansions (i.e., the yield curve flat-
Simulations using proposed alternatives to smooth balance sheet tens) and fall more relative to long rates during recessions (i.e., the
volatility show that a smoothing/averaging technique for falling yield curve steepens) (Figure 7)46.
asset prices blurs the bank’s capital position, in magnitudes varying
by the amount and period over which the averages are calculated. The influence of interest rates tends to dominate the effect of
Smoothing techniques and other impediments to allowing valua- the change in PDs, such that the interest rate effect dampens
tions to adjust, so called “circuit breakers,” make it harder for regu- the magnitude of procyclical equity fluctuations for the European
lators and investors to accurately assess the financial position of a bank, and even becomes countercyclical for the U.S. commer-
bank as it hides the economic volatility that should be accounted cial bank (Figure  8). For the U.S. investment bank, the change in
for in the balance sheet. interest rates renders the evolution of equity procyclical, rather
than countercyclical, as in the baseline simulation. This reversal
To illustrate, two simulations were conducted, each averaging in behavior is due to the fact that the U.S. investment bank has
share prices over different lengths. The first simulation uses a two- a slightly larger share of FV liabilities than assets being revalued
period average, whereas the second simulation is extended to three when interest rates change47. But this also highlights the European
periods. As shown in Figure 6, the longer the averaging length, not banks as an intermediate structure between the investments banks
surprisingly, the smoother is the path of the balance sheet. Notably, and retail bank characteristics. Regardless of the balance sheet
the application of a smoothing technique might reduce the occa- structure, changes to interest rates and other monetary policy
sion for forced sales, as it could avoid sale triggers in some cases. tools can dampen procyclical influences, suggesting countercyclical
Accordingly, this could lessen a downward price spiral in the market monetary policy could have the beneficial outcome of also helping
for a financial product by avoiding forced sales, but comes at the to counteract the effects of the asset valuation cycles on banks’
expense of a reduction in the informational content of financial equity. Note, however, these simulations do not allow the financial
statements and potentially lengthening the resolution period. institutions to respond to policy changes, and thus these results,
while informative, should be taken with caution.
Similarly, concepts such as a circuit breaker, whereby rules stem the
recognition of a fall in asset prices, mask the underlying equity posi-

44 Moving to an expected loss model of provisioning could decrease volatility. 47 This simulation abstracts from the effect of revaluing interest rate swaps.
45 Although this simulation is subject to the Lucas critique in that bank behavior is Unfortunately, it was not possible to obtain a sufficiently complete and consistent
assumed not to change in response to policy adjustments, it provides some insights dataset on these instruments to include them in the simulation. Nevertheless, pre-
into the interaction between FVA and interest rates. liminary results using available data on interest rate swaps showed similar qualitative
46 Interestingly, the addition of changes in the yield curve counteracts the effect of the results. 71
evolution of PDs. The drop in the yield curve in the downturn results in higher valu-
ations and thus counterbalances the downward effect of the PDs, while the positive
effect on valuations stemming from lower PDs is counterbalanced by a higher yield
curve in the upturn.
Financial stability, fair value accounting, and procyclicality

Conclusions and policy recommendations Proposals for alternative accounting methods, such as historical
The financial turmoil that started in July 2007 unveiled weaknesses cost or simplistic mechanisms to smooth valuation effects on bank
in the application of some accounting standards48 and with the balance sheets, reduce the transparency of a financial institution’s
valuation and reporting of certain structured products. While these health by blurring the underlying capital position. While these tech-
weaknesses may have contributed to the current crisis, they also niques may avoid sale triggers incorporated in risk management
provide an opportunity to better understand them. covenants and limit downward price spirals, the measurement
variance introduced by such techniques can increase uncertainties
The paper finds that, despite concerns about volatility and mea- regarding valuations. The loss of transparency makes it more dif-
surement difficulties, FVA is the appropriate direction forward and ficult for all users of financial statements, for example, for supervi-
can provide a measure that best reflects a financial institution’s sors to conduct adequate oversight of financial institutions and rec-
current financial condition, though various enhancements are ommend appropriate regulatory measures to deal with prudential
needed to allow FVA to reinforce good risk management techniques concerns, and for investors who will demand increased risk premia
and improved prudential rules. Nevertheless, the application of in the face of uncertainty.
FVA makes more transparent the effects of economic volatility on
balance sheets that, under certain risk management frameworks, Policy proposals
could exacerbate cyclical movements in asset and liability values. Most proposals should aim to deal with the use of FV estimates to
Exaggerated profits in good times create the wrong incentives. lessen the volatility that FVA can introduce to the balance sheet.
Conversely, more uncertainty surrounding valuation in downturns Assessments of provisioning and capital adequacy should take
may translate into overly tight credit conditions, and negatively better account of the business cycle. Improved transparency can
affect growth at a time when credit expansion is most needed. be achieved not necessarily by more disclosures, but better dis-
This is not to say that alternative accounting frameworks, such closures. Financial, accounting, and regulatory bodies are already
as historical cost accounting, avoid such fluctuations, but rather providing guidance and recommendations in this direction.
that FVA recognizes them as they develop. Regardless, accounting
frameworks are not meant to address the market-wide or systemic ■■ The simulations support the relevance of establishing a capital
outcomes of their application, as they are applied only to individual buffer that looks through the cycle, augmenting the capital posi-
institutions. Nevertheless, much of the controversy surrounding tion during boom cycles to withstand the burden on capital that
FV stems more from the risk management and investment deci- stems from economic downturns. Although a partial analysis, the
sion rules using FV outcomes, rather than the framework itself. simulations show that FVA can introduce financial statement vola-
Delinking the interaction of FV estimates from specific covenants, tility and provide a first indication that buffers of around 24 per-
such as sales triggers, margin calls, or additional collateral require- cent of additional capital would help banks weather normal cycli-
ments during downturns, or compensation tied to short-term prof- cal downturns, whereas higher buffers — on the order of 30–40
its during upturns, are options that could mitigate the procyclical percent extra capital — would be needed to offset more severe
impact of FVA. shocks. Recognizing that these estimates do not reflect concur-
rent changes in risk-weighted assets, they nevertheless provide an
Overall, the simulations confirmed a number of issues in the ongo- initial estimate of the magnitude of the needed capital buffer, as
ing FVA debate and underscored three key points regarding FVA well as the direction for further analysis. Note that these are not
and its potential regulatory and financial stability implications: (i) adjustments to FV calculations, per se, but are adjustments meant
strong capital buffers and provisions make an important contribu- to help mitigate the impact on bank balance sheets. Consideration
tion to withstanding business cycle fluctuations in balance sheets, to making other changes to the accounting framework so that
especially when FVA is applied more extensively to assets than the FV calculations themselves obviate the need for these other
liabilities; (ii) when combined with additional liquidity shortages in adjustments would be useful at this juncture.
financial markets, the FVA framework magnifies the cyclical volatil- ■■ Broadening the current narrow concept of provisions to incor-
ity of capital; and (iii) fair valuing an expanded set of liabilities acts porate additional methods of retaining income in upswings could
to dampen the overall procyclicality of the balance sheet. However, provide a way of better offsetting balance sheets’ procyclical
the latter may also give rise to the counterintuitive outcome of effects, for not-fair-valued assets. It is generally agreed that
producing gains when the valuation of liabilities worsens. This is of provisions protect against expected losses and capital protects
particular concern when a deterioration in a bank’s own credit wor- against unexpected losses. A build-up of provisions better linked
thiness, and the subsequent decline in value of own debt, results to the expected volatility, higher risks, and potentially larger
in profits and a false sense of improvement in the bank’s equity losses of an asset, could better anticipate the potential negative
position. effects on the balance sheet that would be reflected through the

48 Although the weaknesses are related more to issues of OBSEs, consolidation, and
72 – The   journal of financial transformation derecognition, than to FV.
Financial stability, fair value accounting, and procyclicality

cycle, as long as the build-up does not provide a way for manipu- and issued more frequently (i.e., quarterly)52 and cater to a
lating earnings. Coordination between accounting standard set- narrower group of user’s needs could highlight the most rel-
ters and supervisors would be needed to effect such changes. evant information, with a particular emphasis on risk develop-
■■ Similarly, the use of forward-looking provisioning49, combined with ments. Further, the volatility associated with a FV balance
a supervisor’s experienced credit judgment in assessing the prob- sheet may mean that the balance sheet is no longer the pri-
ability of default, loss given default, and loan loss provisioning50, mary medium for evaluating bank capital. Market participants
could mitigate the procyclical forces on the balance sheet. The and supervisors may increasingly turn to cash flow state-
recognition of credit losses in the loan portfolio earlier in a down- ments, income and equity statements, and risk measures —
ward cycle would lessen an accompanying decline in bank profits all of which provide distinct financial information and that must
and the potential for a squeeze in credit extension that could evolve in response to users’ needs.
contribute to a further downward economic trend. Similarly, on ■■ Albeit of a simple structure and subject to isolated shock sce-
the upside, dividend distributions should only come from realized narios, the simulations point to the fact that the application of
earnings that are not biased by upward cyclical moves. FV to both sides of the balance sheet would introduce a coun-
■■ From an oversight perspective, the simulations underscore the tercyclical component that may cushion some of the financial
importance of understanding the cyclical implications of FVA. shocks that can result in large swings in bank equity. This result,
An enhanced role for prudential supervisors will be needed to however, arises in the shock scenarios, in part, from a deteriora-
ensure close inspection of a bank’s risk profile and risk manage- tion in the own-debt values as risk premia rise on the liability
ment practices, and make appropriate recommendations for side of the balance sheet. This logically compensates for the
augmented capital buffers and provisions, as needed. A compre- deterioration of the asset side during a downturn. From the view
hensive bank supervisory framework should include stress tests point of assessing the riskiness of the financial institution or its
of FV positions through the business cycle. Similarly, auditors future prospects, the result can be viewed as paradoxical, as it
will have a critical role to play in ensuring credibility, consistency, can hardly be regarded as a positive factor for the financial insti-
and neutrality in the application of FVA, and overall in support- tution to have its own debt values deteriorate. The simulations
ing market confidence rather than appearing to augment pro- also illustrate how a bank’s response to a particular shock varies
cyclicality by encouraging lower valuations during a downturn. substantially depending on the specific balance sheet structure
A closer collaborative framework among audit and accounting and thus there is a need to discern the source of the cyclicality
standard setters and supervisors would be highly beneficial for through additional disclosures.
markets and financial stability to ensure greater consistency in
assessing and interpreting financial statements. A key challenge going forward will be to enrich the FVA framework
■■ In light of the different dynamics through the financial cycle and so that market participants and supervisors are better informed,
the doubts that can surround valuations, FV estimates should be in order to promote market discipline and financial stability. The
supplemented by information on a financial instrument’s price fragmented solution that currently exists between the accounting,
history, the variance around the FV calculations, and manage- prudential, and risk management approaches to valuation is insuf-
ment’s forward-looking view of asset price progression and how ficient and must be reconciled. Importantly, this will require adjust-
it will impact the institution’s balance sheet. Reporting a range ments on the part of all three disciplines to resolve these tensions.
within which the FV price could fall would help users of financial
statement to better understand and utilize the volatilities with
which they are dealing. FV estimates should be supplemented
with detailed notes on the assumptions underlying the valuations
and sensitivity analyses, so that investors can conduct their own
scenario analyses and determine whether the FV price is repre-
sentative of market conditions.
■■ More refined disclosures could meet the expanding needs of
various users, including investors, supervisors, and deposi-
tors, in a common framework of disclosure. For example, a
series of shorter reports that would be available on websites51

49 Forward-looking provisioning denotes provisions based on the likelihood of default 50 Basel Committee on Banking Supervision (2006b) and IAS 2001.
over the lifetime of the loan, reflecting any changes in the probability of default 51 FASB’s XRBL project for financial institutions would provide data online in about
(after taking into account recovery rates). Dynamic (or statistical) provisioning can be three years, as discussed in the IMF April 2008 edition of the Global financial stability
considered an extension of forward-looking provisions with reliance on historical data report [IMF (2008b)].
on losses for provisioning calculations. Conceptually, dynamic provisioning would 52 This would be separate from U.S. SEC 10-Q filings.
entail that during the upside of the cycle, specific provisions are low and the statisti-
cal provision builds up generating a fund; during the downturn, the growth in specific 73
provisions can be met using the statistical fund instead of the profit and loss account
[Enria al (2004) and Bank of Spain (www.bde.es)].
Financial stability, fair value accounting, and procyclicality

U.S. commercial banks U.S. investment banks European banks U.S. commercial banks European banks
Retail-oriented U.S. banks Retail-oriented European banks

106 Normal Cycle 105.1 107.8


109 Normal Cycle
104 101.5 105.1
102 100.0 103.8
98.3 104 104.9
100 100.0
98 100.0 103.8
98.3 100.0
97.6 99
96 96.9 97.6 97.8
94 96.5
Business cycle Business cycle Business cycle Business cycle 94
Business cycle
Business cycle Business cycle Business cycle Business cycle Business cycle
trend trough trend peak trend
trend trough trend peak trend

160 Bust-Boom Cycle in Share Prices


149.3 160
137.3 Cycle in Funding Spreads 141.9
140 140.9
140
120 117.4
121.4 120
100.0
100 100.0 100.0 114.0
100 88.9 100.0
80.7
80 85.0
80
61.0 64.4
60
53.5 60
62.0
40
Business cycle Business cycle Business cycle Business cycle Business cycle 40
Business cycle Business cycle Business cycle Business cycle Business cycle
trend trough trend peak trend trend trough trend peak trend

160 Bust-Boom Cycle in Real Estate


160
140 Bust-Boom Cycle in Real Estate
130 107.8
120 105.1
101.5 105.1 100.0
100.0 100 104.9 100.0
100 103.8 100.0 103.8
69.1
96.9 70
80 69.1 66.7
60 66.7 40 45.9
40 14.4
Business cycle Business cycle Business cycle Business cycle Business cycle 10
Business cycle Business cycle Business cycle Business cycle Business cycle
trend trough trend peak trend trend trough trend peak trend

Figure 2 – Simulation of full fair value Figure 4 – Simulation of full fair value: international versus retail-oriented banks

Normal cycle Cycle in funding spreads U.S. commercial banks U.S. investment banks European banks
Cycle in debt securities' LGDs 115
Normal Cycle 111.3
110 109.4
135 U.S. Commercial Bank
117.4
115 100.0 98.3 107.3 105 107.9
95 91.6 100.0 100.0
103.8
100 100.0
75 85.0 96.3
55 95 95.5
Business cycle Business cycle Business cycle Business cycle Business cycle 94.7
trend trough trend peak trend 90
Business cycle Business cycle Business cycle Business cycle Business cycle
trend trough trend peak trend

160 155.5
145 140.9 Bust-Boom Cycle in Share Prices
U.S. Investment Bank 149.9
135 140
125
115 120
106.3 125.4
100.0 101.5 100.0
105 100 100.0
100.0
95 78.7
83.9 96.9
85 80
75 60 55.1
65
58.8 50.5
55 40
Business cycle Business cycle Business cycle Business cycle Business cycle Business cycle Business cycle Business cycle Business cycle Business cycle
trend trough trend peak trend trend trough trend peak trend

145 140
135 European Bank 141.9 Bust-Boom Cycle in Real Estate
120 111.3
125 107.9
115 111.7 100.0 95.5
100 109.4 100.0
105 100.0 97.6
105.1 100.0
95 80
85.3 67.1
85
75 60 63.8
65
62.0
55 40
Business cycle Business cycle Business cycle Business cycle Business cycle Business cycle Business cycle Business cycle Business cycle Business cycle
trend trough trend peak trend trend trough trend peak trend

Figure 3 – Simulation of full fair value: changes in fund conditions and financial Figure 5 – Simulation of partial fair value (includes short-term and long-term financial
market distress liabilities valued at amortized cost)

74 – The   journal of financial transformation


Financial stability, fair value accounting, and procyclicality

Bust-boom cycle in share prices Two period average U.S. commercial banks U.S. investment banks European banks
Three period average Circuit breaker 110
130
121.4 Normal Cycle
120 U.S. Commercial Banks 105 103.9
103.0
110.4
110 108.8 100.0 102.7
104.4 100 98.4 100.0
100.0
100 102.3 100.0 98.4
89.5
90 95
95.8
80 80.7
90
70 Business cycle Business cycle Business cycle Business cycle Business cycle
Business cycle Business cycle Business cycle Business cycle Business cycle trend trough trend peak trend
trend trough trend peak trend
160
160 148.1
Bust-Boom Cycle in Share Prices
U.S. Investment Banks 140 143.1
137.3
140
120 116.0
120 120.2 100.0
100.0 112.1 100 100.0
85.4
100 100.0
81.2 93.6 80
80
55.4
60
60 61.0 54.2
40
40 Business cycle Business cycle Business cycle Business cycle Business cycle
Business cycle Business cycle Business cycle Business cycle Business cycle trend trough trend peak trend
trend trough trend peak trend
160
160 Bust-Boom Cycle in Real Estate
149.3 140
European Banks
140
120 103.9
120 122.1
121.7 100.0 95.8 102.7
111.0 100 100.0
100.0
100 101.5 100.0 79.8 98.4
80
80
75.5 70.3
60
60
53.5 40
40 Business cycle Business cycle Business cycle Business cycle Business cycle
Business cycle Business cycle Business cycle Business cycle Business cycle trend trough trend peak trend
trend trough trend peak trend
Figure 6 – Simulation of smoothing techniques Figure 8 – Simulation of full fair value with upward sloping yield curve

6,75
Trend Trough Peak
6,25

5,75

5,25

4,75

4,25

3,75

3,25
1 2 3 4 5 7 10 20 30

Maturities (In years)

Figure 7 – Yield curves and business cycles (in percent)

75
Financial stability, fair value accounting, and procyclicality

References
• Acharya, V. V., S. T. Bharath, and A. Srinivasan, 2007, “Does industry-wide distress • Pederzoli, C., and C. Torricelli, 2005, “Capital requirements and business cycle regimes:
affect defaulted firms? Evidence from creditor recoveries,” Journal of Financial Forward-looking modeling of default probabilities,” Journal of Banking and Finance,
Economics, 85:3, 787–821 29:12, 3121-140
• Altman, E. I., B. Brady, A. Resti, and A. Sironi, 2005, “The link between default and • Piazzesi, M., and M. Schneider, 2005, “Equilibrium yield curves,” NBER Working Paper
recovery rates: theory, empirical evidence, and implications,” Journal of Business, 12609
78:6, 2203–2227 • Union Bank of Switzerland (UBS), 2007, “Subprime loss protection – a do-it-yourself
• Bangia, A., F. Diebold, A. Kronimus, C. Schagen, and T. Schuermann, 2002, “Ratings kit,” Mortgage Strategist, June 26
migration and the business cycle, with application to credit portfolio stress testing,”
Journal of Banking and Finance, 26
• Bank of Spain, “Dynamic provisioning in Spain: impacts of the new Spanish statistical
provision in the second half of 2000,” Available at (www.bde.es)
• Barth, M., 2004, “Fair values and financial statement volatility,” in Borio, C., W. C.
Hunter, G. G Kaufman, and K. Tsatsaronis (eds), The market discipline across countries
and industries, MIT Press, Cambridge, MA
• Barth, M., L. D. Hodder, and S. R. Stubben, 2008, “Fair value accounting for liabilities
and own credit risk,” The Accounting Review, 83:3.
• Basel Committee on Banking Supervision, 2006a, Results of the fifth quantitative
impact study (QIS 5), Bank for International Settlements, Basel
• Basel Committee on Banking Supervision, 2006b, Sound credit risk assessment and
valuation for ), Bank for International Settlements, Basel
• Borio, C., and K. Tsatsaronis, 2005, “Accounting, prudential regulations and financial
stability: elements of a synthesis,” BIS Working Papers No. 180, ), Bank for International
Settlements, Basel
• Bruche, M., and C. González-Aguado, 2008, “Recovery rates, default probabilities, and
the credit cycle,” CEMFI Working Paper, Centro de Estudios Monetarios y Financieros,
Madrid
• Calza, A., T. Monacelli, and L. Stracca, 2006, “Mortgage markets, collateral constraints,
and monetary policy: do institutional factors matter?” Center for Financial Studies,
University of Frankfurt, Working Paper No. 2007/10
• Center for Audit Quality, 2007, “Measurements of fair value in illiquid (or less liquid)
markets,” White Papers, October
• Citigroup, 2008, “There’s a hole in my bucket: further deterioration in European banks’
capital ratios,” Industry focus, Citigroup Global Markets: Equity Research
• Enria, A., L. Cappiello, F. Dierick, S. Grittini, A. Haralambous, A. Maddaloni, P. A.M.
Molitor, F. Pires, and P. Poloni, 2004, “Fair value accounting and financial stability,”
ECB Occasional Paper Series No. 13, European Central Bank
• Financial Stability Forum, 2008, “Report of the Financial Stability Forum on enhancing
market and institutional resilience,” April 7
• Financial Times, 2008, “Banks according to GAAP.” Available at www.ft.com
• Fitch Ratings, 2008a, “Fair value accounting: is it helpful In illiquid markets?” Credit
Policy Special Report, 28 April
• Fitch Ratings, 2008b, “Fair value disclosures: a reality check,” Credit Policy Special
Report, June 26
• Guerrera, F., and B. White, 2008, “Banks find way to cushion losses,” Financial Times,
July 8
• Global Public Policy Committee, 2007, “Determining fair value of financial instruments
under IFRS in current market conditions,” Available at www.pwc.com
• International Accounting Standards Board, 2001, “Financial instruments: recognition
and measurement,” IAS 39, Available at www.iasb.org
• International Accounting Standards Board, 2007, “Disclosure of financial instruments,”
IFRS 7, available at www.iasb.org
• International Monetary Fund (IMF), 2008a, “Financial stress and economic downturns,”
Chapter 4, World Economic Outlook, World Economic and Financial Surveys, October
• International Monetary Fund (IMF), 2008b, Global financial stability report, World
Economic and Financial Surveys, April
• Kashyap, A., 2005, “Financial system procyclicality,” Joint Conference of the IDB and
Federal Reserve Bank of Atlanta’s Americas Center: Toward better banking in Latin
America, InterAmerican Development Bank, Washington, January 10
• Keen, H., 1989, “The yield curve as a predictor of business cycle turning points,”
Business Economics, October, Available at: http://findarticles.com/p/articles/mi_
m1094/is_n4_v24/ai_7987167.
• Nickell, P., W. Perraudin, and S. Varotto, 2000, “Stability of rating transitions,” Journal
of Banking and Finance, 24:1–2, 203–227
• Office of Federal Housing Enterprise Oversight, 2008, Data statistics

76 – The   journal of financial transformation


Articles

Can ARMs’ mortgage


servicing portfolios
Carlos E. Ortiz
be delta-hedged under
Associate Professor, Department of Mathematics
and Computer Science, Arcadia University gamma constraints?
Charles A. Stone
Associate Professor, Department of Economics,
Brooklyn College, City University of New York

Anne Zissu
Associate Professor and Chair, Department of
Business, CityTech, City University of New York,
and Research Fellow, Department of Financial
Engineering, New York University,
The Polytechnic Institute

Abstract
Ortiz et al. [2008, 2009] develop models for portfolios of mortgage
servicing rights (MSR) to be delta-hedged against interest rate risk.
Their models rely on this fundamental relationship between pre-
payment rates (cpr) and interest rates, represented as a sigmoid
function (S-shape). Defaults that lead to foreclosures or loan modi-
fications on mortgages will either truncate or extend the stream of
servicing income greeted by pools of adjustable rate mortgages.
Ortiz et al.’s previous research focuses on mortgage services rights
for fixed rate mortgages. In this paper we will extend their research
to MSR for adjustable rate mortgages (ARMs).

77
Can ARMs’ mortgage servicing portfolios be delta-hedged under gamma
constraints?

The market for ARMs and the current financial crisis are closely 20.00% % Change
related. The story goes like this: home values were increasing at 15.00%
10.00%
increasing rates between 1994 and 2007, making the public goal
5.00%
of widespread home ownership more difficult to achieve. Not only 0.00%
were house prices increasing, the rate of increase was going up for -5.00% 1988 1990 1992 1994 1996 1998 2000 2002 2004 2006 2008
-10.00%
most of this period as well (Figure 1).
-15.00%
-20.00%
As home prices increased home ownership became too costly for -25.00% Time

large segments of the population, particularly people with sub- Figure 1 – Percentage change in Case-Shiller national home price index
prime credit scores. Subprime lending filled the gap and subprime (Q1-1988 to Q1-2009)
lenders bent over backwards to accommodate more and more bor- 8 6-month LIBOR
rowers who wanted to get involved with the home buying frenzy.
1-Year CMT
Underwriting standards were lowered and speculative/ponzi loans 6

cloaked in the fabric of thirty-year amortizing loans were originated. short run. Negative amortization creates an unstable situation
4
Rising home values increased the value of leverage. Leverage was when house prices begin to decline and the thin equity cushion the
more affordable when the borrower assumed the interest rate risk owner
2 has is quickly eroded. This is what happened at the end of
(adjustable rate mortgages). Competition among mortgage lenders the housing boom.
0
for origination and servicing fees created a supply of unstable mort-
1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009
gage loans that were effectively short term loans with an option The underwriters underpriced the credit risk because they overes-
to extend or refinance. Rising home prices were expected to act timated the future value of the mortgaged property. It appears that
as cover for weak underwriting. Ponzi finance always depends on the premise that was prevalent in the mortgage market between
rising asset values and underpriced credit. Both of these elements 2005 1
and 2007 was that rising home prices would erase all traces
became increasingly and extremely scarce at the end of 2006 and of0.8weak or questionable underwriting. Risky underwriting includes
the beginning of 2007. As refinancing for subprime borrowers basing
0.6 loan amounts on the discounted rates as opposed to fully
became less available, default became the only exit for a large per- indexed
0.4 rates. This technique was used extensively to tease bor-
cent of the subprime market place. rowers
0.2 into using more leverage than was prudent. The unsurpris-
ing0result of offering below market rates is payment shock when
Adjustable rate mortgages encompass a wide range of contract the
-0.2 discounted rate is adjusted to a fully indexed rate. Optimists

designs. The significant differences across ARM contracts are the believed
-0.4 that payment shock could be dealt with by either refinanc-
2005 2006 2007
frequency of interest rate adjustment, the index used to determine ing
-0.6an outstanding loan with a loan that was2008 2009
more affordable or
the interest rate, the margin over the index, the caps and floors by sale of the house for more than the amount of the mortgage
in the contract, and the required minimum interest payment. The principal. When both the mortgage and housing markets collapsed
basic ARM product will use a money market index such as LIBOR or in 2007, payment shock led to accelerating default rates.
the constant maturity treasury (CMT) as the benchmark. The inter-
est rate the mortgagor is required to pay will be set at a margin Servicing income has become an increasingly important source
above this index and will adjust periodically. The adjustment period of income for financial institutions as securitization became the
may be a year or longer. The amount of rate adjustment that can preferred method of financing mortgages. Securitization enables
take place within an adjustment period and over the length of the financial institutions to originate mortgages in greater volumes
loan may be capped and floored. Some ARMS known as hybrid than they are able to finance mortgages. This is true at the firm
mortgages have a fixed rate period that may range from three level and the industry level. Financial institutions (FI) have trans-
to ten years. After the fixed rate period is over the adjustable formed their balance sheets from spaces where long-term financing
rate phase begins. The mortgage rate begins to float at a margin takes place to refineries where the raw material of MBS and ABS
above an index. Another product that became common during the are distilled into primary components and then refinanced in the
extreme acceleration of home prices in 2005 is the option ARM. broader capital markets all over the world. What stays behind on
The distinguishing feature of this ARM product is the choice it the balance sheets of the FI is the servicing contract. The benefits
gives the borrower to defer interest payments. Deferred interest that accrue to FI from operating a financial refinery are the fees
payments create negative amortization of the loan as the amounts associated with originating and servicing financial assets. In this
deferred are added onto the outstanding mortgage loan principal. paper we are examining how servicers can construct delta hedges
Negative amortization increases the leverage the borrower is using to offset losses in value to ARM servicing portfolios from changes in
to finance the property. Negative amortization was a short-term interest rates. Interest rate changes affect the rates of mortgagor
solution for borrowers who were trying to conserve cash in the prepayment and default.

78 – The   journal of financial transformation


Can ARMs’ mortgage servicing portfolios be delta-hedged under gamma
constraints?

Servicing income is derived from the outstanding mortgage princi- The flow of subprime risk into the mortgage market increased dra-
pal that is being serviced. Servicing fees range from 25 to 38 basis matically in the years leading up to the crash in the housing market
points of serviced principal. Servicing costs are not constant. The and the collapse of the capital and money markets in 2008. For
servicer is responsible for the costs associated with delinquencies, example, as a percent of total mortgage originations in terms of
workouts, and foreclosures on top of the normal costs of operating principal in 2006, 2007, and 2008, the percentages of MBSs backed
servicing centers. The value of servicing is higher when mortgagors by subprime mortgages that were issued were 16.8%, 9.01%, and
make payments on time. Delinquencies, defaults, and foreclosures 0.13%, respectively. These numbers are important because servic-
are costly. Workouts, while costly, are also a source of revenue ing income became more unstable as it was backed by mortgages
because while defaults truncate the stream of servicing income, that were speculative in nature. If borrowers were unable to call
workouts extend the servicing income. The fact that mortgage their mortgage by refinancing with a more affordable loan then
defaults were too many and too fast, “too many to fail” has pro- they often exercised the other embedded option, the right to put
moted the government, investors, and lenders to promote the work- the mortgaged property to the borrower. The current wave upon
out loans rather than move to foreclose on properties. This makes wave of foreclosures was triggered by the collapse of the subprime
sense in a deflating property market. Not only does the relationship market. Going forward, what will have an important impact on ser-
of the fixed costs to variable costs makes the value of servicing vicing income derived from ARMs are the public policy and private
uncertain, the principal amount that is serviced at any point in time initiatives that are being executed to modify outstanding mortgage
is not known in advance because mortgagors can prepay or default. loans, many of which are and will be subprime ARMs. “Currently,
While both prepayments and defaults reduce the future stream about 21 percent of subprime ARMs are ninety days or more delin-
of servicing income, the impact of defaults and prepayments are quent, and foreclosure rates are rising sharply.” (Chairman Ben S.
not equivalent. As already mentioned, a default is more costly to Bernanke at the Women in Housing and Finance and Exchequer
service than a prepayment. If the prepayment is financed with a Club Joint Luncheon, Washington, D.C. January 10, 2008, Financial
mortgage that is serviced by the same servicer as the original mort- Markets, the Economic Outlook, and Monetary Policy)
gage, then servicing income may boost income from the profits
associated with the origination of a new loan. Full defaults simply To arrive at a rough idea of how significant the market for mort-
end the servicing income because the principal of the loan is settled gage servicing rights is, we assume there is a 25 bps servicing fee
with the investors from the sale of the foreclosed property. charged against all outstanding first lien mortgage loans financing
1 to 4 family residences, whether securitized or not. The approxi-
The common elements that drive prepayment of ARMs are the rela- mate servicing income generated in 2005, 2006, 2007, and Q1
tive cost of available fixed rate financing at a point in time to the 2008 is provided in Figure  2. Hedging this income correctly at a
expected cost of floating rate financing going forward. If an ARM is reasonable cost can protect the capital of banks and make cash
designed to adjust upward or downward every six months and mort- flows less volatile. Figure 4 illustrates the breakdown between fixed
gage rates continue to increase after the mortgage is originated, rate mortgages and adjustable rate mortgages. As of May 2007,
the cause for refinancing will not be to secure an immediate savings subprime ARMs accounted for two-thirds of first lien subprime
but rather to lower the expected present value of future funding mortgage market and 9% of all first lien mortgages. “After rising
costs. Fixed rates will always be above adjustable rates except for at an annual rate of nearly 9 percent from 2000 through 2005,
the rare case of a very steep downward sloping yield curve. As house prices have decelerated, even falling in some markets. At the
long as fixed rates are above adjustable rates, the exercise of the same time, interest rates on both fixed- and adjustable-rate mort-
ARM prepayment option would be prompted by a mortgagor trying gage loans moved upward, reaching multi-year highs in mid-2006.
to protect resources from future increases in the ARM index. The Some subprime borrowers with ARMs, who may have counted on
incentive to refinance an ARM is not one sided as it is for a fixed refinancing before their payments rose, may not have had enough
rate mortgage. When fixed rates fall by enough from their levels home equity to qualify for a new loan given the sluggishness in
that existed when the ARM was originated, the mortgagor may
have the incentive to switch into the less risky fixed rate mortgage.
Hybrid ARMs that combine features of fixed rate and adjustable
2005 2006 2007 Q4-2008 Q2-2009
rate instruments have become popular. Mortgages such as the 5/1
Outstanding U.S.$ U.S.$ U.S.$ U.S.$ U.S.$
ARM offer fixed rates for a five-year period after which the rates mortgage principal 9.385 10.451 11.140 11.042 10.912
begin to adjust. There is evidence that at the cusp between the (first lien) trillion trillion trillion trillion trillion
Servicing income = U.S.$ U.S.$ U.S.$ U.S.$ U.S.$
adjustable rate and fixed rate periods, mortgagors act to avoid a
25 bp x outstanding 23.462 26.128 27.852 27.606 27.281
sharp upward revision in interest payments by attempting to refi- mortgage principal billion billion billion billion billion
nance out of ARM into a more affordable loan if one is available
Figure 2 – Estimate of annual mortgage servicing income
[Pennington-Cross and Ho (2006)]. Source for outstanding mortgage principal is Freddie Mac.

79
Can ARMs’ mortgage servicing portfolios be delta-hedged under gamma
constraints?

house prices.” (Chairman Ben S. Bernanke at the Federal Reserve with mortgagors. Loan modifications extend the life of mortgages,
Bank of Chicago’s 43rd Annual Conference on Bank Structure and enhancing the value of servicing contracts. Rather than lose the
Competition, Chicago, Illinois May 17, 2007, The Subprime Mortgage mortgage principal that is being serviced via a foreclosure, the idea
Market) is to modify loans that allow borrowers to stay current with the new
terms. (OCC and OTS Mortgage Metrics Report Disclosure of National
If we look at the CMT (constant maturity Treasury) rate or the Bank and Federal Thrift Mortgage Loan Data Second Quarter 2009,
six month LIBOR we see that rates rise incrementally, placing Office of the Comptroller of the Currency Office of Thrift Supervision
increasing pressure on the borrowers’ ability to repay as ARM Washington, D.C. September 2009)
rates increase. At some point the mortgagor will have to decide if
further increases in rates are sustainable. Rising ARM indices will Prepayment function for variable rate mortgages
prompt the mortgagor to search for a way out of the increasingly Our model of ARM prepayments is based on a number of general
costly contract. The three paths out are prepayment, modification, assumptions about prepayment behavior. First of all, the diversity
or default. It must be noted that while conventional mortgages are of ARM contracts means that it is not possible to make accurate
typically issued without prepayment penalties, subprime mortgages general statements about all ARMs. The ARM contracts that our
often do have prepayment penalties attached. These penalties analysis is most applicable to are subprime ARMs that were issued
make prepayment more costly. with initial index discounts, Option ARMs and Hybrid ARMs. All
of these mortgage contracts set up the potential for an extreme
Institutions such as Citibank, Wells Fargo, Bank of America, or increase in required payments once the teaser period is over or the
Countrywide rely on servicing income and take measures to hedge fixed rate period is over and the interest is fully indexed to current
this income. We are offering a technique for hedging the nega- money market rates. A glance at Figure 3 shows how rapidly money
tive fallout to servicing income derived from ARM loans that face market rates to which many ARMs are indexed (the six-month
increasing risks of prepayment and default as interest rates rise. LIBOR rate and the 1 year CMT) increased between 2004 and 2006.
This rapid increase in interest rates resulted in serious stress on
The subprime mortgage loan cohorts that had the worst perfor- the ability of many households that had issued ARMs to continue
mance are 2005, 2006, and 2007. Subprime mortgages originated in to make monthly payments. Since the rate increases happened dis-
2007 experienced the most rapid foreclosure rates. The 2007 cohort cretely and index resets are not continuous, mortgagors have time
loans were originated as home processes began their steep decline to make decisions that lower the value of their payment burdens.
and refinancing became next to impossible (Testimony before These options include prepayment and default [Pennington-Cross
the Joint Economic Committee U.S. Congress HOME MORTGAGES and Ho (2006)]. This increases the likelihood of a delinquency lead-
Recent Performance of Nonprime Loans Highlights the Potential ing to default or delinquency leading to prepayment.
for Additional Foreclosures Statement of William B. Shear, Director
Financial Markets and Community Investment, For Release on The following assumptions about mortgagor prepayment and ser-
Delivery Expected at 10:00 a.m. EDT Tuesday, July 28, 2009). vicing are integral to our model of delta-hedging servicing income.

It is estimated that 21.9% of subprime and Alt-A loans set to reset in ■■ The prepayment rate, cpr for ARMs (adjustable rate mortgages),
the third quarter of 2009 are already 30+ days overdue. This makes is a function of the market interest rate y and default rates d. In
these loans likely candidates for default or prime candidates for our model, the cpr of ARMs increases when interest rates go up,
loan workouts. (Data report No. 1, February 2008, State Foreclosure or are expected to increase because mortgagors have an incen-
Prevention Working Group) tive to lock-in current rates using fixed rate mortgages or search
for alternative ARMs with lower rates. Mortgage defaults impact
To stem the tide of foreclosures, the U.S. Government has initiated the cpr in the same way as prepayments. As a result of this
the “Making Home Affordable Program” (HAMP). This program offers assumption the cpr function is no longer the S-Shape found with
mortgagors the chance to refinance with a more affordable mortgag- fixed-rate mortgages [Stone and Zissu (2005)]. The cpr-function
es or to modify a mortgage that is delinquent or at risk of becoming for ARMs that prompt prepayments or defaults in rising interest
delinquent. Servicers are offered compensation to facilitate the mod- rate environments becomes an inverted S-shape (mirror of the
ification of loans that qualify for the program. These fees along with prepayment function for fixed rate mortgages). As interest rates
the savings that servicers will gain from avoiding foreclosure and the decline relative to the contract rate, borrowers will still have
increased longevity of the loan make HAMP interesting. The program an incentive to refinance when the obtainable fixed rate is less
is being ramped up so that the rate of foreclosures can be stemmed, costly than the expected adjustable rates in the future.
which would help to stabilize the housing market. HAMP supplements ■■ Default rates on ARMs are positively correlated with rising inter-
the efforts of lenders who are taking actions to save value by working est rates and diminishing home values.

80 – The   journal of financial transformation


-25.00% Time

8 6-month LIBOR

Can ARMs’ mortgage servicing portfolios


20.00%
15.00%
% Change
6 be delta-hedged
1-Year CMT
under gamma
constraints?
10.00%
5.00%
4

0.00%
-5.00% 1988 1990 1992 1994 1996 1998 2000 2002 2004 2006 2008
2
-10.00%
-15.00% 0
-20.00% 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009
-25.00% Time

1
8 6-month LIBOR
0.8
1-Year CMT
6 0.6
0.4
4 0.2
0
2
-0.2

0 -0.4
1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 -0.6 2005 2006 2007 2008 2009

Figure 3 – ARM indicies, 6-month LIBOR rates, and 1-year CAT Figure 4 – Conventional 30-year mortgage rate conventional Treasuy indexed 5-1
hybrid adjustable rate mortgages

■0.8
■ The magnitude and value of income derived from mortgage servic- ment experience or prepayment forecasts of mortgage loans of
0.6
ing rights (MSR) depend on default rates and prepayment rates. the type included in the trust over an extended period of time, and
0.4
the experience with respect to the mortgage loans included in the
0.2
We delta-hedge a portfolio of mortgage servicing rights for ARMs trust is insufficient to draw any conclusions with respect to the
0
with other fixed income securities such that the value of the servic- expected prepayment rates on such mortgage loans.” (Prospectus
-0.2
ing portfolio is not affected by increases or decreases in market supplement to prospectus dated June 28, 2006, DLJ Mortgage
-0.4
rates. In order
2005
to obtain the portfolio
2006 2007
that requires
2008
the lowest
2009
cost Capital, Inc. Sponsor and seller adjustable rate mortgage-backed
-0.6
delta hedge, we compare hedge ratios dynamically. This paper pass-through certificates, series 2006-3).
applies Ortiz et al.’s model of a delta-hedge-ratio function applied to
a portfolio of fixed rate mortgages to a portfolio of servicing rights Model and applications
derived from a pool of ARMs. We develop the gamma-hedge-ratio We express the relationship between prepayment rate and the
function using three types of fixed-income securities, a coupon pay- change in basis points in equation (1). The prepayment rate, cpr, is
ing bond with n years to maturity (case 1), a zero coupon bond with primarily a function of the difference between the new mortgage
n years to maturity (case 2), and a bond that pays coupons only rate y in the market and the contracted mortgage rate r.
twice in the life of the bond, at n/2 and then at n (case 3).
cpr = a/(1 + eb(y-r)) (1)
“The fair value of the MSRs is primarily affected by changes in
prepayments that result from shifts in mortgage interest rates. In Figure 5 shows the relationship between the spread between mar-
managing this risk, Citigroup hedges a significant portion of the value ket rates y and contractual rates on an adjustable-rate mortgage r
of its MSRs through the use of interest rate derivative contracts, and the cpr of the mortgage. The greater the spread is, the higher
forward purchase commitments of mortgage-backed securities, and the incentive mortgagors will have to switch from an ARM to a
purchased securities classified as trading (primarily mortgage-backed fixed rate mortgage or default on their loan ARM if new affordable
securities including principal-only strips).” (Citigroup’s 2008 annual financing is not available. The default option is typically exercised
report on Form 10-K). We have selected cash purchases of bonds to when home equity has become negative. Declining home equity can
affect our delta hedge. Our research can be extended to incorporate be the result of either an increase in the value of the mortgage lia-
the instruments that banks like Citigroup employ to hedge MSR. bility or a decline in the value of the mortgaged property or both.

The option to prepay an ARM is not as straight-forward as it is Of course, declining home values are the fundamental cause of
for a fixed rate mortgage. Mortgagors have financial incentives to negative home equity. Default rates have been stronger than refi-
prepay ARMs when interest rates are either rising or falling. This is nancing rates during the subprime crisis. Default rates by subprime
because ARM mortgages shift the interest rate risk to the borrower. borrowers have swamped refinancing rates by these borrowers
Our contribution is to set up a delta hedge for MSR that diminishes because of rapidly declining values of home equity and increasing
as mortgagors try to avoid further rate increases. joblessness, even as rates have come down to historical lows.

“The adjustable-rate mortgage loans in the trust generally adjust Kalotay and Fu (2009) illustrate that the option value offered to
after a one month, six month, one year, two year, three year, five mortgagors differs across mortgage contracts. He shows that the
year, or seven year initial fixed-rate period. We are not aware of 5/1 ARM has a lower option value than the fixed rate mortgage and a
any publicly available statistics that set forth principal prepay- higher option value than the 1 year ARM. The option value increases

81
Can ARMs’ mortgage servicing portfolios be delta-hedged under gamma
constraints?

V(MSR) = (s)m0 [Σ(1 – (a/(1 + expb(y-r)))t-1Bt-1] ÷ (1 + y)t (3a)


Plot of cpr vs (y-r)
0.18
We use the following values for Equation (3a) to generate the MSR
0.16 over a range of market rates y and we present them in Figure 6:
0.14 m0 = 100
B0 = $100,000
0.12
r = 6%
cpr(y) 0.10 a = .4
b = 100
0.08
cpr = prepayment rate
0.06
s = .25%
0.04 n = 30

0.02
Before commenting on Figure 6, it is important to understand that
a mortgage servicing right is equivalent to an interest only secu-
-0.06 -0.04 -0.02 0 0.02 0.04 0.06
(y-r) rity (IO). The ultimate cash flow that an IO generates over time is
Figure 5 – Prepayment function directly tied to the amount of underlying principal outstanding at
any point in time. An interest only security is derived by separating
as the mortgage incorporates more elements of the 30-year fixed interest from principal cash flows when the mortgagor’s payment
rate mortgage. A lower option value has generally translated into is made, before distributing it to investors. The interest component
a lower initial interest rate for the borrower. (Consumer Mortgage is distributed to the IO investors and the principal component is
Decisions by Andrew J. Kalotay and Qi Fu, Research Institute for distributed to the PO (principal only) investors. The typical graph of
Housing America June 2009) an IO security over market rates shows an increase in the value of
MSR as market rates increase because prepayment rate decreases
Valuation of MSR (prepayment effect is stronger than discount effect) until its value
The cash flow of a MSR portfolio at time t is equal to the servicing reaches a maximum (prepayment effect is equal to discount effect)
rate s times the outstanding pool in the previous period: and then it starts to decrease (discount effect is greater than pre-
payment effect).
MSRt = (s)m0(1 – cpr)t-1Bt-1 (2)

where:
m0 = number of mortgages in the initial pool at time zero Plot of function V(IO) vs y

B0 = original balance of individual mortgage at time zero


r = mortgage coupon rate
cpr = prepayment rate 5.5 x 10
6

m0(1-cpr)t = number of mortgages left in pool at time (t)


Bt = outstanding balance of individual mortgage at time (t)
s = servicing rate
5 x 106

We express the value of mortgage servicing rights as: $

V(MSR) = (s)m0 [Σ(1 – cpr)t-1Bt-1] ÷ (1 + y)t (3) 4.5 x 106

with t = 1,…..n (through the entire paper).

4 x 106
Equation (3) values a MSR portfolio by adding each discounted cash
flow generated by the portfolio to the present, where n is the time
at which the mortgages mature, and y is the yield to maturity.
0 0.05 0.10 0.15
y
After replacing equation (1) in equation (3) we obtain the MSR
function as: Figure 6 – Value of mortgage servicing rights

82 – The   journal of financial transformation


Can ARMs’ mortgage servicing portfolios be delta-hedged under gamma
constraints?

The value of MSR derived from a pool of ARMs with respect to chang- The delta hedge ratio
es in market interest rates differs from the value of MSR for fixed We use the same model (OSZ) [Ortiz et al. (2008)] developed
rate mortgages because the prepayment function is inverted. This to obtain a delta-hedged portfolio of fixed rate bonds and MSRs
is illustrated in Figure 6. When market rates start to increase, mort- derived from fixed rate mortgages. In this paper we are hedging
gagors who anticipate that further rate increases are on the horizon MSRs that are derived from ARMs with fixed rate bonds with the
will attempt to refinance their mortgages with fixed rate mortgages. difference that the MSR are backed by adjustable rate mort-
Those who cannot secure a fixed rate mortgage at acceptable terms gages.
may have to endure further rate increases which will lead to higher
default rates. The effect of rising interest rates for certain classes of αV(MSR) + βV(B) = K (4)
subprime ARMs will be a diminution of outstanding principal. Since
this principal is the source of servicing fees and the discount rate is Where K is the constant value of the portfolio of MSRs and bonds;
now higher the value of MSR must fall. We are discounting a smaller and α and β are the shares of MSRs and bonds respectively that are
stream of cash flows at a higher rate. consistent with a zero-delta portfolio that satisfies the constraint K.

ARMs do not all reset at the same margin above an index. Rates OSZ obtain the hedge ratio α as a function of the MSR’s and bond’s
that were initially set at a discount from market value at some deltas and their respective values:
point will reset to the fully indexed margin. An extreme change
in rates when the teaser period is over leads to what is known as α = [-K(dV(B)/dy)] ÷ [(dV(MSR)/dy)V(B) – (dV(B)/dy)V(MSR)] (5)
prepayment shock. Anticipation of payment shock increases the
incentive to prepay. In 2004, the Fed began raising the Fed funds OSZ simulations and analysis
rate. In December 2003, the rate was 1%; by June 2007 it had risen In the following three cases we use the OSZ model to create the
to 5.25%. This increase in money market rates led to increases in delta hedge of ARM mortgage serving rights.
ARM indices, a lower demand for mortgage finance, higher default
rates, and the beginning of the reduction in the supply of mortgage Case 1
credit. A negative feedback loop was set in motion that led to falling A regular bond with yearly coupons and face value paid at maturity:
house prices that increased default rates that lowered the supply of
mortgage credit even further that placed further downward pres- V(B) = c Σ 1/(1+y)t + Face/(1+y)n
sure on house prices.
c = $350,000
The LIBOR index rose by more than the CMT index, especially in Face = $5,000,000
2008 when interbank lending became severely curtailed. Rising y = 5%
interest rates can lead to payment shock for initially discounted n = 10
ARMs. If mortgagors who have issued ARMs expect rates to con-
tinue rising they can see that their household finances will become Case 2
stressed. While prepayment into a fixed rate mortgage will not A zero-coupon bond
necessarily lower payments, it may lower the value of the mortgage
liability relative to the initial ARM by shielding the borrower from V(B) = Face/(1+y)n
further rate increases.
We keep the value of the variables the same as before:
Rapidly increasing interest rates will increase the incentive to Face = $5,000,000
refinance and the likelihood to default. The incentive to get out of y = 5%
ARMs that are becoming too costly will depress the value of MSR. n = 10
Declining interest rates and lower defaults rates will enhance the
value of MSR in our model over a range of rates. Notice that the plot Case 3
of MSR in Figure 6 is far from linear. At first, rising rates diminish A bond that pays coupons only twice in the life of the bond: at n/2
the value of the MSR, the IO strip. Once MSR reaches a minimum, and then at n:
further rate increases lead to an increase in the value of MSR. This
is explained by the absence of refinancing opportunities that offer V(B) = c/(1+y)n/2 + c/(1+y)n + Face/(1+y)n
savings and perhaps coupled with rising home values. Again at
some point the discount rate effect takes over and the discounted c = $350,000
value of MSR begin to decline. n = 10

83
Can ARMs’ mortgage servicing portfolios be delta-hedged under gamma
constraints?

Face = $5,000,000 Zone 3 — starts where the three α functions cross for the second
y = 5% time the horizontal axis; that point corresponds to the maximum
value of MSR in Figure  6. In that zone, the highest level of α is
α corresponds to the share of MSR in the portfolio of MSR and achieved by adding to the MSR zero-coupon bonds (case 2).
bonds that delta-hedges the portfolio of MSR for small changes
in interest rates. A high α means that less bond principal must Conclusion
be positioned to delta-hedge the MSR portfolio, making for a less Adjustable rate mortgages shift, to varying degrees, interest rate
costly hedge. The optimal hedge is the one that requires the highest risk from lenders/investors to borrowers. The hybrid mortgage
α bond portfolio. combines elements of the fixed-rate mortgage and the adjustable
rate mortgage by offering borrowers a fixed period prior to the
We now examine Figure 7, which plots on the vertical axis the value adjustable period. Adjustable-rate mortgages were a significant
of α and on the horizontal axis the market interest rate available component of the subprime mortgage market. The value of mort-
to mortgagors who have issued ARMs but are now considering gage servicing rights is extremely sensitive to changes in market
prepaying their mortgage. The interest rate on the underlying ARM rates and of course to default rates which are directly impacted by
mortgage pool is 5%. Again α is the share of the MSR in the hedged market rates. Holders of MSR can delta-hedge their MSR portfolios
portfolio. against interest rate risk. This paper simulates three different port-
folios of MSR and bonds. While the servicer will sometimes choose
We have divided the horizontal axis into three zones. to take a long position in zero-coupon bonds and sometimes to
short the n year coupon paying bond, the intermediate case is never
Zone 1 — starts at a market rate of 0% and ends where the three α selected. The intermediate bond has a shorter duration than the
functions cross the horizontal axis for the first time. This crossover zero-coupon bond and a longer duration than the n year coupon
point corresponds to the minimum value of the MSR in Figure 6. In bond.
zone 1 the maximum α (most efficient hedge) is achieved by pur-
chasing zero coupon bonds. As market rates begin to rise, the value We have illustrated that the financial instrument that servicers
of MSR derived from certain pools of ARMs will begin to decline as must use to effectively delta-hedge a portfolio of servicing rights
defensive prepayments begin to increase. that is losing value as interest rates rise changes as the spread
between the market mortgage rate and contract rate changes.
Zone 2 — corresponds to the range of market rates where the three Hedging servicing rights derived from ARMs that expose the mort-
α functions are negative. In this range, the optimal delta hedge is gagor to payment shock as fixed rates becomes adjustable and the
obtained by shorting n year coupon bonds. contract rate is fully marked to a margin above the ARM index. In
addition to the changing incentive to refinance into a more afford-
able mortgage there is the increasing risk of default if the opportu-
nity to refinance is not available. The hedging of AMRs in our model
also incorporates the discount affect on future servicing income.
Plots of alpha for the three cases

0.6
A very import dimension of the market for mortgage servicing
rights and particularly servicing income derived from subprime
0.4 mortgages, and even more specifically from subprime ARMs, is the
effort the government and the private sector are making to modify
loans before they default. These efforts will boost the value of MSRs
0.2
because the servicers will continue to service principal that would
have otherwise been lost to foreclosures. In addition, servicers that
0 participate in the government’s HAMP program will retain servicing
0.05 0.10 0.15
income that could have been lost to other lenders/servicers.
-0,2

Case 1: red line (n-year annual coupon bonds)


Case 2: blue line (n-year zero coupon bonds)
Case 3: green line (bond with duration less than case 1 and greater than case 2)

Figure 7 – The share of MSR (α) in a delta-hedged portfolio

84 – The   journal of financial transformation


Can ARMs’ mortgage servicing portfolios be delta-hedged under gamma
constraints?

References
• Bierwag, G. O., 1987, Duration analysis: managing interest rate risk, Ballinger Publishing
Company, Cambridge, MA
• Boudoukh, J., M. Richardson, R. Stanton, and R. F. Whitelaw, 1995, “A new strategy
for dynamically hedging mortgage-backed securities,” The Journal of Derivatives, 2,
60-77, 1995
• Ederington, L. H., and W. Guan, 2007, “Higher order Greeks,” Journal of Derivatives,
14:3, 7-34
• Eisenbeis, R. A., W. Frame Scott, and L. D. Wall, 2006, “An analysis of the systemic
risks posed by Fannie Mae and Freddie Mac and an evaluation of the policy options for
reducing those risks,” Federal Reserve Bank of Atlanta, Working Paper 2006-2
• Goodman, L. S., and J. Ho, 1997, “Mortgage hedge ratios: which one works best?”, The
Journal of Fixed Income, 7:3, 23-34
• Kalotay, A. J., and Q. Fu, 2009, “Consumer mortgage decisions,” Research Institute for
Housing America, June
• Office of Thrift Supervision, 2007, “Hedging – mortgage servicing rights,” examination
book, 750.46-750.52, July
• Ortiz, C., C. A. Stone, and A. Zissu, 2008, “Delta hedging of mortgage servicing portfo-
lios under gamma constraints” Journal of Risk Finance, 9:4, 379-390
• Ortiz, C., C. A. Stone, and A. Zissu, 2009, “Delta hedging a multi-fixed-income-securities
portfolio under gamma and vega constraints,” Journal of Risk Finance, 10:2, 1691-72
• Pennington-Cross, A., and G. Ho, 2006, “The termination of subprime hybrid and fixed
rate mortgages,” in Research Division Federal Reserve Bank of St. Louis Working Paper
Series, July
• Posner, K., and D. Brown, 2005, “Fannie Mae, Freddie Mac, and the road to redemp-
tion”, Morgan Stanley, Mortgage Finance, July 6
• Stone, C. A., and A. Zissu, 2005, The securitization markets handbook: structure and
dynamics of mortgage- and asset-backed securities, Bloomberg Press

85
Articles

The time-varying risk


Christoph Kaserer
Full Professor, Department of Financial
of listed private equity
Management and Capital Markets, Center for
Entrepreneurial and Financial Studies (CEFS),
Technische Universität München

Henry Lahr
Center for Entrepreneurial and Financial Studies
(CEFS), Technische Universität München

Valentin Liebhart
Center for Entrepreneurial and Financial Studies
(CEFS), Technische Universität München

Alfred Mettler
Clinical Associate Professor, Department of
Finance, J. Mack Robinson College of Business,
Georgia State University

Abstract
Structure and stability of private equity market risk are still nearly
unknown, since market prices are mostly unobservable for this
asset class. This paper aims to fill this gap by analyzing market
risks of listed private equity vehicles. We show that aggregate
market risk varies strongly over time and is positively correlated
with the market return variance. Cross-sections of market risk are
highly unstable, whereas ranks of individual vehicles within yearly
subsamples change slightly less over time. Individual CAPM betas
are predictable only up to two to three years into the future and
quickly converge to a stationary distribution when measured in
risk classes in an empirical Markov transition matrix. We suspect
that market risk of private equity is affected by factors unique to
this sector: acquisitions and divestments that constantly rebalance
portfolios, scarcity of information about portfolio companies, and
rapid changes within portfolio companies. Unstable market risk
seems to be a fundamental characteristic of private equity assets,
which must be incorporated into valuation and portfolio allocation
processes by long-term private equity investors. Large increases in
systematic risk in recent years cast doubt on diversification ben-
efits of private equity in times of crisis.

87
The time-varying risk of listed private equity

Performance measurement and portfolio allocation are notoriously For this purpose, we measure market risk in an international capital
difficult when dealing with private equity assets due to the non- asset pricing model (CAPM) using Dimson (1979) betas. While Lahr
transparent nature of this asset class. To evaluate the success of an and Herschke (2009) measure constant betas over the lifetime of
investment, its risk premium as derived from factor pricing models listed private equity vehicles, we take a step further in considering
can serve as a benchmark for required returns in excess of the risk- their time series properties. In our model, market risk is measured
free rate. Private equity’s market risk, albeit hard to measure in over a rolling window to generate a continuous set of beta obser-
traditional private equity funds, can be obtained from listed private vations, which describes the aggregate asset class risk over time.
equity (LPE) vehicles. In this paper, we measure aggregate and indi- Second, we examine the stability of individual betas. Correlations of
vidual market risk and its variability over time. Private equity inves- cross-sections for consecutive years can offer insights into relative
tors can benefit from a quantification of private equity market risks changes of risk within the asset class. We find that market risk of
and their variability, since this asset class represents a substantial listed private equity is quite unstable if time periods longer than
share of international investment opportunities. The private equity two years are considered. In a second step, we compute transition
sector had more than U.S.$2.5 trillion of capital under management probabilities between risk classes. Our results reflect the instability
in 2008 according to the International Financial Services London of risk in general, but highlight a moderate persistence of excep-
[IFSL (2009)]. This large volume demands for a time variation anal- tionally high and low risks.
ysis of systematic risks. In addition, industry-specific characteristics
caused by private equity business models shape the evolution of Stability of market risk
risk within this asset class: acquisitions and divestments of portfolio A broad picture of how market risk evolves in listed private equity
companies constantly rebalance fund portfolios, which should lead can be seen from yearly cross-sections. In this section, we show the
to highly unstable market risk. main properties of market risk measurement for our listed private
equity sample for different time horizons: rolling windows of one
Several authors have focused on non-constant risk premia in equi- and two years and total lifetime. Our sample of listed private equity
ties, bonds, and REITs. Early papers discussed the impact of risk vehicles is based on the data from Lahr and Herschke (2009). They
variability on portfolio decisions [Levy (1971), Blume (1971)]. Later generate a comprehensive list of listed private equity companies,
work developed the conditional capital asset pricing model and sim- which we extend to account for recent listings. Our final sample
ilar models using mostly public market data [Lettau and Ludvigson includes 278 vehicles, the largest proportions being investment
(2001), Ghysels (1998), De Santis and Gérard (1997), Jagannathan companies and funds that are managed externally. The time hori-
and Wang (1996), Bollerslev et al.(1988), Ferson et al. (1987)]. Time- zon chosen for our analysis is January 1st, 1989 to March 25, 2009,
varying risk properties of private equity, however, have not been although not all companies have returned data during the entire
examined by empirical research. time period due to new listings and delistings.

The difficulty with risk measurement in traditional (unlisted) private To measure the market risk of our sample vehicles, we use individual
equity vehicles lies in the opacity of their price formation. Time return indices from Thomson Datastream in U.S. dollars, the 3-month
series of returns are hardly observable, which renders estimation of Treasury bill rate as a proxy for the risk-free rate, and MSCI World
market risk nearly impossible. Many attempts at measuring system- index returns in U.S. dollars as a proxy for market returns. All return
atic risk are based on voluntarily reported returns of private equity data are converted to logarithmic returns. During the time period
funds, internal rate of return (IRR) distributions [Kaplan and Schoar studied in this paper, 33 companies were delisted. All companies
(2005)], cash flows [Driessen et al. (2009)], transaction values of enter our analysis when return data becomes available and drop out
portfolio companies [Cochrane (2005), Cao and Lerner (2009)], or if they delist or if trading volume is zero after some date.
the matching of portfolio companies to public listed companies with
similar risk determinants [Ljungqvist and Richardson (2003), Groh Market risk estimation
and Gottschalg (2009)]. To obtain market risks we regress excess LPE stock returns on
excess market returns (MSCI World). We employ a Dimson regres-
Private equity vehicles that are listed on international stock sion to account for autocorrelation in asset returns caused by
exchanges provide a natural alternative to unlisted ones when esti- illiquidity in LPE vehicles. Early studies showed that in similar set-
mating their risk. Return data are readily available and can be used tings autocorrelation on the portfolio level can be a problem [Fisher
to answer risk-related questions: what are the market risk patterns (1966), French et al. (1987)]. We use the results of Dimson (1979)
of listed private equity throughout the life of the security? How and incorporate 7 lagged market returns in the estimation model to
stable are the market risks of the listed private equity companies? adjust for autocorrelation. In a second step we aggregate the lags
as proposed by Dimson to obtain a measure of market risk. Since
We first analyze the market risk structure of listed private equity. our sample consists of vehicles traded on international exchanges,

88 – The   journal of financial transformation


The time-varying risk of listed private equity

we account for currency risk by introducing risk factors for the measures historical return variances but does not examine the time
euro, yen, and pound sterling, which represent excess returns of series properties of systematic risk.
currency portfolios in U.S. dollar. The international capital asset
pricing model is thus given by Table  1 summarizes the main market risk statistics for different
7
observation windows. Mean one-year betas range from a minimum
ri,t = α i + ∑ βi,k rm,t−k + γ i,1 GBPt + γ i,2EURt + γ i,3YPYt +ε i,t of 0.22 in 1993 to a maximum of 1.36 in 2000. Two-year betas
k=0 are highest for the periods 1999-2000 and 2007-2008. One-year
betas as well as two-year betas vary around an average that is
where ri,t and rm,t—k are the respective observed logarithmic almost equal to unity. All periods and estimation windows exhibit a
(excess) individual vehicle and market returns at time t and t-k, large cross-sectional variation in market risk. This might be caused
whereas k corresponds to the respective lag. The intercept a rep- by the huge diversity within the listed private equity asset class,
resents a vehicle-specific constant excess return, b and g are the which includes many small vehicles with strongly differing business
slope coefficients, and e is an error term. We use this regression models. Interestingly, mean betas are positively correlated to their
equation to calculate the market risks for different time periods: standard deviation. This suggests that mean betas are driven by
one year, two years, and lifetime market risks. vehicles with huge betas as indicated by a skewed distribution of
betas.
Yearly cross-sections and aggregate market risk
We first illustrate the behavior of aggregate market risk in a time Figure  1 shows a more detailed picture over time. Listed private
series context before taking a closer look at individual risk stability. equity betas are first estimated for each vehicle in rolling windows.
Time series of aggregate risk can be constructed from measures These individual betas are then weighted equally over all vehicles
of market risk in rolling windows. We define two such rolling win- that we were able to calculate a beta for at a given point in time.
dows: the first spans 52 weekly observations and the second has Figure  1 reveals the volatile nature of private equity market risk.
104 observations. Similar windows are used by Bilo (2002), who Even mean betas vary widely over time. Beta variability is smaller

Lifetime beta Deciles


Mean SD No. > 0 No. < 0 Min 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 Max

1.31 0.93 270 8 -2.89 0.32 0.62 0.85 1.03 1.18 1.41 1.62 1.88 2.44 5.28

Yearly cross-sections
One-year betas Two-year betas
Period ending Mean SD Max-Min No. < 0 N Period ending Mean SD Max-Min No. < 0 N

1989 0.51 1.58 6.74 13 35


1990 0.84 0.86 4.01 6 41 1990 0.93 0.71 2.69 5 41
1991 0.84 1.40 8.45 9 48
1992 0.26 1.74 8.72 22 53 1992 0.56 1.06 6.75 12 53
1993 0.22 1.52 8.25 27 57
1994 0.52 1.68 8.77 20 61 1994 0.65 1.32 8.14 16 61
1995 0.64 2.47 16.61 25 69
1996 1.15 4.32 38.72 27 74 1996 0.86 2.06 15,00 23 74
1997 0.97 2.23 16.94 32 88
1998 1.16 1.33 7.34 14 103 1998 1.26 1.26 6.45 7 103
1999 1.31 2.53 20.75 32 120
2000 1.36 2.31 17.89 32 135 2000 1.33 1.30 8.49 17 135
2001 1.19 1.36 9.96 21 160
2002 1.16 1.45 11.49 29 177 2002 1.19 1.16 7.12 11 177
2003 1.09 2.09 15.21 42 182
2004 1.35 2.95 30.14 47 183 2004 1.30 2.07 16.94 42 182
2005 1.05 2.63 20.78 58 190
2006 0.95 1.51 13.83 40 203 2006 1.16 1.29 11,00 23 202
2007 1.04 2.12 23.33 52 226
2008 1.20 1.33 12.02 30 244 2008 1.37 1.29 8.15 18 243
Mean 0.94 1.97 15.00 Mean 1.06 1.35 9.07

Table 1 – Beta cross-sections from 1989 to 2008

89
The time-varying risk of listed private equity

Market risk Variance of weekly returns of the MSCI World


ket’s risk, and even worse rises in times when investors seek portfo-
3 0,00245 lio insurance, the often purported benefits of this asset class might
2,5 0,0021 turn out to be hard to achieve.
2 0,00175

1,5 0,0014 Do risks move together?


1 0,00105 Although aggregate market risk seems to be rather unstable over
0,5 0,0007
medium to long time periods, individual risks might still move
0 0,00035
parallel to the general mean. There could thus be considerable
-0,5 0
relative stability within the listed private equity asset class despite
1990 1992 1994 1996 1998 2000 2002 2004 2006 2008
its apparent irregular behavior. We measure risk stability relative
Total
to the listed private equity asset class by estimating Pearson cor-
Mean
Variance MSCI relation coefficients. Betas can be huge in magnitude, which can
Market risk Variance of weekly returns of the MSCI World strongly influence linear estimators such as Pearson correlation
2 0,0015
coefficients. Spearman’s rank correlation coefficient can provide a
1,8 0,00135
1,6 0,0012 robust measure of relative beta stability.
1,4 0,00105
1,2 0,0009
We first calculate the Pearson correlation to capture beta move-
1 0,00075
0,8 0,0006 ments between two points in time. Correlation coefficients are
0,6 0,00045 calculated from all vehicles with a risk estimate available for two
0,4 0,0003
0,2 0,00015
consecutive years. Figure 2 shows that one-year betas are correlated
0 0 especially at lags one and two, but only for observation periods after
1990 1992 1994 1996 1998 2000 2002 2004 2006 2008
1993. Betas prior to 1994 seem to behave randomly. Correlations of
Figure 1 – Mean one-year beta (upper panel) and two-year beta (lower panel) from one-year betas after 1995 for lag one vary between 0.16 and 0.3.
1989 to 2009. Interestingly, betas are not always significantly correlated even for
recent years. There is, for example, almost no correlation between
2006 and 2008. An explanation for weak relations in general could
for two-year betas due to smoothing but higher at the beginning be mean reversion. LPE vehicles tend to have a beta around an indi-
of our observation period. This higher variability could be caused vidual mean. If market risk deviates from this mean in a given year
by the smaller number of vehicles compared to later years, which due to an exogenous shock affecting an individual vehicle, it tends to
makes the sample mean a less efficient estimator for the true mean. drift back to the individual mean after some time. This would reduce
The one-year beta time series exhibits a mean-reverting behavior, the correlation between individual betas.
oscillating between low values around zero during the early 1990s
and peaking in 1996 and 2000. Its long term average, however, is Another explanation is real changes in the vehicles’ underlying
a moderate 0.98. Two-year betas behave similarly around a time businesses, which cause beta variability. Private equity funds buy
series mean of 1.07. They are lower than the market average during portfolio companies to realize capital gains and management fees
the 1990s and again from 2006 through 2008 with a large hike dur- over some holding period, which is usually less than ten years,
ing the financial crisis. Both charts have more or less pronounced
peaks during the Asian Financial Crisis (1997-1998), the dot.com One-year betas Two-year betas
89
bubble (1999-2001), the year 2004, and the recent financial crisis 90
90
91
92
(2007-2009). 92
93
94
94
95
96
96

Exogenous shocks and extreme market movements are possible 97


98
98

99
causes for beta variability. The green lines in Figure 1 show market 00
00

01
02
return variances for the corresponding rolling windows (one-year 02
03
04
04
and two-year) for weekly MSCI World return data. Betas and the 05
06
06

market return variance are significantly correlated with a coeffi- 07


08
08

cient of 0.28 (p<0.05), which is surprising, since betas are inversely


89
90
91
92
93
94
95
96
97
98
99
00
01
02
03
04
05
06
07
08

90

92

94

96

98

00

02

04

06

08

related to the market’s variance in a CAPM context by definition. -0.4 0.05 0.5
This result suggests a large increase in covariance between listed
Figure 2 – Correlation of cross-sectional betas between observation periods.
private equity vehicles and the market in times of uncertainty. Pearson correlation coefficients are below the main diagonal, Spearman rank
If systematic risk of private equity is about the same as the mar- correlation coefficients are above.

90 – The   journal of financial transformation


The time-varying risk of listed private equity

depending on the funds’ focus. Portfolio turnover is thus higher correlated from 1996 on, which is partly due to their higher magni-
than in other companies that grow organically or merge with newly tude and partly due to increasing degrees of freedom.
acquired subsidiaries. If the portfolio companies’ market risk is
diverse across portfolio acquisition, private equity betas experience Results are different in the two-year case but similar to Pearson
jumps according to the market risk of the often substantial amounts correlations. Coefficients are about 0.2 higher than in the one-
of assets acquired or sold. year beta case. This is likely the result of better estimates due
to the increasing degrees of freedom when estimating beta over
The portfolio companies’ risk itself might not be constant either. 104 weeks, which makes estimates less sensitive to outliers in the
Private equity funds and firms that specialize in turnaround man- return distribution. If betas are measured more accurately, correla-
agement and venture capital in particular can experience large tions increase as well.
swings in market risk. Restructuring can affect operational risk as
well as market risk, while rapid growth of successful companies Our results suggest that opacity in portfolio companies and illi-
causes changes in portfolio risk even if individual portfolio com- quidity lead to estimation error on the short run, while portfolio
pany risk remains constant. This effect also depends on portfolio rebalancing changes individual betas over the medium to long term.
size. Acquisitions and divestments must be large relative to port- Estimated market risk seems to be most stable over horizons span-
folio size to cause substantial risk changes on the portfolio level. ning two to three years. The two-year beta correlations are slightly
It seems reasonable to expect such a rebalancing effect over the higher than in the one-year case, while significant correlations can
medium to long term. be observed over the last decade only.

Short-term beta variability is likely caused by estimation errors, Evolution of risk over time
which in turn arise for primarily two reasons. First, listed private The time series perspective can be combined with an assessment
equity vehicles are often quite small by market capitalization and of cross-sectional stability if one assumes that individual betas are
illiquid. The low informational quality of prices in thinly traded generated by a Markov process common to all vehicles. We estimate
stocks — although mitigated by the Dimson regression — can carry an empirical Markov transition matrix under the assumption that
over to insignificant or unstable beta coefficients. Second, the future betas depend only on their current value. Transition prob-
nontransparent structure and characteristics of portfolio com- abilities from one risk class to another within the empirical Markov
panies can reduce informational efficiency. Since most portfolio Transition Matrix (MTM) are calculated as the relative frequency
companies are not listed and investors in private equity must rely of moving from one risk class to another in the next observation
on the information provided by the fund manager, this information period. For this purpose, we construct risk classes in two different
can sometimes be as scarce as unreliable. If company betas are ways: betas deciles and fixed beta classes.
seen as a moving average of partially unreliable information about
true market risk, betas become increasingly unstable over short Persistency in beta deciles
horizons. The deciles-oriented MTM is based on quantiles of the distribution
of individual company market risk. All companies in a decile are
Rank correlations assigned to one risk class. As a result of changes in aggregate beta
As a robustness check for the Pearson correlation matrices, we over time, decile boundaries may change as well. Since betas are
calculate the Spearman rank correlation to capture the rela- measured with error, boundaries of upper and lower deciles fluctu-
tive rank movements. Instead of correlating betas directly, the ate most. Quantiles around the median are more stable.
Spearman correlation computes a coefficient based on the ranks
of individual betas in two consecutive time periods. This calculation Because a company’s risk class is assigned by the decile function, its
yields a matrix with yearly entries from 1990 to 2008 in the one- risk class depends on the risk exposure of the entire listed private
year case and with two-year intervals where betas are estimated equity market. An increase in beta, however, does not change the
over 104 weeks. Estimation of correlations is based on all vehicles risk class if all companies are affected by this increase to the same
with observable betas in two consecutive periods, which leads to a extent. This property has an important influence on the interpreta-
changing number of degrees of freedom. tion of our results. The probability of a transition from one risk class
into another does not reflect changes of absolute risk exposure but
The entries above the diagonal in both panels in Figure 2 show rank gives insight into how the risk of one company behaves compared
correlation matrices for one-year and two-year betas. Similar to the to the risk of all other companies. If, for example, a flat transition
one-year Pearson correlation matrix, betas are correlated at the matrix is found, betas are equally likely to move from one risk class
first few lags in the one-year case. The highest correlation for one- to any other. In other words, the relative risk structure would be
year betas is 0.319 at the first lag. Betas begin to be significantly completely random over time. This could hold for companies in the

91
The time-varying risk of listed private equity

venture capital market in particular, whose risks are driven by real One-year betas Two-year betas

options and less by fundamental data. If, to the contrary, only posi-
10
10
9
tive entries along the main diagonal exist, relative risks within the 8
9

8
industry do not change over time. 7
7
6
6
t t
When interpreting results, we have to take care of the fact that
5
5
4
new companies get listed and some become delisted. If we allow 3
4

3
companies to move in and out of our sample, decile boundaries can 2
2

change even without any variation in risks, which may force a com- 1
1

pany to change its risk class. This can be a problem if the sample

10

10
t+1 t+1
is not random.
0 0.25 0.5

Transitions probabilities shown in Figure  3 confirm our results Figure 3 – Markov transition matrices with risk classes constructed from deciles
for cross-sectional rank correlations. Betas are moderately stable Left panel – one year betas from 1990 to 2008, N = 2158; right panel: two year betas
from 1991 to 2007, N = 914
relative to the listed private equity sector over short time periods.
Interestingly, high-risk companies remain in the highest risk class
with a 24% (one-year) and a 25% (two-year) probability. This
suggests that there might be companies that have a persistently Persistency in fixed beta classes
higher risk than other companies. The same result can be seen for Absolute persistency of market risks can be examined by using
low risk classes. For both observation horizons, risk is comparably fixed risk classes. In this case, we do not measure the behavior of
stable (17.8% for the one-year betas and even 18.3% for the two- company risk relative to other companies but absolute risk change.
year betas). Note that in both cases a relatively high proportion of We define the following ten risk classes for individual betas βi,t for
companies switch from the highest risk class into the lowest risk vehicle i at time t: qk-1<βi,t<qk with {q0,...,q10} = {-∞,-3,-2,-1,0,1,2,3,4,
class from one period to the next and vice versa. These companies 5,∞}, where each class is denoted by its upper boundary. If risks
are most likely outliers, whose beta cannot be estimated reliably were stable in this sense, we would expect matrix entries along the
and therefore is unstable compared to the industry as a whole. main diagonal. If risks change their size randomly, each row should
Considering the almost random correlation structure prior to 1994 reflect the unconditional distribution of betas.
in Figure 2, we calculated transition matrices excluding these early
betas, but find similar results. The MTM with fixed risk classes in Figure 4 yields a different impres-
sion of market risk. When using fixed class boundaries, transition
The flat probability structure in deciles 5 and 6 can be observed matrices cannot be used to reach conclusions about risk move-
for several reasons: first, it can be driven by listings and delistings. ments within the listed private equity sector anymore. Instead,
These changes in the underlying sample can influence decile bound- transition probabilities reflect the behavior of individual risks, which
aries, if newly added companies do not follow the same risk distri- include changes in market risk as well as idiosyncratic exogenous
bution as existing ones. A second explanation could be that the risk factors. As expected from our correlation analysis, betas show
of private equity changes fast. Companies having extremely high or a highly mean-reverting behavior. This effect can be seen from
low risk remain in their respective risk classes if their risk does not classes five and six (representing the industry general mean mar-
change too much, whereas medium-risk companies switch between ket risk), which have the highest transition probability from almost
deciles more often for similar beta changes. Incomplete and noisy every other risk class. Transition probabilities seem to converge to
information about portfolio companies might not allow the market the stationary distribution after a short time, which again indicates
to generate stable betas over short time periods. that betas become unstable due to portfolio rebalancing and time-
varying risk within portfolio companies.
Moderately stable ranks are good news for private equity investors.
If private equity vehicles remain in their risk deciles over periods Our suspicion that extreme betas are due to estimation error is
of two years and even longer (results for three and four years are confirmed by the fact that one-year betas in risk classes 1 and
not shown here), investors can base their portfolio allocation deci- 10 behave quite randomly between two observation periods. An
sions on betas relative to the private equity sector. Although betas economic reason for unstable negative betas could be that private
exhibit large absolute swings, which will be shown below, long-term equity has no short selling strategies. Although results are simi-
investors can still target specific high or low risk vehicles for inclu- lar for one-year and two-year betas, there are a few differences.
sion in their portfolio. Except for one outlier in element p(-3,5) (not shown), the lowest risk
class for two-year betas does not contain any entries. High risks are

92 – The   journal of financial transformation


The time-varying risk of listed private equity

One-year betas Two-year betas References


max max
• Bilo S., 2002, “Alternative asset class: publicly traded private equity, performance,
5 5
liquidity, diversification potential and pricing characteristics,” Doctoral dissertation,
University of St. Gallen
4 4
• Blume, M. E., 1971, “On the assessment of risk,” Journal of Finance, 26:1, 1-10
3 3
• Bollerslev, T., R. F. Engle, and J. M. Wooldridge, 1988, “A capital asset pricing model
2 2
t t with time-varying covariances,” Journal of Political Economy, 96:1, 116-131
1 1
• Cao, J., and J. Lerner, 2009, “The performance of reverse leveraged buyouts,” Journal
0 0
of Financial Economics, 91:2, 139-157
-1 -1
• Cochrane, J. H., 2005, “The risk and return of venture capital,” Journal of Financial
-2 -2
Economics, 75:1, 3-52
-3 -3
• De Santis, G., and B. Gérard, 1997, “International asset pricing and portfolio diversifica-
tion with time-varying risk,” Journal of Finance, 52:5, 1881-1912
-3

-2

-1

max

-3

-2

-1

max
t+1 t+1
• Dimson, E., 1979, “Risk measurement when shares are subject of infrequent trading,”
Journal of Financial Economics, 7, 197-226
0 0.25 0.5
• Driessen, J., T. Lin, and L. Phalippou, 2009, “A new method to estimate risk and return
of non-traded assets from cash flows: the case of private equity funds,” NBER Working
Figure 4 – Markov transition matrices with fixed risk classes.
Paper No. W14144
Left panel – one year betas from 1990 to 2008, N = 2158; right panel: two year betas
• Ferson, W. E., S. Kandel, and R. F. Stambaugh, 1987, “Tests of asset pricing with time-
from 1991 to 2007, N = 914.
varying expected risk premiums and market betas,” Journal of Finance, 42:2, 201-220
• Fisher, L., 1966, “Some new stock market indices,” Journal of Business, 39:1, 191-225
• French, K., W. Schwert, and R. Stambaugh, 1987, “Expected returns and volatility,”
Journal of Financial Economics, 19:1, 3-29
more persistent than medium risks, as can be seen from risk classes • Ghysels, E., 1998, “On stable factor structures in the pricing of risk: do time-varying
9 and 10, whose transition probabilities are shifted to the right. This betas help or hurt?” Journal of Finance, 53:2, 549-573
• Groh, A., and O. Gottschalg, 2009, “The opportunity cost of capital of US buyouts,”
effect is strongest in one-year betas in the high risk classes, which
Working Paper
are more stable than two-year betas. • International Financial Services London, 2009, “Private equity 2009,”
• Jagannathan, R., and Z. Wang, 1996, “The conditional CAPM and the cross-section of

Conclusion expected returns,” Journal of Finance, 51:1, 3-53


• Kaplan S. N., and A. Schoar, 2005, “Private equity performance: returns, persistence,
Aggregate market risk of listed private equity vehicles varies and capital flows,” Journal of Finance, 60, 1791-1823
strongly over time and is positively correlated with the market • Lahr, H. and F. T. Herschke, 2009, “Organizational forms and risk of listed private
equity,” The Journal of Private Equity, 13:1, 89-99
return variance. Individual CAPM betas are highly unstable, where-
• Lettau, M., and S. Ludvigson, 2001, “Resurrecting the (C)CAPM: a cross-sectional test
as ranks of individual vehicles within a cross-section change slightly when risk premia are time-varying,” Journal of Political Economy, 109:6, 1238-1287
less over time. Individual CAPM betas are predictable only up to 2 to • Levy, R. A., 1971, “On the short-term stationarity of beta coefficients,” Financial
Analysts Journal, 27:6, 55-62
3 years into the future and quickly converge to a stationary distri-
• Ljungqvist, A., and M. Richardson, 2003, “The cash flow, return and risk characteristics
bution when measured in risk classes in an empirical Markov transi- of private equity,” NBER Working Paper No. 9454
tion matrix. High- and low-risk companies, however, are more likely
to remain within their risk classes than medium-risk companies. The
probability that a company can be found in the same decile in the
next observation period is about 25% for high-risk companies and
18% for low-risk ones.

We suspect that market risk of private equity is affected by factors


unique to this sector: acquisitions and divestments that constantly
rebalance portfolios, scarcity of information about portfolio com-
panies, and rapid changes within portfolio companies. Unstable
market risk seems to be a fundamental characteristic of private
equity assets, which must be incorporated in the valuation process
and which casts doubt on diversification benefits of private equity
in times of crisis. Particularly important, investors usually hold
traditional private equity shares to maturity, which can be up to 10
years. Unpredictable changes in market risk pose a challenge for
portfolio allocation, since investors would be buying assets that
behave entirely different from what they were supposed to when
first included in the investor’s portfolio. However, targeting vehicles
with specific risks relative to the asset class might be a feasible
strategy for long-term private equity investors.

93
Articles

The developing legal


risk management
environment

Marijn M.A. van Daelen


Law and Management lecturer / researcher,
Department of Business Law / Center for Company
Law (CCL), Tilburg University

Abstract
Financial institutions are facing an increasing number of risk
management regulations with different national and international
approaches. The legal framework, including its risk management
provisions that existed prior to the current financial crisis, has been
severely tested in ‘real life’ and did not hold up to its expectations
(whether reasonable or not). As a reaction to the financial crisis
lawmakers and policymakers have been focusing on, inter alia, risk
management regulations to restore public confidence in companies
and the overall market. The underlying question here is whether
new regulations can indeed prevent the next crisis.

95
The developing legal risk management environment

As a reaction to the financial crisis lawmakers and policymakers disclosure document) and in half year or annual reports (ongoing
have been focusing on, inter alia, risk management regulations to disclosure documents).
restore public confidence in companies and the overall market. The
underlying question here is whether new regulations can indeed In specific, financial institutions are facing an increasing number of
prevent the next crisis. Of course, new regulations can improve risk management regulations with different national and interna-
the legal environment of financial institutions, thereby reducing tional approaches. The legal framework, including its risk manage-
the imperfections shown by this crisis. However, there seems to be ment provisions that existed prior to the current financial crisis, has
nothing to gain with extensive additional regulation that can only been severely tested in ‘real life’ and did not hold up to its expecta-
prevent a crisis with attributes that are similar to the one the mar- tions (whether reasonable or not). These risk management provi-
ket is facing now. First of all, previous crises have had their specific- sions can be divided into general and sector-specific provisions. As
ities and the next crisis will most likely have its own features. Hence the sector-specific rules supplement the general requirements, the
it is more than doubtful whether a lot of this kind of regulation is latter will be discussed in the next section. The following section
needed to prevent the next crisis. Secondly, introducing new rules will address the sector-specific risk management regulation and the
without reducing existing rules that should have (but might not final section provides for some concluding remarks.
have) tackled the same or relating problems, will lead to a pile-up
of regulations. Compliance with all applicable rules and regulations General risk management regulation
becomes prohibitively costly for companies. At the same time, the To start off, in general the board of directors is responsible for man-
compliance risk will increase, which in turn can also increase costs. aging the company. For example, the U.S. statutory model states that
Compliance risk can be defined as “the risk of legal or regulatory the board of directors has to manage and oversee the business and
sanctions, material financial loss, or loss to reputation a [financial exercise corporate powers and the U.K. model articles of association
institution] may suffer as a result of its failure to comply with laws, state that the board has to manage the company’s business3. In both
regulations, rules, related self-regulatory organization standards, the U.S. and U.K. practices, management directs operations since del-
and codes of conduct applicable to its […] activities”1. This is why, in egation of board authority is recognized, but policymaking remains
the words of Enriques (2009), “excessive reregulation today is the a task of the board of directors4. Throughout the years, the duty of
best guarantee of effective pressure towards deregulation tomor- directors has been further developed. In the 20th century the duties
row” and “regulators should make a lot of noise and show a lot of the directors included maintaining a system of internal controls
of activism, all the while producing very little change”2. Excessive and disclosing the company’s risks5. For years, the U.S. was forerun-
regulation will have a lot of negative side effects and may not even ner with requirements that focused solely on the financial reporting
ensure justified public faith in the reliability of companies and the aspect of the internal control and risk management framework. The
overall market in the future. The upshot is that it makes sense first foundations of American federal securities law are laid down in the
to have a closer look at existing risk management regulation and 1933 Securities Act, demanding issuers to publicly disclose significant
determine the goals in the long run before deciding on the need for information about the company and the securities, and the 1994
more financial regulation. Securities Exchange Act, requiring mandatory corporate reporting
and independent audits. Congress enacted these two securities
Two types of rules provide the basis for the legal risk management Acts in response to the American stock market crash of 1929 and
environment. Firstly, there are regulations that demand compa- the Great Depression in order to reduce the information asymmetry
nies to have an appropriate risk management system present. between managers and shareholders. The Securities and Exchange
This includes not only ensuring that there is a system to identify Commission (SEC) was established pursuant to the 1934 Act. In the
and analyze risks, but also ensuring that the system is adequately 1933 Act, regulatory recognition was given to the importance of
maintained and monitored. Secondly, there are regulations that internal control (which had a more limited scope than risk manage-
require the company to disclose information on (a) the company’s ment as we know it today). Regulation S-X Rule 2-02(b) of the 1933
risk management system or (b) the risks that the company faces Act required external auditors to give appropriate consideration
(thus indirectly require a system). This firm-specific information set to the adequacy of the system of internal control implemented by
is important for minimizing the information asymmetry between the corporation. In its final report of 1940, the SEC recommended a
managers and shareholders. After all, managers are involved with more thorough assessment of the internal control system. After a
the day-to-day business and have better access to all company number of corporate scandals, which were related to the bribery of
information whereas the shareholders, the providers of capital, only foreign officials in the mid-1970s, the American regulatory approach
receive publicly available information. The information needed to to internal control changed. The 1977 Foreign Corrupt Practices Act
reduce this asymmetry must be disclosed in prospectuses (one time (FCPA) required reporting companies to keep books, records, and

1 Basel Committee on Banking Supervision, 2005, “Compliance and the compliance 4 van Daelen, M. M. A. and C. F. Van der Elst, 2009, “Corporate regulatory frameworks
function in banks,” Bank for International Settlements, p. 1 for risk management in the US and EU,” Corporate Finance and Capital Markets Law
2 Enriques, L., 2009, “Regulators’ response to the current crisis and the upcoming Review, 1:2, 83-94, p. 84
96 reregulation of financial markets: one reluctant regulator’s view,” University of 5 van Daelen, M. M. A., forthcoming 2010, “Risk management from a business law
Pennsylvania Journal of International Economic Law, 30:4, 1147-1155 perspective,” in van Daelen, M. M. A. and C. F. Van der Elst, eds., Risk management
3 Section 8.01(b) of the US MBCA 2005 and Regulation 70 of the U.K. Table A as and corporate governance: interconnections in law, accounting and tax, Edward Elgar
amended on 1 October 2007 as well as Article 3 of the Model Articles for Public Publishing, Cheltenham
Companies of the Companies Act 2006.
The developing legal risk management environment

accounts as well as maintaining a system of internal accounting con- tems of internal control and recommends the supervisory board to
trols in order to control management activities6. A few years later, consider whether to appoint an audit committee. This committee was
companies needed to assess their risks as item 303 of the MD&A recommended specific duties such as supervising external financial
was added to Regulation S-K. It required managements’ discussion reports, compliance, and the control of company risks13.
and analysis report to include “material events and uncertainties
known to management that would cause reported financial informa- Obviously, the legal internal control and risk management envi-
tion not to be necessarily indicative of future operating results or of ronment significantly changed when the U.S. Congress passed
future financial condition”7. Around that time, U.S. recommendation the Sarbanes Oxley Act (SOX) after the corporate failures and
reports and guidelines — such as the 1978 Cohen Report and the fraud cases between 2000 and 2003. It has been said to be the
1979 Minahan Report and later the 1987 Treadway Report and 1992 culmination of a century of internal control developments14. This
COSO I Report — were starting to stress a broader internal control 2002 federal law was intended to restore public faith and trust
framework. Moreover, recommendations towards the oversight duty by, inter alia, improving the accuracy and reliability of corporate
of audit committees of the board of directors regarding the financial disclosures. It contains not only disclosure requirements, but also
reporting process and internal controls started to develop8. substantive corporate governance mandates15. The legal duties of
corporate constituents regarding a system of internal controls are
Years after the U.S. 1933 Act, the FCPA, and the MD&A, the U.K. further developed by this Act and other legislative measures. The
followed with self-regulatory but more detailed provisions to man- well-known SOX Section 404 demands an annual internal control
age companies’ risks. The main rules on mandatory disclosure were report in which management’s responsibility for “establishing and
given by the Companies Act of 1985 and the Listing Rules. Section maintaining an adequate internal control structure and procedures
221 of the Companies Act required companies to keep account- for financial reporting” is stressed. The report also has to include an
ing records in order to show and explain their transactions and to assessment of the effectiveness of these structures and procedures.
disclose their financial position. The Listing Rules required listed Section 302 requires the CEO and CFO — thus not management as
companies to include a statement of compliance with the provisions Section 404 does — to certify the fairness of the financial state-
of the 1992 Cadbury Report9 in their annual report and accounts on ments and information as well as their responsibility for establish-
a comply-or-explain basis. This self-regulatory report provided that ing and maintaining internal controls. The CEO and CFO also have
the board of directors had to maintain a system of internal control to present their conclusions — not the total evaluation of Section
over the financial management of the company — including proce- 404 — about the effectiveness of the internal controls based on their
dures to mitigate corporate governance risks and failures — and that evaluation. The duties of audit committees also continued to evolve
the directors had to make a statement in the annual report on the as Section 301 requires that audit committees establish procedures
effectiveness of their internal control system10. The Cadbury Report for, “the receipt, retention, and treatment of complaints […] regard-
also recommended that all listed companies should establish an audit ing accounting, internal accounting controls, or auditing matters.”
committee, comprising at least three non-executives. The report Section 205(a) stresses that the purpose of the audit committee is
gave the audit committee’s duties, which include reviewing the to oversee the company’s accounting and financial reporting pro-
company’s statement on internal control systems11. The 1998 Hampel cesses and audits of the financial statements. Other U.S. legislative
Report broadened the U.K. internal control perspective by arguing measures as well as guidelines have a wider internal control and risk
that the system did not only have to cover financial controls but also management perspective, such as the MBCA and the COSO II Report.
operational and compliance controls, as well as risk management12. The MBCA provides the scope of the board’s oversight responsi-
As the Hampel Committee suggested, the London Stock Exchange bilities relating to the company’s major risks and the effectiveness
issued the Combined Code on Corporate Governance, which included of the company’s internal financial, operational, and compliance
the provisions of, inter alia, the Cadbury Report and Hampel Report. controls16. The COSO II Report broadens reporting to encompass
Later, other European member states followed the U.K. with inter- non-financial information and internal reporting and it adds a fourth
nal control and risk management regulations. For instance, the category, the strategic objectives, to the existing financial reporting,
Netherlands issued a self-regulatory code (the 1997 Peters Report) operational, and compliance objectives17. Eversince the corporate
that stressed the board of directors’ responsibility for effective sys- failures manifested themselves, it has been argued that new regula-

6 15 U.S.C. section 78m (b) (2) (B) 13 Committee on Corporate Governance, 1997, Corporate governance in the Netherlands
7 17 CFR 229.303.a-3-ii and “Instructions to paragraph 303(a)”, under no. 3 – forty recommendations, Paragraphs 4.2, 4.3, 3.2 and 6.4
8 See, for instance, the Treadway Commission, 1987, Report of the National 14 Heier, J. R., M. T. Dugan and D. L. Sayers, 2004, “Sarbanes-Oxley and the culmination
Commission on Fraudulent Financial Reporting, New York: AICPA Inc., p. 12. of internal control development: a study of reactive evolution,” American Accounting
Establishing an audit committee was already recommended by the SEC in 1972 and Association 2004 Mid-Atlantic region meeting paper, p. 14
demanded by the NYSE in 1978. 15 Romano, R., 2005, “The Sarbanes-Oxley Act and the making of quack corporate gov-
9 The Cadbury Committee was set up by the Financial Reporting Council, the London ernance,” Yale ICF Working Paper 05 (23), p. 1
Stock Exchange, and the accountancy profession. Its recommendations are focused 16 Section 8.01(c), subsections (2) and (6) of the Model Business Corporation Act
on the control and reporting functions of boards, and on the role of auditors. 2005. This Act has been adopted in whole or in part by more than 30 U.S. states.
10 Cadbury Committee, 1992, Report on the financial aspect of corporate governance, Amendments to the act were adopted December 2009 regarding proxy voting.
London: Gee, Recommendation 4.31 and 4.32 17 Committee of Sponsoring Organizations of the Treadway Commission, 2004, 97
11 Cadbury Report 1992, Recommendation 4.35, section (e), under (v) Enterprise risk management – integrated framework, executive summary, New York:
12 Hampel Committee, 1998, Committee on Corporate Governance – final report, AICPA Inc., p. 3
London: Gee, Section D (Accountability and Audit) under II and subsection 2.20, p. 21
The developing legal risk management environment

tions such as SOX might not succeed in regulating frauds or that the reliability of the reporting and an internal control framework26.
their effectiveness would be limited as the frauds that preceded this Responding to the corporate failures and fraud around the new mil-
legal response occurred despite several levels of monitoring in place lennium, the E.U. became more active in areas such as company law,
at the time18. For example, Cunningham notes that “[h]istory offers accounting, and auditing law, although parts of these areas remain
no reason to expect that new rules will prevent a repeat of account- controlled by the national legislators. The E.U. legislative movement
ing scandals even of this large size or frequency”19 and Bratton, brought forward several general as well as sector-specific direc-
that “[t]he costs of any significant new regulation can outweigh the tives and recommendations. One of these general directives is the
compliance yield, particularly in a system committed to open a wide Prospectus Directive with the purpose of harmonizing, inter alia,
field for entrepreneurial risk taking”20. the information contained in the prospectus in order to provide
equivalent investor protection. It requires the prospectus to include
Within Europe, the legal internal control and risk management envi- key information on the company’s risk factors and a summary in
ronment also changed after the failures and frauds around the new which “the essential characteristics and risks associated with the
millennium, but with a more principle-based and self-regulatory issuer, any guarantor and the securities” are disclosed27. In addition,
approach. Following the 2003 European Commission’s Plan to Move the Transparency Directive 2004/109/EC harmonizes transparency
Forward, E.U. member states have drawn up or updated their nation- requirements to ensure a higher level of investor protection and mar-
al corporate governance codes for listed companies. In the U.K., due ket efficiency within the E.U. The directive acknowledges the impor-
to the Hampel Report, the 2000 Combined Code already underlined tance of disclosure of the companies’ main risks as Articles 4 and 5
the board’s duty to maintain a sound system of internal controls, of the directive require the annual and half-yearly financial reports
on a comply-or-explain basis. The Code added that the board has to to include a management report in which a description is given of the
report annually to the shareholders that it has reviewed the effec- “principal risks and uncertainties” that the company faces. Next to
tiveness of the group’s internal control system, covering financial, disclosing general information on the companies’ risks, it is required
operational, and compliance controls and risk management21. A few to disclose information on the risk management systems regarding
years later, the provisions dealing with the audit committee’s duties the financial reporting process. That is because Articles 1 and 2 of the
were updated due to the 2003 Higgs and Smith Reports requiring 2006/46/EC amendment to the Accounting Directives — the Fourth
the committee to review the company’s internal control and risk and Seventh Company Law Directives — provide that the annual
management systems22. The 2009 Review of the Combined Code corporate governance statement must include a description of the
announced amendments to the internal control principle in order main features of the company’s (or group’s) internal control and risk
to stress “the board’s responsibility for defining the company’s risk management systems for the financial reporting process. Moreover,
appetite and tolerance and maintaining a sound risk management specific features of the audit committee’s duties are given at E.U.
system” and to the provisions in order to include that the board has level. In a Commission Recommendation (2005/162/EC), the audit
to “satisfy itself that appropriate systems are in place to enable it committee is recommended to assist the board to review the internal
to identify, assess and manage key risks”23. Other E.U. member control and risk management systems. The committee should do so
states also issued corporate governance codes that emphasized in order to ensure that the main risks the company faces are properly
the board’s and audit committee’s duties. For instance, the Dutch identified, managed, and disclosed. This monitoring role of the audit
code provides that the board is responsible for complying with all committee is further developed in the 2006/43/EC Audit Directive
relevant primary and secondary legislation and managing the risks for monitoring the financial reporting process and the effectiveness
associated with the company’s activities. In addition, the board of the company’s internal control and risk management systems.
has to report related developments to and discuss the internal risk Thus, the legislative measures at the E.U. level require reporting on
management and control systems with the supervisory board and the main features of the systems for the financial reporting process
the audit committee24. The audit committee has to monitor the and monitoring the effectiveness of the systems. Before implement-
activities of the board with respect to the operation of the internal ing these directives in national laws and regulations, most member
risk management and control systems25. states had already issued corporate governance codes focusing on a
broader concept of internal control and risk management, emphasiz-
Traditionally, E.U. lawmakers focused mainly on corporate disclosure ing the financial reporting, operational, and compliance aspects.
rules and not so much on requiring management systems to endorse

18 Ribstein, L. E., 2002, “Market vs. regulatory responses to corporate fraud: a critique 24 Corporate Governance Code Monitoring Committee, 2008, The Dutch corporate gov-
of the Sarbanes-Oxley Act of 2002”, Journal of Corporation Law, 28:1, p. 5 ernance code – principles of good corporate governance and best practice provisions
19 Cunningham, L. A., 2002, “Sharing accounting’s burden: business lawyers in Enron’s (DCGC 2008), Principle II.1
dark shadows”, Boston College Working Paper, pp. 16-17 25 DCGC 2008, Best practice provision III.5.4
20 Bratton, W. W., 2002, “Enron and the dark side of shareholder value,” Available at 26 The 1984 Eighth Company Law Directive (84/253/EEC, OJ L 126, 12 May 1984, p.
SSRN: <http://ssrn.com/abstract=301475>, p. 13 20–26) harmonized the approval of persons responsible for carrying out the statu-
21 Committee on Corporate Governance, 2000, The Combined Code – principles of tory audits of accounting documents. Articles 3 and 24 demanded such persons to be
good governance and code of best practice (Combined Code 2000), Principle D.2 and independent and of good repute.
98 Provision D.2.1 (Principle C.2 and Provision C.2.1 of the 2008 Combined Code) 27 Article 5 and IV of Annex I of Directive 2003/71/EC of the European Parliament and
22 Committee on Corporate Governance, 2003, The Combined Code – principles of good of the Council of 4 November 2003 on the prospectus to be published when securi-
governance and code of best practice (Combined Code 2003), Provision C.3.2 ties are offered to the public or admitted to trading and amending Directive 2001/34/
23 Financial Reporting Council, 2009, Review of the Combined Code: Final Report, p. 27 EC, OJ L 345, 31 December 2003, p. 64–89
The developing legal risk management environment

The upshot is that, despite some inconsistencies, both in the E.U. Banking Supervision was established in 1974 by the central bank
and the U.S. most parts of the reporting and monitoring level governors of the Group of Ten countries29. Without formal supra-
are covered, as shown in Table 1. In the U.S. this is accompanied national supervisory authority, the committee issues supervisory
by legislative measures focusing on establishing and maintaining standards and guidelines which national authorities can implement.
an internal control system for financial reporting, whereas E.U. The 1988 Basel Capital Accord introduced a capital measurement
member states further the framework by provisions related to the system which provided for the implementation of a credit risk mea-
overall system for establishment and maintenance. In addition, in surement framework. In 2004 a revised framework was issued. This
the U.S. the establishing, maintaining, reporting, and monitoring 2004 Basel II Accord provides for requirements relating to minimum
provisions regarding the financial reporting systems are provided capital, supervisory review, and market discipline and disclosure30.
by law. On the contrary, in the E.U. the establishing, maintaining, It stresses that risk management is fundamental for an effective
and reporting provisions regarding the overall systems are pro- assessment of the adequacy of a bank’s capital position. The Basel
vided by self-regulation, although this regulation has a legal basis II framework is introduced into European legislation through the
in several member states. Capital Requirements Directives, comprising Directive 2006/48/EC
and Directive 2006/49/EC. It affects credit institutions and certain
Establish/ Maintain/ Report on Monitor types of investment firms. In line with the above described gen-
identify manage eral legal provisions, Article 138 of Directive 2006/48/EC requires
General (E.U. / U.S. (E.U. / U.S. E.U./U.S. E.U. MS / U.S. credit institutions to have adequate risk management processes
risks state) state) state
General E.U. MS E.U. MS E.U. MS E.U./E.U. MS /
and internal control mechanisms in place, including reporting and
systems U.S. state accounting procedures. Article 135 of that Directive reads that
Financial U.S. U.S. E.U./U.S. U.S. E.U. member states have to demand that “persons who effectively
reporting
systems direct the business of a financial holding company be of sufficiently
good repute and have sufficient experience to perform those
Table 1 – Aspects of the main general E.U. and U.S. internal control and risk duties.” Consequently, where general legal provisions develop what
management provisions
the duty of the board and managers includes, this sector-specific
Sector-specific risk management regulation provision regulates what a proper person for performing certain
Next to these general provisions, the legal risk management environ- duties would be like. In addition, credit institutions and certain
ment — especially with regards to the financial industry — is shaped by types of investment firms must have effective processes to iden-
sector-specific legal measurements. The financial industry-specific tify, manage, monitor, and report their risks as well as adequate
provisions cover mainly the banking system, insurance, and securi- internal control mechanisms. Their management body — consisting
ties. The previous section shows that the foundation of the legal of at least two persons with sufficiently good repute and sufficient
risk management environment is given by mainly two types of rules. experience to perform such duties — should approve and review the
Firstly, having — including maintaining and monitoring — internal strategies and policies for indentifying, managing, monitoring, and
control and risk management systems and secondly, disclosing infor- mitigating the risks, taking into account specific criteria regarding
mation about those systems and the risks the company faces. As the the credit and counterparty risk, residual risk, concentration risk,
current section will show, the financial industry regulations and guide- securitization risk, market risk, interest rate risk arising from non-
lines supplement these two levels — though much more detailed — trading activities, operational risk, and liquidity risk31. Obviously,
but also expand the first level. Indeed, the general rules provide that these requirements, especially the specific criteria, are much more
a system must be in place and stress the board’s duty to maintain detailed than the general ones described in the previous section.
internal control and risk management systems and the audit com-
mittee’s monitoring role therein. The financial industry-specific Another sector-specific European Directive that introduces a
provisions, however, regulate certain internal functions within the comprehensive framework for risk management and regulates
organization and emphasize the external monitoring role of the certain internal functions within the organization is the Solvency
supervisory authorities. Several financial industry-specific provisions II Directive32. It has a much wider scope than the Solvency I
are described below in order to further explain these expansions28. Directive and contains thorough revision and realignment of the
E.U. Directives relevant to (re)insurers. The Solvency II Directive
At E.U. level, financial industry-specific directives that refer to risk has similarities to the Basel II banking regulation. This set of regu-
management are, inter alia, the Capital Requirements Directives, latory measurements for insurance companies includes provisions
the Solvency Directive, and the MiFID. The Basel Committee on regarding capital, governance, and risk management, effective

28 See for a more thorough analysis of risk management within financial law: Van der 31 Articles 11 and 22 and Annex v of Directive 2006/48/EC of the European Parliament
Elst, C. F., forthcoming 2010, Risk management in financial law, in van Daelen, M. M. and of the Council of 14 June 2006 relating to the taking up and pursuit of the
A., and C. F. Van der Elst, eds., Risk management and corporate governance: intercon- business of credit institutions, OJ L 177, 30 June 2006, p. 1–200, and Article 34 of
nections in law, accounting and tax, Edward Elgar Publishing, Cheltenham Directive 2006/49/EC of the European Parliament and of the Council of 14 June
29 Belgium, Canada, France, Germany, Italy, Japan, the Netherlands, Sweden, the U.K. 2006 on the capital adequacy of investment firms and credit institutions, OJ L 177,
99
and the U.S. and later also Switzerland. 30 June 2006, p. 201–255
30 Basel Committee on Banking Supervision, 2004, International convergence of capital 32 Directive 2009/138/EC of the European Parliament and of the Council of 25
measurement and capital standards – a revised framework November 2009 on the taking-up and pursuit of the business of Insurance and
Reinsurance (Solvency II), OJ L 335, 17 December 2009, p. 1–155
The developing legal risk management environment

supervision, and disclosure. It requires written and implemented on to describe that this compliance function must have “the nec-
policies for the company’s risk management, internal control, and essary authority, resources, expertise, and access to all relevant
internal audit. In addition, the (re)insurance companies must have information” in order to create an environment in which it can
an effective and well integrated risk-management system in order discharge its responsibilities properly and independently. In addi-
to identify, measure, monitor, manage, and report risks, covering, tion, likewise the insurance companies, the investment companies
inter alia, market, credit, liquidity, concentration, and operational need to have an internal audit function for the evaluation of the
risks. The risk management system must include risk mitigation adequacy and effectiveness of the internal control system36.
techniques and the companies have to conduct risk and solvency
assessments. Next to the risk management system, an effective As the regulatory reform is making its entrance at E.U. level, E.U.
internal control system with administrative and accounting pro- member states are introducing their own financial industry-specific
cedures, an internal control framework, and appropriate reporting guidelines and the U.S. is developing general regulations regarding
arrangements is required33. To sum up, compared to the general certain internal functions within the organization. In the U.S., the New
provisions described above, this directive introduces a more com- York Stock Exchange corporate governance rules require audit com-
prehensive set of risk management and internal control require- mittees to discuss the guidelines and policies to govern the process
ments. The directive also prescribes internal functions and the of risk assessment and risk management37. In addition, in May 2009
duties of as well as the personal qualifications for those functions. legislation entitled ‘Shareholder Bill of Rights Act of 2009’ has been
For instance, for the evaluation of the adequacy and effective- introduced in the U.S. Senate by Senator Charles Schumer that would,
ness of the internal control system, an internal audit function is if passed, mandate risk committees for publicly traded companies in
required. Furthermore, the (re)insurance companies are instructed general. The role of these risk committees — that are to be composed
to have a compliance function within their organization. This com- of independent directors — is to be responsible for the establishment
pliance function has the duty to advise the management or super- and evaluation of the risk management practices of the issuer38. Like
visory body on compliance with laws and regulations, to identify in the U.S., in E.U. member states the idea of requiring companies to
and assess compliance risks, and to assess the possible impact of have a risk committee is also starting to be considered. For instance,
any changes in the legal environment. Moreover, it demands that the Dutch self-regulatory Banking Code requires banks — not listed
the “persons who effectively run the undertaking or have other key companies in general — to have a risk committee39. Furthermore,
functions” are fit and proper, that is, have adequate professional the U.K. Walker Review recommends certain listed banks and insur-
qualifications, knowledge, and experience and are of good repute ance companies to establish a risk committee in order to, inter alia,
and integrity respectively34. oversee and advise the board on current risk exposures and the
future risk strategy40. Similar to the Netherlands, but contrary to
A third financial industry-specific piece of E.U. legislation is the the U.S., this U.K. recommendation is not extended to non-financial
MiFID, the markets in financial instruments directive, which pro- listed companies41. In general, from a legal perspective reform as a
vides organizational requirements and operating conditions for reaction to the financial crisis includes the (further) development of
investment firms35. Like the previous directives it requires com- the duty of the board, senior management, the supervisory body or
panies to establish, implement, and maintain risk management non-executives, the audit committee, the internal audit, the compli-
policies and procedures in order to detect risks, set the level of risk ance function, and the risk committee.
tolerance, and includes risk minimizing procedures. The directive
further demands investment companies to monitor the adequacy Next to regulating the duty of and personal qualifications for cer-
and effectiveness of these risk management policies and proce- tain internal functions within the organization, the more recent
dures. With regard to the internal functions of the organization, it financial industry-specific provisions emphasize the external moni-
provides that investment companies have to establish and maintain toring role of the supervisory authorities. Financial markets are
a compliance function, for which a compliance officer must be more global than they used to be and since the financial crisis can
appointed, with the duty to monitor and assess the adequacy and be defined as a global systemic crisis, lawmakers are searching for
effectiveness of the company’s measures and procedures. It goes ways to address regulatory repair internationally. The reaction at

33 Articles 41, 44, 46 and 101 of Directive 2009/138/EC 39 Section 4.2 of the Dutch Banking Code: Nederlandse Vereniging van Banken (NVB),
34 Articles 42, 46 and 47 of Directive 2009/138/EC Code Banken, 9 September 2009, p. 10. This self-regulatory code will most likely
35 Commission Directive 2006/73/EC of 10 August 2006 implementing Directive receive a legal basis in 2010.
2004/39/EC of the European Parliament and of the Council as regards organisational 40 Recommendations 23-27 and Annex 10 (Elements in a board risk committee report) of
requirements and operating conditions for investment firms and defined terms for the Walker Review, 2009, A review of corporate governance in U.K. banks and other
the purposes of that Directive, OJ L 241, 2 September 2006, p. 26–58 (MiFID level 2 financial industry entities – Final recommendations. In addition, see the U.K. Turner
Directive) Review, 2009, A regulatory response to the global banking crisis, p. 93.
36 Articles 6 and 7 of Commission Directive 2006/73/EC 41 Financial Reporting Council, 2009, Review of the Combined Code: Final Report, p. 25
37 Section 303A of the NYSE’s Listed Company Manual
38 COSO, Effective enterprise risk oversight: the role of the board of directors, 2009;
Section 5 of the Shareholder Bill of Rights Act of 2009 (S. 1074) of 19 May 2009. The
bill proposes to amend the Securities Exchange Act of 1934 (15 U.S.C. 78a et seq.) by
100
inserting after section 14 section 14A and by adding at the end subsection (e) ‘corpo-
rate governance standards,’ (5) ‘risk committee’
The developing legal risk management environment

the E.U. level42 includes introducing European supervisory authori- Concluding remarks
ties to supplement the national supervision in order to repair the Since the financial crisis started, the legal risk management envi-
lack of cohesiveness and form a supervisory front. The European ronment is under construction. Especially for the financial services
Commission mandated the de Larosière Group, a high-level group industry, regulations relating to the internal organization of the
on financial supervision in the E.U., to propose recommendations company and the external monitoring role of the supervisory
on the future of European financial regulation and supervision. authorities are piling up. As argued above, the underlying question
The report of the de Larosière Group provides for a framework here is whether new regulations can indeed prevent the next crisis.
that points out three main items to strive for: a new regulatory For new regulations to improve the legal environment of financial
agenda (to reduce risk and improve risk management), stronger institutions — thereby reducing the imperfections shown by this
coordinated supervision (macro- and micro-prudential), and effec- crisis — firstly the current legal environment must be clear and sec-
tive crisis management procedures (to build confidence among ondly, the primary problem has to be understood. This seems only
supervisors)43. In reaction to these recommendations, the European logical, but gauging the precise problem is far from easy. Where
Commission proposes reforms relating to the way financial markets Fein argues that it might not be bank regulation that is broken, but
are regulated and supervised. It recommends a European financial rather bank supervision, Kashyap argues that regulation is broken
supervisory framework composed of two new pillars. The first pil- at the most basic level46. Even so, if new regulations can prevent
lar is a European Systemic Risk Board (ESRB) “which will monitor a crisis such as this one, there is no guarantee that this type of
and assess potential threats to financial stability that arise from regulation is needed to prevent the next one, as the next crisis will
macro-economic developments and from developments within the most likely have its own features. Besides that, at a roundtable on
financial system as a whole (‘macro-prudential supervision’)”44. The the future of financial regulation Harring argued that there might
second pillar is a European System of Financial Supervisors (ESFS) almost never be a perfect time for reform. “When profits are high
that should consist of a network of national financial supervisors and markets are buoyant, it’s only we ivory tower types who think
as well as new European Supervisory Authorities. At the moment, about it. And when there’s a crash, risk aversion rises to such an
three financial sector-specific committees are already in place extent that tightening regulations is unnecessary because institu-
at the EU level: the Committee of European Banking Supervisors tions and markets are already too risk-averse to rekindle economic
(CEBS), the Committee of European Insurance and Occupational growth”47. To conclude, it might not be the right time for regulatory
Pensions Committee (CEIOPS), and the Committee of European reform, but when it is, it might be best to focus on what we want
Securities Regulators (CESR). In order to establish European to achieve in the long run and how that can be achieved keeping
Supervisory Authorities, a directive is proposed which transforms in mind an adequate balance between the interests of corporate
these committees into a European Banking Authority (EBA), a constituents, consumers, and investors on the one hand, and the
European Insurance and Occupational Pensions Authority (EIOPA), overall costs, on the other. After all, the financial services industry
and a European Securities and Markets Authority (ESMA)45. is one of the most heavily regulated industries already.

42 See for the U.S. reaction: H.R.4173 – Wall Street Reform and Consumer Protection 45 Proposal for a directive of the European Parliament and of the Council Amending
Act of 2009. U.S. House passed this financial services reform bill on 11 December Directives 1998/26/EC, 2002/87/EC, 2003/6/EC, 2003/41/EC, 2003/71/EC, 2004/39/
2009, which is said to be ‘the biggest change in financial regulation since the Great EC, 2004/109/EC, 2005/60/EC, 2006/48/EC, 2006/49/EC, and 2009/65/EC in
Depression.’ When the bill becomes law, it would provide for, inter alia, protection of respect of the powers of the European Banking Authority, the European Insurance
consumers and investors, enhanced Federal understanding of insurance issues, and and Occupational Pensions Authority, and the European Securities and Markets
regulated over-the-counter derivatives markets. This legislation would also establish Authority, 26 October 2009, COM(2009) 576. Legislative measures to implement the
a Consumer Financial Protection Agency and give the Treasury Department new European Systemic Risk Board are not included in this proposal.
authority. See in addition, Department of The Treasury, financial regulatory reform 46 See for this discussion Morley, J. D., and R. Romano, eds., 2009, “The future of finan-
– a new foundation: rebuilding financial supervision and regulation, 17 June 2009 cial regulation”, John M. Olin Center for Studies in Law, Economics, and Public Policy,
(the White Paper on Financial Regulatory Reform of June 2009 from the Obama Research Paper No. 386, 108-116
Administration). 47 Morley, J. D., and R. Romano, R. eds., 2009, “The future of financial regulation”, John
43 The de Larosière Group, 2009, Report of the high-level group on financial supervision M. Olin Center for Studies in Law, Economics, and Public Policy, Research Paper No.
101
in the EU, Brussels, p. 4 386, p. 88
44 Communication from the Commission, European financial supervision, 27 May 2009,
COM(2009) 252
Articles

Interest rate risk


hedging demand under
a Gaussian framework

Sami Attaoui
Economics and Finance Department,
Rouen Business School

Pierre Six
Economics and Finance department,
Rouen Business School

Abstract
This article analyzes the state variables Merton-Breeden hedg-
ing demand for an investor endowed with a utility function over
both intermediate consumption and terminal wealth. Based on the
three-factor model of Babbs and Nowman (1999), we show that this
demand can be simply expressed as weighted average zero-coupon
bonds sensitivities to these factors. The weighting parameter is
actually the proportion of wealth our investor sets aside for future
consumption rather than for terminal wealth.

103
Interest rate risk hedging demand under a Gaussian framework

[ ]
'
Portfolio management of fixed-income securities has recently where D K (TB ) ≡ Dκ1 (TB ) Dκ 2 (TB ) Dκ 3 (TB ) ,
received a lot of attention in the literature. However, to the best of
1
our knowledge, none of these studies focus on the Merton-Breeden with D x ( y) ≡
x
(1 − exp (−x.y)) .
hedging demand [Merton (1973), Breeden (1979)] for an arbitrary
fixed income security. Considering an investor with a utility func- These three functions represent the sensitivity of the bond price to
tion over intermediate consumption and terminal wealth, we focus the three factors. The long term interest rate r∞ and the determin-
in this study on a stochastic opportunity set composed of stochas- istic function w(TB) are given in the appendix. Applying Ito’s lemma
tic interest rates modeled by the three-factor model of Babbs and to Equation (3), we obtain the zero coupon volatility vector:
Nowman (1999). Indeed, Litterman and Scheinkman (1991) have
identified three common factors that account for more than 82% σ B (TB ) = σ ′ D K (TB ) (4)
of innovations in bond returns. 
We consider an individual that invests in a riskless asset and in a
Relying on the martingale approach for portfolio management portfolio composed of three non-redundant fixed-income securities.
[Karatzas et al. (1987), Cox and Huang (1989, 1991)], we provide Let π(t) denote the vector of risky proportions, and W(t) and C(t)
useful insights on the factors related to interest rate risk hedging the investor’s wealth and consumption, respectively. Furthermore,
demand. Our results show that this demand is intimately linked to we assume that our investor has a constant relative risk aversion2,
the sensitivities of zero-coupon bonds to the various factors as γ, and a finite investment horizon, TI.
well as to the proportion of wealth an investor sets aside for future
consumption rather than for terminal wealth. In addition, these The investor maximizes his/her expected utility over consumption
quantities can be straightforwardly computed. and terminal wealth subject to a budget constraint. Formally,

  TI C 1 −γ WT 
1 −γ
Setting J ≡ sup E  ∫ 0 s ds+ I  (5)
We consider the three-factor interest rate model of Babbs and π (s) 0≤ s ≤ TI  
  1 −γ 1 − γ  
Nowman (1999)1. The three state variables Xi(t), i = 1, 2, 3 are dW(t) C (t) ′
assumed to be governed by the following dynamics + dt = [r(t)+ (ϖ (t)′π(t))θ ]dt+ (ϖ (t)′π(t)) dz(t) (6)
W (t) W (t)
W (0) ≡ W
dX(t) = -diag(κ)Xdt + σdz(t);
X(0) ≡ X, (1) and C(0) ≡ C is the investor optimal initial consumption.

where dz(t) ≡ [dz1(t) dz2(t) dz3(t)]’ is a vector of independent Using the dynamic programming approach3, Merton (1973) has
Brownian motion defined on a filtered probability space (Ω, F0≤t≤T, shown that:
F, P), T designates the end of the economy, and P the historical 3
JW JWi
probability. The market prices of risk, θ ≡ [θ1 θ2 θ3]’, linked to these π(0) = Σ−1 (0)e(0)+ ∑ Σ−1 (0) ωi (0), (7)
−WJWW −WJ WW 
Brownian motions are assumed to be constant. Moreover, κ ≡ [κ1 κ2 i

κ3]’ is a vector of positive constant mean reversion speed of adjust- where JW and JWW denote the first and second partial derivative
ments and diag(κ) is the diagonal matrix whose ith element is κi. of the value function J with respect to wealth, respectively. JWi is
σ ≡ (σij), 1≤i, j≤3, is a lower triangular matrix of constants that repre- the second cross partial derivative of J with respect to wealth and
sents the sensitivity of the factors to the various shocks. state variable Xi. Σ(t) ≡ ϖ(t) ϖ(t)’ is the variance covariance matrix,
with ϖ(t) denoting the matrix of volatility of assets. e(t) represents
The instantaneous risk-free interest rate is specified as follows: the vector of excepted return in excess of the risk free rate of these
assets. ωi(t) stands for the vector of covariance between state vari-
r(t) ≡ μ – X1(t) – X2(t) –X3(t)  (2) able Xi and the three risky assets.

where μ is a positive constant. The second component4 of the right hand side of Equation (7) is the
so-called Merton-Breeden hedging demand for risky assets whose
Under these assumptions, Babbs and Nowman (1999) give a closed- number is equal to that of state variables.
form formula for the time-0 price of a zero-coupon bond maturing
at time TB: As stated above, we focus on the part of the hedging demand that
does not depend on the type of asset selected in the portfolio:
 ′ 
B(TB , X ) = exp −TB .(r∞ − w(TB )) − D K (TB ) X  , (3) JWi(t) ÷ -W(t)JWW(t), i=1,2,35. Merton (1973) has shown that JWi(t) ÷
 
-JWW(t) could be couched in terms of two sensitivities:

1 Our analysis can also be carried out in the Dai and Singleton (2000) framework. 4 The first component is the speculative mean-variance demand that depends on asset
104 2 We consider the case γ>1. This is standard in the theoretical literature and has been direction that can be much more easily studied than the Merton-Breeden hedging
supported by empirical evidence. demand.
3 See Cox and Huang (1991) for the equivalence between the martingale and dynamic 5 The study of the variance covariance matrix as well as the covariance of assets with
programming approach. state variables can be easily carried out for various assets in our affine Gaussian
framework and is thus omitted in the sequel.
Interest rate risk hedging demand under a Gaussian framework

JWi ÷ -JWW = -∂iC ÷ ∂WC  (8) The first term of Equation (13) is a positive decreasing function of
the investment horizon and the second term is obviously also a
where ∂iC denotes the partial derivative of consumption with positive decreasing function of the investment horizon. Hence, q(γ,
respect to state variable Xi and ∂WC is the partial derivative of TI, X) and Q(γ, TI, X) are decreasing and increasing functions of the
consumption with respect to wealth. Since -∂iC ÷ ∂WC can be posi- investment horizon, respectively. As a consequence, the behavior
tive or negative, we provide below a comprehensive investigation of the wealth to consumption ratio [Equation (12)] as a function of
of this term6. the investment horizon has to be examined numerically. Moreover,
Equation (12) shows that the wealth to consumption ratio does not
In order to study this hedging component, we use the martingale depend on either wealth or consumption. Thus, using Equation (11),
approach [Karatzas et al. (1987), Cox and Huang (1989, 1991)], our hedging term can be couched solely in terms of consumption:
where the budget constraint (6) is cast into a martingale:
JWi ∂C
 TI C WT  (γ,TI , X ) = − i (γ,TI , X ) (14)
E  ∫ 0 s ds + I  = W −WJWW C 
(9)
 G s G TI   The hedging demand component is the opposite of the elasticity
The numeraire Gt is the growth portfolio whose dynamics obeys the of consumption with respect to state variables. To the best of our
following equation:7 knowledge, this feature has not been much emphasized in the lit-
erature.
dG(t)/G(t) ≡ [r(t) + |θ|2]dt + θ’ dz(t)
G(0) = 1 (10) To further study this elasticity we need to define another variable
which measures, in proportion, how much wealth an investor sets
Using the results of Munk and Sorensen (2007), the value function aside to satisfy future consumption rather than terminal wealth.
can be computed as follows: Building on Karatzas et al. (1987)9, we show that this proportion,
πC(γ, TI, X), is given by:
W 1 −γ W γ
J(γ,TI , X,W ) =  (γ,TI , X ) . (11)
  q(γ,T , X ) 
−1
1 −γ  C  
π C (γ,TI , X ) = 1 +  I
 . (15)
The optimal ratio of wealth to consumption is given only in terms   Q (γ,TI , X )  
of the growth portfolio: πC(γ, TI, X) is an increasing function of the investment horizon, is
positive, and lies between 0 and 1. When πC(γ, TI, X) ≡ 1, the investor
W
(γ,TI , X ) = Q(γ,TI , X ) + q(γ,TI , X ) , (12) focuses only on consumption, and in the case πC(γ, TI, X) ≡ 0, he/she
C  is solely concerned by terminal wealth. We are now able to state the
TI
 1  main result of our article:
−1
with Q (γ,TI , X ) ≡ ∫ q(γ,u, X )du
0
and q (γ,u, X ) ≡ E G (u)γ  .
  Proposition 2 — the elasticity of consumption with respect to state
Equation (12) stresses the importance of the function q(γ, u, X), variables is given by:
which describes how state variables affect the value function and
∂i C
then the hedging component under scrutiny. The closed-form (γ,TI , X ) = π C (γ,TI , X ) H i (γ,Ti (γ,TI , X )) + [1 − π C (γ,TI , X )] H i (γ,TI ) (16)
C 
expression of q(γ, u, X) can be obtained by a change of probability
 1
measures where a particular numeraire8, whose volatility vector with H i (γ,T ()) ≡ 1 −  Dκ i (T ()) and Ti(γ,TI,X) is given in the appendix.
 γ
is 1/γ · θ + [1 – 1/γ]σB(TI), is introduced. We obtain the following
proposition: Proof — Available from the authors upon request.

Proposition 1 — The functions q(γ, TI, X) are given by: Equation (16) states that the elasticity of consumption can be
1  γ −1 TI  decomposed in weighted average of bond sensitivities. The weight-
q (γ,TI , X ) = B(TI , X ) exp − 2 ∫ σ B (v) −θ dv
1− 2
γ (13) ing parameter is the proportion of wealth for future consumption.
 2γ 0  Furthermore, Hi(γ, TI) and Hi(γ, Ti(γ, TI, X)) are the risk-aversion
TI adjusted sensitivities of a zero-coupon bond price with respect to

2
where the explicit expression of σ B (v) −θ dv is given in the the state variable. They are related to a bond having TI as terminal
0
appendix. maturity and to a bond maturing at an intermediate horizon Ti(γ, TI, X),
respectively10. Moreover, in the case of the first bond the investor is
Proof — available from the authors upon request. solely concerned about terminal wealth, whereas, in the case of the
second bond he/she is concerned about intermediate consumption.

6 If ∂iC<0 (>0) then the investor will demand more of the risky asset whose returns are 8 Full demonstration is available from the authors upon request
positively (negatively) correlated with changes in the state variable i. Therefore, an 9 The proof is available from the authors upon request. 105
unfavorable shift in the opportunity set is offset by higher level of wealth (see Merton 10 We show in the appendix that Ti(γ, TI, X) <TI, which implies that Hi(γ, Ti(γ, TI, X) < Hi(γ, TI).
(1973) for a more detailed discussion of this point).
7 || stands for the usual euclidean norm.
Interest rate risk hedging demand under a Gaussian framework

Finally, we point out that the elasticity of consumption is always We consider three levels of risk aversion, γ = 3, 6, 9, and three
positive. investment horizons, TI = 3, 10, 30.

Numerical illustration Table 2 provides results for πC(γ, TI, X). It is an increasing function of
In this section, we provide various numerical analysis for πC(γ, TI, X), both the risk aversion and the investment horizon. It reaches 100%
Hi(γ, TI), Hi(γ, Ti(γ, TI, X)), Ti(γ, TI, X), and ∂iC/C(γ, TI, X). The base case for a very long investment horizon. The investor, in this case, solely
parameters (Table 1) are taken from the empirical study in Babbs considers the hedging component linked to consumption.
and Nowman (1999). The initial values for the state variables are
set equal to their long-term means, i.e. zero. Table 3 reports the bond sensitivities in the case of terminal
maturity only (Panel A) and in the case of intermediate maturity
(Panel B). Both components are increasing in risk aversion and
κ1 κ2 κ3 θ1 θ2 θ3 μ
investment horizon. However, they differ in size impact. It is higher
65.53% 7.05% 5.25% 15.82% 9.61% 1.73% 7.01%
for the sensitivity linked to terminal wealth than for that linked to
σ11 σ21 σ21 σ31 σ32 σ31
2.14% -1.78% 0.65% 1.43% -0.46% 0.64%
consumption. Moreover, the magnitude across the three factors
varies significantly, especially for the component linked to terminal
Table 1 – Base case parameters
wealth. For example, for γ = 6 and TI = 10, we obtain, in the case of
terminal wealth, H1(γ,TI) =1.27, H2(γ,TI) = 5.98, and H3(γ,TI) = 6.48,
  πc(γ,TI,X) and, in the case of intermediate consumption, H1(T1(γ,TI,X)) = 0.89,
  TI =3 TI =10 TI =30 H2(T2(γ,TI,X)) = 2.08, and H3(T3(γ,TI,X)) = 2.16.
γ=3 79.21% 98.39% 100%
γ=6 80.15% 99.02% 100%
The size and pattern of the sensitivity-linked consumption is
γ=9 80.46% 99.17% 100%
reflected in the behavior of the intermediate horizon (Table 4). This
Table 2 – πC(γ, TI, X) as a function of γ and TI consumption horizon is decreasing in risk aversion and increasing
in investor’s terminal horizon.

Panel A — linked to terminal wealth


  H1(γ,TI) H2(γ,TI) H3(γ,TI)
We finally turn to the ultimate variable of the paper, that is, the
  TI=3 TI=10 TI=30 TI=3 TI=10 TI=30 TI=3 TI=10 TI=30 elasticity of consumption with respect to the state variables
γ=3 0.875 1.02 1.02 1.8 4.78 8.32 1.85 5.19 10.1 (Table  5). Despite the conflicting behavior of the various compo-
γ=6 1.09 1.27 1.27 2.25 5.98 10.4 2.31 6.48 12.6 nents of demand, the pattern of this component is monotonic. For
γ=9 1.17 1.35 1.36 2.4 6.38 11.1 2.47 6.92 13.4 all factors, this term is increasing in risk aversion and investment
Panel B — linked to intermediate consumption
horizon. Moreover, the elasticity of consumption is substantially
H1(T1(γ,TI,X)) H2(T2(γ,TI,X)) H3(T3(γ,TI,X))
higher for the third factor than for the first one.
TI=3 TI=10 TI=30 TI=3 TI=10 TI=30 TI=3 TI=10 TI=30
γ=3 0.546 0.738 0.748 0.875 1.81 1.94 0.89 1.88 2.04
γ=6 0.675 0.89 0.897 1.08 2.08 2.16 1.1 2.16 2.25
γ=9 0.718 0.938 0.945 1.14 2.16 2.23 1.16 2.24 2.32

Table 3 – Bond sensitivities

T1(γ,TI,X) T2(γ,TI,X) T3(γ,TI,X)


TI=3 TI=10 TI=30 TI=3 TI=10 TI=30 TI=3 TI=10 TI=30
γ=3 0.504 0.847 0.872 0.00684 0.015 0.0162 0.00381 0.00843 0.00918
γ=6 0.496 0.788 0.801 0.00673 0.0136 0.0142 0.00375 0.00767 0.00804
γ=9 0.493 0.771 0.781 0.0067 0.0132 0.0137 0.00373 0.00744 0.00774

Table 4 – Intermediate horizon

  ∂1C/C(γ,TI,X) ∂2C/C(γ,TI,X) ∂3C/C(γ,TI,X)


  TI=3 TI=10 TI=30 TI=3 TI=10 TI=30 TI=3 TI=10 TI=30
γ=3 61.46% 74.24% 74.84% 106.77% 185.66% 194.45% 108.94% 193.72% 203.78%
γ=6 75.83% 89.36% 89.7% 131.04% 211.49% 216.26% 133.68% 219.93% 225.38%
γ=9 80.54% 94.18% 94.46% 138.93% 219.17% 223.02% 141.72% 227.68% 232.08%

Table 5 – Elasticity of consumption as a function of γ and TI

106 – The   journal of financial transformation


Interest rate risk hedging demand under a Gaussian framework

Concluding remarks References


• Babbs, S., and K. Nowman, 1999, “Kalman filtering of generalized Vasicek term struc-
This article provides an in-depth analysis of the state variables
ture models,” Journal of Financial and Quantitative Analysis, 34:1, 115-130
Merton-Breeden hedging demand when the opportunity set is • Breeden, D., 1979, “An intertemporal asset pricing model with stochastic consumption
affected by interest rates only. Relying on the three-factor model and investment opportunities,” Journal of Financial Economics, 7, 263-296
• Cox, J., and C. Huang, 1989, “Optimum consumption and portfolio policies when asset
of Babbs and Nowman (1999), we show that the Merton-Breeden
prices follow a diffusion process,” Journal of Economic Theory, 49, 33-83
hedging demand boils down to a portfolio of risk-adjusted bond • Cox, J., and C. Huang, 1991, “A variational problem arising in financial economics,”
sensitivities with respect to the state variable. The portfolio weight- Journal of Mathematical Economics, 21, 465-488
• Dai, Q., and K. Singleton, 2000, “Specification analysis of affine term structure mod-
ing parameter is the wealth proportion that will be used for future
els,” The Journal of Finance, 55, 1943-1978
consumption. Our analysis could be extended to take account of a • Karatzas, I., J. Lehoczky, and S. Shreve, 1987, “Optimal portfolio and consumption
stochastic behavior of the market price of risk. decisions for a small investor on a finite horizon,” Siam Journal on Control and
Optimization 25, 1557-1586
• Litterman, R., and J. Scheinkman, 1991, “Common Factors Affecting Bond Returns,”
Appendix Journal of Fixed Income, 1, 62-74
A. Zero coupon parameters • Merton, R., 1973, “An intertemporal capital asset pricing model,” Econometrica, 41,
867-887
• Munk, C., and C. Sørensen, 2007, “Optimal real consumption and investment strategies
Following Babbs and Nowman (1999), the bond parameters are as in dynamic stochastic economies”, in Jensen, B. S., and T. Palokangas, (eds), Stochastic
follow: economic dynamics, CBS Press, 271-316
2
3 3
σ ij 1 3  3 σ ij 
r∞ = µ + ∑θ j ∑ − ∑ ∑  (A1)
j =1 i=1
κ i 2 j =1  i=1 κ i  

3 3 σ 3 3
σ σ  1 3 3
σ σ
w(u) = ∑ Dκi (u) ∑θ j i, j − ∑ ∑ kj ij  + ∑ Dκ i +κ j (u)∑ ik jk (A2)
i=1  j =1 κ i j =1 k=1 κ k κ i  2 i, j k=1 κ iκ j 

TI


2
B. Closed-form expression of σ B (v) −θ dv
0

Direct computation leads to:


TI 3 3

∫ 0
σ B (v) −θ dv = θ TI + ∑σ ijθ j
2 2

i, j
1
κi
(TI − Dκi (TI)) + ∑ mij
i, j
κ
1
iκ j
(
TI − Dκi (TI) − Dκ j (TI)+ Dκi +κ j (TI))
3
(TI)) + ∑ mij
1
κ iκ j
(TI − Dκi (TI) − Dκ j (TI)+ Dκi +κ j (TI) ) (A3)
i, j 

with mij ≡ (ss’)ij.

C.
It can be shown from the authors that T1(γ,TI,X), is given by:
 TI  
−1

Ti (γ,TI , X ) ≡ κ i log 1 −κ i  ∫ ψ (γ,u, X ) Dκi (u)du 


 (C1)
 0   

q (γ,u, X )
with ψ (γ,u, X ) ≡
Q (γ,TI , X ) (
. Since κ i log (1 −κ i x)
−1
)
is the inverse

function of Dki(x), Equation (C1) clearly demonstrates a weighted


average formulae which implies that Ti(γ,TI,X)<TI.

107
Articles

Non-parametric
liquidity-adjusted
VaR model:
Emmanuel Fragnière
Professor, Haute École de Gestion de Genève,
and Lecturer, University of Bath
a stochastic
Jacek Gondzio
Professor, School of Mathematics,
programming approach
University of Edinburgh

Nils S. Tuchschmid
Professor, Haute École de Gestion de Genève

Qun Zhang
School of Mathematics, University of Edinburgh

Abstract
This paper proposes a Stochastic Programming (SP) approach for the solution. Initially, a set of numerical experiments indicates that the
calculation of the liquidity-adjusted Value-at-Risk (LVaR). The model LVaR figures are quite similar for both approaches when all the
presented in this paper offers an alternative to Almgren and Chriss’s underlying financial assumptions are identical. Following this sanity
mean-variance approach (1999 and 2000). In this research, a two- check, a second set of numerical experiments shows how the ran-
stage stochastic programming model is developed with the intention domness of the different types (i.e., bid and ask spread) can be easily
of deriving the optimal trading strategies that respond dynamically incorporated into the problem due to the stochastic programming
to a given market situation. The sample paths approach is adopted formulation and how optimal and adaptive trading strategies can
for scenario generation. The scenarios are thus represented by a be derived through a two-stage structure (i.e., a recourse problem).
collection of simulated sample paths rather than the tree structure Hence, the results presented in this paper allow for the introduction
usually employed in stochastic programming. Consequently, the SP of new dimensionalities into the computation of LVaR by incorporat-
LVaR presented in this paper can be considered as a non-parametric ing different market conditions.
approach, which is in contrast to Almgren and Chriss’s parametric

109
Non-parametric liquidity-adjusted VaR model:
a stochastic programming approach

Developed over the last couple of decades, Value-at-Risk (VaR) tion. Note that both approaches require externally setting a fixed
models have been widely used as the main market risk manage- horizon for liquidation. Aiming to overcome this problem, Hisata
ment tool in the financial world [Jorion (2006)]. VaR estimates and Yamai (2000) extended Almgren and Chriss’s approach by
the likelihood of a portfolio loss caused by normal market move- assuming a constant speed of sales and by using continuous
ments over a given period of time. However, VaR fails to take into approximation. They could derive a closed-form analytical solution
consideration the market liquidity impact. Its estimate is quite for the optimal holding period. In this setting, the sales comple-
often based on mid-prices and the assumption that transactions do tion time thus becomes an endogenous variable. Yet, Hisata and
not affect market prices. Nevertheless, large trading blocks might Yamai’s model relies on the strong assumption of a constant
impact prices, and trading activity is always costly. To overcome speed of sales.
these problems, some researchers have proposed the calculation
of liquidity adjusted VaR (LVaR) [Dowd (1998)]. Differing from the Krokhmal and Uryasev (2006) argued that the solution offered
conventional VaR, LVaR takes both the size of the initial holding by Almgren and Chriss and that of Jarrow and Subramanian were
position and liquidity impact into account. The liquidity impact is unable to dynamically respond to changes in market conditions.
commonly subcategorized into exogenous and endogenous illiquid- Therefore, they suggested a stochastic dynamic programming
ity factors. The former is normally measured by the bid-ask spread, method and derived an optimal trading strategy by maximizing the
and the latter is expressed as the price movement caused by mar- expected stream of cash flows. Under their framework, the optimal
ket transactions [Bangia et al. (1999)]. From this perspective, LVaR trading strategy becomes highly dynamic as it can respond to mar-
can be seen as a complementary tool for risk managers who need ket conditions at each time step. Another methodology that incor-
to estimate market risk exposure and are unwilling to disregard the porates these dynamics into an optimal trading strategy is that of
liquidity impact. Bertsimas and Lo (1998). They applied a dynamic programming
approach to the optimal liquidation problems. Analytical expres-
Bangia et al. (1999) proposed a simple but practical solution that sions of the dynamic optimal execution strategies are derived by
is directly derived from the conventional VaR model in which an minimizing the expected trading cost over a fixed time horizon.
illiquidity factor is expressed as the bid-ask spread. Although this
approach avoids many complicated calculations, it fails to take In this paper, we present a new framework for the calculation
into consideration endogenous illiquidity factors. Hence, liquidity of non-parametric LVaR by using stochastic programming (SP)
risk and LVaR are underestimated. A more promising solution for techniques. Over the past few years, stochastic programming has
LVaR estimation stems from the derivation of optimal trading grown into a mature methodology used to approach decision mak-
strategies as suggested by Almgren and Chriss (1999 and 2000). ing problems in uncertain contexts. The main advantage of SP is
In their model, Almgren and Chriss adopted the permanent and its ability to better tackle optimization problems under conditions
temporary market impact mechanisms from Holthausen et al.’s of uncertainty over time. Due to the fast development of com-
work (1987) and assumed linear functions for both of them. By puting power, it has been used to solve large scale optimization
externally setting a sales completion period, they derived an opti- problems1. Therefore, we believe it is a promising methodology for
mal trading strategy defined as the strategy with the minimum LVaR modeling.
variance of transaction cost, or of shortfall, for a given level of
expected transaction cost. Or inversely, a strategy that has the The SP approach presented in this paper is extended from
lowest level of expected transaction cost for a given level of vari- Almgren and Chriss’s framework (1999 and 2000). The sample
ance. With the normal distribution and the mean and variance of path approach is adopted for scenario generation, rather than the
transaction cost, LVaR can also be determined and minimized scenario tree structure usually employed in SP. The scenario set
to derive optimal trading strategies. In this setting, LVaR can be is represented by a collection of simulated sample paths. Differing
understood as the pth percentile possible loss that a trading posi- from Almgren and Chriss’s parametric formulation of LVaR, we
tion can encounter when liquidity effects are incorporated into present a non-parametric formulation for LVaR. Both exogenous
the risk measure computation. Later on, Almgren (2003) extended and endogenous illiquidity factors are taken into account. The for-
this model by using a continuous-time approximation, and also mer is measured by the bid-ask spread, and the latter is expressed
introduced a non-linear and stochastic temporary market impact by linear market impact functions, which are related to the quantity
function. Another alternative is the liquidity discount approach of sales. The model in this paper is built in a discrete-time manner,
presented by Jarrow and Subramanian (1997 and 2001). Similar and the holding period is required to be determined externally. The
to Almgren and Chriss’s approach (1999 and 2000), the liquidity permanent and temporary market impact mechanism proposed by
discount approach requires that the sales completion period be Holthausen et al. (1987) is adopted to formulate the market impact,
given as an exogenous factor. The optimal trading strategy is then and both permanent and temporary market impacts are assumed
derived by maximizing an investor’s expected utility of consump- as linear functions.

1 Gondzio and Grothey (2006) showed in their research that they could solve a qua-
110 – The   journal of financial transformation dratic financial planning problem exceeding 109 decision variables by applying a
structure-exploiting parallel primal-dual interior-point solver.
Non-parametric liquidity-adjusted VaR model:
a stochastic programming approach

Stochastic programming LVaR model


nk nk nk nk nk n1 1 n n
This paper proposes a SP approach to estimate the non-parametric g g= = (3) and h h = k =+ +k  k (4)
2 2
LVaR, which is based on Almgren and Chriss’s mean-variance
approach (1999 and 2000). While their model has been shown to where g and h are the permanent and temporary market impact
be an interesting methodology for the calculation of LVaR and has coefficients2, respectively, and e denotes the bid-ask spread. They
a huge potential in practice, the optimal trading strategies derived are all assumed to be constant.
by their model fail to respond dynamically to the market situation,
as they rely on a ‘closed-form’ or ‘static’ framework. For instance, Based on the previously presented equations, the formula for the
if an increasing trend is observed in the market price, investors actual sale price is derived as:
may decide to slow their liquidation process. If, on the other hand, k k
1 nk
unexpected market shocks occur, investors may decide to adjust S�k = S0 + j + µk nk (5)
j=1 j=1 2
their trading strategy and to speed up the completion of their sale. 
These market situations can be simulated and incorporated into I II III
scenarios. Clearly, any ‘closed form’ solution cannot deal with this
type of uncertainty in such a dynamic manner. The LVaR formula- As you can see from this formula, the actual sale price can be
tion in Almgren and Chriss’s model is based on the mean-variance decomposed into three parts. Part I is the price random walk, which
framework; thus, it can be considered as a parametric approach. describes the price dynamics without any market impacts. Parts II
In contrast, the LVaR formulation presented in this paper is non- and III are the price decline caused by the permanent and tempo-
parametric and allows for the incorporation of various dynamics rary market impact, respectively.
in the liquidation process. Thus, we propose a new framework for
LVaR modeling. Then the total proceeds can be calculated by summing the sale
values over the entire holding period:
Almgren and Chriss’s mean-variance model N N N N
1 N

According to Almgren and Chriss’s framework, a holding period T total proceed = nk S�k = XS0 + xk 1 k +µ xk 1 nk ( X xk ) X nk2
k= 1 k= 1 k= 1 k= 1 2 k= 1
is required to be set externally. Then, this holding period is divided N N
1 1 1 N
= XS0 + xk 1 k +µ xk 1 X2 X nk2 (6)
into N intervals of equal length (t = T/N). The trading strategy is k= 1 k= 1 2 2 2 k= 1

defined as the quantity of shares sold in each time interval, which


is denoted by a list of n1,…, nk,…, nN, where nk is the number of Consequently, ‘liquidation cost’ (LC)3 can be derived by subtract-
shares that the trader plans to sell in the k interval. Accordingly, th
ing the total actual sale proceeds from the trader’s initial holding
the quantity of shares that the trader plans to hold at time tk = kτ value, that is:
is denoted by xk. Suppose a trader has a position X that needs to be
N N N
1 1 1 N
liquidated before time T, then we have: LC = XS0 nk S�k = xk 1 k µ xk 1 + X2 + X+ nk2 (7).
k= 1 k= 1 k= 1 2 2 2 k= 1

N N k k N N
nkn=k =
k 1k 1 xkx,kX
, X= = nknk andxkx=
k =
XX njn=j = njn, j , k k= =0,...,
0,...,NN
. . Almgren and Chriss derive the formulae for the mean and variance
k =k1 = 1 j = 1j = 1 j = kj =+1k +1
of the liquidation cost as:
Price dynamics in Almgren and Chriss’s model are formulated as an
1 E2 LC N= 1 X 2 1 µ
N
11 XN+ 2 1 N

arithmetic random walk as follows: E [LC ] = X [ µ] xk 1 + X + xk 1 + nk nk2 (8)


2 k= 1
2 2 k= 1 22 k = 1 2 k= 1

N
nk N
V [LC
and V [LC ] = xk2 1] =
2
Sk = Sk 1 + k +µ g  (1) 2 xk2 1 (9)
k= 1 k= 1

where Sk is the equilibrium price after a sale, m and s are the drift Finally, they formulate the LVaR by using the parametric approach
and volatility of the asset price, respectively, and xk is a random with the mean and variance of the LC as:
number that follows a standard normal distribution N (0, 1). The last
term, g(nk/t), describes the permanent market impact from a sale. LVaR = E [LC ]+ cl V [LC ]  (10)
The actual sale price is calculated by subtracting the temporary
impact, h(nk/t), from the equilibrium price: where cl denotes the confidence level for the LVaR estimation, and
αcl is the corresponding percentile of the standard normal distribu-
nk
S�k = Sk h  (2) tion. As expressed, LVaR measures a possible loss with a given
position while taking into consideration both market risk conditions
According to Almgren and Chriss’s framework (1999 and 2000), and liquidity effects.
both g(nk/t) and h(nk/t) are assumed to be linear functions:

2 For the estimation of temporary and permanent market impact coefficients, Almgren coefficients, Almgren and Chriss’s simple assumption is adopted for all the numerical
and Chriss did not propose a specific method. They assumed that: for the temporary experiments in this paper. 111
market impact, trading each 1% of the daily volume incurs price depression of one bid- 3 In Almgren and Chriss’s paper, this ‘cost’ is referred as the transaction cost. However,
ask spread, and for the permanent market impact, trading 10% of the daily volume will the transaction cost is commonly known as the fees involved for participating in the
have a significant impact on price, and incur price depression of one bid-ask spread. market, such as the commission to the brokers. Therefore, in order to avoid any con-
Since this paper focuses on the LVaR modeling, not the estimation of market impact fusion, it is named ‘liquidation cost’ in this paper.
Non-parametric liquidity-adjusted VaR model:
a stochastic programming approach

The optimal trading strategy could be derived by minimizing the obstacle for dynamic or stochastic optimization problems. An
LVaR. The mathematical programming formulation of this optimiza- alternative method to overcome this problem is to simulate a col-
tion problem is thus written as: lection of sample paths to reveal the future uncertainty as shown
in Figure  1(b). Each simulated path represents a scenario. These
min E [LC ]+ cl V [LC ] sample paths can be generated by using Monte Carlo simulation,
nk
N historical simulation, or bootstrapping. There have been several
. xk =
st nj k = 0,..., N 1 interesting papers regarding the application of the sample paths
j = k +1
N method in stochastic programming [Hibiki (2000), Krokhmal and
X= nk Uryasev (2006)]. Using sample paths is advantageous because
k= 1

n1 ,..., nN 0. increasing the number of paths to achieve a better approximation


causes the number of nodes to increase linearly rather than expo-
Based on this brief introduction to Almgren and Chriss’s mean-vari- nentially. This advantage is also present when the time period is
ance approach, we can thereon proceed to present the SP approach increased. The number of nodes increases linearly with the sample
to LVaR modeling. paths method and exponentially with scenario tree structure.

Stochastic programming transformation Let =be(C {


a0collection
, C1 , s , C 2, s , …of, Csample paths }
k , s , … , C N , s )| s = 1, … , Sc ,

In stochastic programming, uncertainty is modeled with scenarios


that are generated by using available information to approximate { }
= (C 0, C1 , s , C 2, s , … , C k , s , … , C N , s )| s = 1, … , Sc ,
future conditions. Before conducting the SP transformation, we
need to briefly introduce the scenario generation technique used where Ck,s represents the information about relevant parameters.
in this paper. In Almgren and Chriss’s model (1999 and 2000), we should recall
that the only randomness considered is market price. Hence, to
The liquidation process of investors’ positions is a multi-period set a point of comparison between their results and the results
problem. The most commonly used technique is to model the evo- from the SP approach, we first assume the only randomness that
lution of stochastic parameters with multinomial scenario trees, as is considered in the sample paths will come from the market price
shown in Figure 1(a). component, k. Yet, this restrictive assumption can clearly be easily
relaxed, and randomness can be added to other parameters, such
However, the use of scenario tree structures often leads to con- as the bid-ask spread or the temporary and permanent market
siderable computational difficulty, especially when dealing with impact coefficients4.
large scale practical problems. In the scenario tree structure, the
uncertainties are represented by the branches that are gener- Under the SP framework, the trading strategy is no longer a vector
ated from each node. Increasing the number of branches per node but a two dimensional matrix
can improve the quality of the approximation of the uncertainty.
n1 ,1 � nk ,1 � n N ,1
However, it causes an exponential growth in the number of nodes.
� � �
Indeed, in order to approximate the future value of the uncertain strategy=
n1 , s nk , s nN ,s
parameters with a sufficient degree of accuracy, the resulting
� � �
scenario tree could be of a huge size. It is commonly known as
n1 , Sc � nk , Sc � n N , Sc
the “curse of dimensionality” [Bellman (1957)]. It is a significant

first stage second stage

where nk,s is the quantity of shares sold in kth interval on path s, s is


(a) scenario tree (b) sample paths the index of scenarios, and Sc is the number of scenarios. This is a
two stage SP problem. n1,s (s = 1,...,Sc) are the first stage variables,
and nk,s (k = 2,…,N and s = 1,...,Sc) are the second stage variables.
Due to the nonanticipativity in the first stage, the first stage vari-
ables must be locked:

n1 , s = n1 , s 1 s = 2,...Sc .

Time period Time period


For the actual sale price formulation, recall Equation (5). Taking
Figure 1 – Scenario generation into account the scenarios and replacing part I with k,s (i.e., the

4 An extension is presented below.


112 – The   journal of financial transformation
Non-parametric liquidity-adjusted VaR model:
a stochastic programming approach

asset price without market impacts in kth interval for each scenario), nario and calculate the liquidation cost for that specific scenario.
the actual sale price is reformulated as: That is to say, we substitute the optimal trading strategy matrix
into the liquidation cost formula (Equation [13]), and calculate
ˆ
k
1 nk , s
S�k , s = Sk,s nj , s  (11)
LC, which is a vector indexed by s.
j=1 2 3 Sort the vector LC, and find the value of the αth percentile LC, i.e.,
As we now have the sale price formulation, the total sale proceeds the α%-LVaR. The most commonly used α value is 95 and 99.
corresponding to each scenario is naturally obtained by summing
up the sale proceeds over the entire set of N intervals: Numerical experiments I
As previously mentioned, we first conducted a sanity check. This
N N
1 k nk2, s section details the numerical experiments for both the SP model
total proceed = nk , s S�k , s = Sˆk , snk , s nk , s nk , s nj , s
k= 1 k= 1 2 j=1
. and the Almgren and Chriss’s mean-variance model with the restric-
N
1 1 1 N
(12) tion of randomness on one component only, i.e., the ‘pure market
= Sˆk , snk , s X X2 n2k , s
2 2 2
k= 1 k= 1 price.’

JP Morgan’s stock data was collected for the numerical experi-


Consequently, the liquidation cost under scenario s is derived by ments. The holding period, T, was set to be 5 days, and we selected
subtracting the corresponding total sale proceeds from the trader’s the time interval to be 0.5 day. Thus, the total number of sales, N,
initial holding value, that is: was 10. The selection of the holding period and time interval was
N
arbitrary.
1 1 N
ˆ n + 1 N
LC s = XS0 nk , s S�k , s = XS0 + X+ X2 Sk,s k,s nk2, s (13)
k= 1 2 2 k= 1 2 k= 1 
For the price sample paths generation, the Monte Carlo simulation
The deterministic equivalent formulation of this SP problem with was applied. The stochastic evolution of the price was assumed to
nonanticipativity constraints is: follow a geometric Brownian motion:
Sc
ˆ =S
ˆ 1
min ps LC s Sk k 1 exp µ 2
+ k (14)
nk ,s
s= 1 2 
N
. X=
st nk , s s = 1,..., Sc Under Almgren and Chriss’s mean-variance framework, market
k= 1

n1 , s ,..., nN , s 0 s = 1,..., Sc price was assumed to follow an arithmetic random walk because it
is ultimately rather difficult to derive a closed-form solution based
n1 , s = n1 , s 1 s = 2,...Sc
on an assumption of geometric Brownian motion. Yet, with Monte
where ps is the probability of scenario s. Since the scenarios are Carlo simulations, formulating the price evolution under different
obtained by the Monte Carlo simulation, they are thus equally prob- assumptions creates no issues related to the underlying distribu-
able with ps = 1 / Sc. tions that could generate prices and returns. Since the geometric
random walk is the most commonly used assumption for price sto-
The resulting problem is a quadratic optimization one. The objec- chastic processes, it is used in this paper even though differences
tive, the expected value of the liquidation cost, is a quadratic func- between these two random walks are almost negligible over a short
tion, and all constraints are linear. period of time.

Non-parametric LVaR formulation 10,000 sample paths were generated by using the Monte Carlo
Depending on the set of assumptions, the calculation methodology, simulation. The simulated prices form a 10-by-10,000 matrix. The
and their uses, two different types of VaRs usually exist, i.e., the initial price is 37.72. These simulated sample paths are displayed
parametric VaR and the non-parametric VaR. The same categoriza- in Figure 2.
tion obviously applies to LVaR. The LVaR estimated by Almgren
and Chriss (1999 and 2000) is parametric as shown in Equation Five different initial holdings were chosen for the numerical experi-
[10]. In this paper, we have to rely on a non-parametric formulation ments with the aim of observing how the initial position affected
because it stems from the SP solution that we have adopted. More the LVaR estimation. The LVaRs were calculated with the most
precisely, the calculation procedure is as follows: commonly seen confidence levels of 95% and 99%. The results are
summarized in Table 1.
1 Solve the stochastic optimization problem stated above and
obtain the optimal trading strategy matrix. The numerical results show that the LVaR ratios computed by the
2 Apply the optimal trading strategies to the corresponding sce- SP model are slightly lower than those computed by Almgren and

113
Non-parametric liquidity-adjusted VaR model:
a stochastic programming approach

Chriss’s model at both the 95% and 99% confidence levels. Also,
Price
44 as expected, the numerical results show that the LVaR estimates
Price scenario (simulated random walk
increase when the initial holdings increase. This is true for both
Almgren and Chriss’s model and the SP model. As previously stated,
42
as the initial holding becomes larger, the market impact becomes
stronger when a trader liquidates his position. Consequently, the
40
LVaR will increase as well. This is clearly a characteristic that dis-
tinguishes the LVaR from the traditional VaR.
38

SP LVaRs are lower than Almgren and Chriss’s LVaRs because the
36
SP model’s optimal trading strategies can dynamically adapt to the
market situation. This fits investors’ actual trading behaviors in
the market, as they will adjust their trading plans according to the
34
market environment. Therefore, the SP model can provide more
precise LVaR estimates due to the characteristic of the SP model’s
32
0 0.5 1 1. 5 2 2.5 3 3.5 4 4.5 5
adaptive trading strategies.
Time (days)
Generalization of the stochastic programming LVaR
Figure 2 – Simulated price scenarios
model
A simple stochastic programming LVaR model that was transformed
from Almgren and Chriss’s mean-variance model was presented
Initial holding
(shares) 1000000 500000 100000 50000 10000 above in order to compare with other models discussed (i.e., both
two LVaR approaches were used under the same setting). Let us now
Almgren and Chriss’s Mean-Variance Model
extend the analysis and show some of the advantages provided by the
Parametric LVaR
1.425E+06 6.186E+05 1.010E+05 4.845E+04 9.311E+03 SP approach. Contrary to Almgren and Chriss’s model that assumes
95%
Parametric LVaR that both the bid-ask spread and market impact coefficients are con-
1.425 1.237 1.010 0.969 0.931
per share
stants, we generalize the SP LVaR model by relaxing this assumption
Parametric LVaR
ratio
3.78% 3.28% 2.68% 2.57% 2.47% and treating these two components as random variables.
Parametric LVaR
1.863E+06 8.216E+05 1.388E+05 6.718E+04 1.304E+04
99% By incorporating randomness in the bid-ask spread and both the
Parametric LVaR
per share
1.863 1.643 1.388 1.344 1.304 permanent and temporary market impact coefficients, the formula
Parametric LVaR of the actual sale price needs to be rewritten as:
4.94% 4.36% 3.68% 3.56% 3.46%
ratio
� =S ˆ
k
1 n
k,s k,s
Stochastic Programming Model Sk,s k,s j ,s nj , s k,s (15)
j=1 2 
Non-parametric
1.290E+06 5.399E+05 8.144E+04 3.832E+04 7.218E+03
LVaR 95% For the formulation of ek,s, we employ a standardization process.
Non-parametric
1.290 1.080 0.814 0.766 0.722 Since bid-ask spreads tend to be proportional to asset prices, past
LVaR per share
Non-parametric observations may not accurately reflect the current variations.
3.42% 2.86% 2.16% 2.03% 1.91%
LVaR ratio Bangia et al. (1999) suggested calculating a relative bid-ask spread
Non-parametric that is equal to the bid-ask spread divided by the mid-price. By
1.811E+06 7.944E+05 1.342E+05 6.442E+04 1.249E+04
LVaR 99%
Non-parametric
employing this calculation, the bid-ask spread is expressed as a pro-
1.811 1.589 1.342 1.288 1.249
LVaR per share portion of the asset price; thus, the current bid-ask spread variation
Non-parametric
4.80% 4.21% 3.56% 3.42% 3.31% is sensitive to the current asset price rather than past observations.
LVaR ratio
The relative bid-ask spread, as a normalizing device, can improve
Table 1 – Numerical results summary the accuracy of the bid-ask spread variation estimation. The bid-ask
*LVaR per share = LVaR/Initial holding; LVaR ratio = LVaR per share/Initial price spread is thus formulated as:

k,s = Sˆk , s 
k,s
(16)

= Sˆk , s k , s is the relative bid-ask spread at time tk on path s. Recall


where
k,s

the sample path set

114 – The   journal of financial transformation


Non-parametric liquidity-adjusted VaR model:
a stochastic programming approach

{
= (C 0, C1 , s , C 2, s , … , C k , s , … , C N , s )| s = 1, … , Sc . } Initial holding
(shares) 1000000 500000 100000 50000 10000

By incorporating the randomness into the relative bid-ask spread Non-parametric


1.084E+06 4.740E+05 7.800E+04 3.739E+04 7.119E+03
LVaR 95%
and market impact coefficients, Ck,s is extended to
Non-parametric
1.084 0.948 0.780 0.748 0.712
LVaR per share
Ck,s (
= Sˆk , s , k,s , k,s , k,s ). Non-parametric
LVaR ratio
2.87% 2.51% 2.07% 1.98% 1.89%

Non-parametric
In other words, in the generalized SP model, each node on the simu- 1.621E+06 7.389E+05 1.312E+05 6.403E+04 1.243E+04
LVaR 99%
lated sample paths contains information for the asset price, the Non-parametric
1.621 1.478 1.312 1.281 1.243
LVaR per share
relative bid-ask spread, and the permanent and temporary market
Non-parametric
impact coefficients. LVaR ratio
4.30% 3.92% 3.48% 3.40% 3.30%

As we now have the new sale price formulation, the formula for Table 2 – Numerical results

liquidation cost under scenario s (as shown in Equation [13]) is


rewritten as: In Figure  3, we can see that the LVaR ratios computed by the SP
2
model with the incorporation of randomness into the bid-ask spread
N N
1 ˆ k n
LC s = XS0 nk , s S�k , s = XS0 Sˆk , snk , s Sk , s k,s nk , s nk , s k,s nj , s k,s k,s
 (17) and the market impact coefficients are slightly lower than those
k= 1 k= 1 2 j=1
computed by the SP model with the constant bid-ask spread and
The deterministic equivalent formulation and the LVaR calculation market impact coefficients. When the initial holding is small, incor-
procedure are the same as shown above. By generalizing the SP porating these new random variables does not cause a significant
model, more parameters are incorporated in the sample path set, change to the LVaR estimate. However, when the initial holding is
which leads to a more accurate approximation of future uncertain- large, the differences are substantial. For instance, when the initial
ties. holding is 1,000,000, incorporating randomness reduces the 95%
LVaR ratio from 3.42% to 2.87% and the 99% LVaR ratio from
Numerical experiments II 4.80% to 4.30%.
This section reports the numerical experiments for the generalized
SP LVaR model presented above. We use the same dataset that The main reason for these differences must lie in the way the opti-
was used for aforementioned numerical experiments. The holding mal trading strategies that are derived by the SP model respond to
period and time interval remain identical, i.e., 5 days and half a day, the variation of the bid-ask spread and market impact coefficients.
respectively. For example, if we assume that the bid-ask spread is constant, the

For the sample paths’ generation of the relative bid-ask spread, and
the permanent and temporary market impacts, we assumed that
LVaR ratio
they followed independent lognormal distributions and simulated
5,0%
each of them simply as a white noise:
4,5%
1
k,s = exp µ 2
+ k,s
(18)
2 4,0%

1
k,s = exp µ 2
+ k,s (19) 3,5%
2
3,0%
1
k,s = exp µ 2
+ k,s (20)
2 2,5%

2,0%
where m and s are the means and standard deviations, respectively,
of the three random variables (i.e., e, g and h). 1,5%
1.000.000 500.000 100.000 50.000 10.000
Initial holding (shares)
Once again 10,000 sample paths were generated by using the
LVaR ratio 95% with incorporation of the variation of spread and market impact coefficients
Monte Carlo simulation for each parameter. The LVaRs at the 95% LVaR ratio 95% with constant spread and market impact coefficients
and 99% confidence levels were computed for the same five initial LVaR ratio 99% with incorporation of the variation of spread and market impact coefficients
LVaR ratio 99% with constant spread and market impact coefficients
holding scenarios employed above. The results are summarized in
Table 2. Figure 3: LVaR ratio comparison

115
Non-parametric liquidity-adjusted VaR model:
a stochastic programming approach

loss caused by the spread in the whole liquidation process is e X/2 References
• Almgren, R. and N. Chriss, 1999, “Value under liquidation,” Risk, 12, 61-63
for each scenario as shown in Equation (13) (e is the mean value
• Almgren, R. and N. Chriss, 2000, “Optimal execution of portfolio transactions,” Journal
of the bid-ask spread). With the incorporation of randomness into of Risk, 3, 5-39
the bid-ask spread, the optimal trading strategies are adjusted in • Almgren, R., 2003, “Optimal execution with nonlinear impact functions and trading-
enhanced risk,” Applied Mathematical Finance, 10, 1-50
accordance with its variation. When the spread is high, the optimal
• Bangia, A., F. Diebold, T. Schuermann and J. Stroughair, 1999, “Liquidity on the out-
trading strategy may suggest selling less. On the contrary, when side,” Risk, 12, 68-73
it is low, the optimal trading strategy may suggest selling more. • Bertsimas, D., and A. Lo, 1998, “Optimal control of execution costs,” Journal of
Financial Markets, 1, 1-50
Therefore, the average loss caused by the spread can be expected
• Bellman, R.E., 1957, Dynamic programming, Princeton University Press, Princeton, New
to be lower than e X/2. Stated otherwise, the SP model’s optimal Jersey
trading strategies can take advantage of changes by acting in a • Dowd K., 1998, Beyond Value at Risk: the new science of risk management, Wiley and
Sons, New York
flexible and timely manner. Also note that introducing the calcula-
• Gondzio, J. and A. Grothey, 2006, “Solving nonlinear financial planning problems with
tion of the relative bid-ask spread and the Monte Carlo simulation 109 decision variables on massively parallel architectures,” in Costantino, M., and C.A.
itself can cause certain differences. However, the effects are pre- Brebbia (eds.), Computational finance and its applications II, WIT transactions on mod-
elling and simulation, 43, WIT Press, Southampton
sumably small.
• Hibiki, N., 2000, “Multi-period stochastic programming models for dynamic asset allo-
cation,” Proceedings of the 31st ISCIE international symposium on stochastic systems
Finally, it is worthwhile mentioning that incorporating randomness theory and its applications, 37-42
• Hisata, Y. and Y. Yamai, 2000, “Research toward the practical application of liquidity
into the bid-ask spread and market impact coefficients within the
risk evaluation methods,” Monetary and Economic Studies, 83-128
Almgren and Chriss’s model will definitely enlarge the resulting • Holthausen, R. W., R. W. Leftwich and D. Mayers, 1987, “The effect of large block trans-
LVaR estimates. Indeed, it would add new variance terms to the actions on security prices: a cross-sectional analysis,” Journal of Financial Economics,
19, 1987, 237-268
variance of the liquidation cost, since the variations of parameters
• Holthausen, R. W., R. W. Leftwich and D. Mayers, 1990, “Large-block transactions, the
are represented by their variances. This would lead to the increase speed of response, and temporary and permanent stock-price effects,” Journal of
of the LVaR estimates. The SP solution and its numerical experi- Financial Economics, 26, 1990, 71-95
• Jarrow, R. A., and A. Subramanian, 1997, “Mopping up liquidity,” Risk, 10, 170-173
ments indicate that if uncertainty is handled well, it does not nec-
• Jarrow, R. A., and A. Subramanian, 2001, “The liquidity discount,” Mathematical
essarily cause an increase in the LVaR estimates. It highlights the Finance, 11:4, 447-474
strength of the SP approach, which provides adaptive strategies • Jorion P., 2006, Value at Risk: the new benchmark for managing financial risk, 3rd Ed.,
McGraw-Hill, New York
(or ‘recourse strategies’). Moreover, adding new random variables
• Krokhmal, P., and S. Uryasev, 2006, “A sample-path approach to optimal position liqui-
in the model does not increase the difficulty of the problem due to dation,” Annals of Operations Research, Published Online, November, 1-33
the non-parametric nature of the SP LVaR.

Conclusion
This paper presents a stochastic programming approach for LVaR
modeling, which is extended from Almgren and Chriss’s mean-
variance approach. In contrast to their approach, the optimal trad-
ing strategies are derived by minimizing the expected liquidation
cost. Thus, the SP strategies dynamically adapt to new market
situations. This is the strength of SP in the context of decision
making under uncertainty. Another contribution from this paper
is the non-parametric formulation of the SP LVaR. It contrasts
with the LVaR modeling methodologies that quite often rely on
parametric approaches. Overall, the numerical results indicate that
the two approaches are not identical. Indeed, the LVaRs computed
using the SP model in this paper are lower than those computed by
Almgren and Chriss’s model. Yet, LVaR modeling still remains in its
infancy, especially when using SP in this context.

116 – The   journal of financial transformation


Articles

Optimization in
financial engineering —
an essay on ‘good’
solutions and
misplaced exactitude
Manfred Gilli1
University of Geneva and Swiss Finance Institute

Enrico Schumann
University of Geneva

Abstract
We discuss the precision with which financial models are handled,
in particular optimization models. We argue that precision is only
required to a level that is justified by the overall accuracy of the
model, and that this required precision should be specifically ana-
lyzed in order to better appreciate the usefulness and limitations of
a model. In financial optimization, such analyses are often neglect-
ed; operators and researchers rather show an a priori preference
for numerically-precise methods. We argue that given the (low)
empirical accuracy of many financial models, such exact solutions
are not needed; ‘good’ solutions suffice. Our discussion may appear
trivial: everyone knows that financial markets are noisy, and that
models are not perfect. Yet the question of the appropriate preci-
sion of models with regard to their empirical application is rarely
discussed explicitly; specifically, it is rarely discussed in university
courses on financial economics and financial engineering. Some
may argue that the models’ errors are understood implicitly, or
that in any case more precision does no harm. Yet there are costs.
We seem to have a built-in incapacity to intuitively appreciate ran-
domness and chance. All too easily then, precision is confused with
actual accuracy, with potentially painful consequences.

1 This paper is based on Manfred Gilli’s leçon d’adieu, given at the Conférence Luigi
Solari 2009 in Geneva. Both authors gratefully acknowledge financial support from 117
the E.U. Commission through MRTN-CT-2006-034270 COMISEF.
Optimization in financial engineering — an essay on ‘good’ solutions and
misplaced exactitude

Der Mangel an mathematischer Bildung gibt sich durch nichts ‘going to the limit,’ i.e., we let numbers tend to zero or infinity. But
so auffallend zu erkennen wie durch maßlose Schärfe im that is not possible on a computer, any quantity must stay finite.
Zahlenrechnen. (Carl Friedrich Gauß) Consequently, we have so-called truncation error. For optimization
models, we may incur a similar error. Some algorithms, in particular
Imagine you take a trip to the Swiss city of Geneva, known for the methods that we describe below, are stochastic. As a result, we
international organizations, expensive watches and, not least, its do not (in finite time) obtain the model’s ‘exact’ solution, but only an
lake. After leaving the train station, you ask a passerby how far it is approximation (notwithstanding other numerical errors).
to the lakefront. You are told, ‘Oh, just this direction, along the Rue
du Mont-Blanc. It’s 512.9934 meters.’ In sum, we can roughly divide our modeling into two steps: from
reality to the model, and then from the model to its numerical solu-
This is not a made-up number. Google Earth allows you to track tion. Unfortunately, large parts of the quantitative finance literature
your path with such a precision. We measured this route around seem only concerned with assessing the quality of the second step,
10 times and found that it ranged between roughly 500 and 520 from model to implementation, and attempt to improve here. In the
meters. (Unfortunately, if you had actually tried to take this route, past, a certain division of labor has been necessary: the economist
during most of 2009 you would have found that the street was created his model, and the computer engineer put it into numerical
blocked by several construction sites.) When we are told ‘it’s 500 form. But today, there is little distinction left between the research-
meters to the lake,’ we know that this should mean about, say, er who creates the model, and the numerical analyst who imple-
between 400 and 600 meters. We intuitively translate the point ments it. Modern computing power allows us to solve incredibly
estimate into a range of reasonable outcomes. complex models on our desktops. (John von Neumann and Herman
Goldstine, in the above-cited paper, describe the inversion of ‘large’
In other fields, we sometimes seem to lack such an understanding. In matrices where large meant n>10. In a footnote, they ‘anticipate
this short article, we shall look at such a field, financial engineering. that n~100 will become manageable’ (fn. 12). Today, Matlab inverts
We will argue that a misplaced precision can sometimes be found a 100x100 matrix on a normal desktop PC in a millisecond. (But
here, and we will discuss it in the context of financial optimization. please, you will not solve equations by matrix inversion.) But then of
course, the responsibility to check the reasonableness of the model
Financial modeling and its solution lies — at all approximation steps — with the financial
In setting up and solving an optimization model, we necessarily engineer. And then only evaluating problems with respect to their
commit a number of approximation errors. [A classic reference on numerical implementation falls short of what is required: any error
the analysis of such errors is von Neumann and Goldstine (1947). in this step must be set into context, we need to compare it with
See also the discussion in chapter 6 of Morgenstern (1963).] ‘Errors’ the error introduced in the first step. But this is, even conceptually,
does not mean that something went wrong; these errors will occur much more difficult.
even if all procedures work as intended. The first approximation is
from the real problem to the model. For instance, we may move Even if we accepted a model as ‘true,’ the quality of the model’s
from actual prices in actual time to a mathematical description of solution would be limited by the attainable quality of the model’s
the world, where both prices and time are continuous (i.e., infinitely- inputs. Appreciating these limits helps us decide how ‘exact’ a solu-
small steps are possible). Such a model, if it is to be empirically tion we actually need. This decision is relevant for many problems
meaningful, needs a link to the real world, which comes in the form in financial engineering since we generally face a trade-off between
of data, or parameters that have to be forecast, estimated, simu- the precision of a solution and the effort required (most apparently,
lated, or approximated in some way. Again, we have a likely source computing time). Surely, the numerical precision with which we
of error, for the available data may or may not well reflect the true, solve a model matters; we need reliable methods. Yet, empirically,
unobservable process. there must be a required-precision threshold for any given problem.
Any improvement beyond this level cannot translate into gains
When we solve such models on a computer, we approximate a solu- regarding the actual problem anymore; only in costs (increased
tion; such approximations are the essence of numerical analysis. At computing time or development costs). For many finance problems,
the lowest level, errors come with the mere representation of num- we guess, this required precision is not high.
bers. A computer can only represent a finite set of numbers exactly;
any other number has to be rounded to the closest representable Example 1 — in numerical analysis, the sensitivity of the problem
number, hence we have what is called roundoff error. Then, many is defined as follows: if we perturb an input to a model, the change
functions (i.e., the logarithm) cannot be computed exactly on a in the model’s solution should be proportional. If the impact is far
computer, but need to be approximated. Operations like differ- larger, the problem is called sensitive. Sensitivity is often not a
entiation or integration, in mathematical formulation, require a numerical problem; it rather arises from the model or the data. In

118 – The   journal of financial transformation


Optimization in financial engineering — an essay on ‘good’ solutions and
misplaced exactitude

finance, many models are sensitive. Figure  1 shows the S&P 500 1100
from 31 December 2008 to 31 December 2009, i.e., 253 daily prices. 1000
900
The index level rose by 23%. 23.45%, to be more precise (from 800
903.25 to 1115.10). But does it make sense to report this number to 700

Jan 09 Apr 09 Jul 09 Oct 09 Jan 10


such a precision? 10% 15% 20% 25% 30% 35%

Figure 1 – Left: The S&P 500 in 2009. Right: Annual returns after jackknifing 2
Suppose we randomly pick two observations (less than one percent observations. The vertical line gives the realized return.
of the daily returns), delete them, and recompute the yearly return.
Repeating this jackknifing 5000 times, we end up with a distribu-
tion of returns (right panel of Figure 1). The median return is about
frequency of rebalancing with exact delta with delta to two digits
23%, but the 10th quantile is 20%, the 90th quantile is 27% (the once per day 18.2% 18.2%
minimum is only about 11%, the maximum is 34%!). Apparently, five times per day 8.3% 8.4%
tiny differences like adding or deleting a couple of days cause very
Table 1 – Volatility of profit-and-loss under different hedging schemes.
meaningful changes.

This sensitivity has been documented in the literature [for instance


in Acker and Duck (2007); or Dimitrov and Govindaraj (2007)], 25 25

20 20
but it is rarely heeded. The precision with which point estimates
15 15
are sometimes reported must not be confused with accuracy. We
10 10
may still be able to give qualitative findings (like ‘this strategy
5 5
performed better than another’), but we should not make single
0 0
numbers overly precise; we need robustness checks. Returns are -5 -5
80 85 90 95 100 105 110 115 120 80 85 90 95 100 105 110 115 120
the empirical buildings blocks of many models. If these simple
calculations are already that sensitive, we should not expect more
Figure 2 – Payoff of replicating portfolios with delta to double precision (left), and
complex computations to be more accurate. delta to 2 digits (right).

Example 2 — the theoretical pricing of options, following the papers


of Black, Scholes, and Merton in the 1970s, is motivated by an arbi- precision. Yet in research papers on option pricing, we often find
trage argument according to which we can replicate an option by prices and Greeks to 4 or even 6 decimals.
trading in the underlier and a riskless bond. A replication strategy
prescribes to hold a certain quantity of the underlier, the delta. The Here is a typical counterargument: ‘True, for one option we don’t
delta is changing with time and with moves in the underlier’s price, need much precision. But what if we are talking about one million
hence the options trader needs to rebalance his positions. Suppose options? Then small differences matter.’ We agree; but the question
you live in a Black-Scholes-Merton world. You just sold a one-month is not whether differences matter, but whether we can meaningfully
call (strike and spot price are 100, no dividends, riskfree rate is at compute them. (Your accountant may disagree. Here is a simple
2%, volatility is constant at 30%), and you wish to hedge the posi- rule: whenever you sell an option, round up and when you buy,
tion. There is one deviation from Black-Scholes-Merton, though: you round down.) Between buying one share of IBM stock or buying one
cannot hedge continuously, but only at fixed points in time [Kamal million shares, there is an important difference: you take more risk.
and Derman (1999)]. We can rephrase our initial example: you arrive at the train station
in Geneva, and ask for the distance to Lake Zurich.
We simulate 100,000 paths of the stock price, and delta-hedge
along each path. We compute two types of delta: one is the delta Optimization in financial engineering
as precise as Matlab can get and the other is rounded to two digits Heuristics
(i.e., 0.23 or 0.67). Table 1 shows the volatility of the hedging-error The obsession with precision is also found in financial optimization;
(i.e., difference between the achieved payoff and the contractual researchers are striving for exact solutions, better even if in closed-
payoff) as a percentage of the initial option price. (It is often help- form. Finding these exact solutions is not at all straightforward, for
ful to scale option prices, i.e., price to underlier, or price to strike.) most problems it is not possible. Importantly, optimization methods
Figure 2 shows replicated option payoffs. like linear or quadratic programming place — in exchange for exact
solutions — considerable constraints on the problem formulation.
The volatility of the profit-and-loss is practically the same, so even We are often required to shape the problem such that it can be
in the model world nothing is lost by not computing delta to a high solved by such methods. Thus, we get a precise solution, but at the

119
Optimization in financial engineering — an essay on ‘good’ solutions and
misplaced exactitude

14
from classical optimization approaches, but exploit the processing
13
0,8 power of modern computers; in particular, they include elements
12 0,6
of randomness. Consequently, the solution obtained from such
11 a method is only a stochastic approximation of the optimum; we
0,4
10 trade-off approximation error at the solution step against approxi-
0,2
9 mation error when formulating the model. Thus, heuristics are not
8
0
‘better’ methods than classical techniques. The question is rather
7 -0,2 when to use which approach [Zanakis and Evans (1981)]. In finance,
0 100 200 300 400 500 600 0 100 200 300 400 500 600
heuristics are appropriate [Maringer (2005) gives an introduction
Figure 3 – Risk and risk-adjusted return. The grey dots give the actual portfolios, the and presents several case studies].
dark line is a local average.

Minimizing downside risk


price of possibly incurring more approximation error at an earlier In this section, we will consider a concrete example: portfolio opti-
stage. An example from portfolio optimization can illustrate this mization. Our first aim is to evaluate the precision provided by a
point. Markowitz (1959, chapter  9) compares two risk measures, heuristic technique. To do that, we need to compare the in-sample
variance and semi-variance, along the dimensions cost, conve- quality of a solution with its out-of-sample quality. Then, we will
nience, familiarity, and desirability, and concludes that variance compare several selection criteria for portfolio optimization, and
is superior in terms of cost, convenience, and familiarity. For vari- discuss the robustness of the results.
ance, we can compute the exact solution to the portfolio selection
problem; for semi-variance, we can only approximate the solution. Required precision — we use a database of several hundred
But with today’s computing power (the computing power we have European stocks to run a backtest for a simple portfolio strategy:
on our desktops), we can test whether even with an inexact solution minimize semi-variance, subject to (i) the number of assets in the
for semi-variance the gains in desirability are worth the effort. portfolio being between 20 and 50, (ii) any weight of an included
asset being between 1% and 5%. We construct a portfolio using
To solve such a problem, we can use optimization heuristics. The data from the last year, hold the portfolio for three months and
term heuristic is used in different fields with different, though relat- record its performance; then we rebalance. In this manner, we
ed, meanings. In mathematics, it is used for derivations that are not ‘walk forward’ through the data which spans the period from
provable (sometimes even incorrect), but lead to correct conclu- January  1999 to March  2008 [Details can be found in Gilli and
sions nonetheless. [The term was made famous by George Pólya Schumann (2009)].
(1957).] Psychologists use the word for simple ‘rules of thumb’ for
decision making. The term acquired a negative connotation through The solution to this optimization problem cannot be computed
the works of Kahnemann and Tversky in the 1970s, since their ‘heu- exactly. We use a heuristic method called Threshold Accepting.
ristics and biases’ program involved a number of experiments that This method, however, only returns stochastic solutions: running
showed the apparent suboptimalitiy of such simple decision rules. the method twice for the same dataset will lead to different optimal
More recently, however, an alternative interpretation of these portfolios. With this method, we face an explicit trade-off between
results has been advanced [Gigerenzer (2004, 2008)]. Studies indi- computing time and precision. So when we let the algorithm search
cate that while simple rules underperform in stylized settings, they
yield (often surprisingly) good results in more realistic situations,
in particular in the presence of uncertainty. The term heuristic is
Ranks Average risk Average risk-adjusted return
also used in computer science; Pearl (1984) describes heuristics as
methods or rules for decision making that are (i) simple, and (ii) give all 9.55 (1.67) 0.48 (0.19)

good results sufficiently often. 1-50 8.03 (0.04) 0.66 (0.05)

51-100 8.07 (0.05) 0.66 (0.05)

In numerical optimization, heuristics are methods that aim to provide 101-150 8.09 (0.04) 0.64 (0.05)

good and fast approximations to optimal solutions [Michalewicz and 151-200 8.17 (0.08) 0.63 (0.07)
Fogel (2004)]. Conceptually, they are often very simple; implement- 201-300 8.51 (0.13) 0.59 (0.08)
ing them rarely requires high levels of mathematical sophistication 301-400 9.16 (0.24) 0.49 (0.11)
or programming skills. Heuristics are flexible, we can easily add, 401-500 10.94 (0.40) 0.35 (0.09)
remove, or change constraints, or modify the objective function.
501-600 12.49 (0.55) 0.19 (0.11)
Well-known examples for such techniques are Simulated Annealing
and Genetic Algorithms. Heuristics employ strategies that differ Table 2 – Risk and risk-adjusted return (out-of-sample).

120 – The   journal of financial transformation


Optimization in financial engineering — an essay on ‘good’ solutions and
misplaced exactitude

for longer, then on average we get better solutions. We execute, for 2


the same dataset, 600 optimization runs with differing numbers of 1.8

iterations, and obtain 600 solutions that differ in their in-sample 1.6

precision. Higher in-sample precision is associated with more com- 1.4

1.2
puting time. We rank these portfolios by their in-sample objective
1
function (i.e., in-sample risk) so that rank 1 is the best portfolio and
0.8
rank 600 is the worst.
Jan00 May01 Oct02 Feb04 Jul05 Nov06 Apr08

The left-hand panel of Figure 3 shows the resulting out-of-sample


2
risk of the portfolios, sorted by in-sample rank. We observe an
1.8
encouraging picture: as the in-sample risk goes down, so does the
1.6
out-of-sample risk. In other words, increasing the precision of the 1.4
in-sample solutions does improve the out-of-sample quality of the 1.2

model. At least up to a point: for the best 200 portfolios or so, the 1

out-of-sample risk is practically the same. So once we have a ‘good’ 0.8

solution, further improvements are only marginal. We also compute Jan00 May01 Oct02 Feb04 Jul05 Nov06 Apr08

risk-adjusted returns (Sortino ratios with a required return of zero),


shown in the right-hand panel. It shows a similar pattern, even Figure 4 – Upper panel: out-of-sample performance of one euro invested in the rank-1
portfolio. Lower panel: out-of-sample performance after jackknifing.
though it is much noisier. Table 2 gives details (all annualized). The
numbers in parentheses are the standard deviations of the out-of-
sample results.
respect to even slight data changes. Hence, objecting to heuristics
This randomness, it must be stressed, follows from the optimization because they do not provide exact solutions is not a valid argument
procedure: in each of our 600 runs we obtained slightly different in finance.
portfolios, each portfolio maps into a different out-of-sample per-
formance. For the best portfolios, the improvements are minuscule. Robustness checks — we run backtests for a large number of
For example, the average risk per year of the best 50 portfolios is 4 alternative selection criteria, among them partial moments (i.e.,
basis points lower than the risk of the next-best 50 portfolios. semi-variance), conditional moments (i.e., Expected Shortfall), or
quantiles (i.e., Value-at-Risk). In the study described above, we only
To judge the relevance of this randomness introduced by our investigated the approximation errors of the optimization method,
numerical technique, we need to compare it with the uncertainty compared with the errors coming from the data. But now we wish
coming from the data. To build intuition, we jackknife from the to compare different models.
out-of-sample paths just as we did in Example 1 above. An example
is illustrated in Figure  4. In the upper panel, we picture the out- We implement a backtest like the one described above, but we also
of-sample performance of one euro that was invested in the best add a robustness check, again based on a jackknifing of the data:
portfolio (rank-1; this path corresponds to the left-most grey dot in suppose a small number of in-sample observations were randomly
Figure 3). In the lower panel, we see several paths computed after selected and deleted (we delete 10% of the data). The data has
having randomly-selected and deleted one percent of the data changed, and hence the composition of the computed portfolio
points. The average risk of the best 50 portfolios in our tests was will change. If the portfolio selection strategy is robust, we should
8.03% per year, with a standard deviation of 4 basis points. With expect the resulting portfolio to be similar to the original one, as
jackknifing one percent of the data, we obtain risk between 7.75% the change in the historical data is only small, and we would also
and 8.17%; with jackknifing five percent, we get a risk between expect the new portfolio to exhibit a similar out-of-sample perfor-
7.58 and 8.29, far greater than the randomness introduced by our mance. Repeating this procedure many times, we obtain a sampling
method. For risk-adjusted return – in which we are naturally more distribution of portfolio weights, and consequently also a sampling
interested – things are even worse. The best 50 portfolios had an distribution of out-of-sample performance. We do not compare
average risk-return ratio of 0.66. Just jackknifing the paths of these the differences in the portfolio weights since it is difficult to judge
portfolios by one percent, we already get a range of between 0.42 what a given norm of the difference between two weight-vectors
and 0.89. practically means. Rather, we look at the changes in out-of-sample
results. This means that for any computed quantity that we are
In sum, heuristics may introduce approximation error into our interested in we have a distribution of outcomes. Figure  5 gives
analysis, but it is swamped by the sensitivity of the problem with some examples for different strategies.

121
Optimization in financial engineering — an essay on ‘good’ solutions and
misplaced exactitude

1
Conclusion
In this article, we have discussed the precision with which financial
models are handled, in particular optimization models. We have
argued that precision is only required to a level that is justified by
0.5
Minimum variance the overall accuracy of the model. Hence, the required precision
Upside potential ratio should be specifically analyzed, so that the usefulness and limita-
Value-at-Risk
tions of a model can be better appreciated. Our discussion may
0
10 11 12 13 14 15 16 17 18 19 20 appear trivial; everyone knows that financial markets are noisy,
Annualised return in %
and that models are not perfect. Yet the question of the appropri-
ate precision of models with regard to their empirical application is
1
rarely discussed explicitly. In particular, it is rarely discussed in uni-
Minimum variance
Upside potential ratio versity courses on financial economics and financial engineering.
Value-at-Risk Again, some may argue, the errors are understood implicitly (just
0.5
like ‘500 meters’ means ‘between 400 and 600 meters’), or that in
any case more precision does no harm; but here we disagree. We
seem to have a built-in incapacity to intuitively appreciate random-
0
ness and chance, hence we strive for ever more precise answers. All
0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1
Annualised Sharpe ratios too easily then, precision is confused with accuracy; acting on the
former instead of the latter may lead to painful consequences.
Figure 5 – Annualized returns and Sharpe ratios for different portfolio selection
strategies.
References
• Acker, D., and N. W. Duck, 2007, “Reference-day risk and the use of monthly returns
data,” Journal of Accounting, Auditing and Finance, 22, 527-557
(More details can be found in Gilli and Schumann, forthcoming.) The • Dimitrov, V., and S. Govindaraj, 2007, “Reference-day risk: observations and exten-
figure shows the out-of-sample returns of three strategies: mini- sions,” Journal of Accounting, Auditing and Finance, 22, 559-572
• Gigerenzer, G., 2004, “Fast and frugal heuristics: the tools of bounded rationality,” in
mum variance, the upside-potential ratio [Sortino et al. (1999)], and
Koehler, D. J. and N. Harvey (eds), Blackwell handbook of judgment and decision mak-
Value-at-Risk; we also plot Sharpe ratios. Portfolios constructed ing, Blackwell Publishing
with the upside-potential ratio, for instance, have a median return • Gigerenzer, G., 2008, “Why heuristics work,” Perspectives on Psychological Science,
3, 20-29
that is more than a percentage point higher than the return of the
• Gilli, M., and E. Schumann, 2009, “Optimal enough?” COMISEF Working Paper Series,
minimum variance portfolio; Sharpe ratios are also higher. Even No. 10
VaR seems better than its reputation. But most remarkable is the • Gilli, M., and E. Schumann, forthcoming, “Risk-reward optimisation for long-run inves-
tors: an empirical analysis,” European Actuarial Journal
range of outcomes: given a 10% perturbation of in-sample data,
• Kamal, M., and E. Derman, 1999, “When you cannot hedge continuously: the corrections
returns differ by more than 5 percentage points per year. of Black-Scholes,” Risk, 12, 82-85
• Maringer, D., 2005, Portfolio management with heuristic optimization, Springer
• Markowitz, H. M., 1959, Portfolio selection, Wiley
• Michalewicz, Z., and D. B. Fogel, 2004, How to solve it: modern heuristics, Springer,
2nd edition
• Morgenstern, O., 1963, On the accuracy of economic observations, Princeton University
Press, 2nd edition
• Pearl, J., 1984, Heuristics, Addison-Wesley
• Pólya, G., 1957, How to solve it, Princeton University Press, 2nd edition (expanded
reprint, 2004)
• Sortino, F., R. van der Meer, and A. Plantinga, 1999, “The Dutch triangle,” Journal of
Portfolio Management, 26, 50-58
• von Neumann, J., and H. H. Goldstine, 1947, “Numerical inverting of matrices of high
order,” Bulletin of the American Mathematical Society, 53, 1021-1099
• Zanakis, S. H., and J. R. Evans, 1981, “Heuristic ‘optimization’: why, when, and how to
use it,” Interfaces, 11, 84-91

122 – The   journal of financial transformation


Articles

A VaR too far?


The pricing of
operational risk

Rodney Coleman
Department of Mathematics,
Imperial College London

Abstract
This paper is a commentary on current and emerging statistical
practices for analyzing operational risk losses according to the
Advanced Measurement Approaches of Basel II, the New Basel
Accord. In particular, the limitations of the ability to model opera-
tional risk loss data to obtain high severity quantiles when the
sample sizes are small and exposed. The viewpoint is that of a
mathematical statistician.

123
A VaR too far? The pricing of operational risk

Value-at-Risk (VaR) entered the financial lexicon as a measure of a business line/loss event type grid using operational risk loss data
volatility matched to riskiness. Basel I saw that it could be a risk- that they themselves have collected, supplemented as required by
sensitive pricing mechanism for computing regulatory capital for data from external sources.
market and credit risks. Basel II extended its scope to operational
risk, the business risk of loss resulting from inadequate or failed Pillar 2 of the Accord requires banks to demonstrate that their man-
internal processes, people, or systems, or from external events. The agement and supervisory systems are satisfactory. Pillar 3 relates
European Parliament agreed and has passed legislation ensuring to transparency, requiring them to report on their operational risk
that internationally active banks and insurers throughout the E.U. management. It is these two latter pillars that are probably going
will have adopted its provisions from 2012. to have a greater impact in protecting global finance than loss
modeling.
However, to put the problem of applying the Basel II risk-sensitive
advanced measurement approaches [BCBS (2006)] in perspective, Solvency II, the European Union’s regulatory directive for insurers,
consider the role of quantitative risk modeling in the recent global has adopted the same three pillars. This directive will come into
financial crisis. This began with the credit crunch in August 2007 force throughout the E.U. in 2012.
following problems created by subprime mortgage lending in the
U.S. A bubble of cheap borrowing was allowed to develop, with debt In November 2007, the U.S. banking agencies approved the U.S.
securitized (i.e. off-loaded) and insured against default, creating a Final Rule for Basel II.
vicious spiral. In their underwriting, the insurers failed to adjust for
the risk that property values might fall, with consequent defaults on Banks will be grouped into the large or internationally active banks
mortgage repayment. This bad business decision caused havoc at that will be required to adopt AMA, those that voluntarily opt-in to
banks who lent the money, and with investors who bought the debt, it, and the rest who will adopt an extended version of the earlier
and insurers without the means to pay out on the defaults. Where Basel I.
was VaR when it was needed? That is to say, what part did financial
modeling play in preventing all this? The answer has to be nothing We note that in the rule book, summarized in BCBS (2006), opera-
much. Further, it can only marginally be put down to operational tional risk sits in paragraphs 644 to 679, occupying the final 12
risk, since operational risk excludes risks which result in losses from pages, pages 140 to 151.
poor business decisions.
Advanced measurement approaches
Oprisk losses will often stem from weak management and from Attention is directed to the following passages taken from BCBS
external events. Basel II, the New Basel Accord, set out a regulatory (2006) that cover the modeling requirements for the advanced
framework for operational risk at internationally active banks. In so measurement approaches.
doing, it gave a generally accepted definition for operational risk,
previously categorized as ‘other risks.’ More importantly, it is Basel (665) — A bank’s internal measurement system must reasonably
II’s role in driving developments in operational risk management estimate unexpected losses based on the combined use of internal
practices, the search for robust risk measurement, and the pros- and relevant external loss data, scenario analysis, and bank-specific
pect of being subject to regulatory supervision and transparency business environment and internal control factors (BEICF).
through disclosure (Pillars 2 and 3 of the Accord) that have led in a
short time to operational risk becoming a significant topic in busi- (667) — The Committee is not specifying the approach or distribu-
ness and management studies. tional assumptions used to generate the operational risk measure
for regulatory capital purposes. However, a bank must be able to
Basel II demonstrate that its approach captures potentially severe ‘tail’
The Accord sets out a risk sensitive way of calculating reserve capi- loss events. Whatever approach is used, a bank must demonstrate
tal to cover possible defaults. Institutions are required to categorize that its operational risk measure meets a soundness standard [...]
operational risk losses by event type, promoting identification of comparable to a one-year holding period and a 99.9th percentile
risk drivers. There is no mandated methodology. confidence interval.

Pillar 1 of Basel II gives three ways of calculating the operational risk (669b) — Supervisors will require the bank to calculate its regula-
capital charge, with increasing complexity, but benefiting from a tory capital requirement as the sum of expected loss (EL) and
reducing charge. We shall be considering its requirements in respect unexpected loss (UL), unless the bank can demonstrate that it is
of its highest level, the Advanced Measurement Approaches (AMA), adequately capturing EL in its internal business practices. That is,
which requires that the banks model loss distributions of cells over to base the minimum regulatory capital requirement on UL alone,

124 – The   journal of financial transformation


A VaR too far? The pricing of operational risk

the bank must be able to demonstrate [...] that it has measured and (673) — A bank must have an appropriate de minimus gross loss
accounted for its EL exposure. threshold for internal loss data collection, for example, €10,000.

(669c) — A bank’s risk measurement system must be sufficiently (675) — A bank must use scenario analysis of expert opinion in
`granular’ to capture the major drivers of operational risk affecting conjunction with external data to evaluate its exposure to high-
the shape of the tail of the loss estimates. severity events. [...] These expert assessments could be expressed
as parameters of an assumed statistical loss distribution.
(669d) — The bank may be permitted to use internally determined
correlations in operational risk losses across individual operational Event types and business lines
risk estimates [...]. The bank must validate its correlation assump- Table 1 shows the seven designated loss event types (ETs) given in
tions. Basel II, also adopted by Solvency II. Table 2 shows the eight broad
business lines (BLs) within the banking sector given in Basel II.
Together they create an 8 by 7 grid of 56 BL/ET cells.
Event types

Internal fraud
The Operational Risk Consortium Ltd (ORIC) database of opera-
External fraud
tional risk events, established in 2005 by the Association of British
Employment practices and workplace safety
Insurers (ABI) has nearly 2000 events showing losses exceeding
Clients, products, and business practice £10,000 in the years 2000 to 2008. Table 3, based on a report for
Damage to physical assets ABI [Selvaggi (2009)], gives percentages of loss amounts for those
Business disruption and system failures BL/ET cells having at least 4% of the total loss amount.
Execution, delivery, and process management

Event type
Table 1 – Event types for banking and insurance under Basel II and Solvency II
Business line CP&BS ED&PM BD&SF Others Total

Sales and distribution 18.9 6.8 0.7 26.4


(699f) — There may be cases where estimates of the 99.9th percen-
Customer service/policy 13.2 2.3 15.5
tile confidence interval based primarily on internal and external loss
Accounting/finance 23.4 0.1 23.5
event data would be unreliable for business lines with a heavy-tailed
IT 6.0 6.6 12.6
loss distribution and a small number of observed losses. In such
Claims 4.0 1.4 5.4
cases, scenario analysis, and business environment and control
Underwriting 6.3 0.3 6.6
factors, may play a more dominant role in the risk measurement
system. Conversely, operational loss event data may play a more Others 5.1 11.8 1.4   10.0

prominent role in the risk measurement system for business lines Total 24.0 65.5 7.4 11.4 100.0

where estimates of the 99.9th percentile confidence interval based


Table 3 – Business line/event type grid showing percentages of loss amounts (min
primarily on such data are deemed reliable. 4%) in the ORIC database (2000-08)
Source: Selvaggi (2009)
(672) — Internally generated operational risk measures used for
regulatory capital purposes must be based on a minimum five-year
observation period of internal loss data. This information tells little about the actual events. For this we need
Level 2 and Level 3 categories, the seven event types being Level
1. To illustrate this, again from Selvaggi (2009), Table 4 shows the
Business unit Business line
most significant Level 2 and Level 3 event types in terms of both
Investment banking Corporate finance
severity and frequency (values over 4%) from ORIC. This table
Trading and sales
excludes losses arising from the widespread mis-selling of endow-
Banking Retail banking
ment policies in the U.K. in the 1990s.The victims were misled with
Commercial banking over-optimistic forecasts that their policies would pay off their
Payment and settlement mortgages on maturity. Among the insurers, Abbey Life was fined
Agency services a record £1 million in December 2002, and paid £160 million in com-
Others Asset management pensation to 45,000 policy holders. The following March, Royal &
Retail brokerage Sun Alliance received a fine of £950,000. Later that year, Friends
Provident was fined £675,000 for mis-handling complaints, also an
Table 2 – Business units and business lines for international banking under Basel II
operational risk loss.

125
A VaR too far? The pricing of operational risk

boundaries. As well as these, a threshold ‘petty cash’ limit is needed


Level 2 Level 3 Size% Freq%
to set a minimum loss for recording it as an operational loss. Loss
Advisory activities Mis-selling (non-endowment) 13 9
events with recovery and other near miss events will also need to
Transaction capture, Accounting error 12
execution, maintenance be entered into the record.
Inadequate process 8
documentation We see that, for regulatory charges, Basel specifies EL to be the
Transaction system error 8 6 expectation of the fitted loss distribution and UL to be its 99.9th
Management information error 7 percentile. For insurers, Solvency II sets UL at the 99.5th percentile.
Data entry errors 7 5 Basel further permits the use of UL as a single metric for charging.
Management failure 5
Let us compare this with the classical VaR, the difference between
Customer service failure 4 16
UL and EL on a profit-and-loss plot. Expectation measures location,
and VaR measures volatility. Oprisk losses are non-negative, so Basel
Suitability, disclosure, Customer complaints 6 4
fiduciary appears to be using zero as its location measure, making it possible
Systems Software 6 to rely on just UL as its opVaR metric. Basel’s referring to ‘confidence
Customer or client Incorrect payment to client/ 9 interval’ for the confidence limit UL adds further weight to thinking
account management customer it refers to the interval (0, UL). Now UL gives no information about
Payment to incorrect client/   4 potential losses larger than itself. In fact, the very high once-in-a-
customer
thousand-years return value is deemed by Basel to capture pretty
Theft and fraud Fraudulent claims 4
much all potential losses. This will not always be the case for heavy-
  Total 76 57
tailed loss distribution models. In survival analysis and extreme value
theory we would also estimate the expected shortfall, the expecta-
Table 4 – Levels 2 and 3 event categories from insurance losses in ORIC (2000-08)
showing loss amount or loss frequency of 4% or more tion of the loss distribution conditioned on the values in excess of UL.
Source: Selvaggi (2009) This has been called CVaR (conditional VaR).

There being a minimum threshold to registered losses, we really


A widely accepted approach in risk management is to identify the need to reconstruct the loss distribution for under-the-threshold
major risks faced by an organization, measured by their impact in losses to include them in the expectation and 99.9th percentile.
terms of their frequency and severity. In many cases, not much
more than a handful of operational risks will give rise to the most I have always felt that the use of the difference between a moment
serious of the loss events and loss amounts. measure (expectation) and a quantile measure (percentile) in the
VaR calculation unnecessarily complicates any investigation of its
However, the day-to-day management requires bottom up as well statistical properties. Why not have median and high percentile?
as top down risk control. There needs to be an understanding of
the risk in all activities. Aggregating losses over business lines and External data
activities would tend to conceal low impact risks behind those hav- During a recent supervisory visit to an insurance company, the
ing more dramatic effect. The bottom-up approach is thus a neces- FSA was told that they had only two operational risk loss events
sary part of seeing the complete risk picture. Aggregation at each to show, with another six possibles. This extreme example of the
higher level will inform management throughout the business. paucity of internally collected oprisk data, particularly the large
losses that would have a major influence in estimating reserve
Expected loss and unexpected loss funding, means that data publicly available or from commercial or
An internal operational loss event register would typically show consortia databases needs to be explored to supplement internal
high impact events at low frequency among events of high fre- loss events.
quency but low impact. A financial institution might therefore sort
its losses into: ‘expected loss’ (EL) to be absorbed by net profit, Fitch’s OpVar is a database of publicly reported oprisk events show-
‘unexpected loss’ (UL) to be covered by risk reserves (so not totally ing nearly 500 losses of more than ten million dollars between
unexpected), and ‘stress loss’ (SL) requiring core capital or hedging 1978 and 2005 in the U.S. The 2004 Loss Data Collection Exercise
for cover. The expected loss per transaction can easily be embed- (LDCE) collected more than a hundred loss events in the U.S. of 100
ded in the transaction pricing. It is the rare but extreme stress loss- million dollars or more in the ten years to 2003.
es that the institution must be most concerned with. This structure
is the basis of the loss data analysis approach to operational risk. The Operational Riskdata eXchange Association (ORX) is well estab-
Hard decisions need to be made in choosing the EL/UL and UL/SL lished as a database of oprisk events in banking. It is a consortium

126 – The   journal of financial transformation


A VaR too far? The pricing of operational risk

collecting data from thirty member banks from twelve countries. It


Business line Sample Internal External Pooled
has more than 44,000 losses, each over €20,000 in value. Apart statistic data data data
from ORIC, individual insurance companies will have their own Retail banking mean 15,888 37,917 11,021
claims records containing accurate settlement values. std deviation 97,668 184,787 59,968

skewness 20.17 26.53 26.60


Combining internal and external data kurtosis 516 910 953
When data from an external database is combined with an in-house
min 500 5,000 1,230
dataset, the former are relocated and scaled to match. The exter-
max 2,965,535 8,547,484 2,965,535
nal data are standardized by subtracting its estimated location
Commercial banking mean 28,682 40,808 19,675
parameter value (i.e., its sample mean m) and dividing the result
median 2,236 12,080 1,347
by its estimated scale value (i.e., its sample standard deviation s)
std deviation 133,874 653,409 83,253
from each datum. Then we adopt the location m’ and scale s’ of
the internal data. So, if y is a standardized external datum, the new min 500 5,000 521.54

value of the external datum is z = m’ + s’y. Finally we check the max 1,206,330 20,000,000 2,086,142

threshold used for the internal database against the transformed


Table 5 – Some descriptive statistics for pooled data [abstracted from Giacometti et
threshold for the relocated and rescaled external data, and set the al. (2007)]
larger of the two as the new threshold, eliminating all data points
that fall below.
Stress testing is about the contingency planning for these adverse
This can lead to strange statistical results. Table 5 shows that sam- events based on the knowledge of business experts.
ple statistics for pooled data can lie outside the range given by the
internal and external data sets [Giacometti et al. (2007, 2008)]. A potential loss event could arise under three types of scenario:
■■ Expected loss (optimistic scenario)
We note that the same statistics using logscale data show this phe- ■■ Unexpected serious case loss (pessimistic scenario)
nomenon for commercial banking skewness and kurtosis. Perhaps ■■ Unexpected worst case loss (catastrophic scenario)
the location and scaling metrics are inappropriate, and we need
model specific metrics for these. We are told that the sample size of This mimics the expected loss, unexpected loss, and stress loss of
the external data is about 6 times that of the internal set. the actuarial approach.

The small sample problem Probability modeling of loss data


An extreme loss in a small sample is overrepresentative of its 1 in a Loss data is not gaussian. The normal distribution model that backs
1000 or 1 in a 10,000 chance, yet under-represented if not observed. so much of statistical inference will not do. Loss data by its nature
Indeed we find the largest losses are overly influential in the fitted has no negative values. There are no profits to be read as negative
model. So, we must conclude that fitting a small dataset cannot losses. Operational losses are always a cost. Further a truncated
truly represent the loss process, whatever the model used.

As a statistician I am uncomfortable with external data. An alterna- 0.005


tive device which I feel to be more respectable is to fit a heavy-
tailed distribution to the data and then simulate a large number of 0.004
values from this fitted distribution, large enough to catch sufficient
high values for statistical analysis. A basic principle in statistics
0.003
is that inferences about models in regions far outside the range
of the available data must be treated as suspect. Yet here we are
expected to do just that when estimating high quantiles. 0.002

Stress and scenario testing 0.001


Scenario analysis has as its object to foresee and consider respons-
es to severe oprisk events. However, merging scenario data with 0.0
actual loss data will corrupt that data and distort the loss distribu- 0.0 200 400 600 800 1000
tion. These scenario losses would be those contributing to a higher
capital charge than otherwise would be the case. Figure 1

127
A VaR too far? The pricing of operational risk

1 μ σ x
GEV 230 130 0.53
0.8 Simulation 1 234 135 0.56

Simulation 2 224 120 0.49

0.6 Simulation 3 221 119 0.45

Simulation 4 230 131 0.51

0.4 Average 227 126 0.50

Table 7 – Parameter estimates of GEV (x, μ, σ) from four simulations of 1000 values
0.2 from GEV (0.53, 230, 130)

0.0
vary to obtain a good fit. The location μ and shape x are not to be
0.0 1000 2000 3000 4000
identified with the population mean and population variance. For
Figure 2 the GPD μ is the lower bound of the range. Figure 1 shows the form
of their respective probability density functions. The lognormal and
normal distribution taking only positive values gives too little prob- Weibull have just two parameters, and so lack flexibility. Four and
ability to its tail, making insufficient allowance for large and very more parameter models, such as Tukey’s g-and-h class of distribu-
large losses. The lognormal distribution has been used instead tions, are also gaining users, but require more data than is usually
historically in econometrics theory, and the Weibull in reliability available, though they have been seen to capture the loss distribu-
modeling. In practice, even the lognormal will fail to pick up on the tion of aggregated firm-wide losses. An extensive list with plots and
extremely large losses. properties can be found in Young and Coleman (2009).

Two models that can allow large observations come from Extreme fitting severity models
Value Theory. These are the Generalized Extreme Value distribu- We fit the GEV and GPD to the 75 losses given in Cruz (2002,
tion (GEV) and the Generalized Pareto Distribution (GPD). They are p.83). For each model we use two fitting processes: maximum
limit distributions as sample sizes increase to infinity. These distri- likelihood for all three parameters and maximum likelihood for the
butions are used in environmental studies (hydrology, pollution, sea location and scale, but the Hill estimator for the shape parameter
defenses, etc.) as well as in finance. The GEV and GPD each have [Hill (1975)]. Figure 2 shows the sample cumulative distribution
three parameters, μ giving location, σ scale, and x shape, which we function (the observed proportion of values less than x) shown as
steps, together with four fitted cumulative distribution functions
(the height y is the probability of obtaining a future value less than
fitted model GEV GEV GpD GpD x). The range of observation is (143, 3822). From Figure 2 we can
parameter estimates see a good fit in each case. In Table 6 what we also see is that the
μ 230 230 135 150
estimated losses at large quantiles (reading x-values from fitted
σ 130 100 165 125
y-values) differ greatly between the fitted models, that is to say, at
x 0.53 0.70 0.44 0.70
Quantiles
values way beyond the largest observation.
Q(0.9) 777 793 866 793
Q(0.95) 1230 1169 1425 1161 Estimation far outside a dataset is always fraught and can lead
Q(0.975) 1960 1706 2333 1661 to significant errors in high quantile estimation. Basel II asks for
Q(0.99) 3663 2794 4457 2605 the 99.9 percentile, Q(0.999), Solvency II for the 99.5 percentile,
Q(0.995) 5906 4046 7258 3619
Q(0.995). These quantiles are the opVaR.
Q(0.999) 18066 9525 22452 7595
Data Model values
917 1089 1057 1251 1053
A simulation study of GPD (0.70, 150, 125) gave an estimated 95%
1299 1288 1214 1497 1204 confidence interval for Q(0.999) of (5200, 9990), very wide indeed.
1416 1614 1459 1902 1435
2568 2280 1925 2733 1857 The computations were made using Academic Xtremes [Reiss and
3822 4842 3470 5929 3160 Thomas (2007)]. The statistical operations were carried out without
seeking any great precision. The data in thousands of dollars were
Table 6 – The parameters, quantiles, and fitted values of GEV and GPD models when
fitted to loss data rounded to the nearest thousand dollars, the parameters of the
Source: Young and Coleman (2009, pp. 400-403) fitted models are rounded to two significant digits, the fits were

128 – The journal of financial transformation


A VaR too far? The pricing of operational risk

judged by eye, the simulation for the estimated confidence interval ■■ Dynamic financial analysis refers to enterprise-wide integrated
was based on a simulation of only 4000 values. My first big surprise financial risk management. It involves (mathematical) modeling
was that a good fit could be achieved, secondly that it could be of every business line, with dynamic updating in real time.
achieved so easily, my third that we could have four close fits, and ■■ Bayes belief networks (BBNs) are acyclic graphs of nodes con-
with such a variety of parameter values. Table 7 shows parameter nected by directed links of cause and effect. The nodes are
estimation variability through four simulations of 1000 values from events, and the states are represented by random variables.
GEV (0.53, 230, 130). We see that 1000 values can be a small sample Each node event is conditioned on every path to it. For the link A
in that it may not be enough to provide precise estimation. to B, the random variable X associated with event A is given by
the multi-on-multi-variate history leading to A. The BBN requires
Why such lack of concern for precision? Highly sophisticated meth- starting probabilities for each node, and the calculation for P(B |
ods can do no better when we have such variability in the resulting A) where A incorporates all links to it. This is a formidable task.
output at high quantiles far beyond the data. We see this variability The introspection forces risk assessment for every activity, and,
right away in the estimated parameter values. The fit was judged once the network is up and running, it can be used for stress
by eye. Why were goodness-of-fit tests not used? They are by their testing.
nature conservative, requiring strong evidence for rejecting a fit
(evidence not available here). To sum up
The point being emphasized is that no methodology on its own can
Modeling body and tail separately provide an answer. Multiple approaches should be used, both quali-
With sufficient data we may see that the tail data needs to be mod- tative and quantitative, to aid management in acquiring a sensitivity
eled separately from the main body. We might consider fitting a to data and its interpretation and its use in decision making.
GEV to the body and a GPD to the tail, reflecting the Extreme Value
Theory properties. Three parameters each and one for the loca- References
• Basel Committee on Banking Supervision, 2006, “International convergence of capital
tion of the join between the two parts makes seven, but having the
measurement and capital standards,” Bank for International Settlements
two probability densities meeting smoothly provides two relations • Coles S., and E. Powell, 1996, “Bayesian methods in extreme value modelling: A review
between them, and using the same shape parameter brings the and new developments, International Statistical Review, 64, 119-136
• Cruz, M.G., 2002, Modeling, measuring and hedging operational risk, Wiley
problem down to four unknowns [Giacometti et al. (2007), Young
• Efron, B., and R. Tibshirani, 1993, An introduction to the bootstrap, Chapman & Hall
and Coleman (2009, p. 397)]. • Embrechts P., (editor), 2000, Extremes and integrated risk management, Risk Books
• Giacometti, R., S. Rachev, A. Chernobai, and M. Bertocchi, 2008, “Aggregation issues in

Modeling frequency operational risk,” The Journal of Operational Risk, 3:3, 3-23
• Giacometti, R., S. Rachev, A. Chernobai, M. Bertocchi, and G. Consigli, 2007, “Heavy-
Standard statistical methods can be used to fit Poisson or negative tailed distributional model for operational risk,” The Journal of Operational Risk, 3:3,
binomial probability distributions to frequency counts. Experience 3-23
• Hill, B.M., 1975, “A simple general approach to inference about the tail of a distribu-
shows that the daily, weekly, or monthly frequencies of loss events
tion,” Annals of Statistics, 3, 1163-1174
tend to occur in a more irregular pattern than can be fitted by • Kyriacou, M.N., and E.A. Medova, 2000, “Extreme values and the measurement of
either of these models. operational risk II,” Operational Risk, 1:8, 12-15
• Lloyd’s of London, 2009, “ICA: 2009 Minimum Standards and Guidance,”, Lloyd’s
• Medova, E.A., 2000, “Extreme values and the measurement of operational risk I”,
Basel asks that we combine the fitted frequency model with the fit- Operational Risk, 1:7, 13-17
ted severity model to obtain a joint model. This loses the correlation • Reason, J., 1997, Managing the risk of organisational accidents, Ashgate
• Reiss, R.-D., and M. Thomas, 2007, Statistical analysis of extreme values (3rd edition),
structure from the loss events.
Birkhauser
• Selvaggi, M., 2009, “Analysing operational risk in insurance,” ABI Research Paper 16.
Some topics left out: • Young, B., and R. Coleman, 2009, Operational risk assessment, Wiley

■■ Basel asks for correlation analysis: a big problem with small data-
sets. Mathematical finance has provided us with a correlation
theory based on copulas, but not useful here.
■■ Validation techniques, such as the resampling methods of the
jackknife and bootstrap [Efron and Tibshirani (1993)] can be
used to obtain sampling properties of the estimates such as
confidence intervals.
■■ Bayes hierarchical modeling [Medova (2000), Kyriacou and
Medova (2000), Coles and Powell (1996)] treats non-stationarity
by letting the GPD parameters be themselves random from dis-
tributions with parameters (hyper-parameters).

129
Articles

Risk management
after the Great Crash

Hans J. Blommestein1
PwC Professor of Finance, Tilburg University and
Head of Bond Markets and Public Debt, OECD

Abstract
This study takes a closer look at the role of risk (mis)management management problems than the one we experienced during the
by financial institutions in the emergence of the Great Crash. It is Great Crash. Against this backdrop, the most promising approach
explained that prior to the crisis too much reliance was placed on for improving risk management systems is by providing a coherent
the quantitative side of risk management, while not enough atten- framework for addressing systematical weaknesses and problems
tion was paid to qualitative risk management. In this context it is that are of a qualitative nature. However, given the inadequate and
argued that there is an urgent need for dealing more effectively very imperfect academic knowledge and tools that are available,
with inherent weaknesses related to institutional- and organiza- risk management as a scientific discipline is not capable of dealing
tional aspects, governance issues, and incentives. More sophistica- adequately with fundamental uncertainties in the financial system,
tion and a further refinement of existing quantitative risk manage- even if a coherent picture associated with the aforementioned
ment models and techniques is not the most important or effective qualitative issues and problems is being provided. This perspective
response to the uncertainties and risks associated with a fast-mov- is providing an additional motivation to those authorities that are
ing financial landscape. In fact, too much faith in a new generation contemplating to constrain or restructure parts of the architecture
of complex risk models might lead to even more spectacular risk of the new financial landscape.

1 The views expressed are personal ones and do not represent the organisations with
which the author is affiliated. All errors are mine. 131
Risk management after the Great Crash

The literature on the causes of the global credit/liquidity crisis are usually developed for a system with ‘well-behaved or governed’
(Great Crash for short) is growing exponentially [see, for example, financial institutions and markets that operate within a behavioral
Senior Supervisors Group (2008), FSA (2009), BIS (2009), among framework with ‘well-behaved’ incentives (that is, risk management
others]. The many studies and policy reports reveal serious failures is not hampered by dysfunctional institutions, markets, or financial
at both the micro- (financial institutions) and macro-levels (financial instruments and/or undermined by perverse incentives).
system as a whole) and mistakes made by different actors (bankers,
rating agencies, supervisors, monetary authorities, etc.). This paper In this paper, I will, therefore, not only focus on the quantitative
(i) takes a closer look at the role of risk (mis)management by finan- dimension of risk management systems such as the mispricing of
cial institutions in the creation of the Great Crash; and (ii) outlines risks (of complex products) and measurement problems but also on
the most promising ways for improving risk management, in par- the qualitative dimension covering such issues like those involving
ticular by paying much more attention to qualitative risk manage- the above mentioned ‘practical’ obstacles of an institutional, orga-
ment issues. However, it will also be explained why it would still be nizational and incentive nature. In addition to the well-documented
impossible to prevent major crises in the future, even if a coherent role of the perverse incentives associated with misconstrued com-
picture involving qualitative issues or aspects would be provided. pensation schemes, it will be argued that the institutional, or orga-
nizational, embedding of risk management is of crucial importance
However, in doing so, we do not exclude the importance of the role as well. It will be suggested that the qualitative dimension of risk
of other (important) factors in the origin of the Great Crash. On the management is equally (or perhaps even more) important to the
contrary, mistaken macroeconomic policies as well as deficiencies measurement or quantitative side of the process of risk manage-
in the official oversight of the financial system were also significant ment.
factors in the global financial crisis, especially in light of global imbal-
ances and the emergence of systemic risks and network externali- This view implies that even if the risk management divisions of
ties during the evolution of the crisis. However, most commentators these financial institutions had acted with the longer-term inter-
would agree that serious failures in risk management at the level of ests of all stakeholders in mind (which many of them did not), they
major (and even some medium-sized) financial institutions played an would still have had an uphill battle in effectively managing the risks
important role in the origin and dynamics of the Great Crash. within their firms because of (a) the inadequate tools that were at
their disposal (based, crucially, on insights from academic finance);
Why did risk management fail at the major financial (b) the complex organizational architecture, and sometimes dys-
institutions? functional institutional environment, in which many risk managers
The conventional storyline before the Great Crash was that risk had (have) to operate; and (c) excessive risk-taking associated
management as a science has made considerable progress in the with business strategies that incorporated perverse compensa-
past two decades or so and that financial innovations such as risk tion schemes. In other words, had the risk management divisions
transfer techniques had actually made the balance sheet of finan- of these institutions effectively implemented the state-of-the-art
cial institutions stronger. However, my analysis of risk management tools that were provided to them by academic finance (includ-
failures associated with the global financial crisis (and supple- ing, crucially, reliable information about the riskiness of complex
mented by insights gained from studying earlier crisis episodes financial instruments such as structured products), they would still
such as the crash of LTCM in 1998) [Blommestein (2000)] uncovers have to struggle with the complex institutional or organizational
deep flaws in the effectiveness of risk management tools used by embedding of risk management situations as well as the various
financial institutions. channels through which an unsound incentive structure (operating
between financial institutions and the market) can have an adverse
The crisis has not only shown that many academic theories were structural impact on the pricing of financial assets2.
(are) not well-equipped to properly price the risks of complex
instruments such as CDOs [Blommestein (2008b, 2009)], espe- This perspective can then also be used to explain why it is very
cially during market down-turns [Rajan (2009)], but also that difficult or even impossible to effectively manage the totality of
risk management methodologies and strategies based on basic complex risks faced by international banks and other financial insti-
academic finance insights were not effective or even misleading. tutions. More specifically, it is the reason why effective enterprise-
A core reason is that academic risk-management methodologies wide risk management is an extremely hard objective, especially in

2 There are various channels through which an unsound incentive structure can have very generous severance and retirement packages. There are also fundamental flaws
an adverse structural influence on the pricing of financial assets. For example, Fuller in the bonus culture. Costs and benefits associated with risk-taking are not equally
and Jensen (2002) illustrate (via the experiences of Enron and Nortel) the dangers shared and the annual merry-go-round means financial institutions can end up paying
of conforming to market pressures for growth that are essentially impossible, leading bonuses on trades and other transactions that subsequently prove extremely costly
to an overvalued stock. Other channels through which perverse incentives are being [Thal Larsen (2008)]. Rajan (2008) points out that bankers’ pay is deeply flawed. He
transmitted originate from situations where traders and CEOs pursue aggressive risk explains that employees at banks (CEOs, investment managers, traders) generate
132 strategies, while they are facing largely an upside in their rewards structures (and jumbo rewards by creating bogus ‘alpha’ by hiding long-tail risks. In a similar spirit,
hardly a downside). For example, successful traders can make fortunes, while those Moody’s arrives at a very strong conclusion that the financial system suffers from
that fail simply lose their jobs (in most cases they move on to a trading job in other flawed incentives that encourage excessive risk-taking [Barley (2008)].
financial institutions). CEOs of institutions that suffer massive losses walk away with
Risk management after the Great Crash

a rapidly-changing environment. This echoes a conclusion from a weakened the applicability and conditions under which these quan-
2005 paper on this topic: “But successful implementation of ERM titative tools and techniques can be used effectively. Many market
is not easy. For example, a recent survey by the Conference Board, participants (including sophisticated ones) had difficulties in under-
a business research organization, shows that only 11 per cent of standing the nature and pricing of new products and markets, due
companies have completed the implementation of ERM processes, to the sheer complexity of many new financial instruments and the
while more than 90 per cent are building or want to build such a underlying links in the new financial landscape. In a 2005 study
framework” [Blommestein (2005b)]. I noted: “Even sophisticated market participants might, at times,
have difficulties understanding the nature of these new products
For all these reasons we have to have a realistic attitude towards and markets. Consequently, risks may be seriously mispriced. In
the practical capacity of risk management systems, even advanced the market for collateralized debt obligations (CDOs), the high pace
ones based on the latest quantitative risk measures developed in of product development requires the rapid adaptation of pricing
the academic literature. An additional reason to take a very modest machines and investment strategies. Although the ability to value
view on real abilities of available risk control technologies is the fact risky assets has generally increased, concerns have been raised
that academically developed risk measures predominantly deal with about the complex risks in this fast-growing, market segment and,
market-and-credit risk. Academic finance has much less to say about more to the point, whether investors really understand what they
the analytical basis for liquidity risk, operational risk, and systemic are buying” [Blommestein (2005b)].
risk. Unfortunately, the latter types of risks played major roles in the
origin of the Great Crash [Blommestein (2008a)], showing that they Moreover, as explained above, securitization was adversely affected
were not adequately diagnosed, managed, and/or supervised. by problems with incentives and information as well as the pricing
of tail events [Cechetti (2009)]. More generally, a number of widely
Against this backdrop, we will, first, cast a critical eye at what has held assumptions proved to be wrong and costly, in particular that
been suggested to be the principal cause of the recent financial cri- the originate-and-distribute (securitize) model would decrease
sis: the systemic mispricing of risks and the related degeneration of residual risk on the balance sheet of banks, that the growing use
the risk management discipline into a pseudo quantitative science. of credit risk transfer instruments would result in better allocated
After that we will show that the disappearance or weakening of due risks and a more stable banking sector, and that ample market
diligence by banks in the securitization process was an important liquidity would always be available.
crisis-factor that was not detected by conventional, quantitative
risk management systems. In fact, it is another important example The outbreak of the financial crisis proved these assumptions
why individual financial institutions can fail, and complete systems wrong, whereby the dark side of the risk paradox became visible
collapse, when not enough attention is being paid to the qualitative [Blommestein (2008c)]. In effect, it became increasingly clear that
dimension of risk management [Blommestein et al. (2009)]. there had been a significant and widespread underestimation of
risks across financial markets, financial institutions, and countries
How important was the mispricing of risks? [Trichet (2009)]. For a variety of reasons, market participants did
An increasingly interconnected and complex financial system made not accurately measure the risk inherent in financial innovations
it harder to price risks correctly. Both market participants and and/or understand the impact of financial innovations on the over-
supervisors underestimated the increase in systemic risk. Early on all liquidity and stability of the financial system. Indeed, there is
I concluded in this context: “The sheer complexity of derivatives growing evidence that some categories or types of risks associated
instruments, coupled with consolidation in the financial industry, with financial innovations were not internalized by markets; for
has made it increasingly hard for regulators and bankers to assess example, tail risks were underpriced and systematic risk (as exter-
levels of risk. In the credit derivatives market, risks that have been nality) was not priced, or was priced inadequately. This widespread
noted include a significant decline in corporate credit quality, little and systemic underestimation of risks turned out to be at the core
information on counterparties, operational weaknesses that may of the financial crisis. At the same time, the availability of sophisti-
result from the novelty of these instruments, and a disincentive to cated quantitative risk tools created a false sense of security and
manage actively portfolio credit risk. As a result, systemic risk in induced people to take greater risks. “Professional enthusiasm
this complex, often opaque financial landscape is likely to be higher about new risk control technology may give rise to overconfidence
than before” [Blommestein (2005b)]. and even hubris” [Helwig (2009)].

Although risk managers had more rigorous risk management tools The underestimation of risks reflected to an important degree mis-
at their disposal than in the past, the rapidly changing financial takes in both the strategic use of risk management systems as well
landscape (characterized by more complex products and markets, as the technically inadequate risk management tools. During the
a higher level of systemic risk, and increased financial fragility) unfolding of the crisis, many financial institutions revealed a huge

133
Risk management after the Great Crash

concentration of risks, suggesting that risk management systems complex financial landscape is a huge challenge. Pricing models
failed (a) to identify key sources of risks, (b) to assess how much are therefore subjected to significant model risk. For example, the
risk was accumulated, and (c) to price financial risks properly (or to foundation of the pricing of risk in structured products such as
use reliable market prices, in particular for structured products). The CDOs and CDSs is based on the key theoretical notion of perfect
underlying problem was that risk management did not keep pace with replication. Naturally, perfect replication does not exist in reality
the risks and uncertainty inherent in financial innovations and the and has to be approximated by historical data which in many cases
fast-changing financial landscape. Risk managers placed too much is very incomplete and of poor quality. Instead, researches and
trust into the existing risk models and techniques (see below), while practitioners had to rely on simulation-based pricing machines. The
underlying assumptions were not critically evaluated. Unfortunately, input for these simulations was very shaky as they were based on
the use of these models proved to be inadequate, both from a techni- “relatively arbitrary assumptions on correlations between risks and
cal and a conceptual point of view. On top of this, risk management default probabilities” [Colander et al. (2009)].
fell short from a qualitative perspective; that is, too little attention
was paid to corporate governance processes, the architecture and Second, many institutions have major difficulties in quantifying
culture of organizations, business ethics, incentives, and people. the ‘regular’ risks associated with credit and market instruments.
However, the Great Crash demonstrated that this was even more
Fatal flaws in the origination and securitization so the case for firms’ operational and liquidity risks. These compli-
process: failures in quantitative — and qualitative risk cations multiply when one tries to aggregate the various risks of
management divisions or departments within larger financial institutions3.
The (impact of the) securitization of mortgages and financial
innovations such as CDO and CDS markets came under heavy Third, from a more conceptual perspective, the financial crisis
criticism as being an important cause of the global financial crisis brought to light that the risk management discipline had developed
[Blommestein (2008a); Tucker (2010); Jacobs (2009)]. Risks were too much into a pseudo quantitative science with pretensions beyond
significantly underpriced [Blommestein (2008b)] (in particular by its real risk management capabilities4. The over-reliance on sophis-
rating agencies) while risk management systems failed. ticated though inadequate risk management models and techniques
contributed to a false sense of security [Honohan (2008)]. Indeed,
Naturally, also regulators and central bankers made mistakes. Most many professionals were too confident in the ability of quantitative
studies, however, seem to suggest that had we gotten a better models to reliably measure correlations and default probabilities
understanding of the correct pricing of complex structured products [Helwig and Staub (1996)]. It was assumed that quantitative risk
such as CDOs, CLOs and CDSs over the cycle, then we might have management models represented stable and reliable stochastic
been able to prevent the seriousness of the global financial crisis. For descriptions of reality. Ironically, by relying to an increasing degree
example, popular CDO pricing models such as the Gaussian copula on sophisticated mathematical models and techniques, the risk man-
function are based on the dubious key assumption that correlations agement discipline lost its ability to deal with the fundamental role
are constant over the cycle. The above reasoning implies that had of uncertainty in the financial system5. In addition to this fundamen-
we been able to employ a far superior method (in terms of accuracy tal methodological problem, the financial crisis revealed technical
and/or robustness) than the Gaussian copula function, then we would failures in risk management in the sense that even sophisticated
have valued more accurately structured products over the cycle. As methods and techniques turned out not to be refined enough. At
a result, the Great Crash would not have occurred. the core of many risk management systems was (is) the concept of
Value-At-Risk (VAR), which became a key tool in the quantification of
The limits of pricing models and quantitative risk management risk, the evaluation of risk/return tradeoffs, and in the disclosure of
However, the conclusion that better quantifications would have risk appetite to regulators and shareholders6. This concept is effec-
prevented a major crisis can be challenged on the following three tively based on the idea that the analysis of past price movement
key grounds. First, as noted above, pricing in the fast-moving, patterns could deliver statistically robust inferences relating to the

3 Let us focus on operational risk (op risk) in a large bank as an example. Shojai and had taken on a life of its own, leading to billions in losses. The meltdown also revealed
Feiger(2010) note that many institutions have only recently started to come to grips dangerous links in the financial system few had previously realized — that losses in
with the fact that operational risk is a major risk fraught with obstacles. First, quantifi- the U.S. housing market could trigger losses in huge stock portfolios that had nothing
cation is a very challenging task. Second, once op risk has been measured properly, we to do with housing. It was utter chaos driven by pure fear. Nothing like it had ever
need to be able to compute the operational risks of each division. Third, how do firms been seen before. This wasn’t supposed to happen!”
aggregate op risk across organisations as a whole? Many larger institutions still segre- 5 Some academic economists were certainly aware of the limitations and weaknesses
gate their different businesses (perhaps for good reasons). Hence it is nearly impossible of these models for use in the financial sector. For example, Merton (1994) gave the
(a) to quantify operational risk for the group and (b) to determine what diversification following general warning: “The mathematics of hedging models are precise, but the
benefits could be derived. Moreover, in some banks, FX, credit, and equities each have models are not, being only approximations to the complex, real world. Their accuracy
their own quants teams, whose aggregate risks for the bank no one can really under- as a useful approximation to that world varies considerably across time and place.
stand [or even for the financial system as a whole [Patterson (2010)]. The practitioner should therefore apply the models only tentatively, assessing their
4 See Patterson (2010) for a non-technical account of the role of Process Driven limitations carefully in each application”. However, other academics and many users
Trading (PDT) during the crisis. The formulas and complicated models of quants of academic models in the financial industry were often ill-informed and ignorant
134 traded huge quantities of securities and as the housing market began to crash, the about the deeper weaknesses of using these kinds of models across time and differ-
models collapsed. In the words of Patterson (2010): “The result was a catastrophic ent market places [Blommestein (2009)].
domino effect. The rapid selling scrambled the models that quants used to buy and 6 Ironically enough, the October 1987 crash marked the birth of VAR as a key risk man-
sell stocks, forcing them to unload their own holdings. By early August, the selling agement tool. For a very brief history of the birth of VAR, see Haldane (2009).
Risk management after the Great Crash

probability of price movements in the future [FSA (2009)]. However, value of the asset), followed by crashes (rapid drops in prices)
the financial crisis revealed severe problems with applying the VAR [Haruvy et al. (2007)].
concept to the world of complex longer-term social and economic
relationships [Danielsson (2002)]. However, prior to the crisis, these underlying structural problems
did not dampen the demand from institutional investors for AAA
Complex risks and uncertainties and the importance paper. Institutional investors believed that it was possible to
of qualitative risk management squeeze out a large quantity of paper rated AAA via the slicing and
Focusing too much on the technical intricacies of models for the dicing through repeated securitization of the original package of
‘correct’ pricing of these assets, although important, ensures that assets (mortgages, other loans). Consequently, all that was needed
we ignore another, often neglected, crucial reason why markets to expose the underlying weaknesses was a correction in house
for securitized assets became so big and fragile and finally col- prices, which is exactly what happened. Moreover, it became indeed
lapsed7. In fact, as noted before, the quantitative approach to risk painfully clear that the complex securities issued by CDOs are very
management does not fully cover the range of important risks and hard to value, especially when housing prices started to drop and
uncertainties. defaults began to increase.

First, the insight that the ‘originate- to- securitize’ process (and its The key insight of this overview is that modern risk transfer
embedded risks) was capable of generating not very well under- schemes (that reallocate risk, or, in many cases more accurately
stood negative spillovers via the shadow banking sector to com- stated, uncertainty) may undermine due diligence (and prudence
mercial banks [Tucker (2010)]. more generally), especially in combination with compensation
schemes that encourage excessive risk-taking. This structural lack
Second, the originate-to-securitize business model or process had of prudential behavior infected not only the structured finance seg-
as (unintended) consequence the fatally weakening of due diligence ments but the entire financial system. All types of players (bankers,
undertaken by originators8. With the originators relinquishing their brokers, rating agencies, lawyers, analysts, etc.) were operating
role as the conductors of due diligence it was left to the credit under business plans with a (implicit) short-term horizon that put
rating agencies (CRAs) to fill this information gap. But, on top of institutions and systems at risk. Deeply flawed incentive schemes
their inadequate pricing methodologies, CRAs never had sufficient encouraged dangerous short-cuts, excessive risk-taking, but also
access to the required information about underlying borrowers to unethical practices9. The culture of excessive risk-taking and dubi-
have any idea of their true state of health. That crucial information ous ethics [Blommestein (2005a)] in the banking industry spread
was in principle in the hands of the issuing bank, but, as noted, they like a virus during the past decades and became firmly entrenched
had stopped caring about collecting that kind of information when [Blommestein (2003)]. Even if the top management of banks aim to
they started selling the mortgages onto other investors [Keys et al. maximize long-term bank value, it may then be extremely hard to
(2010)]. So, the rating agencies had to use aggregate data to rate impose incentives and control systems that are consistent with this
these instruments, or to rely on the credit quality of the insurer objective. In fact, prior to the crisis, there was a move away from
who had provided credit enhancement to the security [Fabozzi and this long-term objective, with an increasing number of bank CEOs
Kothari (2007)]. In either case, neither the credit enhancer nor the encouraging business strategies based on aggressive risk-taking.
rating agency had any idea about the underlying quality of the bor- This in turn engendered excess risk-taking and non-ethical prac-
rowers [Shojai and Feiger (2010)]. tices within firms and at all levels (traders, managers, CEOs).

Third, perverse incentives, associated with flawed compensation Hence, a relatively small and local crisis could transform itself into
structures, are keeping valuations from their ‘true’ (equilibrium) the Great Crash.
prices [Blommestein (2008b)]. As a result, excessive risk-taking
(was) is manifesting itself in part through asset bubbles with sig- Clearly, the many complex and new risks and uncertainties in the
nificantly underpriced risks. Moreover, experience and tests show fast-moving financial landscape could not be effectively diagnosed
that humans have an ingrained tendency to underestimate outliers and managed via a purely quantitative approach. In fact, it encour-
[Taleb (2007)] and that asset markets have a tendency to gener- aged additional risk-taking induced by a false sense of confidence in
ate a pattern of bubbles (with prices much higher than the intrinsic sophisticated risk-control technologies. We, therefore, need a para-

7 A similar problem would occur when one would focus only on ‘macro’ factors such 9 A clear example is the trading strategy used by a number of Citigroup employees.
as global imbalances and low interest policies. Even if interest rates had not been On 2 August 2004, Citigroup pushed through €11 billion in paper sales in two minutes
kept so low for as long as they were or global imbalances would have been smaller, over the automated MTS platform, throwing the market into confusion. As the value
we would still have not been able to prevent the erosion of financial stability and the of futures contracts fell and traders moved to cover their positions, Citigroup reen-
structural weaknesses in the financial sector (in both the banking sector and security tered the market and bought back about €4 billion of the paper at cheaper prices.
markets). In retrospect, the global crisis of the new financial landscape was an acci- The strategy was dubbed Dr Evil, in an internal e-mail circulated by the traders. In
dent waiting to happen due to (semi-) hidden unsound structural features (see below). 2007, an Italian Court indicted seven (by that time former) Citigroup traders on
135
8 Rajan (2009) notes in this context that ‘….originators could not completely ignore charges of market manipulation in the sale and repurchase of government bonds on
the true quality of borrowers because they were held responsible for initial defaults.” the MTS electronic fixed income network. (Citi bond traders indicted over ‘Dr Evil’
However, he concludes that even this weak source of discipline was undermined by trade, http://www.finextra.com/fullstory.asp?id=17210, 19 July 2007.)
steadily rising housing prices.
Risk management after the Great Crash

digm shift in risk management that also includes an assessment Annex: Structural weaknesses waiting to erupt in the
of uncertainties through the lens of qualitative risk management. new financial landscape
Only in this way would we be able to tackle the adverse influences Tucker (2010) recently analyzed the question of whether the struc-
of organizational issues, human behavior, and incentives schemes ture or fundamental architecture of the new financial landscape
[Blommestein (2009)]. This would also allow us to account for the needs to be constrained or restructured by the authorities. In doing
fact that all risk measurements systems are far more subjective so he focused on a key weakness in the new financial landscape: the
than many experts want to accept or admit. Empirical risk control shadow banking sector.
systems are the result of subjective decisions about what should be
incorporated into the risk model and what should not10. In Tucker’s analysis, the dangerous side of ‘shadow banking’ refers
to “those instruments, structures, firms or markets which, alone or
Conclusions in combination, and to a greater or lesser extent, replicate the core
The first key conclusion is that prior to the Great Crash too much features of commercial banks: liquidity services, maturity mismatch
reliance was placed on the quantitative side of risk management and leverage.” The un(der)regulated shadow banking activities can
and too little on the qualitative dimension. In this context it was then create an unstable and fragile banking sector. For example,
argued that there was an urgent need for dealing effectively with the money fund industry is a major supplier of short-term funding
inherent weaknesses related to institutional, organizational, gover- to banks, while its own maturity mismatch served to mask the true
nance, and incentives’ aspects11. More sophistication and a further liquidity position of the banking sector. This in turn fatally injected
refinement of existing quantitative risk management models and additional fragility into the financial system as a whole. Warnings
techniques is not the most important or effective response to the were published quite a few years ago [Edwards (1996)], while Paul
uncertainties and risks associated with a fast-moving financial Volcker, former chairman of the U.S. Federal Reserve, is reported to
landscape. In fact, too much faith in a new generation of complex have expressed serious concerns at internal Federal Reserve meet-
risk models might even lead to more spectacular risk management ings around thirty years ago [Tucker (2010)].
problems as the one witnessed during the last decade. Instead, as
noted by Blommestein et al. (2009), a more holistic and broader So, like in the case of the ‘originate- to- securitize’ process, this
approach to risk management is needed as part of a paradigm shift was an example of a structural weakness waiting to erupt, although
where more attention is given to the qualitative dimension of risk the wait was longer. But during the global financial crisis they both
management. became a reality. “When the Reserve Fund “broke the buck” after
Lehman’s failure, there was a run by institutional investors” …
A final key finding is that it would still be impossible to prevent “Echoing Paul Volcker’s concerns, the Bank of England believes
major crises in the future, even if a coherent picture associated with that Constant-NAV money funds should not exist in their current
the aforementioned qualitative issues is being provided. The under- form” [Group of 30 (2009)].
lying epistemological reason is the (by definition) imperfect state
of academic knowledge about new uncertainties and risks associ-
ated with a fast-moving society, on the one hand, and the inher-
ently inadequate risk-management responses that are available as
tools to risk managers, their top management and, indeed, also to
their supervisors, on the other. Indeed, we have shown in a related
analysis that the Great Crash is another illustration of the fact that
risk management as a scientific discipline is not capable of dealing
adequately with fundamental uncertainties in the financial system
[Blommestein et al. (2009)]. From this perspective it is therefore
no surprise that some authorities are considering to constrain or
restructure parts of the architecture of the new financial landscape
[see Annex below and Group of Thirty (2009)].

10 This point is also emphasized by the CFO of the Dutch KAS Bank. He observes that 11 Of interest is that the conclusions from a 2008 report on risk management practices
136 many risk management models work fine when there is no crisis. However, they can during the crisis, drafted by a group of 8 financial supervisors from 5 countries,
fail spectacularly during a crisis because of left-out factors. It is important (1) to be focused predominantly on organizational and institutional issues [Senior Supervisors
aware which risk factors have been (deliberately) omitted and why and (2) which Group (2008)].
actions to take when a crisis erupts [Kooijman (2010)].
Risk management after the Great Crash

References
• Barley, R., 2008, “Ability to track risk has shrunk ‘forever’” Moody’s, Reuters.com, 6 • Group of Thirty, 2009, “Financial reform: a framework for financial stability,” January 15
January • Haldane, A. G., 2009, “Why banks failed the stress test,” Speech, 13 February
• Blommestein, H. J., 2009, The financial crisis as a symbol of the failure of academic • Haruvy, E., Y. Lahav, and C. N. Noussair, 2007, “Traders’ expectations in asset markets:
finance?(A methodological digression), Journal of Financial Transformation, Fall experimental evidence,” American Economic Review, 97:5, 1901-1920
• Blommestein, H. J.,2008a, “Grappling with uncertainty”, The Financial Regulator, March • Honohan, P., 2008, “Risk management and the cost of the banking crisis”, IIIS
• Blommestein, H. J., 2008b, “Difficulties in the pricing of risks in a fast-moving financial Discussion Paper, No. 262
landscape (A methodological perspective), Journal of Financial Transformation, 22 • Jacobs, B. I., 2009, “Tumbling tower of Babel: subprime securitization and the credit
• Blommestein, H. J., 2008c, “No going back (derivatives in the long run)”, Risk, August crisis,” Financial Analysts Journal, 66:2, 17 – 31
• Blommestein, H. J. 2006, The future of banking, SUERF Studies. • Jensen. M. C., and W. H. Meckling, 1976, “Theory of the firm: managerial behavior, agen-
• Blommestein, H. J., 2005a, How to restore trust in financial markets?” in Dembinski, P. cy costs and ownership structure,” Journal of Financial Economics, 3:4, 305-360
H., (ed.), Enron and world finance – a case study in ethics, Palgrave • Jorion, P., 2006, Value at Risk: the new benchmark for managing financial risk, 3rd ed.
• Blommestein, H. J., 2005b, “Paying the price of innovative thinking,” Financial Times McGraw-Hill
(Mastering Risk), 23 September • Keys, B. J., T. Mukherjee, A. Seru, and V. Vig, 2010, “Did securitization lead to lax
• Blommestein, H. J., 2003, “Business morality audits are needed,” Finance & Common screening? Evidence from subprime loans,” Quarterly Journal of Economics, forthcom-
Good, No. 15, Summer ing
• Blommestein H. J. 2000, “Challenges to sound risk management in the global financial • Kooijman, R., 2010, “Risk management: a managerial perspective,” Presentation at the
landscape,” Financial Market Trends No 75, April, OECD IIR Seminar on Risk Management for Banks 2010, Amsterdam, 26-27 January
• Blommestein, H. J., L. H. Hoogduin, and J. J. W. Peeters, 2009, “Uncertainty and risk • Kuester, K., S. Mittnik, and M. S. Paolella, 2006, “Value-at-Risk prediction: a comparison
management after the Great Moderation, The role of risk (mis)management by finan- of alternative strategies,” Journal of Financial Econometrics, 4:1, 53-89
cial institutions,” Paper presented at the 28th SUERF Colloquium on “The Quest for • Markowitz, H., 1952, “Portfolio selection,” Journal of Finance, 7:1, 77-91
Stability”3-4 September 2009, Utrecht, The Netherlands and Forthcoming in the SUERF • Merton, R.C., 1994, “Influence of mathematical models in finance on practice: Past, pres-
Studies Series ent and future”, Philosophical Transactions (Royal Society), 347 (1684), p. 451-462
• Boudoukh, J., M. Richardson, and R. Stanton, 1997, “Pricing mortgage-backed securities • Patterson, S., 2010, The quants: how a new breed of math whizzes conquered Wall
in a mulitfactor interest rate environment: a multivariate density estimation approach,” Street and nearly destroyed it, Random House, Inc
Review of Financial Studies, 10, 405-446 • Rajan, R. G., 2009, The credit crisis and cycle-proof regulation, Federal Reserve Bank of
• Cechetti, S. G., 2009, “Financial system and macroeconomic resilience,” Opening St. Louis Review, September/October
remarks at the 8th BIS Annual Conference, 25-26 June 2009 • Rajan, R., 2008, “Bankers’ pay is deeply flawed,” Financial Times, 8 January
• Colander, D., H. Föllmer, A. Haas, M. Goldberg, K. Juselius, A. Kirman, and T. Lux, 2009, • Shojai, S., and G. Feiger, 2010, “Economists’ hubris: the case of risk management,”
“The financial crisis and the systematic failure of academic economics,” Kiel Working Journal of Financial Transformation, Forthcoming
Papers No. 1489, Kiel Institute for the World Economy • Senior Supervisors Group, 2008, “Observations on risk management practices during
• Danielsson, J., 2002, “The emperor has no cloths: limits to risk modelling,” Journal of the recent market turbulence”, March 6
Banking and Finance, 26:7, 1273-1296 • Taleb, N. N., 2007, The black swan – the impact of the highly improbable, Random
• Edwards, F., 1996, “The new finance: regulation and financial stability,” American House, New York
Enterprise Institute • Thal Larsen, P., 2008, “The flaw in the logic of investment banks’ largesse,” Financial
• Fabozzi, F. J., and V. Kothari, 2007, Securitization the tool of financial transformation,” Times, 6 January
Journal of Financial Transformation, 20, 33-45 • Trichet, J-C. , 2009, “(Under-)pricing of risks in the financial sector”, speech 19 January
• Financial Services Authority, 2009, “The Turner review: a regulatory response to the • Tucker, P., 2010, Shadow banking, capital markets and financial stability, Remarks,
global banking crisis,” March Bernie Gerald Cantor (BGC) Partners Seminar, Bank of England, London, 21 January
• Fuller, J., and M. C. Jensen, 2002, “Just say no to Wall Street,” Journal of Applied
Corporate Finance, 14:4, 41-46

137
Guidelines for manuscript submissions

Guidelines for authors Manuscript guidelines


In order to aid our readership, we have established some guidelines to All manuscript submissions must be in English.
ensure that published papers meet the highest standards of thought leader-
ship and practicality. The articles should, therefore, meet the Manuscripts should not be longer than 7,000 words each. The maximum
following criteria: number of A4 pages allowed is 14, including all footnotes, references, charts
and tables.
1. Does this article make a significant contribution to this field of research?
2. Can the ideas presented in the article be applied to current business mod- All manuscripts should be submitted by e-mail directly to the editor@capco.
els? If not, is there a road map on how to get there. com in the PC version of Microsoft Word. They should all use Times New
3. Can your assertions be supported by empirical data? Roman font, and font size 10.
4. Is my article purely abstract? If so, does it picture a world that can exist in
the future? Where tables or graphs are used in the manuscript, the respective data
5. Can your propositions be backed by a source of authority, preferably should also be provided within a Microsoft excel spreadsheet format.
yours?
6. Would senior executives find this paper interesting? The first page must provide the full name(s), title(s), organizational affiliation
of the author(s), and contact details of the author(s). Contact details should
include address, phone number, fax number, and e-mail address.
Subjects of interest
All articles must be relevant and interesting to senior executives of the lead- Footnotes should be double-spaced and be kept to a minimum. They should
ing financial services organizations. They should assist in strategy formula- be numbered consecutively throughout the text with superscript Arabic
tions. The topics that are of interest to our readership include: numerals.

• Impact of e-finance on financial markets & institutions For monographs


• Marketing & branding Aggarwal, R., and S. Dahiya, 2006, “Demutualization and cross-country
• Organizational behavior & structure merger of exchanges,” Journal of Financial Transformation, Vol. 18,
• Competitive landscape 143-150
• Operational & strategic issues
• Capital acquisition & allocation For books
• Structural readjustment Copeland, T., T. Koller, and J. Murrin, 1994, Valuation: Measuring and Manag-
• Innovation & new sources of liquidity ing the Value of Companies. John Wiley & Sons, New York, New York
• Leadership
• Financial regulations For contributions to collective works
• Financial technology Ritter, J. R., 1997, Initial Public Offerings, in Logue, D. and J. Seward, eds.,
Warren Gorham & Lamont Handbook of Modern Finance, South-Western Col-
Manuscript submissions should be sent to lege Publishing, Ohio
Prof. Shahin Shojai, Ph.D.
The Editor For periodicals
Editor@capco.com Griffiths, W. and G. Judge, 1992, “Testing and estimating location vectors
when the error covariance matrix is unknown,” Journal of Econometrics 54,
Capco 121-138
Broadgate West
9 Appold Street For unpublished material
London EC2A 2AP Gillan, S. and L. Starks, 1995, Relationship Investing and Shareholder Activ-
Tel: +44 207 426 1500 ism by Institutional Investors. Working Paper, University of Texas
Fax: +44 207 426 1501

138 – The   journal of financial transformation


Request for papers — Deadline 8 July, 2010

The world of finance has undergone tremendous change in recent years.


Physical barriers have come down and organizations are finding it harder
to maintain competitive advantage within today’s truly global market place.
This paradigm shift has forced managers to identify new ways to manage
their operations and finances. The managers of tomorrow will, therefore,
need completely different skill sets to succeed.

It is in response to this growing need that Capco is pleased to publish the


‘Journal of financial transformation.’ A journal dedicated to the advancement
of leading thinking in the field of applied finance.

The Journal, which provides a unique linkage between scholarly


research and business experience, aims to be the main source of
thought leadership in this discipline for senior executives, management
consultants, academics, researchers, and students. This objective can
only be achieved through relentless pursuit of scholarly integrity and
advancement. It is for this reason that we have invited some of the world’s
most renowned experts from academia and business to join our editorial
board. It is their responsibility to ensure that we succeed in establishing a
truly independent forum for leading thinking in this new discipline.

You can also contribute to the advancement of this field by submitting your
thought leadership to the Journal.

We hope that you will join us on our journey of discovery and help shape the
future of finance.

Prof. Shahin Shojai


Editor@capco.com

For more info, see opposite page

2010 The Capital Markets Company. VU: Prof. Shahin Shojai,


Prins Boudewijnlaan 43, B-2650 Antwerp
All rights reserved. All product names, company names and registered trademarks in
this document remain the property of their respective owners.

139
Design, production, and coordination: Cypres — Daniel Brandt and Pieter Vereertbrugghen
© 2010 The Capital Markets Company, N.V.
All rights reserved. This journal may not be duplicated in any way without the express
written consent of the publisher except in the form of brief excerpts or quotations for review
purposes. Making copies of this journal or any portion there of for any purpose other than
your own is a violation of copyright law.
Capco offices

Amsterdam
Antwerp
Bangalore
Chicago
Frankfurt
Geneva
London
Luxembourg
Mumbai
New York
Paris
Pune
San Francisco
Toronto
Washington DC

www.capco.com T +32 3 740 10 00

S-ar putea să vă placă și