Sunteți pe pagina 1din 59

Quantifying Operational Risk:

Possibilities and Limitations


Hansjorg Furrer
Swiss Life
Hansjoerg.Furrer@swisslife.ch
Risk-Based Supervision, ETH Zurich Summer 2004
1 June 2004
Quantifying OpRisk
Contents
A. The New Basel Capital Accord (Basel II)
B. The OpRisk management process
C. OpRisk measurement for regulatory purposes
D. Conclusions
E. References
Quantifying OpRisk 1
A. The New Basel Capital Accord (Basel II)
1988: Basel Accord (Basel I): minimum capital requirements against credit risk.
One standardized approach
1996: Amendment to Basel I: market risk.
1999: First Consultative Paper on the New Basel Capital Accord (Basel II)
April 2003: Third Consultative Paper (CP3) on Basel II
(www.bis.org/bcbs/bcbscp3.htmcp3)
Aug 2003: Publication of the comments received on CP3:
- criticism of credit risk approach
- some comments on OpRisk methodology
Quantifying OpRisk 2
Oct 2003: Committee meeting in Madrid. Press release:
- credit risk approach will be reviewed
- no mention of OpRisk aspects?!
May 2004: Committee meeting in Basel: Census achieved on the remaining
technical issues regarding Basel II (e.g. specication of LDG parameters in
credit risk)
June 2004: Publication of the new Basel II framework
year-end 2006: Implementation of the standardized approaches
year-end 2007: Implementation of the advanced approaches
Quantifying OpRisk 3
Whats new?
Standards on capital adequacy substantially revised
Rationale for the New Accord:
- improve risk management
- enhance nancial stability
- provide incentives to adopt the more advanced risk-sensitive approaches
Structure of the New Accord (compare with Solvency II):
Three-pillar framework:
Pillar 1: minimal capital requirements (risk measurement)
Pillar 2: supervisory review of capital adequacy
Pillar 3: public disclosure
Quantifying OpRisk 4
Whats new? (contd)
Two options for the measurement of credit risk:
- standard approach
- internal rating based approach (IRB)
Pillar 1 sets out the minimum capital requirements:
total amount of capital
risk-weighted assets
8%
MRC (minimum regulatory capital)
def
= 8% of risk-weighted assets
Quantifying OpRisk 5
Whats new? (contd)
New regulatory capital for OpRisk:
OpRisk: The risk of losses resulting from inadequate or
failed internal processes, people and systems or
from external events
Note: This denition excludes
- strategic risk
- reputational risk
- systemic risk
Quantifying OpRisk 6
Rationale for the explicit treatment of OpRisk
Banks activities have become more complex and diverse (due to
globalization, deregulation, improved technology. Specically: e-commerce,
large-volume transactions, outsourcing,. . . )
Risk mitigation techniques for credit and market risk may produce
other forms of risk
Growing number of operational loss event types
* Credit Suisse Chiasso-aair (1997)
* Nick Leeson/Barings Bank, 1.3b (1995)
* Enron (largest US bankruptcy so far) (2001)
* Banque Cantonale de Vaudoise, KBV Winterthur (2003)
Quantifying OpRisk 7
Rationale for the treatment of OpRisk (contd)
OpRisk management viewed as a comprehensive inclusive discipline
(comparable to credit and market risk mgmt)
7 Loss event types: Internal fraud
External fraud
Employment practices and workplace safety
Clients, products & business practices
Damage to physical assets
Business disruption and system failures
Execution, delivery & process management
Quantifying OpRisk 8
Basel II = Solvency II
The dierence between the two prudential regimes goes further in that
their actual objectives dier. The prudential objective of the Basel Accord is
to reinforce the soundness and stability of the international banking system.
To that end, the initial Basel Accord and the draft New Accord are directed
primarily at banks that are internationally active. The draft New Accord
attaches particular importance to the self-regulating mechanisms of a
market where practitioners are dependent on one another. In the insurance
sector, the purpose of prudential supervision is to protect policyholders
against the risk of (isolated) bankruptcy facing every insurance company.
The systematic risk, assuming that it exists in the insurance sector, has not
been deemed to be of sucient concern to warrant minimum harmonization
of prudential supervisory regimes at international level; nor has it been the
driving force behind European harmonization in this eld
(EU Insurance Solvency Sub-Committee (2001))
Quantifying OpRisk 9
Risks within the SST framework
Quantifying OpRisk 10
B. The OpRisk management process
Reference: Basel Sound practices [4] for the management and
supervision of OpRisks
The Sound practices principles form the basis for Pillar 2 and 3
of the Basel proposals
OpRisk management process:
Identication
Assessment
Monitoring
Control/Mitigation
Quantifying OpRisk 11
The OpRisk management process (contd)
Previous wording: identication, measurement, monitoring and
control.
compulsory for those banks that want to adopt the standardized or
AMA approach
Possibly intended for all banks
Quantifying OpRisk 12
The OpRisk management process (contd)
Responsibilities:
- Banking supervisors/board of directors: ensure that an OpRisk management
framework is in place
- Senior management: responsible for the implementation
/: Identication and Assessment of OpRisk:
- identify and assess the OpRisks inherent in all existing products, activities,
processes and systems
- assess OpRisks before new products, activities, processes and systems are
introduced or undertaken
Quantifying OpRisk 13
The OpRisk management process (contd)
How to identify/assess OpRisk?
- Self- or risk assessment (questionnaires, checklists, workshops,. . . )
- risk mapping
- KRI (key risk indicators). State variable: to be assessed. For example #
failed trades, sta turnovers, system downtime etc.
- Scorecards, KRD (key risk drivers): Control variables. Translating qualitative
assessments into quantitative assessments. Examples of KRD: # transactions,
system quality,. . .
KRI = +

k
KRD
k
+
- threshold, limits (tied to risk indicators)
- measurement (tracking and recording systematically the frequency and severity
of individual loss events)
Quantifying OpRisk 14
Example: Questionnaire of the Erste Bank
Der Arbeitsaufwand kann im Rahmen der normalen Arbeitszeit erledigt werden (d)
80% Der Arbeitsaufwand kann zumeist im Rahmen der normalen Arbeitszeit erledigt werden, jedoch
kommt es in Spitzenzeiten zu berstunden
(c)
20% Der Arbeitsaufwand ist zumeist nur mit einer erheblichen Anzahl an berstunden zu erledigen (b)
Der Arbeitsaufwand ist kaum zu bewltigen. Mitarbeiter arbeiten regelmsssig auch am Wochenende (a)
Auswertung Wie wrden Sie die Auslastung der Mitarbeiter in ihrem Fachbereich einstufen in Bezug auf
geleistete berstunden?
38
Es stellt kein Problem dar, geeignetes Fachpersonal zu finden (c)
Es ist nicht immer einfach, sofort die richtige Arbeitskraft fr eine offene/neue Stelle zu finden,
jedoch gelingt dies letztendlich doch immer wieder ohne allzu grossen Zeitaufwand
(c)
100% Es ist sehr schwierig und zeitaufwndig eine offene/neue Stelle zu besetzen, da geeignetes Personal
nur sehr schwer zu finden und oft erst nach einer intensiven Einschulung einsetzbar ist.
(b)
Es ist fast unmglich, stets das richtige Personal fr offene/neue Stellen zu finden. Es ist meist
ausschliesslich nur durch zeitaufwndige, interne Schulungen mglich eine neue/offene Stelle
nachzubesetzen
(a)
Auswertung Wie sieht es aus mit der Verfgbarkeit von Fachpersonal? Ist es leicht oder schwierig, dieses
wieder nachzubesetzen
22
Quantifying OpRisk 15
C. OpRisk measurement for regulatory purposes
Recall:
total amount of capital
risk-weighted assets
8%
Under Basel II: risk-weighted assets must include capital charge for OpRisk
Notation: C
OP
: capital charge for OpRisk
Three distinct approaches for determining C
OP
:
Basic Indicator Approach
Standardized Approach
Advanced Measurement Approaches (AMA)
Quantifying OpRisk 16
Basic Indicator Approach
Capital charge:
C
BIA
OP
= GI
C
BIA
OP
: capital charge under the Basic Indicator Approach
GI: average annual gross income over the previous three years
= 15% (set by the Committee)
No risk mitigation via insurance allowed
Quantifying OpRisk 17
Standardized Approach
Similar to the BIA, but on the level of each business line:
C
SA
OP
=
8

i=1

i
GI
i

i
[12%, 18%], i = 1, 2, . . . , 8
8 business lines:
Corporate nance Payment & Settlement
Trading & sales Agency Services
Retail banking Asset management
Commercial banking Retail brokerage
No risk mitigation via insurance allowed
Quantifying OpRisk 18
Advanced Measurement Approaches (AMA)
Allows banks to use their own methods for assessing their exposure to OpRisk
signicant capital reduction should result compared to BIA and SA
Preconditions: Bank must meet
- qualitative and
- quantitative
standards before using the AMA
Qualitative standards: OpRisk management framework must be in place, see
Section B.
Quantifying OpRisk 19
Quantitative standards
the Committee does not stipulate which (analytical) approach banks must use. . .
. . . however, banks must demonstrate that their measurement system is
suciently granular to capture severe tail loss events, e.g 99.9% VaR
Capital charge dened in the following way:
MRC = EL + UL
If ELs are already captured in a banks internal business practices, then capital
charge set at the unexpected loss alone:
MRC = UL
Quantifying OpRisk 20
Quantitative standards (contd)
In a normally distributed world with = 0.999:
UL = VaR

(X) = + 3.09 , X N(,


2
)
Internally generated OpRisk measures must be based on a 5-year observation
period
Internal data: banks must record internal loss data. Problem: the data for
many (business line/loss event type) cells is sparse. Therefore: aggregation with
External data: A banks OpRisk measurement system must also include external
data
Quantifying OpRisk 21
External consortium data
Data consortia (e.g. Operational Risk data eXchange ORX)
exchange loss data from member banks anonymously
ORX member banks: ABN Amro, BNP Paribas, Commerzbank,
Deutsche Bank, JP Morgan Chase,. . .
(number of member banks increasing)
Mathematical problem: How to scale severity and frequency of the
external data to t into the internal loss data collection?
Quantifying OpRisk 22
Scaling of OpRisk Data
Example:
- Bank ABC: B/S: CHF 250b
- Bank ORX

: B/S: CHF 415b, OpRisk loss CHF 82m for cell (i, k).
- Scaling process:
82
250
415
= 49.4
Is this reasonable? Does a relationship between size and risk exist?
If so, is this relationship linear?
Quantifying OpRisk 23
OpRisk Loss Data
QIS2 exercise ([3]): gathering information concerning OpRisk loss events
loss data mapped according to the 8 7 business line/loss event type cells
- # of participating banks: 30 (11 countries)
- time span: three years (1998-2000)
- # reported loss events: 11,300
- total loss amount EUR 2.6b (only events exceeding EUR 10,000)
- largest contributions:
(Retail Banking/Clients, Products and Business Services)
(Trading and Sales/Execution, Delivery, and Process Management)
(Commercial banking/External Fraud)
- two cells have no reported losses
- extraordinary losses presumably absent from this database
Quantifying OpRisk 24
OpRisk Loss Data (contd)
LDCE exercise ([5]): Loss Data Collection Exercise for OpRisk
extension and renement of the QIS2-exercises
Characteristics of the LDCE:
- # of participating banks: 89 (19 countries)
- time span: 1 year (2001)
- # reported loss events: 47,200
- total loss amount EUR 7.8b (only events exceeding EUR 10,000)
- considerable clustering around certain business lines/event type cells
Quantifying OpRisk 25
Comparison of the LDCE with the QIS2 data



14/41

Figure1a: Percent Frequency by Business Line
0%
10%
20%
30%
40%
50%
60%
70%
Corporate Finance Trading and Sales Retail Banking Commercial
Banking
Payment and
Settlement
Agency and
Custody Services
Asset
Management
Retail Brokerage No Information
2000 2001


Figure 1b: Percent Frequency by Event Type
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
50%
Internal Fraud External Fraud Employment
Practices and
Workplace Safety
Clients, Products
and Business
Services
Damage to Physical
Assets
Business Disruption
and System Failures
Execution, Delivery,
and Process
Management
No Information
2000 2001



Quantifying OpRisk 26
Comparison of the LDCE with the QIS2 data

Figure1a: Percent Frequency by Business Line
0%
10%
20%
30%
40%
50%
60%
70%
Corporate Finance Trading and Sales Retail Banking Commercial
Banking
Payment and
Settlement
Agency and
Custody Services
Asset
Management
Retail Brokerage No Information
2000 2001


Figure 1b: Percent Frequency by Event Type
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
50%
Internal Fraud External Fraud Employment
Practices and
Workplace Safety
Clients, Products
and Business
Services
Damage to Physical
Assets
Business Disruption
and System Failures
Execution, Delivery,
and Process
Management
No Information
2000 2001



Quantifying OpRisk 27
Comparison of the LDCE with the QIS2 data



15/41

Figure 2a: Percent Severity by Business Line
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
Corporate Finance Trading and Sales Retail Banking Commercial
Banking
Payment and
Settlement
Agency and
Custody Services
Asset
Management
Retail Brokerage No Information
2000 2001



Figure 2b: Percent Severity by Event Type
0%
5%
10%
15%
20%
25%
30%
35%
40%
Internal Fraud External Fraud Employment
Practices and
Workplace Safety
Clients, Products
and Business
Services
Damage to Physical
Assets
Business Disruption
and System Failures
Execution, Delivery,
and Process
Management
No Information
2000 2001

Quantifying OpRisk 28
Comparison of the LDCE with the QIS2 data
Figure 2a: Percent Severity by Business Line
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
Corporate Finance Trading and Sales Retail Banking Commercial
Banking
Payment and
Settlement
Agency and
Custody Services
Asset
Management
Retail Brokerage No Information
2000 2001



Figure 2b: Percent Severity by Event Type
0%
5%
10%
15%
20%
25%
30%
35%
40%
Internal Fraud External Fraud Employment
Practices and
Workplace Safety
Clients, Products
and Business
Services
Damage to Physical
Assets
Business Disruption
and System Failures
Execution, Delivery,
and Process
Management
No Information
2000 2001

Quantifying OpRisk 29
Comparison of the LDCE with the QIS2 data
Note:
- the QIS2 data shown in the previous graphs (highlighted in orange) only
depicts the data collected in 2000 (enabling an annual comparison)
- signicant dierences in the samples of participating banks!
- Frequency of loss events:
roughly no changes by business line (Figure 1a)
by event type (Figure 1b), we observe an increase in External Fraud and Employment
Practices, whereas the share in Execution, Delivery, and Process Mgmnt decreased
- Severity of loss events:
Striking changes by event types (Figure 2b): the distribution of loss amounts is sensitive
to low frequency/high severity events (e.g. 09/11)
Quantifying OpRisk 30
Comparison of the LDCE with the QIS2 data
To assess the extent of risk, it is necessary to assess the extent of
variability of both number and amount of loss events.
The RMG is undertaking internal analysis to address such issues,
see also, among others, the papers by Pezier [14].
Quantifying OpRisk 31
Some internal data
type 1


1992 1994 1996 1998 2000 2002
0
1
0
2
0
3
0
4
0
type 2


1992 1994 1996 1998 2000 2002
0
5
1
0
1
5
2
0
type 3


1992 1994 1996 1998 2000 2002
0
2
4
6
8
pooled operational losses


1992 1994 1996 1998 2000 2002
0
1
0
2
0
3
0
4
0
Quantifying OpRisk 32
Modeling issues
Stylized facts about OP risk losses
- Loss occurrence times are irregularly spaced in time
(selection bias, economic cycles, regulation, management interactions,. . . )
- Loss amounts show extremes
Large losses are of main concern!
Repetitive vs non-repetitive losses ([14])
Repetitive losses: loss events that may occur more than once a
week (e.g. settlement risk, minor external fraud, human error in transaction
processing)
Quantifying OpRisk 33
Modeling issues (contd)
Non-repetitive losses:
- ordinary: from once a week to once in a generation ( 2/3 of the risk
categories reported in the QIS studies)
- extraordinary: large but rare
Immaterial losses: to be ignored (both EL and UL are negligible)
Red ag: Are observations in line with modeling assumptions?
Example: iid assumption implies
1 NO structural changes in the data as time evolves
1 Irrelevance of which loss is denoted X
1
, which one X
2
,. . .
Quantifying OpRisk 34
A mathematical (actuarial) model
OpRisk loss database

X
(t,k)

, t {T n + 1, . . . , T 1, T} (years),
k {1, 2, . . . , 7} (loss event type),
{1, 2, . . . , N
(t,k)
} (number of losses)

7 loss event types: Internal fraud


External fraud
Employment practices and workplace safety
Clients, products & business practices
Damage to physical assets
Business disruption and system failures
Execution, delivery & process management
Quantifying OpRisk 35
A mathematical (actuarial) model (contd)
Truncation
X
(t,k)

= X
(t,k)

X
(t,k)

>d
(t,k)

A further index j, j {1, . . . , 8} indicating business line can be introduced


(suppressed in the sequel)
Estimate a risk measure for F
L
(T+1,k)
(x) = P

L
(T+1,k)
x

like
1 Op-VaR
(T+1,k)

= F

L
(T+1,k)
()
1 Op-ES
T+1

= E

L
(T+1,k)
|L
(T+1,k)
> Op-VaR
(T+1,k)

where
L
(T+1,k)
=
N
(T+1,k)

=1
X
(T+1,k)

: OpRisk loss for loss event type k


over the period [T, T + 1].
Quantifying OpRisk 36
A mathematical (actuarial) model (contd)
Discussion: Recall the stylized facts
- Xs are heavy-tailed
- N shows non-stationarity
Conclusions:
- F
X
(x) and F
L
(t,k)
(x) dicult to estimate
- In-sample estimation of VaR

( large) almost impossible!


- actuarial tools may be useful:
Approximation (translated gamma/lognormal)
Inversion methods (FFT)
Recursive methods (Panjer)
Simulation
Extreme Value Theory (EVT)
Quantifying OpRisk 37
How accurate are VaR-estimates?
Make inference about the tail decay of the aggregate loss L
(t,k)
via the tail
decay of the individual losses X
(t,k)
:
L =
N

=1
X

, 1 F
X
(x) x

h(x) , x
1 F
L
(x) E[N] x

h(x) , x .
Assumptions: (X
m
) iid F and for some , and u large
F
u
(x) := P[X u x|X > u] = G
,(u)
(x)
where
G
,
(x) =

1 +
x
(u)

1/
, = 0,
1 e
x/
, = 0.
Quantifying OpRisk 38
How accurate are VaR-estimates? (contd)
Tail- and quantile estimate:
1

F
X
(x) =
N
u
n

1 +

x u

1/

, x > u.

VaR

= q

= u

N
u
n(1 )

(1)
Idea: Comparison of estimated quantiles with the corresponding
theoretical ones by means of a simulation study ([13]).
Quantifying OpRisk 39
How accurate are VaR-estimates? (contd)
Simulation procedure:
Choose F and x
0
< < 1, N
u
{25, 50, 100, 200}
(N
u
: # of data points above u)
Calculate u = q

0
and the true value of the quantile q

Sample N
u
independent points of F above u by the rejection method. Record
the total number n of sampled points this requires
Estimate , by tting the GPD to the N
u
exceedances over u by means of
MLE.
Determine q

according to (1)
Repeat N times the above to arrive at estimates of Bias( q

) and SE( q

)
Quantifying OpRisk 40
How accurate are VaR-estimates? (contd)
Accuracy of the quantile estimate expressed in terms of bias and
standard error:
Bias( q

) = E[ q

], SE( q

) = E

( q

)
2

1/2

Bias( q

) =
1
N
N

j=1
q
j

,

SE( q

) =

1
N
N

j=1
( q
j

)
2

1/2
For comparison purposes (dierent distributions) introduce
Percentage Bias :=

Bias( q

)
q

, Percentage SE :=

SE( q

)
q

Quantifying OpRisk 41
How accurate are VaR-estimates? (contd)
Criterion for a good estimate: Percentage Bias and
Percentage SE should be small, e.g.
= 0.99 = 0.999
Percentage Bias 0.05 0.10
Percentage SE 0.30 0.60
Quantifying OpRisk 42
Example: Pareto distribution 1 F
X
(x) = x

1 F
X
(x) = x

1 F
X
(x) = x

, = 2 = 2 = 2
u = F

(x
q
) Goodness of

VaR

0.99 A minimum number of 100 exceedances


(corresponding to 333 observations) is required
to ensure accuracy wrt bias and standard error.
q = 0.7
0.999 A minimum number of 200 exceedances
(corresponding to 667 observations) is required
to ensure accuracy wrt bias and standard error.
0.99 Full accuracy can be achieved with the minimum
number 25 of exceedances (corresponding to 250
observations).
q = 0.9
0.999 A minimum number of 100 exceedances
(corresponding to 1000 observations) is required
to ensure accuracy wrt bias and standard error.
Quantifying OpRisk 43
Example: Pareto distribution 1 F
X
(x) = x

1 F
X
(x) = x

1 F
X
(x) = x

, = 1 = 1 = 1
u = F

(x
q
) Goodness of

VaR

0.99 For all number of exceedances up to


200 (corresponding to a minimum of 667
observations) the VaR estimates fail to meet
the accuracy criteria.
q = 0.7
0.999 For all number of exceedances up to
200 (corresponding to a minimum of 667
observations) the VaR estimates fail to meet
the accuracy criteria.
0.99 A minimum number of 100 exceedances
(corresponding to 1000 observations) is required
to ensure accuracy wrt bias and standard error.
q = 0.9
0.999 A minimum number of 200 exceedances
(corresponding to 2000 observations) is required
to ensure accuracy wrt bias and standard error.
Quantifying OpRisk 44
How accurate are VaR-estimates? (contd)
Large number of observations necessary to achieve targeted
accuracy.
Minimum number of observations increases as the tails become
thicker ([13]).
Remember: The simulation study was done under idealistic
assumptions. OpRisk losses, however, typically do NOT fulll
these assumptions.
Quantifying OpRisk 45
Pricing risk under incomplete information
Recall: L
(T+1,k)
: OpRisk loss for loss event type k over the period [T, T + 1]
L
(T+1,k)
=
N
(T+1,k)

=1
X
(T+1,k)

Question: Suppose we have calculated risk measures


(T+1,k)

, k {1, . . . , 7}
for each loss event type. When can we consider
7

k=1

(T+1,k)

a good risk measure for the total loss L


T+1
=

7
k=1
L
(T+1,k)
?
Quantifying OpRisk 46
Pricing risk under incomplete information (contd)
Answer: Ingredients
- (non-) coherence of risk measures (Artzner, Delbaen, Eber, Heath
framework)
- optimization problem: given
(T+1,k)

, k {1, . . . , 7}, what is


the worst case for the overall risk for L
T+1
?
Solution: using copulas in [9] and references therein.
- aggregation of banking risks [1]
Quantifying OpRisk 47
A ruin-theoretic problem motivated by OpRisk
OpRisk process:
V
(k)
t
= u
(k)
+p
(k)
(t) L
(k)
t
, t 0
for some initial capital u
(k)
and a premium function p
(k)
(t) satisfying P[L
(k)
t

p
(k)
(t) ] = 1.
Given > 0, calculate u
(k)
() such that
P

inf
TtT+1

u
(k)
() +p
(k)
(t) L
(k)
t

< 0

(2)
u
(k)
() is a risk capital charge (internal)
Quantifying OpRisk 48
Ruin-theoretic problem (contd)
Solving for (2) is dicult
- complicated loss process L
(k)
t
- heavy-tailed case
- nite time horizon [T, T + 1]
Classical risk theory revisited:
Y (t) = u +ct
N(t)

k=1
X
k
= u +ct S
X
N
(t)
(u) = P

sup
t0

S
X
N
(t) ct

> u

Quantifying OpRisk 49
Ruin-theoretic problem (contd)
Pro memoria: Cramer-Lundberg Theorem (light-tailed case):
Assume that there exists a constant > 0 fullling
h(r) =
rc

, h(r) := E

e
rX

1
and that

0
xe
x
d

G
I
(x)/c

=:

< . Then
(u)
1 /c

e
u
, u .
Quantifying OpRisk 50
Ruin-theoretic problem (contd)
Important: Net-prot condition: P

lim
t

S
X
N
(t) ct

= 1
Heavy-tailed case: Let the claim size distribution be such that P[X
1
> x]
x
(+1)
L(x), L slowly varying. Then
(u) u

L(u) , u .
Now assume for some general loss process (S(t))
P

lim
t

S(t) ct

= 1

1
(u) = P

sup
t0

S(t) ct

> u

L(u) , u . (3)
Quantifying OpRisk 51
Ruin-theoretic problem (contd)
Question: How much can we change S keeping (3)?
Solution: use time change S

(t) = S((t))

(u) = P

sup
t0

(t) ct

> u

Under some technical conditions on and S, general models are


given so that
lim
u

(u)

1
(u)
= 1
i.e. ultimate ruin behaves similarly under the time change
Quantifying OpRisk 52
Ruin-theoretic problem (contd)
Example:
- start from the homogeneous Poisson case (classical Cramer-
Lundberg, heavy-tailed case)
- use to transform to changes in intensities motivated by
operational risk, see [11].
Quantifying OpRisk 53
D. Conclusions
OP risk =market risk, credit risk
all risk types (market, credit, Op) must be considered
simultaneously, as the reduction in one risk type is at the expense
of an increase in another type
(Internal) OpRisk loss databases must grow
Standard actuarial methods (including EVT) aiming to derive
capital charges for OpRisks are of limited use due to
- lack of data
- inconsistency of the data with the modeling assumptions
Quantifying OpRisk 54
Conclusions (contd)
interesting source of mathematical problems
challenges: choice of risk measures, aggregation of risk measures
Pillar 2 and 3 (c.f. Basel Sound Practices [4]) important part in
the OpRisk management process
Quantifying OpRisk 55
E. References
[1] Alexander, C., and Pezier, P. (2003). Assessment and aggregation of banking
risks. 9th IFCI Annual Risk Management Round Table, International Financial
Risk Institute (IFCI).
[2] Basel Committee on Banking Supervision. Working Paper on the Regulatory
Treatment of Operational Risk. September 2001. BIS, Basel, Switzerland,
www.bis.org/publ/bcbs wp8.htm
[3] Basel Committee on Banking Supervision. The Quantitative Impact
Study for Operational Risk: Overview of Individual Loss Data
and Lessons Learned. January 2002. BIS, Basel, Switzerland,
www.bis.org/bcbs/qis/qisopriskresponse.pdf
[4] Basel Committee on Banking Supervision. Sound Practices for the
Management and Supervision of Operational Risk. July 2002. BIS, Basel,
Switzerland, www.bis.org/publ/bcbs91.htm
Quantifying OpRisk 56
[5] Basel Committee on Banking Supervision. The 2002 Loss Data
Collection Exercise for Operational Risk: Summary of the Data Collected.
March 2003. BIS, Basel, Switzerland, www.bis.org/bcbs/qis/ldce2002.pdf
[6] Basel Committee on Banking Supervision. Third Consultative Paper on
The New Basel Capital Accord. 29 April 2003. BIS, Basel, Switzerland,
www.bis.org/bcbs/bcbscp3.htm
[7] Center for Financial Studies. Latest Developments in Managing Operational
Risk. March 2004. CFS Research Conference, Eltville. www.ifk-cfs.de/English
[8] Embrechts, P., Furrer, H.J., and Kaufmann, R. (2003). Quantifying Regulatory
Capital for Operational Risk. Derivatives Use, Trading and Regulation,
Vol. 9, No. 3, 217-233. Also available on www.bis.org/bcbs/cp3comments.htm
[9] Embrechts, P., Hoeing, A., and Juri, A. (2003). Using copulae to bound
the Value-at-Risk for functions of dependent risks. Finance and Stochastics,
Vol. 7, No 2, 145-167.
Quantifying OpRisk 57
[10] Embrechts, P., Kaufmann, R., and Samorodnitsky, G. (2002). Ruin theory
revisited: stochastic models for operational risk. Submitted.
[11] Embrechts, P., and Samorodnitsky, G. (2003). Ruin problem and how fast
stochastic processes mix. Annals of Applied Probability, Vol. 13, 1-36.
[12] Geiger, H. (2000). Regulating and Supervising Operational Risk for Banks.
Working paper, University of Zurich.
[13] McNeil, A. J., and Saladin, T. (1997) The peaks over thresholds method
for estimating high quantiles of loss distributions. Proceedings of XXVIIth
International ASTIN Colloquium, Cairns, Australia, 23-43.
[14] Pezier, J. (2002) Operational Risk Management. ISMA Discussion Papers
in Finance 2002-21. To appear in Mastering Operational Risk, FT-Prentice
Hall, 2003.
Quantifying OpRisk 58

S-ar putea să vă placă și