Documente Academic
Documente Profesional
Documente Cultură
l PROPERTIES
l REGULAR MARKOV CHAINS
l ABSORBING MARKOV CHAINS
Introduction
Powers of Matrices
Applications
Andrei Markov
1856 -- 1922
2)
Brand Loyalty:
Stay with brand A
Switch to brand A
Switch away from brand A
3)
Brownian Motion
Product Loyalty
A marketing campaign has the effect that:
80 % of consumers who use brand A stay with it
(so 20% switch away from it)
60 % consumers who use other brands switch
to brand A
What happens in the long run?
Problem: FEEDBACK!
A 0.4
0.6
A
current state
A next state
0.8 0.2
A 0.6 0.4
A1
0.8
0.1
0.8
A2
0.2
A2
A
0.2
A1
A1
0.6
0.9
0.6
A2
0.4
A2
A2
0.8
0.2
A
0.4
A1
0.6
0.8
A3
0.2
A3
A2
A2
A2
0.4
Probabilities
Probability of switching to A after one week of marketing:
P(A1) = P(A0 and A1)+P(A0 and A1)
= P(A0)P(A1|A0)+P(A0)P(A1|A0)
= 0.1*0.8
First State Matrix:
+ 0.9*0.6 = 0.62
A
S1 = [ 0.62 0.38 ]
0.8 0.2
S1 = S0P = [ 0.1 0.9 ]
0.6 0.4
S1 = [ 0.62 0.38 ]
After week 2:
S1 = S0P
S2 = S1P = S0PP = S0P2
S3 = S2P = S0P2P = S0P3
.
.
.
Sk = Sk-1P = S0Pk
0.24
0.28
0.248
0.256
0.7504 0.2496
P4 =
0.7488 0.2512
0.7500000001 0.2499999999
P16 =
0.7500000001 0.2499999999
0.75 0.25
P =
0.75 0.25
Running the marketing campaign for a long time is ineffective, after 4 weeks already 74.896% are buying
brand A. In the next 12 weeks only 0.104% more
switch to brand A.
Note that these numbers are overly accurate, the
model can NOT be that good.
Question
0
P =
1
1
P2 =
0
0
P3 =
1
NO!
1
0
0
1
1
0
1 0
P2k =
0 1
0 1
P2k +1 =
1 0
n STATIONARY MATRICES
n REGULAR MARKOV
CHAINS
n APPLICATIONS
n APPROXIMATIONS
A:
A:
A'
A 0.8 0.2
A' 0.6 0.4
A : 10%
A: 90%
S0 = [ 0.1 0.9 ]
S1 = S0P = [ 0.62 0.38 ]
S2 = S0P2 = [ 0.724 0.276 ]
S4 = [ 0.74896 0.25104 ]
S10 = [ 0.7499 0.2501 ]
S20 = [ 0.749999 0.250001 ]
10
Questions:
11
Regular Matrices
Regular Markov Chains
1
0
P=
0.5 0.5
0 0.2 0.8
P = 0.1 0.3 0.6
0.6 0.4 0
0.5 0.5
P2 =
0.25 0.75
0.5 0.38 0.12
P 2 = 0.39 0.35 0.26
12
Examples Of
Regular Markov Chains
1
0
0.5
0.5
We may leave out loops of zero probability
1
A
0.5
0.5
0.3
0.2
B
0
0.1
A
0.4
0.6
0.8
0.6
C
0
13
Theorem 1
Example 2
(Insurance Statistics)
0.77
0.23
0.89
0.11
14
Example 2 (continued)
If 5% of all drivers had an accident one year, what is
the probability that a driver, picked at random, has an
accident in the following year?
This A
year A
Next year
A
A
0.23 0.77
0.11 0.89
0.23 0.77
P=
0.11 0.89
S0 = [ 0.05 0.95 ]
S1 = S0P = [ 0.116 0.884 ], Prob(accident)=0.116
Example 2 (continued)
What about the long run behavior? What percentage
of drivers will have an accident in a given year?
Since all entries in P are greater than 0, this is a
regular Markov Chain and thus has a steady state:
0.1376
P2 =
0.1232
0.125
P 20 =
0.125
0.8624
0.8768
0.875
0.875
0.126512 0.873488
P3 =
0.124784 0.875216
15
Exact solution
By Theorem 1 part (A): Solve the equation S=SP
0.23 0.77
S = [s 1 s 2 ], where s 1 + s 2 = 1, P =
0.11 0.89
SP = [0.23s 1 + 0.11s 2 0.77s 1 + 0.89s 2 ]
S = SP
s 1 = 0.23s 1 + 0.11s 2
with s 2 = 1 s 1
s 2 = 0.77s 1 + 0.89s 2
s 2 = 1 s 1 = 1 0.125 = 0.875
n
n
n
n
16
Definition
Example 1
0.25
0.15
0.85
17
Observation
The number on an entering arrow gives the
probability of entering that state from the
state where the arrow started.
A
Another Example
To
A
From B
C
A
1
0.75
0
B
C
0
0
0 0.25
0.85 0.15
Probability:
From A to A is 1
From A to B or C it is 0
A is absorbing
0
0
1
P = 0.75
0
0.25
0
0.85 0.15
18
Theorem 1:
2
P = 0 1 0 , P = 0
0
1 0 0
1 0
2 n +1
2n
P
= P , P = 0 1
0 0
0 0
1 0
0 1
0 So an absorbing state
0 does not mean the
a limiting matrix
19
C
1
Definition:
2)
20
Another Definition:
A transition matrix for an absorbing Markov Chain
is in standard form if the rows and columns are
labeled so that all the absorbing states precede all
the non absorbing states:
Abs. NA.
Abs. I
NA. R
0
Q
Example:
0.5
1
0.5 0.5 0
P= 0
1
0
0.5 0 0.5
0.5
0.5
0.5
21
Example:(contd.)
0.5
1
0
0
1
P = 0.5 0.5 0
0 0.5 0.5
0.5
0.5
0.5
Limiting Matrix:
If P is the matrix of an absorbing Markov Chain and
P is in standard form, then there is a limiting matrix
0
1
and F = (I - Q ) Fundamental Matrix
Abs. NA.
Abs. I
NA. R
0
Q
22
More Examples:
0
0
1
P = 0.5 0.5 0
0 0.5 0.5
0.5 0
0.5
R = , Q =
,
0.5 0.5
0
0
0.5
( I Q) 1 =
0.5 0.5
0
0.5 0
0 .5
1
F =
=
4
= 2 2, FR = 1, P = 1 0 0
.
.
0
5
0
5
1 0 0
4
P = 0.75 0.25
0 , P = 0.9375 0.0625
0
0.9375
0.75
0
0.25
0
0.0625
1
0
0
16
P = 0.9999847412 0.00001525878906
0
0
0.00001525878906
0.9999847412
2
23