Sunteți pe pagina 1din 37

POLITECNICO DI BARI

DIPARTIMENTO DI INGEGNERIA ELETTRICA E


DELL’INFORMAZIONE

Master Degree in

TELECOMMUNCIATIONS ENGINEERING

Course: Traffic Theory

Prof. Pietro Camarda

September 2018
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Probability theory deals with the study of random phenomena, which under repeated
experiments yield different outcomes.

When an experiment is performed, certain elementary events occur in different but completely
uncertain ways.

Let us call Sample Space S the set of all possible outcomes of the random phenomena. S can
be:

• S Finite.

• Ex. S={T, C} Single launch of a double face coin

• Ex. S={1, 2, 3, 4, 5, 6} Single launch of a six-sided dice

• S Infinite Countable

• Ex. S={T, CT, CCT, CCCT, CCCCT, ……..} Successive launch of a double face coin
until a value T is experienced.

• S Infinite uncountable

• Ex. S={[0-1]} Set of all possible real numbers in the interval [0-1]

Any subset of S is called Event. The empty set ø and S are Event.

Pietro Camarda 2
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

The probability theory can be developped assiomatically based on the


following three axioms (Kolmogorov)

1. 𝑃(𝐴) ≥ 0
2. 𝑃(𝑆) = 1
3. 𝑃(𝐴 ∪ 𝐵) = 𝑃(𝐴) + 𝑃(𝐵), when A and B are disjoint.

Corollaries: P(ø)= 0, P(𝑨) = 𝟏 − 𝑷 𝑨 (𝑨 is the complement of A).


In general P(A ∪ B) = P(A) + P(B) – P(A ∩ B)

Two Events A and B are said statistically independent when


𝑃 𝐴, 𝐵 = 𝑃 𝐴 𝑃 𝐵

The conditional probability is defined as


𝑃(𝐴, 𝐵)
𝑃 𝐴\B = , when P(B) ≠ 0
𝑃(𝐵)

When A and B are independent


𝑃(𝐴, 𝐵) 𝑃 𝐴 𝑃(𝐵)
𝑃 𝐴\B = = = 𝑃(𝐴)
𝑃(𝐵) 𝑃(𝐵)
Pietro Camarda 3
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Example

A random experiment consists in launching 4 times a double face coin.


The sample space S is finite and contains 24 results.
S={(TTTT), (CTTT), (TCTT), ………., (CCCC)}

An example of event is
A={the number of cross is one}={(CTTT), (TCTT), (TTCT), (TTTC)}
Other example of events are
B={the number of cross is at most two}={(TTTT), (CTTT), (TCTT), (TTCT), (TTTC),
(CCTT), (CTCT), (CTTC), (TCCT), (TCTC), (TTCC)}
C={cross in the first two positions}={(CCTT), (CCCT), (CCTC), (CCCC)}

The probability of an event E, in this case, can be defined as

𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑟𝑒𝑠𝑢𝑙𝑡𝑠 𝑖𝑛 𝐸
P(E)=
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑟𝑒𝑠𝑢𝑙𝑡𝑠 𝑖𝑛 𝑆
4
4 11 4 𝑃(𝐴,𝐵) 16 4
we have P(A)= P(B)= P(C)= P(A\B) = = 11 =
16 16 16 𝑃(𝐵) 11
16

The pair of events A, B and B, C are not disjoint, while the pair A, C are disjoint (they do
not have terms in common).
Pietro Camarda 4
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Total Probability Theorem

Let A1, A2, A3, ….., An events of a random experiment that are mutually exclusive and
exhaustive; i.e.

• Ai ∩ Aj = ø, i≠j (Ai and Aj have null intersection)


• ‫=𝑖𝑛ڂ‬1 𝐴𝑖 = S

The probability of a generic event B can be evaluated by

P(B) = σ𝑛𝑖=1 𝑃(𝐵, 𝐴i) = σ𝑛𝑖=1 𝑃(𝐵\𝐴i) P(𝐴i)


Proof:
B = B ∩ S = B ∩ (‫=𝑖𝑛ڂ‬1 𝐴𝑖 ) = ‫=𝑖𝑛ڂ‬1 𝐵 ∩ 𝐴𝑖

P(B) = P(‫=𝑖𝑛ڂ‬1 𝐵 ∩ 𝐴𝑖 ) = σ𝑛𝑖=1 𝑃(𝐵, 𝐴i)

Considering that 𝐵 ∩ 𝐴𝑖 are mutually disjoint for each value of i, i.e.:


(𝐵 ∩ 𝐴𝑖 ) ∩ (𝐵 ∩ 𝐴𝑗) = ø, i≠j
In the same context can be evaluated the following: Bayes Theorem

𝑃(𝐴𝑖 , 𝐵) 𝑃 𝐵 𝐴𝑖 𝑃(𝐴𝑖)
𝑃 𝐴𝑖 𝐵 = = 𝑛
𝑃(𝐵) σ𝑘=1 𝑃(𝐵\𝐴k ) P(𝐴k)
Pietro Camarda 5
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

The Bernoulli process

A Bernoulli process is a sequence 𝑍1 , 𝑍2 , … . , of i.i.d binary random variables. A generic r.v. 𝑍𝑖


can assume only two values 1 𝑎𝑛𝑑 0 with probability
𝑃 𝑍𝑖 = 1 = 𝑝, 𝑃 𝑍𝑖 = 0 = 1 − 𝑝
The partial sum in a Bernoulli process is
𝑛

𝑆𝑛 = ෍ 𝑍𝑖
𝑖=1
We are interested in evaluating 𝑃 𝑆𝑛 = 𝑘 , 𝑘 = 0,1,2, … , 𝑛
We have

𝑃 𝑆𝑛 = 0 = 𝑃 𝑍1 = 0, 𝑍2 = 0, … , 𝑍𝑛 = 0 = 𝑃 𝑍1 = 0) 𝑃 𝑍2 = 0 ∙∙∙ 𝑃(𝑍𝑛 = 0 = (1 − 𝑝)𝑛

𝑃 𝑆𝑛 = 1
= 𝑃 𝑍1 = 1, 𝑍2 = 0, … , 𝑍𝑛 = 0 ∪ 𝑍1 = 0, 𝑍2 = 1, … , 𝑍𝑛 = 0 ∪∙∙∙∪ 𝑍1 = 0, 𝑍2 = 0, … , 𝑍𝑛 = 1
= 𝑃 𝑍1 = 1, 𝑍2 = 0, … , 𝑍𝑛 = 0 ∪ 𝑍1 = 0, 𝑍2 = 1, … , 𝑍𝑛 = 0 ∪∙∙∙∪ 𝑍1 = 0, 𝑍2 = 0, … , 𝑍𝑛 = 1 =
= 𝑃 𝑍1 = 1, 𝑍2 = 0, … , 𝑍𝑛 =) + 𝑃(𝑍1 = 0, 𝑍2 = 1, … , 𝑍𝑛 = 0) +∙∙∙ +𝑃(𝑍1 = 0, 𝑍2 = 0, … , 𝑍𝑛 = 1
𝑛
= 𝑛𝑝(1 − 𝑝)𝑛−1 = 𝑝 (1 − 𝑝)𝑛−1
1
In general
𝑛 𝑘
𝑃 𝑆𝑛 = 𝑘 = 𝑝 (1 − 𝑝)𝑛−𝑘
𝑘
Pietro Camarda 6
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Example: Communication over Noisy Channel

The transmission of digital signals is carried out by a Binary Communication Channel, whose model is
represented in the figure.

1-p0
0T 0R
p0

p1
1T 1R
Where 1-p1
𝑃 1𝑅\0𝑇 = 𝑝0 , 𝑃 0𝑅\1𝑇 = 𝑝1 , 𝑃 0𝑅\0𝑇 = 1 − 𝑝0 , 𝑃 1𝑅\1𝑇 = 1 − 𝑝1

In this system, an Error occurs when the received bit is different from the transmitted bit, i.e.

𝐸 = 𝑅𝑒𝑐𝑒𝑖𝑣𝑒𝑑 𝑏𝑖𝑡 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑡 𝑓𝑟𝑜𝑚 𝑡ℎ𝑒 𝑇𝑟𝑎𝑛𝑠𝑚𝑖𝑡𝑡𝑒𝑑 𝑏𝑖𝑡

To evaluate the probability of this event, we can exploit the Total Probability Theorem

𝑃 𝐸 = P E, 0T + P E, 1T = P E\0T P 0T + P E\1T P 1T = P 1R\0T P 0T + P 0R\1T P 1T


= 𝑝0 𝑃 0𝑇 + 𝑝1 𝑃(1𝑇)

The Channel is Symmetric (BSC) when 𝑝0 = 𝑝1 = 𝑝 and we have 𝑃 𝐸 = 𝑝 𝑃 0𝑇 + 𝑝 𝑃 1𝑇 = 𝑝൫𝑃 0𝑇 +

Pietro Camarda 7
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Example: Communication over Noisy Channel

When the available BSC has an error probability 𝑝 too high for the considered application, the criteria of
Channel Encoding can be applied. The system can be represented by the following model

Binary Channel Channel Delivered


BSC
information Encoder Decoder information

A useful example of an application of basic probability concepts is to consider two simple cases of
channel Encoding: Parity Check Code, Repetition Code.

Parity Check Code


In this case, the sequence of source information bit is divided in blocks of 𝑛 − 1 bit (𝑛 ≥ 2 𝑒𝑣𝑒𝑛), the
Channel Encoder add a single bit to each block. The added bit is chosen in such a way that the block of n
bits has an even number of bits to one.
Ex: 𝑛 = 6, 11001 → 110011, 11000 → 110000, 00111 → 001111, 01011 → 010111

Each bit is transmitted individually on the BSC. It is supposed that the error on a bit does not influence
any other bit (independence). The Channel Decoder checks the parity on each block of n bit.

It is quite easy to realize that an odd number of bit in error can be revealed from the Channel Decoder,
while an even number of incorrect bit is not revealed by the Decoder and in this case an incorrect
information is delivered to the destination. Three possible events can be derived:

𝐶 = 𝐶𝑜𝑟𝑟𝑒𝑐𝑡 𝑇𝑟𝑎𝑛𝑠𝑚𝑖𝑠𝑠𝑖𝑜𝑛 , 𝑅 = 𝑟𝑒𝑣𝑒𝑎𝑙𝑒𝑑 𝐸𝑟𝑟𝑜𝑟 , 𝐸𝑠 = {𝑁𝑜𝑡 𝑟𝑒𝑣𝑒𝑎𝑙𝑒𝑑 𝐸𝑟𝑟𝑜𝑟, 𝑖. 𝑒. , 𝑆𝑦𝑠𝑡𝑒𝑚 𝐸𝑟𝑟𝑜𝑟}


Pietro Camarda 8
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Example: Communication over Noisy Channel

The probability of these events is


𝑃 𝐶 = 𝑃 𝑎𝑙𝑙 𝑏𝑖𝑡 𝑎𝑟𝑒 𝑡𝑎𝑛𝑠𝑚𝑖𝑡𝑡𝑒𝑑 𝑐𝑜𝑟𝑟𝑒𝑐𝑡𝑙𝑦 = 𝑃 1𝑜 𝑏𝑐, 2𝑜 𝑏𝑐, … . 6𝑜 𝑏𝑐 = (1 − 𝑝)6

𝑃 𝑅 = 𝑃 𝐴𝑛 𝑜𝑑𝑑 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑛𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑏𝑖𝑡 ℎ𝑎𝑣𝑒 𝑏𝑒𝑒𝑛 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑


= 𝑃 1 𝑏𝑖𝑡 𝑖𝑛𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑 ∪ 3 𝑏𝑖𝑡 𝑖𝑛𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑 ∪ … …

𝑃 𝐸𝑠 = 𝑃 𝐴𝑛 𝑒𝑣𝑒𝑛 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑛𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑏𝑖𝑡 ℎ𝑎𝑣𝑒 𝑏𝑒𝑒𝑛 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑


= 𝑃 2 𝑏𝑖𝑡 𝑖𝑛𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑 ∪ 4 𝑏𝑖𝑡 𝑖𝑛𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑 ∪ … …

The union consider mutually exclusive events and we have

𝑃 𝑅 = 𝑃 1 𝑏𝑖𝑡 𝑖𝑛𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑 + 𝑃 3 𝑏𝑖𝑡 𝑖𝑛𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑 + ⋯ …


6
𝑃 1 𝑏𝑖𝑡 𝑖𝑛𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑 = 𝑝 (1 − 𝑝)5
1
6 3
𝑃 3 𝑏𝑖𝑡 𝑖𝑛𝑐𝑜𝑟𝑟𝑒𝑐𝑡 𝑟𝑒𝑐𝑒𝑖𝑣𝑒𝑑 = 𝑝 (1 − 𝑝)3
3
In general, for a generic (𝑛 ≥ 2 𝑒𝑣𝑒𝑛), we have
𝑃 𝐶 = (1 − 𝑝)𝑛
𝑛 𝑘
𝑃 𝑅 = ෍ 𝑝 (1 − 𝑝)𝑛−𝑘
𝑘
𝑘=1,3,…,𝑛−1
𝑛 𝑘
𝑃 𝐸𝑠 = ෍ 𝑝 (1 − 𝑝)𝑛−𝑘
𝑘
𝑘=2,4,…,𝑛

HW: Verify that 𝑃 𝐶 + 𝑃 𝑅 + 𝑃 𝐸𝑆 = 1

Pietro Camarda 9
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Example: Communication over Noisy Channel

Repetition Code

In this case, each bit of the sequence of source information bit is transmittes 𝑛 times (𝑛 is an odd value).

Ex: 𝑛 = 5, 11001 … → 11111 11111 00000 00000 11111 ……

Each bit is transmitted individually on the BSC. It is supposed that the error on a bit does not influence
any other bit (independence). The Channel Decoder decides on a majority base on each block of n bit.

Ex: 11111 → 1, 11101 → 1, 00011 → 0, 11100 → 1, 01111 → 1, 00000 → 0

It is easy to check that, in this case, with 𝑛 = 5, up to two errors, the system is able to deliver to the
destination the correct emitted bit, while for a greater number of errors an incorrect bit is delivered to the
destination.

Two possible events can be derived: 𝐶 = 𝐶𝑜𝑟𝑟𝑒𝑐𝑡 𝑇𝑟𝑎𝑛𝑠𝑚𝑖𝑠𝑠𝑖𝑜𝑛 𝐸 = {𝐸𝑟𝑟𝑜𝑟 𝑖𝑛 𝑡𝑟𝑎𝑛𝑠𝑚𝑖𝑠𝑠𝑖𝑜𝑛}

Developping in the same way of the previous example, in this case for 𝑛 = 5, we have 𝑃 𝐶 =
σ2𝑘=0 5 𝑝𝑘 1 − 𝑝 𝑛−𝑘 , 𝑃 𝐸 = 1 − 𝑃 𝐶 = σ5𝑘=3 𝑝𝑘 (1 − 𝑝)𝑛−𝑘 , while in general for a 𝑜𝑑𝑑 𝑛 we have
𝑘
𝑛
2 𝑛
𝑛 𝑘 𝑛 𝑘
𝑃 𝐶 =෍ 𝑝 (1 − 𝑝)𝑛−𝑘 , 𝑃 𝐸 =1−𝑃 𝐶 = ෍ 𝑝 (1 − 𝑝)𝑛−𝑘
𝑘 𝑘
𝑘=0 𝑛
𝑘= +1
2

Pietro Camarda 10
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Example: Communication over Unreliable Channel

Ideal Feedback Channel

Binary Channel Channel Delivered


BSC
information Encoder Decoder information

Exploiting the Parity Check Code, the following events are allowed

𝐶 = 𝐶𝑜𝑟𝑟𝑒𝑐𝑡 𝑇𝑟𝑎𝑛𝑠𝑚𝑖𝑠𝑠𝑖𝑜𝑛 , 𝑅 = 𝑅𝑒𝑣𝑒𝑎𝑙𝑒𝑑 𝐸𝑟𝑟𝑜𝑟 , 𝐸𝑠 = {𝑁𝑜𝑡 𝑅𝑒𝑣𝑒𝑎𝑙𝑒𝑑 𝐸𝑟𝑟𝑜𝑟, 𝑖. 𝑒. , 𝑆𝑦𝑠𝑡𝑒𝑚 𝐸𝑟𝑟𝑜𝑟}

When events 𝐶 and 𝐸𝑠 occur, the information is transferred to the destination. In case of Revealed Error,
the Channel Decoder can ask, exploiting an ideal feedback channel, to the Channel Encoder to retransmit
the same information. In the second transmission of the same information, the same events are allowed
𝐶, 𝑅, 𝐸𝑠 . Also in this case the Channel Decoder can ask the retransmission of the same information.
Theorically, this loop can go on forever.

In the described situation the destination can receive only a Correct (𝐶𝑇 ) or an Incorrect Information (𝐸𝑇 ).
We have 𝑃 𝐶𝑇 + 𝑃 𝐸𝑇 = 1
𝑃 𝐶𝑇
= 𝑃(𝐶 𝑎𝑡 𝑡ℎ𝑒 𝑓𝑖𝑟𝑠𝑡 𝑡𝑟𝑎𝑛𝑠𝑚𝑖𝑠𝑠𝑖𝑜𝑛 ∪ 𝑅 𝑎𝑡 𝑡ℎ𝑒 𝑓𝑖𝑟𝑠𝑡 𝑡𝑟𝑎𝑛𝑠𝑚𝑖𝑠𝑠𝑖𝑜𝑛, 𝐶 𝑎𝑡 𝑡ℎ𝑒 𝑠𝑒𝑐𝑜𝑛𝑑 𝑡𝑟𝑎𝑛𝑠𝑚𝑖𝑠𝑠𝑖𝑜𝑛

Pietro Camarda 11
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Example: Communication over Noisy Channel

𝑃 𝐶𝑇
= 𝑃(𝐶 𝑎𝑡 𝑡ℎ𝑒 𝑓𝑖𝑟𝑠𝑡 𝑡𝑟𝑎𝑛𝑠𝑚𝑖𝑠𝑠𝑖𝑜𝑛 ∪ 𝑅 𝑎𝑡 𝑡ℎ𝑒 𝑓𝑖𝑟𝑠𝑡 𝑡𝑟𝑎𝑛𝑠𝑚𝑖𝑠𝑠𝑖𝑜𝑛, 𝐶 𝑎𝑡 𝑡ℎ𝑒 𝑠𝑒𝑐𝑜𝑛𝑑 𝑡𝑟𝑎𝑛𝑠𝑚𝑖𝑠𝑠𝑖𝑜𝑛

Pietro Camarda 12
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Random Variables
Random Variable (r.v.): A finite single S
valued function that maps the set of all
experimental outcomes into the set of 
real numbers R is said to be a r.v., if the A
set  | X ( )  x  is an event (a X ( )
subset of S) for every x in R. x B R
The r.v. can be classified in Discrete, Continuous or
Hybrid.
S set of all experimental outcomes
For any type of r.v., we can consider
P  | X ( )  x   FX ( x )
FX (x) is said to the Probability Distribution Function (PDF) associated
with the r.v. X.

Property of FX(x):
• FX ( x)  0, FX ( )  P ( X  )  P ( S )  1, FX ( )  0,
• FX(x+) = FX(x) for all x Right continuous function
• If x1 < x2 then FX(x1) <= FX(x2) Monotone nondecreasing function
• FX(x2) - FX(x1) = P(X≤x2)-P(X≤x1) = P(x1 < X ≤ x2)
Pietro Camarda 13
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Probability density function (p.d.f)

Fon continuous r.v., we can evaluate the derivative of the distribution function FX(x),
called probability density function fX(x) of the r.v X. Thus
dFX ( x)
f X ( x)  .
dx
𝑥
We have FX(x)=‫׬‬−∞ fX(α) 𝑑α

Property of fX(x) :

• fX(x) ≥0, considering that FX(x) is monotone nondecreasing function

+∞
• ‫׬‬−∞ fX(α) 𝑑α = FX(∞) =1 , The area under the pdf is unitary

𝑥 𝑥 𝑥
• P( x1 < X ≤ x2 ) = FX(x2) - FX(x1) = ‫׬‬−∞
2
fX(α) 𝑑α - ‫׬‬−∞
1
fX(α) 𝑑α = ‫ 𝑥׬‬2 fX(α) 𝑑α
1

Pietro Camarda 14
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Probability mass function (p.m.f)

Fon discrete r.v., we can define the p.m.f. 𝑝(𝑋 = 𝑥𝑖 ) for all values 𝑥𝑖 that the r.v. can
take.
We have
෍ 𝑝 𝑋 = 𝑥𝑖 = 1
𝑥𝑖
Ex. Poisson r.v.
𝜆𝑖 −𝜆
𝑋 ∈ 0, 1, 2, 3, … , 𝑛, … . , 𝑃 𝑋=𝑖 = 𝑒
𝑖!
We have
∞ ∞ ∞
𝜆𝑖 −𝜆 𝜆𝑖
෍ 𝑃 𝑋 = 𝑖 = ෍ 𝑒 = 𝑒 −𝜆 ෍ = 𝑒 −𝜆 𝑒 +𝜆 = 1 𝑎𝑠 𝑠ℎ𝑜𝑢𝑙𝑑 𝑏𝑒.
𝑖! 𝑖!
𝑖=0 𝑖=0 𝑖=0

Ex. Binomial r.v. with parameters n and p (n independent repeated trials)


𝑛 𝑘
𝑋 ∈ 0, 1, 2, 3, … , 𝑛 , 𝑃 𝑋=𝑘 = 𝑝 (1 − 𝑝)𝑛−𝑘
𝑛 𝑛 𝑘
𝑛 𝑘
෍𝑝 𝑋 = 𝑘 = ෍ 𝑝 (1 − 𝑝)𝑛−𝑘 = 𝑝 + 1 − 𝑝 𝑛 = 1 𝑎𝑠 𝑠ℎ𝑜𝑢𝑙𝑑 𝑏𝑒.
𝑘
𝑘=0 𝑘=0

Pietro Camarda 15
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Moments
The nth moment of a continuous r.v. is defined as

E[Xn]=‫׬‬−∞ 𝑥𝑛 𝑓𝑋 𝑥 𝑑𝑥
For discrete r.v., we havve
E[Xn]= ෍ 𝑥𝑖 𝑛 𝑝 𝑋 = 𝑥𝑖
𝑥𝑖

The first order moment 𝐸 𝑋 is called also Expectation, Mean, Average Value.

The nth central moment of a r.v. is


𝑛 ∞
𝐸 𝑋−𝐸 𝑋 = ‫׬‬−∞(𝑥 − 𝐸 𝑋 )𝑛 𝑓𝑋 𝑥 𝑑𝑥

The variance of a r.v. X, бX2, is defined as the central second moment:



𝜎𝑋2 =𝐸 𝑋−𝐸 𝑋 2
=න 𝑥−𝐸 𝑋 2 𝑓𝑋 𝑥 𝑑𝑥 = ⋯ = E 𝑋 2 − 𝐸 2 𝑋
−∞

Examples of Random Variables

Uniform r.v.

In this case the p.d.f. is constant in a finite interval [a,b]. We have

1
𝑓𝑋 𝑥 = , 𝑥 ∈ 𝑎, 𝑏
𝑏−𝑎
Pietro Camarda 16
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

0, 𝑥<𝑎
1 𝑥−𝑎
𝑓𝑋 𝑥 = ቊ 𝑥 ∈ 𝑎, 𝑏 𝐹𝑋 𝑥 = , 𝑥 ∈ 𝑎, 𝑏
𝑏−𝑎 𝑏−𝑎
1, 𝑥>𝑎

∞ 𝑏
1 𝑎+𝑏
𝐸 𝑋 = න 𝑥𝑓𝑋 𝑥 𝑑𝑥 = න 𝑥 𝑑𝑥 = ⋯ =
𝑏−𝑎 2
−∞ 𝑎
∞ 𝑏
2 2
1 2
𝑏 3 − 𝑎3
𝐸𝑋 = න 𝑥 𝑓𝑋 𝑥 𝑑𝑥 = න 𝑥 𝑑𝑥 = ⋯ =
𝑏−𝑎 3 𝑏−𝑎
−∞ 𝑎
2 2 2
(𝑏 − 𝑎)2
𝜎𝑋 = 𝐸 𝑋 − 𝐸𝑋 =⋯=
12
Pietro Camarda 17
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Exponential Random Variable

𝑓𝑋 𝑥 = 𝜆𝑒 −𝜆𝑥 , 𝜆 > 0, 𝑥 > 0, 𝐹𝑋 𝑥 = 1 − 𝑒 −𝜆𝑥 ,




1
𝐸 𝑋 = න 𝑥 𝑓𝑋 𝑥 𝑑𝑥 = න 𝑥𝜆𝑒 −𝜆𝑥 𝑑𝑥 = … =
−∞ 𝜆
0


2
𝐸 𝑋 2 = න 𝑥 2 𝑓𝑋 𝑥 𝑑𝑥 = න 𝑥 2 𝜆𝑒 −𝜆𝑥 𝑑𝑥 = … . =
−∞ 𝜆2
0

It can be shown the following important Memoryless Property


𝑝 𝑋 > 𝑠 + 𝑡\X > 𝑠 = 𝑝 𝑋 > 𝑡
In fact
𝑝 𝑋 > 𝑠 + 𝑡, X > 𝑠 𝑝 𝑋 >𝑠+𝑡 𝑒 −𝜆(𝑡+𝑠)
𝑝 𝑋 > 𝑠 + 𝑡\X > 𝑠 = = = −𝜆𝑠
= 𝑒 −𝜆𝑡 = 𝑝(𝑋 > 𝑡)
𝑝(𝑋 > 𝑠) 𝑝(𝑋 > 𝑠) 𝑒

Pietro Camarda 18
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Functions of r.v.

Given a r.v. 𝑋 with 𝑝. 𝑑. 𝑓. 𝑓𝑋(𝑥), applying a deterministic function 𝑔 to 𝑋 produces a new r.v. 𝑌 =
𝑔(𝑋), whose p.d.f. can be evaluated as a function of 𝑓𝑋(𝑥) and 𝑔.

𝐹𝑌 𝛼 = 𝑃 𝑌 ≤ 𝛼 = 𝑃 𝑔 𝑋 ≤ 𝛼 = න 𝑓𝑋 𝑥 𝑑𝑥
𝐷𝛼
Where 𝐷𝛼 = 𝑥|𝑔(𝑥) ≤ 𝛼

Ex. 𝑌 = 𝑋 2

t.b.d.

Pietro Camarda 19
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

In the case of Invertible function, i.e., 𝑋 = 𝑔−1 𝑌 , it can be shown that

𝑑𝑔−1(α)
𝑓 y (α) = 𝑓 x (g-1(α) ) 𝑑α

α−b 𝑑𝑔−1 α 1 α−b 1


Ex. 𝑌 = 𝑎 𝑋 + 𝑏 ⟹ g−1(α) = , = ⟹ 𝑓y (α) = 𝑓x
𝑎 𝑑α 𝑎 𝑎 𝑎

Obviously, once evaluated 𝑓𝑌 (𝛼), the first moments of Y can be evaluated by


previous standard formulas:

E[Y]=‫׬‬−∞ α 𝑓𝑌 α 𝑑 α.

However, it can be evaluated, more simply, exploiting 𝑓𝑋 𝑥 . In fact, it can be


shown the Fundamental Theorem of Expectation
∞ ∞
E[Y]=‫׬‬−∞ α 𝑓𝑌 α 𝑑 α = E[g(X)] = ‫׬‬−∞ g(α) 𝑓𝑋 α 𝑑 α

Pietro Camarda 20
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Characteristic Function
The Characteristic Function of a r.v. 𝑋 is defined as a function of 𝑋

𝜙𝑋 𝜈 = 𝐸 𝑒 𝑗𝜐𝑋

In the Discrete case, we have


𝜙𝑋 𝜈 = 𝐸 𝑒 𝑗𝜐𝑋 = σ𝑥𝑖 𝑒 𝑗𝜐𝑥𝑖 𝑃(𝑋 = 𝑥𝑖 )

While in the continuous case, we have


𝜙𝑋 𝜈 = 𝐸 𝑒 𝑗𝜐𝑋 = න 𝑒 𝑗𝜈𝑥 𝑓𝑋 𝑥 𝑑𝑥
−∞
In the continuous case, the value of 𝑓𝑋 (𝑥) can be evaluated by

1
𝑓𝑋 𝑥 = න 𝜙𝑋 𝜈 𝑒 −𝑗𝜈𝑥 𝑑𝜐
2𝜋
−∞


i.e., 𝜙𝑋 𝜐 is the Inverse Fourier Transform of 𝑓𝑋 (𝑥), i.e., 𝜙𝑋 𝜈 ՞ 𝑓𝑋 𝑥

Consequently, the properties of the Fourier transform can be exploited. It is well known the
convolution property of the FT, i.e.,


𝜙𝑋 𝜈 = 𝜙𝑋1 𝜈 𝜙𝑋2 𝜈 ՞ 𝑓𝑋 𝑥 = 𝑓𝑋1 𝑥 ∗ 𝑓𝑋2 𝑥 = න 𝑓𝑋1 𝛼 𝑓𝑋2 𝑥 − 𝛼 𝑑𝛼
−∞

Pietro Camarda 21
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Properties of the Characteristic Function



It is to check that 𝜙𝑋 0 = ‫׬‬−∞ 𝑓𝑋 𝑥 𝑑𝑥 = 1 This property can be exploited to verify the correctness of the
evaluation of the characteristic function.

The moments of a r.v. can be evaluated taking the successive derivatives of 𝜙𝑋 𝜈 . In fact

𝑑𝜙𝑋 𝜈
= 𝜙𝑋1 𝜐 = 𝑗 න 𝑥 𝑒 𝑗𝜈𝑥 𝑓𝑋 𝑥 𝑑𝑥
𝑑𝜐
−∞
Evaluating for 𝜐 = 0 the previous expression, we have

1
𝜙𝑋 0 = 𝑗 න 𝑥 𝑓𝑋 𝑥 𝑑𝑥 = 𝑗 𝐸 𝑋
−∞
Considering the successive derivatives, we have

𝑘
𝜙𝑋 0 = 𝑗𝑘 න 𝑥 𝑘 𝑓𝑋 𝑥 𝑑𝑥 = 𝑗𝑘 𝐸 𝑋 𝑘
−∞
i.e.,
𝑘
𝐸 𝑋 𝑘 = 𝜙𝑋 0 𝑗 −𝑘
Example
Let 𝑋 be an exponential r.v. with parameter 𝜆, (i.e., 𝑓𝑋 𝑥 = 𝜆𝑒 −𝜆𝑥 , 𝜆 > 0, 𝑥 > 0). The
Characteristic function of 𝑋 is
∞ ∞
𝜆
𝜙𝑋 𝜈 = න 𝑒 𝑗𝜈𝑥 𝑓𝑋 𝑥 𝑑𝑥 = න 𝑒 𝑗𝜈𝑥 𝜆 𝑒 −𝜆𝑥 𝑑𝑥 = … . =
𝜆 − 𝑗𝜈
−∞ 0

Pietro Camarda 22
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

HW: Exploiting the Characteristic Function, evaluate the first and second moment of the
1 2
Exponential r.v. (The results are 𝐸 𝑋 = , 𝐸 𝑋 2 = 2 )
𝜆 𝜆

In general all moments of the Exponential r.v. can be evaluated. In fact, the function 𝑒 𝑗𝜈𝑋 can be
developed in series.
2 3 𝑛 ∞
𝑗𝜈𝑋
(𝑗𝜈𝑋) (𝑗𝜈𝑋) 𝑗𝜈𝑋 (𝑗𝜈𝑋)𝑛
𝑒 = 1 + 𝑗𝜈𝑋 + + + …+ + … =෍
2! 3! 𝑛! 𝑛!
𝑛=0

Exploiting the linearity of Expectation (as shown later), we have


∞ 𝑛 ∞ 𝑛 𝑛
𝑗𝜐𝑋
(𝑗𝜈𝑋) 𝑗 𝜈 𝐸 𝑋𝑛
𝜙𝑋 𝜈 = 𝐸 𝑒 =𝐸 ෍ =෍
𝑛! 𝑛!
𝑛=0 𝑛=0
For the exponential r.v., we have

𝜆 1 𝑗𝑛 𝜈𝑛
𝜙𝑋 𝜈 = 𝐸 𝑒 𝑗𝜐𝑋 = = =෍ 𝑛
𝜆 − 𝑗𝜈 1 − 𝑗 𝜈 𝜆
𝜆 𝑛=0
Obviously, it results
∞ ∞
𝑗𝑛𝜈𝑛 𝐸 𝑋𝑛 𝑗𝑛 𝜈𝑛
෍ =෍ 𝑛
𝑛! 𝜆
𝑛=0 𝑛=0
Equating term by term, we have for an Exponential r.v. a generic 𝑛𝑡ℎ moment
𝑛!
𝐸 𝑋𝑛 = 𝑛
𝜆

Pietro Camarda 23
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Erlang Random Variable

Let 𝑋1 , 𝑋2 , … , 𝑋𝑛 i.i.d. exponential r.v. with parameter 𝜆 and 𝑝. 𝑑. 𝑓. 𝑓𝑥𝑖 𝑥 = 𝜆𝑒 −𝜆𝑥 , 𝜆 > 0, 𝑥 > 0

The Erlang r.v. is defined as


𝑋 = 𝑋1 + 𝑋2 + … + 𝑋𝑛
The p.d.f of 𝑋 results
𝜆𝑛 𝑥 𝑛−1 𝑒 −𝜆𝑥
𝑓𝑋 𝑥 = , 𝜆 > 0, 𝑥 > 0
𝑛−1 !

The proof can be developed by Induction as follows,

• The result is true for 𝑛 = 1. In fact, the previous expression, for 𝑛 = 1, becomes 𝑓𝑋 𝑥 = 𝜆𝑒 −𝜆𝑥 , i.e.,
the p.d.f. of a single Exponential r.v.
• Let us suppose that the previous p.d.f. is true for a generic value of 𝑛 and show that remains true
for 𝑛 + 1. In fact, let
𝑌 = 𝑋 + 𝑋𝑛+1
∞ 𝛼
𝜆𝑛 𝛽 𝑛−1 𝑒 −𝜆𝛽 −𝜆 𝛼−𝛽
𝑓𝑌 𝛼 = 𝑓𝑋 𝛼 ∗ 𝑓𝑋𝑛+1 𝛼 = න 𝑓𝑋 𝛽 𝑓𝑋𝑛+1 𝛼 − 𝛽 𝑑𝛽 = න 𝜆𝑒 𝑑𝛽
𝑛−1 !
−∞ 0
𝛼
𝑛+1 𝑛+1 𝑛
𝜆 𝜆 𝛼 −𝜆𝛼
= 𝑒 −𝜆𝛼 න 𝛽 𝑛−1 𝑑𝛽 = 𝑒
𝑛−1 ! 𝑛!
0
c.v.d.

Pietro Camarda 24
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Gaussian Random Variables


The Gaussian or Normal r.v. with average value  and variance 𝜎 2 has p.d.f.
1 𝑥−𝜇 2

𝑓𝑋 𝑥 = 𝑒 2𝜎2
2𝜋 𝜎
The characteristic function results
𝜎2 2
𝑗𝜐𝑋 𝑗𝜇𝜐− 𝜐
𝜙𝑋 𝜈 = 𝐸 𝑒 =𝑒 2
Remembering that 𝜙𝑋 𝜈 is the inverse Fourier Tansform of 𝑓𝑋 (𝑥), i.e., 𝜙𝑋 𝜈 ՞ 𝑓𝑋 𝑥
the proof is based on some properties of the Fourier Transform.
In particular:
2 2
• 𝑥 𝑡 = 𝑒 −𝜋𝑡 ⟷ 𝑋 𝑓 = 𝑒 −𝜋𝑓
1 𝑓
• 𝑧 𝑡 = 𝑥 𝑎𝑡 ՞ 𝑍 𝑓 = 𝑋 from this property, we have
𝑎 𝑎
1 𝑓
𝑧 𝑡 =𝑥 2𝜋𝜎𝑡 ՞ 𝑍 𝑓 = 𝑋 and from the first property
2𝜋𝜎 2𝜋𝜎
2
1 −𝜋
𝑥
1 𝑥2
−𝜋( 2𝜋𝜎𝑡)2 −
𝑧 𝑡 =𝑒 ՞𝑍 𝑓 = 𝑒 2𝜋𝜎 = 𝑒 2𝜎 2
2𝜋𝜎 2𝜋𝜎
We have
1 𝑥2
−2𝜋2 𝜎 2 𝑡 2 −
𝜙𝑋 𝜈 = 𝑒 ՞ 𝑓𝑋 𝑥 = 𝑒 2𝜎 2
2𝜋𝜎
Remembering also that
• 𝑧 𝑡 = 𝑥 𝑡 𝑒 𝑗2𝜋𝑓0 𝑡 ՞ 𝑍 𝑓 = 𝑋 𝑓 ∗ 𝛿 𝑓 − 𝑓0 = 𝑋(𝑓 − 𝑓0 )
1 𝑥−𝜇 2 1 𝑥2
− 2 − 2
𝑓𝑋 𝑥 = 𝑒 2𝜎 = 𝑒 2𝜎 ∗ 𝛿 𝑥 − 𝜇
2𝜋 𝜎 2𝜋 𝜎
We have
𝜎2 2
−2𝜋2 𝜎 2 𝑡 2 𝑗2𝜋𝜇𝑡 𝑗𝜇𝜐− 𝜐
Pietro Camarda 𝜙𝑋 𝜈 = 𝑒 𝑒 =𝑒 2
25
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Gaussian Random Variables


From the previous result, we have
𝜎2 2
𝑗𝜇𝜐− 𝜐 ቚ
𝜙𝑋 𝜈 = 𝑒 2 𝜈=0 = 𝑒0 = 1
As should be.
𝜎2 2
− 𝜎 2 𝜈)ฬ𝜈=0 = 𝑗𝜇 ⇒ 𝐸 𝑋 = 𝜇
1 𝑗𝜇𝜐− 𝜐
𝜙𝑋 𝜐 =𝑒 2 (𝑗𝜇

𝜎2 2 𝜎2 2
2 (−𝜎 2 )ฬ
2 𝑗𝜇𝜐− 𝜐 2 2 𝑗𝜇𝜐− 𝜐
𝜙𝑋 𝜐 =𝑒 2 (𝑗𝜇 − 𝜎 𝜈) +𝑒 𝜈=0
= 𝑗 2 𝜇2 − 𝜎 2 = 𝑗 2 𝜇2 + 𝜎 2

And thus
𝐸 𝑋 2 = 𝜇2 + 𝜎 2

𝜎𝑋2 = 𝐸 𝑋 2 − 𝐸 2 𝑋 = 𝜎 2

It is confirmed that the average value is 𝐸 𝑋 = 𝜇 and the variance is 𝜎𝑋2 = 𝜎 2

Pietro Camarda 26
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Multiple Random Variables

Many random variables X, Y, .. can be defined on the same Sample Space

It can be defined the joint distribution 𝐹𝑋𝑌 𝑥, 𝑦 of the r.v. X,Y

𝐹𝑋𝑌 𝑥, 𝑦 = P X ≤ 𝑥, 𝑌 ≤ 𝑦
Properties:
• 𝐹𝑋𝑌 ∞, ∞ = 𝑃 𝑋 ≤ ∞, 𝑌 ≤ ∞ = 1
• 𝐹𝑋𝑌 𝑥, ∞ = 𝑃 𝑋 ≤ 𝑥, 𝑌 ≤ ∞ = 𝑃 𝑋 ≤ 𝑥 = 𝐹𝑋 𝑥
• 𝐹𝑋𝑌 𝑥2, 𝑦2 ≥ 𝐹𝑋𝑌 𝑥1, 𝑦1 , if 𝑥2 > 𝑥1 𝑎𝑛𝑑 𝑦2 > 𝑦1 (non decreasing function)

The probability density function 𝑓𝑋𝑌(𝑥, 𝑦) is defined as

𝜕2𝐹𝑋𝑌 𝑥, 𝑦
𝑓𝑋𝑌(𝑥, 𝑦) =
𝜕𝑥𝜕𝑦
Properties:
𝑥 𝑦
• 𝐹𝑋𝑌 𝑥, 𝑦 = ‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋𝑌 𝛼, 𝛽 𝑑𝛽 𝑑𝛼 ⇒ 𝐹𝑋𝑌 ∞, ∞ =
∞ ∞
‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋𝑌 𝛼, 𝛽 𝑑𝛽 𝑑𝛼 = 1

Pietro Camarda 27
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

• 𝑓𝑋𝑌 𝑥, 𝑦 ≥ 0
𝑥 ∞
• 𝐹𝑋 𝑥 = 𝐹𝑋𝑌 𝑥, ∞ = ‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋𝑌 𝛼, 𝛽 𝑑𝛽 𝑑𝛼
𝑑𝐹𝑋(𝑥) ∞
• 𝑓𝑋 𝑥 = = ‫׬‬−∞ 𝑓𝑋𝑌 𝑥, 𝛽 𝑑𝛽
𝑑𝑥

Two r.v. are independent when 𝑓𝑋𝑌 𝑥, 𝑦 = 𝑓𝑋 𝑥 𝑓𝑌(𝑦)

Function of two Random Variables

Let 𝑍 = 𝑔(𝑋, 𝑌) a transformation of the pair of random variables (X, Y) having joint pdf
f𝑋𝑌 𝑥, 𝑦 ; we have:
∞ ∞

𝐸 𝑍 = 𝐸 𝑔 𝑋, 𝑌 = න න 𝑔 𝑥, 𝑦 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑦 𝑑𝑥
−∞ −∞
EX.
𝑍 = 𝑔 𝑋, 𝑌 = 𝑋 + 𝑌
𝐸𝑍
∞ ∞ ∞ ∞ ∞ ∞

= න න 𝑥 + 𝑦 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 = න න 𝑥 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 + න න 𝑦 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑦 𝑑𝑥 = 𝐸 𝑋 + 𝐸[𝑌]


−∞ −∞ −∞ −∞ −∞ −∞
i.e., the Expectation of a sum of Random Variables is always equal to the sum of
Expectations.
Pietro Camarda 28
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Ex. 𝑍 = 𝑔 𝑋, 𝑌 = 𝑋 𝑌
∞ ∞

𝐸 𝑍 = 𝐸 𝑋 𝑌 = න න 𝑥 𝑦 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑥 𝑑𝑦
−∞ −∞
In case of independence of the two r.v., i.e., 𝑓𝑋𝑌 𝑥, 𝑦 = 𝑓𝑋 𝑥 𝑓𝑌 𝑦 , we have
∞ ∞

𝐸 𝑋 𝑌 = න න 𝑥 𝑦𝑓𝑋 𝑥 𝑓𝑌 𝑦 𝑑𝑥 𝑑𝑦 = 𝐸 𝑋 𝐸[𝑌]
−∞ −∞
i.e., the Expectation of a product of r.v. is equal to the product of Expectations
only in case of independence of the r.v.

Covariance
The Covariance of two r.v. 𝑋, 𝑌 is
𝐶𝑂𝑉 𝑋, 𝑌 = 𝐸[ 𝑋 − 𝐸 𝑋 𝑌 − 𝐸 𝑌] = ⋯ = 𝐸 𝑋 𝑌 − 𝐸 𝑋 𝐸[𝑌]
Properties:
• Two Independent r.v. have Covariance equal to zero.
• When two r.v. have Covariance equal to zero, these two r.v. are not
necessarily independent; i.e., it can be shown that there are dependent r.v.
that have Covariance equal to zero.

HW: Let X a r.v. uniformly distributed in the interval [-1, 1] and Y=X2. Show that COV(X, Y)=0
Pietro Camarda 29
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Correlation Coefficient

The correlation coefficient is defined as


𝐶𝑜𝑣(𝑋, 𝑌)
𝜌𝑋𝑌 =
𝜎𝑋 𝜎𝑌
Properties

• When 𝑋, 𝑌 are independent, the Correlation Coefficient is equal to zero. In fact, remembering that for
independent r.v. results 𝐸 𝑋𝑌 = 𝐸 𝑋 𝐸[𝑌], this property follows immediately from the definition of
Covariance.
𝑋−𝐸 𝑋 𝑌−𝐸 𝑌 2
• −1 ≤ 𝜌𝑋𝑌 ≤ 1. In fact, considering the following positive expectation 𝐸 ∓ ≥ 0.
𝜎𝑋 𝜎𝑌
𝑋−𝐸 𝑋 2 𝑌−𝐸 𝑌 2 𝑋−𝐸 𝑋 𝑌−𝐸 𝑌
Developing the square turns out 𝐸 2 + ∓2 ≥ 0. This expression,
𝜎𝑋 𝜎𝑌2 𝜎𝑋 𝜎𝑌
𝐶𝑂𝑉(𝑋,𝑌)
taking into account the linearity of expectation becomes 1 + 1 ∓ 2 ≥ 0, i.e.,
𝜎𝑋 𝜎𝑌
−1 ≤ 𝜌𝑋𝑌 ≤ 1

• HW: Let 𝛼 be a real number and 𝑌 = 𝛼𝑋, show that 𝜌𝑋𝑌 = 𝑠𝑖𝑔𝑛(𝛼).

Pietro Camarda 30
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Conditional Density Functions


The distribution function of X given an event B is
𝑃( 𝑋 ≤ 𝑥 ‫)𝐵 ځ‬
𝐹𝑋 𝑥 𝐵 = 𝑃 𝑋 ≤ 𝑥 𝐵 = when P B ≠ 0
𝑃(𝐵)
Suppose, we let 𝐵 = 𝑦1 < 𝑌 ≤ 𝑦2
𝑃(𝑋 ≤ 𝑥, 𝑦1 < 𝑌 ≤ 𝑦2
𝐹𝑋 𝑥 𝑦1 < 𝑌 ≤ 𝑦2 = 𝑃 𝑋 ≤ 𝑥 𝑦1 < 𝑌 ≤ 𝑦2 =
𝑃(𝑦1 < 𝑌 ≤ 𝑦2)
Exploiting previous results, we have
𝑥 𝑦
‫׬‬−∞ ‫ 𝑦׬‬2 𝑓𝑋𝑌 𝑢, 𝑣 𝑑𝑢 𝑑𝑣
𝐹𝑋 𝑥 𝑦1 < 𝑌 ≤ 𝑦2 = 1
𝑦
‫ 𝑦׬‬2 𝑓𝑌 𝛼 𝑑𝛼
1
In the limit case, when 𝐵 = 𝑌 = 𝑦 , we can let 𝑦1 = 𝑦 𝑎𝑛𝑑 𝑦2 = 𝑦 + ∆𝑦.
This gives
𝑥 𝑦+∆𝑦 𝑥
‫׬‬−∞ ‫𝑦׬‬ 𝑓𝑋𝑌 𝑢, 𝑣 𝑑𝑢 𝑑𝑣 ‫𝑢 𝑌𝑋𝑓 ׬‬, 𝑦 ∆𝑦
𝐹𝑋 𝑥 𝑌 = 𝑦 = lim 𝐹𝑋 𝑥 𝑦 < 𝑌 ≤ 𝑦 + ∆𝑦 = 𝑦+∆𝑦
= −∞
∆𝑦→0
‫𝑦׬‬ 𝑓𝑌 𝛼 𝑑𝛼 𝑓𝑌 𝑦 ∆𝑦
𝑥
‫𝑢 𝑌𝑋𝑓 ׬‬, 𝑦
= −∞
𝑓𝑌 𝑦
The corresponding p.d.f. is
𝑓𝑋𝑌(𝑥,𝑦) 𝑓𝑋𝑌(𝑥,𝑦)
fX(x|Y=y)= which is usually indicated by fX|Y(x|y)=
𝑓𝑌(𝑦) 𝑓𝑌(𝑦)
Pietro Camarda 31
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Which can be reported to 𝑓𝑋𝑌 𝑥, 𝑦 = 𝑓X|Y 𝑥 𝑦 𝑓𝑌 𝑦 = 𝑓Y|X 𝑦 𝑥 𝑓𝑋 𝑥


∞ ∞
Remembering that 𝑓𝑋 𝑥 = ‫׬‬−∞ 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑦 and 𝑓𝑌 𝑦 = ‫׬‬−∞ 𝑓𝑋𝑌 𝑥, 𝑦 𝑑𝑥
We have

𝑓𝑋 𝑥 = න 𝑓X|Y 𝑥 𝑦 𝑓𝑌 𝑦 𝑑𝑦
−∞

𝑓𝑌 𝑦 = න 𝑓Y|X 𝑦 𝑥 𝑓𝑋 𝑥 𝑑𝑥
−∞
Which represents a continuous version of the Total Probability Theorem

Developping the previous results we have a continuous form of the Bayes


Theorem.

𝑓𝑋𝑌(𝑥, 𝑦) 𝑓Y|X 𝑦 𝑥 𝑓𝑋(𝑥)


fX|Y(x|y)= = ∞
𝑓𝑌(𝑦) ‫׬‬−∞ 𝑓Y|X 𝑦 𝑥 𝑓𝑋 𝑥 𝑑𝑥

Pietro Camarda 32
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Conditional Expectation

The Conditional Expectation, in the continuous case, is defined as


𝐸 𝑌 𝑋 = 𝑥 = න 𝑦 𝑓Y|X 𝑦 𝑥 𝑑𝑦
−∞
The definition shows that the result is a function of X, i.e., 𝑔 𝑋 = 𝐸 𝑌 𝑋 = 𝑥 ]
Taking the Expectation of this function of X results

𝐸𝑔 𝑋 =𝐸 𝐸 𝑌𝑋=𝑥 = න 𝑔 𝑥 𝑓𝑋 𝑥 𝑑𝑥 =
−∞
∞ ∞ ∞ ∞ ∞

න න 𝑦𝑓Y|X 𝑦 𝑥 𝑑𝑦 𝑓𝑋 𝑥 𝑑𝑥 = න න 𝑦𝑓X,Y 𝑥, 𝑦 𝑑𝑥𝑑𝑦 = න 𝑦 𝑓𝑌 𝑦 𝑑𝑦 = 𝐸 𝑌


−∞ −∞ −∞ −∞ −∞

Similar results can be derived for the Discrete case. Let ,  discrete r.v. The Conditional Expectation,
in the discrete case, is defined as
𝐸 𝑌\X = 𝑥𝑖 = ෍ 𝑦𝑖 𝑃(𝑌 = 𝑦𝑖 \X = 𝑥𝑖 )
𝑦𝑖
The definition shows that the result is a function of X, i.e., 𝑔 𝑋 = 𝐸 𝑌 𝑋 = 𝑥𝑖 ]
Taking the Expectation of this function of X results
𝐸𝑔 𝑋 = 𝐸 𝐸 𝑌 𝑋 = 𝑥𝑖 ෍ 𝑔 𝑥𝑖 𝑃(𝑋 = 𝑥𝑖 ) = ෍ ෍ 𝑦𝑖 𝑃(𝑌 = 𝑦𝑖 \X = 𝑥𝑖 ) 𝑃(𝑋 = 𝑥𝑖 )
𝑥𝑖 𝑥𝑖 𝑦𝑖

= ෍ ෍ 𝑦𝑖 𝑃(𝑌 = 𝑦𝑖 , X = 𝑥𝑖 ) = ෍ 𝑦𝑖 ෍ 𝑃(𝑌 = 𝑦𝑖 , X = 𝑥𝑖 ) = ෍ 𝑦𝑖 𝑃 𝑌 = 𝑦𝑖 = 𝐸 𝑌
𝑥𝑖 𝑦𝑖 𝑦𝑖 𝑥𝑖 𝑦𝑖

Pietro Camarda 33
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Probability Generating Function.

Let N be a discrete r.v. that assumes non negative integer values, i.e., 𝑁 ∈ 0, 1, 2, … , 𝑛, … . , with p.m.f.
𝑝𝑛 = 𝑃 𝑁 = 𝑛 . Let 𝑧 be a complex value, the Probability Generating Function of  is indicated by
𝐺𝑁 𝑧 , and is defined as

𝐺𝑁 𝑧 = 𝐸 𝑧 𝑁 = ෍ 𝑧 𝑛 𝑝𝑛 = 𝑝0 + 𝑝1 𝑧 + 𝑝2 𝑧 2 + 𝑝3 𝑧 3 + ⋯ + 𝑝𝑛 𝑧 𝑛 + ⋯
𝑛=0

𝜆𝑛 𝜆𝑛 (𝜆𝑧)𝑛
Ex - N Poisson r.v. with 𝑝𝑛 = 𝑒 −𝜆 (n=0,, we have 𝐺𝑁 𝑧 = σ∞
𝑛=0 𝑧
𝑛
𝑒 −𝜆 = 𝑒 −𝜆 σ∞
𝑛=0 =
𝑛! 𝑛! 𝑛!
𝑒 −𝜆 𝑒 𝜆𝑧 = 𝑒 𝜆(𝑧−1)
Ex – N Geometric r.v. with 𝑝𝑛 = 𝑝𝑛−1 1 − 𝑝 , 𝑛 = 1, 2, 3, … .
∞ ∞ ∞
1−𝑝 𝑧
𝐺𝑁 𝑧 = 𝐸 𝑧 𝑁 = ෍ 𝑧 𝑛 𝑝𝑛 = ෍ 𝑧 𝑛 1 − 𝑝 𝑝𝑛−1 = 1 − 𝑝 𝑧 ෍ (𝑝𝑧)𝑛−1 =
1 − 𝑝𝑧
𝑛=1 𝑛=1 𝑛=1
Properties of 𝐺𝑁 𝑧
• 𝐺𝑁 𝑧 |𝑧=1 = 𝐺𝑁 1 = σ∞ 𝑛
𝑛=0 1 𝑝𝑛 = 1
• 𝐺𝑁′ 𝑧 |𝑧=1 = σ∞ 𝑛=0 𝑛 𝑧
𝑛−1
𝑝𝑛 |𝑧=1 = σ∞ 𝑛=0 𝑛 𝑝𝑛 = 𝐸 𝑁
′′ ∞
• 𝐺𝑁 𝑧 |𝑧=1 = σ𝑛=0 𝑛 (𝑛 − 1) 𝑧 𝑛−2
𝑝𝑛 |𝑧=1 = σ∞ 2
𝑛=0(𝑛 −𝑛) 𝑧
𝑛−2
𝑝𝑛 |𝑧=1 = σ∞ 2 ∞
𝑛=0 𝑛 𝑝𝑛 − σ𝑛=0 𝑛𝑝𝑛 =
𝐸 𝑁2 − 𝐸 𝑁
• 𝐺𝑁 𝑧 |𝑧=0 = 𝑝0
• 𝐺𝑁′ 𝑧 = σ∞ 𝑛=0 𝑛 𝑧
𝑛−1
𝑝𝑛 = 𝑝1 + 2𝑝2 𝑧 + 3𝑝3 𝑧 2 + ⋯ + 𝑛𝑝𝑛 𝑧 𝑛−1 + ⋯ ⇒ 𝐺𝑁′ 𝑧 |𝑧=0 = 𝑝1
• 𝐺𝑁′′ 𝑧 = σ∞ 𝑛=0 𝑛 (𝑛 − 1)𝑧
𝑛−2
𝑝𝑛 = 2𝑝2 + 6𝑝3 𝑧 + ⋯ + 𝑛 𝑛 − 1 𝑝𝑛 𝑧 𝑛−2 + ⋯ ⇒ 𝐺𝑁′′ 𝑧 |𝑧=0 = 2𝑝2
in general
1 𝑑𝑛
𝑝𝑛 = 𝐺 (𝑧)ቤ
𝑛! 𝑑𝑧 𝑛 𝑁 𝑧=0
HW: apply the previous properties to the Poisson and Geometric r.v.
Pietro Camarda 34
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

Ex. 𝑋 = 𝑋1 + 𝑋2 + 𝑋3 + … . . +𝑋𝑁
N is a discrete r.v. with p.m.f. 𝑃 𝑁 = 𝑛 = 𝑝𝑛 and average value 𝐸 𝑁 = σ𝑛 𝑛 𝑝𝑛

𝑋1, 𝑋2, 𝑋3, … . . are i.i.d. continuous r.v. with p.d.f. 𝑓𝑋 𝑥 and average value E X .

𝐸 𝑋 =𝐸 𝐸 𝑋𝑁 = ෍ 𝐸 𝑋 𝑁 = 𝑛 𝑝𝑛 = ෍ 𝑝𝑛 𝑛 𝐸[𝑋] = 𝐸 𝑋 𝐸 𝑁
𝑛 𝑛
In a more general case, the evaluation of the p.d.f. of 𝑋, 𝑓𝑋 (𝑥) could be necessary.

For a fixed value of 𝑁 = 𝑛, we already know that


𝑓𝑋 𝑥 = 𝑓𝑋1 𝑥 ∗ 𝑓𝑋2 𝑥 ∗ ⋯ ∗ 𝑓𝑋𝑛 𝑥
𝑛
𝜙𝑋 𝜐 = 𝜙𝑋1 𝜐 𝜙𝑋2 𝜐 … 𝜙𝑋𝑛 𝜐 = 𝜙𝑋𝑖 (𝜐)

Being 𝜙𝑋1 𝜐 = 𝜙𝑋2 𝜐 = ⋯ = 𝜙𝑋𝑛 𝜐 = 𝜙𝑋𝑖 𝜐

In general, when the value of  is a r.v., we have


𝜙𝑋 𝜐 = 𝐸 𝑒 𝑗𝜐𝑋 = 𝐸 𝐸 𝑒 𝑗𝜐𝑋 |𝑁 = 𝑛
The external expectation refers to the r.v. 𝑁. We have
∞ ∞
𝑛
𝜙𝑋 𝜐 = ෍ 𝐸 𝑒 𝑗𝜐𝑋 |𝑁 = 𝑛 𝑃 𝑁 = 𝑛 = ෍ 𝜙𝑋𝑖 𝜐 𝑃 𝑁 = 𝑛 = 𝐺𝑁 𝜙𝑋𝑖 𝜐
𝑛=1 𝑛=1
Where 𝐺𝑁 𝜙𝑋𝑖 𝜐 is the Probability Generating Function.

Pietro Camarda 35
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

The successive moments become

𝑛
𝐸 𝑋 𝑛 = 𝑗 −𝑛 𝜙𝑋 𝜈 ቚ
𝜈=0
We have
1 1 1
𝜙𝑋 𝜈 = 𝐺𝑁 𝜙𝑋𝑖 𝜈 𝜙𝑋𝑖 𝜐 ⇒
1 1 1
𝐸 𝑋 = 𝑗 −1 𝜙𝑋 0 = 𝑗 −1 𝐺𝑁 𝜙𝑋𝑖 0 𝜙𝑋𝑖 0 = 𝑗 −1 𝐸 𝑁 𝑗𝐸 𝑋𝑖 = 𝐸 𝑁 𝐸 𝑋𝑖

2 2 1 1 1 2
𝜙𝑋 𝜈 = 𝐺𝑁 𝜙𝑋𝑖 𝜈 𝜙𝑋𝑖 𝜐 𝜙𝑋𝑖 𝜐 + 𝐺𝑁 𝜙𝑋𝑖 𝜈 𝜙𝑋𝑖 𝜐
2 2 1 1 1 2
𝐸 𝑋 2 = 𝑗 −2 𝜙𝑋 0 = 𝑗 −2 𝐺𝑁 𝜙𝑋𝑖 0 𝜙𝑋𝑖 0 𝜙𝑋𝑖 0 + 𝑗 −2 𝐺𝑁 𝜙𝑋𝑖 0 𝜙𝑋𝑖 0
2 1 1 2
= 𝑗 −2 𝐺𝑁 1 (𝜙𝑋𝑖 0 )2 + 𝑗 −2 𝐺𝑁 1 𝜙𝑋𝑖 0 = 𝑗 −2 𝐸 𝑁 2 − 𝐸 𝑁 𝑗 2 (𝐸 𝑋𝑖 )2 +𝑗 −2 𝐸 𝑁 𝑗 2 𝐸 𝑋𝑖2

Pietro Camarda 36
POLITECNICO DI BARI - MASTER DEGREE IN TELECOMMUNCIATIONS ENGINEERING
Course :Traffic Theory

In the specific case when 𝑁 is a geometric r.v. with parameter 𝑝 and 𝑋1 , 𝑋2 , … , 𝑋𝑛 𝑖. 𝑖. 𝑑. Exponential
r.v. with parameter 𝜇, i.e., 𝑓𝑋𝑖 𝑥 = 𝜇𝑒 −𝜇𝑥 , μ > 0, 𝑥 > 0, we have
1−𝑝 𝑧 𝜇
𝐺𝑁 𝑧 = , 𝜙𝑋𝑖 𝜈 =
1−𝑝𝑧 𝜇−𝑗𝜈
And thus
𝜇
(1 − 𝑝) 1−𝑝 𝜇
𝜇 − 𝑗𝜈
𝜙𝑋 𝜈 = 𝐺𝑁 𝜙𝑋𝑖 𝜐 = 𝜇 = ⋯ =
1−𝑝 1 − 𝑝 𝜇 − 𝑗𝜈
𝜇 − 𝑗𝜈
It results that
𝑓𝑋 𝑥 = 𝜇 1 − 𝑝 𝑒 −𝜇 1−𝑝 𝑥

i.e., 𝑋 is an exponetial r.v. with parameter 𝜇(1 − 𝑝).

Another specific case happens when 𝑁 assumes only a single value 𝑛, i.e., 𝑝 𝑁 = 𝑘 = 0 𝑓𝑜𝑟 𝑘 ≠
𝑛 𝑤ℎ𝑖𝑙𝑒 𝑝 𝑁 = 𝑛 = 1. 𝑋1 , 𝑋2 , … , 𝑋𝑛 as before.

𝐺𝑁 𝑧 = ෍ 𝑝𝑘 𝑧 𝑘 = 𝑧 𝑛
𝑘=0

𝑛
𝜇
𝐺𝑁 𝜙𝑋𝑖 𝜐 =
𝜇 − 𝑗𝜈
It follows that
𝑓𝑋 𝑥 = 𝑓𝑋1 𝑥 ∗ 𝑓𝑋2 𝑥 ∗ ⋯ ∗ 𝑓𝑋𝑛 𝑥
i.e. 𝑋 is an Erlang r.v.

Pietro Camarda 37

S-ar putea să vă placă și