Sunteți pe pagina 1din 7

Shannons Channel Capacity

Shannon derived the following capacity formula (1948) for an additive


white Gaussian noise channel (AWGN):
C = W log
2
(1 +S/N) [bits/second]
W is the bandwidth of the channel in Hz
S is the signal power in watts
N is the total noise power of the channel watts
Channel Coding Theorem (CCT):
The theorem has two parts.
1. Its direct part says that for rate R < C there exists a coding system
with arbitrarily low block and bit error rates as we let the codelength
N .
2. The converse part states that for R C the bit and block error
rates are strictly bounded away from zero for any coding system
The CCT therefore establishes rigid limits on the maximal supportable
transmission rate of an AWGN channel in terms of power and bandwidth.
Bandwidth Eciency characterizes how eciently a system uses its
allotted bandwidth and is dened as
=
Transmission Rate
Channel Bandwidth W
[bits/s/Hz].
From it we calculate the Shannon limit as

max
= log
2
_
1 +
S
N
_
[bits/s/Hz]. (1)
Note: In order to calculate , we must suitably dene the channel bandwidth W. One com-
monly used denition is the 99% bandwidth denition, i.e., W is dened such that 99%
of the transmitted signal power falls within the band of width W.
EE 7950: Statistical Communication Theory 2
Average Signal Power S can be expressed as
S =
kE
b
T
= RE
b
,
E
b
is the energy per bit
k is the number of bits transmitted per symbol
T is the duration of a symbol
R = k/T is the transmission rate of the system in bits/s.
S/N is called the signal-to-noise ratio
N = N
0
W is the total noise power
N
0
is the one-sided noise power spectral density
we obtain the Shannon limit in terms of the bit energy and noise power
spectral density, given by

max
= log
2
_
1 +
RE
b
N
0
W
_
.
This can be resolved to obtain the minimum bit energy required for reli-
able transmission, called the Shannon bound:
E
b
N
0

max
1

max
,
Fundamental limit: For innite amounts of bandwidth, i.e.,
max
0,
we obtain
E
b
N
0
lim

max
0
2

max
1

max
= ln(2) = 1.59dB
This is the absolute minimum signal energy to noise power spectral den-
sity ratio required to reliably transmit one bit of information.
EE 7950: Statistical Communication Theory 3
Normalized Capacity
The dependence on the arbitrary denition of the bandwidth W is not
satisfactory. We prefer to normalize our formulas per signal dimension.
It is given by [5]. This is useful when the question of waveforms and
pulse shaping is not a central issue, since it allows one to eliminate these
considerations by treating signal dimensions [2].
C
d
=
1
2
log
2
_
1 + 2
R
d
E
b
N
0
_
[bits/dimension]
C
c
= log
2
_
1 +
RE
b
N
0
_
[bits/complex dimension]
Applying similar manipulations as above, we obtain the Shannon bound
normalized per dimension as
E
b
N
0

2
2C
d
1
2C
d
;
E
b
N
0

2
C
c
1
C
c
.
System Performance Measure In order to compare dierent commu-
nications systems we need a parameter expressing the performance level.
It is the information bit error probability P
b
and typically falls into the
range 10
3
P
b
10
6
.
EE 7950: Statistical Communication Theory 4
Examples:
Spectral Eciencies versus power eciencies of various coded and un-
coded digital transmission systems, plotted against the theoretical limits
imposed by the discrete constellations.
U
n
a
c
h
i
e
v
a
b
l
e
R
e
g
i
o
n
QPSK
BPSK
8PSK
16QAM
BTCM
32QAM
Turbo
65536
TCM
16QAM
ConvCodes
TCM
8PSK
2
14
Seqn.
2
14
Seqn.
(256)
(256)
(64)
(64)
(16)
(16)
(4)
(4)
32QAM
16PSK
BTCM
64QAM
BTCM
16QAM
TTCM
BTC
BPSK
(2,1,14) CC
(4,1,14) CC Turbo
65536
C
c
[bits/complex dimension]
-1.59 0 2 4 6 8 10 12 14 15
0.1
0.2
0.3
0.4
0.5
1
2
5
10
E
b
N
0
[dB]
BPSK
QPSK
8PSK
16QAM
16PSK
Shannon
Bound
EE 7950: Statistical Communication Theory 5
Discrete Capacities
In the case of discrete constellations, Shannons formula needs to be al-
tered. We begin with its basic formulation:
C = max
q
[H(Y ) H(Y |X)]
= max
q
_

_
_
y

a
k
q(a
k
)p(y|a
k
)log
_
_
_q(a

k
)

k
p(y|a

k
)
_
_
_

_
y

a
k
q(a
k
)p(y|a
k
) log (p(y|a
k
))
_
= max
q
_
_

a
k
q(a
k
)
_
y
log
_
_
p(y|a
k
)

k
q(a

k
)p(y|a

k
)
_
_
_
_
where {a
k
} are the K discrete signal points, q(a
k
) is the probability with
which a
k
is selected, and
p(y|a
k
) =
1

2
2
exp
_
_

(y a
k
)
2
2
2
_
_
in the one-dimensional case, and
p(y|a
k
) =
1
2
2
exp
_
_

(y a
k
)
2
2
2
_
_
in the complex case.
Symmetrical Capacity: When q(a
k
) = 1/K, then
C = log(K)
1
K

a
k
_
n
log
_

k
exp
_
_

(n a

k
+a
k
)
2
n
2
2
2
_
_
_

_
This equation has to be evaluated numerically.
EE 7950: Statistical Communication Theory 6
Code Eciency
Shannon et. al. [4] proved the following lower bound on the codeword
error probability P
B
:
P
B
> 2
N(E
sp
(R)+o(N))
; E
sp
(R) = max
q
max
>1
(E
0
(q, ) R)) .
The bound is plotted for rate R = 1/2 for BPSK modulation [3], together
with selected Turbo coding schemes and classic concatenated methods:
N=448
16-state
N=360
16-state
N=1334
4-state
N=1334
16-state
N=448
4-state
N=2048
16-state
N=2040 concatenated (2,1,8) CC
RS (255,223) code
N=2040 concatenated
(2,1,6) CC +
RS (255,223) code
N=10200
16-state
N=16384
16-state
N=65536
16-state
N=4599 concatenated (2,1,8) CC
RS (511,479) code
N=1024 block Turbo code
using (32,26,4) BCH codes
Unachievable
Region
Shannon
Capacity
10 100 1000 10
4
10
5
10
6
1
0
1
2
3
4
E
b
N
0
[dB]
N
EE 7950: Statistical Communication Theory 7
Code Performance
The performance of various coding schemes is shown below (see [2]). Note
the steep behavior of the turbo codes known as the Turbo cli.
0 1 2 3 4 5 6 7 8 9 10
10
6
10
5
10
4
10
3
10
2
10
1
1
Bit Error Probability (BER)
E
b
N
0
BPSK
CC; 4 states
CC; 16 states
CC; 64 states
CC; 2
14
states CC; 2
40
states
Turbo
Code
References
[1] R.G. Gallager, Information Theory and Reliable Communication, John Wiley & Sons,
Inc., New-York, 1968.
[2] C. Schlegel, Trellis Coding, IEEE Press, Piscataway, NJ, 1997.
[3] C. Schlegel and L.C. Perez, On error bounds and turbo codes,, IEEE Communications
Letters, Vol. 3, No. 7, July 1999.
[4] C.E. Shannon, R.G. Gallager, and E.R. Berlekamp, Lower bounds to error probabilities
for coding on discrete memoryless channels, Inform. Contr., vol. 10, pt. I, pp. 65103,
1967, Also, Inform. Contr., vol. 10, pt. II, pp. 522-552, 1967.
[5] J.M. Wozencraft and I.M. Jacobs, Principles of Communication Engineering, John Wiley
& Sons, Inc., New York, 1965, reprinted by Waveland Press, 1993.

S-ar putea să vă placă și