Documente Academic
Documente Profesional
Documente Cultură
1 This work was supported in part by Swedish Research Council for Engineering Sciences under
Grant 95-164
1
I. INTRODUCTION
The principle of trellis-coded modulation, and the concept of set partitioning
was described by Ungerbck in his paper of 1982 [7]. In [4] Imai and Hirakawa
proposed a coded modulation scheme in which the labels on the branches from one
level of the partition chain to the next are encoded by independent codes. This
scheme enables the usage of a suboptimal multistage decoder, which demonstrates
performance/complexity advantage over the maximum likelihood decoder.
The aim of this work was to compare the capacities of an eight level modulation
scheme using rectangular QAM-signaling and a ve level modulation scheme using
hexagonal QAM-signaling. Both schemes are versions of the one proposed in [4]. We
consider the discrete memoryless Gaussian channel. The multistage decoder consists
of a set of suboptimal decoders matched to the codes used on the corresponding levels
of encoding, and uses as metric the squared Euclidean distance from the received
point to the nearest points of the corresponding signal subset [2], [4].
First, in Section II, we will give a description of the systems considered. Then,
in Section III, the main results of [2] are presented. That is, the expressions of
the Cherno bounding parameter Z , the cuto rate and the capacity for rectangu-
lar QAM-signaling. The Cherno bounding parameter Z when hexagonal QAM-
signaling is used, is calculated in Section IV. The cuto rate and capacity are also
given. A comparison analysis is performed in Section V. It is shown that for cer-
tain signal-to-noise ratios the capacity of the hexagonal constellation is almost 1dB
better than the rectangular constellation.
2
II. SYSTEM DESCRIPTION
The schemes used are shown in Figures 1 and 2. An information sequence u
(binary in Figure 1, and ternary in Figure 2) is partitioned into K subsequences (in
Figure 1 K = 8, and in Figure 2 K = 5), where each subsequence is encoded by
an independent code Ck . The output code sequences are v(1) ; v(2) ; : : : ; v(K ), where
v(k) = v (k) (1) ; v (k) (2) ; : : : ; v (k) (n) ; : : : for k = 1; 2; : : : ; K: A set of K symbols,
n o
v (1) (n) ; v (2) (n) ; : : : ; v (K ) (n) , one from each code sequence are synchronously
mapped onto one of the QAM signal points, s (n). The channel considered in this
paper is the discrete memoryless Gaussian channel with the complex input sequence
s = s (1) ; s (2) ; : : : ; s (n) ; : : :, and the complex output sequence r = s + e. The se-
quence e = e (1) ; e (2) ; : : : ; e (n) ; : : : is an error sequence, e (n) = e(I ) (n)+ je(Q) (n),
where e(I ) (n) and e(Q) (n) are independent Gaussian random variables with zero
mean and variance 2 . The multistage suboptimal decoder is a modied version of
the one proposed in [4]. Each decoding stage consists of calculation of distances
(metrics) to the received sequence r from all possible code words on the correspond-
ing level of set partitioning. The side information from the previous decoding stages
determines, according to the set partitioning structure, the signal set upon which
the metrics are calculated.
When the decoder calculates the metrics, it uses the following suboptimal prin-
ciple. Let us suppose that a block code of length N is used on the kth level of
the encoding, and that the decoding in the previous (k 1) decoding stages de-
termines the subsets S (k 1) (1) ; S (k 1) (2) ; : : : ; S (k 1) (N ), to which the transmitted
symbols of the codeword v(k) = v(k) (1) ; v(k) (2) ; : : : ; v(k) (N ) belong. Let Si(k 1)
(n)
3
(in Figure 1 i = 0; 1, and in Figure 2 i = 0; 1; 2) be subsets of S (k 1) (n), corre-
sponding to transmission of v(k) (n) = i respectively. Let s(k) (n) 2 S (k 1) (n), n =
1; 2; : : : ; N and s(k) = s(k) (1) ; s(k) (2) ; : : : ; s(k) (N ). Finally let S(vk(k)1) = Sv(k(k)(1)
1)
(1) ;
Sv(k(k) (2)
1)
(2) ; : : : ; Sv(k(k)(1)N ) (N ) be the sequence of subsets corresponding to transmission
of the codeword v(k). Then the distance (metric) between the received sequence
r = r (1) ; r (2) ; : : : ; r (N ) and the codeword v(k) is determined as
r; v(k) = (k)min(k 1) dE r; s(k) ; (1)
s 2Sv(k)
where dE (x; y) is the squared Euclidean distance between the N -dimensional vectors
x and y. The decoding consists of choosing the codeword v(k) for which the metric
r; v(k) above is minimal.
4
and when a convolutional code is used, the upper bound on the burst error (rst-
event) probability P (") is [8]
where G (D) (and T (D) respectively) is the generating function of the code. The
values of Z that give exponentially tight bounds in (2) and (3) was derived in [2]
using the Cherno bounding method [10], and are given below.
On the last decoding level, where the signal constellation consists of two points,
we have
2
Z2 =e 8 ; (4)
where is the least Euclidean distance between any two points in the signal con-
stellation (on the last level, there is only one distance) normalized with the standard
p k 1
deviation of the noise . If = 0 on the rst decoding level, then = 2 0
on the kth decoding level. On the last but one level, where the signal constellation
consists of four points,
2 p
Z4 = min
s0
2e2(s) Q 2s
! !!
e Q p2 (2s 1) + es2 Q p2 (2s + 1) ;
s2
(5)
R 2
where Q(x) = p12 x1 e t2 dt. On each of the other levels the signal constellation
is approximated by a QAM signal set with an innite number of signal points, and
thus we get an upper bound on Z for any level of decoding;
2 ps
Z1;R = min 4 e
s 2
s0
5
0 1 !!12
X p2sj p p
@ e Q 2j s Q 2j s + p A: (6)
j= 1 2
The cuto rate for each level can be calculated as
1 + Z
Rc = log2
2 ; (7)
and the total cuto rate is achieved by adding the cuto rates for all the levels. The
capacity of each level is
C =H( ) H( j I) (8)
p
where = 2 (X + Y ) 2 for the case with an innite number of signal points,
(X; Y ) are the coordinates of the received signal point, and I indicates the symbol
sent on the corresponding level. For details we refer to [2]. The total capacity is
calculated analogously to the total cuto rate.
where
X 1
1 X 2 p p
fX (x; b) = p3 1
e 2 (x 3j ) Q b 3k Q b 3k +
2 k = 1 j = 1
1 2
3(j + 21 ))
p 1 p 1 : (11)
+e 2 (x Q b 3 k+
2 Q b 3 k+
2
2
Proof The set partitioning corresponding to a hexagonal QAM signal set with an
innite number of signal points is shown in Figure 4. Without loss of generality we
suppose that the transmitted codeword on the kth level is v0(k), the all zero codeword,
corresponding to sending points from the reference set, Figure 4. We also suppose
that vl(k) is a codeword of Hamming weight wl(k). To simplify the analysis, we change
the order of the transmitted symbols, such that the rst wl(k) symbols of vl(k) are
non-zero. From [2] we have that
Z (k) = min
s0
'(k) (s) (12)
To simplify notation in the following, we leave out the superscript (k) and the
argument (n). Thus, we need to study the metric
, which is the dierence between
the distances from the received point to the nearest reference point and the nearest
opposite point, to calculate the Cherno bounding parameter Z . Conditioned that
the received point, having coordinates (x; y), is in the region marked in Figure 4,
= x2 + y 2 ( x)2 + y 2 = 2x 2: (15)
The conditional probability density function of (X; Y ) given that a point of the
reference set was transmitted and given that the received point is in the marked
region is
3
f(X;Y ) (x; y ) =
2 !
X
1 X
1 p
1 (x 3j )2 +(y 3k)2 1 (x 3(j + 1 ))2 +(y p3(k+ 1 ))2
e 2 +e 2 2 2 : (16)
k= 1 j= 1
Consequently the probability density function of X with the same conditions as
above is 8 p
< fX x; 3x ;
0 x 2
fX (x) = : p ; (17)
fX x; 3 ( x) ; 2 x
where
Zb
fX (x; b) = f(X;Y ) (x; y ) dy (18)
y= b
the result of which is (11). If the received point is in any other region we can
introduce an alternative system of coordinates such that we have the same situation
8
as in Figure 4, and thus (15)-(18) will still be valid. Following the technique used
in [2] we thus get
Z Z
es(2x ) fX (x) dx;
2
Z1;H = min
s 0
s
e f (
) d
= min
s0
(19)
x=0
coordinates (x; y) given that the received signal point is in the integration area, and
given that the sent symbol was i, where i = 0; 1; 2. The total capacity and the total
cuto rate is computed as before.
9
V. COMPARISON ANALYSIS
We want to compare an eight level rectangular QAM system to a ve level
hexagonal QAM system, Figure 3, under the condition that the energy per channel
use is the same for the two systems. The average energy per channel use is, for the
rectangular system
ER = 10880 2
256 (R ) ; (22)
and for the hexagonal system
To calculate the capacity and cuto rate in the rectangular QAM case we use the
results in [2], and approximate all but the last two levels with rectangular QAM
signal sets with an innite number of signal points. In the case of hexagonal QAM
we approximate all but the last level with hexagonal QAM signal sets with an
innite number of signal points, Figure 4. The resulting cuto rates and capacities
are shown in Figures 5, 6, 7 and 8 respectively. It can be seen from Figures 7 and
8 that in terms of capacity the hexagonal system is close to 1 dB better than the
rectangular, for signal-to-noise ratios between 10 dB and 25 dB. Also in terms of
cuto rate the hexagonal system is better, but the dierence here is approximately
0.7 dB.
10
References
[1] E. Biglieri, D. Divsalar, P. J. McLane and M. K. Simon, Introduction to Trellis-
Coded Modulation with Applications, Macmillan, 1991.
[2] K. Engdahl and K. Sh. Zigangirov, \On the Calculation of the Error Probability
for a Multilevel Modulation Scheme Using QAM-signaling," submitted to IEEE
Trans. Information Theory.
[3] J. Huber, \Multilevel Codes: Distance Proles and Channel Capacity," in ITG-
Fachbereicht 130, pp. 305-319, Oct. 1994. Conference Record.
[4] H. Imai and S. Hirakawa, \A New Multilevel Coding Method Using Error-
Correcting Codes," IEEE Trans. Information Theory, vol. IT-23, pp. 371-377,
May 1977.
[6] G. Pottie and D. Taylor, \Multilevel Codes Based on Partitioning" IEEE Trans.
Information Theory, vol. IT-35, pp. 87-98, Jan. 1989.
11
[9] S. G. Wilson, Digital Modulation and Coding, Prentice Hall, 1996.
12
LIST OF FIGURE CAPTIONS
Figure 1: System description of the eight level modulation scheme using rectangu-
lar QAM-signaling.
Figure 5: Comparison of the total cuto rates of the two systems, rectangular
QAM (dashed) and hexagonal QAM (solid). SNR is the average energy per
channel use to 22.
13
u(1) C1
v(1)
u(2) C2
v(2) 28
u Partition
of
u(3) C3
v(3) rectan-
gular
s AWGN r subopti-
mal
u^
information QAM decoder
mapper
p
u(8) v(8)
p
C8
Figure 1:
u(1) C1
v(1)
u(2) C2
v(2) 35
u Partition
of
u(3) C3
v(3) hexa-
gonal
s AWGN r subopti-
mal
u^
information QAM decoder
mapper
p
u(5) v(5)
p
C5
Figure 2:
14
Z3 Z1;H Z1;H =Z3
0.5 0.9692 1.0000 1.03
1.0 0.8825 1.0000 1.13
1.5 0.7548 0.9991 1.32
2.0 0.6065 0.9747 1.61
2.5 0.4578 0.8805 1.92
3.0 0.3247 0.7190 2.21
3.5 0.2163 0.5310 2.46
4.0 0.1353 0.3572 2.64
4.5 0.0796 0.2206 2.77
5.0 0.0439 0.1258 2.86
5.5 0.0228 0.0666 2.92
6.0 0.0111 0.0328 2.96
6.5 0.0051 0.0151 2.98
7.0 0.0022 0.0065 2.99
Table 1:
15
r r r r r r
r r r r r r r r r r r
r r r r r r r r r r
r r r r r r r r r r r
r r r r r r r r r r r r r r
r r r r r r r r r r r r r
r r r r r r r r r r r r r r
r r r r r r r r r r r r r r r r r
r r r r r r r r r r r r r r r r
r r r r r r r r r r r r r r r r r
r r r r r r r r r r r r r r r r
r r r r r r r r r r r r r r r r r
r r r r r r r r r r r r r r r r
r r r r r r r r r r r r r
r r r r r r r r r r r r r r
QQ
r r r r r r r r r r r r r
r r r r r r r r r r
QQ
r r r r r r r r r r r
r r r r
+
QQs
r ` r ` r ` ` ` ` ` ` ` ` r ` r ` r
r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r
` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r `
r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r
r ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` r
` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r `
r ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` r
r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r
` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r `
r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r
` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r `
r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r
` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r `
` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r `
r ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r ` ` r
QQ
` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r `
` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r `
QQ
r ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` ` r ` ` r ` ` r ` ` r
` ` ` ` r r r r ` ` ` `
+
QQs
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` r
` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` r ` ` r
` ` r ` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r
` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` r ` ` r ` `
r ` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` r ` `
QQ
r ` ` r ` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` ` r ` ` ` ` ` ` ` ` ` `
QQ
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` r ` ` r ` `
r r r r ` ` ` ` ` ` ` `
+
QQs
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` r ` ` ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` r ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` r ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` r ` ` ` ` ` ` ` ` r r ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` r ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
r ` ` ` ` ` ` ` ` r ` ` ` ` ` ` r ` ` ` ` ` ` ` ` r ` ` ` ` ` ` r ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
QQ
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` r ` ` ` ` ` ` ` ` ` ` ` ` r ` ` ` r ` ` ` ` ` ` ` ` r
QQ
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` `
+
QQs
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` r ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
r ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` r ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` r ` ` `
` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `
` ` ` ` ` ` ` ` ` ` ` `
Figure 3:
16
6y
u `
e e u `
e e u `
e e u `
e
e u `
e e u `
e e u `
e e
u `
e e u `
e e u `
e e u `
e
e u `
e e u `
e e u `
e e -x
u `
e e u `
e e u `
e e u `
e
e u `
e e u `
e e u `
e e u
Reference point
u `
e e u `
e e u `
e e u `
e `
e
Opposite point
Figure 4:
6
CUTOFF RATE (bits/ch. use)
0
−10 −5 0 5 10 15 20 25 30
SNR (dB)
Figure 5:
17
5
4.8
4.6
CUTOFF RATE (bits/ch. use)
4.4
4.2
3.8
3.6
3.4
3.2
3
10 10.5 11 11.5 12 12.5 13 13.5 14 14.5 15
SNR (dB)
Figure 6:
8
5
C (bits/ch. use)
0
−10 −5 0 5 10 15 20 25 30
SNR (dB)
Figure 7:
18
5
4.8
4.6
4.4
C (bits/ch. use)
4.2
3.8
3.6
3.4
3.2
3
10 10.5 11 11.5 12 12.5 13 13.5 14 14.5 15
SNR (dB)
Figure 8:
19