A Comparison Analysis Of Hexagonal Multilevel Qam And Rectangular Multilevel Qam

  • June 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View A Comparison Analysis Of Hexagonal Multilevel Qam And Rectangular Multilevel Qam as PDF for free.

More details

  • Words: 6,704
  • Pages: 19
A Comparison Analysis of Hexagonal Multilevel QAM and Rectangular Multilevel QAM 1

Karin Engdahl and Kamil Sh. Zigangirov Department of Information Technology Telecommunication Theory Group Lund University, Box 118, S-221 00 Lund, Sweden Phone +46 46 2223450, Fax +46 46 2224714, E-mail [email protected] Abstract{ The performances of two multilevel modulation systems are compared

in terms of capacity and cuto rate, under the condition that the average energy per channel use is the same for the two systems. The systems are versions of the scheme of Imai and Hirakawa, and we consider the multistage suboptimal receiver for the Gaussian channel. The rst system is an eight level modulation system using rectangular QAM-signaling and a binary alphabet, the second is a ve level modulation system using hexagonal QAM-signaling and a ternary alphabet. It is shown that, under these conditions, the capacity for the hexagonal system is close to 1 dB better than the capacity for the rectangular system for a range of signalto-noise ratios. The Cherno bounding parameter is also calculated for hexagonal QAM with an in nite number of signal points. Keywords{ Multilevel modulation, hexagonal QAM-signaling, rectangular QAM-

signaling, QAM-signaling, set partitioning, ternary signaling.

1 This work was supported in part by Swedish Research Council for Engineering Sciences under Grant 95-164

1

I. INTRODUCTION The principle of trellis-coded modulation, and the concept of set partitioning was described by Ungerbck in his paper of 1982 [7]. In [4] Imai and Hirakawa proposed a coded modulation scheme in which the labels on the branches from one level of the partition chain to the next are encoded by independent codes. This scheme enables the usage of a suboptimal multistage decoder, which demonstrates performance/complexity advantage over the maximum likelihood decoder. The aim of this work was to compare the capacities of an eight level modulation scheme using rectangular QAM-signaling and a ve level modulation scheme using hexagonal QAM-signaling. Both schemes are versions of the one proposed in [4]. We consider the discrete memoryless Gaussian channel. The multistage decoder consists of a set of suboptimal decoders matched to the codes used on the corresponding levels of encoding, and uses as metric the squared Euclidean distance from the received point to the nearest points of the corresponding signal subset [2], [4]. First, in Section II, we will give a description of the systems considered. Then, in Section III, the main results of [2] are presented. That is, the expressions of the Cherno bounding parameter Z , the cuto rate and the capacity for rectangular QAM-signaling. The Cherno bounding parameter Z when hexagonal QAMsignaling is used, is calculated in Section IV. The cuto rate and capacity are also given. A comparison analysis is performed in Section V. It is shown that for certain signal-to-noise ratios the capacity of the hexagonal constellation is almost 1dB better than the rectangular constellation.

2

II. SYSTEM DESCRIPTION The schemes used are shown in Figures 1 and 2. An information sequence u (binary in Figure 1, and ternary in Figure 2) is partitioned into K subsequences (in Figure 1 K = 8, and in Figure 2 K = 5), where each subsequence is encoded by an independent code Ck . The output code sequences are v(1) ; v(2) ; : : : ; v(K ), where

v(k) =

n

v (k) (1) ; v (k) (2) ; : : : ; v (k) (n) ; : : :

o

v (1) (n) ; v (2) (n) ; : : : ; v (K ) (n)

for

k

= 1; 2; : : : ; K: A set of

K

symbols,

, one from each code sequence are synchronously

mapped onto one of the QAM signal points, s (n). The channel considered in this paper is the discrete memoryless Gaussian channel with the complex input sequence

s = s (1) ; s (2) ; : : : ; s (n) ; : : :, and the complex output sequence r = s + e. The sequence e = e (1) ; e (2) ; : : : ; e (n) ; : : : is an error sequence, e (n) = e(I ) (n)+ je(Q) (n), where

e(I ) (n)

and

e(Q) (n)

are independent Gaussian random variables with zero

mean and variance 2 . The multistage suboptimal decoder is a modi ed version of the one proposed in [4]. Each decoding stage consists of calculation of distances (metrics) to the received sequence r from all possible code words on the corresponding level of set partitioning. The side information from the previous decoding stages determines, according to the set partitioning structure, the signal set upon which the metrics are calculated. When the decoder calculates the metrics, it uses the following suboptimal principle. Let us suppose that a block code of length

N

is used on the

kth

level of

the encoding, and that the decoding in the previous (k 1) decoding stages determines the subsets S (k

1) (1) ; S (k 1) (2) ; : : : ; S (k 1) (N ),

to which the transmitted

symbols of the codeword v(k) = v(k) (1) ; v(k) (2) ; : : : ; v(k) (N ) belong. Let Si(k 3

1)

(n)

(in Figure 1 i = 0; 1, and in Figure 2 i = 0; 1; 2) be subsets of S (k

1) (n),

sponding to transmission of v(k) (n) = i respectively. Let s(k) (n) 2 S (k

corre-

1) (n), n

=

1) (1) ; 1; 2; : : : ; N and s(k) = s(k) (1) ; s(k) (2) ; : : : ; s(k) (N ). Finally let S(vk(k)1) = Sv(k(k)(1) 1) (2) ; : : : ; Sv(k(k)(1)N ) (N ) be the sequence of subsets corresponding to transmission Sv(k(k) (2)

of the codeword v(k). Then the distance (metric) between the received sequence

r = r (1) ; r (2) ; : : : ; r (N ) and the codeword v(k) is determined as 







r; v(k) = (k)min(k 1) dE r; s(k) s

2Sv(k)



;

(1)

where dE (x; y) is the squared Euclidean distance between the N -dimensional vectors

x and y. The decoding consists of choosing the codeword v(k) for which the metric    r; v(k) above is minimal.

III. CAPACITY AND CUTOFF RATE FOR RECTANGULAR QAM In [2] we analyzed a multilevel modulation scheme using rectangular QAMsignaling. The main result was the derivation of explicit expressions for the Cherno bounding parameter Z , which is used in the calculation of error probabilities as described below. The performance of a multilevel coded modulation system, which employs a multistage decoder, is commonly estimated by average bit error probability of each component code or by block error probability and burst error probability for block and convolutional coding respectively [1], [3], [5], [6]. When a linear block code is used on a signaling level, an upper bound on the probability of decoding error P (") for that level is [9] P (")  G (D) jD=Z ;

4

(2)

and when a convolutional code is used, the upper bound on the burst error ( rstevent) probability P (") is [8] P (") < T (D) jD=Z ;

(3)

where G (D) (and T (D) respectively) is the generating function of the code. The values of Z that give exponentially tight bounds in (2) and (3) was derived in [2] using the Cherno bounding method [10], and are given below. On the last decoding level, where the signal constellation consists of two points, we have Z2

=e

2 8 ;

(4)

where  is the least Euclidean distance between any two points in the signal constellation (on the last level, there is only one distance) normalized with the standard deviation of the noise . If  = 0 on the rst decoding level, then  =

p k 2

1

0

on the kth decoding level. On the last but one level, where the signal constellation consists of four points, Z4

2

= min 2e2(s) Q s0

 p

!



2s 

!!

(5)  e Q p2 (2s 1) + es2 Q p2 (2s + 1) ; 2 R where Q(x) = p12 x1 e t2 dt. On each of the other levels the signal constellation s2

is approximated by a QAM signal set with an in nite number of signal points, and thus we get an upper bound on Z for any level of decoding; Z1;R

s = min 4 e s0

5

2 ps 2

0 1 X @

p2sj 

p



p

Q 2j  s e Q 2j  s + p 2 j= 1 The cuto rate for each level can be calculated as Rc

=

log2

1 + Z  2

!!12 A:

;

(6)

(7)

and the total cuto rate is achieved by adding the cuto rates for all the levels. The capacity of each level is

p

C

=H( )

H(

j I)

(8)

where = 2 (X + Y ) 2 for the case with an in nite number of signal points, (X; Y ) are the coordinates of the received signal point, and I indicates the symbol sent on the corresponding level. For details we refer to [2]. The total capacity is calculated analogously to the total cuto rate.

IV. CAPACITY AND CUTOFF RATE FOR HEXAGONAL QAM Now we consider a ternary alphabet as opposed to the binary alphabet that was considered in the case of rectangular QAM. An example of the set partitioning in a multilevel modulation scheme with ve levels is shown in Figure 3. On the last decoding level, where the signal constellation consists of three signal points, the value of the parameter Z is the same as in the case of two signal points, that is Z3

=e

2 8 ;

(9)

where  is de ned as before. Here, if  = 0 on the rst decoding level, then =

p k 3

1

0 on the kth decoding level. Once again, the signal constellation on

all the other levels is approximated by a hexagonal QAM signal set with an in nite number of signal points, Figure 4, and thus we get an upper bound on Z ; 6

Theorem 1 On each level of suboptimal decoding of a multilevel coded hexagonal QAM signal set with an in nite number of signal points, the Cherno bounding parameter Z has the value Z1;H

Z =2

s2

= min e s0

+

Z =2

0

e2sx f

X

 p 

 p x;

3x

e2sx fX x;

dx+

 !

3 (

x) dx ;

(10)

where fX (x; b) = 1 +e 2 (x

2  1 e 2 (x 3j ) Q b

1 X 1  X

p3

2 k = 1 j = 1 2

3(j + 21 ))

Q



p  3

b

k+

1  2

p

3k



Q b





Q b

p  3

k+

p



3k +

1  : (11) 2

2

Proof The set partitioning corresponding to a hexagonal QAM signal set with an in nite number of signal points is shown in Figure 4. Without loss of generality we suppose that the transmitted codeword on the kth level is v0(k), the all zero codeword, corresponding to sending points from the reference set, Figure 4. We also suppose that vl(k) is a codeword of Hamming weight wl(k). To simplify the analysis, we change the order of the transmitted symbols, such that the rst

wl(k)

symbols of vl(k) are

non-zero. From [2] we have that Z (k)

= min '(k) (s) s0

(12)

where '(k) (s) is the generating function of the metric





l(k) (n) = dE r (n) ; v0(k) (n)

7





dE r (n) ; vl(k) (n)

(13)

and where



(k)

dE r (n) ; vl



(n) =



min (k 1)



dE r (n) ; s(k) (n) :

s(k) (n)2S (k) (n) v l (n )

(14)

To simplify notation in the following, we leave out the superscript (k) and the argument (n). Thus, we need to study the metric , which is the di erence between the distances from the received point to the nearest reference point and the nearest opposite point, to calculate the Cherno bounding parameter Z . Conditioned that the received point, having coordinates (x; y), is in the region marked in Figure 4,

=



x2 + y 2

 

(

x)2 + y 2



= 2x 2:

(15)

The conditional probability density function of (X; Y ) given that a point of the reference set was transmitted and given that the received point is in the marked region is

3 2   !  p 2 1 (x 3j )2 +(y 3k) 1 (x 3(j + 1 ))2 +(y p3(k+ 1 ))2 2 2 2 : (16) +e 2 f(X;Y ) (x; y ) =



1 X 1 X k=

1 j= 1

e

Consequently the probability density function of above is

8  p  < fX x; 3x ;  p fX (x) = : fX x; 3 (

where fX (x; b) =

Zb y= b

X

with the same conditions as

  0  x  2

x) ;

2

x

f(X;Y ) (x; y ) dy

;

(17) (18)

the result of which is (11). If the received point is in any other region we can introduce an alternative system of coordinates such that we have the same situation 8

as in Figure 4, and thus (15)-(18) will still be valid. Following the technique used in [2] we thus get Z1;H

= min s 0

Z

s

e f

( ) d = min s0

Z x=0

2 es(2x  ) fX (x) dx;

(19)

2

which is the same as in (10).

Numerical values of Z1;H and Z3 are shown in Table 1. There it can be seen that when  is large the quote Z1;H =Z3 approaches 3. This is an expected result due to the \nearest neighbor error events\ principle, and the fact that every signal point in the constellation with in nitely many signal points has exactly three nearest neighbors of each sort. The cuto rate for each level is calculated as  1 + 2Z  : Rc = log2 3 The capacity of each level is C

= H (X; Y )

H (X; Y

(20)

j I) =

! ! Z Z 1X 2 2 X 1 (i) (i) = 3 i=0 f(X;Y ) (x; y) log2 3 i=0 f(X;Y ) (x; y) dxdy+ 2 Z Z 1X (i) (i) +

f(X;Y ) (x; y ) log2 f(X;Y ) (x; y )dxdy; (21) 3 i=0 where the integration area is the upper triangle of the region marked in Figure 4 (for

the last level, when the signal constellation consists of three points, the integration i) area is R2), and f((X;Y ) (x; y ) is the conditional probability density function of the

coordinates (x; y) given that the received signal point is in the integration area, and given that the sent symbol was i, where i = 0; 1; 2. The total capacity and the total cuto rate is computed as before. 9

V. COMPARISON ANALYSIS We want to compare an eight level rectangular QAM system to a ve level hexagonal QAM system, Figure 3, under the condition that the energy per channel use is the same for the two systems. The average energy per channel use is, for the rectangular system ER

2 = 10880 256 (R ) ;

(22)

EH

= 6942 (H )2 : 243

(23)

and for the hexagonal system

If these energies are to be equal we get for the normalized Euclidean distances

s

243   1:2197 : H = 10880 R 256 6942 R

(24)

To calculate the capacity and cuto rate in the rectangular QAM case we use the results in [2], and approximate all but the last two levels with rectangular QAM signal sets with an in nite number of signal points. In the case of hexagonal QAM we approximate all but the last level with hexagonal QAM signal sets with an in nite number of signal points, Figure 4. The resulting cuto rates and capacities are shown in Figures 5, 6, 7 and 8 respectively. It can be seen from Figures 7 and 8 that in terms of capacity the hexagonal system is close to 1 dB better than the rectangular, for signal-to-noise ratios between 10 dB and 25 dB. Also in terms of cuto rate the hexagonal system is better, but the di erence here is approximately 0.7 dB.

10

References [1] E. Biglieri, D. Divsalar, P. J. McLane and M. K. Simon, Introduction to TrellisCoded Modulation with Applications, Macmillan, 1991.

[2] K. Engdahl and K. Sh. Zigangirov, \On the Calculation of the Error Probability for a Multilevel Modulation Scheme Using QAM-signaling," submitted to IEEE Trans. Information Theory.

[3] J. Huber, \Multilevel Codes: Distance Pro les and Channel Capacity," in ITGFachbereicht 130, pp. 305-319, Oct. 1994. Conference Record.

[4] H. Imai and S. Hirakawa, \A New Multilevel Coding Method Using ErrorCorrecting Codes," IEEE Trans. Information Theory, vol. IT-23, pp. 371-377, May 1977. [5] Y. Kofman, E. Zehavi and S. Shamai, \Performance Analysis of a Multilevel Coded Modulation System" IEEE Trans. Communications, vol. COM-42, pp. 299-312, Feb./Mar./Apr. 1994. [6] G. Pottie and D. Taylor, \Multilevel Codes Based on Partitioning" IEEE Trans. Information Theory, vol. IT-35, pp. 87-98, Jan. 1989.

[7] G. Ungerbck, \Channel Coding with Multilevel/Phase Signals," IEEE Trans. Information Theory, vol. IT-28, pp. 55-67, Jan. 1982.

[8] A. J. Viterbi and J. K. Omura, Principles of Digital Communication and Coding, McGraw Hill, 1979.

11

[9] S. G. Wilson, Digital Modulation and Coding, Prentice Hall, 1996. [10] J. M. Wozencraft and I. M. Jacobs, Principles of Communication Engineering, Wiley, 1965.

12

LIST OF FIGURE CAPTIONS Figure 1: System description of the eight level modulation scheme using rectangular QAM-signaling.

Figure 2: System description of the ve level modulation scheme using hexagonal QAM-signaling.

Figure 3: Set partitioning of 243 hexagonal QAM. Figure 4: A hexagonal QAM signal constellation with an in nite number of signal points.

Figure 5: Comparison of the total cuto rates of the two systems, rectangular QAM (dashed) and hexagonal QAM (solid). SNR is the average energy per channel use to 22.

Figure 6: Detail of Figure 5. Figure 7: Comparison of the total capacities of the two systems, rectangular QAM (dashed) and hexagonal QAM (solid). SNR is the average energy per channel use to 22.

Figure 8: Detail of Figure 7. Table 1: Numerical values of Z in the case of hexagonal QAM. For numerical values of Z in the case of rectangular QAM we refer to [2]. 13

u(1) u(2) u Partition of

u(3)

information

C1 C2 C3

v(1) v(2) v(3)

p

p

u(8)

p

C8

v(8)

28 rectangular QAM mapper

u^ s AWGN r suboptimal  decoder

Figure 1:

u(1) u(2) u Partition of

u(3)

information

C1 C2 C3

v(1) v(2) v(3)

p

p

u(5)

p

C5

v(5)

35 hexagonal QAM mapper

Figure 2:

14

u^ s AWGN r suboptimal  decoder

 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 5.5 6.0 6.5 7.0

Z3

0.9692 0.8825 0.7548 0.6065 0.4578 0.3247 0.2163 0.1353 0.0796 0.0439 0.0228 0.0111 0.0051 0.0022

Z1;H

1.0000 1.0000 0.9991 0.9747 0.8805 0.7190 0.5310 0.3572 0.2206 0.1258 0.0666 0.0328 0.0151 0.0065 Table 1:

15

Z1;H =Z3

1.03 1.13 1.32 1.61 1.92 2.21 2.46 2.64 2.77 2.86 2.92 2.96 2.98 2.99

r r

r r

r r

r r

r r

r r

r

r r

r r

r

r

r

r

r r

` `

r r

` `

r r

` `

r

`

` `

r

r

`

r

`

r

`

` `

r

r

`

`

` r

r

r

r

`

`

r

` `

` `

` `

` `

` `

` ` `

` `

` ` `

r

`

`

`

`

` `

r

`

` `

`

`

`

`

`

` `

` `

` `

` `

` `

` ` `

` r

` ` `

`

`

`

`

`

`

`

` `

`

`

`

` `

` `

`

` `

` `

` `

`

` `

` `

`

` `

` r

` `

`

`

` `

`

`

`

`

` `

` `

` `

` `

`

`

` `

` `

` `

`

`

`

` `

` `

` `

`

`

`

`

`

`

`

`

`

`

` `

`

`

`

`

`

` `

` `

`

`

`

`

`

`

`

`

`

`

`

` `

`

`

` `

`

`

`

`

` `

` `

`

`

`

`

Figure 3:

`

`

`

`

`

`

`

`

`

`

`

` `

`

`

` `

`

`

`

` `

`

`

`

` `

` `

r `

`

`

`

`

`

` `

` `

` `

`

` `

`

`

`

`

` `

`

`

`

` `

`

`

`

` `

`

`

` `

`

` `

`

`

`

` `

`

` `

` `

`

` `

` `

`

`

`

` `

`

`

`

` `

` `

`

` `

` `

`

`

`

` r

`

`

`

`

`

`

`

` r

`

`

` `

`

` `

`

` `

` `

`

`

` `

`

`

`

`

`

` `

` `

`

` `

`

`

` `

`

` `

`

` `

`

`

`

` `

`

`

`

`

`

` `

`

`

` `

r

`

`

` `

` `

` `

`

`

` `

`

`

`

` `

` `

`

`

`

`

`

`

`

`

` `

`

`

`

`

`

` `

` `

`

`

`

` `

`

`

`

`

`

`

`

` `

`

` `

`

` `

`

`

`

`

`

` `

`

` `

`

`

`

`

`

`

`

`

`

` `

` `

` `

r

`

`

`

` `

` `

`

` `

` `

`

`

`

` `

`

r

`

` `

` `

` `

`

` `

` `

`

`

`

` `

`

`

`

` `

`

`

` `

`

` `

` `

` `

` `

`

`

`

`

`

`

`

` `

` `

`

`

`

` `

`

` `

`

` `

`

` `

`

`

`

`

`

r

`

`

`

` `

` `

`

`

`

` `

` `

`

` `

`

` `

` `

`

`

` `

` `

`

`

`

` r

` `

`

` `

`

`

` `

`

`

`

`

`

` `

`

`

`

`

`

` `

`

`

`

`

` `

r

`

` `

`

`

`

`

`

r

`

`

` `

` `

`

` `

`

`

` `

`

` r

`

` r

`

`

` `

` `

`

`

r

` `

`

`

`

`

`

`

`

` `

` `

`

` `

` `

`

`

`

`

`

`

`

`

`

`

`

`

` `

`

`

` `

` `

`

r `

`

`

`

`

` `

`

`

`

` `

`

r

`

`

`

`

`

r `

` r

`

`

`

`

` `

`

`

` `

`

`

` `

`

` `

`

` `

` `

`

r

`

` `

`

` `

` `

`

` `

`

`

` r

`

r `

r

` `

` `

`

`

` `

`

`

`

`

`

` `

`

` `

`

r

`

`

` `

r

r `

r

`

` `

`

r `

r

` `

`

`

` `

` r

`

r

`

`

r `

`

`

`

r `

r `

`

`

`

` `

`

`

`

`

r

`

` `

`

` `

`

`

`

`

`

`

` `

`

` `

`

`

` `

`

r

`

` `

`

`

` `

` `

`

`

`

` `

`

`

`

` `

r

`

`

`

r `

`

r `

`

`

r `

r `

r

`

`

`

r

` r

` r

`

r `

r `

` r

` `

`

`

`

`

r

`

r `

`

`

r `

r `

`

`

`

`

r `

`

`

`

`

`

`

`

`

`

`

`

` `

`

`

`

`

`

`

`

`

`

` `

`

`

` `

`

`

`

`

`

r

`

`

r `

r

`

` `

`

`

`

`

` `

`

` `

`

`

`

r `

r

`

` `

`

` `

`

` `

` `

` `

`

`

`

`

` `

` `

`

`

16

`

` `

`

`

`

`

r

r `

`

` `

`

r

` r

`

`

` r

`

r

`

`

r

`

r

`

` `

` `

` `

` `

` `

`

QQ QQ QQs 

` `

` `

`

` `

`

` `

`

`

` `

`

`

` `

`

` `

`

` `

`

`

`

` `

`

`

`

r

`

r

`

r `

` r

` r

`

`

r `

` `

` `

`

`

`

` `

` `

`

` `

`

`

`

`

`

`

` `

` `

`

`

`

`

`

`

`

`

`

`

`

` `

`

`

`

`

`

`

` `

`

r

`

` r

`

` `

` `

`

r

` `

`

`

` `

`

`

`

` `

`

`

` r

`

`

`

`

`

` `

`

`

`

`

r

r

`

r `

`

`

r

`

`

`

`

`

`

`

`

`

`

`

r

` r

`

`

`

r

`

` r

`

r `

r

`

`

`

r

r `

` r

`

`

r

`

` `

r

`

` `

` `

`

`

`

` `

` `

`

`

`

` `

`

`

`

`

`

`

`

` `

`

`

r

`

r

r

` r

`

`

r

`

`

` `

` `

` `

`

`

`

`

`

`

`

`

` `

` `

` `

`

` `

`

` `

` `

`

`

`

` `

`

`

`

` `

`

`

` `

`

`

` `

` `

` `

`

` `

`

` `

`

` `

r

r

`

`

`

` `

`

`

`

`

`

`

`

`

r

r

r `

`

` `

`

r

`

`

`

` `

r `

r

`

`

r

`

QQ QQ QQs 

`

`

`

`

` `

` `

`

`

`

`

`

` `

` `

`

`

`

`

` `

` `

` `

`

`

`

`

r `

r

`

r `

r

`

`

`

r `

` r

` r

`

`

r

r

`

r `

`

`

r

`

`

` `

`

`

`

` `

`

`

`

`

`

` `

`

`

r

`

`

r

`

`

` `

`

`

`

`

`

` `

` `

`

`

` `

`

`

`

`

` `

`

`

`

`

`

`

`

`

` `

`

` `

`

`

`

` `

`

`

`

` r

`

r `

`

`

`

r

`

` `

`

` `

` `

`

r

`

`

` `

` `

`

`

`

`

`

`

`

`

` `

`

`

`

`

`

r `

`

` `

` `

`

`

`

` `

` `

`

`

r

`

`

`

`

`

`

` `

` `

` `

`

`

` `

` `

` `

`

`

`

`

`

` `

`

`

`

`

` `

`

r

`

`

` `

`

`

`

`

`

`

`

`

` `

`

`

r

`

` `

r `

r `

r

`

` `

` `

`

` r

`

`

`

`

r

`

`

`

`

r

` r

`

`

`

` `

QQ QQ QQs 

` `

` `

` `

`

`

`

` `

`

`

`

`

`

` `

`

` `

`

` `

r

` `

`

` r

` r

r

`

r

r `

` r

`

`

`

` `

` `

`

` `

`

` `

`

` `

`

`

r

`

`

r `

`

r

`

` `

`

`

r

` r

`

`

` `

`

` `

`

`

`

`

`

` `

`

`

r `

`

`

`

`

`

`

`

` `

`

`

`

` `

` `

` `

`

` `

`

`

`

`

r

`

` `

`

`

r `

`

`

`

`

`

`

`

`

`

` `

`

`

`

` `

` `

`

`

`

`

`

r `

r `

r

`

`

`

`

`

r

`

`

`

`

r

`

r

`

`

` `

`

` `

`

`

` `

`

`

`

`

`

` `

`

`

` `

`

r

`

r `

` r

r

` `

` `

`

r

`

`

`

r

r

`

` `

`

r `

`

`

`

r

`

r

`

`

r `

`

` `

`

`

`

` `

r

`

`

` `

`

`

`

`

`

`

`

` r

`

r

`

`

`

`

`

`

`

`

`

`

`

`

`

`

`

`

` `

`

`

`

r

`

`

` `

`

r

` `

`

`

`

`

`

r

` `

`

` `

`

r

` r

`

`

` r

`

r

`

`

r

`

r

` `

r

r

` `

`

`

` `

`

` `

` `

`

`

` `

`

`

` `

`

`

`

`

`

`

`

` `

`

   + 

`

` `

` `

`

` `

`

r

` `

` `

`

`

`

` `

` `

` `

r

` `

`

`

` `

`

`

`

`

`

`

`

`

`

`

`

`

`

`

`

`

` r

` r

`

` `

r

` r

` r

`

`

`

` `

` `

`

r

`

`

QQ QQ QQs 

`

` `

` `

`

r r

r

`

`

r

`

r r

r

`

r

`

`

r

` r

`

r `

`

`

r

`

`

`

` `

`

r `

`

`

`

`

` `

`

`

`

`

` `

`

`

` `

`

`

`

`

`

`

`

`

` `

` `

`

` `

`

`

`

`

`

`

`

`

`

`

` `

`

`

`

`

` `

` `

`

`

`

`

`

`

`

`

`

`

`

r

`

`

`

`

` `

`

`

` `

` `

` `

`

`

`

`

`

`

`

`

`

`

`

`

`

r

`

r `

`

` r

r r

` `

`

r `

r r

`

r

` r

r

`

`

` `

`

r

` `

` `

`

`

`

`

`

` `

`

`

`

` `

` `

`

`

`

`

`

`

`

`

`

`

`

`

   + 

`

` r

`

r

`

`

`

r r

r

r r

r `

r

`

`

r

r

` `

` `

`

` `

`

`

` `

`

`

`

`

`

` `

`

` `

` `

` `

` `

`

`

`

`

`

`

` `

` r

`

r

`

`

` `

` `

`

`

`

r

`

r

r

r r

r

r r

r

r r

r

r

r `

r

`

`

r

`

`

r

`

`

` `

` `

` `

`

` r

` `

`

` `

r

r

`

`

`

`

`

r

`

` `

`

`

` `

` `

`

`

` `

`

`

`

`

`

`

` `

`

`

`

`

`

`

r `

`

r

` `

`

`

r `

r

r

r

r r

r

r

r

r r

`

r

`

`

r

`

r

`

`

`

`

r

` r

` r

` r

r

r `

r

`

r r

r

r

r r

r r

r r

r

r

`

`

r

`

`

r

r

r

r r

r

r

r

r

r r

r r

r

r

r r

`

`

`

r

`

`

`

`

`

`

`

`

`

`

`

` `

`

`

`

`

`

`

`

` `

`

`

`

`

`

` `

`

` `

`

`

r `

`

`

`

`

`

`

`

`

`

`

`

r

` `

` `

`

`

`

`

`

` `

`

r

` `

` `

`

`

`

` `

`

`

`

` r

`

`

`

`

`

`

`

r

` `

` `

` `

`

`

`

`

` `

`

` `

`

`

`

`

`

` `

`

`

`

`

r

`

` `

`

`

` `

` `

`

`

` `

`

`

r

`

`

r

r

r

r

r

r r

r

r

r r

r `

r `

r

` r

`

` r

` r

`

r

r

` `

`

r

`

r r

r r

`

r

`

`

r

`

r

`

`

`

`

`

r

`

`

`

`

` `

` `

`

`

r

` `

` `

`

`

`

`

r

r

r `

`

` `

`

` `

` `

` `

`

r

` `

`

`

`

`

` `

r

` r

`

`

` `

r

`

    + 

` `

`

`

`

` r

`

` `

`

`

`

r

`

`

`

r

`

`

`

`

`

`

r `

`

`

`

` `

` `

`

r

`

`

`

` `

` `

` `

`

`

`

`

`

`

`

`

r

r

r

r r

r

r

r

r

r

`

`

r

` r

` r

`

`

`

`

r

`

`

r

` `

`

`

`

`

`

r `

`

`

`

`

` `

`

`

` `

`

` `

`

` `

`

`

`

`

r

` `

` r

`

`

`

`

`

` r

`

`

r

`

`

r

`

` r

`

` `

` `

` `

`

`

`

`

`

` `

`

`

`

`

`

`

` `

`

r

` r

` r

`

`

` `

`

`

`

`

r

r

r

r

r r

r

r r

r

r

r

r

r r

` `

`

`

r

r r

r

r

r

r

r r

r

r r

r

r

r

r

r

r

r

`

`

r `

`

` `

` `

`

r

r `

`

` `

`

` `

r

r

r

r

r

r

r r

r r

r r

r

r

r

r

r

r

` r

r

r r

r

r

r

r r

r

r r

r

r

r

r

r

`

`

r

r r

r

r

r

r r

r r

r r

r

r

r

r

r

`

`

r

`

r

`

`

r

`

r

r

` r

`

`

r `

`

r

`

r `

r

` r

`

`

` `

`

r

`

`

` `

`

r `

`

` `

` `

r `

`

`

r `

r `

r

`

r

` `

r

`

`

r `

r `

` r

`

`

r

` r

` r

`

`

`

`

r `

r

r `

`

` `

`

r

`

`

`

` `

r `

r

`

`

r

` r

`

`

`

`

r `

r

`

` `

`

` `

r

` r

`

`

r

r

`

r r

` `

`

r

`

`

`

` `

`

r

`

`

` `

r `

r `

r `

r

`

r

` `

r

`

`

r

`

r

`

r r

` `

` `

` `

`

`

`

r

`

r

` `

r

r

r

r

r r

r r

r r

r

r

r

r

r

r r

r

r r

r

r

r

r

r

r r

r r

r r

r

r r

   + 

r

r r

r

r

r r

r r

r

` `

6y

 u

` e

e

u

e

u

` e

e

u

e

` e

u

` e

u

e

` e

e

u

` e

e

u

` e

u

` e

u

e

u

e

` e

e

` e

u

e

e

u

` e

e

u

` e

e

` e

e

` e

u

` e

u

` e

u

e

u

e

e

` e

u

` e

u

e

u

e

` e

-x

e

u

` e

e

` e

e

` e

u

` e

u

` e

u

Reference point Opposite point

u

e

` e

` e

Figure 4:

8

7

CUTOFF RATE (bits/ch. use)

6

5

4

3

2

1

0 −10

−5

0

5

10 SNR (dB)

Figure 5: 17

15

20

25

30

5 4.8

CUTOFF RATE (bits/ch. use)

4.6 4.4 4.2 4 3.8 3.6 3.4 3.2 3 10

10.5

11

11.5

12

12.5 SNR (dB)

13

13.5

14

14.5

15

Figure 6: 8

7

C (bits/ch. use)

6

5

4

3

2

1

0 −10

−5

0

5

10 SNR (dB)

Figure 7: 18

15

20

25

30

5 4.8 4.6

C (bits/ch. use)

4.4 4.2 4 3.8 3.6 3.4 3.2 3 10

10.5

11

11.5

12

12.5 SNR (dB)

Figure 8:

19

13

13.5

14

14.5

15

Related Documents