Lecture 17

  • October 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Lecture 17 as PDF for free.

More details

  • Words: 961
  • Pages: 4
AWGN channel COMMUNICATION SYSTEMS Lecture # 17 4th

Apr 2007 Instructor

WASEEM KHAN

A channel which adds Gaussian distributed white noise to the signal is called Additive White Gaussian Noise (AWGN) channel. The term additive means that the noise is simply superimposed or added to the signal. This channel affects each transmitted symbol independently. Such a channel is called memoryless channel.

Centre for Advanced Studies in Engineering

Effect of Noise

Bit-energy E b

Noise when added to the information signal, may cause errors in the received information bits. Bit-error rate (BER) is the basic criteria to check the performance of a communication system.

Let in a binary digital communication system, a symbol is defined as

s0 (t )

A cos t

Tb

Usually BER is plotted against Eb/N0. Eb is the bit energy (Eb = S . Tb) where S = signal power and Tb = bit duration

0

or

s 2 ( t ) dt 0

Eb STb S / Rb S W N0 N /W N /W N Rb Eb/N0 is dimensionless and usually expressed in decibels (dB) Eb N0

sin( 2 Tb ) 2

(since Tb >>sin(2 Tb) / 2 )

2 Eb Tb

AWGN

DEMODULATE & SAMPLE RECEIVED WAVEFORM TRANSMITTED WAVEFORM

FREQUENCY DOWN CONVERSION

FOR BANDPASS SIGNALS

Now we can define Eb/N0 as

10 log

A

A2 Tb 2

Demodulation and Detection of Digital Signals

Tb

Eb ( dB ) N0

A2Tb 2

Eb

N0 is power spectral density, hence we can define N0 = N/ W, where N is total noise power while W is the bandwidth. Eb is bit energy which can be defined as Eb = STb or

Eb

[ A cos t ]2 dt

Eb

E b/N0

0 t Tb

Bit energy can be calculated as

RECEIVING FILTER

EQUALIZING FILTER

DETECT SAMPLE at t = T

THRESHOLD COMPARISON

COMPENSATION FOR CHANNEL INDUCED ISI

MESSAGE SYMBOL OR CHANNEL SYMBOL

The digital receiver performs two basic functions: Demodulation, to recover a waveform to be sampled at t = nT. Detection, decision-making process of selecting possible digital symbol

1

Detection of Binary Signal in Gaussian Noise

Detection of Binary Signal in Gaussian Noise

2

For any binary channel, the transmitted signal over a symbol interval (0,T) is:

1

Original signal

0 -1 -2

0

2

4

6

8

10

12

14

16

18

20

si (t )

2 1

Noise

s0 (t ) 0 t T

for a binary 0

s1 (t ) 0 t T

for a binary 1

0 -1 -2

0

2

4

6

8

10

12

14

16

18

20

If the channel is AWGN, the received symbol will be

2

Noisy signal

r(t) si (t )

1 0

n(t) i 0,1

0 t T

-1 -2

0

2

4

6

8

10

12

14

16

18

Detection of Binary Signal in Gaussian Noise

20

Detection of Binary Signal in Gaussian Noise The recovery of signal at the receiver consists of two parts: Waveform-to-sample transformation Demodulator followed by a sampler Each symbol is sampled at t = T to get a sample z(T).

The recovery of signal at the receiver consist of two parts Filter Reduces the received signal to a single variable z(T) z(T) is called the test statistics Detector (or decision circuit) Compares the z(T) to some threshold level 0 , i.e., H 1

z (T )

0 H 0

where H1 and H0 are the two possible binary hypothesis

Detection of Binary Signal in Gaussian Noise

z(T ) ai (T )

n0 (T ) i 0,1

where ai(T) is the desired signal component, and n0(T) is the noise component

Detection of symbol Assume that input noise is a Gaussian random process, i.e.

1

p (n0 ) 0

2

exp

1 2

n0

2

0

Probabilities Review

The sample z(T) will be another Gaussian random variable.

p( z | s0 )

1 exp 0 2

1 z a0 2 0

p( z | s1 )

1 exp 0 2

1 z a1 2 0

2

2

P[s0], P[s1] a priori probabilities These probabilities are known before transmission P[z] probability of the received sample p(z|s0), p(z|s1) conditional pdf of received signal z, conditioned on the transmitted symbol si P[s0|z], P[s1|z] a posteriori probabilities

1

2

Choosing the Threshold

Choosing the Threshold

Maximum a posteriori (MAP) criterion:

L(z) If

p ( s0 | z )

p ( s1 | z )

H0

If

p ( s1 | z )

p ( s0 | z )

H1

p( z | s1 ) P(s1 ) P( z)

H1

H0

L( z)

p( z | s ) p(s ) i i p(z)

p( z | s0 ) P(s0 ) P( z)

p( z | s0 )

H1

P (s0 )

H0

P ( s1 )

likelihood ratio test ( LRT )

When the two signals, s0(t) and s1(t), are equally likely, i.e., P(s0) = P(s1) = 0.5, then the decision rule becomes

Problem is that a posteriori probability are not known. Solution: Use Bay s theorem:

p (s | z) i

p ( z | s1 )

p ( z | s1 ) p ( z | s0 )

H1

1

max likelihood ratio test

H0

This is known as maximum likelihood ratio test because we are selecting the hypothesis that corresponds to the signal with the maximum likelihood. H1

p( z | s1) P(s1)

p( z | s0 ) P(s0 ) H0

In terms of the Bayes criterion, it implies that the cost of both types of error is the same.

Announcements Class on 6th April (6:00~7:30 pm) in lieu of 7th April Second sessional on 13th April (Friday) 6:00pm to 7:30pm

3

This document was created with Win2PDF available at http://www.daneprairie.com. The unregistered version of Win2PDF is for evaluation or non-commercial use only.

Related Documents

Lecture 17
November 2019 26
Lecture 17
October 2019 23
Lecture 17
May 2020 14
Lecture 17
November 2019 23
Lecture 17
April 2020 11
Lecture 17
October 2019 30