Lecture 16

  • October 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Lecture 16 as PDF for free.

More details

  • Words: 910
  • Pages: 4
Statistic characterization of Random Variables

COMMUNICATION SYSTEMS Lecture # 16 31st

Mar 2007

The cdf or pdf are sufficient to fully characterize a random variable. However, random variable can be PARTIALLY characterized by other measures. Expected value or mean (1st moment): E [ X ] Mean-square value (2nd moment): E [ X 2 ]

X

xf

X

( x )dx

x 2 f X ( x )dx

Central Moments Variance E [( X

X

)2 ]

2 X

var[ X ]

(x

X

) 2 f X ( x )dx

Instructor

WASEEM KHAN

var[ X ] = E[ X2 ]

Standard deviation STD [ X ]

Centre for Advanced Studies in Engineering

Some Important Distributions Rayleigh distribution

Gaussian distribution*

E [ X ]2

X

var[ X ]

3rd, 4th and Nth moments can be calculated but first two are the most useful.

Gaussian Distribution in Matlab x = -4:0.01:4; g = exp(-x.^2/2)/sqrt(2*pi); N = 1000000; % number of random values y = randn(1,N); h = hist(y,x); plot(x,g); hold plot(x,h/10000,'r'); legend('Gaussian distribution', 'pdf of random numbers');

fX(x) = *Gaussian distribution is also called normal distribution

Autocorrelation Correlation is a matching process; autocorrelation refers to the matching of a signal with a delayed version of itself. Autocorrelation function of a real-valued signal x(t) is defined as:

Autocorrelation A plot showing 100 random numbers with a "hidden" sine function, and an autocorrelation of the series on the bottom.

The autocorrelation function Rx( ) provides a measure of how closely the signal matches a copy of itself shifted by units in time. Rx( ) is not a function of time; it is only a function of the shift in time i.e. .

1

Random Processes A random process X(A,t) can be viewed as a function of two variables: an event A and time.

Statistical Averages of a Random Process Statistical averages for a random process are calculated for particular instant in time. First moment or mean of a random process X(t) is

For a random variable, the outcome of a random experiment is mapped into a number. For a random process, the outcome of a random experiment is mapped into a waveform that is a function

E [ X ( t i )]

X

(ti )

x ( t i ) f X ( t i ) ( x )dx

Autocorrelation function of the random process X(t) is

of time. R X ( t1 , t 2 )

E [ X ( t1 ) X ( t 2 )]

x1 x 2 f X ( t1 ), X ( t 2 ) ( x1 , x 2 ) dx 1 dx 2

Here fX(t1),X(t2)(x1, x2) is the joint probability distribution of X(t1) and X(t2).

A Typical Random Process

Stationarity A random process X(t) is said to be stationary in the strict sense if none of its statistics are affected by a shift in the time origin. A random process is said to be wide-sense stationary (WSS) if two of its statistics, its mean and autocorrelation function, do not vary with a shift in the time origin.

E[ X (ti )] R X (t1 , t 2 )

Energy and Power of Signals Energy is calculated usually for a finite-duration signal, which is called energy signal. It is given by Ex Power is the rate of energy delivered per unit time. Power is generally calculated for an infinite-duration signal. It is given by

X

(t i )

R X (t 2

X

for all ti

t1 ) for all t1 and t 2

Noise in Communication Systems Noise is the unwanted signal added to the original signal. Noise can be generated by galactic objects such as sun, stars, etc., switching transients in electrical equipments, thermal motion of electrons in electronic components, such as resistors, wires, etc. Out of them thermal noise is the most prominent. Thermal noise is a zero-mean Gaussian random

An energy signal has finite energy but zero average power. Power signal has finite average power but infinite energy.

process.

2

Noise in Communication Systems A zero-mean Gaussian random process n(t) is described by its density function given by

White Noise A characteristic spectral feature of thermal noise is that its power spectral density is the same for all frequencies of interest in most communication systems. It is called white in the sense that it contains all frequencies; same as white light contains all frequencies in the visible band. Power spectral density of thermal or Gaussian noise is

Gn ( f )

N0 2

watts/ Hz

Here the factor of 2 is included to indicate that the Gn(f) is a two-sided power spectral density. The average power of white noise is infinite.

AWGN channel A channel which adds Gaussian distributed white noise to the signal is called Additive White Gaussian Noise (AWGN) channel. The term additive means that the noise is simply superimposed or added to the signal. This channel affects each transmitted symbol independently. Such a channel is called memoryless channel.

Effect of Noise Noise when added to the information signal, may cause errors in the received information bits. Bit-error rate (BER) is the basic criteria to check the performance of a communication system. Usually BER is plotted against Eb/N0. Eb is the bit energy (Eb = S . Tb) where S = signal power and Tb = bit duration

3

This document was created with Win2PDF available at http://www.daneprairie.com. The unregistered version of Win2PDF is for evaluation or non-commercial use only.

Related Documents

Lecture 16
November 2019 37
Lecture 16
June 2020 26
Lecture 16
April 2020 30
Lecture 16
November 2019 30
Lecture 16
October 2019 30
Lecture 16
November 2019 23