Random Variables and Random Processes 2

Basics of Signals and Systems Feb 5th, 2007 Random Variables and Random Processes Joint Probability distribution Let X ...

1 downloads 377 Views 37KB Size
Basics of Signals and Systems Feb 5th, 2007

Random Variables and Random Processes Joint Probability distribution Let X and Y be two random variables. The two probability distribution functions - the joint cummulative probability distribution and the joint probability density function can be defined on these two random variables. The joint cummulative probabililty distribution function is given by

FX,Y (x, y) = P (X ≤ x, Y ≤ y) Similarly, the joint probability desity function is given by

fX,Y (x, y) =

also,

Z

+∞ −∞

Z

∂ 2 FX,Y (x, y) ∂x∂y

+∞

fX,Y (x, y) dx dy = 1 −∞

Marginal Distribution functions Z

+∞ −∞

Z

+∞ −∞

Z

x

fX,Y (ε, η) dε dη = FX (x) −∞

Z

y

fX,Y (ε, η) dε dη = FY (y) −∞

FX (x) and FY (y) are called marginal distributions functions of X and Y . 1

Marginal Density functions Z

+∞

fX,Y (x, η) dη = fX (x) −∞

Z

+∞

fX,Y (ε, y) dε = FY (y) −∞

fX (x) and fY (y) are called marginal density functions of X and Y .

Conditional probabibility density functions

Z

fY (y/x) =

fX,Y (x, y) fX (x)

fX (x/y) =

fX,Y (x, y) fy (y)

+∞

fY (y/x) dy = −∞

Z

+∞

fX (x/y) dx = 1 −∞

Statistical Averages Mean Mean of a random variable X given by µx or expection of X, E[X] is computed as

µx = E[X] =

Z

+∞

x.fX (x) dx −∞

If, Y = g(X), then

E[Y ] =

Z

+∞

g(x).fX (x) dx −∞

2

Moments of a random variable Moments about origin:

n

E[X ] =

Z

+∞

xn .fX (x) dx −∞

Moments about mean:

E[(X − µX )n ] =

Z

+∞

(x − µx )n .fX (x) dx −∞

The second moment about mean is called variance represented as σx2 or var[X] and defined as

σx2 = E[(X − µX )2 ] =

Z

+∞

(x − µx )2 .fX (x) dx −∞

σx is called the standard deviation of the variable which is the measure of variable’s randomness. A relation between variance and mean of a random variable can be developed as follows

σx2 = E[(X − µX )2 ] = E[X 2 ] − 2µx E[X] + E[µ2x ] = E[X 2 ] − 2µ2x + µ2x = E[X 2 ] − µ2x

Correlation of Random Variables covariance[X, Y ] = E[(X − E[X])(Y − E[Y ])] = E[XY ] − E[X]E[Y ] − E[Y ]E[X] + E[X]E[Y ] = E[XY ] − µx .µy From the above relation we can see that when the two random variables are statistically independent then covariance[X, Y ] becomes zero.

3