01.05.2017 Views

563489578934

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

B–9 Multivariate Statistics 711<br />

Proof. First, show that this result is correct if N = 2 and L = 1:<br />

q<br />

q<br />

f(x 1 , x 2 ) dx 2 = f(x 1 )f(x 2 ƒ x 1 ) dx 2<br />

L-q<br />

L -q<br />

q<br />

= f(x 1 ) f(x 2 ƒ x 1 ) dx 2 = f(x 1 )<br />

L<br />

(B–90)<br />

since the area under f(x 2 x 1 ) is unity. This procedure is readily extended to prove the<br />

Lth-dimensional case of Eq. (B–89).<br />

Bivariate Statistics<br />

Bivariate (or joint) distributions are the N = 2-dimensional case. In this section, the definitions<br />

from the previous section will be used to evaluate two-dimensional moments. As shown in<br />

Chapter 6, bivariate statistics have some very important applications to electrical engineering<br />

problems, and some additional definitions need to be studied.<br />

-q<br />

DEFINITION.<br />

DEFINITION.<br />

The correlation (or joint mean) of x 1 and x 2 is<br />

m 12 = x 1 x 2 =<br />

L<br />

q<br />

q<br />

-q L-q<br />

x 1 x 2 f(x 1 , x 2 ) dx 1 dx 2<br />

Two random variables x 1 and x 2 are said to be uncorrelated if<br />

m 12 = x 1 x 2 = x 1 x 2 = m 1 m 2<br />

(B–91)<br />

(B–92)<br />

If x 1 and x 2 are independent, it follows that they are also uncorrelated, but the converse is not generally<br />

true. However, as we will see, the converse is true for bivariate Gaussian random variables.<br />

DEFINITION. Two random variables are said to be orthogonal if<br />

m 12 = x 1 x 2 K 0<br />

(B–93)<br />

Note the similarity of the definition of orthogonal random variables to that of orthogonal<br />

functions given by Eq. (2–73).<br />

DEFINITION.<br />

The covariance is<br />

u 11 = (x 1 - m 1 )(x 2 - m 2 )<br />

=<br />

L<br />

q<br />

q<br />

-q L-q<br />

(x 1 - m 1 )(x 2 - m 2 )f(x 1 , x 2 ) dx 1 dx 2<br />

(B–94)<br />

It should be clear that if x 1 and x 2 are independent, the covariance is zero (and x 1 and x 2 are<br />

uncorrelated). The converse is not generally true, but it is true for the case of bivariate<br />

Gaussian random variables.<br />

DEFINITION.<br />

The correlation coefficient is<br />

r = u 11<br />

s 1 s 2<br />

=<br />

(x 1 - m 1 )(x 2 - m 2 )<br />

3(x 1 - m 1 ) 2 3(x 2 - m 2 ) 2<br />

(B–95)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!