23.11.2014 Views

Slides Chapter 1. Measure Theory and Probability

Slides Chapter 1. Measure Theory and Probability

Slides Chapter 1. Measure Theory and Probability

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>1.</strong>8. RANDOM VARIABLES<br />

Definition <strong>1.</strong>49 The k-th moment of X with respect to the<br />

mean µ is ∫<br />

µ k := α k,µ = (x−µ) k dF X (x).<br />

IR<br />

In particular, second moment with respect to the mean is called<br />

variance,<br />

∫<br />

σX 2 = V(X) = µ 2 = (x−µ) 2 dF X (x).<br />

The st<strong>and</strong>ard deviation is σ X = √ V(X).<br />

Definition <strong>1.</strong>50 The k-th moment of X with respect to the<br />

origin µ is<br />

∫<br />

α k := α k,0 = E(X n ) = x k dF X (x).<br />

Proposition <strong>1.</strong>20 It holds<br />

µ k =<br />

IR<br />

k∑<br />

( k<br />

(−1) k−i i<br />

i=0<br />

IR<br />

)<br />

µ k−i α i .<br />

Lemma <strong>1.</strong>1 If α k = E(X k ) exists <strong>and</strong> is finite, then there exists<br />

α m <strong>and</strong> is finite, ∀m ≤ k.<br />

One way of obtaining information about the distribution of a<br />

r<strong>and</strong>om variable is to calculate the probability of intervals of the<br />

type (E(X) − ǫ,E(X) + ǫ). If we do not know the theoretical<br />

distributionofther<strong>and</strong>omvariablebutwedoknowitsexpectation<br />

<strong>and</strong> variance, the Tchebychev’s inequality gives a lower bound of<br />

this probability. This inequality is a straightforward consequence<br />

of the following one.<br />

ISABEL MOLINA 44

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!