09.12.2012 Views

Homework-1 (due date: 08.11.11) To hand in problems 1–5; other ...

Homework-1 (due date: 08.11.11) To hand in problems 1–5; other ...

Homework-1 (due date: 08.11.11) To hand in problems 1–5; other ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

M11 Stochastic Processes III/IV – MATH 3251/4091 HW1:sol s<br />

<strong>Homework</strong>-1 (<strong>due</strong> <strong>date</strong>: <strong>08.11.11</strong>)<br />

<strong>To</strong> <strong>hand</strong> <strong>in</strong> <strong>problems</strong> <strong>1–5</strong>; <strong>other</strong> <strong>problems</strong> are optional<br />

Problem 1 Let the r.v. Z1 have mean EZ1 = m, variance VarZ1 = σ2 , and<br />

generat<strong>in</strong>g function ϕ(s). Let Zn be the size of the nth generation of the related<br />

branch<strong>in</strong>g process; we write ϕn(s) for the correspond<strong>in</strong>g generat<strong>in</strong>g function.<br />

� �<br />

a) Us<strong>in</strong>g ϕn(s) ≡ ϕn−1 ϕ(s) or <strong>other</strong>wise, deduce that EZn = mn ;<br />

b) Us<strong>in</strong>g ϕn(s) ≡ ϕ � ϕn−1(s) � or <strong>other</strong>wise, deduce that<br />

�<br />

σ<br />

Var(Zn) =<br />

2mn−1 mn−1 m−1 , m �= 1 ,<br />

σ2n , m = 1 .<br />

Solution. Th<strong>in</strong>k<strong>in</strong>g of Zn as a random sum of random variables, we have Zn ∼ SN<br />

with N ∼ Z1 and X ∼ Zn−1; therefore EZn = EZ1 · EZn−1 = mEZn−1 and<br />

Var(Zn) = E(Z1)Var(Zn−1) + Var(Z1) ` ´ 2<br />

EZn−1 = mVar(Zn−1) + σ 2 m 2(n−1) ;<br />

a straightforward <strong>in</strong>duction now gives the result (do this carefully!).<br />

Alternatively, differentiate the equalities and use the fact that if X ∼ ϕX(s), then<br />

EX = ϕ ′ X(1) and Var(X) = ϕ ′′ X(1)+ϕ ′ X(1)− ` ϕ ′ X(1) ´ 2 ; next, carefully apply <strong>in</strong>duction.<br />

Problem 2 In a branch<strong>in</strong>g process with generat<strong>in</strong>g function<br />

ϕ(s) = as 2 + bs + c<br />

where a > 0, b > 0, c > 0, ϕ(1) = 1, compute the ext<strong>in</strong>ction probability ρ and give<br />

the condition for sure ext<strong>in</strong>ction. Can you <strong>in</strong>terpret your results?<br />

Solution. S<strong>in</strong>ce ϕ(1) = 1 implies a + b + c = 1, the fixed po<strong>in</strong>t equation becomes<br />

(as − c)(s − 1) = 0, giv<strong>in</strong>g the ext<strong>in</strong>ction probability ρ = m<strong>in</strong>(c/a, 1).<br />

Similarly, sure ext<strong>in</strong>ction happens when ϕ ′ (1) ≡ 2a + b ≤ 1, which is equivalent to<br />

a ≤ c. (Can you <strong>in</strong>terpret these results?)<br />

Problem 3 Let (Zn)n≥0 be a supercritical branch<strong>in</strong>g process 1 with offspr<strong>in</strong>g distribution<br />

Poi(λ), λ > 1. Let T0 = m<strong>in</strong>{n ≥ 0 : Zn = 0} be its ext<strong>in</strong>ction time,<br />

and let ρ = P(T0 < ∞) > 0 be its ext<strong>in</strong>ction probability. Def<strong>in</strong>e ( � Zn)≥0 as Zn<br />

conditioned on ext<strong>in</strong>ction, ie., � Zn = � Zn | T0 < ∞ � .<br />

a) Show that the transition probabilities ˆpxy of ( � Zn)n≥0 and the transition probabilities<br />

pxy of the orig<strong>in</strong>al process (Zn)n≥0 are related via ˆpxy = pxyρy−x , x, y ≥ 0;<br />

� b � � �<br />

Z1<br />

Z1<br />

b) Deduce that the generat<strong>in</strong>g functions �ϕ(s) ≡ E1 s and ϕ(s) ≡ E1 s satisfy<br />

the identity �ϕ(s) = 1ϕ(ρs),<br />

0 ≤ s ≤ 1;<br />

ρ<br />

c) Use the fixed po<strong>in</strong>t equation ρ = eλ(ρ−1) to show that �ϕ(s) = eλρ(s−1) , ie., that<br />

the offspr<strong>in</strong>g distribution for ( � Zn)n≥0 is just Poi(λρ).<br />

1 ie., with the expected offspr<strong>in</strong>g size m larger than one;<br />

O.H. http://maths.dur.ac.uk/stats/courses/StochProc34/ 1


M11 Stochastic Processes III/IV – MATH 3251/4091 HW1:sol s<br />

Solution. a) If we start with x <strong>in</strong>dividuals at time zero, Px(T0 < ∞) = ρ x . Then the<br />

Markov property for (Zn)n≥0 implies 2<br />

ˆpxy = Px(Z1 = y, T0 < ∞)<br />

Px(T0 < ∞)<br />

b) The result <strong>in</strong> a) implies<br />

ˆ Z1<br />

b<br />

bϕ(s) ≡ E1 s ˜ = X<br />

y≥0<br />

c) Us<strong>in</strong>g the fixed-po<strong>in</strong>t equation, we get<br />

bϕ(s) = ϕ(ρs)<br />

ρ<br />

= Px(Z1 = y)Py(T0 < ∞)<br />

Px(T0 < ∞)<br />

ˆp1ys y = 1<br />

ρ<br />

X<br />

y≥0<br />

= pxyρ y<br />

p1y(ρs) y ≡ 1<br />

ϕ(ρs) .<br />

ρ<br />

≡ exp{λ(ρs − 1)}<br />

exp{λ(ρ − 1)} = eλρ(s−1) ,<br />

ρ x .<br />

which is the generat<strong>in</strong>g function of Poi(λρ), as requested. Now use the uniqueness<br />

result (state it!).<br />

Problem 4 Let (Zn)n≥0 be a supercritical branch<strong>in</strong>g process 1 with offspr<strong>in</strong>g distribution<br />

{pk}k≥0, generat<strong>in</strong>g function ϕ(s) and ext<strong>in</strong>ction probability ρ ∈ [0, 1).<br />

a) If Z0 = 1, let ˜pk be the probability that conditioned on survival the first generation<br />

has exactly k <strong>in</strong>dividuals with an <strong>in</strong>f<strong>in</strong>ite l<strong>in</strong>e of descent. Show that<br />

˜pk = 1<br />

1 − ρ<br />

∞�<br />

n=k<br />

pn<br />

� �<br />

n<br />

(1 − ρ)<br />

k<br />

k ρ n−k .<br />

b) Let ( � Zn)n≥0 count only those <strong>in</strong>dividuals <strong>in</strong> (Zn)n≥0, who conditioned on survival<br />

have an <strong>in</strong>f<strong>in</strong>ite l<strong>in</strong>e of descent. Show that ( � Zn)n≥0 is a branch<strong>in</strong>g process with<br />

offspr<strong>in</strong>g generat<strong>in</strong>g function<br />

�ϕ(s) = 1<br />

�<br />

ϕ<br />

1 − ρ<br />

� (1 − ρ)s + ρ � �<br />

− ρ .<br />

Solution. a) Use the partition theorem for probabilities: for a s<strong>in</strong>gle <strong>in</strong>dividual<br />

(Z0 = 1), conditional on survival (prob. 1 − ρ) to produce exactly k <strong>in</strong>dividuals <strong>in</strong> the<br />

next generation with an <strong>in</strong>f<strong>in</strong>ite l<strong>in</strong>e of descent, one has to produce n ≥ k <strong>in</strong>dividuals<br />

of which exactly n −k die out (total prob. ρ n−k ) and the rema<strong>in</strong><strong>in</strong>g k <strong>in</strong>dividuals have<br />

an <strong>in</strong>f<strong>in</strong>ite l<strong>in</strong>e of descent (total prob. (1 − ρ) k ); hence the result.<br />

b) In view of the previous part, we just need to compute the generat<strong>in</strong>g function<br />

eϕ(s) = X<br />

s<br />

k≥1<br />

k ˜pk = 1<br />

!<br />

X X n<br />

pn (1 − ρ)<br />

1 − ρ k<br />

k≥1 n≥k<br />

k ρ n−k s k<br />

X nX<br />

!<br />

(1 − ρ) k ρ n−k s k .<br />

= 1<br />

1 − ρ<br />

n≥1<br />

pn<br />

k=1<br />

n<br />

k<br />

2 You might also wish to show that ( b Zn) ≥0 is a Markov cha<strong>in</strong>! In this case the follow<strong>in</strong>g<br />

observation might be useful: if events B, C and D are such that for every event A one<br />

has P ` A ∩ B | C ´ = P ` A ∩ B | D ´ , then P(A | B ∩ C) = P(A | B ∩ D); next apply it to<br />

A = {Zn=1 = zn+1}, B = {T0 < ∞}, C = {Zn = zn} and D = {Zn = zn, . . . , Z0 = z0}.<br />

O.H. http://maths.dur.ac.uk/stats/courses/StochProc34/ 2


M11 Stochastic Processes III/IV – MATH 3251/4091 HW1:sol s<br />

Notice that the <strong>in</strong>ternal sum is just ` (1 − ρ)s + ρ ´ n − ρ n (and that this expression<br />

vanishes for n = 0), so that<br />

eϕ(s) = 1<br />

1 − ρ<br />

as required.<br />

X<br />

n≥0<br />

pn<br />

“ `(1 − ρ)s + ρ ´ n − ρ n ”<br />

= 1<br />

“<br />

ϕ<br />

1 − ρ<br />

` (1 − ρ)s + ρ ´ ”<br />

− ϕ(ρ) ,<br />

Problem 5 For n ≥ 1 let X (n)<br />

k , k = 1, . . ., n, be <strong>in</strong>dependent Bernoulli random<br />

variables with<br />

P(X (n)<br />

k = 1) = 1 − P(X(n)<br />

k = 0) = p(n)<br />

k .<br />

Assume that as n → ∞<br />

and that<br />

n�<br />

k=1<br />

p (n)<br />

k<br />

δ (n) def<br />

= max<br />

1≤k≤n p(n)<br />

k → 0<br />

≡ E<br />

n�<br />

k=1<br />

X (n)<br />

k<br />

→ λ ∈ (0, ∞) .<br />

Us<strong>in</strong>g generat<strong>in</strong>g functions or <strong>other</strong>wise, show that the distribution of �n k=1 X(n)<br />

k<br />

converges to that of a Poisson(λ) random variable. This result is known as the law<br />

of rare events.<br />

Solution. By <strong>in</strong>depedence, the generat<strong>in</strong>g function of the sum is Q n<br />

k=1<br />

` (n) ´<br />

1+(s−1)p k ,<br />

so it is enough to show that its logarithm converges to λ(s−1) for every fixed s, |s| ≤ 1.<br />

By the elementary <strong>in</strong>equality 3 |y − log(1 + y)| ≤ y 2 , valid for all |y| ≤ 1/2, we get 4<br />

˛<br />

nX<br />

k=1<br />

log ` 1 + (s − 1)p (n) ´<br />

k − (s − 1)<br />

nX<br />

k=1<br />

p (n)<br />

k<br />

˛ ≤ (s − 1) 2<br />

nX ` (n) ´ 2<br />

p ,<br />

and it rema<strong>in</strong>s to observe that the last sum is bounded above by δ (n) · Pn k=1 p(n)<br />

k → 0<br />

as n → ∞. Similarly, the assumptions above imply (s − 1) Pn k=1 p(n)<br />

k → (s − 1)λ as<br />

n → ∞ for every fixed s, |s| ≤ 1. The result follows.<br />

optional<br />

Problem 6 In a sequence of <strong>in</strong>dependent Bernoulli experiments with success probability<br />

p ∈ (0, 1), let D be the first moment of two consecutive successful outcomes.<br />

Show that dn = P(D = n) satisfy d2 = p 2 , d3 = qp 2 , and, by condition<strong>in</strong>g on the<br />

value of the first outcome, that<br />

dn = q dn−1 + pq dn−2 , n > 3.<br />

Use these relations to derive the generat<strong>in</strong>g function GD(s). Compute E[D].<br />

R<br />

3 y`<br />

´ R 1 y<br />

Indeed, s<strong>in</strong>ce y − log(1 + y) = 0 1 − dx = 1+x 0<br />

k=1<br />

k<br />

x<br />

dx if only y > −1, tak<strong>in</strong>g the<br />

1+x<br />

absolute value on both sides with |y| ≤ 1/2 and <strong>in</strong>tegrat<strong>in</strong>g, we get |y − log(1 + y)| ≤ y 2 .<br />

4 as long as |1 − s|δ (n) ≤ 1/2;<br />

O.H. http://maths.dur.ac.uk/stats/courses/StochProc34/ 3


Z<br />

Z<br />

M11 Stochastic Processes III/IV – MATH 3251/4091 HW1:sol s<br />

Solution. The first failure (prob. q = 1 − p) is followed by a valid str<strong>in</strong>g of n − 1<br />

attemps (prob. dn−1); similarly, the first success if followed by a failure at the second<br />

attempt (prob. pq) and then by a valid str<strong>in</strong>g of n − 2 attempts (prob. dn−2). Hence<br />

the recursion. A standard computation thus gives<br />

GD(s) =<br />

p 2 s 2<br />

d<br />

, and E[D] =<br />

1 − qs − pqs2 ds<br />

˛<br />

˛ s=1<br />

GD(s) = 1<br />

p<br />

1<br />

+ .<br />

p2 Thus the average time <strong>in</strong> a die-toss<strong>in</strong>g experiment until a double-six appeares is 42.<br />

Problem 7 A biased co<strong>in</strong> show<strong>in</strong>g ‘heads’ with probability p ∈ (0, 1) is flipped<br />

repeatedly. Let Cw be the first moment when the word w appears <strong>in</strong> the observed<br />

sequence of results. F<strong>in</strong>d the generat<strong>in</strong>g function of Cw and the expectation E � �<br />

Cw<br />

for each of the follow<strong>in</strong>g words: HH, HT, TH and TT.<br />

Solution. A standard method gives 5<br />

GHH(s) =<br />

p 2 s 2<br />

1 − qs − pqs 2 = p2 s 2 (1 − ps)<br />

1 − s + p 2 qs 3 , GHT(s) =<br />

For the <strong>other</strong> two cases, <strong>in</strong>terchange H ↔ T and p ↔ q.<br />

pqs 2<br />

(1 − ps)(1 − qs) .<br />

5Notice that (1 − qs − pqs2 ) −1 is the generat<strong>in</strong>g function of all f<strong>in</strong>ite str<strong>in</strong>gs end<strong>in</strong>g with<br />

T, which never have two H <strong>in</strong> a row! One can use a similar observation to immediately write<br />

the result for HHH, HHHH etc. (can you?!)<br />

O.H. http://maths.dur.ac.uk/stats/courses/StochProc34/ 4

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!