10.08.2013 Views

SOLUTION FOR HOMEWORK 3, STAT 4352 Welcome to your third ...

SOLUTION FOR HOMEWORK 3, STAT 4352 Welcome to your third ...

SOLUTION FOR HOMEWORK 3, STAT 4352 Welcome to your third ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Then<br />

6. Problem 10.66. Let X1, . . .,Xn be iid according <strong>to</strong> the pdf<br />

gθ(x) = θ −1 e −(x−δ)/θ I(x > δ).<br />

LX n(δ, θ) = θ−n e − n<br />

l=1 (Xl−δ)/θ I(X(1) > δ).<br />

Recall that X(1) = min(X1, . . .,Xn) is the minimal observation [the first ordered observation].<br />

This is the case that I wrote you about earlier: it is absolutely crucial <strong>to</strong> take in<strong>to</strong> account<br />

the indica<strong>to</strong>r function (the support) because here the parameter δ defines the support.<br />

By its definition,<br />

Note that<br />

( ˆ δMLE, ˆ θMLE) := arg max ln(LXn(δ, θ)).<br />

δ∈(−∞,∞),θ∈(0,∞)<br />

L(δ, θ) := ln(LXn(δ, θ)) = −n ln(θ) − θ−1 n <br />

(Xl − δ) + ln I(X(1) ≥ δ).<br />

Now the crucial step: you should graph the loglikelihood L as a function in δ and visualize<br />

that it takes on maximum when δ = X(1). So we get ˆ δMLE = X(1). Then by taking a<br />

derivative we get that ˆ θMLE = n −1 n l=1 (Xl − X(1)).<br />

Answer: ( ˆ δMLE ˆ θMLE) = (X(1), n −1 n l=1 (Xl − X(1)). Please note that ˆ δMLE is a biased<br />

estima<strong>to</strong>r; this is a rather typical outcome.<br />

7. Problem 10.73. Consider iid uniform observations X1, . . .,Xn with the parametric pdf<br />

l=1<br />

fθ(x) = I(θ − 1/2 < x < θ + 1/2).<br />

As soon as the parameter is in the indica<strong>to</strong>r function you should be very cautious: typically<br />

a graphic will help you <strong>to</strong> find a MLE estima<strong>to</strong>r, and not a differentiation. Also, it is very<br />

helpful <strong>to</strong> figure out the nature of the parameter. Here it is obviously a location parameter,<br />

and you can write<br />

X = θ + Z, Z ∼ Uniform(−1/2, 1/2).<br />

The latter helps you <strong>to</strong> guess about a correct estima<strong>to</strong>r and check a suggested one and, if<br />

necessary, simplify calculations of descriptive characteristics (mean, variance, etc.)<br />

Well, now we need <strong>to</strong> write down the likelihood function (recall that this is just a joint<br />

density only considered as a function in the parameter given a vec<strong>to</strong>r of observations):<br />

LXn(θ) =<br />

n<br />

I(θ − 1/2 < Xl < θ + 1/2) = I(θ − 1/2 < X(1) ≤ X(n) < θ + 1/2).<br />

l=1<br />

Note that the latter expression implies that (X(1), X(n)) is a sufficient statistic (due <strong>to</strong> the<br />

Fac<strong>to</strong>rization Theorem). As a result, any good estima<strong>to</strong>r, and the MLE in particular, must<br />

be a function of only these two statistics. Another remark is: it is possible <strong>to</strong> show (there<br />

exists a technique how <strong>to</strong> do this which is beyond this class objectives) that this pair of<br />

4

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!