10.08.2013 Views

SOLUTION FOR HOMEWORK 3, STAT 4352 Welcome to your third ...

SOLUTION FOR HOMEWORK 3, STAT 4352 Welcome to your third ...

SOLUTION FOR HOMEWORK 3, STAT 4352 Welcome to your third ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

extreme observations is also the minimal sufficient statistic. Please look at the situation: we<br />

have 1 parameter and need 2 univariate statistics (X(1), X(n)) <strong>to</strong> have the sufficient statistics;<br />

here this is the limit of data-reduction. Nonetheless, this is a huge data-reduction whenever<br />

n is large. Just think about this: <strong>to</strong> estimate θ you do not need any observation which is<br />

between the two extreme ones! This is not a trivial assertion.<br />

Well, now let us return <strong>to</strong> the problem at hand. If you look at the graphic of the likelihood<br />

function as a function in θ, then you may conclude that it attains its maximum on all θ such<br />

that<br />

X(n) − 1/2 < θ < X(1) + 1/2. (1)<br />

As a result, we get a very curious MLE: any point within this interval can be declared as<br />

the MLE (the MLE is not unique!).<br />

Now we can consider the particular questions at hand.<br />

(a). Let ˆ Θ1 = (1/2)(X(1) + X(n)). We need <strong>to</strong> check that this estima<strong>to</strong>r satisfies (1). We<br />

just plug-in this estima<strong>to</strong>r in (1) and get<br />

X(n) − 1/2 < (1/2)(X(1) + X(n)) < X(1) + 1/2.<br />

The latter relation is true because it is equivalent <strong>to</strong> the following valid inequality:<br />

X(n) − X(1) < 1.<br />

(b) Let ˆ Θ2 = (1/3)(X(1) + 2X(n)) be another candidate for the MLE. Then it should<br />

satisfy (1). In particular, if this is the MLE then<br />

(1/3)(X(1) + 2X(n)) < X(1) + 1/2<br />

should hold. The latter inequality is equivalent <strong>to</strong><br />

X(n) − X(1) < 3/4<br />

which obviously may not hold. The contradic<strong>to</strong>ry shows that this estima<strong>to</strong>r, despite being<br />

a function of the sufficient statistic, is not the MLE.<br />

8. Problem 10.74. Here we are exploring the Bayesian approach where the parameter of<br />

interest is considered as a realization of a random variable, (it can be considered as a random<br />

variable). For the problem at hand X ∼ Binom(n, θ) and θ is a realization (which we do<br />

not directly observe) of a beta RV Θ ∼ Beta(α, β).<br />

[Please note that here <strong>your</strong> knowledge of basic/classical distributions becomes absolutely<br />

crucial: you cannot solve any problem without knowing formulae for pmf/pdf; so it is time<br />

<strong>to</strong> refresh them.]<br />

In other words, here we are observing a binomial random variable whose parameter<br />

(probability of success has a beta prior.<br />

To find a Bayesian estima<strong>to</strong>r, we need <strong>to</strong> find a posterior distribution of the parameter<br />

of interest and then calculate its mean. [Please note that <strong>your</strong> knowledge of means of classical<br />

distribution becomes very handy here: as soon as you realize the underlying posterior<br />

distribution, you can use a formula for calculating its mean.]<br />

5

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!