10.08.2013 Views

SOLUTION FOR HOMEWORK 3, STAT 4352 Welcome to your third ...

SOLUTION FOR HOMEWORK 3, STAT 4352 Welcome to your third ...

SOLUTION FOR HOMEWORK 3, STAT 4352 Welcome to your third ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Given this information, the posterior distribution of Θ given the observation X is<br />

=<br />

f Θ|X (θ|x) = fΘ (θ)f X|Θ (x|θ)<br />

f X (x)<br />

Γ(n + α + β)<br />

Γ(x + α)Γ(n − x + β) θx+α−1 (1 − θ) (n−x+β)−1 .<br />

The algebra leading <strong>to</strong> the last equality is explained on page 345.<br />

Now you can realize that the posterior distribution is again Beta(x+α, n−x+β). There<br />

are two consequences from this fact. First, by a definition, if a prior density and a posterior<br />

density are from the same family of distributions, then the prior is called conjugate. This<br />

is the case that Bayesian statisticians like a lot because this methodologically support the<br />

Bayesian approach and also simplifies formulae. Second, we know a formula for the mean of<br />

a beta RV, and using it we get the Bayesian estima<strong>to</strong>r<br />

ˆΘB = E(Θ|X) =<br />

X + α X + α<br />

=<br />

(α + X) + (n − X + β) α + n + β<br />

Now we actually can consider the exercise at hand. A general remark: Bayesian estima<strong>to</strong>r<br />

is typically a linear combination of the prior mean and the MLE estima<strong>to</strong>r with weights<br />

depending on variances of these two estimates. In general, as n → ∞, Bayesian estima<strong>to</strong>r<br />

approaches the MLE.<br />

Let us check that this is the case for the problem at hand. Write,<br />

Now, if we denote<br />

ˆΘB = X<br />

n<br />

we get the wished presentation<br />

n α<br />

+<br />

α + β + n α + β<br />

w :=<br />

n<br />

α + β + n ,<br />

ˆΘ = w ¯ X + (1 − w)θ0.<br />

α + β<br />

α + β + n .<br />

where θ0 = E(Θ) = α/(α + β) is the prior mean of Θ.<br />

Now, the problem at hand asks us <strong>to</strong> work a bit further on the weight. The variance of<br />

the beta RV Θ is<br />

Well, it is plain <strong>to</strong> see that<br />

Then a simple algebra yields<br />

V ar(Θ) := σ 2 0 =<br />

θ0(1 − θ0) =<br />

αβ<br />

(α + β) 2 (α + β + 1) .<br />

αβ<br />

(α + β) 2.<br />

σ 2 0 = θ0(1 − θ0)<br />

α + β + 1<br />

6

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!