13.08.2022 Views

advanced-algorithmic-trading

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

25

• The fairness of the coin does not change in time, that is it is stationary

With these assumptions in mind, we can now begin discussing the Bayesian procedure.

3.3 Recalling Bayes’ Rule

In the the previous chapter we outlined Bayes’ rule. I’ve repeated it here for completeness:

P (θ|D) = P (D|θ) P (θ) / P (D) (3.1)

Where:

• P (θ) is the prior. This is the strength in our belief of θ without considering the evidence

D. Our prior view on the probability of how fair the coin is.

• P (θ|D) is the posterior. This is the (refined) strength of our belief of θ once the evidence

D has been taken into account. After seeing 4 heads out of 8 flips, say, this is our updated

view on the fairness of the coin.

• P (D|θ) is the likelihood. This is the probability of seeing the data D as generated by a

model with parameter θ. If we knew the coin was fair, this tells us the probability of seeing

a number of heads in a particular number of flips.

• P (D) is the evidence. This is the probability of the data as determined by summing (or

integrating) across all possible values of θ, weighted by how strongly we believe in those

particular values of θ. If we had multiple views of what the fairness of the coin is (but

didn’t know for sure), then this tells us the probability of seeing a certain sequence of flips

for all possibilities of our belief in the coin’s fairness.

Note that we have three separate components to specify, in order to calcute the posterior.

They are the likelihood, the prior and the evidence. In the following sections we are going to

discuss exactly how to specify each of these components for our particular case of inference on a

binomial proportion.

3.4 The Likelihood Function

We have just outlined Bayes’ rule and have seen that we must specify a likelihood function,

a prior belief and the evidence (i.e. a normalising constant). In this section we are going to

consider the first of these components, namely the likelihood.

3.4.1 Bernoulli Distribution

Our example is that of a sequence of coin flips. We are interested in the probability of the coin

coming up heads. In particular, we are interested in the probability of the coin coming up heads

as a function of the underlying fairness parameter θ.

This will take a functional form, f. If we denote by k the random variable that describes the

result of the coin toss, which is drawn from the set {1, 0}, where k = 1 represents a head and

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!