13.08.2022 Views

advanced-algorithmic-trading

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

38

We then generate a uniform random number on the interval [0, 1]. If this number is contained

within the interval [0, p] then we accept the move, otherwise we reject it.

While this is a relatively simple algorithm it isn’t immediately clear why this makes sense

and how it helps us avoid the intractable problem of calculating a high dimensional integral of

the evidence, P (D).

As Thomas Wiecki[102] points out in his article on MCMC sampling, we are actually dividing

the posterior of the proposed parameter by the posterior of the current parameter. Utilising

Bayes’ Rule this eliminates the evidence, P (D) from the ratio:

P (θ new |D)

P (θ current |D) =

P (D|θ new)P (θ new)

P (D)

P (D|θ current)P (θ current)

P (D)

= P (D|θ new)P (θ new )

P (D|θ current )P (θ current )

(4.4)

The right hand side of the latter equality contains only the likelihoods and the priors, both of

which we can calculate easily. Hence by dividing the posterior at one position by the posterior at

another, we’re sampling regions of higher posterior probability more often than not, in a manner

which fully reflects the probability of the data.

4.4 Introducing PyMC3

PyMC3[10] is a Python library that carries out "Probabilistic Programming". That is, we can

define a probabilistic model specification and then carry out Bayesian inference on the model,

using various flavours of Markov Chain Monte Carlo. In this sense it is similar to the JAGS and

Stan packages. PyMC3 has a long list of contributors and is currently under active development.

PyMC3 has been designed with a clean syntax that allows extremely straightforward model

specification, with minimal "boilerplate" code. There are classes for all major probability distributions

and it is easy to add more specialist distributions. It has a diverse and powerful suite

of MCMC sampling algorithms, including the Metropolis algorithm that we discussed above, as

well as the No-U-Turn Sampler (NUTS). This allows us to define complex models with many

thousands of parameters. d It also makes use of the Python Theano[94] library, often used for

highly GPU-intensive Deep Learning applications, in order to maximise efficiency in execution

speed.

In this chapter we will use PyMC3 to carry out a simple example of inferring a binomial

proportion. This is sufficient to express the main ideas of MCMC without getting bogged down

in implementation specifics. In later chapters we will explore more features of PyMC3 by carrying

out inference on more sophisticated models.

4.5 Inferring a Binomial Proportion with Markov Chain

Monte Carlo

If you recall from the previous chapter on inferring a binomial proportion using conjugate priors

our goal was to estimate the fairness of a coin, by carrying out a sequence of coin flips.

The fairness of the coin is given by a parameter θ ∈ [0, 1] where θ = 0.5 means a coin equally

likely to come up heads or tails.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!