13.08.2022 Views

advanced-algorithmic-trading

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

89

9.3 White Noise

We will begin by motivating the concept of White Noise.

Above, we mentioned that our basic approach was to try fitting models to a time series until

the remaining series lacked any serial correlation. This motivates the definition of the residual

error series:

Definition 9.3.1. Residual Error Series. The residual error series or residuals, x t , is a time

series of the difference between an observed value and a predicted value, from a time series model,

at a particular time t.

If y t is the observed value and ŷ t is the predicted value, we say: x t = y t − ŷ t are the residuals.

The key point is that if our chosen time series model is able to "explain" the serial correlation

in the observations, then the residuals themselves are serially uncorrelated. This means

that each element of the serially uncorrelated residual series is an independent realisation from

some probability distribution. That is, the residuals themselves are independent and identically

distributed (i.i.d.).

Hence, if we are to begin creating time series models that explain away any serial correlation,

it seems natural to begin with a process that produces independent random variables from some

distribution. This directly leads on to the concept of (discrete) white noise:

Definition 9.3.2. Discrete White Noise. Consider a time series {w t : t = 1, ...n}. If the elements

of the series, w t , are independent and identically distributed (i.i.d.), with a mean of zero, variance

σ 2 and no serial correlation (i.e. Cor(w i , w j ) = 0, ∀i ≠ j) then we say that the time series is

discrete white noise (DWN).

In particular, if the values w t are drawn from a standard normal distribution (i.e. w t ∼

N (0, σ 2 )), then the series is known as Gaussian White Noise.

White Noise is useful in many contexts. In particular, it can be used to simulate a synthetic

series.

Recall that a historical time series is only one observed instance. If we can simulate multiple

realisations then we can create "many histories" and thus generate statistics for some of the

parameters of particular models. This will help us refine our models and thus increase accuracy

in our forecasting.

Now that we have defined Discrete White Noise, we are going to examine some of its attributes

including its second order properties and correlogram.

9.3.1 Second-Order Properties

The second-order properties of DWN are straightforward and follow easily from the actual definition.

In particular, the mean of the series is zero and there is no autocorrelation by definition:

µ w = E(w t ) = 0 (9.1)

{

1 if k = 0

ρ k = Cor(w t , w t+k ) =

0 if k ≠ 0

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!