01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

64<br />

4 Regression Techniques<br />

There is a growing interest in machine learning to design powerful algorithms for performing nonlinear<br />

regression. We will here consider a few of these. The principle behind the technique we will<br />

present here is the same as the one used in most other variants we find in the literature and<br />

hence this offers a good background for an interested reader.<br />

Consider a multi-dimensional (zero-mean) variable<br />

N<br />

x∈° and a one-dimensional<br />

variable y ∈ ° , regression techniques aim at approximating a relationship f between y and x by<br />

building a model of the form:<br />

y= f ( x)<br />

(4.1)<br />

4.1 Linear Regression<br />

The most classical technique that the reader will probably be familiar with is the linear regressive<br />

model, whereby one assumes that f is a linear function parametrized by<br />

( , )<br />

T<br />

y f x w x w<br />

w∈ °<br />

N<br />

, that is:<br />

= = (4.2)<br />

For a given instance of the pair x and y one can solve explicitly for w . Consider now the case<br />

i<br />

where we are provided with a set of M observed instances X = { x } and { } i<br />

Y y<br />

i=<br />

1<br />

i=<br />

1<br />

M<br />

M<br />

= of the<br />

variables x and y such that the observation of y has been corrupted by some noise which may<br />

or not be a function of x, i.e. ε ( x)<br />

:<br />

( )<br />

T<br />

y x w ε x<br />

= + (4.3)<br />

Classical means to estimate the parameters w is through mean-square, which we review next.<br />

4.2 Partial Least Square Methods<br />

Adapted from R. Rosipal and N. Kramer, Overview and Recent Advances in Partial Least Squares, C. Saunders et<br />

al. (Eds.): SLSFS 2005, LNCS 3940, pp. 34–51, 2006.<br />

Partial Least Squares (PLS) refer to a wide class of methods for modeling relations between sets<br />

of observed variables by means of latent variables. It comprises regression and classification<br />

tasks as well as dimension reduction techniques and modeling tools. As the name implies, PLS is<br />

i<br />

based on least-square regression. Consider a set of M pairs of variables { }<br />

i=<br />

1<br />

M<br />

i<br />

Y = { y } , least-square regression looks for a mapping w that sends X onto Y, such that:<br />

i=<br />

1<br />

.<br />

X<br />

M<br />

= x and<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!