24.03.2013 Views

Chapter 1 LINEAR COMPLEMENTARITY PROBLEM, ITS ...

Chapter 1 LINEAR COMPLEMENTARITY PROBLEM, ITS ...

Chapter 1 LINEAR COMPLEMENTARITY PROBLEM, ITS ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

30 <strong>Chapter</strong> 1. Linear Complementarity Problem, Its Geometry, and Applications<br />

bought from, or sold to rms outside the state, statistics on which are not available.<br />

De ne<br />

yt =number (in millions) of growing pullets in the state, on the rst day of<br />

month t.<br />

dt =number (in millions) of day-old-chickens hatched by hatcheries in the<br />

state in month t (from government statistics).<br />

Here dt are not variables, but are the given data. People in the business of producing<br />

chicken feed are very much interested in getting estimates of yt from dt. This provides<br />

useful information to them in their production planning, etc. Not all the day-oldchickens<br />

placed by hatcheries in a month may be alive in a future month. Also, after<br />

ve months of age, they are recorded as hens and do not form part of the population of<br />

growing pullets. So the appropriate linear regression model for yt as a function of the<br />

dt's seems to be yt = 0 + P5 i=1 idt;i, where 0 is the number of pullets in census,<br />

which are not registered as being hatched (pullets imported into the State, or chickens<br />

exported from the State), and i is a survival rate (the proportion of chickens placed in<br />

month t;i that are alive in month t, i = 1 to 5). We, of course, expect the parameters<br />

i to satisfy the constraints<br />

0 < = 5 < = 4 < = 3 < = 2 < = 1 < =<br />

1 : (1:21)<br />

To get the best estimates for the parameters = ( 0 1 2 3 4 5) T from past<br />

data, the least squares method could be used. Given data on yt, dt over a period of<br />

time (say for the last 10 years), de ne L2( )= P t (yt ; 0 ; P5 i=1 idt;i) 2 . Under the<br />

least squares method the best values for are taken to be those that minimize L2( )<br />

subject to the constraints (1.21). This is clearly a quadratic programming problem.<br />

One may be tempted to simplify this problem by ignoring the constraints (1.21)<br />

on the parameters . The unconstrained minimum of L2( ) can be found very easily<br />

by solving the system of equations @L2( )<br />

=0.<br />

@<br />

There are two main di culties with this approach. The rst is that the solution of<br />

this system of equations requires the handling of a square matrix (aij) with aij =1=(i+<br />

j ; 1), known as the Hilbert matrix, which is di cult to use in actual computation<br />

because of ill-conditioning. It magni es the uncertainty in the data by very large<br />

factors. We will illustrate this using the Hilbert matrix of order 2. This matrix is<br />

8<br />

H2 = >: 1 1<br />

3<br />

Consider the following system of linear equations with H2 as the coe cient matrix.<br />

x1<br />

1<br />

1<br />

2<br />

1<br />

2<br />

x2<br />

1<br />

2<br />

1<br />

3<br />

1<br />

3<br />

9<br />

> :<br />

b1<br />

b2

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!