15.05.2015 Views

Greville's Method for Preconditioning Least Squares ... - Projects

Greville's Method for Preconditioning Least Squares ... - Projects

Greville's Method for Preconditioning Least Squares ... - Projects

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

22<br />

RIF preconditioned GMRES did not converge in 2000 steps <strong>for</strong> all the dropping tolerances<br />

τ d we tried. The column “deficiency detected” gives the linearly dependent<br />

columns detected by our algorithm. −1260 means that the 1260th column, which is<br />

a rank deficient column, is missed by our algorithm. For example, when τ d = 10 −6 ,<br />

τ s = 10 −10 our preconditioning algorithm detected 12 rank deficient columns, and<br />

missed the 1239-th, 1261-th, and 1278-th columns, and did not detect wrong linearly<br />

dependent columns. Hence, Assumption 1 is satisfied. When τ d = 10 −6 , τ s = 10 −7<br />

our preconditioning algorithm found exactly the 15 linearly dependent columns. When<br />

τ d = 10 −6 , τ s = 10 −5 many more than 15 columns were detected. Thus, assumption<br />

1 is not satisfied. Hence, the preconditioned GMRES did not converge. From this example<br />

we can see that the numerical results coincide with our theory very well. From<br />

Figure 1, we obtain a better insight into this situation. We observe that when the<br />

0<br />

Residual Curve of lp_cycle, τ d<br />

= 10 −6<br />

−2<br />

log 10<br />

||A T r|| 2<br />

/||A T b|| 2<br />

−4<br />

−6<br />

−8<br />

−10<br />

τ s<br />

=10 −5<br />

τ s<br />

=10 −7<br />

τ s<br />

=10 −10<br />

−12<br />

−14<br />

0 50 100 150 200<br />

Iteration Number<br />

Fig. 1 Convergence Curve <strong>for</strong> lp cycle<br />

switching tolerance τ s = 10 −5 , ‖AT r‖ 2<br />

‖A T maintains the level 10 −1 in the end, which<br />

b‖ 2<br />

means that the solution is not a solution to the original least squares problem. This<br />

phenomenon illustrates that Assumption 1 is necessary.<br />

9 Conclusion<br />

In this paper, we proposed a new preconditioner <strong>for</strong> least squares problems. When<br />

matrix A is full column rank, our preconditioning method is similar to the RIF preconditioner<br />

[3]. When A is rank deficient, our preconditioner is also rank deficient.<br />

We proved that under Assumption 1, using our preconditioners, the preconditioned

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!