Greville's Method for Preconditioning Least Squares ... - Projects
Greville's Method for Preconditioning Least Squares ... - Projects
Greville's Method for Preconditioning Least Squares ... - Projects
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
17<br />
Theorem 10 Let A ∈ R m×n . If Assumption 1 holds, then GMRES determines a<br />
solution to the right preconditioned problem<br />
be<strong>for</strong>e breakdown happens.<br />
min ‖b − AMy‖ y∈R m 2 (7.21)<br />
Proof The proof is the same as Theorem 6.<br />
✷<br />
Theorem 11 Let A ∈ R m×n . If Assumption 1 holds, then <strong>for</strong> any b ∈ R m , GMRES<br />
determines a least squares solution of<br />
min ‖b − AMy‖ y∈R m 2 (7.22)<br />
be<strong>for</strong>e breakdown and this solution attains min<br />
x∈R n‖b − Ax‖ 2, where M is computed by<br />
Algorithm 1.<br />
We would like to remark that it is preferable to per<strong>for</strong>m Algorithm 1 and Algorithm<br />
2 to A T rather than A when m < n, based on the following three reasons. By doing<br />
so, we construct ˆM, an approximate generalized inverse of A T . Then, we can use ˆM T<br />
as the preconditioner to the original least squares problem.<br />
1. In Algorithm 1 and Algorithm 2, the approximate generalized inverse is constructed<br />
row by row. Hence, we per<strong>for</strong>m a loop which goes through all the columns of A<br />
once. When m ≥ n, this loop is relatively short. However, when m < n, this loop<br />
could become very long, and the preconditioning will be more time-consuming.<br />
2. Another reason is that, linear dependence will always happen in this case even<br />
though matrix A is full row rank. If m τ s ∗ ‖A i−1 ‖ F ∗ ‖a i ‖ 2