Greville's Method for Preconditioning Least Squares ... - Projects

Greville's Method for Preconditioning Least Squares ... - Projects Greville's Method for Preconditioning Least Squares ... - Projects

forge.scilab.org
from forge.scilab.org More from this publisher
15.05.2015 Views

22 RIF preconditioned GMRES did not converge in 2000 steps for all the dropping tolerances τ d we tried. The column “deficiency detected” gives the linearly dependent columns detected by our algorithm. −1260 means that the 1260th column, which is a rank deficient column, is missed by our algorithm. For example, when τ d = 10 −6 , τ s = 10 −10 our preconditioning algorithm detected 12 rank deficient columns, and missed the 1239-th, 1261-th, and 1278-th columns, and did not detect wrong linearly dependent columns. Hence, Assumption 1 is satisfied. When τ d = 10 −6 , τ s = 10 −7 our preconditioning algorithm found exactly the 15 linearly dependent columns. When τ d = 10 −6 , τ s = 10 −5 many more than 15 columns were detected. Thus, assumption 1 is not satisfied. Hence, the preconditioned GMRES did not converge. From this example we can see that the numerical results coincide with our theory very well. From Figure 1, we obtain a better insight into this situation. We observe that when the 0 Residual Curve of lp_cycle, τ d = 10 −6 −2 log 10 ||A T r|| 2 /||A T b|| 2 −4 −6 −8 −10 τ s =10 −5 τ s =10 −7 τ s =10 −10 −12 −14 0 50 100 150 200 Iteration Number Fig. 1 Convergence Curve for lp cycle switching tolerance τ s = 10 −5 , ‖AT r‖ 2 ‖A T maintains the level 10 −1 in the end, which b‖ 2 means that the solution is not a solution to the original least squares problem. This phenomenon illustrates that Assumption 1 is necessary. 9 Conclusion In this paper, we proposed a new preconditioner for least squares problems. When matrix A is full column rank, our preconditioning method is similar to the RIF preconditioner [3]. When A is rank deficient, our preconditioner is also rank deficient. We proved that under Assumption 1, using our preconditioners, the preconditioned

23 problems are equivalent to the original problems. Also, under the same assumption, we showed that the GMRES determines a solution to the preconditioned problem. Numerical experiments confirmed our theories and Assumption 1. Numerical results showed that our preconditioner performed competitively both for ill-conditioned full column rank matrices and ill-conditioned rank deficient matrices. For rank deficient problems, the RIF preconditioner may not work but our preconditioner worked efficiently. References 1. M. Benzi and M. Tůma, A sparse approximate inverse preconditioner for nonsymmetric linear systems, SIAM Journal on Scientific Computing, 19 (1998), pp. 968–994. 2. , A robust incomplete factorization preconditioner for poitive definite matrices, Numerical linear algebra with applications, 10 (2003), pp. 385–400. 3. , A robust preconditioner with low memory requirements for large sparse least squares problems, SIAM Journal on Scientific Computing, 25 (2003), pp. 499–512. 4. Å. Björck, Numerical Methods for Least Squares Problems, 1996. 5. M. Bollhöfer and Y. Saad, On the relations between ILUs and factored approximate inverses, SIAM Journal on Matrix Analysis and Applications, 24 (2002), pp. 219–237. 6. P. N. Brown and H. F. Walker, GMRES on (nearly) singular systems, SIAM Journal on Matrix Analysis and Applications, 18 (1997), pp. 37–51. 7. R. Bru, J. Cerdán, J. Marín, and J. Mas, Preconditioning sparse nonsymmetric linear systems with the sherman–morrison formula, SIAM Journal on Scientific Computing, 25 (2003), pp. 701–715. 8. R. Bru, J. MarÍn, and J. M. M. TUMA, Balanced incomplete factorization, SIAM Journal on Scientific Computing, 30 (2008), pp. 2302–2318. 9. S. L. Campbell and C. D. Meyer, Jr., Generalized Inverses of Linear Transformations, Pitman, London, 1979. Reprinted by Dover, New York, 1991. 10. E. Chow and Y. Saad, Approximate inverse preconditioners via sparse-sparse iterations, SIAM Journal on Scientific Computing, 19 (1998), pp. 995–1023. 11. T. A. Davis, University of florida sparse matrix collection, NA Digest, 92 (1994). 12. J. A. Fill and D. E. Fishkind, The Moore–Penrose generalized inverse for sums of matrices, SIAM Journal on Matrix Analysis and Applications, 21 (2000), pp. 629–635. 13. T. N. E. Greville, Some applications of the pseudoinverse of a matrix, SIAM Review, 2 (1960), pp. 15–22. 14. K. Hayami, J.-F. Yin, and T. Ito, GMRES methods for least squares problems, Tech. Report NII-2007-009E, National Institute of Informatics, Tokyo, July 2007, (also to appear in SIAM Journal on Matrix Analysis and Applications). 15. Y. Saad, Iterative Methods for Sparse Linear Systems, Society for Industrial and Applied Mathematics, Philadelphia, PA, USA, second ed., 2003. 16. Y. Saad and M. H. Schultz, GMRES: A generalized minimal residual algorithm for solving nonsymmetric linear systems, SIAM Journal on Scientific and Statistical Computing, 7 (1986), pp. 856–869. 17. G. Wang, Y. Wei, and S. Qiao, Generalized Inverses: Theory and Computations, Since Press, Beijing, 2003. 18. P.-Å. Wedin, Perturbation theory for pseudo-inverses, BIT Numerical Mathematics, 13 (1973), pp. 217 – 232. 19. J.-F. Yin and K. Hayami, Preconditioned GMRES methods with incomplete Givens orthogonalization method for large sparse least-squares problems, J. Comput. Appl. Math., 226 (2009), pp. 177–186. 20. N. Zhang and Y. Wei, On the convergence of general stationary iterative methods for range-hermitian singular linear systems (p n/a), Numerical Linear Algebra with Applications, published on line (2009).

23<br />

problems are equivalent to the original problems. Also, under the same assumption,<br />

we showed that the GMRES determines a solution to the preconditioned problem.<br />

Numerical experiments confirmed our theories and Assumption 1.<br />

Numerical results showed that our preconditioner per<strong>for</strong>med competitively both <strong>for</strong><br />

ill-conditioned full column rank matrices and ill-conditioned rank deficient matrices.<br />

For rank deficient problems, the RIF preconditioner may not work but our preconditioner<br />

worked efficiently.<br />

References<br />

1. M. Benzi and M. Tůma, A sparse approximate inverse preconditioner <strong>for</strong> nonsymmetric<br />

linear systems, SIAM Journal on Scientific Computing, 19 (1998), pp. 968–994.<br />

2. , A robust incomplete factorization preconditioner <strong>for</strong> poitive definite matrices, Numerical<br />

linear algebra with applications, 10 (2003), pp. 385–400.<br />

3. , A robust preconditioner with low memory requirements <strong>for</strong> large sparse least<br />

squares problems, SIAM Journal on Scientific Computing, 25 (2003), pp. 499–512.<br />

4. Å. Björck, Numerical <strong>Method</strong>s <strong>for</strong> <strong>Least</strong> <strong>Squares</strong> Problems, 1996.<br />

5. M. Bollhöfer and Y. Saad, On the relations between ILUs and factored approximate<br />

inverses, SIAM Journal on Matrix Analysis and Applications, 24 (2002), pp. 219–237.<br />

6. P. N. Brown and H. F. Walker, GMRES on (nearly) singular systems, SIAM Journal<br />

on Matrix Analysis and Applications, 18 (1997), pp. 37–51.<br />

7. R. Bru, J. Cerdán, J. Marín, and J. Mas, <strong>Preconditioning</strong> sparse nonsymmetric linear<br />

systems with the sherman–morrison <strong>for</strong>mula, SIAM Journal on Scientific Computing, 25<br />

(2003), pp. 701–715.<br />

8. R. Bru, J. MarÍn, and J. M. M. TUMA, Balanced incomplete factorization, SIAM<br />

Journal on Scientific Computing, 30 (2008), pp. 2302–2318.<br />

9. S. L. Campbell and C. D. Meyer, Jr., Generalized Inverses of Linear Trans<strong>for</strong>mations,<br />

Pitman, London, 1979. Reprinted by Dover, New York, 1991.<br />

10. E. Chow and Y. Saad, Approximate inverse preconditioners via sparse-sparse iterations,<br />

SIAM Journal on Scientific Computing, 19 (1998), pp. 995–1023.<br />

11. T. A. Davis, University of florida sparse matrix collection, NA Digest, 92 (1994).<br />

12. J. A. Fill and D. E. Fishkind, The Moore–Penrose generalized inverse <strong>for</strong> sums of<br />

matrices, SIAM Journal on Matrix Analysis and Applications, 21 (2000), pp. 629–635.<br />

13. T. N. E. Greville, Some applications of the pseudoinverse of a matrix, SIAM Review, 2<br />

(1960), pp. 15–22.<br />

14. K. Hayami, J.-F. Yin, and T. Ito, GMRES methods <strong>for</strong> least squares problems, Tech.<br />

Report NII-2007-009E, National Institute of In<strong>for</strong>matics, Tokyo, July 2007, (also to appear<br />

in SIAM Journal on Matrix Analysis and Applications).<br />

15. Y. Saad, Iterative <strong>Method</strong>s <strong>for</strong> Sparse Linear Systems, Society <strong>for</strong> Industrial and Applied<br />

Mathematics, Philadelphia, PA, USA, second ed., 2003.<br />

16. Y. Saad and M. H. Schultz, GMRES: A generalized minimal residual algorithm <strong>for</strong> solving<br />

nonsymmetric linear systems, SIAM Journal on Scientific and Statistical Computing,<br />

7 (1986), pp. 856–869.<br />

17. G. Wang, Y. Wei, and S. Qiao, Generalized Inverses: Theory and Computations, Since<br />

Press, Beijing, 2003.<br />

18. P.-Å. Wedin, Perturbation theory <strong>for</strong> pseudo-inverses, BIT Numerical Mathematics, 13<br />

(1973), pp. 217 – 232.<br />

19. J.-F. Yin and K. Hayami, Preconditioned GMRES methods with incomplete Givens orthogonalization<br />

method <strong>for</strong> large sparse least-squares problems, J. Comput. Appl. Math.,<br />

226 (2009), pp. 177–186.<br />

20. N. Zhang and Y. Wei, On the convergence of general stationary iterative methods <strong>for</strong><br />

range-hermitian singular linear systems (p n/a), Numerical Linear Algebra with Applications,<br />

published on line (2009).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!