# Acceleration of convergence and extrapolation methods

In several scientific domains, one has often to deal with sequences and series, that converge slowly and it is a serious drawback to their effective use. Extrapolation methods form the basis of new methods for solving such and more problems.
Extrapolation methods now are a specific area of numerical analysis having connections with many other important topics (Padé approximation, projection methods, ...). Besides convergence acceleration of (scalar, vector or matrix) sequences, they can also furnish new solution methods for different problems. In the past, particular attention was given to new approaches for recovering known sequence transformations and obtaining new ones, in particular in the vector case. Useful tools that lead to new vector sequence transformations have been the Schur complement and its extension to rectangular matrices, and some determinantal properties like the Sylvester determinantal identity.
Recursive algorithms have to be obtained, and programmed with great care. We have also to look for theoretical results on convergence and acceleration.

For a book of the subject and a library follow this link

Two main applications are mainly considered:

• An important application of extrapolation methods is the acceleration of iterative methods used in Web search. The most popular algorithm is the PageRank (used by Google), and the power method is the mathematical tool used. But the dimension is huge and the convergence is slow. One possibility for accelerating this process is to use extrapolation methods, also on parallel computers; Such procedures can be applied also for the computation of eigenvectors of reducible stochastic matrices.

• A well known technique for solving an ill-conditioned linear system is the regularization. When the regularization parameter tends to zero, the solution of the regularized system converges to the exact solution. Unfortunately, if the parameter is close to zero, then, due to the ill–conditioning, the solution is badly computed while, if the parameter is far away from zero, the solution is well computed but it is very different from the exact solution. Thus, the choice of a good value for the parameter is an important problem. In 1998 we computed the regularized solutions for several values of the regularization parameter and then we extrapolated at zero. The numerical results obtained proved the effectiveness of this innovative technique. A first attempt for extending this technique to the multiparameter case (that allows to add simultaneously several regularization terms) has been proposed in 2003, but it needs of additional study and test.

Several of the proposed research subjects are related to structured matrices and their treatment. Last year, two of the participants, released a beta version of an object oriented specialized Matlab toolbox. Particular attention was made to an user-friendly use, to the compatibility with the Matlab environment and with the intrinsic computing functions, and to the optimization of the computing time and memory requirements, focusing first on few structured matrices and on the basic algorithms for the solution of linear systems by using preconditioned iterative methods. This toolbox, named SMT (Structured Matrices Toolbox), allows to use the structured matrices in a natural way, by using the arithmetic operations implemented through fast algorithms. The package has been completed with a test matrices set and with the most common preconditioners. It can be obtained in its beta version on this page. We want to extend it soon by including more classes of structured matrices (Hankel, Vandermonde, displacement structure, ...), inserting also for the an optimized implementation of the available best fast algorithms for solving structured algebraic problems.

Michela Redivo Zaglia
Michela.RedivoZaglia@unipd.it