Speaker
Description
When efficient linear solvers for shifted systems are unavailable, polynomial Krylov subspace methods are often the only methods of choice to compute $f(A)b$, the action of a matrix function on a vector. For less well conditioned problems the number of required Arnoldi steps may then become so large that storing the Arnoldi vectors exceeds the available memory and that orthogonalization costs become unbearable.
This talk will deal with different ways to overcome these shortcomings. We will first review restart procedures, in particular for matrix Stieltjes functions and its recent generalization [A. Frommer, K. Kahl, M. Schweitzer, and M. Tsolakis, Krylov subspace restarting for matrix Laplace transforms, SIAM J. Matrix Anal. Appl. 44, 693-717 (2023)] to matrix Laplace transforms. We then continue with the sketching approach developed in [S. Güttel and M. Schweitzer, Randomized sketching for Krylov approximations of large-scale matrix functions, SIAM J. Matrix Anal. Appl. 44, 1073-1095 (2023)]. Finally, we will show that polynomial preconditioning can also be used to keep recursions short and memory requirements low in the case of the matrix square root and inverse square root. This is joint work with Gustavo Ramirez, Marcel Schweitzer and Manuel Tsolakis.