The f(A)bulous workshop on matrix functions and exponential integrators

Sep 25 – 27, 2023
Max Planck Institute for Dynamics of Complex Technical Systems
Europe/Berlin timezone

Computational graphs for matrix functions

Sep 27, 2023, 9:00 AM
45m
Main/groundfloor-V0.05/2+3 - Prigogine (Max Planck Institute for Dynamics of Complex Technical Systems)

Main/groundfloor-V0.05/2+3 - Prigogine

Max Planck Institute for Dynamics of Complex Technical Systems

Sandtorstr. 1 39106 Magdeburg
100
Plenary

Speaker

Massimiliano Fasi (Durham University)

Description

Numerical methods for evaluating a function $f$ at an $n \times n$ matrix $A$ can be based on a variety of different approaches, but for a large class of algorithms the matrix $f(A)$ is approximated by using only three operations:

1. $Z \gets c_X X + c_Y Y,$ linear combination of matrices),
2. $Z \gets X \cdot Y,$ (matrix multiplication),
3. $Z \gets X^{-1} Y,$ (solution of a linear system with $n$ right-hand sides),

where $X$, $Y$, and $Z$ are $n \times n$ matrices and $c_{X}$ and $c_{Y}$ are scalars.

Algorithms that combine only these three basic building blocks are particularly attractive, as they correspond to functions that are easy to work with: if an expression for the scalar function $g$ features only linear combinations, multiplications, and inversions, and $g$ is defined on the spectrum of $A$, then a formula for $g(A)$ can be obtained by replacing all occurrences of $z$ in the formula for $g(z)$ with $A$.

Rephrasing these methods as directed acyclic graphs (DAGs) is a particularly effective approach to study existing techniques, improve them, and eventually derive new ones. The accuracy of these matrix techniques can be characterized by the accuracy of their scalar counterparts, thus designing algorithms for matrix functions can be regarded as a scalar-valued optimization problem. The derivatives needed during the optimization can be calculated automatically by exploiting the structure of the DAG, in a fashion analogous to backpropagation.

GraphMatFun.jl is a Julia package that offers the means to generate and manipulate computational graphs, optimize their coefficients, and generate Julia, MATLAB, and C code to evaluate them efficiently at a matrix argument. The software also provides tools to estimate the accuracy of a graph-based algorithm and thus obtain numerically reliable methods. For the exponential, for example, using a particular form (degree-optimal) of polynomials produces implementations that in many cases are cheaper, in terms of computational cost, than the Padé-based techniques typically used in mathematical software.

Primary author

Massimiliano Fasi (Durham University)

Co-authors

Prof. Elias Jarlebring (KTH Royal Institute of Technology) Dr Emil Ringh (Ericsson Research)

Presentation materials

There are no materials yet.