New Results on Block Diagonalization of Matrix Pencils

27 May 2025, 17:30
30m
Faculty of Mathematics

Faculty of Mathematics

TU Berlin
Talk Talks

Speaker

Vasile Sima (National Institute for Research & Development in Informatics (retired))

Description

Block diagonalization algorithms have many applications, including computation of reciprocal condition numbers for eigenvalues and invariant or deflating subspaces of matrices or matrix pencils, respectively, fast computation of matrix functions, or of the response of linear time invariant standard or descriptor systems. For matrix pencils, this algorithm is applied to a matrix pair in a generalized (real) Schur form, obtained using unitary transformations. The off-diagonal blocks are annihilated by solving generalized Sylvester equations, but this involves non-unitary transformations. In order to ensure good accuracy, the numerical condition of these transformations should be bounded. Specifically, if any element of the solution of a Sylvester equation is larger than a specified constant $b$, the process of solving that equation is stopped, a new pair of diagonal blocks is selected, and another Sylvester equation is attempted to be solved.

There are several strategies to choose a new pair of blocks after each failure. For real matrices, the original strategy chooses a pair of diagonal blocks of order 1 or 2 whose eigenvalue(s) are closest to the mean of eigenvalues of the current pair of diagonal blocks. The chosen pair is moved by unitary equivalence transformations at the bottom of the current pair. Another attempt is made to solve the corresponding generalized Sylvester equation. If the new transformation matrices have all elements with magnitude bounded by $b$, then the added blocks are accepted and the same procedure is applied to the remaining part. Another strategy selects the pair of diagonal blocks whose eigenvalues are closest to the eigenvalues of the current pair. Two pairs of blocks are supposed to belong to the same cluster if all their eigenvalues satisfy an absolute or relative distance condition.

Such strategies are of the bottom-up type, and they are very efficient for matrix pencils with relatively small order and well separated eigenvalues, in which case the solver often succeeds to obtain diagonal blocks of order at most two. But for large order problems with clustered eigenvalues, the solution time can be high, due to the possibility to have a big number of unsuccessful attempts to split the blocks.

A preliminary analysis of the clustered structure of the spectrum, followed by an appropriate reordering of the eigenvalues, could improve the efficiency. Specifically, starting by the most separated eigenvalues, the solver could quickly decouple them, leaving to the end all possibly big clusters of eigenvalues. Few failed attempts to split such a cluster could signal that there is no reason to continue the computations and, therefore, finish the process with one or more large blocks. These strategies could be considered as being of ``top-down'' type. This paper investigates such a top-down technique.

Numerical tests have been performed to assess the possible performance improvement using a clustering technique. For random problems of order 100, the mean CPU time has been reduced by a factor of about 2 compared to the results using bottom-up strategies. For a real problem of order 999, with a clustered block of order 734 (for $b = 5000$), the computing time has been reduced from about 62.7 to 4.7 seconds. The symmetric chordal metric is used as a suitable ``distance'', which seems appropriate for problems that may have infinite eigenvalues. The properties of the chordal metric are also studied, and reliable and efficient algorithms for its computation are proposed.

Author

Vasile Sima (National Institute for Research & Development in Informatics (retired))

Presentation materials

There are no materials yet.