next up previous
Next: Partial Diagonalisation Up: Matrix Eigenvalue Problems Previous: Full Diagonalisation


The Generalised Eigenvalue Problem

A common generalisation of the simple eigenvalue problem involves 2 matrices
\begin{displaymath}
\bss{A}\bi{x} = \alpha\bss{B}\bi{x}.
\end{displaymath} (3.29)

This can easily be transformed into a simple eigenvalue problem by multiplying both sides by the inverse of either $\bss{A}$ or $\bss{B}$. This has the disadvantage however that if both matrices are Hermitian $\bss{B}^{-1}\bss{A}$ is not, and the advantages of the symmetry are lost, together, possibly, with some important physics. There is actually a more efficient way of handling the transformation. Using Cholesky factorisation an $\bss{L}\bss{U}$ decomposition of a positive definite matrix can be carried out such that
\begin{displaymath}
\bss{B} = \bss{L}\bss{L}^\dagger
\end{displaymath} (3.30)

which can be interpreted as a sort of square root of $\bss{B}$. Using this we can transform the problem into the form
$\displaystyle \left[\bss{L}^{-1}\bss{A}\left(\bss{L}^\dagger\right)^{-1}\right]
\left[\bss{L}^\dagger \bi{x}\right]$ $\textstyle =$ $\displaystyle \alpha \left[\bss{L}^\dagger \bi{x}\right]$ (3.31)
$\displaystyle \bss{A}'\bi{y}$ $\textstyle =$ $\displaystyle \alpha\bi{y}.$ (3.32)

Most libraries contain routines for solving the generalised eigenvalue problem for Hermitian and Real Symmetric matrices using Cholesky Factorisation followed by a standard routine. Problem 6 contains a simple and informative example.
next up previous
Next: Partial Diagonalisation Up: Matrix Eigenvalue Problems Previous: Full Diagonalisation