2x2 Jacobi Algorithm
Finding eigenvalues and eigenvectors of symmetric matrices is pretty useful. Typically you don't have to implement your own algorithms to do this since there are so many packages around that can do it really well. But there are still times you might want to do it yourself, particularly when dealing with small matrices. For 2x2 matrices there's a fast analytic approach, and this can be leveraged to iteratively converge for "larger" matrices. The technique I'm going to use here is called the "Jacobi Algorithm". I used
Chapter 3 of Calculus++: The Symmetric Eigenvalue Problem by Eric A Carlen to figure out how to get this working.
for such a matrix I show how eigenvalues \(\mu_+\) and \(\mu_-\) are found,
\[
\begin{align}
A={A}^\intercal&=\begin{bmatrix}a & b\\b & d\end{bmatrix}\\
\mu_+&=\dfrac{a+d}{2}+\sqrt{b^2+(\dfrac{a-d}{2})^2}\\
\mu_-&=\dfrac{a+d}{2}-\sqrt{b^2+(\dfrac{a-d}{2})^2}\\
\end{align}
\]
with the eigenvalues it's easy to find the eigenvectors \(\textbf{u}_0\) and \(\textbf{u}_1\),
\begin{align}
\begin{bmatrix}\textbf{r}_0\\\textbf{r}_1\end{bmatrix}&=A-I\mu_+\\
\textbf{u}_0&=\dfrac{1}{||\textbf{r}_0||}\textbf{r}_0^\perp
\end{align}
finding \(\textbf{v}^\perp\) from \(\textbf{v}\) is straightforward in 2D,
\begin{align}
\textbf{v}&=\begin{bmatrix}v_0\\v_1\end{bmatrix}\\
\textbf{v}^\perp&=\begin{bmatrix}-v_1\\v_0\end{bmatrix}\\
\end{align}
for symmetric matrices all eigenvectors are orthogonal so : \(\textbf{u}_1=\textbf{u}_0^\perp\). With all this we can perform a diagonalization:
\begin{align}
A&={A}^\intercal\\
U&=\begin{bmatrix}\textbf{u}_0 & \textbf{u}_1\end{bmatrix}\\
\Lambda = {U}^\intercal AU&=\begin{bmatrix}\mu_+ & \\ & \mu_-\end{bmatrix}\\
A&=U\Lambda{U}^\intercal\\
\end{align}
Below you can modify some arbitrary \(A\) matrix and see the resulting orthogonal eigenvector matrix \(U\) and the result of \({U}^\intercal AU\), which should be diagonal \(\Lambda\) if \(A={A}^\intercal\).
\(\LARGE A = \) |
|
\(\LARGE U = \) |
|
\(\LARGE \Lambda = {U}^\intercal AU = \) |
|
figure 1 : modify matrix A by typing in entries.
Some practical notes when you implement this. If the off diagonal elements are zero, and the diagonal elements are equal or one is greater than the other, the process outlined above seems to spit out NaN's due to a division by zero. I found an easy way to fix this is to just add some very small value to the off diagonals to ensure they're non-zero. This prevents the division by zero. Technically there's no point even doing anything in this case because your eigenvalues ARE the diagonal entries, however maybe it's advantageous to code this without adding conditional branching for performance reasons on some hardware (GPU implementations?).
Check out
3x3 jacobi algorithm for an example of using the algorithm to iteratively diagonalize in the 3x3 case.
Site Navigation
Homepage
© 2018 Kevin Bergamin.