※ To facilitate visualization and understanding, the field in which vectors and matrices are defined is limited to real numbers.
Original linear transformation $A$
Linear transformation $A$ decomposed into three steps using EVD
EVD allows us to think of the original linear transformation as a sequence of three steps: 'rotate', 'stretch', and 'rotate'.
Prerequisites
To better understand this post, it is recommended that you be familiar with the following:
Background Knowledge
Matrix is a Collection of Vectors
Suppose we have $n$ $n$-dimensional vectors $a_i\in \Bbb{R}^{n \times 1} \text{ for } i = 1, 2, \cdots, n$.
Then we can gather these $n$ vectors and construct an $n\times n$-dimensional matrix.
\[A = \begin{bmatrix} | & | & \text{ } & | \\ a_1 & a_2 & \cdots & a_n \\ | & | & \text{ } & | \end{bmatrix}\in \Bbb{R}^{n\times n}\]How to factor each column of a matrix
Suppose an $n\times n$-dimensional matrix $A$ is constructed as follows:
\[A = \begin{bmatrix} | & | & \text{ } & | \\ \lambda_1 a_1 & \lambda_2 a_2 & \cdots & \lambda_n a_n \\ | & | & \text{ } & | \end{bmatrix}\in \Bbb{R}^{n\times n}\]Here, $\lambda_i \text{ for } i = 1, 2, \cdots, n$ are arbitrary real numbers.
Then the matrix $A$ in Eq. (2) can be factored as follows:
\[A = \begin{bmatrix} | & | & \text{ } & | \\ a_1 & a_2 & \cdots & a_n \\ | & | & \text{ } & | \end{bmatrix}\begin{bmatrix}\lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & 0 & 0 \\ \text{ } & \text{ } & \ddots & \text{ } \\ 0 & 0 & \cdots & \lambda_n\end{bmatrix}\]Do not think that this multiplication is difficult, but if you think of each vector of the matrix being multiplied on the left as a number and calculate it, you can easily see that equation (3) is correct.
Eigenvalue decomposition
Suppose we have obtained multiple eigenvalues and eigenvectors for an arbitrary $n \times n$ matrix $A$.
In other words, we can obtain $n$ eigenvalues and eigenvectors that satisfy the following equation:
\[Av_i = \lambda_i v_i\text{ for } i = 1, 2, \cdots, n\]Then we can think of the matrix $V\in\Bbb{R}^{n\times n}$, which collects all the eigenvectors, as follows:
\[V = \begin{bmatrix} | & | & \text{ } & | \\ v_1 & v_2 & \cdots & v_n \\ | & | & \text{ } & | \end{bmatrix}\]Using equation (4), we can express the relationship between matrix $A$ and matrix $V$ as follows:
\[AV = \begin{bmatrix} | & | & \text{ } & | \\ \lambda_1 v_1 & \lambda_2 v_2 & \cdots & \lambda_n v_n \\ | & | & \text{ } & | \end{bmatrix}\]If we think of the matrix $\Lambda \in \Bbb{R} ^{n \times n}$, which collects all the eigenvalues on the diagonal, as follows:
\[\Lambda = \begin{bmatrix}\lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & 0 & 0 \\ \text{ } & \text{ } & \ddots & \text{ } \\ 0 & 0 & \cdots & \lambda_n\end{bmatrix}\]Using the relationship shown in equations (2) and (3), we can write equation (6) as follows:
\[Equation\ (6) \Rightarrow AV = V\Lambda\]If all eigenvectors are linearly independent, we can express matrix $A$ as follows using equation (8):
\[A = V\Lambda V^{-1}\]Geometric interpretation of eigendecomposition
Eigendecomposition shows that a linear transformation represented by a matrix can be decomposed into three processes: ‘rotate’, ‘stretch’, and ‘rotate’.
To understand the geometric interpretation of eigendecomposition, let’s try to eigendecompose the following matrix A:
\[A = \begin{bmatrix}1.2 & -0.5 \\ -1.5 & 1.7\end{bmatrix}\]First, let’s visualize the linear transformation of matrix $A$ as shown below:
Animation 1. Linear transformation of matrix A
The eigenvalues and eigenvectors of the above matrix are as follows:
\[\lambda = 0.5486 \text{ or } 2.3514\] \[\vec{v}_1 = \begin{bmatrix}0.6089 \\ 0.7933\end{bmatrix}\] \[\vec{v}_2 = \begin{bmatrix}-0.3983 \\ 0.9172\end{bmatrix}\]Using the eigenvalues and eigenvectors of matrix $A$, we can write the matrices for decomposing matrix $A$ as shown below:
\[V = \begin{bmatrix}0.6089 & -0.3983 \\ 0.7933 & 0.9172\end{bmatrix}\] \[\Lambda = \begin{bmatrix}0.5486 & 0 \\ 0 & 2.3514\end{bmatrix}\] \[V^{-1} = \begin{bmatrix}1.0489 & 0.4555 \\ -0.9072 & 0.6963\end{bmatrix}\]We can think of matrix $A$ as being split into the following:
\[A = V\Lambda V^{-1} \notag\] \[\Rightarrow \begin{bmatrix}1.2 & -0.5 \\ -1.5 & 1.7\end{bmatrix}=\begin{bmatrix}0.6089 & -0.3983 \\ 0.7933 & 0.9172\end{bmatrix} \begin{bmatrix}0.5486 & 0 \\ 0 & 2.3514\end{bmatrix}\begin{bmatrix}1.0489 & 0.4555 \\ -0.9072 & 0.6963\end{bmatrix}\]The following is the linear transformation of matrix $V$ composed of eigenvectors.
If you look closely, the linear transformation of matrix $V$ shows a form similar to rotation because all the column vectors in $V$ are eigenvectors, which represent only direction and have a length of 1.
Therefore, since two vectors starting from the origin and having a length of 1 are pointing in different directions, it can perform a rotation-like transformation.
However, it cannot be completely the same as a rotation transformation because all the length of the basis vectors after transformation are not the same, and there is a possibility of being flipped during the transformation.
In addition, it should be noted that the angles between the basis vectors of V after transformation do not form 90 degrees.
\[V = \begin{bmatrix}0.6089 & -0.3983 \\ 0.7933 & 0.9172\end{bmatrix}\notag\]
Animation 2. Linear transformation of V
The linear transformation of the diagonal matrix $\Lambda$, which consists of eigenvalues as diagonal elements, is shown below. Since only diagonal elements exist, it can be seen that only transformation that expands or contracts in the up-down or left-right directions is shown.
\[\Lambda = \begin{bmatrix}0.5486 & 0 \\ 0 & 2.3514\end{bmatrix}\notag\]
Animation 3. Linear transformation of $\Lambda$
The linear transformation of the inverse matrix $V^{-1}$, composed of eigenvectors, is shown below.
The linear transformation of $V^{-1}$ also shows a rotation-like shape similar to $V$. However, it can be seen that it rotates in the opposite direction to $V$.
\[V^{-1} = \begin{bmatrix}1.0489 & 0.4555 \\ -0.9072 & 0.6963\end{bmatrix}\notag\]
Animation 4. Linear transformation of $V^{-1}$
If the linear transformations of $V^{-1}$, $\Lambda$, and $V$ are applied in order, it can be seen that they are the same as the original linear transformation $A$.
The three slides in the applet below correspond to $V$, $\Lambda$, and $V^{-1}$, respectively.
Animation 5. Apply linear transformations of $V$, $\Lambda$, and $V^{-1}$ independently.
Eigenvalue decomposition of symmetric matrices
There is a slightly unique point in the eigenvalue decomposition of symmetric matrices compared to that of general square matrices. This is because symmetric matrices show the property that their eigenvectors are orthogonal to each other.
A symmetric matrix is a matrix that satisfies the following property:
\[A=A^T\]If the matrix $A$ can be decomposed into eigenvalues as in Equation (9), the following can be derived:
\[A = V\Lambda V^{-1} = A^T = (V\Lambda V^{-1})^T = (V^{-1})^T\Lambda^TV^T\]Here, since $\Lambda$ is a diagonal matrix, $\Lambda^T = \Lambda$ and $(V^{-1})^T = (V^T)^{-1}$, Equation (12) can also be written as follows.
\[V\Lambda V^{-1} = (V^T)^{-1}\Lambda V^T\]Therefore, the matrix $V$ consisting of eigenvectors of a symmetric matrix is an orthogonal matrix. In other words, the following holds.
\[V^T = V^{-1}\]and
\[VV^T = V^TV = I\]Thus, a symmetric matrix can be decomposed into eigenvalues and eigenvectors as follows.
\[A= V\Lambda V^T\]Generally, the matrix $Q$ is often used instead of $V$ to denote the matrix that collects the eigenvectors of a symmetric matrix $A$, such that
\[A = Q\Lambda Q^T\]when $A$ is a symmetric matrix.
Geometric interpretation of eigenvalue decomposition of a symmetric matrix
To understand the geometric interpretation of eigenvalue decomposition of a symmetric matrix, let’s try to decompose the matrix A below into eigenvalues and eigenvectors.
\[A = \begin{bmatrix}2 & 1 \\ 1 & 2\end{bmatrix}\]First, if we visualize the linear transformation of matrix $A$, it looks like the following.
Animation 6. Linear transformation of matrix A
The eigenvalues and eigenvectors of the matrix are as follows.
\[\lambda = 1 \text{ or } 3\] \[\vec{q}_1 = \begin{bmatrix}0.7071 \\ -0.7071\end{bmatrix}\] \[\vec{q}_2 = \begin{bmatrix}0.7071 \\ 0.7071\end{bmatrix}\]When we use the eigenvalues and eigenvectors of matrix $A$ to decompose matrix $A$ as in equation (9), the matrices we need are as follows:
\[Q = \begin{bmatrix}0.7071 & 0.7071 \\ -0.7071 & 0.7071\end{bmatrix}\] \[\Lambda = \begin{bmatrix}1 & 0 \\ 0 & 3\end{bmatrix}\] \[Q^T = \begin{bmatrix}0.7071 & -0.7071 \\ 0.7071 & 0.7071\end{bmatrix}\]From the perspective of eigenvalues and eigenvectors, the difference between a general square matrix and a symmetric matrix is that the eigenvectors of the symmetric matrix are orthogonal to each other.
Therefore, when linearly transformed by matrix $Q$, we can observe the result similar to rotating the vector space, as the length of the basis vectors remain 1 and they are orthogonal to each other even after the linear transformation. (The reason we say “similar to rotating” is that the eigenvectors of matrix $Q$ may be flipped depending on the position of the column vectors of $Q$.)
Thus, the matrix $Q$ obtained by collecting the eigenvectors of the symmetric matrix has a similar meaning to a rotation matrix.
By applying linear transformations of $Q^T$, $\Lambda$, and $Q$ in order, we can obtain the original linear transformation $A$.
The three slides in the following Applet correspond to $Q$, $\Lambda$, and $Q^T$, respectively.
Animation 7. Let's apply linear transformations $Q$, $\Lambda$, and $Q^T$ independently.