Linear Algebra: XI Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Let \( A \) be an \( n \) by \( n \) square matrix, defining a linear transformation \( T: \mathbb{R}^n \to \mathbb{R}^n \). A non-zero vector \( \vec{v} \) is called an eigenvector of matrix \( A \) / a linear transformation \( T \) if:
$$ T(\vec{v}) = A\vec{v} = \lambda \vec{v} $$
for some scalar $\lambda$, called its eigenvalue. A matrix \( A \) does not change its eigenvectors' directions since each \( A\vec{v} \) is a scalar multiple of \( \vec{v} \). An eigenvalue tells you the change in magnitude since \( \| \lambda \vec{v} \| = |\lambda| \|\vec{v}\| \), and its sign tells you its orientation.
If $\vec{v}$ is an eigenvector of $A$ with eigenvalue $\lambda$, then $c \vec{v}$ ($c \neq 0$) is also an eigenvector of $A$ with eigenvalue $\lambda$.
We observe that:
$$ (A - \lambda I)\vec{v} = 0 $$
In order for \( \mathrm{Null}(A - \lambda I) \) to be non-empty so that there exists a non-zero eigenvector, then
$$ \det(A - \lambda I) = 0 $$
where $\det(A - \lambda I)$ generates the characteristic polynomial of degree \( n \) of \( A \).
Identity Matrix
The eigenvalues of the $n \times n$ identity matrix $I$ are
$$ \lambda_{1,\ldots,n}=1 $$
Since
$$ 0 = \det(I-\lambda I) = \det((1-\lambda)I) = (1-\lambda)^n $$
All $n$ dimensional vectors are eigenvectors of $I$ since $I\vec{v} = 1\cdot\vec{v}$.
Triangular and Diagonal Matrices
The eigenvalues of an $n \times n$ upper triangular matrix $U$ are its diagonal entries
$$ \lambda_{i}= u_{ii} $$
Since if $U$ is upper triangular, then $U - \lambda I$ is also upper triangular. Recall that the determinant of an upper triangular matrix is the product of its diagonal entries. Therefore,
$$ (u_{11}-\lambda) \cdots (u_{nn}-\lambda) = 0 $$
The eigenvalues of an $n \times n$ lower triangular matrix $L$ as well as an $n \times n$ diagonal matrix $D$ are also their diagonal entries using similar reasoning.
Projection Matrices
The eigenvalues of an $n \times n$ projection matrix $P$ are either $0$ or $1$ since using the indempotent property:
$$ P\vec{v} = \lambda \vec{v} $$
$$ = P^2\vec{v} = \lambda P \vec{v} = \lambda^2 \vec{v} $$
And so $\lambda = \lambda^2$ and $\lambda(\lambda-1)=0$.
For the vector projection matrix $P = \frac{\vec{b}\vec{b}^\top}{\vec{b}^\top\vec{b}}$, any vector parallel to $\vec{b}$ is an eigenvector with eigenvalue $1$ and any vector perpendicular to $\vec{b}$ is an eigenvector with eigenvalue $0$.
For the least squares projection matrix $P = A(A^\top A)^{-1}A^\top$, any vector in the column space of $A$ is an eigenvector with eigenvalue $1$ and any vector in the left null space of $A$ is an eigenvector with eigenvalue $0$.
Symmetric Matrices
Eigenvectors of a symmetric matrix $A$ where $A = A^\top$ are orthogonal if their eigenvalues are distinct. Suppose that
$$ A \vec{v}_1 = \lambda_1 \vec{v}_1, \quad A \vec{v}_2 = \lambda_2 \vec{v}_2 $$
And consider the scalar $\vec{v}_2^\top A \vec{v}_1$:
$$ \vec{v}_2^\top A \vec{v}_1 = \lambda_1 \vec{v}_2^\top \vec{v}_1 $$
$$ \vec{v}_2^\top A \vec{v}_1 = (A^\top \vec{v}_2)^\top \vec{v}_1 = (A \vec{v}_2)^\top \vec{v}_1 = \lambda_2 \vec{v}_2^\top \vec{v}_1 $$
Therefore,
$$ \lambda_1 \vec{v}_2^\top \vec{v}_1 = \lambda_2 \vec{v}_2^\top \vec{v}_1 \quad \Rightarrow \quad (\lambda_1 - \lambda_2) \vec{v}_2^\top \vec{v}_1 = 0 $$
So if $\lambda_1 \neq \lambda_2$, then $\vec{v}_2^\top \vec{v}_1 = 0$.
Trace
The trace of an \( n \) by \( n \) square matrix \( A \) is the sum of its diagonal entries:
$$ \operatorname{tr}(A) = \sum_{i=1}^{n} a_{ii} $$
For example, the trace of the \( n \) by \( n \) identity matrix \( I \) is \( n \):
$$ \operatorname{tr}(I) = n $$
The trace is linear, meaning that the trace of a sum is the sum of the traces:
$$ \operatorname{tr}(A+B) = \operatorname{tr}(A) + \operatorname{tr}(B) $$
And the trace of a square matrix multiplied by a scalar is the scalar multiplied by the trace:
$$ \operatorname{tr}(cA) = c\operatorname{tr}(A) $$
The trace is also cyclic, meaning that the order of a product does not change its value:
$$ \operatorname{tr}(AB) = \operatorname{tr}(BA) $$
For this reason, the trace of an outer product equals the inner product:
$$ \operatorname{tr}(\vec{u}\vec{v}^\top) = \vec{v}^\top \vec{u} $$
It follows that the trace of a vector projection matrix for any non-zero vector \( \vec{b} \) is 1:
$$ \operatorname{tr} \Big(\frac{\vec{b}\vec{b}^\top}{\vec{b}^\top\vec{b}} \Big) = \frac{\operatorname{tr}(\vec{b}\vec{b}^\top)}{\vec{b}^\top\vec{b}} = \frac{\vec{b}^\top\vec{b}}{\vec{b}^\top\vec{b}} = 1 $$
If \( A \) is invertible, then the trace of the least squares projection matrix of \( A \) is \( n \):
$$ \operatorname{tr}(A(A^\top A)^{-1}A^\top) = \operatorname{tr}((A^\top A)^{-1}A^\top A) = \operatorname{tr}(I) = n $$
Otherwise, the trace of the least squares projection matrix of \( A \) is \( \mathrm{rank}(A)\).
Two Dimensional Spaces
For a general $2$ by $2$ square matrix
$$ A = \begin{bmatrix} a & b \\ c & d \end{bmatrix} $$
The characteristic quadratic polynomial is given by:
$$ \det(A - \lambda I) = \det \Big(\begin{bmatrix} a-\lambda & b \\ c & d-\lambda \end{bmatrix}\Big) $$
$$ = (a-\lambda)(d-\lambda) - bc = \lambda^2 - (a+d)\lambda + (ad-bc) $$
$$ = \lambda^2 - \mathrm{tr}(A)\lambda + \mathrm{det}(A) $$
The roots of this characteristic quadratic polynomial give the eigenvalues of $A$:
$$ \lambda_{1,2} = \frac{\mathrm{tr}(A) \pm \sqrt{\mathrm{tr}(A)^2 - 4 \ \mathrm{det}(A)}}{2} $$
Where:
$$ \lambda_1 + \lambda_2 = \mathrm{tr}(A), \quad \lambda_1 \cdot \lambda_2 = \det(A) $$
The discriminant
$$\Delta = \mathrm{tr}(A)^2 - 4 \ \mathrm{det}(A)$$
Determines the number of unique eigenvalues. If $\Delta > 0$, then there are two unique real eigenvalues. If $\Delta = 0$, then there is one real eigenvalue with algebraic multiplicity of 2. If $\Delta < 0$, then there are two unique complex eigenvalues.
A set of linearly dependent eigenvectors $\mathrm{span}\{\vec{v}\} = \{c\vec{v} \mid c \in \mathbb{R}\}$ for each eigenvalue $\lambda$ is found by solving:
$$ (A-\lambda I) \vec{v} = \vec{0} $$
Comments
Post a Comment