Linear Algebra: XII Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
A symmetric matrix $S$ is a square matrix that is equal to its transpose:
\[ S = S^{T} \]
A symmetric matrix is self-adjoint, meaning that the inner product / dot product of two vectors is unchanged when the matrix is moved from one argument to the other:
$$ (S\vec{x})^\top \vec{y} = \vec{x}^\top (S\vec{y}) $$
A skew-symmetric matrix $K$ is a square matrix whose transpose equals its negative:
\[ K^{T} = -K \]
A skew-symmetric matrix has diagonal entries that are all zero since:
$$ k_{ii} = k^\top_{ii} = - k_{ii} = 0 $$
A skew-symmetric matrix is skew-adjoint, meaning that the inner product / dot product only changes sign when the matrix is moved from one argument to the other:
$$ (K\vec{x})^\top \vec{y} = -\vec{x}^\top (K\vec{y}) $$
Vector Subspace Properties
If \(A\) and \(B\) are symmetric, then \(A + B\) is symmetric since:
\[ (A+B)^\top = A^\top + B^\top = A + B \]
If \(c \in \mathbb{R}\) and \(A\) is symmetric, then \(cA\) is symmetric since:
\[ (cA)^\top = cA^\top = cA \]
Finally, the $n$ by $n$ zero matrix $\vec{0} \vec{0}^\top$ is symmetric. Thus, symmetric matrices form a vector subspace of $\mathbb{R}^{n \times n}$.
If \(A\) and \(B\) are skew-symmetric, then \(A + B\) is skew-symmetric since:
\[ (A+B)^\top = A^\top + B^\top = -A - B = -(A+B) \]
If \(c \in \mathbb{R}\) and \(A\) is skew-symmetric, then \(cA\) is skew-symmetric since:
\[ (cA)^\top = cA^\top = -cA \]
Finally, the $n$ by $n$ zero matrix $\vec{0} \vec{0}^\top$ is skew-symmetric. Thus, skew-symmetric matrices form a vector subspace of $\mathbb{R}^{n \times n}$.
Real Eigenvalues
All of the eigenvalues of a symmetric matrix where $A = A^\top$ are real numbers. Let $A$ be a symmetric $n$ by $n$ matrix. Suppose that $A \vec{x} = \lambda \vec{x}$ for some $\lambda \in \mathbb{R}$ and $\vec{x} \neq \vec{0}$. We consider the quadratic form:
$$ \vec{x}^\top A \vec{x} = \lambda \vec{x}^\top \vec{x} = \lambda \|\vec{x}\|^2 $$
$$ = \vec{x}^\top A^\top \vec{x} = (A \vec{x})^\top \vec{x} = \bar{\lambda} \vec{x}^\top \vec{x} = \bar{\lambda} \|\vec{x}\|^2 $$
Therefore, $\lambda = \bar{\lambda}$ and so $\lambda$ is real.
Orthogonal Eigenspaces
Eigenvectors of a symmetric matrix are orthogonal if their real eigenvalues are distinct. Suppose that
$$ A \vec{v}_1 = \lambda_1 \vec{v}_1, \quad A \vec{v}_2 = \lambda_2 \vec{v}_2 $$
And consider the bilinear form $\vec{v}_2^\top A \vec{v}_1$:
$$ \vec{v}_2^\top A \vec{v}_1 = \lambda_1 \vec{v}_2^\top \vec{v}_1 $$
$$ \vec{v}_2^\top A \vec{v}_1 = (A^\top \vec{v}_2)^\top \vec{v}_1 = (A \vec{v}_2)^\top \vec{v}_1 = \lambda_2 \vec{v}_2^\top \vec{v}_1 $$
Therefore,
$$ \lambda_1 \vec{v}_2^\top \vec{v}_1 = \lambda_2 \vec{v}_2^\top \vec{v}_1 \quad \Rightarrow \quad (\lambda_1 - \lambda_2) \vec{v}_2^\top \vec{v}_1 = 0 $$
So if $\lambda_1 \neq \lambda_2$, then $\vec{v}_2^\top \vec{v}_1 = 0$.
Orthogonal Subspace Invariance
The subspace orthogonal to an eigenvector is preserved under a symmetric matrix. Let $A$ be a symmetric matrix with linear transformation $T$ and let $\vec{v}$ be an eigenvector of $A$ with eigenvalue $\lambda$. Define the orthogonal complement of $\vec{v}$:
$$ V = \{ \vec{x} \in \mathbb{R}^n : \vec{v}^\top \vec{x} = 0 \} $$
We will show that $V$ is invariant under $A$, meaning that:
$$ T(V) \subseteq V $$
Take any $\vec{x} \in V$. Since $A$ is symmetric, we use the self-adjoint property:
$$ \vec{v}^\top (A\vec{x}) = (A\vec{v})^\top \vec{x} $$
Because $\vec{v}$ is an eigenvector:
$$ A\vec{v} = \lambda \vec{v} $$
Substitute:
$$ (A\vec{v})^\top \vec{x} = (\lambda \vec{v})^\top \vec{x} = \vec{v}^\top \lambda \vec{x} = \vec{v}^\top (A \vec{x}) = 0 $$
Therefore:
$$ \vec{v}^\top (A\vec{x}) = 0 $$
So:
$$ A\vec{x} \in V $$
Hence:
$$ T(V) \subseteq V $$
S + K Decomposition
Let $A$ be an $n$ by $n$ square matrix. Then $A + A^\top$ is symmetric since:
\[ (A + A^\top)^\top = A^\top + (A^\top)^\top = A^\top + A = A + A^\top \]
Also, \(A - A^\top\) is skew-symmetric since:
\[ (A - A^\top)^\top = A^\top - (A^\top)^\top = A^\top - A = -(A - A^\top) \]
We define:
\[ S = \frac{A + A^\top}{2}, \quad K = \frac{A - A^\top}{2} \]
Then \(S\) is symmetric, \(K\) is skew-symmetric, and
\[ A = S + K \]
Thus, every square matrix can be uniquely written as the sum of a symmetric matrix and a skew-symmetric matrix.
Quadratic Forms
Let $A$ be an $n \times n$ matrix. The quadratic form associated with \(A\) is the function that returns a scalar from an $n$-dimensional vector $\vec{x}$:
\[ Q_A : \mathbb{R}^n \to \mathbb{R}, \quad Q_A(\vec{x}) = \vec{x}^T A \vec{x} \in \mathbb{R} \]
Equivalently, if $B_A(\vec{x},\vec{y}) = \vec{x}^T A \vec{y}$ is the bilinear form associated with $A$, then the quadratic form is given by:
\[ Q_A(\vec{x}) = B_A(\vec{x}, \vec{x}) \]
If $K$ is a skew-symmetric matrix, then $Q_K(\vec{x}) = \vec{x}^T K \vec{x} = 0$. Since $Q_K(\vec{x})$ is a scalar, it is equal to its traspose:
\[ Q_K(\vec{x}) = Q_K(\vec{x})^\top \]
Thus,
$$Q_K(\vec{x}) = Q_K(\vec{x})^\top = (\vec{x}^\top K \vec{x})^\top$$
$$= \vec{x}^\top K^\top \vec{x} = - \vec{x}^\top K \vec{x} = -Q_K(\vec{x})$$
Hence, $Q(\vec{x}) = 0$. This means that the billinear form of any square matrix only depends on its symmetric part:
$$ Q_A(\vec{x}) = Q_S(\vec{x}) $$
Exercises
-
Show that if $A$ is symmetric, then
$$ B_A(\vec{x}, \vec{y}) = \frac{1}{2}\big( Q_A(\vec{x} + \vec{y}) - Q_A(\vec{x}) - Q_A(\vec{y}) \big) $$
This shows that symmetric bilinear forms are completely determined by their quadratic forms.
Comments
Post a Comment