Linear Algebra: I Scalars and Vectors
Scalars and Vectors
A scalar is defined as a real number $c \in \mathbb{R}$ with 1 dimensional length $|c|$ and either positive or negative direction. A vector is an ordered collection of scalars with dimension $n \geq 1$:
-
Written as a column vector
$$\vec{a} =\begin{bmatrix} a_1 \\ \vdots \\ a_n \end{bmatrix}\in \mathbb{R}^n $$
-
Written as row vector
$$\vec{a}^\top =\begin{bmatrix} a_1 & \ldots & a_n \end{bmatrix}\in \mathbb{R}^n$$
Where $\top$ is the transpose operator that exchanges column vectors with row vectors. Note $(\vec{a}^\top)^\top = \vec{a}$.
-
With magnitude
$$\|\vec{a}\| = \|\vec{a}^\top\|=\sqrt{a_1^2+\ldots+a_n^2} \geq 0$$
And direction (with the exception of $\vec{0}$ where $\|\vec{0}\|=0$ and $\vec{0}$ has no direction).
Row vectors can be treated as transposes of column vectors, and the same operations apply via this identification.
Vector addition for column vectors is defined as
$$\vec{a} +\vec{b} = \begin{bmatrix} a_1+b_1 \\ \vdots \\ a_n+b_n \end{bmatrix}$$
Which is associative
$$(\vec{a} +\vec{b} + \vec{c}) = \vec{a} + (\vec{b} + \vec{c})$$
And commutative
$$\vec{a} +\vec{b} = \vec{b} + \vec{a}$$
Scalar multiplication for column vectors is defined as
$$c\vec{a}= \begin{bmatrix} ca_1 \\ \vdots \\ ca_n \end{bmatrix}$$
Which is associative,
$$c_1 (c_2 \vec{a}) = (c_1 c_2) \vec{a}$$
Commutative,
$$c_1 c_2 \vec{a} = c_2 c_1 \vec{a}$$
And distributive.
$$c (\vec{a} + \vec{b}) = c \vec{a} + c \vec{b}$$
A unit vector is a vector scaled to magnitude $1$ in the same direction as $\vec{a}$:
$$\hat{a} = \frac{\vec{a}}{\|\vec{a}\|}$$
Two vectors $\vec{a}$ and $\vec{b}$ are parallel / pairwise linearly dependent if they are scalar multiples:
$$\vec{b} = c\vec{a}$$
They can have the same direction if $c>0$ and opposite direction if $c<0$. Vectors with the same direction have the same unique unit vector. Two vectors $\vec{a}$ and $\vec{b}$ are pairwise linearly independent if they are not scalar multiples $\vec{b} \neq c\vec{a}$.
A linear combination of $m$ vectors $\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_m \in \mathbb{R}^n$ is written with $m$ scalars $c_1, c_2, \ldots, c_m \in \mathbb{R}$ as:
$$\sum_{i=1}^m c_i \vec{v}_i \in \mathbb{R}^n$$
A set of $m$ vectors $\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_m \in \mathbb{R}^n$ is linearly dependent if there exist $m$ scalars $c_1, c_2, \ldots, c_m \in \mathbb{R}$, not all zero, such that their linear combination is zero:
$$ \sum_{i=1}^m c_i \vec{v}_i = \vec{0} $$
The set of vectors $\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_m \in \mathbb{R}^n$ is linearly independent if the only solution to
$$ \sum_{i=1}^m c_i \vec{v}_i = \vec{0} $$
Is
$$ c_1 = c_2 = \cdots = c_m = 0 $$
A set of $k \geq n$ vectors in $n$ dimensions $\{\vec{v}_1, \ldots, \vec{v}_k\}$ spans $\mathbb{R}^n$ if every vector $\vec{v} \in \mathbb{R}^n$ can be written as a linear combination of these vectors.
A set of $n$ vectors $\{\vec{v}_1, \ldots, \vec{v}_n\}$ is called a basis of $\mathbb{R}^n$ if:
- The set $\{\vec{v}_1, \ldots, \vec{v}_n\}$ spans $\mathbb{R}^n$.
- The set $\{\vec{v}_1, \ldots, \vec{v}_n\}$ is linearly independent.
Let $\vec{a}$ and $\vec{b}$ be $n$ dimensional column vectors. Then the inner product / dot product is defined as:
$$\vec{a}^\top \vec{b} = \begin{bmatrix} a_1 & \ldots & a_n \end{bmatrix} \begin{bmatrix} b_1 \\ \vdots \\ b_n \end{bmatrix} = \sum_{i=1}^n a_i b_i \in \mathbb{R}$$
Which is commutative
$$\vec{a}^\top \vec{b} = \vec{b}^\top \vec{a}$$
And distributive.
$$\vec{a}^\top (\vec{b} + \vec{c}) = \vec{a}^\top \vec{b} + \vec{a}^\top \vec{c}$$It follows that $\|\vec{a}\|^2 = \vec{a}^\top \vec{a}$.
According to the Law of Cosines,
$$\|\vec{a}-\vec{b}\|^2 = \|\vec{a}\|^2 + \|\vec{b}\|^2 - 2\|\vec{a}\|\|\vec{b}\|\cos{\theta}$$
Where $\theta$ is the angle between $\vec{a}$ and $\vec{b}$. Also,
$$\|\vec{a}-\vec{b}\|^2 = (\vec{a}-\vec{b})^\top (\vec{a}-\vec{b}) = \vec{a}^\top\vec{a} + \vec{a}^\top \vec{b} + \vec{b}^\top \vec{a} + \vec{b}^\top \vec{b} = \|\vec{a}\|^2 + \|\vec{b}\|^2 - 2 \vec{a}^\top \vec{b}$$
Therefore,
$$\vec{a}^\top \vec{b} = \|\vec{a}\|\|\vec{b}\|\cos{\theta}$$
We say that two vectors $\vec{a}$ and $\vec{b}$ are orthogonal $\vec{a} \perp \vec{b}$ if $\vec{a}^\top \vec{b} = 0$.
A set of $n$ dimensional unit vectors $\{\hat{v}_1, \ldots, \hat{v}_n\}$ is an orthonormal basis of $\mathbb{R}^n$ if the set $\{\hat{v}_1, \ldots, \hat{v}_n\}$ is a basis of $\mathbb{R}^n$ and the unit vectors are mutually orthogonal $\hat{v}_i^\top\hat{v}_j=0$.
Let $\vec{a}$ and $\vec{b}$ be $n$ dimensional vectors. Then the scalar projection / component of $\vec{a}$ onto $\vec{b}$ describes the length of the shadow of $\vec{a}$ in the direction of $\vec{b}$:
$$\text{Comp}_{\vec{b}} \ \vec{a} = \frac{\vec{b}^\top \vec{a}}{\|\vec{b}\|} = \|\vec{a}\| \cos{\theta}$$
The vector projection of $\vec{a}$ onto $\vec{b}$ describes the vector in the direction of $\vec{b}$:
$$\text{Proj}_{\vec{b}} \ \vec{a} = (\text{Comp}_{\vec{b}} \ \vec{a}) \frac{\vec{b}}{\|\vec{b}\|} = \frac{\vec{b}^\top \vec{a}}{\|\vec{b}\|^2} \vec{b}$$
Comments
Post a Comment