# Section 4.3 Orthogonality

Definition: Given two vectors, $\vec{u}$ and $\vec{v}$ in $\mathbb{R}^n$, the number $(\vec{u})^{T}\vec{v}$ is called the inner product of $\vec{u}$ and $\vec{v}$ or the dot product of $\vec{u}$ and $\vec{v}$. We often write it as $\vec{u} \cdot \vec{v}$.

Theorem: Let $\vec{u}, \vec{v}$, and $\vec{w}$ be vectors in $\mathbb{R}^n$, and let $c$ be a scalar. Then

a. $\vec{u} \cdot \vec{v} = \vec{v} \cdot \vec{u}$

b. $(\vec{u} + \vec{v}) \cdot \vec{w} = \vec{u} \cdot \vec{w} + \vec{v} \cdot \vec{w}$

c. $(c\vec{u}) \cdot \vec{v} = c(\vec{u} \cdot \vec{v}) = \vec{u} \cdot (c\vec{v})$

d. $\vec{u} \cdot \vec{u} \geq 0$, and $\vec{u} \cdot \vec{u} = 0$ if and only if $\vec{u} = 0$

Definition: The length ( or the norm) of $\vec{v}$ is the nonnegative scalar $||\vec{v}||$ defined by $||\vec{v}|| = \sqrt{\vec{v}\cdot\vec{v}} = \sqrt{v_{1}^{2} + \cdots + v_{n}^{2}}$ and $||\vec{v}||^2 = \vec{v}\cdot\vec{v}$ where $\vec{v} = \begin{bmatrix}v_1\\ \vdots \\v_n\end{bmatrix}$

Remark: 1. For any scalar $c$, the length of $c\vec{v}$ is $|c|$ times the length of $\vec{v}$. That is, $||c\vec{v}|| = |c|||\vec{v}||$.

2. A vector whose length is 1 is called a unit vector.

3. If we divide a nonzero vector $\vec{v}$ by its length—that is, multiply by $\frac{1}{||\vec{v}||}$—we obtain a unit vector $\vec{u}$ because the length of $\vec{u}$ is $(\frac{1}{||\vec{v}||})||\vec{v}|| = 1$.

4. The process of creating $\vec{u} = \frac{\vec{v}}{||\vec{v}||}$ from $\vec{v}$ is sometimes called normalizing$\vec{v}$, and $\vec{u}$ is in the same direction as $\vec{v}$.

Definition: For $\vec{u}$ and $\vec{v}$ in $\mathbb{R}^n$, the distance between $\vec{u}$ and $\vec{v}$, written as dist($\vec{u}$, $\vec{v}$), is the length of the vector $\vec{u} - \vec{v}$. That is, dist($\vec{u}$, $\vec{v}$) $=||\vec{u} - \vec{v}||$.

Remark: $(\text{dist}(\vec{u}, \vec{v}))^2 = ||\vec{u} - \vec{v}||^2 = (\vec{u} - \vec{v}) \cdot (\vec{u} - \vec{v}) = ||\vec{u}||^2 + ||\vec{v}||^2 - 2\vec{u}\vec{v}$. This is saying $(\text{dist}(\vec{u}, \vec{v}))^2 = ||\vec{u} - \vec{v}||^2$ if and only if $\vec{u} \cdot \vec{v} = 0$ if and only if the vectors, $\vec{u}, \vec{v}$ and $\vec{u} - \vec{v}$ form a right triangle with two legs, $\vec{u}$ and $\vec{v}$.

Definition: Two vectors $\vec{u}$ and $\vec{v}$ in $\mathbb{R}^n$, are orthogonal (to each other) if $\vec{u} \cdot \vec{v} = 0$.

Remark: The zero vector is orthogonal to every vector in $\mathbb{R}^n$.

Remark: In $\mathbb{R}^2$ and $\mathbb{R}^3$, we have extra use of inner product. $\vec{u} \cdot \vec{v} = ||\vec{u}||||\vec{v}||cos\theta$ where $\theta$ is the angle between the vector $\vec{u}$ and $\vec{v}$.

Example 1: Find the angle between $\vec{u} = \begin{bmatrix}-1\\0\end{bmatrix}$ and $\vec{v} = \begin{bmatrix}1\\1\end{bmatrix}$

Exercise 1: Find the angle between $\vec{u} = \begin{bmatrix}-1\\-1\\-1\end{bmatrix}$ and $\vec{v} = \begin{bmatrix}1\\1\\1\end{bmatrix}$

Definition: A set of vectors $\begin{Bmatrix}\vec{u_1}, \cdots, \vec{u_p}\end{Bmatrix}$ in $\mathbb{R}^n$ is said to be an orthogonal set if each pair of distinct vectors from the set is orthogonal, that is $\vec{u_i}\cdot\vec{u_j} = 0$ whenever $i \neq j$.

Theorem: If $\begin{Bmatrix}\vec{u_1}, \cdots, \vec{u_p}\end{Bmatrix}$ is an orthogonal set of nonzero vectors in $\mathbb{R}^n$, then $S$ is linearly independent and hence is a basis for the subspace spanned by $S$.

Proof: Use $\vec{u_i} \cdot \vec{u_i} \neq 0$.

Example 2: Show $S = \begin{Bmatrix}\begin{bmatrix}1\\0\\1\end{bmatrix}, \begin{bmatrix}0\\1\\0\end{bmatrix}, \begin{bmatrix}1\\0\\-1\end{bmatrix}\end{Bmatrix}$ is an orthogonal set of $\mathbb{R}^3$.

Exercise 2: Show $S = \begin{Bmatrix}\begin{bmatrix}1\\0\\0\end{bmatrix}, \begin{bmatrix}0\\2\\3\end{bmatrix}, \begin{bmatrix}0\\-3\\2\end{bmatrix}\end{Bmatrix}$ is an orthogonal set of $\mathbb{R}^3$.

Definition: An orthogonal basis for a subspace $W$ of $\mathbb{R}^n$ is a basis for $W$ that is also an orthogonal set.

Theorem: Let $\begin{Bmatrix}\vec{u_1}, \cdots, \vec{u_p}\end{Bmatrix}$ be an orthogonal basis for a subspace $W$ of $\mathbb{R}^n$. For each $\vec{y}$ in $W$, the weights in the linear combination $\vec{y} = c_1\vec{u_1} + \cdots + c_p\vec{u_p}$ are given by $c_j = \frac{\vec{y}\cdot\vec{u_j}}{\vec{u_j}\vec{u_j}}$.

Proof: Use $\vec{y}\cdot\vec{u_j} = c_{j}\vec{u_j}\cdot\vec{u_j}$.

Example 3: Show $S = \begin{Bmatrix}\begin{bmatrix}1\\0\\1\end{bmatrix}, \begin{bmatrix}0\\1\\0\end{bmatrix}, \begin{bmatrix}1\\0\\-1\end{bmatrix}\end{Bmatrix}$ is an orthogonal basis of $\mathbb{R}^3$. Let $\vec{y} = \begin{bmatrix}1\\2\\3\end{bmatrix}$. Write $\vec{y}$ as a linear combination of the vectors in $S$.

Exercise 3: Show $S = \begin{Bmatrix}\begin{bmatrix}1\\0\\0\end{bmatrix}, \begin{bmatrix}0\\2\\3\end{bmatrix}, \begin{bmatrix}0\\-3\\2\end{bmatrix}\end{Bmatrix}$ is an orthogonal basis of $\mathbb{R}^3$. Let $\vec{y} = \begin{bmatrix}5\\6\\2\end{bmatrix}$. Write $\vec{y}$ as a linear combination of the vectors in $S$.

Definition: A set $\begin{Bmatrix}\vec{u_1}, \cdots, \vec{u_p}\end{Bmatrix}$ is an orthonormal set if it is an orthogonal set of unit vectors. If $W$ is the subspace spanned by such a set, then $\begin{Bmatrix}\vec{u_1}, \cdots, \vec{u_p}\end{Bmatrix}$ is an orthonormal basis for $W$, since the set is automatically linearly independent, by the first theorem of this section.

Example 4: Show $S = \begin{Bmatrix}\begin{bmatrix}1/\sqrt{10}\\3/\sqrt{10}\\0\end{bmatrix}, \begin{bmatrix}-3/\sqrt{10}\\1/\sqrt{10}\\0\end{bmatrix}, \begin{bmatrix}0\\0\\1\end{bmatrix}\end{Bmatrix}$ is an orthogonal set of $\mathbb{R}^3$.

Exercise 4: Show $S = \begin{Bmatrix}\begin{bmatrix}1/\sqrt{6}\\-2/\sqrt{6}\\1/\sqrt{6}\end{bmatrix}, \begin{bmatrix}0/\sqrt{5}\\1/\sqrt{5}\\2/\sqrt{5}\end{bmatrix}, \begin{bmatrix}-5/\sqrt{30}\\-2/\sqrt{30}\\1/\sqrt{30}\end{bmatrix}\end{Bmatrix}$ is an orthogonal set of $\mathbb{R}^3$.

Remark: When the vectors in an orthogonal set of nonzero vectors are normalized to have unit length, the new vectors will still be orthogonal, and hence the new set will be an orthogonal set.

Theorem: An $m \times n$ matrix $U$ has orthonormal columns if and only if $U^{T}U = I$.

Proof: Use $(\vec{u_j})^{T}(\vec{u_i}) = \vec{u_j} \cdot \vec{u_i} = 0$.

Example 5: Show $U = \begin{bmatrix}1/\sqrt{10} & -3/\sqrt{10} & 0\\3/\sqrt{10} & 1/\sqrt{10} & 0\\0 & 0 & 1\end{bmatrix}$ has orthonormal columns.

Exercise 5: Show $U = \begin{bmatrix}1/\sqrt{6} & 0/\sqrt{5} & -5/\sqrt{30}\\-2/\sqrt{6} & 1/\sqrt{5} & -2/\sqrt{30}\\1/\sqrt{6} & 2/\sqrt{5} & 1/\sqrt{30}\end{bmatrix}$ has orthonormal columns.

Group Work 1: Mark each statement True or False. Justify each answer.

a. $\vec{v} \cdot \vec{v} = ||\vec{v}||^2$.

b. For any scalar $c$, $\vec{u} \cdot (c\vec{v}) = c(\vec{u} \cdot \vec{v})$.

c. If the vectors in an orthogonal set of nonzero vectors are normalized,
then some of new vectors may not be orthogonal.

d. If a set $S = \begin{Bmatrix}\vec{u_1}, \cdots, \vec{u_p}\end{Bmatrix}$ has the property that $\vec{u_i} \cdot \vec{u_p} = 0$ whenever $i \neq j$, then $S$ is an orthonormal set.

e. $\vec{u} \cdot \vec{v} - \vec{v} \cdot \vec{u} = 0$.

f. For any scalar $c, ||c\vec{u}|| = c||\vec{u}||$.

g. Not every linearly independent set in $\mathbb{R}^n$ is an orthogonal set.

h. If $\vec{y}$ is a linear combination of nonzero vectors from an orthogonal set, then the weights in the linear combination can be computed without row operations on a matrix.

Group Work 2: Let $\vec{u} = \begin{bmatrix}-2\\1\\3\end{bmatrix}$ and let $W$ be the set of all $\vec{x}$ in $\mathbb{R}^3$ such that $\vec{u} \cdot \vec{x} = 0$. Show $W$ is a subspace and describe $W$ in geometric language.

Group Work 3: Prove $||\vec{u} + \vec{v}||^2 + ||\vec{u} - \vec{v}||^2 = 2||\vec{u}||^2 + 2||\vec{v}||^2$

Group Work 4: In each case either show that the statement is true or give an
example showing that it is false.

a. If $\begin{Bmatrix}\vec{u},\vec{v}\end{Bmatrix}$ is an orthogonal set in $\mathbb{R}^n$, then $\begin{Bmatrix}\vec{u},\vec{u} + \vec{v}\end{Bmatrix}$ is also orthogonal.

b. If $\begin{Bmatrix}\vec{u},\vec{v}\end{Bmatrix}$ and $\begin{Bmatrix}\vec{x},\vec{y}\end{Bmatrix}$ are both orthogonal in $\mathbb{R}^n$, then $\begin{Bmatrix}\vec{u},\vec{v}, \vec{x},\vec{y}\end{Bmatrix}$ is also orthogonal.

c. If $\begin{Bmatrix}\vec{x_1},\vec{x_2},\cdots,\vec{x_n}\end{Bmatrix}$ is orthogonal in $\mathbb{R}^n$, then $\mathbb{R}^n = \text{Span}\begin{Bmatrix}\vec{x_1},\vec{x_2}, \cdots,\vec{x_n}\end{Bmatrix}$.

Group Work 5: Let $\vec{v} = \begin{bmatrix}a\\b\end{bmatrix}$. Describe the set $H$ of vectors $\begin{bmatrix}x\\y\end{bmatrix}$ hat are orthogonal to $\vec{v}$.

Group Work 6: Let $u = (u_1,u_2,u_3)$. Explain why $\vec{u} \cdot vec{u} \geq 0$. When is $\vec{u} \cdot vec{u} \geq 0$.