Section 2.1 Matrix Addition, Scalar Multiplication, and Transposition

Definition: 1. For any
[latex]m \times n[/latex] matrix, [latex]A = \begin{bmatrix}\vec{v_{1}} \cdots \vec{v_{n}}\end{bmatrix}[/latex] the i-th entry of [latex]\vec{v_{j}}[/latex] vector is called the (i, j)-entry of

 

[latex]\begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1j} & \cdots & a_{1n}\\ a_{21} & & & & & a_{2n}\\ \vdots & & & & & \vdots\\ a_{i1} & a_{i2} & \cdots & a_{ij} & \cdots & a_{in}\\ \vdots & & & & & \vdots\\ a_{m1} & a_{m2} & \cdots & a_{mj} & \cdots & a_{mn} \end{bmatrix}[/latex]

 

2. A square matrix is an [latex]n \times n[/latex] matrix.

 

3. The diagonal entries in an [latex]m \times n[/latex] matrix [latex]A = \begin{bmatrix}\vec{a_{ij}}\end{bmatrix}[/latex] are [latex]a_{11}, a_{22}, \cdots[/latex], and they form the main diagonal of [latex]A[/latex]. A diagonal matrix is a square [latex]n \times n[/latex] matrix whose non-diagonal entries are zero

 

4. The matrix with 1′s on the diagonal and 0′s elsewhere is called an identity matrix and is denoted by I.

 

5. A zero matrix is a [latex]m \times n[/latex] matrix whose entries are all zero and is written as 0.

 

6. [latex]A = \begin{bmatrix}\vec{a_{ij}}\end{bmatrix}[/latex] and [latex]B = \begin{bmatrix}\vec{b_{ij}}\end{bmatrix}[/latex] are two [latex]m \times n[/latex] matrices. We say [latex]A[/latex] is equal to [latex]B[/latex] if
[latex]a_{ij} = b_{ij}[/latex] for all [latex]i, j[/latex]. The sum of [latex]A[/latex] and [latex]B[/latex] is
[latex]A + B = \begin{bmatrix}\vec{a_{ij}}+\vec{b_{ij}}\end{bmatrix}[/latex].

 

 

 

7. If [latex]r[/latex] is a scalar and [latex]A = \begin{bmatrix}\vec{a_{ij}}\end{bmatrix}[/latex]
is a matrix, then the scalar multiple [latex]rA = \begin{bmatrix}\vec{ra_{ij}}\end{bmatrix}[/latex] which entries are [latex]r[/latex] times the entries of [latex]A[/latex].

 

Note: Only when two matrices of the same size can they be equal. The sum of two matrices is only defined when two matrices are of the same size.

 

Theorem: [latex]A, B,[/latex] and [latex]C[/latex] are matrices of the same size, and let [latex]r[/latex] and [latex]s[/latex] be scalars.

(a) [latex]A + B = B + A[/latex]

 

(b) [latex](A + B) + C = A + (B + C)[/latex]

 

(c) [latex]A + 0 = A[/latex]

 

(d) [latex]r(A + B) = rA + rB[/latex]

 

(e) [latex](r + d)A = rA + sA[/latex]

 

(f) [latex]r(sA) = (rs)A[/latex]

 

 

 

Definition: 1. The transpose of [latex]A, A^T[/latex] is the matrix that has rows of [latex]A[/latex] as its columns or has columns of [latex]A[/latex] as its rows.

2. The matrix [latex]A[/latex] is called symmetric if and only if [latex]A = A^T[/latex]. Note that this immediately implies that [latex]A[/latex] is a square matrix.

 

 

 

Theorem: Let [latex]A[/latex] and [latex]B[/latex] be matrices whose sizes are appropiate for the sums of products. Then

a. [latex](A^T)^T = A[/latex]

 

b. [latex](A + B)^T = A^T + B^T[/latex]

 

c. For any scalar, [latex]r, (rA)^T = r A^T[/latex]

 

Example 1: Verify [latex](A + B)^T = A^T + B^T[/latex].

 

 

Exercise 1: Verify [latex]((A)^T)^T = A[/latex].

 

Example 2: Find [latex](2A + B^T)^T[/latex] where [latex]A=\begin{bmatrix} 1 & 2 & 3\\ 4 & 5 & 6\end{bmatrix}[/latex] and [latex]B=\begin{bmatrix} 1 & 4\\ 2 & 5\\ 3 & 6\end{bmatrix}[/latex].

 

 

Exercise 2: Find [latex](A^T - 3B)^T[/latex] where [latex]A=\begin{bmatrix} 1 & -2\\ -2 & 0 \\ 3 & 1\end{bmatrix}[/latex] and [latex]B=\begin{bmatrix} 0 & 3 & 4\\ -2 & 1 & 1\end{bmatrix}[/latex].

 

Group Work 1: Show [latex]A + 2B[/latex] is a diagonal matrix if both [latex]A[/latex] and [latex]B[/latex] are diagonal matrices.

 

Group Work 2: In each case either show that the statement is true or give an
example showing it is false.

a. If [latex]A + B = A + C[/latex], then [latex]B[/latex] and [latex]C[/latex] have the same size.

 

b. If [latex]A + B = 0[/latex], then [latex]B = 0[/latex].

 

c. If the (3,1)-entry of [latex]A[/latex] is 5, then the (1,3)-entry of [latex]A^T[/latex] is 5.

 

d. [latex]A[/latex] and [latex]A^T[/latex] have the same main diagonal for every matrix [latex]A[/latex].

 

e. If [latex]B[/latex] is symmetric and [latex]A^T = 3B[/latex], then [latex]A = 3B[/latex].

 

f. If [latex]A[/latex] and [latex]B[/latex] are symmetric, then [latex]kA + mB[/latex] is symmetric for any scalars [latex]k[/latex] and [latex]m[/latex].

 

g. [latex]A + A^T[/latex] is symmetric for any square matrix [latex]A[/latex].

 

h. If [latex]Q + A = A[/latex] holds for every [latex]m \times n[/latex] matrix [latex]A[/latex], then [latex]Q[/latex]
is the zero matrix.

 

Group Work 3: Show [latex]A^T + 3B[/latex] is a symmetric matrix if both [latex]A[/latex] and [latex]B[/latex] are symmetric matrices

 

License

Icon for the Creative Commons Attribution 4.0 International License

Matrices Copyright © by Kuei-Nuan Lin is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book