Vectors and Matrices

Linear Algebra

Karthik Thiagarajan

Data

  • Vectors
  • Matrices

Vectors

\[ (85, 75), (89, 50), (95, 100), (56, 99), (68, 98) \]

85 and 75 are components of the vector \((85, 75)\)

Matrices

\[ \begin{bmatrix} 85 & 75 \\ 89 & 50 \\ 95 & 100 \\ 56 & 99 \\ 68 & 98 \end{bmatrix} \]

Column Vector

\[ \begin{pmatrix} 85 \\ 75 \end{pmatrix} \quad \begin{bmatrix} 85 \\ 75 \\ 89 \\ 50 \\ 95 \\ 100 \\ 56 \\ 99 \\ 68 \\ 98 \end{bmatrix} \]

Row Vector

\[ (85, 75) \quad \begin{bmatrix} 85 & 75 & 89 & 50 & 95 & 100 & 56 & 99 & 68 & 98 \end{bmatrix} \]

Vector Addition

\[(1, 2, 3) + (4, 5, 6) = (5, 7, 9)\]

\[(x_1, \ldots, x_n) + (y_1, \ldots, y_n) = (x_1 + y_1, \ldots, x_n + y_n)\]

Components are added

Scalar Multiplication

\[3 \cdot (1, 2, 3) = (3, 6, 9)\]

\[c \cdot (x_1, \ldots, x_n) = (cx_1, \ldots, cx_n)\]

Components are scaled

Linear Combination

\[2 \cdot (1, 2) + 3 \cdot (-1, 1) = (-1, 7)\]

\[c_1x_1 + \ldots + c_mx_m\]

where \(x_i = (x_{i1}, \ldots, x_{in})\)

\(\mathbb{R}^n\)

  • \(\mathbb{R}\) : line
  • \(\mathbb{R}^2 = \{(x, y) | x, y \in \mathbb{R}\}\) : plane
  • \(\mathbb{R}^3 = \{(x, y, z) | x, y, z \in \mathbb{R}\}\) : space
  • \(\mathbb{R}^n = \{(x_1, \ldots, x_n) | x_1, \ldots, x_n \in \mathbb{R}\}\)

\(M_{m \times n}(\mathbb{R})\)

\[ A = \begin{bmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 9 & 10 & 11 & 12 \end{bmatrix} = \begin{bmatrix} A_{11} & A_{12} & A_{13} & A_{14} \\ A_{21} & A_{22} & A_{23} & A_{24} \\ A_{31} & A_{32} & A_{33} & A_{34} \end{bmatrix} \]

\(3 \times 4\) matrix

\(M_{m \times n}(\mathbb{R})\)

  • \(M_{3 \times 4}(\mathbb{R})\): set of all real \(3 \times 4\) matrices
  • \(M_{m \times n}(\mathbb{R})\): set of all real \(m \times n\) matrices

Matrix-Vector Multiplication

\[ \begin{bmatrix} 1 & 3 & 5 \\ 2 & 4 & 6 \end{bmatrix} \begin{pmatrix} 3 \\ -1 \\ 2 \end{pmatrix} = 3 \cdot \begin{pmatrix} 1 \\ 2 \end{pmatrix} + (-1) \cdot \begin{pmatrix} 3 \\ 4 \end{pmatrix} + 2 \cdot \begin{pmatrix} 5 \\ 6 \end{pmatrix} \]

Linear combination of the columns

Matrix-Vector Multiplication

\[ \begin{vmatrix} c_1 & \ldots & c_n \end{vmatrix} \begin{pmatrix} x_1 \\ \vdots \\ x_n \end{pmatrix} = x_1c_1 + \ldots + x_nc_n \]

\(m \times n\) · \(n \times 1\) = \(m \times 1\)

\(M_{m \times n}(\mathbb{R}) \cdot \mathbb{R}^n \rightarrow \mathbb{R}^m\)

Vector-Matrix Multiplication

\[ (3, -1) \begin{bmatrix} 1 & 3 & 5 \\ 2 & 4 & 6 \end{bmatrix} = 3 \cdot (1, 3, 5) + (-1) \cdot (2, 4, 6) \]

Linear combination of the rows

Vector-Matrix Multiplication

\[ \begin{pmatrix} x_1 & \ldots & x_m \end{pmatrix} \begin{vmatrix} r_1^T \\ \vdots \\ r_m^T \end{vmatrix} = x_1r_1^T + \ldots + x_mr_m^T \]

\(1 \times m\) · \(m \times n\) = \(1 \times n\)

\(\mathbb{R}^m \cdot M_{m \times n}(\mathbb{R}) \rightarrow \mathbb{R}^n\)

Vector-Vector Multiplication (Inner Product)

\[ (1, 0, 2, -1) \begin{pmatrix} 2 \\ -1 \\ 1 \\ 3 \end{pmatrix} = 2 - 0 + 2 - 3 \]

Dot product

Vector-Vector Multiplication (Inner Product)

\[ \begin{pmatrix} x_1 & \ldots & x_n \end{pmatrix} \begin{pmatrix} y_1 \\ \vdots \\ y_n \end{pmatrix} = x_1y_1 + \ldots + x_ny_n \]

\(1 \times n\) · \(n \times 1\) = \(1 \times 1\)

\(\mathbb{R}^n \cdot \mathbb{R}^n \rightarrow \mathbb{R}\)

Vector-Vector Multiplication (Outer Product)

\[ \begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix} (5, 6, 7) = \begin{bmatrix} 5 & 6 & 7 \\ 10 & 12 & 14 \\ 15 & 18 & 21 \end{bmatrix} \]

Outer Product

Vector-Vector Multiplication (Outer Product)

\[ \begin{pmatrix} x_1 \\ \vdots \\ x_m \end{pmatrix} \begin{pmatrix} y_1 & \ldots & y_n \end{pmatrix} = \begin{bmatrix} x_1y_1 & \ldots & x_1y_n \\ \vdots & & \vdots \\ x_my_1 & \ldots & x_my_n \end{bmatrix} \]

\(m \times 1\) · \(1 \times n\) = \(m \times n\)

\(\mathbb{R}^m \cdot \mathbb{R}^n \rightarrow M_{m \times n}(\mathbb{R})\)

Matrix-Matrix Multiplication

\(AB = C\)

\(m \times n\) · \(n \times p\) = \(m \times p\)

  • Only matrices of compatible dimensions can be multiplied
    • columns of \(A\) = # rows of \(B\)

  • Matrix multiplication is not commutative
    • In general \(AB \neq BA\)
    • If \(AB = BA\), we say that \(A\) and \(B\) commute

Matrix-Matrix Multiplication

\(AB = C\)

  • Matrix-Vector: \(A \begin{vmatrix} b_1 & \ldots & b_p \end{vmatrix} = \begin{vmatrix} Ab_1 & \ldots & Ab_p \end{vmatrix}\)
  • Vector-Matrix: \(B = \begin{vmatrix} a_1^T \\ \vdots \\ a_m^T \end{vmatrix} \rightarrow \begin{vmatrix} a_1^TB \\ \vdots \\ a_m^TB \end{vmatrix}\)
  • Vector-Vector (Inner Product): \(\begin{vmatrix} a_1^T \\ \vdots \\ a_m^T \end{vmatrix} \begin{vmatrix} b_1 & \ldots & b_p \end{vmatrix} = \begin{bmatrix} \ldots & a_i^Tb_j & \ldots \\ & \vdots & \end{bmatrix}\)
  • Vector-Vector (Outer Product): \(\begin{vmatrix} a_1 & \ldots & a_n \end{vmatrix} \begin{vmatrix} b_1^T \\ \vdots \\ b_n^T \end{vmatrix} = a_1b_1^T + \ldots + a_nb_n^T\)

Special Matrices

Diagonal: \[ D = \begin{bmatrix} d_1 & 0 & 0 \\ 0 & \ddots & 0 \\ 0 & 0 & d_n \end{bmatrix} \]

Scalar: \[ S = \begin{bmatrix} c & 0 & 0 \\ 0 & \ddots & 0 \\ 0 & 0 & c \end{bmatrix} = cI \]

Identity: \[ I = \begin{bmatrix} 1 & 0 & 0 \\ 0 & \ddots & 0 \\ 0 & 0 & 1 \end{bmatrix} \]

Square matrix: \[ A_{n \times n} \rightarrow \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} \]

Special Matrices (cont.)

Upper Triangular: \[ U = \begin{bmatrix} a_{11} & \ldots & a_{1n} \\ & \ddots & \vdots \\ 0 & & a_{nn} \end{bmatrix} \]

Lower Triangular: \[ L = \begin{bmatrix} a_{11} & 0 \\ \vdots & \ddots \\ a_{n1} & \ldots & a_{nn} \end{bmatrix} \]

Transpose

\[ \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{bmatrix}^T = \begin{bmatrix} 1 & 3 & 5 \\ 2 & 4 & 6 \end{bmatrix} \]

\[ \begin{bmatrix} 1 & 3 & 5 \\ 2 & 4 & 6 \end{bmatrix}^T = \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{bmatrix} \]

Transpose Properties

  • \(A_{m \times n} \rightarrow A^T_{n \times m}\)
  • \((A^T)^T = A\)
  • \(A_{ij} = A^T_{ji}\)
  • \((AB)^T = B^TA^T\)

Symmetric and Skew-symmetric

Symmetric: \[ A = \begin{bmatrix} 1 & 2 & 3 \\ 2 & 0 & 4 \\ 3 & 4 & 1 \end{bmatrix} \] \[ A^T = \begin{bmatrix} 1 & 2 & 3 \\ 2 & 0 & 4 \\ 3 & 4 & 1 \end{bmatrix} \] \(A = A^T\)

Skew-Symmetric: \[ A = \begin{bmatrix} 0 & 2 & 3 \\ -2 & 0 & 4 \\ -3 & -4 & 0 \end{bmatrix} \] \[ A^T = \begin{bmatrix} 0 & -2 & -3 \\ 2 & 0 & -4 \\ 3 & 4 & 0 \end{bmatrix} \] \(A = -A^T\)

Symmetric and Skew-symmetric (cont.)

For any square matrix \(A\):

\[ A = \underbrace{\frac{A + A^T}{2}}_{\text{symmetric}} + \underbrace{\frac{A - A^T}{2}}_{\text{skew-symmetric}} \]

Inverse

\(A_{n \times n} \rightarrow B_{n \times n}\)

\(AB = BA = I \implies B = A^{-1}\) and \(A = B^{-1}\)

For a 2x2 matrix: \[ \begin{bmatrix} a & b \\ c & d \end{bmatrix}^{-1} = \frac{1}{ad-bc} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix} \quad \text{if } ad-bc \neq 0 \]