Linear Algebra
\[ (85, 75), (89, 50), (95, 100), (56, 99), (68, 98) \]
85 and 75 are components of the vector \((85, 75)\)
\[ \begin{bmatrix} 85 & 75 \\ 89 & 50 \\ 95 & 100 \\ 56 & 99 \\ 68 & 98 \end{bmatrix} \]
\[ \begin{pmatrix} 85 \\ 75 \end{pmatrix} \quad \begin{bmatrix} 85 \\ 75 \\ 89 \\ 50 \\ 95 \\ 100 \\ 56 \\ 99 \\ 68 \\ 98 \end{bmatrix} \]
\[ (85, 75) \quad \begin{bmatrix} 85 & 75 & 89 & 50 & 95 & 100 & 56 & 99 & 68 & 98 \end{bmatrix} \]
\[(1, 2, 3) + (4, 5, 6) = (5, 7, 9)\]
\[(x_1, \ldots, x_n) + (y_1, \ldots, y_n) = (x_1 + y_1, \ldots, x_n + y_n)\]
Components are added
\[3 \cdot (1, 2, 3) = (3, 6, 9)\]
\[c \cdot (x_1, \ldots, x_n) = (cx_1, \ldots, cx_n)\]
Components are scaled
\[2 \cdot (1, 2) + 3 \cdot (-1, 1) = (-1, 7)\]
\[c_1x_1 + \ldots + c_mx_m\]
where \(x_i = (x_{i1}, \ldots, x_{in})\)
\[ A = \begin{bmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 9 & 10 & 11 & 12 \end{bmatrix} = \begin{bmatrix} A_{11} & A_{12} & A_{13} & A_{14} \\ A_{21} & A_{22} & A_{23} & A_{24} \\ A_{31} & A_{32} & A_{33} & A_{34} \end{bmatrix} \]
\(3 \times 4\) matrix
\[ \begin{bmatrix} 1 & 3 & 5 \\ 2 & 4 & 6 \end{bmatrix} \begin{pmatrix} 3 \\ -1 \\ 2 \end{pmatrix} = 3 \cdot \begin{pmatrix} 1 \\ 2 \end{pmatrix} + (-1) \cdot \begin{pmatrix} 3 \\ 4 \end{pmatrix} + 2 \cdot \begin{pmatrix} 5 \\ 6 \end{pmatrix} \]
Linear combination of the columns
\[ \begin{vmatrix} c_1 & \ldots & c_n \end{vmatrix} \begin{pmatrix} x_1 \\ \vdots \\ x_n \end{pmatrix} = x_1c_1 + \ldots + x_nc_n \]
\(m \times n\) · \(n \times 1\) = \(m \times 1\)
\(M_{m \times n}(\mathbb{R}) \cdot \mathbb{R}^n \rightarrow \mathbb{R}^m\)
\[ (3, -1) \begin{bmatrix} 1 & 3 & 5 \\ 2 & 4 & 6 \end{bmatrix} = 3 \cdot (1, 3, 5) + (-1) \cdot (2, 4, 6) \]
Linear combination of the rows
\[ \begin{pmatrix} x_1 & \ldots & x_m \end{pmatrix} \begin{vmatrix} r_1^T \\ \vdots \\ r_m^T \end{vmatrix} = x_1r_1^T + \ldots + x_mr_m^T \]
\(1 \times m\) · \(m \times n\) = \(1 \times n\)
\(\mathbb{R}^m \cdot M_{m \times n}(\mathbb{R}) \rightarrow \mathbb{R}^n\)
\[ (1, 0, 2, -1) \begin{pmatrix} 2 \\ -1 \\ 1 \\ 3 \end{pmatrix} = 2 - 0 + 2 - 3 \]
Dot product
\[ \begin{pmatrix} x_1 & \ldots & x_n \end{pmatrix} \begin{pmatrix} y_1 \\ \vdots \\ y_n \end{pmatrix} = x_1y_1 + \ldots + x_ny_n \]
\(1 \times n\) · \(n \times 1\) = \(1 \times 1\)
\(\mathbb{R}^n \cdot \mathbb{R}^n \rightarrow \mathbb{R}\)
\[ \begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix} (5, 6, 7) = \begin{bmatrix} 5 & 6 & 7 \\ 10 & 12 & 14 \\ 15 & 18 & 21 \end{bmatrix} \]
Outer Product
\[ \begin{pmatrix} x_1 \\ \vdots \\ x_m \end{pmatrix} \begin{pmatrix} y_1 & \ldots & y_n \end{pmatrix} = \begin{bmatrix} x_1y_1 & \ldots & x_1y_n \\ \vdots & & \vdots \\ x_my_1 & \ldots & x_my_n \end{bmatrix} \]
\(m \times 1\) · \(1 \times n\) = \(m \times n\)
\(\mathbb{R}^m \cdot \mathbb{R}^n \rightarrow M_{m \times n}(\mathbb{R})\)
\(AB = C\)
\(m \times n\) · \(n \times p\) = \(m \times p\)
\(AB = C\)
Diagonal: \[ D = \begin{bmatrix} d_1 & 0 & 0 \\ 0 & \ddots & 0 \\ 0 & 0 & d_n \end{bmatrix} \]
Scalar: \[ S = \begin{bmatrix} c & 0 & 0 \\ 0 & \ddots & 0 \\ 0 & 0 & c \end{bmatrix} = cI \]
Identity: \[ I = \begin{bmatrix} 1 & 0 & 0 \\ 0 & \ddots & 0 \\ 0 & 0 & 1 \end{bmatrix} \]
Square matrix: \[ A_{n \times n} \rightarrow \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} \]
Upper Triangular: \[ U = \begin{bmatrix} a_{11} & \ldots & a_{1n} \\ & \ddots & \vdots \\ 0 & & a_{nn} \end{bmatrix} \]
Lower Triangular: \[ L = \begin{bmatrix} a_{11} & 0 \\ \vdots & \ddots \\ a_{n1} & \ldots & a_{nn} \end{bmatrix} \]
\[ \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{bmatrix}^T = \begin{bmatrix} 1 & 3 & 5 \\ 2 & 4 & 6 \end{bmatrix} \]
\[ \begin{bmatrix} 1 & 3 & 5 \\ 2 & 4 & 6 \end{bmatrix}^T = \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{bmatrix} \]
Symmetric: \[ A = \begin{bmatrix} 1 & 2 & 3 \\ 2 & 0 & 4 \\ 3 & 4 & 1 \end{bmatrix} \] \[ A^T = \begin{bmatrix} 1 & 2 & 3 \\ 2 & 0 & 4 \\ 3 & 4 & 1 \end{bmatrix} \] \(A = A^T\)
Skew-Symmetric: \[ A = \begin{bmatrix} 0 & 2 & 3 \\ -2 & 0 & 4 \\ -3 & -4 & 0 \end{bmatrix} \] \[ A^T = \begin{bmatrix} 0 & -2 & -3 \\ 2 & 0 & -4 \\ 3 & 4 & 0 \end{bmatrix} \] \(A = -A^T\)
For any square matrix \(A\):
\[ A = \underbrace{\frac{A + A^T}{2}}_{\text{symmetric}} + \underbrace{\frac{A - A^T}{2}}_{\text{skew-symmetric}} \]
\(A_{n \times n} \rightarrow B_{n \times n}\)
\(AB = BA = I \implies B = A^{-1}\) and \(A = B^{-1}\)
For a 2x2 matrix: \[ \begin{bmatrix} a & b \\ c & d \end{bmatrix}^{-1} = \frac{1}{ad-bc} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix} \quad \text{if } ad-bc \neq 0 \]