I. How to write the solution of $AX= b$ in parametric vector form.

1) Find reduced echelon form of augmented matrix $[A | b]$.

2) If the system is consistent, solve basic variable in terms of free variable, say $x_{i_1}, \dots , x_{i_s}$. Write solution $X$ into a vector with each entry in terms of free variable.

3) Set $x_{i_1}= x_{i_2}= \cdots x_{i_s}= 0$ to find particular solution $X_0$.

4) Let $Y= X-X_0$ so that $Y$ represent all soultions of homogeneous system $AX = \vec 0$.

5) Let $x_{i_1}= 1$, but $x_{i_2}= \cdots = x_{i_s}= 0$ inside $Y$ to build $Y_{i_1}$. Similary biuld $Y_{i_j}$ from $Y$ but set $x_{i_j}= 1$ but all other $x_{i_k}= 0$ if $j \not = k$.

6) Now $X$ can be written as parametric vector form as $$X= X_0 + x_{i_1} Y_{i_1} + \cdots + x_{i_s} Y_{i_s}. $$

II. Find basis of Null space:

Let $A$ be an $m \times n$-matrix and the null spce $N(A)= \{X \in \mathbb R^n | A X= \vec 0 \}\subset \mathbb R^n$. To find basis of $N(A)$ repeat I to find the solution of $AX= \vec 0$ in parametric vector form: $$X= x_{i_1} Y_{i_1} + \cdots + x_{i_s} Y_{i_s}. $$ Then $Y_{i_1}, \dots , Y_{i_s}$ is a basis of $N(A)$.

III. Find basis of column space and row Space:

Let $A$ be an $m \times n$-matrix and $A= (v_1 , \dots v_n)$ and $v_j \in \mathbb R^n$ being its column. To find a basis of Col$(A)= \text{Span} (v_1, \dots , v_n)$ and the row space of $A$ (the subspace of $\mathbb R_m$ spanned by rows of $A$), we do the following steps:

1) Find echelon form $E$ of $A$, locate the column of $E$ containing pivots, say, $j_1, \dots, j_r$-columns of $E$ have pivots.

2) Then original columns of $A$, $v_{j_1}, \dots , v_{j_r}$ is a basis of Col$(A)$.

3) Then nonzero rows of $E$ forms a basis of row space of $A$.

IV. Full rank matrix:

Let $A \in \mathbb R ^{m \times n}$ be an $m \times n$-matrix. $A$ has full rank if rank$(A)=\min\{m, n\}$. If $m = n$ we know $A$ has full rank if and only if $A$ is invertible. The following discuss what if $m \not = n$. Let $T_A : \mathbb R^n \to \mathbb R^m$ be the linear transformation given by $T_A(X) = AX, \ \forall X \in \mathbb R^n$

Case I, $m < n$ and rank$(A)= m$. Then

1) $\rm{Col}(A) = \mathbb R^m$.

2) $\dim$Nul$(A)= n-m > 0$.

3) $\forall b \in \mathbb R^m$, $AX=b$ always has a solution, and infinitly many solutions.

4)No zero rows appears in REF of $A$.

5) The linear transformation $T_A : \mathbb R^n \to \mathbb R^m$ is onto.

6) Rows of $A$ are linearly independent

Case II, $m > n $ and rank$(A)= n $. Then
1) $\rm{Row}(A) = \mathbb R_n$.

2) $\dim$Nul$(A)= n-n = 0$.

3) $\forall b \in \mathbb R^m$, $AX= b$ may not has a solution, if it has then it has unique solution.

4)No free colmuns appears in REF of $A$.

5) The linear transformation $T_A : \mathbb R^n \to \mathbb R^m$ is one-to-one.

6) Columns of $A$ are linearly independent.

IV. Steps to (orthogonal) diagonalize a (symmetric) matrix $A$:

1) Compute characteristic polynomial $f_A(t) = |A-tI_n|.$

2) Factorize $f_A(t)= (-1)^n (t-\lambda_1)^{k_1} \cdots (t-\lambda_m)^{k_m}$ so that $\lambda_1, \dots, \lambda_m$ are distinct. Then $\lambda_i$ are eigenvalues of $A$ with multiplicity $k_i$.

3) For each eigenvalue $\lambda_i$, solve homogeneous equation $(A-\lambda_iI_n)X = \vec 0$ to find a basis of the eigenspace $E_{\lambda_i}$: $$ v^{(i)}_1 , \cdots, v^{(i)}_{s_i}. $$

4) If one of $k_i > s_i$ then STOP, $A$ is not diagonalizable.

5) If all $k_i = s_i$ then $A$ is diagonalizable. Then $A= P \Lambda P^{-1}$ with $$\Lambda= \begin{pmatrix} \lambda_1 & & & & & & \\ & \ddots & & & & & \\ & & \lambda_1 & & & & \\ & & & \ddots & & & \\ & & & & \lambda_m & & \\ & & & & & \ddots & \\ & & & & & & \lambda_m \end{pmatrix} \ \ \ \text{and } \ \ \ P = (v^{(1)}_1, \cdots , v^{(1)}_{k_1}, \cdots, v^{(m)}_1 , \dots , v^{(m)}_{k_m}). $$

(6) In case that $A$ is symmetric, then $A$ is always diagonalizable. That is, $k_i =s_i$ always hold. Also, we need select orthonormal basis for each $E_{\lambda_i}$. Thus we apply Gram-Schmidt process to $ v^{(i)}_1 , \cdots, v^{(i)}_{k_i} $ to obtain an orthonomal basis $ u^{(i)}_1 , \cdots, u^{(i)}_{s_i} $ of $E_{\lambda_i}$. Then we obtain orthogonal diagonalization of $A$: $$A = Q \Lambda Q^T$$ where $$\Lambda= \begin{pmatrix} \lambda_1 & & & & & & \\ & \ddots & & & & & \\ & & \lambda_1 & & & & \\ & & & \ddots & & & \\ & & & & \lambda_m & & \\ & & & & & \ddots & \\ & & & & & & \lambda_m \end{pmatrix} \ \ \ \text{and } \ \ \ Q = (u^{(1)}_1, \cdots , u^{(1)}_{k_1}, \cdots, u^{(m)}_1 , \dots , u^{(m)}_{k_m}). $$