Math 353 Homework

Problem Set 1, Due Thursday August 29th in Class

§1.2: 1, 13, 17, §1.3: 1, 8a,b,c, 18, 22, §1.4: 1, 4a,b, 5a,g , 13 and the following question:

Let $\mathbb R^+:= \{ x\in \mathbb R | x > 0 \}$ be the set of positive numbers. Define addtion $\oplus$ and scalar multiplication $\odot$ as the following: $$ x\oplus y = xy , \ \ \ \forall a \in \mathbb R, \ a\odot x = x ^a . $$ Show that $\mathbb R^+$ is a vector space over $\mathbb R$.

Extra Credits: Due Sep. 5th

Let $V$ be a vector space over a field $F$. Assume that $W_1$ and $W_2$ are two subspaces of $V$. Is $W_1 \cup W_2$ always subspace? If not, provide a necessary and sufficient condition so that $W_1 \cup W_2$ is a subspace and explain why your condition works.

Solutions: $W_1 \cup W_2$ is not always a subspace. Indeed, $ W_1\cup W_2$ is a subspace if and only if $W_1 \subset W_2$ or $W_2 \subset W_1$. To prove the statement, it is clear that when $W_1 \subset W_2$ or $W_2 \subset W_1$ then $W_1 \cup W_2= W_1$ or $W_2$ is a subspace. Conversely, assume that $W=W_1 \cup W_2$ is a subspace. We have either $W_1 \subset W_2$ or $W_1 \not \subset W_2$. We only need to treat the case when $W_1 \not \subset W_2$ and in this case there exists a $x \in W_1$ but $\not \in W_2$. Now for all $y\in W_2$, we have $x+y \in W$ because $W$ is a subspace. Then $x+y \in W_1$ or $\in W_2$. But if $x+y \in W_2$ then $x = (x+y)-y \in W_2$ which contradicts to the assumption that $x \not \in W_2$. So we must have $x+y \in W_1$. Then $y = (x+y) -x \in W_1$. This proves that $W_2 \subset W_1$.


Problem Set 2, Due Friday Sept. 5th in Class

§1.5: 1, 2ch, 9, 13(In the end of first sentence, change "field of characteristic not equals to two" to "field inside $\mathbb C$ (the set of complex numbers)") ; §1.6: 1, 2ab, 3c, 12, 14, and the following question:

Let $V$ be a vector space over a field $F$ with $\dim_F V = n$. Let $S= \{w_1 , \dots , w_m\}\subset V$ be a set of linearly independent vectors. Using Replacement Theorem (Theorem 1.10) to show that $m \leq n$. Furthermore, $m =n$ if and only if $S$ is a basis of $V$.


Problem Set 3, Due Thursday Sept. 12th in Class

§2.1: 1, 2, 4, 10, 14, §2.2: 1, 4, 8, 10,

Extra Credits: For Theorem 2.6, what will happen if we do not require $\{v_1 , \dots , v_n\}$ to be a basis? What will happen if we only require that $\{v_i\}$ spans $V$ or linearly independent?

Solutions to extra credit question: If we do not require $\{v_1 , \dots , v_n\}$ to be a basis then the linear transformation may not exist, and even if it exists it may not be unique. Now consider two situations:

1) If $\{v_1 , \dots , v_n\}$ is just linearly independent but not a basis: In this case, we can add $v_{n+1}, \dots, v_m$ so that $\{v_1 , \dots , v_m\}$ forms a basis. Now apply Theorem 2.6 to the basis $v_1, \dots , v_m$ and $w_1 , \dots , w_n, w_{n+1}, \dots , w_m $, where we can select $w_{n+1}, \dots, w_m $ to be any vectors in $W$. Then there exists a a linear transformation so that $T(v_i) = w_i$ for $i =1 , \dots , m $. Since the choice of $w_{n+1}, \dots, w_m $ are not unique, we see that $T$ is not unique.

2) If $\{v_1 , \dots , v_n\}$ spans $V$ but not a basis: In this case, the required linear trasformation may not exist. For example, suppose $\{v_i\}$ conatins two basis $\{u_1, \dots , u_m\}$ and $\{x_1, \dots , x_m\}$. Now Let $w_i$ is union of $y_1, \dots , y_m$ and $0, \dots , 0$. Suppose not all of $y_i = 0$. If Theorem 2.6 would also work in this case, then $T$ restricted to teh first set of basis $\{u_i\}$ implies that $T$ is not $0$ linear transformation. But $T$ restrcited to second basis $\{x_i\}$ tells us that $T= 0$. This is a contradiction. So in general $T$ does not exist. On the other hand, since $\{v_i\}$ spans $V$, it conatins a basis. Therefore if such $T$ exists then $T$ is necessarily unique.


Problem Set 4, Due Thursday Sept. 19th in Class

§2.3: 1, 3a, 11 ($T_0$ means zero transformation, i.e., sends all vector of $V$ to $\vec 0$), 12; §2.4: 1,3 (tr($A$) means the sum of all diagonal entries of $A$), 16, 17;

Problem Set 5, Due Tuesday Oct. 1st in Class

§2.5: 1, 2a,b 3a,b, 4, 10, §3.1: 1, 2, 3; §3.2: 1, 2a,b,e, 7, 21,

Problem Set 6, Due Thursday Oct. 10th in Class

§3.3: 1, 2 c,d, 4; §3.4: 1, 2a,c 4a,c, 5, 9, 12, §4.1: 1, 2, 5, 7.

Extra Credits: Suppose $S: =\{v_1, \dots, v_n\}$ spans $V$. In class we have the following method to find $S'\subset S$ to be a basis of $V$: Consider the eqation of vectors $$\sum_{i = 1}^n x_i v_i = \vec 0.$$ Then find an eqivalent system of linear equations $AX= 0$. Let $E$ be the reduced echelon form of $A$. Then $$S'= \{v_j\in S | j-\text{th column of } E \text{ has a pivot}\}$$ then forms a basis of $V$. Prove this method is valid.

Proof : Suppose that $S'= \{v_{i_1}, \dots , v_{i_m} \}$. We need to show that $S'$ is linearly independent, and $S' \cup \{v_j\}$ with $v_j \not \in S'$ must be linearly dependent. To see this, recall that for any elementary matrix $Q$, the systems $AX= 0$ and $(QA)X= 0$ are equivalent, consequently, $AX= 0$ and $EX= 0$ share the same solutions. Note that $\{x_j| v_j \in S' \}$ are "pivot" unknowns, and other unknowns $x'_j$ are "free" unknowns. In particular, piovt unknwns $x_j$ can be written as linear combination of free unknowns $x'_j$.Now consider equation \begin{equation} (1)\ \ \ \ \ \ \ \sum_{v_j \in S'} x_j v_j = 0. \end{equation} This is equivalently to solve system $x_1 v_1 + \cdots + x_n v_n = 0$ but set $x'_j = 0$ for all free unknown $x'_j$. But $x_j$ are linear combination of $x'_j$, this forces all $x_j= 0$. That is, th equation (1) only has zero solution. So $S'$ is linearly independent. Now suppose $v_i \not \in S'$ and consier equation $$ \sum_{v_j \in S'} x_j v_j + x_i v_i = 0 $$ Since $x_j$ is always linear combination of free unknowns $x'_j$ and $x_i$ is one of free unknown. We can always set $x_i =1$ to get a nontrivial solution of the above equation, this shows that $S' \cap \{v_i\}$ is linearly depoendent.

Problem Set 7, Due Thursday Oct. 17th in Class

§4.2: 1, 2, 5, 8, 26 §4.3: 1, 2, 15 ($A$ is similar to $B$ if there exists an invertible matrix $S$ such that $A= S^{-1}B S$) §4.4: 1, 3 a,b,g

Extra Credits: §4.4: 6.

Proof: We are aiming to prove $$\begin{vmatrix} A & B \\ 0 & C \end{vmatrix}= |A| |C|.$$ If $A$ is a $1 \times 1$-matrix. This is an easy consequence by cofactor expansion along the first column. By an easy induction, the above formula holds when $A$ is a diagonal matrix. Similary, if $C$ is a $1 \times 1$-matrix then cofactor formula along the last row still shows that the above formula holds and an easy induction shows that the above formula hold when $C$ is a diagonal matrix.

Now let us treat the case that none of $A$ and $C$ are diagonal. Note that we have relation $$\begin{pmatrix} A & B \\ 0 & C \end{pmatrix}=\begin{pmatrix} I_m & 0 \\ 0 & C \end{pmatrix} \begin{pmatrix} A & B \\ 0 & I_k \end{pmatrix}$$ So $$\begin{vmatrix} A & B \\ 0 & C \end{vmatrix}=\begin{vmatrix} I_m & 0 \\ 0 & C \end{vmatrix} \begin{vmatrix} A & B \\ 0 & I_k \end{vmatrix}= |C| |A|. $$


Problem Set 8, Due Tuesday Oct. 29th in Class

§5.1: 1, 3 a),d), 5 (a) (e), 13, 14, §5.2: 1, 2 a),f), 3 a) f), 7, 13; §6.1: 1, 3, 5, 10, 11, 17,


Problem Set 9, Due Thursday day Nov. 14th in Class

§5.3: 2 a,d, 4, 13; §6.2: 1, 2 a,b,c,h, 3, 9, 15; §6.3: 19, 21, 23.

Problem Set 10, Due Thursday day Nov. 21th in Class

§6.4: 1, 2 a,d, 3, 6; §6.5: 1, 2 c,d, 3, 9.

Problem Set 11, Due Tuesday Dec. 3rd in Class

§6.7: 1(b)(c)(e) 3(b)(c)(e)(f); Question

Extra Credits: Let $V$ be an inner product space over $\mathbb C$ and $T: V \rightarrow V$ a linear operator. Suppose that $|| T(x)|| = ||x|| $ for all $x \in V$. Prove that $\langle T(x), T(y) \rangle= \langle x, y\rangle$ for all $x, y \in V$.