Math 260, Fall 2018, Assignment 4
From cartan.math.umb.edu
Revision as of 20:52, 11 October 2018 by Steven.Jackson (talk | contribs)
I was at the mathematical school, where the master taught his pupils after a method scarce imaginable to us in Europe. The proposition and demonstration were fairly written on a thin wafer, with ink composed of a cephalic tincture. This the student was to swallow upon a fasting stomach, and for three days following eat nothing but bread and water. As the wafer digested the tincture mounted to the brain, bearing the proposition along with it.
- - Jonathan Swift, Gulliver's Travels
Read:[edit]
- Section 2.1.
- Section 2.2.
Carefully define the following terms, then give one example and one non-example of each:[edit]
- Product (of two matrices).
- Matrix form (of a linear system).
- Linear (function, from $\mathbb{R}^m$ to $\mathbb{R}^n$).
Carefully state the following theorems (you do not need to prove them):[edit]
- Formula for the entries of the matrix product (this is Theorem 2.3.4 in the text).
- Formula for $A\vec{x}$ in terms of the columns of $A$ and the entries of $\vec{x}$ (this is Theorem 1.3.8 in the text).
- Theorem concerning commutativity of matrix multiplication (this is Theorem 2.3.3 in the text).
- Theorem concerning associativity of matrix multiplication (this is Theorem 2.3.6 in the text).
- Distributive properties of matrix multiplication (Theorem 2.3.7).
- Theorem relating matrix multiplication and scalar multiplication (Theorem 2.3.8).
- Theorem concerning the linearity of functions of the form $f(\vec{x})=A\vec{x}$.
- Theorem concerning the representability by matrices of linear functions.
Solve the following problems:[edit]
- Section 2.1, problems 1, 2, 3, 4, and 6.
- Section 2.2, problems 2 and 5.
- Section 2.3, problems 1, 2, 3, 4, 5, 6, 7, 8, 10, 11, and 13.
Questions:[edit]
Solutions:[edit]
Definitions:[edit]
- Given an $n\times m$ matrix $A$ and an $m\times l$ matrix $B$, the product $AB$ is the $n\times l$ matrix whose $(i,j)$ entry is the dot product of the $i$th row of $A$ with the $j$th column of $B$.
- Given a linear system $$\begin{align*}a_{1,1}x_1+\dots+a_{1,m}x_m&=b_1\\&\vdots\\a_{n,1}x_1+\dots+a_{n,m}x_m&=b_n,\end{align*}$$ put $$A=\begin{bmatrix}a_{1,1}&\dots&a_{1,m}\\\vdots&&\vdots\\a_{n,1}&\dots&a_{n,m}\end{bmatrix}\qquad\vec{x}=\begin{bmatrix}x_1\\\vdots\\x_m\end{bmatrix}\qquad\vec{b}=\begin{bmatrix}b_1\\\vdots\\b_n\end{bmatrix}.$$ Then the matrix form of the given system is the single matrix equation $A\vec{x}=\vec{b}$.
- A function $f:\mathbb{R}^m\rightarrow\mathbb{R}^m$ is said to be linear if, for any $\vec{x},\vec{y}\in\mathbb{R}^m$ and any scalar $k$, we have (i) $f(\vec{x}+\vec{y})=f(\vec{x})+f(\vec{y})$ and (ii) $f(k\vec{x})=kf(\vec{x})$.
Theorems:[edit]
- 7. If $f$ has the form $f(\vec{x})=A\vec{x}$ for some matrix $A$, then $f$ is linear.
- 8. If $f$ is linear, then there exists a unique matrix $A$ such that $f(\vec{x})=A\vec{x}$. In fact, the $j$th column of $A$ is just $f(\vec{e}_j)$.