Math 260, Fall 2019, Assignment 4
From cartan.math.umb.edu
Revision as of 20:27, 26 September 2019 by Steven.Jackson (talk | contribs) (Created page with "__NOTOC__ ''I was at the mathematical school, where the master taught his pupils after a method scarce imaginable to us in Europe. The proposition and demonstration were fair...")
I was at the mathematical school, where the master taught his pupils after a method scarce imaginable to us in Europe. The proposition and demonstration were fairly written on a thin wafer, with ink composed of a cephalic tincture. This the student was to swallow upon a fasting stomach, and for three days following eat nothing but bread and water. As the wafer digested the tincture mounted to the brain, bearing the proposition along with it.
- - Jonathan Swift, Gulliver's Travels
Read:
- Section 2.3 (note that the material on block matrices and transition matrices is optional).
Carefully define the following terms, then give one example and one non-example of each:
- Orthogonal (vectors).
- Addition (of matrices).
- Scalar multiplication (of matrices).
- Product (of matrices).
- Commute (i.e. "two specific matrices $A$ and $B$ are said to commute when...").
Carefully state the following theorems (you do not need to prove them):
- Properties of the dot product (i.e. valid rewritings of the expressions $\vec{u}\cdot(\vec{v}+\vec{w})$, $(c\vec{u})\cdot\vec{v}$, $\vec{v}\cdot\vec{u}$, and $\vec{v}\cdot\vec{v}$).
- Law of cosines (in vector form; also known as the "geometric interpretation of the dot product").
- Formula for the angle between two non-zero vectors.
- Procedure to determine whether two non-zero vectors make an acute, obtuse, or right angle.
- Theorem concerning the associativity of matrix multiplication.
- Theorem concerning the commutativity of matrix multiplication.
- Distributive laws for matrix multiplication.
Solve the following problems:
- Section 5.1, problems 4, 5, 6, 7, 8, 9, and 10.
- Section 2.3, problems 1, 3, 5, 7, 9, 11, 13, and 17. (Hint for 17: if $B$ commutes with $A$, then $B$ must be a $2\times 2$ matrix, so write it as $B=\begin{bmatrix}b_{1,1}&b_{1,2}\\b_{2,1}&b_{2,2}\end{bmatrix}$. Then the condition $AB=BA$ is equivalent to a certain system of linear equations in the four unknowns $b_{1,1},b_{1,2},b_{2,1},b_{2,2}$. You can find all solutions of this system using Gauss-Jordan elimination. The set of all matrices commuting with a given matrix $A$ is called the centralizer of $A$; this exercise shows that the Gauss-Jordan algorithm can be used to compute centralizers.)