Linear Algebra Exercises-n-Answers.pdf
Linear Algebra Exercises-n-Answers.pdf Linear Algebra Exercises-n-Answers.pdf
126 Linear Algebra, by Hefferon As in the prior item, a check provides some confidence that this calculation was performed without mistakes. We can for instance, fix the vector ( ) −1 ⃗v = 2 (this is selected for no reason, out of thin air). Now we have ( ) ( ) ( ) −1 1 2 −1 Rep B (⃗v) = 2 3 4 2 and so t(⃗v) is this vector. B,D ( ( ( 1 1 8 3 · + 5 · = 1) −1) −2) With respect to ˆB, ˆD we first calculate ( ) ( ) ( ) 1 −28/3 −8/3 1 Rep ˆB(⃗v) = −2 38/3 10/3 ˆB, ˆD −2 and, sure enough, that is the same result for t(⃗v). ( ( ( 1 2 8 −4 · + 6 · = 2) 1) −2) Three.V.2.13 B ( 3 = 5) ˆB = D ( ) −4 6 ˆD Where H and Ĥ are m×n, the matrix P is m×m while Q is n×n. Three.V.2.14 Any n×n matrix is nonsingular if and only if it has rank n, that is, by Theorem 2.6, if and only if it is matrix equivalent to the n×n matrix whose diagonal is all ones. Three.V.2.15 If P AQ = I then QP AQ = Q, so QP A = I, and so QP = A −1 . Three.V.2.16 By the definition following Example 2.2, a matrix M is diagonalizable if it represents M = Rep B,D (t) a transformation with the property that there is some basis ˆB such that Rep ˆB, ˆB(t) is a diagonal matrix — the starting and ending bases must be equal. But Theorem 2.6 says only that there are ˆB and ˆD such that we can change to a representation Rep ˆB, ˆD(t) and get a diagonal matrix. We have no reason to suspect that we could pick the two ˆB and ˆD so that they are equal. Three.V.2.17 Yes. Row rank equals column rank, so the rank of the transpose equals the rank of the matrix. Same-sized matrices with equal ranks are matrix equivalent. Three.V.2.18 Only a zero matrix has rank zero. Three.V.2.19 For reflexivity, to show that any matrix is matrix equivalent to itself, take P and Q to be identity matrices. For symmetry, if H 1 = P H 2 Q then H 2 = P −1 H 1 Q −1 (inverses exist because P and Q are nonsingular). Finally, for transitivity, assume that H 1 = P 2 H 2 Q 2 and that H 2 = P 3 H 3 Q 3 . Then substitution gives H 1 = P 2 (P 3 H 3 Q 3 )Q 2 = (P 2 P 3 )H 3 (Q 3 Q 2 ). A product of nonsingular matrices is nonsingular (we’ve shown that the product of invertible matrices is invertible; in fact, we’ve shown how to calculate the inverse) and so H 1 is therefore matrix equivalent to H 3 . Three.V.2.20 By Theorem 2.6, a zero matrix is alone in its class because it is the only m×n of rank zero. No other matrix is alone in its class; any nonzero scalar product of a matrix has the same rank as that matrix. Three.V.2.21 There are two matrix-equivalence classes of 1×1 matrices — those of rank zero and those of rank one. The 3×3 matrices fall into four matrix equivalence classes. Three.V.2.22 For m×n matrices there are classes for each possible rank: where k is the minimum of m and n there are classes for the matrices of rank 0, 1, . . . , k. That’s k + 1 classes. (Of course, totaling over all sizes of matrices we get infinitely many classes.) Three.V.2.23 They are closed under nonzero scalar multiplication, since a nonzero scalar multiple of a matrix has the same rank as does the matrix. They are not closed under addition, for instance, H + (−H) has rank zero. Three.V.2.24 (a) We have ( ) 1 −1 Rep B,E2 (id) = Rep 2 −1 E2,B(id) = Rep B,E2 (id) −1 = and thus the answer is this. ( ) ( ) ( ) 1 −1 1 1 −1 1 Rep B,B (t) = = 2 −1 3 −1 −2 1 ( ) −1 1 −1 = 2 −1 ( −2 ) 0 −5 2 ( −1 ) 1 −2 1
Answers to Exercises 127 As a quick check, we can take a vector at random ( 4 ⃗v = 5) giving ( ) ( ( ) ( 4 1 1 4 9 Rep E2 (⃗v) = = = t(⃗v) 5 3 −1) 5 7) while the calculation with respect ( to B, ) B ( ) ( ( ) 1 −2 0 1 −2 Rep B (⃗v) = = −3 −5 2 −3) −11 B,B B B yields the same result. ( ( ) ( 1 −1 9 −2 · − 11 · = 2) −1 7) (b) We have R 2 t w.r.t. E 2 −−−−→ R 2 w.r.t. E T 2 ⏐ ⏐ id↓ id↓ Rep B,B (t) = Rep E2 ,B(id) · T · Rep B,E2 (id) R 2 t w.r.t. B −−−−→ R 2 w.r.t. B ˆT and, as in the first item of this question ( ) Rep B,E2 (id) = ⃗β1 · · · βn ⃗ Rep E2,B(id) = Rep B,E2 (id) −1 so, writing Q for the matrix whose columns are the basis vectors, we have that Rep B,B (t) = Q −1 T Q. Three.V.2.25 (a) The adapted form of the arrow diagram is this. h V w.r.t. B1 −−−−→ W w.r.t. D H ⏐ ⏐ id↓Q id↓P h V w.r.t. B2 −−−−→ W w.r.t. D Ĥ Since there is no need to change bases in W (or we can say that the change of basis matrix P is the identity), we have Rep B2 ,D(h) = Rep B1 ,D(h) · Q where Q = Rep B2 ,B 1 (id). (b) Here, this is the arrow diagram. h V w.r.t. B −−−−→ W w.r.t. D1 H ⏐ ⏐ id↓Q id↓P h V w.r.t. B −−−−→ W w.r.t. D2 Ĥ We have that Rep B,D2 (h) = P · Rep B,D1 (h) where P = Rep D1 ,D 2 (id). Three.V.2.26 (a) Here is the arrow diagram, and a version of that diagram for inverse functions. V w.r.t. B ⏐ id ↓Q V w.r.t. ˆB h −−−−→ H h −−−−→ Ĥ W w.r.t. D ⏐ id ↓P W w.r.t. ˆD V w.r.t. B ⏐ id ↓Q V w.r.t. ˆB h −1 ←−−−− H −1 W w.r.t. D ⏐ id ↓P h −1 ←−−−− Ĥ −1 W w.r.t. ˆD Yes, the inverses of the matrices represent the inverses of the maps. That is, we can move from the lower right to the lower left by moving up, then left, then down. In other words, where Ĥ = P HQ (and P, Q invertible) and H, Ĥ are invertible then Ĥ−1 = Q −1 H −1 P −1 . (b) Yes; this is the prior part repeated in different terms. (c) No, we need another assumption: if H represents h with respect to the same starting as ending bases B, B, for some B then H 2 represents h ◦ h. As a specific example, these two matrices are both rank one and so they are matrix equivalent ( 1 ) 0 ( 0 ) 0 0 0 1 0 but the squares are not matrix equivalent — the square of the first has rank one while the square of the second has rank zero.
- Page 78 and 79: 76 Linear Algebra, by Hefferon and
- Page 80 and 81: 78 Linear Algebra, by Hefferon (d)
- Page 82 and 83: 80 Linear Algebra, by Hefferon Thre
- Page 84 and 85: 82 Linear Algebra, by Hefferon has
- Page 86 and 87: 84 Linear Algebra, by Hefferon (c)
- Page 88 and 89: 86 Linear Algebra, by Hefferon (c)
- Page 90 and 91: 88 Linear Algebra, by Hefferon Thre
- Page 92 and 93: 90 Linear Algebra, by Hefferon ∣
- Page 94 and 95: 92 Linear Algebra, by Hefferon Thre
- Page 96 and 97: 94 Linear Algebra, by Hefferon Thre
- Page 98 and 99: 96 Linear Algebra, by Hefferon and,
- Page 100 and 101: 98 Linear Algebra, by Hefferon The
- Page 102 and 103: 100 Linear Algebra, by Hefferon (c)
- Page 104 and 105: 102 Linear Algebra, by Hefferon mea
- Page 106 and 107: 104 Linear Algebra, by Hefferon (c)
- Page 108 and 109: 106 Linear Algebra, by Hefferon Thr
- Page 110 and 111: 108 Linear Algebra, by Hefferon (e)
- Page 112 and 113: 110 Linear Algebra, by Hefferon (a)
- Page 114 and 115: 112 Linear Algebra, by Hefferon to
- Page 116 and 117: 114 Linear Algebra, by Hefferon Thr
- Page 118 and 119: 116 Linear Algebra, by Hefferon Thr
- Page 120 and 121: 118 Linear Algebra, by Hefferon As
- Page 122 and 123: 120 Linear Algebra, by Hefferon and
- Page 124 and 125: 122 Linear Algebra, by Hefferon we
- Page 126 and 127: 124 Linear Algebra, by Hefferon (b)
- Page 130 and 131: 128 Linear Algebra, by Hefferon (d)
- Page 132 and 133: 130 Linear Algebra, by Hefferon the
- Page 134 and 135: 132 Linear Algebra, by Hefferon Thr
- Page 136 and 137: and that (⃗v − (c1 · ⃗κ 1 +
- Page 138 and 139: 136 Linear Algebra, by Hefferon (a)
- Page 140 and 141: 138 Linear Algebra, by Hefferon the
- Page 142 and 143: 140 Linear Algebra, by Hefferon and
- Page 144 and 145: 142 Linear Algebra, by Hefferon and
- Page 146 and 147: 144 Linear Algebra, by Hefferon Pro
- Page 148 and 149: 146 Linear Algebra, by Hefferon 5 T
- Page 150 and 151: 148 Linear Algebra, by Hefferon Top
- Page 152 and 153: 150 Linear Algebra, by Hefferon 4 R
- Page 154 and 155: 152 Linear Algebra, by Hefferon > 0
- Page 156 and 157: 154 Linear Algebra, by Hefferon 0.5
- Page 158 and 159: 156 Linear Algebra, by Hefferon n =
- Page 160 and 161: 158 Linear Algebra, by Hefferon Top
- Page 162 and 163: 160 Linear Algebra, by Hefferon and
- Page 164 and 165: 162 Linear Algebra, by Hefferon ∣
- Page 166 and 167: 164 Linear Algebra, by Hefferon Fou
- Page 168 and 169: 166 Linear Algebra, by Hefferon Fou
- Page 170 and 171: 168 Linear Algebra, by Hefferon The
- Page 172 and 173: 170 Linear Algebra, by Hefferon Fou
- Page 174 and 175: 172 Linear Algebra, by Hefferon ∣
- Page 176 and 177: 174 Linear Algebra, by Hefferon whe
<strong>Answers</strong> to <strong>Exercises</strong> 127<br />
As a quick check, we can take a vector at random<br />
(<br />
4<br />
⃗v =<br />
5)<br />
giving<br />
( ) ( ( ) (<br />
4 1 1 4 9<br />
Rep E2<br />
(⃗v) =<br />
= = t(⃗v)<br />
5 3 −1)<br />
5 7)<br />
while the calculation with respect<br />
(<br />
to B,<br />
)<br />
B<br />
( ) ( ( )<br />
1 −2 0 1 −2<br />
Rep B (⃗v) =<br />
=<br />
−3 −5 2 −3)<br />
−11<br />
B,B B<br />
B<br />
yields the same result.<br />
( ( ) (<br />
1 −1 9<br />
−2 · − 11 · =<br />
2)<br />
−1 7)<br />
(b) We have<br />
R 2 t<br />
w.r.t. E 2<br />
−−−−→ R 2 w.r.t. E T 2<br />
⏐<br />
⏐<br />
id↓<br />
id↓<br />
Rep B,B (t) = Rep E2 ,B(id) · T · Rep B,E2 (id)<br />
R 2 t<br />
w.r.t. B −−−−→ R 2 w.r.t. B<br />
ˆT<br />
and, as in the first item of this question<br />
( )<br />
Rep B,E2 (id) = ⃗β1 · · · βn ⃗<br />
Rep E2,B(id) = Rep B,E2 (id) −1<br />
so, writing Q for the matrix whose columns are the basis vectors, we have that Rep B,B (t) = Q −1 T Q.<br />
Three.V.2.25<br />
(a) The adapted form of the arrow diagram is this.<br />
h<br />
V w.r.t. B1 −−−−→ W w.r.t. D<br />
H<br />
⏐<br />
⏐<br />
id↓Q<br />
id↓P<br />
h<br />
V w.r.t. B2 −−−−→ W w.r.t. D<br />
Ĥ<br />
Since there is no need to change bases in W (or we can say that the change of basis matrix P is the<br />
identity), we have Rep B2 ,D(h) = Rep B1 ,D(h) · Q where Q = Rep B2 ,B 1<br />
(id).<br />
(b) Here, this is the arrow diagram.<br />
h<br />
V w.r.t. B −−−−→ W w.r.t. D1<br />
H<br />
⏐<br />
⏐<br />
id↓Q<br />
id↓P<br />
h<br />
V w.r.t. B −−−−→ W w.r.t. D2<br />
Ĥ<br />
We have that Rep B,D2 (h) = P · Rep B,D1 (h) where P = Rep D1 ,D 2<br />
(id).<br />
Three.V.2.26<br />
(a) Here is the arrow diagram, and a version of that diagram for inverse functions.<br />
V w.r.t. B<br />
⏐<br />
id<br />
↓Q<br />
V w.r.t.<br />
ˆB<br />
h<br />
−−−−→<br />
H<br />
h<br />
−−−−→<br />
Ĥ<br />
W w.r.t. D<br />
⏐<br />
id<br />
↓P<br />
W w.r.t.<br />
ˆD<br />
V w.r.t. B<br />
⏐<br />
id<br />
↓Q<br />
V w.r.t.<br />
ˆB<br />
h −1<br />
←−−−−<br />
H −1<br />
W w.r.t. D<br />
⏐<br />
id<br />
↓P<br />
h −1<br />
←−−−−<br />
Ĥ −1 W w.r.t. ˆD<br />
Yes, the inverses of the matrices represent the inverses of the maps. That is, we can move from the<br />
lower right to the lower left by moving up, then left, then down. In other words, where Ĥ = P HQ<br />
(and P, Q invertible) and H, Ĥ are invertible then Ĥ−1 = Q −1 H −1 P −1 .<br />
(b) Yes; this is the prior part repeated in different terms.<br />
(c) No, we need another assumption: if H represents h with respect to the same starting as ending<br />
bases B, B, for some B then H 2 represents h ◦ h. As a specific example, these two matrices are both<br />
rank one and so they are matrix equivalent<br />
( 1<br />
) 0<br />
( 0<br />
) 0<br />
0 0 1 0<br />
but the squares are not matrix equivalent — the square of the first has rank one while the square of<br />
the second has rank zero.