Linear Algebra Exercises-n-Answers.pdf
Linear Algebra Exercises-n-Answers.pdf Linear Algebra Exercises-n-Answers.pdf
118 Linear Algebra, by Hefferon As a check, note that the third column of the starting matrix is 3/2 times the second, and so it is indeed singular and therefore has no inverse. Three.IV.4.17 We can use Corollary 4.12. 1 1 · 5 − 2 · 3 · ( ) ( ) 5 −3 −5 3 = −2 1 2 −1 Three.IV.4.18 (a) The proof that the inverse is r −1 H −1 = (1/r) · H −1 (provided, of course, that the matrix is invertible) is easy. (b) No. For one thing, the fact that H + G has an inverse doesn’t imply that H has an inverse or that G has an inverse. Neither of these matrices is invertible but their sum is. ( ) ( ) 1 0 0 0 0 0 0 1 Another point is that just because H and G each has an inverse doesn’t mean H + G has an inverse; here is an example. ( ) ( ) 1 0 −1 0 0 1 0 −1 Still a third point is that, even if the two matrices have inverses, and the sum has an inverse, doesn’t imply that the equation holds: ( ) −1 ( ) −1 ( ) −1 ( ) −1 2 0 1/2 0 3 0 1/3 0 = = 0 2 0 1/2 0 3 0 1/3 but and (1/2) + (1/3) does not equal 1/5. ( ) −1 ( ) −1 5 0 1/5 0 = 0 5 0 1/5 Three.IV.4.19 Yes: T k (T −1 ) k = (T T · · · T ) · (T −1 T −1 · · · T −1 ) = T k−1 (T T −1 )(T −1 ) k−1 = · · · = I. Three.IV.4.20 Yes, the inverse of H −1 is H. Three.IV.4.21 One way to check that the first is true is with the angle sum formulas from trigonometry. (cos(θ1 ) ( ) + θ 2 ) − sin(θ 1 + θ 2 ) cos θ1 cos θ = 2 − sin θ 1 sin θ 2 − sin θ 1 cos θ 2 − cos θ 1 sin θ 2 sin(θ 1 + θ 2 ) cos(θ 1 + θ 2 ) sin θ 1 cos θ 2 + cos θ 1 sin θ 2 cos θ 1 cos θ 2 − sin θ 1 sin θ 2 ( ) ( ) cos θ1 − sin θ = 1 cos θ2 − sin θ 2 sin θ 1 cos θ 1 sin θ 2 cos θ 2 Checking the second equation in this way is similar. Of course, the equations can be not just checked but also understood by recalling that t θ is the map that rotates vectors about the origin through an angle of θ radians. Three.IV.4.22 There are two cases. For the first case we assume that a is nonzero. Then ( ) ( ) −(c/a)ρ 1 +ρ 2 a b 1 0 a b 1 0 −→ = 0 −(bc/a) + d −c/a 1 0 (ad − bc)/a −c/a 1 shows that the matrix is invertible (in this a ≠ 0 case) if and only if ad − bc ≠ 0. To find the inverse, we finish with the Jordan half of the reduction. ( ) ( ) (1/a)ρ 1 1 b/a 1/a 0 −(b/a)ρ 2+ρ 1 1 0 d/(ad − bc) −b/(ad − bc) −→ −→ (a/ad−bc)ρ 2 0 1 −c/(ad − bc) a/(ad − bc) 0 1 −c/(ad − bc) a/(ad − bc) The other case is the a = 0 case. We swap to get c into the 1, 1 position. ( ) ρ 1↔ρ 2 c d 0 1 −→ 0 b 1 0 This matrix is nonsingular if and only if both b and c are nonzero (which, under the case assumption that a = 0, holds if and only if ad − bc ≠ 0). To find the inverse we do the Jordan half. ( ) ( ) (1/c)ρ 1 1 d/c 0 1/c −(d/c)ρ 2+ρ 1 1 0 −d/bc 1/c −→ −→ (1/b)ρ 2 0 1 1/b 0 0 1 1/b 0 (Note that this is what is required, since a = 0 gives that ad − bc = −bc).
Answers to Exercises 119 Three.IV.4.23 With H a 2×3 matrix, in looking for a matrix G such that the combination HG acts as the 2×2 identity we need G to be 3×2. Setting up the equation ( ) ⎛ 1 0 1 ⎝ m n ⎞ ( ) p q⎠ 1 0 = 0 1 0 0 1 r s and solving the resulting linear system m +r = 1 n +s = 0 p = 0 q = 1 gives infinitely many solutions. ⎛ ⎞ ⎛ ⎞ ⎛ ⎞ ⎛ ⎞ m 1 −1 0 n 0 0 −1 { p ⎜ q = 0 ⎟ ⎜1 + r · 0 ⎟ ⎜ 0 + s · 0 ∣ ⎟ ⎜ 0 r, s ∈ R} ⎟ ⎝ r ⎠ ⎝0⎠ ⎝ 1 ⎠ ⎝ 0 ⎠ s 0 0 1 Thus H has infinitely many right inverses. As for left inverses, the equation ⎛ ( ) ( ) a b 1 0 1 = ⎝ 1 0 0 ⎞ 0 1 0⎠ c d 0 1 0 0 0 1 gives rise to a linear system with nine equations and four unknowns. a = 1 b = 0 a = 0 c = 0 d = 1 c = 0 e = 0 f = 0 e = 1 This system is inconsistent (the first equation conflicts with the third, as do the seventh and ninth) and so there is no left inverse. Three.IV.4.24 With respect to the standard bases we have ⎛ Rep E2 ,E 3 (η) = ⎝ 1 0 ⎞ 0 1⎠ 0 0 and setting up the equation to find the matrix inverse ( ) ⎛ a b c ⎝ 1 0 ⎞ ( ) 0 1⎠ 1 0 = = Rep d e f 0 1 E2 ,E 2 (id) 0 0 gives rise to a linear system. a = 1 b = 0 d = 0 e = 1 There are infinitely many solutions in a, . . . , f to this system because two of these variables are entirely unrestricted ⎛ ⎞ ⎛ ⎞ ⎛ ⎞ ⎛ ⎞ a 1 0 0 b 0 0 0 { c ⎜d = 0 ⎟ ⎜0 + c · 1 ⎟ ⎜0 + f · 0 ∣ ⎟ ⎜0 c, f ∈ R} ⎟ ⎝e⎠ ⎝1⎠ ⎝0⎠ ⎝0⎠ f 0 0 1
- Page 70 and 71: 68 Linear Algebra, by Hefferon (b)
- Page 72 and 73: 70 Linear Algebra, by Hefferon 2 Fo
- Page 74 and 75: 72 Linear Algebra, by Hefferon -a T
- Page 76 and 77: 74 Linear Algebra, by Hefferon 3 (a
- Page 78 and 79: 76 Linear Algebra, by Hefferon and
- Page 80 and 81: 78 Linear Algebra, by Hefferon (d)
- Page 82 and 83: 80 Linear Algebra, by Hefferon Thre
- Page 84 and 85: 82 Linear Algebra, by Hefferon has
- Page 86 and 87: 84 Linear Algebra, by Hefferon (c)
- Page 88 and 89: 86 Linear Algebra, by Hefferon (c)
- Page 90 and 91: 88 Linear Algebra, by Hefferon Thre
- Page 92 and 93: 90 Linear Algebra, by Hefferon ∣
- Page 94 and 95: 92 Linear Algebra, by Hefferon Thre
- Page 96 and 97: 94 Linear Algebra, by Hefferon Thre
- Page 98 and 99: 96 Linear Algebra, by Hefferon and,
- Page 100 and 101: 98 Linear Algebra, by Hefferon The
- Page 102 and 103: 100 Linear Algebra, by Hefferon (c)
- Page 104 and 105: 102 Linear Algebra, by Hefferon mea
- Page 106 and 107: 104 Linear Algebra, by Hefferon (c)
- Page 108 and 109: 106 Linear Algebra, by Hefferon Thr
- Page 110 and 111: 108 Linear Algebra, by Hefferon (e)
- Page 112 and 113: 110 Linear Algebra, by Hefferon (a)
- Page 114 and 115: 112 Linear Algebra, by Hefferon to
- Page 116 and 117: 114 Linear Algebra, by Hefferon Thr
- Page 118 and 119: 116 Linear Algebra, by Hefferon Thr
- Page 122 and 123: 120 Linear Algebra, by Hefferon and
- Page 124 and 125: 122 Linear Algebra, by Hefferon we
- Page 126 and 127: 124 Linear Algebra, by Hefferon (b)
- Page 128 and 129: 126 Linear Algebra, by Hefferon As
- Page 130 and 131: 128 Linear Algebra, by Hefferon (d)
- Page 132 and 133: 130 Linear Algebra, by Hefferon the
- Page 134 and 135: 132 Linear Algebra, by Hefferon Thr
- Page 136 and 137: and that (⃗v − (c1 · ⃗κ 1 +
- Page 138 and 139: 136 Linear Algebra, by Hefferon (a)
- Page 140 and 141: 138 Linear Algebra, by Hefferon the
- Page 142 and 143: 140 Linear Algebra, by Hefferon and
- Page 144 and 145: 142 Linear Algebra, by Hefferon and
- Page 146 and 147: 144 Linear Algebra, by Hefferon Pro
- Page 148 and 149: 146 Linear Algebra, by Hefferon 5 T
- Page 150 and 151: 148 Linear Algebra, by Hefferon Top
- Page 152 and 153: 150 Linear Algebra, by Hefferon 4 R
- Page 154 and 155: 152 Linear Algebra, by Hefferon > 0
- Page 156 and 157: 154 Linear Algebra, by Hefferon 0.5
- Page 158 and 159: 156 Linear Algebra, by Hefferon n =
- Page 160 and 161: 158 Linear Algebra, by Hefferon Top
- Page 162 and 163: 160 Linear Algebra, by Hefferon and
- Page 164 and 165: 162 Linear Algebra, by Hefferon ∣
- Page 166 and 167: 164 Linear Algebra, by Hefferon Fou
- Page 168 and 169: 166 Linear Algebra, by Hefferon Fou
118 <strong>Linear</strong> <strong>Algebra</strong>, by Hefferon<br />
As a check, note that the third column of the starting matrix is 3/2 times the second, and so it is<br />
indeed singular and therefore has no inverse.<br />
Three.IV.4.17 We can use Corollary 4.12.<br />
1<br />
1 · 5 − 2 · 3 ·<br />
( ) ( )<br />
5 −3 −5 3<br />
=<br />
−2 1 2 −1<br />
Three.IV.4.18 (a) The proof that the inverse is r −1 H −1 = (1/r) · H −1 (provided, of course, that<br />
the matrix is invertible) is easy.<br />
(b) No. For one thing, the fact that H + G has an inverse doesn’t imply that H has an inverse or<br />
that G has an inverse. Neither of these matrices is invertible but their sum is.<br />
( ) ( )<br />
1 0 0 0<br />
0 0 0 1<br />
Another point is that just because H and G each has an inverse doesn’t mean H + G has an inverse;<br />
here is an example.<br />
( ) ( )<br />
1 0 −1 0<br />
0 1 0 −1<br />
Still a third point is that, even if the two matrices have inverses, and the sum has an inverse, doesn’t<br />
imply that the equation holds:<br />
( ) −1 ( ) −1 ( ) −1 ( ) −1<br />
2 0 1/2 0<br />
3 0 1/3 0<br />
=<br />
=<br />
0 2 0 1/2 0 3 0 1/3<br />
but<br />
and (1/2) + (1/3) does not equal 1/5.<br />
( ) −1 ( ) −1<br />
5 0 1/5 0<br />
=<br />
0 5 0 1/5<br />
Three.IV.4.19 Yes: T k (T −1 ) k = (T T · · · T ) · (T −1 T −1 · · · T −1 ) = T k−1 (T T −1 )(T −1 ) k−1 = · · · = I.<br />
Three.IV.4.20 Yes, the inverse of H −1 is H.<br />
Three.IV.4.21 One way to check that the first is true is with the angle sum formulas from trigonometry.<br />
(cos(θ1 ) ( )<br />
+ θ 2 ) − sin(θ 1 + θ 2 ) cos θ1 cos θ<br />
=<br />
2 − sin θ 1 sin θ 2 − sin θ 1 cos θ 2 − cos θ 1 sin θ 2<br />
sin(θ 1 + θ 2 ) cos(θ 1 + θ 2 ) sin θ 1 cos θ 2 + cos θ 1 sin θ 2 cos θ 1 cos θ 2 − sin θ 1 sin θ 2<br />
( ) ( )<br />
cos θ1 − sin θ<br />
=<br />
1 cos θ2 − sin θ 2<br />
sin θ 1 cos θ 1 sin θ 2 cos θ 2<br />
Checking the second equation in this way is similar.<br />
Of course, the equations can be not just checked but also understood by recalling that t θ is the<br />
map that rotates vectors about the origin through an angle of θ radians.<br />
Three.IV.4.22<br />
There are two cases. For the first case we assume that a is nonzero. Then<br />
( ) ( )<br />
−(c/a)ρ 1 +ρ 2 a b 1 0 a b 1 0<br />
−→ =<br />
0 −(bc/a) + d −c/a 1 0 (ad − bc)/a −c/a 1<br />
shows that the matrix is invertible (in this a ≠ 0 case) if and only if ad − bc ≠ 0. To find the inverse,<br />
we finish with the Jordan half of the reduction.<br />
( )<br />
( )<br />
(1/a)ρ 1 1 b/a 1/a 0 −(b/a)ρ 2+ρ 1 1 0 d/(ad − bc) −b/(ad − bc)<br />
−→<br />
−→<br />
(a/ad−bc)ρ 2<br />
0 1 −c/(ad − bc) a/(ad − bc)<br />
0 1 −c/(ad − bc) a/(ad − bc)<br />
The other case is the a = 0 case. We swap to get c into the 1, 1 position.<br />
( )<br />
ρ 1↔ρ 2 c d 0 1<br />
−→<br />
0 b 1 0<br />
This matrix is nonsingular if and only if both b and c are nonzero (which, under the case assumption<br />
that a = 0, holds if and only if ad − bc ≠ 0). To find the inverse we do the Jordan half.<br />
( )<br />
( )<br />
(1/c)ρ 1 1 d/c 0 1/c −(d/c)ρ 2+ρ 1 1 0 −d/bc 1/c<br />
−→<br />
−→<br />
(1/b)ρ 2<br />
0 1 1/b 0<br />
0 1 1/b 0<br />
(Note that this is what is required, since a = 0 gives that ad − bc = −bc).