Linear Algebra Exercises-n-Answers.pdf
Linear Algebra Exercises-n-Answers.pdf
Linear Algebra Exercises-n-Answers.pdf
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
120 <strong>Linear</strong> <strong>Algebra</strong>, by Hefferon<br />
and so there are infinitely many solutions to the matrix equation.<br />
( ) 1 0 c ∣∣<br />
{<br />
c, f ∈ R}<br />
0 1 f<br />
With the bases still fixed at E 2 , E 2 , for instance taking c = 2 and f = 3 gives a matrix representing<br />
this map.<br />
⎛<br />
⎝ x ⎞<br />
( )<br />
y⎠ f 2,3 x + 2z<br />
↦−→<br />
y + 3z<br />
z<br />
The check that f 2,3 ◦ η is the identity map on R 2 is easy.<br />
Three.IV.4.25 By Lemma 4.3 it cannot have infinitely many left inverses, because a matrix with both<br />
left and right inverses has only one of each (and that one of each is one of both — the left and right<br />
inverse matrices are equal).<br />
Three.IV.4.26 The associativity of matrix multiplication gives on the one hand H −1 (HG) = H −1 Z =<br />
Z, and on the other that H −1 (HG) = (H −1 H)G = IG = G.<br />
Three.IV.4.27 Multiply both sides of the first equation by H.<br />
Three.IV.4.28 Checking that when I − T is multiplied on both sides by that expression (assuming<br />
that T 4 is the zero matrix) then the result is the identity matrix is easy. The obvious generalization<br />
is that if T n is the zero matrix then (I − T ) −1 = I + T + T 2 + · · · + T n−1 ; the check again is easy.<br />
Three.IV.4.29 The powers of the matrix are formed by taking the powers of the diagonal entries.<br />
That is, D 2 is all zeros except for diagonal entries of d 2 1,1 , d 2 2,2 , etc. This suggests defining D 0 to be<br />
the identity matrix.<br />
Three.IV.4.30 Assume that B is row equivalent to A and that A is invertible. Because they are<br />
row-equivalent, there is a sequence of row steps to reduce one to the other. That reduction can be<br />
done with matrices, for instance, A can be changed by row operations to B as B = R n · · · R 1 A. This<br />
equation gives B as a product of invertible matrices and by Lemma 4.5 then, B is also invertible.<br />
Three.IV.4.31 (a) See the answer to Exercise 28.<br />
(b) We will show that both conditions are equivalent to the condition that the two matrices be<br />
nonsingular.<br />
As T and S are square and their product is defined, they are equal-sized, say n×n. Consider<br />
the T S = I half. By the prior item the rank of I is less than or equal to the minimum of the rank<br />
of T and the rank of S. But the rank of I is n, so the rank of T and the rank of S must each be n.<br />
Hence each is nonsingular.<br />
The same argument shows that ST = I implies that each is nonsingular.<br />
Three.IV.4.32<br />
Exercise 31.<br />
Inverses are unique, so we need only show that it works. The check appears above as<br />
Three.IV.4.33 (a) See the answer for Exercise 25.<br />
(b) See the answer for Exercise 25.<br />
(c) Apply the first part to I = AA −1 to get I = I trans = (AA −1 ) trans = (A −1 ) trans A trans .<br />
(d) Apply the prior item with A trans = A, as A is symmetric.<br />
Three.IV.4.34 For the answer to the items making up the first half, see Exercise 30. For the proof<br />
in the second half, assume that A is a zero divisor so there is a nonzero matrix B with AB = Z (or<br />
else BA = Z; this case is similar), If A is invertible then A −1 (AB) = (A −1 A)B = IB = B but also<br />
A −1 (AB) = A −1 Z = Z, contradicting that B is nonzero.<br />
Three.IV.4.35<br />
Three.IV.4.36<br />
No, there are at least four.<br />
( ) ±1 0<br />
0 ±1<br />
It is not reflexive since, for instance,<br />
( )<br />
1 0<br />
H =<br />
0 2<br />
is not a two-sided inverse of itself. The same example shows that it is not transitive. That matrix has<br />
this two-sided inverse<br />
( )<br />
1 0<br />
G =<br />
0 1/2