0% found this document useful (0 votes)
12 views13 pages

Lin Transform

1. A linear transformation T from vector space V to vector space W preserves vector addition and scalar multiplication. 2. The kernel of a linear transformation T, denoted ker(T), is the set of vectors in V that are mapped to the zero vector in W by T. 3. The rank of a linear transformation T is the dimension of its image, or the size of the largest linearly independent subset of vectors T can map to in W.

Uploaded by

maplebrandish
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views13 pages

Lin Transform

1. A linear transformation T from vector space V to vector space W preserves vector addition and scalar multiplication. 2. The kernel of a linear transformation T, denoted ker(T), is the set of vectors in V that are mapped to the zero vector in W by T. 3. The rank of a linear transformation T is the dimension of its image, or the size of the largest linearly independent subset of vectors T can map to in W.

Uploaded by

maplebrandish
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Linear Transformation

T :V !W
Def: Let V and W be vector spaces over the same …eld F . We call a function
T : V ! W a linear transformation from V to W if, for all x; y 2 V and c 2 F ,
we have
a. T (x + y) = T (x) + T (y), preserving vector addition
b. T (cx) = cT (x), preserving scalar multiplication

Properties of a linear function T


1. If T is linear, then T (0v ) = 0w .
2. T is linear i¤ T (cx + y) = cT (x) + T (y) for all x; y 2 V and c 2 F .
3. If T is linear, then T (x y) = T (x) T (y) for all x; y 2 V .
4. T is linear i¤, for x1 ; :::; xn 2 V and a1 ; :::; an 2 F ,
Pn P
n
T ( ai xi ) = ai T (xi )
i=1 i=1

Ex: De…ne T : R2 ! R2 by T (a1 ; a2 ) = (2a1 + a2 ; a1 )


To show that T is linear, let c 2 R, and x; y 2 R2
where x = (b1 ; b2 ) and y = (d1 ; d2 )
cx + y = c(b1 ; b2 ) + (d1 ; d2 ) = (cb1 + d1 ; cb2 + d2 )

T (cx + y) = (2cb1 + 2d1 + cb2 + d2 ; cb1 + d1 )


Also
cT (x) + T (y)
= cT (b1 ; b2 ) + T (d1 ; d2 )
= c(2b1 + b2 ; b1 ) + (2d1 + d2 ; d1 )
= (2cb1 + cb2 ; cb1 ) + (2d1 + d2 ; d1 )
= (2cb1 + cb2 + 2d1 + d2 ; cb1 + d1 )

Ex: For any angle , de…ne T : R2 ! R2 by the rule: T (a1 ; a2 ) is the


vector obtained by rotating (a1 ; a2 ) counterclockwise by if (a1 ; a2 ) 6= (0; 0),
and T (0; 0) = (0; 0). Then T : R2 ! R2 is a linear transformation that is
called rotation by .

Determine an explicit formula for T . Fix a nonzero vector (a1 ; a2 ) 2 R2 .


1
is the angle (a1 ; a2 ) makes with the positive x-axis, and let r = (a21 + a22 ) 2 .
Then a1 = r cos and a2 = r sin .
T (a1 ; a2 ) has length r and makes an angle + with the positive x-axis.
(a01 ; a02 ) = T (a1 ; a2 ) = [r cos( + ); r sin( + )]
= (r cos cos r sin sin ; r cos sin + r sin cos )
= (a1 cos a2 sin ; a1 sin + a2 cos )
a0 a cos a2 sin
T (a1 ; a2 ) = 10 = 1
a2 a1 sin + a2 cos
a01 cos sin a1
=
a02 sin cos a2

1
This is the matrix representation of rotation by an angle in R2 .

(a1 ; a2 ); (b1 ; b2 )
show T (c(a1 ; a2 ) + (b1 ; b2 )) = cT (a1 ; a2 ) + T (b1 ; b2 )
c(a1 ; a2 ) + (b1 ; b2 ) = (ca1 ; ca2 ) + (b1 ; b2 )
= (ca1 + b1 ; ca2 + b2 )
T (c(a1 ; a2 ) + (b1 ; b2 )) = T (ca1 + b1 ; ca2 + b2 )
= ((ca1 + b1 ) cos (ca2 + b2 ) sin ;
(ca1 + b1 ) sin + (ca2 + b2 ) cos )

according to the transformation


T (a1 ; a2 ) = (a1 cos a2 sin ; a1 sin + a2 cos )

cT (a1 ; a2 ) = c(a1 cos a2 sin ; a1 sin + a2 cos )


= (ca1 cos ca2 sin ; ca1 sin + ca2 cos )

T (b1 ; b2 ) = (b1 cos b2 sin ; b1 sin + b2 cos )


cT (a1 ; a2 ) + T (b1 ; b2 )
= (ca1 cos ca2 sin ; ca1 sin + ca2 cos )
+(b1 cos b2 sin ; b1 sin + b2 cos )
= (ca1 cos ca2 sin + b1 cos b2 sin ;
ca1 sin + ca2 cos + b1 sin + b2 cos )
= ((ca1 + b1 ) cos (ca2 + b2 ) sin ;
(ca2 + b2 ) cos + (ca1 + b1 ) sin )

((ca1 + b1 ) cos (ca2 + b2 ) sin ;


(ca1 + b1 ) sin + (ca2 + b2 ) cos )

Ex: De…ne T : R2 ! R2 by T (a1 ; a2 ) = (a1 ; a2 )


Re‡ection about x-axis.
1 0 a1 a1
=
0 1 a2 a2

Ex: De…ne T : R2 ! R2 by T (a1 ; a2 ) = (a1 ; 0). T i called projection on the


x-axis.

Ex: De…ne T : Mm n (F ) ! Mn m (F ) by T (A) = AT , where AT is trans-


pose of A.

Ex: Let V denote the set of all real-valued functions de…ned on the real line
that have derivative of every order.
De…ne T : V ! V by T (f ) = f 0 , the derivative of f . Let g; h 2 V and a 2 R.
Now T (ag + h) = (ag + h)0 = ag 0 + h0 = aT (g) + T (h)

Ex: Let V = C(R) the vector space of continuous real valued functions on
Rb
R. Let a; b 2 R, a < b. De…ne T : V ! R by T (f ) = a f (t)dt for all f 2 V .

2
T is a linear transformation because the de…nite integral of a linear com-
bination of functions is the same as the linear transformation of the de…nite
integrals of the functions.
Rb Rb Rb
a
[cg(t) + h(t)]dt = c a g(t)dt + a h(t)dt

For vector spaces V and W over F


Def:
identity transformation IV : V ! V by IV (x) = x for all x 2 V .
zero transformation T0 : V ! W by T0 (x) = 0 for all x 2 V .

Def: Let V and W be vector spaces, and let T : V ! W be linear. Let the
Kernel (or Nullspace) N (T ) or ker(T ) of T to be the set of all vectors x in V
such that T (x) = 0; that is, ker(T ) = fx 2 V : T (x) = 0g.

Linear system ! homogeneous solution


Coe¢ cient matrix ! nullspace
Linear transformation ! kernel

Def: The range (or image or codomain) R(T ) or image(T ) of T to be


the subset of W consisting of all images (under T ) of vectors in V ; that is,
image(T ) = fT (x) : x 2 V g.
fx + y + z = 0g

1.0
-4 0.5 -4
-2 -2
z 0.0
0 0
-0.5
y-1.0 2
2 x
4 4

Ex: Let V and W be vector spaces, and let I : V ! V and T0 : V ! W


be the identity and zero transformations, respectively. Then ker(I) = f0g.
image(I) = V , ker(T0 ) = V , and image(T0 ) = f0g.

3
Ex: T 2: R2 ! 3
3 R de…ned by
1 2
T x = 43 45 x
5 26 3 2 3
1 2 x1 + 2x2
x x
T 1 = 43 45 1 = 43x1 + 4x2 5
x2 x2
5 6 5x1 + 6x2
Check on preservation of vector addition
x x0
x = 1 ; x0 = 10
x2 x2
2 3 2 0 3
0 1 2 0 x1 + 2x02 + x1 + 2x2
x + x1 x + x1
T (x + x0 ) = T 1 = 43 45 1 = 43x01 + 4x02 + 3x1 + 4x2 5
x2 + x02 x2 + x02
2 0 3 5 6 5x01 + 6x02 + 5x1 + 6x2
0
x1 + x1 + 2x2 + 2x2
= 43x01 + 3x1 + 4x02 + 4x2 5
5x201 + 5x1 + 6x02 + 6x2 3
(x01 + x1 ) + 2(x02 + x2 )
= 43(x01 + x1 ) + 4(x02 + x2 )5
5(x01 + x1 ) + 6(x02 + x2 )
2 3 2 3 2 3
1 2 1 2 x01 + 2x02 + x1 + 2x2
x x0
T x + T x0 = 43 45 1 + 43 45 10 = 43x01 + 4x02 + 3x1 + 4x2 5 =
x2 x2
2 0 5 6 5 6 5x01 + 6x02 + 5x1 + 6x2
3
x1 + x1 + 2x02 + 2x2
43x01 + 3x1 + 4x02 + 4x2 5
5x01 + 5x1 + 6x02 + 6x2

Generally, any m n matrix gives rise to a linear transformation from Rn


to Rm
2 3
5 0 0
Ex: T : R3 ! R3 given by T x = 5x corresponds to the matrix 40 5 05
0 0 5

Ex: permutation as a linear transformation. Let us take R4 and consider


the operation of switching the …rst and third components
T (x1 ; x2 ; x3 ; x34 ) = (x3 ; x2 ; x1 ; x4 ) 2 3
2
0 0 1 0 x3
60 1 0 07 6 7
6 7 T 6x2 7
41 0 0 05 (x1 ; x2 ; x3 ; x4 ) = 4x1 5
0 0 0 1 x4

Ex: The right-shift operation as a linear transformation


R1 is the space of all sequences, and it contains (x1 ; x2 ; x3 ; x4 ; :::)
De…ne the right-shift operator U : R1 ! R1 by
U (x1 ; x2 ; x3 ; x4 ; :::) = (0; x1 ; x2 ; x3 ; x4 ; :::)

4
Lemma: Let T : V ! W be a linear transformation. Then T is one-to-one
i¤ (if and only if) ker(T ) = f0g.
Proof:
Suppose that T is one-to-one, we need to show that ker(T ) = f0g. 0 2
ker(T ), because T 0v = 0w . Now show there is no other elements in ker(T ).
Prove by contradiction, there was a nonzero vector v 2 V such that v 2 ker(T ),
T v = 0 = T 0. Since T is one-to-one, this has to force v = 0, then we have a
contradiction now.

Now suppose that ker(T ) = f0g; we have to show that T is one-to-one. If


T v = T v 0 , then we must have v = v 0 . Suppose T v = T v 0 , then T v T v 0 = 0,
T (v v 0 ) = 0. v v 0 = 0 ! v = v 0 , this will imply that T is one-to-one.

Def: The rank rank(T ) of a linear transformation T : V ! W is de…ned to


be the dimension of image(T ), thus rank(T ) = dim(image(T )).

Ex: The zero transformation has rank 0.

Ex: (x1 ; x2 ; x3 ; x4 ; x5 ) := (x1 ; x2 ; x3 ; 0; 0)


image( ) = f(x1 ; x2 ; x3 ; 0; 0) : x1 ; x2 ; x3 2 Rg
rank( ) = 3

The rank measures how much information or degree of freedom is retained


by the transformation T .

The Replacement Theorem


Let V be a vector space that is generated by a set G containing exactly n
vectors, and let L be a linearly independent subset of V containing exactly m
vectors. Then m n and there exists a subset H of G containing n m vectors
such that L [ H generates V .
Proof: By induction on m.
Base case m = 0; L = ;, and so taking H = G gives the desired result.
Suppose the theorem is true for m 0, show it is true for m + 1.
Let L = fv1 ; :::; vm+1 g be a linearly independent subset of V containing
m + 1 vectors.
Use the following theorem:
Let V be a vector space, and let S1 S2 V . If S1 is linearly (in)dependent,
then S2 is linearly (in)dependent.
We may apply the induction hypothesis to conclude that m n and tat there
is a subset fu1 ; :::un m g of G such that fv1 ; :::; vm g [ fu1 ; :::; un m g generates
V . Thus there exist scalars a1 ; :::; am ; b1 ; :::; bn m such that
a1 v1 + ::: + am vm + b1 u1 + ::: + bn m un m = vm+1
n m > 0, let vm+1 be a linear combination of v1 ; :::; vm , by the following
theorem
Let S be a linearly independent subset of a vector space V , and let v be a
vector in V that is not in S. Then S [ fvg is linearly dependent if and only if
v 2 span(S).

5
Contradicts the assumption that L is linearly independent. Hence n > m;
that is, n m + 1. Say b is nonzero
u1 = ( ab11 ) + ( ab12 )v2 + ::: + ( abm
1
)vm b11 vm+1
+( bb21 )u2 + ::: + ( bnb1m )un m
Let H = fu2 ; :::; un m g. Then u1 2 span(L[H), and because v1 ; :::; vm ; u2 ; :::; un m
are in the span(L [ H), it follows that
fv1 ; :::; vm ; u1 ; :::; un m g span(L [ H)
because fv1 ; :::; vm ; u1 ; :::; un m g generates V , by the following theorem
The span of any subset S of a vector space V is a subspace of V that contains
S.

implies that span(L [ H) = V . Since H is a subset of G that contains


(n m) 1 = n (m + 1) vectors, the theorem is true for m + 1.

The Dimension Theorem


Let V be a …nite-dimensional space, and let T : V ! W be a linear trans-
formation. Then
nullity(T ) + rank(T ) = dim(T )
corresponds to the rank theorem (version 1)
(number of free var)+(number of pivots)=(number of total var)
Proof: By hypothesis, dim(V ) is …nite; n := dim(V ). Since ker(T ) is a
subspace of V , and has dim k, then k := dim(ker(T )) = nullity(T ). 0 k n.
Now need to show k + rank(T ) = n, or dim(image(T )) = n k.

ker(T ) has a basis fv1 ; :::; vk g of k elements. Since this basis is in ker(T ),
so it is also in V , and the basis vectors are linearly independent, therefore it
must be part of a basis of V , which must have n = dim(V ). A basis for V is
fv1 ; :::; vn g; we can add n k extra basis vectors vk+1 ; :::; vn .

Since vk+1 ; :::; vn lie in V , the elements T vk+1 ; :::; T vn lie in image(T ). Now
we can claim that fT vk+1 ; :::; T vn g are a basis for image(T ), so image(T ) has
dimension n k.

Verify that fT vk+1 ; :::T vn g form a basis, we need to show that they span
image(T ) and are linearly independent. Every vector in image(T ) is a linear
combination of fT vk+1 ; :::T vn g. We pick a vector w from image(T ), and show
w is linear combination of fT vk+1 ; :::; T vn g. By de…nition of image, w = T v.

v = a1 v1 + ::: + an vn for scalars a1 ; :::; an


Apply T
w = T v = (a1 T v1 + a2 T v2 + ak T vk ) + ak+1 T vk+1 + ::: + an T vn

v1 ; :::; vk are in ker(T ), so T v1 = ::: = T vk = 0,


w = T v = ak+1 T vk+1 + ::: + an T vn now we have w as a linear combination
of the basis fvk+1 ; :::; vn g.

6
Now we show that fvk+1 ; :::; vn g is linearly independent. Prove by contra-
diction that they are dependent,
ak+1 T vk+1 + ::: + an T vn = 0 for some scalars ak+1 ; :::; an which are not all
zeros.
T (ak+1 vk+1 + ::: + an vn ) = 0
because ak+1 vk+1 + ::: + an vn 2 ker(T ) and it is spanned by fv1 ; :::; vk g, we
have
ak+1 vk+1 + ::: + an vn = a1 v1 + ::: + ak vk for some scalars a1 ; :::; ak
a1 v1 ::: ak vk + ak+1 vk+1 + ::: + an vn = 0
since fv1 ; :::; vn g is linearly independent, that means all coe¢ cients ai must
be zero. This will lead to a contradiction with the hypothesis. Thus fT vk+1 ; :::; T vn g
must be linearly independent.

Ex: Let T : R2 ! R2 denote the linear transformation


T (x; y) = (x + y; 2x + 2y) = (x + y; 2(x + y))
Find the ker(T ) and image(T ).
ker(T ) = f(x; y) 2 R2 : x + y = 0)
dim(ker(T )) = 1
image(T ) = f(t; 2t) : t 2 Rg
dim(image(T )) = 1
dim(ker(T )) + dim(image(T )) = 1 + 1 = 2 = dim(T )

Lemma: Let V and W be …nite-dimensional vector spaces of the same di-


mension, dim(V ) = dim(W ), and let T : V ! W be a linear transformation
from V to W . Then T is one-to-one i¤ T is onto.

Proof: If T is one-to-one, then nullity(T ) = 0 from dimension theorem.


Since dim(V ) = dim(W ), we have dim(image(T )) = dim(W ). Since image(T )
is a subspace of W , image(T ) = W , thus T is onto.

Theorem: If T : V ! W be a linear transformation, and let fv1 ; :::; vn g span


V , then fT v1 ; :::; T vn g span image(T ).
Proof: we choose any vector w in image(T ), and show that w can be written
as a linear combination of T v1 ; :::; T vn .
w = T v for some v 2 V . Since fv1 ; :::; vn g span V , v = a1 v1 + ::: + an vn .
Apply T to both sides, w = T v = T (a1 v1 + ::: + an vn ) = a1 T v1 + ::: + an T vn
w is now written as a linear combination of T v1 ; :::; T vn .

Theorem: If T : V ! W is a linear transformation which is one-to-one, and


fv1 ; :::; vn g is linearly independent, then fT v1 ; :::; T vn g is also linearly indepen-
dent.
Proof: suppose 0 = a1 T v1 + ::: + an T vn
we need to show that a1 ; :::; an are all zeros.
0 = T (a1 v1 + ::: + an vn )
Since T is one-to-one, ker(T ) = f0g, and thus

7
0 = a1 v1 + ::: + an vn
But since fv1 ; :::; vn g is linearly independent, implies a1 ; :::; an are all zero.

Corollary: If T : V ! W is both one-to-one and onto, and fv1 ; :::; vn g is


a basis for V , then fT v1 ; :::; T vn g is a basis for W . In particular, dim(V ) =
dim(W ).
Proof: Since fv1 ; :::; vn g is a basis for V , it spans V . Then fT v1 ; :::; T vn g
spans image(T ).
T is onto, image(T ) = W .
fv1 ; :::; vn g is linearly independent and T is one-to-one, then fT v1 ; :::; T vn g
are linearly independent. Now the two requirements for basis are both satis…ed.

The converse is also true: if T : V ! W is one-to-one, and fT v1 ; :::; T vn g is


a basis, then fv1 ; :::; vn g is also a basis.

Theorem: Let V be a …nite-dimensional vector space, and let fv1 ; :::; vn g be


a basis for V . Let W be another vector space, and let w1 ; :::; wn be some vectors
in W . Then there exists exactly one linear transformation T : V ! W such
that T vj = wj for each j = 1; :::; n.

De…nition: Let V be a …nite dimensional vector space. An ordered basis of


V is an ordered sequence (v1 ; :::; vn ) of vectors in V such that the set fv1 ; :::; vn g
is a basis.

Ex: The sequence ((1; 0; 0); (0; 1; 0); (0; 0; 1)) is an ordered basis for R3 ; the
sequence ((0; 1; 0); (1; 0; 0); (0; 0; 1)) is a di¤erent ordered basis for R3 .

Ex: v :=
2 (3;
3 4; 5). If := ((1; 0; 0); (0; 1; 0); (0; 0; 1)), then
3
[v] = 445
5
and (3; 4; 5) = 3(1; 0; 0) + 4(0; 1; 0) + 5(0; 0; 1)

2
Ex: P2 (R),
2 3 and let f = 3x + 4x + 6. If := (1; x; x2 ), then
6
[f ] = 445
3

Def: Let V and W be vector spaces, and let S : V ! W and T : V ! W be


two linear transformations from V to W . We de…ne the sum of S + T of these
transformations to be a third transformation S + T : V ! W de…ned by
(S + T )(v) := Sv + T v

Def: Let T : V ! W be a linear transformation, and let c be a scalar.


We de…ne the scalar multiplication cT of c and T to be the transformation
cT : V ! W , de…ned by

8
(cT )(v) = c(T v)

Def: Let L(V; W ) be the space of linear transformations from V to W .

Lemma: The space L(V; W ) is a subspace of F (V; W ), the space of all func-
tions from V to W . In particular, L(V; W ) is a vector space.

Def: Let U; V; W be vector spaces. Let S : V ! W be a linear transformation


from V to W , and let T : U ! V be a linear transformation from U to V . Then
we de…ne the product or composition ST : U ! W to be the transformation
ST (u) := S(T (u))

Ex: Let U : R1 ! R1 be the right shift operator


U (x1 ; x2 ; :::) := (0; x1 ; x2 ; :::)
Then the operator U U = U 2 given by
U 2 (x1 ; x2 ; :::) = U (U (x1 ; x2 ; :::))
= U (0; x1 ; x2 ; :::) = (0; 0; x1 ; x2 ; :::)

Matrix Representation of Linear Transformations


Let V and W be …nite-dimensional vector spaces, and let := (v1 ; :::; vn ) and
:= (w1 ; :::; wm ) be ordered bases for V and W , respectively; thus fv1 ; :::; vn g
is a basis for V and fw1 ; :::; wm g is a basis for W , so that V is n-dimensional
and W is m-dimensional. Let T be a linear transformation from V to W .

Ex: Let V = P3 (R), W = p2 (R), and T : V ! W be the di¤erentiation map


T f := f 0 . We use the standard order basis := (1; x; x2 ; x3 ) for V , and the
standard ordered basis := (1; x; x2 ) for W .

2
Ex: pick
2 a3 v 2 P3 (R), v = 3x + 7x + 5
5
677
[v] = 6435
7

0
d
Then we have T v = dx v = 6x + 7
so that 2 3
7
[T v] = 465
0

General base
v = x1 v1 + ::: + xn vn
Let us apply T to both sides
T v = x1 T v1 + ::: + xn T vn
from the formula [T v] we have

9
T v = y1 w1 + ::: + ym wm
The vectors T v1 ; :::; T vn lie in W , and so they are linear combinations of
w1 ; :::; wm
T v1 = a11 w1 + a21 w2 + ::: + am1 wm
T v2 = a12 w1 + a22 w2 + ::: + am2 wm
:::
T vn = a1n w1 + a2n w2 + ::: + amn wm

Lemma: Let V; W be …nite-dimensional spaces with ordered bases , re-


spectively. Let S : V ! W and T : V ! W be linear transformations from V
to W , and let c be a scalar. Then
[S + T ] = [S] + [T ]
[cT ] = c[T ]
Proof on the scalar multiplication part:
Let = (v1 ; :::; vn ) and = (w1 ; :::; wm ), and denote the matrix [T ] by
2 3
a11 a12 ::: a1n
6 a12 a22 ::: a2n 7
[T ] := 64
7
5
:::
am1 am2 ::: amn
T v1 = a11 w1 + a21 w2 + ::: + am1 wm
T v2 = a12 w1 + a22 w2 + ::: + am2 wm
:::
T vn = a1n w1 + a2n w2 + ::: + amn wm
multiply by c
(cT )v1 = ca11 w1 + ca21 w2 + ::: + cam1 wm
(cT )v2 = ca12 w1 + ca22 w2 + ::: + cam2 wm
:::
(cT )vn = ca1n w1 + ca2n w2 + ::: + camn wm
2 3
ca11 ca12 ::: ca1n
6 ca21 ca22 ::: ca2n 7
[cT ] = 6
4
7
5
:::
cam1 cam2 ::: camn
[cT ] = c[T ]

Def: Let T : V ! W be a linear transformation. We say that a linear


transformation S : W ! V is the inverse of T if T S = IW amd ST = IV .
We say that T is invertible if it has an inverse, and call the inverse T 1 ; thus
T T 1 = IW and T 1 T = IV .

x1 x x2 y 2y1
Ex: T = 1 ;S 1 =
x2 x1 + x2 y2 y2
S T [a]
x x x2 2(x1 x2 )
S T 1 =S 1 =
x2 x1 + x2 (x1 + x2 )

10
2 2 x1
=
1 1 x2
1 1 2 0
T = ;S
1 1 0 1
2 0 1 1 2 2
ST = =
0 1 1 1 1 1
Lemma: Let T : V ! W be a linear transformation, and let S : W ! V
and S 0 : W ! V both be inverses of T . Then S = S 0 .
Proof: S = SIW = S(T S 0 ) = (ST )S 0 = IV S 0 = S 0

Lemma: If T : V ! W has an inverse S : W ! V , then T must be one-to-


one and onto.
Proof: First show that T is one-to-one. Suppose that T v = T v 0 ; we have to
show that v = v 0 . Apply S on both sides we get ST v = ST v 0 ! IV v = IV v 0 ,
thus v = v 0 as desired. Now we show that T is onto. Let w 2 W ; we have to
…nd v such that T v = w. But w = IW w = T Sw = T (Sw), so if we let v := Sw
then we have T v = w as desired.
The converse of this lemma is also true.

Lemma: If T : V ! W is one-to-one and onto linear transformation, then it


has an inverse S : W ! V , which is also a linear transformation.
Proof: Let T : V ! W be one-to-one and onto. Let w be an element of W .
Since T is onto, we have w = T v for some v in V ; since T is one-to-one; this v
is unique. De…ne Sw as equal to this v, thus S is a transformation from W to
V . For any w 2 W , we have w = T v and Sw = v for some v 2 V , and hence
T Sw = w; thus T S is the identity IW :
Now show that ST = IV , that for every v 2 V , we have ST v = v. Since
we already know that T S = IW , we have that T Sw = w for all w 2 W . In
particular we have T ST v = T v, since T v 2 W . But since T is invertible, this
implies that ST v = v as desired.
Finally we show that S is linear, that it preserves addition and scalar multi-
plication. We will just show that it preserves addition. Let w; w0 2 W ; we need
to show that S(w + w0 ) = Sw + Sw0 .
T (S(w + w0 ) = T S(w + w0 ) = IW (w + w0 ) = IW w + IW w0
= T Sw + T Sw0 = T (Sw + Sw0 );
Since T is one-to-one, this implies that S(w + w0 ) = Sw + Sw0 as desired.
Thus a linear transformation is invertible if and only if it is one-to-one.
Invertible linear transformations are also known as isomorphisms.

Def: Two vector spaces V and W are said to be isomorphic if there is an


invertible linear transformation T : V ! W from one space to another.

Example: The map T : R3 ! P2 (R) de…ned by


T (a; b; c) := ax2 + bx + c
is linear, one-to-one, and onto, and hence an isomorphism. Thus R3 and
P2 (R) are isomorphic.

11
Lemma: Two …nite-dimensional spaces V and W are isomorphic if and only
if dim(V ) = dim(W ).
Proof: if V and W are isomorphic, then there is an invertible linear trans-
formation T : V ! W from V to W , which by the lemma on one-to-one and
onto properties of inverse linear transformations, is one-to-one and onto. Since
T is one-to-one, nullity(T ) = 0. Sine T is onto, rank(T ) = dim(W ). By the
dimension theorem we have dim(V ) = dim(W ).

Now suppose that dim(V ) and dim(W ) are equal; let’s say that dim(V ) =
dim(W ) = n. Then V has a basis (v1 ; :::; vn ), and W has a basis fw1 ; ::; wn g.
We can …n a linear transformation T : V ! W such that T v1 = w1 ; :::; T vn =
wn . fw1 ; :::; wn g must span image(T ). But since w1 ; :::; wn span W , we have
image(T ) = W , T is onto, and therefore T is one-to-one, and hence is an
isomorphism. Thus V and W are isomorphic.
2 3 21 3
2 0 0 2 0 0
Ex: If A = 40 3 05 ; A 1
= 40 1
3 05
1
0 0 4 0 0 4

Theorem: Let V be a vector space with …nite ordered basis , and let W
be a vector space with …nite ordered basis . Then a linear transformation
T : V ! W is invertible if and only if the matrix [T ] is invertible. Furthermore
([T ] ) 1 = [T 1 ]

Proof: Suppose that V is n-dimensional and W is m-dimensional; this makes


[T ] an m n matrix.
First suppose that T : V ! W has an inverse T 1 : W ! V . Then
[T ] [T 1 ] = [T T 1 ] = [IW ] = Im
[T 1 ] [T ] = [T 1 T ] = [IV ] = In
Thus [T 1 ] is the inverse of [T ] and so [T ] is invertible.

Now suppose that [T ] is invertible, with inverse B. We will prove that there
exists a linear transformation S : W ! V with [S] = B.
Assume that we have
[ST ] = [S] [T ] = B[T ] = [IV ]
and hence ST = IV . A similar argument gives T S = IW , and so S is the
inverse of T and so T is invertbile.
It remains to show that we can …nd a transformation S : W ! V with
[S] = B. Write = (v1; :::; vn ) and = (w1 ; :::; wm ). Then we want a linear
transformation S : W ! V such that
Sw1 = B11 v1 + ::: + B1n vn
Sw2 = B21 v1 + ::: + B2n vn
:::
Swm = Bm1 v1 + ::: + Bmn vn

12
Corollary: An m n matrix A is invertible if and only if the linear transfor-
mation LA : Rn ! Rm is invertible. Furthermore, the inverse of LA is LA 1 .
Proof: If is the standard basis for Rn and is the standard basis for Rm ,
then
[LA ] = A
By the theorem above, A is invertible if and only if LA is. Also, from the
above theorem we have
[LA1 ] = ([LA ] ) 1 = A 1 = [LA 1 ]
and hence LA1 = LA 1 as desired.

Corollary: In order for a matrix A to be invertible, it must be square.

13

You might also like