Linear Algebra
Real Vector Spaces
Copyright 2005, W.R. Winfrey
Linear Algebra
Real Vector Spaces
Preliminaries
Vectors in the Plane and in 3-Space
Vector Spaces
Subspaces
Span and Linear Independence
Basis and Dimension
Homogeneous Systems
Coordinates and Isomorphisms
Rank of a Matrix
Four Fundamental Subspaces
Topics
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Defn - Let S = { v1, v2, , vk } be a set of vectors
in a vector space V. A vector v V is called a
linear combination of the vectors in S if
v = a1 v1 + a2 v2 + + ak vk
for some real numbers a1, a2 , , ak
Linear Algebra
Span and Linear Independence
Real Vector Spaces
Example
Consider the three-dimensional vectors
=
v1
1
=
2
, v2
1
This image cannot currently be displayed.
1
=
0
, v3
2
1
1
0
2
Express the vector v = 1 as a linear combination
5
of v1, v2 and v3
Linear Algebra
Span and Linear Independence
Real Vector Spaces
Example (continued)
1
1
1
a1 2 + a2 0 + a3 1 =
1
2
0
a + a +a
2
3
1
2a1 + 0 + a3 =
a + 2a + 0
2
1
2
1
1 1 a1 2
0 1 a2 = 1
2 0 a3 5
The linear system may be solved to yield a1 = 1,
a2 = 2 and a3 = 1. So, v = v1 + 2 v2 v3
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Defn - Let S = { v1, v2, , vk } be a set of vectors
in a vector space V. The span of S is the set of all
linear combinations of the elements of S
To determine if a vector v belongs to the span of
S, need to examine the corresponding system of
linear equations. If that system has a solution (or
solutions) then v belongs to the span of S
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Theorem - Let S = { v1, v2, , vk } be a set of vectors
in a vector space V. The span of S is a subspace of V.
Proof - To show that span(S) is a subspace of V, have
to show
If u, v span(S), then u+v span(S)
If c is a real number and u span(S), then cu span(S)
=
=
ai v i , v i 1 bi v i for
Let u, v span(S).
Then u
=
i 1=
some real numbers a1, a2, , ak and b1, b2, , bk .
k
k
k
u=
+ v i 1 a=
b=
i 1 ( ai + bi ) vi span(S).
i v i + i 1=
i vi
=
k
Let c be real and u span(S). Then u = i=1 ai vi for
some real numbers a1, a2, , ak .
k
k
=
cu c=
QED
i 1 a=
i 1 cai vi span(S).
i vi
=
k
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Defn - Let S = { v1, v2, , vk } be a set of vectors
in a vector space V. The set S spans V if every
vector in V is a linear combination of the vectors
in S
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Example
1
1 0 2
Consider the homogeneous
2
2
1
5
A=
system Ax = 0 where
1
1 1 3
Form augmented
matrix and put it into
reduced row echelon
form
0
0
1
0
0
0
4 1
0 2
1 1
0 0
0 0
0
0
Linear Algebra
Span and Linear Independence
Real Vector Spaces
x1
x1 + x2 + 2 x4 =
0
x2
x3 x4 =
0
Let the solution be x =
x3
Set x4 = s x3 = s
x4 x + x = 2s
1
2
Example (continued)
Set x2 = r x1 = r 2s
r 2s
r=
1
2
1 0
r
+s
0
1
0
1
1 2
1 , 0
0 1
0 1
Spans the solution space
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Defn - Let S = { v1, v2, , vk } be a set of distinct
vectors in a vector space V. Then S is said to be
linearly dependent if there exist constants,
a1, a2 , , ak , not all zero, such that
a1 v1 + a2 v2 + + ak vk = 0
Defn - Let S = { v1, v2, , vk } be a set of distinct
vectors in a vector space V. If S is not linearly
dependent, then S is said to be linearly
independent. That is, the only way to have
a1 v1 + a2 v2 + + ak vk = 0
is for a1 = a2 = = ak = 0
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Two Views of Linear Dependence
Linear dependence means that any member of S
can be expressed as a linear combination of the
others
Linear dependence also means that the span of S
can be expressed as the span of some proper
subset of S
Linear Algebra
Span and Linear Independence
Real Vector Spaces
1
2
Example
1
The vectors and 0 that span the solution
space of the previous example are linearly
independent
1
2 0
1
0 0
a1
+ a2
=
0
1 0
0
1 0
a1 2a2
a1
a2
a2
The only solution is a1 = 0, a2 = 0
=0
= 0
= 0
= 0
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Example
Let V be R4 and v1 = [ 1 0 1 2 ], v2 = [ 0 1 1 2 ]
and v3 = [ 1 1 1 3 ]. Determine if S = { v1, v2, v3 }
is linearly independent or linearly dependent
+ a3
a2 + a3
a1v1 + a2 v 2 + a3 v3 =
0
a1 + a2 + a3
2a1 + 2a2 + 3a3
a1
=
0
=
0
=0
=
0
Subtract second equation from third and get a1 = 0.
The first equation gives a3 = 0, then the second
equation gives a2 = 0. So the vectors are linearly
independent
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Theorem - Let S1 and S2 be finite subsets of a
vector space and let S1 be a subset of S2. Then
a) If S1 is linearly dependent, so is S2
b) If S2 is linearly independent, so is S1
Proof - Introduce some notation
S1 = { v1, v2, , vk } and
S2 = { v1, v2, , vk , vk+1 , , vm }
a) Since S1 is linearly dependent, there exist
constants a1, a2 , , ak , not all zero, such that
a1 v1 + a2 v2 + + ak vk = 0
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Proof (continued) Then
a1 v1 + a2 v2 + + ak vk + 0 vk+1 + + 0 vm = 0
Since not all of the coefficients are 0, S2 is linearly
dependent.
b) Let S2 be linearly independent. Then S1 is either
linearly independent or linearly dependent.
Suppose S1 is linearly dependent, then by part (a)
S2 must be linearly dependent, which is a
contradiction. So, S1 must be linearly independent
QED
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Consequence of the Theorem
The set S = { 0 } is linearly dependent since, for
example, 5 0 = 0 and 5 0. By the theorem, any
set of vectors that contains 0 must be linearly
dependent
Also, any set containing a single nonzero vector
must be independent. (Have a theorem that says if
c v = 0 then either c = 0 or v = 0)
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Geometric Interpretations in R2 and R3
Suppose { v1, v2 } is linearly dependent in R2.
There exist a1 and a2, not both zero, such that
a1v1 + a2v2 = 0. If a1 0, then v1 = ( a2 / a1 ) v2.
If a2 0, then v2 = ( a1 / a2 ) v1. Either way, one
vector is a multiple of the other.
Let { v1, v2, v3 } be a set of linearly dependent
vectors in R3. There are three possibilities
1) All three vectors are the zero vector
2) All three vectors lie on the same line through the origin
3) All three vectors lie in a plane through the origin
spanned by two of the vectors
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Theorem - Let S = { v1, v2, , vn } be a set of
nonzero vectors in a vector space V. Then S is
linearly dependent if and only if one of the vectors vj
is a linear combination of the preceding vectors in S.
Proof - Let S be linearly dependent. Then there
exist scalars a1, a2 , , an , not all zero, such that
a1 v1 + a2 v2 + + an vn = 0
Let j be the largest subscript for which aj 0. Know
that j > 1 since j = 1 implies a1 v1 = 0, which implies
v1 = 0, which contradicts the hypothesis of the
theorem, so j > 1
a
a
a j 1
v j 1
vj =
1 v1 2 v 2
aj
aj
aj
Linear Algebra
Real Vector Spaces
Span and Linear Independence
Proof (continued) Let vj be a linear combination of the preceding
vectors in S. Then vj = a1 v1 + a2 v2 + + aj1 vj1
where not all of the ai coefficients are zero. So,
a1 v1 + a2 v2 + + aj1 vj1 vj + 0 vj+1 + + 0 vn = 0
QED
Linear Algebra
Real Vector Spaces
Preliminaries
Vectors in the Plane and in 3-Space
Vector Spaces
Subspaces
Span and Linear Independence
Basis and Dimension
Homogeneous Systems
Coordinates and Isomorphisms
Rank of a Matrix
Four Fundamental Subspaces
Topics
Linear Algebra
Real Vector Spaces
Basis and Dimension
Defn - A set of vectors S = { v1, v2, , vk } in a
vector space V is called a basis for V if S is
linearly independent and S spans V.
Informally, S is a basis if
It doesnt have too many vectors (linearly independent)
It doesnt have too few vectors (spans V)
Example - Let V
= R3 and S
=
1 0 0
=
0 , 1 , 0
0 0 1
{e1, e2 , e3}
= {i, j, k}
S is a basis for R3, called the natural basis
Linear Algebra
Real Vector Spaces
Basis and Dimension
Example
Let V = R3 and S = { [1 0 0], [0 1 0], [0 0 1] }. S
is the natural basis for R3
Example
The natural basis for Rn is
ei
denoted by e1, e2, , en where =
0
0
1 ith
0
0
row
Linear Algebra
Real Vector Spaces
Basis and Dimension
Example
Legendre Polynomials - Ln ( x )
L0 ( x ) = 1
L1 ( x ) = x
L2 ( x ) = ( 3x 2 1 ) / 2
L3 ( x ) = ( 5x 3 3x ) / 2
In general,
(n+1) Ln+1 ( x ) = (2n+1) x Ln ( x ) n Ln1 ( x )
S = { L0 ( x ), L1 ( x ), L2 ( x ) } is a basis for P2
In general,
S = { L0 ( x ), L1 ( x ), , Ln ( x ) } is a basis for Pn
Linear Algebra
Real Vector Spaces
Basis and Dimension
Theorem - If S = { v1, v2, , vn } is a basis for a
vector space V, then every vector in V can be
written in one and only one way as a linear
combination of the vectors in S
Proof - Let v V be arbitrary. Since S spans V, we
can write v as v = a1 v1 + a2 v2 + + an vn . Now,
suppose that v can also be written as
v = b1 v1 + b2 v2 + + bn vn . Subtracting, we have
0 = (a1 b1) v1 + (a2 b2) v2 + + (an bn) vn
Since the elements of S are linearly independent,
then (a1 b1) = 0, (a2 b2) = 0, , (an bn) = 0.
So a1 = b1 , a2 = b2 , , an = bn , i.e. the
coefficients are unique
QED
Linear Algebra
Real Vector Spaces
Basis and Dimension
Theorem - If S = { v1, v2, , vn } is a set of vectors
spanning a vector space V, then S contains a basis T
for V
Proof - If S is linearly independent, we are done. So,
suppose S is not linearly independent. By a previous
theorem, some vj can be written as a linear
combination of the preceding vectors. Form a new
set S1 = { v1, v2, , vj1, vj+1, , vn }, i.e. delete vj
from S. S1 also spans V since any v V can be
expressed as v = a1 v1 + + aj vj + + an vn and
vj can be expressed as a linear combination of
v1, v2, , vj1. So, v can be expressed as a linear
combination of elements of S1 thus S1 spans V.
Linear Algebra
Real Vector Spaces
Basis and Dimension
Proof (continued) If S1 is linearly independent then we are done. If
S1 is linearly dependent, then some vk S1 can be
expressed as a linear combination of preceding
vectors in S1. Define S2 = S1 { vk }
Since S is a finite set, we will eventually find a
subset T of S which is linearly independent and
spans V.
QED
Linear Algebra
Real Vector Spaces
Basis and Dimension
Theorem - If S = { v1, v2, , vn } is a basis for a
vector space V and T = { w1, w2, , wr } is a
linearly independent set of vectors in V, then r n
Proof - Consider the set T1 = { w1, v1, v2, , vn }.
Since S spans V, so does T1. T1 is linearly
dependent since w1 is a linear combination of
vectors in S. Since T1 is linearly dependent, some
vj is a linear combination of the preceding vectors.
Delete vj from T1 to form S1, i.e. S1 = T1 { vj }
= { w1, v1, v2, , vj1, vj+1, , vn }. Note that S1
spans V since T1 does and vj can be expressed as a
linear combination of elements of S1.
Linear Algebra
Real Vector Spaces
Basis and Dimension
Proof (continued) Form T2 = { w2, w1, v1, v2, , vj1, vj+1, , vn }.
T2 is linearly dependent and some vector in T2 is a
linear combination of the preceding vectors. It
cannot be w1 since T is linearly independent. So it
must be vi , i j. Then form S2 = T2 { vi }. S2
spans V
If this process is continued, at each step will
generate a set Tk of linearly dependent vectors that
spans V. In this set, some vp S can be expressed
as a linear combination of preceding vectors. Then
form a set Sk = Tk { vp } that spans V.
Linear Algebra
Real Vector Spaces
Basis and Dimension
Proof (continued) There are two possibilities: r n, and r > n
If r n, the process will stop when Tr is formed,
since Tr+1 cannot be formed
If r > n, then when Tn is formed, the formation of
of Sn by deleting the last element of S from Tn ,
will give a set consisting entirely of elements of T.
Since Sn spans V, the remaining r n vectors in T
may be expressed as linear combinations of
vectors in Sn, which contradicts the linear
independence of T. So, we cannot have r > n
QED
Linear Algebra
Real Vector Spaces
Basis and Dimension
Corollary - If S = { v1, v2, , vn } and
T = { w1, w2, , wm } are bases for a vector space
V, then n = m, i.e. every basis for V contains the
same number of vectors
Proof - Since S and T are both bases for V and are
both linearly independent, the previous theorem
implies that n m and m n. So n = m
QED
Linear Algebra
Real Vector Spaces
Basis and Dimension
Defn - The dimension of a nonzero vector space V
is the number of vectors in a basis for V. Notation
is dim V.
The dimension of the trivial vector space { 0 } is
defined as 0
Linear Algebra
Real Vector Spaces
Basis and Dimension
Corollary - If a vector space V has dimension n, then
a largest linearly independent subset of vectors in V
contains n vectors and is a basis for V
Proof - Let S = { v1, v2, , vk } be a largest linearly
independent set of vectors in V. By the theorem,
k n. Let v V be arbitrary and consider the set of
vectors S = { v1, v2, , vk , v }. Since S is a largest
linearly independent set, S must be linearly
dependent since it contains more vectors. Some
vector in S must be expressible as a linear
combination of preceding vectors. Only choice is v
since S is linearly independent. So, S is a basis since
it is linearly independent and spans V. By the first
corollary, k = n
QED
Linear Algebra
Real Vector Spaces
Basis and Dimension
Corollary - If a vector space V has dimension n,
the smallest set of vectors that spans V contains n
vectors and is a basis for V
Proof - Let S = { v1, v2, , vk } be a smallest set
of vectors that spans V. Suppose S is linearly
dependent. Then some vector in S could be
expressed as a linear combination of preceding
vectors and that vector could be eliminated from S
to give a smaller set of vectors that spans V. This
contradicts the definition of S. So, S is linearly
independent. Thus, S is a basis for V. By the first
corollary, k = n
QED
Linear Algebra
Real Vector Spaces
Basis and Dimension
Corollary - If vector space V has dimension n,
then any subset of m > n vectors must be linearly
dependent
Proof - Let S be a set of m > n vectors. S is either
linearly independent or linearly dependent. If S is
linearly independent, then by the theorem must
have m n, which is a contradiction. So S is
linearly dependent
QED
Linear Algebra
Real Vector Spaces
Basis and Dimension
Corollary - If vector space V has dimension n,
then any subset of m < n vectors cannot span V
Proof - Let S be a set of m < n vectors in V.
Without loss of generality, assume that S is
linearly independent. (If not, can remove vectors
and get an independent set.) If S spans V, then S is
a basis and by the first corollary, m = n , which is
a contradiction. So, S cannot span V.
QED
Linear Algebra
Real Vector Spaces
Basis and Dimension
Theorem - If S is a linearly independent set of
vectors in a finite dimensional vector space V,
then there is a basis T for V, which contains S
Proof - S = { v1, v2, , vm } be a linearly
independent set of vectors in the n dimensional
vector space V, where m < n. Let { w1,w2, ,wn }
be a basis for V and consider S1 = { v1, v2,, vm,
w1, w2,, wn }. Since S1 spans V, it contains a
basis T for V, which is obtained by deleting from
S1 every vector that is a linear combination of the
preceding vectors. Since S is linearly independent,
none of the vi can be linear combinations of other
vj and thus are not deleted. So T will contain S.
QED
Linear Algebra
Real Vector Spaces
Basis and Dimension
Theorem - Let V be an n-dimensional vector space
a) If S = { v1, v2, , vn } is a linearly independent
set of vectors in V, then S is a basis for V
b) If S = { v1, v2, , vn } spans V, then S is a
basis for V
Linear Algebra
Real Vector Spaces
Basis and Dimension
Defn - Let S be a set of vectors in a vector space
V. A subset T of S is called a maximal
independent subset of S if T is a linearly
independent set of vectors, and if there is no
linearly independent subset of S having more
vectors than T does.
Linear Algebra
Real Vector Spaces
Basis and Dimension
Theorem - Let S be a finite subset of the vector
space V that spans V. A maximal independent
subset T of S is a basis for V
Linear Algebra
Real Vector Spaces
Preliminaries
Vectors in the Plane and in 3-Space
Vector Spaces
Subspaces
Span and Linear Independence
Basis and Dimension
Homogeneous Systems
Coordinates and Isomorphisms
Rank of a Matrix
Four Fundamental Subspaces
Topics
Linear Algebra
Homogeneous Systems
Real Vector Spaces
Example
Consider the homogeneous system Ax = 0 where
1
A=
1
2
6
3
4
2
4
4
4
2
6
2
3
3
6
4
4
0
4
0
A reduces to
1
0
0
0 2
1 2
0 0
0 0
0 3 4
0 1 0
1 2 2
0 0 0
Linear Algebra
Real Vector Spaces
Homogeneous Systems
Example (continued)
Corresponding system of equations is
x1 2 x3
0 Let x6 = t, x5 = s, x3 = r
3x5 + 4 x6 =
x2 + 2 x3
0 Then x4 = 2s + 2t,
+ x5
=
x2 = 2r s,
x4 + 2 x5 2 x6 =
0
x1 = 2r + 3s 4t
x1 2r + 3s 4t
2
3
4
x
2 2r s
2
1
0
x3
1
0
0
r
=
= r + s +t
x4 2s + 2t
0
2
2
x
0
1
0
s
5
0
0
1
x6
Linear Algebra
Homogeneous Systems
Real Vector Spaces
Example (continued)
The null space of A is spanned by the independent
set of vectors below and thus has dimension 3.
2 3 4
2 1 0
1 0 0
, ,
0 2 2
0 1 0
0 0 1
Dimension of null space is called the nullity of A
Linear Algebra
Real Vector Spaces
Homogeneous Systems
Example (continued)
The null basis may be obtained by the procedure
on the preceding slides or by the following
procedure
1. In the reduced row echelon form, identify the
columns corresponding to the free variables
(i.e. the non pivot columns). In this example:
columns 3, 5 and 6. Since there are three
nonpivot columns, the null basis will contain
three vectors
Linear Algebra
Homogeneous Systems
Real Vector Spaces
Example (continued)
2. Since there are six unknowns, initialize three 6 x 1
vectors with 1s and 0 s as shown
#3 1 0 0
, ,
#5 0 1 0
#6 0 0 1
Linear Algebra
Real Vector Spaces
Homogeneous Systems
Example (continued)
3. For the pivot positions, 1, 2 and 4, insert the
coefficients of the free variables after changing
their signs
#1 2 3 4
#2 2 1 0
1 0 0
, ,
#4 0 2 2
0 1 0
0 0 1
Linear Algebra
Homogeneous Systems
Real Vector Spaces
Nonhomogeneous and Homogeneous Systems
Consider the linear system
1
2
3
2 1 x1 3
4 2 x2 =
6
6 3 x3 9
The solution consists of all vectors of the form
x 1 2r + s 1
2
1
1
x
=
1
+
r
=
1
+
r
1
+
s
2
0
x 2 + s 2
0
1
3
for arbitrary r and s
Linear Algebra
Homogeneous Systems
Real Vector Spaces
Nonhomogeneous and Homogeneous Systems
2
1
The set of linear combinations, r 1 + s 0 , form
0
1
a plane passing through the
3
origin of R
1
2
1
The solution vectors, 1 + r 1 + s 0 , form
a plane parallel to the 2 0 1
plane above, displaced from the origin by the
vector 1
1
2
Linear Algebra
Real Vector Spaces
Preliminaries
Vectors in the Plane and in 3-Space
Vector Spaces
Subspaces
Span and Linear Independence
Basis and Dimension
Homogeneous Systems
Coordinates and Isomorphisms
Rank of a Matrix
Four Fundamental Subspaces
Topics
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Defn - Let S = { v1, v2, , vn } be an ordered
basis for V. Let v V be expressed as
v = a1 v1 + a2 v2 + + an vn .
The numbers a1, a2 , , an are called the
coordinates of the vector v with respect to the
basis S
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Comments
The conventional notation for the coordinate
vector of v with respect to S is
a1
a2
v S =
an
If a different basis for V is selected, the coordinate
vector of v will be different
No matter what the elements of V are, the
coordinate vectors belong to Rn
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Example
Consider the vector space P1 of all polynomials of
degree 1 and the zero polynomial. Let S = { v1, v2 }
be an ordered basis for P1 with v1 = t and v2 = 1. Let
5
v = 5 t 2, then v = . Consider another
S
2
ordered basis T = { t + 1, t 1 }. Then
3 2
3
7
5t 2= ( t + 1) + ( t 1) v T =
2
2
7 2
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Preliminaries
Note the correspondence between a vector space
P1 of polynomials and the vector space R2. This
correspondence will be developed further
Let v and w be vectors in an arbitrary n
dimensional vector space V which has an ordered
basis S = { v1, v2, , vn }
v = a1 v1 + a2 v2 + + an vn
w = b1 v1 + b2 v2 + + bn vn
Linear Algebra
Coordinates and Isomorphisms
Real Vector Spaces
Preliminaries (continued)
Let [ v ]S and [ w ]S be the coordinate vectors of v
and w
v + w = (a1 + b1)v1 + (a2 + b2)v2 + + (an + bn)vn
v + w S =
a1 + b1
a2 + b2 = v + w
S S
an + bn
So, v + w is associated with [ v ]S + [ w ]S
Linear Algebra
Coordinates and Isomorphisms
Real Vector Spaces
Preliminaries (continued)
c v = (c a1) v1 + (c a2) v2 + + (c an) vn
c=
v S
ca1
ca
2
=
can
c v S
So c v is associated with c [ v ]S
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Defn - Let V be a real vector space with
operations + and , and let W be a real vector
space with operations and
. A one-to-one
function L mapping V onto W is called an
isomorphism of V onto W if
a) L ( v + u =) L ( v ) L ( u ) for every v, u V
b) L ( c v ) =
c L ( v ) for every v V and every real
number c
The function L in the above definition is called a
linear transformation
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Comments
To show that two vector spaces are isomorphic,
have to
Find a candidate mapping L
Prove that the mapping L is one-to-one
Prove that the mapping L is onto
Prove property (a)
Prove property (b)
Two isomorphic vector spaces differ only in the
nature of the elements that are in the spaces.
Algebraically, they are identical
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Theorem - If V is an n-dimensional vector space,
then V is isomorphic to Rn
Proof - Let Let S = { v1, v2, , vn } be an ordered
basis for V, and let L:V Rn be defined by
a1
a
L (=
v) [=
v ]S 2 where v = a1v1+ a2v2 + + anvn
an
To show that L is an isomorphism, we have to
show that it is one-to-one and onto and preserves
vector addition and scalar multiplication.
Linear Algebra
Real Vector Spaces
Proof (continued) L is one-to-one.=
Let [ v ]S
suppose L(v) = L(w).
Coordinates and Isomorphisms
a1
a
2
=
, [ w ]S
an
b1
b
2 and
bn
Then [v]S = [w]S and thus v = w. So L is one-toone.
x1
x
L is onto. Let 2 Rn be arbitrary. Then L maps
xn
x = x1v1 + x2v2 + + xnvn to this vector. So L is
onto.
Linear Algebra
Coordinates and Isomorphisms
Real Vector Spaces
Proof (continued) =
L preserves vector addition.
Let [ v ]S
By the properties of the coordinate
vector established earlier
a1
a
2
=
, [ w ]S
an
L ( v + w ) =[ v + w ]S =[ v ]S + [ v ]S =L ( v ) + L ( w )
L preserves scalar multiplication. Let [ v ]S
and let c be a real number. By the
properties of the coordinate vector
established earlier
L (=
cv )
v ]S c =
[c=
[ v ]S
c L( v)
a1
a
= 2
an
QED
b1
b
2
bn
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Theorem - a) Every vector space is V is
isomorphic to itself
b) If V is isomorphic to W, then W is isomorphic
to V
c) If U is isomorphic to V and V is isomorphic to
W, then U is isomorphic to W
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Theorem - Two finite-dimensional vector spaces are
isomorphic if and only if their dimensions are equal.
Proof - Let V and W be isomorphic finitedimensional vector spaces; let L:V W be an
isomorphism. Suppose that dim V = n and let
S = { v1, v2, , vn } be a basis for V. Consider the
set T = { L(v1), L(v2), , L(vn) } and show that it is
a basis for W.
Firstly, T spans W. Let w be an arbitrary vector in
W, then w = L(v) for some v in V. Since S is a basis
for V, v = a1v1 + a2v2 + + anvn . Then
w= L ( v=
) L ( a1v1 + a2 v 2 + + an v n )
= a1 L ( v1 ) + a2 L ( v 2 ) + + an L ( v n )
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Proof (continued) Secondly, T is linearly independent. Consider
=
0W a1 L ( v1 ) + a2 L ( v 2 ) + + an L ( v n )
= L ( a1v1 + a2 v 2 + + an v n )
By the properties of a linear transformation, L(0V) = 0W
Since L is one-to-one, a1v1 + a2v2 + + anvn = 0V .
Linear independence of S means a1 = a2 = = an = 0.
So T is linearly independent.
Let dim V = dim W = n. Then V and Rn are
isomorphic and W and Rn are isomorphic. By the
preceding theorem, V and W are isomorphic. QED
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Corollary - If V is a finite dimensional vector
space that is isomorphic to Rn, then dim V = n
Proof - Since V is isomorphic to Rn, the preceding
theorem says that V and Rn must have the same
dimension. Therefore dim V = n
QED
Linear Algebra
Coordinates and Isomorphisms
Real Vector Spaces
Relationships Between Bases
Let S = { v1, v2, , vn } and T = { w1, w2, , wn }
be two ordered bases for a vector space V and let v
be a vector in V. Establish a relationship between
[ v ] S and [ v ] T
v= b1v1 + b2 v 2 ++ bn v n
v S
v = c1w1 + c2 w 2 ++ cn w n
b1
b
2
v T
=
bn
c1
c2
cn
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Relationships Between Bases
c w + c w + + cn w n
v =
2 2
S
S 1 1
= c1w1 S + c2w 2 S + + cn w n S
= c1 w1 S + c2 w 2 S + + cn w n S
a1 j
a2 j
Let w j S =
a
nj
Linear Algebra
Coordinates and Isomorphisms
Real Vector Spaces
Relationships Between Bases
v S=
b1
b2 =
bn
a11
a12
a1n
a
a
a
c1 21 + c2 22 + + cn 2n
an1
an 2
ann
a11 a12 a1n c1
21 a22 a2 n c2
a
=
an1 an 2 ann cn
P v T
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Relationships Between Bases
Defn - The matrix P on the previous slide is called
the transition matrix from the T-basis to the S-basis
Comments
a) The transition matrix is computed from the
elements of S and T alone and is independent of v
b) Once the transition matrix has been computed, it
can be used to convert [ v ] T to [ v ] S for every v V
c) To go from the S-basis to the T-basis, use P1
Linear Algebra
Coordinates and Isomorphisms
Real Vector Spaces
Relationships Between Bases - Example
Let V be R3 and let S = { v1, v2, v3 } and
T = { w1, w2, w3 } be ordered bases for R3 where
=
v1
2
=
0
, v2
1
6
w1 =
3 , w 2
3
1
=
2
, v3
0
4
=
1 ,
3
1
1
1
5
w3 =
5
2
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Relationships Between Bases - Example
Need to find coefficients a, b and c such that
a1v1 + a2 v 2 + a3 v3 =
w1
b1v1 + b2 v 2 + b3 v3 =
w2
c1v1 + c2 v 2 + c3 v3 =
w3
Note: Three systems of three equations in three
unknowns
Linear Algebra
Coordinates and Isomorphisms
Real Vector Spaces
Relationships Between Bases - Example
a1v1 + a2 v 2 + a3 v3 =
w1
v
1
v 2 v3
2
w1 = 0
1
1 1 6
2 1 3
0 1 3
Solve by Gauss-Jordan reduction to get a1, a2 and a3
Linear Algebra
Coordinates and Isomorphisms
Real Vector Spaces
Relationships Between Bases - Example
b1v1 + b2 v 2 + b3 v3 =
w2
v
1
v2 =
v3 w 2
0
1
1 1 4
2 1 1
0 1 3
Solve by Gauss-Jordan reduction to get b1, b2 and b3
Linear Algebra
Coordinates and Isomorphisms
Real Vector Spaces
Relationships Between Bases - Example
c1v1 + c2 v 2 + c3 v3 =
w3
v
1
v 2 v3
2
w3 = 0
1
1 1 5
2 1 5
0 1 2
Solve by Gauss-Jordan reduction to get c1, c2 and c3
Linear Algebra
Real Vector Spaces
Coordinates and Isomorphisms
Relationships Between Bases - Example
The steps of the Gauss-Jordan reduction are
identical in each case. So, it is simpler to form the
partitioned matrix
2
v v
2 v3 w=
1 w 2 w3 0
1
1
To
From
1 1 6 4 5
2 1 3 1 5
0 1 3 3 2
and do Gauss-Jordan reduction to put the matrix into
reduced row echelon form
Linear Algebra
Coordinates and Isomorphisms
Real Vector Spaces
Relationships Between Bases - Example
Reduced row echelon form is
1
0
0
0 0 2 2 1
1 0 1 1 2
0 1 1 1 1
The transition matrix from the T-basis to the S-basis
is
=
P
1
1
2 1
1 2
1 1
Linear Algebra
Real Vector Spaces
Preliminaries
Vectors in the Plane and in 3-Space
Vector Spaces
Subspaces
Span and Linear Independence
Basis and Dimension
Homogeneous Systems
Coordinates and Isomorphisms
Rank of a Matrix
Four Fundamental Subspaces
Topics
Linear Algebra
Rank of a Matrix
Real Vector Spaces
Defn - Let A be an m x n matrix. The rows of A,
considered as vectors in Rn, span a subspace of Rn
called the row space of A. Similarly, the columns
of A, considered as vectors in Rm, span a subspace
of Rm called the column space of A
a11
a21
A=
am1
a12
a22
am 2
a1n
a2n
amn
Note: The system Ax = b has a solution when b is
in the column space of A
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Theorem - If A and B are m x n row (column)
equivalent matrices, then the row (column) spaces
of A and B are equal.
Proof - If A and B are row equivalent, then the rows
of B are obtained from the rows of A by a finite
number of the three elementary row operations.
Thus each row of B is a linear combination of the
rows of A. So, the row space of B is contained in
the row space of A. Apply inverse elementary row
operations to B to get A. So, the row space of A is
contained in the row space of B. Thus, the row
spaces of A and B are the same. Similar argument
for the column spaces.
QED
Linear Algebra
Rank of a Matrix
Real Vector Spaces
Example - Find a basis for the subspace V of R3
spanned by S = { v1, v2, v3, v4, v5 } where
v1 = [ 1 2 1 ], v2 = [ 1 1 1 ], v3 = [ 1 3 3 ],
v4 = [ 3 5 1 ], v5 = [ 1 4 5 ]. Note that V is the
row space of the matrix
1
1
A= 1
3
1
2
1
3
5
4
1
1
3
1
5
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Example (continued) - Apply elementary row
operations to A to reduce it to B, in reduced row
echelon form
1 0 3 A basis for the row space of B
consists of the vectors
0
1
w = [ 1 0 3 ] and w = [ 0 1 2 ].
1
2
B = 0 0 0
So { w1, w2 } is a basis for V
0 v = w 2w , v = w + w ,
0 0
1
1
2 2
1
2
0 0
0 v = w 3w , v = 3w 5w ,
3
1
2 4
1
2
v5 = w1 4w2
If A is row equivalent to a matrix B that is in row
echelon form, the nonzero rows of B form a basis for
the row space of A
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Defn - The dimension of the row space of a matrix
A is called the row rank of A
Defn - The dimension of the column space of a
matrix A is called the column rank of A
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Theorem - The row rank and the column rank of
an m x n matrix A = [ aij ] are equal
Proof - Let v1, v2, , vm be the row vectors of A,
where vi = [ ai1 ai2 ain ] for i = 1, 2, , m
Let row rank A = r and let the set of vectors
{ w1, w2, , wr } be a basis for the row space of
A where wi = [ bi1 bi2 bin ] for i = 1, 2, , r
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Proof (continued) Each row vector of A is a linear combination of
the wi vectors =
v1 c11w1 + c12w 2 + + c1r w r
v=
2 c21w1 + c22 w 2 + + c2 r w r
v=
m cm1w1 + cm 2 w 2 + + cmr w r
Equate matrix entries and get (jth column of A)
a=
1 j c11b1 j + c12b2 j + + c1r brj
a2=j c21b1 j + c22b2 j + + c2r brj
= cm1b1 j + cm 2b2 j + + cmr brj
amj
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Proof (continued) -
a1 j
a2 j
For j = 1,2, , n =
a
mj
c11
c12
c1r
c
c
c
b1 j 21 + b2 j 22 + + brj 2r
cmr
cm1
cm 2
Since every column of A is a linear combination of r
vectors, the dimension of the column space of A is at
most r, i.e. column rank A r = row rank A. Can
repeat the argument with column vectors of A (or
apply the above process to AT) and conclude that
row rank A column rank A.
QED
So row rank A = column rank A
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Theorem - If A is an n x n matrix, then rank A = n
if and only if A is row equivalent to In
Proof - Let rank A = n. A is row equivalent to
a matrix B in reduced row echelon form and
rank B = n. Since rank B = n, the dimension of
the row space of B is n. Since B does not have any
zero rows, B = In .
Let A be row equivalent to In . Then
rank A = rank In = n
QED
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Corollary - Let A be an n x n matrix. A is
nonsingular if and only if rank A = n
Proof - Let A be nonsingular. Then A is row
equivalent to In . So, rank A = n
Let rank A = n. By the theorem, A is row
equivalent to In . So, A is nonsingular
QED
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Corollary - If A is an n x n matrix, then rank A = n
if and only if det(A) 0.
Proof - Let rank A = n. By the preceding
corollary, rank A = n means that A is nonsingular.
By the properties of determinants, if A is
nonsingular then det(A) 0.
Let det(A) 0. By the properties of
determinants, A must be nonsingular. By the
preceding corollary, if A is nonsingular, then
rank A = n.
QED
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Corollary - The homogeneous system Ax = 0 where
A is an n x n matrix has a nontrivial solution if and
only if rank A < n
Proof - Let Ax = 0 have a nontrivial solution.
Then A is singular. Since A is singular, by the
previous corollary we cannot have rank A = n, so
rank A < n
Let rank A < n. A cannot be nonsingular since
rank(A) < n. So A is singular. Since A is singular,
Ax = 0 has a nontrivial solution
QED
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Corollary - Let A be an n x n matrix. The linear
system Ax = b has a unique solution for every n x 1
matrix b if and only if rank A = n.
Proof - Let Ax = b have a unique solution for
every n x 1 matrix b. Then Ax = 0 has the unique
trivial solution x = 0. Since A has n columns,
rank A n. By the preceding corollary, rank A < n
means Ax = 0 has a nontrivial solution. Therefore
rank A = n.
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Proof (continued) Let rank A = n. Then A is nonsingular, A1
exists, Ax = b has a solution x = A1b. To argue
uniqueness, suppose there are two solutions x1 and
x2, i.e. Ax1 = b and Ax2 = b. Then
Ax1 Ax2 = A(x1 x2) = b b = 0. Since A is
nonsingular, this homogeneous system has the
trivial solution only, i.e. x1 x2 = 0 or x1 = x2. Thus
the solution is unique.
QED
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Theorem - Let S = { v1, v2, , vn } be a set of
vectors in Rn (or Rn). Let A be the matrix whose
columns (rows) are elements of S. Then S is
linearly independent if and only if det( A ) 0.
Proof - Let S be a set of vectors in Rn. Let A
be the matrix whose columns are the elements of
S. Since S is independent, the dimension of the
column space of A is n, so rank A = n, so A is
nonsingular, so det( A ) 0.
If det( A ) 0, then rank A = n, so the columns
of A, i.e. the elements of S, are independent.
QED
Linear Algebra
Rank of a Matrix
Real Vector Spaces
Example
1 1
1 2 3
Let S =
, , S ,
,
1 1
2 4 3
This image cannot currently be displayed.
1
S is linearly independent since det
1
1
S is linearly dependent since det
2
1
= 2
1
2 3
= 0
4 3
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Theorem - The linear system Ax = b has a solution
if and only if rank A = rank [A:b]. That is, if and
only if the ranks of the coefficient and augmented
matrices are equal.
Proof - Let A = [ aij ] be an m x n matrix. The
linear system Ax = b can be expressed as
a11
a12
a1n b1
a
a
a b
2
x1 21 + x2 22 + xn 2 n =
a
a
m1
m2
amn bm
Existence of a solution means that b is in the column
space of A. Thus the column spaces of A and [A:b]
are the same and their ranks are equal.
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Proof (continued) Let rank A = rank [A:b]. Then the column
spaces of A and [A:b] are the same, which means
that b can be expressed as a linear combination of
the columns of A.
a11
a12
a1n b1
a
a
a b
2
x1 21 + x2 22 + xn 2 n =
a
a
m1
m2
amn bm
The coefficients x1, x2, , xm give a solution of Ax
= b.
QED
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Equivalent statements for n x n matrix A
A is nonsingular
AX = 0 has the trivial solution only
A is row equivalent to In
The system AX = B has a unique solution for
every n x 1 matrix B
A is a product of elementary matrices
A has rank n
The rows (columns) of A form a linearly
independent set of vectors in Rn (Rn)
Linear Algebra
Real Vector Spaces
Rank of a Matrix
Rank Plus Nullity Theorem
Will prove later that if A is an m x n matrix, then
rank A + nullity A = n. This is called the rank plus
nullity theorem
Linear Algebra
Real Vector Spaces
Preliminaries
Vectors in the Plane and in 3-Space
Vector Spaces
Subspaces
Span and Linear Independence
Basis and Dimension
Homogeneous Systems
Coordinates and Isomorphisms
Rank of a Matrix
Four Fundamental Subspaces
Topics
Linear Algebra
Real Vector Spaces
Four Fundamental Subspaces
For every m x n matrix A, there are four subspaces
based on the rows and columns of A that reveal
much about the structure of A Let r be the rank of A
The column space, C (A), dimension = r
The row space, C (AT), dimension = r
The null space, N (A), will show dimension = n r
The left null space, N (AT), will show dimension = m r
In comments on these subspaces, will not make a
careful distinction between Rn and Rn or between Rm
and Rm . For example, row space is subspace of Rn
and null space is subspace of Rn . Will usually call
both subspaces of Rn
Linear Algebra
Real Vector Spaces
Four Fundamental Subspaces
The row space of A and null space of A coincide
with the row space and null space of any matrix
that is row equivalent to A. In particular, bases for
these spaces can be obtained from the reduced row
echelon form of A
Linear Algebra
Real Vector Spaces
Four Fundamental Subspaces
Similarly, bases for the column space of A and left
null space of A can be obtained from the reduced
column echelon form of A.
Also, the pivot columns of A are a basis for C (A).
Suppose that R is the reduced row echelon form of
A. Then Ax = 0 and Rx = 0 have exactly the same
solutions. Every linear dependence Ax = 0 among
the columns of A is matched by a corresponding
linear dependence Rx = 0 among the columns of
R, with exactly the same coefficients. If a set of
columns of A is linearly independent then so are
the corresponding columns of R, and vice versa
Linear Algebra
Real Vector Spaces
Example
1
A=
3
Four Fundamental Subspaces
2
has m = n = 2 and rank r = 1
6
1
Column space contains all multiples of
3
2
Null space consists of all multiples of
1
1
Row space consists of all multiples of ( actually
2
multiples of [ 1 2 ] )
3
Left null space consists of all multiples of
1
Linear Algebra
Real Vector Spaces
Example (continued)
Four Fundamental Subspaces
Linear Algebra
Four Fundamental Subspaces
Real Vector Spaces
General Case
Rn
Rm
dim r
dim r
row space
of A
xr
Ax r = b
x = xr + xn
column space
of A
Ax = b b
Ax n = 0
xn
null space
dim n r
left null space
dim m r
Linear Algebra
Real Vector Spaces
Four Fundamental Subspaces
Comments
From the row space to the column space, every
matrix A is invertible. Every vector b in the
column space comes from exactly one vector xr in
the row space. Can define a pseudoinverse A+
such that for every vector xr in the row space,
A+ A xr = xr
The linear system Ax = b has a solution if and
only if b lies entirely in C (A). This is often
expressed as the Fredholm alternative:
Either Ax = b is consistent,
Or there exists a z such that zT A = 0 and zT b 0
Linear Algebra
Real Vector Spaces
Four Fundamental Subspaces
Existence and Uniqueness
Existence - Full row rank r = m Ax = b has at
least one solution x for every b if and only if the
columns span Rm. Then A has an n x m right
inverse C such that AC = Im . This is possible only
if m n
Can show C = AT ( A AT ) 1
Uniqueness - Full column rank r = n Ax = b
has at most one solution x for every b if and only
if the columns are independent. Then A has an
n x m left inverse B such that BA = In . This is
possible only if m n
Can show B = ( AT A ) 1 AT
Linear Algebra
Real Vector Spaces
Four Fundamental Subspaces
Existence and Uniqueness
Informally, an inverse exists only when the rank is
as large as possible
For a square matrix A, m = n, cannot have one
property without the other. A has a left inverse if
and only if it has a right inverse. Existence implies
uniqueness and uniqueness implies existence.