0% found this document useful (0 votes)
56 views3 pages

The Rank of A Matrix: 1 Rank and Solutions To Linear Systems

The document discusses the rank of a matrix and how it relates to the solutions of linear systems of equations. It contains the following key points: 1) The rank of a matrix is equal to the number of leading variables in the corresponding linear system and determines the number of solutions. 2) A system is inconsistent if the rank of the coefficient matrix is less than the rank of the augmented matrix. It has a unique solution if the ranks are equal and equal to the number of unknowns. It has infinitely many solutions if the ranks are equal but less than the number of unknowns. 3) Homogeneous systems always have at least one solution (the trivial solution of all zeros). They only have this one

Uploaded by

Bharathi Raja
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views3 pages

The Rank of A Matrix: 1 Rank and Solutions To Linear Systems

The document discusses the rank of a matrix and how it relates to the solutions of linear systems of equations. It contains the following key points: 1) The rank of a matrix is equal to the number of leading variables in the corresponding linear system and determines the number of solutions. 2) A system is inconsistent if the rank of the coefficient matrix is less than the rank of the augmented matrix. It has a unique solution if the ranks are equal and equal to the number of unknowns. It has infinitely many solutions if the ranks are equal but less than the number of unknowns. 3) Homogeneous systems always have at least one solution (the trivial solution of all zeros). They only have this one

Uploaded by

Bharathi Raja
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

The Rank of a Matrix

Francis J. Narcowich
Department of Mathematics
Texas A&M University

January 2005

1 Rank and Solutions to Linear Systems


The rank of a matrix A is the number of leading entries in a row reduced form
R for A. This also equals the number of nonrzero rows in R. For any system
with A as a coefficient matrix, rank[A] is the number of leading variables. Now,
two systems of equations are equivalent if they have exactly the same solution
set. When we discussed the row-reduction algorithm, we also mentioned that
row-equivalent augmented matrices correspond to equivalent systems:

Theorem 1.1 If [A|b] and [A0 |b0 ] are augmented matrices for two linear systems
of equations, and if [A|b] and [A0 |b0 ] are row equivalent, then the corresponding
linear systems are equivalent.

By examining the possible row-reduced matrices corresponding to the aug-


mented matrix, one can use Theorem 1.1 to obtain the following result, which we
state without proof.

Theorem 1.2 Consider the system Ax = b, with coefficient matrix A and aug-
mented matrix [A|b]. As above, the sizes of b, A, and [A|b] are m × 1, m × n, and
m × (n + 1), respectively; in addition, the number of unknowns is n. Below, we
summarize the possibilities for solving the system.

i. Ax = b is inconsistent (i.e., no solution exists) if and only if rank[A] <


rank[A|b].

ii. Ax = b has a unique solution if and only if rank[A] = rank[A|b] = n .

iii. Ax = b has infinitely many solutions if and only if rank[A] = rank[A|b] < n.

1
To illustrate this theorem, let’s look at the simple systems below.
x1 + 2x2 = 1 3x1 + 2x2 = 3 3x1 + 2x2 =3
3x1 + x2 = −2 −6x1 − 4x2 = 0 −6x1 − 4x2 = −6
The augmented matrices for these systems are, respectively,
     
1 2 1 3 2 3 3 2 3
.
3 1 −2 −6 −4 0 −6 −4 −6
Applying the row-reduction algorithm yields the row-reduced form of each of these
augmented matrices. The results are, again respectively,
1 32 0 1 23 1
     
1 0 −1
.
0 1 1 0 0 1 0 0 0
From each of these row-reduced versions of the augmented matrices, one can read
off the rank of the coefficient matrix as well as the rank of the augmented matrix.
Applying Theorem 1.2 to each of these tells us the number of solutions to expect
for each of the corresponding systems. We summarize our findings in the table
below.
System rank[A] rank[A|b] n # of solutions
First 2 2 2 1
Second 1 2 2 0 (inconsistent)
Third 1 1 2 ∞

Homogeneous systems. A homogeneous system is one in which the vector


b = 0. By simply plugging x = 0 into the equation Ax = 0, we see that every
homogeneous system has at least one solution, the trivial solution x = 0. Are
there any others? Theorem 1.2 provides the answer.
Corollary 1.3 Let A be an m × n matrix. A homogeneous system of equations
Ax = 0 will have a unique solution, the trivial solution x = 0, if and only if
rank[A] = n. In all other cases, it will have infinitely many solutions. As a
consequence, if n > m—i.e., if the number of unknowns is larger than the number
of equations—, then the system will have infinitely many solutions.

Proof: Since x = 0 is always a solution, case (i) of Theorem 1.2 is eliminated


as a possibility. Therefore, we must always have rank[A] = rank[A|0] ≤ n. By
Theorem 1.2, case (ii), equality will hold if and only if x = 0 is the only solution.
When it does not hold, we are always in case (iii) of Theorem 1.2; there are thus
infinitely many solutions for the system. If n > m, then we need only note that
rank[A] ≤ m < n to see that the system has to have infinitely many solutions. 2

2
2 Linear Independence and Dependence
A set of k vectors {u1 , u2 , . . . , uk } is linearly independent (LI) if the equation
k
X
cj uj = 0,
j=1

where the cj ’s are scalars, has only c1 = c2 = · · · = ck = 0 as a solution. Otherwise,


the vectors are linearly dependent (LD). Let’s assume the vectors are all m × 1
column vectors. If they are rows, just transpose them. Now, use the basic matrix
trick (BMT) to put the equation in matrix form:
k
X
Ac = cj uj = 0, where A = [u1 u2 · · · uk ] and c = (c1 c2 · · · ck )T .
j=1

The question of whether the vectors are LI or LD is now a question of whether the
homogeneous system Ac = 0 has a nontrivial solution. Combining this observation
with Corollary 1.3 gives us a handy way to check for LI/LD by finding the rank of
a matrix.

Corollary 2.1 A set of k column vectors {u1 , u2 , . . . , uk } is linearly independent


if and only if the associated matrix A = [u1 u2 · · · uk ] has rank[A] = k.

Example 2.2 Determine whether the vectors below are linearly independent or
linearly dependent.
     
1 2 0
 −1   1   1 
 2  , u2 =  1  , u3 =  −1  .
u1 =      

1 −1 −1

Solution. Form the associated matrix A and perform row reduction on it:

1 0 − 32
   
1 2 0
 −1 1 1 
1   ⇐⇒  0 1 3  .

A=  2 1 −1   0 0 0 
1 −1 −1 0 0 0

The rank of A is 2 < k = 3, so the vectors are LD. The row-reduced form of
A gives us additional information; namely, we can read off the cj ’s for which
Pk 2 1
j=1 cj uj = 0. We have c1 = 3 t, c2 = − 3 t, and c3 = t. As usual, t is a
parameter.

You might also like