hw2 Solutions
hw2 Solutions
Problem 1
[Heath 3.29, page 152] Let v be a nonzero n-vector. The hyperplane normal to v is the
(n-1)-dimensional subspace of all vectors z such that vT z = 0. A reflector is a linear
transformation R such that Rx = −x if x is a scalar multiple of v, and Rx = x if vT x = 0.
Thus, the hyperplane acts as a mirror: for any vector, its component within the hyperplane
is invariant, whereas its component orthogonal to the hyperplane is reversed.
4. Show that for any two vectors s and t such that s 6= t and ksk2 = ktk2 , there is a
reflector R such that Rs = t
Solution
1. We can obtain the reflection Rx of a vector x with respect to a hyperplane through
the origin by adding to x twice the vector from x to Px, where Px is the projection
of x onto the same hyperplane (see figure 1). Thus
2. A reflection with respect to a hyperplane through the origin does not change the
magnitude of the reflected vector (see figure 1). Therefore we have
1
Figure 1: Reflector
RT R = I ⇒ RT R2 = R ⇒ RT = R
4. Any two vectors s and t are reflections of each other with respect to the hyperplane
normal to the vector s − t that passes from the midpoint of s and t. When ksk2 = ktk2
2
that hyperplane passes through the origin and can be written as {x : (s − t)T x = 0}.
Therefore the Householder transform
(s − t)(s − t)T
H=I−2
(s − t)T (s − t)
Problem 2
Let A be a rectangular m × n matrix with full column rank and m > n. Consider the QR
decomposition of A.
2. Show that for every x we have kAx − bk22 = kA(x − x0 )k22 + kAx0 − bk22 where x0 is
the least squares solution of Ax = b
3. Show that the minimum value for the 2-norm of the residual is attained when x is
equal to the least squares solution and that this minimum value is equal to kP0 bk2
Solution
1. We know that the nullspace of AT and the column space of A are the normal comple-
ments of each other. Therefore, any vector x can be written as x = x1 + x2 where x1
is in the nullspace of AT and x2 is in the column space of A.
For x1 this means that AT x1 = 0. Since A is full rank, the QR decomposition is
defined and
AT x1 = 0 ⇒ RT QT x1 = 0 ⇒ QT x1 = 0
since R is nonsingular. On the other hand, x2 belongs to the column space of A,
therefore it can be written as x2 = Ay = QRy where y ∈ Rn .
3
Thus, the action of P0 on x amounts to
P0 x = I − QQT (x1 + x2 ) = x1 + x2 − QQT x1 − QQT QRy
= x1 + x2 − QRy = x1
Therefore, P0 is the projection matrix onto the nullspace of AT (and QQT is the
projection matrix onto the column space of A).
2. We have
3. From the equation above, we have that the minimum value for kAx − bk2 is attained
for x = x0 , since the term kAx0 − bk22 does not depend on x. The least squares
solution is given as
Rx0 = QT b ⇒ x0 = R−1 QT b
Therefore the minimum value for the residual kAx − bk2 is
Intuitively, this means that the least squares solution annihilates the component of the
residual in the column space of A and the minimum value for the residual is exactly
the component of b that is not contained in the column space of A.
Problem 3
State whether the following classes of matrices are positive (semi-)definite, negative (semi-)definite,
indefinite, or whether their definiteness cannot be determined in general
1. Orthogonal matrices
3. Projection matrices
5. Householder matrices
4
6. Upper triangular matrices with positive diagonal elements
Solution
1. Any diagonal matrix with values +1 or −1 on the diagonal is orthogonal. Nevertheless
it can be positive definite (if it equals I), negative definite (if it equals −I) or indefinite
(in any other case). Thus the definiteness of orthogonal matrices cannot be determined
for the general case.
3. Let V be the vector subspace that a projection matrix P projects onto, and V ⊥ its nor-
mal complement. Let x = x1 + x2 be an arbitrary vector, where x1 is the component
of x in V and x2 its component in V ⊥ . Therefore
xT Px = (x1 + x2 )T P(x1 + x2 ) = xT 2
1 Px1 = kx1 k2 ≥ 0
where we used the fact that P is symmetric and Px2 = 0. Therefore a projection
matrix is always positive semi-definite.
4. The matrix I − P is the projection onto the normal complement of the space P projects
onto. Therefore it is a projection matrix itself and thus positive semidefinite.
5
7.
xT Ax = Aii x2i +
X X X
Aij xi xj = Aij xi xj
i,j i i,j6=i
|Aii ||xi |2 −
X X
≥ |Aij ||xi ||xj |
i i,j6=i
1X
(|Aii | + |Aii |) |xi |2 −
X
= |Aij ||xi ||xj |
2 i i,j6=i
1 X
(|Aij | + |Aji |) |xi |2 −
X
> |Aij ||xi ||xj |
2 i,j6=i i,j6=i
1 2 1
|xi | + |xj |2 − |xi ||xj |
X
= |Aij |
i,j6=i 2 2
1 X
= |Aij | (|xi | − |xj |)2 ≥ 0
2 i,j6=i
Problem 4
[Heath 3.12 page 150]
1. Let A be a n × n matrix. Show that any two of the following conditions imply the
other.
(a) AT = A
(b) AT A = I
(c) A2 = I
2. Give a specific example, other than the identity matrix I or a permutation of it, of a
3 × 3 matrix that has all three of these properties.
3. Name a nontrivial class of matrices that have all three of these properties.
Solution
1.
( )
AT = A
⇒ A2 = I [By substitution]
AT A = I
( )
AT = A
⇒ AT A = I [By substitution]
A2 = I
( ) ( )
AT A = I A−1 = AT
⇒ ⇒ AT = A
A2 = I A−1 = A
6
2.
1 0 0 1/3 −2/3 −2/3
0 −1 0 , −2/3 1/3 −2/3
Problem 5
[Heath 3.16 page 150] Consider the vector a as an n × 1 matrix.
Solution
1. By simple application of the algorithm, we have
1
Q= a R = [kak2 ]
kak2
1 T aT b
Rx = QT b ⇒ kak2 · x = a b⇒x= T
kak2 a a