0% found this document useful (0 votes)
23 views2 pages

The University of Sydney: MATH3906

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views2 pages

The University of Sydney: MATH3906

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

2

The University of Sydney whenever A is nonsingular. It is also clear that taking complex conjugates
MATH3906 Representation Theory preserves sums and products, and commutes with the maps A 7→ A−1 and
(http://www.maths.usyd.edu.au/u/UG/IM/MATH3906/) A 7→ At .) Let v be an arbitrary column vector and let z = v t Av. Since z is a
1 × 1 matrix we have z t = z, and so
Semester2, 1997 Lecturer: R. Howlett t t
z = (v t Av) = v t A v = v t Av = z.

Tutorial 1 Thus z is real.


(iv ) Suppose that A is positive definite. Note first that since A is Hermitian
it must be square (as its transpose is the same shape as itself). Now let v be
1. If A is an m × n matrix over the field C (complex numbers) then A is the
in the nullspace of A; that is, v is a column vector such that Av = 0. Then
m × n matrix whose entries are the complex conjugates of the entries of A.
v t Av = v0 = 0, and positive definiteness of A gives v = 0. So the nullspace
(i ) Show that an arbitrary complex matrix can be written as P + iQ with of A is {0}; this implies that A is nonsingular.
P and Q real. n
the k th entry of v is xk +iyk (for k = 1, 2, . . . , n)
(v ) Pnif v ∈2 C and
Recall that
(ii ) A is Hermitian if A = At (transpose of A). Show that A is Hermitian t 2
then v v = k=1 xk + yk , which is real, nonnegative, and zero only if v = 0.
if and only if its real part (P ) is symmetric and its imaginary part (Q) This shows that the identity matrix is positive definite. Suppose now that
is skew-symmetric. t
A = B CB where B, C ∈ GLn (C) and C is positive definite, and let v ∈ Cn
(iii ) Show that if A is Hermitian then v t Av is a real number for all complex be nonzero. Then Bv 6= 0, since B is nonsingular, and since C is positive
column vectors v (of appropriate size). definite it follows that (Bv)t C(Bv) > 0. But v t Av = (Bv)t C(Bv), and since
(iv ) A Hermitian matrix A is said to be it positive definite if v t Av > 0 for this is positive for all nonzero v it follows that A is positive definite. Putting
all nonzero v. Prove that positive definite matrices are nonsingular. C = I gives the “if” part.
(v ) Show that a Hermitian matrix A is positive definite if and only if there Let A be an arbitrary positive definite n × n Hermitian matrix. We use induc-
t
exists a nonsingular B such that A = B B. (The “if” part is OK. For tion on n to prove that A has the desired form; note that in the case√n = 1 the
the “only if” you have to use row and column operations. Start by matrix A is simply a positive real number, and we may take B = A. Let el
showing that the diagonal entries of A are real and positive.) be the lth column of the identity matrix (so that e1 , e2 , . . . , en comprise the
standard basis of Cn ). The (l, l)-entry of A is el t Ael , which must be positive
(vi ) Show that the sum of two positive definite matrices is positive definite. since A is positive definite. Thus we can write
(vii ) Let G be a finite subgroup of GLn (C). Prove that there exists a positive  
t
definite matrix A such that Y AY = A for all Y ∈ G. (Hint: Try a xt
A= ˜0
P t x A
A = X∈G X X.) ˜
(viii ) Prove that if G is a finite subgroup of GLn (C) then there exists a non- where a is real and positive, x ∈ Cn−1 and A0 is some (n − 1) × (n − 1)
singular B such that BXB −1 is unitary for all X ∈ G. (A matrix is Hermitian matrix. Now set ˜
unitary if its inverse is the transpose of its conjugate.)  √ 
a−1 0
D=
Solution. −a−1 x I
˜
(i ) Let A have (r, s)-entry αrs ∈ C. Writing αrs = βrs + iγrs with βrs , γrs ∈ R
and observe that D is nonsingular;√indeed, as a row operation matrix the effect
we see that A = P + iQ where P and Q have (r, s)-entries βrs and γrs
of D is to divide the first row by a and add multiples
√ of the first row to the
(respectively).
others. We see that the first column of DA is ( a)e1 . Now postmultiplication
(ii ) Since (P + iQ)t = (P − iQ)t = P t − iQt we see that A = P + iQ is t
by D performs a corresponding sequence of column operations, and we find
Hermitian if and only if P t = P and Qt = −Q.
that
Recall that transposing reverses products; that is, (XY )t = Y t X t when-
 
(iii ) t 1 0
DAD = .
ever the left hand side is defined. (Note that this implies that (A−1 )t = (At )−1 0 A00
3 4

Since A was positive definite, this must be too. Hence A00 is a (n − 1) × (n − 1) 3. Let H and N be groups and φ: H → Aut(N ) a homomorphism. Define
t
positive definite matrix. By induction we can write A00 = Y Y , and this gives
 t   H × N = { (h, x) | h ∈ H, x ∈ N }
1 0 1 0 t t
A = D−1 (D )−1 = B B
0 Y 0 Y with multiplication given by
  t
where B = 10 Y0 (D )−1 . (h, x)(k, y) = (hk, xφ(k) y)

(vi ) If A, B ∈ GLn (C) are positive definite and 0 6= v ∈ Cn then for all h, k ∈ H and x, y ∈ N . Prove that this makes H × N into a group.
(Such a group is called a semidirect product of N by H. If φ is the trivial
v t (A + B)v = v t Av + v t Bv > 0
homomorphism (h 7→ 1 ∈ Aut(N ) for all h ∈ H) we get the direct product of
(since v t Av > 0 and v t Bv > 0). N and H.)
t P t t P t
(vii ) We see that Y AY = X∈G (Y X )(XY ) = Z∈G Z Z = A (since Solution.
Z = XY runs through all elements of G as X does). Since φ is a homomorphism we have φ(1) = 1, where the 1 on the left hand
t
(viii ) By (vii ) we can find a positive definite A such that Y AY = A for all side is the identity element of H and the 1 on the right hand side is the
t t t t identity automorphism of N . Hence our multiplication rule gives
Y ∈ G, and by (v ) we can put A = B B. But the equation Y B BY = B B
−1 t t
can be written as BY −1 B −1 = (B )t Y B , or, equivalently,
(h, x)(1, 1) = (h1, xφ(1) 1 = (h, x).
(BY B −1 )−1 = (BY B −1 )t ,
Since all automorphisms of N map 1 to 1 we also find that
−1
showing that BY B is unitary for all Y ∈ G.
(1, 1)(h, x) = (1h, 1φ(h) x) = (h, x).
n t
2. Recall that the dot product on C is defined by u · v = u v, and that unitary
matrices preserve it (in the sense that (Xu) · (Xv) = u · v for all u and v if So H × N has an identity element. The following calculation proves associa-
X is unitary). Recall also that if U is a subspace of Cn then Cn = U ⊕ U ⊥ , tivity:
where
U ⊥ = { v ∈ Cn | u · v = 0 for all u ∈ U } (h, x)(k, y) (l, z) = (hk, xφ(k) y)(l, z) = (hkl, (xφ(k) y)φ(l) z)


(the orthogonal complement of U ). (hkl, xφ(k)φ(l) y φ(l) z) = (hkl, xφ(kl) y φ(l) z)


Let G be a finite group of n × n unitary matrices, and let U be a G-invariant = (h, x)(kl, y φ(l) z) = (h, x) (k, y)(l, z) .

subspace of Cn . (That is, if X ∈ G and u ∈ U then Xu ∈ U .) Prove that the
orthogonal complement of U is also G-invariant. Let (h, x) be an arbitrary element of H×N , and let k = h−1 and y = (x−1 )φ(k) .
Since (x−1 )φ(k) = (xφ(k) )−1 we see that (h, x)(k, y) = (hk, xφ(k) y) = (1, 1).
Solution.
Moreover, since φ(k)φ(h) = φ(hk) = 1 we also have that y φ(h) = x−1 , and
Let v ∈ U ⊥ and let X ∈ G. Then for all u ∈ U we have that X −1 u ∈ U (since (k, y)(h, x) = (kh, y φ(h) x = (1, 1), so that (k, y) is definitely the inverse of
X −1 ∈ G and U is G-invariant), and so (h, x).
(Xv) · u = Xv · X(X −1 u)
= v · X −1 u (since X is unitary)
=0 (since v ∈ U ⊥ ).

Hence Xv ∈ U ⊥ , and since this holds for all X ∈ G and v ∈ U ⊥ we have


shown that U ⊥ is G-invariant.

You might also like