7Bilinear forms II

IB Linear Algebra



7.1 Symmetric bilinear forms and quadratic forms
Definition
(Symmetric bilinear form)
.
Let
V
is a vector space over
F
. A bilinear
form φ : V × V F is symmetric if
φ(v, w) = φ(w, v)
for all v, w V .
Example.
If
S Mat
n
(
F
) is a symmetric matrix, i.e.
S
T
=
S
, the bilinear form
φ : F
n
× F
n
F defined by
φ(x, y) = x
T
Sx =
n
X
i,j=1
x
i
S
ij
y
j
is a symmetric bilinear form.
This example is typical in the following sense:
Lemma.
Let
V
be a finite-dimensional vector space over
F
, and
φ
:
V ×V F
is a symmetric bilinear form. Let (
e
1
, ··· , e
n
) be a basis for
V
, and let
M
be
the matrix representing
φ
with respect to this basis, i.e.
M
ij
=
φ
(
e
i
, e
j
). Then
φ is symmetric if and only if M is symmetric.
Proof. If φ is symmetric, then
M
ij
= φ(e
i
, e
j
) = φ(e
j
, e
i
) = M
ji
.
So M
T
= M. So M is symmetric.
If M is symmetric, then
φ(x, y) = φ
X
x
i
e
i
,
X
y
j
e
j
=
X
i,j
x
i
M
ij
y
j
=
n
X
i,j
y
j
M
ji
x
i
= φ(y, x).
We are going to see what happens when we change basis. As in the case of
endomorphisms, we will require to change basis in the same ways on both sides.
Lemma.
Let
V
is a finite-dimensional vector space, and
φ
:
V × V F
a
bilinear form. Let (e
1
, ··· , e
n
) and (f
1
, ··· , f
n
) be bases of V such that
f
i
=
n
X
k=1
P
ki
e
k
.
If
A
represents
φ
with respect to (
e
i
) and
B
represents
φ
with respect to (
f
i
),
then
B = P
T
AP.
Proof. Special case of general formula proven before.
This motivates the following definition:
Definition
(Congruent matrices)
.
Two square matrices
A, B
are congruent if
there exists some invertible P such that
B = P
T
AP.
It is easy to see that congruence is an equivalence relation. Two matrices
are congruent if and only if represent the same bilinear form with respect to
different bases.
Thus, to classify (symmetric) bilinear forms is the same as classifying (sym-
metric) matrices up to congruence.
Before we do the classification, we first look at quadratic forms, which are
something derived from bilinear forms.
Definition
(Quadratic form)
.
A function
q
:
V F
is a quadratic form if there
exists some bilinear form φ such that
q(v) = φ(v, v)
for all v V .
Note that quadratic forms are not linear maps (they are quadratic).
Example.
Let
V
=
R
2
and
φ
be represented by
A
with respect to the standard
basis. Then
q

x
y

=
x y
A
11
A
12
A
21
A
22
x
y
= A
11
x
2
+ (A
12
+ A
21
)xy + A
22
y
2
.
Notice that if A is replaced the symmetric matrix
1
2
(A + A
T
),
then we get a different φ, but the same q. This is in fact true in general.
Proposition
(Polarization identity)
.
Suppose that
char F 6
= 2, i.e. 1 + 1
6
= 0
on
F
(e.g. if
F
is
R
or
C
). If
q
:
V F
is a quadratic form, then there exists a
unique symmetric bilinear form φ : V × V F such that
q(v) = φ(v, v).
Proof.
Let
ψ
:
V × V F
be a bilinear form such that
ψ
(
v, v
) =
q
(
v
). We
define φ : V × V F by
φ(v, w) =
1
2
(ψ(v, w) + ψ(w, v))
for all
v, w F
. This is clearly a bilinear form, and it is also clearly symmetric
and satisfies the condition we wants. So we have proved the existence part.
To prove uniqueness, we want to find out the values of
φ
(
v, w
) in terms of
what q tells us. Suppose φ is such a symmetric bilinear form. We compute
q(v + w) = φ(v + w, v + w)
= φ(v, v) + φ(v, w) + φ(w, v) + φ(w, w)
= q(v) + 2φ(v, w) + q(w).
So we have
φ(v, w) =
1
2
(q(v + w) q(v) q(w)).
So it is determined by q, and hence unique.
Theorem.
Let
V
be a finite-dimensional vector space over
F
, and
φ
:
V ×V F
a symmetric bilinear form. Then there exists a basis (
e
1
, ··· , e
n
) for
V
such
that φ is represented by a diagonal matrix with respect to this basis.
This tells us classifying symmetric bilinear forms is easier than classifying
endomorphisms, since for endomorphisms, even over
C
, we cannot always make
it diagonal, but we can for bilinear forms over arbitrary fields.
Proof.
We induct over
n
=
dim V
. The cases
n
= 0 and
n
= 1 are trivial, since
all matrices are diagonal.
Suppose we have proven the result for all spaces of dimension less than
n
.
First consider the case where
φ
(
v, v
) = 0 for all
v V
. We want to show that
we must have
φ
= 0. This follows from the polarization identity, since this
φ
induces the zero quadratic form, and we know that there is a unique bilinear
form that induces the zero quadratic form. Since we know that the zero bilinear
form also induces the zero quadratic form, we must have
φ
= 0. Then
φ
will
be represented by the zero matrix with respect to any basis, which is trivially
diagonal.
If not, pick e
1
V such that φ(e
1
, e
1
) 6= 0. Let
U = ker φ(e
1
, ·) = {u V : φ(e
1
, u) = 0}.
Since
φ
(
e
1
, ·
)
V
\ {
0
}
, we know that
dim U
=
n
1 by the rank-nullity
theorem.
Our objective is to find other basis elements
e
2
, ··· , e
n
such that
φ
(
e
1
, e
j
) = 0
for all j > 1. For this to happen, we need to find them inside U.
Now consider
φ|
U×U
:
U × U F
, a symmetric bilinear form. By the
induction hypothesis, we can find a basis
e
2
, ··· , e
n
for
U
such that
φ|
U×U
is
represented by a diagonal matrix with respect to this basis.
Now by construction,
φ
(
e
i
, e
j
) = 0 for all 1
i 6
=
j n
and (
e
1
, ··· , e
n
) is
a basis for V . So we’re done.
Example. Let q be a quadratic form on R
3
given by
q
x
y
z
= x
2
+ y
2
+ z
2
+ 2xy + 4yz + 6xz.
We want to find a basis f
1
, f
2
, f
3
for R
3
such that q is of the form
q(af
1
+ bf
2
+ cf
3
) = λa
2
+ µb
2
+ νc
2
for some λ, µ, ν R.
There are two ways to do this. The first way is to follow the proof we just had.
We first find our symmetric bilinear form. It is the bilinear form represented by
the matrix
A =
1 1 3
1 1 2
3 2 1
.
We then find
f
1
such that
φ
(
f
1
, f
1
)
6
= 0. We note that
q
(
e
1
) = 1
6
= 0. So we pick
f
1
= e
1
=
1
0
0
.
Then
φ(e
1
, v) =
1 0 0
1 1 3
1 1 2
3 2 1
v
1
v
2
v
3
= v
1
+ v
2
+ 3v
3
.
Next we need to pick our
f
2
. Since it is in the kernel of
φ
(
f
1
, ·
), it must satisfy
φ(f
1
, f
2
) = 0.
To continue our proof inductively, we also have to pick an f
2
such that
φ(f
2
, f
2
) 6= 0.
For example, we can pick
f
2
=
3
0
1
.
Then we have q(f
2
) = 8.
Then we have
φ(f
2
, v) =
3 0 1
1 1 3
1 1 2
3 2 1
v
1
v
2
v
3
= v
2
+ 8v
3
Finally, we want φ(f
1
, f
3
) = φ(f
2
, f
3
) = 0. Then
f
3
=
5
8
1
works. We have q(f
3
) = 8.
With these basis elements, we have
q(af
1
+ bf
2
+ cf
3
) = φ(af
1
+ bf
2
+ cf
3
, af
1
+ bf
2
+ cf
3
)
= a
2
q(f
1
) + b
2
q(f
2
) + c
2
q(f
3
)
= a
2
8b
2
+ 8c
2
.
Alternatively, we can solve the problem by completing the square. We have
x
2
+ y
2
+ z
2
+ 2xy + 4yz + 6xz = (x + y + 3z)
2
2yz 8z
2
= (x + y + 3z)
2
8
z +
y
8
2
+
1
8
y
2
.
We now see
φ
x
y
z
,
x
0
y
0
z
0
= (x + y + 3z)(x
0
+ y
0
+ 3z
0
) 8
z +
y
8
z
0
+
y
0
8
+
1
8
yy
0
.
Why do we know this? This is clearly a symmetric bilinear form, and this also
clearly induces the
q
given above. By uniqueness, we know that this is the right
symmetric bilinear form.
We now use this form to find our f
1
, f
2
, f
3
such that φ(f
i
, f
j
) = δ
ij
.
To do so, we just solve the equations
x + y + 3z = 1
z +
1
8
y = 0
y = 0.
This gives our first vector as
f
1
=
1
0
0
.
We then solve
x + y + 3z = 0
z +
1
8
y = 1
y = 0.
So we have
f
2
=
3
0
1
.
Finally, we solve
x + y + 3z = 0
z +
1
8
y = 0
y = 1.
This gives
f
3
=
5/8
1
1/8.
.
Then we can see that the result follows, and we get
q(af
1
+ bf
2
+ cf
3
) = a
2
8b
2
+
1
8
c
2
.
We see that the diagonal matrix we get is not unique. We can re-scale our
basis by any constant, and get an equivalent expression.
Theorem. Let φ be a symmetric bilinear form over a complex vector space V .
Then there exists a basis (v
1
, ··· , v
m
) for V such that φ is represented by
I
r
0
0 0
with respect to this basis, where r = r(φ).
Proof.
We’ve already shown that there exists a basis (
e
1
, ··· , e
n
) such that
φ
(
e
i
, e
j
) =
λ
i
δ
ij
for some
λ
ij
. By reordering the
e
i
, we can assume that
λ
1
, ··· , λ
r
6= 0 and λ
r+1
, ··· , λ
n
= 0.
For each 1
i r
, there exists some
µ
i
such that
µ
2
i
=
λ
i
. For
r
+ 1
r n
,
we let µ
i
= 1 (or anything non-zero). We define
v
i
=
e
i
µ
i
.
Then
φ(v
i
, v
j
) =
1
µ
i
µ
j
φ(e
i
, e
j
) =
(
0 i 6= j or i = j > r
1 i = j < r.
So done.
Note that it follows that for the corresponding quadratic form q, we have
q
n
X
i=1
a
i
v
i
!
=
r
X
i=1
a
2
i
.
Corollary.
Every symmetric
A Mat
n
(
C
) is congruent to a unique matrix of
the form
I
r
0
0 0
.
Now this theorem is a bit too strong, and we are going to fix that next lecture,
by talking about Hermitian forms and sesquilinear forms. Before that, we do
the equivalent result for real vector spaces.
Theorem.
Let
φ
be a symmetric bilinear form of a finite-dimensional vector
space over
R
. Then there exists a basis (
v
1
, ··· , v
n
) for
V
such that
φ
is
represented
I
p
I
q
0
,
with
p
+
q
=
r
(
φ
),
p, q
0. Equivalently, the corresponding quadratic forms is
given by
q
n
X
i=1
a
i
v
i
!
=
p
X
i=1
a
2
i
p+q
X
j=p+1
a
2
j
.
Note that we have seen these things in special relativity, where the Minkowski
inner product is given by the symmetric bilinear form represented by
1 0 0 0
0 1 0 0
0 0 1 0
0 0 0 1
,
in units where c = 1.
Proof.
We’ve already shown that there exists a basis (
e
1
, ··· , e
n
) such that
φ(e
i
, e
j
) = λ
i
δ
ij
for some λ
1
, ··· , λ
n
R. By reordering, we may assume
λ
i
> 0 1 i p
λ
i
< 0 p + 1 i r
λ
i
= 0 i > r
We let µ
i
be defined by
µ
i
=
λ
i
1 i p
λ
i
p + 1 i r
1 i > r
Defining
v
i
=
1
µ
i
e
i
,
we find that φ is indeed represented by
I
p
I
q
0
,
We will later show that this form is indeed unique. Before that, we will have
a few definitions, that really only make sense over R.
Definition
(Positive/negative (semi-)definite)
.
Let
φ
be a symmetric bilinear
form on a finite-dimensional real vector space V . We say
(i) φ is positive definite if φ(v, v) > 0 for all v V \ {0}.
(ii) φ is positive semi-definite if φ(v, v) 0 for all v V .
(iii) φ is negative definite if φ(v, v) < 0 for all v V \ {0}.
(iv) φ is negative semi-definite if φ(v, v) 0 for all v V .
We are going to use these notions to prove uniqueness. It is easy to see that
if
p
= 0 and
q
=
n
, then we are negative definite; if
p
= 0 and
q 6
=
n
, then we
are negative semi-definite etc.
Example. Let φ be a symmetric bilinear form on R
n
represented by
I
p
0
0 0
np
.
Then φ is positive semi-definite. φ is positive definite if and only if n = p.
If instead φ is represented by
I
p
0
0 0
np
,
then φ is negative semi-definite. φ is negative definite precisely if n = q.
We are going to use this to prove the uniqueness part of our previous theorem.
Theorem
(Sylvester’s law of inertia)
.
Let
φ
be a symmetric bilinear form on a
finite-dimensional real vector space
V
. Then there exists unique non-negative
integers p, q such that φ is represented by
I
p
0 0
0 I
q
0
0 0 0
with respect to some basis.
Proof.
We have already proved the existence part, and we just have to prove
uniqueness. To do so, we characterize
p
and
q
in a basis-independent way. We
already know that
p
+
q
=
r
(
φ
) does not depend on the basis. So it suffices to
show p is unique.
To see that
p
is unique, we show that
p
is the largest dimension of a subspace
P V such that φ|
P ×P
is positive definite.
First we show we can find such at P . Suppose φ is represented by
I
p
0 0
0 I
q
0
0 0 0
with respect to (
e
1
, ··· , e
n
). Then
φ
restricted to
he
1
, ··· , e
p
i
is represented by
I
p
with respect to e
1
, ··· , e
p
. So φ restricted to this is positive definite.
Now suppose
P
is any subspace of
V
such that
φ|
P ×P
is positive definite. To
show
P
has dimension at most
p
, we find a subspace complementary to
P
with
dimension n p.
Let Q = he
p+1
, ··· , e
n
i. Then φ restricted to Q × Q is represented by
I
q
0
0 0
.
Now if
v P Q \{
0
}
, then
φ
(
v, v
)
>
0 since
v P \{
0
}
and
φ
(
v, v
)
0 since
v Q, which is a contradiction. So P Q = 0.
We have
dim V dim(P + Q) = dim P + dim Q = dim P + (n p).
Rearranging gives
dim P p.
A similar argument shows that
q
is the maximal dimension of a subspace
Q V
such that φ|
Q×Q
is negative definite.
Definition
(Signature)
.
The signature of a bilinear form
φ
is the number
p q
,
where p and q are as above.
Of course, we can recover p and q from the signature and the rank of φ.
Corollary.
Every real symmetric matrix is congruent to precisely one matrix
of the form
I
p
0 0
0 I
q
0
0 0 0
.