4Vector bundles

III Differential Geometry



4.1 Tensors
The tensor product is a very important concept in Linear Algebra. It is something
that is taught in no undergraduate courses and assumed knowledge in all graduate
courses. For the benefit of the students, we will give a brief introduction to
tensor products.
A motivation for tensors comes from the study of bilinear maps. A bilinear
map is a function that takes in two vectors and returns a number, and this is
linear in both variables. An example is the inner product, and another example
is the volume form, which tells us the volume of a parallelepiped spanned by the
two vectors.
Definition
(Bilinear map)
.
Let
U, V, W
be vector spaces. We define
Bilin
(
V ×
W, U) to be the functions V × W U that are bilinear, i.e.
α(λ
1
v
1
+ λ
2
v
2
, w) = λ
1
α(v
1
, w) + λ
2
α(v
2
, w)
α(v, λ
1
w
1
+ λ
2
w
2
) = λ
1
α(v, w
1
) + λ
2
α(v, w
2
).
It is important that a bilinear map is not a linear map. This is bad. We
spent so much time studying linear maps, and we now have to go back to our
linear algebra book and rewrite everything to talk about bilinear maps as well.
But bilinear maps are not enough. We want to do them for multi-linear maps!
But linear maps were already complicated enough, so this must be much worse.
We want to die.
Tensors are a trick to turn the study of bilinear maps to linear maps (from a
different space).
Definition
(Tensor product)
.
A tensor product of two vector spaces
V, W
is
a vector space
V W
and a bilinear map
π
:
V × W V W
such that a
bilinear map from
V × W
is “the same as” a linear map from
V W
. More
precisely, given any bilinear map
α
:
V × W U
, we can find a unique linear
map ˜α : V W U such that the following diagram commutes:
V × W
V W U
α
π
˜α
So we have
Bilin(V × W, U )
=
Hom(V W, U ).
Given
v V
and
w W
, we obtain
π
(
v, w
)
V W
, called the tensor product
of v and w, written v w.
We say V W represents bilinear maps from V × W .
It is important to note that not all elements of
V W
are of the form
v w
.
Now the key thing we want to prove is the existence and uniqueness of tensor
products.
Lemma.
Tensor products exist (and are unique up to isomorphism) for all pairs
of finite-dimensional vector spaces.
Proof.
We can construct
V W
=
Bilin
(
V × W, R
)
. The verification is left as
an exercise on the example sheet.
We now write down some basic properties of tensor products.
Proposition.
Given maps
f
:
V W
and
g
:
V
0
W
0
, we obtain a map
f g : V V
0
W W
0
given by the bilinear map
(f g)(v, w) = f(v) g(w).
Lemma. Given v, v
i
V and w, w
i
W and λ
i
R, we have
(λ
1
v
1
+ λ
2
v
2
) w = λ
1
(v
1
w) + λ
2
(v
2
w)
v (λ
1
w
1
+ λ
2
w
2
) = λ
1
(v w
1
) + λ
2
(v w
2
).
Proof. Immediate from the definition of bilinear map.
Lemma. If v
1
, · · · , v
n
is a basis for V , and w
1
, · · · , w
m
is a basis for W , then
{v
i
w
j
: i = 1, · · · , n; j = 1, · · · , m}
is a basis for V W . In particular, dim V W = dim V × dim W .
Proof.
We have
V W
=
Bilin
(
V × W, R
)
. We let
α
pq
:
V × W R
be given
by
α
pq
X
a
i
v
i
,
X
b
j
w
j
= a
p
b
q
.
Then
α
pq
Bilin
(
V × W, R
), and (
v
i
w
j
) are dual to
α
pq
. So it suffices to
show that
α
pq
are a basis. It is clear that they are independent, and any bilinear
map can be written as
α =
X
c
pq
α
pq
,
where
c
pq
= α(v
p
, w
q
).
So done.
Proposition. For any vector spaces V, W, U, we have (natural) isomorphisms
(i) V W
=
W V
(ii) (V W ) U
=
V (W U )
(iii) (V W )
=
V
W
Definition
(Covariant tensor)
.
A covariant tensor of rank
k
on
V
is an element
of
α V
· · · V
| {z }
k times
,
i.e. α is a multilinear map V × · · · × V R.
Example. A covariant 1-tensor is an α V
, i.e. a linear map α : V R.
A covariant 2-tensor is a
β V
V
, i.e. a bilinear map
V × V R
, e.g.
an inner product.
Example.
If
α, β V
, then
α β V
V
is the covariant 2-tensor given
by
(α b)(v, w) = α(v)β(w).
More generally, if
α
is a rank
k
tensor and
β
is a rank
`
tensor, then
α β
is a
rank k + ` tensor.
Definition (Tensor). A tensor of type (k, `) is an element in
T
k
`
(V ) = V
· · · V
| {z }
k times
V · · · V
| {z }
` times
.
We are interested in alternating bilinear maps, i.e.
α
(
v, w
) =
α
(
w, v
), or
equivalently, α(v, v) = 0 (if the characteristic is not 2).
Definition (Exterior product). Consider
T (V ) =
M
k0
V
k
as an algebra (with multiplication given by the tensor product) (with
V
0
=
R
).
We let
I
(
V
) be the ideal (as algebras!) generated by
{v v
:
v V } T
(
V
).
We define
Λ(V ) = T (V )/I(V ),
with a projection map
π
:
T
(
V
)
Λ(
V
). This is known as the exterior algebra.
We let
Λ
k
(V ) = π(V
k
),
the k-th exterior product of V .
We write a b for π(α β).
The idea is that Λ
p
V
is the dual of the space of alternating multilinear maps
V × V R.
Lemma.
(i) If α Λ
p
V and β Λ
q
V , then α β = (1)
pq
β α.
(ii) If dim V = n and p > n, then we have
dim Λ
0
V = 1, dim Λ
n
V = 1, Λ
p
V = {0}.
(iii) The multilinear map det : V × · · · × V R spans Λ
n
V .
(iv) If v
1
, · · · , v
n
is a basis for V , then
{v
i
1
· · · v
i
p
: i
1
< · · · < i
p
}
is a basis for Λ
p
V .
Proof.
(i) We clearly have v v = 0. So
v w = w v
Then
(v
1
· · · v
p
) (w
1
· · · w
q
) = (1)
pq
w
1
· · · w
q
v
1
· · · v
p
since we have pq swaps. Since
{v
i
1
· · · v
i
p
: i
1
, · · · , i
p
{1, · · · , n}} Λ
p
V
spans Λ
p
V
(by the corresponding result for tensor products), the result
follows from linearity.
(ii) Exercise.
(iii) The det map is non-zero. So it follows from the above.
(iv) We know that
{v
i
1
· · · v
i
p
: i
1
, · · · , i
p
{1, · · · , n}} Λ
p
V
spans, but they are not independent since there is a lot of redundancy (e.g.
v
1
v
2
=
v
2
v
1
). By requiring
i
1
< · · · < i
p
, then we obtain a unique
copy for combination.
To check independence, we write
I
= (
i
1
, · · · , i
p
) and let
v
I
=
v
i
1
· · · v
i
p
.
Then suppose
X
I
a
I
v
I
= 0
for
a
I
R
. For each
I
, we let
J
be the multi-index
J
=
{
1
, · · · , n} \ I
. So
if I 6= I
0
, then v
I
0
v
J
= 0. So wedging with v
J
gives
X
I
0
α
I
0
v
I
0
v
J
= a
I
v
I
v
J
= 0.
So a
I
= 0. So done by (ii).
If
F
:
V W
is a linear map, then we get an induced linear map Λ
p
F
:
Λ
p
V Λ
p
W in the obvious way, making the following diagram commute:
V
p
W
p
Λ
p
V Λ
p
W
F
p
π π
Λ
p
F
More concretely, we have
Λ
p
F (v
1
· · · v
p
) = F (v
1
) · · · F (v
p
).
Lemma.
Let
F
:
V V
be a linear map. Then Λ
n
F
: Λ
n
V
Λ
n
V
is
multiplication by det F .
Proof.
Let
v
1
, · · · , v
n
be a basis. Then Λ
n
V
is spanned by
v
1
· · · v
n
. So we
have
n
F )(v
1
· · · v
n
) = λ v
1
· · · v
n
for some λ. Write
F (v
i
) =
X
j
A
ji
v
j
for some A
ji
R, i.e. A is the matrix representation of F . Then we have
n
F )(v
1
· · · v
n
) =
X
j
A
j1
v
j
· · ·
X
j
A
jn
v
j
.
If we expand the thing on the right, a lot of things die. The only things that
live are those where we get each of
v
i
once in the wedges in some order. Then
this becomes
X
σS
n
ε(σ)(A
σ(1),1
· · · A
σ(n),n
)v
1
· · · v
n
= det(F ) v
1
· · · v
n
,
where
ε
(
σ
) is the sign of the permutation, which comes from rearranging the
v
i
to the right order.