5Eigenvalues and eigenvectors

IA Vectors and Matrices



5.3 Transformation matrices
How do the components of a vector or a matrix change when we change the
basis?
Let
{e
1
, e
2
, ··· , e
n
}
and
{
˜
e
1
,
˜
e
2
, ··· ,
˜
e
n
}
be 2 different bases of
R
n
or
C
n
.
Then we can write
˜
e
j
=
n
X
i=1
P
ij
e
i
i.e.
P
ij
is the
i
th component of
˜
e
j
with respect to the basis
{e
1
, e
2
, ··· , e
n
}
.
Note that the sum is made as
P
ij
e
i
, not
P
ij
e
j
. This is different from the formula
for matrix multiplication.
Matrix
P
has as its columns the vectors
˜
e
j
relative to
{e
1
, e
2
, ··· , e
n
}
. So
P = (
˜
e
1
˜
e
2
···
˜
e
n
) and
P (e
i
) =
˜
e
i
Similarly, we can write
e
i
=
n
X
k=1
Q
ki
˜
e
k
with Q = (e
1
e
2
··· e
n
).
Substituting this into the equation for
˜
e
j
, we have
˜
e
j
=
n
X
i=1
n
X
k=1
Q
ki
˜
e
k
!
P
ij
=
n
X
k=1
˜
e
k
n
X
i=1
Q
ki
P
ij
!
But
˜
e
1
,
˜
e
2
, ··· ,
˜
e
n
are linearly independent, so this is only possible if
n
X
i=1
Q
ki
P
ij
= δ
kj
,
which is just a fancy way of saying QP = I, or Q = P
1
.
5.3.1 Transformation law for vectors
With respect to basis
{e
i
}
,
u
=
P
n
i=1
u
i
e
i
. With respect to basis
{
˜
e
i
}
,
u
=
P
n
i=1
˜u
i
˜
e
i
. Note that this is the same vector
u
but has different components
with respect to different bases. Using the transformation matrix above for the
basis, we have
u =
n
X
j=1
˜u
j
n
X
i=1
P
ij
e
i
=
n
X
i=1
n
X
j=1
P
ij
˜u
j
e
i
By comparison, we know that
u
i
=
n
X
j=1
P
ij
˜u
j
Theorem.
Denote vector as
u
with respect to
{e
i
}
and
˜
u
with respect to
{
˜
e
i
}
.
Then
u = P ˜u and ˜u = P
1
u
Example.
Take the first basis as
{e
1
= (1
,
0)
, e
2
= (0
,
1)
}
and the second as
{
˜
e
1
= (1, 1),
˜
e
2
= (1, 1)}.
So
˜
e
1
= e
1
+ e
2
and
˜
e
2
= e
1
+ e
2
. We have
P =
1 1
1 1
.
Then for an arbitrary vector u, we have
u = u
1
e
1
+ u
2
e
2
= u
1
1
2
(
˜
e
1
˜
e
2
) + u
2
1
2
(
˜
e
1
+
˜
e
2
)
=
1
2
(u
1
+ u
2
)
˜
e
1
+
1
2
(u
1
+ u
2
)
˜
e
2
.
Alternatively, using the formula above, we obtain
˜u = P
1
u
=
1
2
1 1
1 1
u
1
u
2
=
1
2
(u
1
+ u
2
)
1
2
(u
1
+ u
2
)
Which agrees with the above direct expansion.
5.3.2 Transformation law for matrix
Consider a linear map α : C
n
C
n
with associated n × n matrix A. We have
u
0
= α(u) = Au.
Denote
u
and
u
0
as being with respect to basis
{e
i
}
(i.e. same basis in both
spaces), and ˜u, ˜u
0
with respect to {
˜
e
i
}.
Using what we’ve got above, we have
u
0
= Au
P ˜u
0
= AP
˜
u
˜u
0
= P
1
AP ˜u
=
˜
A
˜
u
So
Theorem.
˜
A = P
1
AP.
Example.
Consider the shear
S
λ
=
1 λ 0
0 1 0
0 0 1
with respect to the standard
basis. Choose a new set of basis vectors by rotating by θ about the e
3
axis:
˜
e
1
= cos θe
1
+ sin θe
2
˜
e
2
= sin θe
1
+ cos θe
2
˜
e
3
= e
3
So we have
P =
cos θ sin θ 0
sin θ cos θ 0
0 0 1
, P
1
=
cos θ sin θ 0
sin θ cos θ 0
0 0 1
Now use the basis transformation laws to obtain
˜
S
λ
=
1 + λ sin θ cos θ λ cos
2
θ 0
λ sin
2
θ 1 λ sin θ cos θ 0
0 0 1
Clearly this is much more complicated than our original basis. This shows that
choosing a sensible basis is important.
More generally, given
α
:
C
m
C
n
, given
x C
m
,
x
0
C
n
with
x
0
=
Ax
.
We know that A is an n × m matrix.
Suppose
C
m
has a basis
{e
i
}
and
C
n
has a basis
{f
i
}
. Now change bases to
{
˜
e
i
} and {
˜
f
i
}.
We know that
x
=
P ˜x
with
P
being an
m × m
matrix, with
x
0
=
R
˜
x
0
with
R being an n × n matrix.
Combining both of these, we have
R
˜
x
0
= AP
˜
x
˜
x
0
= R
1
AP ˜x
Therefore
˜
A = R
1
AP .
Example.
Consider
α
:
R
3
R
2
, with respect to the standard bases in both
spaces,
A =
2 3 4
1 6 3
Use a new basis
2
1
,
1
5
in
R
2
and keep the standard basis in
R
3
. The basis
change matrix in R
3
is simply I, while
R =
2 1
1 5
, R
1
=
1
9
5 1
1 2
is the transformation matrix for R
2
. So
˜
A =
2 1
1 5
2 3 4
1 6 3
I
=
1
9
5 1
1 2
2 3 4
1 6 3
=
1 1 17/9
0 1 2/9
We can alternatively do it this way: we know that
˜
f
1
=
2
1
,
˜
f
2
=
1
5
Then
we know that
˜
e
1
= e
1
7→ 2f
1
+ f
2
= f
1
˜
e
2
= e
2
7→ 3f
1
+ 6f
2
=
˜
f
1
+
˜
f
2
˜
e
3
= e
3
7→ 4f
1
+ 3f
2
=
17
9
˜
f
1
+
2
9
˜
f
2
and we can construct the matrix correspondingly.