Sunteți pe pagina 1din 24

1 Eigenvalues and Eigenvectors

◼ Eigenvalue problem (one of the most important problems in the


linear algebra):
If A is an n n matrix, do there exist nonzero vectors x in Rn
such that Ax is a scalar multiple of x?
(The term eigenvalue is from the German word Eigenwert, meaning
“proper value”)

◼ Eigenvalue and Eigenvector:


A: an n n matrix
: a scalar (could be zero) ※ Geometric Interpretation
y
x: a nonzero vector in Rn Ax = x

Eigenvalue

Ax = x
x

Eigenvector
x 2
◼ Ex 1: Verifying eigenvalues and eigenvectors
2 0 1 0
A= x1 = x2 =
0 −1 0 1
Eigenvalue
※ In fact, for each eigenvalue, it
2 0 1 2 1 has infinitely many eigenvectors.
Ax1 = = =2 = 2x1 For = 2, [3 0]T or [5 0]T are
0 −1 0 0 0 both corresponding eigenvectors.
Moreover, ([3 0] + [5 0])T is still
Eigenvector
an eigenvector. The proof is in
Thm. 1.
Eigenvalue
2 0 0 0 0
Ax2 = = = −1 = (−1)x2
0 −1 1 −1 1

Eigenvector
3
◼ Thm. 1: The eigenspace corresponding to of matrix A
If A is an n n matrix with an eigenvalue , then the set of all
eigenvectors of together with the zero vector is a subspace
of Rn. This subspace is called the eigenspace of
Pf:
x1 and x2 are eigenvectors corresponding to
(i.e., Ax1 = x1 , Ax 2 = x 2 )
(1) A(x1 + x 2 ) = Ax1 + Ax 2 = x1 + x 2 = (x1 + x 2 )
(i.e., x1 + x 2 is also an eigenvector corresponding to λ)
(2) A(cx1 ) = c( Ax1 ) = c( x1 ) = (cx1 )
(i.e., cx1 is also an eigenvector corresponding to )
Since this set is closed under vector addition and scalar
multiplication, this set is a subspace of Rn according to
Theorem 4.5 4
◼Ex 3: Examples of eigenspaces on the xy-plane
For the matrix A as follows, the corresponding eigenvalues
are 1 = –1 and 2 = 1:
−1 0
A=
0 1
Sol:
For the eigenvalue 1= –1, corresponding vectors are any vectors on the x-axis

x −1 0 x −x x ※ Thus, the eigenspace


A = = = −1 corresponding to = –1 is the x-
0 0 1 0 0 0 axis, which is a subspace of R2

For the eigenvalue 2= 1, corresponding vectors are any vectors on the y-axis

0 −1 0 0 0 0 ※ Thus, the eigenspace


A = = =1 corresponding to = 1 is the y-
y 0 1 y y y axis, which is a subspace of R2
5
※ Geometrically speaking, multiplying a vector (x, y) in R2 by the matrix A
corresponds to a reflection to the y-axis

x x 0 x 0
Av = A =A + =A +A
y 0 y 0 y
x 0 −x
= −1 +1 =
0 y y

6
◼ Thm. 2: Finding eigenvalues and eigenvectors of a matrix A Mn n
Let A be an n n matrix.
(1) An eigenvalue of A is a scalar such that det( I − A) = 0
(2) The eigenvectors of A corresponding to are the nonzero
solutions of ( I − A)x = 0
◼ Note: follwing the definition of the eigenvalue problem
Ax = x Ax = Ix ( I − A)x = 0 (homogeneous system)
( I − A)x = 0 has nonzero solutions for x iff det( I − A) = 0
(The above iff results comes from the equivalent conditions on Slide 4.101)

◼ Characteristic equation of A:
det( I − A) = 0
◼ Characteristic polynomial of A Mn n:
n−1
det( I − A) = ( I − A) = n
+ cn−1 + + c1 + c0
7
◼ Ex 4: Finding eigenvalues and eigenvectors
2 − 12
A=
1 −5

Sol: Characteristic equation:


−2 12
det( I − A) =
−1 +5
= 2
+ 3 + 2 = ( + 1)( + 2) = 0
= −1, − 2

Eigenvalue: 1 = −1, 2 = −2

8
◼ Ex 5: Finding eigenvalues and eigenvectors
Find the eigenvalues and corresponding eigenvectors for
the matrix A. What is the dimension of the eigenspace of
each eigenvalue?
2 1 0
A= 0 2 0
0 0 2

Sol: Characteristic equation:


− 2 −1 0
I−A = 0 −2 0 = ( − 2)3 = 0
0 0 −2
Eigenvalue: =2
10
The eigenspace of λ = 2:
0 −1 0 x1 0
( I − A)x = 0 0 0 x2 = 0
0 0 0 x3 0
x1 s 1 0
x2 = 0 = s 0 + t 0 , s , t 0
x3 t 0 1

1 0
s 0 + t 0 s, t R : the eigenspace of A corresponding to =2
0 1

Thus, the dimension of its eigenspace is 2.


11
◼ Notes:
(1) If an eigenvalue 1 occurs as a multiple root (k times) for
the characteristic polynominal, then 1 has multiplicity k.
(2) The multiplicity of an eigenvalue is greater than or equal
to the dimension of its eigenspace. (In Ex. 5, k is 3 and
the dimension of its eigenspace is 2)

12
◼ Ex 6:Find the eigenvalues of the matrix A and find a basis
for each of the corresponding eigenspaces
1 0 0 0
0 1 5 − 10
A=
1 0 2 0
1 0 0 3

Sol: Characteristic equation:


−1 0 0 0 ※ According to the note on
the previous slide, the
0 − 1 −5 10 dimension of the
I−A = eigenspace of λ1 = 1 is at
−1 0 −2 0
most to be 2
−1 0 0 −3 ※ For λ2 = 2 and λ3 = 3, the
demensions of their
= ( − 1) 2 ( − 2)( − 3) = 0 eigenspaces are at most to
Eigenvalues: = 1, = 2, =3 be 1
1 2 3
13
0 0 0 0 x1 0
0 0 −5 10 x2 0
(1) =1 ( 1 I − A)x = =
1
−1 0 −1 0 x3 0
−1 0 0 −2 x4 0
x1 −2t 0 −2
G.-J.E. x2 s 1 0
= =s +t , s, t 0
x3 2t 0 2
x4 t 0 1

0 −2
1 0
, is a basis for the eigenspace
0 2
corresponding to 1 = 1
0 1

※The dimension of the eigenspace of λ1 = 1 is 2


14
1 0 0 0 x1 0
0 1 −5 10 x2 0
(2) 2 =2 ( 2 I − A)x = =
−1 0 0 0 x3 0
−1 0 0 −1 x4 0
x1 0 0
G.-J.E. x2 5t 5
= =t , t 0
x3 t 1
x4 0 0
0
5
is a basis for the eigenspace
1 corresponding to 2 = 2
0

※The dimension of the eigenspace of λ2 = 2 is 1


15
2 0 0 0 x1 0
0 2 −5 10 x2 0
(3) =3 ( 3 I − A)x = =
3
−1 0 1 0 x3 0
−1 0 0 0 x4 0
x1 0 0
G.-J.E. x2 −5t −5
= =t , t 0
x3 0 0
x4 t 1

0
−5
is a basis for the eigenspace
0 corresponding to 3 = 3
1
※The dimension of the eigenspace of λ3 = 3 is 1
16
◼ Thm. 3: Eigenvalues for triangular matrices
If A is an n n triangular matrix, then its eigenvalues are
the entries on its main diagonal
◼ Ex 7: Finding eigenvalues for triangular and diagonal matrices
−1 0 0 0 0
2 0 0 0 2 0 0 0
(a) A = −1 1 0 (b) A = 0 0 0 0 0
5 3 −3 0 0 0 −4 0
0 0 0 0 3
Sol:
−2 0 0
(a) I−A = 1 −1 0 = ( − 2)( − 1)( + 3) = 0
−5 −3 +3 ※According to Thm., the
determinant of a triangular
1 = 2, 2 = 1, 3 = −3 matrix is the product of the
entries on the main diagonal
(b) 1 = −1, 2 = 2, 3 = 0, 4 = −4, 5 =3 17
◼ Eigenvalues and eigenvectors of linear transformations:
A number is called an eigenvalue of a linear transformation
T : V → V if there is a nonzero vector x such that T (x) = x.
The vector x is called an eigenvector of T corresponding to ,
and the set of all eigenvectors of (together with the zero
vector) is called the eigenspace of
※ Remember: The definition of linear transformation functions
※ Here I briefly introduce the linear transformation and its some basic properties
※ The typical example of a linear transformation function is that each
component of the resulting vector is the linear combination of the components
in the input vector x

◼ An example for a linear transformation T: R3→R3


T ( x1 , x2 , x3 ) = ( x1 + 3x2 ,3x1 + x2 , −2 x3 )
18
◼ Theorem: Standard matrix for a linear transformation
Let T : Rn → Rn be a linear trtansformation such that
a11 a12 a1n
a21 a22 a2 n
T (e1 ) = , T (e2 ) = , , T (en ) = ,

an1 an 2 ann
where {e1 , e 2 , , e n } is a standard basis for R n . Then an n n
matrix A, whose i -th column correspond to T (ei ),
a11 a12 a1n
a21 a22 a2 n
A = T (e1 ) T (e2 ) T (en ) = ,

an1 an 2 ann

19
◼ Consider the same linear transformation T(x1, x2, x3) =
(x1 + 3x2, 3x1 + x2, –2x3)
1 1 0 3 0 0
T (e1 ) = T ( 0 ) = 3 , T (e 2 ) = T ( 1 ) = 1 , T (e3 ) = T ( 0 ) = 0
0 0 0 0 1 −2

◼ Thus, the above linear transformation T is with the following


corresponding standard matrix A such that T(x) = Ax
1 3 0 1 3 0 x1 x1 + 3 x2
A= 3 1 0 Ax = 3 1 0 x2 = 3 x1 + x2
0 0 −2 0 0 −2 x3 −2 x3
※ The statement on Slide 18 is valid because for any linear transformation T: V →V,
there is a corresponding square matrix such that T(x) = Ax. Consequently, the
eignvalues and eigenvectors of a linear transformation T are in essence the
eigenvalues and eigenvectors of the corresponding square matrix A
20
◼ Ex 8: Finding eigenvalues and eigenvectors for standard matrices
Find the eigenvalues and corresponding eigenvectors for
1 3 0
※ A is the standard matrix for T(x1, x2,
A= 3 1 0 x3) = (x1 + 3x2, 3x1 + x2, –2x3) (see
Slides 19 and 20)
0 0 −2
Sol:
−1 −3 0
0 = ( + 2) ( − 4) = 0
2
I − A = −3 −1
0 0 +2
eigenvalues 1 = 4, 2 = −2

For 1 = 4, the corresponding eigenvector is (1, 1, 0).


For 2 = −2, the corresponding eigenvectors are (1, − 1, 0)
and (0, 0, 1). 21
◼ Transformation matrix A ' for nonstandard bases

T (x) = Ax T ( x) B
= A x B , where A = T (e1 ) B
T (e 2 ) B
T (e n ) B

is the standard matrix for T or the matrix of T relative to the standard


basis B
The above theorem can be extended to consider a nonstandard basis B ', which
consists of {v1 , v 2 , , v n }

T ( x) B'
= A' x B'
, where A ' = T ( v1 ) B'
T (v2 ) B'
T (vn ) B'

is the transformation matrix for T relative to the basis B '


※ On the next two slides, an example is provided to verify numerically that this
extension is valid 22
◼ EX. Consider an arbitrary nonstandard basis B ' to be {v1, v2,
v3}= {(1, 1, 0), (1, –1, 0), (0, 0, 1)}, and find the transformation
matrix A ' such that T (x) B ' = A ' x B ' corresponding to the same
linear transformation T(x1, x2, x3) = (x1 + 3x2, 3x1 + x2, –2x3)
1 4 4 1 −2 0
T ( v1 ) B'
= T( 1 ) = 4 = 0 , T (v2 ) B'
= T ( −1 ) = 2 = −2 ,
0 B'
0 B'
0 0 B'
0 B'
0
0 0 0
T ( v3 ) B'
= T( 0 ) = 0 = 0
1 B'
−2 B'
−2

4 0 0
A ' = 0 −2 0
0 0 −2 23
◼ Consider x = (5, –1, 4), and check that T (x) B ' = A ' x B '
corresponding to the linear transformation T(x1, x2, x3) = (x1 + 3x2,
3x1 + x2, –2x3)

5 2 8 5 2
T ( x) B'
= T ( −1 ) = 14 = −6 , x B'
= −1 = 3 ,
4 B'
−8 B'
−8 4 B'
4

4 0 0 2 8
A' x B'
= 0 −2 0 3 = −6 = T (x) B'
0 0 −2 4 −8

24
◼ For a special basis 𝐵 ′ = 𝐯1 , 𝐯2 , … , 𝐯𝑛 , where 𝐯𝑖 ’s are eigenvectors of
the standard matrix 𝐴, 𝐴′ is obtained immediately to be diagonal due to
𝑇 𝐯𝑖 = 𝐴𝐯𝑖 = 𝜆𝑖 𝐯𝑖
and
𝜆𝑖 𝐯𝑖 𝐵 ′ = 0𝐯1 + 0𝐯2 + ⋯ + 𝜆𝑖 𝐯𝑖 + ⋯ + 0𝐯𝑛

Let B ' be a basis of R 3 made up of three linearly independent eigenvectors


of A found in Ex. 8, i.e., B ' = {v1 , v 2 , v 3} = {(1, 1, 0), (1, − 1, 0), (0, 0, 1)}

for 1 =4 for 2 =−2 4 0 0


B ' = {(1, 1, 0),(1, −1, 0),(0, 0, 1)} A' = 0 − 2 0
Eigenvectors of A
0 0 −2
Eigenvalues of A 25
Keywords in Section 1:
◼ eigenvalue problem
◼ eigenvalue
◼ eigenvector
◼ characteristic equation
◼ characteristic polynomial
◼ eigenspace
◼ multiplicity
◼ linear transformation
◼ diagonalization

26

S-ar putea să vă placă și