Sunteți pe pagina 1din 19

Properties and significance

Geometric interpretation of
Eigenvalues and Eigenvectors
An nn matrix A multiplied by n1 vector x results in another
n1 vector y=Ax. Thus A can be considered as a
transformation matrix.

In general, a matrix acts on a vector by changing both its
magnitude and its direction. However, a matrix may act on
certain vectors by changing only their magnitude, and leaving
their direction unchanged (or possibly reversing it). These
vectors are the eigenvectors of the matrix.

A matrix acts on an eigenvector by multiplying its magnitude by
a factor, which is positive if its direction is unchanged and
negative if its direction is reversed. This factor is the eigenvalue
associated with that eigenvector.
Eigenvalues
Let x be an eigenvector of the matrix A. Then there must exist an
eigenvalue such that Ax = x or, equivalently,
Ax - x = 0 or
(A I)x = 0
If we define a new matrix B = A I, then
Bx = 0
If B has an inverse then x = B
-1
0 = 0. But an eigenvector cannot
be zero.
Thus, it follows that x will be an eigenvector of A if and only if B
does not have an inverse, or equivalently det(B)=0, or
det(A I) = 0
This is called the characteristic equation of A. Its roots
determine the eigenvalues of A.
Eigenvectors




To each distinct eigenvalue of a matrix A there
will correspond at least one eigenvector which can be
found by solving the appropriate set of homogenous
equations. If
i
is an eigenvalue then the corresponding
eigenvector x
i
is the solution of (A
i
I)x
i
= 0
Properties of Eigenvalues and Eigenvectors
Definition: The trace of a matrix A, designated by tr(A), is the sum
of the elements on the main diagonal.
Property 1: The sum of the eigenvalues of a matrix equals the
trace of the matrix.

Property 2: A matrix is singular if and only if it has a zero
eigenvalue.

Property 3: The eigenvalues of an upper (or lower) triangular
matrix are the elements on the main diagonal.

Property 4: If is an eigenvalue of A and A is invertible, then 1/
is an eigenvalue of matrix A
-1
.
Properties of Eigenvalues and Eigenvectors
Property 5: If is an eigenvalue of A then k is an eigenvalue of
kA where k is any arbitrary scalar.

Property 6: If is an eigenvalue of A then
k
is an eigenvalue of
A
k
for any positive integer k.

Property 8: If is an eigenvalue of A then is an eigenvalue of A
T
.

Property 9: The product of the eigenvalues (counting multiplicity)
of a matrix equals the determinant of the matrix.
Linearly independent eigenvectors
Theorem: Eigenvectors corresponding to distinct (that is, different)
eigenvalues are linearly independent.
Theorem: If is an eigenvalue of multiplicity k of an n n matrix
A then the number of linearly independent eigenvectors of A
associated with is given by m = n - r(A- I). Furthermore, 1 m
k.

Example 2 (cont.): The eigenvectors of = 2 are of the form

s and t not both zero.


= 2 has two linearly independent eigenvectors
,
1
0
0
0
0
1
0
3
2
1
(
(
(

+
(
(
(

=
(
(
(

=
(
(
(

= t s
t
s
x
x
x
x
Significance of
eigen values and
eigen vectors
Schrdinger equation
An example of an eigenvalue equation where the transformation is represented
in terms of a differential operator is the time-independent Schrdinger
equation in quantum mechanics:
where , the Hamiltonian, is a second-order differential operator and ,
the wavefunction, is one of its eigenfunctions corresponding to the eigenvalue ,
interpreted as its energy.
However, in the case where one is interested only in the bound state solutions
of the Schrdinger equation, one looks for within the space of square
integrable functions. Since this space is a Hilbert space with a well-
defined scalar product, one can introduce a basis set in which and can be
represented as a one-dimensional array and a matrix respectively. This allows
one to represent the Schrdinger equation in a matrix form

Schrdinger equation
The braket notation is often used in this context. A vector, which represents a
state of the system, in the Hilbert space of square integrable functions is
represented by . In this notation, the Schrdinger equation is:
where is an eigenstate of and E| represents the eigenvalue . It is a self adjoint
operator, the infinite dimensional analog of Hermitian matrices
(see Observable). As in the matrix case, in the equation above is understood to
be the vector obtained by application of the transformation to

Molecular orbitals

In quantum mechanics, and in particular in atomic and molecular physics,
within the HartreeFock theory, the atomic and molecular orbitals can be
defined by the eigenvectors of the Fock operator. The corresponding
eigenvalues are interpreted as ionization potentials via Koopmans' theorem. In
this case, the term eigenvector is used in a somewhat more general meaning,
since the Fock operator is explicitly dependent on the orbitals and their
eigenvalues. If one wants to underline this aspect one speaks of nonlinear
eigenvalue problem. Such equations are usually solved by
an iteration procedure, called in this case self-consistent field method.
In quantum chemistry, one often represents the HartreeFock equation in a
non-orthogonal basis set. This particular representation is a generalized
eigenvalue problem calledRoothaan equations.

Geology and glaciology

In geology, especially in the study of glacial till, eigenvectors and eigenvalues
are used as a method by which a mass of information of a clast fabric's
constituents' orientation and dip can be summarized in a 3-D space by six
numbers. In the field, a geologist may collect such data for hundreds or
thousands of clasts in a soil sample, which can only be compared graphically
such as in a Tri-Plot (Sneed and Folk) diagram, or as a Stereonet on a Wulff
Net.
The output for the orientation tensor is in the three orthogonal
(perpendicular) axes of space. The three eigenvectors are ordered by their
eigenvalues ; then is the primary orientation/dip of clast, is the secondary
and is the tertiary, in terms of strength. The clast orientation is defined as the
direction of the eigenvector, on a compass rose of 360. Dip is measured as the
eigenvalue, the modulus of the tensor: this is valued from 0 (no dip) to 90
(vertical). The relative values of , , and are dictated by the nature of the
sediment's fabric. If , the fabric is said to be isotropic. If , the fabric is said to be
planar. If , the fabric is said to be linear.

Tensor of moment of inertia and
stress tensor
Tensor of moment of inertia
In mechanics, the eigenvectors of the moment of inertia tensor define
the principal axes of a rigid body. The tensor of moment of inertia is a key
quantity required to determine the rotation of a rigid body around its center of
mass

Stress tensor
In solid mechanics, the stress tensor is symmetric and so can be decomposed
into a diagonal tensor with the eigenvalues on the diagonal and eigenvectors as
a basis. Because it is diagonal, in this orientation, the stress tensor has
no shear components; the components it does have are the principal
components.

Eigenvalues of a graph


In spectral graph theory, an eigenvalue of a graph is defined as an eigenvalue of
the graph's adjacency matrix , or (increasingly) of the graph's Laplacian
matrix (see alsoDiscrete Laplace operator), which is either (sometimes called
the combinatorial Laplacian) or (sometimes called the normalized Laplacian),
where is a diagonal matrix with equal to the degree of vertex , and in , the th
diagonal entry is . The th principal eigenvector of a graph is defined as either
the eigenvector corresponding to the th largest or th smallest eigenvalue of the
Laplacian. The first principal eigenvector of the graph is also referred to merely
as the principal eigenvector.

Eigenvalues of a graph

The principal eigenvector is used to measure the centrality of its vertices. An
example is Google's PageRank algorithm. The principal eigenvector of a
modified adjacency matrix of the World Wide Web graph gives the page ranks
as its components. This vector corresponds to the stationary distribution of
the Markov chain represented by the row-normalized adjacency matrix;
however, the adjacency matrix must first be modified to ensure a stationary
distribution exists. The second smallest eigenvector can be used to partition
the graph into clusters, via spectral clustering. Other methods are also available
for clustering.

Basic reproduction number

The basic reproduction number (R0) is a fundamental number in the study of
how infectious diseases spread. If one infectious person is put into a
population of completely susceptible people, then is the average number of
people that one typical infectious person will infect. The generation time of an
infection is the time, , from one person becoming infected to the next person
becoming infected. In a heterogeneous population, the next generation matrix
defines how many people in the population will become infected after
time has passed. is then the largest eigenvalue of the next generation m
Eigen faces
In image processing, processed images of faces can be seen as vectors whose
components are the brightnesses of each pixel.
[26]
The dimension of this vector
space is the number of pixels. The eigenvectors of the covariance
matrix associated with a large set of normalized pictures of faces are
called eigenfaces; this is an example of principal components analysis. They
are very useful for expressing any face image as a linear combination of some of
them. In the facial recognition branch of biometrics,
Eigen faces
eigenfaces provide a means of applying data compression to faces
for identification purposes. Research related to eigen vision systems
determining hand gestures has also been made.


Similar to this concept, eigenvoices represent the general direction of
variability in human pronunciations of a particular utterance, such as a word in
a language. Based on a linear combination of such eigenvoices, a new voice
pronunciation of the word can be constructed. These concepts have been
found useful in automatic speech recognition systems, for speaker adaptation.

S-ar putea să vă placă și