International Journal of Scientific & Engineering Research, Volume 2, Issue 6, June-2011 1

ISSN 2229-5518

Physical Behavior of Eigenvalues and

Singular Values in Matrix Decompositions

Azmol Huda, R. L. Aguiar

Abstract— An apposite as well as realistic treatment of eigenvalue and singular value problems are potentially of interest to a wide variety of people, including among others, design engineers, theoretical physicists, classical applied mathematics and numerical analysis who yearn to carry out research in the matrix field. In real world, it is extensively used but at sometimes scantily understood. This paper focuses on building a solid perception of eigenvalue as well as singular value with their substantial meanings. The main goals of this paper are to present an intuitive experience of both eigenvalue and singular value in matrix decompositions throughout a discussion by largely building on ideas from linear algebra and will be proficiently to gain a better perceptive of their physical meanings from graphical representation.

Index Terms—Eigenvalue, singular value, matrix decomposition, orthonormal basis, linear mapping.

—————————— • ——————————

IGENVALUE as well as singular value has wide- spread application in diverse fields of empirical science from mathematics to neuroscience because

they are a straightforward, non-parametric value that extracts pertinent information from a large matrix. With nominal effort they make available a roadmap to divulge the hidden, simplified structures in a large matrix that frequently lie beneath it. Eigenvalues take part in a signif- icant role in situations where the matrix is a transforma- tion form one vector space onto itself. The primary para- digms of it are systems of linear ordinary differential eq- uations. The eigenvalues of a matrix be capable of keep- ing up a correspondence to frequencies of vibration, or critical values of stability factors or energy level of atoms. The most application of eigenvalue probably in the field of dimension reduction.

Singular values also play a vital role where the matrix

is a transformation from one vector space to a different

vector space, possibly with a dissimilar dimension. Sys-

tems of over or undetermined algebraic equations are the

most important examples. The term “singular value” re-

of the sample covariance matrix have a tendency to a lim- it under certain conditions and the limit of the cumulative distribution function of the eigenvalues is likely to deter- mine by using a technique of moments. In paper [10], C.B. Moler, presents a graphical depiction, but the singular value decomposition as well as the relation with the ei- genvalues is not discussed in a broader consideration. In this paper, we put in plain words an instictive feel of the physical meaning of eigenvalues and singular values along with emphasizing more on their graphical conse- quence with numerical example in a meticulous way.

The rest of this paper is structured as follows. The no-

tations used all the way throughout of this paper are giv-

en in section 2. We discuss the eigenvalue and singular

value matrix decompositions in section 3. Relation be-

tween eigenvalue and singular value and their geometric- al elucidation are given in section 4 and 5 respectively. Graphical representations are presented in section 6. Con- cluding remarks are presented in section 7.

lates to the distance between a matrix and the set of sin- __ __

gular matrices. From the seminal study of eigen and sin- gular value, there has been a lot of research work related to this field [1], [2], [3], and incredibly common to all ma-

Symbol Meaning

A : square or rectangular matrix of order *n *over a field *F *;

thematicians or engineers although nearby there is very

'Ai

: eigenvalues of matrix *A *;

little works straightforwardly in attendance to the rela- tion of eigenvalues and singular values simultaneously with their graphical properties, in an efficient and repre- sentative way. In paper [4], the author addresses a num- ber of numerical issues of singular value arising in the

A : n-by-n diagonal matrix with the 'A j on the

diagonal ;

X : denotes n-by-n diagonal matrix whose j-th

column is x j ;

U ,V : orthogonal or unitary matrices;

study of models of linear systems with its applications.

cr i

: singular values of *A*;

Numerical computation of the characteristic values of a real symmetric matrix and numerically stable, fairly fast

: n-by-n diagonal matrix with the cr r on the

diagonal;

technique for achieving the singular values are discussed A

in paper [5] and [6]. Authors in paper [7], describe the A

: transpose matrix of *A*;

: Hermitian matrix of *A*;

solution of large scale eigenvalue problems. In paper [8]

and [9], the authors illustrate that the largest eigenvalue

: set of real numbers;

IJSER © 2011 http://w w w .ij ser.org

2 International Journal of Scientific & Engineering Research, Volume 2, Issue 6, June-2011

ISSN 2229-5518

1 I1

if i = j

DECOMPOSITIONS

ui = cr

Avi where ui .u j = �0 otherwise

Let *A *be a square matrix of order *n *over a field *F*. A scalar

and

cr i

is related with the eigenvalues by the relation

'A E F is an eigenvalue of a matrix *A *if there exists a non- zero column vector * x *for which

cr i =

*Av*i

'Ai . This makes an unexpected property,

= cr i . A scalar cr E F is called a singular value of a

Ax = 'A x

(1)

rectangular matrix A , with a pair of singular vectors

The eigenvalue-eigenvector equation for a square ma- trix can be written as ( A - 'A I ) x = 0, x =1 0 where I denotes the identity matrix of order n. This implies that ( A - 'A I ) is

U and V if it satisfies the relation

AV = crU

This can be written as in matrix form AV = U

(5)

or

AHU = V

H where stands for the n-by-n diagonal matrix

singular and hence that det ( A - 'A I ) = 0 . This is christened

the characteristic equation or characteristic polynomial of *A*. The degree of polynomial is the order of the matrix and an n-by-n matrix has *n *eigenvalues, counting re-

with the cr r on the diagonal. The superscript *H *stands for Hermitian transpose. The mathematical intuition behind the construction of the matrix is that we craving to ex-

press all *n *scalar equations in just one equation. The ulti-

peated roots. Suppose

'A1, 'A2 ,..., 'An be the eigenvalues of a

mate form of SVD is thick and it is uncomplicated to

matrix *A *and

x1, x2 ,..., xn be a set of corresponding eigen-

comprehend this process in graphically. The complete

vectors. Let A denote the n-by-n diagonal matrix with the

'A j on the diagonal, and let *X *denotes the n-by-n matrix

whose j-th column is x j , then we can write

AX = X A (2) From equation (2), it is indispensable to put A on the right side so that each column of *X *can be multiplied by its resultant eigenvalues. A noteworthy key conjecture is that, this statement is not factual for all matrices. At this juncture, we assume that the eigenvectors are linearly independent, afterward the inverse of matrix *X *exists and

structures of singular value decomposition are described

in Figure 1.

Figure 1.(a) is the basic form of singular value decom-

position of a matrix, whereas, Figure 1.(b) and 1.(c) are

the form of singular value decomposition of the matrix when m < n and m > n respectively. It yields that singular vectors can constantly be chosen to be perpendicular to each other, so the matrices *U *and *V*, whose columns are the normalized singular vectors, satisfy

U H U = I and V HV = I . In other words, U and V are ortho- gonal, if they are real and unitary, if they are complex.

A = X AX -1

(3)

From equation (4), we can effortlessly derive

with the non singular matrix *X*. This is well-known as the

A = U V H

(6)

eigenvalue decomposition of matrix *A*. When it exists, it

consents us to investigate the properties and characteriza- tion of *A *by exploring the diagonal elements of A . For illustration, repeated matrix powers can be put across in terms of powers of scalars i.e.

with the diagonal matrix and orthogonal or unitary matrices *U *and *V *. This is well-known as the singular value decomposition or SVD, of matrix *A*.

Ak = X AK X -1

(4) n

But, if the eigenvectors of *A *are not linearly indepen- dent, then such a diagonal does not exist and the powers of *A *give us an idea or an evidence of a more complex behavior. If *T *is any non singular matrix, then A = TQT -1 is known as similarity transformation and *A *and *Q *are said to be similar. If Ax = 'A x and x = Ty then Qy = 'A y . So a simi- larity transformation preserves eigenvalues. Commonly, the eigenvalue decomposition is an attempt to find a simi-

Positive

m m num ber n

F igure. 1 (a) General form of SV D

A U ?

V H

larity transformation to diagonal form.

Let A be a rectangular matrix of order m-by-n and the

Figure 1.(b) when m <n

rank of

AAT is *r *. Therefore,

AAT is a square symmetric

matrix of order m-by-m. Let us define some more quanti-

A U ?

ties: let

V = {v1 , v2 ,...vr } be the set of orthonormal

m x1 ei-

genvectors with associated eigenvalues {'A1, 'A2 ,...'Ar } for the

symmetric matrix AAT . Therefore, ( AAT )v = 'A v . Also let,

U = {u1 , u2 ,...ur } be the set of n x1 vectors defined by

Figure 1.(c) when m >n

IJSER © 2011 http://w w w .ij ser.org

Fig.1. Structures of singular value decompositions.

International Journal of Scientific & Engineering Research, Volume 2, Issue 6, June-2011 3

ISSN 2229-5518

In our earlier section, we have established that A = U V H . Now we can obtain the relation between eigenvalues and singular values by a trouble-free computation.

decomposition is possible for any rectangular matrix. In SVD, we strive to find one change of basis in the domain and a typically different change of basis in the range with the intention that the matrix becomes diagonal. Such bases always exist and real if *A *is real. Usually, the transforming

matrices are orthogonal or unitary so as they preserve

AH A = (U V H )H (U V H ) = V

HU HU V H = V ( H

)V H

(7)

lengths and angles and do not magnify errors. The geome- trical outlook of the SVD decompositions can be summa-

AAH = (U V H ) (U V H )H = U V HV

HU H = U (

H )U H

rized as follows: for every linear map

Q : F n F m of the

The right hand side of these two equations expresses the

field *F *one can find orthonormal bases of F n and F m such

eigenvalue decompositions of the left hand sides. It is

that Q maps the i-th basis vector of

F n to a non-negative

evidently seen that, the squares of the non-zero singular

values of *A *are equal to the non-zero eigenvalues of their

AH A and AAH . Furthermore, the columns of U are eigen-

multiple of the i-th basis vector of F m . With respect to these bases, the map Q is therefore represented by a diagonal matrix with non-negative real diagonal entries. Finally, let

vectors of

of AH A .

AAH and the columns of *V *are eigenvectors

we look a more visual flavor of singular values and singu- lar value decompositions, at least when it works on a real vector space. If *S *is a sphere of radius one in n , after that

the linear map Q : F n F m

maps this sphere onto an ellip-

AND SINGULAR VALUE

Let us take a closer look carefully of equation (1), Ax = 'A x and let us now ask whether there are any vectors which are not changed in direction by the deformation? The an- swer may be found easily from the equation (1). We could make some quick concluding remarks of equation (1). The square matrix *A *can be thought of as a transforma- tion matrix. The multiplication of the matrix *A*, on the left of a vector * x*, the answer is another vector that is trans- formed from its original position. The transformed vector

Moreover, it is effortlessly seen that, even if we scale

the vector

On the contrary, SVD is relevant to a possibly rectangu-

lar m-by-n matrix

space onto m-space. Eigenvalue decomposition is applica- ble only for square matrix; in contrast, the singular value

soid in m and usually, nonzero singular values are simply

the lengths of the semi-axes of this ellipsoid.

IJSER © 2011 http://www.ijser.org

Fig. 2. Plot of eigenvalues.

4 International Journal of Scientific & Engineering Research, Volume 2, Issue 6, June-2011

ISSN 2229-5518

Fig. 3. Plot of singular values.

Here we demonstrate a graphical view of eigenvalues and singular values by Mat Lab with a numerical example. We use the transformation matrix

f 1 3 l

we have characterized and discussed the effect of the tra- verse orbit both of eigenvalues and singular values. It puts on display of a mapping from sphere onto an ellipsoid. When the vector * x *moves on a circle,

REFERENCES

[1] G.W. Stewart, “Matrix Algorithms: Basic Decompositions,” SIAM, Philadelphia, 1998.

[2] H. Anton, C. Rorres, “Elementary Linear Algebra with Applications,”

9-th Edition, John Wiley and Sons, 2005.

[3] V.C. Klema, “The Singular Value Decomposition: Its Computation and

Some Applications,” IEEE Trans. Automatic Control, Vol. 25, pp.164-

176, 1980.

[4] W. Givens, “Numerical Computation of the Characteristic Values of a

Real Symmetric Matrix,” ORNL Rep. 1574.

[5] G.H. Golub and W. Kahan, “Calculating the Singular Values and Pseu-

do-Inverse of a Matrix,” SIAMJ. Nwner. Anal., vol. 2, pp. 205-224, 1965.

A =

4 4

[6] R.B. Lehoucq, D.C. Sorensen and C. Yang, “ARPACK User’s Guide:

1 1 L 2J

of order 2 and a unit vector x = [1 0]' to facilitate of draw- ing a unit circle.

The resulting trajectory of * Ax *is plotted. In Figure-2,

the first four subplots, Figure-2.(a), 2.(b), 2.(c) and 2.(d),

are the intermediate steps of their traversed orbits. The

goal is to find the vectors

Generally, for such a direction

2.(e), 2.(d) of Figure-2 , the first eigenvalue is positive, so

In this paper, we have discussed several aspects of eigen- values and singular values from the point of view of their underlying relations. To catch a better understanding, a pictorial visualization and elucidation, are presented. Here,

Solution of Large-Scale Eigenvalue Problems with Implicitly Restarted

Arnoldi Menhods,” SIAM, Philadelphia, 1998.

[7] D. Jonsson, “Some Limit Theorems for the Eigenvalues of a Sample

Covariance Matrix,” J. Multivariate Analysis 12, pp.1-38, 1982.

[8] Y.Q. Yin, Z.D. Bai and P.R. Krishnajah, “On the Limit of the Largest

Eigenvalue of the Large Dimension Sample Covariance Ma-

trix,”Probability Theory and Related Fields, Volume 78, Number 4, 509-

521, 1998.

[9] C.B. Moler, “Numerical Computing with MATLAB,” February 15,

2008.

[10] Y. Jiao, B. Yang, H. Wang, X. Niu, “SVD Based Robust Image Content Retrieval,” International Conference on Intelligent Information Hiding and Multimedia Signal Processing, pp. 351-354, Dec. 2006.

[11] Y.H. Wang, T.N. Tan and Y. Zhu, “Face Verification Based on Singular Value Decomposition and Radial Basis Function Neural Network,” (NLPR), Institute of Automation, Chinese Academy of Sciences, 2005.

[12] C.L., Sun, J. Hahn, “Parameter Reduction for Stable Dynamical Sys-

tems Based on Hankel Singular Values and Sensitivity Analysis,”

Chem. Eng.Sci., 61(16), pp. 5393–5403, 2006.

[13] S. Lall, J.E. Marsden, S. Glavaski, “Empirical Model Reduction of Con-

trolled Nonlinear Systems,” 14th IFAC World Congress, Beijing, 1999. [14] C. Sun, J. Hahn, “Reduction of Differential-Algebraic Equation Systems

Via Projections and System Identification,” Journal of Process Control, vol. 15, pp. 639-650, 2005.

[15] G.H. Golub and C. Reinsch, “Singualr Value Decomposition and Least

Squares Solutions,” Numer. Math., vol. 14, pp. 403-420, 1970.

[16] M.H.A. Biswas, M.H. Azmol, Munnujahan Ara, M.A. Rahman, “Op-

timal Control Theory and its Applications in Aerospace Engineering,” International Journal on Academic Research, Vol. 3, No. 2, March 2011.

IJSER © 2011 http://w w w .ij ser.org