# Info

On the left-hand side of the equation, matrix A is postmultiplied by matrix U of the eigenvectors whereas, on the right-hand side, the matrix of eigenvalues A is premultiplied by U. It follows that U achieves a two-way transformation (rows, columns), from reference system A to system A. This transformation can go both ways, as shown by the following equations which are both derived from eq. 2.27:

A simple formula may be derived from A = UAU-1, which can be used to raise matrix A to any power x:

Raising a matrix to some high power is greatly facilitated by the fact that A* is the matrix of eigenvalues, which is diagonal. Indeed, a diagonal matrix can be raised to any power x by raising each of its diagonal elements to power x. It follows that the last equation may be rewritten as:

This may easily be verified using the above example.

Second property. — It was shown in Section 2.7 that, when the rank (r) of matrix Ann is smaller than its order (r < n), determinant | A | equals 0. It was also shown that, when it is necessary to know the rank of a matrix, as for instance in dimensional analysis (Section 3.3), | A | = 0 indicates that one must test for rank. Such a test naturally follows from the calculation of eigenvalues. Indeed, the determinant of a matrix is equal to the product of its eigenvalues:

so that | A| = 0 if one or several of the eigenvalues X; = 0. When the rank of a matrix is smaller than its order (r < n), this matrix has (n - r) null eigenvalues. Thus, eigenvalues can be used to determine the rank of a matrix: the rank is equal to the number of nonzero eigenvalues. In the case of an association matrix among variables, the number of nonzero eigenvalues (i.e. the rank of A) is equal to the number of independent dimensions which are required to account for all the variance (Chapter 9).

Third property. — It was implicitly assumed, up to this point, that the eigenvalues were all different from one another. It may happen, however, that some (say, m) eigenvalues are equal. These are known as multiple eigenvalues. In such a case, the question is whether or not matrix Ann has n distinct eigenvectors. In other words, are there m linearly independent eigenvectors which correspond to the same eigenvalue?

By definition, the determinant of (A - ^¿I) is null (eq. 2.23):

which means that the rank of (A - ^¿I) is smaller than n. In the case of multiple eigenvalues, if there are m distinct eigenvectors corresponding to the m identical eigenvalues X;, the determinant of (A - ^¿I) must be null for each of these eigenvalues, but in a different way each time. When m = 1, the condition for | A - | = 0 is for its rank to be r = n - 1. Similarly, in a case of multiplicity, the condition for | A - | to be null m times, but distinctly, is for its rank to be r = n - m. Consequently, for n distinct eigenvectors to exist, the rank of (A - X-I) must be r = n - m, and this for any eigenvalue X,- of multiplicity m.

Numerical example. The following matrix has eigenvalues X1 = X2 = 1 and X3 = -1: