Inlinear algebra, aJordan normal form, also known as aJordan canonical form,[1][2]is anupper triangular matrix of a particular form called aJordan matrix representing alinear operator on afinite-dimensionalvector space with respect to somebasis. Such a matrix has each non-zero off-diagonal entry equal to 1, immediately above the main diagonal (on thesuperdiagonal), and with identical diagonal entries to the left and below them.
LetV be a vector space over afieldK. Then a basis with respect to which the matrix has the required form existsif and only if alleigenvalues of the matrix lie inK, or equivalently if thecharacteristic polynomial of the operator splits into linear factors overK. This condition is always satisfied ifK isalgebraically closed (for instance, if it is the field ofcomplex numbers). The diagonal entries of the normal form are the eigenvalues (of the operator), and the number of times each eigenvalue occurs is called thealgebraic multiplicity of the eigenvalue.[3][4][5]
If the operator is originally given by asquare matrixM, then its Jordan normal form is also called the Jordan normal form ofM. Any square matrix has a Jordan normal form if the field of coefficients is extended to one containing all the eigenvalues of the matrix. In spite of its name, the normal form for a givenM is not entirely unique, as it is ablock diagonal matrix formed ofJordan blocks, the order of which is not fixed; it is conventional to group blocks for the same eigenvalue together, but no ordering is imposed among the eigenvalues, nor among the blocks for a given eigenvalue, although the latter could for instance be ordered by weakly decreasing size.[3][4][5]
TheJordan–Chevalley decomposition is particularly simple with respect to a basis for which the operator takes its Jordan normal form. The diagonal form fordiagonalizable matrices, for instancenormal matrices, is a special case of the Jordan normal form.[6][7][8]
The Jordan normal form is named afterCamille Jordan, who first stated the Jordan decomposition theorem in 1870.[9]
Some textbooks have the ones on thesubdiagonal; that is, immediately below the main diagonal instead of on the superdiagonal. The eigenvalues are still on the main diagonal.[10][11]
Ann ×n matrixA isdiagonalizable if and only if the sum of the dimensions of the eigenspaces isn. Or, equivalently, if and only ifA hasnlinearly independenteigenvectors. Not all matrices are diagonalizable; matrices that are not diagonalizable are calleddefective matrices. Consider the following matrix:
Including multiplicity, the eigenvalues ofA areλ = 1, 2, 4, 4. Thedimension of the eigenspace corresponding to the eigenvalue 4 is 1 (and not 2), soA is not diagonalizable. However, there is an invertible matrixP such thatJ =P−1AP, where
The matrix is almost diagonal. This is the Jordan normal form ofA. The sectionExample below fills in the details of the computation.
In general, a square complex matrixA issimilar to ablock diagonal matrix
where each blockJi is a square matrix of the form
So there exists an invertible matrixP such thatP−1AP =J is such that the only non-zero entries ofJ are on the diagonal and the superdiagonal.J is called theJordan normal form ofA. EachJi is called aJordan block ofA. In a given Jordan block, every entry on the superdiagonal is 1.
Assuming this result, we can deduce the following properties:
Consider the matrix from the example in the previous section. The Jordan normal form is obtained by somesimilarity transformation:
Let have column vectors,, then
We see that
For we have, that is, is an eigenvector of corresponding to the eigenvalue. For, multiplying both sides by gives
But, so
Thus,
Vectors such as are calledgeneralized eigenvectors ofA.
This example shows how to calculate the Jordan normal form of a given matrix.
Consider the matrix
which is mentioned in the beginning of the article.
Thecharacteristic polynomial ofA is
This shows that the eigenvalues are 1, 2, 4 and 4, according to algebraic multiplicity. The eigenspace corresponding to the eigenvalue 1 can be found by solving the equationAv = 1v. It is spanned by the column vectorv = (−1, 1, 0, 0)T. Similarly, the eigenspace corresponding to the eigenvalue 2 is spanned byw = (1, −1, 0, 1)T. Finally, the eigenspace corresponding to the eigenvalue 4 is also one-dimensional (even though this is a double eigenvalue) and is spanned byx = (1, 0, −1, 1)T. So, thegeometric multiplicity (that is, the dimension of the eigenspace of the given eigenvalue) of each of the three eigenvalues is one. Therefore, the two eigenvalues equal to 4 correspond to a single Jordan block, and the Jordan normal form of the matrixA is thedirect sum
There are threeJordan chains. Two have length one: {v} and {w}, corresponding to the eigenvalues 1 and 2, respectively. There is one chain of length two corresponding to the eigenvalue 4. To find this chain, calculate
whereI is the4 × 4 identity matrix. Pick a vector in the above span that is not in the kernel ofA − 4I; for example,y = (1,0,0,0)T. Now,(A − 4I)y =x and(A − 4I)x = 0, so {y,x} is a chain of length two corresponding to the eigenvalue 4.
The transition matrixP such thatP−1AP =J is formed by putting these vectors next to each other as follows
A computation shows that the equationP−1AP =J indeed holds.
If we had interchanged the order in which the chain vectors appeared, that is, changing the order ofv,w and {x,y} together, the Jordan blocks would be interchanged. However, the Jordan forms are equivalent Jordan forms.
Given an eigenvalueλ, every corresponding Jordan block gives rise to aJordan chain of linearly independent vectorspi, i = 1, ...,b, whereb is the size of the Jordan block. Thegenerator, orlead vector,pb of the chain is a generalized eigenvector such that. The vector is an ordinary eigenvector corresponding toλ. In general,pi is a preimage ofpi−1 under. So the lead vector generates the chain via multiplication by.[13][2] Therefore, the statement that every square matrixA can be put in Jordan normal form is equivalent to the claim that the underlying vector space has a basis composed of Jordan chains.
We give aproof by induction that any complex-valued square matrixA may be put in Jordan normal form. Since the underlying vector space can be shown[14] to be the direct sum ofinvariant subspaces associated with the eigenvalues,A can be assumed to have just one eigenvalueλ. The 1 × 1 case is trivial. LetA be ann ×n matrix. Therange of, denoted by, is an invariant subspace ofA. Also, sinceλ is an eigenvalue ofA, the dimension of,r, is strictly less thann, so, by the inductive hypothesis, has abasis {p1, ...,pr} composed of Jordan chains.
Next consider thekernel, that is, thesubspace. If
the desired result follows immediately from therank–nullity theorem. (This would be the case, for example, ifA wereHermitian.)
Otherwise, if
let the dimension ofQ bes ≤r. Each vector inQ is an eigenvector, so must contains Jordan chains corresponding tos linearly independent eigenvectors. Therefore the basis {p1, ...,pr} must contains vectors, say {p1, ...,ps}, that are lead vectors of these Jordan chains. We can "extend the chains" by taking the preimages of these lead vectors. (This is the key step.) Letqi be such that
Finally, we can pick any basis for
and then lift to vectors {z1, ...,zt} in. Eachzi forms a Jordan chain of length 1. We just need to show that the union of {p1, ...,pr}, {z1, ...,zt}, and {q1, ...,qs} forms a basis for the vector space.
By the rank-nullity theorem,, so, and so the number of vectors in the potential basis is equal to n. To show linear independence, suppose some linear combination of the vectors is 0. Applying we get some linear combination ofpi, with theqi becoming lead vectors among thepi. From linear independence ofpi, it follows that the coefficients of the vectorsqi must be zero. Furthermore, no non-trivial linear combination of thezi can equal a linear combination ofpi, because then it would belong to and thusQ, which is impossible by the construction ofzi. Therefore the coefficients of thezi will also be 0. This leaves in the original linear combination just thepi terms, which are assumed to be linearly independent, and so their coefficients must be zero too. We have found a basis composed of Jordan chains, and this showsA can be put in Jordan normal form.
It can be shown that the Jordan normal form of a given matrixA is unique up to the order of the Jordan blocks.
Knowing the algebraic and geometric multiplicities of the eigenvalues is not sufficient to determine the Jordan normal form ofA. Assuming the algebraic multiplicitym(λ) of an eigenvalueλ is known, the structure of the Jordan form can be ascertained by analyzing the ranks of the powers(A −λI)m(λ). To see this, suppose ann ×n matrixA has only one eigenvalueλ. Som(λ) =n. The smallest integerk1 such that
is the size of the largest Jordan block in the Jordan form ofA. (This numberk1 is also called theindex ofλ. See discussion in a following section.) The rank of
is the number of Jordan blocks of sizek1. Similarly, the rank of
is twice the number of Jordan blocks of sizek1 plus the number of Jordan blocks of sizek1 − 1. The general case is similar.
This can be used to show the uniqueness of the Jordan form. LetJ1 andJ2 be two Jordan normal forms ofA. ThenJ1 andJ2 are similar and have the same spectrum, including algebraic multiplicities of the eigenvalues. The procedure outlined in the previous paragraph can be used to determine the structure of these matrices. Since the rank of a matrix is preserved by similarity transformation, there is a bijection between the Jordan blocks ofJ1 andJ2. This proves the uniqueness part of the statement.
IfA is a real matrix, its Jordan form can still be non-real. Instead of representing it with complex eigenvalues and ones on the superdiagonal, as discussed above, there exists a real invertible matrixP such thatP−1AP =J is a realblock diagonal matrix with each block being a real Jordan block.[15] A real Jordan block is either identical to a complex Jordan block (if the corresponding eigenvalue is real), or is a block matrix itself, consisting of 2×2 blocks (for non-real eigenvalue with given algebraic multiplicity) of the form
and describe multiplication by in the complex plane. The superdiagonal blocks are 2×2 identity matrices and hence in this representation the matrix dimensions are larger than the complex Jordan form. The full real Jordan block is given by
This real Jordan form is a consequence of the complex Jordan form. For a real matrix the nonreal eigenvectors and generalized eigenvectors can always be chosen to formcomplex conjugate pairs. Taking the real and imaginary part (linear combination of the vector and its conjugate), the matrix has this form with respect to the new basis.
Jordan reduction can be extended to any square matrixM whose entries lie in afieldK. The result states that anyM can be written as a sumD +N whereD issemisimple,N isnilpotent, andDN =ND. This is called theJordan–Chevalley decomposition. WheneverK contains the eigenvalues ofM, in particular whenK isalgebraically closed, the normal form can be expressed explicitly as thedirect sum of Jordan blocks.
Similar to the case whenK is the complex numbers, knowing the dimensions of the kernels of(M −λI)k for 1 ≤k ≤m, wherem is thealgebraic multiplicity of the eigenvalueλ, allows one to determine the Jordan form ofM. We may view the underlying vector spaceV as aK[x]-module by regarding the action ofx onV as application ofM and extending byK-linearity. Then the polynomials(x −λ)k are the elementary divisors ofM, and the Jordan normal form is concerned with representingM in terms of blocks associated to the elementary divisors.
The proof of the Jordan normal form is usually carried out as an application to theringK[x] of thestructure theorem for finitely generated modules over a principal ideal domain, of which it is a corollary.
One can see that the Jordan normal form is essentially a classification result for square matrices, and as such several important results from linear algebra can be viewed as its consequences.
Using the Jordan normal form, direct calculation gives a spectral mapping theorem for thepolynomial functional calculus: LetA be ann ×n matrix with eigenvaluesλ1, ...,λn, then for any polynomialp,p(A) has eigenvaluesp(λ1), ...,p(λn).
Thecharacteristic polynomial ofA is.Similar matrices have the same characteristic polynomial.Therefore,,where is theith root of and is its multiplicity, because this is clearly the characteristic polynomial of the Jordan form ofA.
TheCayley–Hamilton theorem asserts that every matrixA satisfies its characteristic equation: ifp is thecharacteristic polynomial ofA, then. This can be shown via direct calculation in the Jordan form, since if is an eigenvalue of multiplicity,then its Jordan block clearly satisfies.As the diagonal blocks do not affect each other, theth diagonal block of is; hence.
The Jordan form can be assumed to exist over a field extending the base field of the matrix, for instance over thesplitting field ofp; this field extension does not change the matrixp(A) in any way.
Theminimal polynomial P of a square matrixA is the uniquemonic polynomial of least degree,m, such thatP(A) = 0. Alternatively, the set of polynomials that annihilate a givenA form an idealI inC[x], theprincipal ideal domain of polynomials with complex coefficients. The monic element that generatesI is preciselyP.
Letλ1, ...,λq be the distinct eigenvalues ofA, andsi be the size of the largest Jordan block corresponding toλi. It is clear from the Jordan normal form that the minimal polynomial ofA has degreeΣsi.
While the Jordan normal form determines the minimal polynomial, the converse is not true. This leads to the notion ofelementary divisors. The elementary divisors of a square matrixA are the characteristic polynomials of its Jordan blocks. The factors of the minimal polynomialm are the elementary divisors of the largest degree corresponding to distinct eigenvalues.
The degree of an elementary divisor is the size of the corresponding Jordan block, therefore the dimension of the corresponding invariant subspace. If all elementary divisors are linear,A is diagonalizable.
The Jordan form of an ×n matrixA is block diagonal, and therefore gives a decomposition of then dimensional Euclidean space into invariant subspaces ofA. Every Jordan blockJi corresponds to an invariant subspaceXi. Symbolically, we put
where eachXi is the span of the corresponding Jordan chain, andk is the number of Jordan chains.
One can also obtain a slightly different decomposition via the Jordan form. Given an eigenvalueλi, the size of its largest corresponding Jordan blocksi is called theindex ofλi and denoted byv(λi). (Therefore, the degree of the minimal polynomial is the sum of all indices.) Define a subspaceYi by
This gives the decomposition
wherel is the number of distinct eigenvalues ofA. Intuitively, we glob together the Jordan block invariant subspaces corresponding to the same eigenvalue. In the extreme case whereA is a multiple of the identity matrix we havek =n andl = 1.
The projection ontoYi and along all the otherYj (j ≠i ) is calledthe spectral projection ofA atvi and is usually denoted byP(λi ;A). Spectral projections are mutually orthogonal in the sense thatP(λi ;A)P(vj ;A) = 0 ifi ≠j. Also they commute withA and their sum is the identity matrix. Replacing every vi in the Jordan matrixJ by one and zeroing all other entries givesP(vi ;J), moreover ifU J U−1 is the similarity transformation such thatA =U J U−1 thenP(λi ;A) =U P(λi ;J)U−1. They are not confined to finite dimensions. See below for their application to compact operators, and inholomorphic functional calculus for a more general discussion.
Comparing the two decompositions, notice that, in general,l ≤k. WhenA is normal, the subspacesXi's in the first decomposition are one-dimensional and mutually orthogonal. This is thespectral theorem for normal operators. The second decomposition generalizes more easily for general compact operators on Banach spaces.
It might be of interest here to note some properties of the index,ν(λ). More generally, for a complex numberλ, its index can be defined as the least non-negative integerν(λ) such that
Soν(v) > 0 if and only ifλ is an eigenvalue ofA. In the finite-dimensional case,ν(v) ≤ the algebraic multiplicity ofv.
The Jordan form is used to find a normal form of matrices up to conjugacy such that normal matrices make up an algebraic variety of a low fixed degree in the ambient matrix space.
Sets of representatives of matrix conjugacy classes for Jordan normal form orrational canonical forms in general do not constitute linear or affine subspaces in the ambient matrix spaces.
Vladimir Arnold posed[16] a problem:Find a canonical form of matrices over a field for which the set of representatives of matrix conjugacy classes is a union of affine linear subspaces (flats). In other words, map the set of matrix conjugacy classes injectively back into the initial set of matrices so that the image of this embedding—the set of all normal matrices, has the lowest possible degree—it is a union of shifted linear subspaces.
It was solved for algebraically closed fields by Peteris Daugulis.[17] The construction of a uniquely definedplane normal form of a matrix starts by considering its Jordan normal form.
Iteration of the Jordan chain motivates various extensions to more abstract settings. For finite matrices, one gets matrix functions; this can be extended to compact operators and the holomorphic functional calculus, as described further below.
The Jordan normal form is the most convenient for computation of the matrix functions (though it may be not the best choice for computer computations). Letf(z) be an analytical function of a complex argument. Applying the function on an×n Jordan blockJ with eigenvalueλ results in an upper triangular matrix:
so that the elements of thek-th superdiagonal of the resulting matrix are. For a matrix of general Jordan normal form the above expression shall be applied to each Jordan block.
The following example shows the application to the power functionf(z) = zn:
where the binomial coefficients are defined as. For integer positiven it reduces to standard definitionof the coefficients. For negativen the identity may be of use.
A result analogous to the Jordan normal form holds forcompact operators on aBanach space. One restricts to compact operators because every pointx in the spectrum of a compact operatorT is an eigenvalue; The only exception is whenx is the limit point of the spectrum. This is not true for bounded operators in general. To give some idea of this generalization, we first reformulate the Jordan decomposition in the language offunctional analysis.
LetX be a Banach space,L(X) be the bounded operators onX, andσ(T) denote thespectrum ofT ∈L(X). Theholomorphic functional calculus is defined as follows:
Fix a bounded operatorT. Consider the family Hol(T) of complex functions that isholomorphic on some open setG containingσ(T). Let Γ = {γi} be a finite collection ofJordan curves such thatσ(T) lies in theinside of Γ, we definef(T) by
The open setG could vary withf and need not be connected. The integral is defined as the limit of the Riemann sums, as in the scalar case. Although the integral makes sense for continuousf, we restrict to holomorphic functions to apply the machinery from classical function theory (for example, the Cauchy integral formula). The assumption thatσ(T) lie in the inside of Γ ensuresf(T) is well defined; it does not depend on the choice of Γ. The functional calculus is the mapping Φ from Hol(T) toL(X) given by
We will require the following properties of this functional calculus:
In the finite-dimensional case,σ(T) = {λi} is a finite discrete set in the complex plane. Letei be the function that is 1 in some open neighborhood ofλi and 0 elsewhere. By property 3 of the functional calculus, the operator
is a projection. Moreover, letνi be the index ofλi and
The spectral mapping theorem tells us
has spectrum {0}. By property 1,f(T) can be directly computed in the Jordan form, and by inspection, we see that the operatorf(T)ei(T) is the zero matrix.
By property 3,f(T)ei(T) =ei(T)f(T). Soei(T) is precisely the projection onto the subspace
The relation
implies
where the indexi runs through the distinct eigenvalues ofT. This is the invariant subspace decomposition
given in a previous section. Eachei(T) is the projection onto the subspace spanned by the Jordan chains corresponding toλi and along the subspaces spanned by the Jordan chains corresponding to vj forj ≠i. In other words,ei(T) =P(λi;T). This explicit identification of the operatorsei(T) in turn gives an explicit form of holomorphic functional calculus for matrices:
Notice that the expression off(T) is a finite sum because, on each neighborhood of vi, we have chosen theTaylor series expansion off centered at vi.
LetT be a bounded operatorλ be an isolated point ofσ(T). (As stated above, whenT is compact, every point in its spectrum is an isolated point, except possibly the limit point 0.)
The pointλ is called apole of operatorT with orderν if theresolvent functionRT defined by
has apole of orderν atλ.
We will show that, in the finite-dimensional case, the order of an eigenvalue coincides with its index. The result also holds for compact operators.
Consider the annular regionA centered at the eigenvalueλ with sufficiently small radiusε such that the intersection of the open discBε(λ) andσ(T) is {λ}. The resolvent functionRT is holomorphic onA.Extending a result from classical function theory,RT has aLaurent series representation onA:
where
By the previous discussion on the functional calculus,
But we have shown that the smallest positive integerm such that
is precisely the index ofλ,ν(λ). In other words, the functionRT has a pole of orderν(λ) atλ.
If the matrixA has multiple eigenvalues, or is close to a matrix with multiple eigenvalues, then its Jordan normal form is very sensitive to perturbations. Consider for instance the matrix
Ifε = 0, then the Jordan normal form is simply
However, forε ≠ 0, the Jordan normal form is
Thisill conditioning makes it very hard to develop a robust numerical algorithm for the Jordan normal form, as the result depends critically on whether two eigenvalues are deemed to be equal. For this reason, the Jordan normal form is usually avoided innumerical analysis; the stableSchur decomposition[18] orpseudospectra[19] are better alternatives.