Diagonalization Similar Matrices We have seen that the commutative property does not hold for matrices, so that if A is an n x n matrix, then P^{1}AP is not necessarily equal to A. For different nonsingular matrices P, the above expression will represent different matrices. However, all such matrices share some important properties as we shall soon see. Let A and B be an n x n matrices, then A is similar to B if there is a nonsingular matrix P with B = P^{1}AP
Example Consider the matrices
Then
is similar to A.
Notice the three following facts
We call a relationship with these three properties an equivalence relationship. We will prove the third property. If A is similar to B and B is similar to C then there are matrices P and Q with B = P^{1}AP and C = Q^{1}BQ We need to find a matrix R with C = R^{1}AR We have C = Q^{1}BQ = Q^{1}(P^{1}AP)Q = (Q^{1}P^{1})A(PQ) = (PQ)^{1}A(PQ) = R^{1}AR There is a wonderful fact that we state below.
Theorem If A and B are similar matrices, then they have the same eigenvalues.
Proof It is enough to show that they have the same characteristic polynomials. We have det(lI  B) = det(lI  P^{1}AP) = det(P^{1}lIP  P^{1}AP) = det(P^{1}(lI  A)P) = det(lI  A) Diagonalized Matrices The easist kind of matrices to deal with are diagonal matrices. Determinants are simple, the eigenvalues are just the diagonal entries and the eigenvectors are just elements of the standard basis. Even the inverse is a piece of cake (if the matrix is nonsingular). Although most matrices are not diagonal, many are diagonalizable, that is they are similar to a diagonal matrix. A matrix A is diagonalizable if A is similar to a diagonal matrix D. D = P^{1}AP The
following theorem tells us when a matrix is diagonalizable and if it is how to
find its similar diagonal matrix D. Theorem Let A be an n x n matrix. Then A is diagonalizable if and only if A has n linearly independent eigenvectors. If so, then D = P^{1}AP If {v_{1}, ... , v_{n}} are the eigenvectors of A and {l_{1}, ... , l_{n}} are the corresponding eigenvalues, then v_{j} the j^{th} column of P and [D]_{jj} = l_{j}
Example In the last discussion, we saw that the matrix
has 1 and 4 as eigenvalues with associated eigenvectors
Hence You can verify that D = P^{1}AP Proof of the Theorem If D = P^{1}AP for some diagonal matrix D and nonsingular matrix P, then AP = PD Let v_{i} be the j^{th} column of P and [D]_{jj} = lj. Then the j^{th} column of AP is Av_{i} and the j^{th} column of PD is l_{i}v_{j}. Hence Av_{j} = l_{i}v_{j} so that v_{j} is an eigenvector of A with corresponding eigenvalue l_{j}. Since P has its columns as eigenvectors, and P is nonsingular, rank(P) = n, and the columns of P (the eigenvalues of A) are linearly independent. Next suppose that the eigenvalues of A are linearly independent. Then form D and P as above. Then since Av_{j} = l_{i}v_{j} The j^{th} column of AP equals the j^{th} column of PD, hence AP = PD. Since the columns of P are linearly independent, P is nonsingular so that D = P^{1}AP Theorem Let A be an n x n matrix with n real and distinct eigenvalues. Then A is diagonalizable.
Proof Let {l_{1}, ... , l_{k}} and {v_{1}, ... , v_{k}} with rank(Span({v_{1}, ... , v_{k}})) = k  1
be the eigenvalues and eigenvectors of A. We need to show that none of the vectors can be written as a linear combination of the rest. Without loss of generality, we need show that the first can not be written as a linear combination of the rest. If v_{1} = c_{2}v_{2} + ... + c_{n}v_{k} (1) We can multiply both sides of the equation by A to get l_{1}v_{1} = Ac_{2}v_{2} + ... + Ac_{n}v_{k} = c_{2}l_{2}v_{2} + ... + c_{n}l_{n}v_{k} (2) Multiply (1) by l_{1} and subtract it from (2) to get c_{2}(l_{2}  l_{1})v_{2} + ... + c_{n}(l_{n}  l_{1})v_{n} = 0 Since the l's are distinct, the c_{i}'s must all be zero, which is a contradiction (otherwise the rank would be less than k  1). Hence rankSpan({v_{1}, ... , v_{k}}) = k for any k. In particular, let k = n, and the result follows.
Note that the converse certainly does not hold. For example, the identity matrix I has 1 as all of its eigenvalues, but it is diagonalizable (it is diagonal).
Steps to Diagonalize a Matrix
Example Diagonalize the matrix
Solution We find the characteristic polynomial
The roots are 1 (with multiplicity 2) and 2 (with multiplicity 1). Now we find the eigenspaces associated with the eigenvalues. We have
A basis for the null space is
Next we find a basis for the eigenspace associated with the eigenvalue 2. We have
A basis for this null space is
Now put this all together to get
Back to the Linear Algebra Home Page Back to the Math Department Home Page email Questions and Suggestions
