Finding General Solution to a Given Matrix 2x2
The Exponential of a Matrix
The solution to the exponential growth equation
It is natural to ask whether you can solve a constant coefficient linear system
in a similar way.
If a solution to the system is to have the same form as the growth equation solution, it should look like
The first thing I need to do is to make sense of the matrix exponential .
The Taylor series for is
It converges absolutely for all z.
It A is an matrix with real entries, define
The powers make sense, since A is a square matrix. It is possible to show that this series converges for all t and every matrix A.
Differentiating the series term-by-term,
This shows that solves the differential equation . The initial condition vector yields the particular solution
This works, because (by setting in the power series).
Another familiar property of ordinary exponentials holds for the matrix exponential: If A and B commute (that is, ), then
You can prove this by multiplying the power series for the exponentials on the left. ( is just with .)
Example.Compute if
Compute the successive powers of A:
Therefore,
You can compute the exponential of an arbitrary diagonal matrix in the same way:
Example. Compute if
Compute the successive powers of A:
Hence,
Here's where the last equality came from:
Example. Compute , if
If you compute powers of A as in the last two examples, there is no evident pattern. Therefore, it would be difficult to compute the exponential using the power series.
Instead, set up the system whose coefficient matrix is A:
The solution is
Next, note that if B is a matrix,
In particular, this is true for . Now
is the solution satisfying , but
Set to get the first column of :
Hence, , . So
Set to get the second column of :
Therefore, , . Hence,
Therefore,
I found , but I had to solve a system of differential equations in order to do it.
In some cases, it's possible to use linear algebra to compute the exponential of a matrix. An matrix A is diagonalizable if it has n independent eigenvectors. (This is true, for example, if A has n distinct eigenvalues.)
Suppose A is diagonalizable with independent eigenvectors and corresponding eigenvalues . Let S be the matrix whose columns are the eigenvectors:
Then
As I observed above,
On the other hand, since ,
Hence,
I can use this approach to compute in case A is diagonalizable.
Example. Compute if
The eigenvalues are , . Since there are two different eigenvalues and A is a matrix, A is diagonalizable. The corresponding eigenvectors are and . Thus,
Hence,
Example. Compute if
The eigenvalues are and (double). The corresponding eigenvectors are for , and and for . Since I have 3 independent eigenvectors, the matrix is diagonalizable.
I have
From this, it follows that
Here's a quick check on the computation: If you set in the right side, you get
This checks, since .
Note that this check isn't foolproof --- just because you get I by setting doesn't mean your answer is right. However, if you don't get I, your answer is surely wrong!
How do you compute is A is not diagonalizable?
I'll describe an iterative algorithm for computing that only requires that one know the eigenvalues of A. There are various algorithms for computing the matrix exponential; this one, which is due to Williamson [1], seems to me to be the easiest for hand computation.
(Note that finding the eigenvalues of a matrix is, in general, a difficult problem: Any method for finding will have to deal with it.)
Let A be an matrix. Let be a list of the eigenvalues, with multiple eigenvalues repeated according to their multiplicity.
Let
Then
To prove this, I'll show that the expression on the right satisfies the differential equation . To do this, I'll need two facts about the characteristic polynomial .
1. .
2. ( Cayley-Hamilton Theorem) .
Observe that if is the characteristic polynomial, then using the first fact and the definition of the B's,
By the Cayley-Hamilton Theorem,
I will use this fact in the proof below.
Example. I'll illustrate the Cayley-Hamilton theorem with the matrix
The characteristic polynomial is . The Cayley-Hamilton theorem asserts that if you plug A into , you'll get the zero matrix.
First,
Therefore,
Proof of the algorithm. First,
Recall that the Fundamental Theorem of Calculus says that
Applying this and the Product Rule, I can differentiate to obtain
Therefore,
Expand the terms using
Making this substitution and telescoping the sum, I have
(The result (*) proved above was used in the next-to-the-last equality.) Combining the results above, I've shown that
This shows that satisfies .
Using the power series expansion, I have . So
(Remember that matrix multiplication is not commutative in general!) It follows that is a constant matrix.
Set . Since , it follows that . In addition, . Therefore, , and hence .
Example. Use the matrix exponential to solve
The characteristic polynomial is . You can check that there is only one independent eigenvector, so I can't solve the system by diagonalizing. I could use generalized eigenvectors to solve the system, but I will use the matrix exponential to illustrate the algorithm.
First, list the eigenvalues: . Since is a double root, it is listed twice.
First, I'll compute the 's:
Here are the 's:
Therefore,
As a check, note that setting produces the identity.)
The solution to the given initial value problem is
You can get the general solution by replacing with .
Example. Find if
The eigenvalues are obviously (double) and .
First, I'll compute the 's. I have , and
Next, I'll compute the 's. , and
Therefore,
Example. Use the matrix exponential to solve
This example will demonstrate how the algorithm for works when the eigenvalues are complex.
The characteristic polynomial is . The eigenvalues are . I will list them as .
First, I'll compute the 's. , and
Next, I'll compute the 's. , and
Therefore,
I want a real solution, so I'll use DeMoivre's Formula to simplify:
Plugging these into the expression for above, I have
Notice that all the i's have dropped out! This reflects the obvious fact that the exponential of a real matrix must be a real matrix.
Finally, the general solution to the original system is
Example. I'll compare the matrix exponential and the eigenvector solution methods by solving the following system both ways:
The characteristic polynomial is . The eigenvalues are .
Consider :
As this is an eigenvector matrix, it must be singular, and hence the rows must be multiples. So ignore the second row. I want a vector such that . To get such a vector, switch the and -1 and negate one of them: , . Thus, is an eigenvector.
The corresponding solution is
Take the real and imaginary parts:
The solution is
Now I'll solve the equation using the exponential. The eigenvalues are . Compute the 's. , and
(Here and below, I'm cheating a little in the comparison by not showing all the algebra involved in the simplification. You need to use DeMoivre's Formula to eliminate the complex exponentials.)
Next, compute the 's. , and
Therefore,
The solution is
Taking into account some of the algebra I didn't show for the matrix exponential, I think the eigenvector approach is easier.
Example. Solve the system
For comparison, I'll do this first using the generalized eigenvector method, then using the matrix exponential.
The characteristic polynomial is . The eigenvalue is (double).
Ignore the first row, and divide the second row by 2, obtaining the vector . I want such that . Swap 1 and -2 and negate the -2: I get . This is an eigenvector for .
Since I only have one eigenvector, I need a generalized eigenvector. This means I need such that
Row reduce:
This means that . Setting yields . The generalized eigenvector is .
The solution is
Next, I'll solve the system using the matrix exponential. The eigenvalues are . First, I'll compute the 's. , and
Next, compute the 's. , and
Therefore,
The solution is
In this case, finding the solution using the matrix exponential may be a little bit easier.
[1] Richard Williamson, Introduction to differential equations. Englewood Cliffs, NJ: Prentice-Hall, 1986.
Send comments about this page to: Bruce.Ikenaga@millersville.edu.
Bruce Ikenaga's Home Page
Copyright 2012 by Bruce Ikenaga
Finding General Solution to a Given Matrix 2x2
Source: https://sites.millersville.edu/bikenaga/linear-algebra/matrix-exponential/matrix-exponential.html