We now come to the other big topic in linear algebra after linear systems. And those are eigenvalues and eigenvectors are setup is that a is an n by n matrix. And the condition of an eigenvalue is that a times B equals a scalar lambda times V. If you can find a solution of this equation, then the V Here is an eigenvector, provided it is non-zero because 0 is always a solution. And at the same time, the lambda that goes with it would be an eigenvalue. And we won't get to talk about the significance of this condition for a while. So you'll just have to take it on faith that this is important. Now it's trivial to rewrite this by taking the difference of the two sides and remembering how the identity matrix works. We can insert it. And then we can factor out on the left. So there is a connection to a homogeneous linear system here is just that V and lambda are unknown simultaneously. That viewpoint turns out to be very important. So again, v is an eigenvector if it solves a minus lambda I times V equals 0. While v is also non-zero. While in the language we've developed, we would say that v is a solution of the homogeneous equation for this matrix a minus lambda I. We know that this homogeneous problem has a general solution. And in this context, what would be called the eigenspace that's associated with lambda. Now in order for V to be non-zero and this equation to be true, it must be the a minus lambda I is a singular matrix, and therefore its determinant must be 0. And now we don't have a VMI equation anymore. This is a way for us to determine lambda. So let's do an example first, this little two-by-two. There's always two steps. First step is to find the eigenvalues. Second step is to find the eigenvectors. To find the eigenvalues, we take the determinant of a minus lambda i. Lambda i is just the matrix with lambdas on the diagonal and zeros everywhere else. So all we have to do is subtract lambda from the diagonals in a. And this is a two-by-two determinants, how it's very easy. And since we're interested in values of lambda that make this equal to 0, will factor it. And we see it has two roots. So those roots make the determinant 0. Therefore, they make that matrix singular. Therefore will have an eigenvector. So lambda one equals three and lambda two equals negative one are the eigenvalues of this matrix. Now that we have those, we can go on to do the eigenvectors. So the first two lambda1, again, we just have to subtract lambda one along the diagonal. Now we want to find the general solution of the homogeneous problem for this matrix, which means we want to reduce it to REF. First thing we do is divide the first row by negative two to the VA leading one. That gives us one, negative two. And then obviously that cancels out the rover love it. So we get a 0 row, has to have a 0 because it's a singular matrix. We have a single leading one. X2 is free, so x2 is equal to S. And then that first row tells us x one is equal to two times s. So v1 and eigenvector to go with lambda1 is any S times the vector 2121 forms a basis for the eigenspace is not the way we would say. Now we're gonna do lambda2. Same process. We subtract negative one on the diagonal terms. So we're just adding one. Find the RAF, divide the first row by two and the second row becomes 0. We had x2 is equal to S and X one is equal to negative two s. So V2, an eigenvector to go with lambda_2 would be any multiple, non-zero multiple of negative 21. In general, when we take that determinant, a is a fixed matrix. So this is a function of lambda. And in fact it's going to turn out to be a polynomial in lambda, a polynomial of degree n and the n by n case. So we just had an example where n is equal to two. And it was quadratic polynomials known as the characteristic polynomial of the matrix. The roots of the characteristic polynomial are the eigenvalues of the matrix. That means since they're polynomial roots, they may be complex numbers, but they will come in conjugate pairs as long as a was real in the first place. Another example, first step is to find the eigenvalues. So we find the characteristic polynomial. The roots of this polynomial are negative one plus or minus the square root of one squared minus two. So that's negative one plus or minus i. We have a pair of conjugate Eigenvalues. Now we have eigenvalues, so we have to find the eigenvectors are the eigenspaces to go with them. So we subtract lambda one from the diagonal entries of a matrix with complex numbers in it. To freak out, they're just numbers. Divide the first row by negative i to get a leading one. But remember one over i is negative i. And then that's going to cancel out the row below it. We always have to have a row of zeros. So this says that X2 is free and x1 is minus i times S. That tells us the eigenspace to go with lambda one. So here I'm just taking a representative vector. Now we catch a break. Since lambda two is the complex conjugate of lambda one, it always works out that V2 is the complex conjugate of v1. So we're actually done.
I.13 Eigenvalues
From Tobin Driscoll February 02, 2021
138 plays
138
0 comments
0
You unliked the media.