Eigenvector: Difference between revisions
No edit summary |
|||
(3 intermediate revisions by the same user not shown) | |||
Line 14: | Line 14: | ||
(A-\lambda I)\vec{x}=0 | (A-\lambda I)\vec{x}=0 | ||
</math> | </math> | ||
Eigenvectors are the foundation of the [[diagonalization]] technique. | |||
= Intuition = | = Intuition = | ||
If we think of a matrix as a linear transformation, eigenvectors do not change direction. Instead, they simply scale by an eigenvalue. | If we think of a matrix as a linear transformation, eigenvectors do not change direction. Instead, they simply scale by an eigenvalue. | ||
= Eigenspace = | |||
Given A and <math>\lambda</math>, the '''eigenspace''' is all the eigenvectors. | |||
Because any linear combination of eigenvectors will yield another eigenvector of the same eigenvalue, the eigenspace is a vector space. | |||
The following definition of eigenvectors help explore the eigenspace: | |||
<math> | |||
(A-\lambda I)\vec{x}=0 | |||
</math> | |||
We can find the determinant of the preceding matrix: | |||
<math> | |||
det(A-\lambda I)=(-1)^n[\lambda^n+c_{n-1}\lambda^{n-1}...+c_0] | |||
</math> | |||
This is the ''characteristic polynomial'' of the A, showing that for any A, there are at most n eigenvalues. | |||
= Symmetric matrices = | |||
For any symmetric matrix, one can make an orthonormal basis out of their eigenvectors. |
Latest revision as of 17:21, 10 June 2024
Given a matrix, its eigenvectors are special vectors that satisfy the following property:
where is the eigenvalue associated with the eigenvector
The definition of eigenvectors are also frequently written in this form:
Eigenvectors are the foundation of the diagonalization technique.
Intuition
If we think of a matrix as a linear transformation, eigenvectors do not change direction. Instead, they simply scale by an eigenvalue.
Eigenspace
Given A and , the eigenspace is all the eigenvectors.
Because any linear combination of eigenvectors will yield another eigenvector of the same eigenvalue, the eigenspace is a vector space.
The following definition of eigenvectors help explore the eigenspace:
We can find the determinant of the preceding matrix:
This is the characteristic polynomial of the A, showing that for any A, there are at most n eigenvalues.
Symmetric matrices
For any symmetric matrix, one can make an orthonormal basis out of their eigenvectors.