Eigenvector: Difference between revisions
(Created page with "Category:Linear Algebra") |
|||
(4 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
[[Category:Linear Algebra]] | [[Category:Linear Algebra]] | ||
Given a [[matrix]], its '''eigenvectors''' are special vectors that satisfy the following property: | |||
<math> | |||
A\vec{x}=\lambda\vec{x} | |||
</math> | |||
where <math>\lambda</math> is the '''eigenvalue''' associated with the eigenvector <math>\vec{x}</math> | |||
The definition of eigenvectors are also frequently written in this form: | |||
<math> | |||
(A-\lambda I)\vec{x}=0 | |||
</math> | |||
Eigenvectors are the foundation of the [[diagonalization]] technique. | |||
= Intuition = | |||
If we think of a matrix as a linear transformation, eigenvectors do not change direction. Instead, they simply scale by an eigenvalue. | |||
= Eigenspace = | |||
Given A and <math>\lambda</math>, the '''eigenspace''' is all the eigenvectors. | |||
Because any linear combination of eigenvectors will yield another eigenvector of the same eigenvalue, the eigenspace is a vector space. | |||
The following definition of eigenvectors help explore the eigenspace: | |||
<math> | |||
(A-\lambda I)\vec{x}=0 | |||
</math> | |||
We can find the determinant of the preceding matrix: | |||
<math> | |||
det(A-\lambda I)=(-1)^n[\lambda^n+c_{n-1}\lambda^{n-1}...+c_0] | |||
</math> | |||
This is the ''characteristic polynomial'' of the A, showing that for any A, there are at most n eigenvalues. | |||
= Symmetric matrices = | |||
For any symmetric matrix, one can make an orthonormal basis out of their eigenvectors. |
Latest revision as of 17:21, 10 June 2024
Given a matrix, its eigenvectors are special vectors that satisfy the following property:
where is the eigenvalue associated with the eigenvector
The definition of eigenvectors are also frequently written in this form:
Eigenvectors are the foundation of the diagonalization technique.
Intuition
If we think of a matrix as a linear transformation, eigenvectors do not change direction. Instead, they simply scale by an eigenvalue.
Eigenspace
Given A and , the eigenspace is all the eigenvectors.
Because any linear combination of eigenvectors will yield another eigenvector of the same eigenvalue, the eigenspace is a vector space.
The following definition of eigenvectors help explore the eigenspace:
We can find the determinant of the preceding matrix:
This is the characteristic polynomial of the A, showing that for any A, there are at most n eigenvalues.
Symmetric matrices
For any symmetric matrix, one can make an orthonormal basis out of their eigenvectors.