Eigenvector: Difference between revisions

From Rice Wiki
No edit summary
No edit summary
Line 17: Line 17:
= Intuition =
= Intuition =
If we think of a matrix as a linear transformation, eigenvectors do not change direction. Instead, they simply scale by an eigenvalue.
If we think of a matrix as a linear transformation, eigenvectors do not change direction. Instead, they simply scale by an eigenvalue.
= Eigenspace =
Given A and <math>\lambda</math>, the '''eigenspace''' is all the eigenvectors with that particular eigenvalue.
Because any linear combination of eigenvectors will yield another eigenvector of the same eigenvalue, the eigenspace is a vector space.
The following definition of eigenvectors help explore the eigenspace:
<math>
(A-\lambda I)\vec{x}=0
</math>
We can find the determinant of the preceding matrix:
<math>
det(A-\lambda I)=(-1)^n[\lambda^n+c_{n-1}\lambda^{n-1}...+c_0]
</math>
This is the ''characteristic polynomial'' of the A, showing that for any A, there are at most n eigenvalues.

Revision as of 06:43, 10 June 2024


Given a matrix, its eigenvectors are special vectors that satisfy the following property:

where is the eigenvalue associated with the eigenvector

The definition of eigenvectors are also frequently written in this form:

Intuition

If we think of a matrix as a linear transformation, eigenvectors do not change direction. Instead, they simply scale by an eigenvalue.

Eigenspace

Given A and , the eigenspace is all the eigenvectors with that particular eigenvalue.

Because any linear combination of eigenvectors will yield another eigenvector of the same eigenvalue, the eigenspace is a vector space.

The following definition of eigenvectors help explore the eigenspace:

We can find the determinant of the preceding matrix:

This is the characteristic polynomial of the A, showing that for any A, there are at most n eigenvalues.