Search Knowledge

© 2026 LIBREUNI PROJECT

Mathematics / Linear Algebra

Eigenvalues, Eigenvectors & Diagonalization

Spectral Theory: Eigenvalues, Eigenvectors, and Diagonalization

Spectral theory is the study of linear operators through their invariant subspaces and characteristic values. In the context of finite-dimensional vector spaces, this reduces to the analysis of the structure of square matrices.

1. The Eigenvalue Equation

Let be a vector space over a field (typically or ), and let be a linear operator. A scalar is called an eigenvalue of if there exists a non-zero vector , called an eigenvector, such that:

In matrix form, if represents with respect to some basis, the equation becomes:

where is the identity matrix. The eigenvector must be non-zero because is always true for any and provides no information about .

2. The Characteristic Polynomial and the Spectrum

The condition for implies that the operator is not injective, which for finite-dimensional spaces is equivalent to saying the matrix is singular. Thus, its determinant must vanish:

is a polynomial in of degree , known as the characteristic polynomial. The set of all eigenvalues of is called the spectrum of , denoted by :

By the Fundamental Theorem of Algebra, has exactly complex roots (counting multiplicity).

3. Algebraic vs. Geometric Multiplicity

When analyzing the roots of , we distinguish between two types of multiplicity for an eigenvalue :

  1. Algebraic Multiplicity (): The number of times appears as a root of the characteristic polynomial. If where , then .
  2. Geometric Multiplicity (): The dimension of the eigenspace . It represents the number of linearly independent eigenvectors associated with .

Theorem: For any eigenvalue , . If , the eigenvalue is said to be defective. A matrix is defective if it has at least one defective eigenvalue.

4. Similarity Transformations

Two matrices and are similar () if there exists an invertible matrix such that:

Similarity is an equivalence relation that represents the same linear operator under different bases. Theorem: Similar matrices share the same characteristic polynomial, and thus the same eigenvalues, trace, and determinant. Proof sketch:

5. Diagonalizability

A matrix is diagonalizable if it is similar to a diagonal matrix . That is, .

Condition for Diagonalizability: is diagonalizable if and only if it possesses linearly independent eigenvectors. This occurs if and only if for every eigenvalue , the geometric multiplicity equals the algebraic multiplicity ().

If is diagonalizable, the columns of are the eigenvectors of , and the diagonal entries of are the corresponding eigenvalues:

6. The Cayley-Hamilton Theorem

The Cayley-Hamilton Theorem states that every square matrix satisfies its own characteristic equation. If , then:

This theorem is powerful for computing:

  • Matrix Powers: can be expressed as a linear combination of lower powers .
  • Inverses: If is invertible (), then .

7. Numerical implementation in Python

We can verify the diagonalization property using numpy.

import numpy as np

# Define a non-defective matrix
A = np.array([[4, -2,  1],
              [1,  1,  0],
              [0,  0,  2]])

# Compute eigenvalues and eigenvectors
# evals: eigenvalues, evecs: matrix where columns are eigenvectors
evals, evecs = np.linalg.eig(A)

print("Eigenvalues:", evals)
print("Eigenvectors matrix P:\n", evecs)

# Create diagonal matrix D
D = np.diag(evals)

# Verify A = P D P^-1
# Using np.allclose to handle floating point precision
P = evecs
P_inv = np.linalg.inv(P)
A_reconstructed = P @ D @ P_inv

print("\nReconstructed A:\n", A_reconstructed)
print("\nVerification (A == PDP^-1):", np.allclose(A, A_reconstructed))

# Verify Characteristic Polynomial via Cayley-Hamilton
# p(lambda) = det(A - lambda*I)
# For this 3x3, let's find coefficients using np.poly
coeffs = np.poly(A) 
# coeffs[0]*A^3 + coeffs[1]*A^2 + coeffs[2]*A + coeffs[3]*I
A_sq = A @ A
A_cub = A_sq @ A
CH_check = coeffs[0]*A_cub + coeffs[1]*A_sq + coeffs[2]*A + coeffs[3]*np.eye(3)
print("\nCayley-Hamilton check (should be ~0):\n", np.round(CH_check, 10))

8. Physical and Practical Relevance

Stability Analysis

In differential equations, a system is stable if all eigenvalues of have negative real parts. The eigenvectors define the “modes” of the system—decoupled directions along which the system evolves independently.

Vibration and Resonance

In mechanical engineering, the eigenvalues of a mass-stiffness matrix correspond to the natural frequencies of a structure. The eigenvectors are the mode shapes, describing how the structure deforms at those frequencies.

Information Retrieval: PageRank

The PageRank algorithm treats the internet as a massive graph and finds the principal eigenvector (the one corresponding to ) of a modified adjacency matrix. This eigenvector represents the stationary distribution of a random walk, highlighting important nodes.


Knowledge Check

Conceptual Check

A 3x3 matrix A has eigenvalues 2, 2, and 5. The eigenspace for lambda=2 is 1-dimensional. Which of the following is true?

Conceptual Check

How do the eigenvalues of matrix A relate to the eigenvalues of A^2?

Conceptual Check

If two matrices A and B are similar, which property are they NOT guaranteed to share?