|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Eigenvalues and Eigenvectors - Interactive TutorialEigenvalues and Eigenvectors are fundamental concepts in linear algebra with far-reaching applications in physics, engineering, machine learning, and data science. This interactive tutorial provides a geometric understanding of these concepts through visualization. 🎯 The Core InsightEigenvectors are special vectors that don't get "knocked off" their span during a linear transformation. They only get stretched or compressed (scaled) by a factor called the eigenvalue. 🎮 Simulation Features
1. Introduction: What Are Eigenvalues and Eigenvectors?1.1 The DefinitionGiven a square matrix A, a non-zero vector v is called an eigenvector if:
A · v = λ · v
Where:
1.2 Geometric InterpretationWhen you apply a linear transformation to most vectors, they change both direction and magnitude. But eigenvectors are special:
Transformation of a generic vector:
→ v
⟶ A ⟶
↗ Av (different direction)
Transformation of an eigenvector:
→ v
⟶ A ⟶
→→ λv (same direction, scaled)
2. Finding Eigenvalues and Eigenvectors2.1 The Characteristic EquationTo find eigenvalues, we solve the characteristic equation:
det(A - λI) = 0
For a 2×2 matrix:
A = | a b |
| c d |
Characteristic equation:
det | a-λ b | = 0
| c d-λ |
(a-λ)(d-λ) - bc = 0
λ² - (a+d)λ + (ad-bc) = 0
λ² - trace(A)·λ + det(A) = 0
2.2 Solving for EigenvaluesUsing the quadratic formula:
λ = (trace ± √(trace² - 4·det)) / 2
Where: trace = a + d and det = ad - bc 2.3 The DiscriminantThe discriminant Δ = trace² - 4·det determines the nature of eigenvalues:
3. Geometric Visualization3.1 How Linear Transformations Warp SpaceA 2×2 matrix transforms the entire 2D plane. In the visualization below:
3.2 The "Aha!" MomentTry this in the simulation: Move your mouse around the canvas. The cyan vector is your input, and the red vector is the output after transformation. When you align the input vector with an eigenvector direction, the output vector will point in the exact same direction (or opposite if λ < 0). The vectors turn GOLD to celebrate! 3.3 The Determinant and AreaThe determinant of a matrix has a beautiful geometric meaning:
In the simulation, the Input Area (green square) is always 1. The Output Area (red parallelogram) equals |det(A)|. Watch how different matrices stretch, compress, or flip the unit square! 3.4 Complex Eigenvalues: Rotation and SpiralsWhen eigenvalues are complex, there are no real eigenvector directions - every vector gets rotated! Complex eigenvalues come in conjugate pairs: λ = r·e±iθ
The simulation shows a spiral trajectory when you click "▶ Spiral" - this visualizes repeated application of the matrix: v, Av, A²v, A³v, ...
4. Common Matrix Types and Their EigenvectorsThe simulation includes 13 preset matrices demonstrating different transformation types: 4.1 Scaling Matrix (Preset: "Scale")
A = | 2 0 | λ₁ = 2, v₁ = (1, 0) ← x-axis
| 0 3 | λ₂ = 3, v₂ = (0, 1) ← y-axis
Every axis-aligned vector is an eigenvector! Scales x by 2, y by 3. det = 6 (area multiplied by 6). 4.2 Shear Matrix (Preset: "Shear")
A = | 1 1 | λ₁ = λ₂ = 1 (repeated)
| 0 1 | Only one eigenvector direction: (1, 0)
Only horizontal vectors stay on their span. det = 1 (area preserved!). This is called an "area-preserving shear". 4.3 Rotation Matrices (Presets: "Rotate 45°", "Rotate 90°")
A = | cos(θ) -sin(θ) | λ = cos(θ) ± i·sin(θ)
| sin(θ) cos(θ) | COMPLEX eigenvalues!
No real eigenvectors! Every vector gets rotated - none stay on their span. det = 1 (rotation preserves area). Try the "▶ Spiral" button to see a circular orbit! 4.4 Reflection Matrices (Presets: "Reflect (x-axis)", "Reflect (y=x)")
Reflect x-axis: Reflect y=x:
A = | 1 0 | A = | 0 1 |
| 0 -1 | | 1 0 |
λ = +1, -1 λ = +1, -1
Reflections always have det = -1 (area preserved but orientation flipped). Notice the "(flipped)" indicator in the simulation! 4.5 Projection Matrix (Preset: "Project (onto y=x)")
A = | 0.5 0.5 | λ₁ = 1 (onto the line y=x)
| 0.5 0.5 | λ₂ = 0 (perpendicular direction)
Projection onto the line y=x. det = 0 (singular!) - the entire plane collapses to a line. The red parallelogram becomes a line segment with zero area. 4.6 Symmetric Matrix (Preset: "Symmetric")
A = | 2 1 | Symmetric matrices (A = Aᵀ) always have:
| 1 2 | - REAL eigenvalues (λ = 3, 1)
- ORTHOGONAL eigenvectors
Notice the eigenvector directions are perpendicular! This is a fundamental property of symmetric matrices. 4.7 Spiral Matrices (Presets: "Spiral In/Out Fast/Slow")These matrices combine rotation + scaling and demonstrate complex eigenvalues:
Try it: Select a spiral preset, drag the purple "START" marker to set your initial point, then click "▶ Spiral" to watch the trajectory! Matrix A (2×2)
[
]
abcd
Eigenvalue Analysis
Trace (a+d):
4.0
Det (ad-bc):
3.0
Discriminant:
4.0
λ₁:
3.0
λ₂:
1.0
Input Area:
1.00
Output Area:
1.00
Move mouse to explore eigenvectors
Input Vector x
Output Vector Ax
Eigenvector Directions
Input Unit Square
Output (Transformed)
Linear Transformation Visualizer (Move mouse to explore)
Align the blue vector with a dashed line to find an eigenvector!
n=0
5. Applications of Eigenvalues and Eigenvectors5.1 Principal Component Analysis (PCA)In machine learning, PCA uses eigenvectors of the covariance matrix to find the directions of maximum variance in data:
5.2 Stability AnalysisIn dynamical systems dx/dt = Ax, eigenvalues determine stability:
5.3 Google PageRankThe PageRank algorithm finds the dominant eigenvector of the web link matrix:
5.4 Quantum MechanicsIn quantum physics, observable quantities are eigenvalues of Hermitian operators:
5.5 Vibration AnalysisNatural frequencies of mechanical systems are determined by eigenvalues:
6. Important Properties6.1 Key Theorems
6.2 DiagonalizationIf matrix A has n linearly independent eigenvectors, it can be diagonalized:
A = PDP⁻¹
Where P = matrix of eigenvectors (columns), D = diagonal matrix of eigenvalues This makes computing powers of A easy: Aⁿ = PDⁿP⁻¹ 6.3 When Are Eigenvectors Orthogonal?� ️ Common MisconceptionEigenvectors are NOT always perpendicular! They are only guaranteed to be orthogonal for symmetric matrices. Symmetric Matrices: The Spectral TheoremA matrix is symmetric if A = Aᵀ (i.e., a[i][j] = a[j][i]). For symmetric matrices:
This is guaranteed by the Spectral Theorem - one of the most important results in linear algebra! Try it: Select the "Symmetric" preset in the simulation. Notice the two dashed eigenvector lines are exactly perpendicular! General Matrices: No Orthogonality GuaranteeFor non-symmetric matrices, eigenvectors can be at any angle:
Example: A = | 2 1 | (upper triangular, NOT symmetric)
| 0 3 |
λ₁ = 2 → v₁ = (1, 0) ← horizontal
λ₂ = 3 → v₂ = (1, 1) ← 45° diagonal
Angle between eigenvectors: 45° (NOT orthogonal!)
Summary: Eigenvector Orthogonality
🔬 Experiment: Enter the matrix
a=2, b=1, c=0, d=3 in the simulation. You'll see the gold and orange eigenvector lines are at 45° - clearly NOT perpendicular! Then switch to "Symmetric" preset and see them become exactly 90° apart.
7. Using the Simulation7.1 Simulation Controls
7.2 Visual Elements● Cyan Vector
Input vector (mouse position) ● Red Vector
Output vector (Ax) ● Gold Vectors
Eigenvector found! � Green Square
Input unit square (area = 1) � Red Parallelogram
Output area (|det(A)|) ● Purple Marker
Draggable start point for spiral 8. Exercises to Try8.1 Find the EigenvectorsUse the simulation to find eigenvector directions for these matrices:
Matrix 1:
| 3 1 | | 1 3 | Hint: λ = 4 and 2 Check: det = 8, area = 8×
Matrix 2:
| 0 1 | | 1 0 | Hint: This is a reflection! Check: det = -1 (flipped)
Matrix 3:
| 2 0 | | 0 2 | What's special about this? (Every direction is an eigenvector!) 8.2 Predict Before You Check
8.3 Explore Edge Cases
8.4 Spiral Challenge
9. Mathematical SummaryFor a 2×2 Matrix A = [a b; c d]
Eigenvalue Classification
Determinant and Area Transformation
Complex Eigenvalues: λ = r·e±iθ
|