Poor Man's QR

Written by Ian, last updated 25.06.2024 at 21:40

Introduction

This article is about computing eigenvalues. I’m assuming that any readers of this article will have a rudimentary understanding of matrices and eigenvalues. If you aren’t familiar with the concept of eigenvalues, Wikipedia is a pretty good source.

Eigenvalues and Why They Matter

To emphasize the importance of eigenvalues and eigenvectors, let’s take a look at a practical application.

There are three groups of people: Users of Rust, Users of C++ and non-programmers. Let’s see how these groups change every month:

Since Rust is all the rage right now, 20% of non-programmers are learning Rust. About 50% of C++ users are switching to Rust. Ever month, 5% of Rust users quit programming, 90% stay Rust users and the other 5% of Rust users switch back to C++. As for the non-programmers 79% stay non-programmers and 1% start learning C++. That leaves the C++ users, of which 20% stay C++ users and the other 30% quit programming.

Let’s put that in a table:

PreviouslyC++RustNon-Dev
C++0.20.50.3
Rust0.050.90.05
Non-Dev0.010.20.79

The keen-eyed among you probably see where this is going… we’re building a matrix. The following matrix represents the changes in these groups:

A=(0.20.050.010.50.90.20.30.050.79)A = \begin{pmatrix}0.2 & 0.05 & 0.01 \\ 0.5 & 0.9 & 0.2 \\ 0.3 & 0.05 & 0.79 \end{pmatrix}

You might notice that the matrix contains the transposed values of the table. This transposition is required for sensible matrix-vector-multiplication:

Say we take a vector x, where the components represent C++, Rust and Non-Devs in that order. If we now multiply with our matrix, we get:

(0.20.050.010.50.90.20.30.050.79)(100100100)=(26160114)\begin{pmatrix}0.2 & 0.05 & 0.01 \\ 0.5 & 0.9 & 0.2 \\ 0.3 & 0.05 & 0.79 \end{pmatrix} \begin{pmatrix} 100 \\ 100 \\ 100 \end{pmatrix} = \begin{pmatrix} 26 \\ 160 \\ 114 \end{pmatrix}

As you can see, the total number of people hasn’t changed, only the distribution has. And with a simple multiplication, we were able to compute the distribution after a month. Now you might be wondering: What will happen if we let this go on forever? Will it ever stabilize?

We can put this to the test using a simple Rust script:

let matrix = nalgebra::dmatrix![
    0.2, 0.05, 0.01;
    0.5, 0.9, 0.2;
    0.3, 0.05, 0.79
];
let mut vec = nalgebra::dvector![100.0, 100.0, 100.0];

loop {
    let prev = vec.clone_owned();
    vec = &matrix * vec;
    if (&prev - &vec).norm() < 1e-20 {
        break;
    }
}
println!("{:.04}", vec);

And we get:

  ┌          ┐
  │  14.2857 │
  │ 214.2857 │
  │  71.4286 │
  └          ┘

Per definition, an eigenvector x must satisfy the condition:

Ax=λxAx = \lambda x

This means that our vector is an eigenvector and it’s corresponding eigenvalue is one. As we have just seen, eigenvalues can be used to analyze graph-based systems (you can think of it as a fully connected graph with the edge weights given by the table) like the one in this example. We’ve also seen a rudimentary algorithm for computing an eigenvector and it’s corresponding eigenvalue (the eigenvalue could be obtained by dividing the vector lengths). However, there are better ways of computing eigenvalues, as we will see shortly.