*Update: 2023-03-10*.
SRS should be especially suitable for math. Definitions, examples and Theorems are not only hard to understand, they are hard to remember. Would you remember what a normal subgroup is \(\sim10\) years after taking my algebra class? Probably not! Unless you had to use the concept several times the last few years.

Everyone says the same about spaced repetition systems (SRS). They work best to make sure you keep knowing what you know. Is this true? Probably not. You don’t earn mathematics, you get used to it. You should probably make cards for theorems you don’t understand – just being exposed to it repeatedly should help you understand it. Of course, you need to actually *do* math to become better at math, but you won’t be able to do math very efficiently if you haven’t internalized the most important results, definitions, (counter)examples and concepts.

Making an SRS involves two difficult problems. 1. Choosing what to repeat. 2. Making the cards.

## Choosing what to repeat

I’ve been restudying linear algebra for the 3rd time for the last couple of days. During this time I’ve gained an understanding of fundamental ideas I’ve either forgotten or never known.

They would loosely be categorized into definitions, examples, concepts, and results.

For many topics in mathematics, you can find very good lists of stuff to remember on Wikipedia. For instance, the page on normal matrix contain 11 equivalent definitions for normality. Why not try to remember them all?

Here are some examples. The answers to these questions are short, and should only take 2 seconds if you actually know the result.

### Results

- State the Cayley-Hamilton Theorem.
- State the Spectral Theorem.
- State the Jordan Decomposition Theorem.
- State the Jordan–Chevalley Decomposition Theorem.
- State the Schur Decomposition Theorem.
- State the Gershgorin disc theorem. (This is somewhat more laborious to do than the preceeding theorems.)
- What is the QR decomposition?
- How do you usually calculate the QR decomposition?
- What is the algebraic expression for the least squares solution to \((Y-X\beta)^T(Y-X\beta)\)?
- How is the solution to \((Y-X\beta)^T(Y-X\beta)\) calculated in
`R`

? - Calculate the solution to \((Y-X\beta)^T(Y-X\beta)\) using SVD.

### Concepts and examples involving diagonalization

- When are two (triagonalizable) diagonalizable matrices simultaneously (triagonalizable) diagonalizable?
- Give an example matrix \(A=P^{-1}DP\) that does not have orthogonal eigenvectors.
- Provide a \(k\times k\) matrix whose eigenspace does not span \(\mathbb{R}^k\).
- What is a generalized eigenspace?
- Suppose \(A\) is diagonalizable. When does \(A\) have a cyclic vector?
- When is a matrix \(A\) unitarily diagonalizable?
- What is a
*defective*matrix? - Let \(A\) be a symmetric matrix and \(\lambda_i\) its eigenvalues. When is \(A\) positive (semi-)definite?
- When does a matrix \(A\) admit a Cholesky decomposition?
- What eigenvalues does an orthogonal matrix \(Q\) have?
- What eigenvalues will an orthogonal and symmetric orthogonal matrix have?

### Remembering different kinds of matrices

- What is a Householder matrix?
- What is a nilpotent matrix?
- What is an orthogonal projection matrix?
- What is a partial isometry?
- What is an orthogonal matrix?
- What is a companion matrix?
- Define the Moore–Penrose inverse in terms of its singular value decomposition.
- Given a matrix \(A\), how can you formulate the projection on its range using the Moore–Penrose pseudo-inverse?
- How is the kernel of \(A^T\) and the range of \(A\) related?
- State the \((I+AB)^{-1}\) in terms of \((I+BA)^{-1}\), \(A,B\) and \(I\).
- What is the Neuman series of \(A\)?

## Making the cards

You need to be able to make the cards quickly. I would suggest not bothering writing the back side, but maybe provide references instead, or take screenshots, maybe? This is especially well-suited to the basic stuff that is easily summarized on wikipedia or in books.

Then, you need to make sure the questions do not answer themselves. Say that I ask “Suppose that \(A\) satisfies \(||A^*x|| = ||Ax||\) for all \(x\). What matrix class does \(A\) belong to?” and *every* question of this type has the answer “\(A\) is normal”. That’s no good, obviously.

## In practice

I’ll probably try to do this in practice, starting now. While restudying linear algebra, I’ll write down all “essential” facts I don’t remember or understand properly. I’ll try to make a new post in a couple of weeks time.