Note: This is a follow-up to my previous post “Outer Products: The Dual View of Linear Algebra”. It might be worth checking that post out if you’d like more of a background on what outer products are, and why I think they’re neat.
Let’s say you’re under pressure to do some quick math – maybe about to save the world from some doomsday plot or villain – and in order to defeat this villain you need to decompose some matrix you are handed into a sum of outer products. Luckily for you, you read this blog post just last week, and you know that there are two trivial decomposition of any matrix into a sum of outer products.
You already know that any matrix multiplication between two matrices can be rewritten as a sum over the outer product of the columns of the left matrix (denoted $\bm{a}_i^{\text{col}}$) with the rows of the right matrix (denoted $\bm{b}_i^{\text{row}}$):
\[\mathbf{C} = \mathbf{AB} = \begin{bmatrix} \bm{a}_1^{\text{col}} & \cdots & \bm{a}_p^{\text{col}} \end{bmatrix} \begin{bmatrix} \bm{b}_1^{\text{row}} \\ \vdots \\ \bm{b}_p^{\text{row}} \end{bmatrix} = \sum_{k = 1}^p \bm{a}_k^{\text{col}} \otimes [\bm{b}_k^{\text{row}}]^\top\]While this doesn’t immediately tell you how to decompose a single matrix into outer products, there is one simple trick we can use to apply that result. The trick is multiplying (on either side) the identity matrix. So, two answers to whatever weird riddle you’ve been given in this scenario are:
\[\mathbf{W}\mathcal{I} = \begin{bmatrix} \bm{w}_1^{\text{col}} & \cdots & \bm{w}_m^{\text{col}} \end{bmatrix} \begin{bmatrix} \bm{e}_1^{\text{row}} \\ \vdots \\ \bm{e}_m^{\text{row}} \end{bmatrix} = \sum_{k = 1}^m \bm{w}_k^{\text{col}} \otimes [\bm{e}_k^{\text{row}}]^\top\]and
\[\mathcal{I}\mathbf{W} = \begin{bmatrix} \bm{e}_1^{\text{col}} & \cdots & \bm{e}_n^{\text{col}} \end{bmatrix} \begin{bmatrix} \bm{w}_1^{\text{row}} \\ \vdots \\ \bm{w}_n^{\text{row}} \end{bmatrix} = \sum_{k = 1}^n \bm{e}_k^{\text{col}} \otimes [\bm{w}_k^{\text{row}}]^\top\]Where $\bm{e}_i$ is the $i$th row or column of the identity matrix $\mathcal{I}$.
One way of framing what this trivial decomposition does is it “paints in” the matrix $\mathbf{M}$ row-wise or column wise. Recalling the following visual from my previous post (pasted below), taking the outer product with the identity matrix row/column “paints in” whatever the other vector is into the resulting matrix.
If $x$ (above) is replaced with the vector from the identity matrix, then each term of the summation is adding one row to our matrix $\mathbf{M}$. If $y$ is replaced with the vector from the identity matrix instead, then each term of the summation is adding one column to our matrix.
Neat!