In general, notice that the eigen vectors (from different eigen values) that form the S matrix are not orthogonal, but can be chosen to have norm 1. Now, consider this decomposition for a square and symmetric matrix, such as:
The eigen values are easily seen to be zero (nothing special about it) and 5. The eigen vector matrix S is given by:
Notice that the eigen vectors have been normalized to have distance one. Notice that S satisfies the property:
so that the decomposition can be effectively written as:
where Q effectively denotes what we have been calling S so far. It is not an accident that the transpose of the S (Q) also happens to be its inverse. In fact, we can show that all symmetric matrices will satisfy this property for their eigen vectors. In other words, the eigen vectors of a symmetric matrix can be chosen orthogonal (and to have a norm of 1); thus they are said to be orthonormal. Hence a new symbol Q. To extend the above eigen value decomposition to all matrices (not necessarily square and symmetric), we obviously have to make some concessions:
Notice that we would still like:
There is a simple way to determine what these matrices should be. Consider two specific cases of using this decomposition:
Thus,
and
so that:
The singular value decomposition is thus:
The diagonal elements of are called the singular values (NOT the eigenvalues) of .
Normally the singular values are arranged in decreasing order from top-left to bottom-right. We could drop some percentage of the diagonal values of (replace them with 0), and reconstruct the original matrix. This process can be shown mathematically to give the best rank k approximation to the original matrix A.
Here's a small MATLAB script to tinker with these ideas.