you can see eigendecomposition as a special case of the SVD for square matrices | | svd: provide sets of basis vectors--called singular vectors--for the four matrix subspaces: the row space, the null space, the column space, and the left-null space; and to provide scalar singular values that encode the importance of each singular vector. | | A = U sigma (v transpose) | A is the MxN matrix to be decomposed. square or rectangular, any rank. | U is the left singular vectors matrix (M x M), which provides an orthonormal basis for R^M. This includes the column space of A and its complementary left-null space. | Sigma is the singular values matrix (M x N), which is diagonal and contains the singular values. All singular values are non-negative and real. | V - The right singular vectors matrix (NxN), which provides an orthonormal basis for R^N. That includes the row space of A and its complementary null space. Notice that the decomposition contains V^T; hence, although the right singular vectors are in the columns of V, it is usually more convenient to speak of the right singular vectors as the rows of V^T. | | the sizes of U, sigma, V depend on the size of A. Notice that the size of U corresponds to the number of rows in A, that the size of V corresponds to the number of columns in A, and that the sie of sigma is the same as that of A. These sizes allow for U to be an orthonormal basis for R^M and for V to be an orthonormal basis for R^N. | | represent a matrix using two orthogonal matrices surrounding a diagonal matrix. | | Computing SVD: | | eigendecomposition is not defined for non-square, but it is for A^tA. | | 1. Compute the eigendecomposition of A^tA to get sigma and V. | 2. Compute the eigendecomposition of AA^t to get U. | | SVD of a symmetric matrix: A is USigmaU^t