Singular Value Decomposition (SVD) is one of the most powerful factorizations in linear algebra. Every real \( m \times n \) matrix \( A \) of rank \( r \) admits a decomposition \( A = U \Sigma V^T \), where \( U \) and \( V \) are orthogonal matrices and \( \Sigma \) is a diagonal matrix carrying the singular values \( \sigma_1 \geq \sigma_2 \geq \cdots \geq \sigma_r > 0 \). Mastering SVD opens the door to topics such as low-rank approximation, the Moore–Penrose pseudoinverse, principal component analysis (PCA), least-squares problems, and matrix norm analysis. The problems below are organized from foundational computation to advanced applications and cover the full range of difficulty levels tested in undergraduate and graduate linear algebra courses.
Computing Singular Values from \( A^T A \) and \( A A^T \)
The singular values of \( A \) are the positive square roots of the eigenvalues of the symmetric positive semidefinite matrix \( A^T A \) (equivalently of \( A A^T \)). Practicing their computation on small matrices builds the intuition needed for every subsequent topic in SVD theory.
Problem 1: Singular Values of a 2 × 2 Diagonal Matrix
Easy
Consider the matrix \( A = \begin{pmatrix} 3 & 0 \\ 0 & 1 \end{pmatrix} \).
- Compute \( A^T A \) and find its eigenvalues.
- State the singular values \( \sigma_1 \geq \sigma_2 \) of \( A \) and explain why the SVD of \( A \) is trivial in this case.
Hint
View Solution
\[
A^T A = \begin{pmatrix} 3 & 0 \\ 0 & 1 \end{pmatrix}^T \begin{pmatrix} 3 & 0 \\ 0 & 1 \end{pmatrix} = \begin{pmatrix} 9 & 0 \\ 0 & 1 \end{pmatrix}.
\]
The eigenvalues of \( A^T A \) are \( \lambda_1 = 9 \) and \( \lambda_2 = 1 \).
Solution to question 2:
The singular values are \( \sigma_1 = \sqrt{9} = 3 \) and \( \sigma_2 = \sqrt{1} = 1 \). Since \( A \) is already diagonal with positive entries, we have \( U = V = I_2 \) and \( \Sigma = A \), giving the trivial SVD \( A = I \cdot A \cdot I \).
Problem 2: Singular Values of a 2 × 3 Rectangular Matrix
Easy
Let \( A = \begin{pmatrix} 1 & 0 & 1 \\ 0 & 2 & 0 \end{pmatrix} \).
- Compute \( A^T A \) (a \( 3 \times 3 \) matrix) and determine its eigenvalues.
- List the singular values of \( A \) in decreasing order and determine the rank of \( A \).
Hint
View Solution
\[
AA^T = \begin{pmatrix} 1 & 0 & 1 \\ 0 & 2 & 0 \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & 2 \\ 1 & 0 \end{pmatrix} = \begin{pmatrix} 2 & 0 \\ 0 & 4 \end{pmatrix}.
\]
The nonzero eigenvalues of \( A^T A \) equal those of \( AA^T \): \( \lambda_1 = 4,\; \lambda_2 = 2 \). The third eigenvalue of \( A^T A \) is \( 0 \).
Solution to question 2:
\[
\sigma_1 = \sqrt{4} = 2, \quad \sigma_2 = \sqrt{2}, \quad \sigma_3 = 0.
\]
The rank of \( A \) equals the number of nonzero singular values, so \( \text{rank}(A) = 2 \).
Problem 3: Singular Values via the Characteristic Polynomial
Medium
Let \( A = \begin{pmatrix} 2 & 1 \\ 1 & 2 \\ 1 & -1 \end{pmatrix} \).
- Compute \( A^T A \) and find its characteristic polynomial.
- Determine the singular values of \( A \).
- Identify the rank of \( A \) and state the dimensions of the four fundamental subspaces.
Hint
View Solution
\[
A^T A = \begin{pmatrix} 2 & 1 & 1 \\ 1 & 2 & -1 \end{pmatrix} \begin{pmatrix} 2 & 1 \\ 1 & 2 \\ 1 & -1 \end{pmatrix} = \begin{pmatrix} 6 & 3 \\ 3 & 6 \end{pmatrix}.
\]
Characteristic polynomial:
\[
\det(A^T A – \lambda I) = (6-\lambda)^2 – 9 = \lambda^2 – 12\lambda + 27 = 0.
\]
Solution to question 2:
\[
\lambda = \frac{12 \pm \sqrt{144 – 108}}{2} = \frac{12 \pm 6}{2} \implies \lambda_1 = 9,\; \lambda_2 = 3.
\]
Therefore \( \sigma_1 = 3 \) and \( \sigma_2 = \sqrt{3} \).
Solution to question 3:
Both singular values are nonzero, so \( \text{rank}(A) = 2 \). The four subspaces have dimensions: \( \text{Col}(A) \subseteq \mathbb{R}^3 \) has dimension \( 2 \); \( \text{Null}(A) \subseteq \mathbb{R}^2 \) has dimension \( 0 \); \( \text{Row}(A) \subseteq \mathbb{R}^2 \) has dimension \( 2 \); \( \text{Null}(A^T) \subseteq \mathbb{R}^3 \) has dimension \( 1 \).
Full SVD Computation: Constructing U, Σ, and V
Once the singular values are known, the right singular vectors (columns of \( V \)) are obtained as unit eigenvectors of \( A^T A \), and the left singular vectors (columns of \( U \)) follow from \( u_i = \frac{1}{\sigma_i} A v_i \). Constructing the complete factorization \( A = U \Sigma V^T \) by hand is the core computational skill in SVD.
Problem 4: Full SVD of a 2 × 2 Matrix
Easy
Let \( A = \begin{pmatrix} 0 & 2 \\ 0 & 0 \end{pmatrix} \).
- Compute \( A^T A \) and find its eigenvalues and unit eigenvectors.
- Construct the matrices \( V \), \( \Sigma \), and \( U \), then write the full SVD \( A = U \Sigma V^T \).
- Verify the factorization by multiplying \( U \Sigma V^T \) and checking that the product equals \( A \).
Hint
View Solution
\[
A^T A = \begin{pmatrix} 0 & 0 \\ 2 & 0 \end{pmatrix}\begin{pmatrix} 0 & 2 \\ 0 & 0 \end{pmatrix} = \begin{pmatrix} 0 & 0 \\ 0 & 4 \end{pmatrix}.
\]
Eigenvalues: \( \lambda_1 = 4 \), \( \lambda_2 = 0 \). Unit eigenvectors: \( v_1 = \begin{pmatrix}0\\1\end{pmatrix} \), \( v_2 = \begin{pmatrix}1\\0\end{pmatrix} \).
Solution to question 2:
\[
V = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}, \quad \Sigma = \begin{pmatrix} 2 & 0 \\ 0 & 0 \end{pmatrix}.
\]
\[
u_1 = \frac{1}{2} A v_1 = \frac{1}{2}\begin{pmatrix}0&2\\0&0\end{pmatrix}\begin{pmatrix}0\\1\end{pmatrix} = \begin{pmatrix}1\\0\end{pmatrix}.
\]
Choose \( u_2 = \begin{pmatrix}0\\1\end{pmatrix} \) to complete an orthonormal basis. Thus:
\[
U = \begin{pmatrix}1&0\\0&1\end{pmatrix} = I_2, \quad A = I_2 \begin{pmatrix}2&0\\0&0\end{pmatrix} \begin{pmatrix}0&1\\1&0\end{pmatrix}.
\]
Solution to question 3:
\[
U\Sigma V^T = \begin{pmatrix}1&0\\0&1\end{pmatrix}\begin{pmatrix}2&0\\0&0\end{pmatrix}\begin{pmatrix}0&1\\1&0\end{pmatrix} = \begin{pmatrix}2&0\\0&0\end{pmatrix}\begin{pmatrix}0&1\\1&0\end{pmatrix} = \begin{pmatrix}0&2\\0&0\end{pmatrix} = A. \checkmark
\]
Problem 5: Full SVD of a 2 × 3 Matrix
Medium
Let \( A = \begin{pmatrix} 1 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix} \).
- Find all singular values of \( A \).
- Determine the right singular vectors (columns of \( V \)), including those spanning \( \text{Null}(A) \).
- Determine the left singular vectors (columns of \( U \)) and write the full SVD.
Hint
View Solution
\[
AA^T = \begin{pmatrix}1&1&0\\0&0&1\end{pmatrix}\begin{pmatrix}1&0\\1&0\\0&1\end{pmatrix} = \begin{pmatrix}2&0\\0&1\end{pmatrix}.
\]
Eigenvalues of \( AA^T \): \( \lambda_1 = 2 \), \( \lambda_2 = 1 \). Singular values: \( \sigma_1 = \sqrt{2} \), \( \sigma_2 = 1 \).
Solution to question 2:
Unit eigenvectors of \( AA^T \): \( u_1 = \begin{pmatrix}1\\0\end{pmatrix} \), \( u_2 = \begin{pmatrix}0\\1\end{pmatrix} \). Right singular vectors:
\[
v_1 = \frac{1}{\sqrt{2}}A^T u_1 = \frac{1}{\sqrt{2}}\begin{pmatrix}1\\1\\0\end{pmatrix}, \quad v_2 = A^T u_2 = \begin{pmatrix}0\\0\\1\end{pmatrix}.
\]
For \( v_3 \), solve \( Ax = 0 \): the null space is spanned by \( \begin{pmatrix}1\\-1\\0\end{pmatrix} \), so \( v_3 = \frac{1}{\sqrt{2}}\begin{pmatrix}1\\-1\\0\end{pmatrix} \).
Solution to question 3:
\[
U = I_2, \quad \Sigma = \begin{pmatrix}\sqrt{2}&0&0\\0&1&0\end{pmatrix}, \quad V = \frac{1}{\sqrt{2}}\begin{pmatrix}1&0&1\\1&0&-1\\0&\sqrt{2}&0\end{pmatrix}.
\]
The full SVD is \( A = U\Sigma V^T \), which can be verified by direct multiplication.
Problem 6: SVD of a Rank-1 Matrix
Medium
Let \( A = \mathbf{a}\mathbf{b}^T \) where \( \mathbf{a} = \begin{pmatrix}2\\1\\2\end{pmatrix} \in \mathbb{R}^3 \) and \( \mathbf{b} = \begin{pmatrix}3\\4\end{pmatrix} \in \mathbb{R}^2 \).
- Without computing \( A^T A \), identify the single nonzero singular value of \( A \) in terms of \( \|\mathbf{a}\| \) and \( \|\mathbf{b}\| \).
- Write the complete SVD \( A = U\Sigma V^T \) explicitly, including the null-space columns of \( U \) and \( V \).
Hint
View Solution
\[
A^T A = (\mathbf{a}\mathbf{b}^T)^T(\mathbf{a}\mathbf{b}^T) = \mathbf{b}\mathbf{a}^T\mathbf{a}\mathbf{b}^T = \|\mathbf{a}\|^2\, \mathbf{b}\mathbf{b}^T.
\]
The only nonzero eigenvalue of \( \mathbf{b}\mathbf{b}^T \) is \( \|\mathbf{b}\|^2 \), so the sole nonzero eigenvalue of \( A^T A \) is \( \|\mathbf{a}\|^2\|\mathbf{b}\|^2 \). Hence:
\[
\sigma_1 = \|\mathbf{a}\|\,\|\mathbf{b}\| = 3 \cdot 5 = 15.
\]
(Here \( \|\mathbf{a}\| = \sqrt{4+1+4}=3 \) and \( \|\mathbf{b}\|=\sqrt{9+16}=5 \).)
Solution to question 2:
\[
v_1 = \frac{\mathbf{b}}{\|\mathbf{b}\|} = \frac{1}{5}\begin{pmatrix}3\\4\end{pmatrix}, \quad u_1 = \frac{1}{\sigma_1}Av_1 = \frac{\mathbf{a}}{\|\mathbf{a}\|} = \frac{1}{3}\begin{pmatrix}2\\1\\2\end{pmatrix}.
\]
Complete \( V \) with any unit vector orthogonal to \( v_1 \), e.g., \( v_2 = \frac{1}{5}\begin{pmatrix}4\\-3\end{pmatrix} \). Complete \( U \) with any orthonormal basis for the orthogonal complement of \( u_1 \) in \( \mathbb{R}^3 \). The SVD is:
\[
A = \frac{1}{3}\begin{pmatrix}2\\1\\2\end{pmatrix} \cdot 15 \cdot \frac{1}{5}\begin{pmatrix}3&4\end{pmatrix} = \begin{pmatrix}6&8\\3&4\\6&8\end{pmatrix}.
\]
Fundamental Subspaces and the Geometry of SVD
The columns of \( U \) and \( V \) provide orthonormal bases for the four fundamental subspaces of \( A \): the column space, row space, null space, and left null space. Understanding this geometric interpretation is essential for applying SVD to projection, least-squares, and dimensionality reduction problems.
Problem 7: Fundamental Subspaces from the SVD
Easy
Suppose the full SVD of a \( 4 \times 3 \) matrix \( A \) is \( A = U\Sigma V^T \), where \( \Sigma \) has diagonal entries \( \sigma_1 = 5,\, \sigma_2 = 2,\, \sigma_3 = 0 \). Let \( u_1, u_2, u_3, u_4 \) be the columns of \( U \) and \( v_1, v_2, v_3 \) be the columns of \( V \).
- Write a basis for each of the four fundamental subspaces of \( A \) in terms of the \( u_i \) and \( v_j \).
- State the rank of \( A \) and confirm the rank–nullity theorem for both \( A \) and \( A^T \).
Hint
View Solution
The rank is \( r = 2 \) (two nonzero singular values).
- Column space \( \text{Col}(A) = \text{span}\{u_1, u_2\} \subseteq \mathbb{R}^4 \).
- Left null space \( \text{Null}(A^T) = \text{span}\{u_3, u_4\} \subseteq \mathbb{R}^4 \).
- Row space \( \text{Row}(A) = \text{span}\{v_1, v_2\} \subseteq \mathbb{R}^3 \).
- Null space \( \text{Null}(A) = \text{span}\{v_3\} \subseteq \mathbb{R}^3 \).
Solution to question 2:
\[
\text{rank}(A) = 2.
\]
Rank–nullity for \( A \colon \mathbb{R}^3 \to \mathbb{R}^4 \): \( \dim\text{Row}(A) + \dim\text{Null}(A) = 2 + 1 = 3 \). \checkmark
Rank–nullity for \( A^T \colon \mathbb{R}^4 \to \mathbb{R}^3 \): \( \dim\text{Col}(A) + \dim\text{Null}(A^T) = 2 + 2 = 4 \). \checkmark
Problem 8: Orthogonal Projection via SVD
Medium
Let \( A = \begin{pmatrix} 1 & 0 \\ 0 & 1 \\ 1 & 1 \end{pmatrix} \) with SVD \( A = U\Sigma V^T \).
- Compute the SVD of \( A \) (find \( U \), \( \Sigma \), \( V \)).
- Use the SVD to write the orthogonal projection matrix \( P \) onto \( \text{Col}(A) \) as \( P = U_r U_r^T \), where \( U_r \) contains only the columns of \( U \) corresponding to nonzero singular values.
- Verify that \( P^2 = P \) and \( P^T = P \).
Hint
View Solution
\[
A^T A = \begin{pmatrix}2&1\\1&2\end{pmatrix}, \quad \lambda_1 = 3,\; \lambda_2 = 1, \quad \sigma_1=\sqrt{3},\; \sigma_2=1.
\]
Unit eigenvectors of \( A^T A \): \( v_1 = \tfrac{1}{\sqrt{2}}\begin{pmatrix}1\\1\end{pmatrix} \), \( v_2 = \tfrac{1}{\sqrt{2}}\begin{pmatrix}1\\-1\end{pmatrix} \). Then:
\[
u_1 = \frac{1}{\sqrt{3}}Av_1 = \frac{1}{\sqrt{6}}\begin{pmatrix}1\\1\\2\end{pmatrix}, \quad u_2 = Av_2 = \frac{1}{\sqrt{2}}\begin{pmatrix}1\\-1\\0\end{pmatrix}.
\]
Choose \( u_3 \perp u_1, u_2 \) in \( \mathbb{R}^3 \), e.g., \( u_3 = \frac{1}{\sqrt{3}}\begin{pmatrix}1\\1\\-1\end{pmatrix} \).
Solution to question 2:
Since \( \text{rank}(A) = 2 \), take \( U_r = [u_1 \mid u_2] \). Then:
\[
P = U_r U_r^T = u_1 u_1^T + u_2 u_2^T = \frac{1}{6}\begin{pmatrix}1\\1\\2\end{pmatrix}\begin{pmatrix}1&1&2\end{pmatrix} + \frac{1}{2}\begin{pmatrix}1\\-1\\0\end{pmatrix}\begin{pmatrix}1&-1&0\end{pmatrix}.
\]
\[
P = \frac{1}{2}\begin{pmatrix}1&0&\tfrac{1}{3}\\0&1&\tfrac{1}{3}\\\tfrac{1}{3}&\tfrac{1}{3}&\tfrac{4}{3}\end{pmatrix} \quad \text{(simplified form)}.
\]
Solution to question 3:
\( P^2 = (U_r U_r^T)(U_r U_r^T) = U_r (U_r^T U_r) U_r^T = U_r I U_r^T = P \). \checkmark
\( P^T = (U_r U_r^T)^T = U_r U_r^T = P \). \checkmark
Low-Rank Approximation and the Eckart–Young Theorem
The Eckart–Young–Mirsky theorem states that the best rank-\(k\) approximation of \( A \) in both the 2-norm and the Frobenius norm is the truncated SVD \( A_k = \sum_{i=1}^k \sigma_i u_i v_i^T \). This result underpins image compression, noise reduction, and principal component analysis.
Problem 9: Best Rank-1 Approximation
Medium
Let \( A = \begin{pmatrix} 3 & 0 \\ 0 & 2 \end{pmatrix} \).
- Write the SVD of \( A \) and identify its outer-product expansion \( A = \sigma_1 u_1 v_1^T + \sigma_2 u_2 v_2^T \).
- Compute the best rank-1 approximation \( A_1 = \sigma_1 u_1 v_1^T \) and find the approximation error \( \|A – A_1\|_F \).
- Verify that no rank-1 matrix \( B \) satisfies \( \|A – B\|_F < \|A - A_1\|_F \).
Hint
View Solution
\( A \) is already diagonal with positive entries, so \( U = V = I_2 \) and \( \Sigma = A \). The outer-product expansion is:
\[
A = 3 \begin{pmatrix}1\\0\end{pmatrix}\begin{pmatrix}1&0\end{pmatrix} + 2\begin{pmatrix}0\\1\end{pmatrix}\begin{pmatrix}0&1\end{pmatrix}.
\]
Solution to question 2:
\[
A_1 = 3\begin{pmatrix}1\\0\end{pmatrix}\begin{pmatrix}1&0\end{pmatrix} = \begin{pmatrix}3&0\\0&0\end{pmatrix}.
\]
\[
\|A – A_1\|_F = \left\|\begin{pmatrix}0&0\\0&2\end{pmatrix}\right\|_F = 2 = \sigma_2.
\]
Solution to question 3:
By the Eckart–Young theorem, \( A_1 \) is the best rank-1 approximation and \( \|A – A_1\|_F = \sigma_2 = 2 \) is the minimum possible error over all rank-1 matrices. Any other rank-1 matrix \( B \) satisfies \( \|A – B\|_F \geq \sigma_2 = 2 \).
Problem 10: Low-Rank Approximation and Frobenius Error
Hard
A data matrix \( A \in \mathbb{R}^{4 \times 4} \) has singular values \( \sigma_1 = 10,\, \sigma_2 = 6,\, \sigma_3 = 2,\, \sigma_4 = 1 \).
- Compute \( \|A\|_F^2 \), \( \|A\|_2 \), and the condition number \( \kappa(A) = \sigma_1/\sigma_4 \).
- Determine the rank-2 approximation error in both the Frobenius norm and the spectral (2-norm).
- What percentage of the total “energy” (Frobenius norm squared) is captured by the rank-2 approximation \( A_2 \)?
- If a rank-1 approximation must capture at least 90% of the energy, is \( A_1 \) sufficient? Justify your answer.
Hint
View Solution
\[
\|A\|_F^2 = 10^2 + 6^2 + 2^2 + 1^2 = 100 + 36 + 4 + 1 = 141.
\]
\[
\|A\|_2 = \sigma_1 = 10, \quad \kappa(A) = \frac{\sigma_1}{\sigma_4} = \frac{10}{1} = 10.
\]
Solution to question 2:
\[
\|A – A_2\|_F = \sqrt{\sigma_3^2 + \sigma_4^2} = \sqrt{4+1} = \sqrt{5} \approx 2.24.
\]
\[
\|A – A_2\|_2 = \sigma_3 = 2.
\]
Solution to question 3:
\[
\frac{\|A_2\|_F^2}{\|A\|_F^2} = \frac{100 + 36}{141} = \frac{136}{141} \approx 96.5\%.
\]
The rank-2 approximation captures approximately \( 96.5\% \) of the total energy.
Solution to question 4:
\[
\frac{\|A_1\|_F^2}{\|A\|_F^2} = \frac{100}{141} \approx 70.9\% < 90\%.
\]
A rank-1 approximation captures only about \( 70.9\% \) of the energy, which is below the 90% threshold. Therefore \( A_1 \) is not sufficient; at least a rank-2 approximation is required.
Moore–Penrose Pseudoinverse and Least-Squares Problems
The Moore–Penrose pseudoinverse \( A^+ = V \Sigma^+ U^T \) generalizes matrix inversion to non-square and rank-deficient matrices. It yields the minimum-norm least-squares solution \( \hat{x} = A^+ b \) to inconsistent or underdetermined linear systems, making it indispensable in data fitting, signal processing, and control theory.
Problem 11: Pseudoinverse of a Rank-Deficient Matrix
Medium
Let \( A = \begin{pmatrix} 1 & 2 \\ 2 & 4 \end{pmatrix} \).
- Verify that \( A \) is rank-deficient and compute its SVD.
- Construct the pseudoinverse \( A^+ = V\Sigma^+ U^T \), where \( \Sigma^+ \) replaces each nonzero \( \sigma_i \) by \( 1/\sigma_i \).
- Find the minimum-norm least-squares solution to \( Ax = b \) for \( b = \begin{pmatrix}3\\6\end{pmatrix} \).
Hint
View Solution
The rows of \( A \) are proportional, so \( \text{rank}(A) = 1 \). We have:
\[
A^T A = \begin{pmatrix}5&10\\10&20\end{pmatrix}, \quad \text{eigenvalues: } \lambda_1 = 25,\; \lambda_2 = 0.
\]
Singular value: \( \sigma_1 = 5 \). Unit eigenvector: \( v_1 = \tfrac{1}{\sqrt{5}}\begin{pmatrix}1\\2\end{pmatrix} \). Left singular vector: \( u_1 = \tfrac{1}{5}Av_1 = \tfrac{1}{\sqrt{5}}\begin{pmatrix}1\\2\end{pmatrix} \).
Solution to question 2:
\[
\Sigma^+ = \begin{pmatrix}1/5 & 0 \\ 0 & 0\end{pmatrix}, \quad A^+ = v_1 \cdot \frac{1}{\sigma_1} \cdot u_1^T = \frac{1}{25}\begin{pmatrix}1&2\\2&4\end{pmatrix}.
\]
Solution to question 3:
\[
\hat{x} = A^+ b = \frac{1}{25}\begin{pmatrix}1&2\\2&4\end{pmatrix}\begin{pmatrix}3\\6\end{pmatrix} = \frac{1}{25}\begin{pmatrix}15\\30\end{pmatrix} = \begin{pmatrix}0.6\\1.2\end{pmatrix}.
\]
Since \( b = \begin{pmatrix}3\\6\end{pmatrix} \) lies in \( \text{Col}(A) \), the system is consistent and \( \hat{x} \) is the minimum-norm solution.
Problem 12: SVD, Pseudoinverse, and Theoretical Properties
Hard
Let \( A \) be a real \( m \times n \) matrix with full SVD \( A = U\Sigma V^T \).
- Prove that \( |\det(A)| = \sigma_1 \sigma_2 \cdots \sigma_n \) when \( A \) is square (\( m = n \)).
- Show that \( \|A\|_F^2 = \text{tr}(A^T A) = \sigma_1^2 + \sigma_2^2 + \cdots + \sigma_r^2 \).
- Let \( A = U\Sigma V^T \) be invertible. Derive an explicit SVD for \( A^{-1} \) and identify the singular values of \( A^{-1} \) in terms of those of \( A \).
- Using the pseudoinverse, show that \( A A^+ A = A \) and \( A^+ A A^+ = A^+ \) (two of the four Moore–Penrose conditions).
Hint
View Solution
\[
\det(A) = \det(U\Sigma V^T) = \det(U)\det(\Sigma)\det(V^T).
\]
Since \( U \) and \( V \) are orthogonal, \( |\det(U)| = |\det(V)| = 1 \). The matrix \( \Sigma \) is diagonal with entries \( \sigma_1, \ldots, \sigma_n \), so \( \det(\Sigma) = \sigma_1 \cdots \sigma_n \). Therefore:
\[
|\det(A)| = \sigma_1 \sigma_2 \cdots \sigma_n. \qquad \square
\]
Solution to question 2:
\[
\|A\|_F^2 = \text{tr}(A^T A) = \text{tr}(V\Sigma^T U^T U \Sigma V^T) = \text{tr}(V\Sigma^T\Sigma V^T) = \text{tr}(\Sigma^T\Sigma V^T V) = \text{tr}(\Sigma^T\Sigma).
\]
The matrix \( \Sigma^T \Sigma \) is diagonal with entries \( \sigma_1^2, \ldots, \sigma_r^2, 0, \ldots, 0 \), so:
\[
\|A\|_F^2 = \sigma_1^2 + \sigma_2^2 + \cdots + \sigma_r^2. \qquad \square
\]
Solution to question 3:
If \( A \) is invertible, then \( m = n \) and all \( \sigma_i > 0 \). From \( A = U\Sigma V^T \) we get:
\[
A^{-1} = (V^T)^{-1}\Sigma^{-1}U^{-1} = V\Sigma^{-1}U^T.
\]
This is itself an SVD (with the roles of \( U \) and \( V \) swapped) since \( V \) and \( U \) are orthogonal and \( \Sigma^{-1} \) is diagonal with positive entries. The singular values of \( A^{-1} \) are \( 1/\sigma_n \geq \cdots \geq 1/\sigma_1 \) (in decreasing order).
Solution to question 4:
Substitute \( A = U\Sigma V^T \) and \( A^+ = V\Sigma^+ U^T \):
\[
AA^+A = (U\Sigma V^T)(V\Sigma^+ U^T)(U\Sigma V^T) = U\Sigma(\Sigma^+\Sigma) V^T.
\]
For each index \( i \leq r \): \( (\Sigma^+\Sigma)_{ii} = (1/\sigma_i)\cdot\sigma_i = 1 \), and zero elsewhere. Hence \( \Sigma\Sigma^+\Sigma = \Sigma \), giving \( AA^+A = U\Sigma V^T = A \). \checkmark
Similarly:
\[
A^+AA^+ = (V\Sigma^+ U^T)(U\Sigma V^T)(V\Sigma^+ U^T) = V\Sigma^+(\Sigma\Sigma^+)U^T = V\Sigma^+ U^T = A^+. \checkmark \qquad \square
\]