奇异值分解简介
$$
A=U \Sigma V^{T}
$$
此处,$U=\left(u_{i j}\right)_{m \times m}, \Sigma=\left(\sigma_{i j}\right)_{m \times n}, V^{T}=\left(v_{i j}\right)_{n \times n}$,其中方阵U和V都是实正交矩阵(orthogonal matrix),也就是说,$U^T = U^{-1}, V^T=V^{-1}$, $\Sigma$是类对角矩阵,如下
$$
\begin{align}
\Sigma=
\begin{bmatrix}
{\sigma_{1}} & {0} & {0} & {0} & {\cdots} & {0}\\
{0} & {\ddots} & {0} & {\vdots} & {\cdots} & {0}\\
{0} & {0} & {\sigma_{r}} & {0} & {\cdots} & {0}\\
{0} & {\cdots} & {0} & {0} & {\cdots} & {0}\\
{\vdots} & {\ddots} & {\vdots} & {\vdots} & {\ddots} & {\vdots}\\
{0} & {\cdots} & {0} & {0} & {\cdots} & {0}
\end{bmatrix}
\end{align}
$$
主对角元$\sigma_{i}>0, \quad i=1,2, \ldots, r$,且$\sigma_{r+1}=\cdots \sigma_{p}=0, \quad p=\min \{m, n\}$,称为奇异值(singular values)。注意,SVD的分解不具有唯一性。为了便于应用,我们习惯将奇异值由大到小排列:$\sigma_{1} \geq \sigma_{2} \geq \cdots \geq \sigma_{r}>0$。
令矩阵U的行向量分别为$\mathbf{u}_i, i = 1, \dots, m$,矩阵V的行向量分别为$\mathbf{v}_{j}, j=1, \dots, n$, 则$A=U \Sigma V^{T}$可以表示为r个秩-1(rank-one)矩阵之和:
$$
\begin{align}
\begin{split}
A &=
\begin{bmatrix}
\mathbf{u}_{1}&\mathbf{u}_{2}&\cdots &\mathbf{u}_{m}
\end{bmatrix}
\begin{bmatrix}
\sigma_1&&&&\\
&\sigma_2&&&\\
&&\ddots&&\\
&&&\sigma_r &\\
&&&&\ddots
\end{bmatrix}
\begin{bmatrix}
\mathbf{v_1^\mathrm{T}}\\\mathbf{v_2^\mathrm{T}}\\\vdots\\\mathbf{v_n^\mathrm{T}}
\end{bmatrix}\\
&=\left[ \begin{array}{lllllll}{\sigma_{1} \mathbf{u}_{1}} & {\sigma_{2} \mathbf{u}_{2}} & {\cdots} & {\sigma_{r} \mathbf{u}_{r}} & {\mathbf{0}} & {\cdots} & {\mathbf{0}}\end{array}\right] \left[ \begin{array}{c}{\mathbf{v}_{1}^{T}} \ {\mathbf{v}_{2}^{T}} \ {\vdots} \ {\mathbf{v}_{n}^{T}}\end{array}\right]\\
&=\sigma_{1} \mathbf{u}_{1} \mathbf{v}_{1}^{T}+\sigma_{2} \mathbf{u}_{2} \mathbf{v}_{2}^{T}+\cdots+\sigma_{r} \mathbf{u}_{r} \mathbf{v}_{r}^{T}
\end{split}
\end{align}
$$
这个式子说明了原矩阵A仅由U的前r个行向量,$V^T$的前r个列向量,以及$\Sigma$左上$r\times r$分块矩阵决定。其称为瘦奇异值分解:
这进一步说明了,如果使用瘦奇异值分解的形式来储存矩阵的话可以节省大量的储存空间。
另外从线性变换的角度来理解奇异值分解的话,我们可以有如下理解:
我们可以将奇异值分解看作是矩阵A的三个分解步骤:旋转$V^T$,伸缩$\Sigma$,再旋转$U$,图中表示的是2阶矩阵的分解变换。而另一方面,$\Sigma$也可以视作为变换矩阵A参考了基底$\left\{\mathbf{v}_{1}, \dots, \mathbf{v}_{n}\right\}$和$\left\{\mathbf{u}_{1}, \dots, \mathbf{u}_{m}\right\}$的主对角变换矩阵。
其实在具体的奇异值分解的计算中我们主要利用了以下几个性质:
$A^{T} A$和$A A^{T}$的非零特征值为$\sigma_{i}^{2}, i=1,2, \ldots, r, r=\operatorname{rank} A$。需要注意的是,$A^{T} A$和$A A^{T}$是半正定矩阵(positive semidefinite),其特征值必定不是负数。
$A^{T} A$的单位特征向量为$\mathbf{v}_j$,
$$
A^{T} A \mathbf{v}_{j}=\sigma_{j}^{2} \mathbf{v}_{j}, \quad j=1,2, \ldots, n
$$$A A^{T}$的单位特征向量为$\mathbf{u}_j$,
$$
A A^{T} \mathbf{u}_{j}=\sigma_{j}^{2} \mathbf{u}_{j}, \quad j=1,2, \ldots, m
$$对于$j=1,2, \ldots, r$,$\mathbf{u}_j$与$\mathbf{v}_j$具有以下关系:
$$
\begin{aligned}
A \mathbf{v}_{j} &=\sigma_{j} \mathbf{u}_{j}\\
A^{T} \mathbf{u}_{j} &=\sigma_{j} \mathbf{v}_{j}
\end{aligned}
$$
对应奇异值$\sigma_j$,$\mathbf{u}_j$称为左奇异向量$\mathbf{v}_j$称为右奇异向量。
奇异值分解与特征值分解的关系
奇异值分解能够用于任意$m\times n$矩阵,而特征分解只能适用于特定类型的方阵,故奇异值分解的适用范围更广。不过这两个分解之间是有联系的。给定一个矩阵M的奇异值分解,我们可以有
$$
\begin{align}
\begin{split}
M^{\star} M &=V \Sigma^{\star} U^{\star} U \Sigma V^{\star}=V\left(\Sigma^{\star} \Sigma\right) V^{\star}\\
M M^{\star} &=U \Sigma V^{\star} V \Sigma^{\star} U^{\star}=U\left(\Sigma \Sigma^{\star}\right) U^{\star}
\end{split}
\end{align}
$$
由上式可知:
- 矩阵V的列向量(右奇异向量)是方阵$M^{\star} M$的特征向量
- 矩阵U的列向量(右奇异向量)是方阵$M M^{\star}$的特征向量
- $\Sigma$的非零对角元素(非零奇异值)是$M^{\star} M$或者$M M^{\star}$的非零特征值的平方根。
计算
下面举一个简单的例子来说明奇异值分解的计算程序,考虑如下矩阵:
$$
\begin{align}
A=
\begin{bmatrix}
{1} & {0} & {0} & {0} & {2}\\
{0} & {0} & {3} & {0} & {0}\\
{0} & {0} & {0} & {0} & {0}\\
{0} & {4} & {0} & {0} & {0}
\end{bmatrix}
\end{align}
$$
STEP1:计算矩阵与自身转置相乘:
$$
\begin{align}
A^{T} A=
\begin{bmatrix}
{1} & {0} & {0} & {0} & {2}\\
{0} & {16} & {0} & {0} & {0}\\
{0} & {0} & {9} & {0} & {0}\\
{0} & {0} & {0} & {0} & {0}\\
{2} & {0} & {0} & {0} & {4}
\end{bmatrix}
\end{align}
$$
STPE2:求$A^{T} A$的特征值与对应的特征向量,将特征值从大到小排列:
$$
\lambda_{1}=16, \lambda_{2}=9, \lambda_{3}=5, \lambda_{4}=0, \lambda_{5}=0
$$
对应的单位特征向量依次为:
$$
\begin{align}
\mathbf{v}_{1}=
\begin{bmatrix}
{0} \\{1} \\{0}\\{0} \\{0}
\end{bmatrix}
, \mathbf{v}_{2}=
\begin{bmatrix}
{0} \\{0} \\{1} \\{0} \\{0}
\end{bmatrix}
, \mathbf{v}_{3}=
\begin{bmatrix}
{\sqrt{0.2}} \\{0} \\{0} \\{0} \\{\sqrt{0.8}}
\end{bmatrix}
, \mathbf{v}_{4}=
\begin{bmatrix}
{0} \\{0} \\{0} \\{1} \\{0}
\end{bmatrix}
, \mathbf{v}_{5}=
\begin{bmatrix}
{-\sqrt{0.8}} \\{0} \\{0} \\{0} \\\sqrt{0.2}
\end{bmatrix}
\end{align}
$$
则由上面的几个性质可知,非零奇异值为$\sigma_{1}=\sqrt{\lambda_{1}}=4, \sigma_{2}=\sqrt{\lambda_{2}}=3, \sigma_{3}=\sqrt{\lambda_{3}}=\sqrt{5}$
STEP3:设定$\Sigma$和$V$:
$$
\begin{align}
\Sigma=
\begin{bmatrix}
{4} & {0} & {0} & {0} & {0} \\
{0} & {3} & {0} & {0} & {0} \\
{0} & {0} & {\sqrt{5}} & {0} & {0} \\
{0} & {0} & {0} & {0} & {0}
\end{bmatrix}
\end{align}
$$
$$
\begin{align}
V=
\begin{bmatrix}
{0} & {0} & {\sqrt{0.2}} & {0} & {-\sqrt{0.8}} \\
{1} & {0} & {0} & {0} & {0} \\
{0} & {1} & {0} & {0} & {0} \\
{0} & {0} & {0} & {1} & {0} \\
{0} & {0} & {\sqrt{0.8}} & {0} & {\sqrt{0.2}}
\end{bmatrix}
\end{align}
$$
STEP4:最后计算矩阵U。对于$j=1,2, \ldots, r$,此例中$r = 3$。可以直接计算$\mathbf{u}_{j}=\frac{1}{\sigma_{j}} A \mathbf{v}_{j}$,结果如下:
$$
\begin{align}
\mathbf{u}_{1}=
\begin{bmatrix}
{0} \\
{0} \\
{0} \\
{1}
\end{bmatrix}
, \mathbf{u}_{2}=
\begin{bmatrix}
{0} \\
{1} \\
{0} \\
{0}
\end{bmatrix}
, \mathbf{u}_{3}=
\begin{bmatrix}
{1} \\
{0} \\
{0} \\
{0}
\end{bmatrix}
\end{align}
$$
根据施密特(Gram-Schmidt)正交化程序求出:
$$
\begin{align}
\mathbf{u}_{4}=
\begin{bmatrix}
{0} \\
{0} \\
{1} \\
{0}
\end{bmatrix}
\end{align}
$$
合并上面的结果,
$$
\begin{align}
U=
\begin{bmatrix}
{0} & {0} & {1} & {0} \\
{0} & {1} & {0} & {0} \\
{0} & {0} & {0} & {1} \\
{1} & {0} & {0} & {0}
\end{bmatrix}
\end{align}
$$
参考资料
- 奇异值分解, https://ccjou.wordpress.com/2009/09/01/奇異值分解-svd/
- Gene H. Golub, Michacel W. Mahoney, Petros Drineas, and Lek-Heng Lim, Bridging the Gap Between Numerical Linear Algebra, Theoretical Computer Science, and Data Applications, SIAM News, Vol. 39, No. 8, 2006.
- Wikipedia,https://zh.wikipedia.org/wiki/奇异值分解