论文解读第三代GCN《 Deep Embedding for CUnsupervisedlustering Analysis》 (4)

  $\tilde{D}^{-\frac{1}{2}} \tilde{A} \tilde{D}^{-\frac{1}{2}}=\left\{\begin{array}{cccccc}\frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} & 0 & 0 & \frac{1}{\sqrt{3}} & 0 \\\frac{1}{\sqrt{4}} & \frac{1}{\sqrt{4}} & \frac{1}{\sqrt{4}} & 0 & \frac{1}{\sqrt{4}} & 0 \\0 & \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} & 0 & 0 \\0 & 0 & \frac{1}{\sqrt{4}} & \frac{1}{\sqrt{4}} & \frac{1}{\sqrt{4}} & \frac{1}{\sqrt{4}} \\\frac{1}{\sqrt{4}} & \frac{1}{\sqrt{4}} & 0 & \frac{1}{\sqrt{4}} & \frac{1}{\sqrt{4}} & 0 \\0 & 0 & 0 & \frac{1}{\sqrt{2}} & 0 & \frac{1}{\sqrt{2}}\end{array}\right\}\left\{\begin{array}{cccccc}\frac{1}{\sqrt{3}} & 0 & 0 & 0 & 0 & 0 \\0 & \frac{1}{\sqrt{4}} & 0 & 0 & 0 & 0 \\0 & 0 & \frac{1}{\sqrt{3}} & 0 & 0 & 0 \\0 & 0 & 0 & \frac{1}{\sqrt{4}} & 0 & 0 \\0 & 0 & 0 & 0 & \frac{1}{\sqrt{4}} & 0 \\0 & 0 & 0 & 0 & 0 & \frac{1}{\sqrt{2}}\end{array}\right\}$

  $\tilde{D}^{-\frac{1}{2}} \tilde{A} \tilde{D}^{-\frac{1}{2}}=\left\{\begin{array}{cccccc}\frac{1}{3}         & \frac{1}{\sqrt{12}} & 0                   & 0                                & \frac{1}{\sqrt{12}} & 0 \\\frac{1}{\sqrt{12}} & \frac{1}{4}         & \frac{1}{\sqrt{12}} & 0                                & \frac{1}{4}         & 0 \\0                   & \frac{1}{\sqrt{12}} & \frac{1}{3}         & \frac{1}{\sqrt{12}}              & 0                  & 0 \\0                   & 0                   & \frac{1}{\sqrt{12}} & \frac{1}{4}                      & \frac{1}{4}       & \frac{1}{\sqrt{8}} \\\frac{1}{\sqrt{12}} & \frac{1}{4}         & 0                   & \frac{1}{4}                       & \frac{1}{4}      & 0 \\0& 0& 0& \frac{1}{\sqrt{8}}               & 0                  & \frac{1}{2}\end{array}\right\}$

2.1 Spectral graph convolutions 2.1.1 foundation of Spectral graph convolutions

  We consider spectral convolutions on graphs:

  假设 $x$ 是特征函数,$g$ 是卷积核,则图卷积为:

    $(g * x)=F^{-1}[F[g] \odot F[x]]$

    $\left(g * x\right)=U\left(U^{T} x \odot U^{T} g\right)=U\left(U^{T} g \odot U^{T} x\right)$

  把 $U^{T} g$ 整体看作可学习的卷积核 :

    $U^{T} g=\left[\begin{array}{c}\hat{g}_{\theta}\left(\lambda_{0}\right) \\\hat{g}_{\theta}\left(\lambda_{1}\right) \\\ldots \\\hat{g}_{\theta}\left(\lambda_{n-1}\right)\end{array}\right]$

  其中 $\theta$ 为 $g$ 的参数。

  则可得:

    $\begin{array}{l}\left(U^{T} g\right) \odot\left(U^{T} x\right)&=\left[\begin{array}{c}\hat{g}_{\theta}\left(\lambda_{0}\right) \\\hat{g_{\theta}}\left(\lambda_{1}\right) \\\cdots \\\hat{g_{\theta}}\left(\lambda_{n-1}\right)\end{array}\right] \odot\left[\begin{array}{c}\hat{x}\left(\lambda_{0}\right) \\\hat{x}\left(\lambda_{1}\right) \\\cdots \\\hat{x}\left(\lambda_{n-1}\right)\end{array}\right]\\&=\left[\begin{array}{c}\hat{g}_{\theta}\left(\lambda_{0}\right) \cdot \hat{x}\left(\lambda_{0}\right) \\\hat{g}_{\theta}\left(\lambda_{1}\right) \cdot \hat{x}\left(\lambda_{1}\right) \\\cdots \\\hat{g}_{\theta}\left(\lambda_{n-1}\right) \cdot \hat{x}\left(\lambda_{n-1}\right)\end{array}\right]\\&=\left[\begin{array}{cccc}\hat{g}_{\theta}\left(\lambda_{0}\right) & 0 & \cdots & 0 \\0 & \hat{g}_{\theta}\left(\lambda_{1}\right) & \cdots & 0 \\\vdots & \vdots & \ddots & \vdots \\0 & 0 & \cdots & \hat{g}_{\theta}\left(\lambda_{n-1}\right)\end{array}\right] \cdot\left[\begin{array}{c}\hat{x}\left(\lambda_{0}\right) \\\hat{x}\left(\lambda_{1}\right) \\\cdots \\\hat{x}\left(\lambda_{n-1}\right)\end{array}\right]\\&=g_{\theta}(\Lambda)U^{T} x\end{array}$

  最终图上的卷积公式是:

内容版权声明:除非注明,否则皆为本站原创文章。

转载注明出处:https://www.heiqu.com/zzgjdd.html