1 Title
Simplifying Graph Convolutional Networks(Felix Wu、Tianyi Zhang、Amauri Holanda de、 Souza Jr、Christopher Fifty、Tao Yu、Kilian Q. Weinberger)【ICML 2019】
2 Conclusion
This paper proposes a simplified graph convolutional method. By eliminating the nonlinear computation between GCN layers, reducing the additional complexity of GCN by folding the obtained function into a linear transformation, and obtaining the theoretical support of SGC from the root of graph convolution-spectrum analysis, it is proved that SGC is equivalent to a fixed low-channel filter and a linear classifier
3 Good Sentence
1、However, possibly because GCNs were proposed after the recent “renaissance” of neural networks, they tend to be a rare exception to this trend. GCNs are built upon multi-layer
neural networks, and were never an extension of a simpler (insufficient) linear counterpart(GCNs are not linear, making them not easy to optimize and interpret)
2、In contrast to its nonlinear counterparts, the SGC is intuitively interpretable and we provide a theoretical analysis from the graph convolution perspective. Notably, feature extraction in SGC corresponds to a single fixed filter applied to each feature dimension(The advantages of SGC when compared with GCN)
3、The algorithm is almost trivial, a graph based pre-processing step followed by standard multi-class logistic regression. However, the performance of SGC rivals — if not surpasses — the performance of GCNs and state-of-the-art graph neural network models across a wide range of graph learning tasks(The contribution of SGC)
本文的目的是把非线性的GCN转化成一个简单的线性模型SGC,通过反复消除GCN层之间的非线性并将得到的函数折叠成一个线性变换来减少GCNs的额外复杂度。
在GCN中,每一层的操作可以表示为: 其中,A是邻接矩阵,D是度矩阵,是第l层的隐藏状态,是第l层的权重,σ是非线性激活函数。 在SGC中,我们省略了非线性激活函数σ,因此每一层的操作变为: 这样,所有层的操作可以合并为一个操作,即: 其中,K是层数,是输入特征,W是一个需要学习的权重矩阵。 这种方法的优点是计算效率高,因为所有层的操作可以预先计算并存储为一个矩阵。此外,由于省略了非线性激活函数,SGC的训练过程也更稳定。然而,这也意味着SGC可能无法捕捉到一些复杂的非线性模式。