当前位置: X-MOL 学术Pattern Recogn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hypergraph Convolution and Hypergraph Attention
Pattern Recognition ( IF 8 ) Pub Date : 2021-02-01 , DOI: 10.1016/j.patcog.2020.107637
Song Bai , Feihu Zhang , Philip H.S. Torr

Recently, graph neural networks have attracted great attention and achieved prominent performance in various research fields. Most of those algorithms have assumed pairwise relationships of objects of interest. However, in many real applications, the relationships between objects are in higher-order, beyond a pairwise formulation. To efficiently learn deep embeddings on the high-order graph-structured data, we introduce two end-to-end trainable operators to the family of graph neural networks, i.e., hypergraph convolution and hypergraph attention. Whilst hypergraph convolution defines the basic formulation of performing convolution on a hypergraph, hypergraph attention further enhances the capacity of representation learning by leveraging an attention module. With the two operators, a graph neural network is readily extended to a more flexible model and applied to diverse applications where non-pairwise relationships are observed. Extensive experimental results with semi-supervised node classification demonstrate the effectiveness of hypergraph convolution and hypergraph attention.

中文翻译:

超图卷积和超图注意

最近,图神经网络引起了极大的关注,并在各个研究领域取得了突出的表现。这些算法中的大多数都假设了感兴趣对象的成对关系。然而,在许多实际应用中,对象之间的关系是高阶的,超出了成对公式。为了在高阶图结构数据上有效地学习深度嵌入,我们在图神经网络系列中引入了两个端到端的可训练算子,即超图卷积和超图注意力。虽然超图卷积定义了在超图上执行卷积的基本公式,但超图注意力通过利用注意力模块进一步增强了表征学习的能力。有了两个运营商,图神经网络很容易扩展到更灵活的模型,并应用于观察非成对关系的各种应用。半监督节点分类的大量实验结果证明了超图卷积和超图注意力的有效性。
更新日期:2021-02-01
down
wechat
bug