当前位置: X-MOL 学术Future Gener. Comput. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Mutual teaching for graph convolutional networks
Future Generation Computer Systems ( IF 7.5 ) Pub Date : 2020-10-18 , DOI: 10.1016/j.future.2020.10.016
Kun Zhan , Chaoxi Niu

Graph convolutional networks generate reasonable predictions of unlabeled samples because of transductive label propagation. Samples can have different predicted confidences, and therefore, we consider high-confidence predictions as pseudo labels to select more samples for updating models. We propose a new training strategy called mutual teaching, wherein dual models are first trained and they then teach each other during each batch process. Each network feeds forward all samples, and the samples with high-confidence predictions are used to expand the label set; then, each model is updated by the selected samples of its peer network. We consider the high-confidence predictions as useful knowledge, and each network teaches its peer network using this knowledge. In the proposed strategy, the pseudo-label set of a network is derived from its peer network, and this strategy helps improve the performance significantly. Experiments are conducted on the three citation network datasets and experimental results demonstrate that our method achieves superior performance over state-of-the-art methods under the condition of very low label rates.



中文翻译:

图卷积网络的相互教学

由于卷积标记的传播,图卷积网络生成了未标记样本的合理预测。样本可以具有不同的预测置信度,因此,我们将高置信度预测视为伪标签,以选择更多样本来更新模型。我们提出了一种称为相互教学的新培训策略,其中首先对双重模型进行培训,然后在每个批处理过程中彼此进行教导。每个网络都会前馈所有样本,并使用具有高置信度预测的样本来扩展标签集;然后,通过其对等网络的选定样本更新每个模型。我们认为高置信度预测是有用的知识,每个网络都使用此知识来教其对等网络。在拟议的策略中,网络的伪标签集是从其对等网络派生的,该策略有助于显着提高性能。在三个引文网络数据集上进行了实验,实验结果表明,在极低标记率的情况下,我们的方法具有优于最新方法的性能。

更新日期:2020-10-19
down
wechat
bug