当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Message Passing Neural Processes
arXiv - CS - Machine Learning Pub Date : 2020-09-29 , DOI: arxiv-2009.13895
Ben Day, C\u{a}t\u{a}lina Cangea, Arian R. Jamasb, Pietro Li\`o

Neural Processes (NPs) are powerful and flexible models able to incorporate uncertainty when representing stochastic processes, while maintaining a linear time complexity. However, NPs produce a latent description by aggregating independent representations of context points and lack the ability to exploit relational information present in many datasets. This renders NPs ineffective in settings where the stochastic process is primarily governed by neighbourhood rules, such as cellular automata (CA), and limits performance for any task where relational information remains unused. We address this shortcoming by introducing Message Passing Neural Processes (MPNPs), the first class of NPs that explicitly makes use of relational structure within the model. Our evaluation shows that MPNPs thrive at lower sampling rates, on existing benchmarks and newly-proposed CA and Cora-Branched tasks. We further report strong generalisation over density-based CA rule-sets and significant gains in challenging arbitrary-labelling and few-shot learning setups.

中文翻译:

消息传递神经过程

神经过程 (NP) 是强大且灵活的模型,能够在表示随机过程时包含不确定性,同时保持线性时间复杂度。然而,NPs 通过聚合上下文点的独立表示产生潜在描述,并且缺乏利用许多数据集中存在的关系信息的能力。这使得 NP 在随机过程主要由邻域规则(例如元胞自动机 (CA))控制的环境中无效,并限制了关系信息未使用的任何任务的性能。我们通过引入消息传递神经过程 (MPNP) 来解决这个缺点,MPNP 是第一类明确利用模型中的关系结构的 NP。我们的评估表明 MPNP 在较低的采样率下茁壮成长,在现有基准和新提出的 CA 和 Cora-Branched 任务上。我们进一步报告了对基于密度的 CA 规则集的强泛化,以及在挑战任意标记和少样本学习设置方面的显着收益。
更新日期:2020-09-30
down
wechat
bug