当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Automatic Construction of Multi-layer Perceptron Network from Streaming Examples
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2019-10-08 , DOI: arxiv-1910.03437
Mahardhika Pratama, Choiru Za'in, Andri Ashfahani, Yew Soon Ong and Weiping Ding

Autonomous construction of deep neural network (DNNs) is desired for data streams because it potentially offers two advantages: proper model's capacity and quick reaction to drift and shift. While the self-organizing mechanism of DNNs remains an open issue, this task is even more challenging to be developed for standard multi-layer DNNs than that using the different-depth structures, because the addition of a new layer results in information loss of previously trained knowledge. A Neural Network with Dynamically Evolved Capacity (NADINE) is proposed in this paper. NADINE features a fully open structure where its network structure, depth and width, can be automatically evolved from scratch in an online manner and without the use of problem-specific thresholds. NADINE is structured under a standard MLP architecture and the catastrophic forgetting issue during the hidden layer addition phase is resolved using the proposal of soft-forgetting and adaptive memory methods. The advantage of NADINE, namely elastic structure and online learning trait, is numerically validated using nine data stream classification and regression problems where it demonstrates performance improvement over prominent algorithms in all problems. In addition, it is capable of dealing with data stream regression and classification problems equally well.

中文翻译:

从流式实例自动构建多层感知器网络

数据流需要深度神经网络 (DNN) 的自主构建,因为它可能提供两个优点:适当的模型容量以及对漂移和偏移的快速反应。虽然 DNN 的自组织机制仍然是一个悬而未决的问题,但与使用不同深度结构的任务相比,为标准多层 DNN 开发这项任务更具挑战性,因为添加新层会导致以前的信息丢失。训练有素的知识。本文提出了一种具有动态演化能力的神经网络(NADINE)。NADINE 具有完全开放的结构,其网络结构、深度和宽度可以在线方式从头开始自动演化,无需使用特定于问题的阈值。NADINE 是在标准 MLP 架构下构建的,并且使用软遗忘和自适应记忆方法的提议解决了隐藏层添加阶段的灾难性遗忘问题。NADINE 的优势,即弹性结构和在线学习特性,使用九个数据流分类和回归问题进行了数值验证,在所有问题中,它证明了性能优于突出算法。此外,它能够同样很好地处理数据流回归和分类问题。使用九个数据流分类和回归问题进行了数值验证,在所有问题中,它证明了性能优于主要算法。此外,它能够同样很好地处理数据流回归和分类问题。使用九个数据流分类和回归问题进行了数值验证,在所有问题中,它证明了性能优于主要算法。此外,它能够同样很好地处理数据流回归和分类问题。
更新日期:2020-01-10
down
wechat
bug