当前位置: X-MOL 学术J. Comb. Des. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Combinatorial designs for deep learning
Journal of Combinatorial Designs ( IF 0.7 ) Pub Date : 2020-05-04 , DOI: 10.1002/jcd.21720
Shoko Chisaki 1 , Ryoh Fuji‐Hara 2 , Nobuko Miyamoto 1
Affiliation  

Deep learning is a machine learning methodology using multi-layer neural network. A multi-layer neural network can be regarded as a chain of complete bipartite graphs. The nodes of the first partita is the input layer and the last is the output layer. The edges of a bipartite graph function as weights which are represented as a matrix. The values of i-th partita are computed by multiplication of the weight matrix and values of (i-1)-th partita. Using mass training and teacher data, the weight parameters are estimated little by little. Overfitting (or Overlearning) refers to a model that models the `training data` too well. It then becomes difficult for the model to generalize to new data which were not in the training set. The most popular method to avoid overfitting is called dropout. Dropout deletes a random sample of activations (nodes) to zero during the training process. A random sample of nodes causes more irregular frequency of dropout edges. There is a similar sampling concept in the area of design of experiments. We propose a combinatorial design on dropout nodes from each partita which balances the frequency of edges. We analyze and construct such designs in this paper.

中文翻译:

深度学习的组合设计

深度学习是一种使用多层神经网络的机器学习方法。一个多层神经网络可以看作是一个完整的二部图链。第一部分的节点是输入层,最后一部分是输出层。二部图的边用作权重,权重表示为矩阵。第i个partita的值通过权重矩阵和第(i-1)个partita的值相乘来计算。使用大量训练和教师数据,逐步估计权重参数。过度拟合(或过度学习)是指模型对“训练数据”的建模太好。然后模型很难泛化到不在训练集中的新数据。最流行的避免过拟合的方法称为 dropout。Dropout 在训练过程中将激活(节点)的随机样本删除为零。节点的随机样本会导致丢失边缘的频率更不规则。在实验设计领域也有类似的抽样概念。我们提出了对来自每个部分的 dropout 节点的组合设计,以平衡边的频率。我们在本文中分析并构建了此类设计。
更新日期:2020-05-04
down
wechat
bug