当前位置:
X-MOL 学术
›
Opt. Mem. Neural Networks
›
论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Exponential Discretization of Weights of Neural Network Connections in Pre-Trained Neural Networks
Optical Memory and Neural Networks Pub Date : 2020-02-10 , DOI: 10.3103/s1060992x19040106 M. Yu. Malsagov , E. M. Khayrov , M. M. Pushkareva , I. M. Karandashev
中文翻译:
预训练神经网络中神经网络连接权重的指数离散化
更新日期:2020-02-10
Optical Memory and Neural Networks Pub Date : 2020-02-10 , DOI: 10.3103/s1060992x19040106 M. Yu. Malsagov , E. M. Khayrov , M. M. Pushkareva , I. M. Karandashev
Abstract
To reduce random access memory (RAM) requirements and to increase speed of recognition algorithms we consider a weight discretization problem for trained neural networks. We show that an exponential discretization is preferable to a linear discretization since it allows one to achieve the same accuracy when the number of bits is 1 or 2 less. The quality of the neural network VGG-16 is already satisfactory (top5 accuracy 69%) in the case of 3 bit exponential discretization. The ResNet50 neural network shows top5 accuracy 84% at 4 bits. Other neural networks perform fairly well at 5 bits (top5 accuracies of Xception, Inception-v3, and MobileNet-v2 top5 were 87%, 90%, and 77%, respectively). At less number of bits, the accuracy decreases rapidly.中文翻译:
预训练神经网络中神经网络连接权重的指数离散化