当前位置: X-MOL 学术Enterp. Inf. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Named data networking with neural networks for intelligent image processing information systems
Enterprise Information Systems ( IF 4.4 ) Pub Date : 2020-12-06 , DOI: 10.1080/17517575.2020.1856424
Zhengzhou Han 1 , Zhuo Li 1 , Kaihua Liu 1 , Liu Yan 1
Affiliation  

ABSTRACT

Rapid retrieval of differentiated name data and efficient storage of information is essential in intelligent image processing information systems. In order to meet the requirements and challenges of rapid retrieval of differentiated name data and efficient storage of image information under different application scenarios of pending interest table (PIT) of named data network, a learning PIT overall scheme is proposed. This concept is known as the Learning Tree PIT (LT-PIT), which is based on a neural network framework. The index structure of LT-PIT, improves the image information storage efficiency by learning the distribution of index content in memory. The experimental results show that the comprehensive performance of LT-PIT is better than other schemes. When the number of names reaches 2 million, the index structure memory consumption of LT-PIT is 253.129 MB, including 53.129 MB of on-chip memory consumption, which can be deployed on high-speed memory SRAM. This technique is applicable to intelligent image processing information systems and can be used to process big data of images in the future.



中文翻译:

用于智能图像处理信息系统的具有神经网络的命名数据网络

摘要

在智能图像处理信息系统中,快速检索有区别的名称数据和有效存储信息是必不可少的。为满足命名数据网络PIT不同应用场景下差异化名称数据快速检索和图像信息高效存储的需求和挑战,提出一种学习PIT整体方案。这个概念被称为学习树 PIT (LT-PIT),它基于神经网络框架。LT-PIT的索引结构,通过学习索引内容在内存中的分布,提高了图像信息的存储效率。实验结果表明,LT-PIT的综合性能优于其他方案。当名字的数量达到200万时,LT-PIT的索引结构内存消耗为253.129 MB,其中片上内存消耗为53.129 MB,可部署在高速内存SRAM上。该技术适用于智能图像处理信息系统,未来可用于处理图像大数据。

更新日期:2020-12-06
down
wechat
bug