当前位置: X-MOL 学术ACM Trans. Knowl. Discov. Data › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Reducing Cumulative Errors of Incremental CP Decomposition in Dynamic Online Social Networks
ACM Transactions on Knowledge Discovery from Data ( IF 3.6 ) Pub Date : 2021-04-21 , DOI: 10.1145/3441645
Jingjing Wang 1 , Wenjun Jiang 1 , Kenli Li 1 , Keqin Li 2
Affiliation  

CANDECOMP/PARAFAC (CP) decomposition is widely used in various online social network (OSN) applications. However, it is inefficient when dealing with massive and incremental data. Some incremental CP decomposition (ICP) methods have been proposed to improve the efficiency and process evolving data, by updating decomposition results according to the newly added data. The ICP methods are efficient, but inaccurate because of serious error accumulation caused by approximation in the incremental updating. To promote the wide use of ICP, we strive to reduce its cumulative errors while keeping high efficiency. We first differentiate all possible errors in ICP into two types: the cumulative reconstruction error and the prediction error. Next, we formulate two optimization problems for reducing the two errors. Then, we propose several restarting strategies to address the two problems. Finally, we test the effectiveness in three typical dynamic OSN applications. To the best of our knowledge, this is the first work on reducing the cumulative errors of the ICP methods in dynamic OSNs.

中文翻译:

减少动态在线社交网络中增量 CP 分解的累积误差

CANDECOMP/PARAFAC (CP) 分解广泛用于各种在线社交网络 (OSN) 应用程序。但是,在处理海量和增量数据时效率低下。已经提出了一些增量CP分解(ICP)方法,通过根据新添加的数据更新分解结果来提高效率和处理演化数据。ICP方法是有效的,但由于增量更新中的逼近导致严重的误差累积,因此不准确。为了促进ICP的广泛使用,我们努力在保持高效率的同时减少其累积误差。我们首先将 ICP 中所有可能的误差分为两类:累积重建误差和预测误差。接下来,我们制定两个优化问题来减少这两个错误。然后,我们提出了几种重启策略来解决这两个问题。最后,我们在三个典型的动态 OSN 应用中测试了有效性。据我们所知,这是减少动态 OSN 中 ICP 方法累积误差的第一项工作。
更新日期:2021-04-21
down
wechat
bug