当前位置: X-MOL 学术Int. J. Pattern Recognit. Artif. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Boosting Framework of Factorization Machine
International Journal of Pattern Recognition and Artificial Intelligence ( IF 1.5 ) Pub Date : 2021-08-05 , DOI: 10.1142/s0218001421590369
Jun Zhou 1, 2 , Longfei Li 2 , Ziqi Liu 2 , Chaochao Chen 2
Affiliation  

Recently, Factorization Machine (FM) has become more and more popular for recommendation systems due to its effectiveness in finding informative interactions between features. Usually, the weights for the interactions are learned as a low rank weight matrix, which is formulated as an inner product of two low rank matrices. This low rank matrix can help improve the generalization ability of Factorization Machine. However, to choose the rank properly, it usually needs to run the algorithm for many times using different ranks, which clearly is inefficient for some large-scale datasets. To alleviate this issue, we propose an Adaptive Boosting framework of Factorization Machine (AdaFM), which can adaptively search for proper ranks for different datasets without re-training. Instead of using a fixed rank for FM, the proposed algorithm will gradually increase its rank according to its performance until the performance does not grow. Extensive experiments are conducted to validate the proposed method on multiple large-scale datasets. The experimental results demonstrate that the proposed method can be more effective than the state-of-the-art Factorization Machines.

中文翻译:

一个分解机的Boosting框架

最近,因子分解机 (FM) 在推荐系统中变得越来越流行,因为它可以有效地发现特征之间的信息交互。通常,交互的权重被学习为一个低秩权重矩阵,它被表述为两个低秩矩阵的内积。这个低秩矩阵可以帮助提高因式分解机的泛化能力。但是,要正确选择秩,通常需要使用不同的秩多次运行算法,这对于一些大规模数据集显然是低效的。为了缓解这个问题,我们提出了因子分解机(AdaFM)的自适应提升框架,它可以自适应地为不同的数据集搜索适当的秩,而无需重新训练。而不是对 FM 使用固定等级,所提出的算法将根据其性能逐渐增加其排名,直到性能不增长。进行了广泛的实验以在多个大规模数据集上验证所提出的方法。实验结果表明,所提出的方法比最先进的分解机更有效。
更新日期:2021-08-05
down
wechat
bug