Skip to main content
Log in

Item Cold-Start Recommendation with Personalized Feature Selection

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

The problem of recommending new items to users (often referred to as item cold-start recommendation) remains a challenge due to the absence of users’ past preferences for these items. Item features from side information are typically leveraged to tackle the problem. Existing methods formulate regression methods, taking item features as input and user ratings as output. These methods are confronted with the issue of overfitting when item features are high-dimensional, which greatly impedes the recommendation experience. Availing of high-dimensional item features, in this work, we opt for feature selection to solve the problem of recommending top-N new items. Existing feature selection methods find a common set of features for all users, which fails to differentiate users’ preferences over item features. To personalize feature selection, we propose to select item features discriminately for different users. We study the personalization of feature selection at the level of the user or user group. We fulfill the task by proposing two embedded feature selection models. The process of personalized feature selection filters out the dimensions that are irrelevant to recommendations or unappealing to users. Experimental results on real-life datasets with high-dimensional side information reveal that the proposed method is effective in singling out features that are crucial to top-N recommendation and hence improving performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Ricci F, Rokach L, Shapira B. Recommender Systems Handbook (2nd edition). Springer, 2015.

  2. Agarwal D, Chen B. Regression-based latent factor models. In Proc. the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, June 2009, pp.19-28.

  3. Gantner Z, Drumond L, Freudenthaler C, Rendle S, Schmidt-Thieme L. Learning attribute-to-feature mappings for cold-start recommendations. In Proc. the 10th IEEE International Conference on Data Mining, December 2010, pp.176-185.

  4. Wang C, Blei D M. Collaborative topic modeling for recommending scientific articles. In Proc. the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, August 2011, pp.448-456.

  5. Rendle S. Factorization machines with libFM. ACM Trans. Intel. Syst. Tech., 2012, 3(3): Article No. 57.

  6. Elbadrawy A, Karypis G. User-specific feature-based similarity models for top-n recommendation of new items. ACM Trans. Intel. Syst. Tech., 2015, 6(3): Article No. 33.

  7. Sharma M, Zhou J, Hu J, Karypis G. Feature-based factorized bilinear similarity model for cold-start top-n item recommendation. In Proc. the 2015 SIAM International Conference on Data Mining, April 2015, pp.190-198.

    Google Scholar 

  8. Koenigstein N, Paquet U. Xbox movies recommendations: Variational Bayes matrix factorization with embedded feature selection. In Proc. the 7th ACM Conference on Recommender Systems, October 2013, pp.129-136.

  9. Gu Y, Zhao B, Hardtke D, Sun Y. Learning global term weights for content-based recommender systems. In Proc. the 25th International Conference on World Wide Web, April 2016, pp.391-400.

  10. Saveski M, Mantrach A. Item cold-start recommendations: Learning local collective embeddings. In Proc. the 8th ACM Conference on Recommender Systems, October 2014, pp.89-96.

  11. Bauman K, Liu B, Tuzhilin A. Aspect based recommendations: Recommending items with the most valuable aspects based on user reviews. In Proc. the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, August 2017, pp.717-725.

  12. Chen Y, Zhao X, de Rijke M. Top-N recommendation with high-dimensional side information via locality preserving projection. In Proc. the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval, August 2017, pp.985-988.

  13. Du X, Yin H, Chen L, Wang Y, Yang Y, Zhou X. Personalized video recommendation using rich contents from videos. IEEE Trans. Knowl. Data Eng. doi:https://doi.org/10.1109/TKDE.2018.2885520.

  14. Zhang Y, Yin H, Huang Z, Du X, Yang G, Lian D. Discrete deep learning for fast content-aware recommendation. In Proc. the 11th ACM International Conference on Web Search and Data Mining, February 2018, pp.717-726.

  15. Chen Y, Zhao X, Liu J, Ge B, Zhang W. Learning to select user-specific features for top-N recommendation of new items. In Proc. the 35th IEEE International Conference on Data Engineering Workshops, April 2019, pp.141-147.

  16. Billsus D, Pazzani M J. A hybrid user model for news story classification. In Proc. the 7th International Conference on User Modeling, June 1999, pp.99-108.

  17. Zhang L, Agarwal D, Chen B. Generalizing matrix factorization through flexible regression priors. In Proc. the 5th ACM Conference on Recommender Systems, October 2011, pp.13-20.

  18. Yin H, Cui B, Chen L, Hu Z, Zhang C. Modeling locationbased user rating profiles for personalized recommendation. ACM Trans. Knowl. Discov. Data, 2015, 9(3): Article No. 19.

  19. Yin H, Wang W, Wang H, Chen L, Zhou X. Spatial-aware hierarchical collaborative deep learning for POI recommendation. IEEE Trans. Knowl. Data Eng., 2017, 29(11): 2537-2551.

    Article  Google Scholar 

  20. Chandrashekar G, Sahin F. A survey on feature selection methods. Computers & Electrical Engineering, 2014, 40(1): 16-28.

    Article  Google Scholar 

  21. Li J, Cheng K, Wang S, Morstatter F, Trevino R P, Tang J, Liu H. Feature selection: A data perspective. ACM Comput. Surv., 2017, 50(6): Article No. 94.

  22. Gui J, Sun Z, Ji S, Tao D, Tan T. Feature selection based on structured sparsity: A comprehensive study. IEEE Trans. Neural Netw. Learning Syst., 2017, 28(7): 1490-1507.

    Article  MathSciNet  Google Scholar 

  23. Tibshirani R. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 1996, 58(1): 267-288.

    MathSciNet  MATH  Google Scholar 

  24. Nie F, Huang H, Cai X, Ding C H Q. Efficient and robust feature selection via joint ℓ2,1-norms minimization. In Proc. the 24th Annual Conference on Neural Information Processing Systems, December 2010, pp.1813-1821.

  25. Deshpande M, Karypis G. Item-based top-N recommendation algorithms. ACM Trans. Inf. Syst., 2004, 22(1): 143-177.

    Article  Google Scholar 

  26. Kabbur S, Ning X, Karypis G. FISM: Factored item similarity models for top-N recommender systems. In Proc. the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, August 2013, pp.659-667.

  27. Wu Y, DuBois C, Zheng A X, Ester M. Collaborative denoising auto-encoders for top-N recommender systems. In Proc. the 9th ACM International Conference on Web Search and Data Mining, February 2016, pp.153-162.

  28. Rendle S, Freudenthaler C, Gantner Z, Schmidt-Thieme L. BPR: Bayesian personalized ranking from implicit feedback. In Proc. the 25th Conference on Uncertainty in Artificial Intelligence, June 2009, pp.452-461.

  29. Zou H, Hastie T. Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2005, 67(2): 301-320.

    Article  MathSciNet  Google Scholar 

  30. Christakopoulou E, Karypis G. Local item-item models for top-N recommendation. In Proc. the 10th ACM Conference on Recommender Systems, September 2016, pp.67-74.

  31. Shalev-Shwartz S, Tewari A. Stochastic methods for ℓ1- regularized loss minimization. Journal of Machine Learning Research, 2011, 12: 1865-1892.

    MathSciNet  MATH  Google Scholar 

  32. McAuley J J, Leskovec J. Hidden factors and hidden topics: Understanding rating dimensions with review text. In Proc. the 7th ACM Conference on Recommender Systems, October 2013, pp.165-172.

  33. Rendle S. Factorization machines. In Proc. the 10th IEEE International Conference on Data Mining, December 2010, pp.995-1000.

  34. He X, Chua T. Neural factorization machines for sparse predictive analytics. In Proc. the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval, August 2017, pp.355-364.

  35. Srivastava N, Hinton G E, Krizhevsky A, Sutskever I, Salakhutdinov R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res., 2014, 15(1): 1929-1958.

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiang Zhao.

Electronic supplementary material

ESM 1

(PDF 287 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, YF., Zhao, X., Liu, JY. et al. Item Cold-Start Recommendation with Personalized Feature Selection. J. Comput. Sci. Technol. 35, 1217–1230 (2020). https://doi.org/10.1007/s11390-020-9864-z

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-020-9864-z

Keywords

Navigation