Fake online reviews: Literature review, synthesis, and directions for future research

https://doi.org/10.1016/j.dss.2020.113280Get rights and content

Highlights

  • The antecedent–consequence–intervention (ACI) conceptual framework is proposed.

  • A review of fake online reviews is presented.

  • Twenty future research questions are identified.

  • Eighteen propositions are proposed to guide the future research.

  • The publicly available datasets are compiled and summarized.

Abstract

Fake online reviews in e-commerce significantly affect online consumers, merchants, and, as a result, market efficiency. Despite scholarly efforts to examine fake reviews, there still lacks a survey that can systematically analyze and summarize its antecedents and consequences. This study proposes an antecedent–consequence–intervention conceptual framework to develop an initial research agenda for investigating fake reviews. Based on a review of the extant literature on this issue, we identify 20 future research questions and suggest 18 propositions. Notably, research on fake reviews is often limited by lack of high-quality datasets. To alleviate this problem, we comprehensively compile and summarize the existing fake reviews-related public datasets. We conclude by presenting the theoretical and practical implications of the current research.

Introduction

Online reviews form a critical and unavoidable facet of e-commerce. These reviews have a significant impact on consumers purchase decisions as well as the amount of money spent by consumers. As e-commerce grows, however, so does the prevalence of fake online reviews (from here on, “fake reviews”). The proportion of fake reviews ranges from 16% [1], 20% [2], and 25% [3] to 33.3% [4]. As early as 2012, approximately 10.3% of online products were subjected to review manipulation [5].

There also exist infamous cases that demonstrate the seriousness of fake reviews in e-commerce. For example, in 2012, the UK Advertising Standards Authority found TripAdvisor to be involved in creating fake reviews: approximately 50 million online reviews on its site could not be verified as trusted [6]. In 2013, Samsung was ordered to pay a fine of $340,000 by the Taiwan Federal Trade Commission for posting negative fake reviews against its competitor HTC [7]. In 2015, Amazon sued 1114 unidentified people for posting fake reviews [8]. In 2018, Mafengwo.com, a famous tourism platform in China, was involved in review fraud, which included activities such as duplicating online reviews from competitors; the platform subsequently admitted to the issue with fake reviews [9].

Both industry and academia have expended serious efforts in detecting fake reviews and penalizing its perpetrators in order to restrict the prevalence of fake reviews. Governments, for instance, continually attempt to perfect legal systems in response to a crises of fake reviews, often taking tremendous measures to supervise online sellers and platforms. In 2013, the Attorney General of the State of New York spearheaded “Operation Clean Turf,” a year-long undercover investigation to identify and expose firms that create fake reviews [10]. In 2018, China enacted the first “E-commerce Law.” The law stipulates that merchants cannot conduct false or misleading commercial promotions to defraud or mislead consumers with fictional transactions, by fabricating online reviews, or through any other means [11]. Industry professionals and scholars continue to develop algorithms to detect fake reviews to assist in this process [[12], [13], [14], [15], [16], [17], [18], [19], [20]].

However, the effect of reducing fake reviews is unclear, when only external efforts, such as the above, are considered. Developing computerized algorithms to identify fake reviews predominantly emphasizes “treatment” of the “symptom.” Such measures seldom grasp the underlying causes and mechanisms of fake reviews. Many fake reviews still exist regardless of algorithms that have high detection rates, as smart promulgators frequently post fake reviews endowed with new features that evade filter detection.

Fake reviews seriously affect the development of online product reviews and stakeholders. Because such reviews can have a significant effect on product perception, many vendors, retailers, and platforms often manipulate online reviews [[21], [22], [23]]. Online sellers tend to publish positive fake reviews for their products or negative fake reviews against competitors' products for financial gains [24]. Platforms are inclined to acquiesce to review manipulations and add fake reviews to increase traffic and consumer engagement [23]. Opportunity seeking is an example of why such individuals post fake reviews [4].

Fake reviews, compared with genuine online reviews, tend to more influential and complex in structure, impressing upon us the need to rigorously explore them. There is a need for a historiography of the research on fake reviews, which would offer insights into future research prospects. Despite recent and increasing research, scholars are yet to clearly define fake reviews. For example, what types of fake reviews have already been explored by the extant studies? Further, how do we conduct more interesting studies on fake reviews? Research on fake reviews is also limited by the available datasets—In fact, there is little information on public datasets that could be used to study fake reviews.

A systematic literature review of fake reviews is extremely limited. A rare review on false information does mention three types of false information—fake reviews in e-commerce, hoaxes on collaborative platforms, and fake news in social media. However, it does not singularly focus on fake reviews [25]. To the best of our knowledge, there exist nine relevant reviews on this topic [[12], [13], [14], [15], [16], [17], [18], [19], [20]]. These works, however, largely explore the detection of fake reviews, without an overall analysis of the antecedents and consequences of fake reviews. In the present study, we address this gap in the literature in order to guide future research.

The remaining paper is structured as follows. Section 2 presents the working definition of fake reviews, research design, and general quantitative findings of the reviewed literature. It also puts forth the analytical framework of this study, based on which we thoroughly investigate the existing literature to understand the antecedents and consequences of fake reviews in Section 3. Then, we identify future research questions and propose propositions in Section 4. Section 5 summarizes and analyzes the dataset resources, following by a conclusion in Section 6.

Section snippets

Definition of “fake review”

So far, there has been no universally accepted definition of fake reviews. Hu et al. define review fraud as the act of vendors, publishers, writers, or any third-party monitoring online reviews and posting non-authentic online reviews as real customers in order to boost product sales [5]. In this definition, fake reviews are mainly adopted by online merchants, such as vendors, publishers, and retailers for profit maximization. Similarly, Banerjee and Chua define fake reviews in tourism as

Why: reasons causing fake reviews

The essential reason for manipulating reviews would be pecuniary motivation. Studies confirm that online product reviews affect consumers' purchase decisions [15,32], product reputations [33,34], sales volumes, and merchants' profits [35]. For instance, a 1% increase in hotel review ratings may increase sales per room by approximately 2.6% [22]. An extra half-star rating causes restaurants to sell out 19 percentage points more frequently [36].

Another dominant motivation for posting fake reviews

Future research questions and propositions

We now target and identify 20 future research questions and 18 corresponding propositions in Table 1 to better guide future research.

Data sources

Fake reviews are a typical practical research topic, and hence data sources are very important for studying it. However, high-quality datasets are limited, as accurately determining the fake reviews is a challenging task [20]. Generally, we can acquire the required datasets through three approaches.

First, online review fact-checking systems such as Fakespot (https://www.fakespot.com/) offer a way to spot fake reviews. To obtain the datasets, scholars appoint human annotators to manually create

Conclusions

A critical component of any new research venture is the timely establishment of a referential collection of the literature and its forward-looking analyses—the research on fake reviews is no exception. Although studies do explore various aspects of fake reviews, they still fail to comprehensively grasp this issue. For example, the majority of the extant studies only focus on fake reviews posted by merchants, while disregarding reviews from individual consumers and review platforms. Thus, a

CRediT authorship contribution statement

Yuanyuan Wu:Conceptualization, Methodology, Data curation, Formal analysis, Writing - original draft.Eric W.T. Ngai:Methodology, Supervision, Funding acquisition, Writing - review & editing.Pengkun Wu:Conceptualization, Software, Funding acquisition, Writing - review & editing.Chong Wu:Supervision, Project administration, Validation.

Acknowledgments

The authors are grateful for the constructive comments of the three anonymous referees on an earlier version of this paper. Yuanyuan Wu was supported in part by the Joint PhD Programmes (PolyU-HIT) leading to Dual Awards. Pengkun Wu was supported in part by the Humanities and Social Sciences Fund of Ministry of Education of China, the Natural Science Foundation of Hunan Province (2019JJ50403) and the Fundamental Research Funds for the Central Universities.

Yuanyuan Wu is currently a joint-PhD student at The Hong Kong Polytechnic University and Harbin Institute of Technology. Her research interests are fake reviews, fake news, and E-commerce. She has published papers in some international journals, such as Decision Support Systems, Applied Mathematical Modelling, and Social Indicators Research.

References1 (198)

  • N. *Hu et al.

    Fraud detection in online consumer reviews

    Decis. Support. Syst.

    (2011)
  • L. *Chen et al.

    Detection of fake reviews: analysis of sellers’ manipulation behavior

    Sustainability

    (2019)
  • N. *Feng et al.

    Effects of review spam in a firm-initiated virtual brand community: evidence from smartphone customers

    Inf. Manag.

    (2018)
  • T. *Zhang et al.

    Welfare economics of review information: implications for the online selling platform owner

    Int. J. Prod. Econ.

    (2017)
  • W. *Ahmad et al.

    Modeling consumer distrust of online hotel reviews

    Int. J. Hosp. Manag.

    (2018)
  • I. Pranata et al.

    Are the most popular users always trustworthy? The case of Yelp

    Electron. Commer. Res. Appl.

    (2016)
  • N. *Feng et al.

    Effects of review spam in a firm-initiated virtual brand community: evidence from smartphone customers

    Inf. Manag.

    (2018)
  • Y.K. *Huang et al.

    Judgment criteria for the authenticity of internet book reviews

    Libr. Inf. Sci. Res.

    (2012)
  • T. *Ong et al.

    Linguistic characteristics of shill reviews

    Electron. Commer. Res. Appl.

    (2014)
  • S. *Moon et al.

    Estimating deception in consumer reviews based on extreme terms: comparison analysis of open vs. closed hotel reservation platforms

    J. Bus. Res.

    (2019)
  • D. *Savage et al.

    Detection of opinion spam based on anomalous rating deviation

    Expert Syst. Appl.

    (2015)
  • A. *Heydari et al.

    Detection of fake opinions using time series

    Expert Syst. Appl.

    (2016)
  • Y. *Ren et al.

    Neural networks for deceptive opinion spam detection: an empirical study

    Inf. Sci.

    (2017)
  • W. *Zhang et al.

    CoFea: a novel approach to spam review identification based on entropy and co-training

    Entropy

    (2016)
  • C. *Sun et al.

    Exploiting product related review features for fake review detection

    Math. Probl. Eng.

    (2016)
  • L. *Li et al.

    Document representation and feature combination for deceptive spam review detection

    Neurocomputing

    (2017)
  • M. *Luca et al.

    Fake it till you make it: reputation

    Competition, and Yelp Review Fraud, Management Science

    (2016)
  • M. *Schuckert et al.

    Insights into suspicious online ratings: direct evidence from TripAdvisor

    Asia Pacific Journal of Tourism Research

    (2016)
  • T. Reporter

    TripAdvisor told to stop claiming reviews are ‘trusted and honest’

  • D. Bates

    Samsung ordered to pay $340,000 after it paid people to write negative online reviews about HTC phones

  • A. Gani

    Amazon sues 1,000 ‘fake reviewers’

  • R. Zhao

    Mafengwo accused of faking 85% of all user-generated content

  • L. Whitney

    Companies to pay $350,000 fine over fake online reviews

  • CIRS

    China's first e-commerce law published

  • R.K. *Dewang et al.

    State-of-art approaches for review spammer detection: a survey

    J. Intell. Inf. Syst.

    (2018)
  • M. *Crawford et al.

    Survey of review spam detection using machine learning techniques

    Journal of Big Data

    (2015)
  • A. *Rastogi et al.

    Opinion spam detection in online reviews

    J. Inf. Knowl. Manag.

    (2017)
  • N. *Hussain et al.

    Spam review detection techniques: a systematic literature review

    Appl. Sci.

    (2019)
  • D.U. *Vidanagama et al.

    Deceptive consumer review detection: a survey

    Artif. Intell. Rev.

    (2020)
  • Y. *Ren et al.

    Learning to detect deceptive opinion spam: a survey

    IEEE Access

    (2019)
  • U. *Aslam et al.

    A survey on opinion spam detection methods

    Int. J. Sci. Technol. Res.

    (2019)
  • S. *Gössling et al.

    The manager’s dilemma: a conceptualization of online review manipulation strategies

    Curr. Issue Tour.

    (2018)
  • S.Y. *Lee et al.

    Sentiment manipulation in online platforms: an analysis of movie tweets

    Prod. Oper. Manag.

    (2018)
  • Z. *Wang et al.

    GSLDA: LDA-based group spamming detection in product reviews

    Appl. Intell.

    (2018)
  • S. Kumar et al.

    False Information on Web and Social Media: A Survey

    (2018)
  • S. *Banerjee et al.

    Theorizing the textual differences between authentic and fictitious reviews: validation across positive, negative and moderate polarities

    Internet Res.

    (2017)
  • K.M. *Hunt

    Gaming the system: fake online reviews v. consumer law

    Computer Law & Security Review

    (2015)
  • E.T. *Anderson et al.

    Reviews without a purchase: low ratings, loyal customers, and deception

    J. Mark. Res.

    (2014)
  • P. *Sudhakaran et al.

    A framework investigating the online user reviews to measure the biasness for sentiment analysis

    Asian Journal of Information Technology

    (2016)
  • J. Mathieu et al.

    Team effectiveness 1997–2007: a review of recent advancements and a glimpse into the future

    J. Manag.

    (2008)
  • Cited by (0)

    Yuanyuan Wu is currently a joint-PhD student at The Hong Kong Polytechnic University and Harbin Institute of Technology. Her research interests are fake reviews, fake news, and E-commerce. She has published papers in some international journals, such as Decision Support Systems, Applied Mathematical Modelling, and Social Indicators Research.

    Eric W. T. Ngai is a Professor in MIS at Department of Management and Marketing at the Hong Kong Polytechnic University. His research interests are in the areas of E-commerce, Decision Support Systems, RFID research and Social Media Technology and Applications. He has over 130 journal publications in a number of international journals including MIS Quarterly, Journal of Operations Management, Decision Support Systems, European Journal of Operational Research, IEEE Transactions on Systems, Man and Cybernetics, Information & Management, Production & Operations Management, and others.

    Pengkun Wu is an Associate Professor in the Business School at Sichuan University. He received two PhD degrees from The Hong Kong Polytechnic University (2018) and Harbin Institute of Technology (2019). His research interests are fake reviews, fake news, E-commerce, and spatial crowdsourcing. He has published over 10 papers in some international journals including Decision Support Systems, International Journal of Production Research, Applied Mathematical Modelling, Journal of the Operational Research Society, and others.

    Chong Wu is a Professor in the School of Economics and Management at Harbin Institute of Technology. His research interests are fuzzy mathematics and decision science. He has over 170 journal publications in a number of international journals including IEEE Transactions on Knowledge and Data Engineering, Journal of the Association for Information Science and Technology, Information Science, Fuzzy Sets and Systems, Expert Systems with Applications and others.

    1

    References marked with an asterisk (*) indicate the articles included in the systematic review.

    View full text