Inconsistency Investigation between Online Review Content and Ratings
MetadataShow full item record
Type of Work10 PAGES
Citation of Original PublicationGuohou Shan, Dongsong Zhang, Lina Zhou, Lingge Suo, Jaewan Lim, Chunming Shi, eBusiness and eCommerce Digital Commerce (SIGeBIZ), 2018
RightsThis item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please contact the author.
Despite the tremendous role of online consumer reviews (OCRs) in facilitating consumer purchase decision making, the potential inconsistency between product ratings and review content could cause the uncertainty and confusions of prospect consumers toward a product. This research is aimed to investigate such inconsistency so as to better assist potential consumers with making purchase decisions. First, this study extracted a reviewerâ€™s sentiments from review text via sentiment analysis. Then, it examined the correlation and inconsistency between product ratings and review sentiments via Pearson correlation coefficients (PCC) and box plots. Next, we compared such inconsistency patterns between fake and authentic reviews. Based on an analysis of 24,539 Yelp reviews, we find that although the ratings and sentiments are highly correlated, the inconsistency between the two is more salient in fake reviews than in authentic reviews. The comparison also reveals different inconsistency patterns between the two types of reviews.