Q&A: The Influence of Online Movie Reviews
Once upon a time, box office sales were the make-or-break bellwether in the movie business. Studios were eager to shepherd their audiences to the big screen, where, the Columbia Journalism Review estimates, they need to bring in $74 million just to break even.
But with theater audiences declining, the competition for their eyes on their TVs, laptops and phones has only grown. For many consumers, making a decision on movie night starts with combing online ratings and reviews.
“Social media platforms have redefined the way consumers get information,” says Tianjie Deng, an assistant professor in the Department of Business Information and Analytics at the Daniels College of Business. “Every player in business is trying to leverage the power of these platforms in an effort to maximize their profit and exposure. As a result, social platforms offer a variety of reviews simultaneously: user ratings, user comments, critic ratings and critic comments. But we still don’t have a full picture of how exactly each of these different forms of reviews impact the actual sales.”
In other words, which matters more to consumers? Numeric ratings? Written reviews? Do they trust the critics or their peers? To find the answer, Deng targeted her latest research toward the popular Rotten Tomatoes site, mining more than 180,000 user reviews and nearly 13,000 critic reviews in the process.
With the 93rd Academy Awards just around the corner, Deng answered a few questions from the DU Newsroom about the power and influence of online movie reviews.
First of all, why do reviews matter?
Social media platforms have really redefined the way consumers get information. Instead of you hearing passively from a friend of a friend that a movie was good, now you can go online and see what the 10,000 people who saw the movie thought. Not only that, but these platforms have begun to tailor profiles, so you are all but guaranteed to see the reviews of a person who buys what you buy and watches the same TV shows you watch — all in an effort to make the opinions you see much more compelling and influential. A survey conducted by Dimensional Research shows that nearly 90% of consumers who read online reviews use such reviews to assist their purchase decisions. Specifically, in the movie industry, research has confirmed the economic impact of online reviews. Therefore, a good understanding of how exactly these reviews impact sales, as well as the magnitude of such impacts, would have valuable implications for movie distributors, online rating platforms, as well as consumers themselves.
You note in your research that many people consider critics obsolete in an internet age of consumer reviews. Did you find that to be true?
Quite the contrary. Readers may not care so much about the numeric ratings critics give to a movie, but they do pay attention to the textual content of their reviews. The sentiments in such texts are positively related to movie sales. So as a movie distributor, you want the critics to say positive things in their reviews to persuade moviegoers to choose your movie. Maybe this is because the moviegoing public has come to expect that critics are “tough nuts to crack” and not so easily swayed by fluff, so that when the critics endorse something, it must be good.
This notion actually inspired other research I am currently conducting: Are critics truly unbiased? There is already evidence of a “herding behavior” in the user reviewer community: A user tends to herd with their online friends when rating movies. Does this effect also exist within the critic reviewer community?
What are some of the key differences between user reviews and critic reviews?
In general, users give more positive reviews — higher numeric ratings and a more positive tone in their textual comments — while critics tend to offer reviews with lower ratings and more neutral sentiment. That raised a very interesting question to me: As a consumer, how does this discrepancy impact your decision? Is there a premium placed on the consistency found in critics’ reviews, or is it preferable to refer to people more like yourself? Moreover, is the impact of this bias between critics and users strong enough to impact sales?
Why do you think users leave more positive reviews than critics?
There could be a few explanations. One is that users only review movies that they choose to watch, which already reflects a positive sentiment toward the movie. On the other hand, some scholars have argued that negative evaluators are more likely to be viewed as more intelligent and competent than positive evaluators. As a result, critic reviewers may tend to provide more negative reviews to be considered as more knowledgeable and professional.
Is there any difference in the impact of the reviews on consumers and/or sales?
Yes. While users influence sales through their numeric ratings, critics impact sales through their textual comments. This shows that consumers do use both reviews, but they selectively rely on certain aspects of these reviews. This makes sense, given the sheer number of user reviews makes the written reviews less individually informative but improves the reliability of the numeric rating, compared to critics. Readers mainly use numeric ratings from users as heuristic quality signals, while relying on critics’ textual narratives to identify the rationale and sentiment offered in these reviews. As a result, critic reviews and user reviews both impact sales though through different forms.
What are the practical implications of your research?
The findings in this research can provide valuable insights into social media marketing strategies for movie distributors and studios. Since the study confirms the economic value of both critic reviews and user reviews, movie distributors can consider increasing their investment in social media marketing to boost their sales. Also, movie production companies can better assess whether a movie will have a profitable ROI. The different sales impacts of critic reviews and user reviews found in this study can guide distributors to prioritize their investments more strategically. They may want to provide incentives to their two major groups of online word-of-mouth contributors — critics and users — to encourage them to focus on different forms of reviews and recommendations. For example, when promoting a movie, studios and distributors should pay close attention to what critics say, rather than how they rate.
One example of an impact that movie production companies could implement based on my findings is in the production of trailers. Frequently, movie previews and trailers will feature quotes from prominent critics and their star ratings, I would contend that the companies designing these trailers should include the user rating on IMDB, Rotten Tomatoes, etc. alongside the quotes from critics in promoting their movie.
For review platform managers, my findings can help them enhance their reviewers’ contribution and improve the quality of the reviews generated by both users and critics. These platforms should also consider providing different metrics from critic reviews and user reviews, to help their audiences better process them. Many review platforms already offer aggregated numeric ratings from consumer reviews. At the same time, they can consider extracting keywords, overall sentiment and sentiment associated with different features of movies from critic reviews, to reduce information seekers’ information-processing effort and improve their experience using the platform.