A collection of five-star ratings can do amazing things for a business. It’s not all that surprising, then, that hotels, restaurants, local services, product manufacturers and even doctors and dentists are known to plant glowing reviews, pay for perfect ratings and otherwise manipulate the system. What can be done to stop them?
A recent New York Times story about how one manufacturer gave away free tablet cases to customers who posted reviews of the product on Amazon brought to light one of the many shady ways that businesses game online reviews and ratings to their benefit — and to the detriment of unsuspecting consumers.
The net result is that while the Internet should increase transparency and give shoppers access to loads of information and the honest, unbiased assessments of the masses, consumers often feel engulfed in a murky Web that’s not entirely trustworthy.
Just how prevalent are fake online reviews? One indication is how overtly some businesses pay for them. Fake Review Writer is one of the categories of gigs listed at Freelancer.com. The Web page doesn’t beat around the bush, boldly encouraging businesses to “Outsource fake review writer projects!” Nobody’s getting rich writing fake reviews; the pay might be as little as $1 for 500 words. But online ratings and reviews affect consumer perceptions as well as how high a business appears in online search results, both of which can translate into big money.
(MORE: Now There’s Even More Reason to Distrust Online Reviews)
A sample listing, for a writer to post fake reviews of products on Amazon, requests that job candidates refrain from plagiarism — which is an easy way to get flagged by website security software — and “be a native English Speaker, or have EXCELLENT knowledge of the language.” As for the job:
Topics are varied, your main task is to create a believable review, with input from your own personal perspective. I DON’T want a simple rewrite of what Amazon or other product sites have posted. I will also give a “template” of what the review should basically look like.
Bing Liu, a University of Illinois at Chicago computer-science professor, has made a habit of calling attention to fake online reviews. He is one of several experts trying to develop sophisticated detection software to rid the Web of “opinion spam,” as he calls it, which includes fake reviews, fake comments and fake blogs. Liu has estimated that for some products, it’s a safe bet that 30% of the reviews are fake. As things now stand, it’s easy to see why businesses are interested in pumping up online ratings, even if they have to resort to fraud. Liu told the Times:
“More people are depending on reviews for what to buy and where to go, so the incentives for faking are getting bigger,” said Mr. Liu. “It’s a very cheap way of marketing.”
(MORE: Percentage of Americans Who Say It’s O.K. to Cheat “As Much As Possible” on Taxes Has Doubled)
A group of Cornell researchers is also working on algorithms that would out fake reviews. In their study, “Finding Deceptive Opinion Spam by Any Stretch of the Imagination,” researchers used their software to sift through a pool of hotel reviews — 400 fake, 400 real. The software detected the fakes 90% of the time, while a group of humans challenged with naming the fakes was correct only slightly more than half the time.
One of the report’s authors, Myle Ott, spoke with me last week and explained that the software takes note of subtle signs that most people overlook. “Truthful reviews tend to have more punctuation, such as dollar signs, which indicate a specific that’d only be known to someone who has been there,” he said. “There are also more specific details, like the hotel location or that the room was small or large.”
Fake reviews, by contrast, tended to have more superlatives and adverbs in the writing (makes sense) and more details that were “external to the hotel,” such as whom the reviewer was traveling with. The fakes were also filled with pronouns, rather than proper names — because someone who had never been to a hotel wouldn’t know the name of the bellman or the woman at the front desk.
(MORE: Financial Scams Target Boomers)
While no major websites are using Cornell’s software, anyone who is interested can give it a try at ReviewSkeptic.com. It’s still marked Beta, but you can cut and paste any online review and the software will instantly do an assessment and state whether the review is truthful or fake. The site notes that it “works best on English hotels” and that it’s “currently offered for entertainment purposes only.”
The Cornell team’s latest research, which Ott wasn’t fully at liberty to discuss, expands well beyond online hotel reviews. It was prompted in part by its discovery that fake online reviews are being solicited by doctors.
“That was really disturbing,” said Ott. “The worst thing that could happen because of some fake reviews at a hotel site is that you spend the night in a bad hotel. But doctors? We’re talking really high stakes.”
At least, according to Ott’s research, doctors’ reviews don’t have the highest prevalence of deception online. That ignominious distinction belongs to sites such as Yelp and TripAdvisor, which allow anyone to post a review, without requiring proof that the reviewer has actually done business with the hotel or restaurant being rated. (Some sites, including Hotels.com and Priceline, only accept reviews from customers who have booked rooms.) Ott says that at the sites with the highest rate of deception, around 10% of the reviews are flagged as fake by the software.
TripAdvisor, which, interestingly enough, changed the slogan in its online reviews section from “Reviews you can trust” to “Reviews from our community” last fall, responded to the Cornell team’s findings by pointing to a survey it commissioned showing that 98% of respondents said that TripAdvisor hotel reviews were accurate to their experience. TripAdvisor also told me that “attempts to manipulate TripAdvisor are extremely rare,” and that “we have a zero-tolerance approach to all fraudulent activity and we have measures to penalize businesses for attempts to game the system, including affecting their popularity rating on the site and posting public warning notices on properties that have made attempts to manipulate their rating.”
Brad Tuttle is a reporter at TIME. Find him on Twitter at @bradrtuttle. You can also continue the discussion on TIME’s Facebook page and on Twitter at @TIME.