Five Stars, or Five Lies?

Retailers are cracking down on fake product reviews, now that an estimated 30% are fake. The question is whether it’s working.

January 20, 2025

“Works great.” “Best product ever!” “We now have three in our home.”

Are these reviews to be trusted? Perhaps not, and several big retailers say they are launching new crackdowns so consumers know what they’re buying. Experts say it’s far from certain how well the efforts will work, but certainly there’s a mounting need. With fraud exploding, analysts estimate as many as 30% of reviews may be fake, creating levels of consumer distrust that hurt business. “If reviews aren’t helpful, the retailer loses immediate sales,” says retail expert Craig Rowley, senior client partner at Korn Ferry.

To be sure, consumers have long understood that fake reviews exist. But the game has grown exponentially: Today’s technology allows bad actors to create millions of AI-generated reviews, and US online retail grew 8.7% in 2024 to $240 billion, according to data from Adobe Analytics. For online sellers, reviews are not just a supplementary feature. Many consumers depend on them to quickly ascertain which product to buy. For example, a consumer purchasing a vacuum might scan reviews of three similar models to learn which performs best. If she doesn’t trust the reviews, she might leave the website to seek more information—and the retailer likely loses that sale. “She might go into a store and ask the clerks which model they prefer and why,” says Rowley.

This is a far cry from the early days of faux reviews, when bad actors simply posted faked positive or negative comments. Retailers quickly caught on to their game, and shifted review systems to deter them—creating algorithms to reveal suspicious patterns, monitoring review forums, and removing reviews known to be fakes. These practices were successful enough that they spurred the rise of for-hire fake-review farms, which typically operate out of sight to evade detection. The most proactive retailers seek to identify and prosecute them.

Last year, the Federal Trade Commission entered the fray, with a new rule that prohibits the selling or purchasing of fake reviews, and bans reviews—including AI-generated reviews—that misrepresent the author. Yet the psychological damage has already been done. Once consumers begin to distrust reviews, their belief in the system gradually unravels. “People stop believing any and all of the reviews,” says business psychologist James Bywater, senior client partner at Korn Ferry. “Unless there is a technical fix, the reviews cease to be a useful signal for anyone.”

One seemingly simple safeguard is disallowing reviews that give only stars, without editorial commentary. “That eliminates many of the computer-generated reviews,” says Denise Kramp, senior client partner at Korn Ferry. Other verification measures attempt to confirm valid purchases.

For retailers, the first question is identifying who is supplying the reviews, especially in situations when a range of competitive products and retailers exist. “The interesting question is, ‘Who’s paying for that?’” says Rowley. Firms based in foreign countries without regulations against faux reviews are a common source.

As for how to approach reviews as a consumer, Rowley suggests seeking out the overall direction of the reviews—which is easy using AI-powered summaries—then skimming the best and worst of them, to suss out why the reviewer liked or disliked a product or experience. For example, a bad waiter is not synonymous with a bad restaurant. “I want details,” he says. “But you have to read judiciously. A great review or three doesn’t mean a great product.” 

 

Learn more about Korn Ferry’s Consumer Markets capabilities.