A Brief Review of Customer Reviews

We’ve recently seen evidence that more companies are creating space on their sites for customer reviews. According to Alterian, 74% of Americans use online comparison sites to gather product information. While this seems like a straightforward case of increasing adoption, we think it might signal a shift in how people use reviews.

While writing Liminal, we saw that customer reviews did not perform well with respect to the six engagement elements. People are becoming skeptical of reviews, not trusting what they read. Those of us who have read customer reviews regularly expect to find the glowing alongside the devastating. Evidence from some industries indicates that people with very good or very bad experiences are among the most likely to post in the first place. Unfortunately, the general exuberance of these reviews makes them useless to the majority of us.

Corporate America is starting to pick up on this trend. The article “Amazon Fake Reviews get More Subversive“ shows how companies are taking a new tack in creating fake product reviews. They’ve given up on the “rah-rah” and started simply mentioning the names of their products. To wit:

“I used these with an Onkyo HT6100 HTIB system, and everything worked out fine. I was a little worried at first because the mounts are plastic, but they were plenty strong enough to handle the Onkyo speakers which are about 12 inches tall. Good mounts at a good price! -Nick

This is clever for a few reasons. First, it gets the product name out in a positive light, but still sounds natural enough to have come from a real person. Second, it makes us think that the writer has experience with the actual product.

Performing due diligence while shopping means fact-checking anything that you read – period ­– but that’s what makes these new reviews so devious. The point of product reviews is to give us a shortcut. We hope that somebody with more time and interest in a product has already done the legwork. Just give us the expert opinion! And honestly, who is going to fact-check a product review?

Retailers have little incentive to pull positive reviews, but if we don’t want customers to lose all faith, there needs to be a way to see what’s real and what isn’t. ‘How useful was this comment to you’ ratings just ask people we don’t know to tell us what they think about opinions from people we also don’t know. Not good. Where does it end?

One potential answer lies in leveraging social graphs to surface reviews from friends, or friends of friends. This circumvents the issue of trust by letting you know exactly where a review came from and how much you should – or shouldn’t – trust it.

A less-explored avenue would use advanced diagnostic techniques like Cultural Consensus Analysis to validate new reviews against what has already been written – but that, too could fall prey to the fakers.

Until then, read carefully, and don’t forget to comment with your great ideas!

2 Comments

  1. Cari
    Posted March 11, 2011 at 11:01 pm | Permalink

    I think some system of identifying trustworthy reviewers is good, but the problem with social networks is nobody knows enough people (even when you include friends of friends) to cover the millions of products on Amazon. Another interesting approach is using an algorithm to determine which registered reviewers are legitimate and trustworthy. Do they post reviews on a variety of products, does the distribution of stars they give out suggest they are posting real opinions, does the timing of their reviews (daily, weekly, 20 in one day) raise suspicions? Yelp seems to use an approach like this. Sometimes when I look at their “hidden” reviews it seems like they successfully caught a lot of spam, but there are also a lot of false positives with their system. New reviewers who post one positive review tend to be filtered out, which discourages many people from sticking around and writing more. As a result, the most visible Yelp reviews are written by people who have invested a LOT of time in becoming elite Yelp reviewers. This is good because it makes the site more trustworthy, but if an establishment doesn’t cater to the demographics of those elite users, it might not have many elite yelp users writing positive reviews, while actual customers get filtered out. But if the system of treating all users equally is failing because of spam, an algorithm that often gets it wrong might ultimately be better for giving consumers an idea of what real people think.

  2. Posted March 16, 2011 at 1:06 pm | Permalink

    Thanks Aryeh and Mark for this great post. I wanted to bring up another point of view shared by MediaPost this morning on the flip side of ratings & reviews: the detractor. A couple interesting statistics worth reading:

    By listening and proactively responding on the social web, says the report, retailers have a chance to turn disgruntled customers into social advocates. The survey found that, of those who received a reply in response to their negative review: 33% turned around and posted a positive review. 34% deleted their original negative review. 61% of consumers would be shocked if a retailer responded to their negative comment on the social web.

    Even more of a reason to monitor ratings and reviews. When done right, it can be a powerful tool.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>