It all started with a dog guard
I was looking to buy a dog guard for my car – a random purchase I know, and not one that I have ever (or likely will ever) buy again. How do you decide on which one to go for when there are so many options? Well, you go to Amazon and filter by star ratings of course!
Looking through the top 5 products, it struck me that 3 of the top 5 products only have one review (the other two had over 200). Obviously, I was pulled straight away to the product with more reviews. When I looked at the reviews though, some of them looked a bit strange. One just read “not much memory but still good value”.
Wait a sec… what? Why would a dog guard need memory? There’s something weird going on here…
Written in the stars?
Star ratings have become a powerful tool for helping consumers make purchase decisions. The idea of them is great. People buy products, and then rate them so that people can see what others think of them. Research shows that ‘more stars’ really do lead to ‘more sales’, with a BBC report citing the Competition and Markets Authority, which estimates that £23 billion of annual UK consumer spending is a result of reviews’ influence.
A recent BBC article also points to the psychology of reading reviews.
“Online reviews work because people try to take an “effortless route” when they have to make decisions. When it comes to purchasing, especially for items which are easy to buy, we expect this level of convenience and ease”. (Nathalie Nahai, author of Webs of Influence: The Psychology of Online Persuasion, on BBC)
In a world where instant gratification is so key, having a quick check of a product’s star rating can be a useful tool to help with a purchase decision.
But things might not be as rosy as they appear on the surface. Clearly, companies have a vested interest to make sure their products have good ratings in order to attract buyer attention.
Recently, various news outlets shared the findings from Which? who had found that Amazon “is being deluged with fake tech reviews that propel unknown brands into top-rated lists”. Which? also calls on consumers to “not rely on ratings – delve deeper and read the reviews”.
Companies are trying to show their commitment to stopping fake reviews being posted, however, it’s not always easy. Trustpilot, whose platform allows anyone to post a review, claim they “have specialist software that screens reviews against 100’s of data points around the clock to automatically identify and remove fakes”. However, as the BBC found, some are still slipping through the net. Through eBay, the BBC was able to pay someone to publish a fake review for them.
So the question is – what can you do to filter out the fakes and make the most from product review data?
We do a lot of work with product reviews and wanted to share a few tips for working with them so that you can be confident in what they’re telling you:
- Human judgement is still the best tool to flag and re-examine suspect content, such as dodgy translations, descriptions seemingly borrowed from another product category or bodies of text suspiciously similar to one another.
- Verify that the products themselves are genuine. We can check this by reading or watching the reviews by credible professional reviewers and the press.
- Focus on the ‘verified’ reviews. These are much harder to fake, as they indicate that the purchaser is a genuine buyer of the product and has bought from the site directly.
- Examine the date range for product reviews. Make sure that a large body of those isn’t concentrated in a very short time span (e.g. on the same day).
- Don’t only examine the first or last reviews as they could be strategically positioned due to their positive nature (noted by consumer psychologist Cathrine Jansson-Boyd on BBC).
- Exclude extremely vague and generic reviews from any analysis (“it was great”; ” I love it”). Looking at character counts can help with this.
- Examine star rating distribution, to ensure that there is a reasonable breadth of ratings (e.g. 1-4*) rather than suspect dominance of 5* reviews.
- Consider tools like FakeSpot – they claim to check the status of the reviewers and use this to determine the validity of the reviews for a product. This sounds great, but they’ve declined to comment on how this actually works. More black box wizardry isn’t the answer.
This is important. Peer reviews are a vital short cut for consumers. Those who host review platforms owe it to consumers to be open and transparent about the challenges they’re facing.
A final thought from us is that for some companies, reviews are starting to be a two-way process. Platforms such as AirBnB and Uber use a process of review exchange where customers are also reviewed.
This could help with the review verification process, although isn’t without it’s dangers (it could lead to both parties favouring positive ratings in order to preserve their own ratings).
It will be interesting to see the extent to which this model can be adopted into services where the supplier never meets the buyer. If it becomes more widespread it could add a whole new level of review verification.
We help organisations turn this kind of feedback into insights that help improve life for their customers. Give us a shout, we’d love to talk it over.