Five-star review : Artificial Intelligence and Astroturfing.

Imagine this. You’re surfing the web, trying to find a particular product or service.

You’re reasonably well informed – you’ve checked out specifications, fitness for purpose and perhaps even canvassed friends for opinions. Before you part with your hard-earned cash, you’ll probably check out some reviews online about the product or service.

At this point you’ve almost come to a decision, but you want to confirm your logic. So you look at a bunch of reviews from a host of complete strangers. And make your final decision based on potentially questionable facts and star ratings. You look for a five-star review.

But can you trust that five-star review?

Online shopping success is built on two core concepts – perception of risk, and perception of value.

Drive the perception of risk down, the perception of value up and you’re in a good position to move your boxes.

The easiest way to drop perception of risk is to have plenty of five-star reviews about the product or service along with positive commentary.

But some marketers go a little further – they create a ‘backstory’ where fake reviews and ratings are either created by their own people, or by paid ‘customers’. This practice is called ‘astroturfing’.

How can Artificial Intelligence help to detect Astroturfing?

Amazon are currently trialling and testing a system that is designed to weed out fake commentary and five-star reviews.

In an online world where so many just believe what they read, I personally hope that this algorithm becomes widely used, if not mandatory for any site that uses ratings to convince buyers of the quality or efficacy or their products.