REVAIN is to bring back credibility to the Internet
Before we dig into the issue of online reviews being credible, let’s probably try to figure whether they are of much significance at all nowadays answer these two questions:
- Do higher ratings always lead to increases in sales?
- Do reviews from verified buyers have greater impact on sales than those from anonymous sources?
It’s always been a common assumption that online reviews have a quantifiable impact on purchase decisions. But if you look into it, the degree of that impact depends on a number of factors, such as star ratings, the nature of the review content, the number of reviews, the price of the item, and the source of the review. Research shows that initial reviews have the greatest impact. As products begin displaying reviews, conversion rates escalate rapidly, but once total number of reviews reaches a certain mark, conversion rate stops escalating:
54% of UK adults used online reviews, and many found them valuable. In the meantime, reviews from verified buyers are substantially more affluential than reviews written by anonymous sources. Plus verified buyers are more likely to give higher ratings than anonymous reviewers. Accordingly, anonymous buyers are more likely to give one- and two-star reviews. Thus, we assume that customers who are unhappy with their purchase are much more likely to make the effort to post an unprompted review. And most importantly — the research shows that displaying reviews identified by a “verified buyer badge” has a positive impact on sales. This is natural, since reviews written by people with direct experience of using a product are considered more credible. The verified buyer badge shows that the reviewer is a real consumer and not someone who was paid to write a review. Reviews that are viewed as representative and credible can be a significant competitive advantage for businesses. And REVAIN comes into play beautifully here, as it eliminates this need for brands to prompt verified buyers to leave reviews — the platform itself has that interface of reviews exchange both for brands and their customers.
Unfortunately the current situation on reviews market is close to disaster. A recent BBC investigation revealed not only a global market for fake review writers, but also facts of the use of stolen identities to post reviews and customers using the threat of online criticism to win discounts from companies. Review websites themselves have been carrying out malicious practices of picking positive reviews and allowing businesses to remedy negative reviews, that go unpublished, meaning a complete picture being not clear to the users.
Moreover, with AI penetrating into the market and possibilities of AI training on data to train a machine to write reviews indistinguishable from a human (see the research from the University of Chicago), the state of affairs looks quite discouraging. Because being able to buy fake reviews from humans online is one thing; but it’s another when the process can be completely automated, with AI bots bursting out super realistic pieces of reviews on Amazon, Yelp, TripAdvisor or everywhere else you look.
Of course, review websites have been fighting fraud for quite a while. But with the AI being used for creating fake reviews, it’s almost impossible to identify fake ones now.
That’s why the market needs a breakthrough solution.
The first and foremost feature that provides for the users and companies motivation to produce genuine reviews only is economical model of the platform: users are motivated by token rewards taken from a company’s account. A user is allowed to submit 5 reviews per day and in future some users will be allowed to submit more based on their history. All users get equal rewards and they are sent every two weeks.
Businesses are at a great position as well, since they have a chance to set up their own metrics to assess the quality of reviews: hence, a restaurant might require to provide a photo of a dish or menu that disappointed the user.
Users and companies are pushed to be honest also by the system of warnings. There is a certain number of warnings and reaching that number leads to being blocked from using the platform and withdrawing rewards or funds.
If we look at the technical side, fragments of reviews are saved into blockchain and can’t be edited later. This achieved by the mechanism of double filtration that uses the IBM Watson AI as well as the manual review filtration mechanism. And this is a core difference from other review platforms: this scheme of machine moderation performed by AI, and a manual one done by the company itself. REVAIN employees are not involved into moderation process and cannot be corrupted as we saw in other examples cited above.
REVAIN also uses mathematical formulas for calculating bonuses that make submitting a fake review economically unprofitable.
Therefore, on a larger scale REVAIN is creating an unbiased system that doesn’t need a user to trust a platform to perform any actions. DAO and smart-contracts will defend it from any kinds of fake reviews. Companies will be able to improve your product and business processes by getting quality feedback of a real experience of hundreds of your users.
Once back to being credible, online reviews might actually help to regulate the market in a fair and productive way. Companies with bad rating will get less clients without improving the product, and the market will help companies with good reviews to gain popularity.