In this article, we show one case study assessing whether Trustpilot reviews increase conversions.
Lots of research has been done to prove the effectiveness of online reviews. One study, in particular, found that 88% of consumers trust online reviews and 72% of consumers say positive reviews make them trust businesses more.
So, is this really the case? I decided to find out in this case study.
And the result will probably surprise you.
First a bit of background.
Nobody actually said how much impact online reviews have on conversion rates. Are they a minor thing that won’t make much of a difference or are they a key part of your conversion rate strategy?
So, I have run a little experiment on an online course website. Unfortunately, I cannot cite the website, so I ask you to trust me.
Why do people trust reviews?
People trust reviews because it enhances social proof.
What is social proof?
Social proof is based on the idea of normative social influence, which states that people will conform in order to be liked by, similar to, or accepted by the influencer (or society).
When you are visiting a landing page and you see a testimonial from a respected expert, that’s social proof. When you check the pricing page and you see a well-known company already bought that product, that’s social proof.
Trustpilot reviews: experiment setup
Trustpilot is not cheap. For this case study, the website uses the Pro Version and it costs $599/month. If you pay that much, you want to have a great impact on your conversions and good ROI.
The key element of this experiment is the TrustPilot widget, the item that showcases all your reviews and integrates into your website.
The widget looks like this one:
So, using Google Optimise I have created two landing pages:
- A landing page with the widget to showcase all the 323 aggregated positive reviews
- Duplicated landing page, without the widget
- Traffic: Send 50% of the visitors to the first landing page and 50% to the second page
- Segments: UK only, since 99% of our customers are from the UK
- Positioning: on the control page, widget right on top of the page, above the fold.
The number of reviews at the time of the experiment last November was 323 and the overall score 4.5/5, I thought it’s a very strong, healthy brand.
So, the questions I wanted an answer to were:
- Is the widget pushing more conversions?
- If so, by how much?
- Is the $599 monthly price worth paying?
I was expecting 50% of that traffic to see the reviews, be impressed with the number of great testimonials from happy customers and convert more easily.
The results were…interesting, to say the least.
I have started the AB experiment on Google Optimise and waited 10 days.
I gave enough time to Google Optimise to collect data and to measure any sort of social proof signal that might be different when people read the 323 positive reviews.
After 10 days, the conversion rate of that page WITHOUT the TrustPilot widget was exactly the same as the original page.
– Original page with the widget: 222 conversions and 2.19% conversion rate
– Variant without the widget: 226 conversions and 2.18% conversion rate
Also, all website metrics on both pages were largely the same
- Pages/sessions: 2.00 for both pages
- The page with reviews widget had an avg. session duration of 00:01:35 vs 00:01:37 for the page WITHOUT the reviews widget in the 10 days of the experiment.
- Bounce rate: 60.91% for the original with the widget vs 59.90% without the widget.
- Adding the TrustPilot reviews widget made so little difference for conversions that we decided to remove it completely from the page.
It’s hard to draw any sort of firm conclusion based on a single experiment on a single page. Plus, a few other competitors’ websites have roughly the same amount of positive reviews from Trustpilot.
So, it could be that the 323 positive reviews just were not enough to increase conversions because our users see them on other websites too.
There are 321 reviews on a competitor’s website, also coming from Trustpilot, with a 4.8/5 ranking so a single positive widget might not be enough to increase conversions.
It might be that having positive reviews seems to be nothing special when every one of your competitors has it.
Things might have been different if we were the only website showcasing this number of positive reviews.
That said, if online reviews were a super important conversion rate factor, I would have expected to see some increase in conversions, especially considering the high number of positive reviews (323).
So, this data suggests that online reviews might not be some sort of magic bullet, at least according to this small experiment for an online course provider.