2 min read

A/B testing: Everything you need to know about split tests

I really hate it when I’m browsing Netflix and every show I scroll through starts autoplaying. As if I’m not allowed two seconds to scan the rest of the screen. It’s the worst thing to happen to Netflix since Are you still watching? (It’s Sunday! Stop judging me!)

People have been complaining about this feature since Netflix began rolling it out in 2016. There’s a Twitter account, a petition, a Chrome extension and a Star Wars director all pleading Netflix to remove autoplay previews, and yet they haven’t. Why not?

You’ve probably noticed you log into Netflix, you’re greeted with an advert for a film or TV show. You’re probably already aware on some level that this advert has been specially chosen for you, based on your viewing habits. Currently mine is reminding me I still haven’t watched Seasons 4 and 5 of Peaky Blinders. (Yes, I know…)

What you might not expect, is this: you are the subject in a series of experiments.

Every time you log in, whenever you browse through the available shows, whenever you click on something that looks interesting, Netflix is making notes. Scribbling down what caught your eye and, more importantly, what didn’t. They are then using this information to conduct more and more experiments with one goal in mind: to keep you watching as long as possible.

They aren’t removing that autoplay feature because they know it’s working for them. How? Netflix does A/B testing.

What is A/B testing?

A/B testing, also known as split testing, is a way of measuring the performance of two versions of the same thing. It often goes entirely unnoticed by the user, but can be an incredibly effective way of determining the best ways to reach your customers. It takes away the guesswork and gives you some solid evidence on what works and what doesn’t.

For example, you might split your mailing list in half to test two different versions of the same email campaign. By tracking open and click-through rates, you could determine which performed better. You might use it to test different subject lines, or the text on your call-to-action button.

It’s easy to imagine how you might go about it with an email. After all, it’s not hard to split a mailing list down the middle and send a different email each. But what about your website? Or your social media posts?

How to A/B test your website

Fortunately there are tons of great products out there now that will help you conduct and analyse your split tests. There are far too many to mention, but here are some our favourite tools and resources:

Let’s take your current site, for example. Maybe you’re not 100% sold on the wording of your positioning statement. And are people even noticing the menu icon? Should that button say ‘Buy now’ or ‘Find out more’? No more guesswork, let’s find out by conducting an experiment.

Using one of these split test tools, let’s set up your current website as Website A, and a version with some different text on the button. We’ll put a tracking event on that button to count how many people click it, and then eat some crisps while we wait for the data to roll in.

If more people click on the button when they are served Website B, let’s keep the new version! We can make that change, and begin a new experiment, maybe with the layout this time. It’s all about iterating and constant improvement.

Why experiment?

A bedtime story favourite of marketers everywhere is the tale of Google and the 41 Shades of Blue. Once upon a time, Google conducted a series of experiments on their users to find the perfect shade of blue for outgoing links in Google search and Gmail. They found that people were more likely to click on a link if it was a particular shade. Quite a bit more likely, in fact. As a result, Google increased their revenue by an extra $200m a year, and lived happily ever after.

Setting up a bunch of split tests might sound like a lot of work if your website is already doing pretty well, or you don’t get many visitors on which to experiment. But tools like Unbounce and Netlify are now making split testing so easy, and the gains can be so significant, that it’s almost a no-brainer.

Cancer Research UK

When Cancer Research UK conducted split tests on their landing page for World Cancer Day, they increased click-through rate by 294%. More click-throughs meant more donations.

It would have taken a whole series of tests to get from their original design (left) to the final version (right). Some of the changes they tried may have been proven less popular than the original. But over time, they transformed their landing page into a strong, effective call-to-action for donations.

Data-driven design

We work incredibly closely with our clients to get their website right. As Marcus wrote in a previous post, our process is less “Ta dah!” and more “Is this right? Is this right?”

But sometimes the client doesn’t know either. And that’s a perfect time to gather some data to make the decision.

Would you get more conversions if you used a different image on your homepage? Is the length of your form putting people off from contacting you? Are your customers finding your product pages easily?

Forget the guesswork and speak to us today about A/B testing your site.

We create brands, sites and products that move the dial for businesses on their journey from launch to sale. Lock in prospects, wow users, drive sales and win investment.