Learn how to think like an A/B tester, and get better results every time.
How do you know what actually works on your website? Your design could be gorgeous and appealing to you, but is it actually encouraging people to read your content, buy your stuff and engage with your company?
This is where A/B testing comes in. A/B testing a way to measure if version A works better than version B, and vice versa. Lots of websites use A/B testing to make teeny tiny changes and experiment to see which changes drive more sales. You can use A/B testing on basically any feature on your website - design, wording, pricing models, pop-ups, contact capabilities, etc. to see where you should make changes.
What do I need before I run a test?
You might have a hunch about something that doesn’t really work well on your site. But even if you don’t, you can use Google Analytics and Hotjar to see where people are and aren’t looking at your website. Identify the links nobody is clicking on, the titles nobody is looking at, and the calls to action that aren’t being taken. Then, you can come up with an alternative version that might work better.
Another way to think about your A/B tests is that version A is a control version, while version B is a variant version. Control means your normal version that you are using, without any changes. Your variant version is what you’re experimenting with; the single change you’re implementing. Before you run a test, figure out exactly what you’re going to try changing and only change that one thing. This is called “A/B split testing”.
There are so many variables happening at one time on your website. Besides just the fact that people are different and have different preferences, your website, ad or email have a multitude of text, colours and images happening at once. If you change more than one thing at a time, you won’t actually know what was the effective bit that you’re changing. Some people still do this, testing two or three variables at once. The pro is that you can get through your experiments quicker or you can group a headline and image that go well together; the con is that you can’t isolate as well what is making the difference. This is called “A/B multivariate testing”.
The larger the sample size (the more people looking at your website/email/ad), the more valid your results will be. Carmilla The Goth won’t come through stronger with her preferences of everything being scrawly, black and red compared to Ruby the Prep who likes neat, clean lines and pastel colours. The more people you have to experiment with, the less individual preferences make a difference in the results. Instead, you’ll learn aggregate preferences, or all of the individual preferences stacked up and averaged.
You can use a sample size calculator to see how many people need to be looking at your website monthly to understand any real changes. If you don’t have enough people yet, focus your marketing strategy on driving more traffic to your website first. Or, test things out but take your results with a heavy grain of salt.
Once you have two versions of your design and a lot of people looking at your website, you need to be able to actually run an experiment. There are a few websites out there that offer A/B testing services, and you’ll put a little line of code from them onto your website backend so they can track results. They can then take two design versions and split your viewers into two equal groups. You can run an experiment for a little while (the timing depends on how many viewers you have, but a couple weeks should do it).
Optimizely is a popular one because it’s easy to use and highly visual. It has a 30 day free trial and paid plans starting at $17.
CrazyEgg gives you a heatmap to show you where your visitors are looking and clicking. They’re one of the cheapest experimentation platforms out there at $9 a month.
VWO (Visual Website Optimizer) is another commonly-used A/B testing tool with pretty robust features, and has a 30 day free trial.
Google Analytics is free for smaller businesses and you might already be using it. Unfortunately, it’s pretty clunky, doesn’t have a visual editor, and you need to do all the interpretation yourself. (P.S. here’s a guide on how to get the most out the Google Analytics Optimize feature).
Five Second Test lets you test images and asks people what they remember after seeing it for five seconds. Though it’s not traditional split testing, it gives you a lot of information.
Kissmetrics is known for focusing on the human side of analytics. Their platform has become really popular for testing out why people are dropping off your website.
Hotjar is not strictly A/B testing, but does give you lots of ability to try new things out and analyse the results. It integrates automatically with Shopify, which is a big plus.
Give it a try
Here are a few things you can try A/B testing right now.
- Headlines and subheadings
- Longform/shortform paragraphs
- Placement and wording
- Image itself
Calls to action
- Type of button
- What they say
- Where they’re placed
- Free trial vs. money back guarantee
- Subscription plan vs. regular a-la-carte pricing
- Discounts vs free shipping
- Logos of partner brands
- Testimonials and reviews
- Media mentions
- Awards and badges
The gist of it
A/B experiments are actually way more than an add-on you do to tweak your website design. Thinking about finding the right answers by experimenting and testing is something you can use in your day to day life. For example, you and your housemate can never agree on the fastest way to get home. You’re sure you’re right and they’re sure they are. How do you find out? Leave at the same time and see who gets there first.
By thinking in terms of running little experiments, rather than just believing you’re right, you can start making more informed decisions. The only way to truly know what’s working in your business (and elsewhere) is to be a scientist about it. For more info on experimenting with conversion rate optimisation, click here.
Elkfox can help you out with A/B testing. If running your own A/B tests isn’t your thing, don’t worry - it’s ours. We can hypothesise what’s going wrong, design variants, test them out and help you move forward. Talk to us about getting started with A/B testing.