When it comes to marketing your site, it’s often very useful to carry out A/B testing to see which elements of a design get the best engagement from the user. However, it can be tempting to test too many elements of a page at a time or difficult to know just what to test. With that in mind, let’s have a look at what you should be testing and why.
Firstly, it’s worth pointing out that successful A/B tests are:
- generally only carried out on one or two page elements at a time,
- need to have a large enough sample size to return accurate results, and
- be left to run for long enough to ensure effectiveness.
Many people stop running a test as soon as they see positive results in favour of one design, but this can be a false positive and so should be avoided. You should plan to run a split test for a minimum of a month if you want to gain accurate, actionable results.
When it comes to what you should test, this somewhat depends on what type of page you’re testing. If it’s a landing page, then CTAs are going to be important, so you could opt to test a CTA button in different colours. However, you can test anything that you like, including:
- Sales copy
- Product descriptions
A/B or Multivariate Testing?
If you want to test more than one or two elements at a time, then you’d be running a multivariate test, which means that you won’t be testing a different page, split between two groups, but two pages that have many different elements to each other. These test multiple variables and are really best used by advanced marketers who are very familiar with split testing.
With an A/B test, the differences between the two pages that you’re testing are very apparent. For example, on one page the CTA button may be green, whilst on the other it’s red (example below). Or on one page you may have an image with a person in it whilst the other may have a plainer image. The objective is to discover which one prompts the user into taking the action that you’d like them to and thus increase leads.
According to Kissmetrics, “A/B testing, done consistently, can improve your bottom line substantially. If you know what works and what doesn’t, and have evidence to back it up, it’s easier to make decisions, and you can often craft more effective marketing materials from the outset.”
Choosing What to Test
You should first study your analytics to get a good idea of where people are dropping off following landing on your site and some insight into why they might not be choosing to click on your call to action buttons. If visitors are simply landing on the home page and then bouncing right back off again, then obviously there’s something there that needs to be looked into such as navigation or again, sign-up buttons or other CTAs.
Recently, I came across a story in which a company had successfully carried out an A/B test on their navigation. They did this as whilst they were getting traffic to the site, the conversion rate was very low and nobody was clicking through to the products. The navigation system was made up of a lot of textual links and after reading about the ‘jam test’, in which it was found that too much choice often results in the user making none at all, they decided to test the navigation system.
The result was a pared down navigation with far fewer links to choose from and a vastly improved conversion rate.
Of course, it’s not always easy to see what’s going wrong and as it can be something as simple as one colour on one button, a little trial and error may be necessary. Saying that, it’s often the case that you can track user action through analytics and this should throw up any issues.
In the above test carried out for Performable, it’s easy enough to see what was being tested. What’s interesting about the results though is that even though most of us associate green with going ahead and red with stop or delete, in the test the red button performed 21% better than the green. So even if you believe that the design is doing everything right, people can surprise you as they don’t necessarily follow what you feel to be a convention.
There are plenty of examples online that prove this, so don’t make assumptions about your users, instead make A/B testing an ongoing process that tests elements to ensure that you’re getting the most conversions that you can. Also remember that things change over time, such as people’s tastes, so once you feel that you’ve successfully tested and tweaked everything, test them again.
You don’t have to be a web designer or developer to successfully carry out an A/B test these days and neither do you have to perform difficult mathematical equations. These days, there are plenty of commercial options to help marketers to carry out split testing easily, including elcomCMS’s own A/B Testing module.
Carrying out testing can seriously improve your profits, so it’s worth doing it properly. Test one (or at the most two) elements at a time and ensure that you have enough traffic going to each page. Remember also to let the test run for a good length of time in order to get the most accurate results from your efforts.