UX

Increase your conversions by having a strong A/B Testing plan

Increase your conversions by having a strong A/B Testing plan

A/B Testing is a very simple way of being able to measure the potential conversion rate between different styles and page elements.

A/B Testing is a very simple way of being able to measure the potential conversion rate between different styles and page elements.

Regularly doing A/B testing will allow you to come up with more accurate, research-driven assumptions about your users. Over time you will gather a better understanding of their preferences and how they move around your website.

A/B Testing is usually started with one existing “Control” design which can be compared to new “Challenger” variants. Although we call this process A/B testing, there is no reason why additional pages can’t be thrown into the mix, ending up with what looks like A/B/C/D test but is still referred to as an A/B test.

Categorizing the testing plans:

1. A/B Split Testing
Split testing is a method used to simply compare 2 versions of the same page. With this test you will only change one element on the page against another to see how this affects the behaviour of the user.

2. Multivariate Testing
This test is similar to split, however between page varients you can change several elements on the page. This method works well if you have a cluster of elements on the page which all play a part in allowing the user to convert.

3. Experimental Design
This type of test breaks the rules in that, there are no rules. You can come up with your own way of selecting one element on a page and analysing it’s impact on the conversion rates.

Now that you are familiar with the types of testing plans that you can do, we can start to think about how we can use these to improve our current systems.

A/B testing should be a process that is repeated throughout the life of the project. You should clearly understand what you would like the results to achieve and how you would like to improve the journey for your users.

Having a testing cycle is vital to ensuring you are getting the most valuable results.

Step 1. Measure the performance of your current website

1.1 – Business Objectives
The question you should be asking yourself here is what should my website be producing in terms of conversions to benefit my business as a whole? Objectives for your website should be renewed and reviewed quarterly at a minimum.

Some examples of business objectives:
1. Increase overall conversions from online bookings within the next 6 months
2. Increase user database
3. Increase social engagement

1.2 – Website Objectives
Drilling down into the analytics aimed specifically at your website should reveal how long people are spending on pages, how many products they are viewing per visit, how far the users scroll down the page etc. These are all measurable metrics which can develop website specific goals.

Digital Marketing Metrics may also be applicable to your website. How many people should you be signing up to your newsletter on a monthly basis?

Some examples of website objectives:
1. Users should be more engaged by the product imagery
2. More users should be signing up to the newsletter
3. More of users should add a product to their basket

1.3 – Defining your KPI’s
Is your website achieving the objectives it is supposed to? Without having a clear defined set of KPI’s (Key Performance Indicators), how are you ever going to know if your website is over or under-performing?

KPI’s that are relevant to the website could include sales metrics if you are running an e-commerce store. Take a look at the most recent sales targets that the website should be achieving. Are you hitting them?

KPI’s which are relevant to a news website may be more focused on trying to engage the user to the point where they sign-up more often or even share more content to social networks.

Again KPI’s should be tracked and monitored by all who are involved with the running of the website. They should be the primary objectives that everyone from UX design to digital directors are working to achieve.

1.4 – Defining the Metrics
Now that we understand what we are going to use to measure the websites performance, how are we going to analyse and define the results?

First start by agreeing how often these results are going to be looked at. Is it a monthly/quarterly/yearly review? For website objectives these should be reviewed quarterly at a minimum as the website should constantly be evolving and improving.

Let’s say you want to increase newsletter sign-ups. You should probably review this on a monthly basis as this can vary quite a lot depending on how much it is advertised. You may decide to run a campaign encouraging sign-ups one month and not the next. By having a look at the data gathered by website analytics you should be able to have a good idea of the average sign ups during a month. If these should be higher, set an achievable numerical target of sign-ups for the next month as a website objective such as 100.

Reaching the 100 mark will allow you to positively say that your website is indicating strong performance in this area.

1.5 – A complete performance measurement!
Now you know that every change you are making to your website is going to achieve something of value for the business.

Step 2. Test the right stuff!

2.1 – Selecting your tests
When deciding what you are going to change on your website, ensure this decision is backed up by data from your analytics.

“once you know the costs, you can work out if the tangible benefits outweigh those costs. In this respect, you should treat data like any other key business investment. You need to make a clear case for the investment that outlines the long-term value of data to the business strategy.”
– Bernard Marr (Linked in – https://uk.linkedin.com/in/bernardmarr)

“only 45% of software features in production today are never used (or deliver NO value)”
– David Rice (Linked In – https://www.linkedin.com/in/daverice/)

Your analytics should be showing you the problematic pages. High bounce rates, Slow page loading, Rage clicks and Low page interaction.

Look for the accountable elements on the page that are causing issues. Is there a better way that you could display these elements?

Is page interaction an issue? Some of the following fixes might encourage an A/B test:
– Creating more engaging content
– Using a chat feature to talk to your customers
– Make your CTA’s more visually attractive
– Turn off annoying content (Autoplay videos, heavy loading images, adverts).
– Have a comments section
– Make your navigation easier to use
– Improve your social sharing tools

2.2 – Prioritising your tests
There are a couple of prioritisation models which help you to select the right tests to follow-up. The latest of these models has been created by ConversionXL.

Their concept of asking you a set of questions about the element you are about to test enables you to formulate a set of results which you can quickly re-order to ensure the changes with the highest user impact rating sit at the top of the pile.

https://conversionxl.com/blog/better-way-prioritize-ab-tests/

Step 3. Testing

3.1 – Creating a Hypothesis

You create a hypothesis to define why you think there is a specific Problem on a page.

For example:

Problem – Users are not adding the product to their cart from the listings page.

Hypothesis 1 – I think some users like to view the product information before adding it to their cart.
Idea 1 – We could add a product “quick-view” to allow users to see a snippet of detail directly from the listings page.

Hypothesis 2 – I think our call to actions are too small and unnoticeable.
Idea 2 – We should make our CTA’s a more vibrant colour and larger.

Hypothesis 3 – I think the language we are using on the page is not influencing enough.
Idea 3 – I think we should encourage user’s to buy on the page by changing the website tone of voice to something more influencing.

Sometimes you may not get the result you were expecting. Users may actually like the original concept. At least you still have a better understanding of how users want to use your site.

3.2 Choosing your testing software

Here are some of the mentionable testing software we reccommend:

https://www.optimizely.com/ – Great for testing live campaigns and directing a portion of your userbase between several designs.

https://zurb.com/helio – A must use for when designing interfaces within large teams of collaborators as it allows you to quickly test static variances of your product within a couple of hours and it is relatively cheap too.

https://unbounce.com/ – A great piece of software with some great tools to help you quickly create variances of the same page from within the browser.

3.3 – Testing for gradual gains

You should only ever implement a challenger of your control design if the results are over 95%. This ensures that your results are valid and not based on chance. It may take a couple of tests before you reach that but if you achieve a 5% – 15% increase in conversions every time you test a new challenger, it won’t take long for you to be confident in the experiment and have a change ready to implement.

Step 4. Learn, Test, Repeat!

You should be consistently testing your website throughout the build and well after it has launched. Leaving your website to gather dust will surely leave you behind the competition and under-performing with your KPI’s.

Here are some A/B Testing success stories that really show their value.

https://www.behave.org/tests-of-the-month/

Conclusion

– Ensure you always have your analytics to hand. It is no good suggesting that your users do things based on a feeling you have. Back it up!
– Know your business goals. Translate these into KPI’s. Ensure your website objectives can meet target metrics which subsequently indicates good performance.
– Prioritise which pages to test by using a prioritisation model.
– Create a Hypothesis
– Reach a 95% significance before implementing the challenger design.
– Keep Testing!

URL’s
https://www.forbes.com/sites/unicefusa/2017/12/01/waiting-for-an-hiv-diagnosis-in-the-democratic-republic-of-the-congo/#1c9813291906
https://www.linkedin.com/pulse/why-45-all-software-features-production-never-used-david-rice