DON’T MISS OUT

Get updates on new articles, webinars and other opportunities:

6 Must-Know Tips for Successful A/B Testing

by George Mathew

Conversion rate optimization (CRO) lets you achieve something quite wonderful: it helps you get more out of your existing traffic.

A/B testing is the secret sauce that makes that happen.

But what happens when you finally decide to carry out A/B tests? You find tons of articles on the subject and are bombarded with advice—advice which is often conflicting in nature.

Today I am going to tell you the top six things you should know before carrying out an A/B test.

1. Begin A/B testing without any assumptions at all

Don’t assume anything about your audience when you conduct an A/B test. Don’t assume, just because the big orange button works for Unbounce and Oli Gardner sings praises about it, that it will improve your conversions.

That may not happen. Worse: your test version may get lower conversion rates.

Let’s say it is a CTA (call to action) button you’re testing. You may expect conversions to improve if the buttons are bigger. Or you may believe yellow converts best (after all, that’s what the gurus tell us).

But what if the results show otherwise?

In another post, I discussed a case study about a site that tested yellow versus violet buttons. Despite assertions that yellow buttons perform best, in this case, the violet button converted more.

The reason? It stood out in contrast to the site’s brand colors.

Start your tests with a hypothesis, but don’t presume to know what the results will be. More times than not, they aren’t what you expect.

2. Do a qualitative analysis to understand what your audience needs.

Quantitative analysis is when you ask your audience direct questions about your website/business, and it’s a fantastic way to get detailed information for your testing hypotheses.

Let’s say you are selling a dog training course/eBook. You have a great sales page, plenty of testimonials, excellent design and a bonus eBooks to go along with the course. However, the conversions are dismal; on some days, you don’t even have any conversions.

What can you do?

You sign up for Qualaroo and ask your visitors for feedback. Create a spot in the sidebar or on a slide-in that asks people, “What are the problems you are facing with this site?”

The answer you get may actually be the solution to your conversion problem. For instance, visitors may tell you that your site loads too slowly. Armed with that knowledge, you can fix your site speed, and…

Boom! Your conversions double.

But you can use this tactic to fix your landing pages too. For that, you might ask, “What’s the number 1 problem you face with your dog?”

You likely receive many answers, but the most common problem your visitors report is that their dog doesn’t listen to them.

You can now frame your sales page in a manner that resonates with this demand. You can also set up an opt-in pop-up that pre-sells them the same idea with a free eBook. Best of all, you know exactly how to test your sales page: one without the opt-in pop-up and one with the pop-up.

With targeted email subscribers, now your conversions reach the sky.

Groovehq did something even more amazing: they telephoned their customers to find out what they expected from Groove.

Oli Gardner applied the same philosophy to build a viral landing page.

“After watching the conversion rate hover around 25% I decided to try and figure out why more people weren’t clicking my CTA. To do this I added the Qualaroo widget” ~Oli Gardner

kissinsights survey1

The secret is to allow users to give you direct feedback. Then apply that information to smarter A/B tests that truly impact your conversion rate.

3. Make sure you reach statistical significance

Statistical significance refers to “the low probability of obtaining at least as extreme results given that the null hypothesis is true.” In ordinary language, it means you’re sure the results of your test were reliable.

Statistical confidence is the likelihood of the same results being repeated. We talk about statistical significance in A/B tests because of chance.

There are other factors to consider as well.

Sample Size:

If the sample size is too small, then you can’t be confident in being able to reproduce the results.

A sample size of 10 to 100 people is generally considered low. In the example below, two versions of the landing page were used.

Version A: Upload button bold; convert button bold; convert button has a right arrow.

Version B: All buttons regular weight; no right arrow on convert button.

But the sample sizes were too small; only 128 users in version A and 108 in version B.

In fact, we can see that the version seems to have made the page more usable with CTAs in bold.
Statistical significance other A B pitfalls — Cennydd Bowles

Image Source

Role of chance:

It’s always possible that the conversions improved not because of the changes you made, but because of the visitor’s mood, the time of year, or maybe something else.

Statistical significance doesn’t mean practical significance: Just because a test is statistically significant doesn’t mean that it is practical.

As you increase the sample size, you may notice small differences in conversions in the order of 1 to 2%. However, for most websites, these small changes mean nothing. In such cases, the costs of obtaining those results might not even be worth the limited improvement.

To determine the statistical significance of A/B tests, you can try this free tool from Kissmetrics.

A/B Testing Significance Calculator

4. Do Not Stop Early

You should not stop a test early, even if it appears that one version of the best is winning. Until you reach the predetermined sample size you set for the test, there’s a possibility that the element of chance may be at play.

Say you are planning on running an A/B test for one full week to meet a sample size of 10,000. What if after two or three days you see conversion rates of 4% on one version and 5% on another?

You should keep going until the sample size is met (and possibly even beyond).

Will Critchlow from Distilled ran tests for two times, four times and eight times longer than planned.
increased trial size

Image Source

His conclusions show that “running tests for 8 times as long as we previously thought might only get you back to a 90% confidence level.“

5. Test Multiple Variables

Although A/B testing traditionally tests just one element at a time, there’s a lot more you can do with multivariate testing. Once you are done testing headlines and CTA buttons alone, test combinations of these variables.

Comparing a Multivariate Test to an A B Test

Image Source

To know more about multi-variate testing you can refer to this article here or this one here. Read here for 6 tools to help you run your tests.

6. Always remember, popular A/B tests May Not Work for You

This goes back to the point I made above: Don’t assume anything. Just because a test works for another brand doesn’t mean it will work for you.

For example, in most case studies, videos increase conversions—but when Device Magic tried it on their own site, they found that videos decreased conversions.
control home page vwo

Image Source

The image slider increased conversions from the homepage to the signup page by 35%; it also increased subsequent signups by 31%.

Groovehq took inspiration from popular A/B testing case studies to run their own tests, but most A/B tests ended up inconclusive.

I have seen several examples where changing the button color produced significant changes in conversions.

signup color variations

However, for Groove’s customers, red, blue and green buttons all seemed to convert the same way.

In one of Neil Patel’s earlier posts here, he suggested that introducing urgency in the CTA buttons may help conversions.

signup wording variations

But urgency seemed to have no impact for Groove.

Basecamp has conducted several A/B tests on pricing options. For them something converted better than others, but for Groove the results were again inconclusive.
price variations

Image Source

That said, just because it didn’t work for Groove doesn’t mean it won’t work for you. You should keep testing and take nothing for granted.

Here are 6 more such case studies with A/B tests producing unexpected outcomes.

Concluding thoughts

Have you conducted any A/B tests for your website? If so how were the results? Were you able to improve conversions after implementing the changes?

Read other Crazy Egg articles by George Mathew.

3 Comments

DON’T MISS OUT

Get updates on new articles, webinars and other opportunities:

George Mathew

George has marketed million dollar startups with his skills and has contributed to ace Conversion Optimization blogs like ConversionXL. You can hire him for your content marketing needs.

3 COMMENTS

Comment Policy

Please join the conversation! We like long and thoughtful communication.
Abrupt comments and gibberish will not be approved. Please, only use your real name, not your business name or keywords. We rarely allow links in your comment.
Finally, please use your favorite personal social media profile for the website field.

SPEAK YOUR MIND

Your email address will not be published.

  1. Haroon Rasheed says:
    February 2, 2016 at 8:35 am

    Thanks George. This is very helpful

  2. Raghu Sharma says:
    December 8, 2015 at 5:25 am

    Thanks George for sharing your thoughts, CRO is very much useful so that we can find out where our efforts are going.

  3. Vimlesh Maurya says:
    February 26, 2015 at 11:26 pm

    Thanks George Mathew for sharing your great thought and helping world.

Show Me My Heatmap

Looking for key #Webanalytics tools? Try @CrazyEgg for the best way to track your #digitalmarketing efforts in a simple to read #data form

Brad Chuck

@BradChuck