DON’T MISS OUT

Get updates on new articles, webinars and other opportunities:

Never A/B Test Without Doing These 2 Things First

by Christina Gillick

“Give me six hours to chop down a tree, and I’ll spend the first four sharpening the ax.”

That quote, made famous by Abraham Lincoln, tells us a lot about how we should approach any A/B test…

In fact, as Michael Aagaard points out in an Unbounce article:

Running a series of random A/B tests on your landing page with no insight and no underlying hypothesis is like charging blindfolded through the woods, swinging a dull ax and hoping to hit a tree.

It may sound like a ton of fun, but if your goal is to chop down a lot of trees (or get a lot of conversions), it isn’t a very effective strategy by any means.

a/b testing
So what would ‘ol Honest Abe suggest when it comes to your A/B test?

He would likely say we should spend a lot of time—maybe even the majority of our time—prepping for any A/B test. We should gather data, gain insight into our visitors, and create our tests based on solid evidence.

Simply put, Lincoln understood the importance of adequately preparing for the task at hand, whatever that task might be. And, in reality, it’s a fairly universal truth.

Preparation greatly improves implementation, which naturally yields better results.

Yet many people jump straight into their A/B test… without any groundwork whatsoever.

Why prepare for an A/B test anyway?

The purpose of an A/B test is to find what works best. So it may seem you could simply perform test after test until you hit on something that works.

But, as Michael Aagaard points out:

Jumping headfirst into a series of landing page tests with no data and insight is like chopping blindly away at a tree for hours will a dull ax, hoping that the tree will eventually give way to the blade and fall over.

The tree is not likely to give way to the dull ax. And, running tests without data is not likely to yield good results. In fact, it’s likely to be a pure waste of your time and money.

Performing an A/B test correctly takes time, effort, research, monitoring, and analysis.

So, how can we prepare?

2 Steps to Effectively Prepare for Your A/B Test

Here are two steps to take before launching any A/B test:

Step 1 – Research

The first step—as it is with most advertising and marketing—is to do your research. Michael Aagaard relates this step to “sharpening the ax.”

Basically you want to learn as much as you can about your target market…

First, the essentials:

  • Who are they? What is their gender, age, income, occupation, education, etc?
  • What do they want? Or what are their desires (or problems)?
  • How does your product or service solve those problems for them?
  • Where in the buying process are they?

Next, dig a little deeper:

  • How do they feel? (Both about your product and life in general.)
  • What are they interested in?
  • What are their hobbies?

And, get inside their head:

  • What are their values?
  • What do they care about?
  • How do they speak? (More specifically, what words and phrases do they use?)

Knowing the answers to these questions will help you in the next step of A/B test preparation (which we’ll cover in just a moment).

For instance, if you know that your target market is made up of mostly men—let’s say ages 35–50—you probably don’t want to test a headline that would appeal to young women.

So, how can you go about learning the answers to the questions above?

By researching every resource available to you.

4 Ways to Gather Insight for an A/B Test

Anything that helps you better understand your target market (or website visitor) is fair game when it comes to research. Here are just a few options:

1. Study your current customers.

What do they have in common? By better understanding those who have already purchased—and why they did so—you can better appeal to new potential customers.

Which of the above questions can you answer by studying your past and current customers?

2. Use surveys.

If you can’t answer the above questions by studying your customers, why not ask them directly?

Try sending out a simple survey. Remember, to give your customers something in return for their time to get more responses. Some companies will offer a discount, an entry into a contest, or a freebie in exchange for completing a survey.

Also, you should consider surveying your website visitors (and potential customers). Tools like Qualaroo make it easy to set up and run surveys on your website. This is often a very eye-opening exercise when you think you have nothing left to learn about your visitors.

3. Gather data.

If you have previous A/B tests or past campaigns, view the results from those. Do you see any patterns?

Also, to see what your website visitors are doing on your website, watch them. Technology like heatmaps and eye tracking make it possible to see what elements of your website your visitors are focusing on.

Also, you can see if they are taking the action you want them to take and, if not, you can see what they’re doing instead.

4. Track your target market.

Using the same language as your target market will help you create tests that resonate with them—and ultimately perform better.

Go where your audience hangs out and listen to them, both in the real world and virtually. Online you can go to forums, chat rooms, and social networks to read what your audience writes. Notice how they talk and what words they use.

Once you complete step one (research), you have enough information to move on to the next step.

Step 2 – Create a hypothesis.

Now that you’ve researched your target market and understand what they want, look for, and respond to, it’s time to form a hypothesis.

Or in this case, a “scientific hypothesis” because our final action will be to test it. (To qualify as a scientific hypothesis, the scientific method requires that you can test the hypothesis.)

Maybe you’ve heard a “hypothesis” defined as an “educated guess.”

That’s because we’re aiming to come up with a test that will likely beat the control. We won’t be sure until we perform the actual test, but we want our hypothesis to be as educated as possible (based on our research).

Your hypothesis might be a statement as simple as:

If I reduce the amount of information requested in my form, I’ll get more opt-ins.

Another example might be:

If I change the color of my “Submit” button to orange, it will stand out more, grab more attention, and ultimately increase conversions.

Of course, your own hypothesis would be based on your research, not a blind guess. For example:

Because my target market is cautious about giving out their information online, I’ll likely get more opt-ins by reducing the amount of information requested in the form.

After we have a hypothesis—again, based on solid research—our next step would be to create and run our A/B test… either proving or disproving our hypothesis.

Let’s look at an example from Conversion Rate Experts which resulted in 25.9% increase in opt-ins:

ab testing example 3

In this example, Conversion Rate Experts did their research by using heatmap software (from CrazyEgg) to see which part of their page was getting the most attention.

From there they made a hypothesis:

The sidebar was distracting from opt-ins.

For their test, they removed the right sidebar.

The new version resulted in 25.9% more opt-ins than the previous page.

Without the research and heatmap technology, Conversion Rate Experts might not have realized what their sidebar was doing to their conversions. Instead, they may have spent time and money testing other elements on the page.

Let’s look at one more example, this one from WhichTestWon:

ab testing example

Roughly double the number of respondents guessed version B would win the test:

ab testing example 2

Maybe they figured, “Bigger is better.” Possibly… if you’re relying only on conventional ideas without your own research.

In actuality, version A increased form sign-ups by 8.8%.

WhichTestWon’s Analysis said, “Perhaps seeing more of the page behind the overlay convinced more visitors to opt-in.”

Of course, these are just assumptions. Hopefully the person in charge of the test did their research and created a hypothesis. It might have been:

Based on our research, our target market doesn’t like to be bombarded with large pop-ups. If we reduce the size of the pop-up, maybe it will be less threatening, appeal to more of our target market, and ultimately increase signups.

They would have been right.

Your turn…

Do you research before running an A/B test? If so, what insights have you gained and how has it affected your tests?

Read other Crazy Egg articles by Christina Gillick.

2 Comments

DON’T MISS OUT

Get updates on new articles, webinars and other opportunities:

Christina Gillick

Christina Gillick is a direct-response copywriter. She helps her clients create loyal customers and raving fans through relationship building copy and marketing. She is also an entrepreneur and founder of ComfyEarrings – The Most Comfortable Earrings on Earth.

2 COMMENTS

Comment Policy

Please join the conversation! We like long and thoughtful communication.
Abrupt comments and gibberish will not be approved. Please, only use your real name, not your business name or keywords. We rarely allow links in your comment.
Finally, please use your favorite personal social media profile for the website field.

SPEAK YOUR MIND

Your email address will not be published.

  1. ey phua says:
    December 28, 2014 at 8:56 am

    Thanks, like the structured way of determining how to do A/B testing. Will use this methology!

    • Kathryn Aragon says:
      December 28, 2014 at 10:55 am

      That’s great, Ey Phua. Let us know how it goes.

Show Me My Heatmap

Currently looking at @CrazyEgg reports and understanding them. @lorenagomez would be so proud! LOL!

Nicholas Love

@NicholasJLove