DON’T MISS OUT

Get updates on new articles, webinars and other opportunities:

The Hard Truth About A/B Testing I Had to Swallow – By Dennis van der Heijden, CEO of Convert.com

by Today's Eggspert

We lied to you.

For years, we, as providers of an A/B testing tool, told you it was easy. We made a visual editor and pretty graphs and gave you wins on engagement or a lower bounce rate, but we did not really contribute to your bottom line.

My apologies for making it look easy and dragging you into A/B testing when, in fact, it is actually very hard to do right.

Flashback: It was July 2012, and on a sunny afternoon at the Blue Dahlia Cafe in Austin, I had lunch with Bryan and Jeffrey Eisenberg, both recognized authorities and pioneers in online marketing and conversion optimization. I was presenting our just-funded startup Convert.com, and they were very politely sharing that the road to the top of A/B testing tools was a hard one.

They were advising another company that had already made a significant impact on the A/B testing market and were very nice to sit with me for an hour to share some of their knowledge on how conversion rate optimization was impacting the online market.

During our meeting, they were not understanding that our then-revolutionary visual editor, which we designed based on the idea we saw at Dapper, would change the industry they branded. They both smiled and were friendly but said we should focus on some other key features, and they mentioned some.

We thought making easy A/B testing would change the world and that the path was to make the tool easier. As I now know, I was ignoring their advice because I thought I had come up with something that would change it all. It would, but not in the way I thought.

You Need at Least 400,000 Visitors

Fast forward: In July 2016 we killed all plans under 400,000 tested visitors since these customers don’t win much. They don’t improve their conversion rates, and it’s partly my fault. Three years ago I did not see that Bryan and Jeffrey were right. It’s not the tool that makes the conversions go up. It’s the people and the process behind it. So when we looked at which customers really benefitted from A/B testing, we saw little-to-no success in smaller accounts.

There are two things you need to be aware of when setting up your marketing mix and considering A/B testing. First, we made testing look easy, but it’s not. Second, when you want to get a good return on the time invested, the strategy is what makes your conversion optimization work. Trying A/B testing is not something you do for just a couple of months.

You Need 20-60 Hours of Human Resources Time

A/B testing requires a decent amount of human resources time that can’t be provided on smaller budgets. You will need a team of people, beginning with a conversion expert who does all the research and analysis and creates test strategies. Then you will need a designer to design new test variations and, finally, a developer who can implement the changes via the testing tool. So on average you are looking at 20-60 hours of human resources time to get one test live and running.

The exact amount of time depends on the type of test. A simple CTA (call to action) test is generally fast and easy to set up, so fewer hours will be needed. Tests with bolder changes will consume much more of your human resources time. And there is your cost.

Top Conversion Agencies Use Their Own Methods

Just to add to the complexity, top conversion agencies use their proven methodologies to deliver the best possible results.

For example, ConversionXL uses PXL to prioritize tests and ResearchXL methodology to discover problems and opportunity areas that are unique to the client’s site. This is a six-part research process that consists of technical analysis, heuristic analysis, web analytics analysis, mouse tracking analysis, qualitative research/surveys, and user testing.

Their testing methodology consists of four broad steps:

  1. Discovery and research
  2. Prioritization and roadmapping
  3. Running tests correctly
  4. Conducting post-test analysis

ResearchXL

Another conversion agency is ConversionSciences. Its initial six-month Conversion Catalyst™ process ensures that they are collecting good data, that client analytics are accurate, and that they can find ideas that are both creative and supported with data. Their Conversion Catalyst™ process consists of a conversion audit, a hypothesis list (prioritized for ROI), an analytics audit, and user intelligence.

5 Steps

WiderFunnel conversion agency uses Infinity Optimization Process, which is a structured approach to growth marketing strategy and execution. Infinity Optimization encompasses the yin and yang of marketing – the qualitative side that imagines potential insights and the quantitative side that proves whether the insights really work. The process is ongoing. It’s not just a strategy or ad hoc plan, hence the infinity loop shape with progress arrows. And there are two equally important outputs. Both growth and insights are the intended and central results.

Infinity Optimization Process

You Need Enough Traffic for Statistical Validity

A/B testing requires significant volumes of traffic for your test to reach statistical validity. Before I dive into the volumes of traffic required to run your test, I will first explain what statistical validity means, as it’s very important to understand. Our testing tool, as some others, provides a lot of data about your test. One of them is statistical validity, or confidence level of a test. The higher the confidence (you want a 95-99% confidence level to be reached to declare a variation as the winner), the more confident you can be about the test result.

The statistical engine behind our tool derives the confidence level from several factors like conversion improvement detected and volume of traffic. If your winning variation conversion detected is, say 50%, then you need less traffic to validate it. But if your conversion improvement detected is, say 3%, which can still mean a significant revenue increase for websites with lots of traffic, then you need several tens of thousands of visitors to validate it.

Unfortunately, not every test will see increases in conversion of 30% or more. Usually, you will see increases in single digits, and that’s where your traffic volume starts to play a role in validating your test results.

So let’s go back to the traffic volume. What’s the optimum or minimum volume of traffic for testing? When you ask different conversion experts, most will give you different answers. Generally, there are two rules of thumb. If your website gets 1,000,000 visitors per month, then your test page/pages should be exposed to at least several tens of thousands of test users. Whereas, on sites with traffic of fewer than 50,000 visitors per month, a tested page should get a minimum of 3,000-7,000 visitors per month.

Because of that, you will run into the test statistical validity issue if your conversion detected is in single digits and your tested page gets very low traffic. Also, at these minimum levels of visitors, you will be limited by the number of test variations you can test at once.

As you can see, testing can be complicated, and understanding all these nuances is important if you want to execute your testing strategy successfully.

We Provide Two Dozen Conversion Rate Experts to Help You

After I realized that even though we offered one of the best testing tools out there, with lots of cool features and a WYSIWYG editor, some of our users were still struggling to deliver results. So I had to pause and think again about what we had done wrong. After a little bit of thinking (I think it took several months), I decided to make a change.

Yes, we may be offering a very good tool (in fact, one of the best testing tools), but testing requires more than just a tool. It requires experience, knowledge, good process, traffic volume, and good people, too. It is exactly as Bryan and Jeffrey Eisenberg said during our first meeting in 2012.

So I was thinking, OK, how can we help our customers get the best out of our testing tool? And I came up with two ways. First, be honest about the hype, and kill all options for customers to buy a plan without actually having the traffic. We removed all plans with fewer than 400,000 unique users in traffic and even give double traffic to anyone buying a year plan. Volume matters since, with good volume, you can make some errors in tests (and you will), so we made Convert Experiences the most affordable tool for high traffic websites.

Second, get two dozen conversion rate experts together and offer their experience in an easy and affordable format. We are about to launch our own conversion optimization marketplace where our users can hire certified conversion experts with one click to help them with their conversion projects.

These are experts who offer gigs that go way beyond A/B testing. You can find user session analysis or heatmap studies for Hotjar, assistance with test prioritization models, etc. It doesn’t matter which tool you use to test. The conversion optimization marketplace will offer experts who help you close that temporary knowledge gap.

Conclusion

We must face some hard truths about A/B testing. But should they stop us from improving the experience on our websites in return for more revenue and satisfied customers? I think not! To be aware of the complexity of testing can be an eye-opening experience.

Hopefully, I managed today to shed a little bit more light on the whole A/B testing process and the role of the Convert.com testing platform. I hope that, with this information, you will feel more empowered and can make better decisions when planning the next conversion optimization strategy for your website.

About the Author: Dennis van der Heijden, CEO and Co-Founder of Convert.com, is a learner for life, an entrepreneur, and an advocate of simple living. He is enthusiastic about building a better world. Dennis has ensured that Convert Experiences is not just a feature-rich, ever-evolving A/B testing platform, but a platform run by a company with an all-inclusive family that cares about the success and agility of its members. Having worked with brands like Microsoft and SAP and as an advisor to startups, Dennis is a believer in communities. He has fostered a culture of cooperation and paying sincere heed to customer needs and requirements.

One Comment

DON’T MISS OUT

Get updates on new articles, webinars and other opportunities:

Today’s Eggspert

This article was written by today's Daily Eggspert. If you would like to contribute as an Eggspert, please reach out to us here.

ONE COMMENT

Comment Policy

Please join the conversation! We like long and thoughtful communication.
Abrupt comments and gibberish will not be approved. Please, only use your real name, not your business name or keywords. We rarely allow links in your comment.
Finally, please use your favorite personal social media profile for the website field.

SPEAK YOUR MIND

Your email address will not be published.

  1. Ateeq Ahmad says:
    February 21, 2017 at 3:21 am

    Very insightful because a lot of tools get very little traffic and judging the designs based on small data sets can be hard. However, I do think some non-parametric testing helps in such cases. What is the author’s opinion on that?

Show Me My Heatmap

I’m just @crazyegg’ing everything these days.

Kyle Mitchell

@jaggedlines