DON’T MISS OUT

Get updates on new articles, webinars and other opportunities:

How to Cheat at A/B Testing (And Win)

by Ivan Guzenko

A/B testing isn’t for the faint of heart. So every CRO is looking for cheats—simple ways to get great results with minimal effort.

Let me start with the obvious. The best way to cheat in A/B testing is to not cut corners. If you’re looking for quick ways to get out of putting in heavy lifting work to get results, here’s our cheat: work smarter, not harder.

Easier said than done, right? Wrong. Because in this article, I’m sharing some of our favorite hacks, 6 cheats proven to deliver the results you need without a lot of hard work.

Cheat #1: Don’t rush to conclusions

A/B tests are a waste of your time if you draw conclusions too quickly or with too small of a sample group size. A good rule of thumb is to wait until you reach at least 1,000 views before forming opinions or switching your strategy; however, exactly what constitutes a large number depends on how many views you get on a normal basis.

1,000 is the absolute bare minimum. But of course, your base minimum needs to be increased if you regularly get many more views.

Don’t make conclusions when your sample group is too small or before you let enough time pass to understand time-dependent patterns that form. Always run a test for at least a full week (preferably a full month—or even a year, according to this Kissmetrics post), so you can gain average values accounting for daily and weekly traffic spikes and lulls.

As the example below shows, there can be enormous differences from hour to hour, day to day, or month to month.

revenue trackingImage from ConversionXL.com

Cheat #2: Identify audience-specific variations

Let’s say you read a statistic that says the afternoon is the best time to post on Facebook or Twitter. Before you start completing A/B tests that release social media posts at this time, take a second to ask yourself where this statistic came from. Which audience were they testing? What countries did it cover? Why could their results be completely irrelevant to you?

For example, it’s generally understood drip marketing campaigns are most effective right before the workday starts or a few hours before it ends; however, this isn’t accurate for everyone. In particular, this statistic relates to B2B, and when it comes to both B2B and B2C, every customer is different. Maybe your target audience has an age demographic or social reasons for not working during traditional work hours, or maybe you target those who are more likely to respond to social media than email during these hours.

Additionally, peak hours vary across time zones and cultures. Test multiple times to get results, and then ask yourself why you get the results you do. It’s not enough just to see what time works best—understanding the reason why can give you valuable insight into your target audience.

Cheat #3: Change up your copy

Sometimes what makes or breaks responses isn’t what you say, but how you say it. Making slight changes to your copy can garner big results. Outline the goals for what you want to communicate and create multiple variations before completing your A/B testing.

And of course, don’t make assumptions! While many guidelines for writing and formatting copy—such as using bullets and lists to break up text, utilizing short and catchy headlines, and paying attention to design and font are all great, what works for some won’t necessarily be the best method for your objective. Guidelines are great, but they aren’t the end-all, be-all, after all.

Cheat #4: Automate consistency

See the below example of a study from Which Test Won.

AB testing

While most people surveyed believed Version B had better copy, it was Version A that increased leads by over 100%. Why? Version A’s copy was designed to complement the copy of PPC ads driving users to the page.

As the study shows, each element of an A/B test is important in and of itself, but no single element completely controls results. The conversion process relies on many factors, one of the most important of which is synchronizing copy across communications.

One example of this is ensuring the tone and design of your advertisements complement your other marketing communications. Consumers subconsciously sense inconsistency and it can throw their attention and your message off.

Using marketing copy that doesn’t match the tone, color scheme, or photos of an ad will seem out of place and can have negative effects on your audience by confusing them and distorting your message. Make sure you prioritize consistency across communications. And once you find copy that is proven to garner responses through A/B testing, invest in advertising that will semantically match your ad and landing page copy.

Cheat #5: Surprise your audience

Clearly, you need to have enough traffic to complete reliable A/B tests. However, just having traffic is not enough. You need to have traffic that is responsive enough to give you results.

Let’s go to some basics:

#1: Boredom is the opposite of engagement.

#2: Bored audiences are less likely than engaged audiences to respond.

#3: Lack of response from your audience can seriously affect your A/B test results.

So what can we conclude from this? Audiences who know what to expect from you may tune you out automatically and not respond, which means you won’t get the results you need.

For example, consistently receiving similar subject lines and email copy in outreach emails often sends them straight to the trash folder. Small changes to call-to-action text or button design could yield small gains, but it’s the willingness to take big changes that really give you a comprehensive view of what works. While sticking to company branding is important, keeping users on their toes can pull them back in so they respond to your A/B tests.

Take a look at Facebook, who is famous for rolling out design changes slowly to users.

This method doubles as a way to build anticipation and engagement as well as a useful tool for testing changes and gauging users’ responses.

After you have taken the time to assess how possible changes can affect engagement and conversions, remember to think big. Change more than just the placement of sidebars or font size. Dare to make drastic changes to your overall branding tone, site design, and more.

The biggest cheat? Letting go of your assumptions

One of the most dangerous threats to A/B testing is gut instinct. While no test should be completed without a valid theory and reason for completing it, holding on too hard to your hypothesis can kill your results.

An ideal example of this is given by VWO’s examples of multiple A/B testing results that surprised experts. They highlight how websites were surprised by the results they achieved by going against trends. For example, while including photos or videos in marketing is often called a marketing must, some A/B tests have proven that this isn’t always the most effective method. Simply put, recommended best-practice guidelines aren’t always an accurate reflection of what audiences want.

Remember: The tests that prove your assumptions wrong are the ones that often teach you the most.

Your turn

There you have our top ways to cheat at A/B testing with simple hacks that will give you more reliable results. Let us know if you have any more tricks that have helped you cheat the A/B testing process, and give us feedback about how these helped you.

No Comments

DON’T MISS OUT

Get updates on new articles, webinars and other opportunities:

Ivan Guzenko

Ivan Guzenko is VP and Founder of SmartyAds. His passion is advertisement and social engineering. Active digital-fraud fighter.

NO COMMENTS

Comment Policy

Please join the conversation! We like long and thoughtful communication.
Abrupt comments and gibberish will not be approved. Please, only use your real name, not your business name or keywords. We rarely allow links in your comment.
Finally, please use your favorite personal social media profile for the website field.

SPEAK YOUR MIND

Your email address will not be published.

Show Me My Heatmap

I love sites with - combine that with and it gives such a great set of data to inform /

Andrew Martin

@AndrewDoesSEO

What makes people leave your website?