You’ve reached article 4 in our A/B testing series. In case you missed the first three, here’s a recap:
- A Beginner’s Guide to A/B Testing with Crazy Egg
- A/B Testing: How and Where to Start
- Best Practices and Pro Tips for Using an A/B Testing Tool
Now it’s time to move onto the final chapter: How to interpret your results!
I’ll be specifically covering Crazy Egg’s own results setup, but you can use these tips no matter what service you end up using.
What is important to keep in mind is that we use a Multi-Armed Bandit approach to A/B testing, so the analysis of your results is much more hands-off than a traditional 50/50 split test.
Since we automatically adjust the amount of traffic being directed to each of the variants based on its conversion rate, you can rest assured that you won’t need to sacrifice conversions for the sake of finding the winning version.
And with that, let’s dive right in!
Understanding and Evaluating Your A/B Test Results
This is what a typical results page looks like.
I’ve taken the liberty of annotating this page for clarification:
- Total traffic: This metric doesn’t show you the percentage of traffic that has seen each variant. Instead it shows the future division of traffic to each variant. The percentage will continue to change as we track more conversions and find the winning version.
- Visitors: This is the number of visitors who have seen each variant so far.
- Conversions: This is the number of people who have completed the goal that you specified during the goals setup phase.
- Conversion rate: This is the percentage of visitors who have completed the goal, divided by the total number of people who have visited your page.
- Improvement: This shows whether the conversion rate of the variant reflects an increase or decrease when we compare it to the conversion rate of the control.
On the results page, you can also easily see each of the variants that you currently have running:
Since we automatically run and monitor your A/B tests based on our own algorithms, there’s not much you need to do to keep a test going. We recommend that you check back at least once a week just to see the differences in the conversion and traffic rates, but otherwise, you can relax and spend time in other areas of the Crazy Egg dashboard (like Recordings).
Understanding the Performance of Variants using Snapshot Reports
We always recommend that you create Snapshots of each of your variants. This lets you understand exactly how each of the changes you made are impacting your visitors’ click behavior. Checking on the data in your Snapshot reports can also give you ideas for what test to launch next.
Pro Tip: If you are using a service other than Crazy Egg to run your A/B tests, you can still create Snapshots of the variants by following the directions in this article.
Hiding Pop-ups In a Snapshot
Have you noticed that a pop-up is covering a key section of your page and making it difficult to analyze your Snapshot data?
All you need to do is click on the gear icon to edit the Snapshot, scroll down to “Show more settings,” and choose “Omit pop-ups.” And you’re all set! You can also customize any of the other settings.
Compare the Snapshot Reports of Your Variants
While we recommend that you start by going into each Snapshot report individually to see whether visitors are behaving differently due to the changes you’re testing, you can also compare two of the variants; or the control and the variant side-by-side.
This allows you to see exactly whether your changes have impacted how far people scroll down your page, whether different segments respond to each of the variants differently, and if there are any growing differences between your visitor segments.
Snapshot Analysis Inspiration
If you’re in need of some guidance when it comes to conducting Snapshots analysis, check out these step by step guides and resources:
- A/B Testing: How and Where to Start
- What Should I A/B Test? An Ecommerce Homepage Analysis
- Converting More Customers For The Elephants: A Homepage Analysis
- When’s the Last Time You Analyzed Your Conversion Funnel?
- Tools, Tips, and Getting Ready for Testing: A Quick-start Guide to CRO
- A Recipe for CRO Success: Step-by-Step Instructions for Website Optimization
Launching a New A/B Test and Retiring Variants
While there is no set length of time to run a test, we would say that at least a month is a good starting point.
You should consider retiring underperforming variants and launching new tests once our Multi Armed Bandit algorithm is diverting the vast majority of traffic to the winning version of your page (this could be the control, or the new variant),
Here is an example of an A/B test that has reached statistical certainty:
At this point, there are a couple of different options based on your in-house design and development resources.
Action Plan #1: Minimal/No In-House Design or Development Team
Unlike a traditional 50/50 A/B test, you don’t actually ever need to stop a Multi-Armed Bandit test.
Since we don’t actually change your source code (we only inject code superficially for your visitors to see), stopping the Multi-Armed Bandit test would mean that your visitors will go back to seeing the control — even if the variant won.
If the variant wins your A/B test, you should only stop the test if you have the capability to change your source code to match the change you made in the variant.
If you don’t have that ability, then we recommend that you keep the A/B test continuously running, and that you only retire losing variants.
It’s important to note that you cannot retire the control, even if it is losing, because then we will be unable to display the winning variant to your visitors.
Instead, the workflow we suggest is that you retire the losing variant(s), duplicate the winning variant, and then launch a new test from that.
To illustrate this point:
Let’s say that the control has a headline that is black, but the winning variant has a headline that is yellow.
I would first retire any losing variants, then create a new test that experiments with a different CTA color, along with the winning yellow header, and see how that impacts conversions.
How to retire losing variants
How to duplicate the winning variant and make changes
Don’t worry about the control that you will keep running, since our algorithm is directing only 5% of traffic to it. This is a negligible amount, and ensures that the losing control will rarely be seen by any visitors.
Action Plan #2: In-House Design and Development Team
If you have a team with the technical know-how or resources, then we recommend that you make changes to your source code reflecting the winning variant, and then stop the test before launching a new one.
You could even bucket your list of requested site updates so that you only launch a major design overhaul after four to six tests have run. That way you reduce the load on your team.
This approach also lets you launch new tests off of winning variants so you can see how all of your changes work together for your website visitors.
Rinse and Repeat
Believe it or not, we’ve reached the end of the Crazy Egg Guide to A/B Testing! There are countless ways to go about improving your website, your user experience, and your conversion rates.
It’s our hope that this guide has given you a starting point for how to:
- Establish a baseline for your site
- Understand your website visitors
- Come up with ideas that will move the needle
- Launch tests that will help you accomplish your goals
- Evaluate the results of the tests
- Create new tests that will continue to build on previous winning variants
In need of A/B testing inspiration? Learn from those who have come before:
- How ecommerce company WallMonkeys increased their conversion rate by 550%
- How Intuit uses Crazy Egg with Optimizely
- How Radio Free Europe uses Crazy Egg’s heatmaps and scrollmaps to shed light on the user experience
Launched any A/B tests of your own? Let us know the outcomes by commenting on the post or sharing on social media!