Editor’s Note: This is the third article in a 4 part series.
- Part 1 – A Beginner’s Guide to A/B Testing with Crazy Egg
- Part 2 – A/B Testing: How and Where to Start
- Part 4 – How to Understand and Act On Your A/B Test Results
The smallest design changes can sometimes have the biggest impact on your conversion rates.
And sometimes the most popular fads will actually backfire horrendously on your site (I’m looking at you, infinite scroll).
The only way to know for sure is to A/B test those changes to see what resonates with your website visitors and target audience. In the two previous articles of this series, we outlined how to set goals, know what to A/B test, and come up with ideas.
Now we’re moving onto creating the actual test!
The tips and advice I’m offering in this article are very particular to our in-house A/B testing tool, but you can apply the best practices no matter which service you choose.
Limit the Number of Factors You’re Testing
Typically, the whole reason you do an A/B test is that you want to find out which changes impact conversions.
So, let’s say that you make a change to the header, and you also change the color of a CTA, and you reorder your navigation. If you do all these things within a single variant (the “B” to your “A,” which is your original design). then you won’t know which change impacted your conversion rate.
Each of these three changes should be their own test, so you can better track what caused a rise or a dip in your conversions, and each test should be launched consecutively, not concurrently.
Note: You can launch separate tests on different pages on your site. Just try not to change too much in any given test.
Ideally you should come up with ideas based on an analysis of your site and then order those ideas based on:
- Difficulty of change
For example, based on our previous article where my goal was to redesign my homepage, my first order of business would be to change the color of the CTA. My aim is to pull attention away from all the competing links on the nav bar – specifically the “Blog” link, which we saw was an attention hotspot on the heatmap.
Once I run this test and I find that the winning variant is red, I can then move onto the next idea: changing the header. We saw in the List report that people were clicking on this non-linked element, so I want to see if conversions improve if Imake it interactive.
By running these two A/B tests separately, I can be absolutely sure of what moved the needle on my conversion rates.
Setting Up the A/B Test
The next step in creating a test is to choose the page that you want to experiment with and specify the device type you want it to display on.
Since visitor experiences can be drastically different depending on the device being used to access your site, it’s important to customize your experiments accordingly.
Based on the findings from my site analysis, I’m going to focus on the homepage and opt for the desktop view, since that’s what I conducted my analysis on.
Crazy Egg Editor
If your A/B test involves making small adjustments like CTA color, headline copy, or removing elements, the Crazy Egg editor is your new best friend. It lets you make changes to your site with just the click of a button.
Let’s introduce you to the various Editor tools at your disposal.
First up is the control bar at the bottom. It allows you to:
- Create a new variant or multiple variants at a time: For example, I could create one variant where I change the CTA color to red. Or I could create additional variants where I change it to green, change it to yellow, and so on.
- Change the device type: This is handy if I want to either check what device type I’m setting my A/B test up on, or switch to a different device.
- Undo or redo: Convenient if I make a mistake!
- Preview the change: Clicking on this will open the preview in a new tab so you can see what your visitors will experience when you make the change.
- Interactive mode: Notice a pesky pop-up or banner that you don’t want on your screen? By switching to interactive mode, you’ll be able to access dynamic elements on your page. You can open dropdown menus, close pop-up modals, or tab through slideshows. Once you’ve finished, click on the Interactive Mode icon to get back to editing.
- Move element options: Customize how you want to be able to move and shift elements on your page as you’re setting up your A/B test.
When it comes to making changes on your site, you have a few options available to you. Easily change the color of the background, hide an element, or modify its HTML code.
Change the text font, style, and color. You can also add a link to the text.
Consider moving an element from one location on the page to another.
Finally, if you have an image already on your site, then you can replace the image. Unfortunately, we can’t yet add a new image to your site unless it’s already in your source code.
Note: If this is a feature that you are interested in, please email us at firstname.lastname@example.org and let us know!
Note: Page updates that you make in Editor are merely changes that we present to your visitors on our end when we run your A/B test. For any change to be permanent, you will need to modify your own source code.
Now that you know how to use the Editor, it’s time to dive into A/B test best practices.
Number of Variants
Remember how we recommended that you only change one thing per page at a time? When it comes to the number of variants, these restrictions do not apply. You can actually create as many variants of a single change as you would like (some of our experienced customers have up to 22 different variants on a single test!).
For example, I could change the color of the CTA button five different ways: red, yellow, green, pink, and light blue. Each of these changes could be what convince my audience to click on the CTA and convert.
This is one of the benefits of our Multi-Armed Bandit approach. You can create tests for each page on your site with multiple variants, and we’ll run them all.
If you’re new to A/B testing, we recommend that you stick to three variants to start. If you’re more experienced, then we recommend between 5 and 10 variants depending on the test.
We don’t usually recommend more than 10, but you know your goals best!
Now that you’ve made the changes that you think will make a positive impact, it’s time to define what counts as a conversion during your A/B test.
We provide three central goals:
- Sell more products
- Get more registrations
- Get more pageviews
And then we provide advanced users with a customized option.
Once you pick a goal, you’ll then choose how our system figures out that the goal was met. We provide 3 options as well as one customized one:
Once you choose an option (in the case of my homepage A/B test where I’m changing the CTA, I chose “when someone clicks on an element”), then you’ll be taken to the next page where you can specify which element, page, or form to track.
We make it easy for you to do this by pulling up an example page so you can just point and click.
The last stop is our review page where you can double check that all the settings look good before you launch.
Snapshots for your A/B Test
One of the most important steps that you can take on the review page is to enable Snapshots of your variants. While we provide the conversion data on how your various variants performed, seeing Snapshots of each variant will help you fill in the blanks on the visitor behavior that is leading to conversion activities (or not).
Pro Tip: Using an external A/B testing tool, but still want to benefit from Snapshot Reports? Not a problem! All you need to do is follow the directions in this help article: Snapshots for External A/B Tests.
Once you’re done adjusting the settings, just click “Start testing my ideas” and you’re all set!
How Long Should I Run an A/B Test?
We recommend that you run each test for about a month. However, this is flexible depending on the amount of traffic your site receives. The most important factor to keep an eye out for is when one of your variants receives 95% of traffic or more. This is usually a marker that our Multi-Armed Bandit algorithm has reached statistical certainty.
Since we handle the traffic splits on our end, you don’t need to specify an end date for your A/B test. Just check back every now and then to see how each variant is performing!
How Frequently Should I be Launching A/B Tests?
Again, this is very personal to your site and your traffic. You could actually run a test with multiple variants on each page of your site. Once you’ve reached statistical certainty, go into the next round of testing.
You could also stagger your tests so you start on the homepage, then move onto a product page the week after, and then the contact page the week after that.
The important thing is to always have at least one test running so you can keep improving your website.
Ready to Dive In?
Congrats! You now know how to set up and launch an A/B test!
Up next, we’re going to help you understand and act on your A/B test results.
If you have any questions about what you’ve read, feel free to shoot us a note at email@example.com. While everything that was covered here applies to our in-house tool, most other A/B testing services should allow you to customize your tests and evaluate your results based on your goals.