With 94% of internet users regularly checking their email accounts, it’s little surprise that marketers spend hours tweaking and optimizing their email campaigns.
At Zopim we’re firm believers that email marketing is the lifeblood of any online business. But, putting together a strong email marketing campaign that converts customers is no easy trick.
So in the latter half of last year, we decided to find out what our customers really wanted from an email. Coinciding with the launch of our new and improved mobile-optimized chat widget, we constructed a comprehensive A/B split testing campaign to determine what kind of subject lines, email content and call to actions our customers preferred.
In this article I’ll look out how we planned our split testing campaign and the results we saw.
A few caveats
Although email marketing can and should be used by all businesses, what works for one company might not work for another.
As a cloud-based SaaS (Software-as-a-Service) platform, providing live chat to a variety of businesses around the world, Zopim’s customers are primarily other businesses. Hence if your customer base is radically different, you’ll need to make your own changes.
This article has two sections. In the first, I explain the methods of email A/B testing, and in the second, I describe how to construct an email that is opened and read. The former can apply to any company or individual who likes to send emails. However, the latter might not apply for every kind of business, if, for example, your business sells shoes.
How We Conducted our Email A/B Testing Campaign
Before starting your email marketing campaign, there are a number of important elements you need to consider.
1. What to Test
First, you need to figure out what you’d like to test. This might seem straightforward but with over 25 distinct elements in an email, it’s important to have a clear goal in mind before starting out.
Since we had never tested email marketing campaigns with our customers, we wanted to look at the following parameters:
- The email subject line (“SL”),
- The email content (“Content”), and
- The call-to-action used (“CTA”)
2. How to Determine the Winner
Once you’ve identified what you are testing, you must decide how you will determine the winner.
In our case, we decided on the winner of each campaign using the following rules:
- Winner of SL was based on open rates (i.e., how many times the email had been opened)
- Winner of Content and CTA was determined by click-through rates (“CTRs”) (i.e., how many links, including the CTA, had been clicked in the email)
After sending out the emails, the results were collected from MailChimp’s Dashboard.
3. The Sample Size
The next thing you need to determine is the sample size you’d like to test. There’s no general rule of thumb for how large a sample size needs to be (in fact there’s a whole scientific basis for calculating the approximate size of an A/B testing campaign); however, it cannot be a small sample.
We had over 80,000 customers we wanted to test our emails on, and so we roughly divided them into individual segments of approximately 6,600. For every A/B split test we conducted, there would be an average of 3,300 customers in each group.
To ensure our results had statistical significance, we used this A/B Calculator.
4. The Email Platform
We decided to use MailChimp as it has detailed reporting and very flexible pricing options.
5. The Plan
Once you’ve got all the pieces ready, you need to create a plan for what you’re going to test in the email campaign.
Our basic plan, which evolved quite a bit along the way, was focused on testing SLs, Content, and CTAs:
|Test 1||SL 1 v SL 2 v SL 3 v SL 4 v SL 5|
|Test 2||Content 1 v Content 2 v Content 3|
|Test 3||CTA 1 v CTA 2 v CTA 3 v CTA 4|
* Since the CTA is integrally linked to the Content, we had to test both together. In some, cases a particular CTA would only work with one type of Content (for example, if it was longer).
6. The Email
The final piece of the puzzle is the email itself.
We knew what we wanted to test, but we had to actually have something of value to send to our customers. In this case, we were launching a new feature and we also wanted to give all our “Lite” (free) customers a discount code for a free upgrade to one of our paid plans.
With all the various elements of an A/B testing campaign in place, we spent the next 3 months sending a series of emails at exactly the same time in the week.
Here’s what we found.
Subject Line: Don’t Sell your Features, Demonstrate Value to Customers
With over 80% of emails going unopened, there’s a good chance that a majority of your recipients will only ever see your SL. So it’s important to get this right. In fact, many would argue that this is the most important element of any email.
Our email was about the upgrade discount code we were offering and so we came up with five different variants of the SL.
Over the years at Zopim, we had used a few different types of SLs—without any real testing—and we were unsure which style would work best. So, we decided to take our top 5 performing style of SLs and modify them to fit our content for this email. The open rates of the SLs were as follows:
|Key||Subject Line||Open Rates|
|SL 1||Unlock Zopim’s Exclusive Features with a Complimentary Upgrade||25.50%|
|SL 2||Thanks for Being a Loyal Zopim User – Enjoy the Free Upgrade||28.20%|
|SL 3||Try Zopim’s Basic Plan Free for 1 Month||22.90%|
|SL 4||Try our New Mobile-Optimized Live Chat Widget with a Free Upgrade||21.90%|
|SL 5||[Zopim] You’ve earned a free upgrade – Thanks for sticking with us||30.6%|
The results of this phase of the split testing campaign were quite clear: SL 2 and SL 5 were more popular than the rest. However, we wanted to have our company’s name featured prominently, so we were leaning towards SL 5.
Instead of picking one at random, we tested SL 5 and a reworked SL 2 (“[Zopim] Thanks for being a loyal user—Enjoy the free upgrade”):
Although the difference in open rates was a little under 2%, we decided SL 5 worked better overall and so we went with: “[Zopim] You’ve earned a free upgrade—Thanks for sticking with us”
One of our interesting findings was that having the word “free” in our subject line did not negatively impact open rates, despite many articles suggesting the contrary.
- Make sure your business’s name stands out in the subject line. In our case we put “Zopim” in square brackets at the start of the subject line: [Zopim]. This improved our open rates by approximately 2.5%, when tested against a SL that did not include the company name. In AWeber’s study of three high-performing email newsletters, they discovered that including the name of the company increased open rates.
- Give some kind of tangible value to your readers. Our emails had the words “free upgrade” in the first half of the subject, which helped readers decide whether they wanted an upgrade or not. Jamie Turner, founder of the 60 Second Marketer, agrees with our thesis and writes that including an incentive in the email’s subject line can “increase open rates by as much as 50%”.
- Remember customers are interested in a solution to their problems. They don’t care about your awesome features and won’t open emails along the lines of “check out our new features.” In fact the lowest performing emails were the ones that mentioned our new features.
Content: Don’t Overwhelm with Information and Leave Plenty of Whitespace
Getting the customer to open your email is undoubtedly a challenge, but it’s only half the battle. Once you’ve got their attention, you need to provide them with useful information and get them to perform an action.
With that in mind, we came up with three different styles for our emails:
- Content 1 was a short and sweet email with only 1 CTA, a few paragraphs of text, and a single image. The goal was to provide the customer with the most relevant points and push them immediately towards an action.
- Content 2 was a much longer email with multiple images, CTAs, and paragraphs of text. The goal here was to provide as much information about the new feature and use that to push customers towards upgrading.
- Content 3 was our longest email. The goal here was to demonstrate the value of the new features as well as the benefits of upgrading.
The CTRs for the 3 different types of Content tested, were as follows:
The difference between Content 1, 2, and 3 were not huge, and we chalked it up to the fact that the CTAs were identical in all three emails (see the next section for CTA testing).
But because we were testing a few different elements, namely images, text, and bullet points, it’s hard to say with certainty which factors affected CTRs the most.
However, since Content 2 had the highest CTRs, we decided that the sweet spot for our emails would be to have a few paragraphs of text explaining the features, a couple of images, and a couple of CTAs.
Based on our results we came to the following conclusion for the Content:
- Make sure your email is in moderation, not too many texts, images, or CTAs. However, this really depends on what kind of email you are sending. In our case we noticed that customers wanted a little more information before they would act on it, hence adding the additional bullet points explaining our new feature helped a lot.
- Use plenty of images. Although we didn’t A/B test images, Unhaggle did and they recorded an incredible 378% increase in their CTRs solely with the addition of a few pictures. In our emails, Content 2 had an additional image and it performed better.
- Focus the key points in your email in easy-to-read bullet points. This isn’t definitive as we didn’t test it thoroughly, but again, Content 2 performed better than Content 1 (which had no bulleted information). Additionally, Pardot published a case study which indicated that bulleted information was likely to increase readership.
CTAs: Be Aggressive and Keep Asking
The final piece of the email marketing puzzle is the CTA. After all, the aim of the email is to have your customers perform an action.
So, after we had narrowed down the SL and Content, we decided to test out 4 sets of CTAs.
*Depending on the length of the Content, we had to use one or more CTA. For example, if the content was 1–2 paragraphs, one CTA was sufficient. On the other hand if the Content increased to 4-6 paragraphs, two CTAs were required.
We added the CTAs to our emails in a variety of formats and our CTRs (averaged out) were as follows:
CTA 2 (both “Upgrade Now” and “Free Upgrade”) turned out to be the most popular, yielding the highest CTR. These call to actions are short and to the point.
Combining our information on CTA and Content CTRs, we came up with a formula to maximize the CTRs of emails:
- If your email is longer than a few paragraphs, include additional CTAs. Ideally, you want to ensure the reader does not have to scroll up or down to find a CTA. In our emails, the ideal number of CTAs was 2, evenly spaced out within the email. However, the number of CTAs you have in your email really depends on your customer base. For example, Whirlpool reduced the number of CTAs in its emails from four to one and achieved a 42% increase in clicks. Try a few variations and see what works best for you.
- CTAs that are in capital letters worked well for us. We didn’t want to spam our readers with ALL CAPS CTAs, but we found that for some reason these performed the best. We would encourage everyone to try a few different variations (some caps, some not) to see what works for you. Although some sources discourage the use of all caps, we found that it worked for our campaigns.
- The text inside the CTA should be easily legible. For example, don’t use words that are difficult to read or clash. CTA 1 was hard to read and so didn’t perform as well as the others. Another way to put it is to ensure your CTAs don’t have too many characters (90–150 characters is the sweet spot).
- Ensure your CTAs are concise and to the point. In our case having 2 words in the CTA worked the best (i.e., “FREE UPGRADE” and “UPGRADE NOW”). Additionally, MathOffice.com found that adding value to the CTA could increase conversion by 14.79%—so make it interesting.
Results and Final Thoughts: Email A/B Testing
In total, we sent out 80,000 emails, broken down into 12 separate campaigns. Our average, the open rate across all emails was 28.25%, while our average CTR was 5.65%
The email with the subject line “[Zopim] You’ve earned a free upgrade – Thanks for sticking with us” earned the highest open rate at 38.8%. In contrast, the email with the highest clicks at 7.4% was the one focusing on the upgrade with a few paragraphs of text and information about the new feature.
According to MailChimp’s own industry benchmarks, our emails performed well above average:
Our main takeaway from the entire campaign is that it’s necessary to put away any email marketing preconceptions before undertaking such a campaign. Not only is there a distinct difference between B2B and B2C marketing (ours being the former), individuals in different industries also have different tastes when it comes to their emails.
At the end of our campaign, we successfully completed both our objectives. We had a good sense of the kinds of emails our customers liked and we managed to obtain thousands of upgrades.
Going forward we will be testing our landing pages and homepages. Anyone know of a good tool to do that?
Read other Crazy Egg articles about A/B testing.
- How to Write the Perfect Email That Gets Opened and Clicked - November 19, 2014